News

Powerful Grid Set To Handle Collider Data

A high-performance computing network called the Worldwide LHC Computing Grid (WLCG) will be formally introduced on Friday as part of an international scientific collaboration investigating particle physics, including "Big Bang"-type calculations.

The computer grid will process an expected 15 million gigabytes of data generated annually. Researchers at CERN, a European nuclear physics body, will measure high-energy proton collisions generated in a ring-like device called the Large Hadron Collider (LHC), an underground particle accelerator located near Geneva.

The WLCG, comprising more than 140 computer centers in 33 countries, will use purpose-built technologies for the project.

"[It will] combine processing power and storage capacity to create a massively powerful and geographically distributed supercomputing resource for physics and a growing number of other disciplines," according to Ian Bird, project manager of the WLCG, in a letter introducing the organization.

The vast amount of data generated by the LHC -- equivalent to filling about 3 million DVDs every year or a stack of CDs more than 20 kilometers high -- can only be handled by a grid of connected computers sharing processing power and storage and then further distributing the information to individual users.

The WLCG distributes its resources through three tiers located in 33 countries. Tier 0 is the CERN Computing Center, a central hub that provides less than 20 percent of the total computer capacity. Tier 1 includes sites in Canada, France, Germany, Italy, the Netherlands, Nordic countries, Spain, Taipei, the United Kingdom and two sites in the United States. Tier 2 constitutes 140 sites in 60 "federations" in 33 countries that will provide around 50 percent of the capacity needed to process the LHC data. The tier 2 sites will feed their data to physics institutes throughout the world, supplying both scientists and individuals.

Conceived nine years ago, the Worldwide LHC Computing Grid will be "a massively powerful and geographically distributed supercomputing resource for physics and a growing number of other disciplines," Bird concluded.

About the Author

Jim Barthold is a freelance writer based in Delanco, N.J. covering a variety of technology subjects.

Featured

  • Spaceflight Training in the Middle of a Pandemic

    Surprisingly, the worldwide COVID-19 lockdown has hardly slowed down the space training process for Brien. In fact, it has accelerated it.

  • Surface and ARM: Why Microsoft Shouldn't Follow Apple's Lead and Dump Intel

    Microsoft's current Surface flagship, the Surface Pro X, already runs on ARM. But as the ill-fated Surface RT showed, going all-in on ARM never did Microsoft many favors.

  • IT Security Isn't Supposed To Be Easy

    Joey explains why it's worth it to endure a little inconvenience for the long-term benefits of a password manager and multifactor authentication.

  • Microsoft Makes It Easier To Self-Provision PCs via Windows Autopilot When VPNs Are Used

    Microsoft announced this week that the Windows Autopilot service used with Microsoft Intune now supports enrolling devices, even in cases where virtual private networks (VPNs) might get in the way.

comments powered by Disqus

Office 365 Watch

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.