There's a pretty big battle in the world of high-performance computing (HPC)
and hopefully this will soon affect those of you in IT.
HPC has long been the purview of designers, engineers, 3-D renderers and data
miners. These high-performance boxes cluster massive arrays of processors, often
x86 (and GPUs for the graphics-inclined), and aim it all at a small set of specialized
applications. It's very cool, but unfortunately a bit of a niche.
And many of these systems -- in essence, commodity supercomputers -- have been
running Linux. It's free and nice and scalable across clusters, multicores and
multiprocessors. Windows Server is also showing some spunk in this market, and
the availability of either Linux or Windows means you may be able to apply this
muscle soon to more common data-processing tasks.
Red Hat doesn't want to miss this opportunity and has a new
bundle -- a software stack, if you will -- that includes Linux itself along
with clustering tools and a job scheduler. With so much great commodity hardware,
this should form the basis of expensive and utterly ripping HPC systems.
Can you see a use for this style of HPC/supercomputer? Super-smart answers
accepted at [email protected]om.
Posted by Doug Barney on 10/07/2008 at 1:16 PM