Followers

Sunday, October 19, 2008

Business tech born in Cern's Big Bang lab

By Nick Heath

Cutting edge particle physics is being used to hone new technology that will eventually make its way into enterprises.

The Cern nuclear physics laboratory in Geneva, Switzerland is helping the tech industry refine the multi-core processors and fat gigabit networks destined for the data centres of tomorrow through the openlab initiative.

The project sees the IT department at the lab behind the "Big Bang" Large Hadron Collider push cutting edge kit to breaking point to perfect it for its own use, and the consumer and business markets.

CIO50 2008: Top 10

The UK's leading CIOs revealed…

1.Robin Dargue Royal Mail

2.David Lister Royal Bank of Scotland

3.Neil Cameron Unilever

4.Catherine Doran Network Rail

5.John Suffolk UK government

6.Gordon Lovell-Read Siemens UK

7.Paul Coby British Airways

8.Tania Howarth Birds Eye Iglo Group

9.Simon Post Carphone Warehouse

10.Ben Wishart Whitbread

The lab has partnerships with companies including HP ProCurve, Intel and Oracle, who provide the backbone of its IT infrastructure, its 8,000-server computer centre and its links to the worldwide Worldwide LHC Computing Grid, consisting of more than 100,000 processors spread over 33 countries.

Cern CIO Wolfgang von Rueden told silicon.com: "We wait for industry to develop the technology then we take it and see how far we can push it and feed back to them."

von Rueden said openlab is currently helping Intel test out its new chip designs.

"We are looking at optimising and measuring the performance of new CPUs, doing tests on new architectures. The very first quad core that left the US for Europe came to Cern and we tested that," he said.

Cern has also helped Intel optimise more than 100 software compilers for its chips, over 50 of which have gone on to be implemented.

The lab is working with HP ProCurve on automated data collection, to allow the lab to monitor behaviour over its vast network, with 70,000 1-gigabit ports exchanging data within the Cern complex alone.

von Rueden said: "We try to gather all of the data together and analyse them and try to understand, in automated ways, what would you call a standard data pattern.

"Then you can find an abnormal data pattern to detect abnormal behaviour on the network. This obviously would have a use in any large data centre."

Cern was at the forefront of testing 10Gbps networks before they became widely available, and has dedicated 10Gbps links to its 11 main tier-one computing centres worldwide.

von Rueden said the department has already detected abnormal behaviour on the LHC Computing Grid with some users loading up unauthorised shareware.

Working with Oracle, it has also managed to greatly increase the speed at which Cern databases could be distributed to its computer centres.

von Rueden described the challenge of collecting and organising the millions of streams of information from the LHC's five detectors.

"This data has to be collected, watched and stored and this pushes the database technology quite far," he said.

This continuous refinement of hardware and software has allowed the centre to reduce the space needed for its server racks, allowing Cern to double up its storage, CPUs and network bandwidth to maximise reliability.

Original here

No comments: