Followers

Tuesday, August 5, 2008

Intel Plans Chip to Boost Computer Performance

Washington Post Staff Writer

The computer revolution has been powered by chips that operated at ever-higher frequencies. From the Ataris and Commodores of the late 1970s, which ran at roughly one megahertz, to today's devices that run at nearly four gigahertz, one year's star at the electronics store was typically outdone by higher speeds in the next year's model.

But that era of progress may be drawing to a close.

Today, Intel, the world's largest chipmaker, is revealing details about a new chip that seeks to improve performance not by boosting frequency, but by putting more processors or "cores" onto a single chip.

While so-called dual-core and quad-core chips have become commonplace in recent years, Intel expects to place more than 10 "cores" or processors on its new project, code-named Larrabee.

To keep up with the demands of the increasingly digital world, the "multi-core" or "many core" approach is necessary because by 2015, running chips at faster and faster frequencies could have yielded products like laptop or desktop computers that create as much heat as a nuclear reactor, engineers said.

ad_icon

"There is a fundamental physics issue we can no longer get around," said Anwar Ghuloum, a principal engineer managing an Intel group that addresses the software challenges posed by such chips. "If we kept going as we had been, the heat density on a chip would have equaled the surface of the sun."

Intel is releasing features of the Larrabee project today, in advance of an industry conference scheduled for next week in Los Angeles.

First intended for graphics-intensive applications, such as games and other visually intensive programs, company officials said the multi-core approach could become the model for common desktop computers. Engineers expect it to be capable of processing about a trillion instructions a second.

The first product based on Larrabee is expected in 2009 or 2010, and Intel officials anticipate that not long after 2010, there will be laptops running on chips with more than 10 cores.

The drawback of the new approach is that it requires an equally dramatic shift in the software industry. Some experts, such as Microsoft founder Bill Gates, initially expressed reservations because of the disruptive nature of the transition.

To take advantage of a chip with many processors, software has to be broken into chunks of instructions that can run in parallel on multiple processors. So, a computer program that now consists of one set of sequential instructions would have to be parceled into two, four or more than 10 sets of instructions, depending on the number of cores, that can be run in parallel.

Once chips with 10 cores reach consumer desktops, however, the entire corpus of the world's software may have to be rewritten to take advantage of the extra power.

To meet the challenge, new programming languages are being created and technology leaders are encouraging computer science departments at universities to bulk up in courses in parallel processing.

An array of technical possibilities -- in language interpretation, robotics and visual recognition -- depend upon increased processing power.

Some game firms, such as Crytek and Valve, have hailed the advances. But multi-core chips present massive and expensive difficulties.

Executives at Microsoft initially balked at the idea when they met with Intel several times about four years ago.

At the first one, Pat Gelsinger, a senior vice president at Intel, described why the company intended to start developing multi-core and then many-core chips. Gelsinger had been warning the industry of the imminent change for years.

Though Microsoft had been researching the multi-core area since 2001, company officials had hoped to delay the transition.

"It was like, 'thanks very much for your input, Pat. Now, it's wrong, go fix it,'" Gelsinger recalled of the response from Gates and other Microsoft engineers.

Gates and Microsoft were "testing Intel's real sense of needing to make this architectural shift," Microsoft said in a statement. The statement added: "In 2004 it became clear this shift would begin in earnest by the end of the decade."

Original here

No comments: