<A href="http://as.cmpnet.com/event.ng/Type=click&amp;FlightID=46772&amp;AdID=77776&amp;TargetID=649&amp;Segments=823,885,1411,2722,3108,3448,3598,5106&amp;Targets=649,786,2625,2878&amp;Values=34,46,51,63,77,81,91,100,140,203,304,309,442,450,646,655,1184,1311,1388,1431,1716,1767,1785,1798,1901,1925,1945,1970,2267,2299,2310,2325,2352,2408,2678,2727,2767,2862,2878,2898,2942,3234,3270,3284,3347&amp;RawValues=&amp;Redirect=http://www.pmc-sierra.com/whitepaper-processor-mips-sonet-ethernet/sas-zoning-access-control/" target=_top><IMG height=90 src="eetimes.1_files/UNLOCK728x90_110805.gif" width=728 border=0></A>
EE Times: Latest News
Computer R&D rocks on

 
Recomputing the Future: First of three parts

Think computers have become a commodity, like pork bellies, and computer science an old set of solved problems? Think again.

The computer research agenda is as big as ever before, if not bigger. Experts see important breakthroughs and whole new fields of investigation just opening up. Advances will come in natural-language searches, machine learning, computer vision and speech-to-text, as well as new computing architectures to handle those hefty tasks. Beyond the decade mark, Edward D. Lazowska, a professor of computer science at the University of Washington, expects computers based on quantum physics.

Sure, there are problems. Government funding has declined dramatically in the United States, such that it's no longer clear if the nation will maintain its leadership in the face of growing investments by countries such as China (see part two, next week). But even inside Dell Computer Corp. headquarters in Austin, Texas — the beating heart of the so-called commodity computer industry — executives understand the importance of new technology in a sector that has plenty of room to grow (see part three, in two weeks).

Lazowska is quick to disagree with anyone who says the big problems in computer science have been solved.

"That's baloney. We are poised to make several breakthroughs in the next few years," said Lazowska, past chairman of the Computer Research Association, a group of more than 200 university and industry labs. "There will be more coming up in the next decade than there was in the last two decades combined."

"In some ways, I have more to talk about now than I ever had in the past," said Rick Rashid, senior vice president of the 700-person Microsoft Research group. "I see huge progress across a broad swath of areas."

Dick Lampman, the director of Hewlett-Packard Labs, shares that view. "The bubble [of computer research topics] keeps expanding — it's almost exploding," said Lampman, who oversees a team estimated at about 600 researchers. "The biggest shift is that the classic topics are still there, but [now new applications] are shaping and pushing out the frontiers." The new apps span not only today's big corporate computers, but tomorrow's consumer HDTV sets and mobile phones.

"Computer science is at a transition point," said Jitendra Malik, who chairs the CS department at the University of California at Berkeley. "Earlier, what we did only affected technical people, but now everyone uses the Web and e-mail. That means these issues have a broader impact."

Justin Rattner, the director of Intel's Corporate Technology Group, agreed. "We've moved from a community of experts using machines in glass rooms to an environment where the technology is at everyone's fingertips," he said. "We are . . . standing on the shoulders of giants and seeing a whole new set of equally challenging problems."

According to Rattner, today's top computer science problems include designing multicore, multithreaded processors; creating programming environments to harness them; and managing the vast arrays of clustered systems that use them.

Increasingly, the problem set is shifting into the software domain, said Paul Horn, director of IBM Research. Horn promotes a new concept of "services science" to describe the new software agenda.

In the late 1980s, hardware was "about 85 percent of our computer research," he said. "Today, software and services represent a little more than half. Hardware, software and services will end up being comparable [thirds] in size."

The grand challenge here is what Horn calls solutions engineering, which adds a live, personal dimension to applications. For example, tomorrow's social-networking software could automatically optimize systems based on changes it has monitored in end-user behavior.

Such new levels of functionality could emerge hand-in-hand with a radical simplification of the architecture underlying today's software stacks. "There's a lot of good basic research that needs to be done here, a lot of it in exploratory problems," Horn said.

There are always naysayers. "I'm not a big believer in computer science as a science. I think it's more computer engineering," said Glenn Henry, president of X86 chip designer Centaur Technology Inc. (Austin) and a former R&D executive at both IBM and Dell.

"Even the hard-core stuff you do for computer chip design, you do for practical reasons. I don't know that there is a lot of computer research you want to do that is not related to a product," said Henry, who describes himself as a PC guy.

Another IBM Research alumnus stands at the other end of the spectrum. In July, Prabhakar Raghavan took on the job of setting up a world-class research lab for Yahoo. He sees a whole new field of computer science opening up and hopes his group, which now numbers about 40, grows to a point where it submits several hundred patent requests a year.

"Our vision is to invent the future of Internet sciences," said Raghavan. "It's a bit of a gamble, an adventure really. You have to figure out what combinations of skills are the right ones for this new science."

Internet science breaks down into five broad fields, according to Raghavan. Many are familiar areas of study given a fresh twist by the Web. At the top of Raghavan's list is information retrieval, aka search, a nascent sector that the Web "completely turned on its head" in the last decade.

Similarly, machine learning and data mining have existed for some time but have taken on a heftier scale in the Internet era. "Yahoo generates 10 Tbytes of data a day. It was a huge milestone 10 years ago when Wal-Mart reported it had reached a terabyte database," said Raghavan. "How do we use all this data? It's hard to find PhDs who have even thought about this."

Likewise, the Internet is creating a new computing utility, with hundreds of thousands of clustered computers serving hundreds of millions of client PCs, TVs and cell phones. The need to provide mainframe-class performance from globally distributed systems in ways simple enough for nontechnical users is opening up issues in both computer architecture and human factors engineering.

"These are disciplines being invented as we speak," Raghavan said. "They place new kinds of demands on computer science."

One last ingredient Raghavan throws into the mix of his Internet-sciences stew at Yahoo is microeconomics. This branch of information theory studies issues that revolve around a Web world that may host auctions for a million viewers one moment and calculate the pricing of a custom ad delivered to a single individual the next.

Today's agenda
When you ask a broad cross-section of computer scientists what they see as today's burning issues, problems in parallelism at the chip and systems level top the list.

"We are at the cusp of a transition to multicore, multithreaded architectures, and we still haven't demonstrated the ease of programming the move will require," said Intel's Rattner, who worked on massively parallel supercomputers earlier in his career. "It's frustrating that, after decades of work in the area, we still don't have the tools and techniques to overcome these big programming obstacles."

Rattner said he has "talked to a few people at Microsoft Research who say this is also at or near the top of their list."

Indeed, the problems with multicore chips "go to the heart of a lot of assumptions about how you write, build and manage software," asserted Microsoft's Rashid, whose team is working on parallel-programming languages, tools and methods.

Parallelism is also a problem at the systems level. "Today, the whole industry is looking at so-called scale-out computing," said IBM's Horn. "Can you build systems out of scalable, commodity parts — without redesigning the application software? That's really the grand challenge in computer science at the systems level."

Indeed, for hypergrowth computer companies such as Yahoo and Google, "pretty much everything we do is in distributed systems," said Alan Eustace, the vice president of engineering for Google who oversees the search company's growing R&D operations.

Google is said to maintain globally distributed clusters of more than 150,000 PCs to run its search engine and other services. For its part, Yahoo has "well into the tens of thousands" of PCs in about 25 data centers, said Raghavan.

Google has done proprietary hardware and software work to create its clusters. Interconnect technology for linking systems is still the biggest bottleneck, said Eustace. "The cost of a connection point dwarfs the cost of the computer. That's a big problem," he said.

The complexity of chip- and systems-level parallelism issues fuels related problems in reliability and simplicity.

"How do you build reliable systems from less-than-reliable or highly variable parts? Just about every layer of tomorrow's systems will have to address this issue," Intel's Rattner said.

A coming era of ubiquitous computing and sensor networks will push the problem to new heights, as microprocessors in all sorts of embedded systems form unreliable nodes sending inconsistent data over troubled interconnects, said Microsoft's Rashid. That problem is driving work in consensus software that makes decisions based on statistical probabilities.

Ironically, the increasing complexity is driving a push toward greater simplicity.

"Today's systems are way too complicated, and it's way too challenging getting the most out of them," said Rattner. "They seem to be designed with no thought about their environment, what the user is doing or might want to do in the next few moments. We have to restructure systems so they are more cognizant of their environment.

"This isn't just about better 3-D GUIs [graphical user interfaces] and speech interfaces. Unless we change something fundamental, you may be able to talk to your system someday, but it will still be too complex," he added.

Meanwhile, new architectures are on the drawing board.

"We are producing the same von Neumann architectures that were developed in the '50s, and they are not scaling anymore," said Bill Dally, chairman of the CS department at Stanford University. "The technology has changed fundamentally, and we have to evolve chip architectures and programming environments to deal with that."

Dally is also chairman of startup Stream Processors Inc., which is developing a novel signal processor for wireless and digital-media applications. It will drive new systems and software architectures that Dally claims will deliver huge performance gains over today's architectures. "I am hopeful, although the barrier to entry is high," he said.