Kumar on vision of large chips for AI in NY Times

8/18/2019 Joseph Park, ECE ILLINOIS

ECE ILLINOIS Professor Rakesh Kumar weighs in on the vision of large chips for AI systems in an article with The New York Times.

Written by Joseph Park, ECE ILLINOIS

 

Rakesh Kumar
Rakesh Kumar

Typically, computer chips have been thought of to be something tiny. Many computer chips can fit inside your wallet or even sit on the tip of your finger. However, a Silicon Valley startup called Cerebras is challenging that notion by developing the largest computer chip ever built.

 

 “It is not that people have not been able to build this kind of a chip,” said ECE ILLINOIS Professor Rakesh Kumar in an article with The New York Times. “The problem is that they have not been able to build one that is commercially feasible.”

According to an article from The New York Times, this giant computer chip would be as big as a dinner plate, almost 100 times bigger than a regular computer chip. Cerebras believes that this large chip "can be used in giant data centers" and "accelerate the progress of artificial intelligence" in platforms including autonomous cars and digital assistants like Amazon's Alexa. 

As the technology behind artificial intelligence continues to grow, the neural networks behind these artificial intelligence systems become more complex, requiring special computing power. Google and other chip-makers built chips specifically for neural networks but companies want even more computing power.

Even though multiple chips work together to operate AI systems, transferring large amounts of data between chips can be slow and limit their speed of analysis. Cerebras' idea is to keep all of the data on one giant chip so that the system can operate faster. However, this project comes with many challenges.

Creating a bigger chip means that there will be more room for error. Furthermore, it will be difficult to maintain since it will consume large amounts of energy. To counter potential problems, Cerebras plans on releasing the chip as part of a larger machine that uses "elaborate equipment for cooling the silicon with chilled liquid."

Kumar has been working on waferscale computing for a couple of years and had recently written a paper making a case for waferscale systems that appeared earlier this year that was highlighted by news media. His student, Matthew Tomei, recently received a Qualcomm Innovation Fellowship for continuing work on waferscale systems.

Kumar is also affiliated with the CSL

Check out the article from The New York Times here


Share this story

This story was published August 18, 2019.