Need to fit billions of transistors on one chip? Let’s do it

  • Home / Tech Today / Need to fit…

Need to fit billions of transistors on one chip? Let’s do it

Artificial intelligence Now the computer is helping to design the chips – which are needed to run the highly powerful A code.

The design of a computer chip is both complex and intricate, requiring designers to arrange billions of components on a surface smaller than a nail. Each stage decision can affect the final performance and reliability of the chip, so the best designers rely on years of experience and know how to squeeze the best performance and power performance from nanoscopic devices. Efforts to automate chip design over the past several decades have met with little success.

But recent advances in AI have made it possible for algorithms to learn some of the dark arts involved in chip design. This should help companies create more powerful and efficient blueprints in less time. Importantly, this approach can also help engineers design AI software, which tested different tweaks to the code with different circuit layouts to find the maximum configuration of both. Has gone

At the same time, the rise of AI has created a new interest in all kinds of novel chip designs. From cars to medical devices, to scientific research, modern chips are increasingly important to every corner of the economy.

Chip makers, including Nvidia, Google, and IBM, are conducting all AI testing to help arrange components and wires on complex chips. The approach could shake up the chip industry, but it could also introduce new engineering complexities, as the type of algorithms being deployed can sometimes behave in unexpected ways.

In Nodia, principal research scientist Hawking “Mark” Raine is examining how an AI concept known as reinforcement learning can help arrange components on a chip and connect them together. How to use the method. The approach, which allows the machine to learn from experience and experimentation, has been the key to some important advances in AI.

AI Tools Rain is exploring a variety of simulation designs, training a large artificial neural network to recognize which decisions ultimately produce high-performance chips. Ren says the approach should reduce the engineering effort required to develop a chip while such a chip is being developed. Which is similar to a man-made performance.

“You can design chips more efficiently,” says Ren. “In addition, it gives you the opportunity to explore more design areas, which means you can make better chips.”

Nodia began making graphics cards for gamers but quickly saw the ability of single chips to run powerful machine learning algorithms, and is now a major player in high-end AI chips. Ren says that Nvidia plans to market chips made using AI, but declined to comment soon. In the distant future, he says, “you’ll probably see a big chunk of chips made with AI.”

Reinforcement learning was most popularly used to train computers to play complex games, including board game go, with unskilled skills, about the rules of a game or good game principles. Without any clear guidance. It promises a variety of practical applications, including robots trained to understand new things, flying fighter jets, and algorithmic stock trading.

Song Han, an assistant professor of electrical engineering and computer science at MIT, says enforcement learning has the potential to significantly improve the design of chips, because like games like Go, good decisions can be made without years of experience. Can be difficult to predict and practice.

His research team recently developed a tool that uses reinforcement learning to identify the optimal size for different transistors on a computer chip, which seeks out various chip designs in simulation. Is. Importantly, it can transfer what it has learned from one type of chip to another, which promises to reduce the cost of automating the process. In the experiments, the AI ​​tool developed circuit designs that were 2.3 times more energy efficient, while one-fifth, designed by human engineers, produced as much interference. MIT researchers are working on AI algorithms along with novel chip designs to make the most of both.

Other players in the industry – especially those who invest heavily in the development and use of AI – are also keen to adopt AI as a chip design tool.


Write a Comment

Your email address will not be published. Required fields are marked *