05-08-2021 | | By Robin Mitchell
Artificial intelligence has proved to be a useful tool when it comes to identifying patterns and behaviours in datasets that seem completely random. Why is AI so advantageous, what challenges does chip design present, and could AI be used to create future semiconductor products?
Since the turn of the 20th century, humans have long dreamed of machines that could think and make decisions for themselves. While some fear that such a machine would lead to a war between machines and their creators, it is more likely that thinking machines will play an important role in human life.
Fast forward to the 21st century, and what we see are not machines that reject their human overlords, but computer systems that can deploy deep learning algorithms to help make sense of untold amounts of data. This is where we see the first major advantage of AI; its ability to handle extremely large datasets.
Trying to find patterns in data is something that humans have been doing since the dawn of time, but humans can only handle so much data. Imagine a scenario where medical researchers have access to a million medical records; no single human can examine that much information, retain it all in memory, and then compare all that data to find patterns.
The second advantage of AI is that it is far more sensitive to changes in data than the human brain is. The high degree of sensitivity of AI means that an AI system is able to identify patterns in data sets that may not be easily seen by humans. This enables AI systems to be highly receptive to the smallest changes, and correctly identify patterns even when presented with minimal data.
The third advantage of AI is that it is able to learn from data presented to it. This allows AI systems to improve their algorithms over time thus improving their performance. Just like a human, AI can learn and essentially gain experience (lack of a better word) provided with enough data.
All of these characteristics of AI is now seeing their integration into everyday life ranging from self-driving vehicles, personalized AI assistants, language processing, and much more. But humanity has only scratched the surface of AI, and there is far more that AI could be made to do.
There is no doubt that semiconductor devices are an engineering marvel; billions of nanometer-sized transistors packed into a semiconductor no larger than a fingernail able to store millions of bits of data and process that same data at extraordinary speeds. While the physical construction of semiconductors and the development of systems on those devices is an engineering marvel, the physical design of individual circuits is actually simpler than one would expect.
The first processors (such as the 4004), had several thousand transistors on a single device. This may sound like a lot, but due to how CPUs are separated into different units, such designs can be designed piece by piece. For example, an ALU would be designed on its own and tested as would a program counter and instruction decoder. All of these designs can then be combined together, connected, and packaged into a single chip.
With such few transistors, each one can be carefully placed and oriented to maximize the performance of the chip. Fast forward to modern designs, and trying to place each transistor is literally an impossible task. Instead, designers have to design units that themselves become larger via copy and paste mechanisms. For example, designers could design an individual memory cell, and this can then be copied 32 times to create an individual memory location. However, this would then be automatically copied and pasted n times depending on the memory size with other supporting circuitry automatically be placed.
The use of such tools dramatically speeds up the design stage of very large scale integrated circuits, but the placements and routing of wires may not be at their most efficient; in fact, it is most likely very far from the most efficient design.
AI’s ability to identify patterns in large datasets also makes it an ideal candidate for creating the next generation of integrated circuits. If an AI system could be trained in common IC layouts and compare those layouts to their performance, the result would be an AI that can intelligently place and route transistors on a semiconductor.
While humans have a general knowledge of what makes a device efficient (such as minimizing trace lengths and grouping common parts), an AI could develop a deeper understanding of how each decision affects a design. What may seem like a counterintuitive design to a human could end up being more efficient than anything ever designed by man.
A good example of this was when NASA used AI to design antennas for space. The resulting design looked like a bunch of bent paperclips stuck together that hardly resembles an antenna, but it turned out that the antenna was highly efficient and designed faster than that achievable by man.
Of course, the real question now is when AI starts to design integrated circuits, will they be used to design AI chips? And if so, is this a sign of exponential technological growth where the previous system designs the next system?