Computer Architecture Must Change Every 5 Years

11-04-2021 | By Robin Mitchell

A recent interview between Jim Keller and Lex Fridman revealed a concept that, according to Jim Keller, the computer industry must change. Who is Jim Keller, what did he say about computer architecture, and why is he right?

Who is Jim Keller?

Jim Keller is a microprocessor engineer who has continued to play a key role in computer architecture.  Jim Keller’s work initially started out at Harris Corporation where he designed microprocessor boards, but shortly joined DEC to develop processors. In 1998, Jim then moved to AMD to help design the next generation of 64-bit consumer CPUs. His work in the development of CPU architecture includes both Intel and AMD, he was a co-author of the x86/x64 instruction set and was the lead architect for the AMD K8. 

According to Kim Keller, his ideas on the semiconductor industry places him in the minority of engineers, and a recent interview with Lex Fridman demonstrate these “outside” ideas such as believing that Moore’s Law isn't dead and that computer architecture should change from the ground up every 5 years. It is also worth noting that Jim Keller left Intel in 2018 and initially gave the excuse that it was for personal reasons, but the truth was that Jim Keller disagreed with Intel’s manufacturing practice (believing that Intel should become fabless).

What did Jim Keller say about computer architecture?

In an interview with Lex Fridman, Jim Keller got into a discussion regarding his work in the computer architecture industry, and where it is going. In essence, Jim Keller discussed that as the demands on computers changes, chip architectures add more stuff and expand their capabilities to try and enable new features. For example, instruction sets can include new instructions, additional hardware circuitry can be added, and smaller transistors can be used to increase the overall processing power of machines.

However, during the conversation, Jim Keller revealed that while CPUs are becoming “more powerful”, they are instead becoming more powerful on average. Furthermore, the use of older tech and adding new hardware to make the overall system cope with new tasks is a poor method for developing technology. Instead, Jim stated that computer architecture itself, right down to the physical logic gate layouts, should be changed every 5 years.


Why is Jim Keller right about this?

The need to change computer architecture fundamentally is something that I myself have been talking about for several years, and you only need to look at any desktop PC to see this need. An x86 machine designed to run Windows 95 will not be able to run modern operating systems such as Windows 10 for a whole range of reasons, and you would expect this. However, a modern PC can happily run any Windows operating system right back to MS-DOS. 

The reason why this is possible is that the CPU architecture used in modern computers has utilised the same instruction set for more than 30 years, and this instruction set includes backwards compatibility. While the use of backwards compatibility does help keep systems usable for longer, it also creates a development environment that, instead of creating an efficient machine, instead makes a machine that becomes increasingly complex.

Modern computers are running arguably very different tasks than the first mainstream PCs. Sure, both may be required to run Excel or Word, but modern systems also need to run AI tasks, cryptographic routines, and secure storage. The vast number of processes that modern users run can also require the need for multiple physical processors, and yet modern computer designs typically have no more than two physical slots.

The need for new computer architecture is already being realised by large companies such as Amazon and Apple who are using more modern architectures such as ARM and RISC-V to create custom SoC processors. While these processors have a reduced instruction set, they are still able to perform just as well, if not better, than their CISC x86/x64 counterparts. 

Another way to recognise the fallacy of using old compatible architectures is to look at microcontrollers. Imagine if instruction sets and architectures of microcontrollers from the 1980s were still being implemented in modern microcontrollers. Any engineer worth their salt would want to use a modern design whose physical construction and software platform are redesigned every so often. A good example is STM8 and STM32; the STM8 is a good basic microcontroller, but STMicroelectronics completely changed the architecture for STM32 by introducing ARM32, new hardware, and improved software tools.

Overall, modern computer architecture needs to change completely. Destroy bridges, destroy PCIe, and while we are at it, destroy x86, and replace it all with a new topology that works well with modern demands.

Read More

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.