Is AI making designers lazy?

30-03-2021 |   |  By Robin Mitchell

AI has proven to be an extremely useful tool for complex problems. However, AI is increasingly being used in mundane tasks, and thus is AI making designers lazy?

How the Semiconductor Industry has Always Played Cat and Mouse

Ever since the development of the first computers, software engineers have constantly been pressuring hardware engineers and semiconductor manufacturers to improve their tech. When improvements to underlying hardware were made (such as increased clock speed or faster memory), software engineers would very quickly take advantage of the increased performance to develop more complex systems. The new systems would, undoubtedly, push the hardware to its limits which completes the vicious cycle of technological demand.

Of course, this tech demand has been overwhelmingly positive on the world as a whole; the desire for more powerful technology has seen the rise of the internet, the move towards digital storage, smart systems, and improved services. 


A Culture of Laziness Emerges

While the development of improved technology allows for more complex systems, there is clear evidence of software developers showing a culture of laziness and disregard for efficiency. The best place to see such practices is the evolution of programming languages.

The first programming languages available to programmers was assembly which is nothing more than mnemonics for CPU instructions. When assembly is compiled into raw data, the instructions written by the programmer are not changed or interpreted, but instead simply converted into their byte equivalent. For example, LD A, B could be converted into 0x3E (00111110) that is then directly placed into memory for the CPU to read and execute. 

For CPUs predating 1990, assembly is a very useful language to use. Such computers would often be restricted on memory space and have limited CPU capabilities, therefore coding in assembly provided designers with the opportunity to create the most efficient code (both in terms of speed and memory usage).

As CPUs became more complex, so did their instruction sets, and coding in assembler on a modern processor is a monumental task. It was around this time that programmers shifted to using more abstract languages such as C and C++. These languages have greater memory requirements as well as being less efficient than directly written assembler routines, but they greatly simplify coding. 

Even though C and C++ are less efficient than directly writing assembler, they are still compiled languages meaning that when a program is compiled, it is converted into assembler routines for the processor to directly execute. Therefore, many C and C++ routines are just as fast as their assembler counterparts.

However, the rise of interpreted languages is where laziness in system development could start to be seen. Interpreted languages such as Python and Java offer many advantages such as cross-platform capabilities and incredibly powerful libraries, but their interpreted nature (i.e. each instruction has to be read by a virtual machine on-the-fly), means that they suffer from serious performance issues. 

Interpreted languages allow a designer to quickly develop a solution with powerful features, but the efficiency of such solutions is abysmal (i.e. making poor use of hardware). As such, systems that use interpreted languages can often have unnecessarily high computing requirements that could otherwise be far smaller if the system was coded in a more efficient language (such as C and C++).

AI Helps to Encourage Laziness

The world is now facing a new wave of solutions based on an emerging technology; AI. AI is very different to traditional programming methods in that instead of hard coding every single possibility, an AI is shown input data and corresponding output data to which the AI will then adjust itself internally to allow it to also come to the same conclusion. 

This means that instead of a designer developing an extremely complex algorithm to solve a problem, they instead need to gather large amounts of data for their AI solution to learn from. Such solutions are extremely useful in scenarios that involved large amounts of data that can vary in so many ways such as speech-to-text (which has many thousands of accents and voice pitches). 

However, while AI provides an ideal solution for many complex problems, there is a chance that programmers will move towards the AI programming paradigm instead of spending time on developing an algorithm. For example, an AI could be used to control the stability of a drone, but standard PID controllers can already achieve this task extremely well. 

Another example of AI laziness would be a recent solution the author has been working on. A conveyor belt with components needs to be used in conjunction with a robotic arm. AI can be used to recognise where the objects are, but since all the components are the same, using AI would be a waste of resources. Instead, two images are taken of the belt; one when empty and one when loaded with parts. The difference between the images is used, and the resulting image shows areas where there are components. Thus, the robotic arms can quickly identify where components are without an AI system.

The use of AI in trivial solutions is a problem not because of lazy programming, but because of the unnecessary use of processing resources. If designers fall into the habit of using AI to solve everything, this will further put pressure on hardware designers to provide even faster processors as well as dedicated AI co-processors.

While this may seem like a good move, what it will do instead is result in increased power consumption of products unnecessarily, reduce efficiency, and reduce device capabilities. One solution around this would be to use cloud processing for AI routines, but this only creates other problems including privacy and the need for an internet connection.

Going back to assembly is out of the question (try reading the documentation for an x64 processor and how it works), but designers should consider going back to C and C++. Furthermore, designers should also appreciate that using the quickest and simplest solution is not always the right move. If a solution can be achieved using a custom algorithm instead of defaulting to AI, not only will the final design be significantly faster, but will also be able to work on lower system requirements.

Read More


Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.

Related articles