Field Programmable Analog Arrays – What are they and could they help with unique applications?

13-04-2021 | By Robin Mitchell

Everyone knows what FPGAs are, but have you ever heard of an FPAA? What are FPAAs, what problems do they face, and could they provide a viable solution for future problems?

What is an FPAA?

Field Programmable Gate Arrays, or FPGAs for short, are semiconductor devices that provide designers with large programmable logic arrays that can be interconnected to create almost any digital circuit one can think of. While FPGAs are continually increasing in popularity, FPGAs are digital devices and cannot be applied in analogue environments.

An alternative device, called a Field Programmable Analog Array, allows designers to program complex analogue circuits. While FPGAs are common and widely available, FPAAs are extremely uncommon and made by only a handle of manufacturers. Like an FPGA, an FPAA has many analogue block circuits that include passive components, op-amps, and resistors all of which can be programmed. These analogue blocks are then interconnected to create more elaborate analogue circuits in a near-identical way to FPGAs.


What problems do FPAAs face?

While FPAAs may sound like an alternative to discrete components or ASICs, the truth is in fact the opposite. In FPGAs, designers have to consider delay and time propagation of signals throughout their design when operating in high-speed modes.

In FPAAs however, the problems faced become far more complex due to the need to consider parasitic components of the FPAA. Such factors include parasitic capacitance, inductance, and resistance of switch networks, blocks, and I/O. These can be accounted for in routing software. Still, even then, the routing software would need to consider input/output impedance of analogue blocks, and how those parasitic values will affect the analogue blocks.

The second challenge presented by FPAAs is the complex nature of FPAA blocks compared to FPGA blocks. Digital circuits are not dependent on their analogue nature (as the circuit is either on or off), and as such transistors used in FPGAs can be reduced to extremely small sizes. This results in FPGAs having a very large number of logical blocks and allowing for extremely complex designs. FPAAs are required to have a range of analogue components, and these components performance is heavily dependent on their size. As such, analogue blocks can struggle to be reduced in size while retaining capacitance values, inductances, and resistances.

The third challenge presented by FPAAs extends from the previous issues with their low block density. Since FPAAs can struggle to provide a sufficiently large number of blocks, designs that will require an FPAA will most likely do better using a custom ASIC. A designer who chooses an ASIC over an FPAA will create far more complex circuits for the same silicon area, and the low demands for FPAA make them very expensive. Therefore ASICs could even be a more economical option.

Could FPAAs help solve unique future problems?

FPAAs do have some unique applications, and may even be ideal for solving future complex problems that digital circuits struggle to do efficiently. 

The first potential application for FPAAs is to simplify the process of supporting ageing hardware that requires unique circuits which either require ICs that no longer exist or use complex analogue configurations that can be condensed into a single IC. FPAAs would allow engineers to alter a design to either perfect the operation or provide updates that will allow the circuit to operate better.

The second potential for FPAAs is for accelerating complex operations that digital systems struggle to work with. Before digital computers became mainstream, researchers would often use analogue computers to compute equations and see relationships. A circuit would be constructed to represent an equation (such as a differential or integral), and the circuit would be operated with its output connected to a scope. The resulting waveform would demonstrate the equation's output, which would help solve complex problems extremely fast. 

In fact, it is believed that FPAAs could help future AI and advanced systems solve complex numeric methods. According to Professor Jennifer Hasler from the Georgia Institute of Technology, FPAAs can perform matrix factorisation more efficiently than in a digital system (however, a digital system is easier to program). To prove this, her team created an analogue computer that could perform command recognition in speech at just 1uJ per inference which is 1000 times less than an equivalent digital system.

Read More

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.