How updates can potentially interfere with research

23-08-2021 | By Sam Brown

Recently, a researcher from Harvard discovered that trying to use commercial devices to gather data can present some significant challenges. What did the researcher find, why are black-box designs challenging to work with, and how does this relate to AI?


Harvard researcher discovers Apple Watch data interference


Recently, a Harvard researcher decided to investigate data reliability from commercial devices to see if they could be used in studies. Under normal conditions, data gathered for use in studies are done using research-grade equipment, which provides highly accurate data that is also reliable. However, if commercial devices could be used, then not only would the cost of research be reduced via the lack of need for research-grade equipment, a large number of devices in use could provide far more data.

However, upon investigating data from an Apple Watch, the researcher noticed significant irregularities in readings from the sensors. The result from this finding is that devices such as the Apple Watch are not suitable for use in research projects nor should be relied upon. But, it turns out that using third-party applications for reading bio-sensors such as heart rate and temperature cannot directly access raw data from the sensors. Instead, obtained data is first passed through an algorithm whose function is unknown to the public (i.e. a black box).

It was determined that in-between readings during different times of the year, a new update to the Apple Watch may have updated the sensor reading algorithm, thus producing a different result. This situation is further complicated when devices update and do not inform the user of changing algorithms.


How black boxes cause problems


This demonstration of sensory data producing unexpected results is a prime example of why researchers must only use raw data. However, black-box designs can cause challenges in other designs too. For example, using functional ICs to offload tasks and calculations can interfere with designs, especially if the internal workings of the IC are not clearly explained. The same applies to software whereby libraries that can offload complex tasks come with insufficient documentation; if the software fails, it can be hard to trace where it has gone wrong and how to deal with it.

Sensors are also available as black boxes, especially serial interfaces such as I2C and SPI. These designs connect to an analogue sensor (such as a temperature probe), process the data through an algorithm, and then stream this data to a processor. While this can reduce the complexity of a design, it also leaves a designer at the mercy of the sensor IC developer. Generally speaking, it is best to read the raw data from the sensor directly, but this is not always possible.


How does the “black-box” concept affect AI design?


AI is a technology that is rapidly being integrated into everyday life thanks to its ability to learn and infer from incomplete data. However, one major challenge presented by AI is the ability to explain itself and reason its conclusion.

For example, an AI could be trained to identify shapes on a conveyor belt system with a high degree of reliability. However, there may be times when it decides to pick up an object at random that it thinks is the correct object. In these incidences, it would be highly beneficial if the AI could be asked why it decided to choose the object to understand its thought process.

But, there are currently no AI systems that can do this, and this inability to explain their thought process means we will never understand what AI is thinking and how it arrives at its conclusion. This makes AI a black-box system whose internal algorithm is entirely unknown. To make the matter worse, additional training of AI changes the internal configuration of the algorithm, thereby making the AI produce different results.

Thus, researchers of the future should be cautious when experimenting with AI systems. Data that may be produced on one day could become very different the next.

By Sam Brown