Microchip Technology has released a complete, integrated workflow for streamlined ML model development with its new MPLAB Machine Learning Development Suite. This software toolkit can be used across its portfolio of MCUs and MPUs to add an ML inference quickly and efficiently.
"Machine Learning is the new normal for embedded controllers and utilising it at the edge allows a product to be efficient, more secure and use less power than systems that rely on cloud communication for processing," said Rodger Richey, VP of Microchip's Development Systems business unit. "Microchip's unique, integrated solution is designed for embedded engineers and is the first to support not just 32-bit MCUs and MPUs, but also 8- and 16-bit devices to enable efficient product development."
ML employs a set of algorithmic methods to curate patterns from large data sets to enable decision-making. It is typically faster, more readily updated and more accurate than manual processing. One example of how the company's customers will use this tool is to allow predictive maintenance solutions to accurately forecast potential issues with equipment utilised in various industrial, manufacturing, consumer, and automotive applications.
The suite helps engineers build highly efficient, small-footprint ML models. The toolkit is powered by AutoML and removes many repetitive, tedious and time-consuming model-building tasks, including extraction, training, validation and testing. It also supplies model optimisations so the memory constraints of MCUs and MPUs are respected.
Combined with the MPLAB X IDE, the new toolkit provides a complete solution that can be easily implemented by those with little to no ML programming knowledge, eradicating the cost of hiring data scientists. It is also sophisticated enough for more experienced ML designers to control.
The company also provides the option to bring a model from TensorFlow Lite and use it in any MPLAB Harmony v3 project, a fully integrated embedded software development framework that delivers flexible and interoperable software modules to simplify the development of value-added features and decrease a product's time to market. Also, the VectorBlox Accelerator SDK provides the most power-efficient CNN-based AI/ML inference with PolarFire FPGAs.
The suite provides the tools for designing and optimising edge products running ML inference.