In the first two blogs of this series, which included the Wi-Fi Mesh and the Internet of Things and RESTful APIs and Moving Data Across the Internet of Things, we looked at the challenges and solutions involved with the communication between devices that make up the Internet of Things (IoT). Even with forthcoming 5G networks and the capabilities it offers—including faster speeds, mobility focus, scalability, and the fact it is a software-defined standard allowing incremental upgrades—the IoT is poised to swamp these networks with volumes of data that is unprecedented. Therefore, part of the solution to a functional IoT is to be judicious in what data is transmitted to the network. If devices can perform more processing on the raw data locally at the endpoint, then less raw data needs to be transmitted. This concept is referred to as edge computing—or more infrequently fog computing.
At its core, edge computing simply means processing raw sensor data as close to the endpoint that generated the data without going to the cloud to use the heavy computing capability of high-end servers. Artificial Intelligence (AI) algorithms comprise an evolving set of software—with Machine Learning (ML and neural network powered Deep Learning (DL) being leading methods in achieving AI—that will enable much of the innovation needed to do this local Herculean data processing. These algorithms are computationally hungry. For embedded systems, this presents a challenge. Embedded systems have historically prioritized low-cost, low-power, and small footprint over the memory and processing horsepower these next-generation algorithms must have in order to be effective.
Enter the humble Field Programmable Gate Array (FPGA). FPGAs are not new. In fact, they were invented in the 1980s. The founders of Xilinx brought the first commercially viable FPGA, the XC2064, to market in 1985.
What is changing is that price and performance are getting to levels where they are attractive options for even low-cost devices that are powering the endpoints of the IoT. Lower density FPGAs—e.g., FPGAs with few configurable logic blocks—make an attractive option for IoT for a variety of reasons:
It should be noted that getting data back to cloud services that can accumulate and crunch the collective data does have benefits. Teaching and improving ML and AL algorithms require access to copious amounts of data. Data the IoT is more than happy to deliver. Teaching AI algorithms requires lots of processing horsepower that big iron can still do better. But once the AI algorithms are improved and implemented as an upgrade to IoT endpoints, then the embedded devices become smarter. Allowing them to make better decisions when exposed to a new scenario in the real world. Just as humans continuously learn and improve their mental skill sets.
Michael Parks, P.E. is the owner of Green Shoe Garage, a custom electronics design studio and technology consultancy located in Southern Maryland. He produces the S.T.E.A.M. Power podcast to help raise public awareness of technical and scientific matters. Michael is also a licensed Professional Engineer in the state of Maryland and holds a Master’s degree in systems engineering from Johns Hopkins University.
Privacy Centre |
Terms and Conditions
Copyright ©2022 Mouser Electronics, Inc.
Mouser® and Mouser Electronics® are trademarks of Mouser Electronics, Inc. in the U.S. and/or other countries.
All other trademarks are the property of their respective owners.
Corporate headquarters and logistics centre in Mansfield, Texas USA.