How Will Edge Computing Enhance the Internet of Things (IoT)?
It's important for reducing delays and saving data network resources, as seen in examples like space exploration and self-driving cars.
Recently, you might have started to hear a new term: Edge computing. And because it’s a trending buzzword, you might think it represents a sea change in the way we approach some aspect of computing. But if anything, it’s a bit of a throwback, like the IT version of disco or flared jeans.
Edge computing, simply put, is an approach to system design where the computing is done close to the source of the data.
Edge computing and IoT
But what’s that actually mean? Well, you need to think about this in the context of the modern Internet of Things, where sensors and other “things” live locally, at the edge of the network. But the data that’s generated at the edge is often packaged up and sent elsewhere for processing. That “elsewhere” can be a centrally managed server, or more often, the cloud.
Whether we’re talking about the IoT or more mundane computing tasks, we are living soundly in the Age of the Cloud, where seemingly everything is uploaded, stored, and often processed, remotely on an Amazon AWS or other third-party servers. After the data has been processed, the results are then sent back downstream to local computers for additional conditioning, decision making, reporting, and action.
Real world examples
But there are limitations with this approach, often related to bandwidth and latency. Consider an extreme example that’s a little outside the realm of IoT: space exploration. Mars probes like the Curiosity rover are too far away from earth for engineers at NASA or JPL to help inform the rover’s moment-to-moment decision-making. Rather than upload telemetry and observations to earth (depending on where the planets are in the orbit, that adds up to anywhere from a 13-to-24-minute delay), wait for processing, and then another 13-to-24 minutes for the return path, Curiosity can make some decisions on its own. Curiosity, in a sense, makes use of edge computing.
And that same philosophy applies to a similar real-world case of IoT here on earth: self-driving cars. Clearly, when the roads are filled with a fleet of autonomous vehicles, there isn’t time for the vehicles to take measurements of its surrounding, upload that data to cloud, wait for centralized computers to calculate a decision, and send that information back. Even at the speed of light, the round-trip data path is unworkably slow for life-and-death decisions at highway speeds. Instead, self-driving cars need to process that data locally. They perform edge computing.
Preventing slow responses
Latency isn’t just a concern at 60 mph, either. We know from extensive user testing and research, for example, that when people use a computer interface, if it responds in 200 milliseconds or less, it “feels” instantaneous – and obviously, slower responses from the computer are noticeably annoying. This is becoming important as more and more computing is offloaded from local systems and moved to the cloud.
In some cases, it’s unavoidable. Consider smart home devices like Amazon Echo and Google Home. These devices listen for human speech, but don’t have the resources to interpret it locally. Instead, the data is packaged, uploaded to the cloud, and the device waits to be told what to say or do in response. The lag can be quite pronounced, which you have no doubt experienced if you own one of these devices. Some edge computing is already done on these devices – Alexa figures out when you’re talking to it on its own without the help of the cloud – but in the future, Amazon will add more and more edge computing tasks to the device, speeding up its response time.
Question of bandwidth
And then there’s a question of bandwidth. As IoTs expand and become richer and more capable, the networks they run on will have ever greater demands on their resources to move data to a central storehouse or the cloud. Consider a network of cameras, for example – if all of the data must be sent to the cloud in order to analyze it for motion events, for example, that’s potentially an enormous amount of data. If the cameras have edge computing abilities to flag interesting events and only send those moments to the cloud, it can potentially save the company a fortunate in bandwidth and data warehousing costs.
Edge computing isn’t quite a return to the old days of performing all computing, processing, and analysis on local machines, but does represent a smart compromise between a completely cloud-based, central computing solution and agile, local processing.
Have questions about how edge computing might impact your IoT project? Contact Us today for a free consult ››