Edge computing is often seen in IoT deployments as well as for applications that depend on real-time data such that decision making is done close to data creation.
What is it?
A form of distributed computing whereby data storage and processing shifts closer to a device rather than relying on a central location.
Edge computing is particularly suited to applications where latency issues have serious performance implications, for example in Industrial IoT, medical imaging, or VR/AR. By shifting the computation closer to the edges of the network, you’re able to improve performance.
What’s in for you?
If you have applications that are highly sensitive to data latency, edge computing can provide a significant performance boost.
Businesses can save money by getting the processing performed locally, minimizing the amount of data that needs to be stored centrally or in the cloud. Because less energy is used to transport the data, Edge computing is potentially more sustainable too.
What are the trade offs?
Edge computing introduces more diverse and complex deployment scenarios for your organization. Think about management, monitoring, and testing challenges associated with complex and remote architectures.
You’ll need to consider how you deal with data privacy on the edges of your network — this is particularly true if you’re using resource-constrained IoT sensors that may limit your ability to encrypt data.
How is it being used?
Edge computing is often found in Internet of Things (IoT) deployments, where the bandwidth costs of transporting large amounts of data long distances could be prohibitive.
Increasingly, we’re seeing edge computing being used for applications that exploit real time data — such as video processing and analytics, autonomous vehicles and robotics. This is supported by faster networking technologies, such as 5G wireless.