Edge Computing vs Cloud Computing: What, Why & How?

By Mona Mangat

Edge Computing vs Cloud Computing: Key Differences

The term “Edge computing” refers to computing as a distributed paradigm. It brings data storage and compute power closer to the device or data source where it’s most needed. Information is not processed on the cloud filtered through distant data centers; instead, the cloud comes to you. This distribution eliminates lag-time and saves bandwidth.

Edge Computing is an alternative approach to the cloud environment as opposed to the “Internet of Things.” It’s about processing real-time data near the data source, which is considered the ‘edge’ of the network. It’s about running applications as physically close as possible to the site where the data is being generated instead of a centralized cloud or data center or data storage location.

For example, if a vehicle automatically calculates fuel consumption, sensors based on data received directly from the sensors, the computer performing that action is called an Edge computing device or simply ‘edge device.’ Due to this change in data sourcing and management, we will compare the two technologies and examine the benefits each has to offer.

Edge computing vs. cloud computing is not an either-or debate, nor are they direct competitors. Learn more about both of these solutions.

Originally published on phoenixnap.com December 2, 2019