hero-5g-good2

Using 5G in tandem with edge computing to unlock maximum value

June 25, 2021
5G is enabling new cutting-edge applications, but organizations must couple this connectivity with edge computing to drive maximum ROI.

5G is poised to be one of the most transformative technologies of all time, and it will provide a foundation for applications and solutions that will advance enterprises and unlock value across industries and consumers alike. Data shows that organizations are making large investments into 5G networks. For example, the CBRS auction in 2020 raised $4.6 billion from both traditional telecom organizations, as well as non-traditional bidders, including enterprises. What’s more, the C-Band auction, which wrapped up recently, cleared $80 billion in bids.

So why is all this money being spent? Investments in connectivity (as well as servers and storage) is a means to an end. Organizations make expensive infrastructure investments hoping to collect and analyze data more effectively and uncover important insights that improve business outcomes and reduce operational costs. To achieve this, enterprises need reliable connectivity and speed to power mission-critical use cases.

Industry researchers note that 5G networks promise exponential improvements in connectivity speed, capacity and bandwidth to address IoT and automation deployments, like smart factories, connected cities and more. To deliver on these advanced use cases, real-time data-processing capabilities are critical. High-capacity, low-latency information transfer is a core benefit of 5G, but optimal value cannot be achieved by 5G alone.

If organizations want to realize the benefits of the 5G era, edge computing is imperative.

5G networks, both public and private, can help connect millions of sensors in a small area. However, more often than not, large amounts of data collected is transported to the cloud for analysis. Analyzing raw data off-premises in the cloud can be prohibitively expensive, due to the amount of transport and processing costs. As a result, organizations frequently resort to using down-sampled or time-deferred data to balance cost and timeliness, which makes it easy to miss anomalies and difficult to gain the most accurate insights. So as more and more data is generated, enterprises need infrastructure at the edge to rapidly analyze large data sets right where that data is generated.

Some of the most common and highly desired 5G use cases also need low-latency data processing to work. This is why data locality is key. These use cases require powerful, compact computing resources deployed onsite—that is, nearby (or within) the devices or sensors connected to achieve latency targets. Edge computing allows for this real-time data analysis and closed-loop control. 

Consider a video use case in an industrial setting: there’s a camera mounted above an assembly line at a manufacturing plant monitoring each product for any potential defects. The streaming video creates enormous quantities of data. The camera itself, or a gateway near the camera, can ingest and run machine learning to identify any anomalies. When the camera system spots an out-of-spec product, the conveyor belt must stop immediately so the item can be removed or inspected. This all must be done within milliseconds. If the process takes seconds or longer, the product will continue moving along before the line finally halts, creating significant waste and scrap for the manufacturer.

The right 5G and edge infrastructure can help businesses achieve their desired outcomes at the lowest possible costs, with the highest quality of data in real-time. 5G is enabling new cutting-edge applications and use cases that weren’t possible with previous generations of network technology, but organizations must couple this connectivity with edge computing to drive maximum ROI from 5G investments.

By Chris Penrose, COO, FogHorn