Equipment and Chip Opportunities Emerge from Edge Computing in 2021

Edge computing (EC) embodies an old and simple idea – siphon of traffic with special characteristics from the main network to a local network.  The primary network now has more capacity for general traffic while the local network can be designed specifically for the needs of local users.  The newer concepts of edge computing are expected to do the “compute” or analysis as close as possible to the end user, making the processing real-time or near real-time.  This meets the needs of a new breed of traffic that CIR believes is about to hit our networks.

As CIR sees, edge computing architecture is currently most associated with high-definition video traffic, but CIR’s recent report on the market potential of edge computing – “The Edge Computing Infrastructure Market: A Technology and Market Forecast 2020 – 2024” — claims that the match between video and edge networks is not a good one.  Nonetheless, CIR believes that edge computing will truly take off in 2021 when it begins to handle entire new kinds of traffic generated by self-driving cars, artificial intelligence, and the Internet-of-Things (IoT).

According to CIR’s study revenues from Edge Computing will amount to $8 billion, with this money flowing from sales of routers, servers, specialist software and edge chips including some fairly conventional gear.  Beyond 2021 we expect  such products to be increasingly designed and branded specifically for the edge.  Many of them are, at present garden variety.

A Wake-Up Call to Equipment Makers:  New Boxes Needed

Currently CIR sees a lot of buzz generated by edge computing for video (mainly fuelled by work from home and online conferences), but the truth is that video traffic doesn’t really need edge computing.

The latency requirement for video chat and VoIP is 100-150 ms. For large conferences a video lag may create issues, but anything below 100 ms would be considered excellent and edge computing is overkill.

Edge computing is expected to deliver latency in single digits (less than 10 ms).  This matches the  latency for the new kinds of traffic that will be filling up network capacity much as video has filled up network capacity in the past decade.  Replacing the video boom of the past decade, network managers will have to cope with latency-sensitive traffic IoT (happening now), self-driving cars and AI/machine learning (happening in the near future), where latency requirements may be as low as 1 ms.

Very soon makers of routers, servers and gateways will have to design their boxes around this kind of performance.  As CIR sees it, not only will the immediate financial rewards to such firms be considerable, but edge computing will prove to be enablers of new industries such as remote surgery and service robots; sectors that have been looking for edge routing (under various names) for some time.

The Rise of Edge Specific Processors

The terms “edge router” and “edge servers” have been used for some time but have seldom referred to edge-specific products.  In particular, they used convention processors.  CIR notes, however, this is changing to create true edge computing products.

Edge specific processors are designed with the new kinds of traffic that we discussed above in mind – that is to better process latency sensitivity traffic.  In addition, CIR sees edge processors as more efficient and consume less power than the conventional processors that have gone into what passes for edge servers and routers today.  Bottom line then, starting in 2021 OEMs and end users alike will be looking for chips that are (1) very high performance and (2) will be capable of processing in a manner suitable for AI, ML and DL traffic.

The source of these new chips may be the OEMs themselves and the usual suspects among the  fabless manufacturers, but a few of the large established firms may take on this task.  In addition, while CIR believes that the actual manufacturing of the chips will be done as always by foundries, notably TSMC, the market dynamics will change with important instances of OEMs acquiring design firms and fabless “manufacturers”. This in fact, has already beginning to happen.

Meanwhile, edge networks will co-exist with cloud networks this hybrid ecosystem to become a norm going forward. While edge infrastructure will process the locally generally mission critical (and those required for real-time purpose) data, cloud will be used for storage, non-critical and other operational data.

 

Scroll to Top