By nature, fixed and mobile service provider have
distributed sites, large and small where various equipment
is placed to provide connectivity across rural, suburban
areas and cities. In cities we find more enterprises
connected to their network, hence larger data centers are
needed. The more remote the location – the smaller the
capacity and physical size.
Operators have spent lots of money in building out
central offices for their network gear and some have older
equipment and are not as modern as recently built-up data
centers. However, the more things that are connected –
people, cars, trucks, and industrial machines – the more
valuable the sites may prove to be. A short distance to the
data center where the actual service is executed is often
a necessity for many IoT services that have very strict
requirements on latency.
In other words, this distributed telco environment is a
benefit for taking part in the IoT value chain as well as
enterprise on-premises solutions. Many nodes are deployed
close to offices or inside factories, warehouses and other
locations where data is generated and consumed.
Micro-data centers are the solution and with small physical
size and a low number of resources they deliver compute,
storage much closer to the user. Think of the impact to
CDN and video caching services or software updates for
autonomous electric cars.
The best way to address small physical installations is to
design integrated systems – integrated as in one system
integrating storage, compute, switching and routing, as
well as simplifying the applications moving from virtual
machines (VMs) to containerized applications. To keep
overhead costs down, integrated, automated IP-Fabric
management with high availability (HA) is a key ingredient
for a micro-data center. Day-0, Day-1 and Day-2 network
automation becomes a key feature that allows to deploy
and manage 1000s of small fabrics at the edge of the
network as well as tenants provisioning and orchestration. Leveraging Integrated Application Hosting (IAH) that
is built-in the network infrastructure enables compute
and storage power in micro-data centers and provides
enough edge compute power to drive latency sensitive
applications for enterprise customers or integration of
1000s of IoT devices.
Geo-redundancy of equipment is also vital, instead of being
over-ambitious in implementing resilience in hardware and
software. This means that various sites will work together to
make sure the service is always running in case of failure in
one specific location.
Edge computing is as an evolution of cloud computing and
brings application hosting from centralized data center
down to the network edge, closer to the enterprise and
consumer and the data generated by applications. Edge
computing is acknowledged as one of the key pillars for
meeting the demanding Key Performance Indicators (KPIs)
of 5G, especially as far as low latency and bandwidth
efficiency are concerned.
However, not only is edge computing in telecommunications
networks a technical enabler for the demanding KPIs, it
also plays an essential role in the transformation of the
telecommunications business, where telecommunications
networks are turning into versatile service platforms
for industry and other specific customer segments like
enterprises. This transformation is supported by edge
computing, as it opens the network edge for applications
and services, including those from third parties.
With the introduction of 5G and edge computing service
providers can provide new offerings to enterprise customers
while processes and functions are automated. Edge
computing provides distributed computing and storage
resources closer to the location where it is needed. It targets
new business opportunities that provide support for specific
new application use cases like Artificial Intelligence (AI) and
Machine Learning (ML).
Customers want reliable application delivery and service
providers need to put pressure on their IT staff to address
issues as quickly as possible. More compute power in the
edge of the network means containers are constantly
being deployed faster than IT-staff can manage them.
Management tools can be used to automate deployment,
troubleshooting and assuring service must be done in an
automated fashion. With telemetry data spread across
multiple server components, the IT staff needs to process
data quickly and gain valuable insights based on visible trends.
The solution here is AI, specifically machine learning, which
powers orchestration to deliver predictive and scalable
operations across workloads. The combination of real-time
network monitoring and ML provides an automated solution
for provisioning, instantiating, and configuring physical and
virtual network functions quicker and more accurately than
if a human carried out the task.
This frees up time for IT staff to spend their time on mission
critical, higher-value tasks that contribute to the business.
With IoT devices soon expected to produce trillions of
gigabytes of data daily, the Internet of Things (IoT) is
expected to be both the biggest producer and consumer of
data. Billions of IoT devices will include components in a
variety of uses, including smart city, smart retail, smart
vehicles, smart homes, and more. 5G in high bands enabling
critical IoT and industry automation as defined in release 16
of the 5G standard.
Edge devices are, in theory, IoT devices – and video analytics
and AR/VR will play an important part of the IoT. For example,
a face detection workload may be run for a device in a smart
city setting, or for checkout in a smart retail shop, or as a part
of AR for a private user. IoT workloads will also generally
include all the AI workloads in terms of processing a data point.
One specific IoT-related workload is the IoT Gateway. With
all the IoT data needing to be processed differently at
different latencies for varying purposes, compute capability
to process all this data at different locations to fulfill the
varying latency requirements is necessary. Thus, the edge
cloud is an ideal location for performing such functions.
Data organization and processing will be an important
operation at the edge cloud. Fundamentally, data
organization entities range widely in complexity, from simple
key-value stores that are designed for very fast access to
data to complex analytics operations. There will be endto-end use case providing video surveillance to cities,
enterprises or neighborhoods over the network. Edge cloud
micro datacenters are used for analyzing video streams from
nearby surveillance IP cameras to conduct targeted searches
in order to detect, recognize, count and track pedestrians,
faces, vehicles, license plates, abnormal events/behaviors
and other types of content in the video. Analysis and
processing happen closer to the point of capture, thereby
conserving video transmission bandwidth and reducing the
amount of data routed through the core network.
Service Provider and especially 5G Mobile Operators are forced to push more compute and storage power to the edge of their network. Leveraging their existing distributed geo infrastructure is a great asset. Building up 5G micro data centers – build with Integrated Application Hosting and IP Fabric Management - enable and simplify the management of IP-Fabric networks that deliver the foundation for emerging applications in the 5G eco-system.