Share this:

Who would have ever thought that we would literally be able to order car service and the car would pull up with NO DRIVER!?! Self-driving cars are a real thing! Imagine getting in a car, which uses cloud-only compute for autonomous driving, strapping in your seat belt, and rolling towards your destination…now, the connection to the car goes out, you hit a cellular dead spot or the latency spikes. Since the vehicle is using cloud-only compute it is not equipped with local intelligence and the ability to make decisions on its own – this scenario is not going to end very well!! In this self-driving, autonomous car use-case, this intelligent micro data center on wheels, can be the difference between life and death.

The use cases for placing intelligent computing at the edge are endless! Remote cellular and radio towers, container shipping boats, remote military installments, self-driving automobiles, farms, remote branch offices (including retail, education, hospital), content delivery, product recommendations, city traffic management, manufacturing, remote IoT devices – these are just a few of the most common edge use cases.

As one of my favorite rock bands of all times, Aerosmith, said in their 1993 release…We’re Livin’ on the Edge!

What is the edge micro data center?
The edge micro data center places compute and application data as close to the source (or consumer) of the data as possible. Simply put – these distributed applications, devices and users experience much lower latencies and the app performs more efficiently. Previously complex operations would have traveled many hops and possibly over great distances to complete. By placing the intelligent compute close to the application, the experience is improved – lower latency, less bandwidth, increased app efficiency. Win, Win, Win!

The proliferation of data and compute at the edge has been on the rise now for the last few years. For example, we have seen this rise with some of the large hyperscale cloud companies like AWS with Snowball Edge, Outposts and now with AWS Local Zones. COVID-19 has helped accelerate the proliferation, requirements, and use cases of placing data and compute at the edge as well since companies have been forced to expedite their digital transformation timelines to remain relevant.

Risks & Challenges at the Edge
There are many benefits achieved by placing data and compute at the edge, but there are also associated risks and challenges which require consideration:

  • Edge compute, depending on the solution and use case, can be costly to deploy. Costs can be capital expenditures or even related to human capital in the amount of time to manage, maintain, and update. The lack of specialized and trained staff to perform maintenance and manage the edge infrastructure has been a challenge for many years. There has been a significant effort in many solutions to ease this burden but ultimately results in management tool sprawl. A modern approach must be taken to ease the ‘day n’ burden of edge infrastructure.
  • Depending on the amount of compute, data storage capacity, or redundancy required, the amount of physical space can be a challenge. In some locations and deployments physical space is limited to the size of a rugged tote. How do you fit the required compute, data storage, and networking in such a small yet powerful package?
  • What about data sovereignty and the geolocation requirements or regulations? In many areas of the world there are reasons as to why data cannot cross country boundaries. Not all cloud providers have data centers in all countries and depending on the use case, this may not be a risk at all, but an opportunity for an organization to maintain better control on where the data lives.
  • Organizations likely have defined data security patterns in place for the data which resides within the data center. However, the data at the edge can present a risk. Especially if the data contains customer information. Is there data classification? How is the data secured? Is the data encrypted? Who has access to the data? When a breach (not if, when) occurs does the data contain intellectual property or personally identifiable information? Each answer in the realm of security should drive decision-making and how the edge is treated.
  • Latency is a killer! Latency is the roundtrip time that it takes to complete a transaction. Decreasing latency is one of the largest contributing factors to the proliferation of data at the edge. The closer to the consumer we place the data and business logic the more likely we are to be able to increase the end-user’s experience. This comes at the cost and consideration of the previously mentioned items.

Users expect access to their data now and, with the always-on nature of the digital world we live in, a few milliseconds here and there can mean the difference between keeping or losing a customer. The edge presents huge opportunities from a business perspective, however, the challenges are immense. These challenges can be overcome with a modern, cloud-managed approach. Over the coming weeks and months, we will be sharing unique use-cases and real-life customer stories! In the meantime, tweet me at @ClintWyckoff to share your cool and unique edge application requirements!

Share this:

Author Image

Clint Wyckoff

Principle Technical Marketing Engineer

Clint Wyckoff is an avid technologist, cloud enthusiast and IT Pro Community contributor with over 15 years of real-world enterprise data center architecture experience. Clint is an energetic and engaging speaker and places a large emphasis on solving the real-world challenges IT professionals face.