Can You Gain Advantage With Software-defined Edge?
Alan Renouf and Chris Taylor from VMware, during their Edge Field Day presentation, tell us to forget everything we know about VMware because they’ve rethought and redesigned everything for the edge.
VMware VeloCloud is designed to enable the “new enterprise edge” — architected for distributed clouds, workers, and applications.
The edge portfolio includes VMware Edge Cloud Orchestrator, which enables you to manage the edge compute infrastructure at scale. Notably, the user interface shares the same console as the network overlay, VMware VeloCloud SASE secured by Symantec. The goal is the provide you with complete visibility into the entire edge infrastructure to ensure that you can optimize your environment.
What is the edge?
VMware says that the edge is distributed digital infrastructure for running workloads across dispersed locations, placed close to where endpoints are producing or consuming data. Alan and Chris actually call this the “software-defined” edge.
Why put compute at the edge?
The three big drivers to move compute at the edge are:
- Too much data — it can be difficult and expensive to move the data to the core
- Poor connectivity — it can be difficult or impossible to get the necessary reliable data pipes in place
- Regulatory requirements — moving or storing sensitive data such as biometrics may be regulated
Industry success (or failure) at the edge?
Seventy percent of companies that invest in Industry 4.0 — which relies heavily on the edge — fail to move beyond the POC or trial stage.
Why?
Because the edge is different than the core with different challenges, spanning scale, limited networks, edge hardware and protocols, limited onsite personnel, and security. For example, in addition to traditional security concerns, at the edge you have physical concerns — the server might be located in a restaurant kitchen or the store manager’s office and doesn’t have the same level of physical access control that you’d find in a corporate data center.
Ad hoc implementations increase operating complexity and cost
The edge has grown organically, with new disparate components added on as needed to meet new requirements and new use cases. The resulting “hodge-podge” edge infrastructure is hard to manage, and it’s hard to physically maintain and upgrade hardware. In many instances, upgrading software or firmware requires an operator to travel to the physical site and get hands on the actual hardware.
Push vs Pull
Traditional management of infrastructure in a data center uses the push methodology. A management console pushes updates and configuration changes to each component.
With variety, variability, and limited connectivity, the edge needs to take the opposite approach. Each edge component is responsible for “calling home” to check for updates and configuration changes, and for scheduling when to apply those changes.
How can VMware help?
VMware Edge Compute Stack’s value proposition is that it provides an edge-optimized run-time and orchestration platform for frictionless management of edge apps and infrastructure.
- Supports (and is optimized for) the types of workloads at the edge, including real-time apps
- Simplifies operations to reduce management complexity and costs
- Flexibility and scalability for your changing needs
Notably, because they’ve rethought and optimized the infrastructure for the edge, there’s no need to deploy the greater VMware VCF solution.
The architecture comprises:
- Edge optimized runtime – with an optimized ESXi hypervisor and a Kubernetes runtime.
- VECO – centralized, edge optimized orchestration with zero-touch provisioning, leveraging the K8S operating model of defining the desired state of operation such that the orchestration layer takes action to achieve that state.
The goal is to have an OEM or system integrator ship hardware with pre-installed VMware Edge Compute stack runtime. When plugged in and powered on, the VMware edge agent calls home, authenticates, and fetches its configuration and workloads. The edge configuration would have been previously defined by the admins using the centralized console and orchestration layer.
Leveraging modern DevOps principles and tools
VMware is using modern DevOps principles and tools including GitOps. The hardware, infrastructure, and workload configurations are defined in files kept in a source-controlled repository (git). An update to any of the configuration “source code” files” triggers a pipeline of operations using continuous integration and continuous delivery/deployment (CI/CD) methodology to run a series of tests and approvals to ensure that the changes will results in a good state. Then the new version of operations is pulled by each edge component from git and “realized”. This means that the edge components make whatever changes are necessary to move from the current state to the new state.
Stateful or stateless applications?
VMware’s original vision was for a stateless edge environment, without local storage. However, they’ve since come to the realization that some edge workloads may need local storage, and are in the process of revising the solution to support these use cases.
What else is missing?
What we didn’t hear is how the edge compute stack integrates with or interoperates with VMware’s traditional data center stack, and what are the benefits of a core-to-edge VMware deployment.
There was also no mention of how to migrate an existing edge location to VMware Edge Compute Stack. The entire focus is on day 0 deployment: ship new hardware with the edge stack pre-installed, and then configure new apps running on that new hardware. It’s unclear how you would do a “P2V” (physical to virtual) conversion of existing applications running at the edge.
Gaining advantages from VMware Edge Compute Stack
The real challenge with the edge isn’t day 0 deployment, it’s day 1 management and maintenance, and day 2 upgrades and changes. The traditional edge management model is hands on management of each individual component, treating each edge location as separate and unique.
With VMware Edge Compute Stack, you can modernize your operations, moving to a DevOps CI/CD model with source-code controlled configuration files where you simply define the desired state and the infrastructure takes whatever actions are necessary to achieve the desired state. This gives you all the benefits we’ve come to expect from DevOps and CI/CD: faster time to deploy, higher quality, increased efficiency, and enhanced scalability and flexibility. All of these provide directly measurable benefits.