Edge Computing-Managing Workloads and Costs in Complex Data Center Architectures

Edge Computing-Managing Workloads and Costs in Complex Data Center Architectures

Edge computing is a method of optimizing cloud computing systems "by taking the control of computing applications, data, and services away from some central nodes (the "core") to the other logical extreme (the "edge") of the Internet" which makes contact with the physical world. In this architecture, data comes in from the physical world via various sensors, and actions are taken to change physical state via various forms of output and actuators; by performing analytics and knowledge generation at the edge, communications bandwidth between systems under control and the central data center is reduced. Edge Computing takes advantage of proximity to the physical items of interest also exploiting relationships those items may have to each other.

This approach requires leveraging resources that may not be continuously connected to a network such as autonomous vehicles, implanted medical devices, fields of highly distributed sensors, and mobile devices. Edge computing covers a wide range of technologies including wireless sensor networks, mobile data acquisition, mobile signature analysis, cooperative distributed peer-to-peer ad hoc networking and processing also classifiable as local cloud/fog computing and grid/mesh computing, dew computing, mobile edge computing, cloudlet, distributed data storage and retrieval, autonomic self-healing networks, remote cloud services, augmented reality, the Internet of Things (IoT) and more. Edge Computing can involve Edge nodes directly attached to physical inputs and output or Edge Clouds that may have such contact but at least exist outside of centralized Clouds closer to the Edge.

What is edge computing and how it’s changing the network

Edge computing is a way to streamline the flow of traffic from IoT devices and provide real-time local data analysis

Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds.

Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance.

“In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that’s just not realistic,” says Helder Antunes, senior director of corporate strategic innovation at Cisco.

What exactly is edge computing?

Edge computing is a “mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet,” according to research firm IDC.

It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing. Edge computing triages the data locally so some of it is processed locally, reducing the backhaul traffic to the central repository.

Typically, this is done by the IoT devices transferring the data to a local device that includes compute, storage and network connectivity in a small form factor. Data is processed at the edge, and all or a portion of it is sent to the central processing or storage repository in a corporate data center, co-location facility or IaaS cloud.

Edge vs. Fog computing

As the edge computing market takes shape, there’s an important term related to edge that is catching on: fog computing.

Fog refers to the network connections between edge devices and the cloud. Edge, on the other hand, refers more specifically to the computational processes being done close to the edge devices. So, fog includes edge computing, but fog would also incorporate the network needed to get processed data to its final destination.

Backers of the OpenFog Consortium, an organization headed by Cisco, Intel, Microsoft, Dell EMC and academic institutions like Princeton and Purdue universities, are developing reference architectures for fog and edge computing deployments.

Some have predicted that edge computing could displace the cloud. But Mung Chaing, dean of Purdue University’s School of Engineering and co-chair of the OpenFog Consortium, believes that no single computing domain will dominate; rather there will be a continuum. Edge and fog computing are useful when real-time analysis of field data is required.

Edge computing terms and definitions

Like most technology areas, edge computing has its own lexicon. Here are brief definitions of some of the more commonly used terms

  • Edge devices: These can be any device that produces data. These could be sensors, industrial machines or other devices that produce or collect data.
  • Edge: What the edge is depends on the use case. In a telecommunications field, perhaps the edge is a cell phone or maybe it’s a cell tower. In an automotive scenario, the edge of the network could be a car. In manufacturing, it could be a machine on a shop floor; in enterprise IT, the edge could be a laptop.
  • Edge gateway: A gateway is the buffer between where edge computing processing is done and the broader fog network. The gateway is the window into the larger environment beyond the edge of the network.
  • Fat client: Software that can do some data processing in edge devices. This is opposed to a thin client, which would merely transfer data.
  • Edge computing equipment: Edge computing uses a range of existing and new equipment. Many devices, sensors and machines can be outfitted to work in an edge computing environment by simply making them Internet-accessible. Cisco and other hardware vendors have a line of ruggedized network equipment that has hardened exteriors meant to be used in field environments. A range of compute servers, converged systems and even storage-based hardware systems like Amazon Web Service’s Snowball can be used in edge computing deployments.

Mobile edge computing:

This refers to the buildout of edge computing systems in telecommunications systems, particularly 5G scenarios.

Edge Computing as “pushing the frontier of computing applications, data, and services away from centralized nodes to the logical extremes of a network. It enables analytics and data gathering to occur at the source of the data. This approach requires leveraging resources that may not be continuously connected to a network such as laptops, smartphones, tablets and sensors.”

The role of edge computing to date has mostly been used to ingest, store, filter, and send data to cloud systems. We are at a point in time, however, where these computing systems are packing more compute, storage, and analytic power to consume and act on the data at the machine location. This capability will be more than valuable to industrial organizations—it will be indispensable.

Edge computing vs. cloud computing

For industrial companies to fully realize the value of the massive amounts of data being generated by machines, edge computing and cloud computing must work together.

When you consider these two technologies, think about the way you use your two hands. You will use one or both depending on action required. Apply that to an IIoT example, where one hand is edge and the other hand is cloud, and you can quickly see how in certain workloads your “edge hand” will play a more prominent role while in other situations your “cloud hand” will take a lead position. And there will be times when both hands are needed in equal measure.

Scenarios in which edge will dominate include a need for low latency (speed is of the essence) or where there are bandwidth constraints (locations such as a mine or an offshore oil platform that make it neither practical or affordable, and in some cases impossible, to send all data from machines to the cloud). It will also be important when Internet or cellular connections are spotty. Cloud computing will take a more dominant position when actions require significant computing power, managing data volumes from across plants, asset health monitoring and machine learning, and so on.

The bottom line is this: cloud and edge are both necessary to industrial operations to gain the most value from today’s sophisticated, varied, and volume of data applied across cloud and edge, wherever it makes the most sense to achieve the desired outcomes.

Edge computing examples

Here’s a couple of examples to help bring this computing concept to life.

Autonomous vehicles

With autonomous automobiles—essentially a datacenter on wheels— this type of computing plays a dominant role. GE Digital partner, Intel, estimates that autonomous cars, with hundreds of on-vehicle sensors, will generate 40TB of data for every eight hours of driving. That’s a lot of data. It is unsafe, unnecessary, and impractical to send all that data to the cloud.

It’s unsafe because the sensing, thinking, and acting attributes of edge computing in this use case must be done in real-time with ultra-low latency to ensure safe operation for passengers and the public. An autonomous car sending data to the cloud for analysis and decision-making as it traverses city streets and highways would prove catastrophic. For example, consider a child chasing a ball into the street in front of an oncoming autonomous car. In this scenario, low latency is required for decision and subsequent actuation (the car needs to brake NOW!).

It’s unnecessary to send all that data to the cloud because this particular set of data has only short-term value (a particular ball, a particular child on a collision with a particular car). Speed of actuation on that data is paramount. It’s simply impractical (not to mention cost-prohibitive) to transport vast volumes of data generated from machines to the cloud.

However, the cloud is still an important part of IIoT equation. The simple fact that the car had to respond to such an immediate and specific event might be valuable data when aggregated into a digital twin, and compared with the performance of other cars of its class.

Fleet management

In a scenario where a company has a fleet (think trucking company, for example), the main goal could be to ingest, aggregate, and send data from multiple operational data points (think wheels, brakes, battery, electrical) to the cloud. The cloud performs analytics to monitor the health of key operational components. A fleet manager utilizes a fleet management solution to proactively service the vehicle to maximize uptime and lower cost. The operator can track KPIs such as cost over time by part, and/or the average cost of a given truck model over time. This in turn helps maintain optimal performance at a lower cost and higher safety.

Edge computing security

There are two sides of the edge computing security coin. Some argue that security is theoretically better in an edge computing environment because data is not traveling over a network, and it’s staying closer to where it was created. The less data in a corporate data center or cloud environment, the less data there is to be vulnerable if one of those environments is comprised.

The flip side of that is some believe edge computing is inherently less secure because the edge devices themselves can be more vulnerable. In designing any edge or fog computing deployment, therefore, security must be a paramount. Data encryption, access control and use of virtual private network tunneling are important elements in protecting edge computing systems.

Advantages of Edge Computing

As edge computing is adopted and goes mainstream, there are many potential advantages for a wide range of industries. Edge computing, in particular, brings five potential advantages to smart manufacturing:

<
  • Faster response time: Power of data storage and computation is distributed and local. No roundtrip to the cloud reduces latency and empowers faster responses. This will help stop critical machine operations from breaking down or hazardous incidents from taking place.
  • Reliable operations with intermittent connectivity: For most remote assets, monitoring or unreliable internet connectivity regions such as oil wells, farm pumps, solar farms or windmills can be difficult. Edge devices' ability to locally store and process data ensures no data loss or operational failure in the event of limited internet connectivity.
  • Security and compliance: Due to edge computing's technology, A lot of data transfer between devices and cloud is avoidable. It's possible to filter sensitive information locally and only transmit important data model building information to the cloud. This allows users to build an adequate security and compliance framework that is essential for enterprise security and audits.
  • Cost-effective solutions: One of the practical concerns around IoT adoption is the upfront cost due to network bandwidth, data storage, and computational power. Edge computing can locally perform a lot of data computations, which allows businesses to decide which services to run locally and which ones to send to the cloud, which reduces the final costs of an overall IoT solution.
  • Interoperability between legacy and modern devices:  Edge devices can act as a communication liaison between legacy and modern machines. This allows legacy industrial machines to connect to modern machines or IoT solutions and provides immediate benefits of capturing insights from legacy or modern machines.
    With so many advantages with edge computing, edge has an "edge" over the cloud. One may think whether edge computing will replace the cloud, however that's unlikely. The cloud has its own advantages such as computation power, storage, and maintenance, which edge computing doesn't have. Instead, edge computing complements the cloud to create an overall IoT solution.


Madhav University provide all types of Engineering Courses-
- Mechanical Engineering
- Civil Engineering
- Computer Science & Engineering
- Computer Applications
- Electrical Engineering
- Electronics & Communication Engineering

Courtesy:
Jitendra Kumar Purohit,Assistant Professor
Department of Computer Science and Engineering
Faculty of Engineering and Technology
Madhav University