Cryptheory – Just Crypto

Cryptocurrencies are our life! Get an Overview of Market News

Edge computing as a solution for new demands on data centers

6 min read

The emergence of data-intensive technologies, such as virtual and augmented reality, autonomous vehicles and generative AI, has created many innovations and opportunities. However, this has also led to increased strain on existing data center capacity.

As a result, IT infrastructure has shifted to a hybrid model that requires sophisticated management.

With the rise of the artificial intelligence However, in the field of edge computing, data processing is no longer limited to core data centers and centralized clouds, says Pierluca Chiodelli, vice president of engineering for edge computing, strategy and implementation at Dell Technologies.

Instead, it occurs closer to the data source, at the edge of the network, enabling real-time decision making and reducing the need to transmit large amounts of data to central locations.

As a result, organizations must adopt a sophisticated and progressive approach to efficiently, securely and intelligently manage workloads and data across their IT landscapeexplains Chiodelli.

This is the only way they can realize the full potential of data-intensive technologies while addressing the unique challenges that arise from integrating AI at the edge.

In the new study How Edge Computing Is Enabling the Future questioned Schneider Electric more than 1,000 IT leaders and found that 49% of participants cited managing hybrid IT infrastructures as their biggest IT challenge. They expect edge computing to incorporate several key factors such as speed, Data security and improve reliability.

Growing data volumes have also led to increased data processing, putting greater pressure on carbon emissions and operational sustainabilitysays the survey.

Decision makers believe that edge computing can help promote sustainability and achieve their companies’ ESG (environmental, social, corporate governance) goals.

As the amount of data in organizations continues to increase and IT infrastructure becomes more complex, it is critical for organizations to figure out how to track and measure energy across interfaces, says Carsten Baumann, head of strategic initiatives and solutions architect at Schneider Electric .

Edge computing: Low latency + higher reliability = faster response times

Edge computing allows data to be processed close to the source from which the information comes, meaning faster service and greater reliability. This leads to better response times when companies use applications or programs, says Adonay Cervantes, Global Field CTO at CloudBluea multi-layered e-commerce platform.

And because these applications operate at the edge of the network, they perform better with low latencyhe says.

Lee Ziliak, director of technology and managing director of architecture at IT solutions provider SHI International, agrees with this assessment.

Leveraging data at the edge also allows an organization to perform analysis and predictions using time series data, improve monitoring capabilities, increase performance, and achieve greater value by generating new data pointshe explains.

This saves time and money because only the important data is collected and stored.

Regardless of the workload, companies are adopting edge computing because some product features cannot be used in the cloud due to practical or regulatory limitations, says David Kinney, principal architect at IT services provider SPR Inc.

He adds that the most common practical limitations leading to the adoption of edge computing is that there is too much communication between the edge and the cloud Latency or the communication medium is slow or unreliable.

Latency is important for many systems that control machines, such as: B. the collision avoidance systems in new cars are of central importancesays Kinney. In many of these systems, a delay of just a fraction of a second can have catastrophic consequences, requiring critical calculations to be performed at the edge of the system.

Regarding regulatory restrictions, he says this is often the case with medical devices. Medical device on which a patient depends for his life or health, such as: A device, such as an insulin pump, must continue to function even if it cannot communicate with the cloud.

Addressing the challenges of data-intensive technology

Edge computing also helps reduce costs associated with transmitting and storing data, according to Saurabh Mishra, global director of IoT product management at SAS, an analytics software provider.

Huge amounts of data are being generated at the edge of the network, and much of it is sensor-basedhe says. This data is possible redundant and their value is short-lived.”

Instead of transferring and storing this data to the cloud and incurring the associated costs, companies are better off using edge computing to process this data locally on site and only send important events back to the cloud.”

More and more companies are combining edge computing and central data center processing in a hybrid model to address the challenges of data-intensive technologies such as augmented reality, virtual reality, autonomous vehicles and advanced AI applications. These are data-intensive applications that require complex, real-time data analysis to operate successfully, says Bob Brauer, founder and CEO of Interzoid, a data usage consulting firm.

He adds that a cloud-only approach or a fully centralized approach would introduce significant latency in the use of these data-intensive technologies, making them less effective, less reliable and potentially even unsafe, particularly in the case of self-driving vehicles or applications in the Healthcare.

However, the hybrid solution enables sophisticated data processing, such as e.g. B. the creation of AI models, can be carried out on a powerful in-house system, where the infrastructure costs are usually cheaper and more scalable than in shared cloud environments, says Brauer.

Once AI models are complete, comprehensive and well-tested, they can be rolled out to lighter data nodes at the edge of the network for application and availability geographically closer to the systems, devices and vehicles that use these models,” he explained.

This allows companies to make instant decisions without relying on communication with central servers physically located anywhere in the world. According to Brauer, this approach drastically reduces the risk of latency without compromising the quality of the core AI models.

Damien Boudaliez, vice chairman and global head of data solutions development at FactSet, a financial data and software company, describes how edge computing is helping his company become more efficient.

The goal of FactSet’s Ticker Plant Cloud project was to minimize latency in the real-time distribution of financial data,” he says. “Using edge computing allows us to place data closer to global customers, optimizing performance, particularly in regions such as Asia where distance to market is challenging.”

In addition, edge computing complements FactSet’s hybrid cloud model by providing choices.

We can leverage on-premises resources for large-scale, predictable data processing tasks and the cloud for more dynamic, location-dependent needs,” says Boudaliez. This strategy improves the performance of both our external clients and our internal teams. By placing computing resources closer to customers and our global offices, we minimize latency and maximize efficiency.”

Conclusion

As edge computing becomes more widespread across industries, so does the complexity and demands of managing edge operations, Dell’s Chiodelli said.

The edge environment is inherently decentralized, presenting organizations with the dual challenge of capturing and securing data at the source while contending with limited IT expertise,” he says.

This complexity also extends to the management and security of the various edge deployments across many devices and locations, according to Chiodelli. Enterprises need a streamlined approach to monitoring and securing their widespread ecosystems of edge devices and applications.

While models that employ edge servers offer flexibility and control, this approach is not without key considerations, particularly managing the technology at the edge, says Kelly Malone, senior managing director at Taqtile, an augmented reality software company.

Devices and servers at the edge need to be updated, synchronized and managed, which can be complicated because these devices are not in a central location as defined by the edge approach,” says Malone.

And as companies continue to dive into metaverse technologies that allow them to collaborate on new levels and provide their employees with greater efficiency than ever before, they will need to deploy more edge-like technology to handle the amount of computing power that for low latency and better performance,” said Michael McNerney, vice president of network security at the technology company Supermicro.

Not only will there be lower latency needed to make decisions at the edge, but also less bandwidth, allowing enterprises to serve more devices with the same bandwidth,” he noted.

Without edge technology, devices operating at the network edge would suffer from latency issues, create bottlenecks in enterprise networks and present other processing-related challenges, says Sharad Varshney, CEO of OvalEdge, a consulting firm data governance.

However, it is important to remember that edge computing is a framework that requires internal cultural changes if it is to work in your organization,” he adds.

Additionally, edge computing is one of many solutions you should consider if you want to streamline data usage in your organization.”

The best platfroms to buy Bitcoin by debit or credit card 2023

 

All content in this article is for informational purposes only and in no way serves as investment advice. Investing in cryptocurrencies, commodities and stocks is very risky and can lead to capital losses.