The edge of the network continues to be the epicentre of innovation in the data centre space as the calendar turns to 2019, with activity focusing on increased intelligence designed to simplify operations, enable remote management and service, and bridge a widening skills gap. This increasing sophistication of the edge is among the data centre trends to watch in 2019 as identified by Vertiv experts from around the globe.
“Today’s edge plays a critical role in data centre and network operation and in the delivery of important consumer services. This is a dramatic and fundamental change to the way we think about computing and data management. It should come as no surprise that activity in the data centre space in 2019 will be focused squarely on innovation at the edge,” said Vertiv CEO Rob Johnson.
Simplifying the edge
A smarter, simpler, more self-sufficient edge of the network is converging with broader industry and consumer trends, including the Internet of Things (IoT) and the looming rollout of 5G networks, to drive powerful, low-latency computing closer to the end-user. For many businesses, the edge has become the most mission critical part of their digital ecosystem. Intelligent infrastructure systems with machine learning capabilities working in tandem with cloud-based analytics are fundamentally changing the way we think about edge computing and edge services. The result will be a more robust, efficient edge of the network with enhanced visibility and self-healing capabilities requiring limited active management.
Sharing views on the edge trend, Sunil Khanna, President and Managing Director, Vertiv, India, said, “Most industries in India are recognising the limitations of supporting users and emerging technologies through centralised IT infrastructures and are pushing storage and computing closer to users and devices. That shift is becoming necessary because of the increased connectivity of devices and people and the huge volumes of data they generate and consume. We believe this will require profound changes in the compute and storage infrastructure to support the smart and connected future, particularly at the local level.”
A workforce aging into retirement and training programs lagging behind the data centre and edge evolution are creating staffing challenges for data centres around the globe. This will trigger parallel actions in 2019. First, organisations will begin to change the way they hire data centre personnel, moving away from traditional training programs toward more agile, job-specific instruction with an eye toward the edge. More training will happen in-house. Second, businesses will turn to intelligent systems and machine learning to simplify operations, preserve institutional knowledge, and enable more predictive and efficient service and maintenance.
Smarter, more efficient UPS systems
New battery alternatives will present opportunities for the broad adoption of UPS systems capable of more elegant interactions with the grid. In the short term, this will manifest in load management and peak shaving features. Eventually, we will see organisations using some of the stored energy in their UPS systems to help the utility operate the electric grid. The static storage of all of that energy has long been seen as a revenue-generator waiting to happen.
The data centre, even in the age of modular and prefabricated design, remains far too complex to expect full-fledged standardisation of equipment. However, there is interest on two fronts: standardisation of equipment components and normalisation across data centre builds. The latter is manifesting in the use of consistent architectures and equipment types, with regional differences, to keep systems simple and costs down. In both cases, the goal is to reduce equipment costs, shorten delivery and deployment timelines, and simplify service and maintenance.
High-power processors and advanced cooling
As processor utilisation rates increase to run advanced applications such as facial recognition or advanced data analytics, high-power processors create a need for innovative approaches to thermal management. Direct liquid cooling at the chip – meaning the processor or other components are partially or fully immersed in a liquid for heat dissipation – is becoming a viable solution. Although most commonly used in high-performance computing configurations, the benefits – including better server performance, improved efficacy in high densities, and reduced cooling costs – justify additional consideration. Another area of innovation in thermal management is extreme water-free cooling, which is an increasingly popular alternative to traditional chilled water.
If you have an interesting article / experience / case study to share, please get in touch with us at firstname.lastname@example.org