NEWS

Access to Proprietary Protocols and Standard Data Exchange Protocols

Aug 09, 2023

Is this the answer to transforming data center infrastructure management?

In this thought-provoking piece, we delve into pressing issues that inhibit data center operators from gleaning useful management information for critical reporting and more effective site management.


In our view, one of the key issues is access to crucial data from some equipment which cannot be read without the use of expensive proprietary tools supplied by equipment manufacturers. Some equipment manufacturers choose to prevent access to the data from their systems which means it cannot be used by third-party software. There is often important data available, aimed to help improve data center operations but unfortunately, this cannot be used or read to unlock the true potential of the management information that could be at the industry’s disposal.


“People talk about a lack of operational data but it’s simply not true.” says Mark Acton, Business Strategy and Technology Director.


“There is a huge volume of potentially useful operational data, but the fact is we cannot get access to this vast pool of data or, in some instances, it becomes far too complicated and expensive a task to extract information from data center assets that are using proprietary protocols."


There are a number of dynamic factors affecting the ability of operators to extract useful management information from the raw data and it is tools like XpedITe that are essentially paving the way to greater collaboration so that we can make use of this data.


So, what can we do as an industry?

As software developers, we are trying to do the right things by establishing relationships with the manufacturers and third parties so that we can integrate as many propriety protocols into our platforms as possible. To achieve the ambitious goals of many governments and the data center industry, such as improving energy and operational efficiency, our view is that unhindered access to all the available data prevents an ability to create a single federated data lake relating to data center operations and management. This federated approach is the only way to achieve these goals and allow DCIM solutions to reach their unfulfilled potential.


Right now, one key driver is that we will need data from all sources, potentially including propriety data sets, to fulfil the requirements of increasing government scrutiny on data center activities mandated by the coming EED and CSRD obligations in Europe.


We have provided six potential opportunities for change. Opportunities that will enable progress in the accurate reporting of data center infrastructure management:


  1. Harnessing the Potential of AI: AI plays a pivotal role in extracting insights from large data sets but in the data center there is one major hurdle - the data is either locked down within the manufacturer’s assets in the form of proprietary protocols or there isn’t a way of collecting the data at all. As a substantial portion of data center operations relies on mechanical and electrical systems, the absence of a collaborative approach to extract relevant information from hardware impedes progress for data center infrastructure management (DCIM). To drive meaningful change, it is essential to involve a broader spectrum of stakeholders, including manufacturers, software providers, and management tool providers. Without their collaboration, we risk remaining trapped in a stagnant state, hindering advancements for years to come.   

  2. Big Data - Working Towards Aligned Goals: By embracing a shared vision, like working towards achieving net-zero emissions, organizations can synchronize their efforts and move towards a realistic outcome. We need to establish a sense of collective responsibility and promote cross-functional cooperation, and ultimately as already noted, this ambitious endeavor requires the active participation of all stakeholders, from manufacturers to MSPs, operators and software developers and even advocates of the circular economy, collectively leveraging data to generate useful management information.

  3. Standardized Data Exchange Protocols: Adopting a standardized approach raises the question of adopting open-source models to foster collaboration and ensure everyone is working towards a common goal. A standardized approach and shared direction can significantly enhance decision-making and generate better outcomes. This way we can put in place scenario testing and planning to allow organizations to evaluate different outcomes within their own estates and benchmark them against available industry data.

  4. Tailored Dashboards for the Different Stakeholders: Every stakeholder involved in the decision-making process should have access to a customized dashboard. These bespoke dashboards can provide relevant information, patterns, and cycles of interrogation, enabling stakeholders to make informed decisions aligned with their specific roles and responsibilities. Not all stakeholders have the same need and without tailored dashboards, the tool becomes unusable or deemed useless.

  5. The Role of Data Visualization: Data visualization plays a crucial role in facilitating understanding and empowering the data center collective across the different operational and IT disciplines. Without accurate data sets and visualization of this data, we can create dangerous perceptions of reality, resulting in missing significant capacity or resource management opportunities. By embracing the concept of a digital twin, data can be presented in a manner that enables users to make faster and well-informed decisions. This approach also breaks down the barriers between intricate data sets and decision-makers. By embracing data visualization, we can gain a comprehensive understanding of the bigger picture, unlocking new possibilities for advancement.

  6. Continuous Improvement and Feedback: Regular stock-taking, measurement, review, and feedback are crucial for driving continuous improvement. Using existing global standards and platforms for feedback facilitates the ongoing refinement of data practices and ensures that organizations can adapt to changing needs. Additionally, undertaking a cost-benefit analysis (CBA) helps prioritize proposed actions and determine the most cost-effective measures. All of this can be supported with one federated system.








Share by: