Moore’s Law, sustainability and data centres

Issue 7 2022 Infrastructure, Information Security

Natalya Makarochkina.
Natalya Makarochkina.

An important principle in the development of IT over the decades has been Moore’s Law. Simply put, it predicted that transistor density in processors would double every two years as development progressed. Despite many predictions of its demise, it has more or less remained a guiding principle. However, what is perhaps less well known is a similarly persistent trend in the data centre space.

Despite a sixfold increase in the data being processed since 2010, data centre energy consumption only increased by 6% to 2018 (www.securitysa.com/*se2). How has that been possible, and how does it inform sustainability developments into the future?

Where does the data come from?

To contextualise this development, we must first understand where the data processing increase has come from. The Apple iPad was debuted in 2010, which also saw the introduction of Instagram and Microsoft’s Azure cloud service. 2011 introduced us to Minecraft, Snapchat and Uber, with 2013 bringing Amazon’s Alexa, accompanied by Xbox One and PlayStation. 2017 brought Fortnite and TikTok.

Social media engagement over the period increased manifold, while global data production went from estimates of 2 zettabytes in 2010 to 41 zettabytes in 2019. IDC estimates the global data load will rise to a staggering 175 zettabytes by 2025.

The pandemic’s effect has been substantial, with the MENA region seeing a big increase in messaging and social media usage: Social media isers in MEA and Latin America spend the most time on social networks, averaging over 3,5 hours a day.

More than half of users in MEA (57%) reported (in May 2020) spending even more time on social media as a result of the pandemic. Similarly, in a separate study, 71% of Middle East respondents reported WhatsApp and other messaging apps usage increased since the outbreak of the pandemic.

What impact does all that data have?

To understand the impact of this data explosion, a concept has been developed called data gravity (www.securitysa.com/*se3). Coined by engineer David McRory, the term refers to the tendency of an accumulation of data to attract applications and services toward it, precipitating further accumulation, which can lead to immobilisation of the data as well as underutilisation. Data that grows too big, too fast can become immobile, reducing its value and increasing its opacity. Only low-latency, high-bandwidth services, combined with new data architectures, can combat this growing and largely undocumented phenomenon.

What tech developments have made this possible?

Multiple technological developments can account for this data explosion being handled with only minimal increases in energy consumption, from improvements in processor design and manufacture, through power supply units and storage, but also the migration of workloads from on-premises infrastructure to the cloud.

Schneider Electric has been committed to sustainable business for decades. That has meant a renewed focus on efficiency in all aspects of design and operation. Gains have been made in efficiency in power and cooling, with UPS systems and modular power supplies showing significant gains with each generation, culminating in the likes of the current Galaxy VL line. This line’s use of lithium-ion batteries has not only increased efficiency, it has also extended operational life, reduced environmental impact by reducing raw materials usage, and facilitated ‘energised swapping’, where the addition and/or replacement of power modules can be performed with zero downtime, while increasing protection to operators and service personnel.

Advances in cooling, such as flow control through rack, row and pod containment systems, liquid cooling,and intelligent software control, ensure that the pure data processing gains are met and matched.

By ensuring that every link in the chain of power from energy grid to rack is as efficient, intelligent and instrumented as possible, we provide the right basis for the rapid development in computing, networking and storage.

Where do software and apps fit in?

Another key element of the technological development that has allowed such relentless efficiency has been the application of better instrumentation, data gathering and analysis that allows for better control and orchestration. This was illustrated by Google’s DeepMind AI, where the energy used for cooling was reduced at one of its data centres by some 40% in 2016, which represented a 15% overall reduction in power usage. This was accomplished using historical data from data centre sensors such as temperature, power, pump speeds, setpoints, etc. to improve data centre energy efficiency. The AI system predicted the future temperature and pressure of the data centre over the coming hour and made recommendations to control consumption appropriately.

The development of data centre infrastructure management (DCIM) systems has continued apace too, allowing the integration of AI to take advantage of all of these hardware and infrastructure developments. These experiments are now features, allowing unprecedented visibility and control. For those designing for new developments, software such as ETAP allows power efficiency to be built into the design from the outset, while also accommodating microgrid architectures.

What new data sources will contribute to this?

The data explosion is expected to continue increasing with developments such as industrial IoT and 5G, with increasing general automation and autonomous vehicles as driving factors. The data that will be generated far from the centralised data infrastructure must be handled, processed and turned into intelligence quickly, where it is needed.

New data architectures are expected to improve efficiency in how all of that is handled. Edge computing is seen as an important approach to manage more data being generated at the edge.

In one example, genomic research generates terabytes of data, often daily. Sending all that data to a centralised data centre would be slow, demand high bandwidth and be inefficient. The Wellcome Sanger Institute created an edge computing approach (www.securitysa.com/*se4) that allowed it to process data close to where it is produced – the genomic sequencers – with only what is necessary centralised. This saves on storage, bandwidth and speeds the time to intelligence from data. “That is where the edge paradigm has come to us,” said Simon Binley, data centre manager, Sanger Institute.

Modular data centres, micro data centres and better storage management will all contribute to handling this developing wave efficiently, keeping that data centre energy consumption line flat into the future. In the MENA region, 5G and centralisation with edge architectures will be balanced by more hyperscale facilities linking major demand centres.

What effects will this have on the whole data ecosystem?

However, efficiency must extend through not just the supply chain, but also throughout lifecycles. Vendors, suppliers and partners must all be engaged to ensure that no part of the ecosystem lags in applying the tools to ensure efficiency. This applies as much in the design of new equipment and applications as it does through working life and decommissioning. Understanding how an entire business ecosystem impacts the environment will be vital to truly achieving net-zero goals.

Agreed standards (www.globalreporting.org/standards/), transparency and measurability are all vital factors to ensure results.

These considerations are taking hold across the region and great efforts are being made to do better. Greater transparency is now accepted and embraced, with more and more organisations reporting their progress.

Tools and processes shared

The data centre sector has much that will be of use to organisations and industries going on the journey of sustainability towards increasing circularity. With the expertise and experience in efficiency, combined with the tools and intelligence from operations, and deep commitments to tight targets for net-zero operations, the data centre sector can not only handle the data explosion and digital demands of the world, but do so sustainably, while providing others with the tools and insights to do the same for their respective sectors.





Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

Navigating the evolving tech landscape in 2024 and beyond
Residential Estate (Industry) Infrastructure
Progress in the fields of AI, VR and social media is to be expected, but what is not, is our fundamental relationship with how we deploy solutions in our business and how it integrates with greater organisational strategies and goals.

Read more...
AI and ransomware: cutting through the hype
AI & Data Analytics Information Security
It might be the great paradox of 2024: artificial intelligence (AI). Everyone is bored of hearing it, but we cannot stop talking about it. It is not going away, so we had better get used to it.

Read more...
NEC XON shares lessons learned from ransomware attacks
NEC XON Editor's Choice Information Security
NEC XON has handled many ransomware attacks. We've distilled key insights and listed them in this article to better equip companies and individuals for scenarios like this, which many will say are an inevitable reality in today’s environment.

Read more...
iOCO collaboration protection secures Office 365
Information Security Infrastructure
The cloud, in general, and Office 365, in particular, have played a significant role in enabling collaboration, but it has also created a security headache as organisations store valuable information on the platform.

Read more...
Smart mining operations management
Mining (Industry) Infrastructure IoT & Automation
In his presentation at the recent MESA Africa conference, Neels van der Walt, Business Development Manager at Iritron, revealed the all-encompassing concept of SMOM (Smart Mining Operations Management) and why it is inextricably linked to the future of worldwide mining operations.

Read more...
Do you need a virtual CIO?
Editor's Choice News & Events Infrastructure
If you have a CIO, rest assured that your competitors have noticed and will come knocking on their door sooner or later. A Virtual CIO service is a compelling solution for businesses navigating tough economic conditions.

Read more...
The TCO of cloud surveillance
DeepAlert Verifier Technews Publishing Surveillance Infrastructure
SMART Security Solutions asked two successful, home-grown cloud surveillance operators for their take on the benefits of cloud surveillance to the local market. Does cloud do everything, or are there areas where onsite solutions are preferable?

Read more...
Cybersecurity and AI
AI & Data Analytics Information Security
Cybersecurity is one of the primary reasons that detecting the commonalities and threats of what is otherwise completely unknown is possible with tools such as SIEM and endpoint protection platforms.

Read more...
Cyber resilience – protect, defend, recover
Infrastructure
The challenge with AI is that threats are getting harder to detect. As a result, plans in 2024 are not just about detection and prevention, but about recovery.

Read more...
Powering business resilience and field operations
Infrastructure Products & Solutions
[Sponsored] The Anker 757 Portable Power Station emerges as a strategic asset for businesses looking to overcome power instability and the demand for operational efficiency in remote and field-based environments.

Read more...