Paradigm shift in the storage environment

1 May 2019 Infrastructure

Tapes, removable disks, hard disks and solid-state drives (SSDs), also known as flash storage - there have been many technological advancements in the storage market and this is continuing to evolve. As artificial intelligence (AI) develops, the way we store data also promises to become smarter - and more efficient, flexible, and cost-effective for businesses.

Eran Brown
Eran Brown

Companies in South Africa and across the world that want to reliably store their ever-increasing amount of data (in the growing petabyte range) now have a large - and sometimes confusing - choice. So how can AI disrupt the current storage environment?

In principle, IT managers want data to be stored on high performing data carriers in order to ensure rapid access to the information at all times. But how sensible is it to store all data on the most powerful media at all times? For example, Flash is much faster than Near-Line Serial Attached SCSI (SAS) drives, however, it is also much more expensive. It makes little sense to store all your data on Flash, since most data is not used often.

In addition, there is data that must be stored for an extended period of time due to compliance rules, yet there is other data that may need to be accessed on a more regular basis such as for the preparation of long-term analyses. Even backup files do not have to be stored on Flash as they only come into play when restoring a data set. IT managers must therefore keep a constant eye on their data strategy in order to determine the optimal storage media for each application.

Time consuming manual approaches

Up to now, predefined policies have been used to determine exactly what data is stored where. Policies are established at the outset when the corresponding structures are created. They then remain largely the same, even if minor modifications take place during operational activities. Yet the amount and above all, the nature of the data is changing rapidly.

In the past, data was largely standardised due to the limited capacities and capabilities of the IT systems of that time, but today things look different. The constant manual adjustment of policies is becoming more complex and increasingly ties up personnel who can no longer perform other important tasks to their full extent. More complex data structures require more regular adjustments as the wrong choice of storage location can either burden the budget by using costly storage for irrelevant data or by disrupting operations through slower access to relevant data.

Ground-breaking solution using AI

So, how can this dilemma be solved? One way out is to use AI. With an automated method, adjustments can be made second-by-second without the need for manual intervention, allowing companies to use more cost-effective storage. Using machine learning, an AI engine can evaluate user behaviour and the nature of access to data and assign the storage location accordingly. In addition, it can project the patterns according to which accesses need to take place for future usage behaviour. This can also be used to make forecasts of the memory required and the performance required in the future, which can also be reflected in infrastructure and budget planning. An important goal here is to prevent the use of unnecessary resources.

Smart decisions via neural cache

For example, AI can be used via a neural cache, a technology that delivers lower latencies than flash by leveraging smart software algorithms. The machine learning algorithms scan the data pool and analyses data patterns to find hidden correlations. As a result, it decides which data is relevant for immediate access by applications or the user directly. Frequently used data is automatically stored in Random Access Memory (RAM) which is faster than Flash. Next is the ‘warm’ data, which is stored in Flash, and the less frequently used data is stored on Near-Line SAS drives, which are much more cost-effective.

In a storage array that combines Dynamic RAM (DRAM), Flash media and near-line SAS drives, the neural cache reduces latency and accelerates read/write access. Most applications are transactional, requiring at least two separate input/output (I/O) operations. One operation is always used to integrate the transaction to the logs, the other for the actual write operation of the data. This means that latencies can have an exorbitant effect on performance. Response times of the metadata layer thus affect the maximum performance of the application. Both read and write operations – insertions, changes, and deletions from the metadata structure – are processed with the same latency time.

These operations are performed without pre-processing such as pattern removal, compression, or encryption directly in the DRAM of the storage server. Meanwhile, a second copy of the write operation is made in the DRAM of another storage node with low latency RDMA and only then is a confirmation sent to the host. Writing directly to the DRAM connected to the server's central processing unit (CPU) results in lower overall latency than directly accessing an external flash device.

In addition, the use of a single large memory pool for accepting write access – unlike traditional architectures where the write cache is divided into smaller sections – ensures that larger write bursts can be maintained. Data that changes frequently can be overwritten with DRAM latency, allowing neural cache to intelligently decide which data blocks can be stored on which media. The longer retention of the data in the write cache means that CPU and back ends are relieved. The neural cache can also accelerate read operations by holding the most active data in the DRAM.

AI builds its experience by analysing large datasets of data and identifying patterns respectively features. It helps IT managers reduce their storage spending – which is already a top line-item in their budgets – and frees money to invest in innovation and transformation.





Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

Data resilience at VeeamON
Technews Publishing SMART Security Solutions Infrastructure Information Security
SMART Security Solutions attended the VeeamON Tour in Johannesburg in August to learn more about data resilience and Veeam’s initiatives to enhance data protection, both on-site and in the cloud.

Read more...
Troye exposes the Entra ID backup blind spot
Information Security Infrastructure
If you trust Microsoft to protect your identity, think again. Many organisations naively believe that Microsoft’s shared responsibility model covers Microsoft Entra?ID – formerly Azure AD – but it does not.

Read more...
Secure data protection without hardware lock-in
Infrastructure Information Security News & Events
New Veeam Software Appliance empowers IT teams to achieve instant protection with Veeam’s fully preconfigured, software-only appliance, delivering enterprise-ready simplified deployment and operational efficiency, robust cyber resilience.

Read more...
Hytera supports communication upgrade for Joburg
News & Events Infrastructure Government and Parastatal (Industry)
By equipping Johannesburg’s metro police and emergency services with multimode radios which integrate TETRA and LTE networks, Hytera is bridging coverage gaps and improving response times across the city.

Read more...
Combining TETRA or DMR with 5G broadband
Infrastructure IoT & Automation
As enterprises face rising complexity and connectivity demands, hybrid networks offer a transformative path, combining the proven reliability of TETRA or DMR with the innovation and coverage of 5G broadband.

Read more...
Questing for the quantum AI advantage
Infrastructure AI & Data Analytics
The clock is ticking down to the realisation of quantum AI and the sought-after ‘quantum advantage’. In many boardrooms, however, quantum remains mysterious; full of promise, but not fully understood.

Read more...
The growing role of hybrid backup
Infrastructure Information Security
As Africa’s digital economy rapidly grows, businesses across the continent are facing the challenge of securing data in an environment characterised by evolving cyberthreats, unreliable connectivity and diverse regulatory frameworks.

Read more...
IoT-driven smart data to stay ahead
IoT & Automation Infrastructure AI & Data Analytics
In a world where uncertainty is constant, the real competitive edge lies in foresight. Businesses that turn real-time data into proactive strategies will not just survive, they will lead.

Read more...
Hydrogen is green but dangerous
Fire & Safety Infrastructure Power Management
Hydrogen infrastructure is developing quickly, but it comes with safety challenges. Hydrogen is flammable, and its small molecular size means it can leak easily. Additionally, fires caused by hydrogen are nearly invisible, making them difficult to detect and respond to.

Read more...
A whole-site solution to crack the data centre market
Fire & Safety Infrastructure Facilities & Building Management
Fire safety consultants and contractors who can offer a comprehensive fire safety solution to the data centre market can establish themselves as a supplier of a key safety features that help guarantee the smooth operation of critical infrastructure.

Read more...










While every effort has been made to ensure the accuracy of the information contained herein, the publisher and its agents cannot be held responsible for any errors contained, or any loss incurred as a result. Articles published do not necessarily reflect the views of the publishers. The editor reserves the right to alter or cut copy. Articles submitted are deemed to have been cleared for publication. Advertisements and company contact details are published as provided by the advertiser. Technews Publishing (Pty) Ltd cannot be held responsible for the accuracy or veracity of supplied material.




© Technews Publishing (Pty) Ltd. | All Rights Reserved.