Data protection in the cloud

Issue 1 2023 Infrastructure


Rick Vanover.

The ‘Global DataSphere’ is exploding in size. IDC predicts that by 2026, the amount of data in the world will have doubled again. While most enterprises have digitised their operations, they continue to add more strategic workloads and create more and more data. So, as the amount of data enterprises have to deal with grows exponentially, moving to the cloud based on an elaborated strategy offers significant benefits like scalability, flexibility and cost-effective storage.

But can this go on forever? Gartner expects total worldwide end-user spending on public cloud services will reach a record $592 billion this year, a 21% increase from 2022. The Cloud Security Alliance (CSA) reported that 96% of companies say they have insufficient security for sensitive cloud data, so across the board we have a long way to go on this journey. Here are three best practices for enterprises to protect their data in the cloud.

1. Know your data

The first step to solving any problem is to know what you are dealing with. Before you can protect anything, you need to know who is storing what, and where. Is everyone in the business using the same accounts? To make sure this is done right, IT teams often need to play detective or go on a journey of discovery across the business. To find these threads, it is often necessary to look through finances and collect invoices for cloud costs across the organisation.

When brought together, the amount of data kept by most enterprises, whether it was migrated over from on-premises or originally stored in the cloud, is vast. Humans are natural hoarders, and the digital world is no exception. While the ‘virtual garage’ of the cloud can store endless boxes of data, locating everything is only half the battle. In order to know what data is mission-critical and sensitive, you will need to classify it.

Automated data classification engines can help you sort and organise, so you are not blindly trying to protect everything to the nth degree. Once you know exactly what you have stored on the cloud (and where) only then can you start looking at how this data is secured.

As organisations face a fairly low barrier of entry to move data to the cloud, teams may not have prioritised the security and network processes that are required; if the migration happened too fast this can easily be the case. Similarly, because the cloud is a completely different environment to secure, things are often missed. There are many new service types that do not always exist on-premises and many of these need to be protected and recovered in the case of attacks or outages. Examples of these include code in cloud storage, applications that leverage other cloud services, and APIs provided in the cloud.

2. Know your responsibilities

A key issue is enterprises often not realising exactly what they are responsible for, regarding security and data protection in the cloud. There is a big gap in awareness of the shared responsibility model on which cloud security is built. This means they assume the provider is responsible for certain security measures, when in reality it is their job. While it does depend on the cloud provider, typically the provider is responsible for the security of the infrastructure and the physical facilities that host it. Securing applications, data and access to the environment, however, is the responsibility of the customer.

In practice, this means enterprises need to ensure they have backups of all critical and sensitive data stored in the cloud, in case of breaches or outages. The best practice is to have multiple backups in different locations (e.g. one on-premises as well as a cloud copy) and have copies of data across different mediums, with at least one copy kept offsite, offline and immutable – even better yet, all three.

The other core security responsibility that lies with the enterprise is controlling access and privileges. If every user of your cloud has access to ‘God Mode’, any breach is going to be devastating. Likewise, if you are using a single account to do multiple functions like protection and provisioning.

The best practice is to ensure multiple accounts are used across the business, using access and identity management correctly across accounts and subscriptions, so you can easily remove the failure domain, in the case of a security breach. At a user level, ensure the principle of least privilege is followed across the cloud environment, so that people only have access to the resources and environments they need.

3. Keep it cost-effective

In all likelihood, putting the previous two principles into place will be a significant project for most enterprises. The good news is the initial heavy lift to do so will not be required again on the same scale. However, to keep the cloud environment healthy and cost-effective long term, it is important to have cloud data hygiene processes in place.

Ensure you have a proper data lifecycle process. Without it, the good work done initially will become ineffective and expensive over time, with the business paying to store and protect the wrong data in the wrong ways. Data needs to be on the right storage platform in the cloud – and this will change during its lifecycle. For example, it might move from block resource to object storage to archive storage. The costs associated with these are variable, so make sure you are not storing (or backing up) data in inefficient ways.

This is one small part of avoiding eventual ‘bill shock’ for cloud computing and storage costs. Beyond simple data, costs are API costs, data egress (transfer) and more. I always recommend enterprises have an established ‘cloud economic model’ that they follow to prevent costs from piling up and to ensure spending matches expectations. To use a real-life analogy, if you leave a light on or forget to cancel a subscription you no longer use, your monthly bills will be higher than expected. If this happens across an enterprise cloud environment, the total tally can be eye-watering.

As enterprises' (and the world’s) amount of stored data continues to grow over the next five years, the cloud is going to be a vital piece of the puzzle in managing this. Enterprises need to look beyond just storing and protecting their data, and look at ways to utilise it and unlock value for their business and their customers. Doing this requires re-factoring for greater agility, but this will also mean the business is prepared for the ‘whatever’. Cloud computing is nothing if not dynamic, and will continue to evolve, with best practise bound to change. If enterprises become data-centric now, on both the cloud and on-premises, they will be ready for whatever the future throws their way.




Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

Cyber resilience – protect, defend, recover
Infrastructure
The challenge with AI is that threats are getting harder to detect. As a result, plans in 2024 are not just about detection and prevention, but about recovery.

Read more...
Powering business resilience and field operations
Infrastructure Products & Solutions
[Sponsored] The Anker 757 Portable Power Station emerges as a strategic asset for businesses looking to overcome power instability and the demand for operational efficiency in remote and field-based environments.

Read more...
Top bets for backup and business continuity
Infrastructure
Become your organisation’s data pioneer and spearhead data governance and protection of critical data. Challenge why best practices are not adopted or in place, while highlighting the inherent risks this poses.

Read more...
Next-gen solar-powered switches
Infrastructure
Duxbury Networking has introduced its range of solar unmanaged switches, which are ideal for any environment requiring reliable Power-over-Ethernet (PoE) capabilities, such as IP phones, cameras, and access points.

Read more...
Navigating South Africa's cybersecurity regulations
Sophos Information Security Infrastructure
[Sponsored] Data privacy and compliance are not just buzzwords; they are essential components of a robust cybersecurity strategy that cannot be ignored. Understanding and adhering to local data protection laws and regulations becomes paramount.

Read more...
Creating a cybersecurity strategy in a world where threats never sleep
Information Security Infrastructure
[Sponsored Content] The boom of Internet of Things (IoT) technology and the chaos that surrounded the sudden shift to work-from-home models in 2020 kick-started the age of cybercrime. In that period, incidents rose by 600%, affecting every industry and showing no signs of slowing down.

Read more...
Gallagher Security’s achieves SOC2 Type 2 recertification
Gallagher News & Events Integrated Solutions Infrastructure
Gallagher has achieved System and Organization Controls (SOC2 Type 2) recertification after a fresh audit of the cloud-hosted services of its integrated security solution, Command Centre. The recertification was achieved on 21 December 2023.

Read more...
Cyberattacks the #1 cause of business outages
Editor's Choice Information Security Infrastructure
The latest survey by Veeam Software shows that 92% of organizations will increase their spending on data protection by 2024 to achieve cyber resilience due to continued threats of ransomware and cyberattacks.

Read more...
Nology races to end 2023
Editor's Choice News & Events Infrastructure
Nology ended 2023 with an event highlighting its various products and services to the local market, followed by a few laps around the Kyalami Indoor Karting track.

Read more...
Cybersecurity integrated with data protection
Technews Publishing News & Events Infrastructure
Last year's VeeamOn Tour conference in South Africa was a smaller version of the annual global Veeam conference, aimed at the company's regional partners and customers.

Read more...