The pandemic’s long-term legacy on cloud adoption rates and what comes next – ITProPortal


(Image credit: Shutterstock / Blackboard)

After a year of hardship and disruption, many UK businesses will have viewed July’s Freedom Day – and its easing of the remaining Covid restrictions – as a welcome return to some sort of normality. The long-awaited return to the office and face-to-face meetings is finally here. However, very few organizations will be planning a reversion to the old paradigms. 

The pandemic accelerated a trend that was already well underway; cloud adoption. During the first weeks of chaos, the need to facilitate a mass move towards remote working, boost resilience and switch to online customer communications, meant that the use of cloud services was no longer optional. And, with more than 4 in 5 UK businesses already stating plans to implement hybrid strategies moving forward, cloud workloads are only likely to increase even further. 

However, cloud environments come with their own unique set of data-related challenges. Whether it’s integration or security, businesses need to act now to get one step ahead. 

The year of the cloud  

Although many organizations have been on the journey to the cloud for several years in some way or another, the pandemic undoubtedly created a new impetus. It forced business leaders across all sectors to reevaluate their priorities and – more often than not – cloud adoption came out on top. 

In fact, Denodo’s 2021 Global Cloud Survey discovered that cloud adoption year-on-year has risen 25 percent in advanced cloud workloads, with more complex workloads moving to the cloud. It also reported that almost 40 percent of the 150 businesses questioned are now leveraging hybrid cloud (using combined on-premise and cloud services) and some 9 percent who have extended their architectures to multi-cloud (more than one cloud service).

There are several reasons why the pandemic accelerated cloud adoption at a mass scale. These include:

– The need to facilitate remote working – This is perhaps the most apparent change brought on by the pandemic. In an effort to follow government guidelines, keep employees safe and limit the spread of the virus, offices closed their doors and remote work took hold. The need to collaborate online instead of face to face meant that cloud technologies were more essential than ever before. Even industries that were typically slow to adopt due to lack of funding or resources – such as the public sector – had no option when the pandemic hit. 

– The need to continue to deliver services to customers in a digital landscape – Traditional retailers and service providers needed to accelerate their transformation from face-to-face, store-based customer interaction to online services. Whether it was putting in place new service models for physical goods– such as home-delivery and click and collect – or adopting new ways to deliver informational services and advice – such as healthcare Software as a Services (SaaS) applications or e-learning – all industries needed to reassess how they communicated with their customers in a new digital world. This is where the cloud came in.

– The need to increase resilience and agility – The first few months of the pandemic, especially, caused fluctuations in the demand for specific services. Cloud-enabled businesses to be more flexible and scale according to this demand. It also helped businesses to rapidly adapt to new commercial models and launch innovative products and services to gain a competitive advantage, even during this time of chaos. 

There is no doubt that cloud technologies became a lifeline for many businesses over the last 18 months, enabling them to survive one of the toughest economic periods in recent history. However, whilst the mass adoption of cloud technologies solved some of the immediate problems caused by the pandemic, it has also created a very unique set of data-related challenges. If organizations are to continue to reap the benefits of cloud long-term, they need to act now and put the processes in place to thrive in our new hybrid landscape.

New landscape, new challenges 

Businesses typically find themselves struggling during three stages of maturity in the cloud deployment journey:

  • The initial migration – when moving data to the cloud and transforming the existing systems to new cloud models whilst continuing to run existing services. 
  • Accessing and integrating disparate data sources after the initial migration – often from hybrid models with a mix of on-premise and cloud data. 
  • Dealing with more advanced cloud use – this often includes multiple cloud services (public, private, SaaS) where the disparate nature of the data is further complicating the challenges of data integration. 

A common thread throughout these stages is data integration. When moving to the cloud the huge increase in data volumes being stored across multiple sources can make it difficult to keep track. This exacerbates existing challenges around data security and maintaining governance and compliance as well as other regulatory practices such as data sovereignty, lineage and ownership. 

To make matters worse, traditional means of moving and copying data are no longer fit for purpose. In fact, the most common method of data integration, Extract Transform Load (ETL) – where data files are extracted from an existing source, transformed into a common format and loaded onto a new location – has been around since the 1970s. It’s no surprise that this technique’s limitations, most prominently around complexity, performance and security and governance, are becoming increasingly apparent in our data-intensive age. This is where data virtualization comes in.


Data virtualization complements ETL and other methods of integration such as the Enterprise Service Bus (ESB) and Enterprise Applications Integration (EAI) by removing the need to move and copy data during the journey to the cloud. This gives agility and self-service capabilities to the business, without compromising on security and governance.

As consumers, many of us actually already use a very similar model in the home when consuming entertainment through services such as Netflix and Spotify. They don’t hold the physical content in their homes, on DVDs, Blu Rays or CDs. Instead, they will review the information about the film or music (the meta-data) on a platform to decide what they want and, when selected, that content is viewed in real-time from some unknown location in the cloud. 

Data virtualization works like this, but for enterprises. Using this technology, the data is kept at the source and only abstracted and consumed in a report, dashboard or application in real-time, when it is needed. This is completely different from the bulk moving and copying of data used in ETL and data warehousing models. It is enabling businesses to gain real-time data insights, to vastly reduce the movement of data around the enterprise and provide centralized security and data governance irrespective of the data sources. For example, for a user who wanted to run a query for a new analytics report, a data virtualization layer would hold details about all the data they might want to consume. It would only be at runtime however, that the system would abstract and combine the actual data from the data source locations for their request. Much like the domestic model for entertainment described above, the content is provided only as and when it is needed.

With cloud technologies set to play an increasingly important role moving forward, organizations need to act now to ensure that they are able to overcome any data-related challenges and maximize value from their pandemic investment. Modern technologies, such as data virtualization, could bring enormous agility to businesses, removing the complexities from hybrid and multi-cloud architectures as users no longer need to worry about where the data is held or what format or protocol is needed for access. Adopting these technologies could help businesses to thrive, no matter what is around the corner.

Charles Southwood, Regional VP – Northern Europe and MEA, Denodo

Charles Southwood is Regional VP for Denodo Technologies, responsible for the company’s business revenues in Northern Europe and South Africa. Born and raised in Ascot, Berkshire with a degree in engineering, from Imperial College, London, Charles has over 20 years in technology sales and sales leadership where he has been involved in both start-up ventures, and the expansion of existing businesses.