Cloud Data Migration (Database Migration to the Cloud)

Guide

Table of contents

We’re already in a time when nearly all organizations, from small businesses up to the largest enterprises, store the bulk of their data in the cloud. Recent surveys show that 60% of business data lives in the cloud now with 90% of enterprises leveraging multiple cloud environments. 

Likely, every new data store introduced today by database, DevOps, and development teams is a cloud-based platform. That’s great for future storage, collection, and analysis – but everything on the legacy on-premises databases is left behind. 

Copy + paste, right?

Of course, it’s not as easy as one might drag a file from their local desktop computer over to their Google Drive or Dropbox – but the concept is essentially the same. Cloud-based storage environments come with a slew of benefits, most importantly scalability without hardware and maintenance overhead. 

In practice, the cloud data migration process is complex and full of risk. It requires careful coordination throughout the pipeline between DBAs, DevOps managers, application development teams, data teams, and global infrastructure teams. It also requires buy-in, support, and investment from the organization’s technology leaders. 

Here’s what you need to know to run a database migration to the cloud the right way, including the tools, strategies, and best practices.

What is a cloud data migration (database migration to the cloud)?

A cloud data migration, or a database migration to the cloud, is the strategic transfer of an organization’s data from its existing on-premises database environment(s) to its new cloud environment(s). The database migration might be part of an org-wide cloud migration – in which every tech platform shifts to the cloud – or it may be just a database-level initiative affecting one, multiple, or all of the organization’s data stores.

Whether it’s a direct “lift and shift” migration or a more complex re-architecting into specialized data stores, a cloud data migration carries data security, compliance, availability, and integrity risks. It requires a coordinated effort between database, application development, data, DevOps, and infrastructure teams, as well as the right tool for automating the migration across millions of data points. 

Cloud-based data stores can minimize and even eliminate the maintenance and overhead concerns of on-premises databases and make it easiest to grow – both in the volume of data that is collected, processed, and analyzed and in the diversity of specialized cloud-native data stores.

They can also present major cost efficiencies. 

The benefits of shifting to cloud storage

The main differences between on-premises and cloud-based data storage and management come down to:

  • Scalability
  • Efficiency
  • Speed
  • Maintenance
  • Cost
  • Innovation & options
  • Tooling & automation

Cloud-based systems offer on-demand scalability without the need for physical hardware upgrades, while on-premises systems require significant investment in infrastructure (and additional costs to house that infrastructure, etc.). Much like the SaaS model revolutionized software to keep customers at the cutting edge of technology and support, database-as-a-service (DBaaS) keeps cloud database users on top of the best practices and solutions. Cloud databases also take on the burden of maintenance, freeing up IT teams from backups, updates, and other crucial yet time-intensive tasks. 

Efficiency and speed gains tend to come by way of purpose-built structures and capabilities that streamline and accelerate provisioning, operations, and more. Migrating to the cloud provides access to a wide array of services from providers like Azure, Google Cloud, AWS, and Oracle Cloud. Cloud migration also supports the use of many more unstructured databases such as MongoDB, Cassandra, Neo4j, and Amazon DynamoDB. These databases are ideal for handling diverse data types and large volumes of data, making them particularly useful for applications that require flexibility in data storage and retrieval.

Cost is often the first cloud benefit to attract attention, though, especially for rapidly scaling organizations. Without major upfront investment in hardware, infrastructure, and physical storage space, cloud databases reduce the barrier to entry to more organizations and to many of the latest technologies. With a subscription-based model that can flex based on the current needs, cloud options eliminate much of the concerns around growing and diversifying data stores. 

Once the database is migrated, cloud systems typically include more advanced management and automation features, either out-of-the-box or through integrations up and down the pipeline tech stack. Much of what’s done manually – from change management to auditing – can be automated or streamlined through cloud database integrations. 

That advanced automated approach to management needs to come into play before cloud databases are live. Every database migration to the cloud needs a strategy informed by every stakeholder and tools that enable automation, consistency, control, security, and visibility. 

Why every migration needs strategy & tooling

No matter how smooth and straightforward a cloud data migration might seem, it’s always fraught with challenges unique to the business, team, and technology. It involves significant risks and challenges, including data security and compliance issues, potential downtime and data loss, compatibility and integration problems, and the need for meticulous coordination among various teams to ensure a smooth and efficient transition.

Proper tooling enables automation and consistency throughout the migration process. It ensures governance, security, and observability, reducing the risk of errors and enhancing overall efficiency. Advanced tools automate crucial tasks such as change management, auditing, and monitoring, making the migration seamless and more reliable. By integrating these tools into your strategy from the outset, you can manage the complexities of cloud data migration effectively and position your organization for long-term success in the cloud.

Cloud database migration models & strategies

An infrastructure-wide cloud migration, including platforms beyond data storage, has a handful of models and approaches to choose from. Focusing on the database migration to the cloud, though, is a bit simpler. Essentially, it comes down to the classic “lift and shift” approach or refactoring to specialized cloud-native databases – or a mix of the two. 

Organizations undergoing a cloud data migration might also take all three approaches:

  • Migrate over an entire data store to its cloud counterpart
  • Transform some of its data for migration into specialty cloud databases and retain the rest on-premises
  • Migrate entirely to a cloud database and duplicate/transform certain sets of data into additional cloud-native storage environments

Lift and shift

A common scenario is for an organization to forgo its on-site servers and all the maintenance that comes with it. They want to use essentially the same technology but shift from hardware to virtually hosted environments. This basic migration is also referred to as rehosting – same stuff, new host – and means the same for application migrations, too. 

So, they want to migrate their on-premise Oracle Database (for example) – the central repository of all the organization’s data – to Oracle in a Virtual Machine (VM), Elastic Compute Cloud (EC2), or otherwise hosted environment on a cloud platform such as Azure, AWS, or Google Cloud Platform.

The “lift and shift” database migration approach involves moving the entire Oracle Database (example) from an on-premises environment to a cloud-based environment with minimal changes to the database architecture. This method is relatively straightforward and quick, allowing organizations to leverage cloud benefits without major modifications to their existing database systems. 

However, lift and shift migrations still require a careful approach to manage:

  • System availability related to the migrating databases
  • Network configurations
  • Security configurations and compliance requirements
  • Version differences between existing and target databases

With proper management, the lift-and-shift process can be fairly uneventful as long as the database version between on-premises and cloud is relatively close. However, if the on-premises database is on, say, Oracle 12c from 2017, it will need some adjustments to fit the latest structures in Oracle Cloud Infrastructure (OCI). 

That small adjustment is just the tip of the iceberg when it comes to more complex migration management. The more the original and cloud databases differ, the more that might need to be done to bring that data and structure over to the virtual environments. 

More complex migration scenarios

The lift-and-shift/rehost scenario is the most straightforward of cloud database migrations, but even migrations between the same technology (e.g., Oracle to Oracle Cloud) can require changes for cloud compatibility. This kind of relocation migration will require structural adjustments but can happen quickly with minimal disruption.

From there, cloud migration complexity increases with different and additional environments. Replatforming changes and optimizes the existing database’s schema and data for the new cloud environment without significant changes to its core architecture. Sometimes replatforming is referred to as “lift, tinker, shift” – think of it just a tick more evolution than a rehosting, so the database has the most efficient structure aligned with the cloud platform’s capabilities. 

When going to more advanced relational databases or innovative and specialized NoSQL data stores, the database migration includes refactoring (re-architecting), which becomes a more difficult, risky, and resource-intensive process. These cases often involve significant transformations to fit the schema and capabilities of the new specialized cloud database.

Replatforming and refactoring to leverage purpose-built environments for specific needs might look like, for example:

  • Shifting a database from a legacy on-prem Oracle Database to a cloud-hosted Postgres environment for cost reductions
  • Adopting a cloud data warehouse like AWS Redshift to integrate IoT telemetry data with legacy on-prem datasets
  • Splitting up an on-prem database into specialized data stores like PostgreSQL or Snowflake 

Overall, replatforming and refactoring legacy data stores to adopt innovative specialized database types in the cloud enhances efficiency and flexibility in database management. Much of the processes that used to take hours or days on legacy on-prem systems can be reduced to seconds with the right cloud platform, making the preparation and investment in migration strategy and tooling well worth the resources. 

Prepare for the challenges of cloud data migrations

Whether the team takes on a straightforward lift-and-shift cloud data migration or a more complex refactoring, the expected challenges can be as mundane as a bit of downtime or as catastrophic as data loss or a security breach. Tread carefully and strategically by planning for known obstacles and leaving wiggle room for unexpected issues. 

Security & compliance

Like any high-value or sensitive asset in transit, databases in flux need protection. Data must remain secure and databases in compliance with regulatory and internal policies at every stage in the migration, and after. How will the team handle audits, encryption, and access control during the shift and once in the cloud environment?

System availability

Availability is also important – will the migration include scheduled downtime? How can that be minimized and strategically scheduled to avoid disruption? 

Large-scale database migrations often entail blue-green deployments so that the on-premises and cloud databases can be updated without disrupting availability. While leveraging blue-green deployments can avoid downtime, it still requires sophisticated tooling to ensure consistent updates and monitor for drift. 

In similar spirit, a canary deployment strategy can roll out the new cloud database target to a small, controlled subset of users. This allows for real-world testing in a production environment with minimal impact. If the database changes perform as expected without issues, the cloud migration can continue across environments. 

No matter the deployment strategy, the migration team needs to ensure minimal downtime and disruption, if not avoid them completely. 

Data loss

They also need to prevent data loss. During the migration period, data may be in flight that is not written to either database or is only written to one system and not to both, depending on how the application is designed. How will those differences be caught and rectified during and after the migration – before they cause downstream problems?

Compatibility

The migration team needs to review compatibility and integrations with the applications and other environments along the pipeline. Connecting new cloud databases to legacy systems or to additional new cloud systems will require its own testing plan to ensure proper and streamlined operations. The first goal is to have everything up and running, avoid disruptions, and have confidence in the new setup. As affiliated systems update and the team finds ways to further optimize the pipeline, these integrations will need regular review and maintenance. 

Cost overrun

Finally, there’s the other side of the cost coin. Cloud databases, and the necessary migrations, come with expected costs as well as the risk for costly disruption, overrun, and mismanagement. It’s important to align the chosen cloud database with the team’s specific needs and understand the costs of scaling. Beyond cloud platform subscriptions, the data transfer itself poses a cost, including the internal resources and dedicated tools to manage the process. 

Now that it's much easier to consume resources – because teams can spin up new environments in seconds – there needs to be processes in place to clean up resources and control costly database sprawl.  As the cloud database is easier to update, it needs governance that keeps the team moving quickly while staying within company, compliance, and security boundaries. 

These tools step in to handle many of the major challenges of the migration. By automating the schema and data changeover from on-prem to cloud database, a platform like Liquibase lets teams configure requirements, policies, and necessary schema changes to empower not only security and governance, but traceability and flexibility. 

Protect & streamline with database change management automation

Simply put, Liquibase helps organizations get out of the data center and into the cloud as quickly as possible, with the safety, security, and reliability they need before, during, and after the transition. 

Liquibase automates multiple parts of the migration process, including packaging and deploying schema changes to suit the new cloud environment. By applying version control and capturing detailed metadata about every change, Liquibase enhances traceability and simplifies rollbacks if issues arise. This comprehensive automation reduces manual effort, minimizes errors, and supports seamless, reliable database migrations.

While in essence this sounds straightforward, given the safeguards in place and volume of data at hand, it may take some time to setup, test, process, and validate. 

While the migration happens, data is still flowing into the pipeline – is it making it to the correct cloud environment? Liquibase continuously monitors for database drift, to ensure ongoing data collection remains consistent in the transition. Liquibase also automates the duplication of new schema changes to both the old on-prem and new cloud database. Consistency is key and becomes even more important if the new cloud database strategy includes multiple instances of a certain type or even various types of databases.

Most migrations have plenty of nuance and complexity in terms of how and why they’re shifting. Beyond simply writing to the same column in a different location, teams might be shifting from one monolithic on-premises system to a more distributed and specialized mix of data stores. Instead of a 1:1 migration, they might migrate to five databases that could be all the same or all various types for different purposes. 

Another common situation is to shift a central database to its cloud counterpart, making for a fairly simple lift and shift, except for a distinct organization or department. Perhaps the Product Development team is shifting to NoSQL, while the rest of the organization remains on traditional relational databases – but everyone’s together in the cloud. 

The duplication and interim migrations can be most fraught with delay, disruption, error, and risk – and most obviously benefitted by a database DevOps platform like Liquibase. Yet Liquibase continues to empower automation, governance, and observability long after the cloud migration is complete. 

Without change management automation from the beginning, the entire migration is a frustrating game of catch-up to get everything in proper order. To deploy one big shift to the cloud and then go back and fix what inevitably goes wrong only leads to a tedious and drawn-out process and a major business problem of lost or corrupted data. 

Organizations often migrate databases to the cloud in preparation for major pipeline growth. So while a manual or home-built solution for migrating the current databases might seem approachable, the post-migration scale might be 10 times greater. One of the benefits of cloud platforms is their ability to instantly spin up new environments. Teams need to plan for a future in which migrations and change happen a lot more often. 

Automating the process with Liquibase takes a much safer, yet quicker and more streamlined, approach to the cloud transition with consistency, control, and visibility including:

With integration across more than 60 database types, teams can handle the migration needs of the moment and be ready for whichever data store innovations are embraced in the pipeline next. 

Cloud migration DataOps

Liquibase enables an automated DataOps approach, empowering collaborative data management  that improves the communication, integration, and automation of data flows across an organization. In the context of a cloud database migration, DataOps plays a critical role by ensuring that data is efficiently and reliably moved, integrated, and managed in the new cloud environment. This includes:

  • Streamlining the migration process through automated data pipelines, ensuring consistent and error-free data transfers
  • Enhancing cooperation between data engineers, data scientists, and other stakeholders to ensure smooth migration and integration
  • Continuously monitoring data quality and performance throughout the migration, ensuring that data integrity is maintained
  • Enabling scalable data operations to handle increasing data volumes and diverse data types in the cloud

By implementing DataOps with Liquibase, organizations can achieve more reliable, efficient, and agile data migrations, ultimately leading to better data management and analytics in the cloud environment.

Learn more about data pipeline change management.

Ongoing cloud database change management 

After the new cloud environments go live, the legacy database might stay live until the team is confident in the cloud solutions. Liquibase keeps it structurally maintained until then, ensuring an automatic backup in the early days of the cloud transition. 

Cloud database change management automation, governance, and observability maximize the value, safety, and scalability of cloud environments long after the migration phase. As application and data teams expand and evolve their use cases, Liquibase keeps cloud databases consistent, secure, and updated at the pace of the modern pipeline. 

Discover how Liquibase works and find out how to embrace database DevOps for streamlined migrations and change management in a cloud-native future. 

The DBA’s evolving role

Change management automation can eliminate the manual, toilsome change reviews DBAs spend their days slogging through, giving them space for new responsibilities and innovations. With a complete set of database DevOps capabilities at their disposal, DBAs can focus on the overall quality of changes, efficient scaling, and performance optimizations. 

Find out how DBAs embrace database DevOps automation, governance, and observability to elevate the value of their organization’s databases in the comprehensive guide: The Next-Gen DBA: How Database DevOps Automation Unlocks Efficiency & Innovation.