days
0
-9
hours
0
-1
minutes
-4
-1
seconds
-2
-7
search

Don’t let the database derail your containerisation cloud journey

Anil Kumar
cloud
© Shutterstock / Matt Kay

Containerisation is potentially the biggest milestone of the ongoing cloud revolution. However, there are a few obstacles still standing in the way. Anil Kumar, the Director of Product Management at Couchbase, discusses the obstacles and explains why revolutions will only benefit the business if the entire organisation is brought along.

The goals of a successful DevOps team are no secret: increase agility and innovation, improve collaboration, and deliver products faster to market – meaning any technology that supports this will be rapidly adopted.

Containerisation is a natural step on this journey: increasing the scalability and dynamism of the cloud to develop and update applications faster, meet the ever-increasing demands placed on DevOps teams, and ultimately deliver better customer experience. However, there are a few obstacles still standing in the way.

Getting left behind

As the next evolution of the cloud, containerisation continues the trend of offering new ways for DevOps teams to not only perform their core work, but also provide the applications they come up with. But it often finds itself moving too fast for other technologies to keep up. For instance, while developers have embraced containers to make it easier to create cloud-native applications, the databases powering these applications haven’t always followed suit. Legacy databases, while perfectly suited for their original tasks, have traditionally been tied to a single, bare-metal or virtual machine instance. They simply haven’t been designed to support applications in a containerised, highly-distributed, instantly-scalable cloud.

Businesses that insist on using legacy databases to support containerised infrastructure and development will nearly always encounter three key problems:

  • High operational costs – manually deploying and managing hundreds of database clusters across multiple geographies increases cost, effort, and complexity
  • Vendor lock-in – a lack of standardisation to ensure data can be moved freely and safely between cloud providers has made it difficult to switch providers quickly or work with multiple providers.
  • Delayed time to market – customers with applications using microservices architecture have difficulties managing and scaling database clusters in siloed systems, extending development times and making it harder to support their applications

SEE ALSO: For security hygiene, scan your containers in build time

Going cloud native

Most organisations are now figuring out how to move some, if not all, of their core applications to the cloud, and solving these data challenges with containers are critical for these projects to be successful. As well as the immediate benefits for DevOps teams – such as being able to quickly create test environments to fine-tune new features and applications – cloud-native applications are simply more flexible and well-orchestrated. The ability to create applications that can quickly migrate to more cost-effective infrastructure, or to meet compliance demands; that can quickly be recovered in the event of a disaster; and that can scale at peak times, such as at major events for travel and ticketing companies; or at major shopping times for retailers; are all critical to the modern business.

An organisation that sticks with legacy technology that cannot support these changes is likely to stagnate and ultimately be left behind by more agile competitors. In order to survive, most organisations will have to embrace technology that can support cloud-native applications across the entire stack

Tackling misconceptions

Many organisations have delayed adopting technology that can support the next generation of the cloud due to a variety of reasons: their database vendor doesn’t support, endorse, or allow it; database administrators don’t trust it; the storage is volatile; it’s faster and more stable on physical or dedicated virtual nodes; and so on. These are all common and not irrational objections to containerising databases and have caused several organisations to stick with their legacy databases perhaps for longer than they needed to – after all, why upgrade if the upgrade won’t meet your needs either?

Fortunately, there is an answer that can help organisations support the next generation of cloud and allow containers to operate in tandem with a modern database. Container orchestration systems are increasingly including the capability to support containers with distributed databases. For example, DevOps teams can run their database as a fully managed stateful database application next to microservices applications on a single Kubernetes platform. Because Kubernetes makes it easier to manage and scale their database, it’s much faster and easier to put in place the right database capability to support new applications. This means they can concentrate on developing applications that will take full advantage of their database’s capabilities, knowing that this capability will be there when the application is rolled out into full production.

SEE ALSO: “We’ll see an increase in enterprises taking advantage of containers in a multi-cloud architecture”

Similarly, by working through the same orchestration system as containers, such databases can be more easily duplicated and distributed to support cloud-native applications, instead of being rooted to the spot. Organisations can also take advantage of pre-programmed responses that can be built into systems such as Kubernetes: for instance, setting environments to automatically scale up and down when certain criteria is met, or a self-healing mechanism to recover from failures.

Weapon of choice

Containerisation is potentially the biggest milestone of the ongoing cloud revolution. At the same time, it lays bare an important point: revolutions will only benefit the business if the entire organisation is brought along. In the world of IT, this means not only rushing to adopt containers and cloud-native applications but ensuring that critical infrastructure such as databases can be brought along for the ride. Organisations that can adapt, stay ahead of the curve, and effectively future-proof their business, will be at a huge advantage: as their DevOps teams can concentrate on improving the business, instead of firefighting or doing their jobs with one hand tied behind their backs.

Author

Anil Kumar

Anil Kumar is the Director of Product Management at Couchbase. Anil’s career spans more than 15+ years building software products across various domains including enterprise software, mobile services, and voice and video services. He is a hands-on product leader responsible for Couchbase Server, Couchbase Cloud, and Kubernetes product lines, including evangelising the product strategy and vision with customers, partners, developers, and analysts. Prior to joining Couchbase, Anil spent several years working at Microsoft Redmond in the Entertainment division and most recently in the Windows and Windows Live division. Anil holds a master’s degree in computer science from the University of Toronto (Canada) and a bachelor’s in information technology from Visvesvaraya Technological University (India).


Leave a Reply

Be the First to Comment!

avatar
400
  Subscribe  
Notify of