days
0
-10
-2
hours
-1
-4
minutes
0
-7
seconds
0
-5
search
Pros and cons

Containerization: What you need to know

Robert Ross
© Shutterstock / Avigator Fortuner

Running software in containers is catching on, and nearly half of IT leaders plan to deploy containers in production. Containers are extremely useful and can solve many problems associated with multiple environments, but they have their limitations. What do you need to know about containerization and its pros versus con before you begin?

The pace of technology change tends to leave tried-and-true processes in the dust. And so it goes with software development. Running software in containers is catching on, and a primary reason is because when code is developed in one environment, it may not run the same if it is deployed in another. This leads to errors, which take time to fix. In this era of rapid iteration and release, no one can afford that. So, containerization ecosystems like Docker and Kubernetes keep gaining in popularity. The 2018 Container Adoption Benchmark Survey reveals that nearly half (47%) of surveyed IT leaders are planning to deploy containers in production, while another 12% say they already have.

The advantages of containers

By changing the way software delivery is achieved, containers are making developers’ lives easier. They hold a great deal of promise, particularly in terms of increasing developer speed and efficiency across hybrid infrastructures. Containers typically are a collection of pieces of software and environment that run together as a coherent system. Developers make these systems in the form of container images, test them and make sure they’re acceptable. Then developers deploy them to large environments where the container platform instantiates identical replicas from the image, ensuring it’s the same software running everywhere. The reason that there’s a need for containerization is that it enables repeatable deployments of identical software.

SEE ALSO: What is Kubernetes and how does it relate to Docker?

Because they don’t include operating system images, containers require fewer system resources than hardware or traditional virtual machine (VM) environments. With VMs, developers may need to buy more hardware because they reach capacity more quickly. Though workloads can certainly be placed in VMs, using containers is a superior approach because it has a better chance of success as cloud computing moves from simple to complex, distributed architectures.

Containers make software delivery simpler and more predictable, because they provide a consistent deployment environment that can be used at all stages of the delivery pipeline. Applications running in containers can be deployed easily to multiple, different container platforms and cloud providers. Whether you’re building your software, testing your software or deploying software in production, you can use the same environment to host the software. Containers also can help enterprises modernize legacy applications and create new cloud-native applications that are both scalable and agile.

Pluses and minuses

Containers are extremely useful, but they have their limitations. For example, they do eliminate some concerns around how the differences between your development environment and your production environment will affect your application. But containers aren’t totally immune to the types of bug and error concerns that plague traditional software development. The fact that flaws, outages and security incidents still occur is proof that testing tools don’t catch 100% of issues.

There are at least 30 vulnerabilities each in the top 10 most popular Docker images, according to a recent report by Snyk. On top of that, if you install any container with an older version of an application, there’s a high likelihood that it will contain vulnerabilities. And that means your organization is still at risk for potential system outages and downtime that can cause significant economic and reputational impact.

The Ponemon Institute Cost of a Data Breach Study 2018 found that an hour of disruption can cost a small company $8,000, a medium company $74,000, and larger enterprises roughly $700,000. It’s been a challenge in IT maintaining consistent service with mixed-and-matched software, and that’s what containers solve. However, the issue is that if someone creates an exploit that works against one container, now there will be identical software running everywhere – and it’s going to work against all those containers.

Avoiding pitfalls

Instead of merely putting applications in containers and never looking back, developers need a new approach in terms of testing, to help ward off these potential problems. Quality assurance (QA) teams need to make sure they test containerized apps under all of the circumstances that might be present in production. That’s because containers could behave differently due to variables ranging from system hardware to unexpected network traffic. And by testing in production, bugs are detected before they go live, and threats are isolated before they have an impact.

SEE ALSO: Enhance your Docker usage: Launch build containers with Floki

Proceed with caution

Containers provide a tremendous solution to the current problems associated with the multiple environments developers must deal with. They make software development and testing easier. However, they have some disadvantages that need to be addressed. Thus, they need to be handled with care. Bugs and errors can still occur, and a vulnerability in one container means a vulnerability in all of them. Testing in production will catch unwanted anomalies before damage is done. Containers will help you release trouble-free software that benefits your customers – but only if you use them wisely.

Author
containers

Robert Ross

Robert Ross is the founder and CTO of Curtail, Inc. Curtail is changing how IT is implemented for service providers, government agencies and enterprise organization that are developing and launching new software and services, particularly in DevOps environments. Prior to Curtail, Ross served as CTO of Translattice, as a research scientist at McAfee and as a developer at eEye Digital Security. He also developed deception-based systems and high-speed network intrusion detection systems at Recourse Technologies, which was acquired by Symantec. He holds more than 15 patents in computer security, database and distributed systems technologies.


Leave a Reply

Be the First to Comment!

avatar
400
  Subscribe  
Notify of