Hi there, fellas!
Welcome to my first blog post about Docker, and the love affair I share with it. But, first a brief introduction to who I am, and what I do. My name is Martin, and I’m a Partner and Software Architect at Translucent ApS. We’re located in Denmark, but like many Atlassian Experts, we first and foremost work internationally. We are certified Atlassian Platinum Solution Partner, and we help our customers optimize and customize usage of the Atlassian ALM suite to help you take your business to a whole new level - process and technology-wise.
Personally, I’m spearheading the implementation and adoption of Docker within Translucent. We think Docker shows a lot of promise in helping organizations deploy applications in their IT infrastructure predictably and quickly, and this basically helps us help our customers develop better solutions and healthier cultures across software and operations.
We plan to write a handful of articles relating to Docker and how we’re integrating this technology in our daily workflows. I will, personally, be focused on describing how I use it in our development and test cycles, whereas some of my colleagues will write about Docker and DevOps in a broader business context. So What Is Docker?
At first, it was all about explaining it as light weight virtual machines, which I still believe is an appropriate brief introductory explanation. The reason why I think that this is an adequate description is caused by the fact that I often see virtual machines provisioned to run some software contained and separate from the rest of the system… which is exactly what Docker containers do well, and with a lot less overhead than ordinary VMs. Docker containers boot up in matter of seconds, whereas classic virtual machines often measure the boot time in minutes - or worse. A better explanation yet, is, in my opinion, that a Docker container is just like a user-mode Linux instance running your software. This represents what Docker does more closely than using the virtual machine metaphor. However, explaining it while introducing terms like user-mode requires some deeper knowledge of Linux and operating systems, which is perhaps not that ideal.
My own description is that Docker provides a way of running processes very isolated from other processes on the same operating. And it’s #awesome!
How I Got Started
My first experience with Docker (and what quickly turned it into a love affair) was that, in the olden times, administering my own server was sometimes hard, with unsuccessful compilations and installation of software on my system as a result (a developer who can’t compile and install software; that doesn’t seem right). Also, upon compiling stuff correctly, the compilation and installation process would often leave various artifacts throughout the filesystem, which was really just depressing.
As if that wasn’t enough, sometimes removing an installed application would leave artifacts lying around the file system, and would be like nails pressing into my brain. I need it to be clean and nice, you know?
This is when I found out about the containerization technology called Docker. And oh my, a way to temporarily test out new software which would leave your system as it was afterwards (heart). And, also giving me the opportunity to purge the software and have the system return to the same state as before the application was installed and started.
In my experience, Docker has also made it easy to run multiple instances of some software product on the same system, which I think otherwise often requires significant work to make possible. For example, running multiple combinations of various versions of PostgreSQL, or MySQL, for that matter.
I was personally very delighted when trying out some new media server software for my server at home with Docker. Docker made it a breeze to test out, since the software already has Docker-images waiting for me to download and just run. This was how I got to test and evaluate four different media servers in a single afternoon - and at the same time have a nice filesystem not overflowing with mysterious files and folders.
My First Docker Images
Obviously, not all software was pre-packaged in containers for me, just to pull from the heavens and use, and having experienced the bliss of less work and ease of mind, I found that some of the missing software needed my Docker tender, love, and care.
For example, I needed a place to dump my notes, thoughts, etc. so they wouldn’t all end up just end up being zapped as unnecessary information by my brain. I gave it some thought, and obviously the choice was to use the excellent Confluence wiki from Atlassian for this. But alas, there was no Docker image available.
At the same time, I was experimenting with implementing some task tracking software to manage my daily, monthly and quarterly obligations and tasks, to see if it could help me manage my day better. You know, freeing up more time to really geek out and to provide a better overview for things that I needed to do. The best tool for tracking issues, to me, is the brilliant Jira (again from Atlassian). However, no Docker image was available (nor did I look that hard).
So to help others as they have helped me, and to get more experience developing, testing, and integrating with Docker, I set out to make my own Docker images for both Atlassian Confluence and Atlassian Jira. The images have been out there for quite a while, and it looks like I successfully provided value back to the community.
At the time of writing this, the Confluence image has more than 283.100 downloads. That’s thousands of downloads. The image containing Atlassian Jira is not having that bad a time either, with a little more than 63.000 downloads and 110 stars. It would seem people have found my work useful and perhaps like the products as much as I do myself.
Where’s the Value?
Since we are a company that work extensively with Atlassian products and helping our customers - who are running many different versions of the Atlassian software - being able to quickly test something out in certain versions of the products is very important to us. But even with the excellent installer packages and the Atlassian SDK, this proved to be a tedious process.
The Atlassian Plugin SDK is great, and it enables a user to run almost any version of the products, on demand. But when you’re anything else than a developer, and you’re in need of checking something out in multiple versions of a certain product, Docker wins hands down. The simplicity of being able to easily start containers for the applications of the versions needed, and afterwards being able to simply discard them en masse really is a huge win. Docker also excels when you’re utilizing, developing, or testing a software system composed by multiple software products (think micro services). From my perspective, the developer’s perspective, Docker makes it very easy for developers to spin up their own development environment containing the software they are working on, using prepackaged software containers, while freeing them from having to mess with binary packages, package managers or having to compile from scratch (oh, the horror!).
How Translucent ApS Can Help You
Translucent is the leading Atlassian Platinum Solution Partner in Denmark, with more than 11 years of experience, a broad array of competencies, and an extensive list of long-term customer relationships in Denmark, Scandinavia, and internationally, and we can help you leverage the power of the Atlassian ALM tools and Docker.
We offer services within:
- Strategic business consulting (products, projects, portfolios, processes, ITSM, ITIL, Agile, change management, and more)
- Continuous Delivery, Continuous Integration and DevOps
- Bespoke systems and add-on development (like Infinidex and Crypto)
- System maintenance
- Integration between - and with - the Atlassian ALM toolchain products and your systems and infrastructure