
Containers?!
01/16/18 • 8 min
Today on the show: why containers? Where do they come from, and which problems do they solve?
Epsiode transcript:
Prologue
Hi, Karl here. Let me tell you a story from a couple of years back.
Imagine a team of quite stressed out developers. This team at Nokia Research Center had been preparing for a Demoday, to showcase their new applications to an excited audience. Luckily the team had already finished building their application -- or so they thought.
During the evening before the Demoday, they started to prepare the application to be showcased in the demo. This meant moving the application into a server that was located on the second floor of the office. Yet, the size of the application was huge, so the file transfer took all night.
In the morning, half an hour before the demo, the project manager asked for a small change to the application: could the developers change the color of one of the buttons from blue to green. This wasn't a hard task: The developer was able to make the change in a minute, and he could show the result on his computer to the project manager.
But how could they make the change apply to the server? They had no other solution than to grab a USB stick and start running...
These types of problems could be solved with a technology called -- containers :)
Introduction
Hi, and welcome to Cloud Gossip. I'm Annie and I am a cloud marketing expert and a startup coach. Hey, my name is Teemu. I'm Cloud developer, Devops trainer and an international speaker. And I'm Karl and I'm a cloud & security consultant for enterprise customers, and I also moonlight as an international speaker.
Today on the show: why containers? -- Where do they come from, and which problems do they solve? And by the way, no worries if you didn't understand all of the terms used in the beginning, that is why this podcast exists. Glad to have you with us! This podcast is part of a 4 part series, which you can find either on Apple Podcast, Android podcast apps or on our website CloudGossip.net.
Terminology
Okay, so in the intro we highlighted the problems of software development. Now -- we will do a rundown of terminology, and the history leading to containers. Things in real life are more complicated and things will have more layers to it. But here we have tried to simplify and find the best definitions and examples to get you started and grasp the basics.
Let's talk about application development process, which is essentially the process of how applications are built and made available to the users. The process starts with developers building applications on their own computers. And finally, when applications are finished, they are moved to the servers.
We call this deploying to production, which is a fancy name for essentially releasing an application. The biggest difference between development and production phase is that, on the latter the application is continuously running on the server to serve a lot of people -- not just the developer.
So, what are servers? They are expensive computers that are specially made to serve thousands of users at the same time and are never meant to be powered off. Where computers are made for personal use and normally turned off after use.
As an example, a regular computer might store your holiday pictures, your favorite games or you might browse Facebook with it.
Servers are the infrastructure that all internet services run on top of, like a house is built on a foundation. Servers typically house software that thousands of users can use at the same time. For example, Facebook itself, or any of Google's sites are housed on servers.
Hey, did you know?! Previously, we had servers so big, that they filled entire rooms. They would also cost a lot of money, in the realm of hundreds of thousands of euros.
Servers have evolved over the years to be smaller and nowadays you can fit them under your desk. This is very much the same process as what happened with mobile phones; evolving from old and clunky phones, into the small smartphones we currently use.
Let's switch gears and talk about operating systems. On both regular computers and servers, we have an operating system, otherwise known as the OS. The OS is a collection of software that communicates between computers and applications. Operating system makes these all work together.
For example, developer's computer might have a MacOS operating system, and the server might have Windows Server operating system. If an application has been built on top of one operating system and is then placed on a server with a different operating system, things can get a bit messy.
Why does this happen, you might ask? Well, if the application has been built and is used in another system it might not function properly in the new environment – the same way if an athlete trains in a high-altitude environment, ...
Today on the show: why containers? Where do they come from, and which problems do they solve?
Epsiode transcript:
Prologue
Hi, Karl here. Let me tell you a story from a couple of years back.
Imagine a team of quite stressed out developers. This team at Nokia Research Center had been preparing for a Demoday, to showcase their new applications to an excited audience. Luckily the team had already finished building their application -- or so they thought.
During the evening before the Demoday, they started to prepare the application to be showcased in the demo. This meant moving the application into a server that was located on the second floor of the office. Yet, the size of the application was huge, so the file transfer took all night.
In the morning, half an hour before the demo, the project manager asked for a small change to the application: could the developers change the color of one of the buttons from blue to green. This wasn't a hard task: The developer was able to make the change in a minute, and he could show the result on his computer to the project manager.
But how could they make the change apply to the server? They had no other solution than to grab a USB stick and start running...
These types of problems could be solved with a technology called -- containers :)
Introduction
Hi, and welcome to Cloud Gossip. I'm Annie and I am a cloud marketing expert and a startup coach. Hey, my name is Teemu. I'm Cloud developer, Devops trainer and an international speaker. And I'm Karl and I'm a cloud & security consultant for enterprise customers, and I also moonlight as an international speaker.
Today on the show: why containers? -- Where do they come from, and which problems do they solve? And by the way, no worries if you didn't understand all of the terms used in the beginning, that is why this podcast exists. Glad to have you with us! This podcast is part of a 4 part series, which you can find either on Apple Podcast, Android podcast apps or on our website CloudGossip.net.
Terminology
Okay, so in the intro we highlighted the problems of software development. Now -- we will do a rundown of terminology, and the history leading to containers. Things in real life are more complicated and things will have more layers to it. But here we have tried to simplify and find the best definitions and examples to get you started and grasp the basics.
Let's talk about application development process, which is essentially the process of how applications are built and made available to the users. The process starts with developers building applications on their own computers. And finally, when applications are finished, they are moved to the servers.
We call this deploying to production, which is a fancy name for essentially releasing an application. The biggest difference between development and production phase is that, on the latter the application is continuously running on the server to serve a lot of people -- not just the developer.
So, what are servers? They are expensive computers that are specially made to serve thousands of users at the same time and are never meant to be powered off. Where computers are made for personal use and normally turned off after use.
As an example, a regular computer might store your holiday pictures, your favorite games or you might browse Facebook with it.
Servers are the infrastructure that all internet services run on top of, like a house is built on a foundation. Servers typically house software that thousands of users can use at the same time. For example, Facebook itself, or any of Google's sites are housed on servers.
Hey, did you know?! Previously, we had servers so big, that they filled entire rooms. They would also cost a lot of money, in the realm of hundreds of thousands of euros.
Servers have evolved over the years to be smaller and nowadays you can fit them under your desk. This is very much the same process as what happened with mobile phones; evolving from old and clunky phones, into the small smartphones we currently use.
Let's switch gears and talk about operating systems. On both regular computers and servers, we have an operating system, otherwise known as the OS. The OS is a collection of software that communicates between computers and applications. Operating system makes these all work together.
For example, developer's computer might have a MacOS operating system, and the server might have Windows Server operating system. If an application has been built on top of one operating system and is then placed on a server with a different operating system, things can get a bit messy.
Why does this happen, you might ask? Well, if the application has been built and is used in another system it might not function properly in the new environment – the same way if an athlete trains in a high-altitude environment, ...
Next Episode

Hyperscale Datacenters
Today on the show - hyperscale datacenters. After this episode, you'll know what they are, what makes them special and why are they important for the cloud.
#Epsiode transcript:#
##Prologue##
As use of computers grew rapidly in the 1990s, so did the need for servers and datacenters. Back in the day, network connections were slow and expensive. Therefore, the datacenters had to be built close to the companies and users using them. Usually that meant building the datacenters into the office building’s basement.
There was this Nordic company and their business model heavily relied on using a lot of servers. So naturally, they also had to have quite a massive basement. This essentially meant the basement was business-critical for them. If the computers were to be harmed, the company would lose their reputation, business, everything. The office was in an area with low natural disaster risks. For example, there had been no recorded earthquakes in modern history.
However, the basement of this company's office was flooded few years ago. This wasn’t just an inconvenience for the office workers. The flooding was a serious threat for the future of the company, as the server room was completely flooded. As everyone knows computers and water don't mix well together. The situation seemed dire: the company could lose all their data, and their business could go under. At this darkest of the hours, the friendly neighborhood sysadmin jumped in and saved the day by swimming to the servers and rescuing them.
In the end it affected their business, but they avoided a catastrophe. So how could this situation have been avoided? That's what we're discussing in today's episode: --Hyperscale Datacenters.
##Introduction##
Hi, and welcome to Cloud Gossip. I'm Annie and I am a cloud marketing expert and a startup coach. Hey, my name is Teemu. I'm Cloud developer, Devops trainer and an international speaker. And I'm Karl and I'm a cloud & security consultant for enterprise customers, and I also moonlight as an international speaker. Today on the show - hyperscale datacenters. After this episode, you'll know what they are, what makes them special and why are they important for the cloud. This podcast is part of a 4-part series, which you can find either on Apple Podcast, Android podcast apps or on our website CloudGossip.net.
##History of datacenters##
Hi, this is Karl again. So, what is cloud? Cloud - as we know it - is a network of modern, hyper-scale datacenters. These hyper-scale datacenters of today are different from the datacenters we've had previously. Let's look at the history of datacenters leading up to the cloud. Before modern hyper-scale datacenters, we used a single server at a time.
The first datacenters - actually had only single server -- that was filling the whole room. Once we got further, the server size came down and we started to have data centers: multiple servers connected to each other.
The idea was that pretty much every company with computing needs would build their own datacenter. A datacenter is a specifically-made space to host multiple servers and take care of all their needs, such as electricity, heating, ventilation, air conditioning and network.
As all the companies were building their own datacenters, they had to maintain the physical security. This meant installing locks, keycard readers or any other security measures that the customers required. The physical location had to be carefully picked and deals with energy providers had to be made.
When companies were running their own datacenters, it was a big deal that they were responsible of building, installing, updating and "end of lifing all the servers in their use. End-of-lifing means that when the physical server is so old that it's no longer feasible to replace the broken parts and rather cheaper to buy a new server, the old server is disposed in a secure way.
The hard drives are wiped clean in a secure way, so that there's no way that somebody could recover our data from them. After that they are physically destroyed.
When the servers would eventually have hardware failures, the servers would be out of use. This is called an outage. Preparing for outages involves taking care of the spare parts for the servers. The datacenter owner had to purchase enough spare parts for their own use, or make sure they had access to the needed parts when needed.
These tasks of running a datacenter required a lot of personnel. Once up and running, a typical datacenter could have one administrator per two dozen servers. A typical midsize company could easily have 1000 servers in their datacenter. This meant having over 40 people on payroll just to keep the lights on and servers running.
##Problems with traditional datacenters##
Hi, it’s Annie again. Running their own datacenter caused a lot of headache to the companies. A major problem was outages. When an outage ...
If you like this episode you’ll love
Episode Comments
Generate a badge
Get a badge for your website that links back to this episode
<a href="https://goodpods.com/podcasts/cloud-gossip-111527/containers-9923922"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to containers?! on goodpods" style="width: 225px" /> </a>
Copy