I was working in software engineering for a very long time and one of the things I learned was, that tight coupling is a great way to get quick progress, but is not a great way to scale something and make it work at bigger or huge scale. It also brings other challenges, e.g. changing code in more than one place or applying the same change over and over, loosing track of the changes and where to change.
If you look at other things that have been decoupled, it becomes obvious that this is a recipe for successful scalability and manageability
Virtualization - Decoupling the operating system from hardware resources
Virtualization is one of the biggest success stories in IT and despite the fact that the main company that we all think of now is VMware, the history of virtualization started back in the 1960s. The common theme is that we share hardware resources with more than one operating system running on it. This has a couple of great benefits:
- Less expensive - virtualization guarantees optimal use of hardware resources and leaves less room for idling resources
- Faster - it is hard to imagine, that we used to order hardware servers for just one operating system not so long back - a virtualized server can be setup in seconds
- Easier - no rack mounting, no ordering, no cabling
- Safer - can you think of the last updates you did without doing snapshots, did you ever order a machine with not enough CPU or RAM?
One of the greatest and most influential success stories is the victory of cloud computing. It would not be possible without the concept of virtualization.
Software defined networking - decoupling control and data plane
Something that is in a hype cycle right now and is related to the success of virtualization is software defined networking. It is based on the same powerful concept. Decoupling of the network from the underlying networking hardware. By separating the control plane and the data plane of a network it enables networking operators to setup their networking regardless of location and connected devices and create a powerful network that routes packets reliably and fast.
This helped network operators tremendously:
- They have a way to be much more flexible
- Changes can be done much faster
- It is easier to change things in the network
- It enables automation and orchestration
Modern virtualization and software defined networking go hand in hand - one would not be possible without the other.
Containers - decoupling applications from their environment
Another success story that follows Virtualization, containers are a very convenient way to decouple an application from the environment they are running in. This enables datacenter operations to deploy something regardless of the underlying infrastructure, it abstracts the configuration and coupling of application components away from the people deploying and leaves them with a self contained image that just works.
Once again, the benefits are huge, with a container orchestration stack software deployment suddenly scales, can be largely automated, updates can be done on the fly and it gives people a consistent environment while isolating things like networking, storage and virtualization away from them.
Decoupling your security segmentation from your network architecture
We have seen that decoupling is a powerful concept and that abstraction enables us to do more with less. However in the IT security space and especially when thinking about segmenting our networks, we still live tightly coupled to the network infrastructure, while virtualization, cloud, software defined networking and other things make IT much more dynamic and complex than ever.
There are a couple of challenges if you do security segmentation on your network infrastructure:
- every change in segmentation is a network change and - surprise - can potentially break your network
- we often still write policy with good old IP addresses - there has to be a better way
- things are complicated and get more complicated - therefore changes take ages and we still can not be sure if a change is correct (some of this is discussed here: https://www.linkedin.com/pulse/mythical-one-month-firewall-change-alexander-goller/)
- IT is more dynamic than ever - thanks to virtualization and virtual networking
- It can hardly be automated
How about breaking segmentation and network architecture up? What if we could segment and not care about the underlying network, what if policies were human readable by using metadata instead of IP addresses?
What if we could automate security segmentation and make it part of our normal application lifecycle? What if it wouldn't matter if a workload lives for 365 days or just 365 seconds?
Illumio does just that - decoupling your security segmentation from your underlying network.