Decomposing is the answer to challenges related to the matters of speed, agility, stability, security, and reliability of IT commerce systems. DevOps best practices and the way we use cloud are worth considering if you aim for success. Let’s have a look at how DevOps should be understood, whether public cloud is safe and whether migrating the whole system there is the only way for improvement.
Decomposing for improvement
It’s impossible to reach the level of flexibility and speed that is required by today’s standards with a monolithic system. There is too many people working on the same codebase, sharing the same resources and struggling with common problems.
Organizations need to create a layer of separation to enable their teams to work independently, in their own pace and within frameworks they feel most comfortable with. The solution is based on creating microservices that are simple but fast and reliable.
When we decompose our monolithic systems, we shift the complexity of them from software to operations. That requires both simpler software and higher complexity of infrastructure. How can we tackle this issue? DevOps is the answer.
What exactly is DevOps?
Let me tell you what it definitely is not – a job title. We’ve seen it way too many times: “Looking for DevOps people”, “Our DevOps team”, “The DevOps guys can take care of it”. The term has been even more twisted by IT recruiters and became just a buzzword these days.
What the DevOps is then? We would say it’s a mindset. It is about taking all the great practices developers have used for years and applying them to Operations. But it is also about developers having Operations in mind when building their apps. No more tossing software over the fence and saying: “Works in Dev, it’s Ops problem now”. To succeed Developers and Operations need to work hand in hand and draw from each other’s experience. However, DevOps requires a specific team structure as well as a different way of thinking on both sides.
Over the years engineers that live by the DevOps’ standards came up with a lot of new tools that help to manage and operate complex systems in a way that ensures security and reliability without compromising such important aspects as flexibility and speed.
You can spin up a whole Virtual Private Cloud (VPC) by using tools like Terraform and Helm.io in a matter of minutes. And what’s more – you can be certain that it works just as you expect it to. However, those tools alone aren’t enough. You need an infrastructure to run your VPC on. Don’t be fooled, even serverless apps need servers. We’d like to focus on public clouds with emphasis on GCP. Why? Let’s explain it briefly.
What is a public cloud?
Nowadays we try to time-share everything from scooters, cars and offices to apartments. We don’t need an electric scooter daily, we just want to get from A to B quickly. We don’t want to think neither about the maintenance of said scooter, nor about getting it back to where we started my journey.
So, what is a public cloud? Think of it as renting a data center for minutes. A highly advanced, super-efficient, top-of-the-line data center that can accommodate your systems. It is not just the underlying infrastructure like servers, switches, loadbalancers, power supplies, etc.
All major public cloud providers offer a ton of services – from managed database engines to pre-trained AI models. You have to know which ones can help you build better, faster and more resilient systems. And the best part is you can use it right now and for as long as you’d like. You don’t have to make any investments into your infrastructure, just buy hardware and worry about your computing power only.
Is public cloud safe?
How “public” the public cloud is? From the physical point of view you can’t really get in without a tank (please refer to the video). You’ve got multiple layers of security, access verification and a professional process to dispose of the old hardware.
What about my data? That’s a different story. It is as safe as you make it. All major cloud providers are GDPR-compliant. They offer wide range of solutions to protect your systems such as at-rest data encryption, VPN connections, identity management and so on. It is their responsibility to use these tools wisely. Fortunately, they do come pre-configured with best practices in mind, so you would have to deliberately make some changes to lower their safety level.
After you decompose your system, you might be hesitant to migrate every part of your new system to the cloud. Luckily, you don’t have to. You can set up your new architecture to run parts of your system on your on-prem infrastructure, some microservices in Azure and some in GCP. All parts of your complex system can connect safely using VPN’s or other secure connections.
Keep in mind though, that this will add another layer of complexity to your system. You are now managing multiple cloud providers and multiple VPCs. I’m not saying that it’s a showstopper.
Using a hybrid cloud can:
- ease your way into the public cloud
- help you use different services from each cloud provider
- save you from vendor lock-in
Benefits of cloud
Benefits of using cloud are described in the two latest publications:
Google Cloud and DORA (DevOps Research and Assessment) in their 2019 Accelerate State of DevOps Report focuses on different features, while showing new insights, implementation best practices and the advantages of embracing them:
“In this year’s report, the retail industry saw better performance both in terms of speed and stability. (…) Organizations of all types and sizes, including highly regulated industries such as financial services, government and retail, can achieve high levels of performance by adopting DevOps practices.”
A month after Google announced the above mentioned publication, Puppet launched their 2019 State of DevOps Report. Puppet focuses more on the security side of things:
„The DevOps principles that drive good outcomes for software development — culture, automation, measurement and sharing — are the same principles that drive good security outcomes. Reliability, predictability, measurability and observability in your deployments create not just intrinsically more secure environments, but also, when combined with a strong automation practice, enable speed of response to security issues as they arise.”
These are the examples of features which, together with agility and reliability, can be greatly increased by decomposing your current monolithic system, DevOps practices and the use of public clouds.
The biggest concerns that we encounter regarding migration to cloud are often connected with the idea that you have to migrate your whole business and all its components at the same time. Once you understand that decomposition is the way to go, the whole process becomes way less complicated, easier to plan and execute. So take advantage of public clouds, DevOps practices and succeed!