The Thoughtworks Technology Radar is now in its 10th year, with the 22nd edition of the Radar coming soon. The Radar tries to highlight and give advice on languages, tools, platforms and techniques that are important in today’s IT industry. The Radar, however, is just a snapshot in time — we include ‘blips’ that are relevant for each publication, and often items that are still useful will fade from the Radar as we make room for more up-to-date information.
As we’re celebrating the first 10 years of the Radar, we surveyed Thoughtworkers to get their thoughts on the most enduring good advice from the Radar, regardless of when that advice actually appeared. Although we asked about all of the Radar quadrants, the “techniques” quadrant featured heavily in those responses. The underlying technologies and platforms might change over time, but the best techniques — the ‘how’ of software development — tend to remain stable over the long term. This article is an examination and explanation of these “enduring techniques” from the Radar.
A decade of the IT industry
It’s important to remember where (in time) the Radar comes from. Looking back at the first edition, we featured items such as Amazon EC2 and Google Wave. One of those survived, of course, and one did not, but given that we called out EC2 specifically, rather than taking all of Amazon’s services and labeling them “AWS” gives you an idea of the state of the industry at that time. In the last 10 years of the Radar, a number of storylines have played out, and the techniques featured on the Radar are closely tied to those stories.
The rise of cloud
Probably the most obvious storyline over the life of the Radar has been the emergence and rise to dominance of cloud computing. When the Radar was first created, cloud was still in its infancy. Amazon’s “elastic compute cloud” was used as a way to survive sudden spikes in demand, such as a startup becoming popular or needing a short-term burst of capacity in case its product was featured on the Oprah Winfrey show (this is an actual use-case for a Thoughtworks project). But most IT departments operated on traditional infrastructure. VMware and virtualization had started to make inroads but most organizations would not even consider moving their applications, and especially their data, away from an in-house data centre. Fast-forward 10 years and while there’s still some FUD around use of cloud providers, even large financial organizations are entrusting their core data assets to the cloud. The fortunes of major companies including Microsoft now hinge on the success of their cloud offering. Cloud has come of age and in many organizations cloud is the default — if you want to do on-premise hosting you need an exception, not the other way around.
Agile goes mainstream
At the time of the first Radar, Agile software development was considered new, cutting edge, and definitely risky. Waterfall methodologies and big up-front design and planning were prevalent across the industry. Thankfully, the story of IT in the last decade is also the story of the rise of Agile as a mainstream development practice. Organizations have realized that working in shorter iterations and regularly “shipping” code (whether to a test environment or production) and allowing business users to change direction mid-stream leads to better software, happier end users, and more value created. Two key tenets of Agile — that you must shorten feedback loops and break down silos — have spread beyond the development team and started to impact the entire IT department and even the entire business, reshaping companies around how fast they can experiment and get data and feedback about what they’re doing right and wrong.
Automation unleashes cloud’s potential
In the early days of cloud, most organizations used on-premise hosting. Some had begun to move away from “on the metal” hosting through VMware or other virtualization technology, but even if the infrastructure was virtual, the processes surrounding the machines was very traditional. Operations teams provisioned servers slightly faster because they could click a button instead of waiting for physical hardware, but then manually installed and patched operating systems and software, submitted tickets to the Network team to get an IP address, and handed off to the security team for them to do their work.
None of this worked very well when it came to cloud computing. Amazon’s Elastic Load Balancer (ELB) — the core technology that allowed you to survive being ‘Slashdotted’ — worked by spinning up new virtual machine instances based on disk images. Instances needed to auto-start, auto-configure, and join a set of workers based on a single “go” signal from the load balancer. This was one of the key drivers of the automation and DevOps movement — cloud was only about 10% as effective without strong automation, and there was a ceiling on the sheer number of virtual machines you could manage manually. At scale, companies like Netflix couldn’t afford to have the traditional ratio of one operator per 10 servers and needed to develop automation so that a single operator could manage hundreds or even thousands of virtual machines.
Continuous delivery and DevOps
With the rise of Agile as a mainstream methodology, automated testing also became mainstream. No longer was it acceptable to have a test “script” be a set of steps in an Excel sheet that a human needed to follow and sign off that software was ready for production. Successful Agile relies on a safety net of automated tests that can run end-to-end, testing in the small and the large, simulating user interactions and infrastructure failures, and certifying that the software does what it’s supposed to in a wide array of scenarios. Teams used continuous integration (CI) servers to build and test their software, and it wasn’t long before these techniques were extended to their natural conclusion: an automated “pipeline” from code change to production deployment, with teams able to put software into production at will, known as continuous delivery (CD).