Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Blogs Banner

Macro trends in the tech industry | Oct 2020

Being part of the team that puts together the Thoughtworks Technology Radar is a privilege, and now in our tenth year I’m proud to say I’ve been part of every Radar to date. Every time is full of interesting discussion and insight and it’s always true that there is more we’d like to say than can fit into the Radar itself. This article is a continuing companion piece where I take some time (and some words) to expand on the bigger trends happening in the tech industry.

Democratizing programming

One of our themes on this month’s Radar and probably one of the biggest continuing trends in technology today is what we call the “democratization” of programming. This is about putting programming into the hands of anyone who wants to do it, and making the ability to program a machine or system more accessible. This isn’t a new idea; COBOL — the “common, business-oriented language” — was an attempt in the 1960s to create computer programs using an English-like language that was more accessible to non-programmers. Today we see a huge amount of interest in “low-code” platforms that promise to allow us to create software without needing programmers. There are also platforms like the consumer-focused IFTTT or enterprise-focused Zapier that allow a less technical audience to wire up a variety of devices, SaaS-platforms, endpoints and workflows to do interesting and useful things. And if you’re looking for an integration framework, Apiant, Blendr, Microsoft Flow, Pipedream, Quickwork and Tray.io (to name but a few) could help out. For application creation, Amazon Honeycode has gained some momentum, although it’s described by one of the Radar authors as “Microsoft Access on the cloud.” 

Thoughtworks Technology Radar v23

We think the ability to do programming, or at least to have some say in the function of systems that we use, is extremely important. In Douglas Rushkoff’s book Program or Be Programmed, he argues that we must choose whether we direct technology, or let ourselves be directed by it and by those that have mastered it. Beyond this philosophical perspective, the plain fact is that the world is demanding more software than can be created by existing IT teams, and faster. 

Spreadsheets are a common example. Almost every business has some kind of spreadsheet involved in running it, and everyone in the IT industry has seen what can go wrong; giant spreadsheets with critical, untested business logic embedded inside them are fairly common. Recently, and even more worrying, we’ve seen multiple healthcare services around the world lose or mis-process COVID-19 data due to spreadsheet errors. Spreadsheets are typically used to allow non-programmers to quickly create, store and manipulate data without needing to get into a lengthy development cycle with ‘real’ programmers. Low-code platforms are similar in that they promise to accelerate software development by using pre-baked components and configuration instead of code.

Spreadsheets and low code have a key characteristic in common: they both work well in a certain “sweet spot” in terms of the type of functionality required or complexity of the problem domain, but can fail badly when pushed further. Unfortunately, the reason that such a solution is chosen in the first place — scarcity of technical talent or time — also prevents someone using a spreadsheet or low-code environment from realizing they have pushed the solution outside its sweet spot. For this reason we recommend bounded low-code platforms to manage this risk while still taking advantage of the possible acceleration of a democratized programming platform.

Rust continues to spread

One of our favorite programming languages is Rust, a high-performance type and memory-safe language that can be used as a systems programming language (replacing C or C++) or as a general-purpose language in its own right. The popularity of Rust continues to grow, with it being voted Stack Overflow’s favorite language five years in a row.

In this edition of the Radar we noted that Rust is being used for big data and machine learning tasks that traditionally would have entailed Python, and in some cases can offer a large performance benefit. We also noted Materialize, a streaming-oriented, low-latency database written in Rust.

So what makes Rust so popular? Personally I found its combination of strong expressiveness together with compile-time safety to be unique. Stack Overflow notes that Rust is a programming language that “looks like it has been developed by user experience designers” who have a clear vision of the language and carefully chose what to include. All of this good will has created a supportive, accessible community around Rust and an ever-improving ecosystem of libraries and tools.

Visualizing all the things

This edition of the Radar includes some great examples of visualization tools. More than ever the ability to create a good picture of something — architecture, code complexity, system performance, or distributed traces in a microservices ecosystem — is vital to understanding the complex systems we build today. Example of visualization tools that we talked about include:
 
  • Dash and Streamlit, a framework for building ML and data-science web apps
  • Dash, which is used both for ML-type apps as well as in a business intelligence context for building custom reports and dashboards
  • Kiali, a management console for Istio-based service meshes that provides dashboards and other observability tools
  • OpenTelemetry, an API and SDK for capturing distributed traces and metrics
  • Polyglot Code Explorer, a tool for visually examining codebases and exploring their health and structure

You’re probably already familiar with visualization capabilities of mainstream BI tools such as Tableau or Power BI, and this space of tools is exploding with offerings. But tools like Dash and Streamlit offer a code-based approach to visualization with all the associated benefits — flexibility, customizability, version control, automated deployment. These are good reasons to consider a framework rather than a fully fledged “data studio” style tool.

Infrastructure as code is popular and maturing

For our Radar themes, we called out the ‘adolescence’ of infrastructure as code, and we’re making deliberate use of that word with both positive and negative connotations. Like a slightly awkward teenager, infrastructure as code is growing up. This is positive because with more maturity we start to see better outcomes and a growing ecosystem around this technique. But there are also growing pains, such as inconsistency in some of the tooling, and competing approaches and philosophies.

So what is infrastructure as code? Briefly, it is the automation of infrastructure and the careful management of that automation. The canonical description is from our colleague Kief Morris’ book, Infrastructure as Code: Managing Servers in the Cloud, of which a second edition is soon to be published. According to Morris, different schools of thought are emerging for infrastructure as code: Pulumi aficionados talk about “infrastructure as software,” Hightower talks about “infrastructure as data” and WeaveWorks has spawned “GitOps.” It remains to be seen where these varying philosophies end up but for now we would characterize them as flavors of infrastructure as code rather than a significant departure from it. Tooling in this area has improved in leaps and bounds, with CDK and Pulumi examples of increasing maturity in the ecosystem.

The browser as an accidental application platform

The venerable internet browser started life as just that — a tool for browsing HTML documents and navigating between them using hyperlinks. With the browser gaining popularity, HTML 2.0 added many more tags and the ability to submit “forms” to a server for more interactive web pages. During Netscape’s fight with Microsoft, they realized the need for a scripting language for the browser, and so JavaScript (originally named Mocha) was rushed through a 10 day development cycle and added to Netscape. This time in history was named the “browser wars” because companies like Netscape and Microsoft added proprietary extensions to HTML to try to gain a leg up over the competition. You might remember “works best in Internet Explorer” badges on web pages around this time. Flash and embedded Java applets also added interactivity to pages, the Ajax standard was born, and JavaScript was rediscovered by developers and even began to be used as a back-end language (for example within node.js).

The point here is that the browser started life as a simple document viewer but has been coerced into becoming an application platform. I’m writing this article in Google Docs, listening to music on YouTube, and chatting with my colleagues via Google Chat. The chat is running in what looks like a native application window but is actually just a single pane containing a web page. The majority of apps I’m using right now are delivered via the browser. 
 

Over the years we’ve convinced the browser to do ever more amazing things, and the browser has become a complex platform and ecosystem. Generally the various browsers offer broad compatibility, with polyfills bridging the gap and making it easier for developers to target multiple browsers. But the JavaScript ecosystem itself continues to be bewilderingly complex and moving at pace. This Radar we discussed moving Redux back from ‘Adopt’ to ‘Trial’ because developers are starting to look elsewhere to manage state in their React applications (for example to Recoil). But even today, the ‘right’ way to build web applications still continues to evolve: Svelte has been gaining more interest, and is challenging one of the established concepts applied by popular application frameworks React and Vue.js: The Virtual DOM. 

Testing is another area where the browser has more or less been coerced into being cooperative but still suffers from automation and testing tools being retro-fitted rather than designed and supported as first-class objectives. In this edition of the Radar, Playwright is an attempt at improving UI testing, and Mock Service Worker is an approach to decouple tests from backend interactions.

We’re also seeing an evolution of the browser to be a ‘native’ code platform too, with WebAssembly providing an efficient virtual machine that aims to run code at near-native speeds. For example, check out Doom 3 running at 60 FPS in your browser.

The browser isn’t going anywhere but the fact that it’s mostly an accidental application platform continues to reverberate around the tech community, and every project should dedicate at least some time to staying on top of the latest browser-related developments.

That wraps things up for this edition of macro trends. I’d like to thank my fellow ‘Doppler’ members for their help with the article, as well as Andy Yates, Brandon Byars, Ian Cartwright, Kief Morris and Ned Letcher for their thoughts and suggestions.

Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of Thoughtworks.

Keep up to date with our latest insights