Historically, IT organizations have delivered value more slowly as their overall technology assets increase: the weight of existing systems impedes delivery of new features. But today, the best performing tech organizations—the Netflixes of the world—have figured out how to buck this trend. As they build assets, they deliver new features and systems faster.
Their secret is something called “platform thinking”—deliberately encapsulating useful capabilities and business functions into self-service, reusable “platform” components. As the technology and business platform gains new features, it allows the organization to accelerate time-to-value.
Developer productivity becomes a priorityA second trend that we highlight in the Radar is developer experience as the new differentiator. It’s not entirely new—my colleague Rachel Laycock spoke about “The Empowered Technologist” back in 2015—but as we embrace a future where society, business and technology are inextricably linked, it’s more important than ever to do tech right and with the best people.
Silicon Valley has taught companies to have a relentless focus on products and customer experience. This extends to treating developers as customers too. Increasingly, we’re seeing development tools, APIs and platforms evaluated on their ability to reduce developer friction and increase autonomy and empowerment.
When we put superficial private cloud on “hold”, we did so largely because these clouds weren’t enabling self-service teams, nor helping those teams to get software into production. The industry is beginning to understand that developers are the customers for things like private clouds and third party services and APIs; that developer experience is an important differentiator.
If your product—whether it’s internal to an organization or a “real” publicly marketed product—doesn’t appeal to developers, that’s a critical signal that you need to improve it.
If you look at the current state of software architecture, you’ll see the massive impact microservices have had on enterprise systems. Almost universally our clients want to adopt microservices, and the interesting thing is that there are organizational as well as technical implications.
So today, we’re seeing a number of top technology execs coming out of Silicon Valley and into more traditional organizations. They’re subsequently rebuilding their teams into the “Amazon model” of IT.
Amazon pioneered the two pizza team model of small autonomous teams who are accountable to drive a single business metric. What's important about the model isn't the size of teams but their autonomy, their “you build it, you run it” accountability and the enlightened ways in which they interact with other teams.
This enlightenment includes techniques such as API-as-product and treating other parts of the organization as though they are customers, even if they're internal. The combination of microservices and self-service platforms are basically required to implement these behaviors and interaction styles, so their popularity is on the rise.
When talking gets cleverBased in part on the astonishing accuracy of today’s speech recognition systems, conversational UI and natural language processing is on the agenda for many organizations.
The foundational example is the Amazon Echo; in designing for a device with no screen, Amazon was forced to carefully consider how a “conversational UI” should be built. Carrying on a conversation with a device or chatbot is no longer reserved for the likes of Alexa, Siri or Cortana—these capabilities are within reach for device makers and enterprise developers, with the help of APIs and third-party services to do the heavy lifting.
The wit.ai and api.ai frameworks assist developers in extracting intent from speech and the architectural technique of conversationally aware APIs both make the job of conversational UI even easier.
These conversational UIs are powered by machine learning, which we think is taking off because of a nexus of forces.
First, the Big Data hype over the last few years means organizations are sitting on large piles of data because they've been taught it might be valuable and they shouldn't throw it away. Data of course is the lifeblood for useful machine learning.
Second, processing power has increased—especially via the cloud and GPU computing. Apple are designing CPU/GPU chips into their phones that are specifically engineered to speed up machine learning.
Finally, there's been a democratization of the algorithms. Google released TensorFlow as open-source, so now anyone can build an ML model, train it using the cloud or a GPU farm, and execute it at speed on users’ smartphones.
This has led to a new de-facto platform offering: intelligence as a service. Most large platform players—Google, IBM and Microsoft, to name a few—now offer smart algorithms as part of their platform, with features ranging from voice processing and natural language understanding, to image recognition and deep learning.
These data-enabled services work best when they can operate on large data volumes from more than one source, so Amazon’s image-recognition service might give better results than a similar offering from a smaller player (image recognition is startlingly good—we can now identify “golden retriever” not just “dog”). The big benefit to intelligence as a service is that developers can build more ambitious solutions because they're standing on the shoulders of giants. Potential drawbacks of course include data security and third-party lock-in.
Of course, making conjectures about the state of ML and AI is hard—in the past year alone, we’ve seen ML systems master games such as Go and Texas Hold’em poker, that were thought to favor human cunning. The ways in which we are building Deep Learning systems is rapidly improving, and many of the best techniques actually have ML systems building themselves.
Generative adversarial networks pit two deep learning systems against each other in order to improve their overall output. In a stunning example, these systems can create high-resolution images from textual descriptions, effectively running an image recognition algorithm “in reverse.”
An intelligent approach to developingA natural question arises from the rapid advance of machine learning: how will it affect the job of developers, who have always written explicit instructions for computers, albeit at increasing levels of abstraction?
Personally, I think it's too soon to tell. There have been many attempts at higher order languages, where human developers simply explain their intent and an ML/AI system conjures up implementation code, but I think practical usage of these techniques is a long way off.
What’s interesting though is other technology problems that are difficult but may be improved through ML. One example is architecture review for an existing suite of enterprise systems—there are likely to be highly complex interactions between systems and components, complex data access patterns and relationships, and so on. Instrumenting these systems and then applying ML might yield insight that could allow us to do enterprise architecture better—”apply the strangler pattern here”, “retire this flaky old system”, “combine these two apparently related systems”, and so on.
I feel compelled to end with an unfortunate truth about the IT industry: the trend of sexism continues, and technology is often a hostile place to work if you’re not a white male. Susan J Fowler, a former Uber developer’s highly detailed account of a year of discrimination, harassment and management cover-ups went viral in February, and the revelations just keep coming. But it’s not just Uber that’s struggling to provide an inclusive and safe space for women: diversity is a challenge no matter where you go in tech.
ThoughtWorks has spent the last six years focusing on the diversity of our workforce. It’s a long road and diversity is not easy. ThoughtWorks and past winners of the Anita Borg Institute Top Company Award demonstrate that progress is possible and that we can move towards a more inclusive industry. But there’s still a long way to go and a lot more work to be done.