Enable javascript in your browser for better experience. Need to know to enable it? Go here.

This week in AI | 8 May

The AI landscape continues to evolve at a breakneck pace, and as always we're discussing it all on This Week in AI.

 

This week was marked by massive infrastructure investments like the NVIDIA-Corning optical fiber deal to power new AI factories. As capital expenditure in compute capacity grows, we're also seeing significant algorithmic improvements for speed. Techniques like speculative decoding in Gemma 4 and "abstract chain of thought" are pushing the boundaries of performance, allowing models to reason faster in latent space without the latency of generating human-legible thinking tokens.

 

We discussed this week how the market is reacting to the high costs of these capabilities. High-profile shifts in AI pricing, such as Anthropic moving toward API-based usage limits, coincide with cautionary tales like Uber exhausting its annual AI budget in just four months.

 

In response, we see the formation of specialized enterprise AI services firms, backed by major private equity players like Blackstone and Goldman Sachs, to help organizations navigate deployment.

We also discussed the concept of 'cognitive surrender,' which occurs when users stop verifying AI outputs, leading to misplaced confidence in potentially incorrect answers. Experts suggest something called 'friction by design,' which is where you maintain smaller units of comprehension (like Google's 100-line PR norm), and treat AI outputs as spikes that must be validated through deterministic testing and formal verification.

 

Watch the video for the full conversation!

Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of Thoughtworks.

Explore a snapshot of today's tech landscape