Enable javascript in your browser for better experience. Need to know to enable it? Go here.

In evolving interactions,
AI reimagines the possibilities

Interactions between humans and machines have moved far beyond text, extending into voice, images, gestures and emotional cues. In 2026 and beyond they will move further still, as digital products shift from screen-based interfaces to agentic, intent-driven experiences. Rather than requiring users to master interfaces, systems will increasingly interpret goals, act with autonomy and adapt to context — reimagining how customers access value from technology. 

 

Experiences will be built around agentic interfaces that can take initiative; adaptive systems that sense emotion and environment; and embodied modalities fluent in voice, gestures, gaze and haptics. The emphasis is shifting from designing interfaces to designing relationships between humans, AI agents and the ambient systems around them — a move consistent with the broader shift toward intent-based interaction. Intelligent systems become collaborators that anticipate, learn and co-create outcomes, reducing cognitive friction and expanding creative capacity.

These advances will change how people experience digital systems. Interactions become more natural, contextual and emotionally resonant. UX evolves into interaction choreography, where designers orchestrate exchanges across multiple intelligences rather than optimizing a single surface. 

 

For enterprises, these shifts redefine the value possible from technology: deeper personalization, richer service ecosystems, more fluid cross-channel experiences and entirely new modes of engagement. They also introduce governance challenges as autonomous systems interpret intent and act independently, heightening the need for transparency and shared control. 

Key trends  

 

  • Real-time translation and cross-cultural interaction. Advances by Apple, Google and OpenAI have made near-instant translation possible in the voice, video, and augmented reality contexts. This trend is crucial because it dissolves language barriers, enabling global collaboration, inclusion and accessibility in real time across both personal and professional interactions. 
 
  • Agentic UX and predictive collaboration. AI systems will increasingly act as collaborators that anticipate user needs, autonomously completing tasks and negotiating actions. This will redefine productivity tools, transforming  them from passive utilities to proactive partners that adapt dynamically to context to achieve he desired outcomes.

 

  • Emotionally intelligent systems. With improvements in affective computing and biometric sensing, interactions will become more emotionally aware. Systems capable of recognizing tone, facial expression or physiological signals can respond with empathy to a user’s emotional state, which is vital for building trust in autonomous technologies.

 

  • Adaptive and contextual interfaces. Experiences will seamlessly shift across devices, environments and modalities depending on where and how users choose to engage with systems. This matters because it creates  continuity and consistent personalization, making digital experiences feel like coherent extensions of human intention rather than a grab-bag of disjointed tools. 
 
  • Collective intelligence and multi-agent collaboration. Enterprises will be able to leverage networks of specialized AI agents that can work alongside  human teams to make interactions with customers richer and more responsive, and support faster and more informed decision-making.  

 

Signals of this shift include 

 

  • Rising adoption of agentic systems in customer service. Tools like OpenAI’s agent frameworks and Anthropic’s autonomous assistants are extending the role of agents from passive assistants with limited capabilities, to adaptable and active collaborators that can learn from past interactions, reason through and plan the optimal outcomes, and then coordinate to deliver these to the customer.  

 

  • The acceleration of contextual intelligence. Emotion and context-sensing APIs, combined with wearable data, are allowing systems to respond empathetically to users. The advancement of ‘emotionally intelligent’ models has reached a stage where they outperform human beings in some psychometric tests.

 

  • Advances in spatial embodied computing. Though adoption has often been slower than expected, devices that blend physical and virtual spaces such as the Apple Vision Pro and Meta Quest are growing more sophisticated, user-friendly and, critically, affordable. The buzz around enterprise applications of Apple’s Vision Air shows the potential of making interactions more immersive and intuitive in the business context. 

 

 

  • Collective interaction platforms. More dynamic, shared dashboards and multimodal dialogue are providing new foundations for humans and AI agents to collaborate, redefining teamwork and coordination.  

 

The opportunities 

 

By getting ahead of the curve on this lens, organizations can: 

Reimagine the customer experience
One of the main concerns people have about interacting with a system rather than a person  is that the exchange will inevitably lack empathy. Affective computing provides a possible  means to overcome this barrier and to deliver always-on, responsive and emotionally  sensitive customer service at scale.  

Further expand the boundaries of accessibility
Advancements in the capacity of systems to recognize and interpret biometric data, gestures and other forms of interaction can be harnessed to level the playing field for collaboration with and improve the delivery of services to groups like the visually or hearing impaired.  Similarly, more sophisticated AI interpretation tools have the potential to advance inclusivity for non-native English speakers

Deliver deeper, and more effective, personalization
Systems capable of recognizing a user’s emotional states can learn how to respond to these  states in an appropriate and supportive way, adding to the user’s sense that their needs  are being understood and addressed. Combined with generative interfaces, this opens the  possibility of entirely adaptive user journeys that are constantly molding themselves around  individual needs, contexts and preferences.  

Preempt problems and reinforce the management of risks 
Enhanced human-agent collaboration on emerging platforms could have positive implications  for practices like cybersecurity, where the always-on monitoring and data analysis delivered  by machines should be combined with human oversight. Looking ahead, the integration of VR promises to make human-AI collaboration even more seamless and productive. 

Lincoln Unlocked, an augmented-reality mobile app
Lincoln Unlocked, an augmented-reality mobile app

What we’ve done

Reimagining the visitor experience through immersive technology 

 

When the Abraham Lincoln Presidential Library and Museum sought to deepen visitor engagement,  it didn’t just add another layer of digital content — it reimagined how people connect with history.  In partnership with Google Public Sector and Thoughtworks, the museum developed Lincoln Unlocked,  an augmented-reality mobile app that blends physical exhibits with immersive digital experiences.  

 

As visitors move through galleries, AR brings artifacts and historical figures to life, unlocking  multimedia narratives, interactive tours and contextual storytelling that appeal to diverse audiences.  The app also includes accessibility features such as multilingual content and support for visitors with  visual or hearing impairments.  

 

By shifting from static displays to dynamic, intent-centered engagement, the museum has expanded  its reach, made learning more inclusive and created experiences that resonate across generations.  This approach demonstrates how thoughtful use of technology can transform customer interaction,  making cultural exploration more vivid, personal and impactful. 

Actionable advice 

 

Things to do (Adopt) 

 

  • Think beyond channels to create interaction ecosystems. Sticking to a standard mix of physical and digital experiences delivered with only slight variations for different devices will mean passing on the chance to forge deeper connections with existing and potentially new customer groups. While it won’t be realistic or practical for every enterprise to integrate fully immersive VR or empathy engines into their service offerings, consider where there are clear opportunities to apply evolving technologies to improve the quality of services or enhance accessibility for certain customer segments, and invest accordingly.  

 

  • Design for multimodality. Customer experiences, journey maps and service flow charts will all need to be updated to factor in a wider spectrum of interactions and channels — such as  haptic and voice commands — as these increasingly become table stakes and baked into customer expectations.  
 
  • Establish governance and ethical boundaries. While customers value interactions with an element of empathy, emotional AI also presents significant risks, from minor misreadings of sentiment to invasions of privacy and out-and-out manipulation. In exploring new means of interaction enterprises should set clear ground rules for how and where emerging technologies should be used, and when there is a need for human involvement.  

 

Things to consider (Analyze) 

 

  • Incorporating AR and VR into the customer experience. After multiple false starts, the declining price points of AR/VR devices and accelerating adoption in industries like manufacturing could represent a tipping point that will warrant taking the plunge.  

 

  • Exploring the potential of adaptive agentic systems. Enterprises are moving beyond smaller-scale deployments of GenAI and agents to adopt agentic systems that are capable of taking over a much broader portion of customer interactions. These have significant potential to reduce the burden on human workers, but also require investments in data infrastructure and redefining of the role of service teams to succeed.  

 

Things to watch for (Anticipate)

 

  • The rise of adaptive environments that sense and respond seamlessly to customer needs. AI systems will act as ‘cognitive exoskeletons’ — augmenting human perception, creativity and decision-making. This new frontier will involve designing relationships of trust and negotiated  collaboration between humans and intelligent agents. Enterprises that invest now in multimodal design, governance and adaptive AI ecosystems will be best positioned to lead in this next phase of human–machine co-evolution. 

 

Read Looking Glass 2026 in full