Enable javascript in your browser for better experience. Need to know to enable it? Go here.

Evolving interactions

 

New opportunities to engage and interact

 

The range of methods for human-computer interaction is expanding, and interactions themselves are growing more immersive and seamless. This is creating new opportunities for organizations to reimagine how they engage with, learn from and delight their customers, employees and other stakeholders. 

 

Many of the trends in this area are easy to overlook as they don’t, at least on the surface, seem revolutionary. In fact, the accuracy and usefulness of these systems has and will continue to grow immensely in the next couple years. For example, users have been able to speak commands into their mobile devices for years, but, until recently, interaction had to be phrased in a way the device would understand. 

 

Products and services such as OpenAI’s ChatGPT, Google’s Bard and Microsoft’s Copilot have leveraged advances in Generative AI to lead this charge, lowering the bar for individuals to interact with systems or directly with AI models. Tech giants such as Apple are also undertaking major overhauls of their voice-to-text systems, using better predictive AI and context-aware models to render technology touch-free. Beyond voice and text, we expect continued advancement in extended reality (XR) technologies that allow users to interact in virtual worlds, though perhaps not at the pace predicted by  early champions of the space.   

 

There’s significant potential for these evolving modes of interaction to make technology experiences more inclusive, as demonstrated by systems like Jugalbandi which enables illiterate people to access information about government programs by speaking to their mobile device in their native language. Yet they also present a broader set of risks and accessibility issues. While accessibility approaches for more traditional interactions like mobile are relatively well understood, this is not the case for newer interactions like XR. Making voice- or GenAI-based interactions more available and sophisticated widens the scope for misuse and unintended consequences. 

 

For the near term, there is no shortage of promising use cases for XR in emerging areas like training and data visualization that all enterprises should be ready to explore. Consumer adoption is likely to remain primarily limited to areas like collaboration, gaming and entertainment. So far, the advancements in consumer devices haven’t been sufficient to expand outside these areas, but product development by Apple, Meta and others is expected to continue.

 

Kuldeep Singh, Thoughtworks
You often see people’s eyes rolling at terms like metaverse or XR, but at the same time, people are using these immersive technologies in their daily lives without even realizing it: just look at how readily people accept blurring or changing their surroundings on video calls
Kuldeep Singh
Principal Consultant, Thoughtworks
You often see people’s eyes rolling at terms like metaverse or XR, but at the same time, people are using these immersive technologies in their daily lives without even realizing it: just look at how readily people accept blurring or changing their surroundings on video calls
Kuldeep Singh
Principal Consultant, Thoughtworks

Signals

 

  • Tech giants unveiling new XR-enabled devices such as Apple’s Vision Pro ‘spatial computer,’ and Meta’s Ray Ban smart glasses. These are designed to be more comfortable and less intrusive than earlier examples of such hardware. In Apple’s case it incorporates sophisticated gesture recognition to reduce the need for physical input devices.    

     

  • The development of GenAI-enabled apps that enhance accessibility for people previously facing physical and/or linguistic barriers to working with technology systems. Jugalbandi, for example, enables interactions in multiple Indic languages simply through the user’s voice. It does this using a combination of ChatGPT, language translation models and a fixed data set from which to answer questions.

     

  • Higher education institutions pushing the frontier of data visualization. Researchers at institutions such as Monash University are experimenting with using XR to bring representations of data beyond screens into more immersive hybrid environments. 

     

  • The release of enhanced libraries for gesture recognition designed to help developers connect physical gestures to application features, such as those developed by Google and Apple.  

     

  • Continued advancements in natural language processing, including the development of pre-trained models to address specific tasks like sentiment analysis, and a growing focus on multilingual capabilities. These advancements are being leveraged by forward thinking organizations to improve how users find and use information. For example, Zalando has created an interactive assistant that goes far beyond what’s possible with parametric search.

The opportunities

 

By getting ahead of the curve on this lens, organizations can:

 

  • Lower costs by making interactions more efficient. The ability to communicate with a system with the flick of a finger or through voice commands in natural speech, rather than the more laborious process of typing, has the potential to massively accelerate productivity in the workplace and other environments.   

     

  • Boost satisfaction by reducing friction and enhancing availability for customers. Voice-based platforms promise to make it easier for customers, especially those with accessibility challenges, to retrieve product information or get answers to queries in a smooth, hassle-free way. Similarly, advances in natural language processing opens up possibilities such as using chatbots to provide a base level of service and support outside working hours. 

     

  • Derive better insights from richer interactions. As AR and VR make it possible to bring data to life outside the screen or printed page, the foundations are being laid for what’s known as immersive analytics. Immersive environments can help users experience data aurally and tactically as well as visually, enhancing understanding, analysis and, eventually, decision-making.

     

  • Test out scenarios to improve responses. XR-enabled simulations can be used by enterprises to run teams through mission-critical situations that could test the business, giving them an accurate sense of their capacity to react and identify areas for improvement.

A hand holding a mobile phone displaying the FC St. Pauli Museum app
A hand holding a mobile phone displaying the FC St. Pauli Museum app

What we've done

An immersive fan experience for the FC St. Pauli Museum

 

The Hamburg-based football club museum’s ambition is to create a dynamic and immersive experience that grows from fan engagements and promotes topics such as diversity, sporting events against racism, and wider conversations around social impact. Thoughtworks worked closely with FC St. Pauli Museum on a six-week project to design a new interactive fan experience as part of Tech Lab — a Thoughtworks initiative that takes innovative ideas and uses cutting-edge technology to bring them to life.

 

Actionable advice

 

Things to do (Adopt)

 

  • Actively investigate processes or areas that can be improved or replaced by advances in AI and interaction technologies, such as:   

     

    • Customer interactions, more aspects of which can be supported by chatbots as they grow more sophisticated and attuned to industry-specific use cases.  

       

    • Researching and uncovering market trends. Business analysts at Thoughtworks have been able to use ChatGPT and other tools such as Boba for ideation and scenario creation.

 

 

Things to consider (Analyze)

 

  • Monitor the XR/AR space for possible use cases. The benefits for things like training and crisis management, where realistic physical interactions are particularly important, could be substantial. In addition to enterprise contexts, organizations derive value from industrial use cases ranging across design, manufacturing and maintenance applications. The cutting-edge work taking place on data visualization is also likely to prove important for how we consume and manipulate data in future.  

     

  • Understand the diversity of AI tools and how this can serve the needs of people who may want to interact with AI in different ways. Some developers, for example, are comfortable with tools like GitHub Copilot where the actual interaction is largely the same as it has been for some time with autocomplete functions. Others would rather have a ‘conversation’ with a ChatGPT-like solution and use the resulting learnings in their regular internal development environment. 

     

  • Learn, and go, where your customers are. Consider your customer base and how likely your average users are to adopt any new platforms for interaction you offer. For example, retailers could focus on augmented reality (AR) interactions which enable richer interactions for consumers in physical locations.

 

 

Things to watch for (Anticipate)

 

  • Consider ‘outside the box’ use cases. There’s no question the expanding nature of interactions will result in potential applications for every business, but these applications may not always be obvious. This means it’s important to think through business processes and examine where the ability to present data or engage with customers in an entirely different way might improve the entire experience.  

Read Looking Glass 2024 in full