Menú

Evolving Interactions

29 November, 2018 | 28 min 47 sec
Podcast Host Alexey Boas and Mike Mason | Podcast Guest Neil Redding and Barbara Wolff Dick
Listen on these platforms

Brief Summary

The way we interact with computers has transformed, from the classic keyboard-screen combination, to touchscreen and latterly — with the advent of digital personal assistants — to voice. These changing interfaces bring with them new challenges in creating software. In this episode, regular co-hosts Alexey Villas Boas and Mike Mason are joined by Neil Redding and Barbara Wolff Dick as they explore the impact of innovation in our computer interfaces

Podcast Transcript


Alexey Villas Boas:

Hello, and welcome to ThoughtWorks podcast. My name is Alexey Villas Boas, I'm in Sao Paulo, Brazil, I'm the head of technology for Brazil, and together with Mike, I'll be one of your host this time. Hello, Mike.


Mike Mason:

Hello Alexey, it's nice to be with you.


Alexey Villas Boas:

We're also here with two other guests, Barbara and Neil. Hello Barbara, would you mind introducing yourself?


Barbara Wollf Dick:

Sure. Hello, I'm Barbara, I'm a designer from Sao Paulo, Brazil.


Neil Redding:

Hi, this is Neil Redding, I'm director of emerging technology for ThoughtWorks in North America, I'm based in New York. I'm great to be with you all today.


Alexey Villas Boas:

It's wonderful to have you with us, so thanks a lot. Well, we are here to talk about evolving interactions. I guess the first thing is, what is evolving interactions? Neil, would you like to take a shot for that?


Neil Redding:

Sure, yeah. I don't know how many of you may have seen content about evolving interactions on our website, but we think of evolving interactions as representing one of a number of trends that we call seismic shifts that we've seen over the past few years, kind of showing up in the market in our lives and impacting how we create software and how we as humans use software and hardware. So, the simplest way to describe what we mean at ThoughtWorks by evolving interactions is essentially the way that we interact with software and hardware is evolving rapidly, it's accelerating in the way it evolves. So we've seen that we're now 10 years into the smartphone as something that most of us have in our pockets or in our hands most of the time.


Neil Redding:

Building on many decades of other interaction technology from the phone all the way through the PC. Now that we're 10 years into the smartphone, we're seeing this proliferation, right? A diversification of new interaction technologies, voice smart speakers like Alexa devices or Google Assistant devices or a Siri on our phones. We're also seeing a lot more sensors and augmented reality and virtual reality in many cases. So, the point is we wanted to have a way of thinking about how the ways we interact with software and hardware are evolving.


Mike Mason:

I think people might have heard of other stuff like Beyond the Screen or Beyond the Glass interaction styles that are more than just jabbing our fingers on the phone. So I think there's quite a lot of different names for it that people might have come across.


Neil Redding:

Sure, exactly. I mean, for years we've been having [inaudible 00:02:24] what comes after mobile, right? For many years now we've been familiar with a mobile strategy that every brand or enterprise needs to have, but those of us who are future looking or interested in catching the next big wave, have been asking what comes after mobile? And the way we've answered that question in the context of evolving interactions is to say that there is no one single thing, right? That's coming after mobile. I mean, smart phones are not going away anytime soon and what they're leading to is actually a number of different types of touch points.


Neil Redding:

Some of them are wearables, right? I haven't mentioned those yet, but watches have gotten a fair amount of traction. In the Western world, whether it's Apple watch or Samsung or even Fitbit, right? But essentially the phone is becoming this local compute hub to be a little bit geeky there for a moment, right? And just say like there's a powerful computer in your pocket and thanks to Bluetooth and Wi-Fi and other forms of connectivity. It's a hub for devices that we carry with us, right? So, I think we're going to see a lot more diversification of that in the coming five to 10 years as well.


Alexey Villas Boas:

Is it fair then to say that the way we are interacting with technology is changing that sense?


Neil Redding:

Yeah, no, absolutely. I don't know how many of you have used voice interfaces, but I think, as Mike was pointing to this stabbing at touchscreens or tapping and swiping and looking down at our phones, which has become very familiar over the last number of years. We see thanks to voice UI, I mean, Siri in away is off ... but in many countries now we've seen Alexa or Google Assistant really becomes so effective that we're at a point when many of the interactions that we have with technology can be done via voice, and I think this is only going to accelerate over the coming few years, right?


Neil Redding:

But there are lots of use cases or things that we use technology to do where not only do we not really need to be looking at something visual, but we actually need to be hands free whether we're driving a car or we're distracted or just multitasking, doing other things and some voice can be very useful as it becomes more functionally precise, so I think voice is a big thing. Then the other thing we would say in terms of, to your question about, how we're interacting with technology is, we really see mobile augmented reality, although again, it's at the very early stages.


Neil Redding:

But just the level of investment and prominence that both Apple and Google with iOS and Android have placed on augmented reality makes it clear that over the coming again, three, five, ultimately I think there's a 10 year roadmap for this, but we're going to start looking at the world through our phones instead of looking down at our phones. I think we find that a useful way of thinking about the change, right? The change in our moment to moment behavior in terms of how we use this technology.


Mike Mason:

The other thing that seems to be changing is the fluidity with which people move between different interaction modes. All of those are options for people and that we don't think, I'm going to use my smart phone now, I'm going to use my smart watch now, I'm going to yell at Siri across the room. We're more fluid than that and we use the thing that's appropriate and I think as systems get better, they're going to use the most appropriate interface for us. If I get a personal text message, maybe I don't want it read out over the loudspeaker at home, maybe I want that in a more discreet kind of a way.


Barbara Wollf Dick:

Yeah. I think the main thing about the evolving interactions that we are not talking and we won't be talking for long about a single device. We're not talking anymore about ... I mean, not only apps and how we interact with our phones, but how things go beyond the phone and we can go deeper when we talk about designing an experience. We can integrate technology in different devices. It's like we don't need to learn anything new because technology is starting to adapt to things that we know how to use and as Neil mentioned our voice. So we don't need any more to learn how to use a small keyboard in the phone because we are just starting to get experiences that allow us to use our voice with other devices and will deliver us the same value that we would if we are learning how to use something new and completely different as it was 10, 15 years ago.


Neil Redding:

Yeah, no, absolutely. I think that's a great point, Barbara. To add to it slightly, I think one of the things I thought works, that we find really, really useful is to fulfill on our role as we see it, part of what we do is help our clients and audiences make sense of what these technology trends imply, what they're doing and how we can take advantage of them, how their businesses can take advantage of them. So one of the ways of understanding what's happening with evolving interaction technology and really technology more broadly as you pointed to, Barbara, that we're able to interact more naturally in a more natural human way, right, with technology.


Neil Redding:

I think in retrospect we're going to see eventually even typing on a QWERTY keyboard or using a mouse to click an arrow on a thing on a screen as really primitive and unnatural, right? One of the things that I've heard Alex Kipman who invented the Kinect and the HoloLens at Microsoft say in his TED talk, which I thought was really powerful, and before long we're going to be looking back at the several decades in human history as this really strange aberration, because for hundreds of thousands of years before these few decades and forever after, we will have been interacting with everything in three dimensions. But for these few decades, we compress them onto two dimensions like these 2D screens, right?


Neil Redding:

So that's kind of pointing to holographic or mixed reality and that kind of vision, but it's an interesting thing to think about how the early stages of computational technology we've had to just make do, just live with this 2D, these strange, relatively unnatural ways, right, of interacting.


Alexey Villas Boas:

That's a good point, Redding, because the same thing happens if we look back. Right? I think it's Alan Kay that has this great quote that says that, technology is anything that wasn't around when you were born. So I'm sure that young children will see things in a very different way. So, if we look back, I think we can see the same thing or analogous process.


Mike Mason:

Well, I mean, my kids, they're eight and 10 now, when they were much younger they had used iPads and they thought the TV was broken because it didn't respond to touch input and stuff like that is very real example.


Neil Redding:

Well, I think it also points to the remarkable naturalness, right? That was achieved with the multi-touch interface, right, of the iPhone and the iPad is that ... we've all seen both very young children, right, babies successfully use that multi-touch UI on an iPhone or iPad as well as very old people, at least I have, I've seen elderly people who had a really hard time using a PC or even a Mac just to get their email or browse the web. Once the iPad showed up, they're just like, "Ah, this is actually usable." And so much simpler, partly because of operating some largely because the touch interface is just so much more natural than the mouse.


Barbara Wolff Dick:

And it's also interesting because you see kids interacting with tablets but you also see elderly people having almost the same behavior, is like the tablet it's easier. My nephew uses voice commands to search for YouTube videos but also my grandfather uses the same. So it's interesting how it's so much easier to get things done when it's something that you are already used to. To things that we do on a regular basis like talking to people and things that make us human, you start not to notice anymore this micro interactions that you have on every day and you need to use technology and the devices they start to get invisible or you don't have the same friction, it seems that the more touch points that you have with technology, the more invisible it becomes.


Alexey Villas Boas:

Somebody pointed out to me, I think it's on Twitter, somebody pointed out that all of these digital assistants, I think except Google, they're all female names. Alexa, Siri, Cortana. Not only are they female names but they are branded, like Siri is the Apple brand and this person was almost petitioning for us to change that around and start doing the Star Trek version, which is just to call it computer, computer is a subservient computing device that's going to help me out, whereas Siri is a weirdly personified branded Apple thing. So maybe there's something to look out for that, what do you folks think?


Neil Redding:

Yeah. I can share on that. What's interesting? So, a few things, one is ... so my Amazon Echo which is the Alexa device, allows you to change, this is what they call the wake word, right? So I can actually address it as computer. They thought to make that as an available wake word, and I think it was a very an homage to Star Trek that they put that in there, right? Because I can also say, "Computer, where is Captain Picard?" And Alexa will always say though, I mean, this is sadly, there should be more to it, but always say, "Captain Picard is not on board the Enterprise." I wish I heard something more interesting about him. But so, there is that, I think Mike, that what you're pointing to more generally is the complexity, right? Or the dangers or the nuance around how we choose to personify this technology, right?


Neil Redding:

I mean, we read so much into voice and when we hear a voice, even if it's disembodied, right? If there's no person to look at, but we hear a voice and it sounds female versus it sounds male, we automatically layer that or make a lot of assumptions, right? Names are also important and there's a lot to how these personalities are designed to respond, do they behave and speak in a subservient way or do they speak in a way that feels more like a peer? I mean, there's so much to this territory, right? From a design perspective, I mean, I'd love to hear your thoughts on this Barbara, but I think there's ... it's so completely different from the design that we have known for even the last 10 years. Building web apps and mobile apps in a more than 10 years with web apps certainly, but designing a voice personality and choreographing and conversation is so different, right? From a design perspective than wire frames for a web app, right?


Barbara Wolff Dick:

Yeah, of course. I mean, it gets up so much closer to science fiction because in the end we are designing a personality, we are trying to make codes be more relatable. So it's completely different than creating a graphic interface. You use different tools, you need different competencies, different capabilities, it's almost as we need to study all over again. I mean, in design school you learn how to draw and how to design things, how to make artifacts. It's so different, there's so many topics to go on this, but we just need to dive into psychology and all this human behavior that it's so much deeper than we are used to. Because it's completely different than doing, I don't know, user research to this new app then doing user research to understand which tone people would feel comfortable talking to, so you can program that in your voice assistant.


Alexey Villas Boas:

How can we move from there to take a look at the pragmatic angles? So how do we leverage all this and combine all this to really build applications? And for those people who build stuff, so what's the pragmatic approach to that instead of just playing around with new technologies, what things should we have in mind when we try to bring that to applications and to systems, et cetera, and to businesses in general?


Barbara Wolff Dick:

I think in the end we are all trying to solve problems, right? We won't just be developing things because they are fun. I mean, it's fun to do new things, but we should be focusing on solving an issue someone has. So, if we need to use them, some emerging technology that has VR or something, this should be focusing on solving an issue.


Alexey Villas Boas:

Yeah, I think ... No, I think that's a great point. I mean, it's just to add a little bit of that, right? So you start from real problems, start from real pain points or aspirations that, if it's a client of ours at ThoughtWorks or if it's one of their users really looking carefully at those, but then also it's design thinking, right? I mean, it's really looking at the moment day to day journey or behavior of someone who's the target user of a system that we're building. What are they trying to do and how are they trying to do it and what are the physical and emotional and interactive contexts in which they're doing whatever it is they're doing, right? Like we were saying earlier, I mean, sometimes we've actually seen at a client, large pharmacy chain in the U S.


Alexey Villas Boas:

There was work that we did earlier last year in which they actually ... when we came on board, they were dealing with the fallout kind of the failure from interaction or experience design perspective of creating this system that would allow their pharmacists to track and support the workflow of assembling prescriptions into bottles, the right pills, the right bottles, et cetera, into bags, and then packaging them for our customers. They built a mobile app that supported all of this, and then what they found out when they finally built it and got it in the hands of users to test, by the way, this was completely built by this time instead of being tested earlier. But it got it into the hands of the pharmacist, and what they discovered immediately was that the pharmacists don't have any hands left.


Alexey Villas Boas:

I mean, their hands are being used to pull the pills and put them in the bottle, put them in the bag and do all these things. So having to also hold a mobile phone doesn't work for them at all. So that kind of scenario, I mean, obviously it's a case study or like a lesson in user centered design and testing a prototype early, but it's also ... I mean, to point to an opportunity to use voice, right? Or to look at when a hand's free, solution would be much more compelling and then look to see what can be done with voice in that context. Right? So I think that's, just to add a little bit to what you were saying Barbara, I think it's like looking at what are the scenarios in the context that people are actually working in.


Mike Mason:

I think there are some non-obvious things to work on. Kind of implying to all that stuff is obvious. I think there's some additional things to think about. We worked with an airport in the UK to design a chatbot interface for travelers and walking around an airport and the things that they might want. What I thought was interesting was that our designers were trying to consider the right tone of voice for the chatbot system. Not only to allow people to use it well, but also to broadcast a little bit of authority, so that people would feel confident in the answers that it was giving them, but also project the right brand from the airport through that interaction mechanism. I think branding is almost going to be a new frontier in all of these evolving interactions.


Neil Redding:

Yeah, it's fascinating. I mean, people, for a long time we have thought of ... when we build software that has a user interface, we've thought of that user interface as needing to be on brand, right? I mean, needing to reflect the brand, because it is to use a little bit of marketing speak, it's a brand touch point, right? I mean, if you're working for a big airline or a big pharmacy or a big financial services firm or really any client, more and more these companies are competing based on the experience of the brands, right? The customer experience so there needs to be consistency across all the different touch points.


Neil Redding:

Again, when we're talking about voice, it's kind of new territory because these brands, if they're big successful companies, they've figured out what their brand looks like in terms of a visual style guide in terms of logo and visual treatment and all these things. But there's no existing directive around personality voice, male, female, how does it speak? When does it speak? The fact that technology is still early in that stage that just imagining what the personality would be for a particular brand, it may not be implementable at this stage. But clearly it's about then asking all these questions at this point.


Mike Mason:

I also think when it comes to practitioners, we need to be looking at what's the reality of what these technologies can do, what are the last do that we couldn't do before, and I think one of the good examples of that is using virtual reality for training. I think initially, people thought there wasn't a huge enterprise market for VR because, what are we going to do, do all of our meetings in VR and stuff like that. It just didn't seem that useful, but VR for training where you are creating a scenario at will that may not actually occur in the real world very often, but doing that through a VR experience.


Mike Mason:

One of the examples I heard was to use VR to simulate Black Friday rush at a retailer. That thing is ... I mean, it's just incredibly useful, use of new technology because you can put someone in a situation in the way that you would put a pilot through a rigorous training process for those, once in a lifetime things to go wrong and make sure that they can actually deal with them, and now you can do that in a much more mainstream way using VR.


Neil Redding:

And I mean, Walmart is actually a company that announced they were doing that Mike, for Black Friday. Again, we at ThoughtWorks, most of the year ago, so early 2017 we created ... there is a piece in our Insights blog on our site that covers the thinking behind, and also a couple of minute video just depicting this VR prototype simulation that we put together that allows airplane mechanics to be in a hangar with a fully accurate 3D model of an airplane engine. It's interactive, there are movable panels that showcase the maintenance history, the repair history, the maintenance manual content. It also showcases how AI machine learning building on the data that's available to a system like that can automatically suggest based on the data and the diagnosis and so on, where to look to solve a particular repair.


Neil Redding:

As you're saying Mike, I mean, VR is over the past year has become a very clearly powerful approach to creating simulations that support trading. Just like flight simulators have been around for pilots for decades, right? I mean, these are the same things for worker training or UPS is using to train their shippers, and Fidelity Investments even announced they are using VR to allow their analysts to interact with data and visualize data more immersive in rich and insightful way. So it's a lot that can be done.


Barbara Wolff Dick:

I was just going to add to this, that we're now being able to create situations that would actually, I mean, cost money to recreate them. The case for the airplanes engines, it's hard to get an engine that has the issue that you want people to understand what is going on and all that. And now with VR you're able to ... maybe in half of the time that you take to train someone to perform maintenance that you need them to do, you can do in, I don't know, two months, something that would take six months. So this possibility, this is what I was talking about when looking at the issue that need to solve right, is not only using the tech because it's nice and it's cool, but also to help your business to focus on things that you actually need to get done.


Alexey Villas Boas:

Yeah, and I think it touched on a key point, Barbara. These are all very good examples on how to move beyond just the hype and using cool stuff because it's cool and using that because it's really useful for the business and it can generate results and business results. I guess it also requires other organizational changes, right? So lean and hypothesis driven mindset and also the technology platform has to support it and all that. But that might be a different conversation.


Mike Mason:

Well, I mean, I think it depends ... I mean, some of the stuff ... there are proven solutions now for doing VR for training for example. So, that to me is less speculative whereas some of the other stuff that you might be doing, you should probably be taking a lean incremental approach to it and testing stuff early. I think that's one of the things that we talk about at ThoughtWorks all the time, which is how can I do as little as possible to test and validate this thing, which even includes things as simple as not actually using the new technology but just using some very low fire prototype like pieces of paper or whatever you might be using just to see whether it will even work in the context that you're trying to apply it to.


Neil Redding:

Absolutely. The great thing about the way we work at ThoughtWorks, just to add a little bit to what you were saying Mike, is that even though these technologies are new, the way we pair designers and technologists, developers and strategists and product people together allows us to very quickly test out hypotheses. Like the airplane mechanic training simulation in VR that we put together, there was a two or three week ban, we brought the right people together, we were very focused and we created this in partnership with our air travel team, experts in the airline business and we're able to put something together and get it in front of our clients in a matter of several weeks. So, you actually can apply this lean hypothesis driven prototyping approach to a whole range of new technology as well.


Alexey Villas Boas:

Okay, everyone. So, it's been a great conversation and great to have you with us, but we're coming to the end of the episode. Thank you very much for joining and see you next time. Bye.

Check out the latest edition of the Technology Radar