Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Perspectives edition 31 banner
Perspectives edition 31 banner
Edition #31 | April 2024

Intelligent services at scale: Data foundations to deliver on AI’s promise

 

Introduction: Too fast, too soon?

 

AI is the technology of the moment, and the foreseeable future – but Thoughtworks experts active in the field are urging caution. Whether they’re anxious to capture opportunities or just responding to the business version of peer pressure, some organizations are rushing to adopt AI without first laying the groundwork needed to put it to productive use, or considering how it will integrate into workflows to generate value for their stakeholders. 

 

“Board members, because they hear so much about the AI hype, may be pushing management to do something so they can announce AI investments to the media and shareholders,” notes Christine Welsch, Market Director, Automotive & Manufacturing, Thoughtworks. “Elsewhere in the business, some teams, especially those in organizations that work with legacy technology, may be bored with the status quo and eager to explore something new. Those are examples of just two forces that support false starts.” 

 

The good news is that adopting AI doesn’t have to be a knee-jerk decision, and that even at this relatively early stage, there are positive examples to follow. Clear paths to AI-driven productivity and achievement have already been travelled by leading enterprises like the BMW Group in Germany, which has worked closely with Thoughtworks teams to enhance its capability to deliver AI-powered solutions embraced by internal and external customers. What lessons does its experience hold for other organizations when it comes to applying AI to create meaningful outcomes, and challenges faced along the way? 

 

Effective use of AI, and the problems that prevent it, ultimately come down to data. Organizations may struggle to source or manage the data needed to provide a sound foundation for AI, or may not even be sure it exists, given the overwhelming volume and complexity of data the average company must now contend with.

 

Many automakers, like companies in other industries, generate data as a byproduct of their activities – from sales to production through to customer support – yet have rarely seen a need to do anything with it until recently.

Photo headshot of Christine Welsch Market Director, Automotive & Manufacturing, Thoughtworks
"Leveraging data was never a priority in manufacturing companies, but it is something everyone has to learn now. AI is a wakeup call, because companies now realize there are business opportunities that they are not tapping."

 

Christine Welsch
Market Director, Automotive & Manufacturing, Thoughtworks

“Leveraging data was never a priority in manufacturing companies, but it is something everyone has to learn now,” says Welsch. “AI is a wakeup call, because companies now realize there are business opportunities that they are not tapping.

 

While it’s positive that data is more widely understood to be an asset, awareness has also raised more questions about whether data is of sufficient quality to serve as a basis for AI-generated insights and predictions. It’s easy to understand why this is a major source of concern, especially in non-technology functions where there may be less certainty around where data comes from, or how it’s been used.

 

Significant proportion of tech and business leaders are skeptical of their organization’s data quality

Source: Salesforce

 

Issues like these point to the necessity of a robust platform where high-quality data is vetted, integrated and made easily accessible as a prerequisite for a successful AI strategy capable of delivering business outcomes over the long term. This is not necessarily as complex or as costly as many organizations expect, but the experience of firms like BMW Group shows the process needs to be mapped out and tackled deliberately. 


Drawing on the views, and hands-on experience of both BMW Group and Thoughtworks experts, this special edition of Perspectives will explore how to create a scalable, cost-efficient and future-ready platform for AI-based connected services and products that serve organizational priorities and foster customer loyalty.

 

 

 

i. Putting data assets to use    

 

A relatively early adopter, BMW Group embarked on its Connected Vehicle AI platform initiative in 2021, when it became clear that the demands for intelligent vehicle features and functions, both from customers and within the organization, were surging. 

 

Oliver Gruber, BMW Group’s Head of Connected Vehicle Data & AI Functions, explains that many business units wanted to develop AI-based solutions, but faced a high technical barrier to entry. That underlined the need for access to data for machine learning with respect to data privacy, data engineering tools, and opportunities to operate and monitor machine learning models. An AI platform was seen as the only way to bring down the barrier and get started with AI in earnest. 

 

Another key motivation was to overcome the challenge of running AI at scale. BMW Group had to operate its machine learning and AI models worldwide and make them available for its entire connected car fleet. This was a significant challenge for data science teams in that they not only had to develop and train machine learning models, but to be able to scale and run them in production.

 

This AI platform also had to be ‘self-service,’ in that it enabled teams within an organization to develop, maintain and operate AI use cases, relatively independently.

 

Importantly, teams from Thoughtworks and the BMW Group took steps to ensure the ability to deliver value was in focus from the start. 

 

“The very first thing that’s needed before starting any AI project or data product is the use case,” says Welsch. “That means asking ‘What is the value we will get from implementing this?’ Without a use case, it’s like a playground. Any efforts become pure experiments and risk ending up as sandboxes that produce nothing worth serious investment.”

Photo headshot of Christine Welsch Market Director, Automotive & Manufacturing, Thoughtworks
"The very first thing that’s needed before starting any AI project or data product is the use case. That means asking ‘What is the value we will get from implementing this?’ Without a use case, it’s like a playground. Any efforts become pure experiments and risk ending up as sandboxes that produce nothing worth serious investment."

 

Christine Welsch
Market Director, Automotive & Manufacturing, Thoughtworks

By contrast, “identifying a problem or need that could potentially be solved with a data product or AI is the best starting point, because then there will be value behind it,” she adds.

 

Unfortunately this is not always as straightforward as it sounds, meaning that AI can be the perennial solution in search of a problem. Especially when it comes to GenAI, identifying the right use cases is cited as a major obstacle by both adopters and non-adopters, with over half of respondents to a recent O’Reilly poll on GenAI implementation naming this the biggest challenge they face.

 

Conducting a discovery workshop to freely brainstorm ideas and record those deemed most relevant can help organizations identify potential use cases, assess the extent of opportunities and risks associated with each, and prioritize accordingly. The key is to focus on the concrete outcomes for customers or end-users, and not get distracted by the urge to create value from the organization’s vast data stores for the sake of it, Welsch points out. 

 

“When the client says ‘I have so much data, what can I do with it?’ that’s the wrong question,” she says. “The question should be ‘I have so many problems, which of them can I solve with data?’ Leaders in the field will be the ones that really understand their market, and how data products will make a difference in the consumer behavior or appetite for the product or service.” 

Photo headshot of Christine Welsch Market Director, Automotive & Manufacturing, Thoughtworks
"Leaders in the field will be the ones that really understand their market, and how data products will make a difference in the consumer behavior or appetite for the product or service."

 

Christine Welsch
Market Director, Automotive & Manufacturing, Thoughtworks

Armed with these insights, an AI-ready data foundation positions companies to quickly access and repurpose data and models to serve the use cases they’ve prioritized.  

 

BMW Group’s Connected Vehicle AI Platform, for instance, “enables data scientists working on connected vehicle use cases inside the company to interact with any data sent by the vehicles and derive their actions out of this,” notes Fabian Nonnenmacher, Business Analyst, Thoughtworks. 

 

One of the main use cases of the AI platform is proactive customer care – that is, detecting or predicting vehicle maintenance demands even before they arise. This is seen as a massive asset in terms of providing the best customer experience. 

 

This predictive maintenance capability works by tracking several key vehicle performance indicators, such as the health of vehicle parts, and triggers alerts when potential service demands are detected, notes Biplob Biswas, Lead Data & Machine Learning Engineer, Thoughtworks. “Drivers not only get a notification on service demands in their app, but are proactively contacted by their chosen dealer, so that they can head for the service center before the problem escalates.”

 

Another example of proactive care at BMW Group is the capability to accurately identify not just the presence, but the viability of charging stations for electric cars. 

 

“Drivers of electric cars want to find charging stations with a certain level of accuracy, but the information BMW Group gets from charging point operators is not always correct,” says Welsch. “That was the problem BMW Group wanted to solve. They decided to map data from car sensors to information provided by operators to validate that information, and automatically improve data accuracy.”  

 

Using an anonymized set of vehicles to verify the presence of a charging station and that it provides the power that the charging point operator has documented allows BMW Group to map and validate public charging infrastructure faster than via reports provided by operators, Welsch notes. 

 

While perks like these may not be the sole reason a consumer chooses to buy BMW Group cars over other brands, they can definitely be a major consideration. “If a friend who has an EV is annoyed by the number of charging points they need to drive to before finding one that actually works and tells you about that, it might deter you from buying one,” Welsch says.   

 

Another potential feature being developed is the ability to accurately predict the health and performance of electric car batteries by monitoring their charging statistics. 

 

“Battery health can be affected by various factors such as the battery’s age, weather conditions, and the vehicle owner’s driving style,” Biswas explains. “If inadequately monitored, it might perform below expectations.” 

 

“What’s more, as it is currently hard to collect and use real-time battery health information in backend systems, estimates from ML algorithms are used in their place,” Biswas adds. “Any inaccuracies in these estimates can lead to higher error detection rates during vehicle inspections."

 

Heightened monitoring capabilities enable a vehicle owner to easily assess their car battery’s performance – such as how much charge it has, and how much it can retain – in changing real-world circumstances, via the dashboard or app. 

 

With a clear idea of the customer or end-user problems they want to solve, organizations can begin to shape a platform with all the required features to execute that vision of success. 

 

In the case of BMW Group’s AI platform, the key metric was “time to market for AI use cases,” Nonnenmacher notes. “Our goal for the platform was to make data scientists’ lives easier, so they can focus on the actual data science work and bring the data models and pipelines into production. This means that BMW Group can roll out new use cases, ideas or product features in the AI domain faster, with less investment.”

 

 

 

ii. Tech elements of an optimal foundation   

 

Once a vision is in place, organizations can use that to guide the construction of an AI-ready data foundation. 

 

More often than not, the most complex task is providing access to consistently high-quality data, which is critical to the success of AI and machine learning models. 

 

“For (AI) use cases, the main goal is to have everything reproducible, so you have a basis for comparisons,” says Nonnenmacher. “This can mean training a model today with a certain data product, and someone else can validate the results by running the same training with the same inputs.”

Photo headshot of Fabian Nonnenmacher Business Analyst, Thoughtworks
"For (AI) use cases, the main goal is to have everything reproducible, so you have a basis for comparisons. This can mean training a model today with a certain data product, and someone else can validate the results by running the same training with the same inputs."

 

Fabian Nonnenmacher
Business Analyst, Thoughtworks

“As new data is created, it is fed to the inference system to predict results, and if they are not ideal, the training system is used to train a new model,” adds Biswas.  

 

“In the connected AI domain, for example, there are a lot of cars out there, and they upload new data every day,” he explains. “More data is collected, which is of course good because you can derive more conclusions out of it, but the challenge lies in converting the data so it’s consistent with the existing data set and use case. Organizations have to make sure that when they create reproducible pipelines or reproducible training, they're always using high-quality data.”  

 

Fundamentally, high-quality data that can serve as a foundation for training machine learning models needs to be prepared and processed in a certain way. According to Nonnenmacher, the minimum setup required is a hub that enables data governance and provides searchability, as well as access to data so that teams can use it even if they didn’t directly create or collect the data themselves. 

 

Following that, an environment suitable for data exploration needs to be established. “Typically applications like Jupyter Notebooks can be used to explore data and prototype ideas,” Nonnenmacher notes. “The next step is fitting everything into a reproducible and comparable state. The platforms we build will then allow a team to define pipelines, where they can put all these steps into code, and rerun experiments based on different parameters.” 

 

At BMW Group, teams were able to leverage three existing platforms to get a head start, Biswas notes. One collected data from the vehicle fleet based on customer consent, and another was a data lake connecting all producers and consumers of data. The third was a separate platform handling the majority of networking, access and scalability requirements for the various locations where the carmaker operates, including the European Union and China. Because the Thoughtworks team could work with existing infrastructure, the process of building an AI platform was accelerated, with the first use case onboarded in just four months and going live six months after.  

 

Once the AI-ready data foundation is in service, usability and the developer experience are key determinants of the platform’s success. “The AI area is growing significantly and there are always new tools being made available, so it is important that people enjoy using them and find them valuable,” says Nonnenmacher. “Sometimes in machine learning, things take time, but wherever possible, we embed fast feedback loops to accelerate improvements in the user experience.”  

 

Components of an AI/ML-ready data foundation

Source: Thoughtworks

 

"Adapting agile software development practices, such as continuous delivery, and applying them to data problems can potentially help to solve challenges that arise along the way, such as bringing data and models to production,” Nonnenmacher adds.

 

 

Perspectives delivered to your inbox

 

iii. Scaling up while controlling costs and maintaining standards 

 

Getting started may not be easy, but scaling up is often the stage organizations experience the most roadblocks when creating a data foundation for AI, especially in larger companies where maintaining the flow of communication between teams is always a challenge. 

 

Organizations should aim to construct an automated, “repeatable setup that serves different use cases because otherwise, everyone starts from scratch and is reinventing the wheel all the time,” Biswas says. “Once you have a standard in place, and a repeatable set of tasks for users of data, the platform works better for everyone.”

 

Repeatability and consistent quality, therefore, are fundamental concepts that not only underpin AI platforms but also ensure they can evolve to meet future requirements.   

 

For a firm like BMW Group with an international setup, scalability and availability are particularly important, notes Gruber. BMW Group needs to provide runtimes for models that can scale based on the demand coming from a fleet, which varies depending on time of day. This makes elasticity – that is, the platform’s ability to scale up and down in response to demand – essential.

 

“The whole idea of an AI platform is to make use of synergies to enable multiple use cases – that’s what we mean by scaling,” says Nonnenmacher. 

 

However, this runs up against the reality that “users or data consumers may have very diverse backgrounds and often very different needs,” he adds. “So every team may be using a different set of tools to leverage that data.”

 

Organizations also need to be aware of the “tradeoff between time to market, and actually being relevant and then scaling out to a lot of different users,” Biswas points out.   

 

While turning to managed cloud services can help organizations keep their initial investments down and provide a means to scale without worrying about resource requirements, or the capacity to maintain a growing platform, such arrangements come with constraints. 

 

“One issue with cloud platforms is that their capabilities are very generic in nature,” Biswas notes. “You end up having to customize them to your requirements and those costs need to be factored in.”  

 

So too, do the overall costs of boosting cloud services usage, which can quickly add up. 

 

For many businesses, “the beauty of any cloud provider is that they provide different abstraction layers, or shared responsibility models, so there is an entire range where you can optimize for cost,” says Biswas. “It can go from a basic model where you spawn virtual machines for specific uses, to the other end of the spectrum with a completely managed machine learning environment where you just have to put the code in, it starts running, and you can deploy the model in one day.”  

 

Using a managed service on the cloud reduces operational effort, but if data processing demands rise, or there’s a need to expand to different regions, the cost aspect increases significantly. In a large organization where a dedicated Platform DevOps team is responsible for introducing new features and keeping up with security or other updates, costs can be controlled and eventually reduced as they onboard more use cases and achieve economies of scale, Biswas notes.   

Photo headshot of Biplob Biswas Lead Data & Machine Learning Engineer, Thoughtworks
"The hybrid model we built for BMW Group hits that sweet spot where you can scale to infinity but at the same time, you don't have to manage how the scaling works."

 

Biplob Biswas
Lead Data & Machine Learning Engineer, Thoughtworks

“The hybrid model we built for BMW Group hits that sweet spot where you can scale to infinity but at the same time, you don't have to manage how the scaling works,” says Biswas. “It enables the organization to start small and quickly with managed services, then run everything natively within the Kubernetes environment, which scales really well.”

 

Other costs that may need to be factored into the upkeep of an AI platform include having service teams on call “to serve users in different time zones and also to meet service level specifications,” Biswas adds. 

 

As organizations mature in their approach to leveraging enterprise data and build custom platforms, certain conditions need to be met to ensure these provide a conducive environment for AI experimentation over the long term.

 

One is the ability to conduct ongoing performance monitoring when models are deployed to production, which is a major consideration for BMW Group. As Gruber points out, model behavior is massively dependent on the data that is used to make predictions, which can change over a model’s lifetime. It is therefore necessary to be able to monitor this and to have an alerting system for data engineers that informs them when a model’s performance drifts and it needs to be retrained.  

 

“You can never fully control a machine learning model because we’re not working with deterministic algorithms, so there’s always a risk of getting false data,” says Nonnenmacher. “Monitoring the models and validating the data are measures that ensure the potential harm from your predictions is minimized. Having an auditing process or documenting what's actually going on also forms a large part of assessing a model.” 

 

The other baseline requirement for confidence in a platform is data transparency. For data scientists that train machine learning models and produce new data based on existing information, data lineage, which details the data’s origin, destination, and changes in between, has to be a given, notes Nonnenmacher. 

 

“Data transparency plays a significant role in that equation as it makes visible which model is trained on which data, so data scientists are aware when they are dealing with personally identifiable information (PII) data,” he says. Furthermore, it enables data scientists to recognize biases captured in the models and comply with ethical AI principles.

 

An overarching data governance framework can equip the organization with an operating model and structure to work towards meeting and maintaining data quality and regulatory standards. Having the right authorization and authentication to ensure data residency regulations are met preempts any compliance challenges, as does ensuring the right data access permissions are in place, Biswas notes. That means organizations need to implement auditing or tracking capabilities to establish clear lines of sight whenever data is used. 

 

Data privacy and security are also key factors that need to be addressed. At BMW Group, customer consent is explicitly sought and drivers must opt in before any data is collected. 

 

Biswas also notes that for BMW Group, personalized data is encrypted by default and additional steps are taken to create more layers of protection where necessary. 

 

“Whenever a customer marks any attribute as particularly sensitive, this data attribute is encrypted and pseudo-anonymized so it retains the necessary patterns required to train machine learning algorithms without revealing the actual data,” Biswas says. “If the actual data is needed, data engineers have to request that these data sets be decrypted. This entire process is completely auditable, so opportunities for data leakage are minimized and there is no ambiguity in terms of how data is accessed or used.”  

 

BMW Group is also taking steps to improve their machine learning algorithms to “identify biases in a model, and create a model lineage to find out what data generated which model, what hyperparameters were used to train these models and what code was responsible,” according to Biswas.  

 

The most crucial feature influencing a platform’s continued viability is automation, which allows all these steps – from data access, through to deploying models into production – to run in a continued, iterative and compliant way. 

 

That means implementing the principle of continuous delivery for machine learning (CD4ML) in the platform and creating automatable pipelines to execute all the steps in the process. The benefit is that the platform combines and integrates the full set of tools data scientists need. 

 

 

 

iv. Cultivating buy-in, ongoing education: Platforms as a process  

 

As efficient, experience-rich or compliant as a platform may be, its true value can ultimately be measured only by how it’s utilized. 

 

In the case of BMW Group, different departments have KPIs to track the success of the specific use cases. With the Connected Vehicle AI platform itself, Gruber considers adoption to be the main success indicator, and something they need to monitor very closely.  

 

However, as Welsch points out, organizational issues that may have nothing to do with the platform’s capabilities can get in the way of adoption. “Typically, the broader the use case and the stronger the business case, the more complex the organizational challenges that come with it,” she notes. 

Photo headshot of Christine Welsch Market Director, Automotive & Manufacturing, Thoughtworks
"Organizational issues that may have nothing to do with the platform’s capabilities can get in the way of adoption. Typically, the broader the use case and the stronger the business case, the more complex the organizational challenges that come with it."

 

Christine Welsch
Market Director, Automotive & Manufacturing, Thoughtworks

For example, value creation can be impeded if data product owners fail to garner sufficient support across departments. “You have to convince other teams of the value of investing in the data product,” notes Welsch.

 

One preventative measure is to monetize data products internally to incentivize ownership. Thoughtworks experts also point out the importance of gathering insights from continuous discovery and feedback loops, which can highlight different use cases or data consumers’ varying usage requirements. 

 

Building either a highly tailored or widely applicable product comes with tradeoffs. One strategy, according to Welsch, is to build a product so it serves a specific use case – but also retains some flexibility to grow and serve others. 

 

In situations where the platform is just getting started or the value of a data product is in dispute, having a strong champion, as was the case with BMW Group, to steer the course and bring people together makes a huge difference. “In other projects, we have seen sudden changes in strategies when the champion is easily swayed by the opinions of other internal stakeholders and that doesn’t work,” warns Biswas.    

 

Even when the appetite clearly exists, users’ lack of experience can hinder their ability to make use of the platform.

 

In the auto sector example, many executives “have a wealth of experience in machine engineering and automotive manufacturing but because the integration of software only became relevant in the past five years, lack hands-on experience with software development,” says Welsch. “It’s hard for them to go back to that operational level and really understand why things are as they are, why software developers should work with hardware developers and why data from a car is relevant for anybody.” 

 

“Expertise is also needed to bring machine learning use cases into production, and only then can they create value,” notes Nonnenmacher. “Different teams have very different knowledge and backgrounds – and the challenge is to teach these users to use the platform so that they can leverage it to their full advantage.”

Photo headshot of Fabian Nonnenmacher Business Analyst, Thoughtworks
"Expertise is also needed to bring machine learning use cases into production, and only then can they create value. Different teams have very different knowledge and backgrounds – and the challenge is to teach these users to use the platform so that they can leverage it to their full advantage."

 

Fabian Nonnenmacher
Business Analyst, Thoughtworks

In the process of onboarding different use cases for BMW Group, platform teams faced the challenge of “accommodating users with a diverse set of skills and a range of applications, from research-oriented use cases to engineering-oriented ones,” says Biswas.

  

“We needed to make the platform more understandable for everyone, whether it’s someone who doesn't even know how to use GitHub or Python, or people who have been doing continuous integration and continuous delivery every day,” he adds. “This has been an iterative process, where we’ve increasingly made life easier – and we’re seeing the benefits of that right now."  

 

This is why building a platform comes with a certain responsibility to educate and enable teams across the organization, a duty that at companies like BMW Group has far-reaching impact. 

 

One crucial step Gruber’s team took was to introduce a dedicated resource with Thoughtworks to onboard teams on the platform right from the beginning. According to Gruber, this turned out to be a highly effective practice because for many teams, AI was completely new. Many of them had one main question: where to start.

 

Teams are supported from the very beginning when they come up with a use case idea, Gruber notes. They are provided with guidance throughout the process, including help to identify the skills they need, and requirements for the technical setup. In some cases, they collaborate with platform and Thoughtworks experts on proofs of concept. 

 

The effects are maximized through division of labor, adds Nonnenmacher. 

 

“We focus on enabling use cases by onboarding them quickly and providing a good user experience, but also identifying potential feature gaps, and finding solutions,” he says. The platform team, on the other hand, focuses more on the data and strategic development of the platform, such as creating new features, implementing them properly, and making them more stable.   

 

Aside from easing the onboarding process, efforts were also made to address the short-term needs of users by scaling up the capacity, so multiple teams could be onboarded faster, Nonnenmacher notes. “To achieve that, we created sample templates for them to get started and also provided a software development kit that allows easier interaction with the platform.” 

 

The team also fostered adoption by specifically targeting users whose work would be impacted. 

 

“For those who were concerned with the prospect of changing their entire codebase, we took a lift-and-shift approach to migration, so they could start using their existing codebase on the platform with minimal disruption,” Biswas explains. “We then worked together with them on the use cases, to show them how they can follow similar approaches, but enable best practices. For example, continuous delivery and integration so that they can see their changes and pipelines directly on the UI provided by the platform. Once they had this experience, they became proponents of our platform.”  

 

Generally, “users who don't have very strong opinions about what they want to use, or how they want to use it, have been very open and quick to adopt the platform,” Biswas says. “But with others, there was a need to provide more handholding in the onboarding process and multiple check-ins to see if they are facing any issues.”   

 

Weekly sessions were also set up to answer questions ranging from the platform’s architecture or missing features to non-technical matters, Biswas says. These were further supplemented by online channels to solicit guidance, onboarding documentations, and videos.  

 

Initiatives like these ensure that after a strong start, BMW Group can keep its eyes firmly on what comes next, in terms of both use cases and the platform’s evolution. 

 

Thinking about how to extend the platform and closely watching market trends are top priorities. GenAI is also one of the strategic topics BMW Group is looking into currently, given its rising adoption.  

 

Much like BMW Group, “organizations should have a vision in place when they begin thinking about something like this, because bringing it to fruition doesn’t happen overnight,” Biswas says. “It’s an iterative process and involves thinking through what they need as a company, what they want to provide to the users, integrating the platform into the existing ecosystem and their scalability model. Once that is in place, adopting a growth mindset will help organizations manage the process and constantly move to the next level.”   

Photo headshot of Biplob Biswas Lead Data & Machine Learning Engineer, Thoughtworks
"Organizations should have a vision in place when they begin thinking about something like this, because bringing it to fruition doesn’t happen overnight. It’s an iterative process and involves thinking through what they need as a company, what they want to provide to the users, integrating the platform into the existing ecosystem and their scalability model. Once that is in place, adopting a growth mindset will help organizations manage the process and constantly move to the next level."

 

Biplob Biswas
Lead Data & Machine Learning Engineer, Thoughtworks  

About the contributors

Photo of Biplob Biswas, Lead Data & Machine Learning Engineer at Thoughtworks
Biplob Biswas

Lead Data & Machine Learning Engineer, Thoughtworks

Photo of Fabian Nonnenmacher, Business Analyst at Thoughtworks
Fabian Nonnenmacher

Business Analyst, Thoughtworks

Photo of Christine Welsch, Market Director, Automotive & Manufacturing at Thoughtworks
Christine Welsch

Market Director, Automotive & Manufacturing, Thoughtworks

Photo of Oliver Gruber, Head of Connected Vehicle Data & AI Functions at BMW Group
Oliver Gruber

Head of Connected Vehicle Data & AI Functions, BMW Group


Perspectives delivered to your inbox

 

Timely business and industry insights for digital leaders.

 

The Perspectives subscription brings you our experts’ best podcasts, articles, videos and events to expand upon our popular Perspectives publication. 

Marketo Form ID is invalid !!!