Enable javascript in your browser for better experience. Need to know to enable it? Go here.

Hostile tech

Confronting challenges in security, ethics and privacy 

 

As we noted in last year’s Looking Glass, technology’s rapid advance has been accompanied by negative impacts, whether deliberate or unintended. ‘Hostile’ technology continues to manifest in multiple ways, including bias in AI, addictive technology and certain synthetic media. This year we are focusing on threats to security and consumer privacy, because we see those threats — and the subsequent response — ramping up in the year ahead. 

 

Balancing evolving regulations, changing expectations, the need to get closer to consumers and simply doing the right thing will be critical to remaining competitive and fostering customer loyalty. The speed at which regulation is changing means it won’t always be realistic to keep up — our data privacy specialists use an automated service to keep up to date, and, operating in 18 countries, get an email detailing adjustments nearly every day. 

 

What’s more, simply complying with regulation won’t always fix the problem. Standards like the European Union’s (EU) General Data Protection Regulation (GDPR) mean that consumers are warned about the privacy implications of websites or applications, and invited to ‘opt in’ or ‘accept’ as a matter of course — yet few would claim to actually understand what they’re signing up for. The way things are developing means that rather than reactive measures, enterprises should proactively create ethical frameworks to guide the use of technology and data. These can create a firm baseline of respect and security for their customers and minimize the chances of consumer or societal harm. 

In the short time since publishing this edition of Looking Glass, tools like ChatGPT have taken the world by storm. While full of promise, we believe organizations should approach with caution. The outputs tend to be trusted or accepted at face value, but in fact they are often incorrect or include ‘hallucinated’ answers that seem plausible but are actually made up. For more information about our views into balance between this new experiment and responsible technology see this interview with our CTO, Rebecca Parsons: Society urged to hold companies to account on tech use.

 

Signals include:

 

Debate around the concept of ‘informed consent.’ While virtually all companies now offer consumers the chance to agree to terms and conditions with privacy implications, whether that agreement is genuine is an open question. The New York Times has pointed out that understanding standard privacy policies and terms — which can be up to 20,000 words long — requires far more time than the average consumer is likely to have. The result is that consumers are often left unaware of the extent of the data collected about them, while companies are collecting more data than they really need, or storing it for longer than is reasonable. In some cases, this behavior ventures into the illegal.

 

An extended wave of privacy regulation and enforcement. Companies like Google are facing punitive action and being forced to change their practices with the EU’s ‘cookie law’ coming into effect. In the US, a number of data privacy regulations are emerging at the state level, with the California Consumer Privacy Act (CCPA) and Virginia's Consumer Data Protection Act (CDPA) to be followed next year by the Colorado Privacy Act and Utah Consumer Privacy Act.

 

Privacy concerns changing marketing and business models. Companies like Apple have positioned themselves as staunch defenders of data privacy, enabling consumers to declare a screeching halt to the sharing of information and taking steps to protect users from targeted spyware. This is making it harder for companies to directly track users — roughly 70–80% of Apple users have opted out of tracking for apps they use, which has in turn hit revenues from advertising that relied on this information.

 

More companies putting physical marketing back into their strategies, possibly as a result of having less specific data about customers. Amazon, Bass Pro Shops, and others are direct-mailing physical catalogs and advertising to customers, complete with embedded QR codes.

 

The rising discovery of, and trade in, ‘zero-day’ vulnerabilities — previously unknown flaws in software or systems that leave organizations open to data theft or manipulation. The marketplace for zero-day problems is developing rapidly, with the emergence of brokers who engineer deals between those who identify them and the companies impacted (often the likes of Google or Microsoft), governments, or, less happily, criminal or other bad actors, all of which are often prepared to pay a high price for this knowledge. On the one hand it’s encouraging that more of these issues are being flagged; on the other, risks could rise as the entire marketplace becomes more lucrative.

 

Higher risks of cyber attacks with Russia’s war on Ukraine increasing both state-sponsored and volunteer cyber-warfare. The US government has explicitly warned that the victims of these attacks could include ‘regular’ businesses (and by extension their customers) as well as ‘legitimate’ military targets.

 

The opportunities 

 

Enhancing security to protect the bottom line. Target incurred more than $200 million in costs related to a breach of credit card numbers and personal information. The average cost of a data breach continues to mount, now topping $9 million in the US according to IBM, underlining the urgency of developing and investing in a comprehensive security strategy that’s based not just on technology, but culture. Similarly, strong, transparent standards and ethical data practices can help organizations avoid the hefty fines being handed out by regulators, which spiked sevenfold to $1.2 billion in 2021 for the GDPR alone.

 

Beyond the money, solid data protection and security practices help companies avoid lasting brand damage. Consider the online backlash to recent changes to Samsung’s privacy policies. The actual motivations became almost moot, demonstrating that even just a perceived invasion of privacy can have a negative impact on brand reputation. On the other hand, companies like Apple and now Google are benefiting by presenting themselves as privacy and security champions.

 

Healthier and more open customer relationships. There’s emerging evidence that consumers will be more inclined to share information with companies that are seen as stewards or examples of good data governance. Research has pointed to the number of users opting into app tracking growing since Apple made this a choice. The organizations that do the most to convince consumers they respect privacy and are taking proactive steps to protect customer data will very likely be entrusted with more of that data to use as a basis for developing insights.

 

A “privacy-first” stance can help brands win business against their laggard competitors. Research shows that consumers want strong privacy protections and as a result, some of the world's biggest brands are taking an ethics-led approach, especially to digital marketing and advertising. Ensuring that your business cares about privacy as much as your customers do can help you stay ahead of this trend.

A computer monitor displaying code
A computer monitor displaying code

What we’ve seen

To properly address privacy, we urge organizations to think differently about their data. Specifically, adopting a data mesh paradigm enables stronger governance, because instead of simply slurping data from anyone, anywhere, knowledgeable data owners are appointed for a given domain; they’re able to make decisions about what data is needed, what is not and how it should be ultimately used. When we work with clients to implement data mesh, we find it is particularly valuable in helping us to architect systems that incorporate Privacy by Design.

Trends to watch

 

Adopt

 

Decentralized security. As the nature of cyber threats changes, previous methods to prevent attacks are routinely failing. There is no longer a safe boundary or perimeter. System design needs to allow for risk management and security enforcement throughout the entire architecture, with increasing use of security-in-depth practices that embed protection across multiple layers to make it more holistic. These include the use of encrypted communications, segmented regions and authentication and authorization at a more granular level, as well as more intelligent intrusion detection systems.

 

Analyze

 

AI in security. AI capabilities are becoming increasingly important in everyday software applications. Organizations should leverage work in this area to help security professionals identify and react to security threats, and predict attack vectors wherever possible. While we don’t believe automation is a viable replacement for well-trained security professionals, it provides a tool set that can automate some basic defensive processes, and allow people to focus on the most critical threats.


Anticipate

 

Increased regulation. While we’ve flagged some of the most recent regulations to emerge in the privacy space, organizations should be prepared for more. Worldwide, there are a significant number of data protection laws already on the books with more to come. Challenges will emerge as compliance potentially grows more complex, especially for firms operating in multiple jurisdictions. When GDPR came into effect, for example, many US-based news sites simply blocked people in Europe from accessing their websites because they were concerned about falling foul of a law they didn’t understand.

Trends to watch: the ones we're seeing now

Adopt
  • AI as a service
  • Automated compliance
  • Connected homes
  • Decentralized security
  • DevSecOps
  • Explainable AI
  • Privacy first
  • Secure software delivery
Analyze
  • AI-generated media
  • AI in security
  • Alternative currencies
  • Blockchain and distributed ledger technologies
  • Code of ethics for software
  • Differential privacy
  • Ethical frameworks
  • Facial recognition
  • Personal information economy
  • Privacy-respecting computation
  • Smart contracts

 

Anticipate
  • Addictive tech
  • Increased regulation
  • Smart cities
  • Technology and sovereign power
  • Technology for environmental and social governance

Advice for adopters 

 

Remember that when it comes to data, what you don’t do is also important. Since the ‘big data’ trend started firms down a path of gluttony, many companies collect data almost by default and store it for long periods, without critically examining how necessary it is to the business. Today’s machine learning algorithms also encourage a degree of data hoarding. But data has to be recognized as a liability as well as an asset. Hackers can’t steal what you don’t collect, and a security snafu can’t leak customer information that’s not in your database. Think selectively about the data you need and the possible fallout if it is stolen or leaked, and remember, the less customer data you’re dealing with, the easier it will be to manage.

 

Recognize personalization is not always necessary, and can be counterproductive. The reality is consumers do not necessarily expect or even want a one-to-one, personalized experience from every brand or product. Privacy-friendly web analytics is an emerging practice that enables brands to still get a ‘pulse check’ and understand high level flow and consumption trends to get a reading of target audience engagement. Given privacy trends and concerns, our recommendation is to invest heavily in customer research to understand key segments rather than individual behaviors, identifying how these interact with your brand online and offline to create flows and content that work for a broader group, while keeping individual privacy intact.

 

Work with your legal and marketing teams to create easy to understand privacy policies. Not only will these help engage and benefit consumers; internal teams can also use such policies as an effective reference to design their solutions, data retention policies and compliance approaches. Berkeley’s Information Security Office and this research paper provide some good advice on how to start.

Enable your teams to focus on creating value by automating security, privacy and compliance testing. When creating security policies, make sure they can be effective as guard rails for teams, then automate and treat them as code wherever possible so teams can get immediate benefits. One way of doing this is by using dependabot to ensure dependencies are secure, patched and up-to-date, or AI tools that help people to identify and respond to incidents by zeroing in on unusual patterns.

 

Make security education a priority. Hackers only need to find one vulnerability to get in and inflict damage, but defenders need to secure their entire organization — a highly asymmetric reality. Security is much more effective when everyone in the organization does their part.

 

Learning about good security practices is not easy, but creating a layered strategy where experts help improve the overall security stance of the organization, and giving people the tools to make good decisions, will leave the enterprise significantly safer. By extending education to customers, taking steps to bring them up to speed on the importance of data privacy and what can go wrong when data is shared, enterprises can enlist their help in the battle.

 

Build products with robust security and privacy practices. This requires deep commitment and strong leadership; security and privacy are not just technical concerns, but should be seen as an outcome of the culture of the entire organization. Leaders must make it clear that the team should not consider these aspects ‘nice to have,’ something they can delay until later, or somewhere investment can be trimmed to save costs. Products need to embed robust security and respect user privacy from day one.

Privacy must be part of your company culture. Instead of slurping data from anyone, anywhere, and creating a massive data swamp with toxic leaks of unknown provenance. You need to carefully curate data. This is an opportunity to take a pro-privacy stance and build a trusted brand.
Katharine Jarmul
Principal Data Scientist, Thoughtworks Germany

Download the full report