Trust in Me: Trust, Law and Brand Authenticity in the Digital Marketplace


By Catherine McGregor


When I started university in the early 1990s the internet was just getting started. I remember an early lecture where the lecturer predicted the use of cables to send huge flows of information to our homes and how we could speak, work and have a multitude of information choices as a result. What seemed like science fiction then has now become our day to day reality.

The application of technology to business has changed huge swathes of traditional industries and caused new industries to spring up as a result. Aspects of technology application are also changing the way we think about what we do. The technological advances made in data analytics over the last 15 years have accelerated the usefulness of ‘Big Data’ and how it can be used.

The increase of digitisation and the potential this brings for greater global reach are both great opportunities for businesses. But these opportunities also bring with them new risks and different manifestations of old risks. Many of these risks centre around the fundamental issue of trust.

The Trust Contract

A significant way technology is changing business is the way it changes the fundamental contract of trust at the heart of business transactions, whether big or small. Any contract whether a complex legal document or a simple handshake is an expression of shared trust.

As what we do becomes more driven by technology, the nature of trust is changing. Institutional trust is lower than ever and we’re increasingly looking to personal trust as a replacement. Much of this new personal trust is developed online via marketplace sellers, brand spokespeople/influencers and online reviews. But how can we trust what we may not be able to see or touch or build a tangible relationship with?

Whilst brands can see the need to move to digital solutions, they are also faced with a new business dilemma - how to ensure their brands can be trusted in the digital marketplace. How can brands build trust and ensure they keep it?

This leads to a significant issue for lawyers working in many of these companies- being able to mitigate business risk in the best way possible. How is the need to build digital trust changing what in-house lawyers need to think about and how they can effectively protect their organisations?

For many lawyers the range of business and legal risk is multiplying exponentially as new ways of reaching out to customers digitally become prioritised over traditional analogue ones. Risks around the use of more digital avenues include the various cyber-attack risks, data breaches and the proliferation of counterfeit strategies using digital technology to enhance their effectiveness. This can range from counterfeiters selling fake branded goods via social media channels to actual cloning of websites, say for banks, to access customers’ financial details.

Distributed Trust

The move to digital communication and transactions raises the question of trust.

Over the last twenty years there has been a gradual erosion in institutional trust, and it is now at its lowest ebb. The Edelman Trust Barometer charts public trust in institutions and asks about 30,000 people across 28 countries how much they trust NGOs, business, government and the media. In recent years institutional trust has been recorded at some of its lowest levels. In 2020, Edelman reported that despite a strong global economy and good employment statistics, none of the four societal institutions that the study measures—government, business, NGOs and media—is trusted. What this lack of trust is attributed to is the respondents’ fears about the future and their role in it.

But while traditional institutional trust is waning, new forms of trust are being created. Most notably distributed trust, which world-renowned trust expert, Rachel Botsman, identifies as symptomatic of the increased reliance on the digital economy.

“Trust and influence now lie more with ‘the people’ families, friends, classmates, colleagues and even strangers – than with top down elites, experts and authorities. It’s an age where individuals matter more than institutions and where customers are social influencers that define brands.”

(Who Can You Trust?: Botsman: 2017 p.5)

How this trust is often built is through personalisation for brands and the sense of knowing someone, via their online presence, whom you might never have met. Building this new network of distributed trust has been fundamental to the success of many companies which are now hugely influential such as Uber or Airbnb. How they do this effectively is via building what Botsman calls the ‘trust stack’. First the customer needs to trust the idea, then the platform, then finally the individual. If we look at Uber, the idea of ride sharing and being able to bring more availability of transport was initially what consumers could buy into. The digital platform allows consumers to see their ratings and reviews to create a level of trust. The platform and the individual trust work symbiotically here: we feel if we have found the individual driver via a recognised platform, we can trust them more than, say, if they just advertised on Craig’s List. But if Uber was sending nameless drivers without reviews, the trust stack starts to break down.

Navigating this new type of distributed trust is now fundamental to business. Analogue ways of doing business are not cost effective or able to give the scope that digital can. In the banking sector, for example, technology is changing retail banks’ relationships with customers. Much of this has been driven by the increase in challenger banks such as Monzo and Swallow, many of which are completely virtual. This shift led former Royal Bank of Scotland, CEO Ross McEwan to declare a few years ago that RBS is now a tech company that happened to operate as a bank. This shift in mindset and operations is being mirrored by the other four major retail banking groups in the UK.

Technology brings new ways to relate to customers as the bank’s General Counsel for Outsourcing, IP and Technology, Kenny Robertson explains.

“Our mobile app has been instrumental for us, as with every other bank. We need people moving onto tech platforms and part of the draw for this will be greater personalisation. For example, some of the challenger banks now offer services personalising the banking on their app. You spent £12.50 on coffee are you sure you want to do that? Here’s your pending transactions etc. It’s using technology and ironically the fact that it creates a different relationship, to build to a more personalised service than just traditional banking.”

Even with banking apps, a big draw of trying to move customers away from face-to-face banking is to cut costs and expand capacity. This, however, necessitates the creation of even stronger levels of trust - the sense of a bank which has a relationship with us, even if it’s only virtually via an automated algorithm telling us we have direct debits pending.

The mixture of greater personalisation in tandem with actual greater physical depersonalisation is pretty much a defining characteristic of the ‘sharing economy’ and the distributed trust, which it brings with it. Many of the winners of the sharing economy are brands which depend on individuals interacting with other unknown individuals. Trust is at the core of this success.

One of the ways that this distributed trust relationship is created is by sharing personal details and providing reviews. Giving a stranger money, getting in their car, staying in their house etc. is mitigated by having positive reviews and online recommendations. Recent independent research commissioned by Trustpilot and undertaken by Canvas 8, a behavioural insights agency, showed that 89% of customers now research reviews online before making a purchase. While 45% say they use peer reviews more than in the past, this is a trend that is likely to increase. The public’s growing focus on peer reviews has also increased the opportunity for fraud relating to these reviews: crimes which exploit this new form of distributed trust.

Looking the Part

In Who Can You Trust, Rachel Botsman tells an amusing but highly terrifying story about her parents hiring a Scottish nanny with a Salvation Army background. She seems perfect for the role and looks every inch the part. Unfortunately, she turns out to be a drug dealer whose Mary Poppins type persona has been carefully crafted. This was obviously an extreme example which took place in real life. However, for many businesses navigating the digital marketplace and the distributed trust economy, unmasking those ‘bad actors’ is a significant issue. The global marketplace gives bad actors increased opportunity for fraud.

Given the effectiveness of objective customer reviews and ratings in helping to prove trustworthiness for a product or company, reviews are a key area to target. Interestingly the Canvas 8 research on peer reviews and brand trust showed that most people viewed fake reviews much more negatively than politically biased advertising and insults (or trolling); respondents felt there should be zero tolerance mainly because the fake reviews would cause them to make purchases and potentially waste money.

For Carolyn Jameson, Chief Legal and Policy Officer at Trustpilot, the online consumer reviews website which hosts over 1 million reviews a month, this is a compelling business and legal problem.

“The problem is huge, and I had not really appreciated the scale of it until I took this job on. As part of my role, I look after the content integrity team, which is the team that deals with complaints about fake reviews. But the truth of the matter is that they were just scratching the surface. The problem that Trustpilot faced is that we were responding reactively, such as when a newspaper looked into a particular company and found something about fake reviews and the media would play it up as ‘how could Trustpilot not know this?’”

Part of the challenge in combating business risks resultant of the new distributed trust economy is the sheer scale and sophistication of the problem. The industry that exists around activities such as fake reviews has not been fully comprehended, even, as Carolyn Jameson points out, by a company for whom that is their focus. Fake reviews are increasingly sophisticated. For example, fake reviews are not necessarily just the five-star glowing reviews. What’s actually become more common, shares Jameson, is the negative fake review which the company can then respond to in a hugely positive way to gain even more brand authenticity than from glowing reviews. It’s akin to planting a difficult question in the audience at a conference for which you have your answer prepared!

Technology and Legal Solutions for a Distributed Trust Network

As discussed previously the distributed trust network works in an interesting way. It often uses the codes of personal knowledge to build a relationship and therefore trust when no relationship exists.

Given we have built a different relationship with trust regarding transactions on the internet, do we also need to understand the policing of this trust in a different way?

Countless digital solutions across businesses are still based on quite a small series of road maps – many of which focus on adapting traditional practices for a digital world. What many businesses still lack are ways of thinking and ways of applying solutions which are truly digital in their understanding; not a digital hack of an analogue solution.

Enter Pasabi, a high growth technology company aiming to assist lawyers in solving business problems. Pasabi is one of the few legal technology companies working in this space which could well be the next big thing for the future-facing general counsel to think about.

The company started as a venture into making shopping experiences more personalised, using behavioural insights as the foundation for this. But after a fortuitous conversation with David Lindsay, Director of Technology at LVMH, they realised there could be another application of their software in tracking down data about brands on the web, then using their behavioural insights to track down bad actors connected with posts about those brands.

Social media poses a huge challenge for many brands; it’s a nebulous and

multi-faceted arena, which offers both risk and reward. A significant outcome of the Covid-19 crisis has been many premier brands realising that they are too reliant on traditional, physical retail; missing out on the potential of the virtual world and social media channels.

Pasabi has created algorithms which map behavioural insights across social media platforms and look for common traits. Although initially tracking what customers and potential customers were interested in, they quickly realised this could equally look for behavioural insights around bad actors. Most bad actors in the counterfeit space will post on social media posing as lone individuals. But by analysing language, phone numbers, photo backdrops and so forth Pasabi’s algorithm was quickly able to map key clusters of activity across different social media platforms to show many of these ‘individuals’ were actually groups working together. Whilst the traditional focus for spotting counterfeit activity has been to look at the product itself, an obvious solution in any context, looking at actual behaviours across social media strikes at the heart of the distributed trust these bad actors are seeking to build with potential consumers.

It’s been transformative for Carolyn Jameson at Trustpilot where fake reviews were a problem that wasn’t just a legal issue but one which cut to the very core of their business.

“People think that they can spot fake reviews just by looking at them, but you can’t,” Carolyn told me. “What was so creative about the Pasabi solution was the way the technology could work on this and spot patterns which aren’t obviously visible: language patterns, IP addresses, usernames and then you join the links. Before that we'd really been having to rely on very simple things such as assuming there's a normal pattern that ‘x’ percentage would be five-star reviews, this many would be three and then you'd have someone investigate if anything looked a bit odd.”

What was also transformative about applying technology to what was fundamentally a business problem was the volume of data with usable insights Pasabi’s technology solution generated, as Jameson explains: “The Pasabi team were able to turn huge volumes of data into something that's actually really useful from a legal team perspective. Their user interface is really good and is now something that we're looking to use to undertake cease and desist activities on a massive scale, which we have never been able to do before. Then you've got the whole deterrent effect coming into play because people think they're going to get caught.”

Conclusions

Whilst other solutions have claimed to have similar efficacy, that’s not actually the case says Jameson. What sets the Pasabi solution apart are the behavioural insights being uncovered and the ability to link language patterns to individual IP addresses showing hidden patterns and hubs of activity which can then become the target for legal action.

It’s these insights from analysing digital behaviour which become the focus point for identification and aggregation via the technology with the application of legal solutions to these in the most effective way. What Pasabi’s solution also does is get to the heart of the way people actually behave online and the way they are seeking to create trust online. Trust is a two-way street as Pasabi has found out; whilst their journey started by trying to use these insights to help brands create more authenticity and trust online, their findings could be equally effectively applied to breaches of digital trust. Using that understanding of how trust has been changed in the digital economy, they have crafted a truly 21st century solution; not just traditional legal tech but technology plus legal to solve a key business challenge of the digital age.

  • LinkedIn
  • Twitter

2020 Catherine McGregor Research. All rights reserved.
Privacy Policy