Uncovering the Trust Triangle

This is the first of a developing series of pieces and perspectives. Showing my workings and ponderings as I try to understand how the tech and human relationship has reached its current ‘state’, how it needs to be (re) calibrated and what that means practically for individuals, businesses and society. 

Having a practical situation is always very helpful. So, with that in mind I’ll be looking and opening up on how we address such challenges within a real business - People Matter Tech. How do we design, develop, grow a digital product and a data reliant business in a truly ethical and humane way within a digital and societal environment that seems to be ever more fragmented? 

Do we look to break free of the constraints of Big-Tech? Big-Data? The attention economy? Can we? Can we do so in a commercially sustainable way?

At the moment - I don’t know. What I do know is that these ramblings will start off with seemingly disconnected pieces looking at various strands of the challenges we all face through the eyes of a growing business. 

I’ll be wrong often, confused frequently and constantly contradictory. But aren’t we all? 

So, let’s kick off with some thoughts around a Trust Triangle. That relationship between Business, Individuals and Tech. Or in a People Matter sense - Employers, Employees and Tech if you will. 

-----------------

Who do you trust? 

Family and very close friends I would hope. (Although I will challenge this in a future piece!)

Brands? Your employer? Your clients? 

Your phone? The software you use? The Alexa in your lounge? The Smart Doorbell? The Smart Lightbulb?

My guess would be that you don’t trust any of those. 

You don’t really trust the businesses and brands you engage with - you likely feel a little nauseous when you see them ‘jumping on the bandwagon’ of news - exploiting social situations (both positive and negative) to sell more ‘stuff’.  

Do you trust your employer? I have no doubt that you would like to. I’ve no doubt that all the huge advancements in employee support and wellbeing that have been implemented over the last five + years have made a huge positive difference in the employer/employee relationship. But there’s still a nagging doubt - that your employer will drop you tomorrow, they’ll ‘offboard’ you at a time that suits them. There’s a constant nervousness - only exacerbated over the last 18 months through the pandemic. Nervousness is not a strong foundation for trust.

What about the software you use? Intertwined with your daily life, everything (or almost everything you do) has been ‘digitally enabled’... do you trust that - although it’s efficient, the user experience is designed to within an inch of perfection and it’s ‘personalised’ that it's got your best interests at heart? Maybe you know deep down that it doesn’t ‘care’ about you - it’s a piece of tech. At best it’s gathering your data to provide feedback to you, to ‘nudge’ you into doing something -  at worst it's sending that same data to advertisers and data traders. 

But pausing for a moment, what is trust?

To me it’s subjective. I know that I trust some people with some information, but I wouldn’t trust them with a secret. I know some people I trust in a professional environment - but in a personal sense - not in a million years. 

It’s nuanced - maybe another description is that it’s analogue? When I hear more and more digital, marketing and business leaders talking about how one should and can build trust in this tech-led age, I wonder if they are trying to make trust a binary ‘decision’ - “ You either trust us or you don’t, but please do….”. We’ve seen that with Brexit, with wearing a mask or not - creating an either/or decision that one feels that we have to make.  

And, is therefore the design of digital solutions moving on from trying to constantly grab attention to desperately build trust? Has more tracking, more ‘persuasion’, and therefore more ‘personalisation’ therefore been built into digital products and businesses? As an aside - these digital products which (in the vast majority) are built/hosted/deployed on technology owned by data gathering monoliths (Google, Amazon, Microsoft et.al) - are they now after our hearts and our trust?

But is the desire for gaining trust a lost cause? Has the design of tech (based around a data and surveillance economy) broken any chance of trust between us and the vast majority of tech? (For the purposes of my ponderings I’ll not include significant healthcare and support tech used in primary and secondary care).

This uncertainty lies at the heart of what I think is a core challenge that we face at People Matter. How can individuals trust in our product? And, when added to that nervousness (underpinning a distrust) between individuals (employees) and their employer how can we - and do please forgive the appalling phrase - square that ‘Trust Triangle’? 

Maybe we should give up on trust, stop ‘talking’ about it and rather focus on being trustworthy? 

So, for example, let’s prove that People Matter are trustworthy in showing accurate insight to employers and employees. That we are trustworthy in our anonymisation of data and further, we are trustworthy in ensuring data minimisation when gathering it. We show our workings, we discuss our ethical disagreements and challenges - we define what and who we are, what we do. We don’t design to gain people’s trust - we design for trustworthiness. 

It's not hard to shine a light on the very opposite of what we can try and achieve. Look at the Ring doorbell. Do people think it is trustworthy to make a noise when pressed? Yes. Do people think it is trustworthy in terms of being open and transparent in terms of what is happening with any ‘data’ that it is gathering? Errrrrrrr - no.

Where do we start? How do we develop and start to build trustworthiness? 

The starting point surely starts from our mindset in terms of how we view those humans that use our products (they are not ‘users’). This is where our research in terms of designing for human dignity may offer a strong base. By focusing on aspects of dignity such as;

Giving individuals the ‘benefit of the doubt’ (i.e. they know they want to learn more about their mental health).

  • Respecting their Independence - although some grouping may be needed in terms of trends - these should be a minimum. 

  • Recognition of an individual's uniqueness - note the term uniqueness

  • Accountability - being clear that the individual is in charge - and is therefore equally  accountable for the relationship between People Matter and them. 

Maybe, we start to have a foundation from which to build trustworthiness and who knows - maybe move towards a more trusting relationship? One that any individual may not be getting from their employer - who may (in some of my observations) not understand what true human dignity is. 

Start and design for dignity, build trustworthiness between the People Matter tech and the individual - would this possibly lead to trust? 

One to ponder more and this is just the start …..

Dave McRobbieComment