The Actionable Futurist® Podcast

S5 Episode 21: The Ethics and Implications of Data Privacy in the Digital World with Anton Christodoulou from Imagination

August 19, 2023 Chief Futurist - The Actionable Futurist® Andrew Grill Season 5 Episode 21
The Actionable Futurist® Podcast
S5 Episode 21: The Ethics and Implications of Data Privacy in the Digital World with Anton Christodoulou from Imagination
Show Notes Transcript Chapter Markers

In the race to dominate Al we have seen our data privacy, democracy, and even our human rights impacted. 

To understand what brands and consumers need to do to fight back, I spoke with Anton Christodoulou, Group Chief Technology Officer at leading experiential design company, Imagination and co-founder of the new Trust 3.0 initiative.

The Trust 3.0 initiative  is a Data Privacy Advocacy Group convening the brightest minds in privacy, AI, and technology to champion responsible innovation for a safer society.

I’m proud to say that I’m also a part of this timely initiative.

Anton is responsible for overseeing Imagination's global technology strategy, project and service delivery execution; to deliver immersive, engaging and measurable experiences to clients including Mastercard, Ford, Major League Baseball, Jaguar Land Rover and Shell.

I started our discussion by asking Anton more about Trust 3.0 and why it has been set up.

This episode looks at its implications on data privacy for consumers and businesses and offers a deep dive into the necessity of transparency, security, and fair exchange of value in handling customer data.

We also discuss how businesses can leverage this to promote trust and security.

More on Anton
Anton on LinkedIn
Imagination website
Trust 3.0 website


Your Host: Actionable Futurist® & Chief Futurist Andrew Grill
For more on Andrew - what he speaks about and recent talks, please visit ActionableFuturist.com

Andrew's Social Channels
Andrew on LinkedIn
@AndrewGrill on Twitter
@Andrew.Grill on Instagram
Keynote speeches here
Pre-order Andrew's upcoming book - Digitally Curious

Speaker 1:

Welcome to the Actionable Futurist podcast, a show all about the near-term future, with practical and actionable advice from a range of global experts to help you stay ahead of the curve. Every episode answers the question what's the future on, with voices and opinions that need to be heard. Your host is international keynote speaker and Actionable Futurist, andrew Grill.

Speaker 2:

So in the race to dominate AI, we've seen our data privacy, democracy and even our human rights impacted. Maybe you could tell us a bit more about what is Trust 3.0 and why you've set it?

Speaker 3:

up. Trust 3.0 really came out of a gap in the market in terms of how data privacy is being addressed in the current landscape. There are a number of privacy advertising groups helping consumers protect their data. What we really believe is, in order for this to be implemented in the way that it needs to be, we need to make this accessible and practical for brands particularly, but companies generally, that are dealing with, I would say, huge amounts but any amount, arguably of customer data and ensuring that they are doing it in a way that is transparent, fair and essentially enables a customer to feel comfortable interacting with the service. And I think the key here is, although it is absolutely for the end customer, there are real benefits to brands in ensuring that the data is stored, managed and made as transparent as possible from a value exchange and a marketing perspective as well, and even from a security perspective, because if the data is handled and managed in the right way, it actually makes it much harder for bad actors to get access to that data as well.

Speaker 2:

Before a charity organisation. You've got some really interesting people involved in that. There's a bunch of regulation around this. What's the need for this? You see, there's a gap in the market. A regulator is not doing enough. A brand is not doing enough. How can you possibly, as a third party, influence some of these big brands and government decisions?

Speaker 3:

You've got two challenges. You've got the consumers on one side, who, predominantly, are potentially scared. They want access to the services, but don't really understand the implications. What we've seen in the last decade or so is that consumers have essentially just willfully given their data over without really understanding what the impact is, but are becoming much more aware and much more concerned On the business side. Businesses are being hacked, and so there are organisations and certifications such as ISO 27001, which enables you to deal with the security side.

Speaker 3:

You've got B-Corp, which enables you to address sustainability, which, again, even 10 years ago, there were only a few very large organisations that were really taking sustainability seriously. However, there isn't an organisation that's specifically looking at working with companies or brands to ensure that the data privacy side of things is properly handled. At the moment, it's being very much left to individual brands and companies to decide how they deal with that. If you look at someone like Google, who have become much more transparent but essentially live on customer data, versus someone like Apple, who have taken a very stringent data privacy approach but also use that as a marketing tool.

Speaker 2:

So where will you interact, where will you interface with the brand side of the world and the consumer side of the world?

Speaker 3:

The plan really is to primarily work with brands, which is what we're doing now, and it's to really work with brands to help them understand what frameworks and approaches they need to implement. So, although this is ultimately solved through some form of technology infrastructure, the focus really is on identifying what are the frameworks. What is it that you need to implement in order to ensure that your customer data is stored in the right way, it's controlled in the right way, that there is transparency around the way that that data is being used? And then, once you've kind of built those frameworks depending on what their particular service they're providing is, looking at then how you implement a technology framework that enables that to happen, not at a kind of regulatory level, but basically within the layers of the services that you're using, so that obviously, they're still a layer of ensuring and making sure that those services are correctly implemented, but once they are implemented, brands can feel a lot more confident, a lot more comfortable that they are managing and using customer data in the right way and, more importantly, they can then start talking about that to their customers to say you know, if you are interacting with us, if we're providing a service to you, we are doing it in a way where we are protecting you, we're protecting your data.

Speaker 3:

How far they take that is entirely up to them.

Speaker 3:

They could go for a completely decentralized and anonymized approach where you're able to consume the service without the company that's providing the service having any knowledge of you or what you do.

Speaker 3:

The kind of most extreme example would be something like Signal, which obfuscates literally every step of the process, down to kind of where a call is made, when it's made and for how long, as well as, obviously, all the data that's required and the conversations themselves, versus someone like WhatsApp, which actually stores all of that data except the conversation, versus just being more open about kind of how you use that data.

Speaker 3:

Will you share it with third parties and we're kind of at that point at the moment is that most companies now have some sort of privacy policy. They're encouraged to make that privacy policy as clear as possible, although you know most people still don't read it or even understand it, however clear it is. And even if they do, typically as best as you can tell, the main question is are they sharing it with a third party or are they just using it for their own purposes? This takes it a lot further and make sure that they're only essentially using the data they need, potentially only using the data at the point that they need it and, as I say, it may even be that the data they're using is completely anonymized, so they're using it purely at that point, to provide the service, and then, essentially, they'll just continue down that path. What they provide as a value add is the service itself, not the mining and exploiting of that data in order to use it for any other purposes.

Speaker 2:

How would you describe imagination? Are you a marketing agency? We're an experienced design company.

Speaker 3:

We have been designing and building experiences for brands for over 50 years. We started predominantly in the sort of automotive space to some degree. We launched the original events at the Millennium Dome, the O2. We did the New Year's Eve celebrations for Sydney for nearly a decade. We launched products and services for a variety of different companies. We run large physical events as well as either physical, digital and virtually integrated events. So, for example, recently we launched the new European made EV for the Ford, which was a purely virtual event using pixel streaming to be able to enable people to drive the new car before is actually available to the public and be able to order that. So, whether it's a physical event through to a completely virtual event, we are handling our clients, customers' data and we're using that data often to create kind of personalized services. Our focus is always, from an experienced design perspective, is how can you create the best possible experience for the end customer of that service?

Speaker 2:

Your close work with brands, you've really realized that your own clients need to be privacy trust assured. So is that why you've gone ahead and looked at the trust3.0 initiative.

Speaker 3:

We have always taken how we handle customer data very seriously.

Speaker 3:

We have applied, and do apply, the various regulations around the world like GDPR and CCPA. That's kind of for us, a kind of baseline that we would do. Anyway, what we're looking at now, and being aware of where the market's going, is looking at ensuring that when we're working with clients in the same way, in the early days of kind of the sustainability conversation, as soon as they kind of come to us and say actually we want to build this with a slightly different framework. Maybe they want to launch a product or a service that actually focuses on data privacy or they want to make it a key tenet of their marketing strategy, as Apple has, we have the ability to one obviously be able to provide the expertise, the training and ultimately, if required, the certification. We may certify ourselves, but we may also work with brands that want to be able to do that themselves. As well as our own platform, which handles all of the customer data for all of our clients, we want to be prepared for the future and adapting in advance of what's coming, rather than being reactive.

Speaker 2:

You mentioned before decentralized systems. I'm really interested in decentralized identity, also called sovereign identity. So where will the decentralized identity platform, where consumers control their own data, where will that fit, and will platforms like this give consumers the power back of handling and managing their own data when it comes to trust and privacy?

Speaker 3:

One of the reasons that I got involved with, and one of the co-founders of, trust 3.0 is this is actually, for me, a decade's long passion around decentralization. Decentralization and the way in which it empowers users has been something that's been talked about for probably at least 20 years, but certainly the last 10-15. However, we weren't in a position the very broad way, the internet itself was still dealing with much simpler challenges like how do you stream a video to a browser, which obviously we all do and enjoy now, but that took a surprisingly long amount of time to fix. Consumers generally don't necessarily want to own the data themselves. The analogy we use recently is you might want to rent a car. You don't necessarily want to own the car, but you do want to know that the car is insured, that you know you can drive it safely. I think a lot of consumers don't necessarily want the hassle of total ownership, although they should definitely be given that option. What they do want is to know that the data is being handled in a secure and transparent way.

Speaker 3:

And going back to your question on decentralized identity, there's real benefits to having completely decentralized identity, and I would go further as to say having an anonymized you know decentralized identity, so that you can essentially prove who you are and that you have the ability, for example, to pay for a service or to consume the service safely, without necessarily the service knowing exactly who you are.

Speaker 3:

If a bank has a trillion dollars in the bank, then that's a much more attractive target for somebody to come and steal all the money from that bank. If there is actually a trillion banks all with one dollar in it, which essentially is the kind of principle of decentralization, decentralized data ownership, the attack vector is much, much harder. It's much less attractive and therefore brands are less likely to see the kind of data hacks that we still regularly see. I mean I think there was there was another recent big data hack. Individual users can be more confident that they've got control of their data, while brands and bigger companies can have more confidence that actually they're less likely to essentially have the data to be compromised in the first place.

Speaker 2:

I'm going to speak with Marie Wallace in a few weeks from Accenture and this is her focus. She runs decentralized identity for Accenture. How far away are we from actually having this where consumers can actually very easily do this and it's frictionless? Because I think at the moment it's a bit like when you first start getting involved in cryptocurrency and having you set up wallets and authorizing things that there's just so much friction there for the consumer they think are just all too hard. Is their technology out there? Are their processes that are going to make this really easy? And to my first question how far away are we from this? Really, it's a really good question.

Speaker 3:

I think the biggest barrier to this, despite the fact that it's been possible for many years, has been exactly that issue. Going back to the question to a consumer do you want to own and control all your data? Yes, I do. Ok, here's the key. Keep the key in your pocket and make sure you don't lose it, but if you do lose it, all of your data that you own is gone. The people that cut the key and created the save for you will not be able to get into that save. Then you go back and ask the same question so you want to own all your data? It's like, well, yeah, but could you just hang on to the key for me, or a copy of the key, just in case I lose it? And I'm like, well, we can do that, but bear in mind, then you might own the box and the key, but if we wanted to, we still have the key. And so Apple have done quite a nice job of addressing this and I only refer back to them because they have actually they're one of the few big, big players that have implemented this is they actually give you both options. They do store your data within, essentially, a not necessarily decentralized, or you could argue it is decentralized but within a kind of data store. They lock it and they create a key and then they offer to keep that key in your iCloud account, which you still have to unlock with a password username password and various other things. But in theory there is still an ability for the service provider to be able to unlock that service in some way. They give you the key and you have to store it and then they basically get you to agree that if you lose that, then you've lost everything, and I actually went for the latter. But even the version where they keep the key in iCloud is actually a very good balance, and so the technology is absolutely there.

Speaker 3:

It's harder to implement, obviously, it's easy to keep go back to the original energy a trillion records in a large database. I mean it's easier from a point of view of the kind of initial implementation and it's much easier to have one key for the whole thing rather than having a trillion keys. You know, the whole management of those different data stores and the way you access it is more complicated. However, the technology does exist. It's fairly well established. There are a number of different approaches to doing that. There are also a number of fairly well tried and tested ways of implementing that, through two factor authentication and device authentication and so on, and so there is a way to implement this that gives a considerable amount more control back to the consumer or to the individual, while also enabling to continue to consume those services in much the same way as they do now. I mean, no one, I think, would argue, unless they've lost access to their Apple account, that on a day to day basis they see any difference in terms of the way all of that data is stored. But nonetheless, it is actually quite well encrypted, and that's why, you know, companies like Accenture, of course, are well positioned to implement this.

Speaker 3:

I think one of the challenges with this is that we do not replace one poor system with another poor system, and there is a risk that large organizations at that kind of final, that kind of final area where they've implemented everything just right, and then somebody goes in and says, well, let's just leave one master key under the map, just in case. As soon as you do that, you destroy the entire. It's kind of basically pulling the rug from the entire system. So if you implement it, you have to commit to it.

Speaker 3:

You also have to make sure and this is where things like open source comes in is make sure that the system that you're implementing is one that you also relink with some control of, so you may use it, you may support it, you may develop it, you may charge for providing services and everything on top of it, but you need to make sure that the system itself is part of the Internet. It is an open framework that everyone can use and contribute and build. That truly gives us that control back without any back doors. And that's my biggest concern at that last piece is that, the worry that wait a minute I'm actually giving up control, which is fine. As humans, when we have control, we naturally quite like to keep it.

Speaker 2:

Trust 3.0 have just released a report. Maybe we could talk about someone that the first thing was around implicit data. So, first of all, what is implicit data and why is it important? What should we be doing with it?

Speaker 3:

So implicit data is all of the data that's basically been collected before, during and after you consume a service that you won't necessarily be aware of. Even before the current advances in AI, they can pretty much work out what you want. They could work out that you're going to order a coffee in 20 minutes from the Starbucks on the high street in Sheffield, even though you may not have done that before, because they have so much data on you that they can work that out. Now, obviously, some of the advantages of that is that you get there, your coffee's already ready, it's nice and piping hot, there's a table outside because they know that you like to sit outside and that's a really lovely user experience.

Speaker 3:

So the use of implicit data isn't inherently a bad thing. It's the way in which it's used and the transparency for which it's used and the length for which it's retained. So if all of that data was used and they knew which coffee you liked and you got the coffee and you got your favourite seat, they actually didn't know who you were, so they didn't know it was Andrew. They just knew from all of those implicit data points that you wanted and you were able to consume that service. It suddenly again changes that slightly. But that's where implicit data is the thing that people don't realise just how much data is being used, collected on an ongoing basis.

Speaker 2:

That brings me to another point which was raising the report this value trade-off when it comes to consumer privacy. You gave some great examples there of data that could be captured and used. So two things where is that balance? And secondly, if we move to sovereign or decentralised identity, does that remove that level of information? So brands say, well, we can't give you a personalised experience, we don't know who you are. So where does the pendulum swing between helpful and creepy?

Speaker 3:

This really goes back to the question you had earlier on imagination. We are passionate about creating really personalised services for customers and actually very immersive, joyful experiences that can be made more joyful and cooler as a result of us using certain data points, whether they are based on you as an individual or the way you interact in a space. My belief is that you can create a service that is extremely anonymised to a quite high degree, while also actually being able to use some quite sophisticated data points. In the moment, it's the way in which that's implemented. Part of that could be that the data you sort of give permission for that data to be used at a certain point and then that data is erased, so you can provide the service at that point without the company providing it having kind of long term control over it.

Speaker 3:

The flip of that is that actually they have all of the data but they just don't know who you are, and so it's completely anonymised data, which is still very useful for brands to be able to improve the experience and even understand, kind of how consumers interact with the brand and what products they buy.

Speaker 3:

But they don't know it's you particularly you, as Andrew, might go to a show in Berlin and then another one in LA and then another one in London, and interact with those and buy certain services and products. But if they actually said who was that person, they don't know who it is, and so there's lots, and this goes back to what we were saying earlier. With Trust 3.0. We want to start not with the technology, but start with the approach and the framework to understand what is it that you need to be able to provide a really rich service. What kind of relationship are you looking to build with your customers and then build a framework that enables you to deliver those services in a transparent way, while also protecting your customers from the risk of overly intrusive or creepy technology taking over.

Speaker 2:

So when we consider the ethical ramifications of any technology that should be baked into strategic planning for launching an annual initiative, you know from vendor profiling test and learn. Technology on its own seems neutral. It's the use case that supplies the risk. Where do ethics and innovation meet when it comes to privacy and trust?

Speaker 3:

When you're building those approaches and frameworks, you need to build your own ethics framework. So you know, obviously ethics means different things to different people. I'm not saying that you can necessarily come up with your own version of an ethical framework and then say, well, I'm being ethical because clearly some things aren't. But I think if you come up with what your approach is and you're open about what that approach is and then you're implementing and using those services and honoring that, then I think that's kind of where it meets. If you're opaque in your approach, if you're downright dishonest, the ethical framework needs to be open and then use the technology to implement it the way you've said you have.

Speaker 2:

So I'm the Action More Futureist and I like to look far enough ahead to be useful. So where do you think the trust debate will be in 12 months and what will Trust.30 have been doing to further this debate?

Speaker 3:

We'll be working through continuing to build out those frameworks, working with some brands to help essentially identify the best approaches and frameworks to do that, having some sort of events and symposiums to be able to have these debates in an open forum, working towards providing training and support for how you start to implement some of these things, especially around decentralization and some of the identity areas that you've been discussing.

Speaker 3:

And within 12 months, we're really looking to get to a position where we can start to provide a certification framework. So we would do the initial kind of survey or light audit on this and then we would create a framework that enables other companies to be able to do the actual kind of proper auditing and certification with a Trust.30 certification kind of stamp on it. And so, yeah, that's where we're looking to get to and then, beyond that, just continue to be able to help brands and companies do this better so that we get to the point where we have a very open, fair and equitable internet. And actually, when we talk about internet now, we're talking about our day-to-day lives. We're all using the internet or relying on it, you know, even when we're in a very analogue situation. So it's not just protecting the internet itself. It's protecting our kind of society and how we operate going forward.

Speaker 2:

Almost out of time. My favorite part of the show, where we ask our guests a quick fire around iPhone or Android, oh, iphone Window or aisle Window In the room or in the metaverse In the room. I wish that AI could do all of my first thing I thought of as homework what's the app you use most on your phone?

Speaker 3:

Probably calendar.

Speaker 2:

Best piece of advice you've ever received Purpose beyond self. What are you reading at the moment?

Speaker 3:

I'm listening to a lot of Lex Friedman podcasts on AI. Who should I?

Speaker 2:

invite next onto the podcast.

Speaker 3:

Lex Friedman.

Speaker 2:

So, as this is the Actionable Futures podcast, what three actionable things should our audience do today when it comes to better understanding the opportunities and threats from a trusted environment?

Speaker 3:

Treat data privacy as as important as security and sustainability. Don't wrap it into those things. It's its own thing. Seek support and advice not necessarily from trust through, but you know, from whatever sources to ensure that you understand what the challenges and opportunities are. And in the same way that many companies and individuals maybe are looking at things like AI, look at data privacy now with the same lens. Ai will probably help in that respect. Focus on it now because it will benefit you in the future, both personally and as an organisation.

Speaker 2:

Anton, a great chat. How can people find out more about you and your work and the work of Trust3.0?

Speaker 3:

For Trust3.0, it's trust30.org. For imagination, it's imaginationcom, For me probably LinkedIn.

Speaker 2:

Thanks so much for your time today. Great to chat about this very important part of technology going forward. Thanks very much.

Speaker 1:

Andrew, Thank you for listening to the Actionable Futures podcast. You can find all of our previous shows at actionablefuturescom and if you like what you've heard on the show, please consider subscribing via your favourite podcast app so you never miss an episode. You can find out more about Andrew and how he helps corporates navigate a disruptive digital world with keynote speeches and C-suite workshops delivered in person or virtually at actionablefuturescom. Until next time, this has been the Actionable Futures podcast.

Trust and Data Privacy in Future
Data Ownership and Decentralized Identity Importance
Trust and Privacy in the Digital Age