Shreya Gopal:
Hello and thanks for joining us on this episode of the A to Z of tech podcast. I am your host Shreya alongside Louise, and in this episode, we are going to be looking at a really pertinent topic, P for Privacy. It's been 12 months of lockdown, which has resulted in changes in our behaviours, whether that be, how we work, shop, or socialise, and also has wider implications of the data we share, who we share it with, and how the data has been used?
Louise Taggart:
Hi Shreya, yeah, I absolutely agree, those are really important factors at the minute. As well, they also ask the question about whether our general attitude to privacy has changed as well, and partly because of the way that we are now, just so used to sharing our information online. However, we know obviously quite a few people might be uneasy about this and thinking more broadly about actually the price we are paying for that kind of data and the privacy we are giving away. To explore that topic in a bit more detail, we have two guests with us today on the podcast, who can hopefully help us understand some of the technology behind the issues with privacy, both for individuals and businesses, and how we can think about protecting ourselves too.
Shreya:
Indeed Louise. on that note, we are delighted to welcome John Mitchison, who is the director of policy and compliance at the Data and Marketing Association; and Fedelma Good, who is a director here at PWC. She co-leads the data protection strategy law and compliance Services practice. Thank you both for joining us today, John and Fedelma have known each other for quite a while, so we are looking to hear your stories.
John Mitchison:
It's nice to be here, thank you.
Fedelma Good:
Thanks, Shreya, looking forward to it.
Shreya:
Fedelma, if I could turn to you first. Could you tell us a little bit about your background and how you've come to work in data protection and privacy?
Fedelma:
What a lovely question to get right at the beginning, Shreya. I am involved in privacy because I am inherently nosy, which sounds a complete contradiction in terms, but my background is actually in technology. When I started my working life, many years ago, I am not going to tell you how many it was, but many years ago, I was working in the Bank of Ireland. We were redeveloping banks bookkeeping system. What I could see was, we were creating a separate record for people, who had current accounts; a separate record for people, who had savings account; a separate record for people, who had mortgages. When I asked, why it wouldn't make sense to try and understand the relationship the bank had with the individual, I was told to go away and just get on with my coding and leave those types of things to others. I've always inherently had an interest in that information, the data, that was held about people. I've been so lucky through my life that I have ended up through the various career choices that I've been able to make, to actually end up here at PwC basically combining everything that interests me, information about people using that for maximum benefit, but also keep it right as it's heart, the people about whom the information relates. As I say, an ideal job, if you are inherently nosy, get into privacy, I know it's counterintuitive, but it has worked for me.
Louise:
With that background in mind then, an interesting starter for 10 question might be, why is privacy important, why is it a topic that we are talking about at the minute?
Fedelma:
Louise, at the heart of privacy is the concept of trust. I know everybody listening now is probably suffering still from GDPR fatigue, and also possibly cookie fatigue at this point, but actually, inherently we know we are in and moving more into a future that is going to be based on data, using and basically delivering value for everybody from that data. If you and I, and anybody else does not have trust in the organisations that are gathering our data and using it, then the whole thing falls at the first hurdle. For me, privacy embraces the element of trust, that is so essential. At its heart GDPR focuses on two key things, transparency with the people whose data is being gathered, and accountability by people, who are gathering it and using it.
Louise:
You're mentioning obviously data a lot there. I, and presumably the audience as well, have heard a lot about data being compared to, it's the most valuable commodity since oil and all of these kinds of comparisons, what actually makes data a valuable asset?
Fedelma:
Well, if you think about, at the beginning we were describing the changes we've all been through in the last year, and the extent to which we are running our lives online, the extent to which we have moved our shopping habits online, even our viewing habits. What is happening in the background basically is that data is being gathered to help basically give us the best experience online, in apps, in viewing that it can. It also helps organisations, well managed data can help organisations save costs, not send information about products and services that they know people aren't interested in. Information that can help us understand our spending, so that we know where and if we have an overspend as I do on coffee. It's all of those types of things that are really beginning to change, and it's hugely down to elements in technology. When I started in computing, the most expensive thing was the computing itself, the storage, the ability to hold the data. Now, storage is accessible, we can gather huge volumes of data, and the most expensive commodity are the people who can actually analyse that effectively.
Louise:
You mentioned GDPR, which a lot of people will have heard of in the past couple of years or so. There's also obviously been, for example, a number of high-profile data breaches, all of these factors serve to actually raise awareness amongst the general consumer about some of the privacy implications of moving and sharing data. What do you think are some of the stress points or the main issues around this, both for businesses and individuals?
Fedelma:
If I take businesses first, the stress points come in fully understanding the requirements of the law, having the right skills to guide them in their compliance, and balancing compliance against innovation, because transparency is absolutely essential, but actually describing transparently to somebody, who perhaps is not familiar with the technology, is a really difficult thing. If I look at the consumer, and those apps, the use of the mobile device that I mentioned earlier, it is so incredibly difficult to actually describe in accessible ways to someone how cookies and similar technologies are actually operating on the device.
John:
Can I chip in there?
Fedelma:
Sure.
John:
This subject of transparency is right at the top of my agenda. The DMA, Data and Marketing Association, we've been running a consumer tracker for a number of years now. In the latest one, which was just a couple of years ago, we do them every three years. There was one stat that popped out at me, something that really ought to be highlighted, and it's basically 88% of people want to see more transparency about the way their data is collected and used. So, 88% people, that's a huge majority. The difficulty and it was something that you were just touching on there, is the fact that the technology that marketers use in the form of cookies or some of the things that they might be doing with data and it's being shared or analysed or used for profiling and that kind of thing, is very difficult to explain, it's quite technical stuff. There is definitely a challenge there. That's where the word that you used was trust, trust is essential in all our marketing, and again, it comes up a lot in research. But what marketers want to do with data is often just quite benign. We want to be able to pick out the people that we want to speak to and send them the right message, but in order to do that, we need a lot of data and we need to do some sort of classification or profiling, which gets quite technical, and that's where people tend to get a little bit wary, because obviously the word profiling can have a lot of negative connotations, although it's quite benign in the marketing space. The technology that marketers have been using has been allowed to get to run away with itself, and it's been doing great things, but now, the privacy regulations have been tightened up, there's a bit of a gap. In some ways, we are going to have to close down what the technology and marketing did, because it was running away with itself, maybe getting outside of the areas of appropriate privacy.
Fedelma:
John, maybe it's worth taking a moment there just to explain what are the nuances of general data protection regulation and this wonderful thing that we refer to as PECR. So, the general data protection, shall I kick off?
John:
Yeah, you go ahead with the GDPR.
Fedelma:
The GDPR is basically putting individuals back in control of their personal data. As I mentioned earlier on, the key target there is transparency. John, and myself, and Louise, and Shreya, and everybody else who gives their personal data has an understanding of what is going to be done with the data that they are providing and also how they can exercise rights that they have. A business has to be able to prove that they are compliant, and that's described under accountability. John, do you want to pick up PECR?
John:
Yeah okay, PECR or the PEC regs, or privacy and electronic communications regulations, the first iteration came out in 2002 and it has been amended a few times since then. As it would suggest, it is about privacy in electronic communications. There is an overlap there with personal data. PECR isn't exclusively about personal data, it's about other things as well, but there is an overlap, so you do have to think about it in conjunction with GDPR, the general data protection and regulation, when you are dealing with people's personal data.
Shreya:
John, I guess, coming back to you. How have you come to become involved in data and marketing and could you tell us a little bit more about DMA. I am sure that's another acronym, our listeners have heard earlier on.
John:
Of course, well I’ll start with that, if you like. The DMA is the Data and Marketing Association. We are a trade association for companies engaged in marketing in the UK, but we have connections further afield with European DMAs and global DMAs as well, there's a whole bunch of us around the place. Of course, like I said, we are trade association for people in that space. We've got about 1000 members split pretty much evenly across brands, agencies, and suppliers, and we represent those companies, when we engage in lobbying activities. We have councils filled with experts from the various companies that are members that produce outputs, we do webinars and other kinds of events. We have a lot of training, which is run by the world-renowned IDM or Institute of Data and Marketing, and they have courses, anything from simple online courses to postgraduate diplomas.
Louise:
If we are thinking about your experience with working with businesses and thinking about how they are able to secure or safeguard the data with customer or personal data they have access to. What does some of those best practices look like? This is probably a topic that is at the top of people's agenda at the minute, because we are, as we've said, sharing much more personal data at the minute, particularly, for example like health data given the current situation, so I'd be interested to hear what some of those best practices might look like?
John:
Yes, it's funny you should bring that up actually, because again I am just looking at some research that we've done at the DMA, as well as our UK research into consumer attitudes. We actually did a global one as well, calling on all the DMAs from around the world. One of the major points that people made about the implications of their data going out into businesses in the way, because it's shared quite widely around business and the internet, was the fear of hacking. Hacking obviously is a form of data breach, and we hear quite a lot about these in the news. There are major hacks that have gone on, and that is a concern for people, because it can cause so many problems further down the line. The loss of your data means that somebody else may have it and use it for nefarious purposes, which then obviously can affect your credit rating, the very least you have to change all your passwords, which is the biggest job I'd ever have to do. So, security and looking after people's data effectively is fundamental, is one of the main pillars of the general data protection and regulation, is that you have to have appropriate security measures.
Fedelma:
I might jump in there, John, because this is one of the areas that we talk with our clients a lot, and particularly our cyber team, who the data protection team work closely hand in hand with. One of the things in the context of technology these days is, you obviously have data held, we describe that as data at rest. That's sitting in these large combinations of data in these databases, so you need certain types of protection there. For instance, if there is a large database in your organisation, then only people who need to have access to that data or even certain parts of that data should be able to get out. Those would be referred to as access controls. For instance, then you may have data which is moving from point A to point B or company A to company B, and that's data in transit. What you need to do there is apply some other protections. General principle of data protection is about minimisation. For instance, it's about thinking about do you need to send all of the data, or in fact, can you achieve your purpose with just a bit of it. Encryption is a term that people will probably have heard of, and again we'll all be used to that concept now of, when you're moving data around that it's encrypted, and it's protected.
Louise:
John, if I pick up on something that you mentioned a couple minutes ago, around personal email options, and preferences. How can consumers ensure that their data isn't being used in ways that they're not comfortable with, so for example that might be something like, opting out of marketing emails, for example?
John:
Okay, yeah, this is brilliant. Fedelma brought up the subject of transparency and we've talked about that a lot and that's obviously very important. If you imagine that you are faced with a form, automatically a form would be something online, but it could just as easily be a piece of paper or a conversation over the telephone. During the process of data collection, you're putting your information into a form. There needs to be a way of me understanding exactly what's going to happen. When it comes to very obvious things like whether I want to receive an email from this company on a regular basis, then there will probably be a statement on that form that says just that, put your email address in here if you'd like to receive regular updates from us, or tick this box if you're happy for us to send you regular news and updates, we've all seen that kind of thing. By ticking that box that is opting in, alright. That's a positive statement, and you're basically telling that company that you don't mind receiving the occasional email from them, but there will be more information associated with that, and that's always attached in a privacy policy. At the bottom of a form, you will see a link to a privacy policy, or if it is on paper, it might be on the back or they might tell you where to find it online, but there will be a lot more information telling you all about the company, how they hold their data, it might go into some of the security methods, what happens to it, do any other companies get involved, all the kind of information that you might need to know if you're concerned about where your data might end up. The good thing about emails is that you have to ask for an email. People can't just send consumers emails just because they feel like it. When I show up online at the point of check out, there's almost always, do you want to get something from us in the future. Sometimes you do and sometimes you don't and that's a choice, up to you.
When it comes to other methods of communication, traditional marketing, direct mail and telephone. It's also an option you can have an opt out, rather than an opt in. That basically means tick this box if you don't want us to send you something through the post. Sometimes it is a bit confusing, but it should be fairly obvious if people are following the right transparency rules, but there are also other ways that you can affect what kind of marketing that you get, or how your data is used in any context. A lot of companies now have what's known as a preference centre. So, as well as their privacy policy and maybe some information about cookies, there'll be a link to a preference centre. That will be a list of options that you can choose which are relevant just to you as an individual. Yes, you might want to get an email, and you don't mind getting something through post occasion, and you'll make the various changes in this preference centre, save the changes and then the company knows exactly how they are supposed to deal with you that makes you happy. That's a really good way of personalising things for people and making sure that you don't irritate people by too much communication in one space and all of that kind of thing, but there are a couple of other options as well.
When it comes to traditional marketing, which is the opt out style of marketing, there are a couple of services available. One is called the mailing preference service and one is called the telephone preference service; MPS and TPS for short. Those are basically lists, you can go to the relevant websites, register your details, and you go on to a list which is then known as a suppression file, and that suppression file is used by just about everybody in marketing to remove the names of people who have specifically said that they don't want to receive unsolicited marketing through the phone.
Fedelma:
John, one of the things I get asked about on TPS is the mobile versus landline scenario, so TPS is for everything isn't it?
John:
Yes, it is. For some reason people tend to assume, it's landline only, but we're not fussy, that TPS will take any telephone number, landline or mobile.
Shreya:
John, thank you for that information regarding TPS, because I know two of the banks recently have announced in their news that a lot of the scams are being done by text messages as well, so mobile definitely comes into the play there. Switching gears a little bit, we'd be interested in hearing from you on, if we have reached a generational tipping point, wherein young adults are now so accustomed to sharing their personal data online, but this will be their new normal, but maybe not all of us are as comfortable, want to get your thoughts on that, we will go to you first John.
John:
Yeah, for sure. Well, again through the research that we do at the DMA, we've split consumers into three main groups when it comes to how they feel about sharing data and giving data to companies, particularly in an online situation. We have data pragmatists, and they're fairly middle of the road, if it's explained to them properly, and they see value in that data exchange, then they will share their data, and we have data fundamentalists, obviously they're very careful, they don't want to share their data with anybody. Then, there are the unconcerned, alright, who will do whatever they're asked to with their data and put it out wherever. Now, strangely enough over the years that we've been tracking this study. It's the unconcerned, which is the largest growing area, and this is obviously down to younger people joining the sort of community of online users and shoppers. They've grown up in a digital space, they've grown up playing games online, social media online, and then suddenly they're shopping online and the sharing of data is just natural to them and they don't see it as the problem that maybe some older people do. But having said that in the very last study that we did, the largest change in this pragmatist, fundamentalist, unconcerned dynamic was actually with slightly older people with 55 to 64 year olds that were making the biggest change and they were more happy now to share, as well as the younger people coming in at the bottom, who are happy to share data, the older people at the top of the graph, maybe they're learning how to deal with it and how to control their data use, and now they're becoming much more comfortable sharing data as well, there's two sides to that.
Louise:
Some of those generational points that you’ve made and how those trends evolve in the next few years are definitely going to be really interesting to watch both with the younger generation, but as you've said also the older generations and how they begin to interact with different technologies and become more comfortable with sharing that data online absolutely. It is a final wrap up question to you both and ask you to get your crystal balls out and do some gazing into the future. Fedelma, may be if I turn to you first. Is there a future data protection legislation coming on the horizon around privacy, and do you actually think that this evolving context of how different generations interact online will begin to shape what that looks like?
Fedelma:
The answer to that is yes. John spoke earlier on about PECR, privacy and electronic communication regulations. That actually is implemented in UK law, but it actually derives from a European law. Europe is at the point of finalizing what the next generation of those e-privacy rules will be. As John said, I mean they date back to 2002 with changes, having been applied in the interim, but yes, we are looking at some changes coming. For instance, at the moment, the cookie law requires that you get consent if you read anything from or write anything to a user's device, whether that's a laptop, or mobile phone, an iPad, anything. That includes getting consent if you're running, straightforward analytics. If you're simply counting how many people are visiting your website, not that it's Louise, or Shreya or John, but actually that there have been three people visiting, that at the moment, needs active consent. It's hugely difficult to describe to someone in your cookie notice, we just want to count the number of visitors, because they have to take an active action, to say, yeah, I'm okay with that. What we hope is or anticipate is, certainly in that European law, that there will be a move to a pragmatism that protects privacy but enables businesses to actually operate effectively in the digital space.
Louise:
Thank you, Fedelma. Any quick thoughts from yourself John on your part of the landscape.
John:
Yeah, absolutely. I personally think we're kind of like, we’re at a turning point in a way, and I don't quite know how far we're going to turn. So, three years ago now, GDPR was implemented, and there was a lot of concern when GDPR first came in, it did make quite a few changes. Fundamental sort of changes to the way people thought about data. But over the years, we've kind of got to grips with that and I think pretty much everybody now will think that GDPR is a good thing it's a good thing for consumers, it's a good thing for businesses as well. Not least because it forces them to understand how they use data and where it goes and how they're processing it. But now that we've left the EU, and we're about to start coming out of the downturn in the economy due to Coronavirus, the government is very keen to encourage innovation and encourage business to sort of like get up and running again as quickly as they can. One of the things that the guys at DCMS, the Department of Culture, Media and Sport have said is that they don't want to be too restricted by data protection regulation. So, although GDPR is great, sometimes the way that it's interpreted and it's supposed to be interpreted the same all across Europe, although that doesn't seem to have happen so far. The way that is interpreted can sometimes be a bit strict and a bit restrictive for businesses, and we're hoping to see that open up a little bit and just a little softening of the interpretation to allow businesses to get the most out of it, without increasing any harm to consumers along the way.
Louise:
Thank you, John, I'd be really interested to see how that develops in the next couple of years, absolutely. I think unfortunately, however, that does bring us to the end of today's episode. But thank you both John and Fedelma so much for joining us and sharing some of your insights from your own professional experience and your expertise. I think I was particularly interested actually to hear that the TPS is both the mobile as well as landline, so I think I'll probably be in touch there about that.
Shreya:
Also, a big thank you for our listeners for joining us on this episode, and make sure to subscribe to our series, so you don't miss our next episode.