Hannah Fry: We're probably all aware of the increasingly complex risks that are facing businesses and individuals. But finding the right way to navigate those risks and to build resilience, well, that's not always so obvious. It's something that calls for expert decision making and understanding combined with the power of technology. But don't just take it from me, because we have got a great panel who are ready to discuss the importance of a human-led, tech-powered approach to risk and resilience. Well, we don't really want to start on a downer, but I think we should probably begin by talking about how significant and complex the risks are that we're facing at the moment.
Sam Samaratunga: I think sometimes it feels like we face one risk after another after another, you know? And many of these are pervasive. So we have risks relating to climate. We have risks relating to politics, geopolitics and social risks. Amongst this company there might be technology risks. So there are many, many different types of risks that we face, but I do think risk also brings opportunity, because the way you respond to these risks and you deal with those risks can be very positive, positive for society but also positive for an organisation. And the important thing is how quickly you can react to it.
Professor Sue Black OBE: I think one of the challenges is that the rate of change seems like it's forever speeding up. And so, you know, we all have to get used to more and more ways of using technology, different products and services, and at the same time we have to try and keep legislation up to speed with what's going on as well.
Hannah Fry: Are there any particular risks that you are worried about?
Julie Iskow: Sure. I think every leader in today's world is concerned about a myriad of risks. I, of course, on the technology side think about cyber security. I think about the risks associated with automation and lack of automation, but I also think in today's world, certainly in Europe, regulations are growing in nature and complexity and scope. And keeping up with those is a huge risk for the companies if they are not understanding those and putting, again, the technology into place and people understanding what that requires.
Hannah Fry: Well, that point about understanding is an important one, because, I mean, I guess that applies across the board, not just with regulation. I mean, the things you're describing here are quite technical. You know, cyber security, data privacy, I mean, climate even as well. How do companies make sure that they have a really good understanding of these risks?
Julie Iskow: That's why we're here today, right? I mean, it is all about the people understanding. It all starts with the identification of the risks, the understanding of those risks, the impacts, the prioritisation of them and then, of course, understanding how one would go about mitigating those risks and coming up with plans in place.
Hannah Fry: And your thoughts, Sue? From your perspective, do the main risks that you see going forwards in time align with what's been said?
Professor Sue Black OBE: One of the biggest risks, really, is people not really being tech savvy. So that's at an individual level. You know, the more that you understand technology and you're confident with technology, the more that you understand what's going on in the world these days.
Julie Iskow: It also ties into the culture of innovation and a mindset, right? Everyone can be thinking that way and they become more open to the technology, but you need to be continuously thinking about it and being open to speaking about it, bringing it forth, making it visible and then talking about how you might mitigate those risks as well.
Hannah Fry: How much of this is about culture, then? How much of it is about education as in top down, 'Right, these are the risks that we need to understand in order to be able to mitigate them,' and how much of it is about creating a culture where people are continually learning about this stuff and thinking about ways to navigate it?
Sam Samaratunga: You know, the points that you're both making around being more aware and more understanding, those are really valid. And I think big organisations actually have a role to play in the education of people and society as well. But I don't think it's always about mitigating risk. So the point is, if you really understand risk, you can take risks in a more informed way. And frankly, if you don't take risks, an organisation won't move forward, because the best way to avoid risk is to stop doing anything, right? So, you know, that's a sure-fire way of avoiding risk.
Hannah Fry: Which in itself is quite a risky strategy.
Sam Samaratunga: Very risky. Very risky, right? So I think you have to take risks, but you have to do it in a much more-, in a thoughtful way, in an informed way, because that's how you get an edge. The other point I would make is when it comes to mitigation of risk, no matter how much you plan, no matter how you anticipate risk, another risk will come that you didn't really focus on. And that's where resilience comes in. So the organisation needs to bounce back and keep moving forward despite what's happening.
Hannah Fry: From what you're all saying, though, at the heart of this you're starting with people regardless of what happens externally. So how do you make sure that you design a group of people or foster a group of people where they are as resilient as possible?
Professor Sue Black OBE: So aren't we there talking about diversity and inclusion, which we now know is a massive strength? And I think if we want to future proof organisations, then I think diversity and inclusion is at the heart of that. Because if you think about it, if you're working with an organisation where you feel included, you feel safe, you feel supported, you feel that your ideas are valid and valued, then of course you're going to be a much more valuable employee to the company. But also, you know, creating a culture of inclusion across a company is definitely the right way to go, you know. And I think the companies that don't really take this seriously now are possibly not going to be around in five or ten years' time. You know, I think it's absolutely critical to companies' success these days.
Hannah Fry: I noticed you nodding.
Sam Samaratunga: I'm nodding because there's a really interesting point around diversity and inclusion. So it's one thing to have different views represented, and, you know, I think we all subscribe to the fact that if you have different genders, people from different backgrounds including social backgrounds, you get different perspectives. But actually, culturally, you also need to be open to listening. It's one thing having a point of view, but if those points of view are not heard and given the right level of credence, you haven't really succeeded.
Julie Iskow: Absolutely. And it's around diversity of perspectives, diversity of experiences, diversity of skills, all of it.
Hannah Fry: So that's the human side of it in isolation, as it were. But, you know, this series is about human-led, tech-powered, so I want to bring in the technology side of this too. So in what ways can technology help to avoid risks or to help ease the burden of risks?
Julie Iskow: Sure. I mean, I think a lot of organisations today have a lot of manual processes. A lot of it is tribal knowledge and so forth. And Sam, we work with a lot of customers jointly together and provide both the technology that can help to standardise, that can help in today's environment where you've got disparate data sources everywhere, help to make a process repeatable, to finding work flows and processes so that it is just a matter of, you know, following those processes and leveraging that technology to help reduce risk of error. You need data integrity. You need accuracy. You need efficiency. You need collaboration from all over the world with your teams, and technology can help make that happen and it can mitigate risks along the way.
Professor Sue Black OBE: So I think technology can really help us as people because it connects us all together. So, you know, when people ask me what's my favourite kind of technology, it's just anything that connects us together in all different sorts of ways, because I believe that we've got the answers. And what technology does for me is enable the right people to come together and interact.
Hannah Fry: I think when it comes to discussing risks, Sue, we are in a different world now than we were even ten years ago. In what ways has technology changed how we should approach risk?
Professor Sue Black OBE: Yes, that's a good question. I mean, we're all so more (sic) connected with each other now. You know, we see some of the negative effects of that, but also if we want to solve real world problems, we need diverse teams coming together with all different sorts of backgrounds. And the fact that we're more connected now means that we can find those people and collaborate with them, and they could be anywhere in the world.
Julie Iskow: I also think that from a technology perspective, the technology capabilities available to help manage risk have increased dramatically. And the governance risk and compliance capabilities now that are available to organisations really make a big difference.
Sam Samaratunga: So on the one hand, technology allows us to do things we weren't able to do before much faster at much bigger scales. Organisations are complex, and increasingly so. They face multiple risks, and technology can help to bring that together so you see a panoramic view of risk in one place that you couldn't before. On the one hand, you have to have technology, because it stops mistakes that human beings might make sometimes, but I think, Sue, you made the point that there are also risks with technology. Being connected means our privacy could be sometimes at risk, for example. It means people's behaviour changes. You know, the iPhone was invented, I think, in 2007. Can you walk down the street today and-, how many people do you see who are literally walking like this? So, I mean, that's a very simple but very evident example of how behaviours have changed as a result of technology, sometimes for the better. We have to face it, sometimes for the worse. But the point I'm trying to make also is you have to take risks. And if you don't take risks with technology, it's very hard to move forward.
Hannah Fry: And I guess that there are, you know, future technologies which also present other kinds of risk, quantum computers just as a simple example (laughter).
Julie Iskow: Sure. There's the whole field of AI and ensuring that we're doing that in the right way, in an ethical way, in an unbiased way and so forth. So that's one element of it, for sure.
Hannah Fry: But then, I guess, just to pick up on what you were saying, there is a risk of not automating, of not investing in technology.
Julie Iskow: Absolutely, and I think that's a tremendous risk. And I think we're all saying that we need to have the technology there, absolutely. But it also relies, again, on the human. You need to identify what tools you're bringing in. You have to identify what problems you're solving with the technology. You need to configure the tools and the technology.
Professor Sue Black OBE: And that's another way that I think working together with different organisations and different people who understand different parts of what we're trying to do is really critical, because we can't all know everything about everything.
Julie Iskow: Companies are wondering, 'How am I going to do this?' And you brought up the concept of scale. I mean, to be able to meet those regulatory requirements at scale, technology platform is essentially table stakes in today's world. But somebody has to think through the strategy, right, and do materiality assessments and select the data sources, ingest that data and map to the frameworks that have been selected, again by humans, with automation of course and technology to assist that, but also to think about the outcomes that you want and the stakeholders you want to report to. And that's a great combination of the human side of things along with the technology platform but for speed, but for scale and consistency and standardisation and putting it all together with the frameworks and meeting all the raters and rankers and so forth.
Hannah Fry: I really like this idea of seeing risks in a slightly different way as potential opportunities, but there are other situations in which your team does need to be resilient to bad risks. I mean, there's no other real way of describing it. What kind of characteristics do you see in companies who manage that well, who are resilient and have resilient teams?
Sam Samaratunga: I think that starts with the leadership of the organisation. I think being honest about what you are facing is important to be resilient. We don't always have the answers straight away, and I think it's really important to say, 'We know some of the answers, but we're working the other things through.'
Julie Iskow: I was going to say that having an incident response plan in place, practising what happens when an issue arises or something goes in a different direction, having that planned out through your organisation and practising. On the cyber security side, you always do table exercises, and having the communication plan defined about communicated to those who will be within that escalation path, both internally within your organisation, but also externally as well. So planning, preparation certainly helps you handle the risks.
Sam Samaratunga: But knowing that sometimes those plans are probably not going to be valid, but the point is you've been through it and you've had the experience of dealign with it.
Professor Sue Black OBE: So I think also there's the point of admitting when you've made a mistake, either as a person or as a company, because we all make mistakes.
Hannah Fry: What kind of skills do we need to get in place now to try and protect ourselves from what's to come going forwards?
Professor Sue Black OBE: Well, I think there's a big a question, kind of, around skills and what skills you've got in the company. So I think diversity in terms of technical skills, in terms of leadership skills, but then also helping people within the organisation to feel safe, I guess, you know, when things might not be going perfectly all the time. So, kind of, coming back to that resilience.
Hannah Fry: There are some really key themes that are coming out in all of this. First, the reframing of risks as though they're a bad thing to instead make them a buffet cart of potential opportunities, as it were.
Sam Samaratunga: Yes, that's a very good way of looking at it.
Hannah Fry: But then also that people really at the very core of this-, that you have to make sure that there is understanding, that there's the culture, that there's the inclusion of ideas and of perspectives. And then I think also the risk of standing still, the risk of not keeping up to date with the change of technology and embracing what technology can offer.
Julie Iskow: Absolutely.
Hannah Fry: Is that a fair summary?
Sam Samaratunga: A very fair summary. And I would also pick up on the point about bringing different thoughts within an organisation, but actually knowing that working with other organisations can also move you forward.
Professor Sue Black OBE: I think that we're all now encouraged to be authentic, either as individuals or as companies, and I think the more that we do that, we can then see other people, other organisations and what they have to offer, and because technology provides us with all these incredible tools, we can all connect with each other and collaborate to solve the world's problems.
Julie Iskow: Absolutely. When we work together, we bring in other partners, other vendors that bring something to the capabilities that we provide to our customers with our platform.
Hannah Fry: Well, I think that's a very lovely point to end on. So all that remains is for me to thank you all very much for joining me. Really interesting conversation. You can catch the rest of our episodes of, 'Human-led, Tech-powered,' that are online now. Thank you very much.