Q&A with Stephen Pattison: The Public Debate About Technology Needs to Be a Two-Way ExchangePosted on 30th July 2020
Tell me a bit about yourself, what it is you do and what drove you to join the advisory board?
I work for Arm Ltd a global high-tech company, which is probably best -known for the design of microprocessors for mobile phones, but we are involved in many other things too. Our designs are increasingly used in a wide variety of situations, from powering high-performance computers, to edge devices for the Internet of Things. Energy efficiency has long been one of our main goals. Security is also important to us. We get involved in a variety of public policy issues linked to technology, including the relationship between technology, security and privacy.
The key thing here is trust: new emerging technology can bring huge benefits to us as individuals, and perhaps even more importantly to society as a whole. But that will only happen if people trust it. Those of us who are supporters of the new technology need to make sure that we are alert to people’s concerns, and that in developing our technology, we take account of those concerns from the beginning.
Access Partnership can help ensure that meaning that we work to get out the technologists’ arguments as well as addressing the concerns the public has. As we look at the debate unfolding in different parts of the world – whether it’s in Europe, in the United States or in a multilateral organization – we can see contentious and difficult issues coming to the top of the agenda. Technology is becoming part of a sort of geopolitical struggle where it is seen as the new frontier in a new Cold War. This is sad: the progress made in recent decades has been a global effort. We must try to manage these issues in ways which help technology advance for the benefit of every country and every person. That is where I think we can all play a role.
What are the most pressing issues in tech for the 2020s?
The overriding one is winning public trust in technology. If you look at artificial intelligence, it is still pretty much in its infancy. But over the next few years, it will take off. It has the potential to improve the way we do many things including through spotting correlations in data which we have not so far noticed. But people are cautious about it because it can be used to help or support outcomes that people are uncomfortable with. We need to get this right, it is about balancing risk and reward. If we don’t get it right we will lose out.
The geopolitical dimension is the next biggest issue really, where if we’re not careful, we’re going to see technology used as a weapon in a new confrontation between superpowers, and that would be a step in the wrong direction if we want to encourage continued Research and Development on new technologies.
The third big area is technology and climate change, or sustainability. There are two aspects to this: certain technologies can be used to help us use energy more efficiently. But at the same time, we need to watch the energy footprint of the technology sector as a whole.
With the exponential growth in data-driven technologies such as AI, 5G and IoT that we have seen over the last couple of years, how important is it that organisations and governments make the right decisions when it comes to adopting these technologies?
It is crucially important. Government action could make or break the new technology, or the take up of the new technology. Without a sensible approach to data protection and security, the public will have no confidence in it. But, if our approach to these things is to take preemptive regulatory action we risk suffocating innovation and experimentation and stifling the technology before it really takes off. So, the balance that governments must strike between sufficient regulation and over-regulation is a difficult one. Over the next few years, we will see this debate come into sharp focus. We are seeing it already a bit with governments looking at regulating for more secure IoT products for example (a good thing), and looking at what are acceptable uses for facial recognition technology. Everyone is now beginning to look at the ethical framework for artificial intelligence. In all these areas we will see governments taking action. Determining the right sort of regulation or action can help win public confidence in the technology and the wrong sort of government action will suffocate it.
What should companies do when it comes to adopting these technologies?
Companies need to balance listening and leading: they need to listen to public concerns, but at the same time search for ways to continue to lead in innovation, and in addressing those concerns. Companies need to start thinking from the very beginning when they are designing a product, how can they make those products more trusted, more secure? How can they consider the ethical concerns during the design phase? How can they make sure those products are not consuming any more energy than is absolutely necessary? So, there is a heavy big responsibility on these components and it’s not enough for companies to say: “all we do is make technology, it is for someone else to decide whether the technology is acceptable”. Companies themselves must start designing with these concerns in mind.
With regards to regulation – what are some of the key obstacles when finding a good balance between innovation and protecting citizens?
The temptation with regulation is, as we say in English, “to throw the baby out with the bathwater.” The better approach is for regulators to consider what is the smart method to address real concerns. Sometimes it will be concerns about the specific uses of that technology. So, smart regulating will concern the uses rather than simply regulating the technology itself. Take facial recognition, for example, it is one thing to use it to ensure that you are the person whose photograph is on your passport. That seems a relatively uncontroversial use of the technology. But, the technology has proved more controversial where it is being used to scan a lot of people in order to identify some that are on a police wanted list. This use can be more controversial because in those circumstances, the technology may not be good at distinguishing certain photographs from others, so if those are your concerns let’s regulate based on those and not simply ban all facial recognition technology.
What is the role that Access Partnership plays when it comes to finding the right ethical and regulatory solutions for these technologies?
Access Partnership has a role to play in these debates and it can use its expertise, experience and its contacts to ensure that the debate is clear, focused and well informed. Sometimes debates about technology are not very well informed, so there is a need to explain carefully how the technology works, what it does, and to identify precisely what the concerns are. I think, a company like Access Partnership which has tremendous expertise is very well placed to play that role: I sometimes call it demystifying the technology, explaining how technology works and what it does, in ways which a non-specialist can understand.
Finally, how do you like to spend your free time?
My two great hobbies are arts and cricket. I enjoy almost anything to do with the arts from music, to painting, to literature. One of the reasons I like cricket is that it is the team sport which reduces the element of luck more than any other I know. In baseball for example, which I also enjoy, the result of a single homerun can change the outcome of the match, whereas in cricket, whatever happens in one moment is not going to change the outcome of the match. The final result will depend on how people perform over many hours and days as a team using a variety of athletic skills. That is what is so fascinating about it.
Interview by Ivan Ivanov, Senior Marketing Manager, Access PartnershipBack to document archive