AI Governance & Our Shared Future
We ask Robert Whitfield, Chair of the One World Trust onto the show to speak about Artificial Intelligence, and the urgent need for global governance of this emerging technology.
AI Governance & Our Share Future
We invite Robert Whitfield, Chair of the One World Trust and author of AI GLOBAL GOVERNANCE – WHAT ARE WE AIMING FOR? & lead author of Effective, Timely and Global – the Urgent Need for AI Global Governance Just how is AI changing our world?
What are the risks & opportunities from the spread and development of AI? What mitigation is needed to ensure a safer future?
Links to some of Robert Whitfield's content on the subject:
AI transcript - AI Governence & Our Shared Future
Simon Sansbury
00:00:00.0 - 00:00:06.519
Good evening and welcome to the Pompey politics podcast I'm Ian Tiny Morris And I'm Simon Sansbury
Ian 'Tiny' Morris:
00:00:13.26 - 00:00:36.08
Good evening And welcome to the Pompey politics podcast As many of our regular listeners know we often deal with complex issues involved in the local political scene But this week we are trying to get our arms around a very big topic indeed That of a I artificial intelligence and its governance Now of course you could leave this to myself and Simon
Ian 'Tiny' Morris:
00:00:36.259 - 00:00:47.04
but that would kind of be like giving a cow a musket and a mule a spinning wheel So rather than us try and deal with something far beyond our uh our understanding Simon we've got a
Simon Sansbury
00:00:47.049 - 00:01:05.819
guest Uh we have indeed we We're joined uh this evening by uh Robert Whitfield who's the chair of the OneWorld trust Um and um a and has written um some interesting articles and papers uh on on those issues I've put some links to those um into the chat for everybody Um so welcome to the show Robert
Robert Whitfield
00:01:06.22 - 00:01:08.29
Thank you very much Good to be here
Simon Sansbury
00:01:08.419 - 00:01:27.5
Thank you Um and as Ian said um well it's um the audience are going to be much better informed by listening to what you've got to say than what we have on the subject So um so just to help us um un understand your your background a bit more if that's OK Do you mind Just um introducing yourself Um how you got involved with with this subject
Robert Whitfield
00:01:27.87 - 00:01:41.069
Uh yes I mean my this is not something I've grown up with My background was in business uh in particularly in aerospace in Airbus where I had various senior management positions
Robert Whitfield
00:01:41.709 - 00:01:55.839
But after I left Airbus um I then sort of started to engage in uh in environmental governance and environmental solutions and did a masters And uh and and then
Robert Whitfield
00:01:56.61 - 00:02:07.4
at one point when I was I was chair of the One World Trust By that time Uh II I read a book superintelligence by Nick Bostrom
Robert Whitfield
00:02:08.1 - 00:02:19.38
Uh I'd had one more book on that topic called Singularity And singularity seemed to be uh completely science fiction
Robert Whitfield
00:02:20.429 - 00:02:48.339
but uh super intelligence I really engaged with I uh totally immersed myself in it I marked it up page by page I thought it was absolutely brilliant Book Uh and so that started me thinking because basically superintelligence was flagging up a big problem that at the moment we don't have a solution for And the solution is very very very difficult to define
Robert Whitfield
00:02:48.779 - 00:02:50.6
And that's what the book is all about
Robert Whitfield
00:02:51.16 - 00:03:12.16
Uh and that then later got me into um a a particular international working group which I chair with the World Fiddler Movement Um and got me into leading a programme on a I Governance and a I safety for the For the One World Trust Which leads me to be
Simon Sansbury
00:03:12.169 - 00:03:13.179
here Thank you very
Ian 'Tiny' Morris:
00:03:13.19 - 00:03:14.169
much
Ian 'Tiny' Morris:
00:03:14.44 - 00:03:37.19
Marvellous So when we're thinking about artificial intelligence it's one of those It's it's very much AAA buzzword or a phrase that's sort of thrown about regularly We see it in the media So wh what are we talking about when we say you know a I are we talking about things like you know my faithful digital assistants which are all around my house Is it is it self driving cars
Ian 'Tiny' Morris:
00:03:37.419 - 00:03:43.619
You know it Or is it is it something different Try try and define it for us please Robert
Robert Whitfield
00:03:43.63 - 00:03:52.99
Well uh this artificial intelligence it was you know it was initially sort of conceived by Alan Turing back in the early fifties
Robert Whitfield
00:03:53.699 - 00:04:10.38
Uh and since then there have been multiple definitions of what it is uh and so there is no one fixed definition And that's one of the problems with uh with various regulatory uh conventions as to how to define it
Robert Whitfield
00:04:10.929 - 00:04:29.859
Um there's a classic book on artificial intelligence by Russell Norvig And there they talk about um systems that think like humans or systems that act like humans or systems that think rationally or systems that act rationally
Robert Whitfield
00:04:30.839 - 00:04:41.589
But Stuart Russell himself comes up with a a AAA definition which I think is good Um that an A I is intelligent
Robert Whitfield
00:04:42.119 - 00:05:07.04
in so far as what it does is likely to achieve what it wants given what it has perceived and that basic concept Yes it's it You will find it in digital assistance in self driving cars uh with chat G BT which may help you to uh cheat in an exam if that's what you want to do Um
Robert Whitfield
00:05:07.589 - 00:05:12.179
but the key point is that that the impact of a I on human existence
Robert Whitfield
00:05:12.859 - 00:05:25.339
it's really only just beginning I mean some of the things that search and recommendation algorithms chatbots facial detection These are all today
Robert Whitfield
00:05:25.72 - 00:05:30.92
that soon A I will impact every every single aspect of our lives
Ian 'Tiny' Morris:
00:05:31.109 - 00:05:52.959
And and And that's very interesting because a again when I think about uh you know I I rely quite a lot on my digital assistants there you know it It is I think I've shared on the podcast before as as somebody with no useful vision the ability to have all my music and a library of talking books that that can pick up wherever I've left off You know as I wander around the house
Ian 'Tiny' Morris:
00:05:53.19 - 00:06:09.529
um is enormously beneficial to me And I think that the flip side to that is obviously sometimes the irritation when you know suddenly on a Facebook feed you might have searched for something and suddenly it's suggesting other products you might wish to purchase of a similar nature
Ian 'Tiny' Morris:
00:06:09.769 - 00:06:23.79
Uh I mean is that Is that a kind of am I framing that as the the basics of of artificial intelligence You said that it was almost that's That's the That's the A I of today rather than the A I of tomorrow that
Robert Whitfield
00:06:23.799 - 00:06:39.149
that's right I mean So what you've described is is is a sort of a a good application Um you know the the downside of that is that uh you'll probably find that your personal assistant is uh
Robert Whitfield
00:06:39.829 - 00:06:56.579
taking in a lot of information about you And as you said with the um you you You're soon finding that And they've done lots of tests like this um that your your assistant your virtual assistant
Robert Whitfield
00:06:57.089 - 00:07:13.82
um will in practise probably know you better than uh your even quite close friends You know if if if you if you were asked close friends about aspects of you and you've asked a personal assistant you'll find that the A I actually knows you
Ian 'Tiny' Morris:
00:07:13.829 - 00:07:24.899
better And yeah I It is something I'm I I've always been cognizant of because uh as I mentioned I think I've got eight I've got one in every room of the house And uh
Ian 'Tiny' Morris:
00:07:25.149 - 00:07:30.1
and um So uh yeah they they're they're constantly listening Yeah
Robert Whitfield
00:07:30.329 - 00:07:36.959
yeah but but do you then raise a point I mean maybe you know maybe if if R is happy with that then then fine and
Ian 'Tiny' Morris:
00:07:37.48 - 00:07:45.0
and and yeah and I we will probe some of those sort of moral dilemmas later in the podcast It's that you know
Ian 'Tiny' Morris:
00:07:45.179 - 00:08:02.339
it It it is the classic isn't it You know Well uh somebody's listening to you and it's it's I guess as uh as myself and Simon who uh who regularly do a podcast We rather hope that at least somebody is listening to us even if it's only the digital assistant Simon
Simon Sansbury
00:08:02.92 - 00:08:26.41
OK so um so in some of the some of the things when we um when we read through uh the material that that you've produced we you know it uses terms like a G I and uh super intelligence just to help us kind of contextualise what we're talking about What what What do they mean and what do they What would they represent in in practical terms in every in everyday life
Robert Whitfield
00:08:26.809 - 00:08:35.669
Well I think it's it's worth It's worth seeing uh these these terms in in terms of a of a ladder of progression
Robert Whitfield
00:08:36.21 - 00:08:58.2
Um and at the moment uh we all the a I that any of us have had contact with is really is narrow A I or simple a I but it's typically called narrow Where it it it it it essentially does one thing and it does that quite well but it doesn't do lots of radically different things
Robert Whitfield
00:08:59.349 - 00:09:11.489
Uh what people are some people are working towards is then the next step which is a big step up which is artificial general intelligence
Robert Whitfield
00:09:12.059 - 00:09:16.57
Uh and that's where uh an autonomous system
Robert Whitfield
00:09:17.109 - 00:09:24.53
uh it surpasses human capabilities in the majority of economically valuable tasks
Robert Whitfield
00:09:25.39 - 00:09:52.03
Uh so that is a is is a is a significant challenge to get there Um but if you get to artificial general intelligence then what it's saying is uh that you have an an an entity as sort of system that's really comparable or even superior to you in your economic capability But that's still if of of the three words you came up with um
Robert Whitfield
00:09:52.83 - 00:09:54.599
and that's the middle level
Robert Whitfield
00:09:55.26 - 00:10:02.409
And then going beyond that there's the concept of artificial superintelligence Uh and there
Robert Whitfield
00:10:02.919 - 00:10:29.45
really there is no limit to the level um you you've got if you like got from 0 to 11 being our level of intelligence And that's what artificial general intelligence is But uh artificial superintelligence Maybe it it may start by being twice as intelligent as us It could be 10 times it could be 1000 times It could be a million times more intelligent than us
Robert Whitfield
00:10:29.969 - 00:10:58.58
Uh and this links to the term of the title of that first book singularity that people think that the rate of increase in intelligence is likely to become very rapid when you when you if you if we do reach that point of of superintelligence because super intelligence is able to design better versions of itself
Robert Whitfield
00:10:58.9 - 00:11:15.359
and probably replicate them If you think of software replicate them very rapidly Uh what You what you find um in the press recently And it's really only in the last six months of the the new term frontier models
Robert Whitfield
00:11:15.989 - 00:11:22.09
Um and And if you think of what I've described as sort of three steps in a ladder
Robert Whitfield
00:11:23.21 - 00:11:34.729
then the concept of a frontier model is just uh where if we where has society got to in scaling that ladder Um and it
Robert Whitfield
00:11:35.76 - 00:11:44.76
there there are certain things that are likely to be of significance sort of at that frontier And that's what the focus is on at the moment
Robert Whitfield
00:11:45.309 - 00:11:49.289
Sorry Oh sorry Go on But in in terms of you you asked about practical
Robert Whitfield
00:11:49.909 - 00:12:05.33
terms So for an A G I um if you think of a of a robot able to do what you can do uh in terms of thinking of of in practical terms what an artificial superintelligence might be
Robert Whitfield
00:12:05.919 - 00:12:13.25
if you think of an entity that's able to manage a nuclear fusion plant design and fabricate medicines
Robert Whitfield
00:12:13.82 - 00:12:29.0
manage a rocket launch to the moon all at the same time whilst rapidly designing and creating still more super intelligent entities that's the sort of thing that could happen Uh if we if we allow it to happen
Simon Sansbury
00:12:29.01 - 00:12:40.719
But uh it's interesting you say that you you you use the plural in the entities there cos I I think in a lot of the I mean certainly and forgive me for like reaching reaching for science fiction But in a lot of the science fiction
Simon Sansbury
00:12:41.059 - 00:13:05.969
explorations of singularity and artificial intelligence they seem to focus on there being one artificial intelligence When is that in reality likely to be the situation because it's not being developed in one location is it is is it likely to be multiple different iterations or different versions depending on who's developing it or yeah and how it's evolved from
Robert Whitfield
00:13:06.25 - 00:13:17.25
There are a range of scenarios and I don't think we can be prescriptive now as to which one would would be most likely Um but you you you can
Robert Whitfield
00:13:17.809 - 00:13:18.53
uh
Robert Whitfield
00:13:19.309 - 00:13:37.659
uh if you think of there being you know several centres for developing uh artificial general intelligence and then improved intelligence beyond that um they you know they might produce sort of rival um super intelligences
Robert Whitfield
00:13:38.02 - 00:13:52.429
at the same time You know some people have suggested there should be one global centre that works on the development of superintelligence And that would certainly lead to the sort of the single in S uh superintelligence that you're describing
Robert Whitfield
00:13:53.13 - 00:13:53.78
Uh
Robert Whitfield
00:13:54.349 - 00:14:03.469
personally I think we need to think very long and hard before we allow a super intelligence to enter our realm and
Simon Sansbury
00:14:03.479 - 00:14:23.479
the material You um referring to it It talks about the importance of a of the ability to you know effectively a pause button if the intelligence if you like is getting too clever Um now fans of science fiction know how how that seems to um it seems to be a stable in in a lot of the in a lot of the stories It's a um
Simon Sansbury
00:14:23.88 - 00:14:37.77
a contention point in a in A in a struggle for um you know uh essentially for want of a better phrase the dominant species on on on the planet But is is that why that pause is really important before it gets actually too clever for us to be able to understand or
Robert Whitfield
00:14:37.78 - 00:14:42.619
control I mean that's right I mean the the the the The reality is that
Robert Whitfield
00:14:43.14 - 00:15:11.08
in these frontier models these large language models with which you've seen with chat G BT GP T four and things the the the makers the designers of these models do not understand how they work And that is troubles Many people uh you know it It troubles many computer scientists Uh and until one understands how something works
Robert Whitfield
00:15:11.45 - 00:15:37.26
uh it is seems to be foolish to be thundering on to develop something that is much much more powerful than the one that you don't understand how it works I think that leads us in And so that was carry on And and so that was um back in in in March this year That was the call for um for a pause and often that
Robert Whitfield
00:15:37.78 - 00:16:05.03
what the pause was referring to is is it was misstated Some people were talking about it being a a pause uh you know stopping developing um artificial intelligence at all which was complete nonsense All they were saying was that that we should stop scaling which is going to the next level And uh each time they've been scaling they have been
Robert Whitfield
00:16:05.4 - 00:16:10.33
increasing the computational power applied to a system
Robert Whitfield
00:16:10.88 - 00:16:15.219
by a factor of 10 So it's an enormous increase in power
Robert Whitfield
00:16:15.76 - 00:16:17.38
Um And
Robert Whitfield
00:16:18.099 - 00:16:22.25
what What was suggested back in in in March was that
Robert Whitfield
00:16:22.919 - 00:16:37.82
you had with GP T four an ex extremely powerful Impressive Um uh a I system that people didn't understand which which was much more capable than people were expecting
Robert Whitfield
00:16:38.719 - 00:16:40.77
And those are
Robert Whitfield
00:16:41.46 - 00:16:59.71
characteristics which would not suggest right Let's make something you know 10 times more powerful than this and see what happens Uh first of all understand Uh how what GP T four How it works Why it is so good
Robert Whitfield
00:17:00.07 - 00:17:11.17
uh bef and then take a decision Uh which would be a very openly discussed decision as to whether to scale to go forward to uh a still more powerful version
Ian 'Tiny' Morris:
00:17:11.38 - 00:17:26.4
So if we think about that Robert Cos uh uh you know II I think Simon's touched on it before And you know uh uh there there is There is something a touch dystopian about the you know the the the robots taking over
Ian 'Tiny' Morris:
00:17:26.599 - 00:17:41.25
But I guess if we look at it through the the sort of you know through perhaps a more positive lanes Um What What do you see as the as the opportunities for the use of a I you know uh I I is it Is it simply
Ian 'Tiny' Morris:
00:17:41.5 - 00:17:56.969
you know the to harbour the forces you know or to advance the the the causes of darkness Or you know there there has to be some positive societal or ecological or economic benefits from advancing it So So what is the What is the case for
Robert Whitfield
00:17:58.099 - 00:18:01.02
No unquestionably Uh
Robert Whitfield
00:18:02.339 - 00:18:18.969
a I and more advanced A I can offer huge benefits to to mankind Um and be very clear I I'm totally accept that um and I'm looking forward to that
Robert Whitfield
00:18:19.599 - 00:18:41.93
Uh and you know um the latest one of the recent things is uh nuclear fusion Where Deep mind has managed to sort of control plasma Um at the moment for two seconds until the uh the system overheated But that wasn't deep Mind's fault Uh
Robert Whitfield
00:18:43.13 - 00:18:45.02
when we've seen with
Robert Whitfield
00:18:45.699 - 00:19:03.599
with some of the you know five years ago their early systems playing playing go uh where you know early on the machine was really not very capable Um and you know a couple of months later it was able to beat the world champion
Robert Whitfield
00:19:04.01 - 00:19:22.01
Uh and so if they're able to control plasma for even a short space of time there is a real chance that that could continue to develop and improve to the point where we start to have nuclear fusion which would be a a huge step forward
Robert Whitfield
00:19:22.65 - 00:19:32.739
Uh sim similar the same deep mind I mean they came up with their their protein folding which has dramatically uh impacted medicine
Robert Whitfield
00:19:33.349 - 00:19:46.489
Uh Denis Eaby who was the one of the co founder and the current uh chief executive He is looking in the in the in the near term at quantum chemistry
Robert Whitfield
00:19:47.13 - 00:19:50.079
and that leading to material design
Robert Whitfield
00:19:50.75 - 00:19:53.66
such as the idea of designing new materials
Robert Whitfield
00:19:54.489 - 00:20:09.75
so you could design them on a computer before you test them out laboriously in laboratory Uh you could have room temperature superconductors At the moment they they only work extremely low temperatures much better batteries solar panels
Robert Whitfield
00:20:10.4 - 00:20:22.979
Uh so that there are lots of things that people are currently thinking about and currently working on which would make huge difference And then if you then stand back um I mean I think
Robert Whitfield
00:20:23.56 - 00:20:33.3
you can You can see society reaching a point where it is able to you know to to decide what it wants where it wants improvements
Robert Whitfield
00:20:33.92 - 00:20:39.41
Uh and those improvements will be developed Uh people will be lifted out of poverty
Robert Whitfield
00:20:40.15 - 00:20:42.5
There'll be huge economic growth
Robert Whitfield
00:20:43.28 - 00:20:49.829
Uh there can be solutions to climate change to biodiversity and other environmental challenges So
Robert Whitfield
00:20:50.93 - 00:21:17.93
you know other things being equal that the sky is the limit There's huge benefit to come But the problem is that other things are not equal namely just as the benefits get bigger and bigger Uh as the A I gets more intelligent So do so too Do the risks Um and that that list of risks is is uh
Robert Whitfield
00:21:18.67 - 00:21:21.65
it is very serious and I can certainly go into that and
Ian 'Tiny' Morris:
00:21:21.66 - 00:21:47.619
that and that's where I was That's exactly where I was going to take you next Robert You know for for every ying there is a yang and I I guess the history of the history of technological advances shows us that you know whilst there has been you know whilst there's been much you know many tangible benefits to society more often than not there is a there is a spin off or a dark side that
Ian 'Tiny' Morris:
00:21:48.42 - 00:22:17.479
you you kind of have to balance You know I remember having this conversation with somebody about you know is the Internet a good thing and you know and social media and connectivity and it's it's one I've mused on um so let let's focus in on on you know what are those tangible risks Perhaps the ones that are more near term and that you know perhaps you can already see developing rather than the more kind of slightly distant catastrophic singularity where
Ian 'Tiny' Morris:
00:22:17.93 - 00:22:21.609
where we are potentially all controlled by the robots
Robert Whitfield
00:22:21.619 - 00:22:39.39
Ok well well um you know 11 area is is is simply that you know the impact which we're experiencing today uh of social media which as currently designed it It it it It sort of pulls
Robert Whitfield
00:22:40.609 - 00:23:07.63
you know billions of people uh into effective silos um into it It sort of nudges them constantly into areas uh of these silos where they can be targeted with sort of one stream of advertising would fit that whole group of people once you've sort of corralled them into having the same set of views
Robert Whitfield
00:23:08.209 - 00:23:37.449
Uh so there's a huge issue about social media Uh there's a whole question of of bias which is quite different Namely the bias comes from the fact that these A i systems are trained on huge amounts of data Uh and that data is just what reflects society today and and society in the past Um and so you will have all sorts of things that the A I learning
Robert Whitfield
00:23:37.719 - 00:24:00.165
uh will come across misogyny or racism or lots Lots of things Um uh you know some of the more uh unattractive characteristics of of of of humanity will be appear in novels will appear all over the place And so the a I absorbs these um and uh and that can
Robert Whitfield
00:24:00.175 - 00:24:22.18
come out um in uh in the form of bias in terms if you use it for decision making Um and uh a lot of societies are interested in using it for decision making You know whether that's for employment or uh whether you could get a mortgage or whatever it is Um and certainly in the early systems they found
Robert Whitfield
00:24:22.579 - 00:24:41.25
uh you know uh you know No no Women were getting a mortgage or No no uh people of colour were getting a mortgage or whatever it was because that was uh it it had on its data It had worked out a certain relationship
Robert Whitfield
00:24:41.89 - 00:25:00.479
Um there's then the surveillance which you have in in In In places In in China And uh with the Ugas where um you have sort of this big brother system of uh detecting people and then facial recognition
Robert Whitfield
00:25:00.739 - 00:25:16.68
Uh and constant monitoring of you know you you were there and you've just come from there Uh sort of very much a big brother control that for the Uighurs has led to um
Robert Whitfield
00:25:17.349 - 00:25:21.939
you know huge numbers of them going into uh
Robert Whitfield
00:25:23.0 - 00:25:26.65
camps because they've worn a headscarf or something
Robert Whitfield
00:25:27.28 - 00:25:32.589
Uh and you get and there are deep deep fakes Um
Robert Whitfield
00:25:33.369 - 00:25:42.109
so images uh or now increasingly uh video or vi video with uh
Robert Whitfield
00:25:42.66 - 00:25:58.619
uh with with sound uh that is totally fabricated Um I I watched something yesterday where the UN secretary general was talking about this um and he was saying that he had seen um
Robert Whitfield
00:25:59.31 - 00:26:18.25
uh a video of himself uh giving a speech in chinese Um and it was perfect The the lips were totally synced Um the only point he made was he He can't speak Chinese you know And this was this was totally fabricated Uh and so if if
Robert Whitfield
00:26:18.53 - 00:26:32.9
a speech like that and you know some uh antagonistic antagonistic words or you know belligerent words or whatever were put into his mouth um there's huge scope for for
Robert Whitfield
00:26:33.219 - 00:26:36.3
uh for manipulation of of of people It's
Ian 'Tiny' Morris:
00:26:36.31 - 00:26:59.53
an It's a very interesting point because again uh you know touching back on the world of science fiction Uh you know uh Douglas Adams with his Babel fish which meant that uh all languages could be understood by everyone And I think his observation was it had led to more world wars than any other um invention known to man And And I think there is this piece where
Ian 'Tiny' Morris:
00:26:59.849 - 00:27:24.16
you know uh uh and it's interesting just hearing you speak there that the sort of media bias where you know if you if you look in I into US politics you know it's that you know I I If you are a watcher of Fox news it it reinforces the bias But I think what you've described there is you know we we've always we've always been able to say Well the camera never lies or I heard it with my own ears
Ian 'Tiny' Morris:
00:27:24.79 - 00:27:48.04
And I guess we we are We are perhaps I I was listening to a podcast in the week and it was only in the run up to this where they have newsreaders they announce things and then they say something completely silly But it's in the genuine voice of the newsreader and I I I've never quite worked out hows the newsreader to to come in and you know uh record the piece of nonsense
Ian 'Tiny' Morris:
00:27:48.26 - 00:27:56.369
But I think I'm at the point now where they don't have to do that They just sample the the person's voice and build up a construct
Robert Whitfield
00:27:56.459 - 00:28:17.64
That's right and and I think a lot of these systems use this sort of concept of generative adversarial networks which is a long words but it's a simple idea that you have one A i which tries to come up with the fake Let let let's say it's a it's a It's a picture of a woman
Robert Whitfield
00:28:17.91 - 00:28:32.04
Uh the woman's face Uh and the other A I tries to determine whether that is the woman or it's a It's an A I generated uh fake
Robert Whitfield
00:28:32.329 - 00:29:01.229
And so you have these two just as Alpha Go be became brilliant and defeated the world champion with go by playing itself So the same thing here You you have one making and making the fake and the other detecting whether it is a fake or not And so they both get better and better and better in their skills and the Net The net result is that you're you're having these uh remarkably convincing
Robert Whitfield
00:29:01.939 - 00:29:06.89
fake whether it's um images or sound or whatever it is So
Simon Sansbury
00:29:06.9 - 00:29:21.88
So I guess if you've got that sort of technology that can on one hand be used to to create a new Beatles single or it can be used to create a swearing tirade of a political leader or a inflammatory speech um from a political leader
Simon Sansbury
00:29:22.66 - 00:29:48.339
Um and that's the That's the you know there was a nice application for it And then there's a There's the dangerous level of of of the application for it isn't it I guess that's the That's the risk So um so what are the sorts of groups or organisations You know groups of groups of people that are that are working on this It's because it's quite clearly you know it's not couple of a couple of you know students working from a workbench in someone's garage or something But it's
Simon Sansbury
00:29:49.42 - 00:29:57.239
um you know what What sorts of groups are are working on that and and are they What are they led by What are the what would what would their goals be
Robert Whitfield
00:29:57.91 - 00:30:06.979
Yeah well I mean I mean I mean at the moment that the space is is largely dominated by uh
Robert Whitfield
00:30:07.689 - 00:30:21.439
the early players who uh did very well in the in the field generally of tech Uh and so you have these small number of trillion dollar companies
Robert Whitfield
00:30:21.959 - 00:30:25.9
Um it's a big tech uh and
Robert Whitfield
00:30:26.56 - 00:30:48.16
uh then it it's now become quite a battle between these these companies Um one of them for instance Google has made become a trillion dollar company through essentially one tool which is which is search And it's been brilliant at as a search engine
Robert Whitfield
00:30:48.88 - 00:30:58.959
Uh but you now have the the the the the concept that that a i in the form of GP T four or or whatever
Robert Whitfield
00:30:59.9 - 00:31:25.689
uh it's future versions could be applied to support uh an otherwise relatively weak search engine to make a power or even superior to to Google Um and you know people have been talking about the end of end of search because a I is going to come in So you've got these huge organisations who are you know playing very big stakes
Robert Whitfield
00:31:26.31 - 00:31:32.369
Um at the same time you you have got uh a lot of smaller A I firms
Robert Whitfield
00:31:32.949 - 00:31:37.989
um some start ups with some few bright people
Robert Whitfield
00:31:38.569 - 00:31:40.0
Um in that
Robert Whitfield
00:31:40.599 - 00:32:05.979
it's it's very expensive and you require huge resources to train Uh uh a GP D four or a B a RT or whatever it is Uh but but the the use of the system um it requires much less fewer resources So it's it's much more realistic feasible for uh
Robert Whitfield
00:32:06.13 - 00:32:15.949
for a for a start up to uh to to be focusing on the application of one of these uh foundational models
Robert Whitfield
00:32:17.329 - 00:32:19.069
All right you then
Robert Whitfield
00:32:19.939 - 00:32:48.51
uh you have some developers who are uh you know are very consciously working towards artificial general intelligence And as I say that's it's a big question as to whether we want to reach artificial general intelligence Or are we ready for it Uh however there is is And there was a I read one of the participants you know And yesterday you know claiming you know the race is on
Robert Whitfield
00:32:48.849 - 00:33:15.145
uh and he's and and his team are you know are frantically trying to beat the others to be the first to get to reach this goal Um even though it's a hugely questionable goal for humanity there are Then there are the people who working in safety Um far far too few of them Uh the I think it's being considered There are much less than 1%
Robert Whitfield
00:33:15.155 - 00:33:40.729
of the research effort Uh the total overall research effort for a I development less than 1% goes into safety and safety thinking about the safety of the system Uh and Geoffrey Hinton who's the software engineer who is considered the sort of the godfather Um of uh uh of deep neural networks Um
Robert Whitfield
00:33:40.969 - 00:34:04.652
he argues that it should be 50% should be focusing on safety Not 1% Not less than 1% but 50% So a huge transformation Uh there are people collaborating Uh but uh but most of the and there's there's some work done by academics and research that by academics some by governments institutions But but
Robert Whitfield
00:34:04.662 - 00:34:28.345
it's that that's dwarfed on the whole by the uh by the commercial R and D budgets and that uh and so what you have is a set of people with with goals that are often very different I mean there are goals to improve the business profitability There are goals to achieve a G I uh and go beyond that you know for its own sake Um or there are some
Robert Whitfield
00:34:28.355 - 00:34:52.05
who are you know are taking a more balanced approach They're trying to achieve benefits but whilst avoiding uh the risks And for me the the key question is what is the rush Uh uh People talk a great deal about innovation You mustn't You mustn't uh you know slow down or impair innovation As though that is some god given thing
Robert Whitfield
00:34:52.29 - 00:35:07.439
But if it's innovation in order to destroy the human race uh it doesn't seem to me there's such a hurry Surely we should just make sure that the steps we are taking the innovative steps we are taking are are steps in the right direction
Simon Sansbury
00:35:08.179 - 00:35:13.679
And your point is it Is it the age old race to Could we do a thing rather than should we do a thing absolutely
Robert Whitfield
00:35:13.689 - 00:35:14.399
absolutely
Ian 'Tiny' Morris:
00:35:14.409 - 00:35:20.37
and interesting Robert in your paper It points out that the the need for greater regulation and uh uh again
Ian 'Tiny' Morris:
00:35:20.54 - 00:35:43.364
some of those uh those very large companies that you you you mentioned there um perhaps have have had an uneasy regul uh an uneasy kind of relationship with regulators You know we regularly see you know the likes of Facebook and and and Apple kind of really trying to to stay out of the reach of the regulator
Ian 'Tiny' Morris:
00:35:43.375 - 00:35:56.125
How do you think the You know these these big multinational tech companies with budgets far beyond what any government could hope for How how can the regulation of such organisations be be addressed
Robert Whitfield
00:35:56.135 - 00:36:00.6
Well uh it's it's a it's a key question Um
Robert Whitfield
00:36:00.79 - 00:36:16.675
and uh and the The simple answer is that governments have to stand up and do their job Uh but obviously it's it's that's easier much easier said than done uh in the US where you have you know the the
Robert Whitfield
00:36:16.685 - 00:36:32.58
the US and China are the two places where these very powerful technology companies exist Um deep mind is sort of is is an important major player but is linked to is part of Google which is US
Robert Whitfield
00:36:33.34 - 00:36:54.149
Uh and because you know so much wealth has been created Uh politicians in the US have been a bit reluctant to uh to engage in in regulation And on the whole there's less regulation in the in the States than than than elsewhere Um
Robert Whitfield
00:36:54.35 - 00:37:13.459
but But you've seen uh in the European Union they they've sort of taken a different approach And you know they've established the European Union a I act Um it's you know it Ha It hasn't been finalised but it's near finalisation in the in the sort of last stage of the
Robert Whitfield
00:37:13.469 - 00:37:32.59
process Um now of course they they are imposing the European uh union sort of government uh in relation to uh industries which are dominated by um You know foreign coun uh countries So it's a slightly different issue
Robert Whitfield
00:37:32.739 - 00:37:58.635
compared with with the US But basically the US um have to stand up And my my sense is that there are politicians who want to do that Uh but in in the States But uh the the balance between Democrats and Republicans is you know is on a knife edge Um you know in in Congress and there is a uh although you know there's a
Robert Whitfield
00:37:58.645 - 00:38:13.695
bizarre you know from I think from the Biden administration to to regulate but but they're they're nervous of sort of losing the little bit of support they might get from from a few Republicans in order to get things through
Ian 'Tiny' Morris:
00:38:13.705 - 00:38:26.86
Do you think this speaks Robert to to the uh with some of it I I'm just hearing you talk there I I'm kind of drawing parallels a little bit between the nuclear arms race of the sort of seventies and eighties that
Ian 'Tiny' Morris:
00:38:27.06 - 00:38:38.929
that I I you know I I wonder whether there is the There is an attitude that basically says Look in if in the US we don't have ours and it's bigger and better than the Russians or the Chinese
Ian 'Tiny' Morris:
00:38:39.639 - 00:38:54.979
you know their their version of this will dominate us and you know it Then becomes the conversation about the end of freedom and and And I wonder whether that that pervasive thinking does exist in the mind of of governments
Robert Whitfield
00:38:54.989 - 00:38:58.02
No II I I'm sure it does Uh
Robert Whitfield
00:38:58.62 - 00:39:20.32
so you have these two types of competition You have the competition between companies And though it may be between two US companies or as you say it's competition between two states and and in this case the two states that have the the vast bulk of capability in A I
Robert Whitfield
00:39:20.53 - 00:39:46.169
are the existing sort of hegemon the sort of ruling the most most powerful uh dominant country in the world the United States and the aspiring hegemon in in terms of China who would like to be the most powerful uh state in in in the world And any point where you know one might replace another is a period of
Robert Whitfield
00:39:46.37 - 00:40:02.6
uh of high risk you know at at any time if you go sort of go back in history Um so you're absolutely right that you know that there is company competition but also uh uh international geopolitical competition
Robert Whitfield
00:40:03.149 - 00:40:27.55
Uh and so FFFF for for me the the key point is therefore not to try to get away from that mindset of competition and to explore as much as possible the mindset of of Cooper operation and and I'm questioning between the US and China there are in this area of a I there are things where uh
Robert Whitfield
00:40:28.05 - 00:40:45.82
they they think they have the same objectives There will be some where where there are different objectives But there are some where there are the same and they should they should be trying to sort of build build on that and make a start um in addressing things where they where they where they have commonality are the are
Simon Sansbury
00:40:45.899 - 00:40:55.679
the are the regulators or the legislators Are they even remotely ready or prepared for for this sort of conversation or negotiation Because
Simon Sansbury
00:40:56.05 - 00:40:59.34
social media has been around for 20 years and and legislators
Simon Sansbury
00:40:59.899 - 00:41:07.0
still don't seem to understand how that works uh and how to how to protect populations from the from the from the worst
Simon Sansbury
00:41:07.53 - 00:41:17.58
elements of it Um is that also another part of the challenge that the technology is moving much much quicker than the legislators tend to tend to
Robert Whitfield
00:41:18.76 - 00:41:35.04
and yeah I mean the the challenge of regulating a I successfully and effectively is is is huge and it's certainly very complex Uh it is very significant
Robert Whitfield
00:41:35.719 - 00:41:37.669
and it is very fast moving
Robert Whitfield
00:41:38.219 - 00:41:41.8
and it is often unpredictable
Robert Whitfield
00:41:42.709 - 00:42:08.35
And it is tied up as we were just saying in sort of geopolitical uh struggle So nobody should uh say it's it's it's It's a sort of simple task but the But just because it's a challenge does not mean to say that we should One should shy away and think Oh well manana we'll you know we'll we'll have a go next year or the year after or or the year after
Robert Whitfield
00:42:08.919 - 00:42:15.04
and and in in in a sense that's I think I suggest that this partly what's happened with the social media
Simon Sansbury
00:42:15.939 - 00:42:21.28
Yeah so II I guess without heading straight down straight down the dystopian track Then what's the
Simon Sansbury
00:42:21.79 - 00:42:32.699
So what's what's at stake here The risks for for humanity of of you know quote unquote if we get this wrong So what's the What's the risk of the bad side
Robert Whitfield
00:42:33.31 - 00:42:47.169
I mean if if we if we get the the the the near term issues wrong It basically means we're in a pretty miserable time with people divided confused mistrusting misled
Robert Whitfield
00:42:47.909 - 00:43:03.739
Uh but as as time progresses and you start to to look at uh at some of the bigger issues coming down I mean one of the things we haven't really talked about is uh is the whole issue of employment of work Uh you know is it
Robert Whitfield
00:43:03.909 - 00:43:33.25
simply a source of income which you know can can come from government in the form of a universal basic income Does Does that is that the only reason for work to get some money Or do people get uh a sense of purpose a sense of identity from from the from the work that they do And you know whether that's a uh uh a a king's counsellor a barrister Whether that's a a postman whoever that is
Robert Whitfield
00:43:33.629 - 00:43:43.429
you can still get huge sort of sense of of worth and identity and purpose And if you if you lose that is that progress
Robert Whitfield
00:43:44.32 - 00:43:54.459
and I'm not saying it you know Obviously wealth and riches are nice to have have and so if maybe there's a trade off between the two but
Robert Whitfield
00:43:55.09 - 00:43:59.1
the crucial thing is that we don't want to find ourselves
Robert Whitfield
00:43:59.659 - 00:44:06.479
with no work to do and then start thinking Oh no Now what are we going to do And is this what I want
Robert Whitfield
00:44:06.989 - 00:44:32.989
What we need to do is to have the debate before that happens and try to determine the sort of world we do want to have And then maybe we maybe we go down the route that creates all that free time and that's great Or maybe we don't We handle it a different way But then but OK that that was sorry That was just one of the things we need to think about if we get it wrong Is is the whole question about work
Robert Whitfield
00:44:33.199 - 00:44:53.939
Um we haven't talked about persuaders uh brilliant persuaders Um III I mentioned this tool the generative adversarial network and playing one against the other Uh you people are developing that concept
Robert Whitfield
00:44:54.57 - 00:44:56.75
uh in terms of persuasion
Robert Whitfield
00:44:57.56 - 00:45:24.659
and what's coming down the track is persuade us if you if you if you think of uh you know Alpha go quickly it it it trade itself up playing against itself and became beat the world champion So there is a there is a risk of the world champion persuaders being on the phone at the door you know on your mobile whatever it is Morning noon and night
Robert Whitfield
00:45:25.32 - 00:45:40.669
So the the the there's the whole thing of of uh disinformation If we don't If we ever don't find a way of of of sorting that we risk losing losing the trust in in anything
Robert Whitfield
00:45:41.83 - 00:45:54.84
uh and so all this you know that's the medium term stuff And then there's the There's the there's the bigger stuff which we've sort of touched on which is um you know if we find ourselves
Robert Whitfield
00:45:55.419 - 00:46:02.189
that we are no longer um you know the dominant species on the on the planet Uh and
Robert Whitfield
00:46:02.899 - 00:46:04.929
if we think of uh
Robert Whitfield
00:46:05.57 - 00:46:29.51
you know Alan Alan Turing back in in the early 19 fifties um you know we just spread when computing was just starting But even then it was clear that as he said that at some point computers would probably exceed the intellectual capacity of their inventors and that therefore we should have to expect the machines to take control
Robert Whitfield
00:46:30.429 - 00:46:44.81
So we have the capability to avoid that if we want But otherwise that's what he suggested is likely And similarly Jeffrey Hinton who I've mentioned this sort of godfather of deep learning Um
Robert Whitfield
00:46:45.629 - 00:46:53.57
his statement that there is not a good track record of less intelligent things controlling things of greater intelligence
Ian 'Tiny' Morris:
00:46:54.06 - 00:46:54.479
And it is
Robert Whitfield
00:46:55.78 - 00:47:04.06
And so what we've just got to do is just uh make sure we don't go beyond the threshold
Robert Whitfield
00:47:04.639 - 00:47:11.669
unless we understand what we're doing Uh and it is a positive conscious decision by by society
Ian 'Tiny' Morris:
00:47:11.76 - 00:47:33.85
No it's interesting just hearing you talk there as I was reminded of the Shakespearean quote which if every day were as to play then to play would become as tedious as work So as you say we've got to think about what That what that future societal construct is If if there is an a i bot doing the heavy lifting slash heavy thinking for us
Ian 'Tiny' Morris:
00:47:34.09 - 00:47:44.139
Um so I I think that that beautifully outlines that the need to to get a AAA global consensus on control Um
Ian 'Tiny' Morris:
00:47:44.649 - 00:48:13.679
uh uh Thinking about you know uh II I was thinking about a couple of things when with the you know human embryo and and fertilisation seems to be something that globally we we we've managed to get a decent kind of regulatory framework or perhaps some of the climate crises Um you know there's a conference every year where we agree that in 30 years time we might get better Um
Ian 'Tiny' Morris:
00:48:14.1 - 00:48:21.229
uh D Do you see a a AAA route to a framework that could work for controlling a I
Robert Whitfield
00:48:22.51 - 00:48:47.54
uh well there are various uh ideas floating around and And what What was what is good and I I totally applaud Uh uh Richard soak for you know sort of raising the banner of a I safety back in April or whenever it was and and having this summit next week Um in that that is that is caused
Robert Whitfield
00:48:47.83 - 00:48:53.389
uh lots of thinking lots of proposals to come forward in in recent months
Robert Whitfield
00:48:54.06 - 00:49:21.439
Uh but first of all I in terms of precedents I mean the the best example that that people of quote is is the Montreal Protocol for uh for chlorofluorocarbons which were threatening and creating a huge great hole in the in the ozone layer Um and once that was fully recognised and the cause was recognised then
Robert Whitfield
00:49:22.07 - 00:49:24.399
uh the Montreal Protocol
Robert Whitfield
00:49:24.939 - 00:49:53.83
brought people together and and it the term experimentalist governance So they they weren't sure exactly how they were going to do it But they worked together to let's let's do that And they found they made I made improvements And then so then they took the next step and they took uh each chemical 11 by one and they gradually won them down to the production was down to zero and that really did work And you've seen the ozone
Robert Whitfield
00:49:54.29 - 00:49:59.33
the sort of the big hole Suddenly you know that was starting to to close close off
Robert Whitfield
00:49:59.959 - 00:50:09.629
that sort of example of of the world cooperating together intelligently Um is what we need to pursue
Robert Whitfield
00:50:10.199 - 00:50:10.87
Uh
Robert Whitfield
00:50:11.629 - 00:50:13.0
there are
Robert Whitfield
00:50:13.55 - 00:50:28.929
thinking more directly about the this specific nature Um the analogy often people come up with is the IAE a uh for um nuclear uh atomic energy
Robert Whitfield
00:50:29.709 - 00:50:30.82
Uh and
Robert Whitfield
00:50:31.479 - 00:50:49.409
there one of the things they've been trying to control is uh is in inappropriate use of uh O of plutonium and and uh nuclear materials for warfare Um
Robert Whitfield
00:50:50.09 - 00:51:05.6
we you know the the dis disaster in Iraq showed it's it's not easy Uh and the the everyone thought uh Saddam Hussein did have them because he was playing He was behaving as though they did but in fact they didn't But
Robert Whitfield
00:51:06.629 - 00:51:19.56
uh but on the whole the IAE A has has been very successful in in in monitoring what's been happening Um and uh and managing that overall global one
Ian 'Tiny' Morris:
00:51:19.679 - 00:51:28.989
thing that struck me with with that Robert is that you know again and this is where you know uh uh is it something that the world
Ian 'Tiny' Morris:
00:51:29.29 - 00:51:53.75
could regulate because again in in in very simplistic terms you know if you're gonna start refining uranium or plutonium you you need a fairly sizable plant and you need you know knowledge and those things you know we've We've got the technology to be able to spot those potentially happening across the world I guess my my question is you know if you are working on generating
Ian 'Tiny' Morris:
00:51:54.35 - 00:52:04.0
uh uh advanced a I Is it not simply that you need and forgive my simplistic language Just a really big room full of lots of really powerful servers
Robert Whitfield
00:52:04.26 - 00:52:10.699
Yeah Yeah And and and broadly you're right And so that's why uh
Robert Whitfield
00:52:10.979 - 00:52:35.1
controlling a i and the development of a G I and superintendent is much much more difficult than IE a but you still have You can draw on the sort of the constructs and the models uh of of sort of how the IAE a operates but it it it will have to be much more detailed and complex Um there are There are some tools which are
Robert Whitfield
00:52:35.629 - 00:52:36.389
uh
Robert Whitfield
00:52:37.409 - 00:53:03.879
which is slightly more E Easy to grasp There's an excellent book by Mustafa Suleiman uh called the coming wave which is came out in the last few weeks Um and there he lists sort of an action plan of 10 things that that can be done to sort of gain control over the development of a I And he points out he raised the idea of of choke points and
Robert Whitfield
00:53:04.169 - 00:53:08.479
and in practise uh the the computer chips
Robert Whitfield
00:53:09.11 - 00:53:24.389
uh that are required for these generative uh large language models there Is it There is not a massive of suppliers There's really an incredibly limited number of suppliers and you always come down to one in NVIDIA
Robert Whitfield
00:53:24.979 - 00:53:48.919
If you you know physically controlled the uh the distribution of those chips If you if you stopped If you you know if you stopped production for for a period or then that would have a dramatic effect on the development of of new systems Uh and so you know there are uh there are some tools uh like that
Robert Whitfield
00:53:49.5 - 00:54:09.399
Um but another one is is energy Uh if if this is sort of a big system is being developed there may be a lot of energy being consumed which you may may be able to monitor And that's then becomes a bit more analogous to um to the nuclear sort of monitoring um nuclear things
Robert Whitfield
00:54:09.919 - 00:54:12.06
Um but but the
Robert Whitfield
00:54:13.3 - 00:54:20.919
the the the huge challenge in in in designing these uh regulatory systems is that it
Robert Whitfield
00:54:21.469 - 00:54:26.5
It's because a I can uh can move It's moving very fast
Robert Whitfield
00:54:26.689 - 00:54:45.709
uh and can be unpredictable so that the regime that you establish it has to be agile And that's not a not a word that fits easily with sort of the UN bureaucratic systems of trying to get 100 and 96 countries all to agree You know I think
Ian 'Tiny' Morris:
00:54:45.929 - 00:54:51.439
regulatory bodies generally are agile is not a word that fits fits well with them
Robert Whitfield
00:54:51.75 - 00:54:53.3
That's right That's right
Robert Whitfield
00:54:53.86 - 00:54:54.639
Um
Simon Sansbury
00:54:55.929 - 00:55:06.85
So um so after the the PM speech this weekend um and you mentioned earlier on the the global summit next week at Bletchley Park What do you feel is the is the likely outcome
Robert Whitfield
00:55:08.11 - 00:55:34.969
Um well we have what you hope will be Yeah yeah Um I I like lots of people have been sort of following the statements very closely Um and one thing that does seem to be emerging uh at at least quite a bit has been spoken about is the idea of uh of an A I equivalent of the Intergovernmental Panel Panel on Climate Change
Robert Whitfield
00:55:35.55 - 00:56:05.35
And when climate change started to be addressed which was goes back to 1992 and the the UN Framework Convention on Climate Change was first established And then one of the sort of the first things uh as established as part of that process was this intergovernmental panel on uh sorry Yeah the inter governmental panel on climate change IPCC um and that produces uh
Robert Whitfield
00:56:05.659 - 00:56:23.36
periodic every few years produces a re a report which is brings together um all the intelligence from around the world uh both sort of what has been happening and what could be done to address the problem in the coming years
Robert Whitfield
00:56:23.939 - 00:56:27.35
Uh and that has been well respected
Robert Whitfield
00:56:28.169 - 00:56:55.02
And so the the the idea is that you'd have an A I equivalent Um whether it's focused on a I or a I safety there are different views Um but uh clearly you know I think I think that would be an appropriate thing Whatever else happens that would be an appropriate step to to to take Um but
Robert Whitfield
00:56:55.459 - 00:57:08.189
that's only a very first early step Uh what you need to do is is to really for this to be the start of a high profile process um
Robert Whitfield
00:57:08.719 - 00:57:38.1
towards Global A i governance Um And so um some people have called for this group to meet every six months uh until further notice Uh but with with with working groups in between the meetings sort of try to try to set up some some concrete proposals for the next meeting If it were in May uh when Maybe some um some significant institutions uh could be established
Robert Whitfield
00:57:38.62 - 00:58:02.969
Uh but that then triggers the sort of the question which is a bit grey at the moment People haven't really talked about which is how this process starting with uh uh uh at Bletchley Park How that relates to the current UN process which has a global digital compact discussions and a high level advisory body on a I uh and
Robert Whitfield
00:58:03.35 - 00:58:11.639
so that that's going along Are they Are they one single process Are they two parallel processes that needs to be clarified
Simon Sansbury
00:58:12.209 - 00:58:15.81
OK thank you So it's the very start of the very start of the conversation
Simon Sansbury
00:58:16.59 - 00:58:17.439
in the framework
Robert Whitfield
00:58:18.489 - 00:58:21.219
Yeah that's right That's right Yeah
Ian 'Tiny' Morris:
00:58:22.1 - 00:58:42.09
So Robert thank you Our time is drawing to an end Um thank you for leading us through a really uh difficult and challenging topic And just in summary Uh would you like to finish with a a few words with perhaps uh how how you how you hope this story perhaps won't end but will progress in in the next couple of years
Robert Whitfield
00:58:42.76 - 00:58:44.86
Uh OK um
Robert Whitfield
00:58:45.84 - 00:58:53.139
I I mean the the crucial point that what we've been working for for the last few years is is a
Robert Whitfield
00:58:53.689 - 00:59:22.889
that the people should be thinking about a I governance uh and B that that should be global as much as possible We should be thinking about global and and what's really encouraging is that uh particularly you know since March since that open letter C talking about a pause There has been a a discussion You know we're having a discussion now but there's been a discussion around the world about about a i about the risks about the benefits So
Robert Whitfield
00:59:23.129 - 00:59:44.78
that's that has started And that's excellent Uh we know that the design of a regulation um is hugely complex Uh but if uh Ne next week uh Wednesday Thursday is the start of a of a of a real process with some sense of urgency Um
Robert Whitfield
00:59:45.389 - 00:59:58.28
and we can establish a uh an intergovernmental panel reporting on on uh on the situation and then start to develop some some institutions Then uh
Robert Whitfield
00:59:58.919 - 01:00:03.59
then I will feel much more comfortable than I did Uh let's say nine months ago
Ian 'Tiny' Morris:
01:00:03.6 - 01:00:12.03
Brilliant Robert Thank you ever so much for your time You've been listening to the Pompey politics podcast Hi Beenie and Tiny Morris And our guest
Simon Sansbury
01:00:12.04 - 01:00:13.639
today has been Robert Whitfield
Simon Sansbury
01:00:14.159 - 01:00:28.34
I've been Simon Sansbury Uh please do remember to um like follow subscribe et cetera Click Whatever buttons you need to uh to make sure you uh you get our next bit of content Um thanks again to Robert for joining us this evening and you can join us next week at 6 27