Filippa Lentzos on Global Catastrophic Biological Risks
Dr. Filippa Lentzos, Senior Lecturer in Science and International Security at King's College London, joins us to discuss the most pressing issues in biosecurity, big data in biology and life sciences, and governance in biological risk.
Topics discussed in this episode include:
- The most pressing issue in biosecurity
- Stories from when biosafety labs failed to contain dangerous pathogens
- The lethality of pathogens being worked on at biolaboratories
- Lessons from COVID-19
2:35 What are the least understood aspects of biological risk?
8:32 Which groups are interested biotechnologies that could be used for harm?
16:30 Why countries may pursue the development of dangerous pathogens
18:45 Dr. Lentzos' strands of research
25:41 Stories from when biosafety labs failed to contain dangerous pathogens
28:34 The most pressing issue in biosecurity
31:06 What is gain of function research? What are the risks?
34:57 Examples of gain of function research
36:14 What are the benefits of gain of function research?
37:54 The lethality of pathogens being worked on at biolaboratories
40:25 Benefits and risks of big data in biology and the life sciences
45:03 Creating a bioweather map or using big data for biodefense
48:35 Lessons from COVID-19
53:46 How does governance fit in to biological risk?
55:59 Key takeaways from Dr. Lentzos
Lucas Perry: Welcome to the Future of Life Institute Podcast. I'm Lucas Perry. Today's episode is with Dr. Filippa Lentzos and explores increasing global security concerns from the use of the life sciences. As biotechnology continues to advance, the capacity for use of both the harmful and beneficial aspects of this technology is also increasing. In a world stressed by climate change as well as an increasingly unstable political landscape that is likely to include powerful new biotechnologies capable of killing millions, the challenges of biotech to global security are clearly significant. Dr. Lentzos joins us to explain the state of biotech and life sciences risk in the present day, as well as what's needed for mitigating the risk.
Dr. Filippa Lentzos is a mixed methods social scientist with expertise in biosafety, biosecruity, biorisk assessment and biological arms control. She works at King's College London as a Senior Lecturer in Science and International Security. Dr. Lentzos also serves as the Co-Director of the Centre for Science and Security Studies, is an Associate Senior Researcher at Stockholm International Peace Research Institute, and is a columnist for the Bulletin of Atomic Scientists. Her work focuses on transparency, confidence-building and compliance assessment of biodefence programmes and high-risk bioscience. She also focuses on information warfare and deliberate disinformation related to global health security.
And with that, I'm happy to present this interview with Dr. Filippa Lentzos.
To start things off here, we've had COVID pretty much blindside humanity, at least the general public. People who have been interested in pandemics and bio risk have known about this risk coming for a long time now and have tried to raise the alarm bells about it. And it seems like this other very, very significant risk is the continued risk of synthetic bio agents, engineered pandemics, and also the continued risk of natural pandemics. It feels to me extremely significant and also difficult to convey the importance and urgency of this issue, especially when we pretty much didn't do anything about COVID and knew that a natural pandemic was coming.
So, I'm curious if you could explain what you think are the least understood aspects of synthetic and natural biological risk by the general public and by governments around the world and what you would most like them to understand.
Filippa Lentzos: I guess one of the key things to understand is that security concerns of life science research is something that we must take seriously. There's this whole history of using the life sciences to cause harm, of deliberately inflecting disease, of developing biological weapons. But very few people know this history because it's a story that's suffused by secrecy. In the 20th century, biological weapons were researched and developed in several national programs, all of which were top secret, including the US one.
These programs were concealed in labs at military sites that were not listed on ordinary maps. Special code names and exceptionally high classification categories were assigned to biological agents and the projects that were devised to weaponize them. Bioweaponeers were sworn to secrecy and under constant surveillance. So, a lot of that just hasn't become publicly available. Much of the documentation and other evidence of past programs has been destroyed. There were these concerted efforts to bring war crimes and human rights abuses to public light. Information about biological weapons programs tended to be suppressed.
One example of this is the Truth and Reconciliation Commission hearings in South Africa that followed the apartheid. When the commission hearings began to uncover details about South Africa's biological weapons program that was called Project Coast, they were faced with delays and they were faced with legal challenges and the hearings were eventually shut down before the investigators could complete their work. Now, the head of that program became obvious to the investigators at the time who that was, but he was never brought to justice. Unbelievably, he remained a practicing medical doctor for many, many years afterwards, possibly even to this day.
What hasn't been concealed or destroyed or silenced from past biological weapons programs often remains highly classified. So, the secrecy surrounding past programs mean that they're not well known. But there's also a new, contemporary context that shapes security concerns about life science research that we need to be conscious of and that I think relates back to what I think is important to know about synthetic and natural bio risks today. And that is that advances in science and technology may enable biological weapons to emerge that are actually more capable and more accessible with attacks that can be more precisely targeted and are harder to attribute.
So, synthetic biology, for example, which is one of the currently cutting-edge areas of life science research, that is accelerating our abilities to manipulate genes and biological systems. And that will have all kinds of wonderful and beneficial applications, but if the intent was there, it could also have significant downsides. So, it could, for instance, identify harmful genes and DNA sequences in a much quicker way than we've been able to so far. As a result of that, we could, for instance, see greater potential to make pathogens or disease-causing biological agents even more dangerous.
Or we could see greater potential to convert low-risk pathogens into high-risk pathogens that we could potentially even recreate extinct pathogens like the variola virus that causes smallpox, or way further out, we could engineer entirely new pathogens. Now, pathogens in and of themselves are not biological weapons. You need to add some kind of delivery mechanism to have a weapon. The possibilities to manipulate genes and biological systems are coming at a time when new delivery mechanisms for transporting pathogens into our bodies, into human bodies or animal bodies are also being developed.
So, in addition to the bombs and the missiles, the cluster bombs, the sprayers, and all kinds of injection devices of past biological warfare programs, it could now also be possible to use other delivery mechanisms. Things like drones or nanorobots, these incredibly tiny robots that can be inserted into our blood streams for instance, even insects, could be used as vehicles to disperse dangerous pathogens.
So, I guess to get to the bottom of your question, what I'm keen for people to understand, scientists, government officials, the general public, is that current developments in science and technology, or in the life sciences more specifically, are lowering barriers to inadvertent harms as well as to deliberate use and development of biological weapons and that there is this whole history to deliberate attempts to use the life sciences to cause harm.
Lucas Perry: It seems like there's three main groups of people that are interested in such technology. There's something like lone wolfs or isolated individuals who are interested in creating a lot of harm to humanity in the same way that mass shooters are. There are also small groups of people who may be interested in the same sort of thing. Then there's this history of governments pursuing biological weapons. Could you offer some perspective about the risks of these three groups and how you would compare the current technology used for the creating of synthetic pathogens to how strong it was historically?
Filippa Lentzos: Sure. Are we heading towards a future where anyone with a PhD in bioengineering could create a pandemic and kill millions? Is that what you mean? Well, a pathogen, even a bioengineered one, does not on its own constitute a biological weapon, though you will still face issues like agent stability and dealing with large scale production and importantly dealing with efficient delivery, which is much easier said than done. In fact, what the history of bioterrorism has taught us is that the skills required to undertake even the most basic of bioterrorism attacks are often much greater than assumed.
There are various technical barriers to using biological agents to cause harm even beyond the barriers that are being reduced from advances in science and technology. The data that is available to us from past incidents of biological terrorism indicates that a bioterrorism attack is more likely to be crude, more likely to be amateurish and small scale where you'd have casualty levels in single or double digits and not in their hundreds or thousands and certainly not in their millions. Now, my own concern is actually less about lone actors.
Where I see real potential for sophisticated biological weapons in strategic surprise in the biological field is in one of those other categories that you mentioned, so it's at the state or the state sponsored level. Let me explain. Well, I already told you a little bit about how we've recently seen significant advances in genetic manipulation and delivery mechanisms. These developments are lowering barriers to biological weapons development, but that's really only part of the picture, because in making threat assessments, it's also important to look at the social context in which these technical developments are taking place.
One of the things we're seeing there in that social context is a build up into use capacities? What we're seeing is that high containment labs that are working with the most dangerous pathogens are rapidly being constructed all over the globe. So, they're now more people and more research projects than ever before working with and manipulating very dangerous pathogens and there are more countries than ever before that have biodefense programs. There's around 30 biodefense programs that are openly declared. The trends we're seeing is that these numbers are increasing.
It's entirely legitimate to have biodefense programs and they do a lot of good, but a side effect of increasing bio-preparedness and biodefense capacities is that capacities for causing harm, should the intent be there, and that's the crucial part, also increase. So, one person may be setting up all this stuff for good, but if somebody else comes in with different intent, with intent to do harm, that same infrastructure, that same material, that same equipment, that same knowledge, can be turned towards causing harm or creating biological weapons.
Now, another thing we're seeing that won't have escaped your notice is the increasingly unstable and uncertain geopolitical landscape. The world that many of us grew up in and know is one in which America was a clear, dominant power. We're now moving away from that, away from this hegemonic or unipolar power structure towards an international system that is increasingly multipolar. The most clearly rising power today is of course China, but there are others too. Russia is still there. There's India, there's Brazil to name a few. Those are things in the social context that we need to pay attention to.
We're also seeing rapidly evolving nature of conflict and warfare themselves are changing. And that's changing the character of military challenges that are confronting states. Hybrid warfare, for instance, which blends conventional warfare with irregular warfare and cyber warfare, is increasingly likely to compliment classical military confrontation. So, states that are increasingly outmatched by conventional weapons may for instance start to view novel biological weapons as offering some kind of advantage, some kind of asymmetric advantage, and a possible way to outweigh strategic imbalances.
So, states in this kind of new form of conflict, new form of warfare, may see biological weapons as somehow providing an edge or a military advantage. We are also seeing the defense programs of some states heavily investing in the biological sciences. Again, could well be for entirely legitimate purposes, but it does also raise concerns that adversaries may be looking at those kinds of investments and thinking hedging their bets and similarly investing in more biological programs. These investments, I think, are also an indication that there are some real concerns that adversaries are harnessing or trying to harness biotechnology for nefarious purposes.
And we've seen some political language to that effect too, but a lot of this is going under the radar. So, all of these things, and there are more, the flagrant breach of the Chemical Weapons Convention or continuous flagrant breaches of the Chemical Weapons Convention for example, the use of chemical weapons in Syria, or the use of very sophisticated chemicals like Novichok in the UK on Skripal, the Russian, as well as other cases is one other sort of context that plays in, or even our recent experiences of natural disease outbreaks in here. COVID is obviously a key example, but it's not so long ago we've had all kinds of other outbreaks.
Ebola just a few years ago. There's Zika, there's MERS, there's all kinds of other emerging diseases. All of these could serve to focus attention on deliberate outbreaks. And all of these various elements of the social context as well as these technical developments could produce an environment in which a potential military or political utility for biological weapons emerges that alters the balance of incentives and disincentives to comply with the international treaty that prohibits biological weapons.
Lucas Perry: Could you explain the incentives of why a country would be interested in creating a synthetic pathogen when inevitably it would seem like it would come back and harm itself?
Filippa Lentzos: Well, it's doesn't have to be an infectious pathogen. What we're seeing today with COVID, for instance, is an infectious pathogen that spreads uncontrollably throughout the world. But states don't have to. Not all dangerous pathogens are infectious in that way. Anthrax, for instance, doesn't spread from person to person through the air. So, there are different kinds of pathogens and states and non-state actors will have different motivations for using biological weapons or biological agents.
One of those which I mentioned earlier is for instance if you feel that another country... you are outmatched conventionally by conventional weapons, you may want to start to develop asymmetric weapons. That would be an example where a state might want to explore developing biological weapons. But of course, we should probably mention that there is this thing called The Biological Weapons, this international treaty, which completely prohibits this class of weaponry. Historically, there's really only been two major powers that have developed sophisticated biological weapons programs. That is the United States and the Soviet Union.
Today, there are no publicly available documents or any policy statements suggesting that anyone has an offensive biological weapons program. There are many countries who have defensive programs and that's entirely legitimate. There is no indication that there are states that have offensive programs to date. I think the real concern is about capacities that are building up through biodefense programs, but also through regular bio-preparedness programs, and that's something that's just going to increase in future.
Lucas Perry: I'm curious here if you could also explain and expand upon the particular strands of your research efforts in this space.
Filippa Lentzos: Sure. I mean, it's very much related to the sorts of things we've been talking about. One strand that I focus on relates to transparency, confidence building, and compliance assessment of biodefense programs, where I look at how we can build trust between different countries with biodefense programs to trust that they are complying with the Biological Weapons Convention. I'm also looking at transparency around particular high-risk bioscience, so things or projects or research involving genome editing for example, or potentially pandemic pathogens like influenza or coronaviruses.
Another strand that I'm interested in or that I'm looking at focuses on emerging technologies and on governance around these emerging technologies and unresponsible innovation. And there I look particularly at synthetic biology, also a little bit at artificial intelligence, deep learning and robotics, how these other emerging areas are coming into the life sciences and affecting their development and the direction they're taking, the capacities that are emerging from this kind of convergence between emerging technologies and how we can govern that better, how we can provide better oversight.
Now, one of the projects that I've been involved in that has got a lot of press recently is a study that I carried out with Greg Koblentz at George Mason University where we mapped high biocontainment laboratories globally. I mentioned earlier that countries around the world are investing in these kinds of labs to study lethal viruses and to prepare against unknown pathogens. Well, that construction boom has to date resulted in dozens of these commonly called BSL-4 labs around the world. Now, significantly more countries are expected to build these kinds of labs in the wake of COVID 19 as part of a renewed emphasis on pandemic preparedness and response.
In addition, gain-of-function research with coronaviruses and other zoonotic pathogens with pandemic potential is also likely to increase as scientists are seeking to better understand these viruses and to assess the source of risks that they pose of jumping from animals to humans or becoming transmissible between humans. Now, of course, clinical work and scientific studies and pathogens are really important for public health and for disease prevention, but some of these activities pose really significant risks. Surges in the number of labs and expansion and the high risk research that's carried out within them exacerbate safety and security risks.
But there is no authoritative international resource tracking the number of these kinds of labs out there as they're being built. So, there is no international body that has an authoritative figure on the number of BSL-4 labs that exist in the world or that have been established. Equally, there is no real international oversight of the sort of research that's going on in these labs or the sorts of biosafety and biosecurity measures that they have implemented. So, what our study did was to provide a detailed interactive map of BSL-4 labs worldwide that contains basic information on when they were established and the size of the lab labs and some indicators of biorisk management oversight.
That map is publicly available online at globalbiolabs.org. You can go and see for yourself. It's basically a very large Google map where the labs are indicated and you can scroll over the labs and then up pops information about when it was established, how big it is, what sorts of biorisk management indicators there are, are they members of national biosafety associations? Do they have regulations related to by safety? Do they have codes of conduct? Et cetera, those kinds of things. That all comes up there, so you can go and see for yourself. That's a resource that we've made publicly available on the basis of our project.
Looking at the data we then collated, this was really the first time this kind of concerted effort was made to identify these various labs and bring all that information together. And some of our key findings from looking at that data were that... Well, the first thing is BSL-4 labs are booming. We can see a really quite steep increase in the number of labs that have been built over the last few years. We found that there are many more public health labs than there are biodefense labs. So, about 60% of the labs are public health labs, not focused on defense, but resourced out of health budgets.
We also found that there are many smaller labs and larger labs. In newspapers and on TV, we keep seeing photos of the Wuhan Institute of Virology's BSL-4 lab.
In terms of oversight, some of our other findings were that sound biosafety and biosecurity practices do exist, but they're not widely adopted. There's a lot of difference in between the kinds of biosafety and biosecurity measures that labs adopt and implement. We also found that assessments to identify life science research that could harm health safety or security are lacking in the vast majority of countries that have these BSL-4 labs. So, as I said, that's one of the studies that's got a lot of press recently and part of that is because of its relationship to the current pandemic and the lack of some solid information, some solid data on the sort of labs that are out there and on the sorts of research that's being done.
Lucas Perry: Do you have a favorite story of a particular time that a BSL lab failed to contain some important pathogen?
Filippa Lentzos: Well, there are all kinds of examples of accidental releases. In the UK, for instance, where I'm based, a very long time ago, there was work with variola virus that causes a smallpox, was worked in a sort of high rise building that had multiple floors and the variola virus escaped into the floor above and infected somebody there. That was, I think, at the end of the '70s. That was the very last time that someone was infected by smallpox in the UK. More recently in the UK, there's also been the escape of the foot and mouth virus from a lab.
Now, this was not the very large foot and mouth outbreak that we had in the early 2000s, which you know killed millions of animals. I still remember the piles of animal corpses dotted around the country and you could still smell the burning carcasses on the motorway as you drove past, et cetera. That was not caused by a lab leak, but just two, three, four years later, there was a foot and mouth disease virus that escaped from a lab through a leaking pipe that did go on to cause some infections. But by that stage, everyone was very primed to look out for these kinds of infections and to respond to them quickly.
So, that outbreak was contained fairly rapidly. I mean, there are also many examples elsewhere, also in the United States. I mean, there's the one example where you had variola virus found in a disused closet at the NIH after many years and they were still viable. I think that's one of the ones that ranked pretty highly in the biosafety community's memory and maybe even in your own. It was not that long ago, half a dozen years ago or so.
Lucas Perry: What do you think all these examples illustrate of how humans should deal with natural and synthetic pathogens?
Filippa Lentzos: Well, I think it illustrates that we need better oversight, we need better governance to ensure that the life science research done is done safely, it's done securely, and it's done responsibly.
Lucas Perry: Overviewing all these BSL safety labs and all these different research threads that you're exploring, what do you think is the most pressing issue in biosecurity right now, something that you'd really like the government or the public to be aware of and take action on?
Filippa Lentzos: Well, I think there's a really pressing need to shore up international norms and treaties that prohibit biological weapons. I mentioned the Biological Weapons Convention. That is the key international instrument for prohibiting biological weapons, but there are also others. The arms control communities is not in great shape at the moment. It needs more high profile, political attention, it needs more resources. And I think with more and more breaches that we're seeing, not on the biological side, but on other sides, breaches of international treaties, I think we need to make sure there is this renewed effort and commitment to these treaties.
So, I think that's one thing, one issue, that's really pressing in biosecurity right now. Another is really raising awareness and increasing sensitivities in scientific communities to potentially accidental or inadvertent or deliberate risks of the life sciences. And we see that very clearly in the data that's coming out of the BSL-4 study that I talked to you about, that that's something that's needed, not just what we saw there as actually looking at do they have any laws in the books or do they have any guidance on paper or do they have any written down codes of conduct or codes of practice? That's really important.
It's really important to have these kinds of instruments in place, but it's equally important to make sure that these are implemented and adopted and that there is this culture of safe, secure, and responsible science. That's something that we didn't cover in that specific project, but it's something that some of my other work has drawn attention to and the work of many others as well. So, we do need to have this regulatory oversight governance framework in place, but we also need to make sure that that is reflected or echoed in the culture of the scientists and the labs that are carrying out life science research.
Lucas Perry: One other significant thing going on in the life sciences in terms of biological risk is gain-of-function research. So, I'm curious if you could explain what gain-of-function research is and how you see the debate around the benefits and risks of it.
Filippa Lentzos: Well, gain-of-function research is a very good example of life science research that could be accidentally, inadvertently or deliberately misused. Gain-of-function means different things to different people. To virologists, it generally just means genetic manipulation that results in some sort of gained function. Most of the time, these manipulations result in loss of function, but sometimes different kinds of functions of pathogens can be gained. Gain-of-function has got a lot of media coverage in relation to the discussion around the origins of the pandemic or of COVID.
And here, gain-of-function is generally taken to mean deliberately making a very dangerous pathogen like influenza or coronavirus even more dangerous. So, what you're trying to do is you're trying to make it spread more easily, for example, or you're trying to change its lethality. I don't think gain-of-function research in and of itself should be banned, but I do think we need better national and international oversight of function experiments. I do think that a wider group of stakeholders beyond just the scientists doing the research themselves and their funders, I think that a wider group of stakeholders should be involved in assessing what is safe, what is secure, and what is responsible gain-of-function research.
Lucas Perry: It seems very significant, especially with all these examples that you've illustrated of the fallibility of BSL labs. The gain-of-function research seems incredibly risky relative to the potential payoffs.
Filippa Lentzos: Yeah, I think that's right. I mean, I think it is considered one of the examples of what has been called dual use research of concern or experiments that have a higher potential to be misused. By that, I mean deliberately, but also in terms of inadvertently or even accidentally because the repercussions, the consequences have the potential to be so large. That's also why we saw when some of the early gain-of-function experiments gained media attention back in 2011, 2012, that the scientific community itself reacted and said, "Well, we need to have a moratorium.
We need to have a pause on this kind of research to think about how we govern that, how we provide sufficient oversight over the sorts of work that's being done so that the risk benefit assessments are better essentially." I think there will be many who argue that... myself among them, that the discussion that was had around gain-of-function at that time were not extensive enough, they were not inclusive enough, there were not enough voices being heard or part of the decision-making process in terms of the policies that came out of this in the United States. To some extent, I think that's why we're, again, back at the table now with the discussions around the pandemic origins.
Lucas Perry: Do you have any particular examples of gain-of-function research you'd be interested in sharing? It seemed like a really significant example was what was happening in Wisconsin.
Filippa Lentzos: Sure. And that was the one that was the work in Wisconsin and at the Erasmus University in the Netherlands. What they were trying to do there was they were working with influenza or avian flu and they were seeing if they were able to give that virus a new function, so enable it to spread, not just among birds, but also from birds to mammals, including humans, including ourselves. So, they were actively trying to make it not just affect birds, but also to affect humans.
And they did so successfully, which made that virus more dangerous and that was what that media fuel was about and the discussions at the time were that many felt that the benefits of that research did not outweigh very significant potential risks, the very significant risks that that research involved.
Lucas Perry: What are the benefits of that sort of gain-of-function research?
Filippa Lentzos: Well, the ones that carried out that sort of research both at the time, but also the sorts of gain-of-function research that's been going on at the Wuhan Institute of Virology, some of it which has been funded by American money, some of it which has been done in collaboration with American Institute argues that in order to prepare for pandemics, we need to know what kind of viruses are going to hit us. New and emerging viruses generally come, spill over from the animal kingdom into humans, so they actively go and look for viruses in the animal kingdom.
In this case, in the coronavirus case, the Wuhan Institute of Virology, they were actively looking in bat populations to see what sort of viruses exist there and what their potentials are for spilling over into humans. That's their justification for doing that. My own view is that that's incredibly risky research and I'm not sure and I don't feel that that sort of justification really outweighs the very significant risks that it involves. How can you possibly hit upon the right virus in the thousands and thousands of viruses that are out there and know how that will then mutate and get modified as it hits the human population?
Lucas Perry: These are really significant and quite serious viruses. You explained an example earlier about this UK case where the final people to die from smallpox was actually from a BSL lab leak. There's also this research in Wisconsin on avian flu. So, could you provide a little bit of a perspective on, for example, the infection rate and case fatality rate of these kinds of viruses that they're working on at BSL labs, that they have at BSL labs, that they might be pursuing gain-of-function research on?
Filippa Lentzos: Yeah. I mean, certainly in terms of the coronavirus, what we've seen there is that that is clearly many people have died, many people have got infected, but that's not considered a particularly infectious or particularly lethal pathogen when it comes to pandemics. We've seen much more dangerous pathogens that could create pandemics or that are being worked with in laboratories.
Lucas Perry: Yeah. Because some of these diseases, it seems, the case fatality rate gets up to between 10 and 30%, right? So, if you're doing gain-of-function research on something that's already that lethal and that has killed hundreds of millions of people in the history of life on earth, with the history of lab leaks and with something so infectious and spreadable, it seems like one of the most risky things humanity is doing on the planet currently.
Filippa Lentzos: Yes. I mean, one of the things gain-of-function is doing is looking at lethality and how to increase lethality of pathogens. There are also other things that gain-of-function is doing, but that is taking out a large part of the equation, which is the social context of how viruses spread and mutate. There are, for instance, things we can do to make viruses spread less and be less lethal. There are active measures we can take equally, there are responses that could increase the effect of viruses and how they spread.
So, lethality is one aspect, a potential pandemic, but it is only one aspect, right? There are these many other aspects too. So, we need to think of ourselves much more as active players, that we also have a role to play in how these viruses spread and mutate.
Lucas Perry: One thing that the digital revolution has brought in is the increase and the birth of big data. Big data can be used to detect the beginning of outbreaks, to detect novel diseases, and to come up with cures and treatments for novel and existing diseases. So, I'm curious what your perspective is on the benefits and risks of the increase of big data in biology, both to health and societies as well as privacy and the like.
Filippa Lentzos: Well, you pointed to many of the benefits that big data has. There certainly are benefits, but as with most things, there are also a number of downsides. I do believe that big data combined with the advances that we're seeing in genomic technologies as well as with other areas of emerging technology, so machine learning or AI, this poses a significant threat. It will allow an evermore refined record of our biometrics; so our fingerprints, our iris scans, our face recognition, our CCTV cameras that can pick up individuals based on how they walk, all these kinds of biometrics.
It will also allow a more refined record of our emotions and behaviors to be captured and to be analyzed. I mean, you will have heard of companies that are now using facial recognition on their employees to see what kind of mood they're in and how they engage with clients, et cetera. So, governments are gaining incredible powers here, but increasingly, it's private companies that are gaining this sort of power. What I mean by that is that governments, but as I said, increasingly private companies, will be able to sort, to categorize, to trade, and to use biological data far more precisely than they have ever been able to do before.
That will create unprecedented of possibilities for social and biological control, particularly through individual surveillance, if you like. So, these game-changing developments will deeply impact how we view health, how we treat disease, how long we live, and how more generally we consider our place on the biological continuum. I think they'll also radically transform the Julius nature of biological research, of medicine, of healthcare. In terms of my own field of biosecurity, they will create the possibility of novel biological weapons that target particular groups of people and even individuals.
Now, I don't mean they will target Americans where they will target Brits or they will target Protestants or they will target Jews or they will target Muslims. That's not how biology works. Genes don't understand these social categories that we put onto people. That's how we socially divide people up, but that's not how genetics divides people up. But there are groupings also genetically that go across cultures, nations, beliefs, et cetera. So, as we come to have more and more precise biological data on these different groups, the possibility of targeting these groups for harm will also be realized.
So, in the coming decade, managing the fast and broad technological advances that are now underway will require new kinds of governance structures that we need to put in place and these new structures need to draw on individuals in groups with cross-sectoral experience; so, from business, from academia, from politics, from defense, from intelligence, and so on to identify emerging security risks and to make recommendations for dealing with them. We need new kinds of governance structures, new kinds of advisory bodies that have different kinds of stakeholders on them to the ones that we have traditionally had.
Lucas Perry: In terms of big data and the international community, with the continued risks of natural pandemics as well as synthetic pandemics or other kinds of a biological agents and warfare, it's been proposed, for example, to create something like a bio weather map where we have like a widespread, globally distributed early warning detection system for biological agents that is based off of big data or is itself big data. So, I'm curious if you have any perspective and thoughts on the importance of big data in particular for defenses against the modern risks of engineered and natural pandemics.
Filippa Lentzos: Yes, I do think there was a role to play here for data analysis tools of big data. We are, I think, already using some tools in this area where you have, for instance, analysis of social media usage, words that pop up on social media uses, or you have analysis of the sorts of products that people are buying in pharmaceutical companies. So, if there is some kind of disease spreading, people are getting sick and they're talking about different kinds of symptoms, you are able to start tracking that, you're able to start mapping that.
All of a sudden, all kinds of people in say Nebraska are going to the pharmacy to buy cough medicine or something to reduce temperature or there's a big spike for instance, you might want to look into that more. That's an indicator, that's a signal that you might want to look at that more. Or if you're picking up keywords on internet searches or on social media where people are asking about stomach cramps or more specific kinds of symptoms, that again is another kind of signal, you might want to look more into that.
So, I think some of these tools are, are definitely already being developed, some are already in use. I think they will have advantages and benefits in terms of preparing for both natural, but also inadvertent, accidental or deliberate outbreaks of disease.
Lucas Perry: We're hopefully in the final stages of the COVID-19 pandemic. When we reflect back upon it, it seems like it can be understood as almost like a minimally viable global catastrophe or a minimally viable pandemic, because there's been far worse pandemics, for example in the past, and it's tragically taken the lives of many, many people. But at the same time, the fatality rate is just a bit more than the flu and a lot less than many of the other pandemics that humanity has seen in the past few hundred thousand years.
So, I'm curious what your perspective is on what we can learn in the areas of scientific, social, political, and global life, from our experience with the COVID-19 pandemic to be better prepared for something that's more serious in the future, something that's more infectious, and has a higher case fatality rate.
Filippa Lentzos: Well, I think, as you said, in the past, disease has been much more present in our societies. It's really with the rise of antibiotics and the rise of modern healthcare that we've been able to suppress disease to the extent that it's no longer such a pressing feature in our daily lives. I think what the pandemic has done to a whole generation is really it has been a shot across the bow, really crystallized the incredibly damaging effects that disease can have on society.
It's been this wake up call or this reality check. I think we've seen that reflected also politically. International developments like the UN's Biorisk Working Group that's been established by the secretary general or efforts by states to develop a new international treaty on pandemics are concrete evidence of increasing awareness of the challenges that diseases pose to humankind. But clearly, that's not enough. It hasn't been enough, what we've had a place. Clearly, we need to be better prepared. And I guess for me, that's one of the bigger takeaways from the pandemic.
Equally, what the pandemic origin debate has done is to show that whether or not the pandemic resulted from a lab leak, it could have resulted from a lab leak, it could ironically or tragically have been the result of scientific research actually aimed at preventing future pandemics. So, clearly for me, a huge takeaway is that we need better oversight, we need better governance structures ensure safe, secure, and responsible life science research. Potentially, we also need to rethink some of our preparedness strategies.
Maybe actively hunting for viruses in the wild, mutating them in the lab to see if that single virus might be the one that hits us next, the one that spills over, isn't the best strategy for preparing for pandemics in the future. But COVID has also highlighted a more general problem, one I think that's faced by all governments, and that is, how can we successfully predict and prepare for the wide range of threats that there are to citizens and to national security? Some threats like COVID-19 are largely anticipated actually, but they're not adequately planned for as we've seen.
Other threats are not anticipated at all and for the most part are not planned for. The other side, some threats are planned for, but they fail to materialize as predicted because of errors and biases in the analytic process. So, we know that governments have long tried to forecast or to employ a set of futures approaches to ensure they are ready for the next crisis. In practice, these are often general, they're ad hoc, they're unreliable, they're methodologically and intellectually weak, and they lack academic insight. The result is that governments are wary of building on the recommendations of much of this future's work.
They avoid it in policy planning, in real terms funding, and ultimately in practice and institutionalization. What I and many of my colleagues believe is that we need a new vision of strategic awareness that goes beyond the simple idea of just providing a long-term appreciation of the range of possibilities that the future might hold to one that includes communication with governments about their receptivity to intelligence, how they understand intelligence, how they absorb other kinds of intelligence from private corporations, from academia, et cetera, as well as the manner in which the government acts as a result.
So, strategic awareness to my mind and to that of many others should therefore be conceptualized in three ways. You should first look more seriously and closely at threats. Second, you should invest in prevention and foresighted action. Third, you should prepare for medication, crisis management, and bounce back in case a threat can't be fully prevented or deterred. This kind of thinking about strategic awareness will require a paradigm shift in how government practices strategic awareness today. And my view is that the academic community must play an integral part in that.
Lucas Perry: Do you have any particular governance solutions that you're really excited about right now?
Filippa Lentzos: I don't think there's a magic bullet. I don't think there's one magic solution to ensuring that life science research is safe, that it's secure, and that it's carried out responsibly. I think in terms of governance, we need to work both from the top-down and from the bottom-up. We need to have in place both national laws and regulations, statutory laws and regulations. We need to have in place institutional guidance, we need to have in place best practices. But we also need a lot of the commitment, we also need a lot of awareness coming from the bottom-up.
So, we need individual scientists, groups of scientists to think about how their work can best be carried out safely so they can make codes of ethics or codes of practice themselves, they can educate others, they can think through who needs to be involved beyond their own expert community in risk assessing the kinds of research that they're interested in carrying out. So, we need both this top-down government-enforced, institutionally-enforced governance as well as grassroots governance. Only by having both of these aspects, both of these kinds of governance measures, can we really start to address the potential downsides of life science research.
Lucas Perry: All right. Just to wrap things up, I'm curious if you have any final words or thoughts for the audience or anyone that might be listening, anything that you feel is a crucial takeaway on this issue? I generally feel that it's really difficult to convey the significance and gravitas and importance of this. So, I'm curious if you have any final words about this issue or a really central key takeaway you'd like listeners to have.
Filippa Lentzos: I think when we're looking at our current century, this will be the century not of chemistry or physics or engineering, that was the last century, this will be the century of biology and it will be the century of digital information and of AI.
I think this combination, which we talked about earlier, when you combine biological data with machine learning, with AI, with genomic technologies, you get incredible potential of precise information about individuals. I think that is something we are going to struggle with in the years to come and we need to make sure that we are aware of what is happening, that we are aware that when we go buy a phone and we use the face recognition software, which is brilliant, that it can also have downsides, and all these little individuals actions, all these technologies that we just readily accept because they do have upsides in our life, they can also have potential downsides.
I do think we need to make sure we also developed this critical sense or this ability to be critical, think critically about what these technologies are doing to us as individuals and to us as societies. I guess that is the things I would like people to take away from our discussion.
Lucas Perry: All right. Well, thank you so much for coming on the podcast. I really can't think of too many other issues that are as important as this. It's certainly top three for me. Thank you very much for all of your work on this, Dr. Lentzos, and for all of your time here on the podcast.
Filippa Lentzos: Thanks for having me, Lucas.