Host [00:00:58] Welcome back to the Energy Impact Podcast. Today, we're here with Bryan Walsh, an editorial director at Vox Media, where he oversees their tech, climate and world teams, as well as the editor of Vox's Future Perfect section. Bryan, it's great to have you with us.
Bryan Walsh [00:01:12] It's great to be here.
Host [00:01:14] Awesome. Well, let's start with some very first-person questions. Tell us a little bit about where you grew up, where you went to school, and in a short version, how you ended up in your current role at Vox?
Bryan Walsh [00:01:25] Sure. So, I grew up in suburban Philadelphia, about an hour outside the city in a town called Doylestown. I played a lot of basketball when I was growing up. I liked writing and reading. I didn't know if I was going to be a journalist; I think I had aspiration to be a novelist.
Bryan Walsh [00:01:44] I went to college at Princeton, just over the Delaware River, more or less, in New Jersey. And that's where I began to do journalism a little more seriously, studying English literature. But I spent a lot of my time writing for papers at school. I did some internships in college, getting one at Maxim magazine, which was a popular men's magazine at the time. And then when I graduated, I was able to get an internship with Time magazine's Asian edition out in Hong Kong.
Bryan Walsh [00:02:17] I don't think they do this anymore, but back then, Time had these regional editions they would do in Europe, in Australia, actually, and in Asia as well, where they would kind of have a mix of content that they would pull from the US edition, but also original stories that would only appear within that region. And that was an amazing way of journalism. I thought I'd stay there for like three months or six months; I ended up staying for five years. And that's also where I began to focus on the areas that I was most interested in.
Bryan Walsh [00:02:45] So, I did a lot on global health, actually. If you remember, SARS 1.0, that was in Hong Kong, and I got a lot of first hand experience writing about that. I also began to write a lot about energy and environmental issues, first through the lens of the simple, conventional air pollution problem that was actually becoming quite serious in Hong Kong chiefly due to just the massive amount of development happening across the border in mainland China. But then, increasingly writing about climate issues, about energy issues.
Bryan Walsh [00:03:17] After five years there, I spent one year in Tokyo where I was the Tokyo Bureau chief at Time, and then actually moved to New York. This was about 2007... Still at Time, where I started to focus really just on the climate change. I was the climate change correspondent.
Bryan Walsh [00:03:31] And so for the next five or so years, I spent a lot of time traveling to pretty amazing places. I went to the Arctic, I went to the Amazon, Madagascar. I went to a bunch of UN climate change conferences in places ranging from like Bali to Cancun. They weren't all in, like, international resorts; it just happened to be the ones I went to were. And really getting a deep sense of the importance of this issue, the complexity of it, the way it connected to questions around global development, the sort of balancing between the energy we need to develop, really, and also the need to reduce emissions to deal with the really serious effects of climate change.
Bryan Walsh [00:04:11] I did that for a long time; switched over to the editing side. So, at that point I was editing international news. I ended up, after a few years, as the international editor at Time. So, handling all the correspondents all around the world.
Bryan Walsh [00:04:23] Back then, I think a lot of the focus was on some of the things that are still actually quite relevant now. Like, I remember Russia's invasion of Crimea, which turned out to be an early indication of where things would go there. A lot about rising tensions between the US and China, also something we see a lot of now. 2015, the big, original sort of migrant crisis coming out of Syria. Again, something we still see affecting global politics now.
Bryan Walsh [00:04:49] In 2016, I took a buyout there after about 15 years and worked on a book that was published in 2019 called End Times: A Brief Guide to the End of the World. That was all about existential risk. So, basically I broke that book into chapters about all the different ways, both natural and man-made, that the world could conceivably end, ranging from a big asteroid like the one that wiped out the dinosaurs to the supervolcano that actually happens to be sitting under Yellowstone, to, really importantly, man-made technological risks like climate change, like AI, which we're hearing a lot more about these days. Engineered pandemics... Regular pandemics, which again, turned out to be something that came true real fast.
Bryan Walsh [00:05:34] And then, I think it was back around 2020, I moved to Axios, at that point still a new-ish digital media startup, to write their Future newsletter. That was all about things we're going to have in the future, like the changing nature of labor under automation and AI, or emerging technologies like synthetic biology or biotech. Big global trends like where population was going. Stuff like that. I really enjoyed that for two years.
Bryan Walsh [00:06:00] And then, I moved to Vox where, as you noted, I edit the Future Perfect section, which is this sort of really interesting part of Vox that honestly tries to look at the ways we can make the future more perfect. It's kind of right there in the name. And that's looking at both big global problems that tend to be undercovered like, potentially, the end of the world, but also things like AI risk, but also, global poverty, global health, and all the ways we can actually solve it meaningfully. And then more recently, I've taken on a managerial role around a lot of international coverage, a lot of our climate, environment, and energy coverage as well. And in between, I still find time to write occasionally for Vox as well.
Host [00:06:42] And so, would you characterize emerging threats or megatrends... Is that ultimately the connecting tissue between all of those topics? Those are all very eclectic and they are similar in certain respects, but when you explain this to people, what's sort of the thing that brings all of these interests together in your mind?
Bryan Walsh [00:06:59] Well as you know, it is pretty eclectic, so when I have to explain to people, it usually takes about that long. But you're absolutely right; I think that's exactly a great way to look at. In fact, when I was going to write my book, I realized that a lot of what I had done in the previous years in my journalism all kind of added up to that. Thinking about climate change, thinking about the threat of pandemics, both natural and man-made, thinking about new technologies. So, that is sort of the through line as much as there is one in terms of my coverage. And that's one of the reasons I'm so happy to be at Vox, because I'm able to work with other writers, with a company that really takes this seriously, that thinks we should devote a whole section to it.
Bryan Walsh [00:07:39] I feel like people understand this in a better way than they did a few years ago. I think Covid was a big shift for that. Suddenly, we saw how one small virus had these global impacts well beyond just the disease, the impact it's had on politics, how we live, how we parent. Take your pick.
Bryan Walsh [00:07:55] And then looking forward, just over the last year or so with the really rapid advances in artificial intelligence, suddenly things that I was writing about in 2019 where it was all still theoretical was becoming very real to people. When they were interacting with these chat bots like ChatGPT, it did suddenly become possible to think like, "Well, where is this going? How will this change us? Can we maintain control of it?"
Bryan Walsh [00:08:19] And in some ways, that hearkens back all the way to things like nuclear weapons. In the same way that back in 1945, we were suddenly on the cusp of this powerful and scary new technology that, for the first time, really gave human beings the ability to destroy themselves. Now we have many more tools that can do that. But it's not just about that. It's also about what can we get out of this? How can we transform the world in a better way? And the trick and the key to a lot of this coverage, the core of it really is that dual nature of it. How do we get the most out of it while keeping ourselves safe? And is there any guarantee we can actually do that?
Host [00:08:57] And also, the interplay between some of these particular variables. So, one thing we think about over here as folks who think a lot about nuclear power is, yes, emerging technologies like AI are obviously incredibly significant, potentially destabilizing in their own right, but they have ripple effects in other domains as well. One being that, whether they turn out to be good or bad, all these emerging technology sectors are, for instance, super energy intensive.
Host [00:09:22] And so, questions about how to manage AI responsibly are also questions about sustainability, how do we power the internet, how do we do so in a way that is equitable, that's efficient, that doesn't irreparably harm the environment, and so forth?
Host [00:09:34] I'd be curious to hear you just comment on that. You have all of these emerging megatrends that, in their own right, are really, really consequential for the future of humanity, but are also increasingly interconnected with one another.
Bryan Walsh [00:09:48] That's a really good point. And I think energy is at the key of so many of these things. One of the biggest stories, I think... It's beginning to be covered, but it's really going to start hitting people soon, I think, is the fact that... The US has had, on a per capita basis, pretty stable electricity demand for some time. You know, we get more efficient, we use some of that, but we've been able to count on that. And that's really important to things like where do we expect climate models to go?
Bryan Walsh [00:10:16] That is already on a path to change; it's likely to change even more. We are going to need more electricity. And some of the the technologies that you mentioned are going to be the ones driving it. AI need a lot of computing power; that computing power eats up a lot of electricity. And suddenly, we're in a state where we can't just be stable.
Bryan Walsh [00:10:36] And as you noted, we also have this demand and need that is equally essential to make that energy more sustainable, to make it more carbon-free. That's not a situation we really faced before.
Bryan Walsh [00:10:48] And on top of that, we also have a situation where, I think in the United States, it's quite hard to build things. Nuclear plants probably being at the top of that list. But the reality is, even if you're trying to build the transmission lines that you need to connect to the clean energy we're building, that's very hard to do. To build large-scale renewable sources like, say, utility-scale solar, that is hard to do. Our regulatory system is just not built to do this quickly.
Bryan Walsh [00:11:18] And one of the really important paradoxes of energy and climate change that I want people to understand is that, simultaneously, climate change is a result of all the things we've built. It's a result of the industrial civilization that we've created, which is a really good thing by the way. It comes, obviously, with some toxic side effects. Pollution, obviously climate change. But it is what has enabled us to not be poor anymore. And we look around the world in poor and developing countries, a lot of what keeps them that way is a lack of access to electricity. So, you have that.
Bryan Walsh [00:11:50] But at the same time, you need to reduce that for climate change. We have to build really fast. And the problem is, we can put a lot of money in programs like the Inflation Reduction Act. Huge step forward, but that won't materialize unless we can change regulations or change in a way that makes it much, much easier to make that build. And right now, that's the really lagging part, I feel like.
Bryan Walsh [00:12:16] It's not the money, it's not the identification of the problem. It's the fact that there are so many small obstacles that have to be overcome along the way, and we're doing this all on a clock. And that clock's twofold. One clock is the need to reduce carbon emissions to hit goals around climate change. The other is, suddenly you have this new source in AI; it's going to demand more electricity too. That's a really serious tension we have to deal with.
Host [00:12:41] Curious what you think the interplay is here, potentially, between policymakers and industry. AI, to me, is a great example where policymakers are, at this point, acutely aware of all the things that could go wrong, but are unclear in many countries as to what they can actually do to mitigate those potential harms. The policymakers are aware that these problems are potentially out there, but industry is going forward and innovating, building, investing and so forth.
Host [00:13:10] When you think about issues like, yes, climate change, but also those broader issues like energy security and energy abundance, what is the role that the government can play that is most productive versus what we should expect or hope industry to do on its own?
Bryan Walsh [00:13:30] I think what government ideally can do is an element of prioritization. I listed a whole lot of challenges we face as a country, as a species, really. You can't solve all of them at the same time. And so, you have to sort of think, ideally with leadership, to be like, "Well, what do we need to get to first?"
Bryan Walsh [00:13:50] If we treat climate change as an existential problem, which I think it is, then why aren't we treating the solutions with the equal amount of stakes? If that's the case, well, we should be reversing this. It should be very hard to stop a lot of these projects, not the other way around.
Bryan Walsh [00:14:07] Obviously, there's a huge role here for government research, government programs, incentives, all those policy levers that can be pulled. But what we really have to do... And that's beginning to happen. I think slowly, but it is beginning to happen... Is a recognition that a lot of the laws we have in the books that date back 50 or 60 years are built for a world that doesn't exist anymore, or a world where the problems we face now didn't exist.
Bryan Walsh [00:14:35] The reason why we made it maybe hard to permit certain things or build certain things is because we had a real concern about overdeveloping. We had a concern about the immediate effects of conventional pollution or disruption, what have you, destruction of the natural world. Those are still live issues, but now we suddenly have this other huge one. And if we think it's something that could end civilization or certainly really, really negatively affect civilization, then let's treat it that seriously, which means having to rethink some of those things. That's, I think, what government can do.
Bryan Walsh [00:15:06] And with industry, I think, they can push for that as well. I think they have to recognize how do they balance the need for this with the concerns that affected groups on the ground will have?
Bryan Walsh [00:15:19] One of the big challenges with climate change is that carbon emissions are global. Doesn't almost matter where they happen, they affect everyone. But when you're building something, whether it's a transmission line, whether it's a nuclear plant, whether it's big utility-scale solar, there's always going to be specific individuals in specific places that are most affected by that. So, how do we balance that out?
Bryan Walsh [00:15:41] That's always going to be hard. There probably is no one "everyone wins" situation. That's where, ideally, government should come in and say... Look, I mean, in the same way you might do a wartime scenario, we have to make hard choices. Not something politics has been great at, quite honestly. But that is what we need, I think, first and foremost.
Host [00:16:03] Let's shift gears and talk about your book for a moment, End Times. What originally motivated you to write the book? What, in your view, is the thesis of the book? And are there particular things you learned in the process of writing that book that, at the time, you actually found surprising or hadn't thought about or grappled with before?
Bryan Walsh [00:16:27] I think the biggest motivation to write the book was to write a book, honestly. And I give this comparison to a lot of people... Writing a book is a little bit like running a marathon. You're not really necessarily trying to win, you're trying to demonstrate to yourself you can do it and you can finish. But the particular subject matter there, I settled on pretty quickly because I was looking at, as I said before, what have I done over the years that I can sort of weave in a lot of past reporting, a lot of past background knowledge.
Bryan Walsh [00:16:55] And it turned out around the same time, you were seeing this sort of emergence of almost like a new academic concentration, which really existential risk studies. It didn't really exist 20 years ago. People thought about the end of the world and things like the risk of nuclear weapons, obviously, a few other things. But suddenly, really about 20 years ago, in part, I think, prompted by early stirrings around AI, you saw places like Oxford University and Cambridge University, MIT, and elsewhere, these centers popping up that treated this subject as something to be studied in itself.
Bryan Walsh [00:17:31] And I got wind of that; that became the frame for the book, and that just really engaged me. Because to my mind, this is an incredibly important thing. Like, can you argue that there's anything more important than preventing the end of the world if you can?
Bryan Walsh [00:17:45] I think the thesis of it is that. The thesis is, first and foremost, that we do not pay enough attention to these risks. The amount of money, the amount of policy that gets focused to, sometimes what I call "low likelihood but high consequence" risks... Things that aren't likely to happen, but if they do happen, they're really, really, really bad.
Bryan Walsh [00:18:06] Our political system is not great at dealing with this. As human beings, we're not great at dealing with this. We tend to focus on risks that are right in front of us, that are present. We have a bias towards these. By their nature, if something could possibly threaten the world, it hasn't happened before, or at least not happened in our history. So, there has to be sort of a magic leap to get you there.
Bryan Walsh [00:18:26] And so, the thesis, first and foremost, is we are in a new era of existential risk. You take the ones that already existed, like... Look, an asteroid could still come down, a supervolcano could blow up. Obviously, those nuclear weapons are still there. They could still be fired.
Bryan Walsh [00:18:41] And then, you had these new ones that are coming from new technological advances. What will AI do? Or, what will scientists be able to do in a lab around a virus that could potentially get out of control? Or, the fact that we are far more interconnected than ever before, which then, as we saw with Covid, raises the risk of what happens if a truly infectious and dangerous virus just emerges out of nature. And then, of course, climate change as well.
Bryan Walsh [00:19:06] And so, the thesis is, "Here they are. Pay attention to them." And not to just give up. Like, there's an easy sense to sort of feel like, "Well, what can we do?" A fatalism that goes into it. But, in part because we created all these risks, we do have the ability to manage them. And that means thinking about the future in a more serious way than we do now. So, I think that's the ultimate thesis of the book, really.
Bryan Walsh [00:19:28] And in terms of what I learned and maybe what changed my mind... People often ask me how did I rank them, these risks. And ultimately, I was going back and forth a lot during the book, but what it sort of came down to was this, which is that... The risk I worry about most right now...
Bryan Walsh [00:19:45] First of all, one thing is I'm a lot less worried about what I would call natural risks. When I was a kid in the '90s, there were two or more movies about asteroids hitting the earth and destroying everybody. I'm not sure what was going on at the time, but it was really in the zeitgeist. I don't worry about that as much, in part because we've actually taken steps to prevent that. NASA has really mapped out most of the potential asteroid risks out there, and even more recently, we've done work about how would we move one out of the way if we had to? So, I worry less about those.
Bryan Walsh [00:20:16] But right now, what I worry most about is nuclear war. I mean, it's not new. In fact, it's actually much more dangerous than it used to be, for reasons I think a lot of people understand. I mean, the fact of what happened with the Ukraine invasion, some of the sort of loose talk around that. The fact that arms control treaties are falling apart. I still worry about that. That could still happen today.
Bryan Walsh [00:20:38] When I think about what I worry about most in the near future, and probably the most of all, actually, is what will happen with the ability to manipulate biology. Because we know that natural pandemics are very dangerous, but we also have a pretty good sense that they can't wipe us out. That there's something in nature that tends to balance things out with these viruses. They can be quite bad, but we have means of dealing with them.
Bryan Walsh [00:21:03] But if you add in the ability to engineer a virus... Maybe you're doing it for benign reasons, for research purposes and it gets out, or these tools get in the hands of someone who wants to do it from maligned purposes, you could have something that is far, far, far more dangerous than anything we see in nature, and we know that's bad for human beings.
Bryan Walsh [00:21:20] And I worry about that more and more because there's not a lot of regulation around it. That technology improves by the year. It becomes easier for more and more people to do it. In the same way we saw with computer programming. It used to be that was something only someone with great special skills was able to do. As that got easier and easier, a lot of great things happened.
Bryan Walsh [00:21:39] Another thing that happened during that was that it was a lot easier to write things like malware. And suddenly, we find ourselves in an industry where, I think, tens or hundreds of billions of dollars are lost to e-commerce fraud, computer viruses, things like that. Imagine you can do that with human biology; that's pretty scary.
Bryan Walsh [00:21:56] The furthest off one that I still don't have a grip on is the AI question. That's probably the one that people are most familiar with right now, but it's also one that I weirdly feel a little less worried about than I used to, which is counterintuitive because of all the things we've seen in the year since that book came out in 2019. But for whatever reason, the people I talk to, I just feel like there are still a lot of serious technological barriers that would have to be crossed before it gets to that stage. Whereas with biology, and obviously with nuclear weapons, that's here right now.
Host [00:22:26] That stage being artificial general intelligence, for instance?
Bryan Walsh [00:22:30] Artificial general intelligence, yeah. Like, the ability to manipulate the real world. There are a lot of things that have to happen for that. Now, it could do a lot of bad things before that. I would really hope that no one hooks us up to like, a nuclear weapons decision program. That would be bad, because it's not fully foolproof. I mean, neither are human beings, but we know how human beings work a lot better than we know how these models work.
Bryan Walsh [00:22:55] You get vastly different opinions when you talk to AI experts about how quickly this might happen or whether it could happen at all. I don't think we have, in ChatGPT or some of these other models, anything we could really call AGI. We can't even come up with a definition of that, actually. But I think you'd have to see something that is truly capable of doing things not just faster than human beings, but much, much better.
Bryan Walsh [00:23:19] Other than in specific categories, guess what? The best chess player in the world is definitely not a human being. The best player of a lot of games is not a human being. The best predictor of how DNA proteins fold is not a human being anymore. But there's still a lot it can't do. And there are many barriers that are out there. There might be technological barriers in terms of can you feed this thing enough data? Can it get enough compute to get smarter?
Bryan Walsh [00:23:44] And the real thing that makes me feel less worried is the fact that you need some sort of physical interface. Robotics has advanced much, much, much more slowly than AI, which makes sense. When you look at the last 25 or more years, software, computers, smartphones, and AI have all gone... Zoomed up like "this." They've taken advantage of that Moore's Law ability of chips to get faster, cheaper, smaller. Physical things in the real world have not. Robots have not... We still struggle with self-driving cars. So, my feeling is that we struggle with that.
Bryan Walsh [00:24:20] That tells me this problem is probably harder to deal with. Which is not to say it's never going to happen or that we shouldn't prepare for it, or that there's all kinds of near-term risks we don't have to worry about, but I worry about that less than I worry about someone engineering a virus badly in a lab, or obviously, quite honestly, Vladimir Putin has a bad day and suddenly, the worst happens.
Host [00:24:41] Yeah, all of these problems become immensely more scary when you just consider the effect of the internet and just the diffusion of human knowledge and know-how and so forth.
Bryan Walsh [00:24:51] Once again, it's a dual thing. We couldn't be doing this without the internet, obviously. All kinds of things that have been enabled us in the modern world could not exist without the internet. At the same time, knowledge carries risks. And we've always known that. The knowledge of the ability...
Bryan Walsh [00:25:08] If we all watched the Oscars and saw Oppenheimer win Best Picture... Once that was created, once that secret was broken and the bomb was created, it was only a matter of time before it began to spread the way it did. It's very hard to hold onto these secrets, hold onto these discoveries. And that's one where we know what that would do. Whereas with these ones, we have this sort of "could be good, could be bad." I mean, a nuclear bomb is pretty much just that.
Bryan Walsh [00:25:37] But even in that case, it's interesting. In terms of predictions, like I said, "Well, I don't think AGI's likely to happen soon. Put that off." If you go back to the early or mid-'30s, there were a lot of very, very, very smart nuclear physicist who were like, "No, you will never get a fission chain reaction to be controllable. It's too hard. It's not going to happen. It's moonshine." Literally 10 years later, in part under the very intense pressures of a war and the ability of the United States to both bring together all the smartest scientists around this and the sheer industrial power the US had, suddenly in 1945, you have it. Maybe we'll look back and realize we were just as wrong about the speed of AI development as well.
Host [00:26:22] So, nuclear weapons are obviously one thing. What is your general outlook or intuition around civilian nuclear energy on the other hand? Are you particularly bullish, as a lot of people are, about nuclear energy becoming much more prominent, not just in the developed world, but even in the developing world? Or, is this something that you're much more bearish on and are doubtful that nuclear energy is poised to become what a lot of people think it will become?
Bryan Walsh [00:26:54] I'm bullish in that it should, I suppose. That's one way to put it. When you look, "All right, what I really want is more energy." Not everyone says that, but I take that... Like, when I look both at what we will need as a developed country and what countries that are developing and are still extremely energy poor would need, we're going to need more electricity; we need more energy.
Bryan Walsh [00:27:15] I also want energy, ideally, that is baseload. I still think that's pretty important. It's especially become important with things like data centers that just keep sucking up energy. They can't really wait for the sun to shine, the wind to blow, and so forth. And I want it without carbon emissions.
Bryan Walsh [00:27:33] You put all those things together... I mean, quite honestly, nuclear power is the one that fits that bill. The bearish side of it is it still faces, I think, real questions and barriers or obstacles, at least, around cost, around public acceptance to a certain extent, although that has been changing, I think, which is is notable. I think it faces some headwinds around, I believe, engineering talent. Like, have we been training people to run these plants? I don't know, because we haven't built a new one in so long. All those doubts.
Bryan Walsh [00:28:10] I think in different countries, we'll see it in different ways. We know China is moving very fast on it. You had, of course, the backlash after Fukushima. Look what happened in Germany and elsewhere that really set things back. I think it'll move forward. I think it's going to face some challenges.
Bryan Walsh [00:28:30] We need to have a rethink in terms of permitting. I'm not a deep expert in this, but my understanding is that at least some aspect of that very, very high cost we see is the fact that we really put nuclear power through a sort of regulatory wringer that we don't do so much with other energy sources, even though the chances of accident are still very, very low. And we're not really taking into account the fact that because it doesn't produce pollution, other than fairly manageable nuclear waste... It doesn't produce carbon. If you compare it on deaths caused per megawatt, it's extremely low compared to things even like natural gas, which we've seen a huge expansion of.
Bryan Walsh [00:29:16] I'm hoping that changes. I put hope in new ways of doing it, in modular reactors. Obviously, I really hope, like everybody else, that fusion actually happens. And I think that's something we would need to have happen. If you really sort of imagine an AI world, you're going to need that kind of electricity source.
Bryan Walsh [00:29:36] Yeah, I hope so, I guess is what I would say. It's just the headwinds it faces remain pretty strong. We're also in a situation with high interest rates. That makes these kinds of plants tougher to build as well. But yeah, I would call myself someone who counts on it and hopes it comes through.
Host [00:29:57] Yeah, I think you underscored one of the main points, which is that to unlock the potential of nuclear energy, you have to fix the model for nuclear development as well as some of the related variables, like you said, permitting and so forth. And that really is the central key to extracting the benefits for everyone.
Host [00:30:13] Well, obviously there's a lot going on this year in 2024. There are elections in several different countries. Obviously, as you said, there's war ongoing. There's fear of war in other parts of the world. When you think about your work with Vox, what are some of the particular topics or questions you're most focused on exploring and explaining to readers the rest of the year? What particular issues are you hoping to dive most into in the coming months?
Bryan Walsh [00:30:42] We often look for stories that... We are a solution-based site, I think is what I would say, which I think gets lost sometimes. And this is an issue with the media more broadly. It's fair to say that there is a bit of a negativity bias around news. Some of that is just human psychology. We notice when things are out of order more than we notice when things are working fine. We try to both pay attention to trends that are good. I mean, we'll publish a story tomorrow, actually, the day after we speak, about how childhood mortality just hit the lowest level ever, globally. That is a good thing you don't really hear about in the news.
Bryan Walsh [00:31:18] But I think part of what we'll try to do is identify advances or new ideas that allow for technological shortcuts. To give you an example, we still face a major threat from future pandemics. Despite Covid, despite the experience of Covid, we haven't done anywhere near enough to prepare ourselves for the future. A lot of things are in the way of that.
Bryan Walsh [00:31:43] We did a great story a month-and-a-half ago or so about a really smart idea where you could use UV light inside buildings, schools, hospitals, whatever, to kill off a lot of germs. Such that you don't have to rely so much on vaccination. You don't have to wait to detect the virus and then figure out a countermeasure. Like, you could just stop it.
Bryan Walsh [00:32:04] That's the kind of thing I love to see. Anytime you're dealing with something that is politically challenging, that's a serious problem, and you can identify some kind of solution that just shortcuts or jumps over that problem, that's great. After doing this for about a quarter century, I have a little more confidence in technology than I sometimes have in our ability to work as a political body. So, when we can sidestep those issues, that's really important.
Bryan Walsh [00:32:28] With climate change, that's part of the reason I put a lot of hope in new breakthroughs. I think we should spend a lot more money and energy on continuing research for what we can do in the future. Whether it's something like fusion or something else, such that like, "Okay, we don't have to have this endless argument about how we divide up the world's carbon pie and energy needs, but rather we can have more."
Bryan Walsh [00:32:54] I mean, you mentioned energy abundance. That's an idea that I definitely want to write more about. Both because I think it's important... Because I don't think people are going to, really both here in a country like the US, or honestly, even more so, people who are living in a place like sub-Saharan Africa, which is extremely energy poor but is also growing very rapidly... They're not going to put off getting the things that everyone else who has developed has gotten to avert climate change. We have to find out a way to get them those things without having those negative side effects. And so, the idea of some kind of breakthrough...
Bryan Walsh [00:33:27] The idea of energy abundance is really important because it should be seen as something we should shoot for. Not just because it can meet these needs around something like climate change, but also because it can enable all kinds of new things. I mean, quite honestly, lack of energy or lack of inexpensive energy is a huge obstacle to a lot of things we'd like to do.
Bryan Walsh [00:33:46] Like, it would be great if we could create vertical farms that grow farms close to cities, reduce transportation costs, allow for more food self-sufficiency. You can't do that with the energy price the way it is now, because you're basically trying to replace this big free energy source called the sun. So, you can't do it unless you can find, really, ways to make it, as the old saying was, "too cheap to meter..." A lot of other things that fall into that, like AI.
Bryan Walsh [00:34:10] Again, we want AI to succeed, which I think has a lot of benefits. You're going to need to find ways to make energy a lot more cheaper. And like, creating a situation where, "Hey, that kind of abundance is a good thing. Look what it can enable you to do. Look what it can give us." That's not a thought we've had for a long time.
Bryan Walsh [00:34:26] I mean, I think going back 50 or more years now, back to the early '70s, I think the US has lived with the idea that like, "We want to try to minimize. We going to try to minimize footprint, we're going to try to minimize the energy we're using." Sometimes we don't do that; sometimes we have huge SUVs. If we can create a system where that becomes abundant, we turn on those spigots. That, to me, just enables more economic growth, which is a good thing. It enables more quality of life, all these things, and does it ideally with less of the side effects we've seen in the past. So, trying to find stories that can help people grasp why that matters, that to me is something that's really important.
Host [00:35:07] Yeah, there's a good point embedded in there, which is that energy is, of course, necessary for productivity. But energy abundance, true energy abundance is a precondition for prosperity and innovation and going beyond the extent of our current material wealth...
Bryan Walsh [00:35:24] Yeah, and it's just not something that I think people grasp, exactly. One thing we like to do at Future Perfect is to... Yes, we look at the future; we also like to look at the past. You can't really understand where you are as a civilization unless you understand where you've come from. The amount of energy the average person in the United States now has at their fingertips... It is thousands and thousands of fold above what would have been possible 200 years ago. That is the difference more than anything else. And we kind of just take it for granted; it's been that way for a while. But that ability to generate and use energy is what makes all of this possible. And I think we should know that because that would help us value it more.
Bryan Walsh [00:36:09] But also, I think we can extrapolate that further and be like, "Well, if we were able to make similar leaps in the future, what would that enable?" I mean, not to get sci-fi, techno-optimistic here, but this dream of a Star Trek future is basically one where you hit the button and energy just becomes whatever you need. We're not going to do that, probably. But awakening people to the need, that would be a good thing. We should try to see what we can change to make that somewhat more possible.
Bryan Walsh [00:36:38] To me, trying to find stories that could do that, because at the end of the day, we are journalists. And so, we need to use that form. We need to find ways that aren't just data, that sort of capture people's attention. That's what I think we're going to try to do.
Host [00:36:52] Well, it's probably obvious at this point, but tell our listeners where they can find your journalism, your colleagues' journalism, where they can follow you individually online to the extent you're active on social media and so forth. Where can folks find all the stuff you're working on?
Bryan Walsh [00:37:07] Obviously, Vox.com is the site for Vox, and vox.com/futureperfect, but you'll see us on the homepage. We have a twice a week newsletter that I think is really good. It's written by a rotating cast of our writers, so it really covers the breadth of the stories we write about. You can sign up for that on the website.
Bryan Walsh [00:37:25] Me personally, I don't do that social media that much anymore. I used to do a fair amount of Twitter, but honestly, for various reasons, less so. But I'm @bryanrwalsh for Twitter. You can find me there very occasionally.
Host [00:37:43] Awesome. Well, thank you again, Bryan. We really, really appreciate your time.
Bryan Walsh [00:37:46] Great to be here.