Name:
Opening Keynote: Publishers in the Age of Mistrust
Description:
Opening Keynote: Publishers in the Age of Mistrust
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/b5d52823-3efd-48a0-ada1-6980b248b21d/videoscrubberimages/Scrubber_1.jpg
Duration:
T01H13M05S
Embed URL:
https://stream.cadmore.media/player/b5d52823-3efd-48a0-ada1-6980b248b21d
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/b5d52823-3efd-48a0-ada1-6980b248b21d/opening_keynote__publishers_in_the_age_of_mistrust_2024-05-2.mp4?sv=2019-02-02&sr=c&sig=xU%2BcAFd21Q%2Fpjj8oTJGl1rSSEe1s4Ag%2BiVk5fv4Rp4g%3D&st=2025-04-29T20%3A25%3A15Z&se=2025-04-29T22%3A30%3A15Z&sp=r
Upload Date:
2024-12-03T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
Welcome to the SSP 46th annual meeting. So happy to have you all with us in the great city of Boston and virtually. My name is Randy Townsend, SSP president. We are nearly 1,000 strong. And I was watching the numbers every single day and refreshing.
I hope you take advantage of the great sessions and see the posters and bid on the silent auction. We have so many amazing things to celebrate this week and so many important conversations. I'm saving some of my important conversations for tomorrow, but today I'd like to recognize our generous sponsors and encourage attendees to visit the exhibit Hall.
I'd like to Thank the program chairs, Tim Lloyd, Aaron Foley, and Jessica Slater and the annual meeting program committee. Let's give them around of applause. Have you ever sat in on any of those meetings and see how much work they put into it. It's incredible. It's overwhelming.
The agenda for the meeting is in the program app. I watch my P's. Sorry about that where you can connect to the fellow attendees both in person and virtually. I'd like to give a big shout out to everybody that is joining us virtually. Thank you for spending time with us. The Wi-Fi, which is probably the most important thing write it down, is SSP underscore annual meeting or annual underscore meeting.
And the password is SSP 2024. Please, please, please remember to silence your mobile devices. I'm guilty of that. I'll be the one that says it and my thing will ring. SSP is committed to diversity, equity, and providing a safe, inclusive and productive meeting environment that fosters open dialogue and the free expression of ideas free of harassment, discrimination, and hostile conduct.
Creating that environment is a shared responsibility for all participants. Please be respectful of others and observe our code of conduct. If you need assistance during the meeting, please stop by the registration desk outside of the grand ballroom. Recordings of all sessions will be available in the app within 48 hours. At this time, please join me in welcoming our keynote sponsor, Sami benchekroun, co-founder and CEO of Mercier.
It's so good to be here. Thank you very much. I'm CEO of Malaysia, Sammy benchekroun. Just to pronounce it again for everyone. I'm always being asked, how do you pronounce your last name. So here we go. Benchekroun and I'm really excited to be here with. With all of you.
I mean, it's always wonderful to be together with such a wonderful crowd and what I would love to talk a little bit about is the theme of this year's SSP. It's about integrity is about trust in science, which obviously is a big topic. And I've taken this time some notes so I won't be as free as I usually am. So bear with me here.
But I'll try to be as free as possible even though I have these notes. So generally speaking, the scale and the speed of communication is increasing and poses just an absolute serious threat to the public discourse. And we have seen this all too plainly since 2016 elections. And we're seeing that over and over in the entire world. And these same forces driven by that technology, and we see the same forces in science and in research and exploiting this new tech, we have a new phenomenon, which is called bad actors.
Now, when I say bad actors, I don't mean people like Keanu Reeves. Now I'm a hold on, hold on, hold on. I'm a fan. I'm a fan, I'm a fan. But what I have discovered is that he's somehow the eighth worst an actor based on an IMDb list. So number 8 on the Bad Actors List. So when talking about bad actors, I don't mean people like Keanu Reeves.
I talk about all the cases that we've seen over the last couple of years about researchers, people that are incentivized to publish as much and as quickly as possible, and who sometimes, unfortunately, turn turn to desperate measures such as paper Mills. And I also mean people and organizations that are actively working to undermine trust in our institutions, which obviously Keanu would never do and as you share with me and atmosphere, just to turn a little bit on what we do is we realize several years ago that societies and publishers are simply not yet fully equipped to deal with bad actors and to the scale that we're seeing these days, and we understand that our mission is to equip all publishers and societies to respond to that evolving threat every step of the way.
And since then, we've built integrity tools across the workflows. And I'm really excited to announce that we're releasing a new journal and manuscript submission system later this year, built on exactly those principles of integrity that we have learned and which will help publishers and society to challenge and to be prepared for everything that we're seeing.
And at the heart of the integrity crisis is trust, as I said in the beginning, which is hard to earn and easy to break. And I'm really excited to hear our keynote speaker later on, Deborah Blum on the topic. And I would love to continue the discussion with all of you throughout the meeting. So this is it from me. And I would love to hand it over to Tim Lloyd, who will give much more context about who Deborah is.
Over to you, Tim. Thank you very much. And Yes, the device needs the password. We knew this would happen. I'm going to.
Say it is a lovely thing. Thank you. OK great. Welcome SSP community. Welcome those of you who are here in person, welcome. Those of you who are here online, we really value your participation in our meeting in Boston.
My name is Tim Lloyd. I'm one of your annual meeting co-chairs, and it gives me huge pleasure to welcome you all to our opening keynote. On behalf of my fellow co-chairs, Erin Foley and Jesse Slater, you'll be seeing more of us later during the Q&A and the rest of the volunteers and SSPs annual meeting program committee. Those of you who have served on that committee know that there's a lot of work done, and it takes a village.
If you haven't served on that committee yet, go to the volunteer show and sign up. So I'm going to talk a little bit about why we selected Deborah to be our keynote speaker. In that process, we reflected on the significant challenges we face as an industry in dealing with research integrity. We've seen a lot of upheaval across scholarly publishing over the last year as a result of fraudulent research, both large scale and small.
Last year's keynote speaker, Doctor Elizabeth Bik Blick, focused on research misconduct from the perspective of identifying and preventing fraudulent research from being published. This year, we wanted to take a step back and look at the broader impact of research integrity on the ability of scholarly publishing to successfully communicate the value of research. Our speaker is Deborah Blum, a director of the Knight Science Journalism Program at MIT, a Pulitzer Prize winning science journalist and columnist, and an author of six books, most recently the poison squad, a 2018 New York Times notable book and the subject of a 2020 PBS documentary.
She is co-editor of the 2022 book a tactical guide to science journalism, published by Oxford University press. She is a former president of the National Association of science writers, was a member of the governing board of the World Federation of science writers, and currently serves on the board of the Council for the Advancement of Science writing. There's a theme here. She's a fellow of the American Association for the Advancement of Science, and a lifetime associate of the National Academy of Sciences, in recognition of her work in science communication.
She was also our first choice for speaker. And if you've ever had to curate a program, you don't always get your first choice. So thank you. So that's all to say that we have a real expert on communicating the value of research here. Now, notably from our perspective, in 2016, Deborah launched the online Science Magazine Undark, which now numbers a readership in the millions and has won numerous national awards.
And I wanted to read an excerpt from Undark website because I felt it captures the relevance to scholarly publishing. The intersection of science and Society, the place where science is articulated in our politics and our economics, or where it is made potent and real in our everyday lives is a fundamental part of our mission at Undark. As journalists, we recognize that science can often be politically, economically, and ethically fraught, even as it captures the imagination and showcases the astonishing scope of human endeavor.
Undark will therefore aim to explore science in both light and shadow, and to bring that exploration to a broad international audience. As someone who thinks deeply about how to communicate the value of research to society and how to combat fraud and mistrust, we are incredibly excited to have Deborah talk on the topic of publishers in the age of mistrust.
Please join us. This is me looking for my PowerPoint. Here we go. It's absolutely a pleasure to be here. And I had to make an incredibly long trip here from Cambridge, but the weather is just the same.
So thank you so much for having me here and for giving me a chance to talk about some issues that are important to me, and I think, important in the way we navigate the world of science and communication today. And Thank you for the absolutely terrific organizing committee that invited me. I'm deeply grateful. So I want to start hang on here.
I'm really curious about how this AI translation of the way I talk is going to go, because you may be able to hear that I grew up in the South, and Southern accents tend to really throw so I'm not exactly sure what's going to appear on the screen. But anyway, my point with this first slide, which is a cartoon about the Tuskegee experiments, is that the idea of mistrust in science is nothing new. Bad actors are actually nothing new, and deception in science is nothing new.
And so we have that as a foundation I think, that we're increasingly aware of and that we have to consider going forward. But as the speakers before me have said, we're in a new and more challenging age for a number of reasons. So let's start here. There's a lot of very beautiful slides in my PowerPoint that I took out of the windows of airplanes.
This is one of them. I like that overview effect. I actually think this is Boston, but when I was looking at it I'm going is that Boston. Could be any overview. But let's start here with a review of where we are in the landscape of mistrust. And when I talk to my fellow journalists about this, I do quite a bit of talking, both nationally and internationally, about journalists trying to navigate a very mistrustful age.
I usually start exactly like I'm doing here. What is the landscape actually like. What are the challenges that we're facing. So I have a couple of slides to start that have to do with the United States specifically, and then some that are more international, because I think that's important. This is a recent trend research from the Pew Research Center looking at trust in science, and it spans several years.
It tracks a trend of downward trust in science. If you drill into it, you can see that it divides in a number of ways. One is you see the impact of the COVID 19 pandemic, which tells you that some of this rising mistrust trust grew is driven by mistrust in medical science. And so if you go to those years, you start seeing this abrupt decrease there. You'll also see, and this is no surprise to any of us, that it varies in the United States anyway by political party, and that the mistrust is much larger if you in the Republican right wing.
I can't say that right leaning section of the country versus the left. That is also nothing particularly new. But just do see a downward trend in both parties. And we look at this in journalism as particularly important because as you do we think about audience. Who am I trying to persuade to trust me. What is the resistant audience. And so for us, when we're writing about issues in medicine, a lot of times we're saying, well, here is the audience that clearly is going to believe what I say.
How do I get this information across to the audience. That isn't. What is the way that I actually try to reach out to the most mistrustful part of this conversation. And that's really important to me as a journalist. I've never been particularly interested in re-educating the choir. I always think of it as people who are gathered around.
I'm going to say, the scholarly or the research campfire. People who are already there don't need me to tell these stories. It's the people who are alienated from it that I'm particularly interested in reading reaching, because if we don't reach those people, we're never going to get the support to navigate in and intelligent way through the society that we're in today. I have one more slide, which has to do with the US, and again, looks at some problems in the way that we see declining and trust in science.
And this is again in the US. The rest of my slides are more global. This is actually a picture I took when I went to Vermont to watch the total solar eclipse. In the interest of complete transparency, which is one of the themes of my talk. Everyone came up to me afterwards and they said, did you notice how the birds quit singing. And I said, I was there with 3,000 people.
There was no bird song. There was just a lot of yelling by the people around me. So I did not have that transcendental experience. But this year's Edelman Trust Barometer, which I've been following for years because I find it so interesting in a nuanced way. And what you see here, then, is a look at, do scientists clearly communicate with me. And interestingly enough, the Uc is not the worst on this chart.
But you see across the globe, there is a feeling that people in the research community don't communicate clearly with me. They don't know how to talk to me. And the corollary to that is so I quit listening to them. This one has to do with I'm going to walk over here sorry, just for a minute, because I was supposed to see the slides here. And I'm not seeing them, and I just want to make sure I'm scoring this up.
Is government publishing right. Does the fact that the government is driving the research politicize the research. Do I feel that the government is completely talking this research. And again here you'll see that politicization, politicization of science. Another word I can't say. You'll see China in the US in a rare point of agreement that science is too politicized sized, but over here is government funding of research.
Talking the research. You see much more concern in China than you do in the United States. So I think, again, when we're looking at this, part of it is for us to say who's the audience we're trying to reach and how do we best reach that audience. You see here that there is a whole lot of feeling that innovation is not managed well almost in every country in the world.
And that is just a Warning to the rest of us that there is a feeling that science is moving things forward way too fast, and not in a way that is particularly well controlled. And I'm going to tell you myself that I don't think this is necessarily a bad thing. What I don't do when I go out and I talk to people about science and say oh, please trust everything about the research community, please believe that everything about it is shines with integrity, that every person in this community has your best interests at heart.
Because I know that's not true. And I think one of the mistakes we used to make as journalists, when we would write about research, as we would present it as if it was other than human, that this is a special part of the human species, sort of above the rest of us, without human emotions or passions, always objective, always seeking the truth, when in fact, research, like everything else, is a human enterprise and has all of those complications and egos and brilliance and dishonesty and everything that goes with any human enterprise.
And so we don't do ourselves any favors by going out into the world and saying oh, here's the one segment of society that's completely trusted. And again, this is my basic message to you all that you earn trust by being transparent about what's going on. Anyone who's been an investigative journalist like myself knows, and I don't need to tell you this, that when something goes wrong in your business, in your agency, in your house, with your behavior, you're so much better off if you're the person who gets that information out.
Everyone worries about that Gotcha moment. And there are plenty of people in the community of journalism who love the Gotcha moment, and you completely defeat that when you put the information out yourself. People don't want to do that. There's a wish that someone will. No one will ever discover that this has gone wrong. But today, when we're talking about living in the modern age, in the internet age, the odds of you not being discovered are get smaller and smaller every day.
So it's in everyone's best interest to be the person who gets the information out and deflates any Gotcha moment that might follow. Finally, in case you wondered, nobody trusts journalists. This is incredibly consistent finding. The trust in the media ebbs along at a really depressing level. Occasionally I look at the Edelman barometer every year, or occasionally it'll compare journalists and to other industry markers.
And we're somewhere beneath. We're sort of hand in hand with government. Nobody trusts the government. Nobody trusts journalists. We have this is a unique problem for us. And so probably more than most industries, some of that's the shoot the messenger aspect of being a journalist. If we're any good at what we do, we're always telling people things that make someone angry, that someone didn't want someone else to that someone didn't want someone else to hear.
And so the answer is, and we certainly saw this with the previous American president to call it fake news and to demonize journalists. So some of that is built into the way what happens when we do our job well. But I've been a journalist for a long time. I would never defend everything about journalists. And so being more trustworthy is one of the things that I really try to lead as a mission at what we do at KSDJ.
I want to briefly run through a few issues in scholarly publishing. These are not going to surprise you in any way, but and most of the slides I have here are actually more about image than about the internal workings of what's going wrong, because all of you are familiar with what hasn't gone well, and the tools to making that situation better are in your hands, not mine. But there is a lot of coverage of this recently.
I'm a good friend with Ivan oransky, who is a co-founder with Adam Marcus of retraction watch. So when I decided to do this talk, I had a conversation with him about it and asked him for a few slides that would illustrate some of the points I wanted to make. You may have seen this before. This is just retraction watch tracking the rise in retractions, with 2023 going off the charts, largely because of Hindawi.
And it actually mentions the 8,000 retractions related to that. And this slide. Ivan says about two years ago, he published a piece in Nature that said about one it's 2% One in every 50 papers was actually shouldn't have been published. And he said when he published that 2% number he expected to really get creamed by critics.
But people only wrote him and said, you're underestimating the problem. So I actually don't know what the problem number is, and I don't think he does either. But obviously he spent a lot of time on it. This is a piece that ran in Undark last year. It was actually a collaboration with retraction watch, and it focused strictly on a problem we're all familiar with, which is the problem with special editions in journals.
This is the Wall Street Journal's cover story from last week, which looked at Wiley shutting down 19 journals and raised the number of retractions to about 11,300. And I'm mentioning this because all of this goes out to the public. And so it's image creating, right. Its image creating among a certain part of the public. Not everyone reads the Wall Street Journal.
That's a wealthy, educated, influential audience. So, you have to take that into account. But it's not like it's spreading necessarily across the country. Except and this was the other conversation that Ivan and I had when we were talking about this story. Journalists like he and I who cover science and research regularly, we see this as an area of concern. When I'm doing a story now and I want to be a trustworthy source, I may run a researcher or a journal through retraction watch just to make sure they didn't used to.
But I do now that I'm not falling into a problem and that the story sits squarely in the land of integrity. But I'm not wishing ill for the operation. I want it to get better. I want to not have to do that. But this kind of information is weaponized far too often these days, right. So we have to be aware when these stories come out, that you have both good actors, and here you have the bad actors who look at information like this and they say, how can I weaponize this in, say, my anti-vax crusade or some of the ways that people use this information.
I'm not this is not me arguing about against these kind of stories being done. They need to be done. It's just saying that we have to take a realistic effect, look at their impact of the image and acknowledge that this creates challenges in how we do say, how do we build trust when these kind of messages are going out. So having provided you with a whole lot of depressing data, I want to now have another look out the airplane window and also take a kind of positive look at things that I think we can do.
I think it's very important to know the landscape of where you are. One of the things that I believe strongly is that you cannot fix a problem you do not see right. So once you see it, then you are shedding light on all these different aspects and you say, well, I could fix this and I'm not going to stand here and say to you, I like all the Golden answers as how to fix the problem of trust in science and trust in publishing.
And certainly, if I knew how to fix the problem of trust in journalism, right, I would have done that a long time ago. But I am saying there are things that we are doing actively and that we can continue to do, and that we can continue to do better in a communications sense, which is really where I start to try to turn some of this around. So I want to mention a couple of things that we do at the Knight Science Journalism program in particular in this regard.
These are a couple of projects we've run for years. I'm going to focus mostly on fact checking, but I want to mention the science editing project. One of the things that we realized early is that in the world of science journalism, everyone trains the writers. No one ever trains the editors. If you looked at, curricula of any science writing program or journalism program in the country, there's nothing there about editing, and there's certainly nothing there about editing science.
And yet across the world, there are editors who are gatekeepers of science and research information, who are easily influenced by bad actors who are able to take this information and get out the wrong information, who don't know how to judge what good science is or bad science is, and they don't know how it works. And one of the things that inspired us to do this project was actually a vaccine, a vaccine story that ran in the Milwaukee paper.
It's been a while, but it was a mom who had totally bought into the anti-vax thing, and she called the city editor and he assigned a story. The story said vaccines cause autism. Without any questioning of that and without anyone talking to the paper. Science editors and the University of Wisconsin afterwards tracked the drop in vaccinations in Milwaukee that followed that story.
It was hugely impactful. And so one of the things we said was why don't we start training those science editors? And so we did a program. This was funded by the Kavli foundation, and we went around the country training editors at regional publications. And then eventually we created this science editing handbook, which is a free downloadable handbook on our website and goes through all kinds of things.
How do you handle controversy. How does science work. I think Ivan oransky wrote one of the chapters in it. The editor of under Tom Zeller wrote the chapter on dealing with science and controversy, and then we translated it into Spanish, Portuguese, simplified Chinese, traditional Chinese, French, and Farsi, and we translated it into Farsi, because I'm friends with an Iranian journalist who has outlawed to Canada who wanted to have this available for journalists in Iran so that they could start doing a better job trying to understand how science works.
I think it's really fundamental that we recognize the limits of how well trained the gatekeepers of science are in this country and other countries. And the other thing we did from the beginning at Undark is we said, let's fact check. Not the kind of Politico, kind of liar, liar, pants on fire. Fact check that the Washington Post does after a politician gives a speech. But why don't we care about getting it right from the get go.
And so when we started Undark and that was in 2016, we started it out with a staff of really detailed fact checkers. And in case you wonder about the name Undark, I had written a book about the invention of forensic toxicology in the United States. It's called the poisoner's handbook. And one of the chapters in that book is about radium and the. And because the scientist I was following throughout that book were involved in the watch with the watchdog painters and Undark was the name for radium based luminous paint.
It was the paint that the watched. All these girls, these immigrant young immigrant workers, were taught to paint the luminous faces of watches and clocks with this radium based luminous paint, and they were taught to lip point their brushes right. Do a one lip, point your brush, sharpen it back up, do a two, swallow the radium, swallow the radium. If you know anything about radium, that's the most dangerous way to be exposed to it.
It's an alpha emitter, so it doesn't necessarily penetrate skin very well, but inside your body it structurally like calcium and it goes to your bones. And so this is why you had these catastrophic effects. So when we were talking about starting this magazine that was going to illuminate science, we had all these ridiculous names about photon, other light, light and science names. And I thought, wait a minute.
There's this weird name for something that's not really light and it's sort of light in the dark, which was a dark paint. And that's actually the origin of the name of that magazine I actually had. And I won't don't have time to tell you the story. A fan of the magazine mailed me a vial of the original Undark paint in the US mail without labeling it as a radioactive material, and he sent it to my house for a long story involving the nuclear safety people at MIT.
Geiger counters in my basement. But anyway, that's the origin of the name of that. We decided from the beginning, we would fact check. And so I wanted to show you an example of that. This is a story. It ran actually in ondaatje's first year, which we'll show you. We were right into interesting issues of the scientist farmer, and this is by Brooke Burrell, who actually wrote the Chicago guide to fact checking and is an editor at Undark now.
And this is what the fact check for that looked like, right. And this is what we do when we take a science story. We go through it line by line and word by word, and underline facts and color code them. And so if you look at this example, you see on one side is the actual story, and on the other side are the answers. The fact checker has to check everything. I think the first line in there is Yes, she's describing the smell and the smell checks out right.
But we go right into the weeds with this because when the story comes out, well, we don't want, especially with these kinds of stories or the stories that we do. Is someone coming back and saying, well, you got that wrong, right. You can't be trusted. We want to be people to be able to look at it and say, you got it right.
I think there's a parallel to this in peer review, and there's a parallel to this in the way that journals are edited. And the trick with it is to not allow the amount of material coming through to overwhelm your ability to actually check it. And I realized that in the scholarly publishing industry, which is a $30 billion a year business, where some of the model is a whole lot of papers turning over that can be, a challenging financial model.
And yet, when you get it right from the start, then you establish yourself as a trustworthy source. Obviously, we think Undark is that right. And I'll acknowledge that most publications can't do what we do. They don't have the financial wherewithal or the bandwidth to do that kind of fact checking, but it really matters. And so we've been pushing it the way there's free guides to fact checking on our website.
We run fact checking workshops. We keep saying, we've got to do better at this. And here is my continued argument for being transparent. When you make a mistake, you acknowledge the mistake. We certainly do this. Maybe not perfectly in journalism, but journalism is a profession in which we make all our mistakes publicly. So you will see us correcting if we're any good at what we do.
You'll see us correcting as we go. So be really straight up about it. This is one of my favorite cartoons about, trying to obscure the issue. We're destroying the Earth. Could you just say that in a slightly weaselly way. Don't do that. If we're actually destroying the Earth, we need to say we're destroying the Earth.
This is, again, some of the information I got from Ivan, which really deals with the issue of transparency and journals. Do journals get the word out. Not always. And in this case, they had done an analysis that showed about 40% of the retracted papers were not identified as retracted. From my perspective as a journalist.
I'd rather see them labeled that way or asterisked that way, or whatever you guys would decide to do. Here's another story from retraction watch, which is just about the fact that it takes a long time to get a retraction through. Again, that's a transparency issue. I think, as much as anything, if we're communicating about the fact that we want to get this that we care about the integrity of the research, that we care about the integrity of the story, then we have to both try to respond in an expedient way.
But the other thing we need to do, I think, is if it's going to be a long, drawn out process, be much more transparent about the fact that that's just the way things work and try to give people placeholder information, if nothing else. I wanted to mention this is my last slide from Ivan, I think, and you probably were relieved to hear, but that usually people are doing all of the things I'm talking about.
There are journals that are hiring people to supervise research integrity. There are independent operations that look at this, the clear skies project in the UK, the paper Mills problematic paper scanner project out of France. I wish we had something that I could brag about comparably in the United States, but there were certainly researchers in the United States who have made a point of tracking this independently.
I'm Dorothy bishop comes to mind for me. It'd be better if this came from the journals. This is my point about getting ahead of it. It'd be better if all of this was coming from the journals themselves. But it's great that there is this kind of concern. I want to briefly go both forward and backward. Let's anticipate problems that are going to happen. We've already mentioned this a little.
Oh, this is my last slide. Sorry so this is a slide that he put together on papers and peer reviews with evidence of ChatGPT. And I said to him, really, there's no. And I actually went back and checked because he had done this PowerPoint a while back. And then I went back to his website in my fact checking way, and I'm like, surely there are at least 1,000 now. And it was still about 100.
So I said, well, this doesn't make any sense to me at all. And he said, no, those are just the papers in which it's clearly identified that there was the use of ChatGPT. The papers were people are not admitting that or it's not clearly identified. Or if don't know how to search for its use or many, many times that. So I think this is an issue that I just want to say all of us have to anticipate in journalism and elsewhere.
The Wall Street Journal story that I quoted mentioned it. It's the concluding part of that story saying, the worst is yet to come, and it's Wall Street Journal kind of way. But I think it's really important to acknowledge that this is going to be an issue for all of us. It's certainly an issue in journalism, and I think it's going to continue to be a trust issue in ways that we're all figuring out but are aware, that are coming our way, stepping backwards.
I want to talk a little about the importance of also learning from past mistakes. I used to have a paperweight on my desk that said, a mistake is the thing you recognize the third time you make it. But I think it's often more than that. So one of the things that I think, and this is more of a long tail, big picture kind of thing, but one of the things I think is really important is that we do a crummy job, both as journalists and as scholarly publishers, in educating the general public about the process of science.
So this slide that I have up here is a picture from science. At the start of the pandemic, when everyone believed that COVID 19 droplets were the greatest risk and we were all washing our groceries and China was disinfecting the world. This is a picture of China disinfecting a street or something. And afterwards people went back and they said, see, that's proof that scientists didn't know what they were talking about, when in fact, the real issue was that science is an evolving state of knowledge, and we do a really crummy job.
Ongoing collection of data points and that comes back to bite everyone. So here is my brilliant slide about this. It was also this is an illustration in science, the inundation of papers during the COVID 19 pandemic.
I always say this to journalists, but I think it's worth saying wherever I to learn to value repetition, the people who so mistrust, the people who deliberately stir up, false information or mistrust in institutions, repeat themselves over and over and over again. And we have a tendency in journalism to say, didn't I already do that story. And we have a tendency, I think, in academia to say, didn't I already publish that.
Didn't I. Didn't I already say that. And I think we have to recognize that in this particular age, we are going to have to find ways of repeating ourselves and being interesting at the same time. How do I say that same thing in a different way. So as an example, this is the 2014 Ebola pandemic epidemic in Western Africa. And you see this Ebola is real, son. Probably you don't remember how many rumors and falsehoods were circulated about Ebola.
It was a government plot. It wasn't a virus at all. It was right. And so you and we still find ourselves repeating this, because in all kinds of situations in which the misinformation is carried endlessly on at one end and we're tired of correcting it, but in fact, we're at a point in, I think, in modern society where we have to be really never tired of correcting things.
Having said that, speaking of scholarly early publishing from long ago. This was true a long time ago. This is actually one of my favorite quotes on the subject, which is an 18th century quote. And it's very 18th century in that do not fear to repeat what has already been said. Men, that's the 18th century part. Need the truth dinned into their ears many times and from all sides.
The first rumor makes them prick up their ears, the second registers and the third enters. I wish that there was a magic 3 to the skies, and if I just said something three times, clicked my heels three times. As in The Wizard of Oz, everything would be fixed. I think it has to be more than that. But I think we have to learn to love repetition. And finally, this is me on the trail of a serial murder.
I'm writing a book about serial poisoners and female murderers. And so this was actually me in Durham cathedral, on the trail of a famous British serial killer. Taking a moment to go to the Durham cathedral. And this was an art exhibit called The Museum of the moon. And you can vaguely see me standing in the shadow of it. Finally, let's be honest. We can do all of these things. We can do everything right.
We can be very clear. We can share information. We can repeat ourselves endlessly. Some people are never going to try to hear us. They don't want to hear us. So we have to then think of creative ways to get through those barriers. We have to identify these resistant audiences that I'm talking about.
We don't have to worry about the ones that are already essentially on our side. We worry about the ones that aren't. And we recognize we try to figure out what makes them resistant, and then we try to figure out how to get around that. There are no easy answers. I've gone through some of the ones I integrity of story, transparency, being accountable.
It was me. Repetition, all underlying clear and open communication. But I want to throw this in there because this is my favorite American poet, Emily Dickinson. And this is from a poem she wrote called Tell the tell it slant. Tell all the truth, but tell it slant. And I'd just like the end of it. The truth must dazzle gradually.
Or every man be blind again. The 19th century. So it's male. But nevertheless it makes the point that in our arsenal, aside from all the. Let's be straightforward, let's be transparent, let's be clear, let's be honorable. Let's be people of integrity. There's nothing wrong with being strategic.
And so when I'm writing a story about a murder, I'm strategically teaching people about arsenic or carbon monoxide, as the case may be. And I'm just going to tell a story. And one of the reasons I do that, and this is my resistant audience thing. And my last point is that I'm going to use a personal example here. My sister-in-law, who lives in California, is very right wing.
And whenever I see her, she goes into a rant about what she calls smart people. She hates smart people. She hates people who make her feel stupid. She's never going to listen to anything I say, right. I can talk up vaccines or anything else, but all I'm doing is showing off how smart I am. And so I have to think with that kind of resistance. And we certainly see in these resistant communities, this resistance to being lectured, that there's ways to get the information out there.
That's not a lecture. And so narrative writing, narrative storytelling, talking to people, I mean, people come up with all of these different things. But sometimes indirection is a wonderful thing and I highly recommend it. And with that, Thank you. I hope this was helpful.
Hello Yeah. Oh, there we go. Bingo great. Before we start the Q&A, a quick public service announcement. For those of you who are in the overflow Marina room, please enter your questions into the Hoover app under the session Q&A. Because we can't hear you from here, and we have a dedicated volunteer who is picking up the chat in there, and then we are ready to pass to our questions from the audience.
Go. Thank you. Hi, I'm Michael Fitzpatrick from the proceedings of the National Academy of Sciences. Thank you so much for starting off this meeting so wonderfully. Thank you. Your point about trust being gained through transparency is something that really resonated and something that I think that journalists and scholarly publishers share is that there is a very necessary level of confidentiality that needs to be maintained for our processes.
So what do you recommend as tools, tips to find that balance between keeping the confidentiality that we need to but also having as much transparency as possible. So I have always thought, and most people don't realize it, that journalists are amazing keepers of secrets, right. I mean, I've been a long time investigative reporter, and there are secrets from my reporting past that I've actually never told anyone, including my husband.
So I absolutely respect what you say about confidentiality. I think it's really worthwhile in a strategic way, sitting down and saying, so how transparent can we be. For instance, I'm doing something in which there has to be some kind of confidentiality to it, what I need to be is as transparent as I can as to why it's confidential. Why I'm withholding information. Why this is protected.
What this is actually protecting. And make sure that people at least understand how limited. And then say. But here is absolutely everything I can tell you beyond that. And then just a journalistic example of that. So one of the things that comes up in journalism quite a bit is our use of anonymous sources. Did you make that person up.
I always ask that what's the evidence that this is a real person, right. So when we're doing a story that involves protecting an identity, then I think we have to explain very clearly why that identity. And it's not just. And you'll see this sometimes in political reporting so and so and fear of losing their job or whatever. The one that always struck out, stuck out for me was when I was a reporter.
I used to be a science reporter at the Sacramento Bee in California many years ago, and I did not do this story. But the bee did a story on latchkey kids. And the phenomenon at the time, or the growing phenomenon of latchkey kids. They changed every single name in that story for the obvious reason that they weren't trying to give predators the names and addresses of children who were home alone.
And they were very specific about that. Here's why we're withholding that information. So I think in these kinds of cases, I'm not saying this is a cure all, but I think we really go to the mat in saying, this is why we're not telling you this. This is what we can tell. And if we have a sense of when we might be able to give more information, come back.
We so as much as people can see about the actual process as possible, I've thought about this with some of Ivan slide about the delay in retractions. There's all kinds of complicated and sometimes bureaucratic reasons why you would see those delays. And those delays can be used against you. But the more you're able to say, let me explain to you why this delay exists or we have no excuse for this delay is an OK thing to say.
If it's actually been six years that you've been thinking about it, we completely blew that right. And I'm also myself a believer of saying, I screwed up when we actually do. I don't know why we find that so incredibly painful. Well, I do, but in fact, a lot of times people are more forgiving when you just say, yeah, I made a mistake. Here's what I'm doing to fix that. Does that make sense.
As much as you can and as much of the process as you can. And honesty about why you can't is probably, I think, the best we're going to be able to do with some of that. That's a great question. I'm just going to take one from the virtual audience. And I'll come back to you. We have a question from one of our online attendees. I appreciate your acknowledgment of the power of fact checking.
Could this part of the process be made more transparent to authors, readers, or viewers to support the value of paying editors to do this work. Yeah, that's a great idea. I mean, one of the things we've actually talked about, and I put up that fact checking slide, as most people don't even know what a fact check is, right. And so we've talked in journalistic circles about should we put more kind of here's a guide to how this story was fact checked.
Or here's how we put this story together. And what I haven't seen is this story was completely fact checked. Therefore, everyone trusted it. But I think the more that we're transparent about how we do this, the better off we actually are going to be in terms of getting people to invest in it. And we certainly found that fact checking project was funded for us by the Gordon and Betty Moore Foundation.
So just to give you an example of foundation investment in this kind of thing, they actually came to us and said, what's the state of fact checking and science journalism. We did a report for them on that, and then they came back and said, this is a real problem. What can you do to fix it. And here is a whole lot more money to try to work on that. So I think we see that foundations really see that this is worth investing.
And I think institutions do too. So the more we advertise it in ways that I don't think we do, as is suggested in your question, the more I think we're better off, right. And again, transparency, the more we attach that to stories. I haven't fully figured that out. If we took every story and under we have a story, now it's on. I can't tell you what it's about, but the fact check is taking a solid month.
It's probably going to come out in two weeks. And so under our fact checks are the most painful thing I have ever lived through myself. But if we gave people more of a sense of what they were, I think that's a very good point. More people would appreciate the value and we probably would be able to invest in it more. And this is at a time that editors and copy editors are disappearing from journalism.
So if we could turn that around with some of this, that would be really I'm going back to my office now to work on that. That's a really smart idea. Yeah Hi, I'm Heather stains. I work for a company called delta Inc and really appreciated your talk today. My question is also kind of around fact checking and balancing.
The time required for fact checking as you're just saying versus don't take in more information that you can handle. It was interesting to see the Wall Street Journal article referenced, and we know exactly when they did the fact checking for that, because they reached out to us at delta Inc to confirm some of their numbers, and it was down to the wire. I will just say so in journalism, speed, may be important and certainly in our industry, moving things through at a pace that's going to be helpful to the researchers.
But put those checks in. Realistically speaking how do you balance those. How do you trade those two things off. That is such a good question. And I will say, it's easy for us to do this in under in part because we're a magazine, right. We have. We can take the time to do it, right. That's certainly not true at a daily newspaper, right.
I mean, the fact that the Wall Street Journal runs fact checks is actually pretty impressive, right. I think one of the things that's happening at newspapers, and then let me go back to the time issue, I think one of the things that's happening at newspapers is that as the ranks of the copy editors, then they're outsourcing the fact checking more, right. We certainly see that at all kinds of other publications.
And if you actually go to the fact checking site at KSDK, we now have a database of fact checkers for hire that you can search by expertise and experience and all of these other things. So that people who don't have in staff fact checkers can just hire a professional fact checker and work with them. And especially in the book publishing industry, three. We have more and more authors who hire their own fact checkers.
Now, again a book is a time thing, but famously, book publishers almost never hire fact checkers. And so I'm there's a book that just came out by is it out yet. It's almost out by Ferris jabr called Becoming Earth and he had four different people. Fact check that book. I mean, there's a real awareness in science journalism that we have to get this right. So your question about time, I think it's almost like the magazine model.
Where with the things that turn around really fast, we put up a kind of here's some basic stuff that you absolutely have to be sure you get right. Just go through this checklist, make sure the names are spelled right. The publication is spelled right. The fundamentals are correct. Most newspapers don't have time to do more than that. And as we all newspapers misspell names and misspelled publication names.
So we're just trying to get some of those basics right. But for the longer, more in-depth pieces, the pieces that, the articles that you think are really going to get some traction, right. The ones that you expect to attract attention. Or I'm thinking from the journalistic standpoint, science and nature send out these lists of advanced lists of what's going to be published to journalists like me so that you can spend the week ahead putting together the story in honor of the embargo.
And they do that because they're trying to drive attention to those articles. If you're going to have an article that you're going to really get out there for public attention, then I would take the time, I would put those at the top of my priority list. I think it's almost like a triage thing. I'd say I can't do everything. So these are the minimum standards.
And for the ones that I think are really going to get a lot of attention. We're going to spend the time and we're going to slow things down a little bit, and we're going to do these right. And that's not perfect, but those are the ones that you start to see really. Aside from large statistical problems of special issues where you get hundreds of papers involved.
Those are the kind of articles where you really start seeing people say, well, what's right about that, and why is that a problem and what's going on. And I'll give you one. This is a COVID example of that I sort of waved my arms about for quite some time. There was a Canadian study. It was by a cardiology group, I want to say they were in Ottawa. And they were looking at the perceived heart effects of J&J and Novartis and that and those forms of vaccines.
And they got the math mouth completely wrong. It was. I don't know if you remember this, but we do. It was off by like a factor of 100 or 1,000. It was just insane. It was. The math was so off that afterwards I'm like, why did anyone even report on this. Because if these numbers were true, we'd be surrounded by bodies on the sidewalk.
people. It just never made sense. But the journal immediately sent that out to Canadian journalists. The Canadian journalists immediately reported it without checking it. It got huge traction in the anti-vax community. It never should have. First, it should have been fact checked.
If they were going to send it out and flag it for a journalism interest, they should have fact checked it. And second, it's our failure. We should have fact checked it. And the only reason that did not get a lot of traction in the United States is because of the attitude of American science writers toward Canadian research.
We were just like, yeah, some guys from Ottawa. No right. So which turned out to be a good thing, right. In the United States. But you can't always rely on poor added cross border attitudes toward research. I would be very I would acknowledge the limits of time and of money and I would triage all of this. Does that make sense.
Yeah I mean, someone I'm an independent. So Thanks for being here. I have been following since the beginning, so Thank you also. My question is, in the interest of integrity, timeliness, fact checking, when you posted up your picture of what an editorial process looks for, your fact checking on it and noting and everything, I was just wondering how you're thinking about AI I for the future.
Oh, that's a great question. So we're like, starting to try to our way through that now. Obviously right now the English language or the language capabilities of AI are still imperfect enough that you can pick up a lot. And there are programs already to look for. I kind of language in things. We're trying to get ahead of that by partly asking for some kind of honesty from the contributors.
And the example I'm going to give you of that is not so much our magazine, but I'm on the committee of an award that the National Academy runs for the Schmidt foundation. Schmidt Futures. It's the Eric and Wendy Schmidt award in outstanding science communication. And it goes to both scientists and journalists. It's a really interesting award. And last year we started saying to people have to let us know if you used any AI or ChatGPT in your story or in your submission, right.
Thank you. Since I'm walking away from them, I'm not sure this is on. On? Yes. You have to let us know if you're using any AI or ChatGPT in your submission. And we've rejected a couple of them. Then we had to then make the second decision, which is what is the use.
We had one person who came in and said, I have this disability, so I'm using AI to do these things. And we went ahead and allowed that. But it's a judgment call. But every single we get 500 entries for that award and every single one we have to put through that screening process. And in addition to relying on, the integrity of the person who's telling you, then we have to run it through a sort of check for AI I scanner.
I think we're all trying to sort that out right now. I just saw some. I think it was on social media that Vox was going to an increasing AI model and some of the more popular science publications. So I think from a journalistic standpoint, we're also trying to figure out, what is it going to look like out there. And again, how trustworthy.
It's going to be. I think we're all fumbling our way through getting this right at the moment. Truthfully, that's a great question. Hi, I'm Alice with the American Chemical Society. Thank you so much for being here. Thank you. So I'm just wondering if maybe for anybody who might be early career or anybody, anybody who might be interested in a nontraditional, non-obvious type job in our field, can you give us any insight on where might be a good place to go to find information on becoming a fact checker.
Sure well, US. So we do have it's called factcheck.org, a website that has a whole lot of information on how the basics of fact checking. It has a do it yourself fact checking section. It has links to some of our past workshops. It has other information about fact checking. And so we're an easy, start to this.
But because there's so much interest. In fact checking and journalism in particular, the Chicago guide to fact checking, which is in its second edition, is a really good book. And Brooke Burwell, who wrote it, works with Snopes and some of the other fact checking outfits and looking at ways that you can detect falsehood, as well as ways that you can just find errors. And just to give you a ridiculous example, again, to me this is a ridiculous example of fact checking.
We did a story. It was a story about epigenetics. It was an epigenetic story out of Michigan in which this dad believed that his exposure to a toxic chemical in cattle feed was the reason his daughter had later died. And the reporter was kind of walking through some of the epigenetics of that, and he went out to her grave. And part of the story is he's going out to her grave every day.
And he and there's stuffed animals at her tombstone. And so our fact checker actually went on Google Maps to make sure that the distance was right, which was off a little bit by the reporter. But the more important thing that she fact checked and you wouldn't have thought of this in a science story, was that the writer had very carelessly and randomly said, he goes out and leaves another Teddy bear by the tombstone. But they weren't Teddy Bears.
His little girl's favorite stuffed animal was the frog, and he went out and left stuffed frogs by the tombstone. So the fact checker went back and checked that the writer had gotten it wrong. But it was right in the story. And anyone who read that story who was not from this small town in Michigan, would not have gotten how important that was. But anyone who was in that small town would have said, I don't trust a word of this story.
It can't be right, because she couldn't even get the frogs right. And so we really recognize in doing this, and we do this in our training workshops. And you'll find this on our website, and you'll find it in things like the Chicago guide to fact checking. God is in the details. As they say, this really matters.
And we're really pushing it. I hope it will it'll make a difference, I hope. The more we get even the frogs ride, as it were. That will make a difference. And having people come back and say, well, that was a trustworthy story, or those people really cared about getting it right. And that's the other message with it. It's not like we're ever going to be perfect, but the message when we do this is we actually care about getting it right.
And that's a really important message. Hi Thank you. Thank you. Great great talk. Thank you. I'm Christine lamb and I recently retired from the New England Journal of Medicine, working in brand marketing and also media relations.
And I'm wondering what you think and can say about reporters covering preprints, particularly of advanced reports on therapeutics and new drugs and things that might not be final. And what. Thank you. Yeah, that's a great question. And it's a real problem for in terms of going back to when I was talking about the conflict between the media and science.
Science is a process where event driven, and you have to be really careful when you're doing a preprint because of our tendency to announce it as a discovery or an event in science. If we're going to cover preprints, and I think in the modern publishing age, we do, then there's two things that I think we don't always get right, but that we should. One is that whether or not you're convinced that peer review works perfectly in every situation.
Preprints don't have that. And so you have to at least say to yourself, going in. But this didn't go through that extra screening. So how do I do. Essentially the fact checking to try to see how well this holds up. Once you it's on the reporter to do the homework to test the proposition out and put it and put it in the context of science.
And then the other part is complete transparency. I'm not positive that everyone in the general public actually understands what a preprint is, right. But in the world of repetition, we should say this every single time we use it reminds me when I was writing about climate change and you would go, do I have to explain again. Why How the physics of atmospheric gases. Do I have to do this every single time.
I only have so many inches. We used to do newspaper stories and column inches oh, so many inches that I have to go another 2 inches to explaining this again. But we do. We actually have to repeat every time. This is a preprint. This is what that means. This is what the scientists who did it says about how seriously we should take it.
And this is what other scientists say about how seriously we should take it. So some of that is the way we do our homework. And just to give you a sense of when I'm doing a story, I'll research the scientists, I run through PubMed and Google Scholar, right. I'll look at whether they actually have any expertise in the field. I'll look at who cites them.
Instead, I used to always say to the scientist, well, who do you think I should talk to. But quite often I don't do that anymore. I just look at the citations and talk to you people who are not quite so close to the scientist. And when I'm teaching and I'm the advisor for the student newspaper at MIT. So I've been coaching their science editor. It's actually a lot of fun.
But when I'm teaching this, I'll say, minimum of three. If the story that just has the pie saying this is the greatest science in the world, it's not a story. You need to talk to the PI, or the pi may shuffle you off to a grad student. As so often happens. But you then have to have at least two other people who put this in perspective. And if you can't do this minimum three interviews, then don't run the story.
Now that's like an ideal situation, but I think we bracket the preprint with doing our homework and explaining over and over again what kind of context this deserves. Does that make sense. I'm afraid we have run out of time. Sorry about that. It is actually a quarter past, so please join me in thanking Deborah for her really thoughtful presentation.