Name:
The Role of Trust in Publishing for Sustainable Development
Description:
The Role of Trust in Publishing for Sustainable Development
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/014596b0-677b-4b84-899d-a8f0a81c5426/videoscrubberimages/Scrubber_1.jpg
Duration:
T01H02M54S
Embed URL:
https://stream.cadmore.media/player/014596b0-677b-4b84-899d-a8f0a81c5426
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/014596b0-677b-4b84-899d-a8f0a81c5426/session_6c__the_role_of_trust_in_publishing_for_sustainable_.mp4?sv=2019-02-02&sr=c&sig=OODjf%2FStkQ2zZreOYsp%2FKVCEYvP6sfZgBb%2BbIyeME10%3D&st=2024-11-22T05%3A56%3A38Z&se=2024-11-22T08%3A01%3A38Z&sp=r
Upload Date:
2024-02-23T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
Thank you all for coming. It's Friday afternoon. It's after lunch. It's nearly the last session of the conference, so we really, really appreciate you being here to learn about why it is so important that we can trust the published research when it comes to the sustainable development goals and global challenges and sustainable development more generally.
I'm Nicola Jones. I work for Springer nature, where I'm in charge of their sustainable development goals publishing programme, which means that I work with colleagues around the company, across our books and journals and all of our different subject areas who publish content that relates to the UN Sustainable Development Goals with the idea that if we bring this all together, we can help this research to have an impact in the real world.
And I'm joined by a really excellent panel of experts who are going to help us to explore the issue, not just of why we should care about publishing this research, but why it is so important that we can trust what is published. So I'm delighted to welcome Sarah Gorman, who is the CEO of critica. Rebecca Kirk, who is the publisher for portfolio development at plos, and Jay Patel, the head of sales for the Americas at cactus communications.
And I'm going to set the scene a little bit first, and then I'll hand over to our panelists to describe the issues from their perspective and talk about some of the work that's going on. But we also have a couple of polls available on the app. So if anybody would like to go in, tell us how the SDGS relate to your publishing program and what you see as the main issues when it comes to trust and sustainable development publishing.
And it was really exciting to me actually in the keynote session with Elizabeth bick on Wednesday night when she ended by saying that it's up kind of up to science to save the world. We need these insights and we need to be able to trust the science if we are going to save the world. And that's kind of set up and set up this session really nicely.
So for anybody who may be not so familiar with them or didn't make it to the session yesterday afternoon, the SDGS are a set of 17 goals that were adopted by member states in 2015. They will run until 2030 and they set out effectively a blueprint for a sustainable future on earth, thinking about not just environmental issues, but also social and economic issues, global health, good governance, peace and making sure that organizations and countries work together to help the goals be achieved.
Besides the fact that this is, you know, obviously a positive thing for the world, one of the reasons scholarly publishers should particularly care about this is that it's a really rapidly growing area of research. So the chart that I have on the slide on the right hand side shows the articles relating to the SDGS have outgrown articles that don't relate to the SDGS by around 10 percentage points.
And that's true across open access, hybrid publishing, subscription publishing. It's a really rapid area of growth within the market. So it's something that publishers should be thinking about engaging with as much as possible. And there are various initiatives that have sprung up recently in the industry to help publishers engage with the SDGS.
In 2020, the SDG publishers compact was launched by the UN and the international publishers association, and that sets out 10 commitments that publishers are asked to make towards the SDGS, and they relate to operational aspects of sustainability, but they also relate to content publishing, which is really what we're here to talk about today. A number of the organizations represented at this meeting are signatories, and the publishers compact is supported by a group called the SDG publishers compact Fellows, which is a collaboration between publishers, research and educational publishers, librarians, researchers and practitioners.
I'm a fellow. Becks and Jay are both Fellows and Clarice here at the front of the session who ran yesterday's session is also a fellow. One of the things this group has done is to set out top action tips to help people engage with the compact and think about how to integrate the SDGS into the work that they're doing.
Something else that we do is run solution summits. So if you I should have put the website on here really, but haven't. If you Google publishers compact Fellows, you'll be taken to our website and be able to access the resources and the upcoming events. Turning back to the issue of trust now. A couple of years ago, Springer Nature worked with the UN Sustainable Development Solutions Network to convene a series of roundtable discussions to look at what we could learn from the global response to COVID 19 and how we could apply some of these lessons to addressing the climate crisis.
We published a white paper that looked at how these interdisciplinary and cross-sectoral conversations came together to come up with some suggestions of a way forward. One of the focuses for one of these roundtables was the idea of misinformation and the quote that I have on the slide now from Dr. Genevieve Gunter really kind of sums up the issues here. The disinformation that we've seen with both the climate crisis and with the COVID 19 pandemic has been coordinated and disseminated in a systematic way.
And this disinformation is infecting all of our understanding and all of our behavior with respect to both the climate crisis and the COVID 19 pandemic. So it's not just that these are really important issues that we have to solve if we're going to have a sustainable future. It's also that there is. A disproportionate amount of misinformation and disinformation about these topics.
So I'm very happy to be able to hand over to my panel now to talk about how we can ensure trust in the scholarly literature when it comes to global issues. And first of all, I would like to introduce Sarah Gorman. Thanks, Nicola, and Thanks, everyone, for being here today. So I have no conflicts of interest.
Our work at critica, which I'll tell you about in a minute, is funded mostly by the Robert Wood Johnson Foundation. And these are my views, not necessarily those of the foundation. So a little bit about critica since you probably haven't heard of us. We're a small nonprofit. Um, it was founded in 2016 when I wrote published my book denying to the grave why we ignore the facts that will save us, which looked really at the psychology of science denial and misinformation.
We're a science communication research organization focused on ways to improve public understanding of scientific consensus. Our major area of current funded work is around medical misinformation. We focused on COVID 19 vaccines, reproductive health and other childhood vaccines. We also provide extensive technical assistance to public health institutions and national health related organizations.
So the Office of the surgeon general. And the American Board of Internal Medicine in New York City department of health are a few of those. my work is really focused lately actually on trust, so it's an appropriate topic. I've just handed in a book, a new book focused on trust and health care and how that's evolved over the course of the pandemic, which should be out in 2024.
And I will say that it's much easier to talk about what happened to trust and the problems with it than it is to talk about what to do about this loss of trust in the health care system. So that's something I think everyone is still trying to work out. So what is the relationship among these things? Trust, misinformation and the sustainable development goals? Well, we know that trust is required for several things that are important to the sustainable development goals.
So things like collecting reliable data, people need to have trust in those who are collecting the data to allow their information to be shared. There needs to be trust in order to engage in health promoting behaviors. So we saw this very extensively in the pandemic. People didn't trust expert consensus on things like mask wearing and social distancing.
And this led to probably a lot, many more infections and deaths than needed to happen. We also need trust to maintain a general confidence in the government. So in public health, a lot of most of the work comes via the government, government agencies. And so having confidence in the government in general, you know, which people often think as sort of one monolithic entity is really important for health promoting behaviors to happen and for public health goals like the SDGS to be reached.
Also, we need to nourish cooperative work among different countries and stakeholders. So especially for something Nicola mentioned, climate science and obviously the crisis around climate right now requires we're all on the globe together and it requires a lot of work across countries and nations to work together and trust each other in some of these issues. And one, does misinformation flourish? There's kind of a two way relationship between trust and misinformation is what I've really found in my research.
So misinformation can lead to a lack of trust. So the high presence of misinformation can lead to that. But a lack of trust can also lead to a high presence of misinformation and to susceptibility, to believe misinformation. And as we saw during the pandemic as well, misinformation flourishes when there's too much unfiltered or non curated information.
So I'm sure you've all heard the term info demic, which has a lot to do with not just incorrect information, but just too much information coming into the system and people creating a sort of a breeding ground for people to see things that aren't true. So I always like to it's actually harder than you think to define misinformation.
And so I always like to try to do this, but also to point out the limitations of some of our understanding of what misinformation even is. So the usual way to look at this is that there are these three sort of overlapping concepts misinformation, disinformation and information. And you probably heard misinformation and disinformation most frequently. And there is an overlap between disinformation and misinformation, in part because the difference between them has a lot to do with the intent of the person who's sharing the information.
So the traditional view is that disinformation is incorrect information that's shared intentionally to deceive people. Whereas misinformation, the intent may not be there. It may be mistake or misperception or something of that nature. But it is hard to really understand, especially in complex online environments, what everyone's intent is, who's sharing different kinds of information.
So we actually have to treat these problems together. And and when we work on one, we really have to work on the other as well. So this idea that misinformation carries with it a malicious intent is not necessarily true. I think that's one of the common myths about misinformation. So anything from misperception to misunderstanding to malicious intent and sharing of incorrect information or even misleading information can count as an instance of misinformation.
There's also a perception that misinformation is a wealthy nation problem, and this is definitely not true. We see misinformation cropping up all over the world. We see things like vaccine hesitancy happening or even around diseases that are still very prevalent in certain parts of the country. There's a concern right now that the new malaria vaccine may not experience a good enough uptake, even though malaria is such a big burden of disease in many countries, because vaccine hesitancy and misinformation is not just about the realities on the ground, but it's about sort of the information ecosystem and the levels of trust that people are dealing with.
There's also this idea that misinformation is fixed or constant, and this makes us sometimes not want to act because we think there's no point. It's not going to change. But people that people's embracing of misinformation can be very fluid over the course of their lives or over the course of thinking about a certain topic. People sort of come in and out of believing and not believing misinformation.
So there are opportunities to move people in those realms. And then I just wanted to talk briefly about two of the SDGS that I think are really most relevant, especially in this line of work around misinformation related to health. So SDG 3 is about health and well-being. For all. We know that COVID 19 has significantly affected progress in global health.
Global life expectancy is down, routine childhood immunization is down, and health disparities have worsened many of these things as a result of COVID and distrust and misinformation about vaccines is a huge issue now, and we know that vaccination is often the bedrock of creating healthy communities. So this SDG is very much a threat of not being met because of some of those issues. And then, as we mentioned before, climate action is another big one that requires a lot of cooperation, but there's also a lot of misinformation, disinformation and incentives not to believe that climate change is actually a problem from corporations like big gas and oil companies and from social media who don't act enough on this issue.
And there's a lot of motivated reasoning because the issue has become very political and a lack of self efficacy, people believing that they can do anything about the problem. So again, there's a lot of misinformation, there's a lot of motivated reasoning around political ideas around climate change, and there's a lot of actual intentional, incorrect information or disinformation in this area.
And again, it really threatens our ability to achieve this goal. So that was that's where I wanted to stop today. Thanks for your attention and pass it along to bex. OK OK. Thank you. Sarah and Nicola. Um, so as the publisher on the panel thought I would start by reiterating some of the comments that Nicola was talking about earlier in terms of the publishers compact in particular.
And the critical role that publishing as an industry, and that includes all of the folks that are attending this conference in different ways, has to play in achieving the sustainable development goals by 2030. And at the moment, around 300 publishers have signed up to the SDG publishers compact since it launched in October 2020, and there are a number of really important cross publisher collaborations via initiatives like the publishers compact Fellows.
Also, there's some really important initiatives by another like cross-functional industry engagements. obviously I'm coming from a plot perspective here. So just want to kind of give some sort of overview of where we as a publisher stand on the SDGS. This is using data from dimensions. So this is a tool that you can use to see how your publishing portfolio is doing in terms of publications that have been associated by their tool with the SDGS.
So there are some limitations to this. There's double counting here. So just to be clear, this isn't one for one. And I think one of the important things to call out is and this is true of most publishing portfolios, and I think when you look at the whole corpus, good health and well-being is massively overrepresented in comparison with the other SDGS. We're looking at something like 20-fold increase versus the other has its own little section over there because otherwise everything else looks really teeny tiny.
but I think the other thing that's really important and interesting here is that you can look at the mean citations on the right axis in all of these areas are being really built on. So the pieces of work that are being published are being taken. And citations we know isn't a perfect measure of this. But we can tell from this that these pieces of work are being taken, they're being built upon and think particularly calling out no probably.
It's a tiny number of manuscripts, really, particularly in our portfolio. But actually generally, this is really an area that could do with more attention. But those pieces are having an outsized impact. They're getting they're getting well reused and think is, as Nicola pointed out, this is a really growing area, but it's also an area of huge importance.
But we do also, as publishers, have a responsibility, given the number of manuscripts that are associated with the SDGS, to really try to ensure that what we're publishing is really trustworthy. Now looking at this from the sort of plot perspective, but also building on some of the comments that I've been hearing throughout the meeting. Um, one way of building trust in research is providing more of this information that's behind it.
And when we signed the publisher's compact only a few months ago, the CEO outlined it as being open science, as being really a critical accelerator for achieving the SDGS. So in terms of thinking of what I mean about open science, a lot of people mean different things when they say this. So I'm just going to briefly say how I'm interpreting this and generally use the UNESCO definition of open science. I find that it's really helpful.
This has equitable participation at its core and is something that's kind of a really nice sort of centering to come back to. Looking through that definition I've picked out. Four really important points for me is the goals of open science really are to ensure integrity and earn trust, to foster this collaboration, to allow people to build further on existing work, to accelerate that research and dissemination, but also to enable everyone to participate in the process and benefits of science.
And when I'm talking about science here, I'm not just talking about STEM subjects, the sort of scientific endeavor and the way of thinking applies to a lot of associated disciplines as well. So sort of this is a. Something that I really think is very important. And I'm just going to read it out. So please read along with me, because being open is really about more than just being able to read or share an article.
It's really about providing the right context to understand it and think that piece is so important when it comes to trustworthy research, and we need to be making sure that the resources to replicate that research are open and available, and the tools to collaborate and make science better are available to everybody. And this really is going to build that framework for more equitable participation and distribution of knowledge.
So I really do think trust and transparency can support us to deliver the SDGS. I have a little vignette from today. One of the editors from plos biology got in touch with me. One of our authors, Rebecca helm, is currently in Paris at the UN negotiations to end plastic pollution that had a paper published in plos biology in May and the authors come back to us and said sharing our work with delegates would not be possible without your efforts.
It needed to be open for us to be able to share it in this way. And it's having meaningful impact. A month or a few weeks after publication at a un delegation. So I think it's a really nice sort of context setter for how important some of what we do here is. You know, we're having an impact. And we need to really.
Keep that responsibility for that trustworthy and that communication in mind. And with that, I'm going to pass on to jay, who's probably going to terrify us. All right. Thank you. Thank you, Becs. So I'm not here to terrify, but.
Yeah, but hopefully this will be. This will be fun. Yeah so misinformation is not new. That's the good news. The bad news is it's. It keeps growing. Um, it's really been around. Believe Samuel Johnson might have been the first one to use it in writing in 1756 when he was talking about the don't forget who it was.
I think it was some Prussian King or something or another. But so it's been around for a while. It was a problem even during the Renaissance 400 years ago. It's a problem today. Um, and what I really, Uh, when I was doing a little bit of research on this, what I really liked was what Francis Bacon noted was that our minds lend more weight to affirmative or positive than to negative results.
And I think we see that in everyday life. We see that even in the publication industry where not many people publish negative results because who wants to look at negative results? Um, you know, and even online when you go, it's a lot of positive things. You don't really hear people sharing stories of when they failed and fell flat on their face. So we do naturally gravitate towards affirmative or positive things.
Um, and that lends itself very nicely to misinformation because we, because we want to believe that, you know, that climate change isn't real or that your house isn't going to be flooded the next time there's a hurricane blowing through or wildfires aren't going to burn your house down. We all want to believe that. But the fact is it's happening. And there's always going to be a subset of people who are going to believe otherwise because they can't look at it in a negative light.
They can't put their mind around the fact that, yes, climate change is real. Yes, I is here and it's not going away, no matter how much. We wish it does, that there are negative things that are associated with technology that we've seen over the past 30, 40 years. And this all helps fuel misinformation. I know. Scary sorry.
Um, yeah, so the internet definitely makes it. Easier and faster to spread misinformation. Excuse me. I think when we look at Twitter, fake news travels six times faster compared to just good old, you know, real news. Surprisingly, 90% of US adults may not be able to effectively use health information. And it's I mean, it's unbelievable that we have so much information out there, so many ways to help yourself manage chronic disease, avoid chronic disease.
But a lot of people can't access this information because it's not written in a way that they can understand it. And I think that's really critical to fight misinformation. I think open access is really useful. Public health and patient advocacy is really important. But I think what is also very important is that we communicate it in a plain language, that we look at how the journal articles are written, how abstracts are written.
Try to put that into a language that non-experts can understand, that policymakers and politicians and the public can understand that practitioners on the ground who are actually dealing with patients or dealing with their communities can understand it. I don't come from an academic background. I absolutely cannot read a research paper. I suck at it. I get maybe through the first paragraph and I'm already, you know, I already want to shut it off.
So what I've been doing for probably the last past six, seven years is I summarize it. I use I to summarize every research paper I read, even when I was doing this, I yes, I use chatgpt, I use Bard. I did use isomerization to go through research papers and summarize it for me because it just makes it so much easier for me to understand it. And then if I really want to spend a half an hour reading the full paper, I will do it.
But it is very rare that I'll sit there and read a paper without summarizing it or trying to get some sort of context pulled from it using I. Yeah so now the fun part, so generative AI is something that I know has been talked about a lot at this conference at CSCW, at STM, and it's really fun. I don't know how many people in the room have played around with chatgpt or dall-e or midjourney, but I created this image.
I also can't draw, so I can't read research papers and I can't draw. So I asked dall-e to make a picture of a rocket flying out of a laptop and it gave me all these variations and I said, OK, I want to make it look like it's anime. And it created this really fun, but also really, really scary because you can manipulate images, you can do text to video and do deepfakes. You can also write complete research papers.
And we're getting to the point where you'll be able to write full books using AI. So I think it's really, really important that we as an industry are aware of the capabilities, the limitations, the threats from this sort of technology to really add rocket fuel to misinformation and that we are aware of how to use it for good. But then we're also aware of when people aren't using it for good and how to control for that.
So some of the strategies to tackle misinformation. So regulation is key. And I know there have been people talking about regulation for some time now of AI. There are even developers, experts, CEOs coming out and saying, yes, we should be regulated. And I think that's just bullcrap, because if they're calling for regulation, they know that they can't control this.
They should be really the ones to self regulate. And then the government should come in and layer on more regulation. On top of that. They shouldn't be giving up that responsibility to other people. They should be doing it themselves. The other thing is enforcing copyright. Open access is great. I love open access.
We built an app called our discovery, and open access is our favorite, favorite thing for that because we do so many cool things with it, like using AI to summarize articles for people like me who can't really read a research paper or using AI to translate. So if you are happen to be English as a second language, you can translate it into 25 different languages right in the app and read it. We, you know, we're using AI to do all sorts of audio translations as well.
But, you know, these are things that we can all do and these tools are available and we should be testing them and playing around with them. And trying to figure out how we can apply that into our publication process and how our authors and our readers and our members can utilize that for their work and for their just their interactions within the community, because this will help fight misinformation.
I think the other important thing is watermarking for provenance, proving that you actually created it, that you actually own it, and that's really critical. There are things like project origin, Adobe's firefly, I don't know if you've seen videos of that, but it's really cool. It does a lot. It makes editing pictures very easy, but they are also layering in content content credentials.
So you know, it was created using AI. And I think the last and most important thing is education and training about fact checking and verification techniques. It's really important to teach kids. It's really important to teach the public on how to read things that they see on social media, how to read things they see on websites, and how to look for clues that this could be fake news.
And then to be able to go find the correct references that either prove or disprove it because it's extremely easy just to press like and retweet. But it's much harder to actually sit there and do the evidence checking and to disprove it. All right, so that's all the scary stuff. what is. And I always like to and think. Nicola and I think Becs no, because this is my favorite slot in the world now.
But I always try to end on a positive note. And the public really does trust scientists. They want to hear from scientists. The pandemic certainly helped increase trust in science and scientists, and they really do want to listen from us. They want to listen from the authors. They want to listen from our members, but they want to hear from us in plain language.
They want to hear from us in a way that they understand. And if we fail to communicate in plain language, that vacuum is going to be filled by people like Joe Rogan or, you know, or any of those jokers at Stanford or UCSF who think that because they just have a PHD, all of a sudden they're a epidemiologist or an immunologist when they don't even know what they're talking about. Um, also, people are standing up for science, so people are saying, no, that's not, you know, that's not actually true.
The COVID vaccine isn't loaded with a microchip by Bill Gates. You know, you know, no human activity does cause climate change. It's not just happening because we're getting closer to the sun. So, yes, I think it's really, really important that we understand that people trust us, people trust scientists, and that we need to communicate in very simple language to them.
Thank you. Thank you, all of you. That was really, really fantastic. Layout of the issues that we're facing here. Sarah, Thank you for so clearly explaining actually what we're dealing with and the work that kritika's doing in this space because it was really fantastic to get a perspective on the importance of open access here and the impact that our work can have.
And jay, Thank you for ending on a hopeful note rather than a scary one. It's, you know, the scientists that are trusted by the public are our communities. And it's really incumbent on us as scholarly publishers to make sure that we uphold that trust. Um, so I'd like to turn it over to questions from the audience. Now I can see we have one already. Um, if you'd like to come to the mic, that would be really helpful.
So I, Hi Thank you very much. It's a very interesting panel. Unfortunately, even though it's late in the conference, this is the dreaded more of a comment than a question. I promise it's not going to degenerate into a short speech, but I was listening for a word that I didn't hear in this. And that word is distrust because we talk about trust. But the opposite of trust isn't really no trust or neutrality.
It's distrust. And I think it's important to recognize that because it's a larger gap to bridge to no trust or neutrality than to distrust. And I think that what comes of this as well is an approach because I think it's very easy to concentrate on a sort of a cognitive account of trust that we can reason ourselves out of this dilemma. But really we're forgetting about the affective dimension of it.
And this was alluded to about how people don't want to believe that climate change is real or things like that. But I think focusing much more on the sort of emotional or psychological response and not just thinking of it as some kind of game theory example is an approach worth considering. Thank you. Don't you think?
Now it's a question. Thank you. Um, OK. Who on the panel would like to tackle that first? Uh, Thanks for asking that I didn't give the title of my new book, but it's actually vital signs distrust democracy and public health in America. So there you go. Um, so I think that's absolutely true.
And there is some scholarly debate about what's the difference between distrust and mistrust. You know, there's a lot of literature on this and a lot of people have thought about what all these phenomena really are psychologically. And a lot of my work has focused in misinformation, has focused on psychology, both individual and group psychology, because it is true that the idea that there's a knowledge deficit and that's what's driving misinformation is it's just not really that defensible.
It's true in some cases. But for the most part, this kind of crisis that we're facing with misinformation and distrust has much more to do with psychology, has much more to do with group social dynamics than anything else. So I agree with, you know, I'm not much more to say than I agree with what you said. And I don't think that the world has really come up with a great solution to how do you undo distrust when it actually does occur?
And, you know, sometimes within the misinformation world, we talk about just the movable middle, like just focus on the people in the middle, not the people who are extreme like there. RFK, jr. and they think vaccines are poison. They'll never change their mind. And I don't know if we're going to get to that place with saying that people who have this kind of strong distrust may be beyond, you know, the ability to bring them back over into the world of trust.
But hopefully there is more that we can do. Thanks for your question. Yeah think that that tees up the, you know, the, It's really hard a scholarly publishers. What we can do is we can. Is it not? Is that working now? Oh, it's like really alarmed about it screeching at me.
Hopefully that's now working. I think one of the things that. I've been trying to focus on when I've been thinking about this is what we can do as publishers within scholarly publishing and think what we can do is focus on providing folks with those signals of trust. And 57% of US adults say that they trust scientific research findings more if the scientists make their data publicly available.
It's a really strong signal of trust for those folks who are already engaging in that work. And plain language summaries, as Jay's already spoken to, are another way of making sure that we're communicating with people in language that they can really engage with and is accessible. So I think that there are the things that we can do. And I think almost that's sort of the hope there. We we do as much as we can as effectively, as effectively as we can.
And then we do need to rely on other parts of our broader global community to support us in sort of changing minds and making these messages really clear. And that does come from politics. And that's going to be very different globally in different regions. And it's going to land very differently with different communities and really don't have an answer for that.
I don't really have an answer, but I must say that I think where mistrust and distrust really flourish is the people who are pushing it tend to do it in a very entertaining way. And they, you know, tend to use language that people who may not know how to read the science or understand it or find the evidence can easily understand. So I think that really helps them spread their message and their influence further than like a scientist who might know what they're talking about, but they don't communicate it in a very entertaining way, like on a podcast or on a tiktok video or on a blog post.
So I think that's really where maybe we also need to get to is. Upskilling the ability of scientists, editors, authors to communicate in a more accessible and entertaining way. And I don't mean like do a dance on tiktok or anything, but. Go where the audience is. Don't pull them to where you are, but go to where they are and communicate to them in a language they understand.
In a way they understand and think that may be a way to fight misinformation and also maybe get people from distrust to trust ultimately. Thank you. Do we have any other questions from the audience? Yep um, Hi. So this is a pretty focused question. I'll direct it to jay, but it might be for anybody. And it's about Twitter.
Um, so like it or hate it, I think Twitter has been and continues to be a really, really important tool for disseminating information. And I was really interested in the stats that you showed, jay, about how quickly truth versus untruth spreads on Twitter. And I'm curious if because I think it was like 2019 that stat was from and given all of the changes, some not for the better that have happened on the platform recently, do you have any information on that about how that's been impacted?
I'll have to look into that. But I'm very active on Twitter and I love it. I mean, I love the platform. Um, you know, I try to follow the right people. I try to block the people I don't care about, which are many, you know. But I think it's all it has become a little bit worse. I think a lot of good. People have left Twitter that I really enjoyed following and getting updates from.
So yeah, I'm on mastodon too, and I'm on tiktok, but my feed is full of DIY woodcraft and cute animals and gardening tips. But I do follow some scientists on there and some education folks. Um, but yeah, I mean, I think it's, um, I think the problem with social media is that once you show an interest in something, it keeps feeding you that same thing again and again and again unless you actively decide to change what you're interested in.
Um, and then so people who are already susceptible to distrust or misinformation, once they say, OK, here's I'm going to follow a person. Um, and then other people who have a familiar pattern to them, they start getting recommended to you. Then content that has the same pattern gets recommended to you. So the algorithms are meant to make you click and like and Bookshare and read.
Um, so I think it's you have to as a user, you have to make a real conscious decision to follow the right things and to shut down the things that you shouldn't be following or you shouldn't be reading or you shouldn't be retweeting. So I think it is a bit of a user decision, but it's also it's just the way the platforms are geared to get you obsessed with the platform and to sell you more stuff or to serve more ads to you.
Um, at the end of the day, the platforms themselves don't really care about real news or fake news or misinformation or real information. Um, but having said that, I do think we can do better as a industry to be active on those platforms and to encourage folks to be active on those platforms and to share. Think said that in yesterday's panel. But I think we need to share more of our stories.
Um, you know, why are you in publishing? Why do you work for a journal? How do you work for a journal? How do you read a paper? We need to share more of those stories and get people more comfortable with the work that we do because it is really vital. Yeah, right.
Hi, Roy Kaufmann, copyright clearance center. Um, so I'm sort of building on some themes that certainly Sarah and Jay discussed, but I think it's more of a question for the publishers, which is so there's a lot of information and we're only going to get to trustworthy, reliable AI when the training data is good training data. And it has to not only be good, but, you know, as Jay was saying, you have to know what it is.
It's great to use good data, but if you can't say I used good data and this is the version and this is the metadata, um, you're not going to get to that point where people trust the outputs or frankly, the outputs are reliable. So I always get this. I'm struggling with this concept, particularly around, for example, versioning of journal articles.
So within open science, we've got preprint servers, we've got lots of versions out there. And I think a lot of the higher end users want to know, hey, I got from plus I got the VR from here. But because we make it easy for all the right reasons to also have these other versions out there, they're out there under cc-by or an open license and they're on a preprint and it doesn't have all the things that user wants and then they get COVID up.
Um, is a sort of tension that I think we're creating for ourselves as an industry. Are you guys thinking about this and how you thinking about it in this context, if at all? If that makes sense at all. Think the outputs from. Within the scholarly literature really are changing. And I do think that one of the things that we can think about and not necessarily the full answer is.
Those signals of trust. So I think as you were referring to, you've got your preprint now, if you want to put something on Med archive, it actually has to go through quite a lot of checks. It's not just everything just gets chucked up there. So a preprint on Med archive has gone through a certain number of rigorous, you know, has it got the clinical trial listed?
Has it been ethically done before it can go up there? So you've kind of already got within your preprint, you've got some checks that have already been conducted. Then you've got the peer reviewed manuscript that gets published and that's maybe linked out to something that's on GitHub and it's also linked out to something on, um, on GitHub. If it's like a viral genome or something. And all of these repositories have got their signals of trust, they've got their checks, they've got their balances, but I don't think we're communicating that well within the version of record.
There's not there's, you know, we've got and I think it's a really great thing that we have and many other publishers have this as well as like the, the, you know, the Open Data descriptions and hopefully with really rich metadata that links out. But I don't think we're doing a very good job of actually explaining to anybody outside of the person who wrote it or who could reuse that what that really means.
So there's 57% of us folks who believe more in the research, if it's got all of these sort of signals of trust in. But are we actually really educating anybody to understand how to interpret those signals of trust? Like a bunch of links in an article doesn't actually shouldn't earn you any trust. Like what is it linking out to? How is it described? Where is it?
What checks have those pieces of research output undergone to go to be put there? And I think some of that comes from the fact that it's linked out from a peer reviewed article and you hope that your peer reviewers have clicked through to all those links and they've checked that it's been done appropriately, but we're putting a lot of. I'm putting a big ask on our reviewers if we're not also tooling up the people who are reading these manuscripts.
Elizabeth, what was it like? Something like something horrendous, like 600 pieces of essays or something in that science paper? It's entirely unrealistic to expect peer reviewers to have checked that. So we need to be making sure that the people who are clicking through to all those things have got at least some tools, some understanding or some descriptors to help them understand that.
And it's quite often complex scientific details and mean. Jay you said, you don't understand all of the manuscripts you open and biochemist. I don't understand the climate research that open. It's, you know, you need to you're relying on all of these signals that you've been taught through your sort of career to pick out those sort of ideas of trust and those bits that can translate over from my research career.
But you shouldn't have to have a PhD to believe that climate change is real. And I think we need to get better at explaining. And it's not just plain language summaries, it's also explaining how to interpret these signals of trust, how people can actually go and look at the work that's been Bichette wherever it's been published and decide. You know, this is something that I can rely on, and now I can read the plain language summary and really believe that what's described in that plain language summary is coming from a place of solid research and evidence.
I don't think we're doing it brilliantly at the moment, though. I'm also going to add something as a publisher up here, because I think the word that you probably said the most than backs was link. And I think that is really the key point to make sure that all of these different facets and aspects are linked together and that it is clear when you're looking at a preprint.
Did that then go on to be published as a peer reviewed article and how did it change? And you know, do you have the reviews available? Are they public? Is the data public? We saw in COVID the impact of. Un peer reviewed preprints. You know, we have very recent. Demonstration of the consequences of that.
So we need to make sure that the linkages are clear in all directions. And again, as Beck said, we need to do more as an industry to encourage kind of scientific literacy beyond our industry. And I think we are well placed to do that. We're a group of people that really understand the importance of this and are communications professionals.
Before I take another question from the audience, there's I've been looking at the results of the polls, and there's a word that's come up a couple of times in response to the question about what do people see as a key issue when it comes to trust and publishing for sustainable development and that words inequity. So a couple of points made about inequitable access being a particular issue and bias in research integrity leading to inequities.
So what I would like to ask my panel to comment on this aspect. So I wanted to say inequity in terms of something very particular, which is access to health care actually drives trust. So people who have less access to health care tend to have lower trust in the health care system, even in the government sort of safety net surrounding the health care system.
So we do have to understand that very real world inequity is part of the problem here. I would also say that I neglected to mention before that one of the really key things about the information ecosystem is that it's not equal. So people have different kinds of access to different kinds of information, but also people are targeted differentially by disinformation peddlers.
So there are bad actors here who actually target people, especially vulnerable populations who may already have very low levels of trust in health care and government and really just prey on that and make it seem as though those ideas are correct and that, you know, that their trust, their distrust is really well founded. And so it becomes a little bit of an echo chamber. And it's a very dangerous ultimately because it does lead to real world changes in health behavior.
Et cetera. So I think it's very important, especially in the wake of covid, which we all understand has created even more inequities to really understand that the information ecosystem and the problem of misinformation is not created equal. For all populations. I think a lot about equity in my role.
One of the know, one of the key things that floss and many others attempted to do more than 20 years ago was sort of open up the scholarly literature so that anybody could read it. There have been some really strong unintended consequences about who is then involved in being able to disseminate that literature, get their information out there. And I think we really need to think carefully about who is involved in those conversations.
And part of building some of this trust is that people can see that voices like them are part of this literature. So recognizing people within their communities, researchers within their communities are able to publish this work, and that becomes a more real, more tangible piece of information if it's specifically about their community published by their community.
And they can see again, all of these elements of trust. And I think, you know, the APC has been a really big problem with that because it's a big barrier to people being able to publish their work openly. Um, I know a lot of there have been conversations here. There are a lot of conversations ongoing about how we can think about our business models to ensure that more equitable access to being part of the literature versus just being able to have access to it.
Um, and I hope that that is going to start to. To really change the spread of literature and the range of voices that we're hearing within scholarly publishing, within our industry and within the content that we publish. I think it's something we all need to grapple with and think about. Yeah, just really quickly.
So I think inequity is really critical and equity is very important to all the SDGS, no matter what goal you're looking at. And there is some really good work being done on the equity front. I mean, I know I think Springer nature, Elsevier plus a whole bunch of publishers have done blanket waivers now for countries that are low middle income under the World Bank classification.
That's a great move. Now you're going to be able to allow researchers from the global south, from the low middle income countries to be able to publish their research without having to worry about can I finance this? There's really great work being done by research for life when it comes to providing access to what used to be paywalled information to institutions and members of those institutions in low and middle income countries.
Um, you know, there is, you know, there are apps now. I mean, we have an app, there are other apps like researcher app, you know, researchgate has an app where now you can deliver your information directly to the reader, where they are when they want versus pulling them to a location or getting them in front of a computer to VPN into an institution in a country where maybe it's not possible to get to the institution, or maybe the institution doesn't have VPN or IP capabilities, but if they have, you know, an Android phone or maybe an iPhone, but I mean, most likely an Android phone doesn't have to be, you know, something that's $1,000, but something that just can handle install of an app.
Now they're able to find that information. If they have a browser, of course, they can go to Google Scholar. Um, so yeah, so I think there are ways to bring about equity. And I there are certain things that publishers associations like research for life are doing to really help foster that. Thank you. Do we have any other audience questions?
Then I'm going to Pose one last question to you. And it's around urgency and timeliness and misinformation and sustainable development. One of the things that was explored in the roundtables that I talked about was the global response to COVID was mobilized so quickly because it had to be. It was, you know, the virus was spreading at such a rate we had to keep up. The climate crisis.
Although it's becoming very urgent, is not operational on the same kind of time scale. Jas also talks about how, you know, a lie can be halfway around the world before the truth has got its shoes on and. And there two books have both or will both have come out in a US election year. So there's an issue of kind of timeliness and urgency here. And I just wonder if anybody would like to share some thoughts on that.
Or to put that as a more succinct question, how do we make sure that we respond to urgent crises in a timely way, in a trustworthy way? Um, so I think it's difficult because on the one hand, especially with something like misinformation, we struggle with this problem of, of talking about it so much to the point where people think, oh, everybody, you know, it's the norm to not trust, to to, to take information for granted, to believe things that aren't true.
And that sometimes compounds the problem. So the more we talk about it, the more we talk about people don't think climate change is real, the more it seems like the vast majority of people think that when that might not really be true. So we have to be able to create the urgency without making it seem like the negative thing is so pervasive that more people sign on to it because they think everyone else thinks that too.
I will say, I think for something like the response to COVID and misinformation, I am seeing, for example, in the Uc that a lot of the funding structures are really sort of misaligned. So a lot of individual health departments were getting funding during the pandemic to deal with misinformation, and as soon as the emergency was over, that funding was pulled. So I don't know how you get lawmakers to actually do this, but there obviously needs to be sustainable funding architecture for dealing with issues like misinformation that are pervasive public health and public safety problems even outside of a pandemic.
So I think, you know, that kind of structural level is really important. Yeah, I think that that structural thing is, is one of the things that we could use. Um, and the, the example that you gave at the beginning of the growth of research in sustainable development goals, I mean there's a kind of incentive for us all.
You know, we're interested in publishing the things that people are like writing, you know, we need to get ourselves organized so that we can do it fast so that we've got the editorial boards in place to properly assess those manuscripts. So we're engaging with the early career researchers who've maybe got a little bit more time to review the papers interacting with specific preprint servers like Earth archive, to disseminate that research as quickly as possible and then OhioLINK and integrate the version of record and the other research outputs.
Um, I think, you know, we can be as publishers and within scholarly publishing industry, we can be intentional in our portfolio development and thinking how, how we can grow and how we can support the research community to really disseminate this research in a findable way. So it's truly discoverable, so it's properly integrated so that we can speed this up carefully so that everything is connected together.
But, you know, there are a number of partners. There are, you know, do think that one of the things that I've learned really fast this year after joining the Fellows is how many cross industry initiatives are happening to really try to start to get at this and think, yeah, it's definitely going to be one of these things where we're going to be stronger together to actually do this. But as you say, in a trustworthy way, not just racing at it.
I was told there wouldn't be any tough questions. But no, I mean, I would say. I think COVID was definitely an emergency. It was like, Oh my god, I'm going to get it. It's not really good. It's going to cause damage to my body. Maybe I might die. So I think a lot of people acted very quickly.
The thing with climate change is when you look at most of the reports, it's like, oh, by 2050, we can't hit this number, or by 2075 it's going to be two degrees warmer or by 2100. Half of New York is going to be under water. And most people are like, that's too far to think. And, you know, most people don't think that far ahead. Most people don't plan that far ahead. So, you know, I think it needs to come down to more short term things that might change and to communicate it more in a short term.
You know, short term way. Like, you know, you know, I would say it's better to maybe communicated that every year we get a little bit warmer. You're going to spend more money on on, you know, running your AC. You know, every year. The weather patterns get crazier and crazier. You might have an ice storm or you might have a blizzard in areas where maybe that's really not possible or you might have floods or you might have long term droughts.
So I think it needs to be more about focusing on what's happening now in the near future versus talking about what's going to happen in decades. And I think that might lead to more action. But it seems like to me, at least when everyone talks about this, everyone saying in the future. And I think it kind of pushes off the urgency to act on the part of the public, on the part of the policymakers, because they're always like, we'll fix it when it happens.
You know, we have 10 years, 20 years, 30 years. And I think we just have a tendency to respond to crises when they happen rather than planning to avoid the crisis. So I think it's really important to maybe focus more on the short term stuff. You know, and I guess in our research, when we publish research or when authors publish research, they might be looking far out.
But it might also be interesting way to say, well, OK, what's going to happen in the short term? What's the worst case scenario in the short term versus what's the worst case scenario in the long term? And then maybe say, how do we avoid the long term consequences by taking action now? Thank you. I was going to ask you each to leave us with a final thought, but I think we are actually at time now, so I will not do that.
But if anybody is interested in our panelists, final thoughts, please come find us in the break. Thank you all for showing up after lunch on a Friday. Thank you to my panel.