Name:
Double Trouble: Inappropriate Image Duplications in Biomedical Publications
Description:
Double Trouble: Inappropriate Image Duplications in Biomedical Publications
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/9d01cc5c-4116-4a79-baa3-68e7bc31c4cd/videoscrubberimages/Scrubber_741.jpg
Duration:
T01H18M39S
Embed URL:
https://stream.cadmore.media/player/9d01cc5c-4116-4a79-baa3-68e7bc31c4cd
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/9d01cc5c-4116-4a79-baa3-68e7bc31c4cd/double_trouble_inappropriate_image_duplications_in_biomedica.mp4?sv=2019-02-02&sr=c&sig=wOQ9yiauZxGz2JKMK6H4J%2Fb2rynCItM%2B7jSurOCEF7g%3D&st=2024-11-20T01%3A10%3A19Z&se=2024-11-20T03%3A15%3A19Z&sp=r
Upload Date:
2024-02-02T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
Everybody Welcome to Portland. I'm so excited to have you all here. It's just been a wonderful, wonderful experience for me. I've been here already for a few days and had the pleasure of going on a wonderful hike with Alice Meadows and Gabe harp. So if you want to see our pictures, we'll be happy to show you of the waterfalls.
And don't worry, we all tell you different distances on the miles. So it was a lot. Um, so I just want to welcome you to the 45th annual meeting. And I'm happy to have you all here in Portland in Portland with us. And want to take a quick minute. We want to recognize our sponsors and we want to encourage all of our attendees to visit the exhibit hall for our virtual attendees.
We will be live streaming. I will be for the first half an hour. So please stay around and get a chance to see some of the faces that are here in person. A super, super, super special Thank you to our program chairs Lori Carlin, Tim Lloyd and Emily Farrell. Um, amazing amazing job they always do. A special Thank you to Lori Carlin. After four years we'll be rolling off the committee.
So I know we're in a hurry. We have a lot to cover, but can we give Lori a big Thank you? So for those of you who attended last year, you know how hot the whova app is. So if you have not downloaded the app, you are missing out. So please take a minute to go ahead and connect with that and feel free to post in chat. And if you're interested in going on.
I heard some of the runs are getting longer and longer, so if you decide you want to go on one of those long runs, I will meet you downstairs with water when you finish. So it's a great opportunity and you can also connect with the virtual, the Virtual Attendee. And so for all the virtual attendees, Thank you so much for being here. I look forward to interacting with you throughout the conference.
We do have Wi-Fi. The Wi-Fi Wi-Fi network SSP 2023 password Aries 2023 with a capital a, please remember, as always, to be respectful and turn off your mobile devices and keeping in mind that this is a meeting environment that fosters open dialogue and the free expression of ideas, free of harassment, discrimination and hostile conduct. Creating that environment is a shared responsibility for all participants.
Please be respectful of others and observe our code of conduct. If you need assistance during the meeting, please stop by the registration desk. And then recordings of all sessions will be available in the app within 48 hours. So I think with that, I want to invite Laura Moulton to come to the stage and we're going to introduce street books and street books as a.
Is an organization here in Portland that we're supporting through our core values. And I just want to give Laura a minute to share a little bit about her efforts here in Portland. Thanks so much, Miranda, and Thank you so much for giving us a few minutes just to say Hello and to tell you a little bit about street books. My name is Laura Moulton.
I'm the executive director and street books has been around since 2011. We are a bicycle powered mobile library focusing on people living outside. And at the margins in the city of Portland who may not access the mainstream library. And so we started as a three month art project. It was just me, and I imagined that it would be neatly curtailed by August.
And in fact, it's grown and thrived. And we've been an official nonprofit since 2013. At this point, we have three of our bike libraries and you can see one downstairs set up with our table there. We'll be there till seven tonight and hopefully on and off the rest of the couple days. But I want to say one of the things that I did not realize when I set out in 2011, having written a grant to go and do this thing, was how hard it would be to actively look for people who were sitting outside, who had clearly slept outside and invite them to do this art project.
I had a lot of fear suddenly about whether that would even be useful to folks. And over time, I very quickly realized that it was useful and people valued that that Hello and valued being greeted by name valued the fact that I would look for a book for them and come and bring it to them the next week. And so that's one of the things that we offer. We have nine librarians now.
We run 10 shifts a week, five days a week in different parts of our city. We take requests from people and then we find them. The next week we show up same time, same place, so that it's easy to find us. And one of the stories that my friend Hodge loves to tell. He's with us right here. He's a man I met in 2011 at Skidmore. Fountain, came up and inspected the bike library and was very suspicious because there was no PG.
Wodehouse was like, what kind of librarian doesn't have wodehouse? And it turned out Hodge was living outside at the time in old town. He worked his way indoors, came to work for street books, and began to run a shift in the very area where he used to sleep out in old town. And one time he called me after a shift and said, Laura, you're going to love this.
The guy turned down the copy of Nietzsche not because he didn't want to read nietzsche, but because it was the wrong translation. And so we knew then. All right, we're mattering. Our work is landing. And so I just want to say Thank you so much for including us and for helping us spread the word. There are ways to support street books.
If you are coming from out of town. We're finishing a spring campaign in May. We always take financial donations. Many of you have brought books to us that you've carried around all day, and I really want to appreciate you for that. Someone brought books from Boston, so kudos. If you're bringing books from long distances, we really appreciate it.
If you're local, we have a Monday event which we can tell you more about down at our booth. We're launching our summer shifts outside the Portland art museum. And so you're welcome to come and eat cake and celebrate with us. So thank you again so much. I really appreciate it. Have a great conference.
And on a personal note, my husband told me, don't shop because you can't carry things back. And I said, oh, no, I have a suitcase full of books that will be empty when I leave. So I just urge you all to stop by street books table outside the exhibit hall and to donate your books and anything else you find appropriate. They'll also be selling their book, loners, which is about the making of street books with the proceeds going to support the program.
So thank you so much, Laura. And so with that, I'd like to introduce Stuart Maxwell of scholarly IQ, our keynote sponsor, for today. Hi, I'm Stewart Maxwell from IQ. It's great to be here. As a sponsor, I should probably be saying something about 20 years and usage analytics, but I'm not too sure how much people really want to hear about that.
Um, no, I thought not. Um, so as well as the program, events like this are also about sort of meeting up with people and that. So I've got sort of two minutes. And what I'd like to do is just give that back to everybody as a warm up. Just take the opportunity to say Hello to somebody next to you, preferably maybe somebody you don't work with or, you know, just warm up, get used to some of the networking.
Anybody you haven't met beforehand, please just take the opportunity to say hello, introduce yourselves and get ready for the week. Good that helps fill in my time. Great oh, we got lights as well. Thank you. There we go.
That's a much better start. That's nice to hear people talking again, isn't it, in a large group. OK I'm going to put I'm going to put a cancel to that one now, which is really mean of me. I know, but I hope you'll continue and have lots of other time together over the rest of the week.
I've set you all off, haven't I? So the other thing I wanted to just sort of share as a thought I've been having over sort of meetings today, and it was something more general about sort of publishing and it's related to sort of the drive I think that we see a lot to sort of look at publishing as more of a cost service where we are sort of focused on sort of what are the costs of what we do.
And people I won't say who trying to say we should only be paying for those services. And I just want to sort of voice that concerns me. It's something which I think does a lot of disservice to the sort of talent of people that work in this industry. I think treating publishing as a utility is a mistake. There's a lot more that comes through in sort of quality service, quality control, building reaches of audience and communities that if we had more of a sort of flat pack utility service I think would be really quite damaging.
And I think probably more dangerous to researchers. And I think they miss out on it. If we didn't have the kind of competitive marketplace that we have in publishing and I think a drive to sort of have a more sort of generic and as I say, sort of paid for elements, nuts and bolts service, I think actually would be more dangerous to publishing and actually to research than maybe what some people are trying to gear for.
So whilst we're thinking about sort of talking to others, what I really like to just say is I would like to try and meet as many of you as I can do over the next few days, and I'd like to hear your thoughts and ideas about how we can help maybe make some better communications and better messages for those people about the wider values that publishing delivers some of those non intangibles that can't be given a cost element.
But you know about the people within the industry, the talent and the expertise that's within it and those other services that we provide. So whilst I'm meeting you, I hope we get the chance to talk about that. And that's all I really wanted to say and just that if you're interested in that, then I'd love to talk to you and I hope you have a fantastic few days.
And Thank you for listening. I'll hand back over and leave you to enjoy it. OK Thank you. I'd like to introduce the annual meeting program committee co-chair Lori Carlin to introduce our keynote speaker.
I feel like I missed the memo that said there needed to be a joke to start out. I don't really have one. I do like the room setup though. It's very less threatening with these round tables. Good afternoon, everyone, and thank you for joining us here in Portland. And those of you that are joining us virtually, welcome to the SSP 45th annual meeting.
On behalf of myself, my co-chairs, Tim Lloyd and Emily Farrell and the entire annual meeting program committee, we are thrilled to have you here and welcome you to the meeting. We have a wonderful program in store for you over the next couple of days, all built around our theme of transformation, trust and transparency. With this theme in mind, I am honored and excited to have the pleasure of introducing our conference keynote speaker, Dr. Elizabeth bick.
Dr. Beck is a renowned microbiologist, turned research integrity expert who has made a significant impact in the scientific community through her work in detecting research, misconduct in scientific papers. With a PhD in microbiology from Utrecht university, Dr. Beck has had an illustrious career in the field of scientific research, including as a researcher at Stanford University for 15 years, followed by two years in industry being before transitioning into the important role of scientific integrity consultant, where she has gained a reputation as a science detective with a keen eye for identifying fraudulent images and other forms of misconduct in scientific papers.
As a preprint commentator on issues related to research, misconduct and scientific integrity, Dr. Beck has become a leading voice in the field. She can often be found discussing science papers on Twitter microbiome biome digest, writing for her blog, science integrity digest, or searching for biomedical literature with for inappropriately duplicated or manipulated photographic images and plagiarized text.
Her work has been featured in nature science, the new yorker, the New York times, the Washington post, the guardian and the times, UK. She has reported nearly 7,000 papers for issues in image, duplication or other concerns, and her work has resulted in over 1,000 retracted papers and over 980 corrected papers. For her work on exposing threats and research integrity.
She received the 2021 James Maddox prize, which has been awarded annually since 2012, to individuals who have shown courage and integrity in standing up for sound science and evidence. We are privileged to have Dr. bick with us today to share her insights and experience and to offer her perspectives on the challenges facing our scientific community. So please join me in welcoming Dr. bick as our keynote speaker to kick off the 45th SSP annual meeting.
So today I'm going to talk about double trouble, inappropriate image duplications in biomedical publications, duplications in biomedical publications. And yeah, as you already heard in the introduction, I am a science integrity detective.
I look for images in scientific papers that might have problems. Before that, I was a microbiologist, so I have worked in science, but currently I'm not employed. I'm sort of self-employed and my work is crowdfunded. I get a lot of donations for my work by people who want to support it, so they might donate $2 or 5 or so per month. And that gives me just enough money to keep on working and doing what I do.
And because I'm not employed, it actually might give me a lot of freedom to say what I want and perhaps, you know, rock some boats and make some people uncomfortable. But I do feel, as a scientist, we need to talk about suspicions of misconduct or even just honest errors. And so the theme of this conference is transformation, trust and transparency. So I thought about these three themes and how I can talk about that related to science integrity.
And first of all, thinking about transformation, we can think about how scientific publishing has changed recently. And I included here a graph in the bottom left on this slide about how many papers are published each year. And which is pretty obvious, is that in the past two or three years there have been sort of an explosion of papers, mostly about COVID 19, but the amount of papers that are published every day is enormous.
And I think it has gotten to the point where I feel I cannot keep up with the literature. And I think that feeling is shared with a lot of other people. So scientific publishing is just overwhelmingly big as a scientist trying to absorb all of that. So that's one of the transformations. But there's also an increasing pressure to publish. There are certain countries that have regulations about how many papers you should publish in order to get, for example, your M.D. or your PhD.
And that puts a lot of pressure and that comes with a rise perhaps in science misconduct because of this pressure to publish. And so we can talk about trust. How much can we trust the papers, the manuscripts that are sent to us as publishers, or how much can we as readers, as scientists trust the papers that come out? Is there is all the data in there still valid? And then finally, transparency.
How transparent are the authors? How transparent is the data, but also how transparent our investigators actions of misconduct? And I hope to touch upon all of these points in the coming slides. So yeah, transformation of scientific publishing has increased a lot, increased pressure to publish. I already mentioned that there's now over 300,000 papers on COVID 19.
Do we need all those papers? How many of those papers are going to be valuable in the end? I think there's a lot of papers that will never really play any importance in science. But yeah, they get published and we sort of need to read them. A lot of those are reviews or opinion pieces. Papers get more complex in the bottom left. I have found a paper published in Science with 61 supplemental figures and 93 tables.
Who is really going to read all of that? And as a peer reviewer, you know, you cannot really do that justice. That would take like a year to really properly review all of that. And it's just too much. Peer reviewers are harder to find. But with that comes also an increased allegations of about suspicions of fraud.
So that is what's part of my talk will be about. So for me, publications, the way I see it as a scientist are the foundation of science. We build on each other's work, and science is about finding the truth. That is, for me, the core of what science should be. And we can have a long discussion about what the truth is. And, you know, finding the truth is never easy.
And science is endlessly complex. We will never find really the truth. But I do feel as a scientist that we have this perhaps noble mission to report on what we find truthfully, and we also build on each other's work. So publications are the way that scientists communicate with each other. We build our research on the work of others. Scientists are never publishing just for themselves like we publish, so that others can build on our work.
And as we build on each other's work, I sort of see signs and publications as bricks in the layer of the walls of science, or so each publication is a brick upon which other papers rest. And so if we build this work on trust, that will work really well. But if a paper contains misconduct, contains fraud, that could mean that part of this wall could be tumbling down.
Other people could try to replicate that work, but if it's fraudulent, they might not be able. And so science fraud could lead to a lot of wasted money and time and effort. And so, unfortunately, we tend to all trust each other's work. But there seems to be an increase in the amount of fraud that can be found. And science is not immune to that. And I don't know, I was shocked the first time I heard about science fraud because I thought I was you know, every scientist is like this really honest person.
That's at least how I felt. And I was shocked to hear that people would cheat in science and cheating. For me, in science, that's actually not science, because science is about finding the truth. And so if you cheat in science, that is that goes against everything that science should be. And that is sort of what drives me in finding potential problems in papers.
So just a couple of definitions and maybe most of you are already familiar with that. There can be many things wrong in a science paper or a science project which we could call questionable research practices. But there's also write out fraud in science. Science misconduct in the Uc is defined as one of three things plagiarism, copying each other's text or ideas without properly citing them or giving credit.
Falsification where a person does an experiment but changes the results. If you just make this value a little bit higher, it becomes from a negative or positive. And now my results look much better. And so, you know, that could be publishable now, but that would be falsification. Fabrication is where a person completely makes up results. So just typing in some numbers in an Excel spreadsheet might create a beautiful graph.
You always want it. So no measurements were actually done. And behind also each misconduct case. There is a sad story and I'm as I work in science misconduct, a lot of people always ask me, oh, which paper is that from? Or, you know, that person is a fraud and they should be in jail or something like that. I don't want to do that because I realize that what drives people to do misconduct is probably a very sad situation.
Why do people commit fraud? Maybe they work in a lab. Maybe they're an early career scientist who has a bully as a professor, a person who's very demanding and who wants the results to look the way he or she had thought that the result would come out. And they might call that very young person, all kinds of names. They might insult them, they might threaten to fire them.
And if you are, for example, on a visa in the Uc and working in a lab in the us, but you're not from the us, you're on a visa. If you get fired, that means you need to leave the country and go back to your home country within. I think it's a very short period. I'm not quite like, let's say weeks and that is a big threat if you're young and maybe you have a young family and you need to leave the country without a nice publication, without a letter of recommendation, that is putting a lot of pressure on these people.
And if you're young and perhaps you don't quite know how science is being done and you're bullying, professor tells you this is how we all cheat in science. That's the only way to make the experiment work. Then you might believe that and you might tend to Photoshop maybe something or change the results in any other way just to please the professor. And so there's always multiple names on a paper and it's not quite clear who really is responsible for misconduct.
Yes, perhaps one junior person might have photoshopped something, but the senior person in a way is even more responsible. They're responsible for mentoring, for creating an atmosphere of trust and reliability in the lab. And so all authors on a retracted paper will be damaged, not just the people who are on the paper maybe, but also other people in the lab or at that University or perhaps even that country will be held accountable for a particular retraction.
And so I'm trying to be respectful for the persons who might have done misconduct. And I also realize and Holden Thorp, the editor in chief of science, wrote an editorial about this, there's two questions. One is, is there a problem with the paper? And the other is who has done it? And that second question is a very long question. You can do a long investigation.
Who has done misconduct? And that involves the institution. It involves emails back and forth between the authors and pointing fingers. And I don't know all kinds of things that will take a long time to resolve. But I hope that publishers might also realize that there are potential problems with the paper that are too big to really take that long to resolve.
So if you see as a publisher or a journal editor, a big problem or if that is pointed out to you, I hope a lot of journals will put an expression of concern on a paper that could be done, I hope, in days or weeks. And as I will show you, sometimes these retractions take up to five years or so. And that is, in my opinion, too long. There needs to be a very fast editorial expression of concern, flagging a paper with a potential problem pending the investigation that might take much longer to resolve.
So I focus on images in scientific papers, and here are a couple of examples of the images that you might find in biomedical papers, which is my field. You could have line graphs and you could have photos. So line graphs are actually very easy to cheat. Who knows if that experiment was really done, but at least if you have photos of cells or mice or a gel, you can see it looks real.
And we tend to believe these photos. Seeing is believing and a photo or it didn't happen. We tend to look at these photos. We see that these photos are all different and all these photos are fine. There's no problem with them. But there's a lot of detail. And we can look at these and sometimes I can recognize duplications.
So this is sort of my basic slide of the types of duplications that I have found over all these years. So on the left, there's a simple duplication where you have two images that are exactly the same, two photos in the middle, you have a repositioned duplication. There's four different panels, but three of them overlap with each other. And on the right there's a photo.
There's four different photos of Western blots where parts of the photo itself have been duplicated. So you see duplicated bands and that is suggestive for Photoshopping. And also I want to point out that there's a sort of an increased chance of whether or not this was an honest error towards deliberately done from category 1 to 3, where category one could be an honest error.
You know, you have a lot of photos. You just don't label them very well. And by chance you just grab the same one twice to represent two different experiments and then altered photo, a photo with duplicated elements within the photo. In this case, protein bands. But it could also B cells or parts of tissues. That is most likely done deliberately.
And so I want to point out a seminal article by Rosner present in the room and yamada, which realized already in 2004 that digital photography gives rise to all kinds of the temptation of image manipulation with you know, when I worked in the lab, there was still, you know, you still brought your gels to the photographer. But nowadays with digital photography, it opens up all kinds of wonderful techniques to enhance perhaps your results.
And so I want to have some audience participation. So I have here an example of a simple duplication. So you see seven different panels, and each of them have a different label. But two of these panels are identical, so it's exactly the exact same photo being used to represent two different experiments. So that's inappropriate. And just show me.
I can hardly see you, but just raise your hand if you spotted the duplication between two panels. I think this is a fairly easy one. See a lot of hands. Excellent you're all hired. You can detect these duplications. No excuse anymore. So I hope you got it right. Here's the solution.
So these two panels are the same. So I want to point out again, this could be an honest error. And I wrote to the journal in October 2015, but this paper is still out there. So it's not corrected. It's not well, I don't think a retraction would be the obvious answer here, but a correction would be nice. Back then, it was only two years old. I think by now the original photos probably are not findable anymore.
But and it's of course not the end of the world that this photo was duplicated. But, you know, it would be nice if this was corrected, but it wasn't. The journal didn't really seem to care. And so here's a little bit of a harder example. So here's a bunch of panels and there's three panels that overlap with each other. But this is much harder to see.
And this is the point where you wish you were sitting in the front. But I hope you can see a duplication and just give it a couple of seconds. Just raise your hand if you see if you think see a duplication. This is much harder because it's not just that the sample was shifted under the microscope. It also involves a mirroring here. I'll give you a hint.
It's on the top row. I see some hands. See a couple more hands. But this is, as you can see, this is much harder to catch. And most of these examples are found by eye. But it would take me a while to look at this. I don't immediately see it, but here are the duplications. So the first panel and the right panel rightmost panel overlap with each other and panel three and four also overlap with each other even though they have four different labels.
So again, three panels overlap with each other, but they're representing three different experiments. So that's not good. And the authors promise an update, but it's still I don't know if they contacted the journal, but it's still out there. And I think this is a lot of this will be very sloppy. If this was an honest error I wouldn't trust any other results.
If you're that sloppy as a scientist, you're not really I don't know. I wouldn't trust anything that comes out of that lab. So here's another example of how these duplications can sometimes become very extreme. And in this case, the journal actually retracted the paper, which I think is a good outcome. Again, there are so many overlaps here that you just cannot trust anything else in this figure or even in the rest of the paper.
Here's a duplication again, you can see if you can see if people can spot here the duplication. This is a type 3 duplication. So there's duplicated elements within this photo. It's one photo of a Southern blot. I've done many of those. So this should just not be, you know, there shouldn't be any repeats. You see a lot of details in the bands.
You see stripes and smearing. And even if you have no idea what a Southern blot is, that's fine. I'll forgive you, but I hope that you can see some duplications. Um, let's see. There's somebody already spotted something. Yeah, I see some hands. Yeah, some hands over there.
All right, I'll show the answer. So there's lots of duplicated areas here. And I think this is a case where I cannot think of any natural cause for this like this. This appears to have been digitally altered, but this paper has not been retracted yet. But I only recently reported it. But it is a person who threatens to Sue everybody who dares to criticize him.
And so I I'm one of those persons also. But he has sued a couple of people. So I can maybe. Think that some editors are a little bit afraid to tackle this problem, but I don't know. I hope this would not. Be any inhibiting inhibition to correct signs. And sometimes these duplications happen in spectra as well. So this is technically obviously not a photo.
And to be honest, I'm not an expert in NMR spectra. I have basically no idea what it is, but it is a noise and peaks and I spotted some duplications in here that are quite unexpected. And the author tried to upload some samples and raw data, but he uploaded all the raw data except for this particular figure and this is just suspicious when he uploaded all the figures that were fine, but not the one that I actually was concerned about.
And the journal scientific reports retracted this paper. And so I did all this type of scanning of scientific papers by eye. I wanted to know how often do we find these types of duplications if you just scan for it? So I scanned 20,000 papers. I was still fully employed. And if you don't really believe that that's humanly possible, I only look at the figures.
I don't actually read the papers and some papers only have a couple of figures, so you can do one per minute and downloading it is actually more time. So I scanned this set of papers by eye to see if I could find these duplications. My two co-authors, both editor in chief of American Society for Microbiology journals, had to agree with my findings, otherwise I couldn't count them.
And in that set of 20,000 papers, I found 800 papers with duplicated figures, 782 to be exactly, to be exact. So 4% of papers had these inappropriate duplicated figures. Now, of course, as I said in the beginning, some of these are honest errors. You can only make a guess which paper is an honest error and which is intentionally done, but the altered ones are always intentional.
If we assume that and if we assume that the simple duplications are always an honest error and the repositioned one are somewhat in the middle, we made sort of an educated guess that in this set about half of the papers contained inappropriately altered figures. So with the intention to mislead, for example, if you rotate it or if you mirror it or if you stretch it or if you duplicate elements within the photo, that is very likely to have been done intentionally.
So does that mean that 2% of all papers contains misconduct? Can we extrapolate that from biomedical papers to other fields? That's actually a hard question. And and remember, I only looked at photos and so how easy would it be? And that's a scary thought. How easy would it be to fake a line graph if you type in $10,000 numbers and make up some fake error bars, you can make a convincing looking graph that nobody would suspect is misconduct.
I look at photos and I find basically the tip of the iceberg. And if you move your sample under the microscope a little bit more to the left or to the right, there would not be an overlap for me to find. So if you're smart, you could actually fake in a lot of different ways that would not leave any traces in a paper that would never be findable unless you work next to that person in the lab or you check their lab notebooks.
And most scientists are actually smart. I'm really finding the dumb fraudsters. I sometimes feel like the people who left a trace for me to find. And it's much harder to detect fraud in other type of data. That is not a photo. So the real percentage of fraud has to be much higher than 2% How high it is, I don't know. But I think in some fields it could be up to 10% And we I'll talk a little bit about paper Mills and some of the next slides.
But yeah, I think it's a real threat. It's not just a paper here and there. A rare case like I think the numbers of fraudulent papers might be much higher than you think. But what I'm very disappointed with, to be honest, is how slow the journals were to respond. So I sent in all these cases, and this was around 2015. I sent in my 800 papers and this was the first set I did. By now, I've done looked at many more papers.
So I reported these 800 papers to the journals in 2015 and then five years later, so around 2020 I looked at how many papers were corrected or retracted. Like not all of these problems should be retracted, but you know, at least some of them should be corrected. But unfortunately, as you can see on the pie chart, the blue part is the amount of papers that the percentage of papers that no action was taken after five years.
That is a very long time. And I'm very patient. I send some reminders, but very often nothing happens. The journal didn't even respond to the reminders or responded something like, yeah, we're still working on it, but like what is taking so long? And of course I'm sure there's all kinds of reasons. But for me, as an outsider, as a scientist who might read these papers and base my research on it, it's frustratingly slow.
This should be much faster, or at least put an expression of concern on these papers. And as you can see in this pie chart, it's actually barely visible. The amount of expressions of concern was only 0.3% You can barely see it there. And so I feel expression of concern is like the tool to use to quickly mark a paper that potentially has a problem pending the much longer investigation of who has done it.
But yeah, for the reader, it's not really important who has done it. It's important that there's a problem with the paper and that it needs to be flagged. And so that is why I am starting now to use pubpeer and maybe not everybody in the room is a very big supporter of pubpeer and I can see that. But for me, this is the best platform that I can use to warn other readers that there's a potential problem with a paper.
And so I report these things there. I, I can just type in the I put my concern on there and it's a heavily moderated platform. I cannot just say, you know, that person is a fraud or that paper sucks. No, I need to really tell the readers and the moderators what I think is wrong with the paper and show evidence. And so if you then install the pubpeer plugin, you can see if you do a literature search, if a paper has a popular comment.
And for journals, there's also what they call the journal dashboard, and I think a journal needs to pay for that to be that sort of their business model because for me as a user, pubpeer is free. But the way they make money is that a journal can then pay for the service to get an email as soon as somebody posts something about a problem in one of their papers. And it would be for me as a user of paper, it would be wonderful if this was automatically.
And if all journals were signed up for pubpeer the journal dashboard. And I'm saying this, I'm not getting paid by pubpeer, I'm just a user. But for me to find all the email addresses for all of the papers I'm reporting is just really a pain for me. It's very hard as a user to write to a journal and report my problems because very often I know who's in the editorial board, but I don't know their email addresses and so I have to hunt them down and see what their email address are, if I can find it.
But there's many journals where I cannot really find their any email address to contact the journal or it's just a contact@journal.com and I never get a reply. And I'm just not sure if these email addresses are even looked at. And so from my point of view, as a person who is like a heavy reporter of, of problematic papers, I would love to have journals automatically be aware that if I, if I post it on pubpeer, that they'll automatically know that there's a problem and I don't have to also email them.
But I don't know what the thoughts are in this room about that, but that would make my life so much easier. And I actually have I'm very behind reporting these to journals. So some of you might be getting an email with, I don't know, here are 200 papers from your journal with a problem because I'm so behind and I hope you really would check pubpeer a lot more often because everything I have to say about the paper is there.
And then, of course, you, you might ask the question at least I get that asked like, oh, do you spot these things by I really do. You have supervision? Because I don't see it. That's actually the response I got in the beginning where a lot of people didn't see these duplications or did not want to believe that that was going on in science. And so, yes, of course, these.
Things can be detected by software, but it's actually much harder than you might think. And it took years and years. And now there's a couple of software tools that can detect these problems. Kruphixx, I think, is one that several journals are using. I'm also beta testing it and I'm using more and more image twin. That is, I like the tool really. It's very smooth and easy for me to use.
You just drag your PDF in there and it will do its thing. And it also has a database of a lot of papers. So sometimes I'm finding an image that has been used in a completely different other paper from very different authors, but it's the exact same image. And so that is always exciting. But I would never be able to find that myself. These software tools have all their limitations. I can sometimes use them to find sorry, just checking the time because the timer is not.
I have no idea what time it is. I just keep on talking. Would be nice if the timer could. Could be activated, but it's not working. Sorry so I'm using these tools, but I know the ins and outs and the weaknesses and the strengths of these tools. So they're not very good, usually in Western blots, but they're very good in images of microscopy of tissues, as you can see here on the screen.
Where is it? Oh, there it is. And so it will find I'm not sure if that's really visible, but it will mark the duplicated areas with little colored boxes. And these tools, of course, can be part of the suite of tools that are being used for incoming manuscripts or accepted manuscripts, because obviously we want these things to be detected before they get published.
I think that would save everybody a lot of work and pain and effort if these duplications were spotted before they get published. So some of these tools are available for publishers. Of course, they have to pay for it. But yeah, it will find a lot of these duplications and if they have databases to screen for plagiarism of images, I don't like to use the word plagiarism for images, but sometimes these images get stolen from other journals.
I cannot read that. What is it? I I'm just seeing a white screen. How many times do I have? Oh, OK. OK, great. Thank you. And, and so yeah, so, so we can detect these things by software.
And some of these software tools are powered by AI. But I can of course, also create fake data. And I don't think I have to tell anybody how much of a concern these new developments are. Artificial intelligence has gotten so incredibly good in the past half year or so. This development is going so fast that we're. As a scientific community, as publishers, as scientists, as writers and journal editors.
I don't think we're ready for this. How? how do we deal with a robot writing texts that, you know, more or less believable? And sometimes you can read these texts and you're like, well, you can sort of see it's mechanically written. There's no heart or soul in it. But yeah, technically it's all good. How do we deal with scientific texts being written by these robots or these bots?
And I think as a person for whom English is not my first language, I would actually be very grateful if there was a technique that could rewrite my paper and make it better english, make it not just grammatically correct, but also, you know, like engaging. But unfortunately, a lot of these tools are not very good. They are yet, but they might include false reference. For example, there was apparently GPT that was tested.
It was released by Microsoft and it was asked the prompt how many ghosts are there in us hospitals? And the answer was 4.1. And that is just the confidence of these bots is just of these tools is just amazing. And I think it even had some references and it, it said, yeah, these, these ghosts are really bad for hospital. If there's a hospital with a lot of ghosts that would prevent the patients from being healed.
And it's just not, not good for hospital to have so many ghosts but it's just a wild story. And of course, we can laugh about it. But, you know, as long as you say it with a certain amount of confidence. And some of these texts are also including false references that look good, but they're actually not existing. And so these are know, this is something that is going on.
And this is getting better. This is just where we are now. But I don't know, a year from now, I think these tools will become even better. And so we could as a scientist and there are some scientists who just write paper after paper just using chatgpt. And I think the scientific community agreed that we cannot use these things as an author, we cannot say chatgpt was an author, but we can acknowledge it.
But how much is then done really by the human who claims authorship on these papers? Again, these are questions I think we're not perhaps ready for. I hope people are discussing this. I certainly don't have the answers to this. But it's something to look at with a lot of suspicion and a lot of I look at that with fear. And as a person working in images, I'm very afraid for what, you know, things like Delhi and mid-journey can do.
These AI tools can make really realistic photorealistic photos that yeah, that look real but are completely fake. And we can laugh about, you know, a former president being arrested or a pope wearing a designer coat. But if you think about how good we are as humans, usually to recognize faces, I mean, how many pixels are we looking at? But we know who these persons are. We can recognize their faces.
And I'm actually very bad in facial recognition, but most people are fairly good at that. And so if we can recognize these faces and, you know, these photos look realistic, we might believe that these photos are real. How easy would it be to make a photo of a Cell or a Western blot? We that probably we would not be able to tell a Western blot, a fake Western blot generated by mid-journey tell that apart from a real one, I'm very worried about what this technology could do.
Maybe not today, but tomorrow. And I think as a scientific publishing societies, we should be extremely worried about that and think about ways to detect that because it's lovely to have some tool to detect duplications within photos which I'm currently using. But how can we detect a unique yet fake image? And this is something I don't have the answer for, but I hope there's people.
With technology backgrounds, who can look at these images and tell that they're real or not? Not maybe by just looking at them, but by all kinds of layers or, you know, pixel, whatever, something pixel level. I'm sure that you can maybe tell these things apart, but I'm sure this can be cheated, too. But it's a very worrisome development and of course it's fantastic to have fun with it.
But in the hands of the wrong person, it could lead to a lot of damage. And so this brings me to paper Mills and so. I assume most people in the room are very familiar with paper mills, their societies or organizations, networks perhaps, that want to make money. And they make money out of the fact that people who work in science need to publish papers and they're very active in certain countries where there are certain very strict requirements.
To give an example in if you're a medical doctor and you're finishing medical school in China, you need to publish a scientific paper. Now, if you're a medical doctor, you're not necessarily interested in doing science. You're probably working, I don't know, 80 hour shifts in the hospital and you want to treat patients. That's what you know, you are taking the education for. And so if you then suddenly have the requirement to write a scientific paper, but you don't work in a scientific facility, you don't have access to, I don't know, an animal lab or you don't have equipment, you don't have time to do that.
You are faced with a very impossible situation. You need to write a paper, but you have no time and no opportunity. And so these people might see advertisements on social media where they're offered an authorship on a paper. They have to pay money to that organization, and then their author name, their name will be on that paper and then they can check the box and they can continue their career.
And since they're not really scientists or interested in science, they don't really see the damage. Perhaps they just want to continue their career. And so can you really blame those people or the paper Mills or is it perhaps the strict requirements that are put upon these people that drive these paper mills? They're also active in many other countries. And we might not even know it. There might be loosely associated.
There's all kinds of different types. There's people who are writing real papers and then put 20 other authors on it. People who are all from different countries and who are very unlikely to have ever met each other. And with a bunch of people and I've included all their names, there's probably many more. We are sort of rogue scientists looking for signs that papers are fake.
And we're trying to find these sets of papers that don't come from an individual lab, but that come from very different lab, but that have things in common and they might have in common, for example, a Western blot. So this is, I think, the most famous example of a set of papers in which all the Western blots had the exact same background. This is the, I think the error that the paper mill, made.
The Design Studio was a little bit lazy and they just used the same background. And if you enhance the photo a little bit, you can you sort of can bring out the background. And you can see that the shading, what should be random dots in the background is the same for each and every blot. These blots are exactly the same width, the same height, and the bands don't seem to really attach to the background.
Normally they sort of bleed into the background, but it sort of seem to float above the pixels and they look very artificial. They don't look like a real Western blot if you are familiar with how they should look. And so this is sort of a primitive way, a primitive perhaps AI form where images were just generated. And these papers were they all followed the same template.
So you could actually almost recognize them by looking at their titles. And they all came from authors who were associated with Chinese hospitals. So they had something very much in common. You you could see the titles, you could see these blots, you could just find them based on a couple of words. You could pull out hundreds of them. And so a lot of people worked on this paper mill.
I think we found around 600. And there's now people who do textual analysis who find many more of these papers that are probably also belonging to the same paper mill, just based on their textual similarities that we haven't found yet. And so some of these papers have been specifically targeted to a particular journal, probably a journal that maybe the editor was maybe a little bit naive and accepted the paper didn't see it was fake.
And it's actually very hard to recognize one individual paper. But then if you see that suddenly this journal is publishing hundreds of these papers that look all very similar with very similar titles, very similar plots. You have to ask the question, if the editor did not really notice that. And yeah, we just found hundreds of these papers. And so that brings me sort of to one of my last slides.
And I want to get back to the theme of the conference transformation trust and transparency and sort of reflect on some of the examples that I've shown you. And so again, I, I think science is, is getting I don't know if it's getting out of hand. It feels like that as a peer reviewer, you just get so many requests and papers are so complex. And if just if. If you're a reader, there are so many papers on a particular topic.
How can we keep track of that? Should we perhaps slow down signs a little bit and, you know, take it a little bit easier and perhaps produce fewer papers that are better? Because I don't think that all these papers that are being published are all equally good. But of course, this is a complex question because we feel this pressure to publish. We need to publish as scientists.
Like I said, there's 300,000 COVID 19 papers and there's huge paper Mills. This is a very recent development. Recent, as in, let's say seven, eight years ago that this started. And with artificial intelligence, this is a new transformation of science that we might not be completely prepared for, where we no longer can distinguish fake from real.
Bringing me again to the trust part of the theme of this conference where we. Yeah, we would love to know if data is real, but we can no longer distinguish that authors might not even be real or peer reviewers might not even be real. There's things like peer review rings and how can we find the balance between trusting people but also verifying because maybe we cannot trust everybody. And unfortunately, these types of misconduct cases bring real damage to trust, to science, to trust in science, to trust that the general audience has in science.
And yeah, we have seen that in the past two years. A lot of people have lost their faith in science and they're, you know, believing in all kinds of weird conspiracy theories. So one way to solve perhaps misconduct is transparency, perhaps open science, where we ask authors to share much more about their data to make sure that the data is real. It's not just a graph, but show me the, you know, the real measurements and perhaps that you could still fake that, of course, but it would make it a little bit harder to fake a paper if you make have you a little bit stricter requirements for transparency and open open science, then it would be harder for paper Mills to come up with these easy papers that they normally can crank out easily.
So I also want to point out that for me, the investigation, the allegations of misconduct that I sometimes share, the transparency is sometimes very hard to see in the way that institutions handle these cases. These cases are often take years to resolve. And then the outcome is like, yeah, we found some problems, but it wasn't misconduct. And it's hard to see if I look at these photoshopped images, how is that not misconduct?
Are perhaps institutions not really afraid or are they not really don't they really dare to investigate these cases? Why do people perhaps in powerful positions who bring in money, are they maybe being protected by the university? Because very often these outcomes take a very long time and are not very satisfying and perhaps some younger people are being fired.
But too often the people who are actually the managers of these labs, the professors, those are the people who can keep on staying in place. And there appear to be no consequences for the misconduct in their lab. And that brings me to my last slide. So I have four take home points. I also have four wishes which are sort of just thought about just before, had already shared my slides.
So I do have a wish list which is not on this slide, so I apologize for that. So my wish list for the audience is please use expressions of concern much more generously than they are being used. Warn the readers. If I would write to a journal. I had hoped in the beginning that within a week there would be an expression of concern.
But that's not happening. These cases take years and years to resolve, and in the meantime, the audience doesn't know that there's a potential problem with the paper. So I my wish list is that expressions of concern are being put on these papers, at least in the case of photoshopped or suspicion of photoshopped images.
My second wish is that journals would make more use of the pubpeer dashboard or would make it easier for me to contact their journals. Or perhaps I could just contact each publisher, not have to deal with writing with individual editors who all have different ways of responding to my allegations or not. Handling these cases might be inexperienced. Could I please just have a contact person at every publisher?
So that saves me a lot of time and I could just more easily report these papers. And then finally, what I would love to have is legal access to these journals. I am not paid. I don't work for a university, and I would love to have legal access to papers that are behind the paywall. And I would love to ask a publisher's if there's representatives of publishers here, if I could have that.
Like, I'm not using it to read the papers. And of course I will create a lot of work and trouble perhaps, but I would love to have that access in a legal way. And I don't have to do it in any other ways. And that takes me to my four take home points. So for me, science is about finding the truth and science misconduct goes against that. Like, we can think about sad stories, but in the end I feel a paper with suspicions of science misconduct should be immediately handled and perhaps retracted because that's not serving science.
But it takes a village. It takes the role of the reviewers, of journals, of institutions and funders, and too often one of these four, or even more people or more institutions involved do not seem to act because there's all these conflicts of interest. We need as a society to think about how we handle misconduct. Because if we look the other way, if we think about perhaps the analogy of doping in sports, we need to decide that we don't allow science misconduct.
But it takes all these people, all these different institutions to play that role. How can we distinguish fake from real? I don't know. But I is going to, you know, ruin our concept of fake and real. I think so. And then finally, the tremendous cost of science misconduct, because there's a huge cost for scientists trying to reproduce those papers, but also for science as a whole.
Because you could walk home from my talk thinking, oh, well, all science is misconduct. Let's not believe in science anymore. Science is fake. And, you know, let's just trust all the misinformation out there. But I do not want that to be the message. I want the message to be that, yes, there is fraud in science, but we still need science to solve the big problems in the world.
And I trust in science as an institution to perhaps one day solve hunger, pollution, climate change, pandemics like we need science to solve these problems. But in order for that, we need to be able to trust in science. And I hope I play a role in that. Thank you.
Do I take questions or do we just hand it off to the reception? I don't know. Questions oh, it's. I'm just seeing lamps, so I have no idea. Hi Daniel okoh with the American Physical Society. And I'm actually the head of ethics and research integrity there.
So now you know at least who to contact there if you have any issues. And I wanted to pick up about what you were saying about line graphs and. As you correctly pointed out, you can just make up any old numbers that fit the story. So what is the good way to figure out if there's research, misconduct, falsification, fabrication going on with those kinds of plots?
Well, in some cases, like I showed you, the if you have these plots with lots of details, you can pick up on problems. I also spot problems, for example, in flow cytometry images, which are basically a bunch of a lot of little dots. And you can see duplications there in line graphs like, let's say just a line graph with some error bars I've picked up on, you know, irrealistic small error bars or error bars that are always the same or always appear to be 10% of the value or something like that.
There was a famous case not picked up by me, unfortunately, where the error bars were just a letter T which was and they used a font with serif. So it was like not just a plain t, but the t with the little. It was very clearly a Times Roman t that they used and it was just hilarious. But yeah, so, so those are some clear examples. And I also am working now on a case where the spacing on the, the, the, the line graphs, the ticks are not at the same distance and that is just it looks like it was hand-drawn.
So occasionally you can spot problems, but I think it's much, much harder. Then looking at a photo. Hi, my name is Cassandra. I'm a scholarly communications librarian in Canada. And I'm really interested. Of course, I'm working with faculty, but I'm also working with students constantly and talking about how peer review is important.
And it's still a gold standard, but how, you know, there are errors, there are issues like this. And I think it was really interesting what you said about having to sort of be able to admit that we are fallible, that we are wrong, putting those expressions on journals. And I'm curious if you have people coming to you, you know, on Twitter or anything like that who are coming up within science and they love science.
They want to do it. And they're seeing these structures, as you said, that sort of publish or perish that can lead to this issue, whether or not it's, you know, a mistake or a need to just publish. I'm curious what you sort of say to people who are worried about their Futures in science and what you would say about that kind of model and how to move forward that way.
Yeah, great question. I, I get I get a lot of young people who come to me who are. About to leave academia or who have already left academia because they saw things that were wrong in the environment. And they reported it and they were actually harassed after that. Or like little things like their key card suddenly didn't work or their parking spot was gone or the app didn't work anymore, like very weird things that suddenly made them not feel very welcome.
And very often the young people who are the whistleblowers within a lab or within a University are the ones who need to leave because the University might not believe them. And in some of these cases, they have shared their emails with me and they are very long and accusatory. So I always advise people if you see some misconduct, it. I know it's very emotional, but you have to write it as a scientific paper.
Nobody wants to read a 200 page, I don't know, like thing with evidence. Like you need to be able to summarize that and convince the other person because I even don't want to read all these emails. And, and they, they are very emotional. And so I think there tend like the research integrity officer thinks it's a hysterical person. And then very often, of course, the senior person who is maybe accused of misconduct is more experienced and more valuable to the University also.
And those are the ones then who can stay. And a lot of young people like my work because they feel, oh, now finally somebody is standing up for all these cases and in many cases of misconduct, if you read plastic, fantastic. Or if you read the story of diederik stapel, you can see that over and over again. Whistleblowers were trying to. Bring these cases to the attention and they were not believed.
And I hope that can change because, yes, there is misconduct in science and it's similar to there's a lot of parallels to the MeToo movement where, you know, a lot of people who were sexually harassed. And of course, I don't want to completely compare that. That's a much more damaging experience for a person. But the parallels of not being believed and now perhaps a movement through social media and articles in the New York times, et cetera.
These these stories are now believed much more than they were. And I hope that will happen in science integrity as well. Hi, Lisa. Hi, Mike. Mike rossner from image data integrity. Thank you for a terrific talk. And I sort of want to say, a general Thank you for all that you do to promote data, integrity, scientific integrity.
And I imagine that comes from everybody in the room. My question is about your notification process. Do you post on pubpeer and write to journal editors at the same time? Or is it one first and then another? And then do you ever contact institutions directly? You know, do you have a criterion that you apply or criteria that you apply? You know, say five years is too long to get a response from a journal.
I'm going to write to the institution. How does that process work for you? Yeah, I follow different schemes because sometimes you just find one paper and sometimes you find these clusters of papers and that are much more complex. I wish I could tell you. I post them up here and then I immediately write to the editors because that's always my plan. But then I get another email with a fantastic photo duplication and I'm completely sucked into that.
For me, it's much easier to post on up here than to collect the emails of 50 different journal editors and write an email with all the links to the pubpeer like make a nice spreadsheet with that. It's a lot of work for me, to be honest. So very often I just post on pubpeer and I will be very honest with that because it would make my life easier if I didn't have to because for me it feels double like I already posted it.
Somebody can pick it up and why do I still need to write to an editor? But that's just me thinking. I'm sure people might disagree with that, but it will would make my life much easier. Um, I cannot handle second questions because I have totally forgotten what your second question was. Sorry contact? Yes contact the Institute.
Thank you. Yes, I do. When I write to the journal editors, I also include immediately the institutions. And if it was NIH funded research, I will include or the Office of Research integrity. Ori has never replied to my emails. Which I think is bad. Like, come on.
Like you funds millions of dollars and I find a problem. You just don't even send me a Thank you for your email reply like nothing. It's just they never replied, so I'm not quite sure. Maybe I have the wrong email address or so, but I do contact institutions and very often yeah, they say Thank you. We take it very seriously, you know, the standard answer and then you hear nothing back.
Hi, I'm Heather Staines from Delta bank and Thank you for your talk. We've done a lot of research integrity projects recently and your note about the institutions maybe not caring or not being swayed by your pleas. And I think it became immediately apparent to me after talking to these folks that what the publishers are concerned about and what the universities are concerned about are completely different.
What we're concerned about everything you've talked about today. The universities are concerned about scandals and being attacked by the media and faculty not coming or students not enrolling. And I was I was actually wondering about contacting the funders because it seems to me if the funders are giving millions of dollars to an institution and fraudulent papers are coming out of that.
So your non-response from one, I don't know. Have you had any success with other funders trying to get. No, I haven't like diabetes organization or Heart Association. Never they don't even reply. No, but again, maybe like I'm just searching for their general email address. Maybe I just don't know what whom I should write to, but just the fact by now I feel I'm a legit person.
The things I bring up are, you know, it's not just my hysterical eyes who see that I Yeah. Bring up legit concerns. And just to have these organizations not even reply to me is I don't understand why that is. I don't know if we have any funders at the meeting, but or who would dare to stand anybody from or I because we need to talk.
Hello Thank you so much, Dr. bick, for sharing your story with us. My name is Ann Callahan. I'm with data conversion laboratory. And this question comes from one of our virtual attendees. And the question is, some journals are now requesting whole gel blots. Can you recommend other ways for the publishers to mitigate image manipulation?
Um, well, I think requesting a whole gel blocks is a fantastic start. What I've seen occasionally is that the whole gel doesn't match the thing that is in the paper. And I'm like, did nobody actually look at these things? Because it seems that if they don't match, that's already weird. But for other I think the question is about other types of data.
Like I would maybe. Oh Uh, do you have any recommendations for the publisher? Was Marianne's question? I think we need to go towards a model where you need to show that the image really came from, let's say, a microscope or from a gel doc machine or. Because if you.
If you can just fake it with AI, then you need some proof that it really came of a lab. And so maybe you can think of requiring some signature that the microscope adds to an image or so. But this becomes very technical and I would not really be able to give a good answer to that. Do we have time for one more?
Yep. OK. I'm fine. It's Oleg kruszewski from profit science. First of all, Thank you. What you are doing is unimaginable for me as a visually challenged person. How you do it just by your eyes. That's incredible.
But I think raised a very important point. Who are the stakeholders? No matter how much we preach about publishing less and thinking more. If the funders give money to those who have more publications, if University promotes those who have more publications, we are screwed here. And as you just saw for yourself, there is not a single funder in this audience and that tells you something.
And so the question is what we can do as a publisher's giving the stakeholders are not participating in the same game. And I think the only thing which we can do is to make what science is about, namely being reproducible. Reproducibility of the result should become not at the level of the peer review, but on the level of the parallel.
Research should become a core or necessity of publishing. If today I submit a paper to nature which reproduces as a paper in nature, I will not be published. They will reject me. At the start. They say no, it's a secondary research. It's a follow up. If if we will change the situation to the case on every important research will be immediately cross-checked and this will be encouraged and initiated by the journal itself.
We can cure all these problems. Including those of the peer review. And I think that we as a publishing community. Not the ones who distribute the money, but those who keep the quality assurance. Alive have to do it because otherwise it's an arms race against AI, and I don't think humans are able to win it in the short term. So thank you.
Yeah, no, I agree. So I think we need more journals that would be open to reproducibility. And one of the ideas I had that I talked about in another talk I gave is that perhaps a graduate student in their first year of doing research in the lab should just reproduce another lab's paper. And I think that would be a great start, you know, for their careers to learn, to learn it and for science, because then a paper is reproduced.
Or maybe we should go towards a model where a paper is only really valid if it has been reproduced. We can think of many different models to make that possible. But also to give credit for a graduate student or anybody in any stage of their career to reproduce another person's paper. So instead of just listing, as we list in academia, the papers we have published on our resume or cv, we can also list papers.
Other people have reproduced papers that we wrote that other people have reproduced and papers that I reproduced from other labs. And so we would have maybe three types of papers we can include in our Zoom. And I think that would be incredibly valuable for science to slow down everything and make science perhaps a little bit more slower, but to recognize how important it is to reproduce.
Because like you said, we're not going to win from the computers if we keep on going like this. And this is the only way we know that science is real if we can reproduce it. So I agree with that. Hello Thank you so, so much to our speaker, Dr. Beck. What a wonderful way to kick off the meeting, really setting the stage for our theme for the conference this year.
And if you didn't walk in with your own expression of concern about the future impact of scientific misconduct, I'm sure that you probably are walking out with that expression of concern. I hope that you all will join us at the opening reception in the exhibit hall and also check out the posters while you're here. And again, Thank our sponsor, scholarly IQ, for sponsoring our keynote today.
And one last wonderful round of applause for Dr. beck, who shared a wonderful action plan with us all today. Thank you so much.