Name:
Research Integrity #TRANSPARENCY Stories of Learning from Overcoming Mass Retractions, Systematic Manipulation and Research Misconduct
Description:
Research Integrity #TRANSPARENCY Stories of Learning from Overcoming Mass Retractions, Systematic Manipulation and Research Misconduct
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/8cb86f6c-f5c1-4254-9a83-c3798eebae34/videoscrubberimages/Scrubber_23.jpg
Duration:
T00H58M50S
Embed URL:
https://stream.cadmore.media/player/8cb86f6c-f5c1-4254-9a83-c3798eebae34
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/8cb86f6c-f5c1-4254-9a83-c3798eebae34/session_4b__research_integrity_transparency_- (360p).mp4?sv=2019-02-02&sr=c&sig=bt3aon5NoW8xYsBJ00%2Bxjw%2BtgFObHupHWRWHJY2Jawk%3D&st=2024-11-20T01%3A18%3A11Z&se=2024-11-20T03%3A23%3A11Z&sp=r
Upload Date:
2024-02-23T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
Hi, everyone. Thank you so much for joining our panel today on research integrity. I'm Hannah Smith. I'm the director of strategic account development in WileyPLUS partner solutions. And I'm so honored to be joined by these incredible experts who spend all day, every day thinking about and actually solving research integrity challenges.
Yael Fitzpatrick, Luigi Lombardi and Mike Streeter are joining me here on the panel. Today's panel is all about the power of sharing our experiences, even if they are very difficult and the power of the open, honest conversations that we have with one another. Things go wrong sometimes. We know that. And there is so much learning, though, that comes from talking about it and sharing from those experiences.
But that only happens if we actually talk about it with one another. And and we look deeply at what we've experienced and really analyze what went wrong. We advance in solving these big challenges through transparency and collaboration. And that's what this is really all about. Yale, Luigi and Mike are going to share their experiences facing challenges head on and working through complex and frustrating and disappointing challenges that come with things like fraud and plagiarism and paper Mills and mass retractions.
So to get us started, I'm going to ask our panelists to tell us a little bit more about who they are and how they landed in this challenging job. So, Yale, to you. Thank you, Hannah. And it's great to see such it's great to see such a nice turnout here today. And it's funny, because I think that Mike and Luigi and I are all pretty upbeat people and yet we deal with misconduct all day.
But there you go. So, yeah, so I'm Al Fitzpatrick and I'm the editorial ethics manager at the proceedings of the National Academy of Sciences. I've previously also worked for a number of different publishers and journal programs, including at ars and at Au. But for many years, at the beginning of my scholarly publishing career, I was actually primarily in the art and design side of things.
I started as a book designer working for a medical publisher. So shout out to Mosby yearbook. For anybody here who has been in the biz long enough to remember that old blast from the past. And I worked for a number of years as the art director for the journal Science and at science. One of my responsibilities was to help assess cases of suspected image manipulation in research figures.
To help with that, I gained additional specialized training in image forensics, including from Mike Rosner, who is in the room today. And I highly recommend everybody talk to him because he is a legend. And I, I spent a lot of time and still do using image forensics techniques to kind of determine, determine and assess problems with research integrity. From the image side of things.
I have been at PNS now for a little over two years, doing a lot of work also with image forensics and with image based potential problems, but also dealing with editorial ethics as a whole. I think although kind of the skill sets are different, the overall themes and definitely the overall goals are very much the same. And I think for all of us that we're just really want to do what we can to get the publication record, the scholarly record as accurate as possible.
I do want to read one just brief disclaimer and I'm going to read this word for word. So I just wanted to let everybody know that opinions are my own and don't necessarily reflect official policy or positions of PNS or of any other publishers I have worked for. And to maintain confidentiality, any stories that I share may be either unique cases or they may be amalgamations of multiple cases.
And I will not identify the publishers or organizations from which any story originated. All right, you've got one. OK does this one work? Yes OK. So I am Luigi longobardi. I'm the director of publishing ethics and conduct at ieee. And about how I got the job. Well, I mean, the short answer is that I applied, but I was told we had to start with a joke.
So but but I mean, the long stories that I see these as kind of the natural evolution of my career. I'm a physicist by training. I'm published as a physicist. I actually published also with tripoli. And and I was enjoying being a scientist until I think I no longer was. I mean, it was, you know, the frustration of getting that one great result once in a while after months and months and months of failure, kind of at some point got to me and decided to look for alternative careers.
And the American Physical Society was hiring editors. So I became an editor for one of their journals, and I think some of my colleagues from aps are here. There you go waving hey, Daniel, are you doing. And already at aps, it was a much smaller scale, smaller, smaller scale of journals, of papers, but a very nice collaborative environment. So many of us with the same background looking at the peer review process, but then also looking at the failures and looking at how things were going wrong, discussing how we should solve ethical issues, looking at guidelines.
So we were all kind of involved into this, into this process at the time. Then I moved to another organization. I was with the publishing also in physics, and at some point there I became the point person for ethical cases. And then from there, you know, there was the opportunity to join tripoli. I think that if I really cannot imagine a better place to be doing this because of my background as a scientist and because tripoli as a member and volunteer organization, a lot of my work is interacting with our members, is interacting with our volunteers, being able to kind of speak the same science language helps and getting to them the message of what we do in terms of research integrity, how we protect our authors, how we protect our readers and so on.
And coming from that kind of shared background is an advantage. So I'm very fortunate to be doing what I do where I'm doing it. Hi, everyone. I'm Mike Streeter. I'm a director of research, integrity and publishing ethics at Wiley and thinking about how I got to this role.
A colleague of mine who had worked with for many years made the joke, well, you're 19 years at Wiley now and was sort of struck by that that fact. But it hasn't always been in research integrity. I started my role, in fact, at Blackwell publishing as an editorial assistant and have primarily worked in publication of academic and research journals throughout that entire point of my career. I started at a point when, you know, the dominant medium for journal publishing was print, which sounds very ancient.
I think thinking about sort of where we are right now and in fact, you know, started on a program looking after family studies, journals and journals that looked at sociology and psychology. And prior to this role, I had a publisher role primarily in social sciences and humanities, looking after political science and anthropology and public administration journals. And like many of us who do integrity and ethics work, I kind of came into integrity and ethics work as a volunteer.
Maybe this is something that will get teased out in the Q&A later on. But having done publishing for so long in that capacity, I was looking for something kind of new and interesting to do. That kind of expanded my horizon in academic publishing. And so this opportunity had come up within WileyPLUS to take a role in our integrity and publishing group, which is primarily a group of volunteers that help us manage and resolve cases that come to us for integrity and ethics.
And through that work, a really came to understand the diversity of the disciplines that we work in. So I sort of came into this role primarily from that kind of volunteer experience, but was able to apply a lot of my editorial experience to that role because a lot of the integrity and ethics work that we do is highly collaborative. Those of you in the audience who are publishing managers and do publishing work, that you have to work with a variety of different stakeholders within the organization and in addition to stakeholders outside of the organization.
And the same was true with integrity in ethics. And, you know, a lot of the approach that we take to case resolution is one of both guidance and sort of articulating what the best practice is in approaching a solution to an integrity and ethics question. And so I've been in this role now for just over a year and a half and actually less than a year and a half at this point.
And even in that short period of time, the landscape for integrity and ethics has changed. I feel significantly was already a very large priority for Wiley and for other publishers. But I think the need for the expertise in integrity and ethics has changed significantly just in that short time period. So so I think there are a lot of great opportunities that are coming to the fore here.
So I'll stop there. And we're going to move into sharing stories. Now, it's story time. OK so each of our panelists are going to tell us a little bit about what it looks like to do their job and the biggest either biggest challenges that you face. Big challenge that you faced, a time when something went really wrong and how you dealt with it.
Just a story of what you do and. A yeah, yeah, I'm going to go rogue and tell two stories, but I'll keep it brief. So in planning this session, we decided that I would talk from the perspective of the publisher and what a publisher can and cannot do. And I wanted to go kind of a step beyond that and talk about what a publisher should and should not do. And when we're talking about ethics and life in general, the right thing to do is not always the easy thing to do.
The right thing to do is not always the pleasant thing to do or the convenient thing to do. But that doesn't mean that it's not the right thing to do. And so I wanted to share a couple of stories that definitely have kept left an impact with me and maybe you as well. OK so story number one, a manuscript was submitted to a journal. An editor reviewed it and decided that it was high enough quality to send out to three independent reviewers.
This study involved a lot of numerical data. One of the reviewers came back and said, I'm a little bit concerned about the validity of these data. I think that there might have been some intent. Manipulation it was it was a serious enough concern that the publisher went back to the authors and said, hey, these concerns have been raised. Can you please respond to this? The authors came back with a rebuttal, with an explanation, with one admission of what they said was an honest error that had been corrected and a whole slew of new data.
Everything went back to the original editor, to the original reviewer who had flagged the problem and to a new additional reviewer who had really high quality subject matter expertise in this kind of thing. The new reviewer agreed with the first reviewers original concerns, and both of the reviewers who were now taking a look at this, not only were they still concerned that there were issues with the validity of the data and that there was evidence of intentional manipulation of the data, the new materials that the authors had provided actually added to the evidence supporting the concern of intentional manipulation.
So the end the end result from the reviewers and the editors was that there was a very, very high likelihood that the data had been intentionally manipulated. So the decision was made to reject the manuscript. That was kind of a no brainer. But then there remained this question of what, if anything, to do about this concern of suspected research misconduct. A proposal was floated to report this to the author's institution.
There was a thought that this was a high level, high enough likelihood of a high enough level of egregiousness that it warranted reporting it to the institution. But there was not unanimous agreement. One line of thinking was that since the journal had decided to reject the manuscript, it was just kind of out of their hands. It wasn't the journal's problem anymore.
Another line of thinking was that with such strong evidence of this deliberate manipulation, the journal had an ethical responsibility to take action. And one other thing that was kind of folded into the mix was that there was this very conscious consideration that even if there had been deliberate misconduct, it was very likely that not all of the authors were involved or probably even aware of what had happened and that these innocent parties could be very negatively affected if the concern was reported.
Ultimately, the journal did decide to report the matter to the lead authors institution. And one of the key takeaways from this that I think will stick with me to the end of my days is that although the manuscript was ultimately not accepted for publication, the publisher did decide that there was a very strong ethical responsibility to the community at large to bring the matter to the attention of the authors institution.
So the right thing to do was not the easy thing to do, but it was still the right thing to do. One other story a little less intense. There was a refereed of a manuscript that was under review at publisher and they contacted publisher a to say, hey, I've now been asked BI Publisher B to review what looks like it might be the same manuscript, and this reviewer knew that duplicate submission is not allowed by most journals editorial policies.
So the two publishers connected and they both agreed that assessing this concern superseded their respective journals policies on editorial confidentiality. So with the understanding and the agreement that confidentiality confidentiality would be maintained between the two publishers, materials were shared, such as author lists, abstracts to help determine, first of all, if the two submissions were actually the same, they were, and to determine the chronology of events.
So it turned out that not only were they the same the submission, the publisher had received it first, and so that then meant that it was OK for what had been submitted to publisher a but by policies of dual submission that what had gone to publisher B was the ethically problematic one. And so it was decided that publisher B would then contact the authors to say, hey, you can you speak to this?
The authors pleaded ignorance. They said, oh, we didn't know that it's not OK to do this. And they withdrew their submission to publisher B publisher, a debated rejecting it due to this ethical breach. But they did ultimately decide to accept and publish the article. And so some of the key takeaways that I got from this case were that, first of all, yay for reviewers.
Reviewers are tremendous. I think we can all agree that we would be lost without them and also that it's just so important for reviewers to be well educated about ethical standards. I think something that we see a lot is researchers who are really good at research, but they don't necessarily know the ins and outs of publishing. And it's always wonderful when they kind of know those details.
And kind of my favorite takeaway from this whole case was it was just so heartening to see publishers working together to improve the whole enterprise of scholarly publishing without inappropriately compromising confidentiality. All right. That's it for my stories. OK, so I'm next. And I'm also going to go rogue because you asked for an example of something I've dealt with and I'm going to be talking about something that happened before I was working at tripoli.
And the reason I'm actually choosing this story is that in the year and a half I've been with tripoli. I've also been participating in forums, I've been participating in the integrity I'm nice, or the retraction and corrections and expressions of concerns communication group. And almost in every one of these forums I've been in, at some point somebody mentions the thousands of retractions that tripoli did ages ago, and it's going to haunt you forever.
Oh, is it? It is. And and that's OK. And that's OK. Actually, I think it's great because I think we. We I'm actually proud that we represent 25% of the Retraction Watch database. We should wear it as a badge of honor. And and so, again, mean and I'm going to be I'm going to start from the fact that the Retraction Watch posted was about 10 years ago was probably 2015.
So they broke the story. It was tripoli has retracted thousands of conference abstracts. So then you look at the article is not really abstracts is about two conferences each with about 1,000 papers that had been retracted. And actually, it's funny because the real story was not in the Retraction Watch piece. The real story was in the comments because in the comments, some people decided to go look at our digital library and they found out 170 conferences ish, 9700 papers.
Um, so. So, Yeah. How did that happen? Well, how did that happen? Is a bit complex. I can give you a little bit the broad strokes. What's really important is how we investigated and how we got to that large number. And so if we look at how did that happen, I think is tripoli again, is a member organization at the time.
So 15, 20 years ago wanted to expand its global representation. So we wanted to represent as many engineers as possible from everywhere in the globe. And and getting new members is really not just getting new members. It's not just increasing your numbers of membership is creating the environment that fosters the discourse between these members.
So what we wanted to do was also to create local chapters. We wanted the local chapters to organize conferences and then eventually these conferences to get published. And so if you look at this from your objectives and key result point of view, from your KPIs point of view, it was great. It was a great success. And we opened a lot of chapters. We registered a lot of members, we held a lot of conferences, we published a lot of papers until eventually complaints started coming in.
And so, you know, there were issues about the quality of the conference proceedings. There were issues about the scope of some of these papers were not engineering. There was issues with the language. There was issues with we got reported issues about the quality of the peer review and other unethical issues. Like you just published a conference proceeding, but nobody here knows that this conference happened.
And so we get all this. It comes in and we started to investigate. Luckily, you know, maybe some bad actor became a volunteer and some bad actor did something locally that led to this. But we had a large number of trusted members, trusted volunteers. We could lean on them. They could help us with the investigation.
And so we took a three pronged approach. We looked at whatever was published already. And when I say what was published already, I really mean almost everything, not just the concerns we had received. We were looking at what was in flight. So what was in that moment under peer review. And then we needed to look at how do we tighten our standards for conferences, how do we make sure that this doesn't happen again or it doesn't happen at that scale?
And the criteria was if a conference had a complaint, we would look at the entire conference proceeding. If the complaint was confirmed, we would look at five years back on the same conference or if there is the same organizer, if there is any point in a shared point of contact with that conference, we would look at that. So in the end, we ended up commissioning a review of 110,000 papers that led to 99,700 retraction.
So this was a massive effort and we really broadened the scope. So the reasons why did you retract 9700 paper is because we broadened the scope. We tried to look at everything that had issues. So what emerged, again, without going into the details, so different papers were attracted for different reasons. I kind of gave you an overview of what those reasons might be. We didn't just retract papers, we closed chapters.
We stroke membership away. We banned volunteers that had been involved with this. And then we started looking at, OK, what is the next step? What the next step is, let's fix the problem, let's fix what happened. Let's understand our vulnerabilities and let's set guardrails, guardrails in place to minimize the problem. And so now we have a stricter application process for new conferences.
We have oversight on the technical program. We have a technical program integrity committee that provides basically a pre-submission consultation and evaluation of the conference material, both on scope and quality. We have a conference application review committee that is charged with identifying application with quality issue, evaluate those application, make recommendations that may lead to rejecting a conference.
And if all these prevention measures fail, we don't have a conference. Integrity or conference organization integrity committee that reviews post-publication issues. And what I can report is that, yes, it was 9700 a number of years ago. We are just there is about 600 more retractions that have happened ever since. And yeah, OK.
If you look at say, oh, six, 600 more retraction. It's 10 years. It's year. I wish about more than 400 was last year. So and I got my job a year ago, but I'm not going to say that was me who did this. It's just that, you know, it takes some time for issue to bubble up. And and so anyway, so we put all these guardrails in place.
And the issue here is this, is that really once trust, once trust is broken, regaining it is a slow process. It's not going to happen immediately. So our immediate reaction once we put all of this in place was when in doubt, reject the conference. Now we have relaxed a little bit. Now there are issues with a technical problem. There are issues with the conference application process. We give the organizer an opportunity to respond, to fix it, to see if there is some way to, you know, move forward with that.
We also realized already back then that a lot of the problems was education. It was just like people that organized conference didn't know what the correct peer review process should look like. Sometimes they, you know, the papers were out of scope and it was just simply ignorance of who they were organizing a conference for. So now there is a lot of education in place based on the organizer role.
There are education modules on event management, developing the technical program and conducting the peer review. So the point here is, yeah, are we're now doing better. I cannot say we are perfect. We organized more than 2000 conferences last year. We published 200,000 papers as part of conference proceedings. So 100% oversight of all of this. I'm never going to claim that. I'm just saying that, you know, we are going to continue to do better.
We're going to continue to improve. We're going to do our best. And and it has to be said that the vulnerabilities back then. So conferences in a way are a vulnerability. If you look at the landscape today where paper Mills have now become ubiquitous. Conferences are still kind of a point of entry for paper Mills. Back when all the story I'm telling you happened, maybe paper Mills existed, but no one called them that.
And so we were not aware that a bad actor in bad faith could do you know, that level of damage? So what has changed now is the fact that, yes, we are all aware of paper Mills. We have learned to recognize certain red flags. We have also realized that we are kind of in an arms race with them. You know, we discover something. We think we make fixes.
Didn't learn that we've made the fixes, they change their processes. But what's really encouraging of the last year and a half has been seeing the amount of collaboration that we are now creating across publishers. We are now at a point where we look at the scale of all of this and you will be talking about scale. So I'm not going to steal your thunder.
But when we look at the scale of all of this, it's encouraging that we have reached now a point where there is more openness, transparencies, transparency between publishers, and there is more of a willingness to identify how we can solve and work on this together. Mike, you actually set that up really well. Luigi we didn't plan this, but I guess kind of focusing on this sort of notion of transparency, which is really critically at the heart of, I think what we have all really focused on in research integrity, for better or worse.
And, you know, over the course of at least the time I've spent in this role, I mean, the story I want to share is really one that I suspect many in this audience are probably familiar with. And it's one where speaking to the point of transparency, we made a decision in September of last year to retract just over 500 papers from our portfolio due to which was a culmination of an investigation that we had initiated that summer, which led us to the conclusion that we had paper mill activity present in our special issues program within the hindawi portfolio.
And this also sort of kind of extending on that, that point about transparency. We we also made a similar announcement in early April of this year and that that initial announcement was covered in Retraction Watch and this follow up announcement that we talked about the problem of paper Mills and the need to collaborate around investigating and dealing with them was published in the Scholarly Kitchen.
And we also made the point there that we had intended to retract another 1,200 papers. And those retractions have started to have started to make their way into the record. This was really AI don't want to say we certainly weren't the first to make this decision, to make a proactive announcement about a volume of retractions that we were doing. But it certainly it it led on from an investigation and an approach to an investigation that we took, which was very different from a way that we would have approached a maybe not a similar investigation, but an integrity and ethics investigation in the past where we really needed to focus on process and scale.
So that first level of retractions that we did, really the rationale that we had put forward was the identification of peer review and manipulation that we were able to uncover in the investigation that we undertook. And the investigations that we're taking now are really both taking a component of publishing data, data that has been shared with us by external integrity experts and also human intervention and looking at groups of papers where we suspect that a paper mill has been active.
But I think just to the notion of transparency, I think the importance of sharing those things kind of speaks to the collaboration that we're all very much willing to engage in right now. And that is to, I think, give confidence to publishers to enable them to take a step in a somewhat, I would say, kind of uncharted waters in how you approach an investigation that might include 100 or potentially thousands of papers that are potentially manipulated.
And I think that, you know, that's critically important if we are going to do what we are responsible for, which is maintaining the integrity of the scholarly record and the accuracy of the scholarly record. And I think it's also to acknowledge that, you know, nobody wins in a well, this surely is an incentive where paper Mills are concerned, but there are a lot of people who lose out as well, the researchers who are thinking that what they're reading and citing is potentially above board and there's nothing wrong with it.
The time it takes to do this investigation and to take the action that's required and the time that that takes away from doing other good and better things as a publisher that you want to be able to do. But I think it's certainly a position that we've taken as a publisher that it's especially necessary to take the kind of coordinated action that we've taken in terms of the investigation and how we approach it at a very scalable level that we feel will allow us to act and to act more expeditiously, potentially than we have than we have been able to previously.
And I think, you know, the words I keep using are collaboration and transparency. I would basically kind of emphasize what Gail and Luigi have said about the coordination that's gone on between publishers. And that's the real bright spot. I shouldn't just say across publishers. It's others. Other stakeholders in the research ecosystem as well.
That includes funders, includes institutions and authors and researchers. But we've seen the very recently yesterday the Committee on publication ethics published their new guidelines on best practices for managing special issues and best practices for working with guest editors only wanted to say about a month or a month and a half ago, cope also published this best guide. Best practice guidelines for approaching investigations for systematic manipulation.
Where I think as a publisher, you may have been sort of trying to make that up as you go without that sort of published guideline in place. And I think too many of us. Louis you talked about the integrity hub. I was speaking to hilcorp's right before this meeting. You know, one of the things I'm always worried about in having these kinds of conversations in this forum is that we talk about this stuff all the time.
And you sometimes you worry that, well, in an audience like this, who aren't talking about paper Mills all the time or aren't talking about research integrity issues all the time, are we giving kind of enough context for the work that we're doing? I mean, I think the one other point I would just want to make on kind of the collaboration aspect is that, um, you know, research integrity is clearly critical to the value that publishers provide and what they publish.
But I think one of the misnomers may be that, you know, research integrity is the job of the research integrity team. And it really it certainly you're seeing research integrity colleagues within an organization leading policy and leading best practice. But it really takes kind of a collaborative operational approach to tackling these things at scale. You can't simply publish a bunch of retractions without engaging closely with those who are working in production and knowing the importance and the rationale for why you need to do this work.
In our instance, we certainly engaged with our legal colleagues and how we word certain notices and how we, um, the communication that we would be putting out to inform authors of the decisions that we've made. So it's to point out that kind of operational aspect of integrity and ethics is really critical at a publisher and having that sort of close stakeholder engagement internally to make sure that you can carry out the necessary steps in the investigation and then the necessary steps that you need to act in that investigation are an important component to it.
Thanks Thank you, guys. Before we go on to questions, I just want to say Thank you to our panelists for telling your honest, candid stories. When we assembled this panel, I asked all three of you to bare the truth about what's happened in your organizations or various stories of your time in publishing. But I did wonder if when we came on stage, you'd all chicken out.
And I really appreciate that we actually got the true, honest stories here. And I think that really speaks to the heart of that transparency and collaboration that we're really looking for. I was going to kick things off with my own question to get things started, but I want to give the room a chance to ask some questions. We have about 20 minutes left in the session.
I can't see anything. Oh, there we go. Oh, good. Also, don't know how to get you. Should I? What do I do? Thank you very much. So I'm Daniel luco, a former colleague of Luigi's working at the American Physical society's head of ethics and research integrity.
So these are my close colleagues up here on stage. And both Luigi and Michael already alleged to. What is the substance of my question already, but I was wondering if you could talk a little bit more about the stage where you go from an internal inquiry to an investigation where you are involving an independent investigation? We're involving third parties. I've tried a number of approaches which have been sort of situationally dependent, but I was just wondering if you had any if you could elaborate on that a little bit, on what your thoughts are about these external investigations.
Actually so. So that's actually. Daniel Thank you. That's a great opportunity for me to just make one point that I hope there would be a chance to share today, and that's to clear up a common misconception from a publisher's perspective. And this goes back to the question of what a publisher can and cannot do.
Publishers cannot investigate. Just by definition, publishers are not investigative bodies. The main reason for that to think about is just to accurately and thoroughly conduct an investigation would require things that a publisher does not have jurisdiction to do. Things like, you know, taking possession of lab books, things like that. So a publisher can assess things, can review things, and then can work with potentially an institution or somebody else to kind of flag that for them.
So I just wanted to Thank you for giving me the opportunity to kind of share that little, little Nugget with the room. And I'm going to now pass it on to Mike. Well, I thought I would add to that, too, maybe because I also and in thinking about the answer to that question and you kind of did a great job in sort of articulating response, but also to put some clarity around that as well. And that is that you talked about jurisdiction, right?
And so where there's an issue of research misconduct, that is really the responsibility of the research institution to investigate that research misconduct issue, whereas publishers, we have the response, we do investigate and we do engage the authors potentially, or we do our investigation on something where we suspect systematic manipulation. We might act based on hallmarks that we have in that case. But I wanted to sort of clarify that distinction, too, in case that was a needed clarification for the audience.
Yeah and the one thing I was going to add to this is also that, you know, it's not that it's not that we are trying to prove somebody guilty beyond any reasonable doubt. What we are trying to do is to assess if there are. Issues with the paper to the level that we believe the paper is unreliable. That's where we act. We don't necessarily have to prove anything.
We just have to not trust that the content of the paper is accurate. And so that's probably one difference between us and at universities. And I'll just kind of tack on to that, that that's a great point, too, that it's not a publisher's place to kind of deal with the people behind it. It's a publisher's place to correct the scholarly record as appropriate.
We might be curious who did something or why. Oftentimes we'll just never really know. And that's just part of the gig. Hi, I'm Gwen Wertz. I'm the journalist manager at SPI. I've read been reading a lot of articles about mass retractions right now. And something that's come up fairly frequently is to be really cautious about how much information you share publicly about the mechanisms of paper Mills and the networks of peer review fraud.
Like, for example, you find out that reviews were done in 10 minutes, right? That's an indicator you've got a problem, but we shouldn't necessarily share that because then, you know, the bad actors realize that's a flaw in their system and they correct. So I'm a little curious about your ideas about what should be daylighted like what is good to expose about these organizations versus what we should keep quiet so that we can chip away in the background.
I'm just going to give one quick example, which is that I'd read an article about the Facebook groups where you can find all these papers for sale, which led me down this huge rabbit hole, and I found so many thousands of papers for sale on Facebook. Is that something that should be exposed? Do we talk about that? Do we make that public or do we not want to draw attention to it?
I'd love your feedback. I think it depends. Right? so I mean, I would start by saying that there are a couple of different ways to share. One of them. And many of us in this room who are doing this kind of work already engage in an element of kind of closed forum sharing where we can all benefit from what we have learned and in a way that we don't want to share the intelligence that we have gained.
I think the other part of this is also, you know, it was mentioned at the plenary this morning that there was the United to act summit last week that SDN organized. And I lost my point that summit was organized and that was an opportunity for us to come away with what the best recommendations would be. And one of them was, and this is actually the point I forgot and I'm going to make it now, is that awareness and visibility are critically important for not only not only for the community, the community at large, but for authors, for editors in chief who are doing the work of deciding which papers to publish and which papers to reject, and that that level of awareness is also needed at the institutional and at the University level.
So that's to say that we do want to share, but we don't want to share that sort of specific level of intelligence that we aren't confident, has not already been publicly disclosed. How's that for a kind of a generalized non-answer to your point about the Facebook group and authorship for sale? There is actually now a Twitter feed and I cannot remember the name of the Twitter handle that is publishing these posts on a daily basis.
And so you can see these as they come in. I'm not sure if anyone wants to add anything about what to share, when to Bookshare and why speaking of what to share, I want you to share with me the Twitter. I'll share it with you. And that's what I want. But no, I mean, I, I agree here. And we a lot of these forums you were talking about, we are aware of them.
Some of them we are not aware. And actually it just so happened that if we end up publishing papers that had been for sale on one of these forums, there are a lot of people that do similar work to what Elizabeth here in the front row. Does that end up contacting us? Actually, I remember on one of Elizabeth's slides there was acknowledgment to Ana belkina, and these are people that I would like to say they have us on speed dial, but they know how to email us.
And and so, yeah, I mean, if we are not aware, we end up learning about it after and, and then we, we ask the questions well where was this posted initially and, and yeah, I mean then sharing it publicly would not be something we would do. It might have a call with Mike and say, hey, I learned about this. Did you know about it?
But that's what would keep it. And we're going to make a comment about paper Mills in general, which is my own personal frustration and the connects with the question before about, you know, where do what are we responsible to do? And the issue is we are responsible for the record, we are responsible for the papers we have published, which means, you know, we may end up retracting papers and we are attracting papers from authors that paid to have a paper that didn't.
Author sometimes. We never go after the people that got the money. Because it's not our jurisdiction to do it. But I think that's the higher level crime. And so that's my frustration there, is that there are these people that are profiting from it and it's out of our purview. I hope I'm not treading on territory that's already been walked on.
But your reviewers were mentioned early on and I'm just curious, in keeping a paper trail and a record of how something was uncovered, misconduct, plagiarism or something like that. Has has the question ever come up of revealing the process, the communications and the identities involved in uncovering this information? Like, I work for a publisher that does single blinded reviews, so we take the identities of our reviewers very seriously.
So I'm just wondering if that ever has come up or a question has ever come up about that. In other words, who told you that this was plagiarism? Who who uncovered it? Um, it has. Um, and, and there is no answer to that question. So, I mean, when we contact an author asking for an explanation, so when we have uncovered something, we say, hey, this is what we discovered.
Uh, what do you have to say about this? We have found ourselves in situations where they say, we want to see your evidence, but no, you're not going to get it. We are giving you an opportunity to respond. We are giving you an opportunity to defend yourself. But how did I uncover it? That is not something I'm going to share. And I don't know if others here have a different view.
Well, I'll just say that every publisher I've ever worked for believes in as much transparency as possible and believes in sharing both ways, believes in sharing author information with reviewers and editors, and likewise sharing reviewer and editor information with authors. That doesn't necessarily mean that you reveal identities. I would say that in terms of revealing editorial and reviewer information with authors, identities are kept confidential, but you can still share just kind of the raw guts of it.
Yeah I'll add a nuance to that and just say someone made the joke a week ago and I can't give this person his name, but if you ask a research integrity person a question, the answer that you're going to get is it depends, right? So there's always a level of nuance in all of these cases that we're dealing with. Some of them have a lot of similar patterns and hallmarks, but in some instances, it does make it might make sense to share.
An email address that you can tell is not a real email address and to use that in a way to benefit others. Because we know that when you shut something down one place, it's going to end up somewhere that is going to travel to another outlet. So I think that that's where this question about sharing information can be critically important.
Hi, my name is Elizabeth and I feel like I got from IEEE this answer, but I wondered if the larger panel could ask, like using the examples that you gave. What do your organizations do differently as a result of these cases or these investigations, as you say? OK, so I'll start.
I guess there's and there's a lot of different ways to slice that because there's a lot of different kinds of categories of cases that you're looking at. I think one big thing is just based on the awareness and the volume of things that comes up is seeing if there are additional resources that can be channeled into something with image integrity. That is definitely as no surprise to anybody. That's become a massive issue over the years.
And so people in the community at large are responding to that. There's a lot of development happening for new tools and technology to try and help address that. I think just the awareness is a big thing and then kind of seeing what tools and what processes to help address the problems comes from that. Yeah one thing I'll add to that too is we've introduced new screening and we've introduced new processes as a result of the investigations that we've done.
We've we have developed our own guidelines around special issues and around and around manipulation that may potentially occur. So it's one thing to I mean, I think to produce guidelines. The other is the how do you share those guidelines in a way that they're actionable. So we have different teams within WileyPLUS that do that, perform different functions. But one example I'll give is we work very closely with our editor engagement team and they do a lot of educational webinars with decision making editors who are external to WileyPLUS.
And so we'll hold events with those groups to try and raise awareness of the guidelines and the policies that we've created to help them do their jobs better. So I was going to say a little bit more about this is also one of the recent trends that we have all been collaborating on is this incredible shift from reactive to proactive. So, you know, we are all involved again with STEM is an example on building tools for early detection.
So can we find a problem before it gets to publication? Of course. I is one of the it's going to be the future. It's a threat and an opportunity. But yeah, that's the point here is exactly that is let's try to do as much as we can at any point of our workflows before it gets to publication. And so what guidelines, what guardrails we can put in place at any point.
And we all working together to build tools on this. Hi, this is Roger Schoenfeld from Ithaca. Thanks for this terrific panel. I note that at least two of you, your organizations have substantial preprint services as well as formal publications, of course. And I wonder, do your responsibilities as research integrity officers extend to the preprint work that your organizations are engaged in?
And if so, which you were talking about moving sort of upstream stream and more proactive is what's the story there? What's going on and what does that look like? So first question that you and that you asked, the answer is yes, the how. That's a work in progress. So I don't have clear policies at this point on, I mean, we can take down papers from the preprint servers.
Of course, the issue with preprint is that once they're out, if they have done damage, the damage stays. So I don't have much more of an answer to that. But this is, again, and one thing that I need to mention here is that, again, some members and volunteer organization, which means that when a need for policies arise, we research the best practices around the industry.
If there are no best practices, we try to interact in forums like COVID on coming up with common ideas, shared ideas on this, and then eventually for tripoli. And I bring those ideas back in house. I share them with the members, I share them with the volunteers, and they see if there is a path for a policy that they would support and vote on. So for us, it's a long and complex process.
But having said that, the answer is yes, I'm responsible for that and how we're going to do it. I mean, going to use I'm going to quote Mike. It depends. And it depends really case by case. And so I don't have an overarching policy, but we're also looking into what policies should be set in place at some point and who should be collaborating on those.
I have one question that came in online, which will be the last question, because I want to give you each like a 1 minute speed round of a wrap up. So very quickly, have you seen examples of paper Mills moving into meeting abstracts? No great. But I don't exclude it. I mean, you know, give it given the opportunity, they'll do anything.
But are they profiting from that? That's if there is a way to make a profit, they're probably going to do it. So this was the question I was going to ask at the start, but I thought it would be a good place to wrap up. And you'll each have to have a couple just a couple minute answer to this. So the question was, but I'm going to reframe it slightly. What was what do you see as the future of research integrity?
But I'd like you to answer that and then to add to it, what makes you most excited about this space and excited about that future? It's really tough one. Oh heck. Hannah, what was the first question again? Thanks for going first. Yeah, I'll just jump on the grenade for you guys.
The future. So the future. People who know me know that I'm annoyingly optimistic and cheerful, which makes no sense because I deal with misconduct all day. But I'd say that the future is that there's going to be more, you know, more bad stuff to deal with, unfortunately. I wish that wasn't the case, but I also have to say that I think we have to have hope that there will be increased ways of addressing that.
I love that over the last few years, there's been a very heightened just awareness of the problems. There's been more like the theme of collaboration has come up several times. There's been a lot more of that, I think. I think we have a lot of challenges ahead of us, but I do have to have some hope. What I'm most excited about is this just fictional Star Trek future where AI solves everything instead of making everything terrible, which I sometimes think might be what's going to happen?
No, in all seriousness, I know that there is so much work happening right now to try to develop very, very robust tools to address these issues. It's a Herculean mountain to climb. It's a massive task. People seem to think, especially with image integrity. People ask, well, why hasn't this problem been solved? How hard could it be? It's really hard.
It is really, really hard. But I have to say that I'm excited about the possibilities for there at least being significant improvements in what's possible. I'm going to repeat the point I've already made. I mean, we are making the shift into becoming more proactive. And we are actually what really excites me is this climate of collaboration? Is this, you know, us being transparent with one another, coming together at the right tables, discussing the right issues, trying to figure out how we can work together, how we can share information more broadly, what can be done with that, what can be done with that information.
Um, and I dread very, very much, you know, all of the ways that the paper Mills are going to be morphing and trying to stay in business. Um, the thing that excites me about that is that they're going to keep me employed. I was told I have to close with a joke, but, but yeah, I mean, and another thing that's really, really daunting on me is the many months, if not years ahead in which we will be talking about chatgpt ad nauseum.
That's that's another thing that really dawns me. Like I, I really don't want to have those conversations, but I will have them and I'll, I'll just piggyback on that. Not not only text based AI, but image image creation. I it scares the heck out of me. Uh, I'll add to the, I guess kind of speaking to the point about changing that posture of reaction to one of more prevention. I mean, my answer is more, more along the lines of potentially not only for research integrity, but for other types of screening.
I think that if we are going to have a future where we are going to be able to strike a preventative stance and I'm talking about Wiley and I'm talking about other publishers as well, is this injecting a standardization of workflow is going to be critically important and a standardization of the infrastructure that we use to take manuscripts in and do peer review on them or do other things with them.
And without that level of sort of standardization, we're not going to be able to put, um, we're not going to be able to put the types of screening and technological checks that we want to be able to apply across the board. And we're not going to be able to get the kind of data that we need to make future policy decisions and good decision making about different types of workflows that might work better.
So I think it's for me from a speaking from kind of research integrity, but also other screening type initiatives, it really needs to be an infrastructure solution to the standardization question. I'm also excited about collaboration, so I'll just throw it out there. All right. Thank you guys so much. Can we have round of applause for our incredible panelists?
And they'll all be here during the reception to answer all of your tough questions that we didn't get to. So thank you all. And much more conversation to come.