Name:
Prevention of Systematic Manipulation at Scale: Setting a Proactive Strategy
Description:
Prevention of Systematic Manipulation at Scale: Setting a Proactive Strategy
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/bddd56b4-8be0-4ab2-9db2-7220bcafa27e/videoscrubberimages/Scrubber_1.jpg
Duration:
T01H03M18S
Embed URL:
https://stream.cadmore.media/player/bddd56b4-8be0-4ab2-9db2-7220bcafa27e
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/bddd56b4-8be0-4ab2-9db2-7220bcafa27e/SSP2025 5-29 1045 - Session 1A.mp4?sv=2019-02-02&sr=c&sig=8xjpSba3qTZdF4BQJZT1GrxyQaE3GX7czzl3CWDAg98%3D&st=2025-12-05T20%3A55%3A48Z&se=2025-12-05T23%3A00%3A48Z&sp=r
Upload Date:
2025-08-14T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
All right. Hey hi, everyone. It's about 10:45 so we can get started. My name is Megan McCarty. I lead the consulting team over in Wiley partner solutions, and I'm excited to kick things off today talking about research integrity and systematic manipulation at scale.
So excuse me one second. Threats to research integrity have become increasingly complex over the last two decades. Bad actors are more sophisticated at circumventing established integrity checks, and online networks now easily facilitate authorship and citation for sale schemes. Indeed, you'll hear from our colleagues involved with United to act today. So that's Mike and Lisa.
Much of what we know about paper Mills is anecdotal and learned in retrospect. Recent advances in generative AI also make content generation easier than ever. From a wider perspective, researchers continue to face the challenges of a publish or perish system, and current funding cuts will heighten these pressures, creating a greater need for education and support, especially for early career researchers.
So, our session today aims to explore a comprehensive set of strategies from a diverse set of speakers dedicated to ensuring quality of research and trust in the integrity of the scholarly record. So our focus today will be on prevention rather than how research integrity breaches are addressed. Post-publication so mu at the end, we'll speak specifically to the support researchers need in the lab to report what the data say and not give in to pressure to create quote unquote, good data.
Lisa will provide insight from the library and institutional perspectives on other upstream strategies for prevention. Mike will speak from the publisher's point of view, specifically to the challenges of managing research integrity at scale, and Beth will round things out with both a view of what it's like working in a larger society and a smaller society, managing research integrity threats.
So while our panelists are going to bring different points of view, we are aligned on a few basic assumptions. Just one second here. Well, awesome. So breaches in the published record can have profound effects on the careers of researchers, the translation of science into practice, the overall research pipeline, patient health, and so on.
Post hoc investigations are also incredibly time and resource intensive, so prevention really is preferable to remediation. Second assumption no one tool, strategy or stakeholder will stem this crisis alone. Three institutions have a role to play, though I think some of how is TBD and hopefully we can speak to some of that today. And our fourth assumption, and is that the publisher, publisher, parish is the reigning paradigm.
And our focus today is to leave you with strategies and solutions, and hopefully we'll spend a little bit less time just getting stuck in that loop of talking about the reigning paradigm. So great. I wanted to kick it over to the panelists now to introduce yourselves. And yeah, Beth, do you want to kick things off. Yeah good morning, everyone.
I'm Beth Cronin. I am the director for editorial operations with the American Physical Society. Good morning. I'm Lisa janicke hinchliffe. I'm a professor and the coordinator for research professional development in the University library at the University of Illinois at urbana-champaign. Hi, everyone.
I'm Mike Streeter. I'm the director of research integrity, strategy and policy at Wiley. Just to give you a sense of the purview there. That has responsibility for the peer reviewed academic journals that we publish. So that's both policy and the investigations that we do at the publisher. Hi, everyone.
I am mu Yan. I'm a scientist at Columbia University. I'm here as a sleuth for people who don't know what that word means. It's a science detective or the nemesis of Mike. So taking prevention of systematic manipulation and deviations from research and publishing integrity practices and policies into account, what components of a strategy of prevention do you bring to the table in your role.
I think whoever wants to jump in there to talk about prevention right back at me. So I said I'm a career scientist. So I work on the front line of research. I run a core, which means I work with a lot of scientists at Columbia University. So what I'm seeing is that I think the biggest pressure a lot of people are working under a huge amount of pressure these days.
A lot of trainees and students are given implicit pressure that certain type of data are much more desirable and preferable. So these can manifest as sometimes as encouragement. For example, the cure for Alzheimer's is on the horizon. Your work holds the last piece of the puzzle. This may sound like encouragement, but for people who are actually working in the lab, that's a huge amount of pressure that may backfire at some point.
On the other hand, it can be somewhat negative pressure, such as well, if I don't get this funding, I don't think I can support every one of you in the lab. Well, what do you mean by that. It feels like now it's a hunger game in the lab. You have three out of 6 kids can survive. Just depends on who gets the best data. And what is the best. Data may not be the most honest data because honesty takes time to prove the best data.
Usually the data that is most effective against getting fundings right, getting papers out, but those sometimes are vulnerable to misconduct, cherry picking and on the whole, slippery slope of what we classify as misconduct or fraud. And are you API or can you differentiate the role of the pi versus the core director. So core director core is sort of like a fee for service. It's sort of a service facility within a research facility within the University.
So my specialty is mouse behavior. So anyone who study who wants to study effects of drugs or genetic manipulations on Parkinson's, Alzheimer's, autism, you name it. We do the behavioral assay in the animal models. So this is what we call these core facility. Pis are independent lab leaders of independent labs. Thank you. I think the other thing that we can we're just kind of talking about roles within universities right now.
So I'll speak to the role that librarians have when it comes to prevention, which is we have a lot of different places this comes into being. One is our basic instructional programs around the basics of search, identifying quality sources, helping people figure out where they're going to publish their work, giving them signals and tips for how to look for quality publishers.
Librarians spend actually quite a bit of time helping people with compliance requirements of data deposit, open access and that sort of thing and expert resources. But I think that one of the challenges we have, whether it's Columbia or the University of Illinois, I'm not going to claim we have no research integrity issues, but a lot of these at scale problems are not at scale at our institutions.
They are in the larger ecosystem. So, as was mentioned before, Mike and I were at the National Conference on research integrity last week, and I was really struck by something that my assistant Vice Chancellor for compliance, what a title said to me when I said, like she's talking about United. She's like, oh, but we have to care about this because we have to care about a healthy ecosystem of scholarship.
So even if our people aren't availing themselves of paper Mills, we have to care about the overall ecosystem. And so as I was thinking about that, I was also thinking about the way researchers have maybe developed partnerships with other people over time in other countries, and myself as API, there's a lot of just trust we put in the overall system that's like well, you're a faculty member. I'm a faculty member.
We're like, good. We're both faculty. We have the same values. And I think increasingly with both research security and research integrity, we're going to see a kind of requirement that researchers do a little bit more trust but verify. And I definitely see librarians having a role to play as they go out to try and verify, because this is essentially becomes a research question.
Can we figure out that this is a person with a good track record, do they have any red flags in their record. And I think the other thing I will speak to maybe just specifically, is as a faculty member, I'm also teaching directly future scholars. And so the role of coaching is especially important. And I think one of the things that I've noticed is for shorthand reasons, we often tell trainees what to do, but we don't explain why they're doing it because we just there's so much time pressure.
And so really thinking about that mentoring and coaching role where you peel back the why we're doing it this way so that they don't make up a reason that's actually different than reality and get themselves into trouble accidentally. And that would be, misconduct on as somebody who's mentoring future generations to not make sure they really have that full understanding as a preventative mechanism.
I think the other thing I'll just observe is that, by and large, research integrity violations. Particularly when they get to the point of being a retraction or the negativity of that really sticks to publishers and not to institutions. So it's very rare that anyone's like. Can you believe that University has so many research integrity problems. And so we might talk at some point today about whether publishers are in a position to try and seek greater accountability for institutions not to vet work and peer review it, because that's going to be a trust problem.
Like if the institution peer reviews the work. Like, why would you trust that peer review. It's a tautology, but are there other things that publishers could ask institutions to do in a compliance mechanism. Verifying employment, ensuring that in order to submit a manuscript, you have to identify the research integrity officer at your institution. Just smaller things that could clean up the ecosystem.
And so I'm really leaning heavily into my colleague Patty's like metaphor of how do we create a healthy ecosystem. It's not by cleaning up pollution afterwards, only it's a lot of these preventative measures. Thank you. Yeah if I could add to that too, Lisa because we've had some conversations about how the publisher and the library, and particularly how the librarians are getting more into the space around research integrity and trying to figure out what the role of the librarian indeed is.
I mean, I think that there are pre-existing mechanisms for how we work together through maybe the agreements that we have. But some of the conversations I have had with librarians. They bring a level of investigative value to the work that we do. We know a lot of paper mill products find their way into open access journals, right. And so there are APCs involved.
And we see these cases of APC tourism. We can find an author who is at a University who can get funding for this paper, and we can attach our name, our name to it. So, I mean, I think where librarians can play a role because you're actually in a position to in some cases, approve those funds and get a sense of whether that author is qualifying for authorship and qualifying for that funding. So I think the mission is on us to figure out what are the existing mechanisms that we can effectively kind of reapply in ways that we know are going to benefit, benefit one another.
So I want to continue just talking about addressing these research integrity breaches upstream. And we know that takes time, and that the problems we're talking about right now are immediate and ongoing. So what does the comprehensive effort from the publisher and editorial roles look like now. So for those of you that know me or don't know me, just to give you a little context in my perspective in this is that I've worked for three different society publishers, one very small, a much larger where it came from.
And then I joined APS about five weeks ago. And so over that time, in many ways, we've definitely had a significant shift. I don't have to tell this audience about what that's looked like in our industry, especially going from we were really focused on plagiarism. That was like the hot topic and then really gone into the paper mill. And just the flavors of integrity issues have just really kind of grown.
I come from a background of I have a master's degree in counseling, and I've always had this perspective of education and ensuring that we educate along the way to be proactive and be preventative in many aspects. And as we tend to think about that, we really have to think and evolve our thinking in how we as an industry can really be that proactive.
I think teacher in many ways. And as we're facing the types of integrity threats now from anything from manipulated peer review to AI generated content, we really have to think more comprehensive and proactive. And when I think about this, I think about in many ways, thinking about strengthening our editorial gatekeeping. I do, in my experience, believe that our rigorous peer review is still the best defense in many ways for manipulation.
But again, some of those things are now can't be identified in many ways from the human eye or as you're reading stuff. I do think that reviewer and editor education is really important, and that's some of the work that I've done throughout my career. And again, also our editorial board vetting is really important. The idea of starting to educate upstream.
And we were even I was in a meeting this morning in conversation with others in the industry and thinking about, well, how do you educate. Yes, we can put author guidelines out there. Yes, we can ask authors to read that. But again, they don't have time in their day to really go through all of our policies and really understand that piece. So how do you leverage technology upstream to educate the authors in the process, as they're going through of what these things are.
And I think that that's one of the really exciting things that are happening in our industry right now, and something that I think about. But I also when we think about the collaboration that's happening again, I was in a conversation this morning with colleagues and just the openness that the industry is really kind of taking on to say that we are in this together and we can help each other.
And we're seeing that through things like the SDM integrity hub, obviously, the work that Cope's done over the years, but I do see more of that engagement and that collaboration. And I think that's needed and continued to prevent in many ways. We all know about screening tools and that piece of it. And again, I think those are a piece of the puzzle. But I don't think those are the end all be all.
But to me, when I think about proactive preventative space is like, how do we leverage a focus on integration. And it's really exciting to see some of this happening as opposed to a year ago we saw these tools coming online, and now there's this industry integrations that are happening into our current systems and workflows. And lastly, really just again, how can we be clear and transparent about things like our retraction processes, correction processes.
I know retractions tend to get a bad rap, but I will also support the fact that they are there in place to support the scientific record and correct that in many ways. And I still think that's also a piece of the puzzle in this prevention piece. Yeah I'll add a bit to that. To that Beth. And I mean, I think about part of my role in the policy space is trying to think about how do we operationalize integrity policy and best practices within our workflows.
And I think there are advantages to that. And there are disadvantages. One of them being that one of the disadvantages being that we publish across a variety of disciplines social sciences, humanities, life, physical Health Sciences, and we don't always get that sort of policy frequency or industry or discipline expectation piece. If we are too focused on the standardization piece. But what we are aiming to do as a publisher is ensure that we have the standard baseline policy enablement within the workflows that we have at the point of submission, at the point of peer review and in following acceptance to Beth, mentioned screening.
And of course, we're looking for red flags. We're looking for problems in manuscripts that were published. But I also see there's a great deal of value. Obviously, further upstream at the point of submission, are the right disclosures in place. Have the authors provided a funding disclosure if there is one. Are there obvious conflicts of interest that haven't been identified by the editor, by the author, that we can intervene at that point.
And I think when we talk about screening and identification, I'd like to make the point that paper Mills get the front, they get the headline. But the other takeaway is that we are trying to identify honest errors as well. We don't only retract because of misconduct, we retract because of honest error. And so the more that we can try and identify that honest error and be an intervention point to correct that at that point of submission, I think we're in a much better place.
And that provides a very strong author service for the authors that are submitting to the journals that we publish. And so I just wanted to throw it back to you real quick, because we've heard about education and mentorship and coaching from a few of the speakers. And I wanted to hear from you about what you do in your role in that space, working with students. So I always tell the student who came to the core to do their experiment that I will be the bad guy.
I'll go to your lab meeting with you and tell your boss that data are data. There's nothing you can do about it. Everything has been done correctly procedurally. All the methodology was correct. If you don't like this data, I don't know what to tell you. So yeah. So that kind of helps a lot. It also affected a lot of people, obviously.
Lisa, do you see some of the same things that you're like, how are people. How are we going to advocate for students and to make better decisions and put them on the right track. Yeah, I mean, I think that sort of implicit in what you just said is that the student needs defending, and it's interesting for us to pause and say, exactly who do they need defending from. Ironically, the person who's the PI who's supposed to be carrying out the really high quality research.
So there's clearly some things that are not working well within the pi student relationships that institutions could be looking at. And the pressures and the way certain pressures are communicated. But I think that's a certain kind of upstream prevention. And obviously falsified data. And this is another case where somehow or another it's often it's pointed to oh, ISP didn't have anything to do with this.
It was the student in my lab who did this, to which I'm listening and saying, so what you're saying is you're a horrible manager of your own lab. This seems like a weird admission to make. And yet it's working. It works every single time. The intern did it right. It's the same social media that we see, like, oh, that was the intern.
Oh, that was my postdoc. That was my grad student. Well, you're supposed to be in charge of this lab. And so one of the things I just faculty are not going to this. Pis are not going to this. They're going to see this as the corporatization of higher education. They're going to see it as running universities like a business.
But ultimately, this is well, we trusted you and you're not doing your job. So now we need a different model. And so I think there is probably also a role for learned societies here, because if there was any group that could probably shame peers, it's probably much more their actual peers through their learned societies. But I can imagine all kinds of pressures on learned societies where you don't want to ostracize certain of your members for bad behaviors.
I mean, this is the same problem we see in every part of the ecosystem. So ultimately, I see this as a place of every organization is making a decision about whether acting or not acting is at greater risk for their organization. And I think right now, what we see is that this has become a huge risk for publishers. And so we hear stories, like Anne just said on the podium this morning, we used to have six people and now we have 130.
My guess is if we talk to those 130 people, they'd like another 130. It became too risky for publishers to not face up to this. And so until it becomes too risky for universities, for funders, et cetera, it's going to be like it's better to not act than to act. So I would ask us then how do we increase the risk level. How do we increase the problem for these other actors that puts it in their incentives to act rather than to ignore.
And I suspect, as a sleuth who kind of has this feeling about the publishers sometimes to where I know stories of sleuths who send lots and lots of emails and are like, why is this publisher not seeming to be acting. I'll let you defend yourself. Sorry what was that. She said, I'll let you defend yourself. I was going to say we'd like another 130 people ourselves, so.
Yeah, well, I feel like we've actually done a well speaking for Wiley and I think speaking for some of my counterparts at other publishers and some of the work that I know, we'll talk about the act, work that we've done a good job of perhaps lessening some of the tension that was in place between research integrity sleuths and between publishers. We're in a much more active discussion, and I think when we had the planning call about this meeting, you pointed to the fact that you're on this, you're on this panel and that serves as some evidence to it.
And I think increasingly we have obviously very much a similar goal in mind in both cleaning up the record and stopping this stuff in the first place. And a point I know you've made several times and that Megan, that you made earlier, we were in a much better position to stop this stuff in terms of just basic resource and time prior to it getting published than following publication. I mean, I can't speak to the sleuth sentiment, but we do.
I do see the emails that come in and the things that get flagged. And I would say that the enormous amount of credit needs to go to that community, because without it, we may not have that level of visibility to what's in play. Yeah, absolutely. So the sleuth sentiment, I cannot speak for everyone, every author, but I think scientific integrity people probably have the hardest job in publishing ever, because they probably only get hate emails from either side, from the sleuth side.
Why is it taking forever. And then the author side. How dare you to criticize my distinguished work, so they probably all they hate email. So I do very much appreciate what you guys do. And yeah, so I. We're all good. We're see, it doesn't have to be antagonist, but I think traditionally, the authors and publishers are very US versus you relationship.
But I think more collaborative environment will be much, much beneficial for both parties and for science in general. So we can move on to collaboration. But I do want to throw Lisa's question first about shaming other researchers. And who is in the place to do that. The idea that societies could be doing that.
Beth, I have to throw it to you. What did you think about Lisa's perspective there and her questions. And I didn't jump in yet on that, but it was a nice transition with some notes I made here. But in the end, I think as anyone that works for a society, I think we have a responsibility because not only are we publishers, but also we're conveners of the community and the research.
And so in many ways, not so much public shaming, but we have a responsibility of educating, and we have a responsibility to communicate to the community. We're the first probably direction that they look for as trusted source in that research, in the work that they're doing in the community overall. So I do believe that we have a responsibility based on my experience in this space. But it's tough because they're also like societies want to maintain the community and they want to maintain that relationship and that engagement.
And it be tough dealing with research integrity issues because that could be a breach of the relationship and that community. That sense of community, it's tough. And the other thing is that a lot of us in the society space, we set the standards, we set ethical practice. I know in APS recently we've overlooked what our ethical standards are and reviewed them, but not just in publishing in the practice of science as well too.
So again, there's a relationship of the work that we do and setting that expectation, I think, across the research community. So maybe we can move on then to speaking more to collaboration. And so I wanted to since we have Mike and Lisa here, I wanted to hear more about United to act. And congratulations to Lisa on her new her new role. Yeah so what's being referred to there is that after phase one of United to act, which is the full name, is United to act against paper mills, which was co-chaired by Deborah Khan and sabeena.
For the first phase, for the first two years, they will be transitioning to a new sort of working groups and leadership structure. And myself and Nandita Quadri, editor of web of science and vice president of Clarivate, will be my co-chair in this as well. So we'll be taking the products from phase I and turning them more into actionable trainings, outreach, et cetera.
During phase Ii. So we're just really looking to a great deal of work has been done to create training materials, documentations. Mike can talk about the amazing stakeholder map group that he has helped champion and drive the development of and get these tools out to the publishing community, the University community, researchers, societies to be able to work with them so that we can make more progress against paper mills and the catching them.
It's coming at a huge cost, obviously, just pragmatically to publishers to deal with the onslaught of paper mills, because, as I jokingly have said to Mike, he's just a cost center. Yeah, no, no revenue generating out of research integrity. But I'm a librarian. We're a cost center, too. So I'm like, say this with love.
I think the piece that we're also starting to see is we do have the World Congress on research integrity. That happens every other year. Just last week we had the first US based National Congress on research integrity. See, a lot of people in the audience here today who are part of that. I think the number one message from all of this work right now is this issue is not going away.
This is a growing issue, so it's going in the wrong direction. And so we really need to slow its growth and then retard its future development and hopefully be able to find mechanisms that will also just decrease the degree to which this is just a pernicious problem. So I imagine we're going to see it. We are seeing it also creep into conference proposals. Lots of other places.
We probably will see preprint servers weaponized and lots of other things. So what I'm seeing right now is there's a lot of activity in this area and what I'll call verticals. Publishers are doing a lot. Research integrity officers at universities. So our next step, if we're really going to try and make progress here, is going to collaborate across these silos so that we have a lot of efforts right now.
And collectively, can we make those individual efforts have greater impact than just the sum of the individual parts. So I Mike, you wanted to talk maybe about some other industry initiatives here too. Yeah, I could do that. And I'm not sure if you want to contribute to that. So I can actually make it quite quick. I think, Beth, you had mentioned the STM integrity hub, and that is something that Wiley is very much invested in, and it's a way for us to connect the different submission and peer review systems that we use to identify shared and common flags across our portfolios.
I think the other it depends on how many librarians are in the room. I think the other one that I find that we have contributed to very significantly is the crec initiative. So the National Institute of Standards organization, they ran a sub working group called the. I'm always going to get this acronym wrong, but it's the communication of retractions, removals and expressions of concern to try and bring some consistency to how those post-publication amendments are displayed and how they are how they are published, in part to solve problem, which is how retracted articles continue to find their way into the cited literature.
So I guess one thing I'll say and I'll pass it to you or Beth. I was at the STM research integrity day in London in December and I think clearly what was kind of coming across there was that we were at this point of not necessarily theoretical, but more of development type work over the course of the last couple of years. And now this is really the year where we're seeing a lot of these things being implemented, whether they are some form of screening technology, whether there are some form of the thought leadership work that goes into United to act or policies that are coming out of the Committee on publication ethics.
So I think we talk I'll be honest, I sometimes get frustrated with the work collaboration because I don't think that it really demonstrates the fact that we're really doing something. It's one thing to talk, it's another thing to actually put this stuff into practice. So that's really where a lot of the focus is on the work this year coming out of those coming out of those initiatives.
So I mean, are those the biggest. Like, what do you see as the biggest challenges to success of initiatives like United to act. I mean, is it that you get stuck in that discussion mode and no action comes. Like what Lisa, I'm curious also to hear about your perspective too. I mean, I think that. So I think it is the case a lot of times we get stuck in examining the problem and not identifying pathways forward.
Obviously, there is a challenge that everyone needs to negotiate the making sure that collaboration, not collusion, but also the reality is, like libraries and publishers also have a customer relationship with each other. And the one other thing that I think of libraries and have put forward, that maybe my community should be considering would actually feel perhaps a little antagonistic to publishers.
But since I think I'm going to I think it's probably going to come your way. Maybe it's better to be forewarned, which is the same tools that you use to screen proactive manuscripts going forward. Many of those tools can also screen already published literature and put up a dashboard of a particular journal. And what percentage of its article output has red flags.
Now I realize that not every red flag is an actual problem. Sometimes self-citation at a certain level is actually appropriate. Other times it's absolutely not right. There's all kinds of things here. But the point is, if I've got a publisher in front of me and I've got a sea of red across a subset of their journals that I'm subscribing to and what was sold to me was high quality, peer reviewed scholarly literature, protecting the scholarly record.
It looks like maybe I didn't get that. Like maybe I didn't actually get that out of this. And so I could imagine future negotiations with some publishers saying, we didn't get what you promised us here historically. So what are you doing to clean this up. What are you doing to make sure this is not what the portfolio looks like in the next two or three years. And it may come down to some financial negotiations as well.
But minimally, I think you can be expect to be asked about what are your research integrity practices. What are you doing about this work in your area. Just like we would ask you around, what do you do about archiving and what do you do about preservation. And this is probably one more thing we're going to be asking you about in those negotiations.
And as I said, it could feel a little antagonistic. But at the same time, we are the customer here. And so there is a power of the purse on our side as well. I don't know if you're I mean, as you're talking, Lisa, there's two things that come to mind when I think about all of this. And you kind of touched on when things get flagged and all these tools that we're bringing online and whatnot, the reality on the publisher side.
And again, for those in the audience that work in this space is that when something is flagged, that means then you have to deal with it, or you have to look at it or you have to manage it. And that takes a lot of human capital to be able to do that stuff. And again the resource is limited no matter what organization that you're at to a degree. And so you have to be able to figure out from an operations perspective how to manage that at scale.
And you can't throw everything out there. And then you need to be able to manage through. And again, a lot of these cases that come through from an integrity perspective are situational and they're specific that do take time and effort and energy put towards that. So again, I think this is where the preventative aspect needs to come into play. How do we prevent some of these aspects coming in up front.
And, when I think about all of this as well too, I do think a lot of things that are also getting flagged in the system right now are also not all bad actors. I think it's lack of education. So going back to how do we educate our communities around these certain aspects. And again, technology in my mind is one way that I think that we can do that in the front end.
But as we're all managing a lot of other things, this is a major area that we're managing on top of all the other aspects as publishers that we're dealing with on a day to day basis. So we always a former colleague of mine, we always say that we're playing a game of whack-a-mole, especially in the research integrity space, trying to manage through these new issues that are bubbling up. Yeah so from a school's perspective, one of the biggest threats for me, or one of the most biggest annoyance is like this Cavalier corrections or stealthy corrections, because a lot of times I think that it's very confusing.
How? who do you report the problematic papers to. So usually we go, we look at the ethics information. We'll send it to the editor in chief. Then we're told that you're supposed to be contacting the integrity team. Many people don't know about this. Sometimes if the EIC gets the email, gets a report, they wouldn't necessarily forward to the integrity team. They will just like forward email or ask the author and the author say, oh, whoops, this is wrong image, wrong data.
Here's the correct one. And then many times, the EIC will be like, OK, we'll publish a corrections. So these are corrections without investigation. So sometimes we know for sure those are fraud. Those are paper mill product. But when they are corrected I call that fraud laundering. You're laundering fraud into the literature, cementing them into the literature, becoming forever chemicals.
And that also poses all sorts of risks for us. For us. Volunteer volunteer crews. Because now we're the ones like denigrating people's work. So that is one of the biggest threats. So I'm going to consolidate. Another of my point is that there should be some kind of ticketing system because we're just looking on the internet to see who are the EIC and then Google the EIC and find his email and then have no idea who are the integrity guy behind the scene and all that.
So there's just no systematic reporting. I mean, this applies to all the publishers that I work with Springer, Wiley, Elsevier, ACS. And just like I cannot find a place to put reports. So you just email whoever's the reply to your things, and that creates a lot of confusion and conflict in the system, because who's dealing with this and are we all dealing with it and are you're correcting it.
I'm investigating it so the people stepping on each other's toes, I mean, the ethics and the integrity team. So yeah. And causing a lot of fraud to be corrected, which they shouldn't have been. So to me, that's one of the biggest threat. Yeah I think one of the biggest challenges I feel like I'm hearing from everyone is there's so many different journals out there.
Every institution does things differently. Every field, every society does things differently. And there's just like a level of heterogeneity that is just very, very difficult to overcome in this system. And I do think that's why I hope more comes from United to act that can help with that. Great So I wanted we're getting close to Q&A time. So I wanted to shift and do a little horizon scanning if I could just hear from everyone something you're excited about, or a growing threat that you are really keeping on your radar.
We haven't talked about AI much, and maybe someone wants to jump on that one. I don't know. I don't. That's fair enough. Fair enough. But I could talk about one thing that I'm excited about. And that is we have announced, I want to say roughly a month ago that we will be we've partnered with Leiden University and their center for Science and Technology to fund a four year PhD student who will do research on paper Mills specifically focused on some of the operations associated with those paper Mills, but also looking at some of the perverse incentive systems that we have talked about on this stage.
And this is also, I think, kind of very closely aligned with not to give, not to overstate the purpose of United to act, but some of the work that's come out of that, one of those main outputs was by the working group focused on research on paper Mills, who have basically published a bibliography on the research that's available, but have also issued this call for more research to be done on the phenomenon.
And so we partnered with Leiden in very intentionally because they have taken this very sort of meta research, research on research focus to the work that they have done. So we've now we actually received Leiden is in charge of recruiting. They're in charge of selecting the student. We're not involved in that piece of that as the funder and as the sponsor, but surprised to be pleasantly surprised.
We had 72 applications. We've now they have now narrowed it down to 15. And we're hoping to see this person in place in September. I think that's really exciting because when I think about what are the incentives for sleuths to participate in this system and keep providing often unfunded work that we're doing that when we can provide funding, I think that's a really exciting opportunity.
I would and I think I've already talked about this, but I genuinely am excited about how we can teach authors through tools in the submission process or pre-submission. I'm a practical person. I learn by doing, and so I think there are a lot of people out there that do also subscribe to that philosophy. And so to be able to teach people through the process, and I know in the industry for a while, we've made decisions because we want to make the author experience frictionless.
We want to make it easy, but recognize that we don't have enough resource on the back end to then deal with the challenges. Or we all deal with trying to find peer reviewers. That's always a crux of any journal. And, that piece of it and the resource of that. So I do think we need to think a little bit differently in the industry. And maybe it's because I didn't come from this industry until about nine years ago.
And there's just a different perspective that maybe up front, if we make it valuable to the authors to say, they have their manuscript correct, they meet these checks up front, that it's actually going to make their process faster in the long run, less resource for us to have to manage and things like that. So conceptually, to me, that's one of the things I'm most excited about from the operations kind of framework in the society space.
So I will give one excited and one thread because no one else has done a thread, so you know. So I also happen to be chair of the board of orchid. And one thing I'm very excited about is about how. So from day one, orchid has always had provenance information for every piece of data that we display within the system of whether it came from the author or from a publisher or an institution and the.
And this provenance data has now been sort of reformulated in an easier to consume way in a trust marker dashboard that as well as trust marker data that you can get via API if you're a member, in order to embed that into systems. And so the work that orchid has done, again from day one, prescience, to be able to say, like every piece of data, we can tell you where we got it.
That can now be used to do some of this author verification, or get a sense that somebody has a trusted record, and including most recently, our effort to be able to say that institutional emails, even if the email isn't displayed, we can display that we have a verified institutional email for this person. So that's the level I can talk about this as board chair. I see Chris in the back.
So if anyone executive director of orchid wants to hear more about that should go talk to him. But I'm really excited about the ways that we're able to offer that data out to the community and in really consumable ways. Now, here's the threat piece that I'm really pondering right now, which is it turns out anyone can call themselves a scholarly publisher. Like, literally like we could start up a journal right here and have a website by the time we're done, because websites are easy and so we are seeing some people create some scholarly journals right now.
And really we have two actors who do a little bit of cleaning the literature at a meta level as opposed to an individual publisher level. And that would be, of course, Clarivate with web of science and Elsevier with Scopus, where they say, we delist things, we disclaim them. And yet the industry is often very frustrated with their things because it might impact your journals as well to have this kind of scrutiny.
But I think this is one thing that I actually see as a threat to this overall enterprise, which is if things that would not pass muster as having research integrity have the same look, feel, claim that they come from a scholarly publisher and we don't have mechanisms in place. It's again the healthy ecosystem thing because it poisons the notion that the scholarly record has integrity, because any given publisher is only protecting their part of it, and the industry doesn't have really good ways of throwing people out of the industry and saying, well, I don't know what you're doing over there, but that's not scholarly publishing.
So I don't know if it's a trend, but it's kind of sits in my gut a lot of saying, how do we police the boundaries of what gets to be called scholarly publishing. Well, do you have anything to add. Otherwise, we'll open it up for questions. Yeah let's do Q&A just to Q&A. All right. Awesome so we have 10 minutes. I'd love to hear from the audience.
And the mic is here if you mind walking to the mic. Thank you. Appreciate it. Hi great presentation. Thank you all so much for sitting down and talking to us. I have a quick question about more international issues. So most of the tools that we use to search for authors or other papers or any more of these flags that we're talking about are usually using indexed sources that are either in the English language or in otherwise Latin alphabets.
And obviously, that's not every language in the world, nor every alphabet. So I was wondering if you have any best practices or strategies or forthcoming things that could help with that sort of thing or not. I have the primary language that we do publish in is English at Wiley. And so this is about discovery of foreign language articles.
Not in. So perhaps Chinese or in Arabic. And whether we can enable that search and discovery for those. Yeah Beth, it sounds like you might have an answer here. Yeah this is like, stumped the panel. Sorry the thing that I think about in this space is, again, I think we do a lot of aspects of our work in English, but we do have a lot of global industry partners out there.
And so I do think for those of us that do work with some of those international vendor partners, there's opportunities there where you can leverage. And again, there's a lot of flexibility. I think amongst some of them, that's where my head goes a little bit. And thinking about how you can leverage industry partners to help in this space and what tools or even just the human capital that you may need to whether it's translation, I know there's a number of publishers out there that have done translated just even for usage of the sites and authoring services and that kind of stuff.
And the integrity space. I don't know that I've personally ever seen an issue that we've had to deal with in relationship to translating something or something that comes through the door, but it's something that we'll talk about probably after this panel. And I wonder, are you also thinking about identity verification, identity verification and also citation verification.
So yeah, I mean, this information comes from this place, and it's like, well, maybe it does, I don't know. Yeah, I know STM are working on new and I don't want to speak to someone else from STM is in the room. The new guidelines related to identity verification that I think are the third party identity verification. That's really interesting.
Yeah, I would say the STM work that's in identity verification is very cool. Again, I don't have all the details personally, but I've sat in a couple of presentations around it and again, it goes back to being OK with putting some friction in space to validate authors identities as they come into our systems and accounts and whatnot. And again, differing opinions, I think. But again, is it a way to prevent some of these issues long term.
So I would recommend that information and doing some research in that space. Thank you. Thank you. Hi, I'm Shaina Copeland from cell press, which is owned by Elsevier. So I hope I'm not biting the hand that feeds me, but I know that a lot of the challenge and one of the biggest challenges for research identity, research integrity is volume.
We keep getting more and more papers. They keep launching more and more journals. They keep pushing a bigger and bigger pipeline through us. And so we don't have the time, energy resources to do the investigation. Is anybody higher up than the research integrity people listening. And maybe you Elsevier doesn't list your.
Elsevier integrity team I is very disappointing. Let me just say that put it that way. So I've been kind of like it sounds like a hit man. I mean, hitting a lot of Elsevier journals like I don't hit by paper, I'm not hitting by journal. So the environmental research, I reported about 80 papers last November. Now they retracted over 20. So that's like over 20% of paper that I reported.
Mind you, I'm a mouse behaviorist. I have no idea what chemistry physics those things are, but if whatever I report can cause 25% of those being retracted within a couple of months, that means that journal is in huge trouble. But I have very I don't have this kind of relationship with the Elsevier corresponding guy. So he doesn't really reply by email. So you mean there's a lot of if you can pass the message on to your sphere will be great because I don't think so.
If I could just. Sorry and so I think there's an interesting thing here. So obviously certain publishers are. So first, if it's real science, it's pretty darn exciting that science is growing, right to the degree that articles like I'm not a person who says like more articles is necessarily bad, right. There's some people who are like, oh, we should limit the number of articles, I like the growth of knowledge.
So as long as the growth of output is actually a signal of growth of knowledge, that is one kind of problem to deal with. I think there's also the issue of the paper mill problem, which is flooding certain journals. And then it's also, recently we saw it's not even a research integrity. It's just like low quality that AI is enabling. And so it's just costing more.
But I guess what I would say is ultimately I'll come back to the same thing I said before, which this is eroding the foundations of the business. If you're I would hope that if you're building was literally falling in on people, you'd be like, we got to do something about that roof. And then if your foundation was washing away of your building, you'd be like, oh, guess we got to do something about that. So again, I would come back to until like, this has to become risky enough.
But the other thing here is, obviously the open access publishing models do incentivize to publish more, which may not mean if given journal, but overall it incentivizes this and to keep the cost of publishing down because there's no more authors to charge for a given article. Unlike with subscriptions, where you could hopefully go out and get more libraries or the subscribing to it.
That's an oversimplification, but this is the reality. So I just like 130 people is probably not enough if you're publishing 3,000 journals because it's a scale question on both sides. So, I mean, Mike, you're dealing with thousands of journals too. Yeah, I have colleagues from Wiley in the room, and I have to respond to this so our lives might get back to that. I didn't.
So I guess to answer your question is, yes, our leadership is listening. And in fact, they're probably listening a little too closely, just based on the number of emails and emails our team gets on a regular basis. I mean, I think yeah, I mean, absolutely, the number of submissions coming to Wiley are growing. I believe we saw 20% increase in submissions over our last fiscal year.
We measure our fiscal year in a weird way. It's its April to its May to April. So just in case you are comparing that in dimensions, it might not formally line up. But again, I mean, I think what. The more that we can do at the point of submission and that those early stages to weed out the submissions that should not go and should not move into peer review, I think the better off we're going to be.
And I think that one of the things that we are and we're speaking to Megan and to another colleague about this yesterday is we've always had iThenticate, we've always had some form of a screening tool, but that's never been a magic bullet to say, oh, yeah, you should definitely reject this paper because there's 30% overlap in text here. What I think the key is going to be is as we're throwing more and more screening at the beginning part of that process, we're going to trip a number of different signals.
And so we need to understand if you're a decision making editor, or if you're a managing editor, or if you're an editorial assistant, whatever your role is in that place. How do I interpret these six or seven signals that have come in. Do I need to open an investigation. Should I reject this paper outright, or is there a level of false positive that's in place. And I think I mean, I came into this job to a number of years ago thinking, oh, yeah, if we just screened for it, we can easily identify it.
I mean, I think it's really how we operationalize and establish the right escalation points is probably is the most important thing, so that editors in chief can do the job of editors in chief and decide what's going to go to peer review and what's not. Do you get a four chair turn or the voice, four panel answer here. I would piggyback off of Mike and say you have to be realistic about, yes, we're all drinking from a fire hose right now, and we're trying to manage that piece, especially with those of us who are seeing growth in submissions on top of it.
So it's capacity issue, I would say thinking about how you guys go into operationalizing is really critical, because that's going to tell you how you can scale and how fast you can scale. So piloting something, what do you need to know from an operational perspective once you've kind of launched that and then thinking about then how do you execute it.
And to me, that's a really important phase in thinking about rolling out having a plan and the process associated with that in order to do some of these. And I would start with, what's your top priority right now. A former coworker of mine was right now we need an image manipulation tracking tool. Like that was the priority. We couldn't do everything at once.
But what's your high priority. And then try to piecemeal it and take it into components in order to manage it. And to me, some of that stuff has helped then really tick the box of hey, we've launched, VM integrity, we've done this. And then again, at the end of the day, you still have to manage all of the volume. And I hear what Lisa is saying, we're all taking a look at this in many ways.
It's just trying to do it all at once. And that's not realistic. So I would break it down and try and operationalize it in a realistic manner. Thank you. So yeah, one last question. Yes, I made it. Luigi longobardi Tripoli. So my question is inspired by something you said, Lisa.
So you were talking about the fact that paper Mills are these multi-stakeholder problem and that we need to break silos. So we need to funders, research integrity officers and University research integrity teams in journals. We all need to come together to do to work on this one issue with that, and I wonder if you needed to act as any plan to address this issue is that when we encounter these problems, there is a lot of personal, identifiable data that we come across to and sharing this data across silos requires governance and requires a legal framework.
So is there any plan from United to act or from anyone out there that wants to take it on to address the issue of legal framework and governance. I just wrote it down as something we'll talk about. Great I mean, that is a really good point that there's I mean, research integrity investigations can also be weaponized, as I'm sure many of where people report things that actually aren't research integrity issues because they're competitive with someone or their ex-spouse or all kinds of things come into these things, too.
So we have to be careful that we don't assume that every complaint is legitimate. At the same time, we have to take every complaint seriously. And the personal identity information is a really interesting one to try and grapple with there. I think I have a concern about processing the paper mill information, because I guarantee you those information will be very skewed, like showing a geopolitical biases, because it's very specific country.
So I'm very I have a very I struggle a lot with how not to become like, because you're kind of profiling certain countries in a way, because there's no way to get around, because there's two things parallel so closely. So I can see this becoming a very, very tricky issue for legal and political reasons. So It has really been stumped the panel.
So good job everyone. Yeah No that it doesn't look like any more questions. So thank you so much for joining our panel today. Yeah and thank you everyone.