Name:
A Cross-Industry Discussion on Retracted Research: Connecting the Dots for Shared Responsibility
Description:
A Cross-Industry Discussion on Retracted Research: Connecting the Dots for Shared Responsibility
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/b4f332bd-3dc9-4927-9986-7f89cd90f828/thumbnails/b4f332bd-3dc9-4927-9986-7f89cd90f828.png
Duration:
T01H00M50S
Embed URL:
https://stream.cadmore.media/player/b4f332bd-3dc9-4927-9986-7f89cd90f828
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/b4f332bd-3dc9-4927-9986-7f89cd90f828/AM21 Session 5A - A Cross-Industry Discussion on Retracted R.mp4?sv=2019-02-02&sr=c&sig=HV9%2BlE2DwdNnAH5VKJwjypRzXdyDFG6FrmBQzzVi%2FTc%3D&st=2024-11-22T08%3A45%3A17Z&se=2024-11-22T10%3A50%3A17Z&sp=r
Upload Date:
2024-02-02T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
ANNETTE FLANAGAN: OK, Mary Beth, are we live?
MARY BETH BARILLA: You are live. Go ahead.
ANNETTE FLANAGAN: OK. Good afternoon, good evening, and good morning. Welcome. If you are intending to attend Session 5A, this is a cross industry discussion on retracted research Connecting The Dots for Shared Responsibility. I'm Annette Flanagan, executive managing editor and vice president for editorial operations for the JAMA Network.
ANNETTE FLANAGAN: And I'm coming to you virtually from the coast of Maine where Winslow Homer painted many paintings. I'm joined by my four esteemed colleagues here who I'll introduce to you briefly, and then go over some housekeeping tips and then we'll just jump right in. First, Jodi Schneider, who is the reason probably we're all here today. Jodi is at the University of Illinois, and she launched a Sloan Foundation funded initiative called Reducing The Inadvertent Spread of Retracted Science.
ANNETTE FLANAGAN: So Jodi, thank you for that and for bringing us all here. Jodi is going to be giving a summary of that important initiative. Jodi will be followed by Deborah Poff who's chair of the Committee on Public Ethics or COPE, and Deborah is going to talk about a working group that she led to come up with a taxonomy of retracted categories or reclassifications. Deborah will be followed by Hannah Heckner of Silverchair who will discuss the role of publishing platforms in managing retractions.
ANNETTE FLANAGAN: And batting cleanup is John Seguin at Third Iron who will be discussing the role of technology to manage the dissemination of retractions. May I have the next slide. Thank you. Just some helpful information for all of you. There's the meeting hashtag, if you're into tweeting.
ANNETTE FLANAGAN: The presentation here that is live will be recorded and available after this broadcast. In the Files tab in your Pathable platform, you will see four files that we encourage you to take a look at, if you wish. The COPE protraction guidelines are there, the reducing inadvertent spread of retracted science recommendations are there, as also a link to the website for that project.
ANNETTE FLANAGAN: And our slides are there. Finally, we will be using the Pathable chat section so that if you're in your Pathable area you go to the Chat tab on the right, and that's where we'll be asking you to type your questions, which will be taking and addressing as we move through the program here. So with that, I'm going to start, and turn it over to Jody.
ANNETTE FLANAGAN: All yours, Jody.
JODI SCHNEIDER: Thank you so much, Annette. Let's take a look at what retraction is to start with. It's a mechanism for correcting the literature and alerting readers to articles that cannot be relied on because of serious flaws or erroneous content data. Next slide. Particularly retraction is meant to minimize the number of researchers who cite the erroneous work, act on its findings, or draw incorrect conclusions. However, that doesn't really happen.
JODI SCHNEIDER: Next slide. So for about 20 years, or maybe 30, people have been noticing that retracted papers continue to get cited. Here's an example from the past year. There are two COVID-19 articles that were in searchersphere case retracted less than a month after they were published. Those two papers each have over 900 citations.
JODI SCHNEIDER: Science Magazine took a look at 200 of the citations that were articles published after the retraction, and an about half of those articles that they looked at inappropriately cited the retracted articles. That is they didn't take note of the fact they were retracted. They didn't mention the retraction. They didn't-- they were treating them as regular science. 50% is actually pretty good in this space, and that's really the problem.
JODI SCHNEIDER: Next slide. So when we've looked at large sets of literature, 95% of the post retraction citations don't show awareness of the retraction. Let's take a look next slide on what those sort of things look like. So they're just normal citations. A study in rabbits confirmed that this treatment can achieve a faster certain healing rate, right?
JODI SCHNEIDER: Shouldn't be citing something retracted for that. However, to date only one human study has demonstrated such and such. Shouldn't be using retracted citations for that either. What should you use retracted citation for? So the 5% of retracted-- the citations that are reasonable are things like showing just discussing background. People often are talking about how a particular piece of work impacted society or impacted the field, and they know it's retracted.
JODI SCHNEIDER: They mention it's retracted, and they discuss the impact. Or perhaps like in the work that I've been doing studying retract papers sometimes you need to use them as data and describe. So there are a handful of times when it's reasonable to intentionally cite. Next slide. Unfortunately, what we found is that there are just not differences in how retracted papers are cited.
JODI SCHNEIDER: So this picture on the left in blue, that's papers that were retracted, how they recited before the retraction. Now before the retraction they're just normal papers, right? The red is after, and I would say there's not really a difference between those. So fundamentally-- and this is where in the text they're cited, so introduction of course, lots of things, background, and then towards the end of a paper you're often citing a lot.
JODI SCHNEIDER: But there's just not a difference here. And so there doesn't seem to be awareness of authors when they're writing and citing these retracted papers that these things are retracted. So that's led us to this project. Let's take a look at the next slide. So reducing the inadvertent spread of retracted science is a project that's funded by the Alfred P Sloan Foundation, and in that project we have consulted with stakeholders from across scientific publishing and done and environment scan looking both through talking to people and through looking at published literature at what's known about retraction.
JODI SCHNEIDER: In the fall, we had an online workshop where we discussed these issues, particularly what are the problems that retractions pose, what are possible solutions. And then you've just seen some of the results of the citation analysis, and we have an ongoing literature review where we found about 300 papers that contribute to the empirical research about retraction. Next slide.
JODI SCHNEIDER: So we started out with four motivating , questions what's the harm associated with retracted research, what are the intervention points for stopping the spread of retraction, which gatekeepers can intervene or disseminate the retraction status, what are the classes of retracted papers, what classes are citable and in what context, and what are the impediments to what we called at that point open access dissemination of retraction statuses and retraction notices.
JODI SCHNEIDER: We'd probably reframe that if we were writing it anew. This is where we started. And next slide, out of that we have come up with draft recommendations. These are out for review. I would invite you to take a look at the website, and to see the document that's there that talks about these in more detail. Fundamentally, we four recommendations that we need across industry approach to ensure that high quality, consistent timely information about retractions is available.
JODI SCHNEIDER: You'll hear momentarily about the taxonomy, but recommending taxonomy of retraction in classifications and corresponding retraction metadata that all stakeholders could adopt. We need best practices for coordinating the retraction process to enable timely, high quality outcomes, and we also need to educate stakeholders about publication and correction processes, including retraction, but also other sorts of pre and post publication stewardship of the scholarly record.
JODI SCHNEIDER: One problem that's come up often in the spaces is thinking about the stigma that's associated with retraction. And we want people to be able to think about post publication changes to the record on a continuum where retraction is just one part of that continuum. So next slide, this is a picture of the draft recommendations. So there's a document that's there on the science policy research gateway of F1000 research.
JODI SCHNEIDER: You can get to that from the project website. There's a Google Doc you can comment on or document you can comment on. Feel free to email me, jody@illinois.edu with comments or suggestions to improve this before we finalize. And so finally, the next slide will just show you the folks who are speaking, and I'll pass it off to Deborah.
DEBORAH POFF: Great. Thanks very much, Jodi. So clearly I'm one of the people who represented a stakeholder group in this project, which I found a fascinating project, and a lot of different orientations brought to the table. And we've had days where we met virtually. And part of that process is we met in subgroups to address specific questions from our perspectives. And the group that I ended up in was a group that was to look at the issue of retraction taxonomy.
DEBORAH POFF: So that's what we did. There was another group, a parallel group, that is looking at metadata, and I'm not sure where they're at right now. But we divided the labor, and this is the group reporting the proposed retraction taxonomy. Next slide, please. So the rationale for this proposal. Despite excellent examples of proposed taxonomies, and Crossref is a good one and CSE, and different proposed approaches, our former chair of COPE plus some other folks associated with folks-- Jenny Barbour was one of the people who had a tack on this, a different approach that was trying to destigmatize people's concern about taxonomy retractions.
DEBORAH POFF: And Daniel Fanelli and his group have a well-cited article that also tried to attempt to identify a clear taxonomy. But as yet there hasn't been a lot of uptake of these taxonomies. And when we thought about it and thought what's the problem here, why-- Because there's a lot of good material out there and excellent people willing to step forward and do this, what are the problems?
DEBORAH POFF: And part of what we identified as a problem is the redundant language that's used. So I called it, because I'm a philosopher, I called it the problem of distinction without differences. And there's overlapping and redundant concepts without meaningful differences used by people. There's a heck of a lot of terms used for correcting the literature, and there's a kind of laissez-faire marketplace, adoption of practices based on the individual historical artifacts of different publishers and different practices.
DEBORAH POFF: And there's also, we would suggest, a confounding of the reasons for correction with the actual correction. Next slide. So we came up with five high level principles, because when you start digging down, that's when you get stuck in the muck. So we thought if we could keep it simple and keep it clean and uncontroversial then we'd have a better chance at actually engaging people in thinking that this was a great idea for us all to be using the same five concepts and with the same understanding what they mean.
DEBORAH POFF: So we suggested correction, expression of concern, retraction with replacement, retraction, and withdrawal. Next slide, please. So this term is to be applied to all changes in the published content, and addressed errors with or without clarification. But they don't require a higher level of concern, retraction, republication, or replacement. They have errors that should be corrected, but the content is essentially fine.
DEBORAH POFF: And typically, notices are issued by journals, which initially described the errors where the source of scientific errors explain in form of correction notices. Next slide, please expressions of concern. This term is to be applied when there's concern about the integrity or reliability of a published article without sufficient information to determine the final disposition of an article.
DEBORAH POFF: And this happens fairly frequently to editors. I've been an editor for a very long time. And when we get these concerns where we can't determine on our own what the consequences of the problem should be. We usually contact first the author as an editor and explain what the problem is to see if they're willing to explain it, and if it's sufficiently troublesome, we also contact the institution and request an investigation.
DEBORAH POFF: And it's usually the institution, although it can be the funder. Editors as we all know don't employ authors. Institutions do. And so they're the employer and it's their problem, their issue to do an appropriate investigation of something, whether it's founded or whether we decide that it's unfounded and that the article can stand. So this is a term that is appropriate when the editor is not able to definitively decide what the consequences of the concerns are.
DEBORAH POFF: With the notices of expression of concern the article is properly labeled or watermarked with a reciprocal linking between the expression of the concern and the article in question. Next slide, please. Retractions. The term attractions here is used for published articles determined to include results of scientific misconduct such as FF and P, fabrication falsification of data or plagiarism or pervasive errors that irreparably invalidate the key findings of a study typically issued by an editor, a publisher, can be an author, or other authorized institutional representatives.
DEBORAH POFF: The article should be properly labeled and work, and there should be reciprocal linking between the retraction notice and the retracted article in question with the reason for the retraction included. Next slide, please. Retraction with replacement. This term is used for published articles with serious errors. I'm sorry, there's something blocking the language under the pictures of people.
DEBORAH POFF: And which need to be corrected, and corrected changes of the findings are significant but they don't invalidate the underlying science or methods of the study. A notice of retraction with replacement can be issued by an editor, by an author, by a publisher, and there should be reciprocal linking between the retraction notice and the retracted and replaced article. The reasons for the retraction with the replacement should be stated.
DEBORAH POFF: Typically, copies of the original article with errors are highlighted, and there's corrections highlighted and they're published and supplements to the retracted material and replaced article. Next slide, please. So withdrawal. So there's two basic reasons for withdrawals, and we had a bit of a challenge with this one because they're used differently by two different kinds of entities.
DEBORAH POFF: The term is used in the preprint world for articles that have serious or pervasive errors that would otherwise be retracted, but with preprint service withdrawal is the term used and it can be different from a retraction or at a journal. And two, and this is with respect to a number of the big publishers, can be accepted versions of manuscripts that a publisher posted with the DOI or other permanent identifiers but it hasn't gone through final editing, production, formatting.
DEBORAH POFF: And it may have serious and pervasive errors that would otherwise be retracted. However, the serious nature of the reason for the retraction leads the publisher to determine this is a cause for withdrawal. And there are some recent examples of withdrawals because of offensive discriminatory opinion where it was deemed important to remove sexist, racist opinion. So that's where the distinction between the preprint use users of withdrawal, and the publishers of accepted but not formally finally edited articles can both be deemed withdrawal.
DEBORAH POFF: Next slide, please. Now, we had another-- because of the sort of issues around withdrawal and the reasons for withdrawal, we had a bit of a split in the subcommittee where some members thought that we should have a sixth term for inflammatory articles or harmful violation violations of ethical norms or security risks, for example, on the one hand. And people said we should use the term removal for those.
DEBORAH POFF: And others felt that withdrawal could serve this purpose and has. So we're just noting that, that we weren't on the same page with respect to everything. Next slide, please. So lastly, the assumption. And that was an assumption that we had from Jodi and her colleagues in the white paper they produced, is their assumption is that for this to operate effectively it needs to be housed, curated by a non-profit organization that has the mission to advise or educate on publication ethics, or it could be a partnership of two or more of these kinds of organizations.
DEBORAH POFF: People have suggested COPE and NISO as candidates. So that was the last thing we wanted to state because one can't just develop this and think that everybody's going to jump-- read the paper and jump on board. So the assumption is that someone will take ownership of housing, curating, and disseminating the information, and probably some educational activities as well around the taxonomy.
DEBORAH POFF: So thank you very much. And I will pass this on to Hannah.
HANNAH HECKNER: Wonderful. Thank you so much, Deborah, and thanks also to my fellow panelists and our moderator. And yeah, as Annette said thanks so much Jodi for forgetting this merry band of misfits together to talk about retractions. I'm Hannah Heckner. I'm the product strategist at Silverchair, which is a publishing platform. We have many publishers on our platform representing many different content types, disciplines, and publishing avenues.
HANNAH HECKNER: So excited to be here today. I thought a really helpful way for me to start my portion of this session today was just to sort of go over some grounding ideas about the job of a platform. So primarily, we're going to be acting as a container for content. The job of the platform is to make sure that we're disseminating our publishers' content with clarity, thoroughness, and a thought to all of the end users.
HANNAH HECKNER: This might be a researcher, but this also might be a crawler, a machine reader. So we need to keep all of that in mind as we load content onto the platform from the publishers. In addition to communicating that research content, the platform should also be sort of what you might call a lowercase p platform. We want to be the home for our publisher.
HANNAH HECKNER: We want to be where they interface with their users, where they build user loyalty, where they build their brand. So this is really all about the important position of the publisher to collect information about the content, collect information about the user, and really service that face forward to the public, and build that trust within users. So as the main home for the content, the platform should really be a hub of information, a possessor of canonical URLs, versions of record, and also that downstream distributor and source of truth for the content.
HANNAH HECKNER: Let me get over here. So in regards to retractions, I think that I really like this idea of the term palimpsest, which is a historical record that is capturing what was written on this piece of papyrus way back when originally. And it's also-- the palimpsest is also where other information was written on top of that. So as a record, this piece, this artifact shows not just what was originally written, not just what was written subsequently, but also it demonstrates the life of the content, the process behind it.
HANNAH HECKNER: So we think that can really apply this idea to the publishing platform when thinking about how we handle the publication of retractions, corrections, et cetera. So we are the source of truth and of the original research as well as any retractions, correction notices published subsequently. And we maybe other-- aside a little bit differently from this concept, we also want to very clearly communicate this pathway, this life cycle of the research.
HANNAH HECKNER: So yeah, we should be-- the platform should create and maintain links between the original content and the ensuing updates, and should also serve as a link out to other links as well, those downstream deposit services as the content develops and changes. So a big part of this is just that display piece. Publishers should be a partner-- excuse me. Platform should be a partner to publishers in retracting-- excuse me-- in applying watermarks to PDFs to communicate retractions.
HANNAH HECKNER: We should also be providing links and notices when something has been updated. So this is connecting a notice of retraction with the original piece of research, and then this is also a notice that when a user is on the original piece of research that it has been retracted and replaced. This final example is the addition of supplemental information on an original piece of research that shows the process of the retraction, so shows those corrections in action.
HANNAH HECKNER: So in addition to just this display piece, a platform has more than just that to do. We can't just say, OK, we're done, we've displayed information on our sites to say this has been retracted. As the evidence that Jodi talked through earlier points towards, folks are not just going to the site of a publication and finding out all of the information about that. These articles might already be in someone's bookmarked folder as a PDF that they wanted to refer to in their research.
HANNAH HECKNER: So the publishing platform also needs to be connected with third party groups outside of the platform to communicate information about how that content is being changed. This year just a handful of some of those third parties, including abstracting and indexing services Crossmark, which is part of-- affiliated with Crossref where you can check for updates on the content. And I've also pointed to scite.
HANNAH HECKNER: I believe John will talk more about scite later on in the process. But this third party acts as a widget to show whether references within the text-- or excuse me, whether references of the text have been confirmed or denied or are neutral. So this is just sort of how the platform can offer information to the metadata universe that supports original research and any ensuing updates.
HANNAH HECKNER: And it's really keeping in mind the fact that a human reader is going to see one thing, but there's also a lot more in the back end within the XML that needs to be developed to really flesh out the story around the research. With that being said, the platforms and their publishing partners are doing a lot to get this information out about retractions and corrections. There's a lot of room for growth.
HANNAH HECKNER: A lot of these items were touched on, and this is certainly not an exhaustive list, but we see a lot of inconsistencies in metadata and display standards between publishers, platforms, et cetera, about maybe when something is retracted, what it's going to look like, et cetera. So there's also going to be different practices around what merits a retraction.
HANNAH HECKNER: So some of the taxonomy work that Deborah was discussing points to this. I think that it is fine if some of those differences exist between disciplines. We certainly can't paint every single publishing organization with a broad brush that everything needs to fit into these tidy boxes, but just finding some consistencies between those practices and standards I think will be really important as we move forward.
HANNAH HECKNER: There's also, in addition to needing more developed metadata following the standards we might already be following, I think there's also just gaps in that metadata ecosystem. We should have maybe more URL markers, XML meta tags, that can point towards more information about the content and what the content status is. On top of this, I think that this should be seen as a great opportunity in this current day and age where so many more members of the general population are reading research, this is really an opportunity to make sure that we are communicating to that audience specifically, and maybe look at some of the language and display cues that we're using.
HANNAH HECKNER: And even how we think of the research being used, keeping that in mind as we make certain thoughts or statements in practices moving forward. I think also-- this was pointed to earlier by Jodi-- that the stigma that is tied to retractions can really lead to an opacity around the practices here. We can see evidence even in COVID research where content was silently retracted.
HANNAH HECKNER: There's the stigma against publishers, against authors, where they might hesitate before retracting content or before publishing a retraction notice in a more public way. And we really need to fight against this, because the transparency and clarity in communication is going to be really important with moving forward. On that note, suggestions for moving forward, this panel, I think, is a great example of what can be achieved with cross industry collaboration.
HANNAH HECKNER: All of these stakeholders within Scholarly Publishing, we might not agree universally on a lot, but I think we can all get behind the fact that we do not want retracted research to be cited and brought forward as truth, gospel truth. So I think that digging down into this idea of cross industry collaboration I think we also should really look at further engaging with and leveraging some initiatives that are already happening.
HANNAH HECKNER: There are some really cool projects that are going on as well as stakeholders within our communications ecosystem that I think are doing a lot of things that could be maybe applied to this opportunity here. I was reading about the STM article sharing framework and how they are able to identify information from a content license in order to very easily surface the sharing capabilities of content. I think that this could be really interesting if we have richer metadata that points towards corrections or the correction status of something, the retraction status of something, and surfacing that in content wherever it might exist.
HANNAH HECKNER: I think that could be incredibly valuable. Also, as a platform provider, we work really closely with Google Scholar, and are constantly updating you URL markers, meta tags, to indicate the openness of content or elements within that content. So I think that this could be a really interesting place where we can further engage with Google Scholar as we know how much they touch within our ecosystem.
HANNAH HECKNER: I also think that there could be something with the Crossref distributed usage logging . Project I know that it's in sort of a pause right now, but their overall mission to connect a lot of metrics across all of our information silos, I think that could be something that could be really interesting to look into. Also, the taxonomy project that Deborah just spoke about I think will be a huge boon in moving forward and just finding more standardized language, maybe even similar to the credit taxonomy that will then be able to be translated to usable XML that could then go downstream.
HANNAH HECKNER: I think that could be incredibly valuable. And a few of the members of the panel or the group that Jodi got together we're also starting to get a NISO working group together that's looking into this issue. Todd Carpenter told me that I can't say too much. So consider this a bit of a teaser and a watch this space moment. But I think that we have a lot of really great momentum and opportunities ahead of us, and I'm really excited to be a part of this, continue this conversation as it moves forward.
HANNAH HECKNER: And yeah, thanks so much for your time. Excited to chat more in our discussion, and I'll hand it over to you, John.
JOHN SEGUIN: Thanks, Hannah. My name is John Seguin. I'm a Third Iron. We're a library technology company. It's usually about 1,200 libraries around the world. The part that I'm going to be speaking about is about the technology pathways, basically meaning we talked already about kind of standardizing these retractions. Hannah talked a lot about where these retractions should be appearing, how this dissemination should occur.
JOHN SEGUIN: We know that there's already places where it's not being picked up or it's not being observed, as Jodi was observing before. And I think a lot of this has to do with the issue that a retraction status, of course, is not perfectly in sync with paper publication. So paper comes out at one point. At some point in the future a retraction occurs. And at the point where a user might interact with a particular piece of literature that retraction notice may be known, and in that case things that are on a publishing platform may be all that is needed.
JOHN SEGUIN: User encounter is it, and they see it, they go, oh, this is retracted. Now I know what I need to know. But of course, that's not as clean as it always happens in the real world. There's lots of technologies out there for discovering papers, for manipulating things, for organizing your citations, for building your research projects.
JOHN SEGUIN: And those retractions may occur at any time. And so how can we take the knowledge that may be in one bucket of we these things have been retracted versus all of these other buckets that exist of discovery and holding onto citations and building your paper and building your project and try and marry them together. So that's where different technology pathways come into play.
JOHN SEGUIN: Next slide, please. Now, the different areas that we kind of think about when we talk about that is that I think there's three major ones. And the first one is kind of on the front end, essentially if I'm going to be looking for research I'm doing my literature reviews, I'm looking for things about the topic, I'm largely in some sort of search and discovery mode.
JOHN SEGUIN: So I could be in different indexes. I could be off in Google Scholar, in PubMed. I could be in my library discovery systems like Primo and Summon and EDS and so on. So I'm doing a bunch of search in there and indexes, and those indexes may have some knowledge of retraction, they may not have any knowledge at all. They may have partial. It could be kind of all over the place.
JOHN SEGUIN: But depending on those services and how much knowledge they have, when you link through to full text it may not be apparent to the user that this is something that has been retracted. So that's one area. Another area is you've already found a paper, but maybe you found it eight months ago and you put it in your bibliographic management tool so you could cite it in the work that you're going to work on later that year.
JOHN SEGUIN: And in that intervening time, it has been retracted. So you've already read it, you're like, this is a great one. I'll cite it my next paper in eight months. And oops, it's either retracted. So that could be a situation. And then there's another situation where you've pulled everything together, and it's going through the process of submitting that paper for publication.
JOHN SEGUIN: And you can submit it but then somewhere in between there one of those papers got retracted, which might actually greatly influence the work that happened. And so those are three different areas there. So I'm going to be kind of highlighting three different technologies. These are not the only ones that do this, but they kind of represent those three different areas.
JOHN SEGUIN: And I thought they'd be good to kind of show you how they intersect with those areas, and how they can kind of bring this retraction status into these other environments that so many researchers use. Next slide, please. As Hannah mentioned, I was going to touch on scite as well. What she was highlighting was something that publishers can integrate into their publishing platforms which indicates the paper you're looking at and whether citations have been retracted within it.
JOHN SEGUIN: But another aspect of scite is actually their integration into some of the world's leading submission systems, like ScholarOne, Editorial Manager, Manuscript Manager. So as a researcher comes in and submits the paper, there is often an optional ability to say, let's have scite look through the citations at the moment of submission to see if there's been retracted work. This can be a very nice thing for everybody involved because it saves everybody from getting the work out there, and then two months later going wait, oh, no we just found out, blah, blah, blah, blah, when it may have been known at that time.
JOHN SEGUIN: So it's kind of a real time check at that moment to have that additional check of anything being retracted that may of course influence that work. Maybe it doesn't influence it. Sometimes you have papers that you're mentioning. As Jodi was saying, it's just a supporting thing, it didn't actually influence the work that you did. You're just making reference to it. You can just delete it.
JOHN SEGUIN: You can pull it out of there. But overall, what the end goal is is to minimize the dissemination of this retracted work, because, of course, if that gets out there and someone else comes and piles onto your work and now all of a sudden you have a big mountain kind of a big web of access to that retracted material. Next slide, please. Another technology is the one that my company works LibKey and BrowZine.
JOHN SEGUIN: This is a linking technology. It's AI powered. It's used by about 15 million users, and it is really a connective tissue between indexes and full text. So this is all those discovery systems. We're also in public discovery systems like Symantec Scholar and so on. And it's all about getting you from a citation to the full text of material as available to you at your library.
JOHN SEGUIN: So we thought this was actually a perfect opportunity to sort of jump in the middle of that moving train that is often moving so fast to full text that you may not pay attention to anything like retraction notices or where you're linking to doesn't properly label it. Hannah was highlighting a very nice example of the watermarking that platforms do and nice labeling there. Unfortunately, at this time not every publisher and platform is universally doing that.
JOHN SEGUIN: It's kind of inconsistent. That was another one of the aspects that Jodi brought up is something we'd like to see as a standard industry wide, but in the meantime, the LibKey and BrowZine technology can actually put this screen-- here's a kind of example screenshot here-- that says, yes, this thing has been retracted and here's a link to the retraction notice, here's the reasons for that retraction, and so on, and really standardize that process.
JOHN SEGUIN: We have another aspect of a technology that actually signpost this availability into other platforms, so your discovery systems and so on. It can actually change the index record in real time by doing that look up to say, this thing has been retracted. So you know this before you even have to proceed. Next slide, please. Just a couple of other visual cues of that, you can see in the bottom there that's a discovery record where instead of saying, download this article, we put retracted article.
JOHN SEGUIN: So we can actually signpost that in the service itself. On the left there is one of our browser extensions, which also highlights an article retraction on a particular work. We also do have a citation management aspect of our service where we do the same sort of thing which you see on the screen shots in the back that you may have put on your kind of bookshelf six months ago, and we say this has now been retracted.
JOHN SEGUIN: Next slide. The other one I want to highlight is Zotero. Zotero is a popular citation management service over three million users of their Chrome browser extension. That's probably a very low estimate in general of the number of users of Zotero. And they were really the first to do this. This is a very innovative thing that they worked out with the Retraction Watch service.
JOHN SEGUIN: And what they're doing is very much saying if you've built your bibliographic file there of the different papers that you want to cite later, they will in real time update the user that things that you have added previously to your citation management system has now been retracted. So when you come back into that service, you're ready to build your bibliographic citation list there, it will pop up and have this kind of notice for you.
JOHN SEGUIN: So very much looking at that after the fact, I read it six months ago, completely forgot about it I thought it might be important to want to introduce it to my paper, this service is making good use of doing that. And of course, this definitely seems like it should be something that all citation management type services would benefit from being able to do that to save everybody a lot of time downstream.
JOHN SEGUIN: So those are three different technologies that intersect that the user workflow at different points. I think that we're going to see more folks kind of take up this mantle as well on the technology side to provide more and more visibility to this data, and all of that kind of standardization projects that we've been talking about here I think are going to help that so we have a consistency across these different services as well.
JOHN SEGUIN: Thank you very much for inviting me to this panel. You're muted there.
ANNETTE FLANAGAN: Those of you who cannot read lips I was thanking our panelists John, Jodi, Hannah, and Deborah. And we now have about 15 minutes or so for questions. I'm hoping all of you that I cannot see have access to the Pathable chat. And I'll be watching it. So you can put some questions in there. And while I wait for you all to do that-- and I know you're there. Many of you are-- and most of you are not shy so, I won't call you out by name unless I don't see any questions.
ANNETTE FLANAGAN: And those of you who know me know who I might call. So let me actually just start with a follow up to John's presentation. Thank you, John. Very informative. We use an XML software editing software eXtyles by Inera that in the editing process-- so it's a little bit later in the process, but it's prepublication, that will flag say any article that is retracted and available in MEDLINE.
ANNETTE FLANAGAN: And it pop up, comes up and tells the manuscript editor who then tells the author, hey, did you know you've cited a retracted article? But again, that's very dependent on upstream retraction notices A, being published, and B, being delivered to places like MEDLINE, Crossref and other places. So my first question is, should anything be done to address the citation of retracted research and subsequently published research?
ANNETTE FLANAGAN: One suggestion has been perhaps a modification of a correction. So John, I might ask you first to address that, and then ask the other panelists if they'd like to follow up.
JOHN SEGUIN: Yeah, I mean, I think a lot of it has to do with how the metadata flows to these different services. There's a lot of pathways that flow through Crossref. So that's obviously a very helpful cross-industry place for all the stuff. So anytime you can make use of Crossmark and make use of those connections that way. Most services have an ability to pick things up that way. That is dependent on updates, of course, coming from publishers that way.
JOHN SEGUIN: I think that's one of the big gaps that the Retraction Watch project has attempted to fill though is we know that they've been able to track down more retractions than have been sort of voluntarily logged back to Crossmark. And so intervening those or kind of intertwined in that data set in addition to Crossref trying to get the largest, most comprehensive one, that's sort of like a mini project unto itself trying to kind of merge all that together, because once we have the best kind of box for that then it's a matter of just distributing that out.
JOHN SEGUIN: And then it gets much, much easier.
ANNETTE FLANAGAN: Great. John, thank you. Deborah, would you like answer that?
DEBORAH POFF: Yeah, I guess what I was thinking is-- and it was towards the end of what John was saying-- I think part of the issue is one the things I've realized since I've been involved with COPE, which is a fairly lengthy bit of time now, is we have about 13,000 members, and they're all editors or publishers. And the difference in the knowledge base, in the information, in the training from 0 to pretty good of editors by publishers it's so variable that we've done some surveys about-- because we tend to have a STEM membership that is more active than humanities and social sciences.
DEBORAH POFF: And we've been trying to figure out why. And a lot of them are just not trained well enough to even know why they would participate with COPE. And I think that's a real problem. And they won't access these resources if they don't they exist and they don't know how to use them. So that's a barrier.
ANNETTE FLANAGAN: Thank you. Thank you I have other questions, but I don't see any in the chat yet. Hannah or Jodi, do you want to add to that or shall I move on to another question?
HANNAH HECKNER: I would just underline the fact that I think that we should take advantage of certain pathways that have already been placed down, not try to reinvent the wheel in seeking solutions I mean here and in other places. As John said, Crossref is already doing an amazing job here. So just further supporting that, further supporting the work of COPE, and trying to get everyone on an even playing field of information and information that is accompanying their research I think will be really big in addressing this.
ANNETTE FLANAGAN: Great. Jodi, question for you. Back to the grand project that you have been leading for I think a year and a half now. What do you think will be the greatest obstacle to the success of the project?
JODI SCHNEIDER: This is a real ecosystem problem. It's been around for decades. It's not easy to fix. We need collaboration between multiple different stakeholders. And it's not a technical problem. We can't solve it with technology. It's not just a social problem either. It combines from all of these different perspectives. So I think that's really the biggest challenge.
JODI SCHNEIDER: Lots of collaboration is needed and trying to make sure that different organizations and different individuals in the ecosystem are seeing a place for participating in that collaboration.
ANNETTE FLANAGAN: So collaboration, collaboration, collaboration, right? Anyone want to add to that? I think we're all pretty much in agreement. I think I would add time, money. Resources, resources, resources, which we all seem to be in need of. Let me go to another question, and Deborah, maybe I'll direct this to you.
ANNETTE FLANAGAN: All of the panelists have mentioned how stigma is a problem with retractions, and that doesn't seem necessarily to be going away, albeit, we're certainly all trying to remove the stigma. What do you think that can be done to incentivize everyone involved, researchers, editors, publishers, to remove the stigma so that good science doesn't get continually built on bad science even unintentionally?
DEBORAH POFF: Yeah, well, it's kind of the answer that Jodi gave. There's so many stakeholders. There is no even playing field in terms of knowledge base. For researchers COPE is trying to implement a new category membership called universities members. And universities who are the creators and produce the research that is disseminated in publication are not as familiar with publication and publication ethics as they are with research ethics.
DEBORAH POFF: And we're hoping that if we have universities as members of COPE we can break that down. So part of the destigmatizing is for people to really understand that what you really need to be doing with retraction is correcting the scholarly record, and that an honest mistake that is damaging because it is significantly flawed may have no bearing on the integrity or the cleverness or the grounded knowledge of the researcher.
DEBORAH POFF: Mistakes happen. And we want it not to be stigmatizing for people to feel that they can come forward and say, I realized that I made a big blooper. It was a statistical error, and it does impact my findings. And if people could do that without being worried about their job, worried about their reputation, worried about their status, a lot of this stuff would be mitigated.
DEBORAH POFF: But that's not what we're at right now, and--
ANNETTE FLANAGAN: I agree with you.
DEBORAH POFF: --this can be very punitive.
ANNETTE FLANAGAN: This the--
DEBORAH POFF: And I say that as somebody who was always an administrator. For about 25 years I was a university administrator. So I know the other end of this, that universities can be extremely harsh to people.
ANNETTE FLANAGAN: Yeah, I'm seeing a couple of questions come in, but just to address that I think you're right on about getting this information about this spread to all parties, not only researchers but universities, librarians, and those who are serving as resources for researchers. This is the reason that we at JAMA Network and The Lancet that I know of back in 2014 created this new vehicle retraction with replacement, which Retraction Watch actually labeled doing the right thing.
ANNETTE FLANAGAN: So these are authors that come to us and say, oh, we didn't mean to but we had a coding error that has affected every number in table one, two, and three, and oh jeez actually our findings are shifted, maybe shifted completely from a positive finding to a negative finding for the primary outcome. What do we do about that? So we developed this process of allowing them to retain their article.
ANNETTE FLANAGAN: We actually-- this is controversial. We retain the same DOI for the replacement article. Many others do not. And I've criticized us for doing that. But why do we do it? We don't want the authors to be penalized and lose their metrics for their science. We believe the science is valid. We believe the article is valid.
ANNETTE FLANAGAN: It's just some of the numbers were really, really wrong. Again we've been criticized for that and I would like to move on because I see Heather-- Oh, sorry.
HANNAH HECKNER: Before we--
ANNETTE FLANAGAN: Hannah, go ahead.
HANNAH HECKNER: Before we moved on, I just wanted to say I think that a move forward could address this issue and I think so many other issues in the scholarly communication space, which is just the publication of additional research artifacts. I think that the more people are publishing their preprints, their data sets, the more items that are coming into the scholarly ecosystem, the more opportunities for feedback, the more opportunities for maybe seeing something that looks a little bit sketchy from the outside or more eyeballs on this I think would be great.
HANNAH HECKNER: And that would also increase the amount of artifacts that folks can point to in saying, this is all the things that I've done in my resume. So I think that that could be another [INAUDIBLE]..
ANNETTE FLANAGAN: I think that's a great idea, Hannah. I'm seeing a bunch of questions come through. So Deborah, this one's addressed to you. Is COPE also reaching out to government contractors and industry that produce and publish research?
DEBORAH POFF: Not at this stage. I don't think we're averse from doing that. At this stage we think it's an enormous labor and initiative to take on trying to attract universities to become members and participants in encompass as full members. And we have to deal with what our deliverables will be that's worth their participation, and we have to deal with the range and complexity of universities, the scope of universities from large research industry based universities to small humanities and social science and theological based research and production.
DEBORAH POFF: So we've got a big task ahead of us, but it's certainly not that we're ignoring industry and government. It's that if we can accomplish this very complex task I think we'll be well situated to do any next step.
ANNETTE FLANAGAN: Yeah, that was from-- question was from Heather [? Kerttula. ?] Thank you, Heather. I think I would add that we have discussions about involving funders as well in this process. Another question here from Will Schweitzer. Has there been any discussion with the various preprint servers about ways to coordinate statements of concern or retractions with journal publishers? He says, I'm thinking about the life sciences where articles frequently start out as preprints.
ANNETTE FLANAGAN: So I guess the question would be, and this might be for John and Hannah, if a journal publishes a retraction notice about of an article is there a link to that preprint, and how does that get connected either by humans or machines?
HANNAH HECKNER: We will post links to preprints. We also we're supporting article versioning. So earlier versions are visible on our platform. You think of downstream delivery, is that upstream delivery if you're communicating something to a preprint server? So I would say that to my knowledge the literature doesn't have those pathways set up. I also think Will would know as he's my boss.
HANNAH HECKNER: So I think that this is an interesting area for collaboration though, and something to certainly look into more and maybe just collaborate more with the folks at bioRxiv at medRxiv Cold Spring Harbor, et cetera. John, are you aware of--
JOHN SEGUIN: Not really, no, just because most of the tech services that I've talked to who are working with this, including ourselves, we're usually pretty focused on version of record type situations. Obviously, there's a very important breadcrumb trail there to earlier versions if you're trying to be Jodi and kind of walk through all this and kind of figure out what happened or whatever, do the research on retraction. But for folks who I think are saying, well, here's this piece of literature that I found and that piece has been retracted, that's kind of like the de facto most important thing.
JOHN SEGUIN: Why did it get taken down? Usually there's a notice to say, it was because of X, Y and Z. Presumably that X, Y and Z would be also visible at the stage of a preprint. It's just that nobody noticed or wasn't flagged or their research or didn't come forward at the time. There's all kinds of different reasons. But to my knowledge, that isn't-- the dots have not been kind of connected at the metadata stage for this type of thing, except at perhaps the platform like Hannah's mentioned.
ANNETTE FLANAGAN: Yeah, it's another place for the dots for sure whatever your workflow stream direction might be. But you would imagine a retracted article if there was a preprint might result in a withdrawn preprint possibly. There's a hint, hint here from Neil Blair Christensen about where [INAUDIBLE].
HANNAH HECKNER: Yeah, I think that means he'll fix this for us, right?
ANNETTE FLANAGAN: There we go. Wave the magic wand. Let's see. Another question here from Michelle English. This is directed to me. In the instance you mentioned, does the replacement pad paper have to go through peer review again? So I'll tell you about our experiences. We published about 15 of these since late 2015, and in several cases, we have had to have statistical review again because the findings really affect the statistical analysis or the methodological approach.
ANNETTE FLANAGAN: Not always is that necessary. Sometimes it's quite clear what happened. So perhaps that's helpful to you. Another comment here from Tony Alvis. A retracted article affects all of the subsequent research articles that cites the retracted article, and then all of the articles that cite those articles are affected. Is this looking-- is looking at this part of the project that you're leading, Jodi?
JODI SCHNEIDER: This is the research that I'm doing, and it's fascinating and it's super challenging. There are examples of, for instance, a systematic review that is in press and a paper that's included gets retracted. Well, in that sort of case a systematic review is a blend. You put some junk in and you blend in. Well, now you got to throw away the smoothie and start over, right? So there are some cases where it's a bit straightforward.
JODI SCHNEIDER: Then there are cases where you're just citing a piece of background literature, and other people would agree with the background you could pull it from somewhere else. So just citing something that's retracted is not a kiss of death. And then the question is this sort of picture that we have of this project is the Jenga piece, right? When is it the last Jenga piece and the whole thing falls down?
JODI SCHNEIDER: So my team has been working on-- we proposed this idea called keystone citations, which is when is it that last Jenga piece and the thing, the whole argument is going to fall down? And I'm really interested in talking to people who have applications for that to make more practical.
ANNETTE FLANAGAN: Yeah, keystone citations. That's a new word for me, Jodi. Thank you. Comment here from Jessica Slater that what we were discussing just before this seems-- a similar issue would be linking notices to repositories with accepted versions of manuscripts that are meeting green OA requirements. So we have a lot of dots to connect I think is what I'm hearing here.
ANNETTE FLANAGAN: Jodi, I'm seeing multiple iterations of your project. And we are right at the 3:30 mark. So I'm going to stop us. And thank you all. Just as a reminder, if anyone is interested, take a look at the project's paper that's in the link in the Files tab, and I know-- and you see Jodi's email address there next to her name I'm sure she would welcome comments any way that you would like to send them, smoke signals, dots, anything you might like.
ANNETTE FLANAGAN: Again, Mary Beth is telling us from SSP that you can also leave comments here after the session and we'll do our best to follow up with you all. So thanks very much, and I hope you enjoy the rest of your SSP gathering.
HANNAH HECKNER: Thanks everyone.