Name:
What Really Motivates Researchers to Make Real-World Impact?
Description:
What Really Motivates Researchers to Make Real-World Impact?
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/f6325270-b61a-4544-a139-cbb9ae75f2ae/videoscrubberimages/Scrubber_1.jpg
Duration:
T00H57M54S
Embed URL:
https://stream.cadmore.media/player/f6325270-b61a-4544-a139-cbb9ae75f2ae
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/f6325270-b61a-4544-a139-cbb9ae75f2ae/SSP2025 5-29 1045 - Session 1D.mp4?sv=2019-02-02&sr=c&sig=a2NjisfdPuAIq%2B3LwOns9wrSDp%2Btijzmt3oj1sSr%2F2M%3D&st=2025-12-05T20%3A57%3A28Z&se=2025-12-05T23%3A02%3A28Z&sp=r
Upload Date:
2025-08-14T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
OK hello everyone. Good morning and welcome to the session. What really motivates researchers to make real world impact I am Camille Gamboa from Sage. And to my left, we have Samantha green from Silverchair, Taylor Goulding from Overton and Nikki agate from Carnegie Mellon was asking her how to pronounce her last name, and they're going to do further introductions of themselves in just a few minutes.
So we are going to do something a little different today. We're going to start off with a little activity. So if you can pull up your devices or just use your laptops and go to menti.com and use this poll, I'll give you some background and then we'll go ahead and get started with the poll. So in 2024, Sage sent out a survey to our community of researchers, mostly social and behavioral scientists, asking them about how they feel about doing research that makes impact outside of academia or research.
Outside of research that makes impact outside of. Their sphere, their colleagues. Et cetera. Et cetera. We thought that getting their anonymous, real, raw feelings would help us understand how we can support them as they make real world impact. We got 1,800 responses. The insights were really enlightening and interesting.
So to kick off our conversation today. And give us a little bit of context, I'm going to share those results. But to make it a little bit more fun and interactive, I'm going to have you guess their thoughts. And then we'll see if you really know your researchers, if you can guess what you think they had to say. So go ahead again, pull up your devices. We'll start with this first question, which is what percentage of researchers believe that the ultimate goal of research is to make a positive impact on society.
So I'll give you a minute. Are you able to see the live results. Yes you can. OK, good. That's what I wanted. So it looks like we've got a big. Big bets on 85% So we've got 43% 67% 85% and 92% OK, I'm going to go ahead and share the results. If you didn't if you didn't have a chance to respond to this one, you can respond to other ones.
So the correct answer was actually D a whopping 92% I think that really in the end, research is about benefiting society. It's a big number and really inspiring to any of us who care about research impact. As you can see on the screen, however, we asked more questions. We asked them not just about how they personally feel about making a doing research that makes a positive impact in society, but how their peers feel.
So while 92% said that they think, yes, I know this is the ultimate goal of research, only 76% think that their peers feel this way. So they feel pretty highly about themselves, not as highly about their peers, and 68% feel the same about the leadership at their institutions. So it's a really interesting finding there. And then we asked a similar question.
How important is it that research is valued and/or applied outside of academia. And again, we see that researchers feel more strongly about their own personal perspectives than those of their peers and their institutional leadership. OK, so moving on to the next question on the Menti poll. Did you see the next question pop up on your devices. Awesome what percentage of the researchers believe that the ultimate goal of research is to build upon existing literature and/or enable future research.
So go ahead and take your guess 55% 70% 86% and 91% This is a little reminder that it was 92% that felt that the ultimate goal of research was to make a positive impact on society, and this switches things up a little bit to the ultimate goal of research being to build upon existing leadership and/or enable future research. Give you a few more seconds to put your guesses in. See how well our community of researchers.
OK, looks like it's slowing down. Yes, the most of you got this right. The correct response was 91% So for themselves, when they think for themselves, 91% think that the ultimate goal of research is to build upon the existing literature and/or enable future research. The drops down not too far, but down to 87% who think that their peers think the same thing, and then all the way down to 71% when it comes to the leadership at their institutions.
So if we look at those two questions together and an additional question as well, we can get a good comparison. Again, this is them I pulled out here actually what researchers are responding to for themselves. So 92% believe personally that the ultimate goal of research is to make a positive societal impact. 91% think it's about building on future research. So very, very close there.
And I guess ultimate can mean two things at once, because there's a big overlap between the 92% and the 91% And then we also asked about career advancement. And 55% think that the ultimate goal of research is career advancement. OK, let's go back to the Menti poll. So here you can enter in your own short responses. How do researchers communicate and apply work outside of academia.
So if 92% think that this is really important, what do they actually do to make research applicable outside of academic contexts. What do you think. So go ahead and put in your responses. We'll read through some of them and then compare it to what researchers said. OK face to face friend. Family group.
Social media outreach. I would have guessed. Social media to social media, press releases, podcasts, lectures, Twitter, community workshops, lots of social media conferences and associations. Storytelling interesting social media. Teaching blogging.
Translated abstracts that the public can understand. So that's one that hadn't come up yet. Yet like the news media and other forms of media. Community engagement. Talking to colleagues, either ad hoc or at conferences. Working with marketing teams to amplify messages that might be aspirational, I love that. Yep x.
Working with institutions to write press releases. Tell others not to tell. Sharing their thoughts with others who are not privy to their research. OK meetings. OK we'll stop there for now. We've got a lot of social media on there, and I would have thought the same thing, which is really interesting.
So for this question, and this is kind of a false lead up for you all, because I gave you it. I gave it to you as an open ended question, but we gave it to them as a set of options, and they could check all that apply. So they could check multiple. So the most popular response in response to this question was presentations to non-academic audiences at 61% Then books and book chapters, which I thought was interesting at 53% I would have thought that was lower.
But we must have a lot of authors responding, working with businesses and nonprofits, working with policy makers and decision makers. And then it's not until we get to 42% that we get to social media, which I thought was really interesting. I thought that's probably pretty easy. Low hanging fruit for you, but only 42% said social media. Then we get to one on one group or outreach.
And then I thought news media and blogs were also pretty low 28% and 17% So maybe that really is pretty aspirational. All right. Last question on the Menti poll. This one I think I did as a word cloud. So I think you have one or two words. Good luck.
According to researchers, what are the top ways that institutions reward efforts to apply research outside of academia. What kind of rewards do they get. We touched upon that at the very end of the plenary session. What kind of incentives do they get from their institutions. None oh, look how big that is. OK tenure.
Publicity they don't. I think that's the same as none. Not tenure. Funding recognition. Clout grants. Title career progression.
Sad face. Oh, money. OK, so 10 years is a pretty big one. Recognition is a pretty big one, but so is none. Invitations a plaque. Better class times. Punishment oh, yeah.
We are optimistic in this room. All right. OK, so what did they say. So again. Once again, we gave them options and had them choose one. We gave them short. We also gave them the ability to comment. But there wasn't very much in the commentary that was repeated.
Yes So they could check all that applied and the highest response rate when they could check all of them potentially was at 37% And that was tenure and promotion. So even the highest reward wasn't agreed upon by the majority of respondents. And after that, we had awards, resources, funding, and then a whopping 30% said not at all. So your perhaps not so optimistic responses in the Prezi or in the Menti poll were probably just realistic responses in the end.
So on that happy note, now that you have some data on the mindset of researchers, let's go into a discussion with our esteemed panelists to go to talk about, to bring some color to this data and to talk about what we can do about it. And I did want to mention, want to mention as well that we are going to put all this data together in a white paper coming out later this year. So let me know if you're interested.
Come up and let me know. I can put you on the mailing list for it. So just around of intros before I did quick intros. We'll get a little do a little more in depth intros of our panelists name, title, institution and then how you or your institution supports research impact. We'll just go down the line and I'll start. So again, Camille Gamboa, AVP of corporate communications at Sage.
Sage has for a number of years been looking at different ways that we can support research impact, especially in the social and behavioral sciences, which is the core of our publishing program, looking at how we can make our research more impactful, more applicable to those outside of academia, but then also how to incentivize new research that would make impact outside of academia. We are also door signatories, I'm sure many of you are, and we have a new tool that helps with this, which with in partnership with Overton, which I'm sure we'll talk about later.
And so let's go down the line, Samantha, let's see if this is OK. This is on. Hi, everybody. I'm Samantha green. I'm the director of product marketing at Silverchair. I think that sometimes we think about technology providers as a little bit more disconnected from the research itself, but I think that there are several ways that we do facilitate research impact.
The first is really just what technology does for all of us make things quicker, make things easier. Time and resourcing is so often a challenge when it comes to impact. We think about all of the pressures that researchers are under, and technology can make things more streamlined. It can make it easier to do the things that we all need to do, and free up that time to focus on impact.
I think the other way that technology can support research impact is really extending the reach of content. So making it easier to find it, to search for it, syndicating it, widening that distribution so that it can reach the people who can apply it and who can make that impact. I don't.
Yeah, that one's on as well. Hi, I'm Taylor. I'm the product manager at Overton. For those of you that don't know that Overton is we have two different products, but our main flagship product is Overton index, which is a database of policy documents on a global scale. We help our users to understand what their policy impact is. We're mainly used by research institutions, think tanks, NGOs, INGOs publishers to see where scholarly work is being cited in policy documents on a global scale.
We have a second product which is Overton engage, which is a database of policy engagement opportunities. We help researchers find ways to engage with policy makers, both at national and local levels as well, and help them to answer some of the questions that governments are having and hopefully keep improving evidence informed policy making. Hi, I'm Nikki adgate.
I work a couple of hats, all of which I think might come into play today. So I am associate Dean for academic engagement at Carnegie Mellon University libraries. I'm also the editorial director of our press. And I'm API on the metrics project, which has been helping departments, individuals and schools rethink their evaluation and reward processes so that they are measuring what they value rather than valuing what they can measure.
And we've been doing that since 2016. At Carnegie Mellon, one of the things we've been doing, particularly over the last year or two, is really helping our researchers tell a more textured story about their work, and part of that is helping them do the kinds of things that we should learn in grad school, but don't how to think about audience and personas, how to develop different types of messaging, how to change your message according to your audience, all of those kinds of things which all of you probably know very well, but academics in general do not.
Awesome thank you. All right. Now, our first question for the panelists is going to be kind of in reflection based on the survey data that you just saw. So I'm wondering, if we just said that 92% we found that 90% of researchers think that the ultimate goal of research is to make a positive impact on society.
So I was hoping you can respond to that. How do you reconcile that with what you witnessed. And if that's the case, where are they getting stuck and where are they having success. So are we going down the line. I'm happy to start. Who wants to start. Let me start. Whoever feels most inspired.
Am I inspired. I don't know. So this survey is really interesting because it actually backs up data from a couple of other surveys that have been done over the past couple of years. The skull lab had, headed by Juan Pablo alperin in Canada, did one of tenure and review processes and they said they asked researchers, what is it that you want most when you publish.
And they said to be read. And then they said, what do you think your colleagues want most when you publish or when they publish. And they said to be in a high index journal, to be right. So it's all about prestige. The impression that people have is that other people, be it their colleagues, be it the University presidents, whatever, value this high prestige type research, whereas what they care about is actually getting the message out.
I think with metrics. In 2020, we interviewed over 200 people across Big Ten institutions about some of this. And we kept hearing there are two things we were like, what are the most valuable moments of your scholarly career. Like what has brought you the most joy and why did you get into this. And the things they said were either working with students and mentoring piece, which, by the way, also doesn't get rewarded or this idea of changing stuff, of having a positive impact on society, but they get particularly stuck right at the junior level because the things that can give them job security, the things that are valued in the tenure and promotion process, are publishing in scholarly journals with a certain with a certain journal impact factor, or creating a talking at academic conferences where no single member of the public is going to be, never mind anyone even outside their discipline.
And so it feels like there's a vicious cycle here where people are getting stuck because they don't have the opportunity to do the work that actually they care about. But if they want to do that. In addition, they have to do double work if they want to do it, in addition to the stuff that their University seems to care about in the tenure and promotion process. So then it becomes an extra burden for all of these scholars.
And so for me, that's the major sticking point. Yeah, I was going to say a similar thing in talking with the kind of specific viewpoint of policy making and having impact on public policy documents. It's a really long process, and a lot of the time, the most impactful thing that a researcher can do is having really informal conversations with policy actors. They're not being cited in that research, but they're having those conversations.
They're helping those policy actors understand what the current landscape of a type of research, the type of research that's going on within an area is a lot of the time policy actors, and I'm being quite general here. But policy actors aren't experts in their areas that they're writing about. They're asked to be writing policy over a wide range of areas.
A lot of the time they're generalists. They're not academics by background, so they're trying to understand all of this academic literature that's coming out that they aren't sometimes trained to do. So having those conversations with academics with researchers is really helpful for them. But it's really time consuming and there's nothing to show for it at the end.
So we're asking research institutions to believe in their researchers and say, you're doing something impactful, but you're spending a lot of time that's not going to lead to your promotion or tenure. So it's a hard process, and it needs to be something that the researchers want to do themselves as well. If you're trying to incentivize an academic to do all of this outside impact work, and they're really there for the pure science, they're not going to want to do it.
They're going to resent you for it. At the same time. So it needs to be something that they want to do themselves. Yeah, I think that we heard a lot about this and different strategies in yesterday's keynote and all of the decisions that you have to make about managing your time and managing your resources. Ultimately, everybody needs a job, so you have to do the things that are going to allow you to advance your career and the things that your career depends on and that's ultimately not going to be making a TikTok and using it to communicate your research more widely.
You have to always be making those decisions and how you're going to spend this hour or that hour, whether it's a grant submission or working on a conference presentation, as you said, or even something more tactical than public outreach like working with policymakers. These are all decisions that every researcher has to face. And ultimately, whether they want to prioritize impact and whether they can prioritize impact are different things.
I'll share a sort of personal anecdote. My mother is a biology professor, and she ultimately wanted to have an impact with her work. And she had to pivot because she had to prioritize publishing research, publishing in journals that were cited. And she prioritized research about pedagogy and how to teach biology, because that was a way that she could still advance her career while having an impact in a way that was a little bit more tangible for her, and felt more valuable to what she wanted to achieve in her career, and that's a decision that she had to make.
She could not do both. There were only a limited number of hours in a day, so it's hard. Yeah and just to emphasize or to help to demonstrate the point that you all have been making, in particular Taylor with the policy making. I'll share a little story from Sage. So under the previous administration, we brought in researchers who worked at various White House federal agencies into our DC office for lunch from HHS and EPA.
Et cetera. Et cetera. And we had about 10 of them in the office. And we were talking to them about we asked them point blank, what can we realistically do to make our research articles more accessible for you, more usable for you. And they just looked at us and said, nothing like just stop trying to work on those research articles like these little summaries, maybe they help a little bit, but the bigger thing you can do is or the bigger thing that needs to happen is these relationships that we need to be building.
We will call someone up if we know them, that they are an expert on this topic and we can trust them. That's most important for us as we build policy and want to use research. And so what part of the faculty members job description is doing that right. Not any of it. So it is a real challenge that we need to reconcile if we want our research to be more impactful.
Moving on to the next question and actually taking a step back and talking about impact, I want to ask you all who owns impact, who defines it kind of in the ecosystem of impact, what are the different players look like and what levers can they pull. Well, I'll start, but I'm certainly not going to have a complete answer to this question, because I think this is probably one of the greatest challenges when it comes to thinking about impact, is that it requires such a kind of industry wide or ecosystem wide pivot to prioritize impact when it comes to how we incentivize work or how we prioritize what gets published.
Like everything would need to pivot in order to really prioritize impact. If it's not a shared goal, it's not going to be successful. The other thing I often think about when thinking about research impact is the word impact. It's not about the person doing the thing. It's about the recipient of whatever it is, whatever the discovery is, whatever the piece of research is.
So it really has to be a two way goal. You can have wonderful intentions and you can want your science to change the world. But if the people that you are trying to impact or the environment that you're trying to impact the climate. Whatever your discipline is, if there's a disconnect there, then you're ultimately going to fall into this middle ground where you're not having the impact you want, because you are not engaging the community that you are trying to impact.
And I think that's really hard. I also think that defining impact really depends on what type of impact you're talking about. Economic impact, policy impact, social impact, societal impact, academic impact. These are all different things, and they require different stakeholders, and they require different ways of working and different outputs in order to achieve your goals. So it's very nuanced.
It's very complicated. And I think that it's an interesting sound isn't it. And I think we kind of just we have too many different definitions for all different stakeholders in the space. And that becomes complicated very quickly. Doo doo doo doo. We have a ghost in the room.
We do have a ghost in the room. OK I want to pick up on that because I don't think we have too many definitions. I just think that we need to be plural and flexible about what impact is. It's such a violent term as well. But I feel like right now it is really tightly sort of ground up with this idea of excellence. And so for many institutions, impact and excellence are kind of owned by the rankings.
And so it becomes this set of fulfillment of a set of proxy measures that doesn't necessarily have anything to do with how stuff is being talked about, used turned into real world sort of activity. And I think it just I think one of the issues is that we're spending too much time thinking about a one size fits all model, which something like a University ranking system really does, or any kind of numerical quantitative measure.
And if you think about the impact that if you think about mission aligned impact, the mission aligned impact of a land grant institution or an urban community college is going to be very, very different from, a Stanford. And so it's also going to differ by department. And so this question of who owns impact I think is a little bit of a spider-man meme.
Where everyone's pointing at everyone else and the scholars are like, well, our department says what impact is. And they're saying, no, it's the president. The president's like, it's the rankings. So maybe it's the institution that's about levers. And I'm just wondering if maybe it's the institution, the institutions themselves, by which I mean people.
Institutions are not a thing. People are a thing. The people in the institutions, if they care about thoughtful engagement with local community, to begin to think about how to incentivize that. If they care about health outcomes, they care about affecting policy. But for as long as we're all caught up in this prestige game of the things you need to be excellent, whether as an institution or an individual, are X, y, and z, and not these other things that are actually much more mission aligned, I don't think we're going to get anywhere.
Yeah, I think I'm going to just give a little bit of context from what the UK system looks like, but it's exactly what you were saying. It's all based on rankings and excellence and apologies if everyone's already aware of this, but in the UK, we have the research excellence framework, which dictates a lot of the kind of block grant funding from the UK government to our higher education institutions.
And it's essentially a kind of a program that our UK universities have to go through to show their impact and show their research excellence. And it's kind of a facade because a lot of the time the universities that are able to spend the money on building the teams to show the impact are the ones that are then able to get the most amount of funding. If you're able to fund a team of 10 people within your University who all they do for the eight years leading up to this framework is build these impact case studies around how your research is being impactful and all of these different areas.
Then you're more likely to get the funding, and then you can fund that team again and again. And you get these smaller universities that are having real impact within their local area, but they've only got one full time employee that's able to show what that impact looks like. And therefore, they don't end up with the funding that they need to carry on with the research that they're doing. So it's kind of a double edged sword of trying to show that you're having impact, but trying to spend money to get money.
It's a weird area. Yeah well, it sounds like there are lots of different people and groups involved and it's complicated ecosystem. But let's 0 in on those of us in the room publishers, technology providers. What can and should we do to improve how we measure and reward impact.
Well, I think before we even get to measuring impact, we have to facilitate it. I think from a technology perspective that comes in a lot of different ways. Like I said at the beginning, making things easier. We run the ScholarOne peer submission and peer review systems. And how can we make that process quicker and easier, saving time so that researchers could focus on impact or public outreach, or working with policy people or working with their local communities.
So there's that aspect making it quicker, making it easier, making it more streamlined. I also think that accessibility is a big one here. There's a lot when it comes to the accessibility of content and making sure that it is clear it's able to be read and engaged with by communities around the world. And that's a big thing that technology can really help to facilitate. And the final thing this is not an AI session.
I don't think we should go down an AI rabbit hole here, but there is probably a lot that I can do in ways that we haven't even imagined yet. There's things that it can do to support discovery. There's things around helping to translate some of the work that gets published for different audiences, and how we can facilitate, again, saving the time and resources for researchers who want to have an impact, as we discussed with the survey at the beginning.
And helping them to do that by taking some things off of their plate and by using technology to help and assist what they're trying to achieve. Those were exactly the same as my points. Yeah, I mean, at Overton, our second product that we launched Overton was purely to help with this. We were seeing that there was a lot of grunt work that goes in with trying to get researchers to engage with policy makers, and they are struggling to find those opportunities to do so.
Or if the University is lucky enough to have a policy impact team, then it's on them. It's on those members of staff within that policy impact team to find the types of ways that researchers can engage. And those opportunities are really hard to find because government websites are really hard to use a lot of the time, the hide, the consultations and the learning agendas, they're never easy to find and they're never straightforward.
So we were trying to focus on doing that because we found if we can make this easier for people, what if we can take out some of the guesswork of trying to find where you're going to find those policy engagement opportunities. So that was one of the things that we did. And it is useful a lot of the time it's cutting down those hours, and it's making it easier for researchers to see where they can go. But in exactly the same way, we also need to make it easier for the people that want to read the journal articles and understand where that research is going.
A lot of the time, policy actors don't have access to journal subscriptions, so if they're not open access, those journals are hidden for them. But even if they do have access to them, they can't understand what they're saying. There needs to be more around the knowledge brokerage between those researchers. And the policy actors. They need to have translated that.
But equally, we need the researchers to thread the needle and make sure that they're there writing in a way that is accessible for people without ruining the integrity of their research because we know that that's going to be important for them. Wow the problem is so this one really, really helpful. But I really feel like at the bottom this is not a tools and technology issue.
It's a people issue. And it's a process issue it like academics makes an academic processes are notoriously slow. But also changes like this. You have to move at the speed of trust. And so you have people who for years like the way they value their self-worth, is how these things are assessed whenever they go up for reappointment, how these things are assessed in their annual review.
And so to think really, sort of deeply and with them about how to change that narrative a little bit. And so there are some really good initiatives that are trying to do this right now, like doras one, Helios is another in arms in the UK with the more than rank initiative. But those tend to bring in one or two people per institution who are the representatives. And then there's an agreement across a certain band of folks.
But really, some of this work needs to be the really, really slow person by person in a department. Getting folks on board, getting by in. And I know that's not really what anyone wants to hear, but until you have that sort of buy in, I don't think researchers are going to go looking for a tool to help them do the thing. Can I ask a follow up question.
Sure I'm curious because I think yes, change is super slow, especially in academic departments. And one thing that I've witnessed and I've heard a lot that is how hard it is to do that change from a generational perspective. There's a perspective of senior faculty or senior researchers saying, well, I had to do these things. So I don't want to change it for you. You should have to do them as well.
You young faculty member. So how do you get that buy in and prompt those changes. Having experienced that personally as a PhD student, I will be like, yes, yes, that definitely happens. I think it's cohorts. I think it's having people feel like they're not the only one who cares about this, like creating safe spaces for people to talk about what matters to them and then being like, well, hey, wait, if this is actually what matters to all of us, this is why we got into this business, then what are our levers we can pull.
What are the places we can make change and helping them realize that behind every academic process, there are a bunch of people that made that decision. And usually those people are their peers or their peers from a couple of generations ago. I just want to say I agree with you, Nikki, on the larger cultural change and cultural shift that has to happen. And it be slow.
I do think that there are some rhetorical changes that we as publishers and technology providers can provide in the way that we talk about impact and not just making it seem, not just using it as a proxy for impact factor and things like that. And so I know those of you who are familiar with Dora and following those principles are doing that already, but I think it's important. Just to keep in mind here, I'm going to skip to a question I think follows nicely with from that one, which is what new tools and technologies support can support publishers to enable the creation of impactful research, or to talk about impact in New ways.
Looking at you, Taylor. But anyone can respond with their thoughts. Yeah, I think a big thing that we're seeing being talked about, especially in the UK, Europe, EU as well, is evidence synthesis is the kind of big buzzwords at the moment. And I know our governments are investing a lot of money into how they do this properly. And so taking kind of large cohorts of research and helping policymakers, but also the public as well, understand what's going on in an area.
But I think we're all extremely cautious about doing it. I think in a good way. I mean, there's a lot that you can do with LLMs and again, don't want to go too much into AI, but there's a lot that we can do with it. But evidence synthesis seems to be the biggest thing that people are talking about at the moment. How we do that correctly and responsibly and again, try and not dilute the kind of message of all the research.
I'll just pick up on evidence synthesis because this is one of those things that libraries have been doing for some time. And we have librarians who are really, really well trained. We just stood up. We turned an evidence service into an evidence synthesis program at Carnegie Mellon about a year ago. And they're experimenting with LLMs and with various platforms that allow for some of this.
But that deep engagement as part of a research team that involves faculty members, students often, and librarians working together, if there are tools that can facilitate that process, I think it's fantastic. But there is something that is deeply important about that learning opportunity and that collaboration opportunity that I wouldn't like to see of replaced by a tool.
Yeah well, I'll just add quickly in order to not go down an I rabbit hole, I will just leave that to the side. But the other thing I think in terms of tools and technologies for publishers is I think there's an interesting connection between the ultimate impact and outcome of research and research integrity. And I think there's a lot happening around ensuring that the research that ultimately is published is trusted and trustworthy, and ensuring that the quality is there and we are doing what we can during that publication process to weed out things that are either problematic or potentially fraudulent, or have mistakes and errors in them.
And I think that's another thing that ultimately can improve the impact of research is making sure that the quality is there and the trust can continue to be there from an integrity perspective. Hear, hear. I also have to put a plug-in for Sage policy profiles, which we developed in partnership with Overton, which we can't sell to publishers. So it's not me trying to make a sale here.
In fact, we don't sell it at all, but it is a tool that you can share with your researcher communities where they can go in for free, create a profile and then search through all of the data, the amazing data that Overton puts together and adds to constantly to find where their research is cited by policymakers. And then it's got these great visualizations they can bring to their tenure and promotion committees or put on their websites.
Et cetera, et cetera. To help celebrate that type of impact. So I wanted to make sure I shared that we are going to go to audience Q&A soon. But I wanted to ask one last question first, that actually your comment leads us to nicely, which is, I think the elephant in every room we enter at any time in 2025, which is how does our current political social environment provide an opportunity for needed change.
I'm going to frame this optimistically. Is there an opportunity for change here or is it just a distraction in all reality. So please give us your thoughts. I have thoughts, I have feelings, I have opinions, I do think I find it sometimes hard to be optimistic. And I'm not an optimistic person by nature. But what I do think is that we tend to operate as science or research this industry with this illusion of objectivity.
We like to talk about science as being not political or apolitical or objective in nature. And I think that what's happening right now is really revealing how wrong that is. There is nothing that is not political today anymore. And I hope that we can use this moment as one that can galvanize us and one that can really show us the need to have a very strong public voice advocating for the value of science, the role of science, and what it can do for us today and for future generations.
And so I guess that's my optimistic thing is I hope that it can galvanize us. Yeah, I think I'd agree. I think it's a really hard question to answer, but yes. Yeah I think if we're taking a positive spin on it, I think it's just making sure that all the impact, all of the influence that researchers are having is just clearly communicated. And hopefully we can understand how better to communicate everything that we're doing and all of the science that we're producing.
I'm just going to keep harping on the incentive structure because what an opportunity. If there is distrust in science on the one hand, and then this need to rethink impact and excellent beyond of citation counts and social media mentions, then we have the opportunity to reward work that engages with the public and communicates, and not just in a kind of like here, I'm telling you what I did, but in AI am sitting down with you and trying to hear what matters to you, and then think about how I communicate that in conversation with you.
And so not at them, but with them or research that's focusing on K to 12 audiences and thinking about how are we distilling this down so that our students who are being taught that 2020 was stolen, after all, actually have another opportunity to think about other things. So I think it's a really big opportunity. And it's going to be hard, but we have this moment to it would be a real shame if given everything we're facing, we were like what, let's just do academia as usual.
Publishing as usual. There's a lot more we can do here. I agree 100% on that. I was mentioned earlier, and I'm sure you've all seen the headlines about RFK's podcast interview where he talked about the headline was all about, not wanting government HHS employees to or potentially banning HHS employees from publishing in prestigious research journals.
But I actually went and listened to suffered through the entire podcast to get the context. And he was some of the things he was talking about. I was like, actually, that kind of makes sense. He was talking about, the replication crisis and how he actually wants to put a good deal of money in replication studies with the HHS and making sure that actually the research that we publish it can replicate.
Et cetera. Et cetera. And he talked about, I think opening peer review and a few of the things. And anyways and so I totally agree with you, Nikki, that this can be an opportunity for us to actually look back and look for ways to solve some of the problems that we've just been kind of ignoring or pushing aside for some time and really get to the type of research that makes impact and actually consistently makes impact.
The reference to replication, which I thought was interesting. OK, I am going to hop downstairs and put the microphone in the middle. And I would love for you all to think of your questions that you can share with the panelists. Feel free. Hello here we go. Feel free to just come on up here if you have a question.
I'll break the ice. Matt Giampaolo with American Geophysical Union. I was wondering if you could talk a little bit about the need to build trust between researchers and publishers, because I feel like they know they need us.
They rely on us for certain things. But I think in a lot of times, percentage of them don't trust us or trust that we're fully on their side. So how do we build that trust at a time when I think it's probably really needed. I think it's about building trust, but even taking a step back, letting helping them see that what we do and the value that we bring.
Like, I don't even think that our researchers really think about us much. It's kind of just a step in the process to get their research published in this journal that their peers care about, or that they themselves care about, or they've been suggested to publish. I don't think that they actually think about us very much, to be honest. But if we can find new ways to show them, actually we can make your lives easier and this goal that you are working towards and that really matters to you and your journey, whatever your goals are, we can come in here and be a help for you.
And I know that there are a lot of different initiatives that are going on to do that, but I think we can show that. We can help them and make their lives better and easier, at least their professional lives. Then that will help build that trust. I would just add, I think publishers could listen to researchers more and engage with them in conversation. And that can take lots of different forms.
Maybe it's surveys, maybe it's workshops, maybe it's focus groups, whatever kind of form that takes. Because ultimately, I think a lot of times researchers feel that changes are just kind of enacted upon them. Suddenly there's different open access policies or this to navigate or that to navigate, and it's very complicated. So I think listening to what they really need and what challenges they are facing and what challenges are stopping them from achieving their goals, and just kind of opening those lines of dialogue can be really valuable.
I'll just adding to maybe engage your friendly librarians as brokers in this conversation. We are the ones who are talking to the researchers all the time, particularly about publishing processes and research processes you work with us to convene focus groups and things like that, because I think there's maybe a place for us as a more neutral, trusted party that could help with this initiative.
Don't be shy. Good morning. My name is Willa Tavernier. I'm research, impact and open scholarship librarian at Indiana University Bloomington. And I am wondering how much of this is a problem that publishers actually cannot solve because the administration of universities and the evaluation process is divorced.
It's been separated from scholars themselves. And in many cases, the publishing process has also been separated from scholars as if it's a thing you do after you finish your research, instead of Part and Part and parcel of the scientific process, which is to share the knowledge from what you've just researched so that it can be, read, reviewed, critiqued, built upon, et cetera do you actually think that publishers can solve the problem of motivating researchers to do more things that have real world impact when so much of this is caused by the fact that the system, important pieces of the system, has been taken out of scholars hands.
I mean, publishers are already being used as a proxy for measuring excellence, which is really something that scholars should be deciding upon. Yeah, I think I mean, I think you're right that it needs to be, systemic cultural in order for there to be change. I do think there are levers we can pull.
We talked about them before the rhetoric that we as publishers use fetishizing metrics that are only measuring one small thing and calling them something else is a big thing. Also I think back to the survey data and the Nike also seconded the data and similar data that she has seen where researchers are saying, I really care about this, but my peers and my institutional leadership don't.
I think that it's important that we make it known that look, actually, all of you who filled out this survey are 92% of you. All but 8% of you said that this was important, which means more than you think. This Actually, everybody's in the back of their heads. This matters to almost everyone in the back of their heads. And I think the same could probably be said of those of us in the room that are members of SSP where this really matters to us too.
So I think we need like a groundswell of this we're all kind on the same page and working towards the same goal. I think making that known can be helpful. That's not necessarily only a lever that publishers can pull, but I think it's something that we as a larger community can do. Yeah any other thoughts on that one. It is. It is very tricky.
And actually I think really important just to point that out, Willa that this is part of a bigger systemic change that we need. We have a few more minutes for a couple more questions. So please do come on up. Yeah thank you. I'm more of a practical question for Taylor. For Overton.
I'm Jason Winkler with Springer Nature. Is it engagement with the policymakers themselves, or is it with the staffers of the policymakers who is most interested to speak with researchers. And who are you making that connection with. I would say staffers, but yeah, go ahead. Yeah, I think I mean, everybody, to be fair. I mean, when we're speaking with government actors, a lot of the time that it's kind of an all levels of government, you're getting those kind of entry level policy officers.
And again, I'm talking from a UK context. So our government system is set up slightly differently. But it's everyone up until the kind of top directors I mean in the UK we have our government office for science and they're particularly interested in this and they're trying to embed it throughout the entirety of the kind of civil service. And it's not just from the people that are writing the policy as well.
I mean, I guess we have to think in the policy making process, writing the policy is that first step of it. But actually you have to then go further. It's all about the delivery. That's the actual that's the legwork. That's the hard part of creating those policy changes. So you also need to be speaking with people that are working in the delivery aspect of it as well, because sometimes you can take the ideas from a policy document or an initiative, and it gets completely changed when it goes through into the kind of delivery side.
I just want to add to that I think we need to expand what we think a policymaker is. I mean, 10 years ago, my brain would have and not that long ago probably either my brain would have gone to our members of our representatives and our members of Congress. But like I said before, there are policymakers working in all the White House or there were the White House federal agencies.
There are people writing policy that impacts public life at the state and local and the state and local sphere, and those people are more accessible. And actually, our members of Congress are pretty accessible to not necessarily the members themselves, but their staffers. I have a niece working on Capitol Hill right now who is telling me how it works.
And anyways, there are ways that it's not as inaccessible as our researchers probably think. But it does take some effort. So other questions or comments. Yeah go ahead. Hi, Thomas Wofford from the Journal of Bone and joint surgery. Thank you for the panel discussion. I'm sensing a little bit of a shift, because I do think we can get a little drowned in metrics and KPIs and quantitative kind of assessment.
So do you or any of your institutions use any more qualitative measures, which is maybe an oxymoron to look at impact. Do you have any thoughts on that. Thank you. Here you go. I can talk about one of my favorite examples about this is not at my institution, but at the University of Minnesota people.
Incoming faculty can say that what they want to do is community engaged scholarship. That's something that they are focused on. And they actually go through a slightly different tenure and promotion process where some of the letters that people have to get from normally from faculty at other institutions to attest to their brilliance, come from members of those communities who talk about how that research and how the engagement with them has impacted that.
And so it's qualitative. It's a letter, but I think really speaks to you said this was what you wanted to do. We hired you because this is what you do. And we're going to make sure that we're looking at ways to include that in our assessment of you. Yeah we also had a group of users at Uc Davis recently used some far Overton data to it was in their grand challenges team and they were finding that they were trying to communicate their impact on their main key areas that they were looking at as a University.
And they found that a lot of what they were looking at was intangible with looking at metrics. So it was their team mainly focuses on building out those case studies around how they're having impact. And it's all really qualitative. Yeah, that's the one. Yeah I think I would just add not in terms of specific metrics, but just in terms of measuring impact more generally is that it's complex, it's multi-modal.
There's a lot of different aspects to measure, and it's about how you craft that narrative around and align what you are able to measure, and what you are measuring with what your initial goal was. And I also think that when it comes to impact, especially things like policy or public societal change, those are real lagging metrics. It could take years. It could take a generation before you truly see that impact.
And that can be a real challenge when you're trying to figure out how to chart your career development and things like that. So I think that's another thing to consider as we're talking about measuring impact generally. And of course, there's also awards that do that, where the whole point is to show a story or tell a story. If you all have heard of the golden goose awards, probably where it gives an award to people who are funded or federally funded researchers who wrote, who did research on something that might have been silly sounding when it initiated, but then went on to have real impact on the world eventually.
And so that's all about stories. Also you might have heard of I want to explore this more, but narrative CVS, this is a concept of a CV that kind of is more storytelling. From what I understand, we are out of time. But thank you all so much for joining us and for your questions. And thank you to the panelists. And if you have any questions, just come up and let us know.