Name:
Building a Best-in-Class Workflow: Auditing, Streamlining, Standardizing
Description:
Building a Best-in-Class Workflow: Auditing, Streamlining, Standardizing
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/4fef6a26-a743-4fe7-9e8d-06ba7e92a1e7/videoscrubberimages/Scrubber_1.jpg
Duration:
T00H29M12S
Embed URL:
https://stream.cadmore.media/player/4fef6a26-a743-4fe7-9e8d-06ba7e92a1e7
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/4fef6a26-a743-4fe7-9e8d-06ba7e92a1e7/industry_breakout__wiley_2024-05-29 (1080p).mp4?sv=2019-02-02&sr=c&sig=FBXjW7BO%2Bd0g3z5ZLuhIfhoMpsf4vBBtKF2ZDFlgfAQ%3D&st=2025-04-29T19%3A55%3A31Z&se=2025-04-29T22%3A00%3A31Z&sp=r
Upload Date:
2024-12-03T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
Good afternoon. How are. How is everyone. Good post-lunch energy. Great if you're looking for building a best in class workflow, auditing, streamlining, standardizing, you've successfully navigated to the correct room.
So congratulations for that. Good steps in the right direction at the beginning of the meeting. I'm Jennifer workman, business development manager at Wiley, and I'm joined by my colleagues Ray and Britney, who will introduce themselves in a bit. So in my current role at Wiley, I talk with a lot of organizations about workflow challenges and the goals that they want to achieve when making adjustments to their workflows and considering streamlining.
And I also hear a lot about the benefits of streamlining and standardizing from the organizations that I talked to as well as the challenges that they encounter with that process. And I've found that while the reasons for streamlining workflow can vary, there are continually a lot of the same common themes that pop up for reasons why organizations decide to streamline and standardize their workflow.
So we're going to test that out today and see if we can find some common themes. So if you have the web app on your phone, we have a poll connected to the session. So if you'll open it up and find the poll, there's a question there that asks, why do organizations consider streamlining their workflow and go ahead and enter an answer. And once you enter your answer, you'll be able to see everyone else's answer and we'll look and see if there's some common themes there.
So why do you do that. I'm going to talk a little bit about the research that we've done at Wiley and some of the common themes that we've seen. So at Wiley, we did some industry research across various disciplines and publication models asking organizations the same question why they may be considering streamlining workflow. And I've listed several of the top reasons here on the slide.
So first off, user experience is a big reason or many organizations want authors to have a positive and consistent experience across their portfolio of journals. So that's something that comes up quite a bit. Modernization is another common reason for streamlining. Sometimes workflow is outdated and updates need to be made to incorporate new policies or new technology. And along those same lines, automation is often top of mind for many organizations who want to incorporate new software or create better efficiencies that save time and save money.
Portfolio and growth management is something that I hear a lot from organizations. As portfolios grow and new journals are launched and submission numbers increase, it can be really challenging to grow, to scale and really think about how your workflow might be impacted by those changes. And so streamlining and standardizing workflow is often a way to meet those challenges along the way.
And then lastly, managing data across systems is another motivating factor for streamlining workflow. Many organizations want to track data in New ways, and streamlining can help with this as well. So take a look at the answers in the app and see what the feedback tells us. And I'm going to turn things over to Brittany, who's going to talk a little bit about how her team helps organizations streamline and standardize their workflow.
All right. Thanks so much, Jennifer, everybody, volume. Good all right. Perfect so I'm Brittany sweat. I'm the senior director of publishing services within Wiley partner solutions. And I'm here to talk about one of my favorite things, which is improving operational efficiencies of journals to help drive organizational goals for their publishing portfolio through an auditing and assessment process.
One of the first things that organizations really need to focus on when they're thinking about embarking on an audit workflow is thinking about what's your why. Why are you doing this. What are the business reasons. What are the pain points that you're hearing from your stakeholders and what are your goals in the next two to three years to drive the growth and success of your publishing portfolio.
To really focus in on those reasons. Because at its heart, a workflow audit is an operational transformation that it's going to require buy in and implementation from a large number of stakeholders and your team. So it's really important for leaders to be able to articulate the business rationale, the reason, the value for embarking on such a project. So that everyone's on board and coming along for the journey and to really be effective to do all those things, you have to have a change management mindset.
And one of the first key components of that is articulating the why so that everyone's on the same page. Thinking about that, though, for those of us in the room, it's probably not that hard to think about why you would do an auditing workflow. Like we all have similar goals. We know we want to solicit the best content that's out there, have it be submitted in the most easy for author, easy for author way possible.
We want it to go through peer review as quickly as possible while upholding the strongest ethical standards. We want it to publish quickly for authors. We want it to be on a platform that helps us drive engagement and reach more readers, all with a system of reporting underlying that so we can make sure that we're actually achieving our goals and that we have some benchmarks.
So I think we could all agree on that. But why is that sometimes so elusive? And really it comes down to. We have so much pressure on our time, it's difficult to find the time, the resources, the expertise to drive a project like that forward and have the why clearly articulated and aligned across the organization. So drilling down a little bit into to build off of the themes that Jennifer was talking about, these are the top five reasons and rationales that we hear from our customers about why they choose to embark on a workflow audit.
And it's really important when thinking about these things. The first thing we always do with our partners is to think about the overarching goals at this level. It's really important to ascertain what those goals are at the 10,000 foot view. Oftentimes when we think that part of our workflow is broken is because we're hearing really specific complaints from authors, from editors. But in order to contextualize and build a proper audit of a workflow, you have to start at the $10,000, $10,000, 10,000 foot level, $10,000 view, 10,000 foot view, so that you're making sure that you're aligning the workflow to the overarching goals and not doing an audit to meet the preferences of individuals or to be in line with past preferences.
In addition to that, a lot of those very specific pain points that we hear from are situated in very specific parts of the workflow. But as Jennifer was talking was mentioning have to consider the entire workflow. So from journal launch all the way to submission, commissioning content, peer review, the production and copyediting process and all the ways that you interact with authors and they interact with your journals through this experience.
Yes, maybe some of the pain points are situated in the copyediting process. There's a lot of bottlenecks there. We see that pretty often. However, it's often the case that the reasons for some of that could be upstream. So you have to think about the pain points and the solutions from the entire ecosystem of the workflow and think about downstream implications and upstream rationales.
So some of the most common workflow challenges that we see, I'm not going to go into depth on all of these. I think none of this is going to surprise you. You've encountered and seen all of this, but I did want to focus in on a few items in particular. One of those is editors and staff using manual processes outside the system. That is a huge time suck. It's not effective.
It's not transparent. We've all seen it, and thinking about why that's happening is there insufficient training of editors? Is this peer review system not configured appropriately. Does the workflow on paper not match the system. So there's a lot of reasons to dig into that. Also, thinking about the author experience, I think we talk about the author persona and perspective of the peer review process is so vital to a successful publication workflow, but it's really important to decrease the hurdles to submission.
We all know that we want to drive more submissions. We want to increase submissions at scale. We want to try and facilitate doing that with the staff that we have to the best of our ability. And we want to keep authors happy during that process. So how can we automate submission processes as much as possible and make that as easy for authors as possible. Another common theme that we find are just outdated legacy workflows.
As editor tenures change year to year. We all know editors like to make their mark on their journal and their workflow, or they have an experience with another journal that they bring to their editorship. But over time, that's a lot of baggage. That is solely driven by personal preference. So making sure that we're looking at those again from that 10,000 foot view. Was this part of the workflow a personal preference of an editor 10 years ago, or is it really in step with the future facing goals of the organization.
And the last thing I'll mention here is research, integrity. I think all of us have probably worked on new policies in the past one to two years thinking about fraudulent publication, identifying paper, Mill submissions, reviewer rings. So it's really important as you're working through a workflow audit. Yes, you need to look at the configuration of your peer review systems.
Yes, you need to look at the processes and what your staff are doing, but you also need to understand and evaluate the policies that are the foundation for a lot of these peer review and publication processes. So just a highlight. Again, you will hear me say multiple times before I'm done speaking to go back to the goals. It's so important when you are conducting a workflow audit to start with your goals in mind at that 10,000 foot view, but engaging with a lot of different methods to get qualitative and quantitative data, that's going to create a really robust picture of what's happening in your workflow.
You can interview a staff member and what they describe as the workflow can be very different when you get under the hood and the peer review system to see how the site is actually used. So it's really important to have a balanced, balanced sources to understanding what's happening in the audit to make to use that to analyze very specific data and through that analysis, make very specific actionable recommendations.
That implementation is the strategic plan and the transformational aspect of the workflow audit to tie a little bit, just a little bit more depth into the methods that we typically use that are successful to some of those goals we talked about, obviously we're looking in peer review systems to do really detailed analysis of the configuration and the architecture of the peer review system. We're using a lot of data to back up some of our findings, how long are certain parts of the workflow taking.
And 2, in one of the key things that we're also looking at is creating consistency across the journal portfolio. We want to make sure, thinking again about the author experience, about growing submissions, about keeping those submissions within your family of journals. How can we help facilitate transfer across journals. It's so important to have harmonized and consistent decision terms, Article types, metadata fields to really facilitate that.
And once you create and build consistency within those data repositories, it really helps you have robust reporting. You can compare turnaround times across all the journals in your portfolio. You're comparing apples to apples. And one of the things, touching on one of the things Jennifer mentioned, we hear from customers. So much, they want really robust data.
They want to be making data driven decisions. And it's so important, the saying garbage in, garbage out. Like if you have really good data across your portfolio, it can really accelerate your ability to report across the portfolio. I mentioned that we do staff interviews, policy reviews and really thinking about that author perspective, doing a walk through of the submission with that persona in mind, what is the communication they get.
How long does it take. And really keeping that front and center. So I want to talk a little bit about standardization, because I think sometimes, myself included, we can get a little bit in our feelings about all the customizations that our journals have that we've grown to love, that our authors expect. But standardization, if we want to grow our journal portfolios at scale with the resources we have without exponentially increasing resources, it's so important to standardize, but it drives so many other things.
It helps our editors know what their roles are. It creates a consistent author experience. It drives that reporting that we want to be making business decisions with. And if you do it right, you can actually find time on your staff to help push forward publishing initiatives and work on some of those strategic initiatives that are going to help continue to meet those organizational goals. But what I did want to say is that standardization does not mean that a journal or portfolio can never have any customizations.
That's not what I'm saying. There are field specific norms. There are editor preferences. We need to make sure that our editors feel like they can do their very important job. So there's a balance there. But what there should be is a transparent and clear process for evaluating what those customizations should be. Do they align back to our organizational goals that why is everyone in agreement.
Does it apply to more than one journal. Like, can we push this out consistently so that it's still supportive of those robust goals of consistency and data reporting across the portfolio. And finally, just so important at the end of an audit to have, this is a strategic plan ultimately for your journal workflow. So that the operations drives the goals of the journal and your publishing program.
So we want to make sure that they're really specific recommendations that are grounded in data, but there's also a summary so that anyone in your organization can understand what you're talking about, because again, you need that buy in to implement successfully and thinking about the peer review system, the processes and the people aspect of your staff plus policy, those are the three legs of the stool that really make the workflow sing and support those goals and then thinking about implementation and prioritization.
So these should be the outcomes of a journal workflow. So that your team can take that and put it into practice. So with that, I'm going to hand over to my colleague ray, who's going to talk about Wiley's end to end transformation process and how that built into some of our thinking about standardization for workflows. Thanks, Brittany.
Hi, I'm Ray Deguzman, Director of editorial solutions. So I manage a team of product managers. We work on developing our research exchange platform with a specific focus on the integrity, research, integrity, screening, and peer review applications within our suite of products. And so I think Brittany. Brittany has taken us through the why of why we want to standardize.
And we have a pretty good idea of what we want to achieve. So I'm hoping that what I'm going to show you today will be a small, small glimpse into how we started to do this within the wider research publishing end to end program. So I'll take us through, share a glimpse into why these research publishing's path to transformation and with research exchange sorry, with research exchange as the platform that is really a key enabler for this strategy to streamline and standardize over 2000 titles.
So that's a huge challenge for Wylie research publishing. And so I'll share a little bit about our approach in terms of scoping the complexity of this challenge and also ultimately what we learned so that it might be helpful to share as well how we approach the problem, but how it also informed our product solution design. So research exchange is our suite of modular and interoperable software products. So this really serves everything end to end in the pipeline of author collaboration right through to publication ready scholarly research content.
So I'll focus on the three applications. Is that OK. Yeah Thanks. I'll focus on the three applications, submission screening and review. So research exchange submission being our author experience where we really focus on the providing the author with a seamless experience when they're submitting their manuscripts, but also leveraging machine reading in order for us to make submission process completely efficient, but also starting to instill the standards in terms of the data that we're collecting during that submission experience.
This then feeds into the research exchange screening application. I won't go into too much of this. I'll shamelessly plug that. We are doing some demos throughout the event, so please come, come and see us at our booth. But this is our dedicated UI where we provide an interface for editorial staff to manage research integrity checks before peer review.
And then finally to complete the submission to peer review piece. We have research exchange review, which is really where we start to handle the peer review process. So this is the interface with some smart peer review management tools that we provide editors. So that's the final piece of our research exchange product suite. So we have these three applications. Why modular?
And really we focused on a lot of the feedback that we gained from the industry research and understanding that the challenge with transformation change is difficult. Change is hard. Being able to do this in an incremental way is really ideal. It helps to provide a solution that can do this in a way that suits your own pace for transformation. But building a modular platform does mean that we need to have clear data architecture in order for us to have scalable solutions.
So just to focus on this scalable workflow solutions and what we mean by that as a platform, of course, we need to be able to scale in terms of handling volume, but we also want to be able to scale the platform in order to innovate quickly so we can integrate various different paper mill detection tools, various different services within the application. So that you can operationalize them within your workflow. So that is part of the research exchange screening application.
We are also able to explore other areas in the workflow as well. So for example, we are currently doing discovery with ex ordo to try to bring that unified experience a little bit further upstream and having a centralized submission flow, starting with conferences and conference events, but also being able to bring some of that research integrity in the editorial workflow up into the conference space.
So all of these scalable solutions, of course, require standards. As mentioned, we really need to be able to focus on how we want to standardize. But what does it mean to actually standardize what is normal anyway. So I love a stock photo. They look like they have just sent me this. I did purposely choose these stock photos to by the way.
So yeah. So I think if you have dealt with standards or try to address the issue of standardizing, you may have come across this very famous cartoon before. So what did we do with why did we switch publishing as we start to embark on this journey of looking at over 2000 titles, how do you even begin to standardize. Is it possible to standardize do we want configuration, do we want customization?
OK, so a little bit about that. So the case study here is Wiley's end to end project is really a mission about getting fit for an open research future and the challenge being complex infrastructure, multiple submission and peer review platforms and thousands and thousands of stakeholders. So what we needed to do is do some workflow analysis. How many workflows do we really have. Is it possible to have one size fits all.
So first let's define workflow. So this is the process to review, support, edits and decide on manuscripts for publication. And we want to make sure that the scope of the workflow is defined as after the point of submission to the point of sending to production and start high level focus on actions that drive business value outcomes and not features, we often get bogged down.
I think Brittany, you mentioned it as getting in our feelings. This is about taking the same sentiment of the 10,000 foot view to make sure that we're focusing on the core actions that drive business value. Then you want to collate your data. So this is what we did. We had a look at the 2000 journals. That number varies depending on the way that you report on things.
And of course, that number does change naturally. But we were able to identify at least 1425 journals with structured data. Very, very lucky that we could leverage data pipelines into data lakes to be able to look at that data. And we focused on a extracted sample data for manuscripts submitted within a specific time frame. The next stage, this is probably the longest phase of this process is the cleanup.
So we want to reduce those data dimensions. We need to start looking at how can we put that data into categories by focusing on core actions and their actors and also the notion of hierarchy and trying to decouple actions from actors, but keeping it simple and also making sure that we agree on semantics so that once we've started processing this data, we all understand exactly what it is that we mean.
So lots of deduping here and here. I've provided kind of just a picture of how we've tried to reduce those data dimensions. Yeah so all of this is, makes a lot of common sense, I think. I think we're all very familiar with the flow of manuscripts. We know these flows well enough to be able to assert or assume given rules. But if we continue to abstract this data, we actually can start to focus on the relevant points that allow us to identify trends and also in turn validate our categories.
So what I mean by that is that once you start to abstract all of your data points, you can begin to view patterns that allow you to identify hierarchy. And the notion of a hierarchical structure actually becomes an important distinction when defining a model. So we know already that checks happen, that an editor is assigned, that an editor invites, that peer review takes place.
Those are very, very standard steps in the flow. But how do you define a workflow. And it is really focused on this idea of whether it's a one tier model or a two tier model. So this is how we've begun to break down the different types of models that we're seeing in our data. So single tier is we really have one editor who handles all of those core actions, two tier and one step, which is really focusing on a editor role.
One assigning editor role to an editor role two, making the final decision. And two tier is a similar structure here. But except we have it going back to the first tier to make a final decision. We also know there's a third tier, but I'm just not going to focus on that for now because I think it's really important for us to start to really analyze exactly what kind of impacts these workflows are having in our operations and our journal performance.
So what did we learn, customize or configure. Or have we taken configurations to the point where we're actually customizing? Is there even a difference. So through 60,000 rows of data, just a sample that we took, we actually have 22,000 different task names or different labels for actions in the workflow. And there's actually 130 different ways to say make a recommendation, 700 different ways to label various peer review tasks and over 80 different ways to label a sign Editor Tasks with actors still 20 different editor role names.
And I think it's important to be able to see this and really help us understand how much value are those custom labels really bringing to the flow. How much value are some of the different how much value does a deviation from a standard flow actually bring. But the results of the analysis also show that roughly 80% of the 1425 journals can actually be mapped to one of the four workflow models.
So despite all of those variations, actually they still belong to four core models, and 56% of those are two tier. 24% of those are one tier. I mentioned there's a fifth model with three tiers, which we're continuing to do some analysis on. And then we had 9% that we need to really look under the hood to see what kind of fun stuff is happening under there.
So as mentioned, these are the common models that we're finding. We've defined them here, one tier two tier with one step decision with triage, two tier 2-step decision with triage, two tier 2-step decision. I feel free to contact me later. If you want to get into a bit more detail about what that means. But what we've been able to do is actually use this information to help with our dialogue with editors with partners, to help them understand and see that actually the value in your workflows is really being able to find the time to do things like research integrity checks to streamline.
So that we don't spend too much time with focusing on different labels and changing our workflows when actually the majority of our workflows can fit within these standards. And with that, we were able to use those concepts to build the workflow building blocks. And so you take a look at these and I'd love to hear your feedback on can you see for your own workflows, are you able to use these building blocks.
Do they fit, or do you think that these four standards are actually perfectly fine for your journal. So please connect with us. We are at booth 202 if you want to schedule a meeting with us or if we've got a couple more slots for the research exchange screening demos as well. So Thank you.