Name:
The Data Revolution: Unlocking Value Across the Publishing Landscape
Description:
The Data Revolution: Unlocking Value Across the Publishing Landscape
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/71267aad-def1-407f-b911-ee2a0e4cb601/thumbnails/71267aad-def1-407f-b911-ee2a0e4cb601.png
Duration:
T00H57M29S
Embed URL:
https://stream.cadmore.media/player/71267aad-def1-407f-b911-ee2a0e4cb601
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/71267aad-def1-407f-b911-ee2a0e4cb601/PSWS25 May.mp4?sv=2019-02-02&sr=c&sig=gls79M4VRPjia%2FzDsShVDhdwvusJ0QTp%2BNpZqC9%2FbI0%3D&st=2025-05-24T14%3A10%3A17Z&se=2025-05-24T16%3A15%3A17Z&sp=r
Upload Date:
2025-05-15T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
JOSH DAHL: Thank you everyone for joining today. I'm the SVP of product at Silverchair and general manager for ScholarOne. And I'm really excited to be bringing this panel and moderating this panel today, talking about data, the importance of data, how we manage it, how we use it, how we unlock its full potential in the publishing space. Just in a moment, you'll see a poll pop up. Answer it to the best of your ability. We're not looking for exact answers, just from your perspective within your organization, asking a question on data maturity within your organization.
JOSH DAHL: And then what are your organization's main areas of focus with regard to data. So with that, I want to introduce our expert panel. So we've got a great panel here joining us bringing a lot of different perspectives on this particular topic and question. So first, we've got Christian Grubak, who is the CEO and founder of ChronosHub. We've got Colleen Scollans, practice lead for Marketing & Customer Experience at Clarke & Esposito.
JOSH DAHL: We've got Beth Windsor, senior business analyst at the American Chemical Society. And we've got Michael Crumsho VP, Technology and Product Delivery, McGraw Hill Professional. So we're excited to bring everyone here. Thank you all for joining panelists. And let me check quickly. We'll come back to the poll in a moment.
JOSH DAHL: Let's start. First start, I'm going to go ahead and start with some questions for the panel. One of the big ones. Data is a big question. It's a loaded question in a lot of ways. And I wanted to get your perspective from the panel on what do we mean when we talk about data? Colleen, give us some perspective on that.
COLLEEN SCOLLANS: Yeah, sure. When I think about data, I think of it on two axes. One is the type of data and the second is how businesses utilize it. So on the type of data we could be talking about customer and audience data. What do we know about the people that engage with our products and services, read our journals, our authors, et cetera. And that includes know information that we know about them, but also behavioral data.
COLLEEN SCOLLANS: We could be talking about content intelligence and how we classify and better understand the content we publish, the products we create. We could be talking about financial data, revenue, P&L, all of that kind of good stuff. We could be talking about campaign performance data, how marketing campaigns perform. And then there's all sorts of operational data-- workflows, processes, time, SLA.
COLLEEN SCOLLANS: It's a whole host of different sources of data. And all of that comes together to serve a few broad use cases. And I will say my panelists should chime in if I'm forgetting any here. But the first one I would say is marketing and digital experience. And what we're talking about here is how are we better understanding our customers and audiences so that we can better serve them marketing messages, content, and have better user experiences for them.
COLLEEN SCOLLANS: This could be personalization, recommendation, all of the things that improve your marketing performance and the user experience of people engaging with your products. Another really big category of data is sales enablement. Business development as sales teams are selling into institutions organizations. Increasingly that's selling needs to be evidence-based, data-backed.
COLLEEN SCOLLANS: And so really well organizing that data certainly helps the sales team generate revenue. Product and publishing strategy. Where should you publish? Where should you commission articles? Special issues. What products are working? All of that it's incredibly important for publishing pipeline commissioning, but also what new products to develop or what products to deprecate.
COLLEEN SCOLLANS: And then monetization. There are advertising and sponsorship models off of the back of audience and customer data. There's ability to license data to different kind of organizations, all sorts of operational workflow efficiencies. That data helps us improve. That could be the intelligence it provides, or it could actually be the trigger for AI and automation.
COLLEEN SCOLLANS: And lastly, and it's kind of a catch all, but probably the most important use of data. And I know, Beth, you're specialty business intelligence. All of the things it tells us as a business is working is not working. That can be leading, that can be lagging, that can even be predictive. So data is all of those things.
BETH WINDSOR: BETH WINDSOR : Yeah, Colleen, that was a really beautiful laundry list of all the things that data can do. I wish I could have been taking notes. I'm going to listen to this recording later and take those notes. And it really speaks to how data gives you that opportunity to get that holistic view of your customers and your entire business. And another wonderful thing that data does is that when you unsilo all of this data that gets you to all those things that Colleen talked about, you're going to naturally unsilo the teams in the organization because you're all going to be using that shared resource.
BETH WINDSOR: And there is one thing I want to make sure I reiterate throughout this hour is that data is no one's part time job. It really is that dedicated collaboration across multiple teams. And the only way to make this truly successful is for the organization to have a shared vision for what they're doing. Which of those things that Colleen listed are you going to attack, and what data are you going to use to attack it?
BETH WINDSOR: And it's important for that leadership to have pressure that they're putting down to those below them to use that data and advance the state of their data programs to address all of those problems. So it does. It offers a lot of opportunities for your business, but also offers opportunities within your culture to change how you're doing business.
MICHAEL CRUMSHO: I like that comment about it's no one's part time job because I think the thing that's interesting is that it's a multifaceted array of data points that are available or that people are expecting. But within, I think, each segment of the organization, there's specific interests that don't necessarily carry over to other parts of the organization. So that's been the interesting thing that I've been working on here is to understand the differences between what sales needs and what product needs and what marketing needs and what we need to track from a financial perspective.
MICHAEL CRUMSHO: Data means different things to different people. So as a product person, it's always starting from the top and understanding what is the problem that you're trying to solve and how can data play a role in that. I think that's a lot of where my team tends to play overall is just understanding what is it that we're trying to communicate? What would be the ideal state?
MICHAEL CRUMSHO: And there is a lot of power in a lot of the metrics that we can capture to do that and demonstrate a lot of value. And it is just shifting landscape that we're trying to work on now. So I think everyone needs to have some level of competency, but there definitely does need to be one decider in chief pushing things along overall.
CHRISTIAN GRUBAK: And I think, Michael, adding to what you're saying is we often experience that organizations treat different data streams separately. So there isn't really a crossover. And so sales data gets treated separate from usage data, separate from behavioral data. And so, I mean, really the powers in the question. And if you can't traverse all those different data streams because you designed it as marketing will follow their analytics and then sales will do theirs, you really end up losing out on some of the insights.
COLLEEN SCOLLANS: Yeah, I agree, data governance is really important to everybody can interpret and think of data quite differently. And if your goal is to have that unified data layer, that data foundation for whatever use cases the business needs to advance, you need to be having a common data dictionary, data vocabulary, data governance. There can't be 500 different ways to express role or whatever the field might be. There has to be a universal way of thinking about it.
COLLEEN SCOLLANS: There can be options, but you have to come together if you want that full benefit of data, like you were saying, Beth, bringing silos together.
JOSH DAHL: Yeah, it leads into an interesting question around building that competency, that muscle memory in an organization to be able to take, integrate, and use the data in a proper way. Michael, when we were talking in some of the planning meetings coming up-- actually, before we do that, let me just quickly-- we got the poll results here. So interesting. Let's see here. Rate your organization's level of data maturity.
JOSH DAHL: Looks like almost the majority of you are in the second stage, the manage stage. The importance of data in the organization is realized. Some of you, looks like about a third of you are in this middle area where you've got some regulation and guidelines. Not as many of you in the first stage and not as many of you in the latter stages. Panel, how does this match up with colleagues or your organizations, or, Colleen, when you're working with organizations, how does this match up with your view of where they are in their data maturity?
COLLEEN SCOLLANS: From my vantage point, this spread is about where I expected. What I think might be interesting is we've got people on this panel and organizations. I wonder if we ask different people in the organization, would they have rated it the same? Because we sometimes find that somebody may be in a particular function, thinks data is really good and the CEO doesn't, or someone in a different kind of function. So I think there's a little bit of contextualness sometimes to this answer.
COLLEEN SCOLLANS: But I would say most of the clients we work with realize the importance of it. And they're probably early in the journey of certainly optimizing it. They may be optimizing it in pieces.
JOSH DAHL: Yeah, well, that's a great tie-in. Michael, I was going to ask you a question about competency. I know this is one of the things we talked about in the lead up. How your organization can start to develop that data competency. And you've got some experience at McGraw Hill. Do you want to talk a little bit about the barriers and how approaches that you've taken or the organization has taken to build that?
MICHAEL CRUMSHO: Yeah, I mean, I think there's a couple of not necessarily barriers, I think, but just challenges that we need to work through and figure out a strategy for overall. I don't think McGraw is unique. I know a lot of larger companies have probably experienced this as well. But when I look at the landscape of the products that I'm responsible for, it falls into three buckets. There's a core set of products that we develop with Silverchair, which we view as an external party, but they have very robust analytics that have been developed over a couple of decades.
MICHAEL CRUMSHO: We have a platform that we acquired from a third party that developed their own analytics from scratch. So it was kind of a de novo process for them. And then we have an internal product that we've developed that was leveraging technology for a different use case, higher education use case, which has a different set of metrics than we're used to providing within our organization.
MICHAEL CRUMSHO: So I think that's been challenge number one. When I showed up, there was a lot of people talking about the challenges in trying to present a cohesive view of what our customers were using. So there's that normalization aspect. And I think a lot of companies have experienced this in terms of trying to get products out to market faster or making certain acquisitions.
MICHAEL CRUMSHO: And I think the challenge sometimes is when you make an acquisition or you're launching a new product, there's a thesis that you launch that product under. And it was to capture a certain part of the market or a certain amount of revenue. And sometimes the data that's necessary to truly compete there wasn't built into that thesis, and you find yourself backing into it.
MICHAEL CRUMSHO: And so I think that's been part of the challenge here, is just understanding that we've demonstrated the validity and utility of the products, and now we need the correct data underpinnings to be able to demonstrate that in a robust statistical way. And I think people are on board with that. But sometimes getting the buy in for that can be a little bit challenging because data projects-- everybody likes the cool AI project that has a nice widget that you can put on your site or that looks sexy in a press release and saying, hey, we've now made sure that a click is a click is a click across our entire ecosystem.
MICHAEL CRUMSHO: It's not really something that a marketer is going to be super. And, Stephanie, can correct me if I'm wrong on that one, but it's not something that a marketer is going to be really excited to send a message about to clients because that's just table stakes and that's expected. But I think a lot of us are in the same position of trying to have to normalize at this point and understanding what we should be tracking and how we can work it in the future.
MICHAEL CRUMSHO: So that's been my perspective here. There's an appetite to do this work, thankfully. But I have been in other organizations where those kinds of back end projects languish because it's hard to necessarily directly tie it to revenue sometimes operationalization isn't the coolest thing to try and put forward because it's not directly tied to revenue. So I think those are some of the challenges that I've seen.
JOSH DAHL: Yeah, that's a-- yeah, go on, Colleen.
COLLEEN SCOLLANS: I was just going to say when Michael said the word table stakes. I think that's a really interesting question. How much of this is just table stakes versus needing a business case that it's going to drive revenue or it's going to have operational efficiency. Some of this in the world we compete in now may just be table stakes.
BETH WINDSOR: BETH WINDSOR : Yeah, another word that Michael that I latched on to was saying buy-in. Getting buy-in. And for us or for me, I feel like data competency has a lot of layers, but I'd like to talk about culture because I truly believe that is 51% of your problem. And we have found an effective way to shift our culture. We've been doing actually a pretty good job here at ACS of shifting into a data centric culture.
BETH WINDSOR: And we've done it by taking those data professionals and having them collaborate directly with those business units to understand the problem and understand how those business units are doing the work. We don't make any assumptions that we know how to do their job better than them. We work alongside them and we create those solutions alongside them. This is really not one of those build it and they will come moments.
BETH WINDSOR: It's really about making their job easier, allowing them to have better results with what they do, and then they start coming to you naturally in the early stages when they're kicking off a project or they have a problem to solve. And what that does is it shifts the culture, but it also naturally builds the data competency because they're in it with you and it builds it for both of you.
BETH WINDSOR: Because even for the analyst, your data competency is hinged on understanding what they need to do and their role and their problems to solve. So that's where-- go ahead.
MICHAEL CRUMSHO: I'm sorry. I was going to add to that to just we have set up a process where we are working with the data experts. But in a larger company like ours, I think some of the challenge is actually in educating people on our market and the business model. So I remember meeting with our data team, just sitting around. And we just launch right into talking about the specific needs in our market.
MICHAEL CRUMSHO: And I realized that people were looking a little confused about certain things. So I just took a step back and said, do you guys know what counter is? And everyone was like, no, we don't. And I was like, so let me back up. So just we have had to talk about collecting metrics like turnovers or lockouts, which are key for our selling, which is a demonstration that someone tried or had the intent to use a product that they don't have an entitlement to.
MICHAEL CRUMSHO: But even that conversation, which is fairly simple, had to go through multiple iterations of explaining to engineers like a turn away does not mean you unsuccessfully logged in because you could not successfully logged into something you actually have. It means you actually are authenticated into a product, but you don't have it. So setting that baseline is also part of the challenge with a lot of colleagues who might not be familiar with how you sell or what the purchasers actually need to understand or what you're trying to accomplish there.
CHRISTIAN GRUBAK: Looking at it from the product side. Often we see product strategies. We see business strategies. Data strategies are just beginning to come around. And it's not the default strategy for a lot of organizations. I mean, Beth, I do know a little bit about ACS. And it's been a long journey, I guess. And so the whole thing about building product and then figuring out what data do we want.
CHRISTIAN GRUBAK: And then you leave it up to engineers who, as you say, Michael, don't actually necessarily understand the business reason for doing what we're doing and where the decisions are made. So turning into business to a data strategy is a useful tool because it does help inform the other strategies. And you do save a lot of frustration down the line when somebody like me comes around and say, well, we need to measure this because somebody needs to know that it could be too late.
CHRISTIAN GRUBAK: And that's why you need to be early because once it's gone, you never get it back.
JOSH DAHL: Yeah, this is a really good area of discussion too. I want to go to a question just around like, obviously, ACS has invested a lot of time. You've gotten the buy-in. You've built the competency. Just out to the panel here, what about organizations that don't have maybe the size and scale of an ACS or a McGraw Hill? What are some things that they can do focus on to help moving towards a effective data strategy?
JOSH DAHL: And yeah, just curious, what are some things that you all would advise some of those organizations to start doing either from prioritizing metrics or other types of things?
COLLEEN SCOLLANS: Yeah, I would say it starts with building an inventory of what you want to do. That could be some KPIs, that could be some use cases. And then being really disciplined and prioritizing what's going to move your business forward. There's always this dance and balance with data. You want to be thinking enterprise wide and have the right governance, but you also want to be agile. And one of my favorite sayings is single customer view is a destination.
COLLEEN SCOLLANS: It's not a use case in terms of data. You don't need every single piece of customer data necessarily to advance a use case. So I would start by clear-- starting by being really clear who owns the data governance in my organization. Who is that single point? It could be part of a role. It could be a full time role to ensure that we're not doing this.
COLLEEN SCOLLANS: We're not going a million different places. And then I would do an audit of what the opportunities are in terms of my use cases. And then I would do a project to be really clear, what are the KPIs? What are the performance metrics I need to run my business. And then I'd work backwards from that to the data. But it starts with what you want to do from a business first always.
JOSH DAHL: Yeah, Beth, you've had some experience starting and building up the competency at the ACS. I wanted to ask you a little bit about that point that Colleen brought up is identify the priorities, but then also start to pull the data into a place that you can actually use it and start to get some of those insights. What are some of the things that you've done, your organization has done to help integrate third party data sets?
JOSH DAHL: Mix it with your own internal data. How have you approached that being an organization that is pretty far or progressed on that scale of maturity?
BETH WINDSOR: BETH WINDSOR : Yes, I actually love this question. We talked about it in our preliminary discussions. And it really third party data sets are a huge opportunity to create insight and opportunity outside of your organization. You're no longer just looking within. You're able to keep tabs on the greater market space. We're using a number of external feeds. I actually started tallying them up in my head the other day, and I think I stopped at 10.
BETH WINDSOR: And they really do support, like I said, our competitive analysis. The business development that we do, whether if you're looking to branch out into new countries, they improve our editorial processes. We're using them across the board as a shared resource. We use-- I'm going to give a few examples here. We have a number of data sets. We have more than one actually where we want to understand the flow of money.
BETH WINDSOR: And this money is not only for research grants and funding, but it's also for capital markets. And this supports our business development efforts, whether that's from the acquisitions angle or the sales angle. We also have data sets that help us just understand individual affiliation. So now we can see that individual activity and assign it roll up to that larger organization.
BETH WINDSOR: And that helps us with customer growth with finding new customers. And also with our customer retention by proving the value of the ACS content, saying this is how we know you are using us. And if you want to talk a little bit outside of what ACS is doing, I've seen a few organizations do some really, really cool things with digital science data.
BETH WINDSOR: So I don't know if they're still doing this, but years ago, the National Academies of Science had developed an internal tool with an Altmetric feed, and this allowed them to monitor the daily activity around their publications. And if they saw that something was getting traction, something was influencing a policy, or it was getting traction in the news, they could quickly promote that publication and keep that ball rolling.
BETH WINDSOR: And I found the tools they built to be pretty interesting. And it was all with these external data sets. And I will admit, ACS has advanced and these data sets can be pretty expensive. So I usually advocate for that cheap fix first. If you can find a data product out there that you think has value, try to get a sample of their data, or they might have a web tool available that you can subscribe to and develop your use cases around that, really.
BETH WINDSOR: And then if you think that it has legs, and you think you can ingest it, you can talk to that company and say, OK, what would it mean to actually purchase this feed? And then talk, of course, with your IT group to say, can we ingest this reasonably? Can we store it reasonably in a central place where we can all use it? And can we most importantly connect it to our data and deliver it in a way for its intended purpose?
BETH WINDSOR: But there are ways to step through this to figure out if there's value and develop those use cases. It's very similar to what Colleen was saying. You're being really agile with it. And then you also have to set up, like she said, those KPIs say, OK, I have this thing, I'm going to use it. Is it hitting the mark? Is it doing what we think it needs to do?
BETH WINDSOR: And if it doesn't, then you cut it and you move on. But I'll tell you this, I don't recall in the last five years us cutting anything. We have found value in all of the data sets that we have decided to purchase and ingest.
JOSH DAHL: Yeah, sales and marketing obviously a big use case. Colleen, you were going to say something?
COLLEEN SCOLLANS: Yeah, I was. Well, I just wanted to make a point. It's probably an obvious point, but when you think about data, there are, I think, two important lenses. And this is me putting my marketing hat on. It's the intelligence you get from data, which is super important. But also as marketers to be able to act on data. And so one of the unmet use cases we often find is organizations have the data is the marketers can't do anything with it.
COLLEEN SCOLLANS: There's no ability to segment or personalize or activate. So again, getting back to the use cases is your unmet need. Some sales enablement reports for your sales team that may require enrichment data, et cetera, et cetera. Is it that you have data but it's not governed and structured. And so you need to put it together. Is it that you don't have the data? Is it that you have the data, but if your marketing team wants to build a segment, they've got to knock on somebody's door with SQL, and it takes 48 hours.
COLLEEN SCOLLANS: Being really specific about how you want to use the data is really important.
JOSH DAHL: Yeah, that comes into that prioritizing what you want to do with it first.
COLLEEN SCOLLANS: Yeah.
JOSH DAHL: Sorry, Christian. Go on.
CHRISTIAN GRUBAK: No, I was going to maybe circle back to one of the comments you made, Beth, is about data sources. We often see that unless it's perfect, we're not going to do it. Now, I think that's one of the biggest misconceptions in data use. There are many, many, many data sources out there, which will not give us 100% in terms of precision rate but they're good enough for purpose.
CHRISTIAN GRUBAK: Because if we do, if we cross-reference it enough, we're going to find the anomalies. And sometimes just establishing the baseline will help it. And I can't help thinking smaller organizations with limited budgets setting there thinking, oh my God, this is a mountain. We can't climb this. We don't have oxygen. We don't have all these things. So I mean, being deliberate about the data strategy first, knowing the question-- to your point, Colleen, what I really liked getting asked is we're told actually, we want to know this.
CHRISTIAN GRUBAK: That's not an engineering task. That's a business task. So let the engineers produce the data, let the data people avail it, and then let the business people understand it. But unless we take that approach, and I think a lot of organizations shy away from it simply because it seems too complicated and it seems too expensive.
CHRISTIAN GRUBAK: So I just want to give that angle that doesn't have to be perfect to be good. Actually, you can obviously always buy the better option.
COLLEEN SCOLLANS: 100%. I would also add that I think a potential barrier, or at least one that we see, is maybe legacy systems that hold data that have been around for a really long time that may not be really fit for purpose for modern data use cases. Association management systems, for example. And so if a data strategy also requires really understanding how your systems hold data, what you can do with that data, where their limitations are, the number of people I've talked to who have been trying to get their association management system to do some of this type of stuff.
COLLEEN SCOLLANS: And actually, that's not the best system to do it in. I mean, really thinking really carefully about what you need for your data strategy is very important from a technology standpoint too.
CHRISTIAN GRUBAK: I agree. I mean, I'm sat here thinking about my days in e-commerce, and we were talking about the complexities of cross-device tracking. Now we have cross system tracking here. And some of these systems have some age to them. So they may not be able to track or report in the same ways. And so having building strategies on how to mimic that data, how to extract it is going to be key because there are very, very big blank spots on the map as it currently stands, and they're not easily accessible.
COLLEEN SCOLLANS: Yeah.
CHRISTIAN GRUBAK: So I really, really hope that some of the efforts going into, I don't know if we call them legacy systems or whatever they are, also cater for the lack of data transparency.
COLLEEN SCOLLANS: Yeah, and just the ability of these legacy systems to realize we got to get data out. You may not be the best system to do what we need to do in the system natively. So you have to have very good API so we can get our data out and put it in the system that can do all the things with it. Yeah.
JOSH DAHL: Are you seeing panel more standardization of data and where does that stand right now. I know, Christian, we were talking a little bit about standardization and interoperability. You talked about that. And Colleen did about just legacy systems. Are there practical approaches you can take to working with some of those vendors or working with those systems and are there, as you talked about, good enough steps that you can use to bypass some of the limitations?
CHRISTIAN GRUBAK: I mean, this is something very close to heart for me because I was reading about an effort to introduce a common language on Earth, not too many days ago. And it never happened. It failed. We've been trying to do this for data for the longest time, and the effort hasn't really resulted in a common language. It's like speaking.
CHRISTIAN GRUBAK: We're all speaking the same dialect, but we do it poorly rather than just introduce Google Translate. So if I could have it my way and I don't know if I can, but, I mean, standardization should not be the end goal. To make sure that we have interoperability, accessibility of high fidelity data is much more powerful than a common standard, which everybody struggles to keep up with.
CHRISTIAN GRUBAK: I'm yet to see a particular standard being followed by the majority of organizations in this industry. So I think there's been so many technical advancements lately that allow us to not care so much about the structure of the standard, but more being able to collect and interpret the data. And I think that's actually where there's a lot of opportunity in there. But also seen to both the bigger organizations who can build massive data lakes and try to understand it and attach AI's and whatnot to it, but also the smaller organizations because storing unstructured data and understanding that is always cheaper than trying to standardize your output.
CHRISTIAN GRUBAK: So I mean, I think we're going to see us moving away from one standard to rule them all. I don't think it works.
JOSH DAHL: Yeah, and it maybe brings practical steps, maybe for some of you, if you could think about practical ways some of you've had to work with what you're given. Are there some suggestions you have for organizations, again, organizations of size that maybe they don't have a ton of money to devote to this, but they want to start building that muscle? Are there ways they can start to connect this data, things that they can be doing now to start thinking about how to pull it together?
MICHAEL CRUMSHO: Yeah, I think for me it's what are the jobs to be done for your organization? And what is the data that's necessary to demonstrate the completion of those jobs? I think I come back to-- I think Colleen was talking about a line on the use case. Because we are going through a data normalization and centralization process here, which I know we have talked about that shouldn't be the end goal. And it's not the end goal.
MICHAEL CRUMSHO: It's a necessary first step, I think, to get to the goals that the business has and a lot of the goals that the business has in terms of what we actually want to be able to demonstrate as far as the utility of our products, how we actually want to measure like switch out of this mode that we're in, of just demonstrating and equating usage clicks on things to the actual value of the product and getting to the point where we're measuring outcomes, whatever those may be.
MICHAEL CRUMSHO: It does come down to understanding where you're trying to go because I'm not trying to take the whole world of data that we have available to us at McGraw Hill and just standardize that across all products. I'm trying to take the data pieces that I know are important for accomplishing those goals, and making sure those are uniformly applied across all of the products, where we have to demonstrate that.
MICHAEL CRUMSHO: So really, it's use case. What is the problem you're trying to solve? What are the jobs to be done here? And then figuring out what data components are essential to that.
BETH WINDSOR: BETH WINDSOR : Yeah, I would definitely agree with Michael there. And I agree with Michael and Christian. Christian, you never want to over process anything. I feel like that's just like a life motto. Don't over process things. But for our use cases, for our business analytics team, we found that standardization has allowed us to be more interoperable and scalable. And I think that might have been what Michael was just saying.
BETH WINDSOR: Because when we do introduce those third party data sets, it allows us to plug and play very quickly because we've had that layer of standardization and normalization already taken care of. And we can work with multiple groups quickly because we know that our data is clean and joined and our house is clean. So I guess you could argue both sides of it. But we don't build anything within our ecosystem without knowing how it will support the teams and having it structured in a way that will allow us to move quickly with it.
BETH WINDSOR:
COLLEEN SCOLLANS: I was just going to build on Josh's question about what small teams could do because I think we work with clients with very, very large teams and clients with very, very small teams. And this may seem really practical and really, but simple but important. The number of organizations that we encounter that don't even have Google Analytics set up correctly. So for me, it's look at what you have, what it can do, and make sure you're fully maximizing that.
COLLEEN SCOLLANS: So everybody has an analytics tool. Usually Google Analytics. But obviously, there are other choices. And in many cases they're not set up. There's not standard UTM parameters tracking for marketing, just picking a marketing use case or there aren't dashboards built that you can do in Google. So whatever the system you have, make sure it's giving you what you need.
COLLEEN SCOLLANS: So that would be my first tip. I'm often surprised at that.
JOSH DAHL: Yeah. Go ahead, Beth. No, please.
BETH WINDSOR: BETH WINDSOR : I was just going to mention on that how do you get started. I think that the alignment of your groups is also very important. You need to make sure that your business units and your IT group and your data professionals are all aligned and working together. And again, that goes back to that first comment I was making about culture because one of my favorite quotes is culture eats strategy for breakfast.
BETH WINDSOR: And you really do need those three groups working together to collaborate without an ego. And that's very important without the ego because this needs to be part of their objectives. And everybody needs to come to the table on a level playing field. Nobody should be asking for favors. This needs to be like a sanctioned effort to solve these problems.
BETH WINDSOR: And that is where that pressure from the top down comes into play.
JOSH DAHL: Yeah. Are there other practical steps because you talked about that at the front of this. Beth, is just like having the buy-in culture is so important. Having the buy-in from the top. Yeah, how do you get buy-in across different teams and different departments that have different priorities, it's practical things that you've done or seen at the ACS that might work for other organizations.
BETH WINDSOR: BETH WINDSOR : I mean, that is where you put the people in the room together. You had mentioned earlier ACS is advanced, but we didn't-- our business analytics team did not come into this situation in an advanced state. We built this from the ground up. And this literally started with two people that had-- one person had a problem and one person had data. They literally used a spreadsheet to solve this problem.
BETH WINDSOR: And it was a pretty significant problem that saw a nice return. And that's what got everyone's attention. So when you can get the attention by solving that really significant problem that has a return, that's where you can start to sell this as potential. I don't know if you have this same troubles, this same problem. Now, I feel like people are starting to understand the value of data.
BETH WINDSOR: But this was a good seven years ago when people were still a little-- it was a question mark. We were lucky in the sense that it didn't take a lot of convincing for our vice president to understand the value, even 7, 10 years ago. So we were getting that pressure. But I do think it is those small projects where you get those wins and you can prove the value and that's where you start to get the buy-in.
BETH WINDSOR: Again, make their jobs easier and make their results better. And that's when people start coming to you. But it's the alignment is important as well. You have that business unit, you have that IT group, and you have that data professional in the middle that needs to collaborate with all three. The whole group needs to collaborate, but that data professional in the middle to be that translator.
JOSH DAHL: Got it.
CHRISTIAN GRUBAK: But I think that starting with the small projects first because what we often see is organizations go from the first level of maturity to the second, and now they want to be able to analyze everything. And the thing about it is that when you invest a lot of resources and money in collecting data and you don't practice the asking the questions, the querying the data, the analyzing the data, then at some point that goes stale and people turn away from it.
CHRISTIAN GRUBAK: It is a very, very expensive product, but we were actually not using it. So that change management process of keep looking at your analytics, Google Analytics data, even though it's not first class. It's better than what you had a minute ago. So it's just getting the whole organization used to that decision process is very important, seen from my perspective, because there's so many products out there just idle because they're not being used.
COLLEEN SCOLLANS: Yeah. I would also say maybe this is just me, but data is infinitely interesting. And so it is easy to have too much data and want to swim in it. And being disciplined about what data matters, what data is driving business decisions or data you can act on is really important. It's really a lot of it's just really super interesting. And so there has to be a little bit of discipline as well.
BETH WINDSOR: BETH WINDSOR : You're not the only one that feels that way. Colleen, I think it's really interesting. [LAUGHTER]
JOSH DAHL: We got the right panel for today. That's perfect. Yeah. There was a question that came in from the chat that I think is related to this. Nathan Quinn asks, can you speak to the dangers of over collecting customer data and holding data that isn't being used, but it is being collected and might be useful someday, but you don't really have a direct use?
JOSH DAHL: You guys speak to the governance side of it. Yeah, there we go.
MICHAEL CRUMSHO: There too.
COLLEEN SCOLLANS: What are you using the data for? There may be reasons to collect a swath of data if you're doing big data analytic kind of predictive products. But for the majority of clients that we work with, they've got really defined use cases for their customer and audience data. I preach minimal viable data. What is the data you need to hold for those use cases? You don't want to be overwhelmed with data.
COLLEEN SCOLLANS: You also, privacy legislation indicates we shouldn't be holding data that we're not using. There has to be a purpose for holding and collecting data. An example I give, it's a really simple example, but I don't need to hold every single email, click link in every single email I've sent out. Maybe there's some business case for that down the road. But as a marketer, I need to know maybe the last couple emails you clicked on or do you click on emails.
COLLEEN SCOLLANS: Or what type of emails do you click on? I often need the higher level kind of insights, and you can get deluged in data. So to be respectful of customer privacy and to do your job well you have to have minimal viable data mindset.
MICHAEL CRUMSHO: Yeah, a lot of times when I encounter a request for collecting a specific piece of data that isn't attached to any utility for the data, I kick it back right away. You should be collecting this. OK, well, what do you want to do with it. I don't know yet, but maybe someday we'll want to do something with it.
COLLEEN SCOLLANS: Yeah, no, thank you.
MICHAEL CRUMSHO: A lot of the reasons for cutting down on engineering churn and also recognizing the ever shifting privacy landscape in which we live. So I it's important to partner with your legal departments or whoever's in charge in that capacity to understand what their perspective is on what you're actually capable of collecting and leveraging. Overall, that's a fun. We have made some decisions.
MICHAEL CRUMSHO: We're standing up some new products right now about over collecting some data as a part of that build. We have some ideas for how we'll use it. But we're really trying to stay out of that. Just because we think we can collect it, we need to know what we're going to do with it. And does that mean at some point in the future you might come up with a really great data idea and you won't have a couple of years worth of data?
MICHAEL CRUMSHO: Potentially, but I think that's more of a case for just really establishing the use cases right now and understanding what your trajectory and your strategy could end up being so that you can actually plan accordingly.
COLLEEN SCOLLANS: And prioritizing those use cases. You might have batch one this year, batch two next year.
CHRISTIAN GRUBAK: Yeah, but I think there's also a very, very important piece of understanding the metrics you're looking at because some of them can actually impact each other in negative ways. So if we look at something as simple as retention rates, so if a retention rate goes up, is that good or bad? Actually it should be good. But if your user count doesn't follow, it actually means you're attracting less business. So your retention rates goes up.
CHRISTIAN GRUBAK: And so understanding what really drives it, you have to agree on what does good look like. What are we tracking here? Are we tracking the acquisition cost, or are we tracking the lifetime value? Which one. And what's the ratio we're expecting between those two? And if retention rate drops is marketing going to go crazy? But you're seeing more users coming on.
CHRISTIAN GRUBAK: Well, that should be good in a potential sales situation. So just being able to avail the data doesn't actually solve the problem. You need to understand what you want to do with it and how they impact each other as well.
JOSH DAHL: So it's not just about the metrics, but it's also about the context for the metrics and really understanding this is a key metric because it means this for us as a business or for us as an organization.
COLLEEN SCOLLANS: Yeah.
CHRISTIAN GRUBAK: Exactly.
COLLEEN SCOLLANS: Just building on your really good examples, Christian, cost per acquisition may be interesting if I'm comparing different ad channels because it's a flat, easy comparable metric. But if I really want to understand if my marketing is working, you've got to get closer to lifetime value. And so metrics are both useful, but they're useful in very different contexts and purposes.
CHRISTIAN GRUBAK: Exactly. And then, I mean, Michael made a very great point that there's a lot of things being done in the data privacy space, which is now making it even harder. So when we want to, when we want to measure everything, when we want to put everything on dashboards, and at the same time somebody has taken away the IP and masking it, it's becoming increasingly important. And you may be looking at data, which isn't actually correct.
CHRISTIAN GRUBAK: So there has to be different strategies. And that's also why establishing that baseline to be able to follow and benchmark it against what it was before is going to be real important because it's going to get harder and harder to collect behavioral data. Now, some of the data we're collecting from say, peer review management systems and other platforms will still be there post login.
CHRISTIAN GRUBAK: But pre-logging is going to be difficult.
COLLEEN SCOLLANS: So that first party data is always important. And I agree that there's challenges. But the best data you can ever get is zero-party data. I hate where they've numbered it because it should be higher because it's more important. But when someone chooses to tell you something about themselves. I'm interested in acts. I want to learn about why. Building that trusting relationship to get data that customers want to share with you because they trust it will improve their experience, it is really the Holy Grail of data.
COLLEEN SCOLLANS: Hard to get. I need strategies around it, but it is by far, in my opinion, the most valuable data.
CHRISTIAN GRUBAK: Yeah, but what do I get in return? That's an important question because I can always sign up. But what do I actually get in return for signing off. Why do you need me? Why do I need you in that case? And that's an important question to ask yourself.
COLLEEN SCOLLANS: 100%. It's not getting a white paper. It's got to be a much more robust value exchange between the organization and the person they're collecting data from. Completely agree.
JOSH DAHL: Yeah, this is one of the questions that came in as well was just around-- and I think you've touched on this. It's metrics knowing what you're measuring. But then knowing the context, like what's the outcome you're trying to drive or what does this metric actually mean for our business. But one of the questions was just some examples. If anyone can share around basically quantifying and proving the value of a data project or a data investment, is there examples you can share of you've been able to go back and show we've spent this or invested time and effort into this, and this is what success looks like?
JOSH DAHL: Beth, you'd mentioned one of the examples just broadly. Yeah.
BETH WINDSOR: BETH WINDSOR : Yeah, we've actually struggled with that quite a bit. And I hope maybe others have too. Or maybe you've solved it. So for example, these third-party data sets that you pay for. They become part of a larger process machine, if you will, especially with regard to customer acquisition and business development. So you're not using these data sets in a vacuum.
BETH WINDSOR: And suddenly you can assign all of this value to this data set because there is an entire downstream process once you've used it. So we've had conversations about how do we assign value. We haven't really come up with a great solution for that yet. But we are still trying because, again, I understand these data sets are expensive. You have to be able to prove that they are worth it. And we're at the point now where if say, for example, a sales rep says this is valuable, we just take them at their word that it is something, a tool that they need to do their job.
BETH WINDSOR: But I would love to hear other opinions about if you've successfully measured the value of some of these data products that are fit within a larger process. I'm curious if anybody's solved that.
MICHAEL CRUMSHO: So the quantification for us really at this point, for where we are, it is around a lot of building the right eventing and reporting into our products to make sure that we have the correct sales enablement data to rationalize the annual recurring revenue that we charge. So a lot of the quantification of that for us is really in time spent. So we are looking across the organization and understanding that we have customer success, people that are spending x number of hours massaging reports or figuring out how to get data.
MICHAEL CRUMSHO: There's x number of bespoke requests going into engineers that could be working on other things that are more mission critical to get certain pieces of data. There's a certain number of hours that sales reps are spending to massage these and put together the right reporting. So really just understanding where the hours are going for the team because we don't necessarily have the tools in place that we need to operate at scale.
MICHAEL CRUMSHO: That's been what we've been able to utilize to help make the case. So we have engineer A that spends this amount of time across these sprints doing these things. If we can actually make this process more automatic or more standardized or uniform, that's going to drop to close to zero. And that person can then take on other projects that are a little bit more revenue driving. So for us right now, it really is that time quantification, that time savings piece.
MICHAEL CRUMSHO: So while we can establish the baselines that we need. And then we're going to shift into more of, I think, a classic like total addressable market scenario in terms of the products we're trying to build that will be backed on data so that we can match the revenue that we think we can produce by having the right data insights built into the products. And that's what we'll be tracking against in the future. So a couple of different levels for us.
JOSH DAHL: Sounds like this is--
COLLEEN SCOLLANS: The exact. Oh, sorry, Josh. I was going to say the exact same thing, Michael. Time savings, particularly sales and marketing teams and product teams. Or if I've seen the immediate benefits that you can quantify. And then I've seen a lot of really good improvements in marketing performance that you can tie to actual revenue coming in, which is obviously quite measurable as well.
MICHAEL CRUMSHO: Yeah, the best strategy that I've always had for getting a lot of stuff done is partner with the people who bring in the money and get help so that they can bring in more money if they have something that's a little bit easier for them to use. Because that's what talks. No one wants to hear that.
BETH WINDSOR: BETH WINDSOR : Yeah, we are. Sorry. We are closely partnered as-- at the moment we're embedded within our sales team. So those are literally the people out there pounding the pavement, pulling the money. And as soon as I said that, I realized that we do have a process to find opportunity, new opportunities, new customer opportunities. And that process, we have been able to put a value on those prospective new customers and then understand if that has led to a conversion or a new closed customer.
BETH WINDSOR: But again, that process is never short. It's never like, oh, we have identified-- these analytics have identified a customer, a potential customer. Two years later, they become converted in their customer. So it's hard to say how much of that tool of that analytics process actually lent itself to that new customer. Yeah, it's tricky.
JOSH DAHL: I think the takeaway for me from all this is it's really it's like working with the business stakeholders that are driving this, whether it's different departments really understanding what it is that they're trying to drive from this, whether it's user acquisition or retention or shorter turnaround to finding new leads and building metrics around that. It's a lot of this is just a collaboration between the groups to make sure that you're really synced up on what's going to actually show value for it.
JOSH DAHL: Yeah. So I was going to throw a little hand grenade in here because we've got five minutes left and there was some AI questions. And I think one I wanted to talk about because I've had some personal use of this is just how does AI-- how does it change the importance of standardization when you have AI that can work with large unstructured data sets to really streamline it?
JOSH DAHL: Does that lower the barrier for publishers of all size getting involved and starting to build robust data pipelines that-- and experience with that in any of your organizations, different technologies, newer technologies to simplify getting these unstructured or maybe less standardized data sets in one spot?
MICHAEL CRUMSHO: I think a lot.
COLLEEN SCOLLANS: Oh, go ahead, Michael. You go ahead.
MICHAEL CRUMSHO: I'll go for. Go for it. Go first.
COLLEEN SCOLLANS: Oh, sorry. I mean, I think there has been a lot of time and effort historically on tagging content to taxonomies that can be certainly automated with NLP. So I think that's a really good example where AI can give you content intelligence. That would have been harder to do before but not been updated, not as granular, et cetera. So that's an obvious example.
MICHAEL CRUMSHO: I mean, I think where we're focusing on AI within an organization, within our organization, is really trying to decrease the amount of manual work by 80% in a lot of areas. So it's less about just dropping a mess of unstructured data somewhere and trying to use that to pull out insights. I was actually going to say that for a lot of our data science engineers, making sure that there is some level of structure to the data really helps their jobs and makes it a little bit easier.
MICHAEL CRUMSHO: So I think we're in the world right now where we've gotten out of liking to see nicely structured and organized and ordered CSVs. We do spend a lot of time figuring out how to compensate for the fact that there's wild differences in some of the formatting, but a lot of our focus is really right now in just acceleration trying to cut down the amount of things that a human being would have to do in terms of review and things of that nature, leveraging AI.
MICHAEL CRUMSHO: We haven't gotten to the point yet where we think AI, or where we're confident enough in the models to be doing a lot of the analysis work that's not guided or checked.
COLLEEN SCOLLANS: Yeah, I would also say to get to that, I don't know if you've had this experience, Michael, but marketing is, I think, one of the areas where a lot of the work can be augmented by AI activity, but sure heck takes a lot of work to get that to work right. A lot of structure. A lot of thinking. Thinking about your brand LLM, et cetera. So AI is fantastic and tremendously useful, but it's not a magic bullet.
MICHAEL CRUMSHO: We do joke about that a lot because this is less data related. But we are doing various projects to try and accelerate content development. And I think there's this perspective that, well, yeah, AI can just write all the content for you and it's like, AI can write a lot of nonsense that a human being then has to go and correct that you then feed back into the model to show it what good actually looks like.
MICHAEL CRUMSHO: So I think there is this misconception that we are at this point where it's just push button and turnkey.
COLLEEN SCOLLANS: Yeah, it is not.
CHRISTIAN GRUBAK: No, but I think there is an opportunity in AI in terms of understanding data, but there's also a limitation because it's an easy way to start. It can actually do a lot of analytics for you. Analyze fairly good data sets and whatnot, but then you get to the point where being able to prompt data is no longer enough. Now you actually need prompt engineers who need actual development resources, data scientists.
CHRISTIAN GRUBAK: And that's where we're still on the maturing curve where that is still, I mean, I usually think of technology like medical advances. It favors the wealthy first, and so does technology. I mean, whenever we see a new technology coming out, you have to pay a lot for it. In a minute, it's going to become a commodity like the big models when ChatGPT came out, everybody rushed towards that.
CHRISTIAN GRUBAK: Now there's a whole forest of them. And so the foundational models are no longer-- they no longer have the value they did at the time. Now it's what you do with them, what you add on top of them. So I think there is an opportunity for those, especially the smaller organizations to access, to analyze, to have fast track some of those resources. But then it's not a replacement for true data talent.
JOSH DAHL: Yeah.
BETH WINDSOR: BETH WINDSOR : Now I would agree with that. We have dabbled in it a little here. And we've developed some internal tools like maybe on the editorial side. It's not replacing anyone's job. It's more just a way to optimize what they can do and maybe help them make better decisions. But yeah, it's in the early stages where we're looking at some gen BI projects, but we haven't gotten much traction yet.
BETH WINDSOR: Maybe if we have a part 2 to this panel, we in--
JOSH DAHL: There we go.
BETH WINDSOR: BETH WINDSOR : I might have more to share.
JOSH DAHL: And yeah, we've given data and analysis over to the computers. Yes. That's right.
BETH WINDSOR: BETH WINDSOR : I hope not. I hope it's not.
COLLEEN SCOLLANS: I'm sorry. Go ahead.
JOSH DAHL: Yeah.
BETH WINDSOR: BETH WINDSOR : No, I hope it's not totally handed over to the computers because that doesn't sound like fun. All the fun is gone.
COLLEEN SCOLLANS: I was just going to say we could do a whole other panel on AI.
JOSH DAHL: Yeah.
BETH WINDSOR: BETH WINDSOR : I like Colleen's idea.
JOSH DAHL: We left-- there was a couple of AI questions we didn't get to. And yeah, there might be a part 2. We are at time though. And I want to respect first our panelists and our attendees times. Beth, Colleen, Michael, Christian, thank you so much for joining us. For all the attendees, we really appreciate it. We'll have the recording available so you can follow-up if you want to go back and review or you want to send it to someone that wasn't able to attend.
JOSH DAHL: So thanks so much for your time. Thank you.
CHRISTIAN GRUBAK: Thank you.
BETH WINDSOR: BETH WINDSOR : Thanks, guys. Thanks, everyone.