Name:
Technology Leader Roundtable
Description:
Technology Leader Roundtable
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/d3df202b-225f-45c6-87f8-8dc0198cec69/thumbnails/d3df202b-225f-45c6-87f8-8dc0198cec69.jpg
Duration:
T01H09M16S
Embed URL:
https://stream.cadmore.media/player/d3df202b-225f-45c6-87f8-8dc0198cec69
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/d3df202b-225f-45c6-87f8-8dc0198cec69/2 - Technology Leader Roundtable.mp4?sv=2019-02-02&sr=c&sig=mseZY4CBgK1DbE2Rf3tbmE%2B5eLo1S3Y7PMzHUoQ7WSc%3D&st=2024-11-21T09%3A06%3A41Z&se=2024-11-21T11%3A11%3A41Z&sp=r
Upload Date:
2020-11-18T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
WILL SCHWEITZER: So this is going to be kind of a free ranging conversation. I don't think anybody on the stage feels like they're an expert on any of these topics. So in some cases, you're getting our best guess and kind of our opinions. So let's start. And the topic we wanted to start with is actually something that our industry's been engaged with quite a bit recently.
WILL SCHWEITZER: And that is kind of open source solutions, and how from everybody's perspective we evaluate the use of open source versus commercial or proprietary technologies in their technology stack. Who would like to start?
STUART LEITCH: I'll take this one. This is probably something that I'm more qualified to speak to from the technology side. Most of our stack is actually open source. We're part of the Microsoft stack, and Microsoft has just been progressively open sourcing more and more of their stack. And we actually prefer to use things that are open source when we can. But we also really think about what's the size of the community behind that.
STUART LEITCH: We also think about what's the license and what's the risk of this actually becoming some sort of Freemium model. We've certainly got things within our stack like, the Pentaho stack for analytics, or the ELK stack for monitoring. These are both things that really started out as free software, as a genuine kind of open source, but ultimately evolve into Freemium models, where the copyright holders, despite the licenses allowing you to use it in any context, the companies that were really funding the development withheld key features.
STUART LEITCH: And so we ended up, for both of those products, actually getting into a traditional kind of enterprise software agreement. So that's a really important thing to think about as you're getting into open source, what's the risk? And we also think about this from the perspective of the potential for abandonment. We've had a number of components that we have dependencies on effectively be abandoned by the community.
STUART LEITCH: And we think about what's our ability to actually maintain that code base ourself. So that's just a few of my thoughts around that.
ANN MICHAEL: I'll chime in next. So first of all, I've been at PLOS for three months, so my answers will reflect that. But second of all, it's probably good to frame this in the sense that my role is a little different. I would not define myself as a technologist. So I'm the chief digital officer in technology. Our technology personnel make up about 12% to 15% of the people that report into me. So technology, obviously, is important.
ANN MICHAEL: But it's that business perspective and digital transformation that's really our focus. And so as such, when we look at open source, I think we have definitely found modular uses. We use Pentothal as well. And there are other parts of our stack that are open. But when we really look at the bigger issues that we want to deal with, what we find, a lot of times is that PLOS One breaks everything.
ANN MICHAEL: Because of its volume, because of its complexity, it just makes software cry. And so I think that one of the things that we've been doing when we look at COCO, when we look at some of the other options out there from an open source perspective, one of my goals is, I actually would like to get our product and some of our development folks involved in these solutions, even though we're not using them and probably couldn't for years.
ANN MICHAEL: But the only way we're ever going to be able to use them, I feel, is to participate in some fashion. So that the requirements of things like a PLOS One are taken into account when these things grow and expand. So I would say our current strategy is modular, where we can use open source. It's in our blood to be open. So of course, we would like to do that, but some places, we just can't.
JOHN SHAW: So I think you both have covered a lot of things that I would cover. I think your point is really good, that with Microsoft and now Microsoft House, with them opening up their technology, that kind of qualifies them as open source in another way, a well-funded open source, right? I think when we're looking at technology at the company a couple different things always have to come into play.
JOHN SHAW: One is, generally, what are the frameworks that we have? How is it being used? Where is it in the organization? What type of technology are we putting in as an enterprise system or as a platform? Is it a SAAS based solution or is it a component? So we're constantly evaluating the best technology to solve the problem that we have. But we also really have to think about our technologists in the company and make sure that we don't go too far a stream of what we know we can support internally.
JOHN SHAW: And then it gets also back down to what's the community support, because we're going to need to turn to them. We've gone in different directions and had different successes going open source and then going proprietary. So I'd say we pick the best solution for the problem we're trying to solve.
WILL SCHWEITZER: Right so I think there's a theme in all three of your comments around kind of pragmatism, and then looking at essentially the underlying sustainability of technology. And one of the questions I keep asking, essentially around what we're seeing around say COCO and other open source kind of purpose built tools for our space, is if STM is a big enough community to sustain them, and if they're going to mature to a point that your smaller society is going to be able to deploy them on their own.
WILL SCHWEITZER: Do you any thoughts on that, Ann?
ANN MICHAEL: So I would say, I honestly could not answer that question. But what I will say, and kind of hearkening back to the consulting background, you're starting to see organizations start to band together. The DSW, who's on Silverchair, and some-- so obviously, that's not a platform other than Silverchair, but I think you're starting to see an interest, even ones that are not publicly discussed yet, of societies trying to work together to come up with thought leadership, a core of people that would support a platform.
ANN MICHAEL: And I think that's an interesting development. I don't know how many of them will spring up. And I don't know if any of them will be like an open source alternative, that you're talking about, sustained by the industry. But I do think it's interesting that people are looking to band together. And I'm not at liberty to say, but there's at least three different areas where I know societies, and even some with publishers, are trying to do this.
JOHN SHAW: So that's a good point. I think when you talk about open source, maybe you have to go upstream and hit back to community first. So just let's focus on the community part. We are seeing a lot of different communities getting together and banding to try to solve some problems. I think when I think about this, the question is how can we neutralize some of the things that we're trying to do?
JOHN SHAW: Which gets into some other questions I think you'll be asking. If we believe that we have to customize everything, each of our organizations are going to continue to go down paths that are not going to allow us to band together as much. I'm not trying to say we should commoditize everything. But I do think we should look at why we're doing things, and make sure we're doing them for the right reason.
JOHN SHAW: And the more we can neutralize, the better we can work together to solve the problem.
STUART LEITCH: And I would add on just my two cents, that I think the industry is probably just large enough to support quite a few components here. But it all comes down to the politics. You're really thinking about who's really providing the bulk of the funding for these various projects? And can they actually get alignment? Or is there kind of division within the community? Because you get enough division, and you basically diffuse your effort, and you've no longer got enough momentum to really get that traction.
ANN MICHAEL: Well to build on something that John said too, that I think is very interesting, the whole snowflake concept, that everyone believes they're a snowflake, and it makes it hard for people to work together. The flip side of that is everybody wants the space to innovate. And one of the things that we think about, and again, I think you'll get to them in some of your later questions, is how do we decide what we need to do, what we can partner to do, in order to maintain a space where we can still innovate in the areas that are strategically important, but yet don't have to basically reinvent the plumbing.
WILL SCHWEITZER: And the other thing that I've been thinking about a lot is essentially what is kind of the impetus or the driver for these open source solutions. And it seems like on some level it's a measure of ethos. On some level, it's a concern over cost and trying to decrease cost, save for your publishing stack components. But is there another driver here? Is it that the widely available solutions and market aren't meeting needs or requirements?
ANN MICHAEL: Actually, I'm going to go first on this one too. Because I just participated in writing something with Lauren that we posted yesterday. And one of the concerns that we hear, and you may or may not buy into it, and that's fine. I'm not judging the concern, but it is a concern, as a lot of societies are very nervous about getting sucked up, especially self-published societies, into commercial publishers, because their partners are purchased.
ANN MICHAEL: And I'm not saying that open source necessarily negates that, but I believe they feel as though that might be something that is less likely. I'm not sure it is. But again, I think that's another driver here, is this fear of, let's just say it, everybody being sucked up by Elsevier. And it is a paranoia, to some extent, but it's still palpable.
WILL SCHWEITZER: Yes.
JOHN SHAW: Yeah. I'm worried, sorry.
ANN MICHAEL: I don't think they're after you.
JOHN SHAW: Yeah, they're after everyone.
ANN MICHAEL: John and I-- the very first time I ever spoke in STM was 2004, at the Palace, SSP, on content management systems, sitting next to him, only he was on this side. And we got in trouble for horsing around.
JOHN SHAW: I haven't started yet. I'm a little tired. Just flew in. Going to get to it. Yeah, I'm not sure how much more I have to add on to this one. Again, I actually take it back-- I'll take it a couple different directions. So I think we confuse, and you hit just on that, we confuse open source versus banding together to solve problems, whether it's open source or not.
JOHN SHAW: I agree with you, thinking about maybe the genesis of COCO-- well, the genesis of COCO goes back, obviously, to you guys. But if you think about what the mission was there, it was really to try to provide something that the community could use, that wouldn't go away, that everyone can band to, that wouldn't be sucked up, exactly kind of what you hit on. And I think that's noble and I think that that can work. You still need to make the right decisions for your organization.
JOHN SHAW: But whether it's open source or not, I think that's a decision that has to be made. But I still go back to the community, and banding together, and working together to solve problems.
STUART LEITCH: I think more generally, I would look at these choices contextually. You're really thinking about what's the likelihood that this code base will still be-- like whatever component you're using, what's the probability that this will still be vibrant and being actively developed in 10 years. And whether that's a proprietary solution, or that's an open source solution, either way, you've got the same kinds of risks.
STUART LEITCH: They just look a little different. If you're looking at buying into a proprietary ecosystem, you're really looking at, well, what's the health of that company? What's the vision of that company? What's the executive leadership like? What's the talent like within the organization? What are their risks? Are they likely to be acquired?
STUART LEITCH: You're thinking all these kinds of things through. You're just thinking that through in a different way with open source solutions. So I don't think there's actually a simple answer to this. I think you're looking at it contextually each time.
JOHN SHAW: And it doesn't have to be either/or other. It can be both. Because chances are you have open access, or open source technology being used within your solution, period.
WILL SCHWEITZER: Right, I think a lot of us are finding all of our systems are becoming more modular these days. Let's shift gears a bit. And for the majority of us in the room, Google is kind of the largest driver of traffic to our products and sites. And when a user comes to us from Google, they may spend, if we're lucky, two and a half minutes on our site.
WILL SCHWEITZER: They may look at a piece of content or two. They're going to download a PDF. And they're going to kind of move along. And that's an interesting paradigm for me, given the amount of time publishers-- and I'm guilty of this in my past-- spend putting things in the right wheel, developing new widgets and customizations for our site. And I'm kind of left wondering, is really all publishing technology a commodity?
WILL SCHWEITZER: Or can we make these investments? Can we customize our sites? Can we add functionality that differentiates our products in the market?
JOHN SHAW: So many different directions. I'll start there. I think back to when Alice and I worked together at Sage, and when we were first building our platforms, our journals platform. And when thinking about adding features on, we've matured, obviously, and gone different directions. But back then, I think we always thought about it as, there's not one way that we're trying to solve the problem.
JOHN SHAW: There's about five communities that we were looking to build for when adding features into a platform. One of them, of course, was [INAUDIBLE] the researcher, the professor, the student. It went all the way up to the press, and how we thought that the product would be reviewed. Or a librarian was the one who purchased the product, so what did the librarian need in there?
JOHN SHAW: I think Ann will talk about an anecdote that hits right in that. But if you don't put in one of those core features the librarian needs, then chances are that could, at least in the past, have been notched against your product and could have been a critical decision that was made there. So I just think with the features, to get in your commodity piece, we're not a commodity yet.
JOHN SHAW: I think there are different markets with different needs. And Sage has a lot of different markets. So I think we use analytics. We look at research. We look at what else is going on in the community. We look at how we can differentiate. We look at what we feel the community needs. We look at what Sage needs to put into the product. So there's a lot of different things that go into how to add a feature onto a platform.
JOHN SHAW: And sometimes it is only used 1%, but maybe that 1% is OK.
ANN MICHAEL: And so John and I were talking about this before, because your actual question had stated that only 5% of people click on the right rail. I believe that was the [INAUDIBLE] question.
WILL SCHWEITZER: In my experience, yeah.
ANN MICHAEL: So I did a project for our large medical society 10 years ago, which was not the AMA, just to narrow it down a little, and they were considering getting rid of advanced search on their website. And so we looked at that, and looked into who is using that. We looked through logs. We talked to folks. And what we found out was even though 1 and 1/2% of people used advanced search, they were the two single most important bodies of people.
ANN MICHAEL: They were the librarians and they were their own internal editors. And so the reality was, without-- they would have upset the most influential people, and the people contributing the most to their actual content, by removing that, just by looking at the numbers. So it's really important to have context for those numbers. But the other thing that I just thought of as John was talking too, is that we all have to understand that-- I don't know if it's a commodity yet, but the bar's raising.
ANN MICHAEL: So if we don't-- unfortunately, maybe we do this to each other. Maybe everybody adds things, and the Joneses, you have to keep up with. Because there starts to be expectations about what are they are, regardless of how often it's used, which is insane, but is true. You get judged on the things that aren't there. It's like living near the beach and never going, even though it's accessible to you.
STUART LEITCH: I think you largely covered that the points I was going to make. The last thing I would think about is trying to differentiate the quality of users that are coming in too. When you've got kind of people just-- like, I know, as a Google user myself, my kind of pattern is I try and kind of break my searching from-- I go through a cycle with it. I'll pull up a search page, and I'll basically control click on a whole bunch of links, and so they'll all kind of pop up.
STUART LEITCH: And then I'll flip through all those links. And I'll scan them really quickly. And that'll kind of help me narrow down what I'm looking for. But all of those links that I've clicked that I've just looked at just for a brief second, they're all registering somewhere. If I find something that's interesting, and I'll find new sources of information, I might spend more time there.
STUART LEITCH: I'll be looking for related articles there. And that makes me on those sites essentially a higher quality user. And so that's often hard to tease out of your traffic. But you definitely want to think about that, in terms of the overall statistics might be pretty small, but we know, like at least at Silverchair, for some of our clients, they've got a small number of statistics, but as Ann was saying, these users really matter.
JOHN SHAW: One other thing I'd add on. We do college textbooks at Sage. And so we're building a platform right now for the first time that will differentiate us against McGraw-Hill, Cengage, and others. So we're getting into a highly penetrated market, where they've been for 12, 15 years and trying to launch a product in. So we have the benefit of hindsight of seeing what they've done, and hearing from the customers what they think needs to be done, and talking to their users of what they're trying to accomplish.
JOHN SHAW: And it's not so much features. We're looking at it as-- well, it is features, in the sense we're able to start from scratch and say, OK, how can we pull back from everything they did, and what is the core use, what is the user actually trying to accomplish there? And so I wouldn't say we're feature rich at all. It's about usability, ultimately.
JOHN SHAW: Usability, and how easy is it for an instructor to set up a course? How easy is it for the author-- or for the end user, the student, to be able to use that course? How good are those questions that are coming up on the quizzes? So it is totally different landscape than what we've tried to do historically in the journals platforms, which really was competing with the Joneses, looking at what everyone else had and saying, well, if we don't have that, then they're going to buy our product.
JOHN SHAW: So I think it comes down to the market as well.
WILL SCHWEITZER: So something everybody touched on and was using analytics to make these product or feature decisions. And in the audience today, we have organizations that are really large, like Sage. We have medium sized organizations, like PLOS. And we have smaller organizations, who I think in large, are starting to think a lot more about analytics and how you can resource that function within an organization.
WILL SCHWEITZER: Ann or John, do you have any insights to share about how your companies approach analytics, the tools you use, the type of people that you need, how you bring that data into your decision making?
ANN MICHAEL: So I guess I can start. So analytics is definitely one of the groups that is within my part of the organization. We're actually already expanding it to some extent. The interesting thing about analytics, I think, is that whole idea of corporate wide literacy, that it's hard to get anywhere with analytics until people understand that it's not an episodic consultation, but a part of how you do what you do, and start to create some of the demand.
ANN MICHAEL: That started naturally at PLOS, long before I got there. PLOS One is very analytically focused. So right now, we have a manager in analytics. She has-- about to be three-- I don't know if you knew that, Allison-- about to be three people that we've kind of cultivated from other parts of the organization, that are working with her. And so first, we're focusing on some of the very basic things.
ANN MICHAEL: So I look at the analytics team as the scouts, both internal and external. So internally, what are we doing? How can we be more efficient, as far as process and other things? Externally, what are the opportunities that are out there? And then as we build things, they're also going to be the people that help us validate, did something have an impact or not?
ANN MICHAEL: We do have a data engineer that's kind of helping out, even though he's on the tech team. But we're going to grab him too. Anyway, we have some-- we have a data scientist in training, a social science graduate with some advanced degree in data and analytics. And we're building it up. We use Sisense as BI platform, but we're in the process of putting together a proper road map to outline what our data structure-- our data infrastructure is going to be, and how that's going to align with our overall tech infrastructure.
ANN MICHAEL: But that's something that we're probably planning to get to, or to finish in first draft, towards the end of the year. So we have basically pulled together what we have. We're trying to encourage use where it exists. We're trying to democratize use. We have a lot of people that design Sisense dashboards, for example, that are not within the data team. And our hope is that then we're going to move into a leadership position, where we're going to have this centralized group that is more interested and more about enterprise related analytics.
ANN MICHAEL: And the actual operational groups are going to be doing their own. I don't know if that's helpful.
WILL SCHWEITZER: Very helpful.
JOHN SHAW: All that and more. So I think Sage maybe a little-- just a little bit further, but not much. Because this has really come up in the last couple of years, where it's been a true centralized effort to invest. So I think also we had people in the organization that enjoyed analytics, that naturally went there. Maybe you'd have a couple people in different departments that had something about analytics in their role. But today, I can say that we have an analytic strategy.
JOHN SHAW: We have a strategy within our technology strategy that's focused on data and analytics. We have a technology stream that goes all the way up into the senior management about data. We have a growing team of data analysts within the technology group, and that is domestic and overseas in India. So we have a lot of people that are focused on this, and across the organization, so we're not just talking about not journals.
JOHN SHAW: We're talking about data in every aspect, whether it's financial data, usage data, people data, every type of data. We're centralizing on the tools that we're using. So being a Microsoft shop, we did go with Power BI. Power BI is friendly enough to where it's getting people excited about the technology. So we have people naturally learning it. We have people that are being trained by Sage to learn it.
JOHN SHAW: We have an excitement coming ground up. So from the top, but also ground up, in trying to solve problems that we wouldn't solve, or that took 50 people to solve before. Now, we can do it very easily by putting the data in, and allowing people to sit there and analyze it.
ANN MICHAEL: I'm just going to add to it. One of the issues that I think most organizations have is that when you start to look at data, I think you have to start to look at it in two different ways. So you have exploratory things you want to do. And then you have specific questions that you want to answer. And you don't always have the data that you need. And so I think that's the other real important part of this, is thinking about what are the key questions for your business.
ANN MICHAEL: Do you have the data elements you need to answer those questions? What can you use as proxies in their absence? And can you actually collect the data that you need without a ridiculous cost? But I think the other thing-- as you know, I just finished a degree in May in data analytics at NYU Stern. And the first class, the first thing that they said was, they quoted George Box, a statistician.
ANN MICHAEL: And he said, all models are wrong, but some are useful. And I think that people need to understand that, when you look at data, that you're looking at a representation. And you're doing the best you can do to make that representation reality. But there are always assumptions. There's always biases, even in your interpretation. And you have to leave an open mind of how you interpret data and allow yourself to consider different interpretations before applying any knowledge you might meet reap from it.
JOHN SHAW: That's one of the thing there that you hit on. It's doing data analytics and having a team doing that is fine. If you don't have the data, or you can't expose the data, or the quality of your data's so bad that it doesn't really matter what you do with it, then that's a problem in itself. So part of the strategy we've had for the last few years, we've just been trying to collect our data, which is very difficult. And maybe that goes back to the open source piece.
JOHN SHAW: When you have partners and vendors that don't like to share the data, or can't share the data, it makes it very difficult. So we've been trying to track our data down, get quality data back in, discover the data from our own systems, clean the data up, and then expose it back to the business, to allow the business to do it. So as much as we have the focus, and technology, and people that can help, this is not a technology driven, necessarily, all technology driven effort.
JOHN SHAW: This is a business effort. And technology is trying to expose the data through the systems and provide clean data, data lakes, places for them to go get this. It's also, to one other important thing, we have data scientists that are in our divisions. So the different divisions actually have data scientists that are working with the technology groups and trying to figure out what type of data they need, and then how they need it, and then providing the tools to do that.
ANN MICHAEL: Sorry, one more thing before you-- the other thing I was just thinking as you were talking is that, you have to understand it's an iterative process too. You work with what you've got. You understand the gaps. You try to get better. And then you repeat. So technology's involved a lot of times, and pipelines, and cleaning, and storing in some central place where people can then access that data.
ANN MICHAEL: But you're never done. So there's no done.
STUART LEITCH: So a few thoughts I would have to kind of add onto this. Think about centralized versus decentralized. And they both have a role. Centralized would be where you have some team that's essentially pulling all the data together. And in an ideal world, they're creating some kind of fancy portal, where you can go in, and you can kind of do self-serve reports. And that was really the only way you could do this in classical, large scale BI projects.
STUART LEITCH: But with things like Power BI and Tableau, we've seen those models essentially emerge first in the decentralized kind of self-service model. And with something like Power BI, you can actually kind of mix and match the two of them. And there's certain types of enterprise data that you actually need to have a really centralized effort to kind of clean that data up.
STUART LEITCH: But you can also, with tools like Power BI, essentially have your leaders learn how to essentially pull together models on their own. So for me, one of the key messages for my leaders is that I actually expect data literacy. I want them to roll up their hands. These might be people that are kind of further on in their career. But I do believe everybody is still capable of learning.
STUART LEITCH: And you don't need to master it. But you do need to be able to get in there, such that you can kind of manipulate some of this data and form some basic conclusions there. The second thing I would say about this is that, similar to how we go about writing software, you can definitely just start coding. Because we've got relatively high complexity on the system, we actually have a practice of people just creating tech design.
STUART LEITCH: Basically, it's a clarification of intent. And that can actually be useful when you're trying to answer some questions about your data. You're actually trying to really crisp up, specifically what is my hypothesis? What would I need to see? What data do I need? And how do I need to manipulate that? And that also gives you more of a surface area to be able to go and get others to upgrade you.
STUART LEITCH: Because it's that old thing of kind of lies, damn lies, and statistics. It's very easy to kind of be sniffing around the data, and you kind of see some outlier, and it's kind of, wow. And then you build a whole case in your head around that. It's really important to hold yourself intellectually honest, to make sure that you're bringing rigor to what it's actually likely to be saying underneath it.
WILL SCHWEITZER: Well, I'm adding data scientist, along with dog walker and spin instructor to the things I wish my high school guidance counselor told me to consider. So we've learned that unless you're Facebook or Google, really no publisher is an island unto itself. And that an effective digital strategy requires partnerships to make sure your authors and readers can find your content and services where they need to or where they expect to.
WILL SCHWEITZER: Which takes me to this question-- in your various organizations, how do you all approach partnerships?
ANN MICHAEL: Stuart hasn't started in a while.
JOHN SHAW: I like that one. My answer was really very carefully.
STUART LEITCH: Well, besides partnerships being more Will's thing than mine, I would say, more generically, that when I'm looking at partnerships, I'm really thinking about who it is that I'm partnering with. My view is that most of what we do is in a very-- it's a very complex world, and it's a world that's constantly changing. Particularly around software, the whole stack is evolving. It's a constantly moving thing.
STUART LEITCH: And the devil's always in the details. And almost every project hit snags. You run up against the unexpected. And it then becomes a question of who are you actually working with? Are these people that are really going to be responsive? Are we going to be pointing fingers at each other, or are we going to be rolling up our sleeves, and pragmatic, and just working through it, and actually engaging with each other, trying to understand each other's intent, such that we can get better outcomes?
STUART LEITCH:
JOHN SHAW: So I was saying that this question. I was writing down single adjectives, and thought, this sounds like a relationship. It involves a lot of the same things, right? But I do think it comes down to what is it the technology is doing, so how big is the piece of technology that you're partnering for, what does it do within your organization? I'm not going to think that Microsoft is going to be my partner.
JOHN SHAW: I'm going to need to use the software, but they're not going to partner with me. It's strategic to our organization, but there's nothing I can do to make them change their mind. So when I think down to things that we can actually kind of control a little bit more, with a good partnership, I do think about partners that are willing to understand what our strategy is, that are willing to talk about strategy.
JOHN SHAW: So an ongoing conversation, not just kind of on that date, the first time, and it never comes up again. So you want someone that can understand what you're talking about, that's willing to partner, that talks about how you align, how they're looking to align, what you can do together. Communication is key. The minute communication breaks down, then your partnership is gone.
JOHN SHAW: That's everything in life, of course. I'm not say anything that you don't know. But I think that's really important. So how often do you have those communications? Is it an ongoing communication? Or is it once every six months, because that's in the contract? Gardner likes to call me every six months. That's their communication. I don't think that's a partnership.
JOHN SHAW: I think it's an obligation.
ANN MICHAEL: Checkbox.
JOHN SHAW: Yes, exactly. So I think that's another really important one. I think flexibility and fairness comes into play on both sides. So being flexible, understanding each other. Forgiveness, so everyone's going to make mistakes. There's going to be times where your partner does something wrong. They have a bad release, something doesn't work. So can you forgive one another? Or are you going to just hold them so accountable that it goes into a bad a bad relationship?
JOHN SHAW: So I think all those pieces are really important. Evolution is one, seeing that you're both evolving together. Culture, if your cultures are fundamentally different, then that's going to make it hard to have a partnership, if you can't have a conversation, because you don't understand one another. I can go on and on. But those are some of the things that made me-- that I thought about when thinking of a good partnership.
ANN MICHAEL: This is why I waned to go after you, because I saw your list. So the one thing I think that you both did say, whether directly or indirectly, which I would say, is really, mission alignment too. And I guess when I think of partnership too, I'm not only thinking about technology partner. Like I think in our position we should be partnering with funders. We should be partnering with other publishers, for different goals, whether they be business goals.
ANN MICHAEL: And sometimes those things will drive technology needs. And then, of course, we'll deal with them in their proper order. But mission alignment is really, really important. We need to be heading in the same direction. One of the things that-- and again, Allison, this is me, this is not necessarily PLOS-- but one of the things that-- one of the reasons I came to PLOS, which really excites me, is that I was viewing the market, and I don't see PLOS as a competitor.
ANN MICHAEL: I was kind of surprised to see that in many cases, publishers view PLOS a competitor. And I view us as an organization that if we have the right partnerships and the right alignment, we can help pave the way and prove things out that everybody can use. Because our goal is to promote open science, open research, and open access. Maybe, maybe not APCs through different models, things of that nature.
ANN MICHAEL: And so I look at partnership in that fashion, that who out there, what can we do to help move everything forward, and who can help us.
WILL SCHWEITZER: So we've talked about what makes a successful partnership and alignment, but let's talk about the other side of partnerships, and risks, and downfalls. And when you're making a decision to partner or not, how do you evaluate that risk? What are the types of things you think of?
ANN MICHAEL: Well, just to continue what I was saying, I think that mission alignment-- a lot of the stuff that John listed and Stewart listed too, you have to really understand, you have a similar philosophy, you're heading in the same direction. You have similar goals, or at least goals that align or are complementary, that everyone's bringing something to the table. Because sometimes what happens in partnerships is people feel that the contribution isn't equal, or adequate, or as expected.
ANN MICHAEL: And that can be a problem too. Flexibility, as John stated, I think is really, really important. It's a risk if something is too rigid, or someone or some organization is too rigid. Because as we all know, we're in the middle of a lot of different changes. So the idea that any of us are going to set a path for the next three or five years, and we're going to execute in exact alignment with that path is insane.
ANN MICHAEL: It's not going to happen. So that's where I would start.
JOHN SHAW: Yeah, I can go so many directions with this. I think the things that popped to the top of my head is that you do everything that Ann said, you make your best decisions that you can make. You do your due diligence. You look at the history of that potential partner. You try to put on your sorcerer ball and see what the future looks like. You look to see whether maybe could they be acquired. You think about all these things.
JOHN SHAW: But we can't control everything. And so I can put a bazillion [INAUDIBLE] out there, and over my years of thinking that a solution was perfectly safe, and it gets acquired. It's perfectly safe, but the management changes. And the management vision and alignment is totally different. It's all good, and they run out of money. It just goes on, and on, and on. So I can't predict the future.
JOHN SHAW: All it can do is help the organization make the best decision that we can today. Understand what my risk strategy is, so what is the risk? I think, again, that goes back to whether is it an enterprise system organization? Or is it a commodity or a smaller thing? In some ways, I don't worry-- this just sounds-- don't quote me on this-- I don't worry as much as maybe a journalist platform.
JOHN SHAW: It's not the hardest thing to migrate. It's dang hard, but when you talk about peer review systems, oh, well, that's real hard. That is not something you won't go wrong. And I think thinking back to 15 years ago, when we made our decision, I thought to myself, this is the worst decision we're ever going to make in our life, because if something goes wrong, we have to figure out how to extract all of our titles and data out of it.
JOHN SHAW: And it could stop our business. And we could lose our revenue tomorrow. And so those are the types of things that you have to think about. But in that scenario, I didn't have a choice. We didn't have a choice. We still had to make the best decision that was in front of us that day, and move along with it. And try to change our strategy, evolve with the partner, and hope that we can continue on with that technology.
JOHN SHAW: If not, then just know how we're going to get out of it.
ANN MICHAEL: Thinking about your exit strategy upfront as you're forming a partnership is really important too.
JOHN SHAW: I think-- so, just one other piece. I think with all of our technology, we have to meet regularly internally at our business, not only with our partners, but internally our business, to review, what is our strategy? Where does the technologies plot on the strategy? How are we feeling about our partners in the technology? Does it fit with what we're trying to do? Is the alignment there?
JOHN SHAW: Where are we headed, and might that change? So how is the technology going to change? What are we going to need for the partners to do? And make sure you're doing a 360 within your organization about how people in your organization are feeling about that technology partner. Because you, as a technology guy, say, oh, it's great. And then you go to the marketing department, and they say, they're horrible.
JOHN SHAW: So make sure that you understand what's going on in your organization. And then continuously plot out what options do we have and where can we go if we need to change.
STUART LEITCH: Yeah, so to build on this, I think there's a real theme here of actually bringing discipline to flushing out the risk. You can never get the risk to zero. You're actually going to have to manage the risk. To me, the emphasis would be how deliberately are you fleshing out that risk? I would kind of approach it more like hiring an employee. You're trying to actually be really thoughtful about the questions you're asking.
STUART LEITCH: You're trying to actually outsource in the community, of who have they worked within the past. And really recognize that we always bring cognitive biases to these kinds of conversations. It's like you can trust, say, hiring an employee where we have a HR department. And then you think about dating, where there's just so much projection there, and we kind of want to see things in another partner.
STUART LEITCH: You really need to check yourself with that, and think about, really, what are my criteria? What matters to me? And what are the questions that help me flesh that out? And how can I-- what kind of circumstances can I create that might give me a bit of a test run with that? Where you might kind of whiteboard some difficult circumstances or kind of talk through how would you handle this situation.
STUART LEITCH: And then check their responses against people that have worked with them in the past.
WILL SCHWEITZER: So, John, you touched on this a bit. For a lot of us in this room, we have essentially partnerships around really critical systems, things like your manuscripts mission and peer review system, your content hosting platform, or even your production systems, unless you own those stem to stern within your house. And often-- and having been a publisher myself, I can say I'm guilty of this, you often wait until you're in a crisis mode, the relationship is absolutely horrible, the technology is failing, to say I really have to make a change here.
WILL SCHWEITZER: So is it really kind of-- do you think the discipline is the essential thing about kind of keeping in touch with your business partners internally and your external partners, to not ending up in a similar position?
JOHN SHAW: Yeah, I think it goes back down to you can't control risk, 100%, at least. So I think if you have the discipline of evaluation, you're constantly evaluating on a regular basis internally, with your partner, trying to alleviate any surprise, you do as best you can within the framework that you can put out there. But stuff happens. And so all of us have had it happen to us.
JOHN SHAW: And when it does, again, you just have to have a plan for what to do. It's going to happen. It's just a matter of what and when. And then when it happens, what can you do with it? And hopefully, it's not one of those such critical problems, where it fails, it critically fails. Thomas Cook Airline, goes out of business, you can't get home tomorrow.
JOHN SHAW: Can't control it, again, but don't worry, the government's coming. So in our scenario, you just hope that you don't have to be somewhere the next day, and you have enough of a runway to solve the problem. But that should go also into your risk assessments. If it's possible that something could-- in a solution that's so mission critical, that it could go away on one day's notice, then that gets to some of the other questions you've asked about.
JOHN SHAW: When do you take it internal versus when do you partner? And you have to think about those things.
STUART LEITCH: I think that really brings up the point about not all risks are equal in terms of how they manifest. And we think in terms of all the dependencies we have and the risks that could happen to them, which of these risks do we actually need to plan ahead for? Versus what would we deal with the same if it just happened versus if we had a plan for it? And then-- because you can't actually focus on everything. You need to really concentrate in on the things that are really critical.
STUART LEITCH: So it's just asking those questions-- if something went wrong here, would we be in a really bad situation if we hadn't have done forward planning around it?
ANN MICHAEL: And that's a classic risk register, is what's the risk? What's its probability? And what's its impact? And some multiplication of that kind of helps you prioritize what are the risks you really need to have a plan for. And which are the ones that it probably isn't a good use of your time to be planning for them. We had a situation two weeks ago where a key partner in one of their releases released something that caused a lot of downstream problems, and costs for us, and deferred revenue, because of things that couldn't flow through the process.
ANN MICHAEL: And so just escalating with them, and getting them on board. And once we got their attention, they were very good about turning that around. And I think have to be careful that people in the organization don't see an instance like that, and then right away that starts to sour their perception of that partner. Now, if that happens all the time, that's another thing.
ANN MICHAEL: But even all the time, getting back to data, is a perceptual thing. So when I started to ask folks about the frequency of things like this, it went from this happens constantly to I could get three instances in the last 18 months. Now, that might to be too high as well. They weren't as near the severity of the most recent one. But I think what you were saying about constant communication with your partner and with the people interacting with that partner, to keep everybody level set on what the relationship is really bringing, benefits and liabilities wise.
ANN MICHAEL: Otherwise, people's perceptions can get out of line.
JOHN SHAW: Two other things, the risk registry is really important. We have that. We review it weekly across all technology in the technology department. But I think, depending on the other teams, I think some of the other operational groups are doing that at Sage as well. I think another thing I would say that's imperative is to just have your critical systems prioritized, where does it sit in the chain?
JOHN SHAW: And then, what do you do in those scenarios? So you understand kind of where the risk is. If some platforms go down, fine, I can live two weeks, technically it will be OK. If this one goes down, we can't run. So you need to know what it is and have that plan in place, so it's a business continuity in a sense. And I think every business should be reviewing that, if you have technology.
WILL SCHWEITZER: Let's shift gears again and talk about two of everybody's favorite topics, open access and open science. And both those things are creating a lot of pressure is on our organizations, to move along things on the roadmap, or even to change their technology stacks. What do those two things mean for your organizations, Ann or John?
ANN MICHAEL: So I think that there's this misnomer belief out there that if you are a fully open access publisher that you're fine. Everything is wonderful, don't worry, thank god for Plan S. And I think that is not true. So, for example, I think that for us, we look at things like the APC model, and wonder if that's really the model that's the primary model that should go forward. And in starting to think about other potential models, that has technology implications-- not just technology applications, it has processed implications everywhere.
ANN MICHAEL: If you think about the idea of any kind of appeal to funders, or an appeal to consortia or libraries, that involves an infrastructure that we don't really have. So open access, and where we're going to go, and just promoting open research, it really does put pressure on us too, to not only be more efficient with what we do, which impacts process, tools, and people, but also to explore new things.
ANN MICHAEL: And that also impacts process, tools, and people.
JOHN SHAW: Yeah, my answer really mimics that. I mean, yes, open access and the changes do-- are causing us to look at the way that we do everything in our organization. We need to do that anyway. I mean, as a publisher of 50 plus years, and antiquated systems, we need to evolve in some way. This is helping us evolve faster. But the workflows and the problems are slightly different.
JOHN SHAW: I'd say we're having to change on every front across our organization. So it's not just this. It's every component, whether it's college publishing, the library market, and selling to libraries, if that's what we do in the future, open access, and the challenges there. So we're changing up systems, processes, people, skills, everything is changing.
JOHN SHAW: It's not a madhouse. It's a controlled chaos-- don't quote me. No, it's all fine. But it's evolution. We're all evolving. And it's everywhere across our organization. It just is coming a little bit faster on this front. And we're having to just rethink that we do everything for this particular aspect of our business.
JOHN SHAW: But I think that's really healthy. It's helping us become more efficient. It's helping us become more lean in how we do things. And ultimately, I think good things will come from it.
STUART LEITCH: No comment.
WILL SCHWEITZER: OK. So our last question before we turn it over to you all. And we'll discuss data later, actually, kind of after the break, in our last session of the day. But we talked a lot about database decision making, getting essentially your analytics tools and mindset, and getting your staff trained up, and I guess I'm left wondering, are we going to start seeing more CIOs coming to meetings like this, or meetings, say, like STM?
ANN MICHAEL: So first of all, I'm a CDO. And the reason why is because thinking about digital, data was just such an integral part of that. And just to echo some of the things that John said, and Stewart said too, there's no choice anymore. The reality is data is foundational to everything. Now, that doesn't mean data pops out a perfect answer for you that you should follow blindly. There's the balance, and having the right staff and the SMEs, Subject Matter Experts, et cetera, to interpret that.
ANN MICHAEL: But, I would not-- I would be shocked if there weren't more people that are more focused on data at meetings like this. Because the platform generates a large body of the data that is consumed.
JOHN SHAW: That's a perfect answer. I don't know if it's the CIO. I don't think everyone probably has a CIO in their organization. I think there will be more individuals that are focused on data at conferences, yes. I think we see that. I think there are a lot of conferences on data, period, whether it's in our community or outside. They are having to leave the organization and to go out, yes.
JOHN SHAW: We will see them.
ANN MICHAEL: So plug for contact. If you're in London, the first week of December, after STM Innovations, there is a two-day conference. And it's all on data and AI. And it started last year. And it's actually starting to already built a pretty decent following, just because of that. There's a lot of interest in how you actually build this capability and what you do with it.
JOHN SHAW: And to further plug, if you want to hear more about data, and the challenges, and how technology can come, Neil and myself will be at SSP next week, talking with very subject as well.
ANN MICHAEL: Oh, I get to see you next week?
JOHN SHAW: Yeah, sorry.
ANN MICHAEL: I was excited.
STUART LEITCH: My add on to this would just be that effectively solving problems with data is really, it's an intersection of the skill set around actually working with that data. But it's actually marrying that up to business problems. And particularly if you're looking to innovate around data, you actually need to be able to think through what are the problems that we have, the data that can be used to solve, or data that we can get, that allows us to monetize something, something that actually generates outsized value.
STUART LEITCH: And I think conferences are a great place to really bounce ideas off people and kind of cross pollinate in your thinking around that.
JOHN SHAW: So you just made me think of something. You know, another reason why they come, is that when you think about your classic data scientists or people in your data group, they don't actually understand what the business is trying to do. And so if they sit in the technology group, they need to know more about how publishing works. And the more they can understand what we're trying to-- what the problems are, all the problems we're trying to solve, the better they can apply their science to it.
JOHN SHAW: So I think we should start seeing a lot more technologists, or people that are used to be on the backside out, and trying to learn about the problems that we're trying to solve in the community as a whole.
ANN MICHAEL: I completely agree with that. And at PLOS, one of the key drivers regarding data, is someone that is definitely, if you ask her, she's a business person. Now, she had to learn a lot about publishing. She came from another business. And the whole idea of the democratization is that we want to come up with that single source of truth, as far as the data itself. So that people are working with the same things and not coming up with different results because they have different data.
ANN MICHAEL: But really, it's all about business, only about driving business value. And again, that's kind of like our organization mission. The vision for the digital team at PLOS is about driving business and customer value, about having real questions that are going to have real impact.
WILL SCHWEITZER: All right, so we'll turn to you all. We welcome hard questions.
ANN MICHAEL: He welcomes them.
WILL SCHWEITZER: Yes. Hi, Jenny.
AUDIENCE: [INAUDIBLE] Is it really safe to let technologists out of the basement?
JOHN SHAW: Yes.
AUDIENCE: That's kind of a scary though.
JOHN SHAW: No, it absolutely is. Well, I--
AUDIENCE: My real question is--
JOHN SHAW: Darn, I like that one.
ANN MICHAEL: Well, you can answer that one.
AUDIENCE: What is the relationship-- how do people deal with the situation regarding open science and Michael Mann of Kent State? Do you know Michael? He's got a defamation--
STUART LEITCH: Let's repeat the question.
JOHN SHAW: I'm not aware that either.
AUDIENCE: OK, well, Michael Mann is the person who's the scientist who developed the first [INAUDIBLE] on climate change, who will not release his data or his models in this defamation suit. Now how does that factor into initiatives of open science? Is that the cause of open science initiatives, or just a biproduct?
WILL SCHWEITZER: So for those of you who didn't hear the question, there were two. The first one, tongue in cheek, but I think it may deserve an answer, which is should we let technologists out of the basement? And the second part was around open science, the impetus and highlighting a defamation suit involving a climate science researcher. Let's start with the basement.
JOHN SHAW: There's a very simple answer on that. Technologists need to evolve. So if your technology team-- we're looking at hiring totally different people now. If they aren't customer facing, communicative people, that are good at asking questions and solving technology problems, so problem solvers, and curious, then they don't fit our organization anymore. We are going through a lot of transformation. The only way we stay alive is to do that.
JOHN SHAW: And so we're having to really adapt. And yeah, I think we should and will see a lot of them in the future. So that's why I thought actually it was a serious question.
ANN MICHAEL: High five on that one.
JOHN SHAW: Yeah.
ANN MICHAEL: That's good. Good answer.
WILL SCHWEITZER: And then any thoughts on this open science question.
ANN MICHAEL: I have no idea about that.
JOHN SHAW: I don't think I'm qualified.
ANN MICHAEL: I have no background on that.
WILL SCHWEITZER: OK. Questions in the back? We have a microphone now. So coming your way.
AUDIENCE: Thank you. It was very interesting. You all spoke a bit about the evolution of partnerships and made me think, are you able to characterize or give examples of areas where you feel that you are currently going to accept more short term risk with the partners who are willing to partner? So you were talking about the evolutions, and saying you were willing to accept more short-term risk in partnering, in some things where you have a longer view.
AUDIENCE: Are there any examples or characterizations of what those areas where short-term risk is more acceptable?
ANN MICHAEL: I would say just where short-term risk isn't acceptable or anything that's totally business critical. So I mean, manuscript tracking systems, as John said, things that you basically are just keeping things running. Where short-term risk is definitely acceptable is in the area of experimentation and advancement. So in the area there may be places where things related to AI might be something we want to experiment with to see if they can enhance user experience or create efficiencies in the operation.
ANN MICHAEL: So yeah, things that are more isolated or that can be isolated until they're integrated more deeply would be an area that I would think is an acceptable place for short-term risk.
JOHN SHAW: I mirror the exact answer. I think in anything that we're innovating on, we're willing to take a lot of risk. And there's a lot of playing around with technology. If it's a core mission, critical system, like our subscription system, no, not so much.
ANN MICHAEL: We don't have one of those.
JOHN SHAW: Yeah, you can play around with it then. But, yeah, I think it just comes back down to what does the system do within the organization. But I'd say even in those cases-- peer review is a good example. So we're experimenting. We built one. And so we experimented on how that would work for us and whether or not that could solve a problem that we had for some components of our journals.
JOHN SHAW: And that's an experiment. And it could be risky.
ANN MICHAEL: Well, I can tell you that it is.
JOHN SHAW: Yeah, she knows. Ours was done in a little different way. So again, we had a particular scenario, where it made sense to try something. Because in that particular situation, there was no other solution that would work well. Now, we're still playing with it. And so, it's just one of those projects on the side, I'd say. And so I'd say even with-- we just have to understand what our risk tolerance is and what we're willing to accept and lose.
STUART LEITCH: Yeah, absolutely. I think if you're really thinking about what's the rate of change of the thing that you're partnering around. Like an example for us was that we got-- we have a broad partnership with Microsoft, but we essentially took on the Power BI product very early on in the lifecycle. And we were actually willing to take quite a few risks there. And we found that the releases were pretty buggy.
STUART LEITCH: There was a lot of inconsistency there. But we also could see the level of resources that Microsoft was putting behind this product. And we had this hypothesis that it was ultimately going to-- like they would price this such that they'd be able to essentially dominate the market. And so we were willing to put up with short-term risk there. And at the end of the day, yeah, that's become one of the really dominant players.
STUART LEITCH: So that worked out fairly well for us.
JOHN SHAW: Excellent. That's a good example.
WILL SCHWEITZER: We actually have a question from the app, Stuart, that I think you can be the first answer to. Which is can you talk about how we're thinking about costs for technology, especially the differences between cloud, data, and on-premise cost.
STUART LEITCH: Yeah, so I think that's really interesting. We typically think of the cloud as kind of being a lot cheaper. But the answer to that cost question is actually very contextual. If you look at what it costs for you to get, say, a really beefy server in the cloud, and then you happen to be a hardware engineer, and you've got a rack in your basement, and you can just go and buy something off eBay, or even just buy it from HP or from Dell, you'll actually generally work out ahead if you've got the thing in your basement and you can have all that labor.
STUART LEITCH: And likewise, if you've got a data center set up, and it's really steady state, you're not going to save cost going to the cloud. But what you do get in the cloud is you get a whole lot of scalability and flexibility. Particularly if you're developing an application stack from scratch, and you can go cloud native, you're typically operating at a higher level of abstraction. So if you think about a lot of the things that you're worried about when you're running your own center, things like you're running the operating system.
STUART LEITCH: Whereas, if you're able to go to kind of a platform as a service, kind of this infrastructure as a service, and then more abstract from that is platform as a service, where you're no longer having control of the operating system. You're working against more abstract APIs, you've essentially transferred the security risk to a partner like Microsoft, in our case. And that's where you can get real cost benefits there.
STUART LEITCH: So that's one dimension of that.
ANN MICHAEL: I would just add to that too. We're in the process of moving to AWS. And there are a lot of other services and things that you can then plug in much more seamlessly and much more cost effectively than where we were, which is just some kind of a rack provider, which long story, but we're going to save a lot of money.
JOHN SHAW: I come back as, what's the problem you're trying to solve?
ANN MICHAEL: Right.
JOHN SHAW: What type of product or platform is it? How mature is it? What are you trying to do with the product or service? And all that will help you kind of go in the right direction. It's not always about the cost savings. It is about actually scalability or processing power, competency of your own team. A lot of times, we find it is cheaper to just let the legacy servers that are depreciated work to do their magic.
JOHN SHAW: We're not going to save, but in the future, that's probably not the best strategy in some of those scenarios, and some it actually is, because the amount of data that we have to process, and how much it would cost in the cloud, and the time it would take to get it to the cloud. So literally, you have to weigh down what you're trying to solve and then map it over and see if it's going to provide a benefit.
STUART LEITCH: And there is a new kind of set of skills that you need to build into the organization around understanding that costs look very differently in the cloud. You no longer-- you don't have these kind of fixed depreciation. Everything's metered. And particularly, as we're getting into massively parallel computing systems, where you basically just set the dial of parallelism, your costs can explode very, very quickly.
STUART LEITCH: We recently had an example where we had an engineer that was trying to solve a problem. And just in a couple of hours, he burned through $1,000 worth of computer resources. And in the grand scheme of things, that's not terrible. But he hadn't really internalized how costs can run away like that. And so you need everybody that's actually got the ability to spin up resources really understanding.
STUART LEITCH: And you need to have really active monitoring. And our monitoring cycle, we caught this pretty quickly, but we didn't catch it within the window that that was actually happening.
JOHN SHAW: So another important thing there is you need the competency internally.
ANN MICHAEL: Absolutely.
JOHN SHAW: I think you mentioned that. But you're moving from one competency of your infrastructure team to a different company all together. And you can't just hope that Microsoft is monitoring, metering, taking care of everything for you. You have to have the specialty in your house.
WILL SCHWEITZER: Is there another question in the room?
AUDIENCE: With the rise of analytics being more important, how have you approached design, or data utility, and data hygiene, or data clean-up versus data collection? How is that important?
JOHN SHAW: That's a good question. I was thinking about that when we were talking earlier. And I think you mentioned that-- if it's your own systems, it comes back down to the architecture of the system. So as you're building what you want out of the system, you're looking down under, and saying what is it the business needs from the system. So it starts from the get-go. When you're partnering, you're thinking about are they going to be a good partner?
JOHN SHAW: Can we communicate well? Again, you better be looking at, do they have the type of data that we're looking for? Are they able to extract it out? Are they collecting it cleanly, in the way that we need? You need to be doing your due diligence there. So it's a constant process, up and down. It's not just hoping it comes out at the end, and then figure out how to clean it.
JOHN SHAW: It's kind of going all the way to the start, and saying, OK, what are we trying to accomplish with this data, and then kind of taking it through its lifecycle.
ANN MICHAEL: And I would even add to that, that from a product development perspective, as you develop new product, the products need to have a data strategy, so that you start from the get-go, collecting what you need.
STUART LEITCH: And I think to add onto that, sometimes you need to go further upstream as well. That it's not just a technology problem, it's a process problem. And like we look at a lot of our internal operational data, in terms of how tickets flow through our system, how our work actually gets done. And we're noticing some anomalies in terms of higher than usual percentage of things essentially being kind of rapidly closed out, as if there wasn't actually work done on this.
STUART LEITCH: But we were seeing kind of significant hours piled up against this. And so we started to talk to people. And it's like, oh, well there's this complicated workflow to go through, and this wasn't that hard. I don't want to go through all those workflow steps, so I'm just not going to basically close it out as if it's something else. And so that was just something we had to kind of clarify in the system, to be able to differentiate that ambiguity, so that we could then report on it.
WILL SCHWEITZER: So we have another question from the app. What's the next big thing in artificial intelligence or machine learning in scholarly publishing?
JOHN SHAW: That's going to be answering in my session next week. Sorry.
WILL SCHWEITZER: This panel is really good at plugging other things. Do you want to give us a preview?
JOHN SHAW: Well, I'm actually only the moderator. So, Neil, do you want to come up here? I'm not sure if I actually can foresee that one. I go to the Gartner Symposium about every time this year. And every year, they try to tell me what it is. And I don't think I ever walked out of there with anything other than a big video from Microsoft and Google about how they're using their technology. And I don't have a good vision or answer for what I think the next thing is.
JOHN SHAW: I think we see it-- I've also been going to CES every year. I think we see it coming out through other mediums, through other industries. And then it kind of comes into ours. And so that's always fascinating to me, is looking at all the other industries, and seeing how they're using it, and how does it apply back into our industry.
JOHN SHAW: I still feel like we're five to 10 years behind.
ANN MICHAEL: I also think that the difficulty in answering that question-- what's the next big thing, there's a lot of stuff that's going on. But to try to pick one thing, and say, that's the next big thing. Like for example, we have a whole lot of data in manuscript tracking systems. I think there's a lot we could do to try to understand the experience of authors through interpreting the data in those systems, for example.
ANN MICHAEL: There's something to be said for personalization. That potentially there's a lot of applications for personalization, I think, through machine learning and AI. I don't know if I could say this is the next big thing, though.
JOHN SHAW: You can see trends of what we're trying to do. So if I think about the people that we selected for our session next week, it really is thinking with meta, what are they doing, and how are they using technology to solve researchers' problems. So they're trying to solve a bigger problem for that organization that they're part of now. Someone like on silos, trying to solve problems both in traditional machine learning, but they've gotten down into production, trying to predict things within a production process or figure out whether it's something needs copy editing or proofreading before you do it.
JOHN SHAW: Meta is trying to figure out whether an article will ever be cited. I mean, that's just downright scary, right?
ANN MICHAEL: Right.
JOHN SHAW: Do you even want to publish it? Because maybe it's not going to be cited and we know that already. So you can kind of see how technology can be used. I'm not sure it's the right way. And then others are using different tools to try to use artificial intelligence to pull information together that it would have taken 50 other people, or 100, or 200 people to do otherwise in two months, and using artificial intelligence to try to predict what that metadata should be that's being applied to a piece of content.
ANN MICHAEL: And to add to, though, this is all predicated on good data. And so that's another thing that I think as industry, quite frankly, we are lagging, as far as we have industry standards, but they're not always followed. They're not always followed consistently. I mean, for example, there is not an article level indicator consistently populated that tells you whether an article is open access or not.
ANN MICHAEL: There isn't. There is something in JATS for it, but it's not often used or not used consistently. So I think that that's something that while have ideas and we want to go places, we have to have good data. You know, it all starts with thought data hygiene.
STUART LEITCH: My two cents would be that I think more broadly, we might see a collision between artificial intelligence and the legal system, in terms of as we use-- essentially we use artificial intelligence to create decision support systems or actual decisions themselves, where we're doing a pre-screening. And you might have, say, a manuscript being submitted by somebody from some kind of protected class, some minority.
STUART LEITCH: And your algorithms aren't really taking that into account very well, kind of like some of the hubbub about facial recognition and not all ethnicities are equally treated within that. I think that gets to a point where people start to engage in lawsuits around this. They get discovery. They get access to your algorithm. They're kind of testing that.
STUART LEITCH: They're finding out, yes, there is discrimination there. And it becomes kind of like handicap parking spaces, where you have a whole industry kind of trying to push the boundaries. And I think that's going to be a really tough one for us to work through. And I don't think we're very far along.
ANN MICHAEL: So that's a really interesting point. So one of the last classes we had was on ethics and data. And the problem is-- again it goes back to data, our data historically has certain cultural and many other biases in it because of how we act as a community. And AI doesn't magically erase those. It actually perpetuates them if you don't take precautions. You can prepare data in such a way to offset imbalances that might have existed, because of our practices.
ANN MICHAEL: Now you can replicate certain groups within the data to have them appear more often. I mean, there's lots of things that you can do. But I actually am worried that people, in a way, kind of turn away from a AI, because they say, oh, this is just going to be an ethical problem. Rather than trying to fix that. Because the same way that it could perpetuate what was done in the past, it could change what we do in the future.
WILL SCHWEITZER: Yes, so on that, we actually have to bring this discussion to an end. So please thank me--
ANN MICHAEL: Thank you, Bill. [APPLAUSE]