Name:
Minding the Gaps: Bibliometric Challenges at the Margins of the Academy Recording
Description:
Minding the Gaps: Bibliometric Challenges at the Margins of the Academy Recording
Thumbnail URL:
https://cadmoremediastorage.blob.core.windows.net/ca26b81b-69c1-41fb-bfa4-e6fa63cdad20/videoscrubberimages/Scrubber_3.jpg
Duration:
T00H44M54S
Embed URL:
https://stream.cadmore.media/player/ca26b81b-69c1-41fb-bfa4-e6fa63cdad20
Content URL:
https://cadmoreoriginalmedia.blob.core.windows.net/ca26b81b-69c1-41fb-bfa4-e6fa63cdad20/Minding the Gaps Bibliometric Challenges at the Margins of t.mp4?sv=2019-02-02&sr=c&sig=mX5vbKJzYegliojkf5JMnxjBhEtyMQD%2FwO83xzPTJgw%3D&st=2024-12-26T11%3A37%3A05Z&se=2024-12-26T13%3A42%3A05Z&sp=r
Upload Date:
2024-03-06T00:00:00.0000000
Transcript:
Language: EN.
Segment:0 .
Welcome, everyone. Topic of our presentation is minding the gaps bibliometric challenges at the margins of the academy.
We have four speakers. Shenmeng Xu is the Librarian for Scholarly Communications at Vanderbilt University. Cliff Anderson is the Director of Digital Research at the Center of Theological Inquiry in Princeton, New Jersey. He is also the Chief Digital Strategist at Vanderbilt University Library. Wen-Chi is the Research Support Librarian at the National Taiwan University Library.
I'm Charlotte Lew, the Digital Scholarship and Special Projects Coordinator at Vanderbilt Divinity Library. I also serve as a moderator for our discussion session. I'm presenting with Shenmeng to address the question how well do existing bibliometric tools capture and measure the scholarly and societal impacts of minor academic disciplines such as theology and religious studies? Cliff will share how open source initiatives such as Wikidata, WikiCite and Open Alex could help mend the gaps between dominating and marginalized academic fields.
Wen-Chi will introduce a network analysis that can demonstrate the scholarly impact of minor academic disciplines within the larger academy. Shenmeng and I started the Divinity bibliometric project three months ago. We utilized the Divinity publication data and analyzed the publication patterns to illuminate the bibliometric tools that have caused dire consequences impacting scholars in the Divinity School.
First, let me introduce how the Divinity publication data has been collected. The Divinity Library worked with the Digital Scholarship and Communications team to embark on the institutional repository project in 2014 to promote open access for faculty publications. This faculty publications archiving project collects a bibliography for each faculty and populates the resulting data in Zotero.
Thanks to the regularly updated publication data. The Divinity IR project lays the groundwork for this bibliometric project. In addition, I've been working with Steve Baskauf, the Data Science and Data Curation Specialist to mass transpose the Zotero publication data to Wikidata. This Wikidata project allowed me to rectify and enrich the data.
The up-to-date data with high quality is a game changer for the productivity of this bibliometric project. Currently, more than 3,000 publications have been captured in our database. It comprises the works published from 1966 to 2022 by forty-one faculty nine retired, three left Vanderbilt and one passed away.
At a point we argued if the non-current faculty should be included in this project, Considering their academic contributions at great length, we agreed that featuring the publication data could offer a more comprehensive perspective to understand the biased practices from the prevailing research information management that the Divinity faculty have been grappling with. Among the twenty-eight current faculty, sixteen are full professors, eight associates, three assistants and one lecturer.
We generated a list of questions to conduct a diagnostic analysis. In the following section, I'll present the questions, and Shenmeng will share her research and interpretations of the questions. Publication data collected from faculty CVs laid the foundation work of our project, but we also cross-examined data against Web of Science, Scopus as well as the ATLA Religion Database to make sure the obtained data were as comprehensive as possible.
Despite most of these publications being pulled from faculty CVs, some faculty lost interest and momentum to keep updated with their CVs due to the stage of their career. I captured around two hundred publications not listed on faculty CVs when checking their works on the ATLA Religion Database, Scopus and Web of Science. Now I'm curious, Shenmeng, how much do this three citation indexes cover our faculties works?
Thanks Charlotte, for the great introduction and question. These two figures show the comparison of the coverages. It is clear that these publications are underrepresented in the mainstream citation indexes. Atla, as the largest database in the fields of religion and theology, only covers slightly less than half of all publications we collected in Zotero. Web of Science covers about 17% of the publications, while Scopus only covers 10%.
Looking at each individual faculty, we have noticed that Atla has the highest coverage among these three databases for everyone except for three individuals. One of them switched her area of study to social sciences and joined another department on campus a few years ago, and Scopus has the highest coverage of her works. The other two have joint appointments with other humanities departments and Web of Science has the highest coverage for them.
There are three important takeaways. First although Web of Science and Scopus are known to have low coverage of Humanities and Social Sciences in general, it seems that the coverage of religion and theology studies is even lower. More nuanced explorations are warranted to look at the coverages in different fields in the humanities, to advocate for their inclusion in the databases.
Second, even though those discipline-specific databases like Atla have higher coverages than Web of Science and Scopus, we need to practice caution to use them as a golden standard when looking at faculty publications. Atla only covers less than half of what's listed on our faculty CVs. Particularly, discipline-specific databases are even more limited for those in cross and interdisciplinary fields.
Last, as many guidelines have pointed out, for instance, the Leiden Manifesto, in bibliometrics, it is critical to keep data collection and analytical processes open and transparent, and it is crucial to allow scholars to verify these data. Although scholars have different perceptions regarding what should be included in their CVs, at least in our sample here, CV is still the best data source compared to all the citation indexes.
At Vanderbilt, currently, we are still in the initial launching phase of a research information management system, a RIM system. With a well-managed RIM system, metadata of research activities could be better recorded, connected and utilized. Wen-Chi will talk about this later. Divinity faculty have more publications, such as books and book chapters compared to the faculty in the STEM disciplines.
This fact takes us to the next question. Does the publication type impact how these publications are captured and measured? Great question. This is a very complicated story. From this chart, we can see that the biggest chunk of publications is journal articles. However, they only comprise slightly less than half of all publications in our database.
About one fourth of these are book sections. Book sections and books together occupy about 37% of the pie. Further, if we include encyclopedia articles and dictionary entries, these book-typed publications consist of 44% of all publications. So what's highlighted by the pink outline here is close to the number of journal publications -- the blue part.
When evaluating how well the publications are captured and measured, there are two big layers that we look at. The first layer is, are they included or indexed? And then the second layer is, are their references and citations indexed? In other words, how well do the indexes' citation networks cover them?
These two big layers have created barriers for books to have the same level of reach and impact as journal articles. Compared to journal articles, Books are known to be less digital, less findable, less freely available, and less indexed in citation indexes. This is the first layer. Books are underrepresented when indexed in the first place.
The second big layer is how well the citation indexes and networks cover the item. References and citations are not as easy to track in books and thus are significantly underrepresented in these databases. Even in the possibly largest open citation index, the Open Alex, only 2% of books have references. Ideas like crawling references and creating entries by linking to book sites like Open Library or using Wikidata, which Cliff will talk about later, can help remedy this.
Breaking these down by person, we see the diversity in terms of publication types. If these publication profiles are ice creams, then there are a variety of different flavors and toppings. If we consider the above-mentioned book-typed publications plus theses and conference papers as conventional publications, and the other types of publications such as blog posts, web pages, podcasts and so on as unconventional ones, then we can see this growing trend of unconventional publications over time in our database.
Several scholars here have well-received publications that fall into the unconventional item type. They should be recognized and measured more comprehensively and fairly. Using the altmetric lens to look at these publication types, we see the same two layers. The first layer is the diverse range of publications. A lot of them are born digital and born open.
Around these items or artifacts, the second layer consists of much more than citations. There are various types of digital traces, including but not limited to views, downloads, saves, bookmarks, mentions and recommendations in blog articles, on Wikipedia, expert review or Q&A platforms or social media platforms, and reuses on Github, et cetera.
Of course, the second layer applies not only to unconventional publications, but also to conventional ones. In working with our Divinity faculties, we have seen, for instance, highly liked homiletics video talks on YouTube. and for instance, our faculty's journal article being ranked the yearly top 20 downloaded works in academic journals. These digital traces capture early reactions and a wider range of reactions and uses, and thus can reflect a wider range of impact compared to citations alone.
Besides the prevalence of book-typed publications and the emerging trends of unconventional types of publications, are there other unique factors we should consider in bibliometrics in the religion and theology fields? I would like to mainly talk about two additional issues. This is, again related to one of the principles from the Leiden Manifesto.
Biometrics should account for variation by field in publication practices. The first unique publication practice I would like to talk about is the long publication window. To better understand that, first, let's look at the publication career length of our faculty. As Charlotte mentioned earlier, junior scholars are underrepresented in our sample.
As this figure shows, only six out of 41 scholars have a publication career shorter than 20 years. The longest length is 55 years and the average length is 29 years. If we look at the number of publications per year, on the right here, we can see that 25 out of 41 faculty members published two or fewer, publications per year. This is very different from STEM fields and should be borne in mind.
Another unique factor is the dominance of single-authored works. Despite the increasing inflated co-authorship in other disciplines nowadays, 89% of our Divinity faculty publications are single authored. Bibliometrics and scientometrics studies have proposed and discussed different approaches to dealing with multi-authorship.
But in practice the most common approach does not correct or normalize the case of multiple authors. Many measures at the individual- and institutional- levels that might affect hiring, promotion and funding decisions use this approach. For instance, the h-index. In addition to the many other limitations of the index, each author claims full credit for each paper and each ensuing citation, which renders STEM researchers to have significantly higher index values than the ones in the religion, theology and other humanities fields.
Therefore, it is critical to keep this in mind when interpreting metrics and analytical results. Let's take a look at how many Divinity faculty have adopted or ORCID IDs and assigned their works to DOIs. Shenmeng, can you explain how obtaining an ORCID ID and assigning publications with DOIs conduce their works to be indexed?
Thanks! First, let's look at the presence of Digital Object Identifiers. Breaking down by publication types, we can see that journal articles have the highest proportion of DOIs. A third of them have a DOI. Only a tiny proportion of book sections and books in our sample have DOIs. From the figure down below, we can see a growing trend across the years, but still the presence of DOIs in these works is low.
In terms of the person identifier ORCID ID, 12 out of 41 faculty members have ORCID IDs. This adoption rate is low, even considering that 14 are retired professors. Vanderbilt just obtained an ORCID membership this year. Although our ORCID adoption and the RIM system I mentioned earlier are still at the infant phase right now, we will be working hard to make them work together to better promote our research and maximize its impact.
Unique identifiers such as DOI and ORCID ID can effectively create links between objects such as authors and scholarly works in information systems and are essential in citation indexes and bibliometrics databases. The lack of unique identifiers make it extremely challenging for bibliometrics at all levels, especially on the large scale. With unique identifiers, stakeholders and objects can be distinguished and linked together to form networks, trace impact, and identify underlying patterns.
Cliff willll talk more about the promising possibilities brought by Wikidata identifiers. Hi my name is Clifford Anderson, and I'm the Director of Digital Research at the Center of Theological Inquiry in Princeton, New Jersey. The goal of this brief presentation is to ask how research organizations in the field of theology and religious studies can measure, evaluate and improve its scholarly impact.
In September 2022, I took on a new role at the Center of Theological Inquiry in Princeton, New Jersey. CTI is a research organization that fosters interdisciplinary inquiries in theology and the sciences, according to its mission statement. We convene leading thinkers in an interdisciplinary research environment where theology makes an impact on global concerns, and we share those discoveries to inform the way people think and act.
In its research inquiries, CTI brings domain experts from the natural and social sciences to engage in dialogue with so-called science informed theologians. The goal of these engagements is not to turn theologians into amateur scientists, but to provide them with sufficient disciplinary understanding, technical vocabulary and other skills to identify areas of shared inquiry and research interest.
These inquiries take place in hybrid modalities, combining periods of virtual and residential collaboration. As the inaugural director of digital research. My remit is to assist the center with integrating digital technologies seamlessly into its operations Among my hopes is to provide a renewed and improved framework for evaluating cities, academic and public impact. Again, the question is how to measure the extent to which CTI research inquiries have advanced the state of the conversation between theology and sciences.
There are, of course, ready-to-hand ways like the h-index for evaluating the productivity of individual scholars and the h-index for academic departments at higher Institute institutions of higher learning. These metrics feed into rankings of colleges and universities at both the national and international scale. But as an independent research institution, without a permanent faculty and with a small professional staff, there is no straightforward way to apply these metrics at CTI.
The same caveats apply to the application of methods for evaluating the performance of so-called think tanks in a white paper measuring think tank performance an index of public profile. Julia Clark and David Roodman take stock of existing frameworks for evaluating the effectiveness of think tanks. "We are aware of three main approaches that have been used to assess think tank performance, quantitative metrics, qualitative assessments, and expert rankings." These categories provide a useful starting point for our consideration as well, though with caveats in place.
Crucially, CTI is not a think tank, while CTI does publish white papers on its inquiries and a quarterly magazine with regular webinars and podcasts. Its purpose as an organization differs from a classical think tank, as its aim is not to influence public policy, but to promote interdisciplinary work. In theology and science, the majority of which comes to expression through the publications of its members. CTI has evaluated its research impact up to now using a mixture of quantitative and qualitative methods, along with expert opinion on the qualitative side.
CTI maintains relationships with members and values their feedback. CTI as members serve as informal ambassadors, sharing their experiences as participants in inquiries and encouraging emerging and established scholars to apply for new inquiries. These activities elevate expert opinion about CTI On the quantitative side, several rough and ready measures include counts of downloads of our white papers, web traffic at our website, and discussion of sponsored research on social media.
CTI also monitors the performance of its webinars, podcasts and public lectures. These measurements provide an index in particular to public interest in our research outputs. Given its small staff measuring the performance of cities, direct outputs cannot adequately gauge its overall effectiveness. As Clark and Roodman remark, the quantity of an organization's output may indicate something about the tank's relative capacity, but it says little about its impact.
While they offer a way of scaling these measurements, CTI direct outputs do not capture its primary impact on the field. A primary measure of. Capturing that impact is tracking scholarly publications, given that participants in the city's fellowship program generally come from the humanities. Scholarly monographs and peer reviewed articles form key research outputs.
Authors of these publications generally include acknowledgments of the CTI support, making it possible to identify them as research outputs. CTI maintains a collection of such print publications in its library. But beyond identifying and collecting these publications, what else should CTI do to assess its impact? A reasonable suggestion would be to investigate how other academic institutions carry out assessment and benchmarking of publication impact.
In many disciplines, the go to Tools are elsevier, Scopus and SciVal, or Clarivate's Web of Science and inCites. The challenge of using these tools in theology and religious studies is twofold. On the one hand, these tools focus on peer reviewed journals and a curated list of monographs. They also do not include many denominational publications which continue to predominate in the field.
On the other, they are costly proprietary services. Given these limitations, these tools have limited usefulness for CTI. Atla, a membership organization of informational professionals in religious studies and theology, maintains the largest bibliographic databases in the field the Atlas religion database. Kevin McDonough, in a recent review, wrote "In a review of SciImago's top 100 religious studies journals as ranked by their h index score, Atlas contained 87% of them, and the ones excluded were clearly peripheral to the focus of the database." Of course, there is circularity in this evaluation because SciImago's rankings derive from scopus, which does not include all the relevant disciplinary journals.
Whether it includes the most significant is, to some extent, a matter of opinion. Still, given that the Atlas religion database includes many more sources from religion and theology than Scopus or Web of science, should CTI rely on rankings derived from Atlas data to evaluate, reach and impact as a research organization aimed at fostering interdisciplinary research. Though such data would provide only half the picture, so to speak.
So what other alternatives exist? There are open source bibliographic projects like openAlex and WikiCite that promise to overcome the limitations of proprietary tools. These projects are still at early stages of development, but they may remove the current cost and disciplinary barriers. Let's talk a little bit about Wikidata, the efforts of the Vanderbilt University libraries to improve bibliographic data about theology and religious studies in Wikidata as exemplified by Charlotte Lew and Shenmeng Xu's presentation would pay dividends for other research institutions in the field.
Open access sources of bibliographic data are skewed toward the natural and social sciences because data from those fields are generally easier to come by. Wikipedia, as a source of bibliographic data, for instance, likely remains unbalanced and lumpy, so to speak. Of course, as Lew and Xu have discovered, many theological publishers publishers still lack items.
That is entries in Wikidata. However, wiki site is expanding rapidly since its inception in 2014. WikiCite now encompasses 42 million publications and 288 million citations. Still, by treating monographs edited volumes and periodical publications on an equal footing and by removing strictures about what counts as peer reviewed literature. Wikidata proves more hospitable to scholars of theology and religious studies than many commercial alternatives.
If WikiCite continues to grow at its current pace, CTI and other independent research institutes may come to regard it as an acceptable and perhaps preferred alternative to commercial bibliographic databases. What does it look like in practice to add bibliographic data to Wikidata? Take for example, my recently published digital humanities in libraries in archives and religious studies and introduction.
The publication exists as an item. Q111248757 in Wikidata. If you look at the history of this item, you will find that a Wikidata user named Lloyd created this entry on March 15, 2022, not long after its date of publication. Along with adding bibliographic details, this editor also linked this item to Q38099106. The item representing me on Wikidata.
In early January of this year, another user, KarlinMac, clarified the copyright status and license to the volume, noting that the volume was under copyright but has a Creative Commons attribution. Non commercial non derivatives 4.0 international license. But the work of bibliographic description is not yet completed. There are still many properties outstanding, including the language of the work, the place of publication, the number of pages in the volume, et cetera if I wanted to, I could add these details myself.
Otherwise I can wait for other Wikidata users to fill them in for me. Over time, information about items on Wikipedia grows both more complete and more interconnected, with other items expanding into a richly layered, bibliographic data source. Ideally, access to open bibliographic data would be coupled with open access to the content itself. The field of religious studies and theology has historically proved uneven in its approach to the open access movement.
In some cases, authors and institutions have embraced open access as a digital corollary of their religious mission. In other cases, advocates of open access policies have met resistance from authors who worry about the loss of trade publishing contracts. As funders in our discipline began to adopt open access mandates, sharing and promoting Scholarship emerging from cities, research inquiries will be easier, and we should expect the impact of that scholarship, particularly among the public, to increase.
Looking forward, what are the ways might we have to assess how effectively CTE and CTE members have disseminated the fruit of research inquiries? Citation analysis provides a potential method of measuring the degree of interdisciplinary dialogue at a simple level. CTE could count citations in members publications that belong to a different disciplinary domains. A useful indicator of impact might be to measure the growth in the relative percentage of theological versus non theological citations in members publications after participation in a research inquiry.
Studying patterns of co-authorship using network analysis software would also be valuable, as participation in a research inquiry led to publishing collaborations with scientists. Of course, differences in co-authorship patterns may mask the level of involvement in certain sciences, such as high energy physics. Articles may have hundreds or even thousands of co-authors in theology.
The norm is still single author publications. If a scientist co-authors a paper in the field of religious studies, should that indicate deeper interdisciplinary engagement than if a theologian joins with dozens of co-authors on a scientific paper? Huang Wen-Chi's contribution to this panel demonstrates how network analysis complements the traditional measures of scholarly reach as we move forward into a new research cycle at CTI.
My goal is to become more intentional about tracking and evaluating the impact of our program on participant Scholarship. The current inquiry will follow a hybrid meeting schedule over the next two years, combining virtual and residential periods of research. At the beginning of the inquiry, participants will meet in Princeton in person for an intensive two week seminar.
In part, we will use this period of residency to familiarize participants with the scarlet communications ecosystem, equipping them with standard identifiers like orchids, and encouraging them to take ownership of their Google Scholar profiles, et cetera. My hope is that we will also begin to implement subtler measures to evaluate the success of our inquiry over the medium and long term.
If you have suggestions about other metrics CTI could use, I hope that you will please share them during the discussion. Thank you very much for your attention and I look forward to our conversation. Good evening and good morning, ladies and gentlemen. I'm Wen-Chi Huang from research support division of National Taiwan University library. It's a privilege for me to introduce our research support service customized domain network analysis.
Before diving into the customized domain analysis service at NTU library, I would like to give you a picture of public A and I databases of scholarly literatures and the current state of citation database development in Taiwan. One is the National central library Taiwan periodical literature database, and the other is Taiwan Citation Index- Humanities and Social sciences, also known as TCI For example, we can search journal articles through each databases, and by clicking TCI citation statistics button, we can easily explore the cited information of a journal article.
Furthermore, we can get the journal level bibliometric indicators like time cited, five-year impact factor and immediacy index in TCI. Now let's take a look at our University. To permanently archive and promote researcher profiles and scholarly works NTU library integrates the services of NTU repository with Academic Hub to form NTU scholars. We hope to increase the visibility of NTU's research works through this aggregated platform.
Here is an example of a specific journal article page of NTU scholars. Apart from the citation metrics extracted from Scopus and Web of Science core collection databases, we provide NTU Scholar page views and download metrics to demonstrate some impact indicators of this research output of NTU. You may also conduct an extensive search in Google Scholar and altmetrics platforms by clicking the icons to obtain multiple information on the scholarly impact of each research work of NTU.
After the short intro, I'll move on to the main topic of my talk. In 2018, my library started a new division of research support to explore more possibility of bibliographic and bibliometric data. In 2019, we launched the domain network analysis service, also known as DNA service, and there are three objectives of this service. One objective is to support academic policy-making, such as resource allocation.
Second objective is to support faculty research by helping them to explore research fronts. And the last but not the least objective is to raise the visibility of our faculty, our institution and of course, our library. We provide DNA reports by personal, research unit, and discipline level. The DNA reports usually include the information of research impact, research fronts, and the benchmarking analysis, and are mainly used for project application, tenure and promotion, academic evaluation or discovery in the research field.
There are two main approaches by which we conduct DNA analysis. One is bibliometric analysis, which is used for statistics of research impact. As you can see, this is a yearly distribution and the bars stand for scholarly output and the dots stand for relative citation impact of each year. for benchmarking analysis to highlight academic performance.
In this case, we compare the Department of chemistry with other two entities, NTU and the world, in three different biblioometric data. We also analyze the topic distribution of a set of research documents. The other main approach is social network analysis, which is the process of investigating social structures through the use of networks and the graph theory. It characterizes network structures in terms of nodes and the ties, edges or links that connect them.
We apply social network analysis method to analyze the connections of a set of research documents as you can see in the blue circle. We extract the bibliographic data and track the connections of the documents. For our DNA reports the most frequently used SNA analysis dimensions of the bibliographic data are co-word network, co-occurrence of the author keywords forms the connections of this network.
And bibliographic coupling legwork. The more references the two documents share, the closer connection the two documents have. We use this concept to form bibliographic coupling network. And the co-authorship network. In this network, we observe the scholarly collaboration of authors, institutions and countries. Now let's take a look at some of our service examples and find out what network analysis can do.
This is an example of co-authorship network. We proposed the potential collaboration opportunities for the faculty whose name is under yellow mark here. By highlighting the possible interpersonal path to expand his research field from blue to red cluster. Here is a full DNA report sample at personal level from a faculty of College of Bio-resources and Agriculture.
You can scan the QR code to download full PDF file and this report is written in Chinese. Here is one of the most important part of this report, the co-word analysis to help faculty to discover his interested domain and to observe the citation impact of each keyword cluster. Here, I'd like to share a very interesting example. The Department of chemistry wants to demonstrate their research impact on their official website for attracting new students and promoting international collaboration.
In this case, we use co-authorship network of institutes and many visualized diagrams of bibliometric indicators to fulfill the needs of the Department. This was an unexpected requirement for us at that time, and it turns out to be a surprisingly successful case of DNA service. And this is another interesting example.
The department chair asked if our network analysis can be a help on curriculum design. According to the requirement, we used the course reference books on the Open Syllabus website and extracted the subject headings of each books with OCLC worldcat database. And then we analyzed the co-occurrence of the subject headings of the books and formed this network.
By mapping the topic clusters to the curriculum of the department, we believe this can help them to explore the new curriculum topics. And there are a lot of local research in humanities and social sciences. We use TCI data to analyze the co-word network of local scholarly literature. Here are two examples of them. On the left hand side is about the topic "sport city".
And on the right hand side is about the scholarly output of a department. Finally, I'd like to share an example that we use co-word analysis to help the faculty on the literature review section in his published review paper in the field of analytical chemistry. This co-word network is to visualize the resent publications related to a specific emerging method called "ambient ionization mass spectrometry".
We are happy to see the contribution of the library is mentioned in the acknowledgments. To sum up, there are several features that makes our DNA service a little different from the bibliometric service of other libraries. First one is beyond metrics. In addition to academic evaluation, we also help to explore and discover new topics or new research development.
And we also hope to expand the analysis of this widely to faculty in the field of Humanities and Social Sciences. No template, all customized. We customize service for each applicant to highlight academic impact for each of them. Effective use of visualization tools. Apart from VOSviewer, we are trying to use many visualization tools.
For example, Excel, Tableau Public, Flourish, and most of them are easy and free to use. Embrace multiple applications. We believe our service is attracting different viewpoints in our university community, and that's why many expected applications are inspired by them. We also believe our service is very helpful for strategy promotion and this is my presentation today.
Thank you. Thank you. Wen-Chi, Cliff and Shenmeng for your presentations, This concludes the recording session. Soon we will move on to the discussion session. Hopefully at the end of the discussion we can identify at least one main idea to take forward. Besides, we would like to invite people who are interested in taking the idea to draft a proposal for the NISO topic committees to consider.
See you shortly.