I S K O |
Encyclopedia of Knowledge Organization |
||||||||||||||||
home about ISKO join ISKO Knowledge Organization journal ISKO events ISKO chapters ISKO people ISKO publications Encyclopedia KO literature KO institutions ⇗ KOS registry 🔒 members contact us |
edited by Birger Hjørland and Claudio Gnoli
Citation indexing and indexesby Paula Carina de Araújo, Renata Cristina Gutierres Castanha and Birger Hjørland 1. The idea of a citation databaseScientific and scholarly authors normally cite other publications. They do so by providing bibliographical references to other → documents in the text and elaborating them in a special “list of references” (as in this encyclopedia article) or in footnotes. (Such references are also often in the bibliometric literature termed “cited references”). When reference is made to another document, that document receives a citation. As expressed by Narin (1976, 334; 337): "a citation is the acknowledgement one bibliographic unit receives from another whereas a reference is the acknowledgement one unit gives to another" [1]. While references are made within documents, citations are received by other documents [2]. References contain a set of standardized information about the cited document which allows its identification (as, for example, the references in the present article) [3]. A citation → index is a paper-based or electronic database that provides citation links between documents. It may also be termed a reference index, but this term is seldom used [4], and in the following we use the established term: citation index [5]. It has always been possible to trace the references a given document makes to earlier documents (so-called backward searching). A citation index, however, makes it possible to trace the citations (if any) that a given document receives from later documents (so-called forward searching) [6], dependent of which documents has been indexed. Examples:
No doubt, citation indexes are very important tools that have revolutionized the way we can search for information. This article focusses on the function of citation indexes to assist researchers identify useful and relevant research. Citation indexes are, however, increasingly used to evaluate research and researchers, and this function may influence how they are developing, and thus also their functionality for document searching. 2. The principles and design of citation indexesIn the words of Weinstock (1971, 16): “a citation index is a structured list of all the citations in a given collection of documents. Such lists are usually arranged so that the cited document is followed by the citing documents”. It is the scientist (or scholar) who creates the citations not the citation indexes as it has been claimed [8], and the role of citation indexes is to make the citations findable. McVeigh (2017, 941) explains that “a true citation index has two aspects [or parts] — a defined source index and a standardized/unified cited reference index”. In Figure 1, on the left, two articles are shown. These articles are represented in the part of the citation index called the source index. For each article a long range of metadata is provided, including author names, title of article, title of journal, and the list of bibliographical references contained in the article. The source index is therefore a comprehensively described set of the indexed materials from which cited references will be compiled.
Figure 2 shows an example: Among Anders Ørom’s many publications two have been indexed by Web of Science, an article in Knowledge Organization, and another in Journal of Documentation. The figure also shows how many times each article has been cited (on July 8, 2019); the citing articles will be displayed by click on the number. However, as shown below not all citing articles have been captured (in the article in Knowledge Organization only the 31 citations which have been “unified” are included).
Figure 3 shows a corresponding example of forward searching from WoS: 5 references that matched the query “references citing Ørom (2003).
It is important to understand how citations are not being unified — and that the concepts unification and control in bibliographic databases are relative concepts. Contrary to typical library catalogs, for example, citation indexes do not provide standardized author names [12]. The Science Citation Index, for example, is based on a rather mechanical indexing of both metadata in the source index and references in the cited reference index based on the data as given by the source documents themselves. This means, that if an author sometimes uses two initials and sometimes only one initial, his writings are not unified (and both author searching and cited reference searching may be difficult). This is especially a problem when author have common names like A. Smith or when names are spelled in many ways in source documents, such as, for example the names of many Russian authors like Lev Vygotsky. (Compare the concept stray citations in Section 6.3.) McVeigh (2017, 943) emphazises that a citation index is more than just a bibliographic resource with linked cited references. It is the structured, standardized data in the cited reference index, independently of the source index, that for her defines a citation index. 3. Classifications of existing citation databasesIn this article citation indexes are presented in the following order:
Citation indexes may also be classified as follows (see the endnotes for the specific titles in each category):
The most important databases are placed in separately numbered sections. For each database some standardized information is given (such as data of launching) together with presentation of relevant literature about that database. In the end of each description are links to the homepage of the database and list of journals or other sources covered (as bulled lists). 4. The Science Citation Index and other ISI/Clarivate Analytics citation indexesThe American government stimulated the development of scientific research soon after World War II. Considering the fast-growing volume of scientific literature and their concern regarded to the systems for information exchange capacity among scientists, the government sponsored many projects related to the improvement of methods for distributing and managing scientific information. Eugene Garfield was a member of the study team at Johns Hopkins Welch Medical Library sponsored by the Armed Forces Medical Library. Because of that experience “I [Garfield] became interested in whether and how machines could be used to generate indexing terms that effectively described the contents of a document, without the need for the intellectual judgments of human indexers” [21] (Garfield 1979, 6). Garfield’s experience working in that project, his experience doing voluntary abstracting work for Chemical Abstracts and the fact he learned that there was an index to the case literature of the law that used citations (Shepard's Citations, see Section 7.1), led him to create the first modern citation index. He presented his idea of the citation index in Garfield (1955). Garfield's company, Institute for Scientific Information (ISI) was founded in 1960 in Philadelphia and in 1964 Garfield published the first Science Citation Index (SCI; see further on SCI in Section 4.1). ISI has shifted ownership and name many times and is today known as Clarivate Analytics [22]. This company has over the years created a suite of citation indexes to be presented below. Because of the many name shifts, it may be referred to by different names, such as ISI’s, Thomson Scientific’s or Clarivate Analytic’s citation indexes. Web of Science (WoS) [23] is a platform created in 1997 consisting of databases designed to support scientific and scholarly research. It contains several databases, which can be searched together (but not all of them are citation indexes). They can be grouped as follows (see endnotes for lists of all databases in each group):
The WoS platform can be considered a modernized version of the SCI. Its citation databases are further described below in the order of their launching. 4.1 Science Citation Index (SCI) / Science Citation Index Expanded (SCIE)SCI was officially launched in 1964. SCIE, is — as the name indicates — a larger version of the SCI [27]. After its launching, the SCI, and most other citation indexes, have expanded retroactively, so that year of launching does not tell what years are searchable (SCI contains in the time of writing references back to 1898). The SCI was founded on some ideas and practical considerations that contrasted it with the major subject bibliographies at the time:
Braun, Glänzel and Schubert (2000) examined how different disciplines, countries and publishers are represented in the SCI by comparing the subset of journals indexed here with the number of journals covered by Ulrich's International Periodicals Directory from 1998. They found that (p. 254), in average, SCI covered 9.83% of the journals in Ulrich's, but some fields were overrepresented (physics 27.4%, chemistry 26.3%, mathematics 25.0%, biology 23.9%, pharmacy and pharmacology 15.0%, medical sciences 14.8%, engineering 14.5% and earth sciences 12.8%). 17 other fields were found to be underrepresented compared to the 9.83% (in decreasing order): environmental studies 9.4%, computers 8.9%, metallurgy 8.6%, energy 6.4%, public health and safety 5.9%, sciences: comprehensive works 5.0%, petroleum and gas 5.0%, agriculture 5.0%, food and food industries 4.7%, forests and forestry 4.4%, psychology 4.0%, aeronautics and space flight 3.6%, technology: comprehensive work 3.1%, geography 2.4%, gardening and horticulture 1.4%, transportation 0.6% and finally building and construction 0.4%. See also:
4.2 Social Sciences Citation Index (SSCI)SSCI was established in 1973 and indexes according to Web of Science (2018) 3,300 journals showing data from 1900 to present with complete cited references. It is possible to search in an entire century of information in one place, across 55 disciplines of the social sciences. While most physical science research papers are universal in their interest and published in international journals in English, much research from the social sciences tends to be of primary interest to readers from the authors’ country, and often it is published in a national language and in journals not processed for the SSCI (cf., Lewison and Roe 2013); these authors examined SSCI’s coverage of journals from different countries. They concluded that their results can only be regarded as rather approximate but that it is apparent that the shortfall in coverage is real and quite large, and biggest for Russia, Poland and Japan; somewhat smaller for Italy, Spain and Belgium; less again for the Scandinavian countries; and least for the Anglophone countries (Australia, Canada, the UK), as would be expected. Klein and Chiang (2004) found that there is evidence of bias of an ideological nature in SSCI coverage of journals. See also:
4.3 Arts & Humanities Citation Index (A&HCI)The A&HCI was established in 1978. It demonstrated some ways in which the humanities differ from science and social sciences, for example in the use of many implicit citations which needs to be formalized by the indexing staff (Garfield 1980). It indexes today 1,815 journals showing data from 1975 to present with full cited references including implicit citations (citations to works found in the body text of articles and not included in the bibliography, e.g., works of art). Garfield (1977b) suggested — just before A&HCI was started — how this index might benefit the humanities. It has long been felt that adequate coverage is more problematic in A&HCI compared to SCI and SSCI, and this was a major reason for the European Science Foundation to initiate the development of the European Reference Index for the Humanities (ERIH) [28]. Sivertsen and Larsen (2012) considered the lower degree of concentration in the literature of the social sciences and humanities (SSH) and concluded that the concentration is strong enough to make citation indexes feasible in these fields. See also:
4.4 Conference Proceedings Citation Index (CPCI)The CPCI was established in 2008. It was preceded by some conference proceeding indexes from ISI, which were not citation indexes (Garfield 1970; 1977a; 1978; 1981). CPCI indexes now 197,792 proceedings within two main sub-indexes: Conference Proceedings Citation Index: Science (CPCI-S) and Conference Proceedings Citation Index: Social Science & Humanities (CPCI-SSH) (Web of Science 2018). The proceedings selection process is described by Testa (2012). (There is at present no specific homepage for this database at Clarivate Analytics.) 4.5 Book Citation Index (BKCI)BKCI was established 2011. According to Web of Science (2018) it currently indexes 94,066 books from 2005 to present. Clarivate Analytics (2018) [29] wrote: For the coverage of Book Citation Index, each book is evaluated on a case by case basis. The focus is on scholarly, research-oriented books for product. Once a book is selected, both the chapters and the book itself will be indexed. The index page is the guide for the book, so if available, the contents of the index page and all the references will be included. If the book is selected, the full book is indexed. There is no selective coverage. This means that there is no selective indexing of only a few chapters of a selected book into Book Citation Index — without indexing the whole book in BKCI. Leydesdorff and Felt (2012a; 2012b) are two versions of the same study of BKCI. It found that books contain many citing references but are relatively less cited, which may find its origin in the slower circulation of books than of journal articles and that the reading of books is time consuming. The introduction of BKCI “has provided a seamless interface to WoS”. Torres-Salinas, Robinson-Garcia and López-Cózar (2012) analyzed different impact indicators referred to the scientific publishers included in the Book Citation Index for the Social Sciences and Humanities fields during 2006-2011. They constructed 'Book Publishers Citation Reports' and presented a total of 19 rankings according to the different disciplines in humanities, arts, social sciences and law with six indicators for scientific publishers. Gorraiz, Purnell and Glänzel (2013) wrote that BKCI was launched primarily to assist researchers identify relevant research that was previously invisible to them because of the lack of significant book content in the WoS. The authors found that BKCI is a first step towards creating a reliable citation index for monographs, but that it is a very challenging issue because of the special requirements of this document type. Among the problems mentioned is, that books, in contrast to journal articles, seldom provide address information on authors. Therefore, in its current version (at the time of writing their article), the authors found that BKCI should not be used for bibliometric or evaluative purposes. Torres-Salinas et al. (2013) used the BKCI to conduct analyses, that could have not been done without this new index. The authors constructed “heliocentric clockwise maps” for four areas (disciplines): arts & humanities, science, social sciences and engineering & technology. For each area citation average values for the dominant publishers are calculated and displayed. It was found, for example, that the area of engineering & technology is greatly unbalanced because one publisher, Springer, dominates the area accumulating approximately 62% of the total share, that is; 28,000 book chapters of the total of 40,000 belong to this publisher. Other fields may also be unbalanced but not to such extent. Torres-Salinas, Robinson-Garcia, Campanario and López-Cózar (2014) provided descriptive information about BKCI and found: Humanities and social sciences comprise 30 per cent of the total share of this database. Most of the disciplines are covered by very few publishers mainly from the UK and USA (75.05 per cent of the books), in fact 33 publishers hold 90 per cent of the whole share. Regarding publisher impact, 80.5 per cent of the books and chapters remained uncited. Two serious errors were found in this database: the Book Citation Index does not retrieve all citations for books and chapters; and book citations do not include citations to their chapters. Zuccala et al. (2018) studied the metadata assigned to monographs in BKCI and found that many ISBNs are missing for editions of the same work, in particular "emblematic" (original/first) editions. The authors wrote: The purpose of including all ISBNs is to ensure that every physical manifestation of a monograph is recognized (e.g., print, paperback, hardcopy, e-print) and that each ISBN is indexed as part of the correct edition or expression. This, in turn, ensures that all monograph editions can clearly be identified as being part of the same intellectual contribution, or work. Thus, publication counts and citation counts would be more accurate in the BKCI, and new metric indicators could be calculated more effectively. See also:
4.6 Data Citation IndexThe Data Citation Index was established in 2012 by Thomson Reuters as a point of access to quality research data from repositories across disciplines. As data citation practices increase over the years, the new citation index based on research data is available through the WoS from Clarivate Analytics. The Data Citation Index is a tool designed to be a source of data discovery for sciences, social sciences and arts and humanities. Data Citation Index evaluates and selects repositories considering the content, persistence, stability and searchability. Then, data is organized into three document types: repository, data study and data set. In this index, descriptive records are created for data objects and linked to literature articles in the Web of Science. The Data Citation Index emerges at a time in which data sharing is becoming a hot issue. Many researchers find, however, that data sharing is time consuming and too little acknowledged by colleagues and funding bodies. They are not sure whether the practice of sharing data is worth it as they are time-consuming and are not acknowledged by colleagues and funding bodies. Therefore, Force and Robinson (2014) explain that Data Citation Index aims to solve four key researcher problems: (1) data access and discovery; (2) data citation; (3) lack of willingness to deposit and cite data; and (4) lack of recognition and credit. Torres-Salinas, Martín-Martín and Fuentes-Gutiérrez (2014, 6) analyzed the coverage of the Data Citation Index considering disciplines, document types and repositories. Their study acknowledges that the Data Citation Index is heavily oriented towards the hard sciences. Furthermore, four repositories represent 75% of the database, even though there are a total of 29 repositories that contain at least 4000 records. The authors believe that the bias to hard sciences and the concentration on a few repositories is related to the data sharing practices that is relatively common in medicine, genetics biochemistry and molecular biology, for example. A study presented in 2014 demonstrated that data citation practices are uncommon within the scientific community, since 88% of the data analyzed had received no citation. The authors can state that “data sharing practices are not common to all areas of scientific knowledge and only certain fields have developed an infrastructure that allows to use and share data” (Torres-Salinas, Martín-Martín and Fuentes-Gutiérrez 2014, 6). The pattern of citation also changes from one domain to another. “While in Science and Engineering & Technology citations are concentrated among datasets, in the Social Sciences and Arts & Humanities, citations are normally referred to data studies” (Torres-Salinas, Martín-Martín and Fuentes-Gutiérrez 2014, 6). Data Citation Index is an initiative that “continues to build content and develop infrastructure in the interest of improving attribution for non-traditional research output and enabling data discoverability and access” (Force and Robinson 2014, 1048). It is a new tool that can help to argue with researchers about the importance of sharing their data in order to be cited. Furthermore, “encouraging data citation and facilitating connections between datasets and published literature, the resource elevates datasets to the status of citable and standardized research objects” (1048). See also:
4.7 Emerging Source Citation Index (ESCI)ESCI was established in 2015. It is a database that according to Clarivate Analytics (2017) indexes 7,280 emerging journals (journals that are not yet considered to fulfill the requirements of SCI, SSCI and AHCI) from 2005 to present with complete cited references. “Journals in ESCI have passed an initial editorial evaluation and can continue to be considered for inclusion in products such as SCIE, SSCI, and AHCI, which have rigorous evaluation processes and selection criteria”. ESCI is also (in 2019) described as covering “new areas of research in evolving disciplines, as well as relevant interdisciplinary scholarly content across rapidly changing research fields”. ESCI journals do not receive an impact factor but are evaluated regularly and those qualified will be transferred to the WoS and hence, will receive an impact factor. Testa (2009) wrote: As the global distribution of Web of Science expands into virtually every region on earth, the importance of regional scholarship to our emerging regional user community also grows. Our approach to regional scholarship effectively extends the scope of the Thomson Reuters Journal Selection Process beyond the collection of the great international journal literature: it now moves into the realm of the regional journal literature. Its renewed purpose is to identify, evaluate, and select those scholarly journals that target a regional rather than an international audience. Bringing the best of these regional titles into the Web of Science will illuminate regional studies that would otherwise not have been visible to the broader international community of researchers. ESCI thus seems to break with the original idea of SCI to include journals based on their impact factors. Perhaps it can be understood as a response for broader coverage in relation to research evaluation — as well as to the increasing competition from other producers of citation indexes? See also:
5. Citation databases from other database producers
|
ADVANTAGES of bibliographic references as SAP | DISADVANTAGES of bibliographic references as SAP |
---|---|
References represent a form of ‘‘literary warrant’’ and are thus empirically based in the scholarly literature. | The relation between citations and subject relatedness is indirect and somewhat unclear (related to the difference between the social and the intellectual organization of knowledge). |
Citations are provided by researchers (highly qualified subject specialists). | Bibliometric maps do not provide a clear logical structure with mutually exclusive and collectively exhaustive classes. |
The number of references reflects the indexing depth and specificity (the average of scientific papers is about 10 references per article). | Explicit semantic relations are not provided (e.g. genus–species relations and part–whole relations) (but future systems may distinguish between different kinds of citation links/motivations). |
Citation indexing is a highly dynamic form of subject representation (each new document published and indexed updates the pattern). | Only derived indexing is provided: concepts not represented in the literary sample are not assigned. |
References are distributed through papers, allowing the utilization of the paper structure in the contextual interpretation of citations. | There is a tendency to mix different theoretical structures due to the merging of literatures in the samples (rather than providing a system based on a pure theoretical basis). |
Scientific papers form a kind of self-organization system. | Namedropping and other forms of imprecise citation may cause noise. |
Citation based maps identify groups of researchers working in the same specialties. |
Hjørland and Nielsen (2001, 276) concluded that “A given subject access point (e.g., descriptors, references) cannot be expected to have a fixed information value regardless of conventions in the knowledge domain and the writing culture”. The relative value of references as SAP therefore must be studied in relation to terminological problems and citation behaviors in different domains.
Hjørland and Nielsen (2001) also found (258) that “ordinary retrieval algorithms and citation practices seem simply to reflect different theories about subject relatedness” (i.e., ordinary retrieval algorithms tend to consider documents subject-related if they “are alike” by containing the same words or concepts, whereas citation practices tend to consider documents related if they are linked by citations). Hjørland (2013, 1321) took this difference a step further by suggesting: “The relations between papers in a certain tradition are used as criteria of subject relatedness rather than just classifying documents on the basis of shared properties.” These different views correspond to different views in biological classification, in which the cladistic, → genealogical approach is confronted by the numerical taxonomic approach. These approaches are again linked to different epistemologies: historicism versus empiricism. Thus (Hjørland and Nielsen 2001, 258) concluded:
our insight from citation indexes has profoundly changed not only the methods of IR but also the concept of subject relatedness itself and the basic aim of retrieving information.
In scholarly communication, referencing previous works is an indispensable part of a document that reports research. As Ziman (1968, 58) wrote: “a scientific paper does not stand alone; it is embedded in the ‘literature’ of the subject”. From the point of view of using the citations for information retrieval, it is important to consider the citation behavior or the citation culture of citing authors. Garfield (1965, 85) listed 15 reasons (citation motives) to cite other documents:
Garfield (1979, 244-6) further challenges three criticisms of citation analysis: negative citations, self-citations, and citations to methodology papers. For him, negative citations are as important as positive citations because they are part of the process of scientific communication. If a work is so criticized as to be highly cited, it is a work that has some ideas that deserve the attention of other researchers, otherwise, it would be ignored by the scientific community. Many theories that are in force today were criticized initially, and, from these criticisms, were improved and became recognized. As for self-citation, Garfield uses a compelling argument: a researcher who aims to increase the number of citations he needs to publish to make his name appear. However, to generate a large number of publications, it is assumed that the researcher has much to say, otherwise the quality of the works will be smaller, and the author will only be able to publish works in peripheral journals, which are not indexed in the citation indexes. For this reason, Garfield believes that this is one of the criticisms that appear more in theory than in reality. The third point, about the high citation counts of some methodological papers is more difficult, but Garfield (1979, 245) says that such “a conclusion overlooks several important points. The most obvious one is the questionable validity of the judgment that methods are inherently less important than theories”.
Issues such as negative citations, self-citations, methodological citations are often discussed in relation to the use of citations in research evaluation. However, here our question is about bibliographical references as SAP. For example, to the degree that it can be documented that scientists use to quote papers for methodological reasons rather than for theoretical reasons, such knowledge is directly useful: citation indexes may be better for identifying methodological papers than for identifying theoretical papers and other systems must be made for retrieving the last kind.
Bornmann and Daniel (2008, 48) wrote: “Two competing theories of citing behavior have been developed in past decades, both of them situated within broader social theories of science. One is often denoted as the normative theory of citing behavior and the other as the social constructivist view of citing behavior.” The normative theory, following Robert K. Merton (e.g., Merton 1988) basically states that scientists give credit to colleagues whose work they use by citing that work. From this point of view Small (2004, 71) termed “citations as the symbolic payment of intellectual debt” and furthermore wrote that citations represent “vehicles of peer recognition and constructed symbols for specific original achievements in science”.
The social constructivist view was described this way Bornmann and Daniel (2008, 49) [53]:
The social constructivist view on citing behavior is grounded in the constructivist sociology of science (see, e.g. Collins, 2004; Knorr-Cetina, 1981; Latour and Woolgar, 1979). This view casts doubt on the assumptions of normative theory and questions the validity of evaluative citation analysis. Constructivists argue that the cognitive content of articles has little influence on how they are received. Scientific knowledge is socially constructed through the manipulation of political and financial resources and the use of rhetorical devices (Knorr-Cetina, 1991). For this reason, citations cannot be satisfactorily described unidimensionally through the intellectual content of the article itself. Scientists have complex citing motives that, depending on the intellectual and practical environment, are variously socially constructed (e.g. to defend their claims against attack, advance their interests, convince others, and gain a dominant position in their scientific community).
Nicolaisen (2007) presented a comprehensive discussion of theories of citation and included empirical tests of the social constructivist theory. His main conclusion (633) was:
This chapter has sought to make clear that, in order to explain such behavior, we must cease taking the individual’s knowledge structures as our starting point. Rather, we should focus our attention on knowledge domains, disciplines, or other collective knowledge structures. Attempts to explain citation behavior should thus refrain from psychologizing the act of citing and instead recognize it as embedded within the sociocultural conventions of collectives.
It seems obvious that theories of citing behavior are important in relation to considering references as SAP. Hjørland (2002) introduced a view, that can be understood as representing social epistemology as a third position between Merton and social constructivism: on the one hand Merton is right that scientists cite what they consider the most valuable documents in relation to their argumentation. On the other hand, different scientific perspectives (traditions, perspectives or paradigms) may differ with respect to what are considered the most valuable documents. In the literature about schizophrenia, for example, psychoanalytic journals tend to cite other sources than neuroscientific journals (cf., Hjørland 2002, 266). By conclusion: scholars citation behavior may at the deepest level be explained by their theoretical and epistemological commitments and the study of citations should therefore consider traditions and paradigms as a high priority.
Orduña-Malea and Delao-López-Cózar (2018) expressed some views, which we find serve as a proper conclusion for this article. They mentioned the importance of tools such as telescopes and microscopes for the development of science and they related the importance of citation indexes for the understanding of the ecosystem of scientific information with such tools. Here we can specify that these tools function at two levels: (1) they are tools for the scholars seeking knowledge and (2) they are tools for the scholars studying science (including scientometricians and information scientists). The authors had the following additional views:
While we find it too early to grant Dimensions this position in the development of citation indexing, Orduña-Malea and Delao-López-Cózar (2018) nonetheless provided a fine description of the importance of citation indexing. To this description may be added that among the most important contributions of citation indexing is the development of search engines like Google; it is worth considering that the major factor behind the success of Google is its utilization of links between documents and the number of in-links to documents. In this way principles of bibliometrics and citation studies takes a prominent role in front-end technologies today.
The authors wish to thank three anonymous reviewers for a detailed and important feedback on a former version of the article. Ziyoung Park provided up to date information on KCI. Initially Birger Hjørland was editor, however he joined as co-author in the process and thereafter Claudio Gnoli has served as the editor of this article.
1. Price (1970, 7) expressed this terminology in the following way: "if Paper R contains a bibliographic footnote using and describing Paper C, then R contains a reference to C, and C has a citation from R". This way “the number of references a paper has is measured by the number of items in its bibliography as endnotes and footnotes, etc., while the number of citations a paper has is found by looking it up in some sort of citation index and seeing how many other papers mention it” (Price 1970, 7).
Sugimoto and Larivière (2018, 67) also explain that the distinction between citations and references is conceptual. “First, while (almost) all research documents contain references, not all documents are cited. The second difference relates to time: references are always made to past literature and are static; that is, the reference list will never grow or change over time. Citations, on the other hand, come from documents written in the future. Therefore, citations are dynamic”.
2. Although this way of distinguishing “reference” and “citation” has got a certain impact in the field of bibliometrics, it is not common practice. For example, the influential Chicago Manual of Style (17th edition, 2017, p. 743) use the term source citation for what in this article are called "references", and Nicolaisen (2007, 609) wrote: “Unless stated otherwise, the term citation is used synonymously with the term bibliographic reference”.
3. References contain a set of standardized information. However, there are different standards and authors of scholarly papers have to apply the standard used in a specific journal or by a specific publisher. For example, this encyclopedia (IEKO) uses The Chicago Manual of Style, whereas many other publications in information science use Publication Manual of the American Psychological Association. Although such standards differ, there are certain basic elements they all cover, for example, author names, printing years and name of journal in which an article is published.
4. European Reference Index for the Humanities (ERIH) was a project for developing a citation index initiated by European Science Foundation from 2008. Websites: http://archives.esf.org/hosting-experts/scientific-review-groups/ humanities-hum/erih-european-reference-index-for-the-humanities.html and https://dbh.nsd.uib.no/publiseringskanaler/erihplus/. The term reference index seems logically the best term because what is being indexed are the references in the documents indexed by the database.
5. The term citation index is, unfortunately, also ambiguous. For example, MEDLINE database calls itself “the principal online bibliographic citation database of the NLM [National Library of Medicine]”. https://www.nlm.nih.gov/lstrc/jsel.html. However, it is not a citation database as understood in this article (and in general). MEDLINE just provides references, not citations for the indexed documents. It is, by the way, strange, that MEDLINE (PubMed version) has not added an index of cited references, as done, for example, by the PsycInfo database.
6. Backward searching is also termed searching antecedents, while forward searching is searching descendants of a given paper.
7. A short look at the cited references seems not to identify any critical articles in this case. If you are not happy with what you find, you may question (1) the coverage of the database (are certain kinds of research favored?) (2) the epistemology of the field of intelligence research (are the IQ test really testing biological issues as claimed or are they testing how certain people response due to their status and to socio-cultural issues? Feminist epistemology and feminist philosophy of science – along with other epistemological positions – have made objections to this way of doing research; this can also be found in the citation indexes, although probably not in the set of references citing specific empirical investigations such as Nyborg (2005). Nyborg’s research has been very controversial and heavily discussed in, among other places, the media, see, for example https://en.wikipedia.org/wiki/Helmuth_Nyborg. Empirical evidence that contradicts Nyborg’ claim about the superiority of male intelligence can be found, for example, in Flynn and Rossi-Case (2011). The reception of this reference may of course also be traced by citation indexes, e.g. in Google Scholar.
8. As already described, when an author cites a document, he or she also provides a reference to another work. Both citations and references are produced by the very same act, whether it is called "referencing" or "citing". By implication, we do not need two theories: one theory about references and another about citations, as claimed by Wouters (2016, 73-4):
Wouters (1999) concluded that a theory of referencing behavior should be seen as fundamentally distinct from a theory of evaluative bibliometrics. This was based on the statement that there is a fundamental distinction between reference and citation. By analyzing references and citations as different signs, they were essentially positioned as different objects. Their relation is one of descent: the citation emerges in an act of “semiosis” (the creation of a novel sign) from the reference. This has an important implication: it is no longer the scientist who creates the citation. Its source lies in the citation index and the producer of that index is the creator of the sign citation.
9. Remark that the Danish letter Ø and similar non-English letters are not used in Web of Science but is replaced by O; remark also that in the beginning this Citation Index did not provide first names, only initials; although today both initials and first names are used (in two different fields, initials in the AU field and first names in AF (Author Full Name) it is nonetheless necessary to use the AU field to retrieve older records and thus all documents authored by a given author such as Anders Ørom. This makes it very difficult to disambiguate common names like A Smith, although WoS has a specific “author search” facility that may help solving the problem.
10. In this case the use of the source index to search back is not as good as using the article itself: in the original article were 37 cited references and in WoS only 31 are listed, and their bibliographical information are not as full or accurate as that in Ørom (2003, 142-3).
11. Atkins (1999) wrote: “ The standardized capture and further unification of references has three main benefits: 1) [for the producer of the database]: it allows for speed in data capture since the key needed for matching is quite short; 2) it enables the presentation of more consistent references to customers, regardless of the number of variants presented in the source journals; and 3) it enables internal and external links in ISI products”.
12. In reference lists, the same reference may be referred to in many different ways and these are not standardized. The Russian psychologist Lev Semyonovich Vygotsky, for example, can be spelled in many ways, and it is a complicated task to make a search for documents citing him by collocating the different spellings. This is contrary to the library tradition, in which authority control is applied indicating a standard name form, and a unique name for each person (e.g. by adding birth date when needed). In the world of scholarly communication and citation indexes this problem may in the future be solved in a similar way by ORCID (Open Researcher and Contributor ID) and other services, which provide a persistent digital identifier that distinguishes a researcher from every other researcher and support automated linkages between the researcher and the professional activities (see https://en.wikipedia.org/wiki/ORCID). There is also another standardization problem, however. When scholars cite a certain document (books in particular), they may cite different editions with different printing years and different translations. This often poses problems for citation searching and scholars should prefer to cite original editions or standard editions. But many do not, and therefore the problem persist.
13. Web of Science; Scopus Citation Index; Google Scholar; Cite Seer; Korean Journal Database; Scientific Electronic Library Online Citation Index (SciELO); Emerging Source Citation Index (ESCI); Crossref; Microsoft Academic; Dimensions.
14. Science Citation Index (SCI); BIOSIS Citation Index; Chinese Science Citation Database (CSCD); Russian Science Citation Index (RSCI).
15. Social Sciences Citation Index (SSCI); Chinese Social Sciences Citation Index (CSSCI).
16. Arts & Humanities Citation Index (A&HCI).
17. Shepard’s Citations (law); PsycInfo (psychology); CiteSeerX (computer and information science); SciFinder (Chemistry).
18. Book Citation Index (BKCI).
19. Conference Proceedings Citation Index (CPCI).
20. Data Citation Index.
21. The statement “without the need for the intellectual judgments of human indexers” is important. However, as discussed in Section 8, citation indexes introduce a new kind of subjectivity: the choice of references made by the authors.
22. Moed (2005, 11) wrote: “Eugene Garfield Associates was founded in 1954 and launched numerous editions of Current Contents by 1960. In that year, the company name was changed to the Institute for Scientific Information (ISI). In 1964, ISI launched the Science Citation Index (SCI), as a quarterly multidisciplinary index”. In 1992 ISI was bought by The Thomson Corporation and changed name to Thomson ISI. In 2006 it changes name to Thomson Scientific and in 2008 Thomson and Reuters merged under the name Thomson Reuters. From 2016 it was bought by Onex and Baring Asia under the name Clarivate Analytics, as which it is still known, although in May 13, 2019, Clarivate merged with Churchill Capital.
23. Web of Science (WoS) has been a part of Web of Knowledge (WoK) and WoS and WoK have also been used as synonyms. Today, the term WoK is seldom used and seems to be replaced by WoS.
24. WoS Core Collection is opposed to the Specialist Collection (BIOSIS Citation Index, BIOSIS Previews, Biological Abstracts, Zoological Record, Medline, CAB Global Health, Inspec and FSTA) and the Regional Collection (see Appendix 1). The contents of the Core Collection (depends on subscription agreements) includes the following databases:
25. P.t. the following databases:
27. The “expanded” version of the SCI (and formerly also of the SSCI) were created in order for subscribers to have a choice between a relatively cheap and an expensive version. The expanded versions are superset of the non-expanded versions. There are no other theoretical issues involved in this distinction.
28. ERIH, see: http://archives.esf.org/hosting-experts/scientific-review-groups/humanities-hum/erih-european-reference-index-for-the-humanities.html
29. Clarivate Analytics (2018). Book Citation Index: https://support.clarivate.com/ScientificandAcademicResearch/s/article/Web-of-Science- Core-Collection-Book-Citation-Index---Coverage-is-of-the-full-book-and-not-selective-chapters?
30. Digital Object Identifier (DOI) is a persistent identifier or handle used to identify objects uniquely, standardized by the International Organization for Standardization (ISO).
31. I4OC should not be confused with OpenCitations (http://opencitations.net/), another initiative, established in 2010.
32. About Scopus: https://www.elsevier.com/solutions/scopus.
33. The concept of altmetrics was cited the first time in a tweet from Jasom Priem from University of North Carolina-Chapel Hill in 2010. A manifesto was presented in the same year (Priem, Taraborelli and Growth 2010). Altmetrics are fast, using public APIs to gather data in days or weeks. They’re open–not just the data, but the scripts and algorithms that collect and interpret it. Altmetrics look beyond counting and emphasize semantic content like usernames, timestamps, and tags. Altmetrics aren’t citations, nor are they webometrics; although these latter approaches are related to altmetrics, they are relatively slow, unstructured, and closed (Priem, Taraborelli and Growth 2010).
34. Journal level indicators on GS is, however, quite limited.
35. It is not published which publishers are indexed by GS; however, for now, Elsevier publications are not indexed by GS – probably because Elsevier wants users to go to their own portal (ScienceDirect) which is not free.
36. Dimensions plus is adding patents, clinical trials, grants and policy documents and their connections. Additionally, it allows the search of new entities (organizations and financing agents). also includes advanced analysis tools, such as the comparison between organizations or financing agents, the generation of advanced reports, as well as the possibility of integrating custom implementations.
37. Garfields law of concentration (Garfield 1979, 160) stated: “The core literature for all scientific disciplines involves a group of no more than 1,000 journals, and may involve as few as 500”. Garfield added (160): “Though larger collections certainly can be justified in many cases, the single function of providing resonable cost-effective coverage of the literature most used by research scientists requires no more than 500 to 1,000 journals.”
38. The word “free” seems to be misplaces here, since, whether free or not, Google Scholar and Microsoft Academic maintain their position as the most comprehensive sources for publication and citation data.
39. Clarivate Analytics (undated): Web of Science Fact Book. https://clarivate.com/wp-content/uploads/2017/05/d6b7faae-3cc2-4186-8985-a6ecc8cce1ee_Crv_WoS_Upsell_Factbook_A4_FA_LR_edits.pdf and: “With the Web of Science platform, you can access an unrivalled breadth of world-class research literature linked to a rigorously selected core of journals and uniquely discover new information through meticulously captured metadata and citation connections […] Find out what makes Web of Science Core Collection the most accurate, objective, and complete resource available”, https://clarivate.com/products/web-of-science/.
40. MEDLINE’s journal selection criteria are discussed here: https://www.nlm.nih.gov/lstrc/jsel.html.
41. In libraries the idea of curated and quality-controlled collections is challenged by the principle of patron-driven acquisitions and in Wikipedia and social media the concept wisdom of the crowd has challaged the idea of edited works.
42. A special question is whether prepublications (“in press” publications) are indexed and should be indexed. Some databases (e.g. WoS) does not index preprints but wait until a document is formally published before including it in the database, GS and Scopus, on the other hand, also index prepublications. In this way some statistical figures about GS may be inflated compared to WoS. Arguments can be given, however, in favor for the inclusion of preprints: it can be important to find documents fast, and therefore indexing of prepublications can be fruitful. A counter argument is that it is important to keep a well-defined notion of “a publication” and to avoid possible duplicate representations of the same document.
43. Harzing (2016): "Stray Citations" [blog post]: https://harzing.com/resources/publish-or-perish/tutorial/google-scholar/stray-citations.
44. Archived link to Dialog’s “blue sheets” http://web.archive.org/web/20000816215552/http://library.dialog.com/bluesheets/html/bln.html.
45. Google and GS use the following search keys:
AND
, OR
, NOT
) +
: searches stop words intitle
: restricts the results to documents containing that word in the title other words in the query will return documents that mention the word anywhere in the document allintitle
: restricts the results to those with all the query words in the title.site
: searches for the word in the site/domain name. Limits searches to a special domain or site.inurl
: searches for the word in the URL allinurl
: searches for all the words in the URLauthor
: searches for the word in the author’s namefiletype
: limits file type""
searches the phrase*
in phrase searching *
replaced by any single word...
Number range 46. Noruzi (2005, 175) mentions that you may also search journal title. However what can be done is to use journal title as a filter: “Return articles published in”. Among the drawbacks of Google Scholar is that a Google Scholar search only include the first 1000 hits. GS does not provide metadata on the document type and the language of the document that it covers.
47. Microsoft wrote: “Microsoft Academic understands the meaning of words, it doesn’t just match keywords to content. For example, when you type “Microsoft,” it knows you mean the institution, and shows you papers authored by researchers affiliated with Microsoft. Similarly, Microsoft Academic knows journal titles, conference names, and many research topics.” https://academic.microsoft.com/home.
However such functions can only be made by a built-in → knowledge organization system (e.g., a thesaurus or an ontology) or anoher semantic technology. But many systems (also old systems like MEDLINE) have KOSs (or can be combined with KOSs) that allow for kinds of semantic searches, e.g., ending the EXPLODE
operator (!
) to a valid thesaurus term will result in retrieval of narrower terms. For example, the statement SELECT DEMENTIA!
in MEDLINE® will retrieve narrower (i.e., more specific terms) such as ALZHEIMER DISEASE
, CREUTZFELDT-JAKOB SYNDROME
, etc.
48. Exact phrase searching is not possible according to Harzing’s Publish or Perish User’s Manual (retrieved 2019-07-24); this is a mistake, however. See https://images.webofknowledge.com/images/help/WOS/hs_search_rules.html.
49. Combinations of advanced searches and cited reference searches were formerly possible for the ISI citation databases in the database host Dialog. It is a great disadvantages that they cannot be performed directly on WoS, but must be done in downloaded sets using other software tools.
50. For a broader introduction to SAP see Hjørland and Nielsen (2001).
51. What Garfield called a restatement may also be called an interpretation based on both the indexer’s subjectivity and the properties of the indexing language, see further in Hjørland (2018).
52. The terms semantic relevance (by terms) and pragmatic relevance (by citations) can be discussed, because a pragmatic view of semantics consider the meaning and application of terms determined by pragmatic principles.
53. See also the discussion by Zuckerman (1987).
Acharya, Anurag, Alex Verstak, Helder Suzuki, Sean Henderson, Mikhail Iakhiaev, Cliff Chiung Yu Lin and Namit Shetty. 2014. “Rise of the Rest: The Growing Impact of Non-Elite Journals”. http://arxiv.org/pdf/1410.2217v1.pdf.
Atkins, Helen. 1999. “The ISI® Web of Science—Links and Electronic Journals.” D-Lib Magazine 5, no. 9. DOI: 10.1045/september99-atkins.
Buckland, Francesca. 2021. “The Arabic Citation Index: Transforming Local Research into Global Impact”. Clarivate Analytics Blog. Accessed April 17, 2021. https://clarivate.com/webofsciencegroup/article/the-arabic-citation-index-transforming- local-research-into-global-impact/
Bornmann, Lutz, and Hans-Dieter Daniel. 2008. "What do Citation Counts Measure? A Review of Studies on Citing Behavior." Journal of Documentation 64, no. 1: 45-80. https://doi.org/10.1108/00220410810844150.
Braun, Tibor, Wolfgang Glänzel and András Schubert. 2000. "How Balanced is the Science Citation Index's Journal Coverage? A Preliminary Overview of Macrolevel Statistical Data". In The Web of Knowledge: A Festschrift in Honor of Eugene Garfield, edited by Blaise Cronin and Helen Barsky Atkins. Medford, New Jersey: Information Today, 251-77.
Chen, Kuang-hua. 2004. "The construction of the Taiwan Humanities Citation Index". Online Information Review 28, no. 6: 410-19.
Clarivate Analytics (undated). Web of Science Fact Book. https://clarivate.com/wp-content/uploads/2017/05/d6b7faae-3cc2-4186-8985-a6ecc8cce1ee _Crv_WoS_Upsell_Factbook_A4_FA_LR_edits.pdf
Clarivate Analytics. 2015. Russian Science Citation Index. http://wokinfo.com/products_tools/multidisciplinary/rsci/?utm_source= false&utm_medium=false&utm_campaign=false.
Clarivate Analytics. 2016. Korean Journal Database. http://wokinfo.com/products_tools/multidisciplinary/kci_kjd/?utm_source= false&utm_medium=false&utm_campaign=false.
Clarivate Analytics. 2017. Web of Science Core Collection – Emerging Sources Citation Index. http://wokinfo.com/media/pdf/ESCI_Fact_Sheet.pdf?utm_source= false&utm_medium=false&utm_campaign=false.
Clarivate Analytics. 2018a. Chinese Science Citation Database. http://wokinfo.com/products_tools/multidisciplinary/cscd/?utm_source= false&utm_medium=false&utm_campaign=false.
Clarivate Analytics. 2018b. "Web of Science Core Collection: Book Citation Index - Coverage is of the full book and not selective chapters". https://support.clarivate.com/ScientificandAcademicResearch/s/article/Web-of-Science- Core-Collection-Book-Citation-Index---Coverage-is-of-the-full-book-and-not-selective-chapters?.
Clarivate Analytics. 2021. "Introducing the Arabic Citation Index". Accessed April 17, 2021. https://clarivate.com/webofsciencegroup/wp- content/uploads/sites/2/ dlm_uploads/2021/03/WS388953315_ARCI- Brochure_4pp_Web_Eng.pdf
Collins, Harry. 2004. Gravity’s Shadow. The Search for Gravitational Waves. Chicago, IL.: The University of Chicago Press.
Flynn, James R. and Lilia Rossi-Case. 2011. "Modern Women Match Men on Raven's Progressive Matrices". Personality and Individual Differences 50: 799–803. doi:10.1016/j.paid.2010.12.035.
Force, Megan M. and Nigel J. Robinson. 2014. “Encouraging Data Citation and Discovery with the Data Citation Index”. Journal of Computer-Aided Molecular Design 28, no. 10: 1043–48. https://doi.org/10.1007/s10822-014-9768-5.
Garfield, Eugene. 1955. “Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas.” Science 122, no. 3159: 108–11.
Garfield, Eugene. 1964. “Science Citation Index: A New Dimension in Indexing.” Science 144, no. 3619: 649–54. https://doi.org/10.1126/science.144.3619.649.
Garfield, Eugene. 1965. “Can Citation Indexing be Automated?” In Statistical Association Methods for Mechanized Documentation. Symposium Proceedings, edited by Mary Elizabeth Stevens, Vincent E. Giuliano, and Laurence B. Heilprin. Washington: National Bureau of Standards, 189-92. Available at: https://nvlpubs.nist.gov/nistpubs/Legacy/MP/nbsmiscellaneouspub269.pdf. Reprinted in Essays of an Information Scientist 1, no. 1962-1973: 84-90.
Garfield, Eugene. 1970. “A New ISI Program for Dissemination and Retrieval of Conference Papers”. Current Contents/Life Sciences 13, no. 10: 4-5. Retrieved in Reprint from: http://www.garfield.library.upenn.edu/essays/V1p091y1962-73.pdf.
Garfield, Eugene. 1971. “The Mystery of the Transposted Journal Lists. Wherein Bradford´s Law of Scattering is Generalized According to Garfield´s Law of Concentration”. Current Contents, no. 17: 222-3. Reprinted in his Essays of an Information Scientist. Volume 1. Philadelphia, Pa.: ISI Press, 1977. Available at: http://www.garfield.library.upenn.edu/essays/V1p222y1962-73.pdf.
Garfield, Eugene. 1977a. “ISI's New Index to Scientific and Technical Proceedings lets You Know What Went on at a Conference Even if You Stayed at Home.” Current Contents, no. 40: 5-10. Retrieved in Reprint from: http://garfield.library.upenn.edu/essays/v3p247y1977-78.pdf.
Garfield, Eugene. 1977b. “Will ISI's Arts & Humanities Citation Index Revolutionize ""cholarship?” Current Contents, no. 2: 5-9; Reprinted in Essays of an Information Scientist 3: 204-8, 1977-78. http://www.garfield.library.upenn.edu/essays/v3p204y1977-78.pdf.
Garfield, Eugene. 1978. “Introducing Index to Social Sciences and Humanities Proceedings: More Help in Locating and Acquiring Proceedings”. Current Contents, no. 33: 5-9. Reprint retrieved from: http://www.garfield.library.upenn.edu/essays/v3p573y1977-78.pdf.
Garfield, Eugene. 1979. Citation Indexing: Its Theory and Application in Science, Technology, and Humanities. New York, NY: John Wiley & Sons.
Garfield, Eugene. 1980. “Is Information Retrieval in the Arts and Humanities Inherently Different from that in Science? The Effect That ISI®'s Citation Index for the Arts and Humanities Is Expected to Have on Future Scholarship”. Library Quarterly 50, no. 1: 40-57. Available at: http://www.garfield.library.upenn.edu/essays/v6p623y1983.pdf.
Garfield, Eugene. 1981. “Introducing ISI-ISTP and B (Index to Scientific and Technical Proceedings and Books): Online Access to the Conference Literature and Multi-Authored Books”. Current Contents, no. 34: 5-9. http://garfield.library.upenn.edu/essays/v5p213y1981-82.pdf.
Gehanno, Jean-François, Laetitia Rollin and Stefan Darmoni. 2013. „Is the Coverage of Google Scholar Enough to be Used Alone for Systematic Reviews?”. BMC Medical Informatics and Decision Making 13, no. 7. Retrieved from http://www.biomedcentral.com/1472-6947/13/7.
Giri, Rabishankar and Anup Kuma Das. 2011. “Indian Citation Index: A new Web Platform for Measuring Performance of Indian Research Periodicals”. Library Hi Tech News 28, no. 3: 33-35. https://doi.org/10.1108/07419051111145154
Gorraiz, Juan, Philip J. Purnell and Wolfgang Glänzel. 2013. “Opportunities for and Limitations of the Book Citation Index”. Journal of the American Society for Information Science and Technology 64, no. 7: 1388-98. https://doi.org/10.1002/asi.22875.
Gray, Jerry E., Michelle C. Hamilton, Alexandra Hause, Margaret M. Janz, Justin P. Peters, and Fiona Taggart. 2012. “Scholarish: Google Scholar and Its Value to the Sciences.” Issues in Science and Technology Librarianship 70. https://doi.org/10.5062/F4MK69T9.
Gusenbauer, Michael. 2019. “Google Scholar to Overshadow Them All? Comparing the Sizes of 12 Academic Search Engines and Bibliographic Databases”. Scientometrics 118, no. 1: 177–214. https://doi.org/10.1007/s11192-018-2958-5.
Halevi, Gali, Henk Moed, and Judit Bar-Ilan. 2017. “Suitability of Google Scholar as a Source of Scientific Information and as a Source of Data for Scientific Evaluation - Review of the Literature”. Journal of Informetrics 11, no. 3: 823-4. doi:10.1016/j.joi.2017.06.005.
Harter, Stephen P., Thomas E. Nisonger and Aiwei Weng. 1993. “Semantic Relationships Between Cited and Citing Articles in Library and Information Science Journals.” Journal of the American Society for Information Science 44, no. 9: 543-52.
Harzing, Anne-Wil. 2007. Publish or Perish. https://harzing.com/resources/publish-or-perish. [Compare Harzing 2010 and 2019]. Accessed July 18, 2019.
Harzing, Anne-Wil. 2010. The Publish or Perish Book: Your Guide to Effective and Responsible Citation Analysis. Melbourne, Australia: Tarma Software Research (see also: https://harzing.com/resources/publish-or-perish).
Harzing, Anne-Wil. 2011. The Publish or Perish Book, Part 2: Citation Analysis for Academics and Administrators. Melbourne,Australia: Tarma Software Research.
Harzing, Anne-Wil. 2013. “A Longitudinal Study of Google Scholar Coverage Between 2012 and 2013”. Scientometrics 98, no. 1: 565–75. http://dx.doi.org/10.1007/s11192-013-0975-y.
Harzing, Anne-Wil. 2016. “Microsoft Academic (Search): A Phoenix Arisen from the Ashes?” Scientometrics 108, no. 3: 1637-47.
Harzing, Anne-Wil. 2019. “Two New Kids on the Block: How do Crossref and Dimensions Compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?” Scientometrics 120, no. 1: 341-9. https://doi.org/10.1007/s11192-019-03114-y.
Harzing, Anne-Wil, and Satu Alakangas. 2016. "Google Scholar, Scopus and the Web of Science: A Longitudinal and Cross-Disciplinary Comparison." Scientometrics 106, no. 2: 787-804. https://doi.org/10.1007/s11192-015-1798-9.
Harzing, Anne-Wil and Satu Alakangas. 2017a. “Microsoft Academic: Is the Phoenix Getting Wings?” Scientometrics 110, no. 1: 371-83. https://doi.org/10.1007/s11192-016-2185-x.
Harzing, Anne-Wil and Satu Alakangas. 2017b. “Microsoft Academic is One Year Old: The Phoenix is Ready to Leave the Nest”. Scientometrics 112, no. 3: 1887-94. https://doi.org/10.1007/s11192-017-2454-3.
Hjørland, Birger. 1998. “The Classification of Psychology: A Case Study in the Classification of a Knowledge Field”. Knowledge Organization 25, no. 4: 162-201
Hjørland, Birger. 2002. “Epistemology and the Socio-Cognitive Perspective in Information Science”. Journal of the American Society for Information Science and Technology 53, no. 4:257–70. DOI: 10.1002/asi.10042.
Hjørland, Birger. 2011. “The Importance of Theories of Knowledge: Indexing and Information Retrieval as an Example”. Journal of the American Society for Information Science and Technology 62, no. 1:72-77. https://doi.org/10.1002/asi.21451.
Hjørland, Birger. 2013. “Citation Analysis: A Social and Dynamic Approach to Knowledge Organization.” Information Processing & Management 49, no. 6: 1313–25. https://doi.org/10.1016/j.ipm.2013.07.001.
Hjørland, Birger. 2015. “Classical Databases and Knowledge Organization: A Case for Boolean Retrieval and Human Decision-making During Searches”. Journal of the Association for Information Science and Technology 66, no. 8: 1559-1575. DOI: 10.1002/asi.23250
Hjørland, Birger. 2017. “Subject (of documents)”. Knowledge Organization 44, no. 1: 55-64.
Hjørland, Birger. 2018. “Indexing: concepts and theory”. Knowledge Organization 45, no. 7: 609-39. Also available in ISKO Encyclopedia of Knowledge Organization, eds. Birger Hjørland and Claudio Gnoli, https://www.isko.org/cyclo/indexing.
Hjørland, Birger. 2019. “The Classification of Psychology: A Case Study in the Classification of a Knowledge Field”, Appendix 8. In ISKO Encyclopedia of Knowledge Organization. https://www.isko.org/cyclo/psychology#app8.
Hjørland, Birger, and Lykke Kyllesbech Nielsen. 2001. “Subject Access Points in Electronic Retrieval”. Annual Review of Information Science and Technology 35: 249–98.
Hopewell, Sally, M. Clarke, A. Lusher, C. Lefebvre and M. Westby. 2002. “A Comparison of Hand Searching Versus MEDLINE Searching to Identify Reports of Randomized Controlled Trials”. Statistics in Medicine 21, no. 11: 1625–34.
Hua, Weina. 2001. “The Development of the Chinese Social Sciences Citation Index.” Indexer 22, no. 3: 128-9. http://search.ebscohost.com/login.aspx?direct=true&db=edsbl&AN= RN095623682&lang=pt-br&site=eds-live.
Hug, Sven E., Michael Ochsner and Martin P. Brändle. 2017. ”Citation Analysis with Microsoft Academic.” Scientometrics 111, no. 1: 371-8. https://arxiv.org/ftp/arxiv/papers/1609/1609.05354.pdf.
Hug, Sven E. and Martin P. Brändle. 2017. ”The Coverage of Microsoft Academic: Analyzing the Publication Output of a University”. Scientometrics 113, no. 3: 1551-71. https://doi.org/10.1007/s11192-017-2535-3.
Jacso, Peter. 2005. "As We May Search: Comparison of Major Features of the Web of Science, Scopus, and Google Scholar Citation-Based and Citation-Enhanced Databases." Current Science 89, no. 9: 1537-1547. https://www.jstor.org/stable/24110924.
Jefferson, Osmat A., Deniz Koellhofer, Ben Warren, and Richard Jefferson. 2019. "The Lens Metarecord and Lensid: An Open Identifier System for Aggregated Metadata and Versioning of Knowledge Artefacts". LIS Scholarship Archive, November 25. doi:10.31229/osf.io/t56yh.
Khabsa, Madian and C. Lee Giles. 2014. “The Number of Scholarly Documents on the Public Web”. PLoS ONE 9, no. 5: e93949. doi:10.1371/journal.pone.0093949
Klein, Daniel B. and Eric Chiang. 2004. “The Social Science Citation Index: A Black Box – With an Ideological Bias?” Econ Journal Watch 1, no. 1: 134-65. https://econjwatch.org/articles/the-social-science-citation-index-a-black-box-with-an-ideological-bias.
Knorr-Cetina, Karin. 1981. The Manufacture of Knowledge: An Essay on the Constructivist and Contextual Nature of Science. Oxford: Pergamon Press.
Knorr-Cetina, Karin. 1991. “Merton Sociology of Science: The First and the Last Sociology of Science”. Contemporary Sociology – A Journal of Reviews 20, no. 4: 522-6.
The Knowledge Foundation. 2023. “Indian Citation Index”. https://web.archive.org/web/20221203080352/http://indiancitationindex.com/ici.aspx.
Kousha, Kayvan, and Mike Thelwall. 2007. “Google Scholar Citations and Google Web/URL Citations: A Multi-Discipline Exploratory Analysis.” Journal of the American Society for Information Science and Technology 58, no. 7: 1055–65. https://doi.org/10.1002/asi.20584.
Latour, Bruno and Steve Woolgar. 1979. Laboratory Life: The Social Construction of Scientific Facts. London: Sage.
Lewison, Grant and Philip Roe. 2013. “The Shortfall in Coverage of Countries’ Papers in the Social Sciences Citation Index Compared with the Science Citation Index. In 14th International Society of Scientometrics and Informetrics Conference (ISSI), Vienna, Austria 15th to 20th July 2013. Proceedings, edited by Juan Gorraiz, Edgar Schiebel, Christian Gumpenberger, Marianne Hörlesberger and Henk Moed. Wien: AIT Austrian Institute of Technology, vol. 2: 1601 - 1612. http://www.issi2013.org/Images/ISSI_Proceedings_Volume_II.pdf.
Leydesdorff, Loet and Ulrike Felt. 2012a. “’Books’ and ‘Book Chapters’, in the Book Citation Index (BKCI) and Science Citation Index (SCI, SoSCI, A&HCI)”. Proceedings of the American Society for Information Science and Technology 49, no. 1: 1-7. DOI: 10.1002/meet.14504901027
Leydesdorff, Loet and Ulrike Felt. 2012b. “Edited volumes, monographs and book chapters in the Book Citation Index (BKCI) and Science Citation Index (SCI, SoSCI, A&HCI)”. Journal of Scientometric Research 1, no. 1: 28-34.
Li, Jie, Judy F. Burnham, Trey Lemley, and Robert M. Britton. 2010. "Citation Analysis: Comparison of Web of Science®, Scopus™, Scifinder®, and Google Scholar." Journal of Electronic Resources in Medical Libraries 7, no. 3: 196-217.
López-Cózar, Emilio Delgado, Enrique Orduna-Malea and Alberto Martín-Martín. 2019. “Google Scholar as a Data Source for Research Assessment”. In Springer Handbook of Science and Technology Indicators, edited by Wolfgang Glaenzel, Henk Moed, Ulrich Schmoch and Michael Thelwall. (In Press). https://arxiv.org/abs/1806.04435.
Martín-Martín, Alberto, Enrique Orduna-Malea, Mike Thelwall and Emilio Delgado López-Cózar. 2018. “Google Scholar, Web of Science, and Scopus: A Systematic Comparison of Citations in 252 Subject Categories”. Journal of Informetrics 12, no. 4: 1160–77.
Martyn, John. 1965. “An Examination of Citation Indexes.” Aslib Proceedings 17, no. 6: 184–96. https://doi.org/10.1108/eb050021.
McCain, Katherine W. 1989. “Descriptor and Citation Retrieval in the Medical Behavioral Sciences Literature: Retrieval Overlaps and Novelty Distribution.” Journal of the American Society for Information Science 40, no. 2: 110–14. 3.0.CO;2-T">https://doi.org/10.1002/(SICI)1097-4571(198903)40:2<110::AID-ASI5>3.0.CO;2-T.
McVeigh, Marie E. 2017. “Citation Indexes and the Web of Science.” Encyclopedia of Library and Information Sciences. 4. Edition. Edited by John D. McDonald and Michael Levine-Clark. Boca Raton London New York: CRC Press, vol. 2: 940-50.
Meho, Lokman I. and Kiduk Yang. 2007. “Impact of Data Sources on Citation Counts and Rankings of LIS Faculty: Web of Science Versus Scopus and Google Scholar”. Journal of the American Society for Information Science and Technology 58, no. 13: 2105–25. DOI: 10.1002/asi.20677
Mehrad, J. and M. Naseri. 2012. "The Islamic World Science Citation Center: A new scientometrics system for evaluating research performance in OIC region". International Journal of Information Science and Management 8, no. 2: 1-10.
Merton, Robert K. 1988. “The Matthew Effect in Science, II: Cumulative Advantage and the Symbolism of Intellectual Property.” Isis 79: 606–23. https://doi.org/10.1086/354848.
Microsoft Academic [@MSFTAcademic]. 2017. “Some Facts About the Current Size of our Data.” Stop & meet us at #kdd2017 @MLatMSFT [Tweet]. Retrieved from https://twitter.com/MSFTAcademic/status/897494672200921088.
Microsoft Academic. 2021. "Next Steps for Microsoft Academic: Expanding into New Horizons". Accessed August 22, 2021. https://www.microsoft.com/en-us/research/project/academic/articles/....
Mingers, John and Evangelia A. E. C. G. Lipitakis. 2010. “Counting the Citations: A Comparison of Web of Science and Google Scholar in the Field of Business and Management”. Scientometrics 85, no. 2: 613–25. http://dx.doi.org/10.1007/s11192-010-0270-0.
Moed, Henk F. 2005. Citation Analysis in Research Evaluation. (Information Science and Knowledge Management, vol. 9). Dordrecht: Springer.
Moed, Henk F. 2017. Applied Evaluative Informetrics. Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-60522-7_14.
Moed, Henk F., Judit Bar-Ilan and Gali Halevi. 2016. “A New Methodology for Comparing Google Scholar and Scopus”. Journal of Informetrics 10, no. 2: 533–51. doi: 10.1016/j.joi.2016.04.017.
Moskaleva, Olga, Vladimir Pislyakov, Ivan Sterligov, Mark Akoev and Svetlana Shabanova. 2018. “Russian Index of Science Citation: Overview and Review.” Scientometrics 116 (1): 449–62. https://doi.org/10.1007/s11192-018-2758-y.
Narin, Francis. 1976. Evaluative Bibliometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity. Cherry Hill, NJ: Computer Horizons, Inc.
Negishi, Masamitsu, Yuan Sun and Kunihiro Shigi. 2004. "Citation Database for Japanese Papers: A New Bibliometric Tool for Japanese Academic Society". Scientometrics 60, no. 3: 333-51.
Nicolaisen, Jeppe. 2007. “Citation Analysis”. Annual Review of Information Science and Technology 41: 609-41.
Nordic Institute of Asian Studies (NIAS). 2018. “Korean Citation Index.” AsiaPortal: A Nordic Information Resource Portal for Asian Studies. http://www.asiaportal.info/database/korean-citation-index/.
Noruzi, Alireza. 2005. “Google Scholar: The New Generation of Citation Indexes”. Libri 55, no. 4: 170-80. DOI: 10.1515/LIBR.2005.170
Nyborg, Helmuth. 2005. "Sex-related Differences in General Intelligence g, Brain Size, and Social Status". Personality and Individual Differences 39, no. 3: 497-509. https://doi.org/10.1016/j.paid.2004.12.011.
Öchsner, Andreas. 2013. Introduction to Scientific Publishing: Backgrounds, Concepts, Strategies. Berlin: Springer.
Orduña-Malea, Enrique and Emilio Delao-López-Cózar (2018). “Dimensions: Re-discovering the Ecosystem of Scientific Information”. El Profesional de la Información 27, no. 2: 420-31. (In English) https://arxiv.org/ftp/arxiv/papers/1804/1804.05365.pdf.
Ørom, Anders. 2003. “Knowledge Organization in the Domain of Art Studies: History, Transition and Conceptual Changes”. Knowledge Organization 30, no. 3-4: 128-43.
Ouahi, Jamal El. 2021. "Early Insights into the Arabic Citation Index". arXiv.org (Jan 28). Accessed April 17, 2021. https://search-proquest.ez22.periodicos.capes.gov.br/working- papers/early-insights-into-arabic-citation-index/docview/2483459927/se- 2?accountid=146694
Packer, Abel L., Nicholas Cop, and Solange M. Santos. 2014. “A Rede SciELO Em Perspectiva.” In SciELO – 15 Anos de Acesso Aberto: Um Estudo Analítico Sobre Acesso Aberto e Comunicação Científica, 41–66. Paris: UNESCO. https://doi.org/10.7476/9789237012376.
Pao, Miranda Lee, and Dennis B. Worthen. 1989. “Retrieval Effectiveness by Semantic and Citation Searching.” Journal of the American Society for Information Science 40, no. 4: 226–35. 3.0.CO;2-6">https://doi.org/10.1002/(SICI)1097-4571(198907)40:4<226::AID-ASI2>3.0.CO;2-6.
Price, Derek J. de Solla. 1970. “Citation Measures of Hard Science, Soft Science, Technology, and Nonscience.” In Communication among Scientists and Engineers, edited by Derek J. de Solla Price, Carnot .E. Nelson, and Donald K. Pollock. Lexington, Mass.: Heath, 3-22 (+ notes 325).
Priem, Jason, Dario Taraborelli, Paul Growth and Cameron Neylon. 2010. Altmetrics: A Manifesto, 26 October 2010. http://altmetrics.org/manifesto.
Prins, Ad A. M., Rodrigo Costas, Thed N. van Leeuwen and Paul F. Wouters. 2016. “Using Google Scholar in Research Evaluation of Humanities and Social Science Programs: A Comparison with Web of Science Data.” Research Evaluation 25, no. 3: 264–70. http://dx.doi.org/10.1093/reseval/rvv049.
Rousseau, Ronald, Leo Egghe and Raf Guns. 2018. Becoming Metric-Wise: A Bibliometric Guide for Researchers. Oxford, UK: Chandos Publishing.
Shapiro, Fred R. 1992. “Origins of Bibliometrics, Citation Indexing, and Citation Analysis: The Neglected Legal Literature.” Journal of the American Society for Information Science 43, no. 5: 337–39. 3.0.CO;2-T">https://doi.org/10.1002/(SICI)1097-4571(199206)43:5<337::AID-ASI2>3.0.CO;2-T.
Sinha, Arnab, Zhihong Shen, Yang Song, Hao Ma, Darrin Eide, Bo-june (Paul) Hsu and Kuansan Wang. 2015. „An Overview of Microsoft Academic Service (MAS) and Applications”. In Proceedings of the 24th International Conference on World Wide Web, edited by Aldo Gangemi,
Stefano Leonardi and Alessandro Panconesi. New York: ACM, 243–46. DOI: 10.1145/2740908.2742839.
Sivertsen, Gunnar and Birger Larsen. 2012. “Comprehensive Bibliographic Coverage of the Social Sciences and Humanities in a Citation Index: An Empirical Analysis of the Potential”. Scientometrics 91, no. 2: 567-75. https://doi.org/10.1007/s11192-011-0615-3.
Small, Henry G. 1978. “Cited Documents as Concept Symbols.” Social Studies of Science 8, no. 3: 327–40. http://www.jstor.org.offcampus.lib.washington.edu/stable/284908.
Small, Henry. 2004. “On the Shoulders of Robert Merton: Towards a Normative Theory of Citation.” Scientometrics 60, no. 1: 71–79. https://doi.org/10.1023/B:SCIE.0000027310.68393.bc.
Soler Monreal, M. Concha and Isidoro Gil-Leiva. 2011. "Evaluation of Controlled Vocabularies by Inter-Indexer Consistency". Information Research 16, no. 4: paper 502. Available at http://InformationR.net/ir/16-4/paper502.html.
Souza, Iara Vidal Pereira de. 2015. “Altmetria ou Métricas Alternativas: Conceitos e Principais Características.” (Altmetrics or Alternative Metrics: Concepts and Key Features). AtoZ: Novas Práticas em Informação e Conhecimento 4, no. 2: 58-60. https://revistas.ufpr.br/atoz/article/view/44554/27146
Sugimoto, Cassidy R. and Vincent Larivière. 2018. Measuring Research: What Everyone Needs to Know. Oxford: Oxford University Press.
Swanson, Don R. 1986. “Undiscovered Public Knowledge.” The Library Quarterly 56, no. 2: 103-18.
Testa, James. 2009. “Regional Content Expansion in Web of Science®: Opening Borders to Exploration”. GlobalHigherEd, Webblog. (Originally published on an internal Thomson Reuters website). https://globalhighered.wordpress.com/2009/01/15/regional-content-expansion-in-web-of-science/.
Testa, James. 2012. Clarivate Analytics Conference Proceedings Selection Process http://wokinfo.com/products_tools/multidisciplinary/webofscience/cpci/cpciessay/.
Thelwall, Mike. 2017. “Microsoft Academic: A Multidisciplinary Comparison of Citation Counts with Scopus and Mendeley for 29 Journals”. Journal of Informetrics 11, no. 4: 1201-12. https://doi.org/10.1016/j.joi.2017.10.006.
Thelwall, Mike. 2018a. “Microsoft Academic Automatic Document Searches: Accuracy for Journal Articles and Suitability for Citation Analysis”. Journal of Informetrics 12, no. 1: 1-9. https://doi.org/10.1016/j.joi.2017.11.001.
Thelwall, Mike. 2018b. “Dimensions: A Competitor to Scopus and the Web of Science?” Journal of Informetrics 12, no. 2: 430-5. https://doi.org/10.1016/j.joi.2018.03.006.
Torres-Salinas, Daniel, Evaristo Jiménez-Contreras, Nicolas Robinson-García. 2014. “How Many Citations Are There in the Data Citation Index?” In Proceedings, 19:6. Leiden: Universiteit Leiden. http://hdl.handle.net/10481/32931.
Torres-Salinas, Daniel, Alberto Martín-Martín and Enrique Fuente-Gutiérrez. 2014. “Analysis of the Coverage of the Data Citation Index – Thomson Reuters: Disciplines, Document Types and Repositories”. Revista Española de Documentación Científica 37, no. 1: 1-6. https://doi.org/10.3989/redc.2014.1.1114.
Torres-Salinas, Daniel, Nicolas Robinson-Garcia, Juan Miguel Campanario and Emilio Delgado López-Cózar. 2014. “Coverage, Field Specialization and the Impact of Scientific Publishers Indexed in the Book Citation Index”. Online Information Review 38, no. 1: 24–42.
Torres-Salinas, Daniel, Nicolás Robinson-García and Emilio Delgado López-Cózar. 2012. “Towards a ‘Book Publishers Citation Reports’. First approach using the ‘Book Citation Index’”. Ec3 Working Papers 7, Sun, 29 Jul 2012; https://arxiv.org/ftp/arxiv/papers/1207/1207.7067.pdf.
Torres-Salinas, Daniel, Rosa Rodríguez-Sánchez, Nicolás Robinson-García, J. Fdez-Valdivia and J. A. García. 2013. “Mapping Citation Patterns of Book Chapters in the Book Citation Index.” Journal of Informetrics 7, no. 2: 412–24. https://doi.org/10.1016/j.joi.2013.01.004.
Türp, Jens C., Jutta-Maria Schulte and Gerd Antes. 2002. “Nearly Half of Dental Randomized Controlled Trials Published in German are not Included in Medline”. European Journal of Oral Sciences 110, no. 6: 405–11.
Wade, Alex D., Kuansan Wang, Yizhou Sun and Antonio Gulli. 2016. “WSDM Cup 2016: Entity Ranking Challenge”. In Proceedings of the Ninth ACM International Conference on Web Search and Data Mining, edited by Paul N. Bennet, Vanja Josifovski, Jennifer Neville and Filip Radlinski. New York: ACM, 593–4.
Web of Science. 2018. Web of Science Core Collection: Descriptive Document. Pensylvania: Clarivate Analytics. https://clarivate.libguides.com/ld.php?content_id=45175981.
Weinberg, Bella Hass. 1997. “The Earliest Hebrew Citation Indexes.” Journal of the American Society for Information Science 48, no. 4: 318–30. 3.0.CO;2-Z">https://doi.org/10.1002/(SICI)1097-4571(199704)48:4<318::AID-ASI5>3.0.CO;2-Z.
Weinberg, Bella Hass 2004. “Predecessors of Scientific Indexing Structures in the Domain of Religion.” In The History and Heritage of Scientific and Technological Information Systems: Proceedings of the 2002 Conference, 65:126–34. Philadelphia, PA: American Society for Information Science & Technology.
Weinstock, Melvil. 1971. “Citation Indexes.” Encyclopedia of Library and Information Science, edited by Allan Kent. New York, NY: Marcel Dekker, vol. 5: 16-41. Available in reprint at: http://www.garfield.library.upenn.edu/essays/V1p188y1962-73.pdf.
Wikipedia. The Free Encyclopedia. “Shepard's Citations”. Retrieved 2019-07-10 from: https://en.wikipedia.org/wiki/Shepard%27s_Citations.
Wilkinson, David and Mike Thelwall. 2013. “Search Markets and Search Results: The Case of Bing.” Library & Information Science Research 35, no.4: 318–25. https://doi.org/10.1016/j.lisr.2013.04.006.
Wouters, Paul F. 1999. The Citation Culture. PhD thesis. Amsterdam: University of Amsterdam. https://dare.uva.nl/search?identifier=b101b769-100f-43e5-b8d2-cac6c11e5bbf.
Wouters, Paul F. 2016. “Semiotics and Citations”. In Theories of Informetrics and Scholarly Communication: A Festschrift in Honor of Blaise Cronin, edited by Cassidy R. Sugimoto. Berlin: Walter de Gruyter, 72-92.
Zainab, A. N., A. Abrizah and Ram Gopal Raj. 2013. "Adding value to scholarly journals through a citation indexing system". Program 47, no. 3: 239-262.
Ziman, John M. 1968. Public Knowledge: An Essay Concerning the Social Dimension of Science. London: Cambridge University Press.
Zuccala, Alesia, Mads Breum, Kasper Bruun and Bernd T. Wunsch. 2018.“Metric Assessments of Books as Families of Works.” Journal of the Association for Information Science and Technology 69, no. 1: 146-57. https://doi.org/10.1002/asi.23921.
Zuckerman, Harriet. 1987. “Citation Analysis and the Complex Problem of Intellectual Influence”. Scientometrics 12, no. 5-6: 329–38. https://doi.org/10.1007/BF02016675.
CSCD was established in 2000 as a result of a partnership between Clarivate Analytics and Chinese Academy of Sciences. The database is hosted on the Web of Science. CSCD is fully integrated and searchable with WoS and covers approximately 1,200 top scholarly publications from China, with nearly 2 million records in total. (Clarivate Analytics 2018a).
CSSCI was established in 2000 as an interdisciplinary citation index, developed by Nanjing University. It covers about 500 Chinese academic journals of humanities and social sciences. CSSCI has bridged a gap in the field of Chinese social science research and has become a useful database for information retrieval and an important tool for evaluating research work and social science journals (Hua 2001).
The Korean Journal Database, KCI was established in 2010. It “provides a comprehensive snapshot of the most influential regional content from researchers in South Korea. Using citation connections from the Web of Science™, regional work is framed within the broader context of global research”. It indexes about 5,600 journals (at present) from 1980 to present (the list can be downloaded in MS-Excel format from the website) and is a result from a collaboration with the National Research Foundation of Korea. Subject coverage includes: Arts & Humanities, Life Sciences & Biomedicine, Physical Sciences, Social Sciences, and Technology (Clarivate Analytics 2016). Citation information, statistical data and bibliographic information on domestic journals from Korea and indicators like the number of citations and h-index are available through KCI (Nordic Institute of Asian Studies 2018).
SciELO was established in 2014. It is a multidisciplinary citation index from Latin America. SciELO Citation Index is part of the SciELO Network wich was created in 1998 and, its first collection was SciELO Brazil. We may indicate some of the roles of SciELO collections: journals indexing based on specific criteria; identify statistics (access, downloads, citation); the publication of full text in open access and the guarantee of interoperability of collections and journals (Packer, Cop and Santos 2014).
RSCI was established in 2016. This database includes papers from selected Russian journals and is based on the data from the national citation index Russian Index of Science Citation (RISC). RISC was launched in 2005 but it is scarcely known to the English-language audience. It is a government-funded project primarily aimed at creating a comprehensive bibliographic/citation database of Russian scholarly publishing for evaluation purposes based on Scientific Electronic Library (further eLibrary.ru) which started as a fulltext database of scholarly literature for grant holders of Russian Basic Research Foundation (Moskaleva et al. 2018). In collaboration with the Scientific Electronic Library (eLibrary.ru), the Russian Science Citation Index on Web of Science™ enables discovery of new insights from Russian publications. It covers fields such as engineering, material science, and ecology and indexes over 600 titles (Clarivate Analytics 2015).
The Taiwan Citation Index — Humanities and Social Sciences (TCI-HSS) (formerly: Taiwan Humanities Citation Index, THCI and Taiwan Social Sciences Citation Index, TSSCI) was formally launched 11th of September 2013 (cf., https://enwww.ncl.edu.tw/information_40_3680.html.)
THCI is Taiwan's effort to construct a search, research, and evaluation tool for research in the arts and humanities. Chen (2004) describes the design, framework, features, and policies and rules of the THCI. Citation analysis has been regarded as a systematic way to investigate research developments and trends. Since the Arts & Humanities Citation Index (A&HCI) indexes mostly English journals, the THCI could become an auxiliary citation index of the A&HCI for Taiwanese researchers.
Negishi, Sun and Shigi (2004) describe the construction and functions of the Citation Database for Japanese Papers (CJP) developed at the National Institute of Informatics, Japan (NII), and the impact factors of CJP's source journals. Then statistical analyses of multidimensional scaling on citation counts for the academic society journals to measure relationship among the societies are described. The authors also introduce a new citation navigation system, CiNii, which enables users to access various resources provided by NII, such as NACSIS Electronic Library Service (NACSIS-ELS) to get electronic full-text of journal articles through citation links. Recent political developments in Japan towards enhancement of scientific information infrastructure are also introduced with its implication to research evaluation systems incorporating citation analyses.
Hosted by the Islamic Citation Centre in Shiraz, Iran. The establishment of Islamic World Science Citation Center (ISC) has been approved by the Islamic Conference of the Ministers of Higher Education and Scientific Research (ICMHESR) in a meeting held by ISESCO in 2008 in Baku, capital of Azerbaijan. Since then, Islamic universities and research institutes are required to cooperate with ISC.
Based on the ratification of Development Council of Higher Education in Iran ministry of science, research and technology, the establishment of ISC has been approved in 2008. Currently, all ISESCO member countries, including Southeast Asia, Arab countries, African non-Arab Islamic countries, Central Asia and the Caucasian Regional countries, and other Islamic countries in the Middle East, Europe and South America have been covered by ISC.
Some of the major deeds of ISC since its opening are as the following (source: https://isc.gov.ir/en):
According to Mehrad and Naseri (2012), the Islamic World Science Citation Center (ISC) is the first citation system in the Islamic countries. They need their citation databases with a good coverage of the Islamic countries' journals with no language bias, then the Regional Information Center for Science and Technology in 2001 started to construct the citation databases for the Islamic countries called ISC and released its services from January 2004.
ISC databases provide access to current and retrospective bibliographic information and cited references found in approximately 1317 titles of the Islamic countries scientific journals covering engineering, science, agriculture, medicine and humanities' disciplines.
The cooperation between ISC and Scopus shows that construction of non-English citation databases reflecting scientific activities in each Islamic country or geographic area to developed countries is of utmost importance.
MyCite (Malaysian Citation Index), that will analyse and provide citation data only for domestic citations. Authors may use MyCite to find out who has cited their papers published in Malaysian scholarly works. Citations to foreign journals cited in a domestic journal is not within the scope of MyCite. From MyCite it is possible to identify which paper cites whose paper, the impact of a journal and an article. MyCite also provides users with easy access to Malaysian scholarly works over the Web. (http://www.mycite.my/en/about-mycite/background)
The Ministry of Education (MOE) of Malaysia initiated the establishment of the Malaysian Citation Centre (MCC) in 2011. MCC is responsible for collating, monitoring, coordinating and improving the standard of scholarly journal publications in Malaysia. MCC will maintain a citation system, named MyCite or Malaysian Citation Index.
MyCite will provide access to bibliographic as well full-text contents of scholarly journals published in Malaysia in the fields of Sciences, Technology, Medicine, Social Sciences and the Humanities. Besides this, MyCite will provide citation and bibliometric reports on Malaysian researchers, journals and institutions based only on the contents within MyCite. It is estimated that there are over 500 Malaysian journals, the contents of which needs to be made visible globally so that Malaysian researchers can identify expertise, areas of possible collaboration, stimulate use and citations. Nowadays, MyCite has access to 65065 articles in 225 different journals.
For Zainab, Abrizah and Raj (2013), to solve the problem of poor representation in WoS and Scopus, some countries especially in Asia have been developing their own citation indexing systems. In Malaysia, the proposal for the setting up of a Malaysian abstracting and indexing system was first mooted at a publisher's conference in 2006. The Ministry of Higher Education, Malaysia eventually saw to the approval of the setting up of a Malaysian Citation Centre, which is responsible for a Malaysian citation index system (MyCite).
MyCite is a citation indexing system and was developed in-house collaboratively between Malaysian Citation Centre and the Information Technology Centre University of Malaya. Currently, it is available to the global community in the hope to increase the visibility of Malaysian scholarly journals, empowering the scholarly community of users and can support utility at the national level.
The Arabic Citation Index (ARCI) was funded by the Egyptian Knowledge Bank (EKB) in 2020 as part of the Egyptian Ministry of Education and of Egypt's Vision 2030 (Ouahi 2021). It is built by and hosted on Web of Science (Clarivate Analytics). It is the fifth regional citation index developed by the Web of Science (Buckland 2021). It provides access to bibliographic information and citations to scholarly articles from Arabic journals and other Web of Science content. According to Buckland (2021) ARCI has already seen strong submissions from Egypt (28%) and Algeria (24%), with Iraq (12%), Jordan (8%) and Saudi Arabia (7%) also well represented.
The primary aim of the index is "to evaluate the quality and research output of Arabic researchers, universities and research organizations" (Clarivate 2021). When it was launched, the ARCI indexed more than 400 expertly curated Arabic journals with language interface in both English and Arabic (Buckland 2021).
The ARCI offers enhanced Arabic journal visibility, quality, influence of Arabic science on the global arena, its chances to be cited worldwide, language interface in English and Arabic, regional competitive support and funding opportunities (Clarivate 2021).
The journals covered in ARCI are selected by a newly established Editorial board with members from Arab League countries who provide subject knowledge and regional insights. The selection process for ARCI is based on traditional scientific publishing standards and the research norms of the Arab region. (Ouahi 2021)
Indian Citation Index (ICI) is an abstract and citation database for measuring performance of Indian research journals. The index was formally launched in October 2010 by The Knowledge Foundation in association with Divan Enterprises. It has two basic functions, general literature search and evaluation using citations similar to international databases.
“ICI is a fully web-based abstract and citation database covering R&D literatures across all disciplines published in journals/serials or in other documents emanating from India” (Giri and Kuma Das 2011, 34). The ICI's scope covers Indian R&D literature across all disciplines, life science, technology, medicine, agriculture, social science and humanities get published in 1000 plus journals/serials or in other documents emanating from India. It is a multidisciplinary research platform covering about 1067 scholarly journals from India and there are about 600 open access journals (Knowledge Foundation 2023).
The objectives of the index are: (a) too ensure access to articles published in local Indian R&D literature at national & global level; (b) to reflect and represent true picture of locally published Indian scholarly contribution at national and global level; (c) to have an authentic tool/ground for effective, & rigorous evaluation of Indian scholarly works (Knowledge Foundation 2023).
The journal selection policy is guided by the principle of the Bradford's Law. Furthermore, it considers many factors when evaluating journals for their indexing ranging from qualitative to quantitative parameters. Some of the evaluation topics are: editorial context; focus whether National and International and Citation Analysis. Those factors are considered related to each other to take a decision to cover or not cover in ICI (Knowledge Foundation 2023).
Version 1.0 published 2019-08-05
Version 1.2 published 2019-09-12: Appendix 1.6-9 added
Version 1.3 published 2021-04-22: Appendix 1.10 added
Version 1.4 published 2021-09-06: Information about Microsoft Academic and Lens added
Version 1.5 published 2023-05-23: Appendix 1.11 added
Article category: Knowledge organizing processes (KOP)
This article (version 1.0) is also published in Knowledge Organization. How to cite it:
Araújo, Paula Carina de, Renata Cristina Gutierres Castanha and Birger Hjørland. 2021. “Citation Indexing and Indexes”. Knowledge Organization 48, no. 1: 72-101. Also available in ISKO Encyclopedia of Knowledge Organization, eds. Birger Hjørland and Claudio Gnoli, https://www.isko.org/cyclo/citation
©2019 ISKO. All rights reserved.