Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus

Article in Future Internet about Academic SEO
Click to download

Abstract

Search engine optimization (SEO) constitutes the set of methods designed to increase the visibility of, and the number of visits to, a web page by means of its ranking on the search engine results pages. Recently, SEO has also been applied to academic databases and search engines, in a trend that is in constant growth. This new approach, known as academic SEO (ASEO), has generated a field of study with considerable future growth potential due to the impact of open science. The study reported here forms part of this new field of analysis. The ranking of results is a key aspect in any information system since it determines the way in which these results are presented to the user.

The aim of this study is to analyze and compare the relevance ranking algorithms employed by various academic platforms to identify the importance of citations received in their algorithms. Specifically, we analyze two search engines and two bibliographic databases: Google Scholar and Microsoft Academic, on the one hand, and Web of Science and Scopus, on the other.

A reverse engineering methodology is employed based on the statistical analysis of Spearman’s correlation coefficients. The results indicate that the ranking algorithms used by Google Scholar and Microsoft are the two that are most heavily influenced by citations received. Indeed, citation counts are clearly the main SEO factor in these academic search engines. An unexpected finding is that, at certain points in time, Web of Science (WoS) used citations received as a key ranking factor, despite the fact that WoS support documents claim this factor does not intervene. 

Keywords ASEO; SEO; reverse engineering; citations; google scholar; microsoft academic; web of science; WoS; scopus; indicators; algorithms; relevance ranking; citation databases; academic search engines



Source: referred article

1. Introduction

The ranking of search results is one of the main challenges faced by the field of information retrieval [1,2]. Search results are sorted so that the results best able to solve the user’s need for information are ranked at the top of the page [3].

The challenges faced though are far from straightforward given that a successful ranking by relevance depends on the correct analysis and weighting of a document’s properties, as well as the analysis of the need for that information and the key words used [1,2,4].

Relevance ranking has been successfully employed in a number of areas, including web page search engines, academic search engines, academic author rankings and the ranking of opinion leaders on social platforms [5]. Many algorithms have been proposed to automate this relevance and some of them have been successfully implemented. In so doing, different criteria are applied depending on the specific characteristics of the elements to be ordered.

PageRank [6] and Hyperlink-Induced Topic Search (HITS) [7] are the best know algorithms for ranking web pages. Variants of these algorithms have also been used to rank influencers in social media, and include, for example, IP-Influence [8], TunkRank [9], TwitterRank [10] and TURank [11]. To search for academic documents, various algorithms have been proposed and used, both for the documents themselves and for their authors.

These include Authority-Based Ranking [12], PopRank [13], Browsing-Based Model [14] and CiteRank [15]. All of them use the number of citations received by the articles as a search ranking factor in combination with other elements, such as publication date, the author’s reputation and the network of relationships between documents, authors and affiliated institutions.Many information retrieval systems (search engines, bibliographic databases and citation databases, etc.) use relevance ranking in conjunction with other types of sorting, including chronological, alphabetical by author, number of queries and number of citations. In search engines like Google, relevance ranking is the predominant approach and is calculated by considering more than 200 factors [16,17].

Unfortunately, Google does not release precise details about these factors, it only publishes fairly sketchy, general information. For example, the company says that inbound links and content quality are important [18,19]. Google justifies this lack of transparency in order to fight search engine spam [20] and to prevent low quality documents from being ranked at the top of the results by falsifying their characteristics.Search engine optimization (SEO) is the discipline responsible for optimizing websites and their content to ensure they are ranked at the top of the search engine results pages (SERPs), in accordance with the relevance ranking algorithm [21]. In recent years, SEO has also been applied to academic search engines, such as Google Scholar and Microsoft Academic. This new application has received the name of “academic SEO” (or ASEO) [22,23,24,25,26].

ASEO helps authors and publishers to improve the visibility of their publications, thus increasing the chances that their work will be read and cited.However, it should be stressed that the relevance ranking algorithm of academic search engines differs from that of standard search engines. The ranking factors employed by the respective search engine types are not the same and, therefore, many of those used by SEO are not applicable to ASEO while some are specific to ASEO (see Table 1).

SEO companies [27,28,29] routinely conduct reverse engineering research to measure the impact of the factors involved in Google’s relevance ranking. Based on the characteristics of the pages that appear at the top of the SERPs, the factors with the greatest influence on the relevance ranking algorithm can be deduced. It is not a straightforward task since many factors have an influence and, moreover, the algorithm is subject to constant changes [30].

Search conducted on WoS with relevance ranking using the number of citations. Source: referred article

Studies that have applied a reverse engineering methodology to Google Scholar have shown that citation counts are one of the key factors in relevance ranking [31,32,33,34]. Microsoft Academic, on the other hand, has received less attention from the scientific community [35,36,37,38] and there are no specific studies of the quality of its relevance ranking.

Academic search engines, such as Google Scholar and Microsoft Academic, are an alternative to bibliographic commercial databases, such as Web of Science (WoS) and Scopus, for indexing scientific citations and they provide a free service of similar performance that competes with the business model developed by the classic services. Unlike search engines, bibliographic databases are fully transparent about how they calculate relevance, clearly informing users how their algorithm works on their help pages [39,40].

The primary aim of this study is to verify the importance attached to citations received in the relevance ranking algorithms of two academic search engines and two bibliographic databases. We analyze the two main academic search engines (i.e., Google Scholar and Microsoft Academic) and the two bibliographic databases of citations providing the most comprehensive coverage (WoS and Scopus) [41].We address the following research questions: Is the number of citations received a key factor in Google Scholar relevance rankings? Do the Microsoft Academic, WoS and Scopus relevance algorithms operate in the same way as Google Scholar’s?

Do citations received have a similarly strong influence on all these systems? A similar approach to the one adopted here has been taken in previous studies of the factors involved in the ranking of scholarly literature [22,23,31,32,33,34].The rest of this manuscript is organized as follows. First, we review previous studies of the systems that concern us here, above all those that focus on ranking algorithms. Next, we explain the research methodology and the statistical treatment performed. We then report, analyze and discuss the results obtained before concluding with a consideration of the repercussions of these results and possible new avenues of research.

2. Related Studies

Google Scholar, Microsoft Academic, WoS and Scopus have been analyzed previously in works that have adopted a variety of approaches, including, most significantly:

  • Comparative analyses of the coverage and quality of the academic search engines and bibliographic databases [42,43,44,45,46,47,48,49,50,51]
  • Studies of the impact of authors and the h-index [33,44,52,53,54,55,56,57]
  • Studies of the utility of Google Scholar and Academic Search for bibliometric studies [20,49,55,58,59,60,61]

However, few studies [43,62] have focused their attention on information retrieval and the search efficiency of academic search engines, while even fewer papers [22,23,31,32,33,34] have examined the factors used in ranking algorithms.The main conclusions to be drawn from existing studies of relevance ranking in the systems studied can be summarized as follows:

  • The number of citations received is a very important factor in Google Scholar relevance rankings, so that documents with a high number of citations received tend to be ranked first [32,33,34].
  • Documents with many citations received have more readers and more citations and, in this way, consolidate their top position [61].

Surprisingly, the relevance ranking factors of academic search engines and bibliographic databases have attracted little interest in the scientific community, especially if we consider that a better position in their rankings means enhanced possibilities of being found and, hence, of being read. Indeed, the initial items on a SERP have been shown to receive more attention from users than that received by items lower down the page [63].

In the light of these previous reports, it can be concluded that the number of intervening factors in the academic search engines is likely to be fewer than those employed by Google and that, therefore, the algorithm is simpler (see Table 1).

[…]

Access:

References

  1. Baeza-Yates, R.; Ribeiro-Neto, B. Modern Information Retrieval; Addison-Wesley Professional: New York, NY, USA, 2010; pp. 3–339. [Google Scholar]
  2. Salton, G.; McGill, M.J. Introduction to Modern Information Retrieval; McGraw Hill: New York, NY, USA, 1987; pp. 1–400. [Google Scholar]
  3. Blair, D.C. Language and Representation in Information Retrieval; Elsevier: Amsterdam, The Netherlands, 1990; pp. 1–350. [Google Scholar]
  4. Maciá-Domene, F. SEO: Técnicas Avanzadas; Anaya: Barcelona, Spain, 2015; pp. 1–408. [Google Scholar]
  5. Chang, Y.; Aung, Z. AuthorRank: A New Scheme for Identifying Field-Specific Key Researchers. In Proceedings of the CONF-IRM 2015, New York, NY, USA, 16 January 2015; pp. 1–13. Available online: http://aisel.aisnet.org/confirm2015/46 (accessed on 1 July 2019). [Google Scholar]
  6. Brin, S.; Page, L. The anatomy of a large-scale hypertextual Web search engine. Comput. Netw. ISDN Syst. 199830, 107–117. [Google Scholar] [CrossRef]
  7. Kleinberg, J.M. Authoritative sources in a hyperlinked environment. JACM 199946, 604–632. [Google Scholar] [CrossRef]
  8. Romero, D.M.; Galuba, W.; Asur, S.; Huberman, B.A. Influence and passivity in social media. Mach. Learn. Knowl. Discov. Databases 20116913, 18–33. [Google Scholar]
  9. Tunkelang, D. A Twitter Analog to PageRank. The Noisy Channel Blog. 2009. Available online: http://thenoisychannel.com/ (accessed on 1 July 2019).
  10. Weng, J.; Lim, E.-P.; Jiang, J.; He, Q. TwitterRank: Finding topic-sensitive influential Twitterers. In Proceedings of the 3rd ACM International Conference on Web Search and Data Mining (WSDM), New York, NY, USA, 3–6 February 2010; pp. 261–270. [Google Scholar]
  11. Yamaguchi, Y.; Takahashi, T.; Amagasa, T.; Kitagawa, H. TURank: Twitter user ranking based on user-tweet graph analysis. In Web Information Systems Engineering—WISE 2010; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6488, pp. 240–253. [Google Scholar]
  12. Hristidis, V.; Hwang, H.; Papakonstantinou, Y. Authority-based keyword search in databases. ACM Trans. Auton. Adapt. Syst. 200833, 11–14. [Google Scholar] [CrossRef]
  13. Nie, Z.; Zhang, Y.; Wen, J.R.; Ma, W.Y. Object-level ranking: Bringing order to web objects. In Proceedings of the 14th International Conference on World Wide Web (ACM), Chiba, Japan, 10–14 May 2005; pp. 567–574. [Google Scholar] [CrossRef]
  14. Chen, L.; Nayak, R. Expertise analysis in a question answer portal for author ranking 2008. In Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT), Washington, DC, USA, 31 August–3 September 2010; pp. 134–140. [Google Scholar]
  15. Walker, D.; Xie, H.; Yan, K.-K.; Maslov, S. Ranking scientific publications using a model of network traffic. J. Stat. Mech. Theory Exp. 200706, 6–10. [Google Scholar] [CrossRef]
  16. Google. How Google Search Works. Learn How Google Discovers, Crawls, and Serves Web Pages, Search Console Help. Available online: https://support.google.com/webmasters/answer/70897?hl=en (accessed on 1 July 2019).
  17. Ziakis, C.; Vlachopoulou, M.; Kyrkoudis, T.; Karagkiozidou, M. Important Factors for Improving Google Search Rank. Future Internet 201911, 32. [Google Scholar] [CrossRef]
  18. Ratcliff, C. WebPromo’s Q & A with Google’s Andrey Lipattsev, Search Engine Watch. Available online: https://searchenginewatch.com/2016/04/06/webpromos-qa-with-googles-andrey-lipattsev-transcript (accessed on 1 July 2019).
  19. Schwartz, B. Now We Know: Here Are Google’s Top 3 Search Ranking Factors, Search Engine Land. Available online: http://searchengineland.com/now-know-googles-top-three-search-ranking-factors-245882 (accessed on 1 July 2019).
  20. Beel, J.; Gipp, B. Academic search engine spam and Google Scholar’s resilience against it. J. Electron. Publ. 201013, 1–28. [Google Scholar] [CrossRef]
  21. Enge, E.; Spencer, S.; Stricchiola, J. The Art of SEO: Mastering Search Engine Optimization; O’Reilly Media: Sebastopol, CA, USA; Boston, MA, USA, 2015; pp. 1–670. Available online: https://books.google.co.in/books?id=hg5iCgAAQBAJ (accessed on 1 July 2019).
  22. Beel, J.; Gipp, B. Google Scholar’s ranking algorithm: The impact of articles’ age (an empirical study). In Proceedings of the Sixth International Conference on Information Technology: New Generations, ITNG’09, Las Vegas, NA, USA, 27–29 April 2009; pp. 160–164. [Google Scholar]
  23. Beel, J.; Gipp, B.; Wilde, E. Academic search engine optimization (ASEO) optimizing scholarly literature for google scholar & co. J. Sch. Publ. 201041, 176–190. [Google Scholar] [CrossRef]
  24. Codina, L. SEO Académico: Definición, Componentes y Guía de Herramientas, Codina, Lluís. Available online: https://www.lluiscodina.com/seo-academico-guia/ (accessed on 1 July 2019).
  25. Martín-Martín, A.; Ayllón, J.M.; Orduña-Malea, E.; López-Cózar, E.D. Google Scholar Metrics Released: A Matter of Languages and Something Else; EC3 Working Papers; EC3: Granada, Spain, 2016; pp. 1–14. Available online: https://arxiv.org/abs/1607.06260v1 (accessed on 1 July 2019).
  26. Muñoz-Martín, B. Incrementa el impacto de tus artículos y blogs: De la invisibilidad a la visibilidad. Rev. Soc. Otorrinolaringol. Castilla Leon Cantab. La Rioja 20156, 6–32. Available online: http://hdl.handle.net/10366/126907 (accessed on 1 July 2019). [Google Scholar]
  27. Gielen, M.; Rosen, J. Reverse Engineering the YouTube, Tubefilter.com. Available online: http://www.tubefilter.com/2016/06/23/reverse-engineering-youtube-algorithm/ (accessed on 1 July 2019).
  28. Localseoguide. Local SEO Ranking Factors Study 2016, Localseoguide. Available online: http://www.localseoguide.com/guides/2016-local-seo-ranking-factors/ (accessed on 1 July 2019).
  29. Searchmetrics. Rebooting Ranking Factors. Available online: http://www.searchmetrics.com/knowledge-base/ranking-factors/ (accessed on 1 July 2019).
  30. MOZ. Google Algorithm Update History. Available online: https://moz.com/google-algorithm-change (accessed on 1 July 2019).
  31. Beel, J.; Gipp, B. Google scholar’s ranking algorithm: An introductory overview. In Proceedings of the 12th International Conference on Scientometrics and Informetrics, ISSI’09, Istanbul, Turkey, 14–17 July 2009; pp. 230–241. [Google Scholar]
  32. Beel, J.; Gipp, B. Google scholar’s ranking algorithm: The impact of citation counts (an empirical study). In Proceedings of the Third International Conference on Research Challenges in Information Science, RCIS 2009c, Nice, France, 22–24 April 2009; pp. 439–446. [Google Scholar]
  33. Martín-Martín, A.; Orduña-Malea, E.; Ayllón, J.M.; López-Cózar, E.D. Does Google Scholar Contain all Highly Cited Documents (1950–2013); EC3 Working Papers; EC3: Granada, Spain, 2014; pp. 1–96. Available online: https://arxiv.org/abs/1410.8464 (accessed on 1 July 2019).
  34. Rovira, C.; Guerrero-Solé, F.; Codina, L. Received citations as a main SEO factor of Google Scholar results ranking. El Profesional de la Información 201827, 559–569. [Google Scholar] [CrossRef]
  35. Thelwall, M. Does Microsoft Academic Find Early Citations? Scientometrics 2018114, 325–334. Available online: https://wlv.openrepository.com/bitstream/handle/2436/620806/?sequence=1 (accessed on 1 July 2019). [Google Scholar] [CrossRef]
  36. Hug, S.E.; Ochsner, M.; Brändle, M.P. Citation analysis with microsoft academic. Scientometrics 2017110, 371–378. Available online: https://arxiv.org/pdf/1609.05354.pdf (accessed on 1 July 2019). [Google Scholar] [CrossRef]
  37. Harzing, A.W.; Alakangas, S. Microsoft Academic: Is the phoenix getting wings? Scientometrics 2017110, 371–383. Available online: http://eprints.mdx.ac.uk/20937/1/mas2.pdf (accessed on 1 July 2019). [Google Scholar] [CrossRef]
  38. Orduña-Malea, E.; Martín-Martín, A.; Ayllon, J.M.; Delgado-Lopez-Cozar, E. The silent fading of an academic search engine: The case of Microsoft Academic Search. Online Inf. Rev. 201438, 936–953. Available online: https://riunet.upv.es/bitstream/handle/10251/82266/silent-fading-microsoft-academic-search.pdf?sequence=2 (accessed on 1 July 2019). [Google Scholar] [CrossRef]
  39. Clarivate. Colección Principal de Web of Science Ayuda. Available online: https://images-webofknowledge-com.sare.upf.edu/WOKRS532MR24/help/es_LA/WOS/hs_sort_options.html (accessed on 1 July 2019).
  40. Elsevier. Scopus: Access and Use Support Center. What Does “Relevance” Mean in Scopus? Available online: https://service.elsevier.com/app/answers/detail/a_id/14182/supporthub/scopus/ (accessed on 1 July 2019).
  41. AlRyalat, S.A.; Malkawi, L.W.; Momani, S.M. Comparing Bibliometric Analysis Using PubMed, Scopus, and Web of Science Databases. J. Vis. Exp. 2018. Available online: https://www.jove.com/video/58494/comparing-bibliometric-analysis-using-pubmed-scopus-web-science (accessed on 1 July 2019).
  42. Giustini, D.; Boulos, M.N.K. Google Scholar is not enough to be used alone for systematic reviews. Online J. Public Health Inform. 20135, 1–9. [Google Scholar] [CrossRef]
  43. Walters, W.H. Google Scholar search performance: Comparative recall and precision. Portal Libr. Acad. 20089, 5–24. [Google Scholar] [CrossRef]
  44. De-Winter, J.C.F.; Zadpoor, A.A.; Dodou, D. The expansion of Google Scholar versus Web of Science: A longitudinal study. Scientometrics 201498, 1547–1565. [Google Scholar] [CrossRef]
  45. Harzing,, A.-W. A preliminary test of Google Scholar as a source for citation data: A longitudinal study of Nobel prize winners. Scientometrics 201394, 1057–1075. [Google Scholar] [CrossRef]
  46. Harzing,, A.-W. A longitudinal study of Google Scholar coverage between 2012 and 2013. Scientometrics 201498, 565–575. [Google Scholar] [CrossRef]
  47. De-Groote, S.L.; Raszewski, R. Coverage of Google Scholar, Scopus, and Web of Science: A casestudy of the h-index in nursing. Nurs. Outlook 201260, 391–400. [Google Scholar] [CrossRef]
  48. Orduña-Malea, E.; Ayllón, J.-M.; Martín-Martín, A.; Delgado-López-Cózar, E. About the Size of Google Scholar: Playing the Numbers; EC3 Working Papers; EC3: Granada, Spain, 2014; Available online: https://arxiv.org/abs/1407.6239 (accessed on 1 July 2019).
  49. Orduña-Malea, E.; Ayllón, J.-M.; Martín-Martín, A.; Delgado-López-Cózar, E. Methods for estimating the size of Google Scholar. Scientometrics 2015104, 931–949. [Google Scholar] [CrossRef]
  50. Pedersen, L.A.; Arendt, J. Decrease in free computer science papers found through Google Scholar. Online Inf. Rev. 201438, 348–361. [Google Scholar] [CrossRef]
  51. Jamali, H.R.; Nabavi, M. Open access and sources of full-text articles in Google Scholar in different subject fields. Scientometrics 2015105, 1635–1651. [Google Scholar] [CrossRef]
  52. Van-Aalst, J. Using Google Scholar to estimate the impact of journal articles in education. Educ. Res. 201039, 387–400. Available online: https://goo.gl/p1mDBi (accessed on 1 July 2019). [Google Scholar] [CrossRef]
  53. Jacsó, P. Testing the calculation of a realistic h-index in Google Scholar, Scopus, and Web of Science for FW Lancaster. Libr. Trends 200856, 784–815. [Google Scholar] [CrossRef]
  54. Jacsó, P. The pros and cons of computing the h-index using Google Scholar. Online Inf. Rev. 200832, 437–452. [Google Scholar] [CrossRef]
  55. Jacsó, P. Calculating the h-index and other bibliometric and scientometric indicators from Google Scholar with the Publish or Perish software. Online Inf. Rev. 200933, 1189–1200. [Google Scholar] [CrossRef]
  56. Jacsó, P. Using Google Scholar for journal impact factors and the h-index in nationwide publishing assessments in academia—Siren songs and air-raid sirens. Online Inf. Rev. 201236, 462–478. [Google Scholar] [CrossRef]
  57. Martín-Martín, A.; Orduña-Malea, E.; Harzing, A.-W.; Delgado-López-Cózar, E. Can we use Google Scholar to identify highly-cited documents? J. Informetr. 201711, 152–163. [Google Scholar] [CrossRef]
  58. Aguillo, I.F. Is Google Scholar useful for bibliometrics? A webometric analysis. Scientometrics 201291, 343–351. Available online: https://goo.gl/nYBmZb (accessed on 1 July 2019). [Google Scholar] [CrossRef]
  59. Delgado-López-Cózar, E.; Robinson-García, N.; Torres-Salinas, D. Manipular Google Scholar Citations y Google Scholar Metrics: Simple, Sencillo y Tentador; EC3 working papers; Universidad De Granada: Granada, Spain, 2012; Available online: http://hdl.handle.net/10481/20469 (accessed on 1 July 2019).
  60. Delgado-López-Cózar, E.; Robinson-García, N.; Torres-Salinas, D. The Google Scholar experiment: How to index false papers and manipulate bibliometric indicators. J. Assoc. Inf. Sci. Technol. 201465, 446–454. [Google Scholar] [CrossRef]
  61. Martín-Martín, A.; Orduña-Malea, E.; Ayllón, J.-M.; Delgado-López-Cózar, E. Back to the past: On the shoulders of an academic search engine giant. Scientometrics 2016107, 1477–1487. [Google Scholar] [CrossRef]
  62. Jamali, H.R.; Asadi, S. Google and the scholar: The role of Google in scientists’ information-seeking behaviour. Online Inf. Rev. 201034, 282–294. [Google Scholar] [CrossRef]
  63. Marcos, M.-C.; González-Caro, C. Comportamiento de los usuarios en la página de resultados de los buscadores. Un estudio basado en eye tracking. El Profesional de la Información 201019, 348–358. [Google Scholar] [CrossRef]
  64. Torres-Salinas, D.; Ruiz-Pérez, R.; Delgado-López-Cózar, E. Google scholar como herramienta para la evaluación científica. El Profesional de la Información 200918, 501–510. [Google Scholar] [CrossRef]
  65. Ortega, J.L. Academic Search Engines: A Quantitative Outlook; Elsevier: Amsterdam, The Netherlands, 2014. [Google Scholar]
  66. Hug, S.E.; Brändle, M.P. The coverage of Microsoft Academic: Analyzing the publication output of a university. Scientometrics 2017113, 1551–1571. [Google Scholar] [CrossRef]
  67. MOZ. Search Engine Ranking Factors 2015. Available online: https://moz.com/search-ranking-factors/correlations (accessed on 1 July 2019).
  68. Van der Graaf, P. Reverse Engineering Search Engine Algorithms is Getting Harder, Searchenginewatch. Available online: https://searchenginewatch.com/sew/how-to/2182553/reverse-engineering-search-engine-algorithms-getting-harder (accessed on 1 July 2019).
  69. Dave, D. 11 Things You Must Know About Google’s 200 Ranking Factors. Search Engine Journal. Available online: https://www.searchenginejournal.com/google-200-ranking-factors-facts/265085/ (accessed on 10 September 2019).
  70. Chariton, R. Google Algorithm—What Are the 200 Variables? Available online: https://www.webmasterworld.com/google/4030020.htm (accessed on 10 September 2019).
  71. Google. About Google Scholar. Available online: http://scholar.google.com/intl/en/scholar/about.html (accessed on 1 July 2019).
  72. Mayr, P.; Walter, A.-K. An exploratory study of google scholar. Online Inf. Rev. 200731, 814–830. [Google Scholar] [CrossRef]
  73. Sinha, A.; Shen, Z.; Song, Y.; Ma, H.; Eide, D.; Hsu, B.; Wang, K. An Overview of Microsoft Academic Service (MAS) and Applications. In Proceedings of the 24th International Conference on World Wide Web (WWW 2015 Companion), ACM, New York, NY, USA, 18–22 May 2015; pp. 243–246. [Google Scholar] [CrossRef]
  74. Herrmannova, H.; Knoth., P. An Analysis of the Microsoft Academic Graph. D Lib Mag. 201622, 9–10. [Google Scholar] [CrossRef]
  75. Ortega, J.L. Microsoft Academic Search: The multi-object engine. Academic Search Engines: A Quantitative Outlook; Ortega, J.L., Ed.; Elsevier: Oxford, UK, 2014; pp. 71–108. [Google Scholar]
  76. Microsoft Academic. How Is MA Different from other Academic Search Engines? 2019. Available online: https://academic.microsoft.com/faq (accessed on 1 July 2019).
  77. Eli, P. Academic Word List words (Coxhead, 2000). Available online: https://www.vocabulary.com/lists/218701 (accessed on 1 July 2019).
  78. Wiktionary: Academic word list. Available online: https://simple.wiktionary.org/wiki/Wiktionary:Academic_word_list (accessed on 18 September 2019).
  79. Coxhead, A. A new academic word list. TESOL Q. 201234, 213–238. [Google Scholar] [CrossRef]
  80. Harzing, A.-W. Publish or Perish. Available online: https://harzing.com/resources/publish-or-perish (accessed on 1 July 2019).
  81. Harzing, A.W. The Publish or Perish Book: Your Guide to Effective and Responsible Citation Analysis; Tarma Software Research Pty Ltd: Melbourne, Australia, 2011; pp. 339–342. Available online: https://EconPapers.repec.org/RePEc:spr:scient:v:88:y:2011:i:1:d:10.1007_s11192−011−0388-8 (accessed on 1 July 2019).
  82. R Core Team. R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing. Available online: https://www.R-project.org (accessed on 1 July 2019).
  83. Revelle, W. Psych: Procedures for Personality and Psychological Research, Northwestern University. Available online: https://CRAN.R-project.org/pahttps://cran.r-project.org/package=psychckage=psych (accessed on 1 July 2019).
  84. Lemon, J. Plotrix: A package in the red light district of R. R News 20066, 8–12. [Google Scholar]
  85. Farhadi, H.; Salehi, H.; Yunus, M.M.; Aghaei Chadegani, A.; Farhadi, M.; Fooladi, M.; Ale Ebrahim, N. Does it matter which citation tool is used to compare the h-index of a group of highly cited researchers? Aust. J. Basic Appl. Sci. 20137, 198–202. Available online: https://ssrn.com/abstract=2259614 (accessed on 1 July 2019). [Google Scholar]