World ranking of universities on the Web

World ranking of universities on the Web . Among its objectives is to promote electronic access to scientific publications and to all academic materials.

In its beginnings it had the objective of promoting Web publishing and supporting “Open Access” initiatives, as well as promoting electronic access to scientific publications and to all those academic-type materials. However, Web data at the moment is very useful for ranking universities because they are not based on the number of visits or page design, but rather take into account the quality and impact of the universities.

Summary

[ hide ]

  • 1 Presentation of the world ranking
    • 1 Background of the project
    • 2 Intentions and objectives of the rankings
    • 3 Design and weight of the indicators
    • 4 Collection and processing of data
    • 5 Presentation of the Ranking results
  • 2 Usefulness of web ranking
  • 3 Sources

Presentation of the world ranking

The World Ranking of Universities on the Web adheres formally and explicitly to all the proposals set forth in the document Berlin Principles of Higher Education Institutions [2] . The ultimate objective is the continuous improvement and refinement of the methodologies used according to a group of previously agreed principles of good practice.

Over the past year several of the signatories to the Code of Good Practice known as the Berlin Principles on Ranking of Higher Education Institutions have become private for-profit companies and the biases of some of these rankings are now more, and more evident. Although the Web Ranking of World Universities still formally and explicitly adheres to the Berlin Principles, it also complies with others, such as:

  • A World Ranking is A ranking: Publishing a completely different set of rankings with exactly the same data is useless as well as confusing.
  • A World University Ranking is a ranking of universities from all over the world, covering thousands of them, not just a few hundred institutions in the developed world.
  • A Ranking endorsed by a private for-profit company operating the business related to the rankings must be carefully checked.
  • The unexpected presence of certain universities in very high positions is a good indicator of the (lack of) quality of the Ranking, regardless of how supposedly good the methodologies used in its preparation are.
  • Rankings that favor the stability of results between editions, and that do not explicitly publish individual changes, and the reasoning behind them (correcting errors, adding or deleting entries, changing indicators) are violating the code of good practice.
  • The rankings based solely on Research (bibliometric) are biased to the detriment of the technological, computer, social and human sciences, disciplines that normally account for more than half of the academics of a standard university.
  • The Rankings should include indicators, even indirect, that cover the teaching mission and the so-called third mission, considering not only the scientific impact of the university’s activities but also the economic, social, cultural, and also political impacts.
  • World-class universities are not small, highly specialized institutions.
  • The survey is not an adequate tool to make a World Ranking since there is not a single individual with deep knowledge (several semesters per institution), and multi-institutional experience (several dozen) and multidisciplinary (sciences, biomedicine, social sciences, technology ) from a representative sample (different continents) of universities around the world.
  • Link analysis is a much more powerful tool for quality assessment than citation analysis that only takes into account formal peer recognition, as links not only include bibliographic citations but also add the contribution of third parties parties in university activities.

Background of the project

The “World Ranking of Universities on the Web” is an initiative of the Cybermetrics Laboratory that belongs to the Center for Human and Social Sciences (CCHS), which is part of the largest national research center in Spain , the CSIC. The Cybermetry Laboratory is dedicated to the quantitative analysis of the Internet and the contents of the Network, especially those related to the process of academic generation and communication of scientific knowledge. This is a new and emerging discipline that has been called Cybermetry. The CSIC has been developing and publishing free electronic journal [Cybermetrics] since 1997 , also known as Webometry.

With this ranking we aim to provide extra motivation to researchers around the world to publish more and better scientific content on the Web, thus making it available to colleagues and to people in general wherever they are.

The “World Ranking of Universities on the Web” was officially launched in 2004 , and is updated every 6 months (the data is collected during the months of January and June and published a month later). The web indicators used are based on and correlate with traditional bibliometric indicators and scientometric indicators. The objective of the project is to convince the academic and political communities of the importance of web publishing not only for the dissemination of academic knowledge but also as a way to measure scientific activity, performance and impact.

Intentions and objectives of the rankings

  • Documentation of higher education institutions (processes and results) on the Web. Rankings based on Web data can be combined with other non-web indicators, in fact we are already publishing comparative analyzes following a similar initiative. But the current objective of the Ranking of Universities is to promote the publication on the Web by the universities, evaluating their commitment to electronic distribution, and to fight a very worrying problem in the academic environment, which is that of the emergence of a digital divide that is even more evident among universities in developed countries. The University Rankings is not intended to assess their performance based solely on their production on the Web,
  • Ranking purpose and target groups. The University Ranking measures the volume, visibility and impact of the web pages published by the universities, with a special emphasis on scientific production (evaluated articles, contributions to congresses, drafts, monographs, doctoral theses, reports, etc.) but also having Take into account other materials such as those from courses, documentation from seminars or work groups, digital libraries, databases, multimedia, personal pages, etc., and general information about the institution, its departments, research groups or research services. support and people working or attending courses. There is a group that is the direct objective of the Ranking and that is that of the university authorities. If the performance of an institution’s website is below what is expected according to its academic excellence, then the institution’s web policy should be reconsidered, promoting a substantial increase in the volume and quality of its electronic content. The members of the institution are an indirect objective since we hope that in the not too distant future web information may be as important as other bibliometric and scientometric indicators for the evaluation of the scientific performance of academics and their research groups.

Finally, those students who are looking for a university should not use these data as the only guide, although a high position will always indicate that the institution maintains a policy that promotes the use of new technologies and has resources for their adoption.

  • Diversity of institutions: Missions and objectives of the institutions. Quality measures for research-oriented institutions are, for example, quite different from those that are appropriate for more general institutions. The institutions participating in the ranking and the experts who carry it out should be consulted often.
  • Sources of information and interpretation of data. Access to information on the Web is done primarily through search engines. These intermediaries are free, universal, and very powerful even when we consider their limitations and shortcomings (limitations in coverage and subjectivity, lack of transparency, business strategies and secrets, irregular behavior). Search engines are key pieces to measure the visibility and impact of college websites. There are a limited number of sources that are useful for “webometric” purposes: 7 general search engines (Google *, Yahoo Search *, Live (MSN) Search *, Exalead *, Ask (Teoma), Gigablast and Alexa) and 2 specialized scientific databases (Google Academic * and Live Academic). They all have very large (gigantic) independent databases,
  • Linguistic, cultural, economic, and historical contexts. The project aims to have a truly global coverage, not limiting the analysis to only a few hundred institutions (the world-renowned universities) but including as many organizations as possible. The only requirement in our international ranking is to have an autonomous web presence with an independent domain. This approach allows a large number of institutions to monitor their current ranking and the evolution of their position after adequately modifying their policies and implementing specific initiatives. Universities in developed countries have the opportunity to know precisely the limit of the indicators that distinguish the elite.

Design and weight of the indicators

Methodology used to create the rankings. The unit used for the analysis is the institutional domain, so only universities and research centers with an independent domain are considered. If an institution has more than one primary domain, 2 or more entries are used with the different addresses. Between 5-10% of institutions do not have an independent web presence, and most of them are in developing countries. Our catalog of institutions not only includes universities, but also includes other higher education institutions as recommended by UNESCO. The names and addresses have been obtained from both national and international sources, including among others:

  • Universities Worldwide. Available at: [3]
  • All Universities around the World. Available at: [4]
  • Braintrack University Index. Available at: [5]
  • Canadian Universities. Available at: [6]
  • UK Universities. Available at: [7]
  • US Universities [8]

The university activity is multi-dimensional and this is reflected in its web presence. So the best way to build the ranking is through the combination of a group of indicators that measures all these different aspects. Almind & Ingwersen proposed the first web indicator, Web Impact Factor (WIF), which is based on a link analysis that combines the number of links from external pages to the website and the number of pages of the website, a 1: 1 ratio between visibility and size. This relationship is used for ranking, but two new indicators are added to the size component: number of documents, measured as the number of rich files in the web domain, and the number of publications that are being collected in the database of the Academic google. As already mentioned,

  • Size (S). Number of pages obtained from 4 search engines: Google, Yahoo, Live Search and Exalead. For each motor, the results are normalized logarithmically to 1 for the highest value. Then, for each domain the maximum and minimum results are excluded and each institution is assigned a rank according to the combined sum of the remaining values ​​obtained.
  • Visibility (V). The total number of external links received (inlinks) by a site can only be reliably obtained from Yahoo Search, Live Search and Exalead. For each motor, the results are normalized logarithmically to 1 for the highest value and then combined to generate the range.
  • Rich files (R). The following file types were selected after assessing their relevance in the academic and editorial environment, and their volume in terms of use compared to other formats: Adobe Acrobat (.pdf), Adobe PostScript (.ps), Microsoft Word (.doc ) and Microsoft Powerpoint (.ppt). This data was extracted using Google and gathering the values ​​obtained for each type of file after logarithmically normalizing as described above.
  • Scholar (Sc). Google Scholar provides the number of articles and citations for each academic domain. The results obtained from the Google Academic database include articles, reports and other related material.

The 4 ranges were combined according to the following formula in which each one is assigned a different weight:

  • Visibility (external links) 50%
  • Website size 20%
  • Amount of rich files 15%
  • Indexing in Google Scholars 15%

Relevance and validity of the indicators. The choice of indicators was made according to several criteria (see note), with some of them trying to capture the quality and academic and institutional strengths, while others try to promote web publishing and the implementation of initiatives ” Open Access “. The inclusion of the total number of pages is based on the recognition of a new global market for academic information, since the Web is the appropriate platform for the internationalization of institutions. A strong and detailed web presence that provides accurate descriptions of the university’s structure and activities can attract new students and scholars from around the world. The number of external links received (inlinks) by a domain is a measure that represents the visibility and impact of the published material, And although the motivation to link is very diverse, there is a significant fraction of that activity that works in a similar way as the bibliographic citation does. The success of self-archiving and other information storage initiatives are reflected by data from wealthy archives and Google Scholar. The high values ​​obtained for the pdf and doc formats mean that not only administrative and bureaucratic reports are involved, but academic production is very significant. PostScript and Powerpoint files are clearly related to academic activity. The success of self-archiving and other information storage initiatives are reflected by data from wealthy archives and Google Scholar. The high values ​​obtained for the pdf and doc formats mean that not only administrative and bureaucratic reports are involved, but academic production is very significant. PostScript and Powerpoint files are clearly related to academic activity. The success of self-archiving and other information storage initiatives are reflected by data from wealthy archives and Google Scholar. The high values ​​obtained for the pdf and doc formats mean that not only administrative and bureaucratic reports are involved, but academic production is very significant. PostScript and Powerpoint files are clearly related to academic activity.

Measure results in preference to resources . Resource data is relevant in that it reflects the general condition of a given institution and is generally more accessible. The measurement of results provides a more accurate assessment of the capacity and / or quality of the institutions or their programs. We hope to offer a better balance in the future, but currently we want to draw attention to incomplete strategies, inappropriate policies and bad practices when it comes to web publishing before trying to show a more complete scenario.

Balancing the different indicators : Current and future evolution. The current rules for range indicators, including the described weight model, have been tested and published in scientific articles (see note). Research continues on this topic, but the final objective is to develop a model that includes additional quantitative data, especially bibliometric and scientometric indicators.

Collection and processing of data

Ethical standards . We have identified some relevant flaws in the data obtained from the search engines including the under-representation of some countries and languages. Since the behavior is different depending on the engine used, a good practice is to combine the results obtained from various sources. Any other mistake is not intentional and should not affect the credibility of the ranking. Please contact us if you think the ranking is partial or not objective in any respect.
Verified and audited data. The only source of data to build the University Ranking is a small set of globally available and free access search engines. All results can be duplicated according to the explained methodology and taking into account the explosive nature of the growth of content on the web, its volatility and the erratic behavior of commercial engines.
Collection of data . Data is collected during the same week, in two consecutive rounds for each strategy, selecting the highest value. Each website that is under the same institutional domain is scanned, but no attempt is made to combine content or links from different domains.
Quality of ranking processes. After automatic data collection, positions are manually checked and compared to previous editions. Some of the processes are duplicated and new experiences are added from a different variety of sources. The pages that link to the University Ranking are explored, and comments from blogs and other forums are taken into account. Finally, our email address receives many requests and suggestions that are recognized individually.
Organizational measures to increase credibility . The results of the ranking and the methodologies used are discussed in scientific journals and presented at international conferences. We hope that international bodies of advisers and even supervisors will take part in the future development of the ranking.

Presentation of the Ranking results

Sample of the data and factors involved . The published tables show all the web indicators used in a very synthetic and visual way. Not only is a main ranking that groups the top 4000 institutions worldwide (Top 4000) provided, but other regional rankings are also shown for comparative purposes.
Updating and reducing errors . The listings are offered from dynamic pages that connect to various databases where errors can be easily corrected when detected.

Usefulness of the web ranking

If an institution’s web performance is below what is expected according to its academic excellence, university leaders should reconsider their Web policy, promoting a substantial increase in the volume and quality of their electronic publications. On the other hand, if a university is in the Directory of the 20 thousand institutions, but does not appear in the Ranking of the first 12 thousand, they should strive to increase the number of quality international academic web pages of its website.

 

Leave a Comment