The other day, Vladimir Sklyar (@Vladimir_Sklyar) published two materials about the academic segment of the Internet: one and two . I started writing a comment ... and got carried away. As a result, I write a very detailed comment.
First of all, I want to thank Vladimir for the curious materials and the topic raised. To me, who takes the first steps in the academic world, it is very interesting and seems important (although I understand that this topic is not the most significant for the whole Habr).
Despite the joy of reading the material, a wonderful style and succinct generalizations (I really liked the section "What is the reason for such inattention to this important component of scientific work?"), There was a feeling of immense underexposure. In my opinion, Vladimir touched only the tip of the iceberg. Further comment will be divided into additions and clarifications .
Research Gate has a very mixed reputation in the academic community. They basically achieved this by a policy of very aggressive imposing themselves on the world ( for example ). In addition to Research Gate, there is a more “academic” Academia.edu . But we will not go into holivar. Just options are much more than RG. In my superficial judgment, the rating on RG is very often misleading, not to mention the endorsement system. Q & A at RG - for the most part, a collection of questions that many students are ashamed to ask (to the author’s words in the second article, "There are functions for locating professional questions of network participants and answers to them." ).
In addition to RG and Academia, there are plenty of options to place the full texts of their articles in good repositories. For starters, the super famous ArXiv . He appeared in the early 1990s and immediately set the level at which many imitators are still oriented. For many, the snag is that this portal narrowly specializes in technical sciences. From notable contemporaries, probably, it is necessary to note the lightning- speeding rise of Bioarxiv and the very “omnivorous” Figshare . The common feature of all these archives is the policy of the impossibility of deleting files. That is, you need to think about whether the draft is ready to show, before you litter the web. Narrower systems for social sciences - RePEc and SSRN . Both are highly respected, but, in my opinion, a little stuck in the past.
What else should be mentioned? Publishing and magazine policies regarding the publication of preprints and postprints. Countless options. Fortunately, there is a project that in a convenient form aggregates all this information - RoMEO .
Well and, of course, with such a review can not ignore the topic of personal websites. Must be "own harbor", where the availability and format of information depends only on you. Today, to make your own page is technically completely for any user. There are many options. For example, my site . Or here is a pretty old researcher site , also based on Google Sites, very simple, but everything is important in place. And there are plenty of examples of excellent sites on more modern platforms. A few examples: a WordPress site ; one more ; and here is a great site created with github pages and jekyll . There is no limit to perfection. In the foreseeable future, I plan to master the creation of a static site on github pages using R. In general, github is a separate world and a separate topic for discussion. Not only programmers, but also many researchers can systematize with its help, beautifully present and make their projects reproducible ( ! ).
By the way, it is impossible not to notice the excellent domestic development - MGUnaya TRUTH . It is open to all researchers, not only from Moscow State University. Minimalism and a bunch of beautifully implemented things. For authors who publish not only in English is a very green system.
Well and, probably, it should be noted that "not a single hirshe". There is a curious Altmetric project that measures the effect of an article in the media sphere (by the way, an increasing number of publishers are integrating these metrics into their sites, for example, T & F and Wileys ).
Unfortunately, the national magazines of the Russian segment do not strive to enter the Web of Science, Scopus or even Google Scholar.
There is no process for getting into Google Scholar. GS indexes everything that it finds in PDF format. Also in the second article , in the section on GS, the author writes about praising the service and, in particular, writes "Obviously, this service is evolving at the moment . " In my opinion, this is far from the truth. If Google had invested at least 10% of the effort spent on stillborn Google+ on Scholar, it would have been possible to really make all other services (as well as similar discussions and reviews) disappear as unnecessary. Today, GS indexes a bunch of garbage and makes no difference in the quality of the material or at least the reliability of the source. Web of Science and Scopus at least try to sort the slag. It turns out not always, but, in general, the chance to meet their indexed slag is not great, especially if you focus on the top magazines, measured by their own metrics, for example SJR . Sometimes GS allows you to look quite impressive frank academic freaks . For example, I stumbled upon a frame that has more than 1,000 citations in the GS and only 8 (eight!) In Scopus . Here is a typical list of citations of his article on the GS . In my opinion, comments are superfluous.
In addition, the phrase itself (quote above), it seems to me, is monstrously far from reality. Our journals scrub in the Web of Science and Scopus with all their might. Few people get. And do not even always crawl the best. But this is a separate conversation.
It should be noted Russian Science Citation Index from the Web of Science. Thomson Reuters decided to select the top 1000 Russian magazines and include them not in the Core Collection, but in a separate database. As a result, about 650 were selected. Well, just a minute: there are more than 2000 names in the list of HAC. This is a question about the relevance of VAK as a measure of the quality of journals.
A little more about Google Scholar. In the second article, describing the GS functionality, Vladimir writes "The Alerts, in my opinion, are not so important . " Sharply disagree. In my opinion, this is the most adequate of the existing options on the Internet, allowing you to follow the publications of specific authors. Indispensable.
Also in the second article about GS "... and some probability of connecting in the near future Russian-speaking segment . " The Russian-speaking segment has existed for a long time. See, for example, this profile .
My publications in the Web of Science turned out less than in Scopus, although, strictly speaking, this is a question of a promotion strategy.
This is not at all surprising, since Scopus indexes significantly wider, the WoS criteria are tougher. As a rule, everything that is in WoS is in Scopus. But not vice versa.
ORCID . Marked at the very end of the second article as plans for the future. This is the perfect must of the modern academic world. Some journals, as well as, for example, Bioarxiv, strongly recommend that all authors acquire an identification number at the earliest opportunity.
I really hope that the MIT PubPub project will take off. This is a potential revolution in the world of academic publications. The open review system alone is worth something! It's funny that after reading the explosive article about the sci-hub and another one about the consequences of the distribution system, I thought about how to reform the system of publication of academic works. So, PubPub is all my dreams, only cooler. Well, really, why is peer-review, the institution on which science is based in its modern form, the most ungrateful and little remunerated part of academic work? Believe in PubPub!
Bibliometrics is important, and without it nowhere. In general, this is probably still good. But excessive focus on quantitative parameters also leads away from the truth. In Russia (judging by HSE), there is a typical rating boom for developing countries (the same was in South Korea and China). Given the numerous imperfections of existing systems and institutions, it is worth refraining from radical judgments based on bibliometrics. In particular, the passage from the first article ( "I, for example, met one person at a conference who had as much as 50 h-index in Scopus. 50 publications, each of which was quoted at least 50 times, wow!" ) seems like an illustration of this potentially dangerous quantitative approach. Not all of the truly great researchers in general produced 50 significant texts in their lives. And not all authors of scientific articles make a significant contribution to their writing ( here is a funny link ). So be careful.
I will give an example of a potential fallacy into which bibliometric fever can enter (of course, from its own area, demography). If one strictly follows the logic of bibliometrics, then among domestic demographers it is impossible not to draw attention to A. Korotaev. Meanwhile, the activities of his and his team in the demographic field, I would describe as "inflating bibliometric bubbles." The fact that this is so can be seen in a short article by the remarkable Russian demographer Yevgeny Andreev. Advancement in the media sphere of unreliable results can only harm the science, because it distracts public attention from the really worthwhile works and advances incompetent authors to the rank of “experts”. So not everything that comes to the surface is worth a close look.
I want to emphasize that in this example I am talking only about demographic work, the level of which I can estimate to some extent. I do not have enough competence to judge the anthropological, historical and political science works of Korotayev. However, I am sure that there is a general level of integrity and academic integrity of the researcher. The scientific world is built on trust, and I would be careful not to rely on the opinions of those who undermine it.
And absolutely in conclusion. I sincerely recommend everyone to read the appeal to the scientific community of bibliometrics and the University of Leiden (by the way, they are considered the best in the world). Experts urge not to dwell on the numbers!
Source: https://habr.com/ru/post/306790/
All Articles