40,95 €
40,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
20 °P sammeln
40,95 €
40,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
20 °P sammeln
Als Download kaufen
40,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
20 °P sammeln
Jetzt verschenken
40,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
20 °P sammeln
  • Format: PDF

This book presents an introduction to the field of applied evaluative informetrics, dealing with the use of bibliometric or informetric indicators in research assessment. It sketches the field's history, recent achievements, and its potential and limits. The book dedicates special attention to the application context of quantitative research assessment. It describes research assessment as an evaluation science, and distinguishes various assessment models, in which the domain of informetrics and the policy sphere are disentangled analytically. It illustrates how external, non-informetric…mehr

  • Geräte: PC
  • ohne Kopierschutz
  • eBook Hilfe
  • Größe: 10.35MB
Produktbeschreibung
This book presents an introduction to the field of applied evaluative informetrics, dealing with the use of bibliometric or informetric indicators in research assessment. It sketches the field's history, recent achievements, and its potential and limits. The book dedicates special attention to the application context of quantitative research assessment. It describes research assessment as an evaluation science, and distinguishes various assessment models, in which the domain of informetrics and the policy sphere are disentangled analytically. It illustrates how external, non-informetric factors influence indicator development, and how the policy context impacts the setup of an assessment process. It also clarifies common misunderstandings in the interpretation of some often used statistics.

Addressing the way forward, the book expresses the author's critical views on a series of fundamental problems in the current use of research performance indicators in research assessment. Highlighting the potential of informetric techniques, a series of new features is proposed that could be implemented in future assessment processes. It sketches a perspective on altmetrics and proposes new lines in longer term, strategic indicator research.

It is written for interested scholars from all domains of science and scholarship, and especially for all those subjected to research assessment, research students at advanced master and PhD level, research managers, funders and science policy officials, and to practitioners and students in the field.


Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

Autorenporträt
Henk F. Moed is a former senior staff member and full professor of research assessment methodologies in the Centre for Science and Technology Studies (CWTS) at Leiden University. He obtained a Ph.D. degree in Science Studies at the University of Leiden in 1989. He has been active in numerous research topics, including: the creation of bibliometric databases from raw data from Thomson Scientific's Web of Science and Elsevier's Scopus; analysis of inaccuracies in citation matching; assessment of the potentialities and pitfalls of journal impact factors; the development and application of science indicators for the measurement of research performance in the basic natural- and life sciences; the use of bibliometric indicators as a tool to assess peer review procedures; the development and application of performance indicators in social sciences and humanities; studies of the effects of 'Open Access' upon research impact and studies of patterns in 'usage' (downloading) behaviour of users of electronic scientific publication warehouses; studies of the effects of the use of bibliometric indicators upon scientific authors and journal publishers.
He has published numerous research articles, and is editor of several journals in his field. He is a winner of the Derek de Solla Price Award in 1999. He edited, jointly with W. Glanzel and U. Schmoch, the Handbook on Quantitative Science and Technology Research (Kluwer 2004), and published Citation Analysis in Research Evaluation (Springer 2005), a textbook which is one of very few of these in the field.
He developed a new indicator of journal impact, SNIP (Source Normalized Impact per Paper), a so called "rolling year" journal metric. He is a member of the Board of the International Society for Scientometrics and Informetrics (ISSI). He was a Senior Scientific Advisor at Elsevier for 4 years and a founder of the Elsevier Bibliometric Research Program (EBRP, which ran till Aug. 2013) and of the Elsevier Metrics Development Program (from 2014). He also was Director of the Informetric Research Group (2012-2014).
Rezensionen
"In Applied Evaluative Informetrics the author adopts a didactic approach and a pragmatic perspective, drawing on his extensive knowledge and experience. He has written the book for a general audience of non-experts and only secondarily for the scientometric or informetric community." (David A. Pendlebury, Scientometrics, Vol. 119, 2019)