Tuesday, January 18, 2011

On quality evaluation of scientific publications

Yesterday, the entire Etzion tribe visited the Technion to attend the ceremony in which my daughter Anat received a plaque of being in the president's list.    Since I don't have yet pictures from this ceremony I am posting pictures of the Technion president, Professor Peretz Lavi, and of Anat doing some non-academic tasks while renovating her apartment.

Talking about academia, in a recent posting on  the complexevents forum, Rainer has put a reference to a European project that is aimed to take a new approach to scientific publications.   In the relatively early days of the Internet (somewhere in the 1990-ies) there has been a panel in one of the conference (I think it was VLDB in Zurich) about the future of scientific publications in the Internet era.   The observation is that now a person can just post papers on the web and make them available to people, instead of publication in scientific journals (or conference proceedings, a computer science anomaly - most disciplines don't view it as a real publication).   However, the peer-reviewed publication process also serves as "quality control", moreover, in the academic world,  the promotion metrics are largely depend on publications (how much? where? how many citations?)  lead to metrics like H-Index and G-Index.   Since our world is increasingly metrics oriented, everything is measured, satisfying the metrics becomes the goal, and the way to do it becomes means (instead of vice versa), this is, by the way, true for other areas - not just scientific publications, but now we discuss scientific publications.   The original proposal at the panel, as far as I remember, was that everybody will freely publish paper on the web, but will get certification by peer-reviewed institutes, whose business models is that the author pays for review and they pay reviewers, authors can apply to several levels of certifications (grade A certification paper, grade B certification paper etc...).    Actually this did not happen, the scientific journals are still alive, although most accesses are now through digital libraries and not through hard copies.  I used to have a lot of journal subscriptions to hard copy journals, and when the IBM Haifa Research Lab migrated to the current building 10 years ago, I thought it is a good time to throw away all journals (leaving some journal issues with my papers published in) and move to digital libraries subsciptions.    

Now, the European project mentioned above, tries to re-invent the publication model claiming the failure of the peer reviewed system, saying there is low correlation between the grade achieved in the peer review process by three researchers who happend to be selected as reviewers and the impact of the paper in reality, and replace it by the "wisdom of the crowd", everybody can comment and endorse the paper.

As a side comment, I never liked the metrics of number of citations (always reminds me the old metrics based  on counting   lines of code), this is certainly a metrics for visibility, but not necessarily a quality mertics,  let's assume that somebody quotes my paper saying -- "this paper attempted to approach the same issue, but its solution was based on faulty assumptions", this is counted as a citation the same as another quote saying -- "we are basing our approach upon the seminal work done in that paper", again counted as one citation.  Also when citation number metrics became popular in evaluation processes, it encouraged the development of clicques that cite each other,  all of these are not reflected in metrics.  

Since I am working in an industrial research institute and not in a university,  my interest is to evaluate the impact of a scientific paper on the world,  where citations is only one (and maybe in some cases a minor) factor.   Codd's paper on the relational databases was cited by 1844 other papers (according to the  Microsoft Academic site)  but it is obvious that this is a minor consideraiton in its impact on the world. 
I think that one of the challenges is to measure this impact, which oftern can be done only in retrospect. 

More thoughts about it - later.

No comments: