{"id":3990,"date":"2017-07-14T05:25:32","date_gmt":"2017-07-14T10:25:32","guid":{"rendered":"http:\/\/blog.law.cornell.edu\/voxpop\/?p=3990"},"modified":"2017-07-14T05:25:32","modified_gmt":"2017-07-14T10:25:32","slug":"the-debate-on-research-quality-criteria-for-legal-scholarships-assessment-some-key-questions","status":"publish","type":"post","link":"https:\/\/blog.law.cornell.edu\/voxpop\/2017\/07\/14\/the-debate-on-research-quality-criteria-for-legal-scholarships-assessment-some-key-questions\/","title":{"rendered":"The debate on research quality criteria for legal scholarship\u2019s assessment: some key questions"},"content":{"rendered":"

\"\"<\/a>Ginevra Peruginelli (Institute of Theory and Techniques of Legal Information of the National Research Council of Italy) <\/span><\/i><\/p>\n

[Ed. note: \u00a0This instalment of our 25-for-25 looks, at first, like a bit of a departure for us — it talks about different methods of evaluating legal scholarship. \u00a0But with a little reading-between-the-lines, it’s not hard to see how well it ties in with questions that are very present for American legal experts. \u00a0 The problem of evaluating the quality of legal expertise expressed, consumed, and commented upon in different online environments — blogs such as this one, online commentary, and nontraditional channels of all kinds — is a stubborn one that is gaining increased attention. \u00a0 How do you measure the quality of scholarship, or its impact? Other disciplines have struggled with this, as reliance on particular publication vehicles becomes obsolete in the face of new methods of dissemination, community discussion, and response. \u00a0It is high time that we looked at legal scholarship as well. \u00a0Of late, law librarians<\/a> interested in so-called “alt-metrics”<\/a> have begun to.]<\/em><\/p>\n

The evaluation of the quality of legal publications is now at the center of the debate in the legal academia in Europe<\/span> (among others Fl\u00fcckiger and Tanquerel 2015). <\/span>Nowadays, in principle, peer review remains the preferred method for assessing the quality of legal scholarship: this is partly due to the failure of a purely metrics-based system in this area<\/span>. <\/span>In legal sciences, where research output is usually produced in long written texts, research performance is hard to assess using quantitative indicators: bibliometric methods are not sufficiently capable of measuring research performance in legal scholarship and are not considered trustable by the legal community.<\/span><\/p>\n

In 1992 Edward L. Rubin, professor of law at the Vanderbilt University Law School (Rubin 1992) argued that there is no theory of evaluation for legal sciences. He stated that what actually leads legal academics to assess a work is based on an undefined concept of quality of judgments. This creates a number of conceptual and practical difficulties that produce confusion and unease in the area. It is a matter of fact that many of the most heated discussions on legal scholarship concern the evaluation process and a relevant number of these are repetitive and non-productive for the total absence of an evaluation theory. Rubin directly tackles the question of what the foundation for evaluation should be and recommends an epistemological approach for formulating an evaluation theory. Some interesting issues are raised in his writings: the need for using criteria such as clarity, persuasiveness, significance, the consideration by evaluators of their own uncertainty, especially in case of topics somehow far from their discipline. <\/span><\/p>\n

A strong debate is still going on over criteria and even about the possibility of objective reliable evaluation in the law domain; major critical issues are still in place and no innovative solutions have been brought forward yet.<\/span><\/p>\n

According to one \u00a0part of the literature \u00a0(van Gestel and Vranken 2011; van Gestel 2015; Gutwirth 2009; Epstein and King 2002; Siems 2008), it is possible to identify some critical issues at the core of the debate on legal research assessment at European level. These are reported below in the form of questions and comments based on the current debate. <\/span><\/p>\n

(a)<\/span><\/i> Following the research assessment exercise of various European countries, content-based criteria such as originality, significance, societal impact are adopted. Is there general consensus on the value and interpretation of such criteria? <\/span><\/i><\/h3>\n

Depending on the type of research, on the literary genres and on the areas of law, the above content-based quality criteria can be critically different. Legal scholarship dedicated to interpret recent case law or a legal provision meets some difficulties in fulfilling the standard of originality as compared to theoretical research on general concepts, problems and principles of the law<\/span> (Siems 2008). Similar difficulties arise in evaluating criteria such as internationalization and societal impact, particularly in some fields of law, which are not part of the international arena, in terms of relevance, competitiveness and approval by the scientific community, including the explicit collaboration of researchers and research teams from other countries.<\/span><\/p>\n

(b) Is it possible to assess legal research on the basis of bibliometric evaluation techniques more or less widely accepted in other scientific disciplines?<\/span><\/i><\/h3>\n

Of course such alternatives should b<\/span>e thoroughly analyzed, taking into account a methodological justification in legal research. <\/span>Although<\/span><\/a> the best way to assess legal research and scientific publication is peer review, its time consuming process, the scarce availability of reviewers with expertise in this domain and the increasing request of research outputs evaluation limit the peer review method in legal science. Moreover, background figures that can be used to support the allocation of funds are being requested more and more by governments and policymakers (G<\/span>utwirth 2009).<\/span> This situation has actually created the need for quantitative measurements of scientific output as support tools for peer review. Performance indicators used in the assessment of exact sciences are now a strong part of the debate concerning how to evaluate non-bibliometric areas such as law. However, adopting the criteria, evaluation processes and methods that are used in other sciences is not a good solution. It would be appropriate to create transnational standards for legal research quality assessment, taking into account the actual internationalization of research in this area and the increasing mobility of students and development of international law schools. The establishment of harmonized standards or of generally accepted quality indicators is a challenge to be met, despite the differences between national assessment methods, various publishing cultures and different academic traditions.<\/span><\/p>\n

(c) How reliable is peer review?<\/span><\/i><\/h3>\n

Finding highly qualified peer reviewers is a difficult task when a pre-selection is to be performed, and usually it is not always clear how reviewers are recruited and selected<\/span> (Lamont 2009). Besides that, subjectivity, unconscious biases and prejudices are impossible to eliminate. Honesty, accountability, openness and integrity are vital qualities for all reviewers who should be able to pursue their work in an atmosphere free from prejudice. <\/span>In addition, if we focus on the problem from the point of view of legal journals and their publishing practices, it is important to reach clarity and consensus within editorial boards about the way criteria are used and the decisions are take<\/span>n. Editorial boards should follow a well-documented procedure and make it clear to the audience (van Gestel and Vranken 2011). It is also up to the editorial boards to check that the submitted papers include a clear explanation of the research question and the research design. Quite important, submissions dealing with comparative law issues should contain an explanation of jurisdictions that are taken into account and employed methods of the analysis. <\/span>In several European countries there is no common policy framework of articles submitted to national law journals: every journal\/publisher follows its own practice to assess the quality of legal research outputs.<\/span><\/p>\n

(d) Which are the advantages and disadvantages of law journals rankings?<\/span><\/i><\/h3>\n

Over the past few years, legal academics and their institutions have become obsessive about the star<\/span>\u2010<\/span>ratings of the journals in which they publish. On one side ranking of journals gives university management a convenient method of assessing research performance, on the other hand, research evidence suggests that journal ranking is not a good proxy for the value and impact of an article. Moreover, when journal rankings are based on journal citation scores, the number of citations that a journal receives in other periodicals is a very indirect indicator of the quality of an article in that same periodical. <\/span><\/p>\n

In particular, the law journal ranking system is encouraging the situation where academics become more interested in publishing in specific journals of high impact than in doing research that is of real value. Moreover, high qualified researchers are forced to publish in impacted journals abroad and there is no surprise that the national periodicals suffer for lack of the highest level submissions. In a longer period, this could have a negative effect on the existence of local scientific legal periodicals itself. <\/span><\/p>\n

The idea of a European ranking of law journals represents a great challenge because it would require a cross-border classification of journals. A multilingual law journal database would be an important achievement, reflecting differences of legal cultures and jurisdictions (van Gestel 2015). <\/span><\/p>\n

(e) Is the relation between legal science and legal practice important for research assessment?<\/span><\/i><\/h3>\n

Nowadays a close connection exists between legal science and legal practice, given that both rely on similar instruments for analysis, practical argumentation and reasoning.<\/span> Legal science is both the science of law and one of the authoritative and influencing sources of that law. This is why there is a strict correlation between legal science and legal practice. As a result, legal science has to pass two \u201cexams\u201d: a quality test within legal academia, which first evaluates its robustness as scientific research, and secondly assesses the pertinence and relevance to legal practice. These overlapping dimensions produce legally relevant knowledge, which should both be considered in the process of evaluating legal science <\/span>(Gutwirth 2009)<\/span>.<\/span><\/p>\n

(f) Is the harmonization of legal research assessment exercises at European level desirable in years to come?<\/span><\/i><\/h3>\n

Legal research could take advantage of the delay it has experienced in comparison to evaluation procedures developed and carried out for the other social sciences, by initiating a scientific debate on the benefits and disadvantages of the various quality evaluation systems. The goal would be to eventually promote uniformity in the definition of indicators and standards (van Gestel and Vranken 2011).<\/span><\/p>\n

These are some of the key questions that are most likely to form a framework for future debate, not only because they can promote lively discussions, but because they are also capable of involving countries that have only recently addressed the question of legal research assessment. Legal scholars within each country are the main actors of this discussion. In particular, quality indicators should not be imposed upon legal scholars from a top down perspective, and transparency as well as accountability are to be valued in the legal evaluation process so to build a strong evaluation culture.<\/span><\/p>\n

References: <\/b><\/p>\n

Epstein L. and King G. (2002). <\/span>The Rules of Inference<\/span>, 69 Chicago Law Review: 1<\/span>\u2013209<\/span>.<\/span><\/p>\n

Fl\u00fcckiger A. and Tanquerel T. (2015). L’\u00e9valuation de la recherche en droit \/ Assessing research in law Enjeux et m\u00e9thodes \/ Stakes and methods. Bruxelles, Bruylant.<\/span><\/p>\n

van Gestel R. (2015). Sense and non-sense of a European ranking of law schools and law journals. Legal Studies, 35: 165\u2013185. doi: 10.1111\/lest.12050.<\/span><\/p>\n

van Gestel R. and Vranken J. (2011). Assessing Legal Research: Sense and Nonsense of Peer Review versus Bibliometrics and the Need for a European Approach, German Law Journal, Vol. 12, no. 3 p. 901-929.<\/span><\/p>\n

Gutwirth S. (2009). The evaluation of legal science. The Vl.I.R.-model for integral quality assessment of research in law: what next ? Brussels, It takes two to do science. The puzzling interactions between science and society, Available at: <\/span>http:\/\/works.bepress.com\/serge_gutwirth\/16\/<\/span><\/a><\/p>\n

Lamont M. (2009). How professors think. Inside the curious world of academic judgment, Harvard University Press, 336 pp.<\/span><\/p>\n

Rubin E.L. (1992). On Beyond Truth: A Theory for Evaluating Legal Scholarship, 80 California Law Review vol. 80 n. 4 pp. 889-963 (Reprinted in Readings in Race and Law: A Guide to Critical Race Theory, Alex Johnson, ed., West, 2002).<\/span><\/p>\n

Siems M.M. (2008). Legal Originality, 28 Oxford Journal of Legal Studies 174.<\/span><\/p>\n

 <\/p>\n

Ginevra Peruginelli is Researcher at ITTIG-CNR. She has a degree in Law and a Ph.D in Telematics and Information Society at the University of Florence. She has also received her Master\u2019s degree in Computer Science at the University of Northumbria, Newcastle. Since 2003 she is entitled to practice as a lawyer.<\/em><\/div>\n
She has been involved in several projects at European and national level such as the NiR (Norme in Rete \u2013 Legislation on the Net) portal, MINERVA (Ministerial Network for Valorising Activities in Digitisation), DALOS (Drafting Legislation with Ontology-based Support), CARE (Citizens Consular Assistance Regulation in Europe) and e-Codex (e-Justice Communication via Online Data Exchange). She has also worked in a research project promoted by the Publications Office of the EU concerning interoperability issues between the Eurovoc thesaurus and other European thesauri. In 2004 and in 2006 she won two CNR research fellowships as visiting scientist at the Institute of Advanced Legal Studies of the University of London and the Centre de recherche en droit public at the Faculty of Law of the University of Montr\u00e9al.<\/em><\/div>\n
Ginevra is the editor-in-chief of the Journal of Open Access to Law, a joint effort of ITTIG, the Autonomous University of Barcelona’s Center for Law and Technology, and the Legal Information Institute.<\/em><\/div>\n","protected":false},"excerpt":{"rendered":"

Ginevra Peruginelli (Institute of Theory and Techniques of Legal Information of the National Research Council of Italy) [Ed. note: \u00a0This instalment of our 25-for-25 looks, at first, like a bit of a departure for us — it talks about different methods of evaluating legal scholarship. \u00a0But with a little reading-between-the-lines, it’s not hard to see […]<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[5012],"tags":[],"_links":{"self":[{"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/posts\/3990"}],"collection":[{"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/comments?post=3990"}],"version-history":[{"count":1,"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/posts\/3990\/revisions"}],"predecessor-version":[{"id":3991,"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/posts\/3990\/revisions\/3991"}],"wp:attachment":[{"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/media?parent=3990"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/categories?post=3990"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.law.cornell.edu\/voxpop\/wp-json\/wp\/v2\/tags?post=3990"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}