CUL logoWe are in the thick of the spring semester, and you have papers and projects underway.  You may find that you could benefit from an hour or two of a workshop to gain some research skills or database finesse, for sources not strictly legal.  All across campus, workshops are offered by various libraries at Cornell.  Topics range from managing data sets to business research, from statistical databases to PowerPoint.

Click here to register and to see details on workshops offered:

You can also request a consultation with a research attorney for specialized legal research help by filling out this online form.

Google has been busily digitizing the world’s books since 2004.  As of December 2010, some 15 million books have been digitized.  A couple of months ago, Google Labs announced a new tool called Ngram Viewer that allows the user to analyze and graph word usage over time from a 500-billion-word subset of those 15 million books.  Google has divided the 500-billion-word subset into a number of “corpora,” which allow you track usage of words and phrases in English, American English, British English, and a number of foreign languages including Spanish, French, and German.  “English Fiction” is a particularly intriguing corpus.  The most accurate data are for English-language materials published between 1800 and 2000.

Ngram Viewer makes it possible to track the early appearances of a word or phrase (like “laptop”) in published books, but it’s even more interesting to compare the ascending and descending usage of two or more words or phrases (like “laptop” and “mainframe”) on the same graph.  The graph below compares the usage of the words “Nazism,” “fascism,” and “communism” in English-language works published between 1920 and 2008.   Not unexpectedly, usage of “Nazism” and “fascism” peaks in the 1940s, while usage of “communism” reaches an apex around the time of the Cuban Missile Crisis (1962).  Beneath the graph is a series of year ranges corresponding to each search term entered into Ngram Viewer; clicking on a range runs a search in Google Books for publications within that range of years that include the search term(s) in question.  Click on the image below to enlarge it.

Ngram ViewerNgram Viewer is fun and easy to use.  Once you start experimenting with it, it’s hard to stop!  More detailed information about Ngram Viewer is available at http://ngrams.googlelabs.com/info.  To see an interesting collection of Ngrams submitted by users, go to http://ngrams.tumblr.com/.

CISER, the Cornell Institute for Social and Economic Research, offers free workshops in statistical research methods each semester.  Some of the available workshops include how to use statistical software like Stata and SPSS (separate classes) and accessing restricted social science data at Cornell.  View the schedule and sign-up here.

Between 1950 and 2008, about one out of every 23 opinions of the U.S. circuit courts of appeals cited at least one article from a law review or law journal.

That is one of the findings of a new article posted on SSRN entitled The Use of Legal Scholarship By the Federal Courts of Appeals: An Empirical Study by David L. Schwartz, a professor at Chicago-Kent College of Law, and Lee Petherbridge, a professor at Loyola Law School Los Angeles.

Surprised by that number?  Not only that, but the rate at which U.S. appellate courts cite law journals has been increasing over the past 59 years, from a rate of 3.4% of opinions during 1950-1970 to a rate of 6.21% of opinions during 1999-2008.  This finding challenges the conventional wisdom that courts have been paying less attention to legal scholarship lately (challenges–but doesn’t disprove).

Is the conventional wisdom just plain wrong?  Was it caused by several earlier empirical studies that found a decrease in citations (those studies were done on a much smaller scale)?  The authors of this new study found that about 14% of judges are responsible for about 50% of the citations but do not break this statistic out over time.  Could it be that the percentage of judges citing to legal scholarship is decreasing, and there are few judges now who cite to legal scholarship albeit citing more often?  Does the conventional view stem from negative statements about legal scholarship made by Justice Roberts, Judge Posner, and other prominent jurists?  Is it the result of some combination of these, or is something else going on?

The article makes several suggestions for future research, such as, how do judges really use legal scholarship?  As the authors point out, the methodology of the study isn’t adequate to making fine-tuned observations.  Then there is the even more difficult question: how should legal scholarship be used by the judicial system?  Knowing the answers to these questions will help lawyers and academics be more effective in 1) citing legal scholarship in pleadings submitted to the court; and 2) producing legal scholarship.

As a side note, I was really impressed with the search query the authors used to search for opinions that cite law review and journal articles in Westlaw:

da(YYYY) & ((“l.j.” “l. rev.” “l.rev.” “j.l.” “law review”) /10 (20** 19** 18**)) % ((j.l. /4 v.) ti((j. /2 l.) lj jl j.l. l.j.) (at(lj jl l.j. j.l.)) (“nat! l.j.” “national law journal”))

 The first part of the query looks for opinions published during a year period that cite law reviews or journals, the second part after % (BUT NOT) limits the query from retrieving cases where L and J are cited as someone’s initials and citations to the National Law Journal (not an academic publication). The query is not perfect, but it is about as close as you can reasonably get.

Hat tip to Legal Informatics Blog for the SSRN posting.

© 2020 InfoBrief Suffusion theme by Sayontan Sinha