In a previous article on LLRX, I raised a question that many legal research instructors have perhaps wondered to themselves, in brief, silent moments of doubt and despair: can legal research actually be taught? This question arose from the following notions: (1) legal research involves finding relevant information; (2) in order to determine the relevance of a particular document, the researcher must already know something about the substance of that area of law (loosely termed the relevance paradox); and (3) since legal research instructors cannot teach all the substantive law needed to research any possible issue, we must figure out how to teach students so that they can teach themselves the subject matter knowledge they need to determine relevance for a particular issue. The post ended with a plea for legal research instructors to devote some thought to the idea of relevance in order to better understand how to more effectively cover this topic with our students.
A good first step in further developing our thinking on relevance is to be clear on what kind of relevance is important (or relevant!) for our purposes. Relevance is a concept that has received a fair amount of attention in the field of information science, generally, and information retrieval, in particular. In 1975, Tefko Saracevic reviewed the extant literature on relevance across several disciplines and developed a framework of five different views of relevance. His work, and that of Birger Hjorland, informs much of what follows.
If you consider any sort of information retrieval system – a database, catalog, or index – the question of relevance relates to how the retrieval mechanism identifies relevant documents. This can be done through algorithm, controlled vocabulary, classification, etc. From the perspective of the instructor, this view of relevance may be helpful to discuss with students. Knowing a bit about how the system determines relevance (Saracevic’s systems relevance) is helpful for understanding the system, but it does not tell you whether a given document is actually relevant for a given problem – only that the system identifies it as relevant to your query.
Obviously, the person using the system must make a judgment about the relevance of the documents appearing in the system’s search results. This happens on an individual basis, as the user applies her knowledge, reasoning, and intuition to determine whether or not something is relevant. Students may find it interesting to hear about the factors the instructor considers when determining relevance (Saracevic calls this destination relevance; Hjorland’s user relevance is the same thing), but this does not necessarily translate into a set of universal practices that students can put to use in every situation. Students need to be able to draw upon more than just their individual instructor’s knowledge.
The expert user’s relevance determination may draw upon his knowledge of existing documents in this subject area. This knowledge may be represented in citation analyses and other bibliometric measures – a particular document that is highly cited by other well-cited documents may indicate the cited document’s relevance on a particular topic (Saracevic’s subject literature relevance). We can see ways in which this view of relevance can be rolled into the systems view of relevance (e.g. Ravel’s visualizations of case searches). But, however effective such quantitative measurements are at showing that a document is relevant, they cannot teach a student why that document is relevant.
In order to understand the “why” of a document’s relevance, the student must grasp the reasons why a given document occupies the role it does within the subject literature. Ultimately, this is a question of how knowledge in a particular discipline is created, disseminated, and organized (subject knowledge relevance). Knowledge of the content of a discipline is, of course, helpful in determining the relevance of a particular document, but an effective relevance determination relies upon a theory of what counts as knowledge, or, in legal practice, what counts as legally valid. If you understand that, you can make the connections between what you know, what the system is showing you, and what the subject literature says.
A lawyer cannot adequately determine the relevance of a document without knowing the difference between a case and a statute, where each is published, and how to make sure they are still good law. These are all questions of legal bibliography – things that should already be taught by legal research instructors – but they form a necessary substrate for judging the relevance of a case or other source. More than this, however, a lawyer’s relevance determination draws on her understanding of legal argumentation and authority, her ability to read a case or a statute, to interpret, analyze, and synthesize the law. These latter questions are generally of the sort treated in a legal writing or legal analysis course.
However, just because those ideas are discussed in a different course does not mean that they have no place in legal research instruction. On the contrary, legal research instructors should discuss the role that legal analysis and argument play in determining the relevance of legal documents. Likewise, to the extent that legal research is taught as part of an integrated legal research and writing class, the instructor must make the connections between legal bibliography, legal analysis, and legal research clear.
Legal research can be taught, but only to the extent that it is connected to deeper questions about what the law is, how it is created, and how it develops. Cultivation of that level of perspective promotes the understanding necessary to find the relevant information.
References
Birger Hjorland, The Foundation of the Concept of Relevance, 61 J. Am. Soc’y for Info. Sci. & Tech. 217, 225 (2010).
Tefko Saracevic, Relevance: A Review of and a Framework for the Thinking on the Notion in Information Science, 26 J. Am. Soc’y Info. Sci. 321 (1975).
[Editor’s note: republished with permission of the author – first published on RIPS Librarian Blog.]