"Power Tools" in Bar Preparation
I have an atypical commute to work. Each week, I drive at least 600 miles. Sometimes, I drive twice that. As a result, I find myself in the car listening to audiobooks—lots of audiobooks.
I recently discovered Powerful Teaching: Unleash the Science of Learning, written by cognitive scientist Pooja K. Agarwal, Ph.D. and veteran K-12 teacher Patrice M. Bain, Ed.S. In Powerful Teaching, Agarwal and Bain help simplify cognitive science research and illustrate ways to apply the science of learning in classrooms settings. The book is filled with evidence-based strategies designed to improve student achievement and enhance students’ higher order learning and transfer of knowledge beyond the classroom and to, for example, the courtroom. For someone who focuses on student learning, like me, this book quickly made its way onto my Audible list.
As I started to listen to the book’s narrated introduction, I was immediately excited to hear that I am already implementing all four “Power Tools” described in Agarwal and Bain’s book—that is, (i) retrieval practice, (ii) spacing, (iii) interleaving, and (iv) feedback-driven metacognition—in the bar preparation course I teach at Lincoln Memorial University Duncan School of Law, called Multistate Bar Exam Skills, as well as other courses I’ve taught at LMU Law and previous institutions I've worked.
Below I describe how I have incorporated and continue to use each of the four techniques that Agarwal and Bain have penned the “Power Tools” even before I knew they were, in fact, “Power Tools” based on the science of learning.
Retrieval Practice
Multistate Bar Exam Skills—or “MBE Skills” for short—is a four-credit bar preparation course that soon-to-be graduates take the Spring semester of their final year at LMU Law. The course is designed to improve legal analysis and study skills in preparation for taking the bar examination. The focus of the course is on subjects covered on the multiple-choice Multistate Bar Examination, but much of the substantive review of the material is done by students outside of classes. In a way, MBE Skills is a flipped classroom where students are required to review most of the substance outside of class so that application of that learning can be conducted within the classroom and with instructor guidance.
Retrieval practice is the act of trying to recall information from long-term memory, rather than constantly feeding it into the brain.
In MBE Skills, after students review the substantive material outside of class—generally by viewing prerecorded video lectures and reading substantive outlines—students must complete an initial quiz on the reviewed materials and then a subsequent retention-building quiz (largely consisting of similarly tested material found in the initial quiz) after reviewing their performance on the initial quiz. These quizzes are low-stakes assessments with low point values because the goal of these retrieval practice exercises is to reinforce the learning—not to measure or grade student work.
Spacing
Spacing is the incorporation of time gaps between learning and practice, which leads to improved performance over time compared to learning and practice sessions that occur close together. In other words, asking students to retrieve information a few weeks or even months after they learned it improves learning because that information is harder to recall. And since it’s harder to recall, spacing actually makes the learning that comes from it that much more durable.
In MBE Skills, typically after the midterm exam—which, at LMU Law, usually takes place about a third into the semester—I incorporate spacing techniques in the second (retention-building) quizzes after each unit of instruction. For example, students may complete a 17-question initial quiz on hearsay principles after reviewing the corresponding Evidence video lecture. After reviewing their performance on the initial quizzes during class and additionally outside of class, students will complete another 17-question quiz—the retention-building quiz—that will not only include questions covering the same hearsay principles tested in the initial quiz but will also include previously learned material from earlier quizzes, like Torts or Criminal Law. So while students are being forced to retrieve recent knowledge to answer the Evidence questions, they also will need to recall information from longer ago.
Interleaving
Interleaving is the mixing and varying up of content while learning. Whereas blocking involves studying one topic at a time before the next (for example, “topic A” before “topic B” and so on, forming the pattern “AAABBBCCC”), in interleaving, one mixes—or interleaves—topics on several related topics together (forming for example the pattern “ABCABCABC”).
In MBE Skills, my students experience interleaving on both a micro and macro level. For example, on the micro level, for the initial quiz on intentional torts, I don’t simply test battery in the first five questions, then assault in questions six through eight, then false imprisonment in questions nine to twelve, then, say, defenses to intentional torts in the remaining five questions. Rather, I mix—or interleave—the causes of action and defenses among the 17 questions so that Question 1 might test the defense of consent, Question 2 might test trespass to land, Question 3 might test battery, and so. While all of the questions are based on intentional torts, the subtopics within intentional torts aren’t asked in any logical or progressive order.
On the macro level, students see most drastic interleaving on the midterm exam. By the time midterm rolls around, students will have covered two major areas of Torts—intentional torts and negligence—and several major areas of Criminal Law—inchoate crimes, crimes against habitation, and crimes against property, to name just a few. So on the midterm exam, all these areas are tested in no particular order, so Question 1 might test criminal battery, Question 2 might test comparative negligence, Question 3 might test intentional infliction of emotional distress, Question 4 might test burglary, Question 5 might test civil battery, and so.
The interleaving of questions on lower-stakes assessments—the MBE Skills midterm exam is worth only 10%—can help students choose the correct strategy to solve a problem. It can also help students to see the connections, similarities, and differences between legal concepts that might not be recognizable in block practicing.
Feedback-driven Metacognition
Metacognition is the awareness and understanding of one’s own thought processes. Feedback-driven metacognition is being able to help students learn how to discriminate between what they know and what they don’t.
In MBE Skills, I tag every multiple-choice question administered in every assessment to a particular category that the National Conference of Bar Examiners, the group that drafts the MBE questions for the bar exam, has already pre-defined.
After every assessment, I release to students their results, which also includes their areas of strengths and weaknesses based on the tested categories. This can easily be done with ExamSoft’s Examplify testing platform. So not only does the student know that he or she scored 75% on a Torts assessment, but that student receives more granular feedback to know that, for example, he or she scored 90% on intentional torts questions and 60% on negligence questions. This information is even broken down further so that the student will know that within the 60% of negligence questions that were scored correctly, the student scored 80% on questions testing duty and standard of care, 90% on breach questions, 40% on causation questions, and 50% on damages questions, as an example. By knowing the specific areas that they are getting right or wrong, students can take better ownership of their own learning and begin to study smarter—not harder.
Additionally, after the midterm exam, I require students to review each question that they missed on the midterm exam and compare their selected responses with the explanatory answers provided to them. After reviewing the question, their selected response, and the explanatory answer, students need to identify the reason that they incorrectly answered the question. Was it because they simply didn’t know the rule? Was it because they knew the rule, but they incorrectly applied it to the facts or did not appropriately understand the significance of a certain fact? Did the student misread the call of the question or the fact pattern? But understanding why questions are missed, students are in a better position to understand how to improve their assessment performance.
Conclusion
As teachers, we all probably incorporate these four “Power Tools” described above in all of our courses—whether we actually know them by their cognitive science names. But, every once in a while, it’s always good to reflect on our own teaching and course design to make sure we’re doing the best we can to improve student learning. After all, while those of us in academic success and bar preparation always warn our students about the importance of proper study techniques and skills, we, as teachers, also need to be more deliberate—and even transparent—in using proven learning strategies when we teach those students.