Skip to main content

Scott MacClintic About Scott MacClintic

Scott MacClintic is an independent school educator with 31 years of classroom experience teaching all levels of biology and chemistry. Scott currently serves as the Director of the Henry R. Kravis ’63 Center for Excellence in Teaching at The Loomis Chaffee School in Windsor, CT. Scott also serves as the Chair of the Connecticut Association of Independent Schools Commission on Professional Development.

Debate: E-Readers and Reading Comprehension
Scott MacClintic
Scott MacClintic

AdobeStock_88899588_Credit

[Editor’s note: Scott’s post is in response to this earlier article.]

Most times when I get asked about the e-reader debate, it is usually not a sincere question from a person who does not already hold a strong opinion on the matter. In these moments I am reminded of the expression “when you find yourself in a hole, stop digging!”

No matter how many studies I mention or which side of the issue I am trying to argue on behalf of, as soon as I provide a brief pause, I am confronted with “yeah, but…” and then the person proceeds to tell me why his/her long-held belief is the final word on the subject.

As for where I come down on the issue, I tend to defer to people who are way smarter than me on the subject —  such as Daniel Willingham.

As Willingham concludes in his review of some of the literature on the subject, If the choice is read on a device or read on paper, I believe that the paper is still slightly in the lead if you are looking at straight up comprehension. The problem I have is that this shift to digital is really only a lateral move or a substitution situation, and perhaps not a wise one if you want improved student comprehension!

As a teacher, I choose to incorporate technology in the design of my lessons if I believe it is going to result in noticeable and definable modification or redefinition of the learning tasks and outcomes (SAMR model). The question I ask is “what will the use of this technology allow me or my students to do that previously could not have been accomplished?” If the answer is a “not much” then I do not bother to use the technology. The technology itself should not be the focus of the lesson; student learning must be front and center.

So…”to e-reader or not to e-reader” is actually not the question that we should be asking; rather, we should be asking “does this technology add transformative value to the learning experience for my students?” If we want to go even further, we should ask “How might I measure this value and know that my students are benefiting?”

“Without data, you’re just another person with an opinion.”― W. Edwards Deming
Scott MacClintic
Scott MacClintic

AdobeStock_119988913_Credit

Data Informed Instruction

Early Steps

There are a few key steps to effectively incorporating MBE (Mind, Brain & Education) ideas and concepts into one’s daily teaching routine. The first key is the low hanging fruit, namely, educating oneself on the research about learning and the brain and what the research suggests are effective pedagogies. If you are reading this blog, you are probably already familiar with one fantastic resource for such information (shameless plug warning!)) – www.learningandthebrain.com

There are certainly plenty of resources out there and I strongly encourage you to seek them out. This first step is critical and has become easier in the last few years as more and more of the actual research is available online and more and more has been written for teachers as the target audience.

The second key step involves actually trying something new in your classroom, whether it is using more retrieval practice exercises [1], incorporating movement [2] or perhaps shifting to a more student-centered model for class discussions [3].

Quantum Leaps

But wait, your work is not done! Trying something new based on the conclusions of a research paper you read is certainly a big step but how will you know that the change you made was effective? What is your evidence that the change you incorporated actually improved student learning? THIS is the difficult part.

Analyzing the impact or effect of a new pedagogy is quite complex and requires the collection and analysis of data. While you may not be able to perform a double-blind controlled experiment–the gold standard in scientific research–you CAN analyze the impact of your intervention and use data to inform your teaching practice going forward.

So how do you collect data that can help you improve your  teaching practice?

I have found that one of the most useful tools for collecting data is one of the easiest to set up and use, but is frequently one of the least likely to be used by teachers – videotaping your class.

Watching a videotape of your class and objectively analyzing the tape for evidence of improved learning can be extremely enlightening, illuminating and humbling.

  • Did I really only give Hermione 2 seconds of wait time before I moved on to Draco?
  • Were the students really trying to take notes, listen to me deliver content and participate in the conversation simultaneously?
  • Did I really shake my head in disapproval as Luke responded to my question with an answer that was way off base?

I have yet to find a teacher who enjoys watching himself/herself on video but have found that most teachers who actually go through with it find the experience to be incredibly informative. Watching your video with a trusted colleague or Critical Friends group can be even more thought provoking  and lead to fruitful conversations about teaching and learning.

Data 2.0

I have been playing around with an exciting new  tool for data collection lately that has the potential to make the time consuming analysis of videotape seem like a thing of the past.  The app does a deep dive into an audio recording from class and provides me with nearly immediate data to analyze.

Here’s how it works: At the beginning of class I start an audio recording of the class on my phone and hit stop when the class is over. In the current  iteration of the app, I upload the audio file to be analyzed and within an hour or so, I receive a report back on the class. Right now, the report includes data in 5 minute increments on:

  • My talking speed (words per minute)
  • How many questions I ask
  • The types of questions I asked – How? vs Why? vs What?
  • Percentage of the time that I was talking vs. the students were talking

Questions that I have been able to think more critically about with this data include:

  • Was my student-centered class discussion really as student-centered as I thought?
  • Am I asking questions that require surface level knowledge (“what”) or ones that will lead to more critical thinking on the part of my students (“why,” “how,” “if”)?
  • Am I speaking too fast when giving instructions as I set up an activity?

The app is still in its development phase and there are bugs to be worked out before it will be available to a wider audience; but if you are interested in participating in the pilot, you can sign up here. Of all the data collection tools out there, I think that this app  has the potential to be an incredibly valuable tool for teachers as they attempt to evaluate the impact of changes in their practice.

For all of its potential uses, I do realize that their are potential dangers with the collection of this type of data. Who initiates the collection of the data? What if an evaluator or administrator wants to use the data? What are the privacy concerns about collecting this type of data? Who has access to the files and data?

All of these questions are important ones that need to be fleshed out to be certain; however, I believe that if properly used, this app has the potential to be a powerful tool for teachers who want to use data to inform their teaching as they incorporate new strategies and pedagogies.

  1. http://www.retrievalpractice.org/
  2. Donna Wilson, Move your body, grow your brain, March 12, 2014 [link
  3. Goldschmidt, M., Scharfenberg, F. J., & Bogner, F. X. (2016). Instructional efficiency of different discussion approaches in an outreach laboratory: Teacher-guided versus student-centered. The Journal of Educational Research, 109(1), 27-36. [link]

 

Click Here: The Technology of Retrieval Practice in the Classroom
Scott MacClintic
Scott MacClintic

AdobeStock_110872836_Caption

Back in the dark ages, when I was just cutting my teaching teeth, we teachers might have asked our students to review for an upcoming test by asking them to reread the chapter and their notes from class. With the benefit of psychology research, we now know that another strategy will be more effective.

Rather than have students reread the chapter or their notes, we might instead encourage them to outline the content from memory. This approach–called “the testing effect” or “active recall’ or event “blank page review”–leads to substantial increases in long-term memory formation. (That’s psychologist speak for “learning.”)

The efficacy of this form of retrieval practice is supported by a wealth of research and has been shown to be a powerful strategy for long-term learning.(1) The benefits have been shown in a variety of environments, over a wide range of student ages, and across many disciplines.(2) (3) (4)

One of the nice things about the testing effect is that it can easily be integrated into a study routine or a class lesson plan. Students can employ the strategy on their own without the use of anything more sophisticated than a blank piece of paper. Teachers can incorporate blank page review or frequent low stakes or no stakes quizzes into their courses as a way to leverage the power or retrieval practice.

Are there other effective ways in which we can incorporate retrieval practice into the classroom using current technology to not only enhance long term learning but also to provide formative assessment data for both the student and the teacher? The simple answer is YES!

“High-tech” version

Student response systems–commonly known as “clickers”–are a fantastic way to engage students in the process of retrieval practice; they also provide both teacher and student with valuable formative assessment data. Several strategies for effective use of clickers will enhance students’ learning.

  1. Make sure that the questions are not too easy.
  2. Be sure to include the most common wrong answers as options.
  3. After the initial polling is complete, take advantage of the different student answers to generate discussion and debate about the topic. Insist that students make a convincing argument as to why their choice is the best answer.

After initial polling on a question, I often project the results for my students to see. Depending on the spread of answers, I follow up with one of the these questions:

“Can somebody make a case for why their answer is the best choice?”

“Can somebody make a case for why their answer is a better choice than the one that was just proposed?”

“What do you need to know/remember in order to answer this particular question?”

When there is no clear consensus and wide range of answers are selected, I usually go in a different direction.

“Take a minute at your table (typically, 3-4 students) or with the person next to you to discuss your initial answer and come to a consensus. In a minute, we will re-poll on the same question.”

After a brief period of discussion and re-polling, there tends to be fewer potential answers chosen. I can then solicit an argument for one answer or another.

Don’t be afraid to include some vague wording, or to have more than one answer be correct depending on how the question is interpreted. A little intentional confusion and healthy debate/discussion can be a powerful way to incorporate an additional desirable difficulty into the mix.(5)

The feedback that occurs during the post polling discussion and analysis is not only beneficial for correcting erroneous answers; it also helps with long term retention of correct answers on which the students were not initially confident in their answer.(6) Both of these factors lead to greater long term retention ,as well as strengthened metacognitive skills for the students. A win-win!

If you do not have clickers at your disposal, you have several web-based alternatives to collect student responses. Polleverywhere, Socrative, Google forms and Kahoot! are just a few of the options that exist out there for teachers to use.

As a word of warning, there are some potential downsides and caveats that you need to consider when using student response systems. First and foremost is that no matter how much you plan ahead, you can count on the technology not working flawlessly every time. Who among us has not experienced the joy of having the projector bulb blow out just as you are about to project something on the board?

Another factor to consider is the time required. It takes longer to cover the same ground using this retrieval practice strategy. I would argue that the time is well worth it for the students, but the reality is that it will take more of your valuable class time.

“Low-tech” version

If you do not have a set of clickers or enough electronic devices in your classroom, you can still take advantage of this technique. Personal white boards, paddles, or even different colored note cards let individual students or groups of students vote for various possible answers. Any way that allows you to canvas different student responses and then to generate discussion and debate about those answers will work just as well.

Regardless of the technique used, the power of retrieval practice and feedback for long term learning is undeniable and should be an arrow in your pedagogical quiver.

References:

  1. Roediger, H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210. Link
  2. Karpicke, J. D., & Blunt, J. R. (2011). Retrieval practice produces more learning than elaborative studying with concept mapping. Science, 331(6018), 772-775. Link
  3. Karpicke, J. D., & Roediger, H. L. (2008). The critical importance of retrieval for learning. Science, 319(5865), 966-968. Link
  4. Agarwal, P. K., Bain, P. M., & Chamberlain, R. W. (2012). The value of applied research: Retrieval practice improves classroom learning and recommendations from a teacher, a principal, and a scientist. Educational Psychology Review, 24(3), 437-448. Link
  5. Overoye, Acacia L.; Storm, Benjamin C. (2015) Harnessing the power of uncertainty to enhance learning. Translational Issues in Psychological Science, Vol 1(2), Jun 2015, 140-148. Link
  6. Butler, Andrew C.; Karpicke, Jeffrey D.; Roediger III, Henry L. Correcting a metacognitive error: Feedback increases retention of low-confidence correct responses. Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol 34(4), Jul 2008, 918-928. Link