I was also fortunate this year that two colleagues (Dr. Pete Van Zandt and Dr. Melanie Styers) and I were awarded a grant from the Associated Colleges of the South to assess the impact of blended learning and flipped teaching in our student's ability to think critically. For this study we utilized the Critical Thinking Assessment Test (CAT) developed and validated by Tennessee Tech in a pre/post test format in three of our classes that to some degree utilize flipped teaching. We also employed pre/post SALG surveys to assess student perceived gains in hopes of discovering correlations between categories in which the students believed they gained and in what the CAT test measured. While we are still waiting on the results from the CAT test, both my student evaluations, the technology survey that I utilize each year and the SALG responses indicate that we are headed in the right direction.
Technology Survey:
My other goal in the course this term was to redesign the in-class exams such that they better reflect the POGIL activities which is why we see a significant decreases in appreciation for the previous year's exam, because they were dramatically different in structure with short answer versus fill in the blank and multiple choice. But still we see that, the highest rated tool for "promoting critical thinking" again this year was the course management page on Moodle... which again makes me question whether or not students truly understood the question "Which tools did you find were beneficial in facilitating critical thinking". But I was happy to see that the POGIL activities and case studies came in close.
Student SALG Survey Results
Student Evaluations
Comments that made me smile:
"Pretty well organized for how much stuff was needed. Lots of thinking by the students that was then reinforced by the teacher". (2015) vs "I think the class should be more lecture based. While the flipped idea is fun, I think that for a class with this much information, we need a lecture." (2014)
"She has made students think critically every class period. She created a new spin to the science department at BSC". (2015) VS "A better focus on making sure students are learning rather than memorizing metabolic pathways" (2014)
"I liked how she supplement the videos with some in-class explanations. The activities were pretty solid too; very helpful. The objectives were AWESOME". (2015) VS "POGIL activities - some concepts, actually most, were too complicated for the score of this course" (2014)
"Forced us to reason through problems rather than simply memorizing facts"(2015) vs "More teaching in class" (2014)
Comments that demonstrate challenges still exist: (besides the "I learn better with straight lecture" comments
"Narrow the learning objectives to better match the exams, make the exams have stuff on them that we learned in class before taking it, link the activities in class with the material more"
"I also never felt prepared for test despite strenuously studying"
"What was on the test always took me by surprise"
So while the above comments lead me to believe that the students do realize that the flipped model is improving their ability to reason and think critically, I think they are still very unsure of themselves when it comes to the exam and believe that they should still rely on rote memorization. Now I do have to defend myself, because in re-creating the exams, I pulled questions directly (and sometimes literally) out of their in-class POGIL activities. And from the first exams where the average test grade was a D, to the third and even the cumulative final exam grades averaged around a high B; i'd have to say that the students improved DRAMATICALLY on what are very challenging exams!
Changes for next year
1. In the first week of class try to better model how the group should work together and the pattern of the activities. I think if we walk through the first two activities as a larger group step by step, it may help alleviate some of the stress and give them a rhythm to work with the rest of the term.
2. Consistently remind them that the videos are there to help introduce or explain course content while the in-class meetings are designed to help them see how that content is applied. (And they need not rely on just the videos, they have a text book and internet resources at their disposal as well).
3. Instead of requiring students to complete the video quizzes for credit, I will now have students write out their muddiest points for each video lecture/topic and submit them online (either through facebook or moodle) for a muddiest point lecture at the start of each class. While I consistently tell my students that I will address questions in my "muddiest point" mini-lectures every class meeting, I rarely actually get questions. Then at the end of the term, my students always ask that I do this more. So this way, by requiring the questions for credit, hopefully I can increase the engagement and help better address their needs for clarity.
In terms of the exams, it is my hope that now, with the redesigned exam format from last year freely available to the students to analyze and study, some of the frustrations the students voiced will decrease. I have also implemented an objective alignment activity between the video learning objectives and the in-class learning objectives to help them see how they build and grow with each other to lead them to higher order thinking and then how those higher order critical thinking skills are what I test for in the exam.
Overall, I am really pleased with how that class has progressed and am excited to see how it goes next year!! Also, stay tuned for the CAT results soon to come!