I've been getting feedback from students over the last two years on what materials they would like to have included in the course materials distributed with our second-year neuroscience course. I have heard a very clear message from the students over the two years I've been teaching the course. The expectations for what is included in the course materials and which readings are required has changed over the last few years.
Let me take you back to the mid-nineties when I took my medical school course work (and my college experience in the early 90's). Let's call this syllabus 2.0. I received copies of all the slides presented (as long as they were in PP, we still had some lecturers who used slide carousels and they had minimal notes printed - call that syllabus 1.0). In class we took notes. If you missed or ditched class, you could look back over what the lecturer talked about by subscribing to a note taking service which was run by the students. Readings were from the required textbooks. Test questions covered anything in the printed syllabus as well as anything said verbally in lecture (even seemingly off-hand remarks) and anything covered in the textbook.
In this model, the material is presented, but there is a intentional (or unintentional) fire hose level of information delivered. It was up to the student to wrestle with this large volume of information, distill it down to essential concepts, and organize it in their brain to allow them to pass the test. It was expected that there would be some test questions which were not covered explicitly in class, and the purpose of those questions was to differentiate the top of a group of very highly motivated students. The upside of this model is that if forces the student to be able to analyze large volumes of information some of which is not a core concept, and independently synthesize the important concepts. This skill is not outside of the required skill set to be a doctor in a clinic. The downside is that there is room for the individual student to miss the boat and miss out on important concepts which aren't explicitly identified as core. Also, this model can increase student anxiety during test preparation as you are not clear until you take an exam if you are missing the boat.
Let's move to 2013. Our course syllabus was inherited from the above paradigm, and we have been modifying multiple lectures. Hence our lectures don't have well developed outlines or notes by the faculty to accompany PP presentations. Students have on several occasions pointed me towards courses at our institution and others where the course materials include extensive annotation by the faculty in addition to the slides. Students over the last two years have said things like (paraphrased):
"What I want is to have everything I need to know about this lecture written down so I can go learn it."
"I don't want to have links to a whole bunch of useful information about a topic, I want a single link to a very succinct, applicable resource."
"Even if the syllabus for a class is 450 pages, if it is all I need to look at, that's what I'd prefer."
Another way to state this is the students would like a curated information repository which is finite, organized, and focused on the learning objectives. This sounds to me like it is mirroring discussions about moving from web 2.0 to web 3.0. In other words, there is a desire to block out noise and focus on what is important to the individual. Thus, all information in the course materials is honed to efficiently deliver information necessary to perform well in the course.
The upsides of this is that is very clear what the student is expected to learn. From a pedagogy standpoint, this is an ideal situation for an educational model based on measuring competence. Hence it is clear what measure to obtain, and all learners can potentially reach this bar. The downside is that this perhaps does not help in the long run as this is not how real life medical decision making occurs. There is no finite set of combinations of signs and symptoms, so often there is a need to be able to process a cacophony of noise and distill out the important ideas. There is always more to read or more detail, and part of being a doctor is gaining skills in deciding how to be your own curator.
Which is better? My view is we should aim in the grey of the middle. I think it should be clear what is necessary to pass the exam, and if all material covered in class, small groups, and presentations is mastered. I agree that if there is a picture slide, that it is reasonable for a lecturer to include some text to create context for the slide. I do think it is also reasonable to have some way of assessing whether a student can surpass these minimal competency levels. On an exam, that means asking questions which may not have come specifically from the readings. It may introduce a novel topic and apply the concepts learned in class in a new way. What are your thoughts on how much detail should course materials contain?
No comments:
Post a Comment