Educational reform always provokes controversy and arguments. I honestly started engaging in healthy discussions (read arguments) about educational reform when I was in my first education theory class in college. Most education reform arguments come down to whether the current paradigm is really broken, and whether the new paradigm is enough better to be worth the trouble replacing the old paradigm. Most of these arguments have fuzzy data at best to show for either side of the argument.
In my experience, these arguments tend to ride heavily on the past educational experiences of those involved in the arguments. The problem with this approach is that it assumes the two (or three or four) argue-ers are all equivalent learners. It assumes that all learners will thrive in the environment in which the argue-er thrived. It assumes that all learners have a mental processing intake system which acquires and stores information in a similar manner. It assumes that all learners are motivated to learn by the same motivations that drove the argue-er. No wonder these arguments are typically never resolved with the one party spontaneously saying, "Wow, you're right, and I was wrong all along. Thank you!".
Why is this? I think it is because we often make the assumption that all learners are equivalent in every aspect of acquiring, storing, retrieving, and applying knowledge. This makes it easier for us to create what little data we have as our current model for getting data on effectiveness of educational models. A p-value is not so useful if the entire cohort you are studying is a ill-defined mass of goo. Unfortunately, that is exactly what we have as our substrate, an ill-defined mass of goo.
What do I mean by this? Take neuroscience education in medical school as an example. First, there are obvious background differences - people with advanced degrees in neuroscience mingle in the class with those who have no idea what the frontal lobe is all about. Second, the way people learn is different. When I was a resident, I liked to see a few patients, and then take time right then to look up a bunch of stuff about those patients. I had friends who would rather be slammed with as many patients in a shift as they could find, as they felt they learned better in the doing. Some people like learning large concepts, and then going into details, and others like learning the details first, and then piecing them together later into a larger whole. Some people like to focus on one system or organ at a time, and some people like to have multiple courses concurrently running, so there is more time to absorb the information from each course. Some people love concept maps. Personally, I've never been able to get my head around why they are so great. I'm more of an outline guy. With these differences, we are trying to measure and argue over substrate that is an ill-defined mass of goo.
I'm not saying there are not basic learning theory principles which can be universal. I am saying the application of those basic learning theories is sometimes more wibbly wobbly than the ed-heads like to let on in their arguments. It could be that this multiple choice test on whether education reform is needed is not really a multiple choice test. It's an essay test. And there are multiple right answers as long as you can justify your answer. And everybody hated those tests...
The semi-random musings of a neurologist who first trained to be a high school teacher, and never quite left his educator days behind. Views on the blog are my own, and are not specifically endorsed by my employer.
Friday, September 11, 2015
Friday, July 31, 2015
Is there an upside to the noise of seemingly irrelevant content in medical education?
As I have worked in planning curriculum in medical school either as a course/ clerkship director or in various school-level committees, a common question keeps coming up. How much is too much information? Or conversely how little do you need to know about basic science to be able to competently practice medicine?
The age of the 'Google' search and other tools like Epocrates and PubMed being capable of getting a seemingly endless stream of factoids answered instantly makes this question more difficult. How much do I need to know versus how much do I need to be able to know how to look it up?
I don't think there is a perfect answer to this question, but let's use as an example GI histology. As a neurologist, if you asked me how much I have used my knowledge from medical school of GI histology in the past six months, I would probably laugh. Most neurologists would probably laugh as when you ask that, what comes to mind immediately is the day in med school microscopy lab where we looked at the slides of intestines and identified the villi cells. I don't do that anymore - like ever.
However, in the last six months I have taken care of Parkinson's patients with gut motility problems and constipation and I've also taken care of people on whom we were considering gluten-sensitivity as differential diagnostic points. We also know the carbidopa/levodopa competes with protein in small intestine absorption. How much of my ability to understand these basic problems with occur daily in my clinic is founded in part on my original knowledge of GI histology? What I think it critical to consider when considering what level of detail of GI histology is important to physician training is to consider what is implicit knowledge that allows me to solve problems. This means looking beyond the typical response to any given topic where a practicing physician says, "I never use that." (Biochemistry anyone?) This means spending time unpacking the implicit framework knowledge on which you have built much more complex concepts. On the flip side, there are some things which I learned in med school which I really don't ever seem to use much now even as much as I try to rack my brain to figure out if I do use them.
I'm not sure of the best way to puzzle this question out. I'm a little worried about running the grand experiment of just stopping teaching the med students all the tiny details we have taught in the past without first pausing to understand the repercussions. There is not likely going to be a firm line in the sand somewhere where a given topic is relevant or irrelevant, it'll look likely more like a large sandy smudge. However, every teacher of med students has to draw their line somewhere, and it would be good to have some alignment within a med school system.
The age of the 'Google' search and other tools like Epocrates and PubMed being capable of getting a seemingly endless stream of factoids answered instantly makes this question more difficult. How much do I need to know versus how much do I need to be able to know how to look it up?
I don't think there is a perfect answer to this question, but let's use as an example GI histology. As a neurologist, if you asked me how much I have used my knowledge from medical school of GI histology in the past six months, I would probably laugh. Most neurologists would probably laugh as when you ask that, what comes to mind immediately is the day in med school microscopy lab where we looked at the slides of intestines and identified the villi cells. I don't do that anymore - like ever.
However, in the last six months I have taken care of Parkinson's patients with gut motility problems and constipation and I've also taken care of people on whom we were considering gluten-sensitivity as differential diagnostic points. We also know the carbidopa/levodopa competes with protein in small intestine absorption. How much of my ability to understand these basic problems with occur daily in my clinic is founded in part on my original knowledge of GI histology? What I think it critical to consider when considering what level of detail of GI histology is important to physician training is to consider what is implicit knowledge that allows me to solve problems. This means looking beyond the typical response to any given topic where a practicing physician says, "I never use that." (Biochemistry anyone?) This means spending time unpacking the implicit framework knowledge on which you have built much more complex concepts. On the flip side, there are some things which I learned in med school which I really don't ever seem to use much now even as much as I try to rack my brain to figure out if I do use them.
I'm not sure of the best way to puzzle this question out. I'm a little worried about running the grand experiment of just stopping teaching the med students all the tiny details we have taught in the past without first pausing to understand the repercussions. There is not likely going to be a firm line in the sand somewhere where a given topic is relevant or irrelevant, it'll look likely more like a large sandy smudge. However, every teacher of med students has to draw their line somewhere, and it would be good to have some alignment within a med school system.
Friday, June 26, 2015
How might a pure competency-based curriculum change residency interview season?
OHSU is one of several schools that recently received an AMA-funded grant to push medical educational innovation. Our new curriculum, YourMD (yeah it has a cool marketable name), is in many ways a test lab for this grant (to be clear, most of what I'm going to discuss here is beyond the scope of the current version being developed for the YourMD curriculum, and I'm outlining my personal view of what the model may look like in the future). One of the primary themes in OHSU's work for this grant is to create a workable competency-based (not time-based) model of medical education.
As you can imagine, there have been many questions about the logistical problems with such a system. One of the issues raised at our institution as this concept has been discussed at various faculty meetings is the perceived trouble students in such a system will have in finding a residency program. After all, the student will have this transcript which looks remarkably different from most of the current school transcripts. It will have a bunch of competencies and EPA's. It may not have any mention of honors. How is a residency director to be able to choose who is the best candidate for their program?
I've thought about this a bit, and have a few ideas. First, if the school is truly competency-based, just the fact that the student has been able to graduate should indicate that:
a) The student understands and applies the knowledge necessary to start as an intern,
b) The student clearly demonstrates the skills necessary to start as an intern
c) The student clearly demonstrates the professionalism necessary to start as an intern
To my mind (assuming the system will work as advertised), this is revolutionary. This means you don't have to guess as a residency director what you are getting. You don't have to read between the lines for the secret codes hidden in the letters of recommendation. This person is ready for residency
. End of line.
So, then what do you look for now? Now, as a program director, you can begin to look more at what other experiences and skills does this particular individual have that would help them thrive at any particular institution. Instead of trying to assure that the person had 'honors' in internal medicine, the medicine program director can sort applicants in all manner of ways. They could determine their program wants people who have above-average skill in quality-improvement, or they could decide they want residents who are particularly interested in medical education. They can rank based on how well they operate in a team-environment. They can look for students who have had particular experiences that would benefit them in their environment - say a lot of rural practice experience or many rotations in an under-served inner-city. Each program director can choose what they'd like to highlight, and I don't see a problem with letting students know what they are looking for in applicants. This makes the interview sessions even less about figuring out if this person can operate on the ward successfully, and more about does this person fit well with our system and our culture.
If competency-based education works, this may be something residency program directors will need to think about. We're all well on the way to competency-based education. So, program directors, prepare yourselves. I think it'll make interview season more fun actually.
Multnomah County Hospital residents and interns, circa 1925 |
I've thought about this a bit, and have a few ideas. First, if the school is truly competency-based, just the fact that the student has been able to graduate should indicate that:
a) The student understands and applies the knowledge necessary to start as an intern,
b) The student clearly demonstrates the skills necessary to start as an intern
c) The student clearly demonstrates the professionalism necessary to start as an intern
To my mind (assuming the system will work as advertised), this is revolutionary. This means you don't have to guess as a residency director what you are getting. You don't have to read between the lines for the secret codes hidden in the letters of recommendation. This person is ready for residency
. End of line.
So, then what do you look for now? Now, as a program director, you can begin to look more at what other experiences and skills does this particular individual have that would help them thrive at any particular institution. Instead of trying to assure that the person had 'honors' in internal medicine, the medicine program director can sort applicants in all manner of ways. They could determine their program wants people who have above-average skill in quality-improvement, or they could decide they want residents who are particularly interested in medical education. They can rank based on how well they operate in a team-environment. They can look for students who have had particular experiences that would benefit them in their environment - say a lot of rural practice experience or many rotations in an under-served inner-city. Each program director can choose what they'd like to highlight, and I don't see a problem with letting students know what they are looking for in applicants. This makes the interview sessions even less about figuring out if this person can operate on the ward successfully, and more about does this person fit well with our system and our culture.
If competency-based education works, this may be something residency program directors will need to think about. We're all well on the way to competency-based education. So, program directors, prepare yourselves. I think it'll make interview season more fun actually.
Friday, March 13, 2015
Robert's rules and the digital age (or my one day as chair of the curriuculum committee)
Oregon Medical School Admission and Advanced Standing Committee | 1950's |
I was impressed that Paul knew there was a Robert's rule pertaining to the absence of the chair, and as the newly-appointed curriculum committee chair of-the-day, I thought I should review Robert's rules myself. The OHSU school of medicine website on committees led me to this document. It's on the one hand a very tedious set of rules. On the other hand, it represents a time-tested means for a group to come to a decision on matters important to an organization. However, there are a few areas that I noticed where this document may need some updating.
First, it mentions specifically a lot of papers, here are three examples:
This language should probably be cleaned up to say "document" at the least. Also, what's the deal with not being able to write on the papers? Should we extend this rule to pdf marking apps like Notability? It seems a titch silly to me to have this laid out, but perhaps this is for maintaining the integrity of the first draft.
Next, there is this bit about everybody needing to sign the document physically:
Can we amend this to say we can 'sign' by a form of electronic signature - in many cases now a reply from a personal email account with a signature block is what is required.
The last bit I found that needs updating pertains to remotely logging into a meeting (I found this on FAQ about parlimentary procedure):
I'm thinking that as time goes along, remote log in to meetings via web or phone will become more the norm than the exception. I haven't been to a curriculum committee meeting in over a year where someone wasn't logging in remotely. I think this one should be changed so the default is that remote login of any sort in real-time should be considered present and able to vote. Being absent and voting only by email, is probably along the lines of mail-in votes, and probably should be prohibited. As the rules don't mention video login or VOIP logins at all, I think this rule needs updating as well.
As I'm not a parlimentary expert, I'm not sure if these issues have already been addressed, I'm just going by what my school of medicine references as the rules we abide by. It's overall a good system, it just needs a bit of a nudge into the twenty first century. Please leave your thoughts or any updated parliamentary rules links you are aware of below.
Subscribe to:
Posts (Atom)