Looking for the "historiography" of Learning and Development
Thinking and learning about learning and thinking....critically...
“The historian must be an analyst, not a judge; he must seek the true causes that explain human actions"~ French historian Marc Bloch, "The Historian's Craft" (French: Apologie pour l'histoire ou Métier d'historien), (1949).
I think I disappointed my dad when instead of going for an MBA, I decided to take my bachelor’s degree in business and go to grad school for….history and anthropology. Hope he’s over that by now. Anyway, I’ll never forget walking into my first grad seminar in history. I was the only one of about 20 grad students, without an undergrad degree in history. I had been passionate about it and studied it (mainly military history) as a hobby but no degree. I was self-conscious about this lack of formal domain knowledge and even more so because I didn’t quite grok the subject of the seminar. It was something called “historiography.” Now, after completing a master’s in history, working as a contract historian and going ABD (All But Dissertation) in a history PhD program, that topic remains one of my favorites and that specific class is the main reason for that.
Now not to be recursive, but historiography is the study of how history is made, not by the actors but how it has been constructed by various historians. It is the study of the methods, sources, assumptions, biases, and the quality of conclusions in various works. In that first class I mentioned, taught by Prof Darold Wax, I learned a few things. One was that all those undergrad historians around the table, while they had a deeper domain knowledge than me to start, had not been called on to make as many arguments as an undergrad in B school is. I might have had less to draw on but I was more accustomed to dissecting assumptions, examining motives, and essentially pulling the curtain back. Each week (it was a M, W, F) class, we had lectures on Monday and Wednesday and then on Friday, we were expected to have selected a book on the topic for that week, gutted it, and then on both sides of a 5x8 index card, taken that particular work to task on the quality of its arguments, examined its assumptions, rated the quality of its sources, and made an overall argument as how how rigorous and balanced a piece of work that particular book was. On those Fridays, we then made oral arguments to the class and got peppered with questions from the Prof and other students (since while we each had our own book, we were all on the same topic). I loved it. It taught me invaluable lessons about not taking things at face value, at digging for the reasoning behind the conclusions and at looking for qualified sources.
Flash forward and the first conference in the “Learning and Development” field I ever attended was an ISPI conference in Boston. I had left history at this point and was working in the Office of Readiness and Training in the Office of the Secretary of Defense in the Pentagon. I don’t remember much about that conference but I remember this one panel. The panel was full of luminaries from the field talking on the myths and fads that plagued our field (yes, even then). The last speaker was Sivasailam Thiagarajan or just Thiagi. I remember his accent, I remember his humor, I remember his first joke (looking up and down the table and in that voice of his that always carries a hint of a smile), he said “it is clear that I am the only Indian at a table full of chiefs.” Thiagi then preceded, over the next 10-15 minutes to use humor and penetrating insights to essentially take an entire industry to task for, as he put it, “trying to design a systematic approach to something inherently as messy as human learning.” Everyone was laughing but I heard echoes of historiography in his talk. I was also heartened by the panel - that people of such stature were taking on myths in the industry - that impressed me. Then I stayed in the field.
I saw the battle of myths continue. I saw people like Jay Cross, Will Thalheimer, Clark Quinn, Bob Mosher and lots of others continue to fight for more rigor in the field. On the other hand, I saw other professionals confidently using the slide that talked about remembering 10% of what we read and 20% of what we hear and so on, and accepting that without little to no inspection. I saw people propagating the myth of learning styles. I saw people using Myers Briggs as if it was based on actual science. I saw people think that the ADDIE model, which is really a production or product design model, think it was some Rosetta Stone for designing effective learning. I saw people using, studying, and getting jobs in Instructional Design without understanding its origins and what those origins meant in terms of limitations of the field. I remember going to a Game Developers Conference and finishing a session on human cognition and memory and walking into another one on behavioral economics and wondering why I never saw those sessions at any of the L&D conferences I went to.
I remember being absolutely thrilled when I first read Donald Clark’s take on Learning Theorists (up to 280 now from the 100 I first saw - maybe it was even 50 in 50 days..) - because it showed classic historiographical signs. Maybe that also comes from Donald being educated in Philosophy, another questioning discipline. I also took heart from the amazing work of Will Thalheimer, who has dedicated much of his professional life to “trueing up” evaluation but also to debunking some of the myths that persist in L&D. These folks and some others, have really been carrying a ton of weight on their shoulders and while it is an immensely laudable effort, it is not sustainable or scalable and we need that more than ever.
We are about to enter a period of profound change in L&D. Even if AI only disrupts the production side of this industry, the impacts could be massive. If it inches into the products we build, the impacts will be even larger AND if (here’s the real big one) our users’/learners’/clients’/customers’ expectations and needs can be fully met by some other product or service, the impact could devastate the industry. That means that more than ever we need a historiography for L&D - we need skeptics - we need a massive infusion of critical thinking - we need to make the critical evaluation of new technologies and services aimed at adult learning and corporate training - an honored practice - a required course if you will.
One thing that will mean is that we need to move away from business models tied to a particular theory or methodology. If you build a whole business, let’s say around Myers-Briggs - and teaching that and leading workshops in that is how you generate income, I’m going out on a limb and say that you’d be fairly resistant to people saying it was the equivalent of corporate astrology and has no basis in science. What we need are businesses focused on the value they deliver not the theory they most believe in - that means that when something is disproven or a better methodology is created, businesses will have the flexibility to move to that and not stay with an outdated model. It will be acceptable to think critically.
"I would rather have questions that can't be answered than answers that can't be questioned." - Richard Feynman, The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman (1999).
"Critical thinking is thinking about your thinking while you're thinking in order to make your thinking better." - Richard W. Paul, Critical Thinking: What Every Person Needs to Survive in a Rapidly Changing World (1993).