We watch how exemplary subject matter experts (SMEs) use gap analysis to do their work better than their average counterparts, to capture in a training curriculum the creme de la creme of how-to. But today's knowledge workers are required to demonstrate higher-order cognitive and interpersonal competencies.
This challenges instructional designers to raise the instructional systems design (ISD) bar much higher, since underlying theories and principles must be translated into concrete displays of performance known to be effective (as opposed to presumed or suspected to be, but actually shown to be predictably effective time and again) through replicated research studies. Research is our only means of better informing our curricula and raising a curriculum's IQ, as well as heightening the accountability of all its stakeholders.
What is it?
In contemporary terms, research has a fresher, more encompassing name--evidence-based practice (EBP). EBP refers to any profession that makes central a reliance on the most current and credible research evidence for effectiveness available. Such evidence not only informs the profession as a whole, it inherently guides everyday conduct and accountability.
For any profession that adopts EBP, its ability to deliver is only as potent as the volume, currency, and breadth of research available to practitioners--in our case, instructional designers. There is an additional burden of individual responsibility for developing several new skill sets:
* filtering relevant research
* assessing its rigor and findings
* translating findings and their implications into practical business applications that justify the reasons for training.
On the upside, for those who apply EBP to courseware targeting complex cognitive or interpersonal performance, two frequently ignored ISD gaps are closed:
1| repeatedly upgrading to the most current "what works" evidence on instructional design and delivery
2| conducting a literature review of courseware core content in terms of what works, when it works, and how it works best.
This yields a more potent, robust curriculum design. Another key is that the stronger the evidence that's found for cause-effect and ability to influence, the higher the probability for exponentially raising the effectiveness level.
You've created a curriculum that translates what-works evidence into a training intervention. Target populations who rely heavily on higher-order cognitive or interpersonal competencies, do so based on an exquisite interplay of theories, principles, and constructs.
We can't watch a theory in action (much less reach consensus on what was observed). Instead, there needs to be an objective translation of theory into concrete performance. This is research's forte. Should we leave the translation up to SME guesswork, opinion, or preference? How often have you witnessed disagreement between SMEs? Or, is it more prudent to rely on translation from rigorously applied research studies? This is the raison d'etre for EBPs taking hold in so many disciplines.
"Why it works"
EBP is the R&D of all human services professions: counseling, criminal justice, education, healthcare, nursing, psychiatry, social services, and training. Whereas manufacturing R&D endeavors to develop the most effective products possible, EBP endeavors to produce the most effective outcomes possible, and under far more complex and difficult-to-control conditions.
Benjamin Ruark is an instructional and performance transfer designer; firstname.lastname@example.org.
Clark, RE "Six principles of effective e-learning: What works and why" Learning Solutions e-Magazine, September 10, 2002
EBP by Profession/Industry:
Mihalic, S, Fagan, A, Irwin, K, Ballard, D, Elliott, D Blueprints for Violence Prevention Washington, DC U. …