Why Personalization in Education Misses the Point
Personalization. It’s the big thing in education right now. Everyone in education is talking about it, selling it, and investing heavily in it.
The Mark Zuckerberg backed educational foundation CZI has recently committed to the goal of “bringing personalized learning to every child,” and the Bush Foundation is funding schools designed to enable “mass customization” of instruction.
The interest in personalization is but one example a broader shift across edtech toward the prioritization of services, algorithms, recommendations, analytics, platforms, etc., and away from content. Instructional content is increasingly perceived as commoditized and no longer a meaningful differentiator in the education marketplace.
So are educational companies now in a race to see who can best adapt, personalize, bundle, customize, filter, analyze, sequence, deliver, manage, [insert preferred adjective here] educational content (wherever that content may happen to come from)?
Is content as a differentiator dead?
Why So Similar?
One wouldn’t be foolish to think so.
Sure, modern digital textbooks include attractive animations, integrated videos, interleaved check-your-understanding questions, pre-highlighting of key concepts, and analytics on learner usage. Some digital textbooks even customize practice/study questions to student performance and adapt instruction based on student ‘mastery’. But behind all these changes is a learning experience that has changed very little at the instructional core.
A quick glance at the digital texts of leading publishers reveals a startling level of homogeny. If you were an instructor or educational institution, its hard to see instructional content really mattering that much when choosing among publishers.
Now if you believe current instructional content is already optimally designed, or perhaps you think the core is fine and it just needs to incorporate a little more learning science (e.g., a sprinkle of retrieval practice here…a dash of segmenting there…smidge less cognitive load everywhere), this would go a long way toward explaining the current instructional indistinguishability. It would also explain why so many edtech companies believe the logical next step toward revolutionizing education is tailoring and personalizing existing content to each individual student’s needs.
But what if the real problem is actually the instructional content itself? And that all this effort around personalization and adaptation is merely rearranging chairs on the Titanic; technological wizardry unwittingly deployed to optimize an inferior learning experience. In other words:
What if we’re personalizing the wrong thing in education?
I’m just going to lay my cards on the table: I believe the instructional approach at the core of virtually all available learning products is largely inadequate for engendering the type of deep, transferable, and complex learning we want to impart to students. It simply doesn’t support the acquisition of the integrated set of cognitive strategies, affective dispositions, and foundational skills necessary to be a successful and flexible problem-solver, self-direct learner, and critical thinker.
And as recent notable setbacks have illustrated, it doesn’t matter if this inferior instructional method is deployed in a fancy MOOC, or in a technologically sophisticated “personalized” classroom, or deeply integrated into an adaptive learning software.
What’s So Wrong With Instructional Content Today?
At this point you may be saying to yourself, “You’re talking crazy, Jay.”
But hear me out.
Rather than starting with the identification of real-world, authentic tasks, that we want students to master and perform, education is often best described as the fragmented and compartmentalized teaching of decontextualized learning outcomes. Current instructional content is highly balkanized and modularized, divided into small disconnected learning components and ‘objects’ that deny learners the opportunity to grasp the interconnected, meaningful, and holistic knowledge underlying complex learning.
We teach to objectives instead of tasks. This is a problem because:
When it comes to learning, the whole is more than the sum of its parts.
We’re so focused on ensuring students can successfully recall every component needed to construct a house that we neglect to ask learners to actually pick up a hammer and build one. And not just once, but over and over and over again.
As a result, we deny students the opportunity to construct the cognitive schemas, induce the problem-solving strategies, and grok the larger relevance of what we’re trying to teach them.
Now I get why we teach this way, it is so much easier. We need only identify and explicate discrete objectives and procedures then have students repeat them back to us, perhaps incorporating frequent feedback, multimedia learning theory, and growth mindset encouragements because, you know, that’s learning science!
It’s the three Rs of modern (evidence-informed) instruction: Receive, Retrieve, spaced-Repeat.
And while this is a great strategy for improving recall for lists of words and isolated facts, it lacks key qualities need for robust learning; there is no integration, no meaning, no real-life tasks, no collaboration, no immersion, and no valid performance-based assessments.
Researchers have long known this ‘knowledge-telling’ approach is ineffective for imparting transferable and flexible skills, knowledge, and attitudes (Goldman & Pellegrino, 2015; Lim, Reiser, and Olina, 2009; Marzano et al., 2001). The same goes for all the important 21st Century Skills that people in education are so adamant about students needing to acquire. And it is surely a major factor when it comes to the low levels of student motivation and engagement in school. As David Merrill observes,
Learning is the greatest motivator when learners can see that they have acquired a new skill. Merely remembering concepts, terminology, principles, and facts for recall on a multiple choice test in a testing center is not motivating. However, being able to do something that they could not do before is very motivating. (2009)
When They Go OER, We Go Higher-Order Learning Tasks
Once we recognize the biggest challenge in education is with the pedagogy, it should be clear that current efforts to personalize fragmented and compartmentalized learning objectives isn’t the answer. So what is?
We really need to think about how to design better instruction and the content to go with it.
Consider the following:
Imagine the backbone of a digital learning product was a rich collection of real-world, authentic, and varied learning tasks. These tasks would reflect the variety and richness of genuine activities encountered by real-life professionals and actors. Now we can’t expect students to be able to think or behave like experts immediately (see Willingham, 2009), so these tasks need to be carefully designed and sequenced to provide appropriate scaffolding, fidelity, complexity, and be linked to well-designed performance criteria. This approach would go a long way toward achieving the outcomes that proponents of discovery learning, problem-based learning, and inquiry learning laudably value, while avoiding the problematic nature of these less structured instructional methods (see Kirschner, Sweller, & Clark, 2006).
The key point is that the foundation of instruction should be a collection of rich learning tasks that reflect the activities we want students to carry out post-instruction. This isn’t a novel idea (see Merrienboer & Kirschner, 2017, Merrill, 2012). Instruction should be problem-centered, combining real-life problems with supporting direct instruction (Merrill, 2007).
Designing instruction to achieve this goal, admittedly, is easier said than done. There is a good reason we’ve preferred fragmented, topic-driven, instruction for so long in education.
For example, consider the (non-exhaustive) obstacles:
- The challenge of creating a large collection of meaningful learning tasks that reflect the rich variety of problems and activities a student may encounter; requiring high levels of subject matter experience, creativity, and storytelling prowess
- The need to build validated skill hierarchies through cognitive task analyses and talk-out-loud protocols rather than relying on unstructured expert opinions of subject matter experts (see Velmahos et al., 2004; Feldon, 2006)
- Creating a rich database of relevant learning tasks categorized along instructionally meaningful dimensions (e.g., scaffolding type, physical fidelity, performance conditions, task complexity, tool availability, etc.)
- Technological expertise to create interactive and high fidelity learning tasks potentially involving simulations, role play, interactive multimedia, and mobile assessments
- Expertise in designing and collecting performance assessment data that is seamlessly linked to real-world learning tasks
- The ability to carefully align and adapt the sequence of learning tasks in response to student performance
- Convincing educators and policy-makers of the value in moving from a topic-driven instructional approach to one designed around whole learning tasks that are capable of supporting complex learning and acquisition of 21st century skills (the biggest challenge of all?!)
As this list makes clear, what we are talking about here is the need to design learning experiences that are qualitatively different.
Furthermore, I believe this approach reflects a more inclusive and less dogmatic application of learning and education science. Current applications of learning research in edtech typically fixate on supporting the retention and acquisition of low-level procedural knowledge, facts, concepts, and relationships through strategies identified in cognitive and psychological research. And while these strategies are undeniably useful for foundational learning tasks, their application is less clear when it comes to acquiring the strategies, attitudes, and abilities underlying more complex learning (see Koedinger et al., 2012 ).
So if the type of complex and rich learning we really care about requires instruction and content built around real-life, authentic, learning tasks, those of us designing future learning products definitely have some work ahead. What might these products look like?
Just for fun let’s imagine a new suite of learning products — let’s call it Apprentice™ — designed to meet the goals I’ve outlined above. Imagine a student logging into their statistics course and being treated to an experience analogous to entering into an apprenticeship with a famous statistician.
The student would likely begin with some low-complexity authentic tasks, starting at her current level of knowledge and reflecting the simplest activities a statistician or researcher might encounter. Eventually, however, the student would take on more challenging real-life tasks as her skills and knowledge demonstrably grew, progressing to complex, ambiguous, and collaborative problems. The student’s projects would be carefully sequenced and adapted to ensure appropriate levels of support, feedback, and guidance. And rather than being characterized by a sequence of discrete summative assessments that encourage a performance mindset and disincentivize feedback usage, each student’s instructional pathway would focus on incremental learning improvements and require that students apply feedback they receive to subsequent tasks (see Shute, 2007).
And as she takes on new tasks, the student would receive insight into how experts think through the problem-solving process and the mental models they employ when tackling similar tasks. Students would be exposed to detailed worked-examples of novel challenges and eventual fading of support. The student should also be treated as an active collaborator, with agency to influence the type of tasks she undertakes and given the liberty to choose the strategies she employs — even if they are likely to lead to failure. Furthermore, the authentic tasks would be representative of the variability and multidisciplinary nature of a domain to ensure deep and flexible understanding. And because these tasks would reflect realistic problems, they would also imbue meaning to the requisite component skills and facilitate greater learner engagement. In fact, maybe the tasks would even have genuine real-life relevance.
Finally, both instructors and students would have access to a portfolio of the problems and tasks a student has attempted. And each task would be characterized by a rich description that includes the performance conditions (e.g., setting, time constraints, workload), complexity, completion standards, realism, available supports, guidance, givens, and integrated set of learning objectives. Using this information an instructor and/or student can make informed decisions about what tasks to take on next, what types of tasks require additional practice, and specific areas that may need additional study. Furthermore, a clear picture of what actual tasks a student is able to accomplish emerges, supporting greater self-regulated learning and more valid judgments about a student’s generalizable knowledge, affective states, and abilities (see Shute et al., 2016).
The resulting educational product would be centered around a collection of real-life tasks that integrate theoretical knowledge, domain-general skills (e.g., creativity, self-regulated learning, collaboration, information literacy, etc.), and affective dispositions. It would embrace the integrated and holistic nature of complex knowledge, equipping students with the capacity to successfully carry out realistic tasks in unfamiliar situations. And if I had my way, the tasks would all be tied together, perhaps with a compelling narrative, because stories are good for learning!
Sure, the expository information and recurrent practice found in traditional digital textbooks and study algorithms will still be needed, but as supportive and strengthening elements rather than being the backbone of instruction. And traditional assessment techniques would be replaced as much as possible by performance and artifact-based evaluations.
So the question I leave you with is how can we break out of the current educational fixation on personalization and instead focus on revolutionizing teaching through the creation of problem-centered learning experiences? How can we encourage companies to support educators by designing products driven by sequences of real-life learning tasks that support deep, meaningful, and complex learning? Because education desperately needs tools that leverage the potential of technology to empower students to pull up a chair next to a master practitioner and learn how humans have always learned best: through carefully guided immersion, deliberate practice, and induction.
Yes, personalization is important in education; but more important is ensuring we’re personalizing the right thing.
Feldon, D. F. (2006). The Implications of Research on Expertise for Curriculum and Pedagogy. Educational Psychology Review, 19(2), 91–110.
Goldman, S. R., & Pellegrino, J. W. (2015). Research on Learning and Instruction. Policy Insights from the Behavioral and Brain Sciences, 2(1), 33–41.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Educational Psychologist, 41(2), 75–86.
Koedinger, K. R., Corbett, A. T., & Perfetti, C. (2012). The Knowledge-Learning-Instruction Framework: Bridging the Science-Practice Chasm to Enhance Robust Student Learning. Cognitive Science, 36(5), 757–798.
Lim, J., Reiser, R. A., & Olina, Z. (2009). The effects of part-task and whole-task instructional approaches on acquisition and transfer of a complex cognitive skill. Educational Technology Research and Development, 57(1), 61–77.
Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2001). Classroom Instruction that Works: Research-based Strategies for Increasing Student Achievement. Alexandria, VA: Association for Supervision and Curriculum Development.
Merrill, M. D. (2012). First principles of instruction. John Wiley & Sons.
Merrill, M. D. (2009). First Principles of Instruction. In C. M. Reigeluth & A. Carr (Eds.), Instructional Design Theories and Models: Building a Common Knowledge Base (Vol. III). New York: Routledge Publishers.
Shute, V. (2007). Focus on Formative Feedback. Review of educational research. Princeton, NJ.
Shute, V. J., Leighton, J. P., Jang, E. E., & Chu, M. (2016). Advances in the Science of Assessment. Educational Assessment, 21(1), 34–59.
Willingham, D. T. (2009). Why don’t students like school?: A cognitive scientist answers questions about how the mind works and what it means for the classroom. John Wiley & Sons.
Van Merriënboer, J. J., & Kirschner, P. A. (2017). Ten steps to complex learning: A systematic approach to four-component instructional design. Routledge.
Velmahos, G., Toutouzas, K., Sillin, L., Chan, L., Clark, R. E., Theodorou, D., Maupin, F. (2004). Cognitive task analysis for teaching technical skills in an animate surgical skills laboratory. The American Journal of Surgery, 187, 114–119.
Age of Awareness
Stories providing creative, innovative, and sustainable changes to the education system
Applause from you and 196 others
I think about learning, a lot
Stories providing creative, innovative, and sustainable changes to the education system
Hi Jay, I find I’m having two opposing thoughts when reading your article. Part of me is thinking: “Yes! Finally, someone who gets it.” But then, there’s another part of me thinking: “Oh, no. Even though this article is clear and well-written, it’s going to be widely misunderstood and misapplied. While I strongly agree with this article, I’m going to end up disagreeing with almost everyone else who strongly agrees with it.” For example, a lot of people are going to read your article and believe that something like High Tech High is aligned with your vision and moving in the right direction. But I couldn’t disagree more. Even if High Tech High is an improvement over a traditional high school, I feel it’s moving toward a local maximum which is far short of the ideal you envision in this article.
Recently, I’ve had a number of conversations with Howard Johnson about apprenticeships, project-based learning, the state of educational research, and pragmatism. In general, I agree with Howard that we need to move to an instructional model grounded in apprenticeships and real-world projects. I also believe that, like you said, instructional content matters. I particularly love this statement:
I believe the instructional approach at the core of virtually all available learning products is largely inadequate for engendering the type of deep, transferable, and complex learning we want to impart to students. It simply doesn’t support the acquisition of the integrated set of cognitive strategies, affective dispositions, and foundational skills necessary to be a successful and flexible problem-solver, self-direct learner, and critical thinker.
However, I also believe context matters. The state of educational research is such that we’re not in a position to generalize yet. There are simply too many unaccounted variables in play. So my apprenticeships and projects may look very different from someone else’s apprenticeships and projects, and not all of them will be equally effective. Just describing apprenticeships and projects at a high level is too imprecise. We need details grounded in real-world context.
I’m arm-waving (over-generalizing) at this point, so let me share a concrete example of what I mean. Several years ago, I worked with a team of expert roboticists to develop a robotics course for high school students who weren’t typically interested in robotics or computer programming. These roboticists were domain experts and highly passionate about exposing more students to the wonders of robotics. And, since we were working without constraints, we were also free to dream big.
We quickly agreed to do a project-based course. But as we brainstormed instructional content together, the roboticists kept steering us down the same well-trodden paths—paths that everyone else were taking, but that led to misconceptions and dead ends rather than true understanding and insights into robotics. Basically, instead of enabling students to engage in real robotics, as these expert roboticists experience it everyday, they were steering students toward a kind of empty faux-robotics, an oversimplified version of robotics that only exists in popular culture. I’m not talking about useful scaffolding here. I’m talking about experiences which would generate misconceptions that would only have to be torn down later if any of these students were to go on to become actual roboticists. It took months of unlearning and relearning on our part before we could develop authentic instructional content.
In 2011, a small team began working on an introductory computer science course for high school students. Daniel had…medium.com
A few hundred years ago, through hard experience, we knew how to prepare kids to take over the family farm or the local blacksmith business. But today, we know next to nothing about preparing kids to be engineers, journalists, or doctors. We might think we know how to do it, but we really don’t.
Can we learn how to do it through iteration? Maybe the projects at High Tech High will improve over time as we learn from our mistakes? Perhaps, but I’m more than a little skeptical. I don’t see robotics projects getting any better over time. To come up with our authentic robotics project, we had to make a radical shift in thinking and spend a lot of time growing ourselves first. I don’t see others doing that because they’re not aware any of that’s necessary. Most people are thrilled with the High Tech High model.
One way to create more authentic projects is by shifting how we think about projects and apprenticeships. Projects can’t just be fun ways to make learning active and to integrate skills, knowledge, and cognitive strategies. Apprenticeships can’t just be a way to personalize learning and create deeper student-teacher connections. Projects and apprentices also need to be useful, both in the real-world and in the here-and-now. The roboticists working with me would have developed authentic instructional content much more easily if (1) the project was something that the roboticists needed doing, rather than an exercise invented purely for the sake of student learning, and (2) they were trying to make their students as useful as possible because they needed those students to take over actual real-world work now, rather than simply trying to engage students and expose them to some robotics concepts to build on later. In other words, projects and apprenticeships need to be useful (pragmatic) for both the students and the teacher.
If you’re interested I can share an example of an inauthentic project which could be made authentic with a few changes. I’m working on an article about it now, but happy to share a quick preview.
Thank you for the thoughtful reply David.
I found myself agreeing with almost everything you wrote. Honestly, I think you are being too generous when you write, “The state of educational research is such that we’re not in a position to generalize yet.” The state of educational research right now, particularly when it comes to the type of complex learning you and I are discussing, is virtually non-existent. Yes, some foundational empirical pieces are in place, but we flying blind to a degree that few in education would like to admit.
I believe the field of education needs to become much more open and intentional about measuring and reflecting on the effectiveness of current instructional practices. Unfortunately, there seems to be a persistent resistance to meaningfully quantifying and incrementally improving student learning — defined however one prefers. This, I would argue, reflects a serious ethical failing on our part as a society.
You are also spot on when you describe how experts are often poorly equipped to teach or design instructional content. It is the curse of expertise, often rendering one unable to understand the learning needs of novices unfamiliar with to-be-learned content. This is precisely why I mentioned the importance of designing instruction informed by cognitive task analysis and other protocols rather than relying on expert judgments alone. Your description of designing the robotics course is a great example!
I must admit that I too share your ambivalence regarding my article — I know some people will read it and think I’m encouraging discovery learning or suggesting that every instructional experience must be ‘project-based’. Like you intimated, learning is far too complicated for such simple answers and our knowledge is too nascent and fragmented to provide clear guidance in every situation. We shouldn’t let this uncertainty dissuade us from moving in the direction we believe to be most effective, however, and we have an critical obligation to measure and evaluate the impact of our actions on student learning with the goal of continuous improvement.