Students First, Not Stuff

Dec 14, 2014

by Will Richardson
Putting technology first—simply adding a layer of expensive tools on top of the traditional curriculum—does nothing to address the new needs of modern learners. Technological change is not additive; it is ecological, which means, it changes everything.
Neil Postman

For us teachers and education leaders, this moment of rapid and radical technological change is not what we signed up for, is it? A trillion web pages; a billion smartphones; movies, TV shows, newspapers, and novels on demand, wherever we are, whenever we desire; near ubiquitous courses and coursework, with teachers, tutors, and technologies that let learners of any age learn whatever they want, whenever and wherever they desire. “Always on” access has created an abundance of learning potentials that scarcely existed even a decade ago.

No, this is not the picture most of us painted for ourselves when we went into education. Most of us went into teaching understanding that school was pretty much the only education game in town, the place where kids came to get information, where, at the end of the day, we were responsible for disseminating the knowledge, we assessed whether our students got it, and we stamped it “an education.” For that vast majority of kids (and for us, too) who attended a brick-and-mortar school, that’s been the unbending, monolithic vision of schooling for 150 years.

So what do we do when that vision begins, finally, to be undermined? What do we do when the fast-maturing technology—the web—which has upended just about every other traditional institution we’ve grown up with, sets its sights squarely on schooling? And what do we do as schools become just one of many places in both the real and virtual world where our students can get an education?

Welcome to what portends to be the messiest, most upheaval-filled 10 years in education that any of us has ever seen. Resistance, as they say, is futile.

 

So Many Ways to Learn

The news that change is now seriously afoot, although daunting, is good news on many levels. In fact, it’s hard not to look at it as great news for kids, who will see growing availability of computers and access as a means to learn deeply and passionately in ways the current system of schooling was never built for. I can’t help but look at my own two kids and feel a twinge of envy. For the learner, these are exciting times.

But while the world of learning expands, many education policymakers seem to want to limit it to a discrete set of standards and outcomes. While our children pursue a growing array of interests and passions using technology outside school, we’re narrowing opportunities for learning inside school as we raise the stakes on standardized assessments to evaluate not just students but also teachers, schools, districts, and states.

Worse, in the debate over what comes next, few people seem to understand that the web and the technologies that drive it are more than just vehicles for delivering the traditional curriculum more effectively as measured by those assessments. Instead, as Neil Postman suggests, those technologies change everything. Right now, we should be asking ourselves not just how to do school better, but how to do it decidedly differently.

What changes, exactly? Suddenly having access to an abundance of content, knowledge, and teachers—and being able to tap into all of that at a moment’s notice—is just the start. Here’s the real sea change with the web: Learning is now truly participatory in real-world contexts. The transformation occurs in that participation, that connection with other learners outside school walls with whom we can converse, create, and publish authentic, meaningful, beautiful work. That’s when technological change becomes ecological, when the classroom walls are obliterated, when students truly drive their own learning, and when people whom we will never meet in person become some of our best teachers.

Putting Learning First

This moment of technological explosion raises a host of important questions for education leaders that speak directly to the way we think about the potentials of technology in school. If we see technology simply as additive, our questions will be about the technology: Should we get iPads or laptops? Does every classroom need an interactive whiteboard? What apps are best to engage students? and so on.

As Larry Cuban and others have pointed out, we’ve spent billions of dollars on technology that by almost every measure has had little or no widespread effect. No doubt, we’ve spent millions of dollars on iPads and interactive whiteboards in schools that do little more than deliver digitized worksheets or teacher-directed content to students.

But it’s not about the tools. It’s not about layering expensive technology on top of the traditional curriculum. Instead, it’s about addressing the new needs of modern learners in entirely new ways. And once we understand that it’s aboutlearning, our questions reframe themselves in terms of the ecological shifts we need to make: What do we mean by learning? What does it mean to be literate in a networked, connected world? What does it mean to be educated? What do students need to know and be able to do to be successful in their futures? Educators must lead inclusive conversations in their communities around such questions to better inform decisions about technology and change.

What Do We Mean by Learning?

Almost a decade ago, Seymour Sarason asked the most basic of questions in the title of his book And What Do YOU Mean by Learning? If we define learning as “higher student achievement” or “improved student performance” as measured by test scores (as many of us do), then we’ll add technology primarily to deliver the current curriculum more effectively, to make those scores “better.” When I hear school leaders talk about spending thousands of dollars on devices to make students more engaged or to “personalize” instruction, I know they mean to change little, if anything, of what learning looks like in classrooms. Old wine, new bottles.

Sarason (2004) writes that “productive learning is the learning process which engenders and reinforces wanting to learn more” (p. x). Never has that been more possible than at this moment of abundant access to information, knowledge, and people via the web. But “wanting to learn more” suggests a transfer of power over learning from teacher to student—it implies that students discover the curriculum rather than have it delivered to them. It suggests that real learning that sticks—as opposed to learning that disappears once the test is over—is about allowing students to pursue their interests in the context of the curriculum. And it suggests that learning should have an authentic place in the world, that it should be shared with the world. I think John Dewey and Maria Montessori, both of whom saw school as a place for students to do real-life learning around the things that interested them, would be thrilled at the potentials that today’s technologies bring to that vision.

That shift—from teacher to student control, from contrived to authentic creation and sharing of outcomes, from covering the curriculum to discovering it—represents ecological change. That’s not to say that teachers haven’t used these approaches before or that these approaches rely exclusively on technology. But it is to say that now—because of the web, because of how new technologies create new ways to connect, create, and communicate—those changes must become the rule in our classrooms, not the exception.

And with those changes comes a change in the role of the teacher. Teachers must be colearners with kids, expert at asking great, open-ended questions and modeling the learning process required to answer those questions. Teachers should be master learners in the classroom.

What Does It Mean to Be Literate?

Reframing learning in classrooms requires us to reframe literacy as well. As the National Council of Teachers of English suggests, literacy is much more than simply reading and writing texts. The organization’s position statement (n.d.) now defines 21st century literacies as including “proficiency with the tools of technology,” an ability to “manage, analyze, and synthesize multiple streams of simultaneous information,” an ability to “design and share information for global communities to meet a variety of purposes,” and more.

Some, like Stanford professor Howard Rheingold, believe that technology now requires an attention literacy—the ability to exert some degree of mental control over our use of technology rather than simply being distracted by it—for users to be productive. Professor Henry Jenkins at the Massachusetts Institute of Technology (MIT) advocates for transmedia literacy, which includes networking and performance skills that take advantage of this connected, audience-rich moment.

But these literacies don’t just apply to our students; they apply to us as well. If we cannot use these new literacies in our own lives, if we’re not “proficient with the tools of technology,” how can we make sound decisions about the technologies that will support this kind of literacy development in our students? It’s not surprising that we’ve wasted millions of dollars buying technologies that few, if any, of the adults in the room were familiar enough with to guide the decision-making process.

What Does It Mean to Be Educated?

Connective technologies like the web now require us to rethink what “being educated” means. As we gain more access to more information, content, knowledge, and teachers, it’s becoming increasingly possible for learners to educate themselves outside the traditional structures of school. I’m not just talking about taking online courses, which by and large are simply additive in that they change little more than the delivery model. With access, and with a full set of skills and literacies to use this access well, we now have the power to create our own education in any number of ways.

Take Massively Open Online Courses—or MOOCs. Although MOOCs have exploded in the last year, the original idea, created in 2008 by Canadian researcher-educators George Siemens and Stephen Downes, was to organize a learning experience around a topic that online students could then explore in a way that best suited them. That might mean writing and responding to blog posts, participating in webinars, contributing to discussions, or even meeting with other participants in Second Life. In Siemens and Downes’s original iteration, the MOOC enabled more than 2,000 passionate, self-directed learners from all parts of the globe to educate themselves, teaching one another and modeling the powerful potential to learn deeply with others online.

Now, almost five years later, a host of respected universities like Stanford, Princeton, MIT, and others are offering more structured versions of MOOCs led by experts and professors in the field. These versions of MOOCs are more additive in that they offer shared online learning spaces that feature readings, homework, and assessments mostly selected and driven by the instructors. But they also represent ecological change because they’re free. Millions have signed on, and tens of thousands have received certificates of completion.

But here’s the newest twist. Some universities, like Colorado State University, are now offering full college credit for completion of free MOOCs (Lewin, 2012). That changes things a bit, don’t you think?

I’m not suggesting that a four-year college degree is no longer worth striving for. What I am saying is this: As tuition costs continue to rise and as opportunities to earn credits, certificates, or badges for a variety of online learning experiences continue to grow, our youngest students will have many opportunities to create their own paths to an education when they leave the K–12 system. Instead of helping our students become “college ready,” we might be better off making them “learning ready,” prepared for any opportunity that might present itself down the road. That’s an ecological shift in thinking.

What Do Students Need to Know?

Now that we have so much information and knowledge at our fingertips, what’s important for students to know? And how can we make sure that the knowledge they acquire in the classroom is relevant enough that they can put it to good use in their lives outside school?

The current “knowledge-based” curriculum that most schools use was created at a time when access to information was scarce; we wanted to make sure our students had a working memory of the facts, figures, and skills we believed they would need in their lives. But the world has changed.

The reality is that I no longer need to send my children to a school to learn algebra, U.S. history, or French. For many students around the world who have no access to schools but who have a self-directed disposition to learn, MIT Open Courseware or courses offered through Khan Academy will provide all the knowledge they need to pass a typical test on the subject.

What if we focused on developing kids who are “learners” instead of trying to make sure they’re “learned”? What if, instead of delivering the same, common education to every student, we focused on developing the skills and dispositions necessary for them to learn whatever they need to learn whenever they need to learn it? That means rethinking classrooms to focus on individual passions, inquiry, creation, sharing, patient problem solving, and innovation.

That doesn’t mean that we throw all information and knowledge out of the curriculum. No question, all kids need to be able to read and write effectively, understand enough math to function in their daily lives, and have a basic understanding of science, history, and more. But we must be willing to consider that in a world full of access to knowledge and information, it may be more important to develop students who can take advantage of that knowledge when they need it than to develop students who memorize a slice of information that schools offer in case they might need it someday.

It’s About the Students

Asking and answering questions like these before we consider adding technology within school would do much to ensure that the decisions we make and the money we spend are in the service of learning rather than just an impulse to keep up with the school down the road or show off the next shiny, expensive new object that gets marketed as a learning device. Our students certainly need connected, networked devices and a fluency in how to use them so they can compete with people who have both of those things. But giving students devices and access is only a small part of the equation.

Right now, the web requires us to reconsider the ecology of schools, not just the technologies we use in them. We must start long-term, broad, inclusive conversations about what teaching, learning, and being educated mean in light of the new technologies we now have available to us. Just like business, politics, journalism, music, and a host of other long-standing institutions that the web has rocked at their foundations, education will be and is being changed. To understand the implications fully, we need to start with the questions that focus on our students—and not just on the stuff.

References

Lewin, T. (2012, September 6). Colorado State to offer credits for online class. New York Times. Retrieved fromwww.nytimes.com/2012/09/07/education/colorado-state-to-offer-credits-for-online-class.html

National Council of Teachers of English. (n.d.) The NCTE definition of 21st century literacies. Retrieved fromwww.ncte.org/positions/statements/21stcentdefinition

Sarason, S. (2004). And what do YOU mean by learning? Portsmouth, NH: Heinemann.

Will Richardson is a parent, educator, speaker, and author, most recently of Why School? How Education Must Change When Learning and Information Are Everywhere (TED Conferences, 2012). He blogs at http://willrichardson.com.