Questioning 1-1

sign post written in Welsh language

Way back in August, before I took an unplanned five-week blog rest, I wrote a post about attending a community presentation by the overly-large school district that once employed me. The assistant super and his associates wanted to explain their plans for an upcoming 1-1 program.

I ended that post by saying that I have a lot of questions about the project. Let’s start with one of the most basic queries: Why?

Ok, I suppose that’s rather broad. The project page on the district website tries to lay out a rationale so we’ll start there.

Students’ lives have changed considerably in how they live, communicate, work, and interact within a globally connected world. Students need both content and skills outlined in the Portrait of a Graduate in order to be successful in the workforce of the future. FCPSOn can support the development of both content knowledge and Portrait of a Graduate skills. 

FCPSOn increases equitable access to technology and to instructional practices that lead to personalized, meaningful learning experiences. This allows students opportunities for deeper understanding of content and the skills needed by the Portrait of a Graduate. 

FCPSOn will support teachers as they create learner-centered environments that help students learn concepts in meaningful experiences. The technology not only facilitates learning, it also frees time to focus attention in places that makes teaching and learning more rewarding.1

That’s it. Hard to argue with anything in that collection of eduspeak, but it really doesn’t answer my original question. Allow me to expand on it.

Why will giving every student a Windows laptop2 improve their learning?

If every student is carrying a device, does that really lead teachers to create “learner-centered environments” and “meaningful experiences”?

Where is the evidence that continuous access to powerful computing and communication technology will result in students leaving high school with those Portrait skills that are the centerpiece of the superintendent’s goals?

The rest of the page doesn’t really address any of those questions. But whatever committees3 wrote it worked hard to cover as many different educlichés as possible. Anytime/anywhere? Yep. Collaboration? You bet. Digital citizenship? Of course. Too much time on screens? Of course not.

Of course, the simple answer to all these interconnected questions is that technology will not improve learning, make it more meaningful, or improve student skills. Not unless we also make substantial changes to the rest of the learning process.

Missing from this page is the substantial issue of how the curriculum will be rewritten to make best use of these “digital tools”. Access to huge amounts of data and information should allow a shift from students memorizing lots of facts and processes to understanding how to organize, validate, and synthesize that information. But that doesn’t happen automatically.

The page also claims, as a result of this projects, students will work on “authentic projects and real world problems”. So will that change the primary means of assessing student learning? Most instruction is still firmly locked to the state standardized testing program, not to mention the curriculum pacing guides and the district’s expensive, home-grown “electronic” assessment system.

Then there are a few oddities on the page that make me go “huh?”. For example, how students will work on those projects when district policies prevent them from directly connecting with the outside world?

And I certainly don’t understand how adding computers to the classroom relates to this supposed effect of the project: “Supporting planning and reflection of student-created goals and teacher-directed learning outcomes”. How are goals “student-created” if “learning outcomes” are directed by the teacher?

As I said in the previous post, since I’m no longer in the middle of all this, my ranting here is almost entirely based on the small amounts of information provided to the community, like this project page. I could be completely wrong. District leaders may have already addressed all of my questions and have major changes in the works.

I look forward to being shown the errors in my ranting.

But until that happens, I will have more questions…


Image credit: Photo of a multidirectional sign post in Welsh by Dave Clubb was downloaded from Unsplash and is used with permission.

1. For those who are not part of the district, some explanation of terms: FCPSOn is the branding name given to the upcoming 1-1 project. Portrait of a Graduate is a collection of skills, divided into five categories, that a student should have when they leave school. It’s actually not a bad statement. Too bad it doesn’t really connect with what is actually happening in most classrooms.

2. I know it will be a device running Windows. The IT department works very hard to stop any other option.

3. I’m very sure the FCPSOn webpage was wordsmithed by several different offices in the district, probably including the lawyers.

Spotlight Mismatch

Description of a Spotlight report on Personalized Learning from an EdWeek newsletter:

See how schools are using algorithm-driven playlists to customize lessons for students, consider red flags to look for when purchasing products, and learn how personalization can make learning more social.

I’m almost curious enough to give them my personal information, just so I can understand how “algorithm-driven playlists” and customized lessons can make learning “more social”. Seems like a big mismatch to me.

The next item in the newsletter describes their Spotlight report on Maker Education:

Learn how schools are embracing student-driven learning, ensuring equity in maker education, and providing students with opportunities to develop real-world skills.

Is it possible to have “algorithm-driven playlists” and “student-driven learning” in the same classroom? Or do these reports describe two completely different groups of students? And if that’s the case, how do we decide which students get “personalized” and who gets the “opportunities to develop real-world skills”?

Lots of questions. Not many good answers.

Digital Conversion

In the last few years, many districts in this area have been promoting a “digital transformation” in their schools, including Fairfax, the system that employed me for many years. It’s a nice phrase and one that is often linked to 1-1 programs. But what does the phrase really mean? What exactly is going to be transformed?

Dig into the plans – posted on websites, presented at conferences, explained in conversations – and you hear a lot of elements not related to learning. The discussion is about technology and support issues: What device should we buy? Do we have enough bandwidth? We need more power outlets. How do we pay for all this? What happens if a student does something wrong with the machine we’re handing them?

Almost completely missing is an explanation of the major changes that will come in curriculum, pedagogy, assessment, or pretty much anything else instructional, as a result of buying all the equipment, software, and infrastructure.

Ok, I know transformations like this take time, especially in a tradition-bound institution like American education. And I’m also sure this kind of external communication doesn’t cover all the pieces districts are considering in their planning.

So, at the risk of covering issues already being addressed, I have a few questions for districts and schools undergoing a digital transformation.

How are you planning to change the curriculum teachers and students will be working with?

Shouldn’t the concept of learning change when information is no longer scarce? When the process of “teaching” is no longer one way from teacher to student? Asking students to recreate the same research papers their parents wrote makes no sense. Plodding through sheets of problems that their phones could solve in seconds, and which add nothing to their understanding of mathematics, wastes everyone’s time.

Are you providing enough support and time for teachers to learn the pedagogy to accompany all the digital?

Managing computers in the classroom is important. Knowing how to work Google Classroom or Office 365 is certainly part of the mix. But using Google is not necessarily transformative. Shifting the standard assignments from paper to digital is not at all transformative. And it’s going to take a lot of time for teachers (measured in years, not semesters) to make the major alterations to their practice that takes complete advantage of the new opportunities available in their classroom.

How will evaluation change to match the transformed expectations for learning?

Certainly there is basic knowledge and fundamental skills that we should expect any educated person in our society to know. Beyond that, digital tools allow for exploring the personal interests and talents that all students bring to school. So how do we assess their learning of both the essential materials and their individual goals? It’s not through standardized tests and we need to figure it out if this transformation is ever going to happen.

And finally, where are the students in your transformational planning?

Educators talk all the time about how the kids are the most important part of school. However, we rarely include them in any of these discussions. Not with surveys. Not by asking their opinion about school rules. Not with a few focus groups once most of the plans are in place. Students need to be at the table when we are finding the answers to all of the questions above. It’s their education. They will benefit most from their work in school (or possibly benefit very little). They need to have an equal voice.

This is just a start. There are many, many other questions that need to be asked, all part of the process of creating real change.

Because if you are using technology to digitize the same old learning process, what you get is a digital conversion, not a transformation.

Conference Overload

Did you ever think that there might be too education-related conferences? Especially edtech-related?

You probably don’t know the half of it.

Twice a year, a consultant from Toronto assembles a list of “selected events that primarily focus on the use of technology in educational settings and on teaching, learning, and educational administration”, to be held in the next six months, all over the world.

And it is a very long list. I didn’t bother to count the number of items in the current edition, but the information is distributed in a 102 page Word document. Each entry given two or three lines of 10pt type.

Some, like the International Workshop on Content-Based Multimedia Indexing (next month in Bucharest, Romania) and WorldFuture (presented by the World Future Society in DC in July) don’t exactly strike me as education conferences. And the list likely misses many state and local conferences.

However, my overall feeling as I scroll (and scroll, and scroll,…) through this list is: Are all these meetings really necessary? They all cost someone money and time to assemble; are they worth the costs involved? Do participants at these events really learn something that improves their practice, and, more importantly, positively impacts their students?

As someone who attends and presents at a few conferences a year, I always leave them asking those same questions. I’ll be in Denver for ISTE next month (attending, not presenting) and I know I’ll learn from the people I meet, as well as having a good time. But that doesn’t mean I won’t question the value of both the conference and my participation.

Anyway, just something to think about. If you’re interested in scrolling through the conference list yourself, the 35th edition, covering mid-May through December 2016, is now available.

Observing the Future

Betteridge’s law of headlines states that “Any headline that ends in a question mark can be answered by the word no.”.

Take for example, a recent article on the Ars Technica website: “Is your smartphone making you dumb?”.

And despite the provocative question, the authors of the study being referenced don’t actually arrive at that conclusion.

“the results are purely correlational,” emphasize Golonka and Wilson. There’s no way to tell whether an over-reliance on smartphones decreases analytical thinking or whether lower analytical thinking ability results in a heavier reliance on smartphones, they explain.

Of course, this is one instance in a long line of “research” and “analysis” provocatively asking if Google, the internet, social networking, or technology in general is impacting human intellectual development in some way. For good or bad. Maybe both.

Did societal observers have the same questions in the aftermath the printing press, telegraph, telephone, radio or any other major change in the way people communicated information? I suspect they did. Did humans get dumber? Smarter? Weirder?

I’m pretty sure the honest answer to the question of what the use of smartphones/instant search/social networking/<insert your tech fear here is doing to our brains is “we don’t know”. All of these digital tools some say we are addicted to (another of those headline concerns) are very, very recent developments in human history. It takes more than a decade or two to sort through all the data.

Which is all the more reason to do our own research. Be introspective about ourselves and observant of others. Pay attention and we’ll watch the future of the human species develop.

I’m pretty optimistic about it.