A screen isn’t human. Neither is a book.The Chromebook-as-baby-sitter problem and the edtech abstinence movement
A movement has taken over social media that some have called “the abstinence movement in education” — no, not that abstinence movement, but one centered around banning screens. I will admit that I’ve played a part in this movement. I supported, and still support, the complete removal of personal devices from schools — that is, students’ cell phones. The argument to ban all other screens goes like this: Screen time is associated with negative outcomes. Kids are losing either their capacity, or their desire, to focus, and given the choice, most will play mindless games and wander into porn sites rather than do hard intellectual work. And perhaps most importantly to me, as an educator: screens tend to create bad teaching. How? They become a sort of baby-sitter in classrooms, and because most educational technology is beholden to the views of progressive education, what gets put in front of students on these screens is ineffective pedagogy. Students need a rigorous diet of systematic, explicit instruction that involves modeling, practice, and feedback — not the exploratory trash that gets passed off as educational. But you can immediately see the flaws in all of this, can’t you? Screen time is associated with negative outcomes. Associations are correlational, and the causal research on this topic is thin. If you go look at what people are citing, the case is built almost entirely on cross-sectional surveys where kids (or their parents) self-report how many hours they stared at a screen, and then somebody runs a correlation against test scores or grades. And what gets measured is its own problem. “Screen time” in most of this literature is a single composite measure — TV, video games, social media, homework on a laptop, an adaptive math app, all collapsed into one number. So when a study finds a small negative correlation between “screen time” and grades, we have no way of knowing whether the kids were watching cartoons, doomscrolling, or doing fluency practice on a literacy app. Now, about that small negative correlation. I’m not the most ardent supporter of the meta-analysis methodology, but when I read the largest one (Adelantado-Renau et al., 2019, pooling 58 studies and over 100,000 kids) the headline finding was that overall screen media use is not significantly associated with academic performance at all (ES = −0.29, 95% CI −0.65 to 0.08, confidence interval crosses zero). The negative associations they did find were specific to television viewing (ES ≈ −0.19) and video game playing (ES ≈ −0.15) — not “screens” as a category. The authors themselves noted that lumping everything together misses “the specific device used, the purpose of the task, the content, and the context.” A more recent regression analysis of 17,150 Chinese middle schoolers found small but significant negative associations (β = −0.02 to −0.11) — but again, cross-sectional, with the authors themselves saying causation cannot be inferred (Feng et al., 2025). And when researchers disaggregate further, the picture falls apart even more. Academic screen use correlates positively with achievement (Kim et al., 2017; Lukram et al., 2026). It is recreational use that correlates negatively. In other words, what predicts outcomes probably isn’t the screen but what’s on it. I can already hear people saying, “we don’t have causal evidence on smoking either, but we know it’s bad.” That’s true, and it’s also the wrong analogy. Smoking research is correlational because you cannot ethically assign children to smoke. Edtech research isn’t this way — we have dozens of meta-analyses pooling hundreds of experimental and quasi-experimental studies, going back four decades, comparing kids who used educational software against matched or control groups and measuring what they learned. Educational technology has shown positive effects on learning in these meta-analyses — effects that are small, uneven, and contested. Cheung and Slavin’s (2013) meta-analysis of K–12 math tech landed on ES = +0.15. Ma et al.’s (2014) meta-analysis of intelligent tutoring systems found g = 0.43, though Steenbergen-Hu and Cooper (2013), looking specifically at K–12 math ITS, found near-zero effects (g = 0.01 to 0.09). Cheung and Slavin’s effects also declined over time, from +0.23 in the 1980s to +0.12 after 2000, with the largest RCT in their review (Dynarski and Campuzano) coming in near zero. But a recent meta-analysis of adaptive learning systems in primary and secondary schools found a medium-positive effect across cognitive, affective, and behavioral outcomes (Wang et al., 2026). Mixed, modest, and exactly what you would expect if my hypothesis is right. The studies mushed into these meta-analyses are so wide-ranging in quality and design — discovery games, “engagement” over instruction, no mastery criterion, no error correction, no carefully sequenced Engelmann-esque examples — that we cannot conclude technology-enabled instruction doesn’t work, any more than if we mushed all the teachers on the planet together and concluded that teaching doesn’t work. The Chromebook-as-baby-sitter problem is worth our consideration, but I suspect it is the consequence of a deeper one: a view of teaching as activity-based, and planning as time-filling. When instruction is organized around activities rather than objectives, edtech becomes one more activity to slot in — alongside the station rotations and the worksheet packets and the Pinterest projects. Of course a teacher who lazily treats edtech as a quieting activity will not see massive gains compared to if they just taught the kids directly. As someone who observes lessons for a living, this is what I see daily — there is no purpose for the use of the technology, there is no systematic program of skills that the kids are mastering, there is no sophisticated “Theory of Instruction” within the apps, and there is no rhyme or reason for this shitty tech. It is just there to make the day go by a little faster for the teacher, and maybe for the students too. Indeed, the “teachers use it poorly” argument cuts both ways. Teachers use a lot of things poorly. Teachers use read-alouds poorly. Teachers use exit tickets poorly. Teachers use seating charts poorly. Even a book off the shelf can be used to babysit — or worse, to teach something false. Direct, explicit instruction itself is a weapon that can be used to indoctrinate in Holocaust denial just as much as it can be used to eradicate bigotry and inspire kids to make a difference in this world. The loudest argument against these devices is that screens are not human. Children need human connection. Learning involves relationships. A machine cannot love a child. All of that is true. And none of it is an argument against educational technology, because a book is not a human either. A book does not love the child reading it. A book does not kneel down to a child’s eye level. A book does not dry shoes that have been rained on, nor notice when a kid hasn’t eaten. A book does not remember that this particular child’s dad just deployed. A book is a static, mass-produced artifact that was manufactured by strangers and shipped to a school where it sits on a shelf until a teacher pulls it down. By every standard the abstinence movement applies to a screen, a book should also be banned. But of course nobody wants to ban books because we all understand that a book can reveal the wonders of the universe. A book does not replace the relationship between humans — it extends it. A teacher with great books does more than a teacher without them. A teacher with a great reading curriculum does more than a teacher with a pile of unrelated predictable readers. The book is not competing with humans; humans use books to learn. This is why the abstinence position is intellectually lazy. It skips the real questions: what are we putting in front of students, what are they thinking about, and are they learning? The ban-it-all crowd does not want to have that conversation, because having it requires them to know something about instruction, about mastery, about curriculum, about cognitive science, and to confront the fact that many kids are getting extremely subpar paper-pencil lessons, day after day, week after week. It is much easier to sign an oath to the cult of abstinence than it is to think in systems, engineering, and design. All this is to say, I have sympathy for folks who want to advocate that THEIR child not be subjected to technology during school, for the many reasons I’ve mentioned above and more. Maybe you don’t trust that the school is capable of preventing your child from getting onto horrible and damaging sites. Maybe you suspect teachers will use it to pacify and babysit your kid — as I have observed firsthand. Maybe you just want your kid to get much better at penmanship, and to spend their precious education in front of paper books, with screen time reserved for special moments, or not at all. But have you thought about the high schooler who is years behind and needs a suitable app to get them their GED in an alternative school? What about the parent whose child already knows all the grade-level content and could see their enjoyment in math soar if they had access to an app like Math Academy? What about the parent — new to this country, watching their kid struggle — who realizes early on that the only way to get a baseline of facts taught properly, through direct, explicit instruction, is to hire a tutor they cannot afford? Meanwhile, the public school they’ve been assigned to serves up a smorgasbord of romantic social-emotional mumbo jumbo, telling kids that knowledge isn’t important, and that all you need to do is, ironically, “look it up” in the age of AI. What about them? If you want less tech for your kid — I’m all for it! Just say, “I don’t want my kid in such a school,” and find one that fits your values. I work for many of them, and I don’t want them changing a single thing. We are in an unprecedented era of school choice. If this matters to you — and it should — go advocate for your kid. But to pretend that you have the moral high ground, and that the world is evil if they don’t adopt your preferences, is not the way to go. It would be a shame if public schools left all this potential on the table — scalable instructional systems that can teach far more children powerful, enabling knowledge than humankind has ever thought possible — all because of an overcorrection. References Adelantado-Renau, M., Moliner-Urdiales, D., Cavero-Redondo, I., Beltran-Valls, M. R., Martínez-Vizcaíno, V., & Álvarez-Bueno, C. (2019). Association between screen media use and academic performance among children and adolescents: A systematic review and meta-analysis. JAMA Pediatrics, 173(11), 1058–1067. https://jamanetwork.com/journals/jamapediatrics/fullarticle/2751330 Cheung, A. C. K., & Slavin, R. E. (2013). The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educational Research Review, 9, 88–113. https://www.sciencedirect.com/science/article/abs/pii/S1747938X13000031 Feng, X., Ren, S., & Shi, P. (2025). The relationship and mechanism of screen time and academic performance among adolescents: An empirical study based on CEPS. Frontiers in Public Health, 13, 1533327. https://doi.org/10.3389/fpubh.2025.1533327 Kim, S. Y., Kim, M.-S., Park, B., Kim, J.-H., & Choi, H. G. (2017). The associations between internet use time and school performance among Korean adolescents differ according to the purpose of internet use. PLOS ONE, 12(4), e0174878. https://doi.org/10.1371/journal.pone.0174878 Lukram, D., et al. (2026). Screen time patterns and their impact on academic performance: A prospective study among Phase 1 MBBS students. Cureus, 18(1), e101440. https://doi.org/10.7759/cureus.101440 Ma, W., Adesope, O. O., Nesbit, J. C., & Liu, Q. (2014). Intelligent tutoring systems and learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901–918. https://www.apa.org/pubs/journals/features/edu-a0037123.pdf Steenbergen-Hu, S., & Cooper, H. (2013). A meta-analysis of the effectiveness of intelligent tutoring systems on K–12 students’ mathematical learning. Journal of Educational Psychology, 105(4), 970–987. https://eric.ed.gov/?id=EJ1054449 Wang, J., Tigelaar, D. E. H., Ye, T., & Admiraal, W. (2026). A meta-analysis of moderators of the effects of technology-enhanced adaptive learning on primary and secondary students’ learning outcomes. Journal of Computer Assisted Learning, 42(1), e70168. https://doi.org/10.1002/jcal.70168 Zach Groshell is free today. But if you enjoyed this post, you can tell Zach Groshell that their writing is valuable by pledging a future subscription. You won't be charged unless they enable payments.
|
Wednesday, 29 April 2026
A screen isn’t human. Neither is a book.
Subscribe to:
Post Comments (Atom)
A screen isn’t human. Neither is a book.
The Chromebook-as-baby-sitter problem and the edtech abstinence movement ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ...
-
Rex Sikes posted: " Take this quote of William Atkinson Walker's to heart. Understand it and apply it in your life. ...
No comments:
Post a Comment