During Lent, I abstained from social media. During that time, I read many books, including Cal Newport’s Digital Minimalism: Choosing a Focused Life in a Noisy World. Newport convinced me: my technology habits needed to change permanently—beyond the pre-Easter season. I won’t attempt to summarize his book; you should read it for yourself (seriously). Instead, I want to tell you about a singular moment I experienced after my cautious, limited, intentional reentry into the world of social media.
Recently, I published a blog post; writing more has been a major unforced byproduct of cutting social media out of my life, and I was excited to have written something personal for public consumption—something I used to do regularly but have neglected to do for quite some time. I hopped on Facebook during the short window when Facebook was accessible to me that day (thanks, Freedom), just to share the link to my new blog post. I posted, exited, went on with my evening, and then went to sleep. No big deal.
Except . . . the next day, I found myself longing to know if people had engaged with my post. Comments? Likes? Loves? I won’t look . . . Or, maybe just one quick look. But, wait—I missed the window, and now Facebook is blocked on my laptop and phone browser for 12ish hours. Should I quickly download the Facebook app onto my phone, check, and then delete the app? Yeah, I should do that.
I repeated that process—download, check, delete—three times within 24 hours. The post received a small amount of engagement. I have a theory as to why: I haven’t been on Facebook for about two months, not posting or commenting or liking, so the algorithms are mad at me. Or, maybe very few folks care about my writing. Or both. Regardless: I was craving the whateverneurotransmitter hit that comes from comments and likes.
In his masterwork Mindstorms, published in 1980 (!), Seymour Papert wrote:
In many schools today, the phrase “computer-aided instruction” means making the computer teach the child. One might say the computer is being used to program the child (emphasis his).
When I taught technology at a K-6 institution, Papert’s philosophy—namely, that the child should program the computer, and not the other way around—was a huge component of my pedagogy. Papert’s works appealed to me on a theoretical level, but it most compelled me because I saw every day the effects of letting technology program our children.
So many of my preteen and teen students are on social media. They are vying for likes and comments the same way we adults are. Instagram is programming their brains to respond positively to engagement and negatively to a lack thereof. Their physiological human experience is being radically reprogrammed by iPhone apps.
(This is an issue in many school technology curricula, too, but I won’t get into that here.)
Of course, as a person whose entire professional life is devoted to the wellbeing of children, this bothers me. Bad. Like . . . it keeps me up at night. I have an entire folder on my desktop of “essays” I’ve written about the topic. But that’s not what this blog post is about—not primarily. Primarily, it is, as promised, about a singular moment:
I woke up this morning wanting to check my post. Should I engage in the download-check-delete cycle again? Writing in my journal, I told myself that, no, I would not look again. I would never look again—or at least not until the post popped up on my “memories” a year from now. I giggled thinking about the “challenge”: I’d post links to my blog posts on Facebook, but I would explicitly disallow myself from looking to see if there have been comments or likes. This seemed—and still seems—so funny to me. Passive subversion—against Facebook as a corporation, against our social-media-saturated culture, against egocentrism. Against my own reprogramming, in which I have been unwittingly complicit for many years now.
We’ll see if I’m successful in this aim. It makes me laugh, which compels me to think I’ll succeed, but maybe technology has already so deeply programmed me that sustainability will be near impossible. Who knows. Might as well try.