Human Work That Makes Learning Real
In college, I took an introductory course in art history that was a real challenge for me, both in terms of the volume of material to process and the precision of the analysis demanded by the professor. To prepare for the final exam, two friends and I spent a full 24 hours reviewing images and artists, analyzing motifs and movements, and memorizing influences and dates. It was a headswimming exercise, and we all wandered to the exam room bleary-eyed—but also decidedly more confident and competent in our understanding.
Were I doing this final exam review today, I have no doubt that I would be employing AI to help me in my studying, and to profoundly good effect; indeed, it’s not hard to imagine how artificial intelligence could promote adaptive reviewing, help in outlining and refining analytical arguments, or propose thematic connections between works across cultures or eras. This would surely have proved useful in my attempt to grasp a survey of art history.
At the same time, I do not want to dismiss what happened between me and my friends on that spring day 30 years ago. While I am confident that AI would have been more coherent in helping me to recall, say, the key elements in the transition between Baroque and Neoclassical painting in France, I wonder what it would have sacrificed in terms of healthy life skills.
A study session between friends demands desirable human qualities of its participants. It may need someone to approach and organize group members. It may ask someone to confide in others that they are struggling. It may require participants to rely upon one another to deepen their learning. It may call for patience as imperfect understandings are offered, challenged, refined, and offered again. This process surely does not have efficiency, instantaneity, or even efficacy of a chatbot, but the imperfect peer study group may promote other practices and dispositions—initiative, vulnerability, collaboration, perseverance—that boys becoming both accomplished students and purposeful men would also do well to explore and adopt.
AI is not inherently bad, nor to be avoided at all costs. With proper guidance and scaffolding, artificial intelligence applications have a very real capacity to expand exploratory horizons, increase information access, and amplify creativity among students. But as media scholar Neil Postman famously claimed “[t]echnological change is not additive; it is ecological.” When a new technology arrives, it does not merely add something new—it changes that which currently exists. AI has tremendous potential to enhance certain educational practices, but we must recognize that it also carries with it the possibility of reshaping learning cultures and interactions in many ways.
There are rightful concerns about personal privacy and data mining, about a potential to circumvent academic honesty and integrity, and about the ubiquitous temptation to outsource all human thinking and creation to a technological tutor. (Even as I write this blog post, the AI of my operating system is continuously offering to “refine” my prose). But we should also bear in mind that fundamental social and emotional skills will go underdeveloped or neglected if relying on AI becomes our default move--or our only move—each and every time we need help, want to deepen our learning, or are in need of feedback.
Schools like Browning should certainly not be in the business of demonizing AI. We should instead continue the hard work of thinking through what kind of human experiences—collaborative, relational, connective experiences—we need to structure for our boys to ensure that their learning experiences are not just efficient and polished, but also soulful, textured, meaningful, and enduring.