Chat GPT Concerns? De-Humanizing Education is Nothing New …

It was an interesting year – 1924. 99 years ago. Astronomer Edwin Hubble discovered the Andromeda galaxy, the tomb of King Tut was revealed in Egypt, and the first automated teaching machine was invented by Sidney Pressey, a young psychologist at Ohio State University. 

These three events share a thread linking to notions of human existence, but Pressey’s teaching machine serves as a stepping stone on a path leading to today’s pile-up over AI and its continued development. Bear with me and I’ll get to why I think the apocalyptic warnings about this should have been addressed decades if not centuries ago. 

The historical steps preceding Pressey’s invention can be traced to Galileo and his hypothesizing about a mechanical universe, ca 1600, that kick-started a revolution in how humans thought of knowledge. Galileo, and a legion of others following him including Descartes and Newton, prioritized objective thinking as the basis of scientific knowledge. It was predictable and controllable, parsed by mathematics, the lingua franca of scientific thinking. Subjective thinking, influenced by emotionality, was almost the very opposite – it was messy and unpredictable and so relegated as untrustworthy. 

Artist rendering of Galileo explaining celestial mechanics, ca 1610. In the heavens, Galileo and other astronomers saw mechanical forces governing all events. This became the basis of scientific theorizing and the rise of objectivist thinking.

The Industrial revolution spun out of the scientific revolution and, in the late 1800s, scientists seeking to explain human behaviour went all-in with theories rooted in mathematics and the engineering sciences girding industrialization. Behavior and human learning, said the first psychologists, could be manipulated and controlled, as crude experiments demonstrated. 

Industries relied on factories that relied on workers completing menial tasks. The workers required a little literacy but not too much. The captains of industry determined that Increasing factory efficiencies was a prime objective and so adopted the precepts of Frederick Taylor, an engineer anointed as the guru of ‘scientific management.’ School administrators were urged to study ‘Taylorism’ and mimic the factory model and soon did just this, aided by another invention of the time, Intelligence Quotient or ‘IQ’ testing, that helped filter students for the best fit to industry’s needs. 

The ‘Scientific Management’ principles of Frederick Winslow Taylor (d. 1915), widely adopted wherever industries and factories flourished, emphasized efficiencies to increase productivity and profits, often at the expense of workers’ health and well-being. ‘Taylorism’ was also taken up by school administrators.

IQ testing, like scientific management, relies on numbers telling stories via equations, bar graphs, charts and analyses. In this way, humans can be rendered the same as car parts and cans of beans rolling off factory assembly lines. 

Soon after, Pressey’s machine fit into the narrative of the day and helped launch a new mode of educating masses of students with the help of steel, some programming, and motivational candy pellets (automatically dispensed for correct answers). Teachers were eliminated but through the crude programming guiding the machine, students were vanquished as individuals, too. To wit, vanquishing the human was the narrative bolstering schooling efficiency. Measure what you can count, discredit and discard what you can’t, and keep on pretending the mechanical universe rules the heavens and its earthly subjects. 

Sidney Pressey ‘teaching machine’ (ca 1925) designed to teach by having students answer a series of short multiple choice questions and rewarding correct replies through dispensing a candy pellet. (Smithsonian image)

There is a major problem with this however. Humans aren’t car parts or cans of beans. 

We are flesh and blood, tissues and organs, molecules and neurotransmitters. Our behaviours reflect genetic predispositions forged millions of years ago and events from five minutes ago. Our mental and emotional lives are extremely complex and experienced idiosyncratically, just like our learning. 

In fact, the real nature of human learning lies in its subjectivity, the ways each of us uniquely processes sensory information and determines this as important or not. Reflecting myriad contextual factors, one person catalogues an experience as joyful, another perceives the same fearfully, another as ‘meh.’  

One thing human learning is not is innately mechanical or robotic though this can be an outcome of training. These insights are confirmed by the most recent neuroscience, but autobiography – the vast catalogue of human remembrances also documents this, one life-story after another. 

So, too, is this perceived and understood by most parents and educators: anyone of any age, learning something of importance to them, does so in their own way. And we keep on learning like this, lifelong. 

Sometimes, technologies and learning ‘devices’ assist learning but they don’t create it. Only humans do that in uniquely complex ways. It was an error on the part of Galileo and Descartes following him to insist that mechanical, objectivist thinking (aided by God) superseded subjectivist human thinking. For starters it overlooked the fact that their own ideas were formed through human thinking processes and not those of a mechanical universe. 

Unfortunately, psychologists and academics bought the story and ran with it, hatching theories of learning calibrated in mathematical certainty but drained of real human subjectivity. Adding to this, they wrote up their findings in journals calibrated to ‘objectivist-speak’, and invoked mass education measures characterized by standardized curricula and testing, and generations of teaching machines from Pressey’s candy dispenser through the ‘Scantron‘ era to today’s corporate-designed ‘individualized’ programming templates. At the end of term reward students with an automated evaluation: “XY needs to apply himself more”. “MN has successfully met course requirements and is promoted to the next grade.” 

This is the story of mass schooling: the human student has been rendered invisible, non-existent, value-less. Not by everyone in every situation as many people can attest to, including myself. But western-colonial education systems, worldwide, have mainly been rigged to support and enforce this de-humanizing system.

I rebelled against this system as a student and was punished and marginalized, as were many friends. I also rebelled against it as an educator, and faced similar marginalization. 

But I continue to advocate for humane learning and I object to de-humanized schooling. My PhD dissertation expands on this theme, advocating for re-cognition of a ‘nature of learning’ that reflects new insights and also timeworn understandings about who we really, really are. Most indigenous cultures have beautiful and eloquent insights into this. 

This leads me to puzzling about the fretting over the introduction of new generation AI technologies into education. My cynical side wonders what the fuss is about; I mean, what was the ‘standardization’ movement all about, starting with Taylor or earlier, if not an all-in attempt on the part of administrators and academics to vanquish the ‘person’ in mass schooling. Throw in the ‘teacherly voice’ or the utterly banal voicing baked into most academic texts and journals and you seal the deal. That was AI 1.0 IMO. 

Fear about student misuse of ChatGPT is over-played, IMO. Mass schooling vanquished the human in education a very long time ago; at least students now have a chance to even the score!

But now the academic-administrative cabal is getting apoplectic and apocalyptic because students are going to sign into ChatGPT and copy out automated responses, essentially dishing back what has been imposed on generations of students! 

That’s rich. Nerdy, too. 

I hope there is more willingness to think this through and really converse about the intersectionalities arising in this situation. There are things to question, even fear, in the latest AI developments. But this is hardly new territory in the western obsession with the mechanical labyrinth that Galileo first hallucinated. 

Reflecting on the real nature of human learning can help lead people out of the mechanical labyrinth and should be part of the “pause” so many tech innovators are calling for right now.  

  • end

Tags: , ,

No comments yet.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.