Thursday, January 29, 2026

The Shoemaker (1930)


When I look at The Shoemaker by Tadeusz Makowski, what draws me in is not the hardship of the work, but the way time seems to gather around the figure. The shoemaker bends inward toward his task. His hands hold the shoe with a care that suggests not urgency, but attention. The walls are crowded with finished shoes. They are the evidence of repetition, of days accumulated into years, and yet the figure has not vanished into the work. He remains present. The work has not consumed him; it has shaped him.

This is why I see my great-grandfather here. He died long before I was born, but his life reaches me through photographs and stories. He was shoemaker among many other forms of labor. Others in my line were miners, factory workers, wire pullers, secretaries, water-plant employees, teachers, house keepers, and housewives. People whose identities were not articulated so much as enacted. Their lives were structured by work that demanded time, patience, and consequence. Meaning did not arrive through reflection first; it emerged through how their days were spent.

As I move through another application season, this time for administrative roles, I find myself returning to this image not out of anxiety, but contemplation. I am less concerned with advancement than with formation. The question that lingers is not whether I am capable of this work, but what kind of self it will continue to form. This is where religious language becomes useful in helping me articulate my thoughts, not as doctrine but as a vocabulary for attention.

Religion, at its most elemental, has always been concerned with time. Not how efficiently it is used, but how it is ordered. It asks what a life is shaped by through repetition. It asks what practices quietly form character long before beliefs are articulated. In this sense, religion does not begin with transcendence, but with discipline: with how ordinary days are given shape and meaning.

The Buddhist notion of Right Livelihood (sammā ājīva), articulated within the Noble Eightfold Path, names this concern with particular clarity. Right Livelihood does not evaluate work by status or reward. It asks whether the way one earns a living allows intention, action, and consequence to remain aligned. Because work occupies so much of one’s time, it becomes one of the most powerful forces in identity formation. A livelihood forms the self whether one intends it to or not.

This contemplative concern echoes the early humanism of Karl Marx, who understood labor as a primary site of self-recognition. For Marx, work is not merely a means of survival; it is a way human beings come to know themselves. When labor is whole and intelligible, identity coheres. When labor is abstracted, reduced to output, metrics, or symbolic value the worker becomes alienated. Time is consumed without forming the self. “The worker puts his life into the object,” Marx writes, “but now his life no longer belongs to him, but to the object.”

This sense of time emptied of formation surfaced recently in a conversation with a friend who works in construction. Their work pays well, far more than my teacher’s salary ever has or will, but as they spoke about building coffee shops and dentist offices, spaces designed for consumption and efficiency, I felt a quiet hollowness. Not judgment toward them, but a sense that something was missing. The work produces value, but it feels thin. It does not seem to ask much of the self beyond speed and compliance.

What lingers beneath that conversation is a cultural theology that equates prosperity with meaning. The assumption, often unspoken, is that if work pays well, it must therefore be good; if it produces visible success, it must therefore be worth one’s time. Religion, at its most reflective, resists this collapse. It insists that time itself is formative, and that how we spend it shapes who we become, regardless of what it earns.

This is why I return to Martin Luther King Jr. and his insistence on dignity as a matter of presence rather than position:

“If a man is called to be a street sweeper, he should sweep streets even as Michelangelo painted… and all the hosts of heaven and earth will have to pause and say, here lived a great street sweeper who did his job well.”

King’s claim is not romantic. It is contemplative. Greatness here is not about visibility or reward, but about inhabitation. Greatness here is about giving oneself fully to the work at hand. Time becomes meaningful when it is lived attentively, not when it is merely filled.

Makowski’s shoemaker embodies this inhabitation. The work is repetitive, enclosed, and modest, yet the figure remains inside it rather than estranged from it. This is what Right Livelihood names. It is not purity or prestige, but coherence. The shoemaker’s time has not been emptied of self. He is still being formed by what he does.

This is the posture I find myself in now. Not a crisis, not a refusal, but a pause. A moment to reflect on what kind of work continues to shape me in ways I can recognize and live with. Marx helps me name the risk of alienation when work becomes abstract. Buddhism helps me name the risk of dispersion when attention is scattered. Religion, more broadly, gives me permission to slow down and consider how my time is forming me.

Remembering my lineage in this moment is not an argument against moving forward. It is a way of orienting myself before doing so. The people I come from did not theorize their work, but they lived its demands. Their identities were formed because their time required presence. This moment of contemplation is simply an attempt to honor that inheritance and to ask, carefully, whether the next way I give my time will still allow me to remain whole, attentive, and recognizable to myself.

Catch Me If You Can

The Rockwell's Complete Book of Roses

 


MoPAL Scholar Meeting

Yanis Coffee Zone

Sunday, January 25, 2026

Clair Obscur Expedition 33 Poster

Boy Carrying a Sword (1861)

I keep returning to Boy Carrying a Sword when my work feels heavy in a way that is more than fatigue. This past week, there were two major conflicts among the young adults in my care: one physical, one emotional. Fights, plainly. They were disruptive, frightening, and consequential. But what lingers for me is not the spectacle of conflict; it is the recognition that follows it. I call them young adults because that is the language of the institution, the language of forms and policies. In practice, few of them are even close to being adults at all. They are unfinished. They remind me of Manet’s boy: holding something powerful, dangerous, and irreversible before he understands what it asks of him.

Trauma arrests development. I have come to believe this not as an abstract claim but as an observation repeated often enough to become unavoidable. Trauma does not merely injure; it interrupts the sequence by which emotional regulation, impulse control, and reflective thought are learned. Many of the young people I work with mastered survival early. They learned vigilance, speed, and defense. What they did not have the opportunity to learn, because the conditions did not permit it, were the slower skills of negotiation, self-soothing, and perspective-taking. When conflict arises, the body responds before the mind can intervene. As Bessel van der Kolk reminds us, trauma lives in the body. Behavior becomes a form of speech when language and regulation have not been adequately developed.

What complicates this further is poverty. Over time, I have come to rely on a distinction that feels increasingly precise: not every at-risk student lives in poverty, but nearly every student living in poverty is at risk. Poverty functions as a constant stressor rather than a static condition. It narrows temporal horizons, intensifies threat perception, and erodes trust in systems that promise delayed reward. Living under chronic scarcity trains the nervous system to privilege immediacy over reflection. In such contexts, escalation often feels safer than restraint, because restraint presumes a future that has rarely proven reliable.

This is why I suggested that our team read, The Poverty Problem: How Education Can Promote Resilience and Counter Poverty’s Impact on Brain Development and Functioning by Horacio Sanchez. What I value in Sanchez’s work is its refusal to moralize. Poverty is not framed as a failure of character or values, nor is it reduced to cultural difference. Instead, Sanchez situates poverty squarely in the realm of neurodevelopment. Chronic stress alters brain functioning. Executive skills weaken under prolonged threat. Emotional regulation becomes fragile. In this framing, what schools often label as misbehavior appears instead as adaptation: effective in one context, destructive in another.

Years ago, I encountered related ideas through the work of Ruby Payne. Her articulation of the “hidden rules” of class helped me, at the time, to recognize how easily educators misinterpret behavior through their own assumptions. That framework had value. But with distance, I have grown wary of how readily it can slide into essentialism. When poverty becomes a typology, students are reduced to traits rather than understood as people shaped by conditions. Sanchez’s contribution feels corrective. Poverty is not who students are; it is what has happened to them over time.

Schools, however, are poorly designed to hold this complexity. As systems, they are built for compliance. Behavior is treated as an isolated variable to be managed rather than as evidence to be interpreted. Order and compliance becomes synonymous with success. When disruption occurs, the response is procedural: identify the infraction, assign the consequence, restore calm. The system functions efficiently, but shallowly. The deeper causes of trauma, poverty, developmental interruption are rarely addressed because they resist standardization and slow the machinery down.

This is where Manet’s painting becomes something close to diagnosis. The boy is dressed for authority. He holds a sword that signifies adulthood, power, and consequence. Yet his posture is tentative, his body misaligned with the role he is meant to inhabit. Schools do this constantly. They place adult consequences in the hands of children who were never adequately prepared to carry them. Discipline records, suspensions, legal accountability: these are swords. Once handed over, they cannot be easily set aside. When harm occurs, the weapon eclipses the child who is still learning how to stand.

This is why I believe so strongly in restorative practices. I am careful with that language. I do not mean restorative discipline, which too often functions as a softer form of punishment within the same compliance-based logic. Restorative practices represent a fundamentally different orientation. They ask different questions: What happened? Who was harmed? What is needed to repair the damage and prevent it from happening again? These questions assume that behavior is relational and developmental, not merely willful.

Restorative practices are not permissive. If anything, they are exacting. They require individuals to confront the impact of their actions rather than passively absorb punishment. They require systems to tolerate ambiguity and discomfort rather than default to exclusion. Most importantly, they recognize that regulation must come before cognition. When the nervous system is in a state of threat, moral reasoning is inaccessible. Punitive responses escalate that threat. Restorative practices slow the moment down, creating enough safety for reflection to become possible.

What continues to draw me to restorative practices is their refusal to collapse identity into behavior. Harm is named clearly, but the individual is not reduced to the act. Responsibility is held within relationship, where learning can still occur. Belonging is not treated as a reward for good behavior; it is understood as the condition that makes growth possible at all. This directly challenges systems shaped by scarcity and control, systems that rely on removal rather than repair.

Manet leaves his boy unresolved, suspended between authority and unreadiness. Restorative practices refuse that stasis. They treat conflict not as evidence of failure, but as a moment of instruction. As a rupture that can either harden into identity or open into growth. I do not romanticize this work. It is slow, uneven, and often incomplete. Repair does not guarantee transformation. But it is developmentally honest in a way compliance never is.

Until schools are willing to prioritize understanding over order and formation over obedience, they will continue to reproduce the very behaviors they seek to eliminate. We will keep asking young people to carry swords they did not forge and punishing them when the weight proves too much. Restorative practices, at the very least, attempt something more humane: they create the possibility that the sword can be set down, and that what replaces it might be skill, language, and relationship rather than fear.

Saturday, January 24, 2026

Battlefield 6

Helldivers 2

Seated Man with Cat on His Lap (1919)

January has a way of quieting everything. Not just sound, but intention. Today the cold has settled in deeply, the kind that presses against the house and asks nothing more of it than endurance. Snow continues to fall, the temperature continues to drop, and the world outside seems content to remain unfinished. Inside, the house is still. There is no movement that matters.

My cats move in and out of the room, unhurried and unconcerned. They pause on my lap for warmth, stay just long enough to be felt, then drift away. Their presence does not interrupt the silence; it completes it. Sitting here with them, I am struck by how closely this day mirrors my internal state. I feel listless, unsure what to do with myself. I read for a while. I play with the cats. I watch something random or lose an hour to a game. None of it feels urgent. None of it feels wrong.

I keep returning to Seated Man with Cat on His Lap (1919) by M. C. Escher. What draws me in is not the subject so much as the lines. They are dense, repetitive, and restraining. The man is held together and held in place by them. There is no sense of motion or narrative progression. Time does not advance in this image; it lingers. That is exactly how today feels. Not restful in a celebratory way, but suspended. Quiet without being peaceful.

Often, movement substitutes for relationships in my life. Work, travel, projects. Forward motion has long been easier than intimacy. I do not have many close friendships, and while I share so much of myself in writing, I do so at a controlled distance. What remains private is not information, but exposure. Writing allows me to be seen without having to wait for a response. Dating and hiring committees do not offer that protection.

Being home today with my cats feels like a salve I did not know I needed. Work, lately, has been a reminder of how easily worth can feel conditional. Despite strong credentials and experience, interviews remain elusive. It echoes an earlier moment in my life, after being let go from a middle school position in 2010, when I applied to dozens of jobs and received only one interview. Then, the fear was economic. Now, it is existential. The sense of being unwanted has returned, stripped of urgency but heavy with implication.

My lack of success in dating and my stalled professional advancement feel intertwined, forming an unkind narrative about how I am perceived in middle age. Undesirable. Arrogant. Tactless. Difficult. I tell myself that at least some of it must be true. Evidence, after all, accumulates. And once that protective shell cracks, it is tempting to believe the worst. It's tempting to believe that what is revealed beneath is spoiled beyond repair.

Yet today resists that conclusion. The snow does not judge its own stillness. The cats do not require me to perform, impress, or justify myself. They respond only to warmth and presence. In Escher’s image, the man does not look fulfilled, but he remains in relation to something living, breathing, and real. There is dignity in that, even if it is quiet and unacknowledged.

This day exists on a different register than my normal life. It does not demand productivity or progress. It does not offer answers. It simply allows me to sit, to feel the weight of the cold and the silence, and to acknowledge the ache without rushing to resolve it. Perhaps that is enough for January. Perhaps meaning does not always arrive through movement. Sometimes it settles briefly, like a cat on my lap, and leaves behind the reminder that being held, even imperfectly, still counts.

The Guns Of August

Frozen

Thursday, January 22, 2026

Hands (2017)

I have always been attentive to hands. Not merely their function, but their form. The way they vary from person to person, slender or thick, long-fingered or compact, youthful or worn. I notice how skin changes over time, how it thins and creases, how scars interrupt its surface, how folds and lines accumulate like annotations written slowly by living. Hands, more than faces, seem to tell the truth. They record what a life has asked of a body. In this way, hands become biographical documents, bearing witness to time, labor, affection, and loss.

This is what draws me so powerfully to Hands (2017) by He Lihuai. The painting offers almost nothing in the conventional sense: no face, no setting, no narrative arc. And yet it offers everything. In the interlaced hands, rendered with careful restraint, I encounter a life without being told its story. The work trusts the viewer to recognize meaning without instruction. It is an ethics of attention rather than explanation.

My own memories confirm this trust. I remember sitting with someone over coffee, our hands brushing by accident, then touching again deliberately. The sensation was electric. When our hands finally held one another, the moment felt complete in itself. When they looked at me and said they'd always wondered how my hands might feel, it was an instance of pure presence. Touch preceded interpretation. Long before words could clarify intention, the hands had already spoken.

I remember walking the halls of high school hand in hand with my sweetheart. It was a public gesture, unmistakably so. To hold hands in that space was to claim connection openly, to allow oneself to be seen. In retrospect, I understand it not simply as affection, but as courage. Hands, when joined, declare relationship in a way that language often hesitates to do.

I also remember the opposite. I remember being pushed away by hands that no longer wanted connection. The rejection arrived through the body before it was articulated through speech. In that moment, the withdrawal of touch clarified what words would later confirm. Hands can end relationships as decisively as they begin them.

These memories resonate with what psychology has long suggested: touch is foundational to human experience. Developmental research demonstrates that physical contact precedes language as the primary medium through which safety, belonging, and regulation are learned. Donald Winnicott’s concept of “holding” captures this power precisely. It is not merely the act of being physically held, but the broader psychological containment that allows a person to exist securely in relation to others. That early bodily knowledge never fully disappears. Even in adulthood, we continue to seek reassurance, recognition, and connection through the hands.

I remember the hands that taught me. My mother’s hands turning the pages of a book as she read aloud, steady and unhurried. The hands of teachers, fingertips whitened with chalk dust, shaping ideas line by line on blackboards. These hands did not simply convey information; they modeled patience, care, and presence. Learning, I realize now, was always embodied before it was intellectual.

I remember holding my grandmother’s hand as she slipped away from this life one breath at a time. Her hands were cold. So thin. No longer the hands that had rolled out hundreds of cookies, hands that once radiated the warmth of hearth and home to family and strangers alike. In that final moment, her hands no longer produced or served, yet they retained their meaning. Holding them became an act of witness, an acknowledgment that relationship persists even as function fails.

I remember my father’s hands, bloodied from work, and his impatient insistence that “the job’s not done until I’m bleeding.” It was not a philosophy so much as a habit, but it revealed something essential about him. His hands bore the cost of effort, the friction of a life lived in contact with resistance. They told a story of endurance without sentimentality.

And then there are my own hands. They carry scars: a cut across my palm, faded lines where my knife would open small wounds too big for my heart to hold, even smaller marks whose origins I no longer remember. They are unremarkable, yet entirely specific. They have held others and been pushed away. They have worked, taught, restrained, comforted, and failed. In them, I see the gradual accumulation of my own life.

Philosophically, this attention to hands aligns with Maurice Merleau-Ponty’s insistence that the body is not an object we possess, but the primary means through which we exist in the world. My hands are not instruments separate from my self; they are how my intentions enter the world. Through them, I enact care or harm, openness or withdrawal. They reveal my values before I articulate them.

This is why the hands in He Lihuai’s painting feel so honest. They are neither open in invitation nor clenched in rage. They rest in suspension. They are capable of both tenderness and violence, yet choosing neither in this moment. That ambiguity feels deeply human. Most of life is lived there, between reaching out and pulling back, between connection and self-protection.

Hands, I am coming to understand, do much of the moral work of being human. An open hand extended in peace signals trust and restraint. A clenched fist announces threat and withdrawal. The difference between the two is minimal in motion but immense in meaning. In this sense, ethical life is often enacted not through grand decisions, but through small physical gestures. Enacted through how, when, and whether we choose to touch.

Hands shows so little, yet it activates so much. It reminds me that our lives are written not only in words or beliefs, but in skin, scars, and gestures. To attend to hands, both others’ and my own, is to attend to the quiet, cumulative ways we relate to the world and to one another. In the end, being human is not an abstract condition. It is something we do, again and again, through the simple, consequential act of reaching out or pushing away. Of holding tight or letting go.


The Soldier in our Civil War: A pictorial History of the Conflict, Vol. II



Elegy For Jane (1950)

Elegy for Jane
(My student, thrown by a horse)

By Theodore Roethke

I remember the neckcurls, limp and damp as tendrils;
And her quick look, a sidelong pickerel smile;
And how, once started into talk, the light syllables leaped for her.
And she balanced in the delight of her thought,
A wren, happy, tail into the wind,
Her song trembling the twigs and small branches.
The shade sang with her;
The leaves, their whispers turned to kissing,
And the mould sang in the bleached valleys under the rose.

Oh, when she was sad, she cast herself down into such a pure depth,
Even a father could not find her:
Scraping her cheek against straw,
Stirring the clearest water.
My sparrow, you are not here,
Waiting like a fern, making a spiney shadow.
The sides of wet stones cannot console me,
Nor the moss, wound with the last light.

If only I could nudge you from this sleep,
My maimed darling, my skittery pigeon.
Over this damp grave I speak the words of my love:
I, with no rights in this matter,
Neither father nor lover.


Sounds and Silences

 


Tuesday, January 20, 2026

The Tower of Babel (1928)

The story of the Tower of Babel occupies a complicated place in my intellectual and emotional life. I first encountered it within a literalist religious framework, where Genesis 11 was presented as straightforward history: an actual tower, a unified human language, and a divine act that shattered human coherence. When I eventually came to understand that this account could not be sustained as literal history, the realization was deeply destabilizing. What fell apart was not only a particular story, but the epistemological structure that had supported it. For a time, Babel became synonymous with loss. A loss of certainty. A loss of trust. A loss of the sense that meaning was guaranteed rather than negotiated.

With distance and maturity, I now return to Babel as allegory rather than reportage, and in doing so, I find that the story has gained rather than lost depth. Contemporary biblical scholarship overwhelmingly situates the Tower of Babel within the genre of mythic or etiological narrative. That is, stories that explain fundamental features of human existence rather than recount empirical events. The text itself signals this intention. Genesis 11:1 begins, “Now the whole earth had one language and the same words." This is not a claim meant to withstand linguistic scrutiny; it is a narrative setup, a deliberate exaggeration that frames the theological problem the story seeks to explore.

Modern scholars emphasize that Babel is less concerned with divine jealousy or punishment than with human ambition and the politics of unity. Walter Brueggemann argues that the tower represents “the human yearning to create a self-sufficient world, secure against all threats, including the threat of God." In this reading, Babel is not about curiosity or creativity gone awry, but about the consolidation of power. The builders seek to “make a name for ourselves” (Gen. 11:4), a phrase that signals not mere pride but the desire for permanence, control, and centralized identity.

Robert Alter similarly notes that the language of the passage echoes imperial projects familiar to the ancient Near East. Bricks, kilns, and monumental architecture are not neutral details; they evoke Mesopotamian ziggurats and the political theology of empire. Alter observes that Babel dramatizes “the hubris of civilization itself, its impulse to monumentalize power and erase difference.” From this perspective, the confusion of languages is not arbitrary cruelty but resistance. It is a divine refusal to allow totalizing uniformity to harden into domination.

This scholarly framing fundamentally reshapes how I engage the story. As someone who now understands himself as religious but not spiritual, I no longer seek transcendence or metaphysical certainty in the text. Instead, I approach it as moral literature, attentive to its critique of systems and structures. Babel becomes a story about how easily coordination slides into coercion, how unity becomes suspect when it leaves no room for plurality.

Escher’s The Tower of Babel offers a striking visual analogue to this scholarly consensus. His tower is not grotesque or chaotic; it is orderly, elegant, and impressively engineered. The workers are disciplined and purposeful. Nothing about the scene suggests moral failure at the level of individual intent. This aligns with Brueggemann’s insistence that the danger of Babel lies not in wickedness but in “the absolute confidence that human systems can finally secure human destiny.”

What unsettles me most in Escher’s image is the absence of dissent. Every figure is absorbed into the task. Each occupies a clearly defined role within a larger, unquestioned project. This resonates powerfully with my own experience in education, where institutional systems are often sustained by people of goodwill who rarely have the vantage point, or the permission, to question the whole. Like Babel’s builders, educators frequently labor within structures that promise coherence, efficiency, and progress, even as those same structures risk flattening human complexity.

The biblical text itself hints at this problem of scale and distance. “The Lord came down to see the city and the tower” (Gen. 11:5). The irony is unmistakable: the tower meant to reach heaven is so small, from the divine perspective, that God must descend to inspect it. Scholars have long noted the subtle satire embedded here. The builders’ ambition is undercut not by thunder or fire, but by narrative irony. The tower’s grandeur exists only from within the system that produced it.

This irony speaks directly to my own journey away from literalism. What once felt like betrayal now feels like discernment. To read Babel allegorically is not to diminish its authority, but to take it seriously as wisdom literature. The story does not explain why people speak different languages; it interrogates why humans so often mistake sameness for unity and scale for significance.

Escher’s decision to depict the tower mid-construction rather than in ruins is crucial. The story, as the image renders it, is unfinished. The catastrophe has not yet occurred. The builders still believe in what they are making. This suspended moment mirrors my present stance toward inherited religious and institutional narratives. I am no longer inside them uncritically, but neither have I abandoned them entirely. I engage them cautiously, historically, and ethically.

In this light, Babel no longer represents the fragility of belief for me. It represents the necessity of humility. The modern scholarly view affirms what my own experience has taught me: that stories endure not because they are literally true, but because they remain diagnostically accurate. Babel continues to matter because it names a recurring human problem: the temptation to build systems so comprehensive that they silence the very diversity they claim to unite. Escher’s tower stands as a visual reminder that the most dangerous structures are often the most convincing ones, and that discernment, not certainty, is the truest form of faith I now know how to practice.

Monday, January 19, 2026

Gleaners, End of Day (1891)

A gleaner is not a harvester. In agrarian societies, gleaners entered the field only after the harvest was complete, collecting what remained. Collecting what machinery missed, what labor deemed inefficient, what fell outside the logic of yield. Gleaning assumed scarcity rather than abundance and responsibility rather than reward. It was work defined not by beginnings, but by endings. Work defined by attention to what would otherwise be abandoned.

This distinction has become increasingly central to how I understand my work in education. Much of my career has unfolded after the system had already passed through. I've spent years working with students whose trajectories were nonlinear, whose needs did not fit neatly within institutional timelines, and whose success depended less on acceleration than on sustained presence. In Léon Augustin Lhermitte’s Gleaners, End of Day, I recognize not only the posture of that work, but its moral location. The figures are bent beneath a fading light, gathering what remains once efficiency has done its work. Teaching, for me, has often felt like labor undertaken at precisely this hour.

It is from this position that the research of Christina Maslach has given me language equal to my experience. Maslach defines burnout as “a psychological syndrome emerging as a prolonged response to chronic interpersonal stressors on the job." Importantly, she insists that burnout is not a failure of character or resilience. “Burnout is not a problem of people,” she writes, “but of the social environment in which people work." That framing matters. It allows experience to be examined without collapsing into blame.

Maslach identifies three core dimensions of burnout: emotional exhaustion, depersonalization or cynicism, and a reduced sense of personal accomplishment. In later work, she and her colleagues extended this model by identifying six domains of job–person fit (workload, control, reward, community, fairness, and values), arguing that burnout arises predictably when misalignment across these domains becomes chronic. What has struck me most forcefully is how precisely these domains map onto my own professional history: not as isolated crises, but as cumulative conditions shaped by circumstance.

Workload was the most visible site of misalignment. When I began my career, the work I now do was distributed across multiple people: teachers and paraprofessionals serving Flex, Missouri Option, and night school. Over time, those positions were reassigned or eliminated through attrition. The work itself did not diminish. It was passed to me. Expectations for outcomes remained unchanged or increased. Because I continued to meet those expectations, the redistribution became normalized. Maslach notes that “emotional exhaustion represents the basic individual stress dimension of burnout," but in my experience, exhaustion did not present as collapse. It presented as functioning too well for too long, depletion masked by competence.

This strain was compounded by diminished control. Being physically off site meant that students were often assigned to me without notice and without the information necessary to support them effectively. Requests for information were ignored or denied. I was excluded from conversations about placement, discipline, and intervention. On paper, my supervisor had little authority over my students and no consistent presence in my workspace. Maslach and Leiter argue that “a lack of control over one’s work is strongly associated with burnout,” because it converts expertise into liability. I adapted by improvising. By building informal networks, asking individual teachers for help, and working around systems that did not include me. This was not resistance. It was survival under constraint.

The domain of reward revealed a subtler misalignment. One year, I was a finalist for a regional teaching award. It was the closest I had come to formal recognition, and I allowed myself to hope. When the award went to a teacher from an elite private school, the disappointment lingered. Over time, reflection helped me release jealousy and genuinely celebrate others’ success. It also clarified a structural reality Maslach describes succinctly: “When rewards are insufficient—whether financial, institutional, or social—people feel devalued." Recognition tends to follow visibility and prestige rather than labor that occurs at the margins. The absence of reward was not personal; it was patterned.

Community proved fragile over time. As the administrators who hired me moved on, being off site meant repeatedly explaining why my program existed at all. Twice, that vulnerability crystallized into prolonged uncertainty. Once, I went an entire semester without new students being assigned, reliving an earlier experience of being phased out during the Great Recession. Another time, a building transition left me professionally isolated, nominally assigned to a staff that had moved elsewhere, supervised by someone who never came to the building. Maslach emphasizes that burnout accelerates when social connections erode: “People thrive in communities and suffer when those connections are broken.” My experience confirms this. What mitigated burnout was not policy, but reconnection. It was being seen again as part of a professional whole.

The domain of fairness was most starkly tested when a student of mine died in a tragic accident. An administrator called to ask if I was okay. I said I was coping. No counselors came. No administrators checked on my students. I attended the visitation as the sole representative of the school, standing with students and family in silence. I do not interpret this moment as neglect born of indifference. It was ambiguity of responsibility under strain. Yet Maslach is clear that fairness is not only about outcomes, but about process and presence. “Perceived unfairness,” she notes, “is a powerful source of cynicism and disengagement.” In that moment, institutional care was absent, and the burden fell, again, on the person still in the field.

Throughout all of this, my values remained aligned with students, even as institutional priorities narrowed to graduation metrics. I accept the necessity of those metrics. I also know that each student I have helped cross the stage did so against significant odds. Often, I have been the only keeper of their stories—their setbacks, their resilience, their quiet persistence. Maslach argues that burnout intensifies when “there is a mismatch between the values of the individual and those of the organization.” I reconciled that mismatch by honoring the stories anyway, even when the system lacked language to recognize them.

Maslach’s work has helped me understand that my experience of burnout did not signal disengagement. I did not stop caring. Emotional exhaustion manifested as sustained depletion, not collapse. Depersonalization emerged as selective distance, not indifference. It was a necessary boundary to remain functional. Reduced personal accomplishment reflected mismeasurement rather than futility. Engagement persisted, but it took the form of discipline rather than enthusiasm. As Maslach notes, engagement is not simply the absence of burnout; it is “characterized by energy, involvement, and efficacy." In my case, that energy was quiet, that involvement ethical, that efficacy often invisible.

The opposite of burnout, then, is not optimism or individual self-care alone. It is alignment. It is stewardship. It is the intentional design of conditions so that necessary work does not rely indefinitely on individual endurance. Maslach is explicit on this point: “The responsibility for burnout lies with the organization, not the individual.” That insight has been liberating, not accusatory.

This is why my sense of vocation is shifting now. Over the past year, several mentors who shaped me deeply have died. They were men who appeared at critical moments in my life and taught me how to be both a teacher and a man. Each of them served as an administrator. Their influence was not rooted in authority, but in presence. It was rooted in standing between people and systems, in noticing who was bending too long, in redistributing strain before it became damage.

At this stage of my career, I feel called to honor that legacy. My desire to move into administration is not an escape from teaching, but its extension. It is a movement from mentoring students to mentoring teachers; from being a gleaner in the field to helping reshape the conditions under which others labor there. Like Lhermitte’s figures at the end of the day, I have stayed. Now, I feel both the responsibility and the readiness to step into a role where staying is no longer solitary, and where endurance is no longer mistaken for inexhaustibility.

In that sense, Gleaners, End of Day offers not a conclusion, but an accounting. It is the moment when the light fades enough to reveal what truly remains and to decide, deliberately, what must change before the next day begins.

Best Way to Spend a Day

A Knight of the Seven Kingdoms

Saturday, January 17, 2026

Dr. No

The Death of Cato the Younger (of Utica) (1863)

I was explaining democratic participation to my students when I felt the familiar unease return. We were doing everything correctly, we were defining terms, tracing institutions, rehearsing rights, yet something essential felt absent. The language of democracy I was offering them felt expansive but thin, confident but unburdened. It occurred to me, again, that while we teach government through Athens, the generation that founded the United States imagined its future through Rome. As we approach the 250th anniversary of the American Revolution, that distinction feels less academic and more diagnostic.

Rome was never attractive to the founders because it was glorious. It was attractive because it failed. Greek philosophy offered ideals of reason, discourse, and participation; Rome offered a long, unsparing education in power, corruption, restraint, and decline. The founders did not read Roman history to admire it. They read it to understand what happens when virtue erodes more slowly than institutions. Rome was not an origin myth. It was a warning.

That warning reached them less through abstract theory than through lives. Parallel Lives shaped generations of readers by insisting that political outcomes are inseparable from moral character. Plutarch’s method is quietly severe. He does not ask which system is most just in theory, but which kind of person power requires in practice. “The most noble deeds,” he observes, “do not always bring the most brilliant fame.” History, in his hands, becomes a record of character under pressure rather than success rewarded.

Within this moral framework, Cato the Younger stands as a limit case. Paired with Phocion, Cato is not offered as effective, adaptable, or victorious. He is incorruptible and therefore increasingly isolated. Both men resist popularity, distrust the crowd, and refuse expedient compromise. Both are destroyed not by vice but by fidelity to principle in political cultures that have moved beyond them. Athens executes Phocion. Rome survives Cato, but only as empire. Plutarch’s judgment is implicit and devastating: virtue may survive corruption, but it rarely survives once moral consensus dissolves.

Cato’s importance lies not in triumph, but in refusal. When Caesar prevails and the republic collapses into autocracy, Cato chooses death rather than accommodation. Plutarch records his reasoning without sentimentality: Cato believed it “unworthy to live by the favor of a tyrant.” Liberty, in this Roman-Stoic imagination, is not primarily participation or voice. It is moral sovereignty. It is the refusal to live incoherently. When political liberty collapses, moral liberty becomes the final jurisdiction of the self.

This austere vision haunted the revolutionary generation and found its historical counterpart in The History of the Decline and Fall of the Roman Empire, published in the same year as American independence. Gibbon’s Rome does not fall in spectacle or catastrophe. It erodes. “The decline of Rome,” he writes, “was the natural and inevitable effect of immoderate greatness.” Institutions persist long after seriousness thins. Civic life becomes administrative. Liberty survives as memory rather than practice. The founders read Gibbon not as prophecy, but as diagnosis and resolved to design a republic that would never require heroic virtue simply to endure.

It is precisely this tension that gives The Death of Cato the Younger (of Utica) its quiet, devastating power. Laurens does not paint resistance in action; he paints what follows its failure. The composition is enclosed and compressed. The architecture offers no civic grandeur, no public horizon. Cato lies horizontally, drained of vitality, his body already yielding to gravity. The few witnesses are pushed to the margins, unable to enter the moral space he occupies. There are no standards, no symbols of the republic, no validating crowd. The state has vanished. What remains is a single human being and the coherence he refuses to surrender.

What I most appreciate in this work is that Cato is alone.

Not theatrically abandoned, not dramatically martyred, but simply solitary. Laurens strips Roman virtue of its last consolation: recognition. At the end of republican life, virtue no longer gathers others; it isolates. Cato’s final act is not performed for Rome or even against Caesar. It is performed in the absence of any shared moral language that could still make sense of it. Liberty collapses inward. It survives only as fidelity to the self.

That Laurens painted this scene in 1863, amid the American Civil War, deepens its resonance. Like Plutarch, Laurens is drawn to moments when political structures fracture and conscience becomes private rather than communal. His Cato is not a model to emulate but a warning made visible. He represents the moment when institutions can no longer carry ethical weight and moral seriousness is borne alone.

This is an uncomfortable inheritance for a modern democratic age. We have turned, understandably, toward Athens. We emphasize participation, inclusion, voice, and adaptability. We distrust moral absolutism and recoil from civic virtue that demands sacrifice. Rights are foregrounded; obligations are negotiated. Much of this reflects genuine moral progress. Yet it also raises the question the founders would have recognized immediately: can a republic endure when liberty is understood only as expression, and not as restraint?

Rome, Gibbon reminds us, did not fall because its citizens loved freedom too little, but because they no longer agreed on what freedom required. Liberty became detached from obligation. Power outpaced character. Institutions endured longer than meaning.

Which brings us, uncomfortably, to the present. Two hundred and fifty years after independence, it is tempting to ask whether we are living through our own decline and fall. But that question risks melodrama. Rome did not know it was Rome until after. Decline rarely announces itself as crisis; it appears as normalization. The more honest inquiry is formative rather than prophetic: are we still cultivating citizens capable of restraint, responsibility, and moral seriousness? Do we share a civic language that makes sacrifice intelligible, or has conscience itself become isolating?

Cato should not be revived as a model. A republic that requires Catos to function has already failed. But his solitude remains instructive. It marks the point at which shared civic meaning thins so completely that principle can no longer be spoken in common. Cato is alone because the language he inhabits no longer has an audience.

At 250, the question is not whether we are Rome. It is whether we still understand why Rome frightened those who founded us. They turned to Rome not because they despised democracy, but because they feared what democracy might become without moral seriousness. We turn to Athens because we fear exclusion and rigidity more than decay. Both fears are justified. The tension between them is permanent.

As a teacher, standing between these inheritances, I find no comfort in prediction. What remains is responsibility. To teach government is not merely to explain how democracy works, but to ask how republics fail when character is outsourced to procedure. Rome still matters because it refuses reassurance. It reminds us that the end of civic life is not always collapse, but solitude and that a republic worthy of endurance should never require its citizens to stand alone in order to remain free, even as it quietly depends on the possibility that someone still would.

Friday, January 16, 2026

Airplane Ears

The Son of Man (1964)

Opening: The Conversation

The conversation itself was unremarkable in its setting and familiar in its contours. We spoke, as so many do, about the cost of higher education: about tuition figures that now feel abstract in their enormity, about student debt normalized as a rite of passage, about institutions that increasingly resemble financial instruments rather than places of learning. At some point, a claim surfaced that was meant to steady the discussion, perhaps even to resolve it: education exists for the sake of education.

I did not disagree. In fact, I found myself nodding in assent. And yet, almost immediately, the phrase revealed its insufficiency. It was true in a way that no longer felt adequate. Education may exist for its own sake in theory, but in practice it exists within systems that demand justification, profitability, and demonstrable utility. The deeper question, the one that refused to let the conversation rest for me after is was over, was not whether education has intrinsic value, but whether our institutions still behave as though that were true.

What lingered for me was the sense that we were talking past something fundamental. The problem is not merely that education has become expensive, though it certainly has. The problem is that the meaning of education itself has been quietly redefined. Degrees still signal achievement, but the nature of that achievement has shifted. Completion has replaced formation. Credentialing has displaced cultivation. What we now celebrate is not the disciplined development of thought, but successful navigation of a system designed to sustain itself.

It is tempting to narrate this as a betrayal, to imagine a moment when education “sold out” to the business community. But that framing is too simple, and too comforting. What occurred was slower and far more subtle. Universities adapted. At first to industrialization, then to mass access, then to economic pressure. In doing so, they gradually absorbed the logic of the market. Education did not cease to matter; it learned how to justify itself in economic terms. Over time, that justification became its identity.

What troubles me is not the presence of vocational training within higher education, nor the legitimate desire of students to secure meaningful employment. It is the inversion that followed. Education no longer prepares individuals to think rigorously and work meaningfully; it increasingly prepares them to work first, and to think only insofar as thinking serves productivity. The question shifts from What kind of person is this education forming? to What role will this graduate fill?

This is where the metaphor of René Magritte’s The Son of Man began to impose itself on my thinking. The figure stands upright and respectable, clothed in the uniform of modern professionalism. He is not lacking dignity. He is not incoherent. And yet, his face, the site of identity, perception, and moral agency, is obscured. Something essential is present but hidden, replaced by a symbol that gestures toward meaning without granting access to it.

That, increasingly, is how higher education appears to me. The structure remains. The language of excellence persists. But the animating purpose, the formation of independent thinkers capable of judgment, critique, and imagination, has slipped behind something else. What we are left with is not an absence of education, but a version of it that has learned to survive by becoming something adjacent to what it once was.

This essay grows out of that unease. It is an attempt to slow the conversation down, to refuse the comfort of easy explanations, and to ask what has been obscured in the process of adaptation. If education still matters, and I believe it does, then the question is not simply how much it costs, but what we believe it is for. 

The Obscured Face

René Magritte’s The Son of Man presents itself with deceptive simplicity. A man stands before us, frontal and composed, dressed in a dark overcoat and a bowler hat. Behind him stretches a low stone wall and a restless sea beneath a clouded sky. The figure is unmistakably modern. He is neither heroic nor marginal, but respectable, anonymous, and orderly. And yet, at the precise point where recognition should occur, where face meets gaze, an apple hovers, blocking access.

Magritte insisted that this was not concealment for the sake of mystery, but a statement about perception itself. What we see, he argued, always hides something else. Visibility does not guarantee understanding. In The Son of Man, the man is not absent; he is withheld. His identity is implied but deferred, suggesting that the surface we are offered is never the thing itself.

This distinction matters. The apple does not erase the face; it replaces it with a symbol. We are given an object rich with cultural meaning—knowledge, temptation, nourishment, even transgression—but we are denied the human specificity behind it. The substitution is clean, elegant, and acceptable. It asks us to be satisfied with representation rather than encounter.

The business suit, too, is not incidental. Magritte’s recurring bowler-hatted figures populate his work as emblems of modern conformity. As men shaped by external norms rather than interior vocation. They are interchangeable, defined less by who they are than by how well they fit. In this sense, The Son of Man does not depict an individual so much as a type: the modern subject rendered legible through dress and posture rather than through thought or expression.

It is difficult not to see the contemporary university reflected here. The institution stands upright, respectable, and rationalized. It wears the uniform of legitimacy: accreditation, rankings, outcomes, and strategic plans. Its face, however, is obscured. Where one might expect to find a clear articulation of educational purpose like formation, wisdom, and ethical reasoning we encounter instead an apple: employability, market relevance, workforce readiness. These are not meaningless concepts, but they function as substitutes. They signal value without revealing substance.

The apple reassures. It is familiar. It promises nourishment. And yet, it prevents us from asking who stands behind it.

What makes Magritte’s image particularly resonant is that the concealment is voluntary. Nothing forces the apple into place. The figure does not struggle against it. There is a quiet acceptance in the pose, a willingness to be known through symbol rather than self. This, too, parallels the modern university’s posture. Institutions rarely claim to have abandoned deeper educational ideals; they simply stop foregrounding them. Over time, the language of formation recedes, replaced by the language of function.

The sea behind the figure complicates the scene. It suggests instability, motion, and depth—forces that cannot be fully contained or managed. In contrast, the man stands rigidly, aligned with the wall, contained within the frame of respectability. The tension between these elements mirrors the tension within education itself. Human inquiry is inherently unsettled, resistant to closure. Institutional structures, by contrast, seek predictability and control. The apple becomes a means of stabilizing what is otherwise unruly.

In reading The Son of Man as an educational metaphor, the question is not whether the apple belongs there, but what its presence costs us. When education presents itself primarily through the language of utility, it risks obscuring the human face of learning, the slow cultivation of judgment, the capacity to question inherited assumptions, the courage to think independently. We are shown outcomes instead of minds, credentials instead of thinkers.

Magritte does not resolve this tension. He leaves us suspended between curiosity and frustration. We know there is more to see, but we are not permitted access. That refusal is instructive. It forces us to confront our own willingness to accept substitutes for substance, symbols for realities, surfaces for depths.

If the university now wears the business suit, the problem is not the suit itself. It is the quiet agreement that the suit is sufficient. It is that the suit tells us everything we need to know about what education is and whom it serves. The Son of Man insists otherwise. It reminds us that whenever something essential is hidden, the task is not to look away, but to ask why we have grown comfortable not seeing.

When Education Changed (and Why That Matters)

It is tempting to locate the transformation of higher education in a single moment. To link it to a policy decision, an economic downturn, a cultural betrayal and to name it as a fall from grace. That narrative is emotionally satisfying, but historically false. What changed was not education’s existence, but its orientation, and that change unfolded slowly, almost imperceptibly, as universities responded to shifting cultural demands. The danger lies precisely in that gradualism. When change occurs incrementally, it rarely announces what it costs.

The earliest universities in Europe, institutions such as the University of Bologna and the University of Paris, understood themselves as communities ordered toward inquiry rather than output. Their purpose was not to serve an economy in the modern sense, but to preserve, interpret, and extend bodies of knowledge deemed essential to human understanding. Even when these institutions trained clergy, jurists, or physicians, they did so through a framework that emphasized intellectual formation prior to professional application. Knowledge preceded function.

This began to shift as universities became increasingly entangled with the needs of the modern state. During the early modern period, education was called upon to produce administrators, legal thinkers, and professionals capable of sustaining bureaucratic governance. The humanities remained central, but their justification subtly changed. They were no longer merely avenues toward wisdom; they became instruments of social order. Still, the scholar’s identity remained intact. Education served power, but it had not yet been subsumed by it.

The decisive transformation occurred with industrialization. As economies reorganized around manufacturing, efficiency, and specialization, knowledge itself was reclassified. Applied sciences, engineering, and technical expertise rose in prominence, not because they were inferior forms of knowing, but because they aligned cleanly with emerging economic needs. Universities adapted, as institutions must in order to survive. In the United States, this adaptation was formalized through the Morrill Acts of the nineteenth century, which established land-grant universities explicitly tasked with advancing agriculture, industry, and mechanical arts.

This expansion was, in many respects, a democratic achievement. Higher education ceased to be the preserve of a narrow elite and became accessible to broader segments of the population. But accessibility came with new expectations. Universities were no longer judged primarily by the quality of their intellectual culture, but by their utility to the nation’s economic development. Education acquired a public justification rooted in productivity.

The twentieth century accelerated this trend. Following World War II, universities expanded dramatically, fueled by government investment, corporate partnerships, and mass enrollment. During this period, higher education increasingly adopted the language and structures of business. Departments were evaluated by measurable outputs. Research was valued for its grant-generating potential. Students were reframed as investments whose success could be quantified through placement rates and earnings data.

At no point did universities announce that they were abandoning the cultivation of thought. They did something more subtle: they learned to speak about education in terms that could be defended economically. Over time, that defensive posture became internalized. Institutions began to understand themselves less as guardians of intellectual tradition and more as competitive enterprises operating within a marketplace of credentials.

This is where something fundamental was lost. What was lost was not rigor, not ambition, but patience. The kind of thinking fostered by the humanities requires time that cannot always justify itself in quarterly reports or strategic plans. It values uncertainty, ambiguity, and critique. It values qualities that resist efficient measurement. As universities became self-sustaining businesses, these forms of thinking grew increasingly difficult to defend, not because they lacked value, but because their value was not easily monetized.

The change, then, was not malicious. It was adaptive. Universities responded to real pressures with rational strategies. But adaptation is never neutral. Each accommodation subtly reshapes identity. When survival depends on alignment with market needs, institutions begin to internalize those needs as purposes. Education shifts from asking what kind of human being it hopes to form to asking what kind of graduate the economy requires.

Understanding this slow transformation matters because it prevents both nostalgia and resignation. The problem is not that universities evolved, nor that they engaged the world beyond their walls. The problem is that in learning how to sustain themselves, they quietly redefined what education was for. What was lost was not a romantic past, but a clear sense of orientation. What was lost was an understanding of education as a formative good rather than a transactional one.

This historical context sharpens the distinction that now presses forward: the difference between a scholar and an employable graduate. The erosion did not happen all at once, but its consequences are now unmistakable. To recover what matters about education, one must first recognize how slowly, and how reasonably, it drifted from its original center.

The Scholar’s Robe

Before the scholar became a professional, before knowledge became a commodity, the scholar occupied a distinct social and moral space. The robe was not merely ceremonial attire; it was a visible declaration that the scholar stood in a different relationship to time, value, and obligation. To wear the robe was to signal withdrawal from immediate utility and immersion in study. It marked a life oriented toward contemplation rather than production.

This separation mattered. The medieval and early modern scholar was not understood primarily as a worker, but as a steward of inherited knowledge and a participant in its ongoing refinement. The work of learning was slow, recursive, and dialogical. Authority was earned through sustained engagement with texts, arguments, and traditions, not through efficiency or output. The robe signified patience. The robe signified an acceptance that understanding unfolds over time and often resists closure.

At the heart of this scholarly formation stood the trivium: grammar, logic, and rhetoric. These were not subjects in the modern sense, but disciplines of mind. Grammar trained attention to language, cultivating precision, nuance, and care in expression. Logic disciplined reasoning, teaching the scholar to distinguish coherence from persuasion, validity from assertion. Rhetoric, far from mere ornamentation, addressed the ethical responsibility of speech within communal life. Together, these disciplines formed thinkers capable of judgment rather than mere performance.

What is striking, in retrospect, is how little the trivium concerned itself with application. It assumed that if one learned how to think clearly, speak responsibly, and reason well, usefulness would follow but indirectly, and on terms shaped by judgment rather than demand. The scholar was not trained to fill a role; the scholar was formed to discern which roles were worth filling.

The robe thus represented more than intellectual status. It marked a refusal to collapse learning into labor. Scholars were not above society, but they were not reducible to its immediate needs. Their value lay precisely in their capacity to stand at a critical distance. There value lay in there ability to question assumptions, to preserve inconvenient truths, and to engage questions that had no obvious payoff.

This distance is what the modern university has found increasingly difficult to justify. As institutions became embedded within economic systems, the scholar’s posture of separation came to appear indulgent, even irresponsible. The robe, once a symbol of dedication, began to look like an anachronism. In its place emerged the language of professionalism: productivity, relevance, deliverables. The scholar was recast as an expert, then as a knowledge worker, and finally as an employee within a complex organizational hierarchy.

The business suit did not replace the robe because it was more practical; it replaced it because it communicated alignment. It signaled that the scholar now operated within the same temporal and economic logic as everyone else. Research was expected to produce outcomes. Teaching was evaluated through metrics. Thought itself became accountable.

Something essential was lost in this transition. Not because scholars should be exempt from responsibility, but because the conditions necessary for genuine intellectual formation were eroded. When every activity must justify itself through measurable impact, the kinds of questions that define scholarship become harder to pursue. Why ask what justice is when one can measure compliance? Why linger with ambiguity when clarity is rewarded? Why risk dissent when alignment ensures stability?

The scholar’s robe once protected a space where such questions could be asked without apology. Its disappearance marks not simply a change in fashion, but a change in expectation. The scholar is no longer permitted to be unproductive in the short term, no longer granted the freedom to think beyond immediate usefulness. The world the robe signified, a world in which education was trusted to matter even when its benefits were not immediately visible, has grown fragile.

This does not mean that we must return to a cloistered academy or deny the realities of contemporary life. But it does require acknowledging that something irreplaceable was carried in that older identity. The scholar was not merely someone who knew things, but someone formed by the discipline of thinking itself.

The loss of the robe, then, is not about nostalgia. It is about the disappearance of a cultural permission. The permission to think slowly, deeply, and critically without first proving one’s economic worth. To understand what education has become, we must first understand what it no longer protects.

From Formation to Function

If the scholar was once formed, the contemporary student is increasingly prepared. The distinction may appear semantic, but it marks a profound shift in educational identity. Formation implies an open-ended process concerned with who a person becomes. Preparation implies readiness for a predefined role. The modern university, shaped by economic pressure and cultural expectation, has largely embraced the latter.

This shift did not occur because universities suddenly rejected intellectual depth. It occurred because function became easier to justify than formation. As higher education expanded in the twentieth century, particularly in the postwar era, institutions were asked to explain their value to governments, families, and students themselves. The answer increasingly took the form of employability. A degree was defended not as an initiation into a life of the mind, but as a credential that translated learning into economic opportunity.

Over time, this translation became the primary language of legitimacy. Programs were evaluated by placement rates. Majors were ranked by earning potential. Students were encouraged, often implicitly and sometimes explicitly, to think of education as an investment and themselves as future returns. In this framework, learning becomes instrumental. Its worth lies not in what it cultivates internally, but in what it produces externally.

The figure that emerges from this logic is the employable graduate. This graduate is adaptable, credentialed, and aligned with market needs. They possess competencies rather than convictions, skills rather than questions. Again, this is not a caricature; such graduates are often capable, hardworking, and earnest. The issue is not their character, but the narrowness of the educational horizon offered to them.

When universities center employability as their defining purpose, they implicitly teach students how to relate to knowledge. Knowledge becomes something to acquire efficiently, apply predictably, and monetize successfully. Learning is framed as accumulation rather than transformation. The question shifts from What does this idea mean? to How will I use this? Meaning is subordinated to utility.

This orientation subtly reshapes pedagogy. Courses are designed around outcomes rather than inquiry. Assessment privileges demonstration over exploration. Risk—intellectual, moral, or imaginative—is discouraged in favor of clarity and correctness. Students learn quickly what is rewarded: alignment with expectations, mastery of frameworks, and the reproduction of accepted answers. Independent thought becomes optional; compliance becomes strategic.

The employable graduate, then, is not trained to ask whether the structures they enter are just, humane, or sustainable. They are trained to function within them. Education becomes a mechanism of social reproduction rather than social critique. It produces workers capable of sustaining systems, but less capable of imagining alternatives.

This is where the distinction becomes urgent. A scholar is formed to interrogate the world; an employable graduate is prepared to inhabit it. A scholar understands that knowledge is provisional and contested; an employable graduate is taught to deliver solutions within existing parameters. One is oriented toward thought as a moral responsibility; the other toward productivity as a social expectation.

The tragedy is not that universities prepare students for work. It is that work has become the dominant metaphor through which learning is understood. When function replaces formation, education risks losing its capacity to surprise, to unsettle, and to provoke. It becomes efficient, legible, and safe.

And yet, the challenges facing contemporary society—ethical crises, political polarization, technological acceleration, environmental collapse—are not problems of insufficient skill. They are problems of judgment, imagination, and moral reasoning. They require thinkers capable of questioning assumptions, not merely executing plans.

The rise of the employable graduate thus reveals a deeper tension at the heart of modern education. We have learned how to prepare individuals to enter systems efficiently, but we have grown less confident in forming individuals capable of asking whether those systems deserve their allegiance.

Scholar versus Employee

The distinction between a scholar and an employee is not a matter of prestige, nor is it an argument for elitism. It is a distinction of orientation toward knowledge, toward authority, and toward the self. When education collapses these identities into one, it does not merely change outcomes; it reshapes the moral posture of learning itself.

A scholar is defined less by what they know than by how they stand in relation to knowledge. Scholarship is marked by attentiveness, skepticism, and humility. The scholar understands that ideas are provisional, that truth emerges through contestation, and that certainty is often a warning sign rather than an achievement. To be a scholar is to accept intellectual responsibility. To be a scholar is to recognize that thinking carries ethical weight and that inherited frameworks must be examined rather than merely applied.

An employee, by contrast, is defined by function. Their value lies in reliability, efficiency, and alignment with organizational goals. This is not a moral failing; societies require coordinated labor in order to function. But the employee’s relationship to knowledge is instrumental. Knowledge is something one uses to accomplish tasks, solve problems, and meet expectations. Questions are welcome insofar as they improve performance; doubt is tolerated only when it leads to resolution.

The problem arises when education treats these orientations as interchangeable.

When universities imagine the ideal graduate primarily as an employee-in-waiting, they tacitly teach students how to inhabit authority. Authority becomes something external and given located in systems, metrics, and managerial frameworks rather than something to be examined, challenged, or negotiated. The student learns that success depends on correctly interpreting expectations and delivering acceptable outputs. Thinking becomes adaptive rather than critical.

The scholar, by contrast, is formed to be uneasy with authority. Not hostile, but alert. The scholar asks who benefits, whose voice is missing, and what assumptions remain unspoken. This posture is inherently inefficient. It slows decision-making, complicates narratives, and resists closure. For precisely these reasons, it is often inconvenient. But it is also indispensable.

This distinction matters because societies advance not through obedience, but through friction. The ideas that reshape cultures rarely emerge from smooth compliance. They arise from individuals educated to see beyond what is immediately given, to imagine alternatives, to recognize contradictions, and to speak when silence would be easier. Such capacities are not accidental byproducts of education; they must be cultivated deliberately.

When higher education prioritizes employability above all else, it risks producing graduates who are highly capable but intellectually deferential. They know how to operate within systems, but not how to question them. They follow procedures, but struggle to articulate why those procedures exist or whether they should. In this way, education can unintentionally train people to mistake order for wisdom and efficiency for truth.

A system optimized for employability tends to reward conformity masked as professionalism. Original thought becomes risky. Moral dissent becomes inconvenient. The goal is not to form minds capable of resistance, but personalities capable of integration.

The scholar resists this reduction. To be educated as a scholar is to learn that thinking is not merely a tool, but a responsibility. It is to understand that one’s highest loyalty is not to productivity, but to truth-seeking even when that truth unsettles existing arrangements. This does not make scholars unemployable; it makes them dangerous in the most necessary sense of the word.

If education abandons this distinction, it forfeits one of its most important social functions. It ceases to be a space where alternative futures can be imagined and becomes instead a mechanism for reproducing the present. The question, then, is not whether universities should prepare students for work, they must, but whether they will also insist on forming thinkers capable of asking what kind of work, and what kind of world, is worth sustaining.

Teaching How to Think versus Training What to Do

At the center of this distinction lies a deceptively simple claim: education should teach people how to think, not what to think. It is a phrase often repeated and rarely examined. In practice, teaching how to think is far more demanding, more disruptive, and more difficult to institutionalize than training students to perform tasks correctly. It resists standardization. It cannot be rushed. And it does not always produce outcomes that can be predicted in advance.

To teach how to think is to invite uncertainty into the classroom. It requires confronting students with questions that do not resolve neatly, arguments that unsettle prior assumptions, and texts that refuse easy moral alignment. This kind of education privileges process over product. It values the quality of engagement more than the correctness of conclusion. Above all, it insists that thinking is an active, ethical practice rather than a technical skill.

Training, by contrast, is oriented toward execution. It asks what must be done and how efficiently it can be accomplished. Training rewards clarity, repetition, and compliance. It assumes that the goals are already known and that the task of education is to prepare individuals to meet them. Within this model, thinking becomes subordinate to doing. Reflection is permitted only insofar as it improves performance.

The humanities historically occupied the space where thinking itself was the subject. Philosophy trained students to recognize flawed reasoning and to sit with ambiguity. History taught them that institutions are contingent, not inevitable. Literature cultivated empathy and moral imagination by placing readers inside lives not their own. These disciplines did not aim to produce consensus; they aimed to produce discernment.

What makes such education uncomfortable, particularly for institutions under pressure to demonstrate value, is that it cannot guarantee agreement. A student who has learned how to think may arrive at conclusions that are unpopular, inconvenient, or disruptive. They may question authority, challenge inherited norms, or refuse the premises upon which systems operate. From the perspective of efficiency, this looks like failure. From the perspective of democracy and cultural vitality, it is essential.

When education shifts its emphasis toward training, it subtly reshapes student behavior. Students learn to anticipate what instructors want rather than to pursue questions that matter to them. They optimize for grades, credentials, and approval. Risk-taking becomes irrational. Intellectual obedience masquerades as professionalism. Over time, this produces graduates who are adept at navigating systems but hesitant to interrogate them.

The fear that education is increasingly shaping individuals to follow instructions and generate products rather than to cultivate ideas is not a speculative anxiety; it is structural. It is woven into curricular architectures, reinforced through assessment regimes, and echoed in institutional language that defines success as alignment with predetermined outcomes. Within such a framework, intellectual divergence comes to appear not as curiosity, but as risk. Creativity is welcomed only insofar as it can be repackaged as innovation, and thought that resists immediate economic justification is quietly edged toward the margins.

This is not merely an academic concern; it is a civic one. Societies depend on individuals capable of critical judgment, moral reasoning, and imaginative foresight. The most pressing challenges of our time such as technological power without ethical clarity, political polarization without dialogue, economic growth without meaning are not problems that can be solved through training alone. They require people educated to think beyond instruction.

Teaching how to think does not guarantee wisdom. But it creates the conditions under which wisdom remains possible. It prepares individuals not merely to function within the world as it is, but to question whether that world ought to remain unchanged. In doing so, it honors education’s deeper responsibility: not to reproduce society efficiently, but to renew it thoughtfully.

When universities replace this mission with the language of readiness and relevance, they risk mistaking immediacy for importance. The question is not whether graduates can do what is asked of them. The question is whether they have been taught to ask whether what is being asked is worth doing at all.

The Degree as Commodity

When a degree becomes a commodity, its meaning shifts in ways that are subtle yet profound. Commodities are valued not for what they are, but for what they can be exchanged for. Their worth is determined by market conditions, scarcity, and perceived return on investment. Once education enters this logic fully, it does not cease to be education but it is compelled to speak a language that steadily erodes its deeper claims.

The contemporary degree functions increasingly as a transactional object. Students purchase access, complete requirements, and receive a credential that promises future economic benefit. This promise is not incidental; it is often the primary justification offered to prospective students and their families. Marketing materials emphasize career outcomes, salary projections, and employability statistics. Education is framed less as a formative journey and more as a product with deliverables.

In this framework, achievement is redefined. The degree no longer signifies sustained engagement with ideas, traditions, and modes of thought; it signifies successful compliance with institutional procedures. Credit hours replace mastery. Benchmarks replace judgment. Completion becomes the dominant metric, regardless of what has been completed internally. The credential certifies endurance and conformity as much as understanding.

This is not to deny the real labor students undertake, nor the genuine learning that still occurs. Rather, it is to note that the meaning attached to the degree has been thinned. When education must justify its cost primarily through economic payoff, it invites a narrowing of purpose. Students learn, often implicitly, that the value of their education lies not in who they become, but in what the credential allows them to access.

Universities, operating as self-sustaining businesses, are not immune to this pressure. Rising operational costs, reduced public funding, and competitive markets force institutions to frame education in terms that attract enrollment and investment. The result is a feedback loop: students demand economic justification because institutions market it; institutions market it because students demand it. In the process, education becomes both expensive and fragile, priced highly while defended narrowly.

This commodification also alters the student’s relationship to learning. When education is purchased, dissatisfaction is framed as consumer grievance rather than intellectual struggle. Difficulty becomes a service failure. Ambiguity becomes inefficiency. The slow, often uncomfortable work of thinking is reinterpreted as poor design. Education, once understood as transformative, is now expected to be smooth.

What is lost here is not merely romance, but integrity. A commodity must satisfy the consumer; education, at its best, unsettles the learner. A commodity confirms preference; education challenges assumption. When these logics collide, education is pressured to behave like something it is not. The degree becomes a proxy for value rather than a testament to formation.

This dynamic helps clarify the paradox at the center of contemporary higher education: education becomes increasingly expensive even as its meaning is quietly diminished. Rising costs reflect expanding infrastructure, administrative complexity, and market competition, while the substance of education is narrowed by the pressure to promise certainty and measurable return. The result is a steady inflation of credentials accompanied by a corresponding deflation of intellectual ambition.

Seen through the lens of The Son of Man, the degree functions much like the apple. It is tangible, recognizable, and reassuring. It signals legitimacy. But it also obscures the face behind it. It obscures the the actual work of thinking, questioning, and becoming. We accept the symbol because it is easier to exchange than the reality it replaces.

If education is reduced entirely to commodity, it risks hollowing itself out. Not because markets are inherently corrosive, but because learning cannot be fully captured by transactional logic. The value of education lies precisely in what exceeds immediate exchange: the capacity to think critically, to imagine alternatives, and to act with moral awareness.

When the degree becomes the goal rather than the byproduct of formation, education begins to mistake its own symbol for its purpose. And once that confusion takes hold, reclaiming the deeper meaning of achievement becomes not only difficult, but urgent.

The Civic Consequences of a Narrow Education

The consequences of an education oriented primarily toward employability do not remain confined to campuses or career centers. They radiate outward, shaping the intellectual and moral texture of civic life itself. When universities narrow their purpose to the production of efficient workers, society inherits not merely a workforce, but a citizenry trained to function rather than to judge.

Democratic societies depend on more than technical competence. They require citizens capable of interpreting complex information, recognizing manipulation, and engaging disagreement without retreating into slogans or certainty. These capacities are not natural; they are learned. Historically, they were cultivated through exposure to philosophy, history, literature, and rhetoric; disciplines that trained individuals to weigh arguments, recognize contingency, and tolerate ambiguity. When such formation weakens, public life becomes brittle.

A narrow education produces a particular civic posture: deference to systems that present themselves as inevitable. Graduates trained primarily to operate within existing structures are less likely to question the assumptions those structures rest upon. They may be skilled at implementation, but they are often hesitant when confronted with foundational questions: Who benefits? Who is excluded? What alternatives exist? In the absence of these questions, efficiency becomes a substitute for justice.

This helps explain a paradox of contemporary life. Never before have societies possessed such technical capacity, and yet public discourse feels increasingly shallow, polarized, and reactive. The problem is not a lack of information, but a lack of interpretive skill. Citizens are flooded with data but underprepared to evaluate it. Opinion replaces argument. Certainty replaces inquiry. These are not failures of intelligence; they are failures of education.

When education emphasizes compliance and performance, it habituates individuals to external authority. Metrics replace judgment. Expertise is deferred to rather than examined. Over time, this produces a civic culture in which people are either obedient or cynical, but rarely engaged in sustained, reflective participation. Thought becomes outsourced to algorithms, to institutions, to experts rather than exercised as a shared responsibility.

The humanities once served as a counterweight to this tendency. They did not provide answers so much as they cultivated habits of questioning. They taught that history could have unfolded differently, that language shapes reality, and that moral clarity is often hard-won. Without these habits, societies struggle to imagine futures that differ meaningfully from the present. Innovation becomes technological rather than ethical. Progress is measured by speed rather than wisdom.

The concern that education increasingly produces followers rather than thinkers finds its civic consequence here. A society shaped primarily by followers may appear orderly, but it remains fundamentally vulnerable and susceptible to demagoguery, to the seductions of simplification, and to the gradual erosion of democratic norms. Without citizens formed to think critically and independently, public life shifts from deliberation to performance, from shared reasoning to managed spectacle.

This is not an argument for abandoning practical education, nor for retreating into an ivory tower. It is an argument for balance. Work matters. Skills matter. But when education forgets its responsibility to cultivate judgment, society pays a quiet but enduring price. The ability to earn a living does not guarantee the ability to live well together.

In this sense, the loss being traced is not merely academic, but civic and moral. An education that once sought to form individuals capable of critique now risks shaping individuals optimized for compliance. The former sustains democratic life by cultivating judgment and dissent; the latter stabilizes existing systems while leaving their legitimacy largely unquestioned.

If the university has become a business, then the question is not whether it can remain profitable, but whether it can still serve the public good in its deepest sense. The answer depends on whether education can reclaim its role as a space where thinking is valued not for its efficiency, but for its capacity to orient society toward justice, meaning, and shared responsibility.

Returning to the Painting: What Remains Hidden

At this point, it becomes necessary to return to The Son of Man, not as an illustration of the argument, but as its quiet adjudicator. René Magritte’s painting has waited patiently in the background, offering no solutions and demanding no conclusions. Its power lies precisely in its refusal to resolve the tension it creates.

The man remains standing. The suit remains impeccable. The apple remains suspended. Nothing has been corrected.

This is important. Magritte does not remove the obstruction. He does not reward our analysis with revelation. The face, the symbol of subjectivity, agency, and moral presence, remains hidden, not because it has been destroyed, but because it has been replaced by something easier to accept. The apple is not violent. It is ordinary, even comforting. And yet, it blocks the encounter that matters most.

In this way, the painting mirrors the condition of contemporary education with unsettling accuracy. The scholar has not disappeared. The intellectual life has not been eradicated. It has been covered. It has been covered by credentials, outcomes, employability statistics, and marketable assurances. These are not false goods. But they function as substitutes. They promise legitimacy while diverting attention from formation.

What remains hidden is not knowledge, but thinking. Not information, but judgment. Not productivity, but purpose.

The business suit in the painting is crucial here. It signals respectability and belonging. The figure is not marginal or rebellious; he is compliant with social expectation. This is not the image of an enemy of learning, but of its domestication. Education, too, now stands respectably clothed, fluent in the language of markets and metrics. It appears successful. It appears rational. And yet, the face behind it, the question of what education is ultimately for, remains obscured.

The sea behind the figure continues to trouble the composition. It suggests depth, instability, and forces that exceed containment. It is the realm of uncertainty, possibility, and risk. Education, at its best, belongs partly to this sea. It invites students into intellectual waters that cannot be fully mapped in advance. It exposes them to questions that do not resolve neatly into outcomes. But institutions, seeking stability, prefer the wall in front of the sea. They prefer a boundary, a safeguard, a manageable horizon.

The apple hovers precisely at the intersection of these forces. It keeps the subject legible while protecting the system from the unpredictability of genuine thought.

What this painting ultimately reveals is not deception, but accommodation. The man has accepted the apple. He does not push it away. Likewise, universities have largely accepted the substitution of formation with function, not because they reject thinking, but because they have learned to survive without foregrounding it. Over time, survival becomes justification. Justification becomes identity.

And yet, Magritte reminds us that what is hidden still exists. The face is there. The scholar remains. The capacity for education to form thinkers rather than merely produce graduates has not vanished. It has simply been covered by symbols that promise clarity without depth.

It is here that this essay must resist the temptation of a tidy resolution. There is no simple restoration, no return to robes and cloisters, no wholesale rejection of work or economic life. The task is more demanding than nostalgia allows. It requires learning to see beyond the apple. It requires us to recognize the moment when symbols of value begin to stand in for value itself.

To return to education, then, is not to abandon employability, but to refuse its dominance. It is to insist that thinking is not a luxury, but a responsibility. It is to remember that societies do not move forward merely by producing more efficient workers, but by cultivating minds capable of imagining alternatives, resisting injustice, and asking uncomfortable questions.

Magritte offers no prescription, only a demand for attention. He reminds us that whenever something essential is hidden, the ethical task is not to accept the substitution, but to ask what we have agreed not to see.

Conclusion: Reclaiming the Face

To reclaim the face in The Son of Man is not to remove the apple, but to recognize what it conceals. René Magritte does not invite us to destroy the symbol; he invites us to become aware of our willingness to accept it. That awareness, applied to education, is where any honest recovery must begin.

The transformation of higher education into a self-sustaining business did not occur because institutions abandoned their ideals, but because they learned to survive in a culture that increasingly demands justification in economic terms. Over time, survival logic hardened into identity. The degree became a product. The graduate became an outcome. Thinking became valuable insofar as it could be operationalized. What was lost was not learning itself, but confidence in learning as a human good that exceeds immediate utility.

Reclaiming the face of education, then, does not mean rejecting employability or denying the realities of work. It means refusing to allow employment to exhaust education’s purpose. A society that educates only for function risks becoming efficient but hollow. It risks become an education capable of producing goods without producing meaning, systems without wisdom, and compliance without judgment.

The scholar matters because the scholar embodies a different relationship to the world. The scholar is not opposed to work, but oriented toward understanding. Not resistant to structure, but alert to its limits. Not indifferent to outcomes, but unwilling to let outcomes replace inquiry. This posture, slow, critical, and reflective, is increasingly difficult to defend in institutions shaped by speed, scale, and market logic. Yet it is precisely this posture that enables societies to correct themselves.

To teach people how to think rather than what to think is to accept risk. It is to form individuals who may challenge authority, disrupt consensus, and resist inherited assumptions. Such individuals are inconvenient. They do not always integrate smoothly. But they are necessary. Without them, societies stagnate. They repeat themselves efficiently until they collapse under the weight of unexamined belief.

The tragedy being named is not merely educational, but moral. When education trains individuals primarily to follow instructions and deliver outcomes, it relinquishes its responsibility to cultivate conscience and imagination. Obedience is privileged over judgment, productivity over purpose, and in that exchange the very face that gives education its meaning is quietly obscured.

And yet, the face remains.

Behind the credentials, behind the metrics, behind the language of readiness and relevance, there persists a quieter truth: education matters because thinking matters. Not thinking as technique, but thinking as an orientation toward truth, toward justice, toward the possibility that the world might be otherwise than it is.

The task before education is not restoration, but remembrance. To remember that a degree is not an end, but a begining. That employability is not a telos, but a consequence. That the deepest work of learning cannot always justify itself in advance, because its value lies in what it makes possible rather than what it produces.

Magritte leaves us standing before a figure we cannot fully see. Education stands before us in much the same way. Both stand before us recognizable, respectable, and partially obscured. Whether we choose to accept the apple or to ask what it hides will determine not only what our universities become, but what kind of society they sustain.

To reclaim the face is to insist that education is not merely preparation for life, but participation in its meaning. It is to defend thinking not as a luxury, but as a civic and moral necessity. And it is to affirm, quietly but firmly, that an educated person is not simply someone who can do what is asked, but someone who knows how and when to ask otherwise.