<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
        xmlns:content="http://purl.org/rss/1.0/modules/content/"
        xmlns:wfw="http://wellformedweb.org/CommentAPI/"
        xmlns:dc="http://purl.org/dc/elements/1.1/"
        xmlns:atom="http://www.w3.org/2005/Atom"
        xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
        xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
        xmlns:media="http://search.yahoo.com/mrss/"
        xmlns:snf="http://www.smartnews.be/snf"
        >
<channel>
        <title>The MIT Press Reader</title>
        <atom:link href="https://thereader.mitpress.mit.edu/tag/values/feed/" rel="self" type="application/rss+xml" />
        <link>https://thereader.mitpress.mit.edu/</link>
        <description>Illuminating the bold ideas and voices that make up the MIT Press's expansive catalog. </description>
        <lastBuildDate>Thu, 02 Apr 2026 09:55:00 +0000</lastBuildDate>
        <language>en-US</language>
        <sy:updatePeriod>hourly</sy:updatePeriod>
        <sy:updateFrequency>1</sy:updateFrequency>
        <snf:logo><url>https://thereader.mitpress.mit.edu/wp-content/themes/ta/img/mitp-reader-smartnews.png</url></snf:logo>
        <generator>https://wordpress.org/?v=6.9.4</generator>
        			                <item>
                        <title>‘Backrooms’ and the Rise of the Institutional Gothic</title>
                        <link>https://thereader.mitpress.mit.edu/backrooms-and-the-rise-of-the-institutional-gothic/</link>
                        <pubDate>Thu, 02 Apr 2026 09:55:00 +0000</pubDate>
                        <dc:creator>Shira Chess</dc:creator>
                        		<category><![CDATA[Backrooms]]></category>
		<category><![CDATA[Gothic]]></category>
		<category><![CDATA[Internet]]></category>
		<category><![CDATA[Liminal]]></category>
		<category><![CDATA[Video Games]]></category>
		<category><![CDATA[Media]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19541</guid>
                        <description><![CDATA[<p>A new spin on an old genre replaces flesh-and-blood monsters with the mundanity of modern bureaucracy.</p>
]]></description>
                        <content:encoded><![CDATA[<p>A new spin on an old genre replaces flesh-and-blood monsters with the mundanity of modern bureaucracy.</p>

<figure class="wp-block-image">
<img width="700" height="394" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/Backrooms_model-700x394.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>A version of “The Backrooms,” from the original 4chan creepypasta. </figcaption>
</figure>

<p class="has-drop-cap">In February, A24 released a <a href="https://www.youtube.com/watch?v=tKGhxMi50y8" target="_blank" rel="nofollow">movie teaser</a> that was likely difficult for many to parse. The promo for its upcoming film, “Backrooms,” features no characters, no plot, and no music. Instead, the camera moves downward through layers of uncanny interiors, accompanied by a narrator who recalls a “massive” space full of rooms that “build” and “remember” themselves. If you watched this video without context, you might have come away confused. A <a href="https://www.instagram.com/p/DWjfwilEeI4/" target="_blank" rel="nofollow">second trailer</a> released this week offers only slightly more detail, with a man obsessively telling his therapist about an uncannily infinite space: “Sometimes I’m scared I’ll get lost,” he admits, before rhapsodizing, “It’s beautiful… am I right?”</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262553889/the-unseen-internet/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/demon-jckt.jpg" alt="" class="wp-image-19544"/></a><figcaption class="wp-element-caption">Shira Chess is the author of “<a href="https://mitpress.mit.edu/9780262553889/the-unseen-internet/" target="_blank" rel="noreferrer noopener">The Unseen Internet</a>.”</figcaption></figure>
</div>


<p>However, there’s a surprisingly deep history behind “Backrooms.” It’s one that touches on everything from Gothic literature to internet folklore to video game culture to ’80s nostalgia. But above all, “Backrooms” captures a <em>feeling — </em>and one that I would argue has become a defining condition of life under Corporate America: dread.</p>



<p>To unpack this feeling — and how it comes into play in “Backrooms” — we must first gesture toward “liminality,” a term that seems to be suddenly creeping out of academia and into the mainstream. The term, coined over 100 years ago by anthropologist Arnold van Gennep, originated in reference to a ritual threshold space. Throughout the 20th century, it was used to capture the disorientation one feels in transitional, in-between spaces. By the 2010s, the internet had reified liminality into a full-fledged visual “aesthetic”: Think abandoned bowling alleys, vintage airport terminals, and deserted playgrounds at dusk. More often than not, liminal aesthetics are human-made spaces, sans humanity.</p>



<p>It was out of this context that the idea for “The Backrooms” emerged, first as “creepypasta” — internet slang for a spooky story that’s cut and pasted so many times that people lose sight of its original authorship. Like all good creepypastas, the post was thin in lore, leaving ample room for endless interpretation and reinterpretation. It appeared on 4chan’s /x/, the paranormal-positive board of the infamously anonymous internet hate machine, and was conjured in response to a prompt asking for spaces that looked <em>wrong</em>. One anon posted a picture of an eerily empty, yellowed office space, alongside the text:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>If you’re not careful and you Noclip out of reality in the wrong areas, you’ll end up in the Backrooms, where it’s nothing but the stink of old moist carpet, the madness of mono-yellow, the endless background noise of fluorescent lights at maximum hum-buzz, and the approximately six hundred million square miles of randomly segmented empty rooms to be trapped in. </p>



<p>God save you if you hear something wandering around nearby, because it sure as hell has heard you.</p>
</blockquote>



<p>That term, “noclip,” is meaningful here. It’s video game-speak for falling through what appears to be a stable in-game object — a failure of digital collision detection. This kind of slippage combines esoteric notions about the flimsiness of reality with gaming logic, not unlike the notion that real-life people are non-player characters (NPCs) or the hypothesis that we live in a computer simulation. Like the simulation hypothesis, noclipping presents the world as an imperfect construct built by unseen programmers and suggests a humanity at the brink of becoming digital objects themselves. </p>



<p>For a while, “The Backrooms” was a crowdsourced effort, confined purely to niche corners of the internet. Its fans created level after level of wiki-madness, posting thousands of eerie officescapes one might accidentally noclip into. Then it got bigger: In 2022, then-17-year-old Kane Parsons (who’s also director of the A24 film) created an eponymous web series, supplementing the aesthetic with deep lore. <span style="box-sizing: border-box; margin: 0px; padding: 0px;">Parsons&#8217; creation</span>, which he designed with a 3D modeling software often used for game development, soon went massively viral, with over 190 million views to date.</p>



<figure class="wp-block-pullquote"><blockquote><p>In these new tales, offices become inescapable traps — long winding corridors with no way out.</p></blockquote></figure>



<p>In the nine-minute, 14-second pilot, Parsons carries his camera in a game-like first-person perspective. He plays the role of a 1991 indie filmmaker who suddenly falls downward into duplications of the same interior, navigating an eerie, brightly lit, otherwise empty office of the “Async Research Institute.” After a brief exploration, he is stalked by a chimeric creature (a “lifeform”) composed of wires and unknown organic materials that looks more machine than human. As with all found-footage horror, it doesn’t end well for anyone.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Thanks in large part to “The Backrooms,” media scholars have begun to grapple more seriously with the sudden popularity of liminal aesthetics in online folklore. </p>



<p><a href="https://journals.sagepub.com/doi/10.1177/14614448241238395" target="_blank" rel="nofollow">Bradley Earl Wiggins</a> argues the genre taps into a “nostalgia” that offers a “critique of consumerist society during perceived notions of late capitalism” through its hyperreal game-like visuals — reproductions of things that never physically existed. Likewise, <a href="https://online.ucpress.edu/fq/article/79/3/8/217405/The-Surprising-Folklore-of-Analog-Horror" target="_blank" rel="nofollow">Elinor Dolliver</a> characterizes the appeal as a new spin on storytelling in the analog-horror style, which gives the impression of folklore without any connection to actual history. I agree with both takes, but I would further suggest the rise of liminality evinces the resonance of a new media genre altogether — one which I’d call the Institutional Gothic.</p>



<p>To understand this shift, we need to wind back the clock a couple of hundred years.</p>



<p>Horace Walpole’s “The Castle of Otranto” (1764), a tale of crumbling aristocracies, secrets, revenge, murder, and hauntings, is widely regarded as the original Gothic novel. The genre exerted its influence throughout the 19th century, with what <a href="https://www.routledge.com/Gothic/Botting/p/book/9780415831727" target="_blank" rel="nofollow">Fred Botting</a> refers to as the “return of the pasts on the present”— the sins of older generations bearing upon the young. Its settings were necessarily both familiar and mysterious, often situated in dramatic, desolate landscapes. The gothic was marked by duplicitous, monstrous antagonists, terrified heroines, fragmented narratives, and the supernatural. Its particularities changed with new locales and time periods, but at their thematic root, they’ve always been negotiations of cultural and social mores, reminding us that our present can never escape the ghosts of our past.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1920" height="1080" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/image-1.png" alt="" class="wp-image-19556" style="width:644px;height:auto"/><figcaption class="wp-element-caption">Gameplay from “<a href="https://www.stanleyparable.com/" target="_blank" rel="nofollow">Stanley Parable: Ultra Deluxe</a>.” Source: Crows Crows Crows.</figcaption></figure>
</div>


<p>Which brings us to the <em>Institutional</em> Gothic. Like the traditional Gothic, the Institutional Gothic involves uncanny spaces, malevolent forces, and overwhelming discomfort related to spatiality and power. But where the traditional Gothic is dark and looming with ornate architecture, the Institutional Gothic occurs in winding or otherwise empty office spaces, consumed by machine-made mundanity and the unforgiving gaze of noisy overhead fluorescent lighting. The antagonists, once bloodthirsty lords, are instead soulless corporations. The protagonists, once women at the mercy of those lords, are now often white men, wandering fearfully or uncomfortably through those catacombs. </p>



<p>Taken together, the Institutional Gothic transforms a genre once fueled by phantasmal terror into the familiar, worldly dread of workplace alienation.</p>



<p>There are many examples of this beyond “The Backrooms.” The video game “The Stanley Parable” and the AppleTV series “Severance” (filmed at the semi-abandoned Bell Works complex in New Jersey) share the same aesthetic. Aspects of Institutional Gothic have also crept into real life in recent years, with the <a href="https://www.wsj.com/real-estate/commercial/empty-manhattan-offices-coronavirus-reopen-workplace-lockdown-newyork-covid-economy-11597157584" target="_blank" rel="nofollow">emptied fiefdoms</a> of our post-pandemic workplaces. The aesthetic has <a href="https://www.nytimes.com/2025/10/16/style/mall-world-dreams-tiktok.html" target="_blank" rel="nofollow">apparently</a> even found its way into our subconscious: Thousands of people online last year shared the experience of having the same lucid dream — walking through empty food courts and stairs to nowhere in a giant abandoned “Mall World.”</p>



<figure class="wp-block-pullquote"><blockquote><p> “The Backrooms” reminds us that we have no choice but to negotiate with the monsters and sins of our past.</p></blockquote></figure>



<p>When it comes to “The Backrooms,” though, we find all the hallmarks of the Institutional Gothic: the labyrinthine hallways, bright lights, the bland “madness of mono-yellow,” and a supernatural subtext built out of the mythologies of corporate and government experimental woo-woo. It is no coincidence that “The Backrooms” is set mostly in the ’80s and ’90s, a time of great prosperity for the American middle class. It was the heyday of cubicles, where infinitely reusable office spaces, despite the booms and busts of modern capitalism, seemed as though they’d be useful forever. By the early 2000s, the cubicles were mostly eclipsed by panopticon-friendly open floor plans. And today, with many industries scaling back, our old institutions are increasingly emptied out.</p>



<p>If traditional Gothic is about the sins of the past revisiting the present, then the Institutional Gothic echoes that trope by focusing not on the class-based horrors of 200 years ago, but on the (still class-based) corporate choices made within the 20th century. The middle class has been alienated and abandoned by Corporate America. In these new tales, offices become inescapable traps — long winding corridors with no way out.</p>



<p>Of course, monsters remain at the heart of the Institutional Gothic, too. The “lifeform” of “The Backrooms” was never the real monster, as was the case with Mary Shelley’s Gothic horror “Frankenstein.” Rather, the monster was always the <em>creator</em>, which, in the case of the Institutional Gothic, is the efficiency-seeking corporation. From environmental destruction to indifference to human harm, our 20th-century oligopolies paved the path to where we are now. Like a hydra, they cannot be killed; they just re-form under a new head.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">In one of the most suspenseful moments from “The Backrooms,” we see a group of men in hazmat suits holding a red tether line as they explore the alien office environment. It’s odd watching a cautious exploration into a space that would otherwise seem so familiar, the red line loudly pronouncing itself within all that mono-yellow.</p>



<p>But one can imagine a different version of this scene: a <em>future</em> humanity similarly excavating remains of corporate hallways that have since crumbled, wondering what life could have been like at the turn of the 20<sup>th</sup> century. What might our strange office spaces look like to the humans of the 2100s? What might they eventually look like to Gen Z and Gen Alpha, who may only know these environments through the ominous “Backrooms” or the goofy hijinks of “The Office”?</p>



<p>As we reconcile with our lost spaces and our offices give way to new “lifeforms,” such as billion-dollar data centers for AI and cloud computing, liminality will continue to define this threshold moment between physicality and digitality. “The Backrooms” — in all its iterations — reminds us that we have no choice but to negotiate with the monsters and sins of our past as we noclip into an unknown future.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong>Shira Chess</strong> is Associate Professor of Entertainment and Media Studies at the University of Georgia. She is the author of “<a href="https://mitpress.mit.edu/9780262044387/play-like-a-feminist/" target="_blank" rel="noreferrer noopener"><em>Play Like a Feminist</em></a>,” “<a href="https://www.upress.umn.edu/9781452954998/ready-player-two/" target="_blank" rel="noreferrer noopener nofollow"><em>Ready Player Two</em></a>,&#8221; and “<a href="https://mitpress.mit.edu/9780262553889/the-unseen-internet/" target="_blank" rel="noreferrer noopener"><em>The Unseen Internet</em></a>.” You can find more of her work on her <a href="https://unseeninternet.substack.com/" target="_blank" rel="nofollow">Substack</a></em>.</p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/Backrooms_model.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/Backrooms_model.jpg" />                                        </item>
        			                <item>
                        <title>Is Citizenship a ‘Blood Aristocracy’ in Disguise?</title>
                        <link>https://thereader.mitpress.mit.edu/is-citizenship-a-blood-aristocracy-in-digsuise/</link>
                        <pubDate>Mon, 30 Mar 2026 09:55:00 +0000</pubDate>
                        <dc:creator>The Editors</dc:creator>
                        		<category><![CDATA[America]]></category>
		<category><![CDATA[Citizenship]]></category>
		<category><![CDATA[Immigration]]></category>
		<category><![CDATA[Inequality]]></category>
		<category><![CDATA[Culture]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19657</guid>
                        <description><![CDATA[<p>Why the passport you inherit can determine your place — and potential — in a hierarchy of global inequality.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Why the passport you inherit can determine your place — and potential — in a hierarchy of global inequality.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/citizenship-cover-copy-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock</figcaption>
</figure>

<p class="has-drop-cap">On some level, life can be understood as a series of lotteries: genetic, familial, economic, and so on. These contingencies shape everything from our educational and professional opportunities to our freedom of movement and even life expectancies. </p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262537797/citizenship/" target="_blank"><img loading="lazy" decoding="async" width="320" height="448" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/citizenship-jkt-copy.jpg" alt="" class="wp-image-19658"/></a><figcaption class="wp-element-caption">Dimitry Kochenov is the author of “<a href="https://mitpress.mit.edu/9780262537797/citizenship/" target="_blank">Citizenship</a>.”</figcaption></figure>
</div>


<p>But few are as brutally determinative as the country in which we are born, argues Dimitry Kochenov. In his “<a href="https://mitpress.mit.edu/series/mit-press-essential-knowledge-series/" target="_blank">Essential Knowledge</a>” book, “<a href="https://mitpress.mit.edu/9780262537797/citizenship/" target="_blank">Citizenship</a>,” the Soviet-born Dutch legal scholar interrogates how the modern citizenship regime operates not merely as a legal framework but as an engine of global inequality that preserves a kind of “blood aristocracy.” International rules governing citizenship, he contends, constrain the potential of billions of people in the Global South by trapping them in their circumstances of birth, all while citizens of Western nations enjoy privileged access to healthcare, jobs, and international mobility. “Citizenship,” the author writes, “is never and has never been neutral.”</p>



<p>In the following interview, edited for length and clarity, Kochenov unpacks the debate around “open borders,” the murky realities of statelessness, and how citizenship has been weaponized in U.S. immigration policy. “If regular people don’t actually see the arbitrariness, the outrageousness, the inhumanity” of immigration enforcement, he says, “then they cannot have an open and informed conversation about the actual values of this society.” Increasingly, Kochenov adds, “Americans are learning about those values the hard way.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>When people think of the word “citizenship,” they probably imagine a set of rights and protections. Your book challenges that intuition dramatically — by arguing that citizenship is a system rooted in and perpetuating exclusion and hierarchy. How did you come to see citizenship this way?</em></strong></p>



<p><strong>Dimitry:</strong> Everything depends on who you are. If you lived your whole life in New York, you probably have no reason to think about citizenship critically. The critical angle only appears once you achieve a certain level of abstraction. So, for example, once you start crossing borders, once you start <em>comparing</em>. Without the vital element of comparison, there can be no talk of throwing this concept of citizenship down from its pedestal.</p>



<p>Once you start looking at things globally, citizenship is not what it seems. The law does not work in the same way for everybody. For example, <em>you</em> can vacation in the U.K., but most of the world’s population cannot dream of it because they know they will never obtain the necessary visa, as the U.K. is quite selective about whom it grants visas. Access only depends on one thing, essentially: your nationality.</p>



<figure class="wp-block-pullquote"><blockquote><p>Once you start looking at things globally, citizenship is not what it seems.</p></blockquote></figure>



<p>Citizenship was designed to fight the aristocracy. It started as a way to ensure that a random peasant — compared with a count who owns all the land — actually has a claim to equality under the law. Now, it is about preserving that aristocracy.</p>



<p><strong><em>You have a good quote in the book that I think speaks to the thought-terminating nature of citizenship: “Citizenship is totalitarian in nature — it does not emerge in dialogue.” Is that to say the entire world is organized around an authoritarian concept that’s non-negotiable?</em></strong></p>



<p><strong>Dimitry:</strong> Absolutely. I stand by my quote. All this is distributed by blood. Some states act as if this is not true; they would instead say citizenship is about their “values.” But then there is actually research on values when you compare these countries’ constitutions, party majorities, and political programs. The values are roughly all the same: tolerance, dignity, human rights, free speech, rule of law, blah, blah, blah. So, the idea that borders are also correlated with any kind of value system is absurd, because then they wouldn’t actually draw the boundaries they are designed to draw.</p>



<p>Just like in feudal times, rights, duties, and glass ceilings are distributed in today’s world by blood, and citizenship is the core tool of such distribution. With less than 2 percent of the world’s population ever changing citizenship after birth, global non-blood-based transmission of citizenship is within the margin of statistical error. Crucially, birth distribution <span style="box-sizing: border-box; margin: 0px; padding: 0px;">depends chiefly on bloodlines, even in <em>ius soli </em>jurisdictions: the majority of babies born in U.S. territory will have at least one U.S.-citize</span>n parent anyway. The spatial, as opposed to class-based, nature of global inequality completes this picture: Where opportunity and rights are spatialized, citizenship of the U.S., France, or Japan is an aristocratic title, a blood-based key to global opportunity, while the majority of other citizenships in the world is the exact opposite: a rightless liability you cannot refuse.</p>



<p><strong><em>Right. You can observe every American value in the book, know everything about American history, speak perfect English, and it still doesn’t make you a citizen…</em></strong></p>



<p><strong>Dimitry:</strong> Yes, you’ll be deported. ICE will come after you, and they will separate you from your children, who will be mistreated in some kind of cage. So much for “values.” In fact, <em>that </em>speaks more to the values of America, as well as to the values projected onto the population.</p>



<p>This is a tragedy in the contemporary world. We all say we have certain values, but then there is an asterisk: It doesn’t apply to the majority of humans, certainly not those who do not have the right color of passport, which our society has chosen. We are back to square one: blood aristocracy, which citizenship was originally designed to destroy. Someone who was born with an Afghan passport in Berlin and went to a Berlin public school and got a PhD in physics there — they will still not be entering the United States visa-free because they have an Afghan passport.</p>



<p>The other side of the same coin is that you often cannot renounce your citizenship by simply saying to your country of origin, “I don’t believe in your values. I would like something else.” In the majority of cases, citizenship never expires. This is something that can haunt you your whole life, your children, and the children of your children, because it also undermines the quality of any other passports or citizenships you might hold. The U.S. often will not let you enter if you also hold the citizenship of some “questionable” country.</p>



<p>The overwhelming majority of people in the world never change their citizenship, principally. This is the kind of thing that modern constitutionalism was supposed to supersede and overcome, but in fact it hasn’t.</p>



<p><strong><em>You spend a significant portion of the book discussing statelessness, which might seem like a rare anomaly, but in reality, <a href="https://www.unhcr.org/about-unhcr/who-we-protect/stateless-people" target="_blank" rel="nofollow">it affects millions of people worldwide</a>. What are some of the most common ways people become stateless?</em></strong></p>



<p><strong>Dimitry:</strong> In general, international law would at least strive to prevent statelessness. But in practice, the group is growing. There are plenty of reasons for this: It can be a state secession that went wrong — for example, when Latvia left the Soviet Union. Or sometimes countries simply don’t care about basic principles of equality in international law. For instance, they might believe that women can create human beings, but at the same time, they cannot create<em> citizens</em>, especially if they don’t have the “right” husband. (Consider Lebanon and other places in the Middle East or Belgium until the end of the ’80s. There are still plenty of nationalities that cannot be transferred via women.) Or some countries might abuse their idea of “values” in order to maliciously deprive you of your nationality.</p>



<p>Last year, though, I<a href="https://www.annualreviews.org/content/journals/10.1146/annurev-lawsocsci-041822-045326" target="_blank" rel="nofollow"> published a paper arguing that statelessness as a concept is meaningless</a>. And I will explain why: Statelessness in international law is when someone is not claimed by any public authority of any other state. It’s presumed that as a stateless person, you’re worse off than someone who has some citizenship. That might be true when comparing a stateless person in some horrible, dangerous place with someone who has citizenship in a desirable country. But in most cases, it doesn’t work that straightforwardly at all. Many “illegal” immigrants throw away their passports and lie about their origins because recognized statelessness puts them on a faster track to citizenship in the majority of cases. It also doesn’t transfer the stigma of their undesirable nationality to the next generation.</p>



<p><strong><em>I want to pivot to the U.S. As you’ve already alluded to, immigration has been a defining political issue since the post-Trump era — </em></strong></p>



<p><strong>Dimitry:</strong> It started before that. The border walls appeared more than 10 years ago. What changed — and I apologize for jumping in — is that this immigration enforcement reached local communities. Suddenly, Minneapolis is the border. Suddenly, Boston is the border.</p>



<div class="wp-block-image ma-related-post ma-related-post-normal"><figure class="alignright size-pinned is-resized"><a href="https://thereader.mitpress.mit.edu/anatomy-of-extremism-what-ice-is-revealing-in-minnesota/" title="Anatomy of Extremism: What ICE Is Revealing in Minnesota"><span class="ma-related-post-top"><span class="ma-related-post-heading">Related</span><span class="ma-related-post-title">Anatomy of Extremism: What ICE Is Revealing in Minnesota</span></span><img decoding="async" loading="lazy" class="ma-related-post-img" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Extremism-1-408x267.jpg" alt="" width="370" height="242"></a></figure></div>



<p>All this sorting of people based on their status is not something that is logically explainable within any established community, especially if it’s a democracy. This is not simply how we think. We don’t sort people by blood, closing our eyes to all the other factors, like education, wealth, beauty, you pick. But the border is<em> only</em> about blood, nothing else. When this border comes to your village and your township and starts sorting people precisely on that principle, everybody experiences it as a tragedy.</p>



<p>So, if you are demonstrating against ICE in Minneapolis, you are actually demonstrating against the main rule in global law governing the distribution of rights and duties in human population management, which is strictly enforced worldwide.</p>



<p><strong><em>Do you think that the way that U.S. immigration enforcement has historically happened out of public view has constrained the national conversation around the issue?</em></strong></p>



<p><strong>Dimitry:</strong> Absolutely. If regular people don’t actually <em>see</em> the arbitrariness, the outrageousness, the inhumanity, and the demeaning exclusion based on no rational ground, then they cannot have an open and informed conversation about the actual values this society or any other is built on. Suddenly, Americans are learning about those values the hard way.</p>



<p>But also, America is not the worst example. I think the European Union is number one in terms of how harmful and how immigration enforcement is, when you consider deaths, kidnappings, and torture. Look at the <a href="https://www.unicef.org/press-releases/approximately-3500-children-have-died-central-mediterranean-over-past-10-years" target="_blank" rel="nofollow">3,500 migrant children who have died</a> over the past 10 years attempting to cross the Mediterranean for Italy, as well as the <a href="https://missingmigrants.iom.int/region/mediterranean" target="_blank" rel="nofollow">34,000-plus migrants still missing today</a>. In Europe, all this happens out of public view and is, essentially, fully legal through the creative use of legal techniques that I, with a co-author from Yale Law School, <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5051949" target="_blank" rel="nofollow">described as ‘EU Lawlessness Law.</a>’</p>



<p><strong><em>Open borders are often invoked, especially on the right, as a caricature of a completely rule-free global order. But what, in your mind, would a world with open borders actually look like in practical terms?</em></strong></p>



<p><strong>Dimitry:</strong> To me, the borders are already open. Like, when I show my Dutch passport at any border, I never encounter any obstacles. The majority of my friends — those who are citizens of the global aristocracy — never do either.</p>



<p>Which means that when we speak of “opening” or “closing” the borders, we only speak about people who are “not like us.” It only applies to, say, someone from the Central African Republic, or to someone from Bogota, or to someone from Algeria — not to a Frenchman and not to a guy from Tokyo. And this is what undermines the quality of the academic debate on this issue; they never make that disclaimer. They always pretend that the U.S. border is equally meaningful for a Mexican and for a Canadian.</p>



<p>This one-way-ness is actually just a cover-up for how wrong the starting point is. We should approach people as human beings rather than as citizens. Then, we might have a totally different conversation. We have to admit that blood is not the right proxy for security or any other kind of selection.</p>



<p><strong><em>So, then, I guess the question is: What would be the more socially ideal way to sort people, to regulate who’s coming in and out, who can reside, who can work, etc., within that framework?</em></strong></p>



<p><strong>Dimitry:</strong> It’s an interesting question. Look at the European Union: Having non-discrimination on the basis of nationality as a starting point works.</p>



<p>Of course, you could say the E.U. consists merely of the richest countries, etc., and that’s true. But it’s also <em>not</em> true because, for example, Bulgaria’s GDP per capita is more than six times smaller than Ireland’s — it is a bigger discrepancy than that of Mexico and the U.S. So, to pretend that borders are meaningful and that opening them is dangerous, at least in the context of the E.U., is absolutely baseless.</p>



<p>If you suddenly start treating people as human beings based on the data they submit, you might discover that, actually, you <em>can</em> open the border for plenty of people and fine-tune the system along the way. They will not be overstaying. They will not be violating the objectives that states set for themselves. <span style="box-sizing: border-box; margin: 0px; padding: 0px;">In fact, many states <em>already</em> review personal data beyond passports to determine who should be able to cross their borders, a</span>s more and more countries — the U.S., Australia, the U.K., and the Schengen Zone members now require pre-travel authorizations from all foreign travellers. Broader deployment of modern information technology could turn such screening into a much more effective tool than the good old passport color test.</p>



<p><strong><em>I want to end on a question that’s a bit more personal. You grew up through the fall of the Soviet Union. How do you think that might have colored the way you think about citizenship from an early age?</em></strong></p>



<p><strong>Dimitry:</strong> I was lucky because it was a time when schools didn’t know what to teach. I didn’t receive what might be considered a “normal” indoctrination. You know, <em>love your motherland</em>. It was great that the U.S.S.R. was dissolving, but it’s not as though it suddenly disappeared. </p>



<p>I think the Olympic Games were in ’92, and by then, there was no Soviet Union. The team simply flew the Olympic flag. Who were those guys competing for? The simple answer was that nobody knew. Even those who were in charge of particular provinces or countries didn’t know.</p>



<figure class="wp-block-pullquote"><blockquote><p>We should approach people as human beings rather than as citizens.</p></blockquote></figure>



<p>Now, one or two generations after mine, <a href="https://www.theguardian.com/world/2025/feb/09/many-teachers-dont-want-to-do-this-but-theyre-trapped-film-shows-extent-of-putin-indoctrination-in-russian-schools" target="_blank" rel="nofollow">all the indoctrination is back</a>, in the worst kind of shape and form. So, this kind of window of statelessness — for me — was good in a sense, but also bad, because without police, crime runs rampant. And when the state is gone, law enforcement <em>becomes</em> crime. So, you start doubting the state much more. And when the states tell you a story — like America’s story about “values” — but then ICE comes, and they shoot random people, you remember: “Oh, I&#8217;m not surprised; this is what states do.”</p>



<p><strong><em>You mentioned the Olympics, which calls to mind the recent fracas over Eileen Gu’s decision to compete for China rather than America in this year’s games. I can’t help but ask you what you thought about that situation.</em></strong></p>



<p><strong>Dimitry:</strong> It’s funny because in every Olympic Games, there have been plenty of these kinds of competitors. If you follow the media in the U.S., it will be one person; if you follow the French media, it will be another.</p>



<p>Normal human beings don’t fit into this neat globe where all the territories are marked. She <a href="https://www.cnn.com/2026/02/19/sport/eileen-gu-china-us-controversy-winter-olympics-intl-hnk" target="_blank" rel="nofollow">stated in an interview</a> that it’s difficult for Americans to understand that she is as Chinese as she is American. And for the Chinese, it’s also difficult. And it’s wrong.</p>



<p>You can fly whatever flag, but you cannot change what’s in your heart. It’s simply about how your brain is wired, how you go about different languages, spaces, and circumstances in life. And some people will be totally at home in seven countries, while others will only be at home in one.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong><em>Dimitry Kochenov </em></strong><em>is Professor at the Central European University in Vienna and Budapest, and also teaches at LUISS Guido Carli in Rome and Peking University School of Transnational Law in Shenzhen. He has served as a consultant for governments, law firms, and international institutions, including the Maltese Republic, the Kingdom of the Netherlands, and the European Parliament. He is the author of “<a href="https://mitpress.mit.edu/9780262537797/citizenship/" target="_blank" rel="noreferrer noopener">Citizenship</a>.”</em></em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/citizenship-cover-copy.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/citizenship-cover-copy.jpg" />                                        </item>
        			                <item>
                        <title>The Grand Old Illusion of ‘Ethical’ Capitalism</title>
                        <link>https://thereader.mitpress.mit.edu/the-grand-old-illusion-of-ethical-capitalism/</link>
                        <pubDate>Thu, 26 Mar 2026 09:53:00 +0000</pubDate>
                        <dc:creator>Brad Swanson</dc:creator>
                        		<category><![CDATA[Big Oil]]></category>
		<category><![CDATA[Capitalism]]></category>
		<category><![CDATA[ESG]]></category>
		<category><![CDATA[investing]]></category>
		<category><![CDATA[Trump]]></category>
		<category><![CDATA[Economics]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19429</guid>
                        <description><![CDATA[<p>Donald Trump’s Big Oil bonanza is an environmental disaster — but the industry’s reaction exposes a larger truth about capitalism itself.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Donald Trump’s Big Oil bonanza is an environmental disaster — but the industry’s reaction exposes a larger truth about capitalism itself.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/rabbit-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock
</figcaption>
</figure>

<p class="has-drop-cap">Donald Trump has long called global warming a <a href="https://democrats.org/news/donald-the-denier-donald-trump-has-repeatedly-called-climate-change-a-hoax/" target="_blank" rel="nofollow">hoax</a>, but his sweeping anti-climate agenda has stunned even many of his supporters. Since returning to the White House, he’s <a href="https://www.theguardian.com/us-news/2026/jan/27/trump-withdraws-paris-climate-agreement" target="_blank" rel="nofollow">withdrawn</a> the U.S. from the Paris Treaty, <a href="https://www.nytimes.com/2026/02/12/climate/trump-epa-greenhouse-gases-climate-change.html" target="_blank" rel="nofollow">rolled back</a> critical greenhouse gas regulations, and opened up <a href="https://www.taxpayer.net/energy-natural-resources/federal-oil-gas-leasing-outlook-2026/" target="_blank" rel="nofollow">millions of acres</a> of previously protected public land for oil and gas drilling.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051590/profit-vs-progress/" target="_blank"><img loading="lazy" decoding="async" width="320" height="473" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/ESG-jkt.jpg" alt="" class="wp-image-19430"/></a><figcaption class="wp-element-caption">Brad Swanson is the author of the book “<a href="https://mitpress.mit.edu/9780262051590/profit-vs-progress/" target="_blank">Profit vs. Progress.</a>”</figcaption></figure>
</div>


<p>In response, big oil and gas companies have abandoned, without the slightest resistance, the showy public commitments they had previously made to climate transition. For example, <a href="https://www.euronews.com/business/2025/02/26/bp-scraps-renewables-target-returns-to-oil-and-gas-in-strategy-reset?utm_source=" target="_blank" rel="nofollow">BP</a> has slashed green energy expenditures by 70 percent, <a href="https://www.france24.com/en/live-news/20250205-oil-giants-totalenergies-equinor-reduce-low-carbon-investments?utm_source=" target="_blank" rel="nofollow">Equinor</a> has cut back its renewable capacity targets by almost 40 percent, and <a href="https://www.enerdata.net/publications/daily-energy-news/chevron-eyes-us18bn-us21bn-capex-2026-growth-plans.html" target="_blank" rel="nofollow">Chevron</a> has reduced its carbon-reduction capital expenditures to about 5 percent of its total capital expenditures. None of the world’s 12 largest oil and gas companies plan to decrease fossil fuel production, and all of them project that fossil fuels will continue to overwhelm other sources of energy for the foreseeable future, according to a recent <a href="https://reclaimfinance.org/site/en/assessment-of-oil-and-gas-companies-climate-strategy/" target="_blank" rel="nofollow">evaluation</a>.</p>



<p>Far from a change of heart, this is simply Big Oil returning to form. The petroleum industry has never been serious about curbing emissions, <a href="https://www.un.org/en/climatechange/science/causes-effects-climate-change" target="_blank" rel="nofollow">90 percent</a> of which globally come from fossil fuels. Indeed, after decades of investment, renewables still account for a minuscule amount — about 0.13 percent — of total energy produced by the world’s largest 250 oil and gas companies, according to a recent <a href="https://www.nature.com/articles/s41893-025-01647-0" target="_blank" rel="nofollow">research paper.</a> “I think the article resolves the debate on whether the fossil fuel industry is honestly engaging with the climate crisis or not,” <a href="https://www.anthropocenemagazine.org/2025/10/fossil-fuel-companies-contributions-to-the-green-transition-are-largely-hot-air/" target="_blank" rel="nofollow">said the paper’s lead researcher</a>. “Their interest ends with their profits.”</p>



<p>Some oil companies, such as ExxonMobil, continue to promise to reduce emissions to net zero by 2050. This appears to align them with the consensus of climate science that this is necessary globally to limit warming to 1.5° C (2.7° F) above pre-industrial levels. However, Exxon is typical in designating a narrow target of greenhouse gases to eliminate: only those from its own operations, mainly pumping and refining oil and gas, and from buying electricity generated by fossil fuels. This conveniently ignores greenhouse gases from the consumption of its gasoline and other petroleum products, as well as those of its suppliers — which <a href="https://reclaimfinance.org/site/wp-content/uploads/2024/05/202405_Assessment-of-ExxonMobils-climate-strategy.pdf" target="_blank" rel="nofollow">exceed</a> by <em>four times</em> the total covered by Exxon’s commitment.</p>



<p>Exxon wants us to believe that running its pump jacks and refineries on solar and wind power puts it on the side of the climate transition. It’s cynical buffoonery. But it’s also a sign that America’s leaders and electorate have been willfully blind. We should have realized that companies, like Exxon, that <a href="https://exxonknew.org/" target="_blank" rel="nofollow">knowingly act in pursuit</a> of catastrophe cannot be trusted to stop of their own accord. As Shakespeare might have said, “The fault, dear Brutus, is not in Big Oil but in ourselves.” </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">The past is prologue. Ever since the advent of industrial capitalism in America in the early 1800s, corporations have consistently served one master, shareholders, delivering them profits by open competition in free markets. From the start, elites have insisted that corporations must regard financial and social objectives as mutually exclusive, even as a single-minded quest for profitability has pushed the system to its breaking point.</p>



<p>We saw the injustice of this belief in the late 19<sup>th</sup> century, when “robber barons” — who had clawed their way to the top of an unregulated, chaotic economy — justified poverty wages and harsh working conditions by co-opting Charles Darwin’s new theory of evolution, popularized as “survival of the fittest.” Railroad magnate <a href="https://www.jstor.org/stable/3112464" target="_blank" rel="nofollow">Charles Elliott Perkins</a> — who embodied Social Darwinism by rising from office boy to president of one of the nation’s largest railroads — declared his creed: “That a man is entitled to a living wage is absurd….[If] you take from the strong to give to the weak, you encourage weakness; therefore, let men reap what they and their progenitors sow.”</p>



<p>Early capitalism was marred by periodic, destructive economic downturns. But over time, government acquired fiscal and monetary tools to smooth the boom-and-bust cycles and soften the hard edges of fierce profit-seeking through welfare programs, especially during the Progressive Era (1890s–1920) and the New Deal (1933–1938).</p>



<figure class="wp-block-pullquote"><blockquote><p>America’s leaders and electorate have been willfully blind.</p></blockquote></figure>



<p>However, the bedrock of the corporate mission stayed solid even as the government built new structures on top of it. During the New Deal, for example, leading industrialists joined the <a href="https://cr.middlebury.edu/amlit_civ/allen/field_house/2012%20backup/scholarship/liberty%20league.pdf" target="_blank" rel="nofollow">American Liberty League</a> to oppose innovations like Social Security. A League leader, echoing his counterpart six decades earlier, proclaimed, “You can’t recover prosperity by seizing the accumulation of the thrifty and distributing it to the thriftless and unlucky.”</p>



<p>The permanent establishment of a taxpayer-funded social safety net in the postwar period only reaffirmed corporations’ unwavering fealty to shareholder value. The president of the mighty Dow Chemical Company, <a href="https://fraser.stlouisfed.org/title/commercial-financial-chronicle-1339/july-18-1957-556289?page=18" target="_blank" rel="nofollow">Leland Doan</a>, wrote in 1957, “Any activity labeled ‘social responsibility’ must be judged in terms of whether it is somehow beneficial to the immediate or long-range welfare of the business. . . . I hope we never kid ourselves that we are operating for the public interest per se.”</p>



<p>The corporate community resisted even when the tide of public opinion turned against the malign Jim Crow segregation system in the 1950s and ’60s. When U.S. Steel was accused of workplace discrimination in 1963, prominent academic <a href="https://www.nytimes.com/1963/11/17/archives/do-corporations-have-a-social-duty-a-corollary-of-the-civilrights.html" target="_blank" rel="nofollow">Andrew Hacker</a> struck back forcefully: “If corporations ought to be doing things they are not now doing — such as hiring Negroes on an equal basis with whites — then it is up to government to tell them so. The only responsibility of corporations is to make profits, thus contributing to a prosperous economic system.”</p>



<p>Predictably, that same decade, the corporate establishment dismissed the emergence of the environmental movement. In 1962, when Rachel Carson’s “Silent Spring” shocked the nation by exposing the harm to human and animal life posed by the unrestricted use of pesticides, a <a href="https://www.pbs.org/wgbh/pages/frontline/shows/nature/disrupt/sspring.html" target="_blank" rel="nofollow">chemical industry</a> spokesman responded: “If man were to follow the teachings of Miss Carson, we would return to the Dark Ages, and the insects and diseases and vermin would once again inherit the earth.”</p>



<p><a href="https://www.nytimes.com/1970/09/13/archives/a-friedman-doctrine-the-social-responsibility-of-business-is-to.html" target="_blank" rel="nofollow">Milton Friedman</a>, Nobel Prize–winning economist and chief economic adviser to Ronald Reagan, famously summed up the unchanging corporate consensus in words still widely quoted today: “There is one and only one social responsibility of business — to use its resources and engage in activities designed to increase its profits.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">For the most part, investors have held their noses and counted their gains. But starting almost a century ago, in 1928, when the invention of mutual funds opened up the stock market to the middle class, “ethical” funds, as they came to be known, entered the arena. They were marketed to individuals and families who wanted their portfolios to reflect their values, and to asset managers who wanted their clients to consider them good citizens.</p>



<p>For a long time, these socially responsible funds were a negligible part of the industry because they typically <a href="https://www.researchgate.net/publication/240311517_Socially_Responsible_Mutual_Funds" target="_blank" rel="nofollow">underperformed the market</a>. These funds used a strategy called negative screening — excluding certain “sin” industries, such as cigarettes, liquor, and weapons. Unfortunately, negative screening typically yields lower returns (sin often pays in the stock market!) and greater price volatility, due to limited diversification. In addition, there is <a href="https://www.sciencedirect.com/science/article/pii/S0304405X24001958?via%3Dihub" target="_blank" rel="nofollow">no reason</a> to believe that negative screening has any discernible effect on stock prices, so it has no power to compel corporations to reform.</p>



<p>The answer to this quandary finally came in the early 2000s, in the form of a new stock-picking tool called Environmental, Social and Governance, or “ESG” for short. The seductive promise of ESG is “doing well by doing good” — or getting rich by investing in companies that make the world better. On the back of this dream, capital invested in accordance with ESG principles has grown monumentally, to as much as <a href="https://www.gsi-alliance.org/wp-content/uploads/2023/12/GSIA-Report-2022.pdf." target="_blank" rel="nofollow">$30 trillion</a>, about one-quarter of the global total of assets under management.</p>



<p><a href="https://documents1.worldbank.org/curated/en/280911488968799581/pdf/113237-WP-WhoCaresWins-2004.pdf" target="_blank" rel="nofollow">ESG</a> claims that adroitly managing environmental and social risks will improve profitability and, therefore, stock prices. But ESG only counts risks that are financially material, ignoring all social or environmental harm for which a company faces no financial penalty. As you might expect, this often bears perverse results. For example, cigarette companies kill their customers — you can’t get more anti-social than that! — but smoking is legal, and Big Tobacco rarely faces liability for cancer from smoking. That is why tobacco companies are sometimes awarded good <a href="https://www.morningstar.com/stocks/xnys/pm/sustainability" target="_blank" rel="nofollow">ESG scores</a> and even <a href="https://vcm.com/assets/victoryMF/allholdings-pdf/Victory_World_Growth_Fund_Holdings.pdf" target="_blank" rel="nofollow">appear</a> in some ESG stock funds. Likewise, fossil fuel companies, which have historically made high returns and avoided significant regulatory penalties, appear in <a href="https://www.energymonitor.ai/finance/sustainable-finance/why-esg-funds-are-full-of-fossil-fuels-but-thats-okay/" target="_blank" rel="nofollow">80 percent of ESG funds</a>.</p>



<figure class="wp-block-pullquote"><blockquote><p>The damage that companies inflict on society without literally paying for it entirely escapes ESG’s radar.</p></blockquote></figure>



<p>Whether it be alcoholism, gambling addiction, gun deaths, climate change, or other iniquities, the damage that companies inflict on society without literally paying for it — or the negative externalities, as they’re called in economics — entirely escapes ESG’s radar.</p>



<p>Worse, the key assumption of ESG — that adept social risk management translates into higher profitability — is fundamentally unprovable. Many studies have attempted to show a strong positive correlation between specific ESG policies, like emissions reductions or heightened employee benefits, and financial metrics, like cost of debt or return on assets. But, as I explain in my forthcoming <a href="https://mitpress.mit.edu/9780262051590/profit-vs-progress/" target="_blank">book on socially responsible investment</a>, very few succeed. In the end, the research only allows you to draw one conclusion with confidence: that it is simply not possible to precisely define ESG practices at a granular level, measure their direct effect on financial performance, and compare these results validly across different companies.</p>



<p>But that does not stop ESG rating agencies from trying. ESG ratings have grown into a <a href="https://www.ey.com/en_gl/insights/financial-services/emeia/how-esg-data-markets-have-evolved-for-financial-services" target="_blank" rel="nofollow">big business</a>, since fund managers pay dearly for them to guide their stock selection. The rating agency reports are typically long, detailed, and quantitative — but completely unreliable. These reports may <em>look</em> sober and professional, like credit rating reports from companies such as S&amp;P Global or Moody’s. But credit rating agencies are analyzing real financial values to assess a tangible corporate quality: its ability to repay its debts. The numbers are verifiable and have a proven relevance to the projected outcome. That is why credit ratings have a <a href="https://www.reuters.com/article/us-climate-ratings-analysis-idUSKBN19H0DM/" target="_blank" rel="nofollow">90 percent correlation</a>; S&amp;P and Moody’s seldom disagree substantially on a company’s rating.</p>



<p>ESG ratings, by contrast, are all over the map, with a correlation of only <a href="https://qsinvestorsproduction.blob.core.windows.net/media/Default/PDF/The%20Devil%20is%20in%20the%20Details_Divergence%20in%20ESG%20Data.pdf." target="_blank" rel="nofollow">40 percent</a>. <a href="https://doi.org/10.1093/rof/rfac033" target="_blank" rel="nofollow">Analysts</a> point to three key factors: the rating agencies choose different terms to measure; they measure them with incompatible methods; and they use contradictory methodologies to combine these idiosyncratic measurements into final ratings. These discrepancies build on each other to produce wildly variant final scores. A company denigrated as a dog in ESG terms by one rating agency may be lauded as a star by another.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">If ESG is just an illusion, and negative screening a disappointment, how should investors direct their capital to make corporations more socially responsible? The answer is, they shouldn’t bother.  </p>



<p>In the game of capitalism, the role of corporations is to make as much money as they can, while playing by the rules. The role of the state, as we learned in the Progressive Era and the New Deal, is to revise the rules periodically to ensure fair play and a socially positive outcome — without hobbling the players. We do want fierce competition, but we don’t want to destroy the playing field in the process.</p>



<p>Today, <a href="https://cdn.pficdn.com/cms/pgim-fixed-income/sites/default/files/2021-04/The%20Evolution%20of%20U.S.%20Corporate%20Profits_2.pdf" target="_blank" rel="nofollow">corporate profits</a> are at their highest proportion of GDP in 50 years, while wages are at their lowest. Overall, <a href="https://eml.berkeley.edu/~saez/saez-UStopincomes-2022.pdf" target="_blank" rel="nofollow">income inequality</a> has <em>never </em>been greater, not even in the Gilded Age, the period immediately preceding the Progressive Era, when many toiled in Dickensian poverty while a few, like the Vanderbilt dynasty, flaunted their extravagant and lavish lifestyles. Now, like then, the people, with justification, are <a href="https://www.amacad.org/publication/daedalus/fifty-years-declining-confidence-increasing-polarization-trust-american-institutions" target="_blank" rel="nofollow">losing faith in the system</a>.</p>



<figure class="wp-block-pullquote"><blockquote><p>It is folly to ask business to do the work of government.</p></blockquote></figure>



<p>Like our Progressive forebears, we will have to revamp capitalism in order to rescue it. Key objectives must include rebuilding organized labor, since what <a href="https://www.huffpost.com/entry/union-membership-middle-class-income_n_3948543" target="_blank" rel="nofollow">benefits unions</a> benefits the middle class. We’ll also need to break up de facto <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4724974" target="_blank" rel="nofollow">corporate cartels</a> that stifle competition, squeeze wages, and lower productivity. To counter the existential threat of climate change, we need a <a href="https://www.bruegel.org/analysis/europes-emissions-trading-system-ally-not-enemy-industrial-competitiveness" target="_blank" rel="nofollow">cap-and-trade</a> system that makes industry a partner in carbon reduction, not an opponent, and can serve as a model for other public-private partnerships.</p>



<p>It is folly to ask business to do the work of government. The sooner we stop expecting companies like Exxon to be voluntary agents of social change and acknowledge that they are amoral profit machines, the sooner we can stop the flow of hypocrisy and greenwashing and start working on resolving the social and environmental crises that blight the lives of billions. The path to greater corporate social responsibility leads through the voting booth and the statehouse, not through Wall Street and the C-suite. </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Brad Swanson</em></strong><em> manages socially responsible investments and is an adjunct faculty member in the Costello College of Business at George Mason University. Before entering the finance industry, he was a Foreign Service Officer in the U.S. Department of State, with tours of duty in several African countries. He is the author of the book “<a href="https://mitpress.mit.edu/9780262051590/profit-vs-progress/" target="_blank">Profit vs. Progress: Why Socially Responsible Investment Doesn’t Work and How To Fix It</a>.”</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/rabbit.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/rabbit.jpg" />                                        </item>
        			                <item>
                        <title>How ‘Tiny Shortcuts’ Are Poisoning Science</title>
                        <link>https://thereader.mitpress.mit.edu/how-tiny-shortcuts-are-poisoning-science/</link>
                        <pubDate>Mon, 23 Mar 2026 09:55:00 +0000</pubDate>
                        <dc:creator>Thomas Plümper and Eric Neumayer</dc:creator>
                        		<category><![CDATA[Data]]></category>
		<category><![CDATA[Experiments]]></category>
		<category><![CDATA[Fraud]]></category>
		<category><![CDATA[Trust]]></category>
		<category><![CDATA[Science & Tech]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19013</guid>
                        <description><![CDATA[<p>Seemingly harmless data tweaks are undermining the integrity of the entire field. We must define the problem to prevent it.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Seemingly harmless data tweaks are undermining the integrity of the entire field. We must define the problem to prevent it.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/credibility-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock
</figcaption>
</figure>

<p class="has-drop-cap">In 1999, <em>Time</em> magazine featured a famous photo of Albert Einstein on its cover — looking old and tired, his forehead covered in wrinkles, his hair long and gray. The photograph was taken in 1947, during a portrait session with Philippe Halsman in which Einstein expressed remorse for his inadvertent role in the Manhattan Project, the initiative that ultimately culminated in the devastating bombings of Hiroshima and Nagasaki. It would go on to become Halsman’s most iconic image.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051279/the-credibility-crisis-in-science/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/science-jkt.jpg" alt="" class="wp-image-19014"/></a><figcaption class="wp-element-caption">This article is adapted from Thomas Plümper and Eric Neumayer&#8217;s book “<a href="https://mitpress.mit.edu/9780262051279/the-credibility-crisis-in-science/" target="_blank">The Credibility Crisis in Science</a>.” </figcaption></figure>
</div>


<p><em>Time</em> magazine rarely places a picture of a celebrity from a historical period on its cover. But in 1999, the editors had good reasons to ignore this rule: The magazine had designated Einstein as the “person of the century,” a distinction that placed him above notable figures like Mahatma Gandhi and Franklin D. Roosevelt, who were the runners-up. It was a great honor for Einstein and for the profession he represents. And Einstein was not the only scientist on <em>Time</em>’s list of the 100 most influential people of the twentieth century: The list featured 19 scientists, making them the third most prominently represented professional group, just a shade behind politicians and industrialists. The 20th century was the century of man-made political disasters. But the 20th century was also the century of science, and Einstein was its figurehead.</p>



<p>Those days seem to be over. And they may never come back.</p>



<p>In the 21<sup>st</sup> century, the role and relevance of scientists have changed. Science is no longer triumphant: It is in the midst of a severe crisis. Public trust in scientific results and findings has dwindled, and science does not know how to regain credibility. This crisis has many facets. More than anything else, however, it is a credibility crisis. The public no longer believes that scientists merely make honest mistakes on the long and winding road to truth. Instead, scientists are increasingly seen as partial, ideological agents, activists in an armchair, or, worse still, simply fraudsters who fabricate or manipulate data and tweak the specifications of their empirical models to get their desired results.</p>



<p>The credibility crisis of science is not about scientific progress invalidating previously held scientific beliefs, which is intrinsic to the very nature of scientific revolutions. Rather, the crisis has been caused by scientists who deliberately publish overconfident, misleading, and often simply false empirical results based on research designs or model specifications they have intentionally specified to give the desired results. We call this practice “tweaking.” In extreme cases, published results rely on manipulated or outright fabricated data. Whether tweaked, manipulated, or fabricated, the results often cannot be replicated — not even if replication analysts use identical research designs.</p>



<p>By itself, failure to replicate does not necessarily indicate, and certainly not prove, scientific fraud. Empirical results can vary for many reasons. However, replication analyses usually show that replicated effect sizes are, on average, systematically smaller and often statistically insignificant. If 90 percent of replications <a href="https://www.science.org/doi/10.1126/science.aac4716" target="_blank" rel="nofollow">deviate</a> from the original article in one direction that is less favorable to what the authors wanted to demonstrate, then these deviations are not innocent random errors or acts of nature. If the deviations were random, they would cancel each other out, and their mean would be close to zero.</p>



<figure class="wp-block-pullquote"><blockquote><p>Scientists are increasingly seen as partial, ideological agents, activists in an armchair, or, worse still, simply fraudsters.</p></blockquote></figure>



<p>Instead, these deviations indicate that many published results were likely tweaked, manipulated, or fabricated.</p>



<p>Tweaking is potentially more damaging to science in the long run than data manipulation and fabrication. That might be hard to believe, since tweaked empirical results are likely to have smaller effects on the fabric of science than cases of data fabrication and manipulation. But the cumulative effect of tweaking can still be larger than that of data fabrication and manipulation because these strategies are rare, whereas tweaking <a href="https://www.theguardian.com/commentisfree/2023/aug/09/scientific-misconduct-retraction-watch" target="_blank" rel="nofollow">is common</a>.</p>



<p>Ever since the online platform <a href="https://retractionwatch.com/" target="_blank" rel="nofollow">Retraction Watch</a> began monitoring and reporting retractions in 2010, the number of retracted articles per year has steadily increased. Some of this is due to “bulk retractions” of thousands of articles published by so-called paper mills, where authors pay to have fake articles published. We are not interested in these retracted paper-mill publications but in variants of data fraud, a subset of retractions that have also been steadily increasing. Most notably, there have been several high-profile retractions involving work by <a href="https://www.science.org/content/article/after-honesty-researcher-s-retractions-colleagues-expand-scrutiny-her-work" target="_blank" rel="nofollow">Francesca Gino of Harvard University</a> and<a href="https://retractionwatch.com/2023/08/31/stanford-president-retracts-two-science-papers-following-investigation/" target="_blank" rel="nofollow"> Marc Tessier-Lavigne of Stanford University</a>, who was ultimately exonerated of personal misconduct but <a href="http://chrome-extension//efaidnbmnnnibpcajpcglclefindmkaj/https://boardoftrustees.stanford.edu/sites/g/files/sbiybj31576/files/media/file/scientific-panel-final-report.pdf" target="_blank" rel="nofollow">held accountable for inadequate laboratory oversight and failure to correct the scientific record</a>. And these are just the most recent cases — the ones that stick in the public mind for a while before attention shifts to others.</p>



<p>All of this is to say that scientists no longer sit at God’s table, so to speak. They have become mere mortals in the midst of a massive crisis of trust. Could we go so far as to say that today’s scientific process is broken? Perhaps. But the more correct answer is: <em>It depends</em>.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">One of the things it depends most on, of course, is how we define fraud itself. <a href="https://mitpress.mit.edu/author/lee-mcintyre-7837/" target="_blank">Lee McIntyre</a>, one of the foremost philosophers of science, defines scientific fraud as “the intentional fabrication and falsification of the scientific record.” He distinguishes between fraud, on the one hand, and honest error, on the other, plus a third category in between, which he labels “murky,” where scientists’ motives are not “pure.”</p>



<p>What McIntyre calls the murky category, we call “tweaks.” Tweaks are the intentional manipulation of empirical results through changes in and choices of research design, model specification, and/or estimation procedures. McIntyre restricts fraud to data fabrication and manipulation, but the “murky” third category does not, in his view, qualify as fraud. Here is why:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>“What about all of those less-than-above-board research practices p-hacking and cherry-picking data . . . ? Why aren’t those considered fraud the minute they are done intentionally? But the relevant question to making a determination of fraud is not just whether those actions are done intentionally, it is whether they also involve fabrication or falsification. . . . The reason that p-hacking isn’t normally considered fraud isn’t that the person who did it didn’t mean to, it’s that . . . p-hacking is not quite up to the level of falsifying or fabricating data.”</p>
</blockquote>



<p>In our view, McIntyre’s definition of data fraud is incomplete and imprecise. It conceals that the fabrication and manipulation of data — and the manipulation of empirical results through tweaking — serve the same purpose: to promote the researcher’s interests.</p>



<p>Consider the case of Diederik Stapel, a fraudster with at least 58 retracted articles under his belt, ranking eighth on the Retraction Watch <a href="https://retractionwatch.com/the-retraction-watch-leaderboard/" target="_blank" rel="nofollow">leaderboard</a>. Stapel came to fame as a fraudster; he has contributed massively to the existential crisis of social psychology. Joel Achenbach, in an <a href="https://www.washingtonpost.com/blogs/achenblog/post/diederik-stapel-the-lying-dutchman/2011/11/01/gIQA86XOdM_blog.html" target="_blank" rel="nofollow">article</a> for <em>The Washington Post</em>, called him the “Lying Dutchman.” A fraudster he is, but he is surprisingly willing to talk and write about his fraudulent career. He even wrote a book-length manuscript about his life — an autobiography titled “<a href="https://errorstatistics.com/wp-content/uploads/2014/12/fakingscience-20141214.pdf" target="_blank" rel="nofollow">Faking Science: A True Story of Academic Fraud</a>.” Whenever we need insights from a fraudster’s perspective, Stapel is a good, perhaps the best, source.</p>



<p>Stapel kick-started his fraudulent career, as he himself recounts, by becoming “impatient, overambitious, reckless.” Data analyses do not always align with researchers’ expectations and interests. And so Stapel took the truth into his own hands and decided to take “one, tiny little shortcut.” He tortured the data to bring the results into line with the arguments in his articles. In his autobiography, Stapel explains how he drifted further and further away from the path of virtue: “Everything had to be neat and orderly. No mess. I opened the computer file with the data that I had entered and changed a . . .  2 into a 4; then . . .  I changed a 3 into a 5. I . . .  made a few mouse clicks to tell the computer to run the statistical analyses. When I saw the results, the world had become logical again.”</p>



<p>In the early stages of his fraudulent career, he eliminated cases he classified as “deviant” — cases that prevented the results from turning out as he expected and wanted. These, in his view, were common practices among social psychologists. “Tiny little shortcuts,” he calls them. Tweaks were Stapel’s gateway drug. Soon after he <a href="file:///Users/jonskolnik/Desktop/deutschlandfunk%20.­%20de%20/­%20ergebnisse%20-%20­%20um%20-%20­%20jeden%20-%20­%20preis%20-%20­%20wenn%20-%20­%20forscher%20-%20­%20betruegen%20-%20­%20100%20.­%20html" target="_blank" rel="nofollow">started</a> to tweak empirical results, he resorted to data fabrication and outright data manipulation. But, in his book, Stapel draws a line in the sand: While he accepts data manipulation and fabrication as fraud, his “tiny little shortcuts” are common practice, and thus not fraud, at least not <em>really.</em> In other words, if everyone cheats, is it still cheating?</p>



<p>The cold reality is that tweaks are not just “tiny little shortcuts”; they are tiny little shortcuts with substantively large consequences. They change the results of empirical analyses, often making manuscripts more interesting. Manuscripts that become more interesting change reviewers’ attitudes toward them, allowing tweakers to publish in more visible journals and with better publishers. When tweakers publish more interesting results in more visible places, they get additional attention for their work, receive better job offers and promotions, and rise to ever greater power and influence.</p>



<p>Make no mistake: Tweaking is not about changing the course of science. Nor is it, at least not primarily, about the misuse of public research funds (although it is a scandal that hard-working taxpayers fund the research of tweakers). Rather, tweaking is about scientists pursuing their own interests in a competitive, vulnerable system based on trust and on freedom from control by institutions that enforce rules.</p>



<p>Is the intentional manipulation of statistical quantities of interest always fraudulent? As with any categorization, there are gray areas.</p>



<figure class="wp-block-pullquote has-text-align-center"><blockquote><p>If everyone cheats, is it still cheating?</p></blockquote></figure>



<p>One of the most common gray areas involves the experimental researcher who, after a first round of experiments, fails to achieve a statistically significant treatment effect. So, they organize a second round of experiments with the very rational expectation that the sheer number of observations will eventually push the significance level above the threshold that separates publishable from unpublishable results. This research practice is common in the life sciences because experiments are costly and may cause unnecessary harm to participants. It may therefore make sense to start with a small sample and only add participants when the results are “not yet significant.”</p>



<p>The problem with ever-increasing sample sizes is that, as the number of observations approaches infinity, the standard error (i.e., the measure of sample variability) of an estimate approaches zero. Thus, if your model indicates any effect at all, then as you collect more and more data, the statistical test will inevitably register the effect as significant — despite how small the effect may be. </p>



<div class="wp-block-image ma-related-post ma-related-post-normal"><figure class="alignright size-pinned is-resized"><a href="https://thereader.mitpress.mit.edu/perils-of-publication-and-citation-bias/" title="Unintended Consequences: The Perils of Publication and Citation Bias"><span class="ma-related-post-top"><span class="ma-related-post-heading">Related</span><span class="ma-related-post-title">Unintended Consequences: The Perils of Publication and Citation Bias</span></span><img decoding="async" loading="lazy" class="ma-related-post-img" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2020/09/lede-408x267.jpg" alt="" width="370" height="242"></a></figure></div>



<p>Scientists may be reluctant to call the above practice, or p-hacking, “fraudulent.” And indeed, this practice is <em>not</em> fraudulent if a p-hacked study clearly states that the results are insignificant given the original small number of observations and only become significant in a larger sample. But this holds for all adjustments: A change in model specification or research design is not fraudulent if the change and its effects on results are clearly discussed and not suppressed. What makes tweaks fraudulent is not the tweak itself, but the selective reporting of results based on relevant quantities of interest. For example, a gradual increase in sample size is fraudulent if the authors suppress the results of the smaller sample.</p>



<p>Now, are all researchers actually <em>aware </em>of this problem? And do they all collect more and more data until the desired significance appears? Perhaps not. But as we have said, when it comes to tweaking, it is usually impossible to prove intention. At the same time, the existence of a gray area with manipulations that border on the fraudulent does not mean, for example, that intentionally dropping a control variable from the list of regressors, adjusting the operationalization of a key variable, or dropping cases from the sample, to produce desired results, does not constitute scientific fraud.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Rules have the greatest effect when they are clear, violations are easy to detect, and enforcement is simple and not prohibitively expensive. And here lies the problem with scientific fraud: The more broadly we define scientific fraud, the larger the share of fraudulent analyses that are extremely difficult to detect. The more broadly we define scientific fraud, the more costly enforcement becomes. However, if we define it narrowly and exclude tweaks, science will not be able to appropriately address, let alone overcome, its credibility crisis.</p>



<p>Science is ill-advised to narrow the definition of scientific fraud just to make detection easier and rule enforcement less costly. The negative consequences of scientific fraud are not limited to data manipulation and fabrication; tweaks, too, have the same distorting effect on competition for academic merit and research funding, and the same devastating effect on public confidence in scientific results and on trust between scientists.</p>



<p>Both scientists and the public lose confidence in science when there is a non-trivial chance that scientists manipulated empirical results to support the arguments, theories, hypotheses, and stories they wish to corroborate, or to cast doubt on the arguments, theories, hypotheses, and stories that contradict the worldview they believe in.</p>



<p>Science has lost some of its standing with the public. While skepticism about scientific findings can be healthy and is an inherent part of the scientific process, general disbelief and distrust pose significant challenges. Scientists have a vested interest in regaining some of that lost trust. This is easier said than done. But much would be gained if scientists were honest about the uncertainties associated with scientific results — honest with other scientists in scientific publications and honest in public statements. Scientists must learn to distinguish between scientific results and their personal opinions, promote full transparency in scientific research — not hide potential conflicts of interest — and find ways to improve communication with the public to rebuild trust.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Thomas Plümper</em></strong><em> is Professor of Quantitative Social Research at the Vienna University of Economics and Business and Head of the Department of Socioeconomics.<em> <strong>Eric Neumayer</strong> is a Professor at the London School of Economics and Political Science (LSE) and its Deputy President and Vice Chancellor.</em> Together they have coauthored several books, including “<a href="https://www.cambridge.org/core/books/robustness-tests-for-quantitative-research/FEDDFA23613B60B7AA3CCA841A892008" target="_blank" rel="nofollow">Robustness Tests for Quantitative Research</a>” and “<a href="https://mitpress.mit.edu/9780262051279/the-credibility-crisis-in-science/" target="_blank">The Credibility Crisis in Science</a>,” from which this article is adapted.</em></p>



<p><em><strong>Correction</strong>: An earlier version of this article mischaracterized the nature of misconduct in the Marc Tessier-Lavigne case. The text has been updated to note that he was exonerated of personal misconduct and scientific fraud.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/credibility.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/credibility.jpg" />                                        </item>
        			                <item>
                        <title>What America Could Learn From Asia’s Robot Revolution</title>
                        <link>https://thereader.mitpress.mit.edu/what-america-could-learn-from-asias-robot-revolution/</link>
                        <pubDate>Thu, 19 Mar 2026 09:55:00 +0000</pubDate>
                        <dc:creator>Candi K. Cann</dc:creator>
                        		<category><![CDATA[AI]]></category>
		<category><![CDATA[Japan]]></category>
		<category><![CDATA[Korea]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[Science & Tech]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18919</guid>
                        <description><![CDATA[<p>In Korea and Japan, humanoid machines aren’t rivals but partners, assisting with elder care, creating jobs for people with disabilities, and even leading religious rituals.</p>
]]></description>
                        <content:encoded><![CDATA[<p>In Korea and Japan, humanoid machines aren’t rivals but partners, assisting with elder care, creating jobs for people with disabilities, and even leading religious rituals.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Robot-cover-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock</figcaption>
</figure>

<p class="has-drop-cap">In 2023, I traveled to South Korea on a Fulbright fellowship, excited to revisit some of my favorite places from 30 years earlier when I was an exchange student at Han Nam University. Han Nam is located in Daejeon City, in the heart of Korea, a university founded by Presbyterians in the 1950s, in the aftermath of the Korean War. Wandering along the campus, I was surprised by how much had changed in 30 years: Where the campus once ended, it now extends, marked by a contemporary-design coffee shop made of shipping containers, with its own coffee bean roaster. In the early 1990s, this was where young student activists would gather to read poetry and discuss Korea’s future before moving to the forests under the cover of the trees to practice protest dance, or <em>talchum</em> (멽띙).</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051118/augmented/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/augmented-jkt.jpg" alt="" class="wp-image-18920"/></a><figcaption class="wp-element-caption">This article is adapted from Candi K. Cann&#8217;s book, <em>“</em><a href="https://mitpress.mit.edu/9780262051118/augmented/" target="_blank"><em>augmented</em></a><em>.”</em> </figcaption></figure>
</div>


<p>Daejeon City is also the heart of Korea’s science and technology boom, with cutting-edge universities dotting the mountainous landscape. Students today rush by, focused on their smartphones, largely oblivious to the previous generation’s plight. Most are more interested in securing well-paying jobs than in the nebulous, lofty ideals their parents were so concerned with, such as democracy and freedom.</p>



<p>You’ll, of course, find the echoes of this generational shift throughout the West, too. However, what I found especially remarkable in South Korea is that the attitude toward technology — a source of immense skepticism and nihilism in the U.S. — is overwhelmingly positive. Not only are robots everywhere, but they are welcomed as dependable, efficient, and predictable. While living in Korea, I&#8217;ve often found it preferable to order at the automated kiosk or hand my dishes to the dish-collecting robot, so I don’t interrupt anyone at work. The robots at Incheon International Airport in Seoul, the museums in Daejeon, and the restaurants in Busan all switch easily between Korean and English, allowing me to navigate the country more easily. Robots and humans seem to work well together, and life runs smoothly.</p>



<p>According to the World Robotics Statistics <a href="https://ifr.org/ifr-press-releases/news/robot-density-nearly-doubled-globally" target="_blank" rel="nofollow">released</a> by the International Federation of Robotics in 2021, South Korea has the world’s highest robot density, with 932 robots per 10,000 employees in manufacturing. This stands in sharp contrast to the rest of the world, which has an average of 126 robots per 10,000 employees in manufacturing. The use of robots in Korea has expanded into the service industry in the past few years.</p>



<p>In a 2013 <a href="http://researchgate.net/publication/263180796_Users'_attitudes_toward_service_robots_in_South_Korea" target="_blank" rel="nofollow">study</a> on attitudes toward robots in South Korea, researchers “found that users’ attitudes toward service robots and the perceived usefulness of the service robots were the main determinants of the users’ intention to use the robots.” Moreover, they also “found that the need to belong had a moderate impact on users’ beliefs concerning service robots.” Though more than a decade old, this study suggests that the cultural desire for belonging may be just as important as robot functionality in one’s attitude toward robots. As Jae-myoung Hong, the senior engineer of LG’s Smart Solutions Division, says, “In our view, artificial intelligence, robots and related solutions are not just new gadgets, but key technologies to support humans. . . . In some cases, robots may perform jobs that are too dangerous or too complicated for regular workers.”</p>



<p>Robots are functional in a practical way, and that alone may be their appeal. However, some scholars suggest that Koreans’ acceptance of robots may be more culturally embedded and might go beyond the appeal of efficiency. Kwang-yeong Shin, professor of sociology at Chung-Ang University in Seoul, <a href="https://www.bbc.com/travel/article/20171205-why%20-south-korea-is-an-ideal-breeding-ground-for-robots" target="_blank" rel="nofollow">argues</a> that the cultural acceptance of robots in Korea may have more to do with the Korean shamanist attitudes toward nonliving things. “We can think that any kind of non-human being might have a spiritual or super power beyond human capacity, whether it is a natural object or artificial object.”</p>



<figure class="wp-block-pullquote"><blockquote><p>Not only are robots everywhere, but they are often welcomed as dependable, efficient, and predictable.</p></blockquote></figure>



<p>The cultural reverence for inanimate objects and an acknowledgment of their spiritual possibilities are present in both the Japanese and Korean contexts, and spiritual rituals in both countries embrace the spiritual significance of inanimate objects in everyday life. In Japan, there are Shinto shrines devoted to dolls, needles, golf, and every type of quotidian object. There are even Buddhist <a href="https://www.straitstimes.com/asia/east-asia/fido-funeral-in-japan-a-send-off-for%20-robot-dogs." target="_blank" rel="nofollow">funerals</a> for robotic pets, acknowledging the importance of their companionship to their families. Viewing humans and robots as sharing characteristics would imply that they have similar moral status.</p>



<p>One of the more interesting cross-cultural comparative studies on perceptions of robots <a href="https://dl.acm.org/doi/10.1145/2559636.2559676" target="_blank" rel="nofollow">found</a> vast differences between American and Korean attitudes toward them, and that “Koreans preferred robots as assistants more than both Turkish and US participants, and as pets more than Turkish participants. Unlike Koreans, the majority of whom believed robots should have social roles (92 percent), most US participants saw robots as tools (53 percent).” Multiple studies find the same results: Asians tend to embrace robots and view them as useful, whereas Americans tend to be quite suspicious of them because they may displace humans in the workforce and challenge notions of human exceptionalism.</p>



<p>Whatever the reason, robots and AI are viewed quite positively in South Korea and Japan, and the dangerous apocalyptic narrative that surrounds their utilization in the United States simply doesn’t exist in Asia. In fact, the last time I went to Incheon Airport in Seoul to meet family coming to visit me in Korea, I met a roving information robot, gliding along the terminal in the ticketing area to answer questions people might have about the airport layout: where the restrooms are, what restaurants are available, and even where flights and check-in are located in the terminal. A small toddler went up to the robot with her parents and talked to it. The robot’s large, friendly eyes and voice made it seem more approachable than the information desk often found at most airports. With a touchscreen that lets anyone choose a language to converse in with the robot, the language barrier is not a concern when seeking information. What this means, of course, is that Asia is leading the robot revolution and utilizing AI in some innovative and adaptive ways.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">One of the most innovative uses of robots in the disability space can be found in the DAWN Avatar Café in Tokyo, Japan, which utilizes café robots remotely run by homebound disabled people to perform everyday tasks at the café and interact with people. The DAWN Avatar website states that it established the café for two primary reasons: to provide disabled people with a place to connect with others and to employ homebound disabled people, giving them a sense of purpose through financial independence. The café has multiple robots throughout — from a greeter robot that helps you find your reservation and a table to a robot that sells merchandise, robots that serve food and drinks, a barista robot that makes coffee, and finally, smaller tabletop robots that accompany you in your meal, explaining the restaurant concept and providing companionship as you eat and drink.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="2160" height="950" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/image-01.jpg" alt="" class="wp-image-18932"/><figcaption class="wp-element-caption">DAWN Avatar Café in Tokyo, Japan. Source: OryLab Inc.</figcaption></figure>
</div>


<p>Next to each tabletop robot is a small picture of the person remotely operating it, where they live in Japan (the operators live throughout the country — not merely in Tokyo), and facts about themselves that they want to share with others visiting the café. I booked a reservation for dinner at DAWN Café with my daughter, and we were assigned a small table with a cute robot run by Koki Yanagida from his home in Kyoto. We were seated, then met Koki, who introduced himself, explained a little about the menu, took our orders, and relayed them to the kitchen.</p>



<p>I was excited to eat at DAWN Avatar Café because I love robots, but I was unprepared for the emotions I felt during this experience. DAWN’s website states that the primary purpose of the robot café is for patrons to develop a connection with its employees through the robots. But, to be quite honest, even as much as I love robots and feel generally positive toward them, I didn’t expect to come out of the experience with any sort of emotional connection. That feeling quickly changed, however, when I sat down to have dinner with my robot companion run by Koki.</p>



<p>Koki activated his robot, turning on its little eyes so we knew he was with us, and then told us about his life. Koki shared with us how shortly after he turned 18 and graduated high school, he was in a terrible car accident that left him paralyzed from the neck down, forestalling his dream of going off to college and leaving him unable to use his arms and legs. He operates his robot with his mouth through a small straw, which remotely controls all its movements. The evening was particularly poignant because I was there with my daughter, who had just graduated from high school and was in the same stage of life as Koki when he had his accident. Koki asked my daughter about her plans to attend college and congratulated her on graduating from high school. I could tell she was deeply moved by his life circumstances and optimism.</p>



<figure class="wp-block-pullquote"><blockquote><p>I was unprepared for the emotions I felt during this experience.</p></blockquote></figure>



<p>At one point in our conversation, another robot passed by, and Koki called out to him, asking how he was doing. It was touching to see everyone interacting and checking in on each other. At the end of the evening, both my daughter and I felt sad to leave because we were so unexpectedly moved by the experience. Those who are not disabled don’t always see the value of — or even the possibilities for — the connections that technologies offer, particularly for those who may not be able to leave their home or get out and about on a regular basis. </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">In South Korea, robots are also being used for a phenomenon known as “lonely deaths,” which describes the contemporary phenomenon of seniors who live and die alone. In previous generations, it was common for older people to live with their children and grandchildren, but now an increasing number of older Koreans live alone in their later years. In fact, the Korea-EU Research Centre reports that “the number of lonely deaths [in South Korea] soared from 1,669 in 2015 to 2,880 in 2020.”</p>



<p>To reduce lonely deaths and provide companionship more broadly, the government has been developing unique solutions. One of these is by equipping robots with Internet of Things (IoT) sensors, pieces of hardware that collect data and detect changes in the environment. Temperature, motion, image, and proximity sensors allow activity in a house to be monitored from another location. In short, your home alarm system is essentially an IoT sensor, though you may not have ever called it that. Having IoT sensors installed in the living quarters of seniors who live alone helps monitor them and support those who want or need to age in place and live independently of their families. Since South Korea has a national government insurance system, the sensors also enable oversight in case something happens at home, and urgent medical care is needed, but cannot be requested.</p>



<p>One government solution is the distribution of a robot called “Hyodol,” a cute little stuffed-person robot with big eyes and a friendly smile that provides companionship and personalized services, such as wake-up reminders and notifications to take medicine. The robot also sends a notification if it detects no movement for a certain period, and it sends emergency texts and calls if the user presses and holds its hand for more than three seconds. The robot comes preloaded with thousands of songs and various entertainment functions, including quizzes and games. Many users interact with their robots during mealtimes, and these conversations are tracked to monitor users and detect signs of cognitive decline.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1202" height="795" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/hyodol-photo-courtesy-of-hyodol-20251127180824458.avif" alt="" class="wp-image-18933" style="width:518px;height:auto"/><figcaption class="wp-element-caption">The AI-powered Hyodol companion doll. Source: Hyodol.</figcaption></figure>
</div>


<p>Studies on Hyodol show that companion care robots work well alongside their human caregivers, providing an extra layer of support for those aging in place. In their 2023 study on Hyodol, Heesun Shin and Chihyung Jeon <a href="https://www.tandfonline.com/doi/full/10.1080/18752160.2024.2348304" target="_blank" rel="nofollow">found</a> that “robots do not substitute for human caregivers but displace or redistribute their tasks and responsibilities.” As AI’s capabilities continue to grow, care for the elderly will be an important area to watch, as the world’s population ages and traditional family structures can no longer be guaranteed to provide care as people age or to assist them in their dying.</p>



<p>In Japan, where the aging population is quickly growing, and the government expects to face a shortage of at least 380,000 caregivers by the year 2025, according to an article in <em>NUVO</em>, a sizeable portion of the national budget has been allocated to the development of AI “carebots” geared toward both aging in place and palliative care.</p>



<p>The <a href="https://nuvomagazine.com/culture/robots-and-the-future-of-dying" target="_blank" rel="nofollow">article</a> outlines the many AI options that currently exist in this arena:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>The Honda Asimo can fetch a bowl of soup and carry it upstairs. Secom’s My Spoon can raise food to your mouth. The polar bear–like Riken Robear will soon be able to lift your body from the bed and carry you to the bathroom (the sweet-faced, 300 lb. bot is in beta mode until it learns to be more gentle with fragile skin). Once in the bathroom, Sanyo’s bathtub can wash and rinse you. The Cyberdyne Hybrid Assistive Limb (yes, that’s “HAL”) suit can detect the attempted movement of a weak limb, giving it a boost of power. The CT Asia Robotics Dinsow can remind you to take your pills and automatically answer the phone.</p>
</blockquote>



<p>Assistive-care robots help the old and infirm with everyday tasks and expand current care options, reducing the stress on a healthcare system increasingly faced with the needs and demands of an aging population. With the breakdown of the intergenerational family structure in which families care for their aging parents and grandparents, the Japanese government is seeking to ease the burden on families and reduce reliance on human labor to fill the gaps. These companion robots can provide companionship, monitor one’s health, and aid with basic everyday needs, all while keeping elderly populations safe.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">In addition to assistive-care robots, there are also religious robots that conduct Buddhist funerals. In Japan, funeral ceremony robots are seen as positive for two primary reasons: They are cheaper to hire than a human religious officiant (in 2017, <em>The Guardian</em> <a href="https://www.theguardian.com/technology/2017/aug/23/robot-funerals-priest-launched-softbank-humanoid-robot-pepper-live-streaming" target="_blank" rel="nofollow">reported</a> the average costs for hiring a religious officiant in Japan were $2,189, whereas a robot hired to conduct the exact same service is only $450), and they are generally more efficacious. Because they are robots, they generally recite the sutras and prayers correctly, ensuring that the deceased loved one is properly cared for during their last rites.</p>



<p>From an emotional perspective, robots are also generally less messy, and some people feel relief at not having to manage human interaction during their time of grief. Introduced at Tokyo’s Life Ending Expo in 2017, the cute bald robot named Pepper comes appropriately dressed in Buddhist robes and “can perform multiple functions such as chanting sutras, and even tapping a drum the same way a human Buddhist priest would. Another interesting function of the robot priest is that it also provides live-streaming of the ceremony for people who are unable to attend.” While Pepper met with some success, the company that produced Pepper, SoftBank Group Corp., stopped production in 2020 due to layoffs and financial restructuring that included reducing its investment in robotics. So, it remains to be seen if robots will make a comeback in the funeral industry.</p>



<figure class="wp-block-pullquote"><blockquote><p>In the U.S., robots are viewed as soulless, unlike in Asia, where they are viewed as soul-possible or soul-different.</p></blockquote></figure>



<p>Another interesting development in religious robotics is an AI robot meant to replicate the speech, gestures, and movements of the deceased to help the bereaved deal with grief and mourning. Created by Etsuko Ichihara in Japan, Digital Shaman is a humanoid robot with a 3D-printed copy of someone’s face. In anticipation of their death, a person interacts with the robot beforehand so it can be trained to mimic their speech patterns and gestures. After the person’s death, the robot is given to a mourner or mourners for 49 days (seven weeks of seven days is the traditional mourning period in Japanese Buddhism), but after that initial 49-day period, the robot is deprogrammed because Ichihara believes that otherwise the bereaved may not be able to move on. In this case, the robot becomes a digital stand-in for some of the more traditional aspects of a Buddhist funeral.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="445" height="311" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/3500.avif" alt="" class="wp-image-18937"/><figcaption class="wp-element-caption">Pepper at the Life Ending Industry Expo 2017 in Tokyo, Japan. Source: Kim Kyung-Hoon/Reuters.</figcaption></figure>
</div>


<p>For the roboticist Ichihara, the ability to interact with a digital replica of a deceased loved one allows a mourner to process the death and ask the deceased questions as they begin their mourning. For her, this interaction with a digital replica is in many ways far less jarring than the interaction she herself experienced with the dead in the more traditional Buddhist funeral. She <a href="https://www.koaa.com/news/2018/12/28/a-new-way-to-grieve-a-robot-that%20-acts-looks-like-dead-loved-ones/" target="_blank" rel="nofollow">says</a>, “I clearly remember a few things from the funeral. Makeup was applied on my dead grandmother’s face… We placed flowers in her coffin. After she was cremated, our family picked the bones out of her ashes. It was a shocking ritual.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Inventions like Hyodol, Pepper, and Digital Shaman bring up important questions about human attitudes toward robots’ functions. If they serve as a stand-in or conduit for human care and do so effectively and economically, religious robots might be viewed positively, too. But most of my American colleagues are repulsed by this idea. In particular, they view religious robots like Pepper as too impersonal and perhaps encroaching on the very thing that makes us human.</p>



<p>However, I would argue that this view of religious robots as negative might be connected to thinking found in the Christian (and more specifically, Protestant) worldview, which places more value on religious <em>belief</em> than on religious <em>practice</em>. From a Christian point of view, what the robot <em>does</em> is of secondary importance to the fact that it is not, strictly speaking, a sentient being. This means the robot would be incapable of functioning in a religious way (for the Protestant, without the spirit, the robot has no religious animus to function authoritatively in the religious sphere).</p>



<p>To me, this is the crux of why Americans have such a hard time accepting robots and other new technologies into our everyday lives, and why our science fiction is filled with stories of humans versus robots. In the United States, robots are viewed as soulless, unlike in Asia, where they are viewed as soul-possible or soul-different. For those who cling to the notion of human exceptionalism, if robots could be viewed as sentient, then perhaps humans are not that special after all. Until we take seriously the ways in which our cultural and religious heritages inspire and impede our attitudes toward technologies, the development of these technologies will remain the realm of only a select few.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Candi K. Cann</em></strong><em>, a former Fulbright Scholar, is a professor at Baylor University with a research focus on death, dying, and grief, as well as the intersections of marginality, diversity, and death technologies. She is the author of the book “</em><a href="https://mitpress.mit.edu/9780262051118/augmented/" target="_blank"><em>augmented</em></a><em>,” from which this article is adapted.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Robot-cover.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Robot-cover.jpg" />                                        </item>
        			                <item>
                        <title>Is Creativity a Young Person’s Game?</title>
                        <link>https://thereader.mitpress.mit.edu/is-creativity-a-young-persons-game/</link>
                        <pubDate>Mon, 16 Mar 2026 09:55:00 +0000</pubDate>
                        <dc:creator>Keith Sawyer</dc:creator>
                        		<category><![CDATA[Age]]></category>
		<category><![CDATA[Creativity]]></category>
		<category><![CDATA[Productivity]]></category>
		<category><![CDATA[Youth]]></category>
		<category><![CDATA[Philosophy]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19153</guid>
                        <description><![CDATA[<p>The data suggest that we tend to reach our most productive years in midlife. They also indicate that quality follows from quantity.</p>
]]></description>
                        <content:encoded><![CDATA[<p>The data suggest that we tend to reach our most productive years in midlife. They also indicate that quality follows from quantity.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/no-wires-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock
</figcaption>
</figure>

<p class="has-drop-cap">There’s an unspoken assumption, particularly in America, that youth and creativity tend to go hand in hand. It’s no surprise: Many of the artistic greats, from Jack Kerouac and Bob Dylan to Zadie Smith and Taylor Swift, found immense success when they were young. The fastest-growing startups — Meta, Snapchat, Airbnb, and so on — were founded by twentysomethings, some of whom never graduated from college. A quote <a href="https://www.sciencehistory.org/stories/magazine/positive-effect/?utm_source=chatgpt.com" target="_blank" rel="nofollow">frequently attributed</a> to Albert Einstein holds that scientists who have not made a great contribution to their fields before the age of 30 will never do so in their lifetimes. As one of my chemist colleagues, at 55, told me, “Science is a young man’s game.”</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262551649/learning-to-see/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/creativity-jkt.jpg" alt="" class="wp-image-19158"/></a><figcaption class="wp-element-caption"><em>Keith Sawyer is the author of the book “<a href="https://mitpress.mit.edu/9780262551649/learning-to-see/" target="_blank" rel="noreferrer noopener">Learning to See</a>.” </em></figcaption></figure>
</div>


<p>But can we generalize from these anecdotes? Can we <em>really </em>conclude that young people are more creative than their older counterparts? To answer this question, we need to determine when people are at their creative peak, and to do that, we need to delve into the science of creativity.</p>



<p>In my book with Danah Henriksen, “<a href="https://academic.oup.com/book/55265" target="_blank" rel="nofollow">Explaining Creativity</a>,” I touch upon studies of thousands of creators, from their 20s to the end of their lives, demonstrating that they generate their <em>best</em> work at the same time they’re putting the <em>most</em> work out into the world. This is particularly true of scientists, whose creative contributions are well documented because everything they do is quantified through citations, publications, books, awards, and patents. </p>



<p>What age, you might wonder, are scientists likely to be when the Nobel committee comes calling? Would you bet on the young striver or the seasoned expert? The answer, interestingly, is <em>neither</em>. The most creative people are smack in the <em>middle</em> of their careers — neither in their 20s nor nearing retirement. If we were to visualize this finding, we’d see that creativity is an inverted-U function of <em>career age</em>, or the length of time the individual has been working in their field.</p>



<p><a href="https://substackcdn.com/image/fetch/$s_!Hsnm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58aa6bd7-a1a1-4112-a3f4-9bb58b9cb9a7_289x232.tif" target="_blank" rel="noreferrer noopener nofollow"></a></p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1304" height="1084" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-19-at-3.27.27-PM.png" alt="" class="wp-image-19164" style="width:360px;height:auto"/><figcaption class="wp-element-caption"><em>The relationship between career age and annual production of creative ideas</em> for poets and mathematicians. Source: <a href="https://www.sciencedirect.com/science/article/abs/pii/0273229784900200?via%3Dihub" target="_blank" rel="nofollow">Simonton, D. K. (1984)</a>.</figcaption></figure>
</div>


<p>In other words, as the years go by, productivity increases until it peaks. Then creativity starts to drop, and it continues to decline.</p>



<p>The story is, of course, different depending on the type of work you do. Physicists, writers, and painters peak in their 20s or 30s, biologists and social scientists peak in their 40s, and writers and philosophers can maintain a steady output through to retirement. This broadly aligns with a 1966 study showing that scientists and artists peak earlier than scholars such as historians and psychologists.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1160" height="978" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-19-at-3.28.46-PM.png" alt="" class="wp-image-19165" style="width:374px;height:auto"/><figcaption class="wp-element-caption"><em>Age curves for three general domains of creativity.</em> Source: <a href="https://academic.oup.com/geronj/article-abstract/21/1/1/548959?redirectedFrom=fulltext&amp;login=false" target="_blank" rel="nofollow">Dennis (1966)</a> and <a href="https://pubmed.ncbi.nlm.nih.gov/2276635/" target="_blank" rel="nofollow">Simonton, D.K. (1990)</a>.</figcaption></figure>
</div>


<p><a href="https://substackcdn.com/image/fetch/$s_!fDTq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F454a5cc9-35ab-4454-96af-e3ed1996e6b4_298x271.jpeg" target="_blank" rel="noreferrer noopener nofollow"></a><em><a href="https://substackcdn.com/image/fetch/$s_!fDTq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F454a5cc9-35ab-4454-96af-e3ed1996e6b4_298x271.jpeg" target="_blank" rel="noreferrer noopener nofollow"></a></em>I also want to stress again that the research shows creators have their most important, groundbreaking ideas in the same year they’re most prolific. If you’re a serial inventor, the patent that makes you millions is likely to be from the same year that you filed the <em>most</em> patents. If you’re a painter, then your most famous painting will likely be in the same year that you made the most paintings. The key to successful creativity is <span style="box-sizing: border-box; margin: 0px; padding: 0px;"><em>producti</em></span><em>vity</em>; quantity leads to quality.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">You might be wondering what these insights mean for your own creative output, whether you&#8217;re a writer, coder, sculptor, engineer, or anything else. You’re right to wonder, especially in a society where all manner of things are competing for your attention, making sustained focus — a prerequisite of creativity — increasingly hard to hone. So, I’ve compiled below a list of several nuggets of wisdom that might help maximize your creative output. It goes without saying that these are easier said than done:</p>



<ul class="wp-block-list">
<li>Work hard and finish projects. The aim should always be to put your creations into the world.</li>



<li>Don’t spend all of your time on just one idea, no matter how great you think it is. Spread your energy around. Place multiple bets.</li>



<li>Some creative professions — and some types of work — peak earlier than others. Consider switching to a different path in your 30s or 40s to catch multiple peaks.</li>



<li>If you want quick success, choose professions that peak early: math, science, computers, and engineering. But if you want a sustained creative life, look more broadly.</li>
</ul>



<p>Applying these principles in the real world can certainly be challenging. For instance, if you’re in high school or college, what should you major in? Liberal arts colleges have been saying for years, “We’re not educating you for your first job; we’re educating you for your entire career.” That kind of holistic approach to education is great. But in such a <a href="https://www.nytimes.com/2026/02/04/opinion/ai-jobs-employment-industry.html" target="_blank" rel="nofollow">volatile labor market</a> as America’s, you do <em>still</em> need that first job. So maybe consider double-majoring in an early-career field (math, engineering) and a later-career field (literature, history, philosophy).</p>



<p>If you’re mid-career, you’re at your peak, so make good use of the resources, the network, and the reputation that you have already built for yourself. By now, you should have the foundation to do your best work. If you’re nearing retirement, you simply can’t coast. You must work just as hard as you did in your 20s, which might be painful at an older age. But if you put in the work, your experience and wisdom will pay off greatly.</p>



<p>One more thing: The beauty of creativity is that it’s often collaborative. This means that you can expand your creative potential by collaborating across age groups. So, consider pairing up with someone who is considerably older or younger than you. This might feel counterintuitive: If you’re senior and experienced, you probably have a fancier title, you make more money than your younger colleagues, and you may wonder: What can that recent college graduate with little to no professional experience teach <em>me</em>? But trust me, that college graduate is thinking that much of your so-called expertise is outmoded and irrelevant.</p>



<p>This divide is difficult to bridge in the workplace. But throughout human history, it has been done repeatedly with great success: Niels Bohr and Werner Heisenberg were over two decades apart in age when they together fundamentally reshaped quantum mechanics. Larry Page and Eric Schmidt had a 20-year age gap when they founded Google. Leonard Bernstein, as a middle-aged musician, composed works often directly inspired by those of his younger students. If people of different age groups can learn to combine their respective strengths, collaboration across decades of life experience is a surefire path to greater creativity.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong>Keith Sawyer</strong> is one of the world’s leading researchers on creativity. He has published 20 books, including “</em><a href="http://www.groupgenius.net/" target="_blank" rel="noreferrer noopener nofollow"><em>Group Genius</em></a><em>,” “</em><a href="http://www.zigzagcreate.com/" target="_blank" rel="noreferrer noopener nofollow"><em>Zig Zag</em></a><em>,” and, most recently, “</em><a href="https://mitpress.mit.edu/9780262551649/learning-to-see/" target="_blank" rel="noreferrer noopener"><em>Learning to See</em></a><em>.” Sawyer is the Morgan Distinguished Professor in Educational Innovations at the University of North Carolina at Chapel Hill. He’s the host of the podcast <em>“</em><a href="http://www.sawyerpodcast.com" target="_blank" rel="nofollow"><em>The Science of Creativity</em></a>.<em>”</em> A version of this article first appeared on Sawyer’s <a href="http://keithsawyer.substack.com" target="_blank" rel="nofollow">Substack</a>.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/no-wires.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/no-wires.jpg" />                                        </item>
        			                <item>
                        <title>Cannabis Through the Ages</title>
                        <link>https://thereader.mitpress.mit.edu/cannabis-through-the-ages/</link>
                        <pubDate>Thu, 12 Mar 2026 09:55:00 +0000</pubDate>
                        <dc:creator>Linda A. Parker</dc:creator>
                        		<category><![CDATA[Cannabis]]></category>
		<category><![CDATA[Drugs]]></category>
		<category><![CDATA[History]]></category>
		<category><![CDATA[weed]]></category>
		<category><![CDATA[Culture]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19480</guid>
                        <description><![CDATA[<p>The drug’s history of healing and experimentation stretches from ancient China to American counterculture — yet its promise remains trapped in a legal straitjacket.</p>
]]></description>
                        <content:encoded><![CDATA[<p>The drug’s history of healing and experimentation stretches from ancient China to American counterculture — yet its promise remains trapped in a legal straitjacket.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/weed-cover-copy-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock</figcaption>
</figure>

<p class="has-drop-cap">An altered state of consciousness, euphoria, relaxation, increased enjoyment of food tastes and aromas, distortion in time perception, joviality, introspection, and a heightened sense of creativity: These are some of the reported “psychoactive effects” often experienced by cannabis users, and they are the same effects that the drug has had on people throughout the arc of human history.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051392/cannabinoids/" target="_blank"><img loading="lazy" decoding="async" width="320" height="448" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/weed-jkt.jpg" alt="" class="wp-image-19483"/></a><figcaption class="wp-element-caption">Linda A. Parker is the author of “<a href="https://mitpress.mit.edu/9780262051392/cannabinoids/" target="_blank">Cannabinoids</a>,” from which this article is adapted.</figcaption></figure>
</div>


<p>The earliest evidence of cannabis cultivation dates to over 10,000 years ago in modern-day China, Mongolia, and Kazakhstan. It was likely used primarily as a fiber (for making ropes, nets, and other textiles), as a food (for protein from hemp seeds), and as a ritualistic drug (for ceremonial or psychoactive use). </p>



<p>But the systematic medicinal applications of cannabis for treating numerous pathologies were not documented until thousands of years later. The earliest of such applications we know of began with the legendary Emperor Shen Nung (2700 BCE), a quasi-legendary figure known as the “father” of Chinese medicine. He is said to have taught Chinese people to practice agriculture, cultivating not only cereals and tea, but also cannabis, which he apparently saw as an alternative to magic in fighting disease.</p>



<p>The first known Chinese pharmacopoeia — the “Shen Nung Pen Ts’ao Ching,” written in the first century BCE — lists all the traditional remedies that have been handed down orally for over 2,000 years, dating back to the mythical Emperor Shen Nung’s reign. In it, a concoction of female cannabis flowers was prescribed for all conditions associated with pain, constipation, malaria, and gynecological disorders. It was considered a safe, highly effective herb. In this ancient text, there is limited reference to psychoactive properties, except that too much cannabis could cause the person to “see demons” and allow a person to “communicate with the spirits.” It is likely that the psychoactive use of cannabis was limited to shamans at the time. However, by the time of the Shang dynasty, which placed restrictions on practices such as divination and ritual healing, many shamans had begun to leave China for India.</p>



<figure class="wp-block-pullquote"><blockquote><p>Most effects of cannabis that are only now being studied have been known throughout human history.</p></blockquote></figure>



<p>In ancient India, cannabis use spread rapidly as a source of drug-induced elation and was commonly used in religious rituals, as is reported in the sacred text “Atharva Veda,” an ancient collection of holy writings (around 2,000 BCE). The sacred <em>bhanga</em>, as the drug was called, was considered the optimal treatment for anxiety and was used to treat pain, produce anesthesia, reduce spasms and convulsions, and induce hunger. By around 800 BCE, cannabis was used for its intoxicating and therapeutic effects in Assyria and in ancient Persia, as well as in medieval Arab societies.</p>



<p>Despite cannabis’s long and widespread history of recreational use, it would take many, many millennia for the plant to come under official scientific scrutiny.</p>



<p>The first to study its pharmacological and toxicological properties was Irish chemist and physician William Brooke O’Shaughnessy, who researched the drug’s use in India from 1833 to 1840. After conducting a series of human and animal experiments to explore the drug’s therapeutic effects on pain, rheumatism, and convulsions, he brought his findings back to the European medical community. O’Shaughnessy concluded that cannabis was a useful analgesic, muscle relaxer, and the most useful treatment known for convulsions.</p>



<p>O’Shaughnessy was not alone: French psychiatrist Jacques-Joseph Moreau introduced cannabis to Europeans in the mid-1800s as a psychoactive drug based on observations made during travel in the Middle East. He used the scientific method to detail the drug’s psychoactive effects, which he believed offered a way for psychiatrists to better understand mental illness. Soon enough, in Paris, the drug’s psychotropic use extended beyond the therapeutic; numerous artists, such as Victor Hugo, Alexandre Dumas, and Charles Baudelaire, wanted to try cannabis. During monthly meetings, Moreau dispensed <em>dawamesk</em> (a mixture of hashish, cinnamon, cloves, nutmeg, pistachio, sugar, orange juice, butter, and cantharides) to eminent people who had assembled to ingest the drug.</p>



<div class="wp-block-image ma-related-post ma-related-post-normal"><figure class="alignright size-pinned is-resized"><a href="https://thereader.mitpress.mit.edu/a-brief-global-history-of-the-war-on-cannabis/" title="A Brief Global History of the War on Cannabis"><span class="ma-related-post-top"><span class="ma-related-post-heading">Related</span><span class="ma-related-post-title">A Brief Global History of the War on Cannabis</span></span><img decoding="async" loading="lazy" class="ma-related-post-img" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2020/01/stoa-lede-408x267.jpg" alt="" width="370" height="242"></a></figure></div>



<p>“There are two modes of existence — two modes of life — given to man,” Moreau mused. “The first one results from our communication with the external world, with the universe. The second one is but the reflection of the self and is fed from its own distinct internal sources. The dream is an in-between land where the external life ends, and the internal life begins.”</p>



<p>With the aid of hashish, he felt that anyone could enter this in-between land at will. As Moreau studied hashish, he noted a relationship between the amount of the drug taken and its effects. A small dose produced a sense of euphoria and calm. With higher doses, however, attention wandered, ideas appeared at random, minutes seemed like hours, thoughts rushed together, and sensory acuity increased. As the dose increased further, dreams began to flood the brain, like hallucinations of insanity. Indeed, it is now understood that cannabinoids exhibit biphasic effects, in which low doses produce the opposite effects of high ones.</p>



<p>By the early 20<sup>th</sup> century, cannabis had become well-known throughout the world. It was the drug of choice for many early jazz musicians, such as Louis Armstrong, as well as reggae artists like Bob Marley. In the beatnik community, writers such as Jack Kerouac and Allen Ginsberg used it for creative inspiration. And by the ’60s, cannabis had become what some might consider <em>the</em> symbol of American counterculture.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Today, most of the effects of cannabis that are only now being studied are hardly new. For instance, it has been known for centuries that the drug is effective in treating seizures. We now know that CBD is the constituent of cannabis responsible for this effect. CBD was approved for the treatment of childhood epilepsy by the Federal Drug Administration (FDA) in the United States in 2020, yet it was shown in early clinical trials to be effective in treating epilepsy as early as 1978. Why has it taken so long to determine how this drug produces its effects? Prohibition, at least in America, has played a major role.</p>



<p>The fall of medicinal cannabis research — fueled by those like O’Shaughnessy and Moreau — came in 1937 with the enactment of the Marihuana Tax Act. Thanks in part to a moral panic stoked by Harry Anslinger, then the supervisor of the Federal Bureau of Narcotics, cannabis was made too cost-prohibitive and legally risky to research, despite appeals from the American Medical Association. In 1941, the drug was altogether removed from the United States Pharmacopeia-National Formulary, which helped shift public perception away from thinking of cannabis as a medicine.</p>



<p>Then came perhaps the largest setback: In 1961, an international treaty called the Single Convention on Narcotic Drugs placed psychoactive substances into four schedules. Schedule I, the most restrictive, contained drugs viewed to be particularly dangerous for abuse with little therapeutic value. At a subsequent 1971 UN Convention on Psychotropic Substances, the cannabis plant, its resin, extracts, and tinctures were all placed in Schedule I, which prohibited all use except for scientific purposes and very limited medical purposes by duly authorized persons. Phytocannabinoids other than THC (such as CBD) were excluded from this control by many countries, such as Britain. But the United States and Canada chose to restrict any constituent of cannabis under the same restrictive schedule as THC.</p>



<figure class="wp-block-pullquote"><blockquote><p>Cannabis is a drug worth studying, rather than one we should put in a legal straitjacket.</p></blockquote></figure>



<p>Since then, restricted access to cannabis and its constituents in America has had a negative impact on the scientific world and beyond. We could have spent decades conducting experiments on cannabis as a drug of both abuse and therapy, the results of which might have greatly informed the legalization of cannabis in several U.S. states. Indeed, well-powered, placebo-controlled investigations are still critically needed to disentangle pharmacologic efficacy from expectation.</p>



<p>However, these studies have been nearly impossible to conduct, partially because of the U.S. Drug Enforcement Agency’s (DEA’s) Schedule I labeling of the cannabis plant and its constituents. Removal of research barriers like this is currently under active discussion. Many lawmakers have suggested that cannabis should be descheduled and decriminalized entirely.</p>



<p>Whatever the case may be, thousands of years of history show us that cannabis is a drug worth studying, rather than one we should put in a legal straitjacket. We have merely scratched the surface of this drug&#8217;s potential for harm and good. Going forward, only quality science, with data that helps us assess the drug&#8217;s societal risks and benefits, will allow us to make responsible decisions about its use.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Linda A. Parker</em></strong><em> is University Faculty Emeritus and the former Canada Research Chair in Behavioral Neuroscience at the University of Guelph, Ontario, Canada. She has published over 200 scientific articles and written several books, including “</em><a href="https://mitpress.mit.edu/9780262051392/cannabinoids/" target="_blank">Cannabinoids</a>,” from which this article is adapted.</p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/weed-cover-copy.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/weed-cover-copy.jpg" />                                        </item>
        			                <item>
                        <title>War Begets War</title>
                        <link>https://thereader.mitpress.mit.edu/war-begets-war/</link>
                        <pubDate>Mon, 09 Mar 2026 09:55:00 +0000</pubDate>
                        <dc:creator>Robert Jay Lifton, Neta C. Crawford, and Matthew Evangelista</dc:creator>
                        		<category><![CDATA[Middle East]]></category>
		<category><![CDATA[Military]]></category>
		<category><![CDATA[Nuclear]]></category>
		<category><![CDATA[Trump]]></category>
		<category><![CDATA[War]]></category>
		<category><![CDATA[Culture]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19502</guid>
                        <description><![CDATA[<p>Triumph breeds hubris. Defeat breeds grievance. Either way, from World War II to Afghanistan, America has fueled a cycle that never ends.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Triumph breeds hubris. Defeat breeds grievance. Either way, from World War II to Afghanistan, America has fueled a cycle that never ends.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/war-cover-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>A massive Kuwaiti oil fire caused by the Iraqi military during America&#8217;s involvement in the Gulf War. Source: Adobe Stock</figcaption>
</figure>

<p><em>This article <a href="https://direct.mit.edu/daed/article/154/4/181/134162/War-Begets-War" target="_blank">first appeared</a> in the Fall 2025 issue of <a href="https://direct.mit.edu/daed" target="_blank">Daedalus</a>, under the same title. It features a three-way dialogue between political scientists Neta C. Crawford, Matthew Evangelista, and the late psychiatrist Robert Jay Lifton (1926–2025), an expert on the psychological causes and effects of violence in war. Lifton’s work was foundational to our understanding of trauma, particularly in shaping how we conceptualize and study posttraumatic stress disorder.</em></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Neta C. Crawford</strong>: First of all, thank you for doing this. It&#8217;s really appreciated.</p>



<p><strong>Robert Jay Lifton</strong>: I&#8217;m happy to, and I feel that my work connects with your concerns, so that&#8217;s why we&#8217;re all here.</p>



<p><strong>Crawford</strong>: This conversation began with a concern about the ways that the post-9/11 wars had affected American democracy. We also want to hear what you say about defeat in a “lost war,” the role of posttraumatic stress disorder (PTSD), which you helped conceptualize, and the diagnosis of it among Vietnam War veterans. Can you relate that to the concept of the lost war?</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><img loading="lazy" decoding="async" width="300" height="429" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/war.jpg" alt="" class="wp-image-19509"/><figcaption class="wp-element-caption"><em>This article <a href="https://direct.mit.edu/daed/article/154/4/181/134162/War-Begets-War" target="_blank">first appeared</a> in the Fall 2025 issue of <a href="https://direct.mit.edu/daed" target="_blank">Daedalus</a>, under the same title.</em></figcaption></figure>
</div>


<p><strong>Lifton</strong>: Well, first of all, I would say the principle here is that war begets war. War creates more war, and it always has to do with something that happened or didn&#8217;t happen in the previous war. Just as we speak of “nuclearism” as an embrace of nuclear weapons to solve human problems, so can we speak of war or “warism.” Warism requires a high degree of militarism and an ever-present potential use of force. This is especially true of a superpower, which maintains a dubious claim to omnipotence.</p>



<p>I always choose Vietnam as an example because, in factual terms, we clearly lost the Vietnam War; that loss was intolerable to a superpower. We knew we had the hardware — the technology — to win any war, whether with powerful non-nuclear weapons (so-called conventional weapons) or even nuclear weapons. And the question always arose: Why didn&#8217;t we?</p>



<p>When you lose that sense of omnipotence, there&#8217;s an impulse to reverse the loss of the war. Either by creating a new war that can be won (the First Iraq War was initiated to reverse the loss in Vietnam, though it had nothing to do with it), or by what we can call the “Rambo phenomenon.” <span style="box-sizing: border-box; margin: 0px; padding: 0px;">In the series of “Rambo” films, a super-masculine figure can, </span>by his own power, bring about a reversal of the outcome of the Vietnam War.</p>



<p>Involved here, very importantly, is a preoccupation in my work with the idea that we humans are meaning-hungry creatures. For survivors, that&#8217;s true 10 times over, especially for survivors of extreme violence or trauma. Toward the end of the Vietnam conflict, I wrote an article called “<a href="https://spssi.onlinelibrary.wiley.com/doi/abs/10.1111/j.1540-4560.1975.tb01020.x" target="_blank" rel="nofollow">The Post-War War,</a>” which described the struggle between adversarial groups to impose their meaning on that loss. One meaning was that it was an ill-advised war, a misguided enterprise that we should never have initiated. Another was that the war was necessary, fought for a noble cause, and that we should have won it by applying our superior technology of destruction.</p>



<figure class="wp-block-pullquote"><blockquote><p>Humans are meaning-hungry creatures. For survivors, that&#8217;s true 10 times over.</p></blockquote></figure>



<p>The concept of posttraumatic stress disorder was brought about by a committee consulting with those responsible for the “Diagnostic and Statistical Manual of Mental Disorders” (DSM). A close friend and colleague, Chaim Shatan, did most of the coordinating, but I was active in it too. I brought up not only my experience with Vietnam veterans during the early 1970s, but also my experience with Hiroshima survivors in the 1960s.</p>



<p>Of course, as some have pointed out, PTSD can be so medicalized as to lose its political significance, but that can happen with any concept.</p>



<p>There are certain advantages to the use of the concept of PTSD. One is that it gives recognition to adult trauma. So much of professional psychiatry has focused on either the organic — the German <em>Anlage</em> — source of various conditions, or on childhood influences, as in the work of Freud or Freudians. There&#8217;s been a kind of lacuna for adult trauma. Erik Erikson helped overcome that in his work, especially in relation to the life cycle.</p>



<p>Another advantage of the concept of PTSD is that it can contain a body of symptoms that are valuable for us to recognize. These include an obsession with the trauma while being unable to talk about it or to talk about anything else. What results is considerable anxiety, alternating with what I call <em>psychic numbing,</em> the inability or disinclination to feel. There can be “flashbacks,” which take the veteran back into the Vietnam situation, and he or she can behave accordingly in ways that include rage and violence.</p>



<p>For treatment purposes, it is most effective to provide psychological help close to the combat area and as quickly as possible. But when you do that, you are seeking to sustain participation in whatever war is being fought.</p>



<p>In terms of meaning, we may say that anti-war veterans found it in the meaninglessness of their war. And in coming to that powerful factual truth, they were released to tell others about it and emerge as leaders of various peace movements, especially in this country. And their leadership continues to expand.</p>



<p>They had, of course, special credibility because they were there doing the killing and dying. They could recognize the extraordinary number of Vietnamese civilians killed, and the confusion Americans inevitably had in distinguishing civilians from combatants in that kind of counterinsurgency war. These were the conditions that John Paul Sartre called likely to bring about genocide; certainly, they can bring about atrocities.</p>



<p>It&#8217;s also important to understand that the resistance by the anti-war veterans came from below. They were mostly ordinary Americans who hadn&#8217;t questioned American war-making, because it was their country and they considered themselves patriotic. The fact that they could undergo this dramatic change in opposing their war while it was going on had intense significance for the society as a whole in turning against the war.</p>



<p><strong>Crawford</strong>: It seems to me that the way you think about this throughout all of your work is to see the individual as both an individual and as a metaphor for the society. Are you saying that the culture experiencing this trauma of the lost war also has a need to overcome it collectively?</p>



<p><strong>Lifton</strong>: Yes, there is the question of the individual and the collective, and that question runs all through my work. I have mostly interviewed individuals and looked for what I call <em>shared themes,</em> which can then identify the collective. Shared patterns of individuals — including trauma and pain — become sources of understanding of the collective. Collective behavior becomes crucial to bringing about any social change or to characterizing what is happening in a society.</p>



<p>The “Rambo” phenomenon wouldn&#8217;t have taken shape if there wasn&#8217;t a long-standing collective support of the war, which amounted to a collective falsification of the war. That pattern was interrupted by the antiwar activities of veterans I interviewed.</p>



<p>The other point you raised has to do with the idealization of the lost war. Here, one does well to go back to the American Civil War, when leaders in Southern culture, notably Robert E. Lee, who became the commanding general of the Confederacy, can be ennobled as having admirably held to their cultural loyalty and to the “compelling charm” of their society. This idealization covers over the fact that Southern culture was inseparable from slavery.</p>



<div class="wp-block-image ma-related-post ma-related-post-normal"><figure class="alignright size-pinned is-resized"><a href="https://thereader.mitpress.mit.edu/the-mega-wars-that-shaped-world-history/" title="The Mega-Wars That Shaped World History"><span class="ma-related-post-top"><span class="ma-related-post-heading">Related</span><span class="ma-related-post-title">The Mega-Wars That Shaped World History</span></span><img decoding="async" loading="lazy" class="ma-related-post-img" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2020/03/lede-2-408x267.jpg" alt="" width="370" height="242"></a></figure></div>



<p>There&#8217;s a partial parallel with Vietnam: the kind of empathy and sympathy I and others had for the veterans themselves could be extended by some to mean that they were fighting for a noble cause. Ronald Reagan could see them as patriots on a great mission to combat a Communist effort to suppress our country. There&#8217;s a lot of falsehood in that, since it was a murderous war that we started under dubious conditions.</p>



<p>We&#8217;re still struggling with the false ennobling of the Confederate cause and the Vietnam War.</p>



<p><strong>Crawford</strong>: What do you think could transform the collective? Because we remain at least partially stuck in the Reagan-era reinterpretation of the war.</p>



<p><strong>Lifton</strong>: With Vietnam, the collective became increasingly susceptible to questioning; that is, Americans came to have increasing doubts about the war. There were enormous demonstrations; there was the “<a href="https://en.wikipedia.org/wiki/Moratorium_to_End_the_War_in_Vietnam" target="_blank" rel="nofollow">Moratorium</a>”; there were many efforts on the part of the general public to express outright opposition to the war.</p>



<p>Let me say something else about the individual and the collective process. Erik Erikson had a theory of the Great Man (or Great Woman) in history. He emphasized (as he did in his psychobiographies of Luther and Gandhi) the great person who must “solve for all what he could not solve for himself alone.” That was what led to historical change. My focus on shared themes questioned that theory in favor of a focus on specific groups of people that have particular influence in being acted upon or themselves acting on others. Among those specific groups were Hiroshima survivors and anti-war Vietnam veterans.</p>



<p>I think the shared themes theory is more in keeping with our task in this interview. That is also perhaps true for most of the other essays in this <em>Daedalus</em> issue, which are collectively oriented. They would be more in the realm of shared themes than of the great person in history.</p>



<figure class="wp-block-pullquote"><blockquote><p>September 11 still haunts us, because a superpower cannot allow itself to be defeated or humiliated by anyone.</p></blockquote></figure>



<p><strong>Matthew Evangelista</strong>: In terms of shared themes, would you credit something like a “Vietnam Syndrome,” in which many Americans became skeptical of the use of military force, for wars that resembled Vietnam?</p>



<p><strong>Lifton</strong>: The Post-Vietnam Syndrome collectively for America, as you suggest, came to mean a reluctance to get into counterinsurgency wars like Vietnam that are so dubious. That&#8217;s been a very powerful influence. But the post-9/11 wars in Iraq and Afghanistan that we entered were, unfortunately, also counterinsurgency wars, and could be said to have been fought to break out of the Vietnam Syndrome.</p>



<p>It was the first George Bush who said, “By God, we&#8217;ve kicked the Vietnam Syndrome once and for all!” Well, in fact, we hadn&#8217;t, but we had broken out of it significantly in creating the First Iraq War. And even with the Afghan War, one could have advocated much more limited means. Some action had to be taken against Osama bin Laden, but we didn&#8217;t need to initiate a war on the entire nation of Afghanistan, where previous efforts, including a Russian one, had notoriously failed.</p>



<p>Let me also say something about another version of the Post-Vietnam Syndrome. It originally had a different meaning, at least for veterans. It signified that veterans of Vietnam seemed different from the veterans of other wars. Many of them were reluctant to go to the Veterans Administration, which refused to recognize that difference. For a long time, the Veterans Administration wanted to see Vietnam veterans as just like veterans of other wars, who should join local veterans’ groups that tended to be conservative or reactionary about military matters.</p>



<p>I fortunately had an influence in bringing about a change in that attitude. A young man named Arthur Blank, who was my student and colleague at Yale, and himself a psychiatrist and a Vietnam veteran, became head of an outreach program of the Veterans Administration. He consulted with me about veterans in general and the work I had done in rap groups [discussion groups or group therapy] with them. He enabled the Veterans Administration to recognize the conflicts of the soldiers in that war. Where I and others working with me could reach just a few hundred people in our rap groups and interactions with veterans, his program could reach tens of thousands.</p>



<p><strong>Evangelista</strong>: What about the “war on terror” following upon 9/11?</p>



<p><strong>Lifton</strong>: Unfortunately, that “war on terror” could have a totalism of its own. Anyone who did not completely support our position was against us. September 11 also still haunts us, all the more so because a superpower cannot allow itself to be defeated or humiliated by anyone.</p>



<p><strong>Crawford</strong>: When you say we&#8217;re haunted by the wars, do you think of it as victory having its own sort of hangover — victory as part of the superpower syndrome?</p>



<p><strong>Lifton</strong>: Winning wars is problematic, too. I have in mind World War II, which killed enormous numbers of people. I was once giving a talk to a religious group, and I mentioned atrocities in Vietnam, and the atrocity-producing situation, and a man got up and said, “I was a Marine in World War II. We mutilated bodies, too. We killed prisoners. It wasn&#8217;t just Vietnam.” That was Paul Moore, the great Episcopal leader. He was saying those atrocities could occur even in a so-called good war, necessary to defeat the Nazis. The victory parades that followed World War II could also help block out its ugliness. The soldiers came back as heroes. We became world-dominant and had a lot of ethical claim. And our own atrocities were covered over.</p>



<p><strong>Evangelista</strong>: Would you say that the outsized role that military power plays in U.S. foreign policy has an effect on the quality of our democracy?</p>



<p><strong>Lifton</strong>: What you are raising is what has come to be called a “national security state.” What that means is that the organs of the state are subsumed to a form of militarism as an assertion of what&#8217;s called “national security.” But that can come to mean a domination of behavior in the world.</p>



<p>It&#8217;s significant that this concept of the national security state was one that we directly questioned in the physicians’ antinuclear and antiwar movement: PSR, Physicians for Social Responsibility, and then the international version, IPPNW, International Physicians for the Prevention of Nuclear War. We put forward a position of shared security or human security. That was embodied in a quasihumorous but deeply significant toast that would be offered at each meeting of the international group, either by an American or a Soviet delegate to the meeting. The toast that he or she would make was: “Here&#8217;s to your good health and the health of your leaders and the health of your people, because if you die we die, and if you survive we survive.” A little gallows humor there and a lot of truth.</p>



<p>It&#8217;s disappointing that, in the buildup to the American election of 2024, there was very little rational mention of the nuclear threat.</p>



<p><strong>Evangelista</strong>: Why do you think there&#8217;s such neglect of the nuclear danger now? Many would credit the international physicians with contributing to the end of the Cold War and the end of the superpower nuclear arms race. They won a Nobel Peace Prize for their efforts. Yet here we are with countries still maintaining nuclear arsenals even though they were reduced quite a lot after the initiatives of Gorbachev and Reagan. Now we hear talk of a new nuclear arms race, one that includes China. There&#8217;s still concern about Iran&#8217;s nuclear program and North Korea&#8217;s nuclear program.</p>



<p><strong>Lifton</strong>: I think that the human psyche has a certain kind of overall area in which apocalyptic dangers are confronted or experienced. Charles Strozier and I did a study that was termed “Nuclear Threat” and found that people spoke of climate and nuclear threat almost in the same paragraph or even in the same sentence.</p>



<figure class="wp-block-pullquote"><blockquote><p>“Nuclear ethics” is a contradiction in terms.</p></blockquote></figure>



<p>Much of the conversation about nuclear weapons has been in relation to deterrence. Joseph Nye at the Kennedy School wrote a notorious book called “Nuclear Ethics,” in which he said we shouldn&#8217;t be hawks and build too many, we shouldn&#8217;t be doves and not build enough, we should be owls who build just the right number. And, under certain conditions, we may have to use them. “Nuclear ethics” is a contradiction in terms. There is no ethics and only criminality in using weapons that can bring about an end to humanity. One has to remember that so-called deterrence always includes the possibility of using the weapons, and sometimes can encourage first use. That kind of thinking is a form of nuclearism. So is the idea that there can be an “exchange”: I drop a bomb on Moscow, you drop a bomb on New York, and we&#8217;re finished.</p>



<p>The dropping of the first nuclear bomb in Hiroshima was an act of nuclearism. J. Robert Oppenheimer&#8217;s tragedy was his brilliant success in bringing about the making of the bomb at Los Alamos. He became a national hero. But he advocated the use of the weapon to solve the country&#8217;s problems.</p>



<p>In the physicians’ movement, we were attempting to break out of nuclearism. We would say in effect: “Look, we&#8217;re doctors, we&#8217;d like to patch you up after a nuclear war, as doctors do with any war. But the trouble is that there will be no medical facilities to do that, and, besides, you&#8217;ll be dead, and we&#8217;ll be dead.” That was our message. It was the direct antithesis of nuclearism, and it was a form of factual truth-telling about the nuclear threat.</p>



<p>All of my work in relation to nuclear threat and threat of war in general is enormously affected by the fact that I encountered the bomb in its annihilative use in Hiroshima. Survivors, called <em>hibakusha,</em> whom I interviewed, described those human effects in the most pained way. That led me to look into the state of mind of those at the other end of the weapon, those who created it and advocated its use.</p>



<p>In my latest book, I emphasize survivor power and survivor wisdom, because survivors can apply what they have experienced — whether the survivors of Hiroshima or survivors of Auschwitz — to tell the tale of what happened in a deeply believable way. And their influence can be sustained even after their generation begins to die out.</p>



<p>Most of the scientists who worked closely with Oppenheimer to make the bomb also became what I came to call “prophetic survivors.” They started the <a href="https://thebulletin.org/#navbar-brand" target="_blank" rel="nofollow">Bulletin of the Atomic Scientists</a><em>,</em> whose authors mainly included scientists active in creating the bomb, who knew all too well what it could do, and did do, to human beings in general.</p>



<p>Survivor power involves what Martin Buber called “imagining the real.” That is, taking in the factual truth of the kind of catastrophe that threatens our species.</p>



<p>The fact that International Physicians for the Prevention of Nuclear War won a Nobel Peace Prize suggests the hunger for factual truth: The truth of nuclearism is that it&#8217;s endangering the planet. There is the phenomenon of “nuclear winter,” where the ashes of the nuclear attack will block out the light of the sun and make it impossible to survive. And there is newer work that explores how nuclear war would affect agriculture and create world starvation. It&#8217;s research-based, so these are nuclear truths that are factual and that we need to articulate and continue to articulate.</p>



<div class="wp-block-image ma-related-post ma-related-post-normal"><figure class="alignright size-pinned is-resized"><a href="https://thereader.mitpress.mit.edu/devastating-effects-of-nuclear-weapons-war/" title="The Devastating Effects of Nuclear Weapons"><span class="ma-related-post-top"><span class="ma-related-post-heading">Related</span><span class="ma-related-post-title">The Devastating Effects of Nuclear Weapons</span></span><img decoding="async" loading="lazy" class="ma-related-post-img" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2022/03/nuclear-war-408x267.jpg" alt="" width="370" height="242"></a></figure></div>



<p><strong>Crawford</strong>: There&#8217;s a National Academy of Sciences study of nuclear winter underway. Christopher Yeaw, as part of his testimony for that study, advocated that nuclear deterrence required us to avoid giving the impression to adversaries like Russia and China that we would hold back from using the weapons. He warned against being “self-deterred.”</p>



<p><strong>Lifton</strong>: The mildest term for that is disinformation. It&#8217;s worse than that because it&#8217;s reminiscent of the nuclearism of Edward Teller or Herman Kahn. Teller thought that the significance of Hiroshima was that we should never cease making bigger and more deadly weapons. Kahn, describing how, when someone might tell him that a nuclear policy could lead to the loss of a city, would reply: “Well, we&#8217;ll build a new city.” These are false assumptions about the weapons and about human behavior. Nuclearism can all too readily lead to planetary destruction.</p>



<p>Still, I think it&#8217;s reasonable to ask: How is it or why is it that there have been no nuclear weapons used since Nagasaki? Given the prevalence of nuclearism, one might have well feared they could be used again. We don&#8217;t know the answer to that question exactly, but it could be that the various peace movements, the recognition of Hiroshima, which created what I came to call “imagery of extinction,” and other forms of disseminating nuclear truths have played a part that could be of greater significance than any clear deterrence. And that commitment to factual truth-telling about nuclear weapons has to be sustained by responsible leaders. But, as I always emphasize, the struggle continues.</p>



<p><strong>Crawford</strong>: If it were me, I would say that the truth about war is that it never discriminates; it always harms civilians. And you would say there&#8217;s always atrocity.</p>



<p><strong>Lifton</strong>: Yes, there&#8217;s always atrocity, with widespread killing of civilians.</p>



<p>War is also likely to produce the seeds for dictatorial leaders. For instance, in Hitler&#8217;s own story, he could take the German defeat in World War I and the conditions imposed by the Allies as humiliating, as many others did. He himself described a kind of transcendent experience under poison gas during which he could envision himself as a great leader of the German people.</p>



<p>There&#8217;s something about the mass killing in war, any war, that leads to extremity and speaks to those who want to either reverse it or deny its harm. I think that so much is covered over by the joy in victory. Warism becomes transcendent.</p>



<p>The Nazis believed that one could only be tested by war — that war-making was an ultimate human achievement. William James recognized the danger of that idea when he wrote about the “moral equivalent of war,” asking that people be conscripted not into the military but into communal forms of hard labor and survival in the wilderness. But war-making has always had an appeal that is difficult to resist.</p>



<figure class="wp-block-pullquote"><blockquote><p>Humiliation is an ever-potential source of violence.</p></blockquote></figure>



<p><strong>Crawford</strong>: We seem to be in a cultural moment when violence is alluded to, threatened, and ubiquitous. Do Trump&#8217;s appeals to violence offer some hope of something to his supporters? Why are people attracted to that? We haven&#8217;t talked enough about violence.</p>



<p><strong>Lifton</strong>: Violence is very, very important. James Gilligan, a psychiatrist with whom I&#8217;ve been friendly, studied violent people extensively and found that at the center of it was humiliation. There was personal humiliation in their lives that readily lent itself to violence. There can be collective humiliation on the part of countries, as Hitler claimed for Germany. Trump can tap the grievances of large numbers of people who feel they have been humiliated by intellectuals and scholars like ourselves, left out and ignored.</p>



<p>So humiliation is an ever-potential source of violence. But Trump has both threatened violence and initiated violence regularly against those who simply question his falsehoods. It&#8217;s reminiscent to me of a strange comparison: I had a Japanese friend who was anti-military and anti-emperor. During the postwar years, he spoke out against the emperor system, and when he did, he would find a note in his mailbox saying, “I heard you talk yesterday, I trust you and your family are well.” It was a thinly veiled threat to treat his family violently, not just him. So, the threat of violence can always be hovering in the Trumpist movement as well.</p>



<p><strong>Crawford</strong>: Do you think that more Americans are accepting of that violence after twenty years of war, or because of Vietnam?</p>



<p><strong>Lifton</strong>: Not accepting that violence, but more susceptible to its threat because of our history. We have had an enormous amount of violence, including the assassinations of the &#8217;60s. And the recent January 2021 calling forth of insurrectionists by Trump to storm the Capitol and allow in those who are armed. People are always concerned about the threat of violence, but Americans have reason for greater belief in its possibility.</p>



<p><strong>Crawford</strong>: This reminds me of Irving Janis&#8217;s work on groupthink. But it&#8217;s a little bit different in the sense that you&#8217;re saying that it&#8217;s not just the people who silence themselves, they actually come to believe.</p>



<p><strong>Lifton</strong>: You know, Janis was part of the Wellfleet meetings that I started with Erik Erikson in 1966 as a yearly seminar on the intersections of psychology and history. Gilligan came to those meetings as well. Janis talked about groupthink to us at Wellfleet. It does become a kind of reality in which those who start out skeptically do come to the thinking of the dominant group. Colin Powell was susceptible to groupthink when he testified falsely about weapons of mass destruction and chemical weapons in Iraq. He was, after all, a military person and an advocate of military loyalty to civilian control. In that case, his response to groupthink was catastrophic.</p>



<p><strong>Evangelista</strong>: We also have the example of Robert McNamara during the Gulf of Tonkin incidents, when he lied about the evidence and later admitted having done so, out of a misplaced notion that lying was the right thing to do for his country. We think of the invasion of Iraq and the run-up to the invasion of Iraq as a kind of inflection point at which truth became quite degraded, and maybe we&#8217;re still suffering the consequences of that. But in some respects, it goes further back, to the Vietnam War.</p>



<p><strong>Lifton</strong>: McNamara was very much compromised, both in relation to nuclear weapons and to the Vietnam War. Yet he turned around eventually and became critical of nuclear policy and war-making. I was in touch with someone who worked with him, UN-sponsored, and he described McNamara as quite reasonable in advocating peaceful directions. So Janis&#8217;s groupthink can work in different ways.</p>



<p><strong>Crawford</strong>: What do you think about Harold Lasswell&#8217;s idea of the “garrison state”? In Lasswell&#8217;s view, it is a “world in which the specialists on violence are the most powerful group in society,” and on the civilian side, where civil liberties like voting are essentially optional.</p>



<p><strong>Lifton</strong>: The garrison state does suggest militarism. And yes, it&#8217;s a close equivalent of the national security state with a military emphasis. Lasswell is partly right, but also turned out to be partly wrong in the sense that the military has more recently loomed large in questioning Trumpist efforts at seizing power. The military has held to subsuming itself to civilian control and has made statements against being used to suppress American protest, as Trump has suggested he would like to use it.</p>



<p><strong>Lifton</strong>: Let me conclude with a few simple thoughts. Wars seek to solve human problems, but never do. Rather, each war contributes to subsequent wars and general violence. Winners can experience dangerous forms of triumphalism, among them the fantasy of controlling the events of history. Losers are likely to invoke Rambo<em>&#8211;</em>like attempts to reverse the outcome. What is unacceptable psychologically is the idea that a large number of one&#8217;s nation&#8217;s men and women have “died in vain.”</p>



<p>There is always an early “war fever,” a widespread experience of transcendence with a glorification of a deadly version of patriotism. But soon afterwards comes the killing and dying. The chaos and violence of war lead to the emergence of dictators and of totalistic ideologies like communism and fascism.</p>



<p>Our task becomes that of breaking this collective vicious circle of violence by invoking diplomatic forms of interaction among nations and institutions within our own country that remain committed to truth-telling. The process is ongoing, a continuous dynamic of resistance to the rule of force by means of the rule of law.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong>Neta C. Crawford</strong> is Professor of International Relations at the University of St. Andrews. </em></p>



<p><em><strong>Matthew Evangelista</strong> is President White Professor of History and Political Science Emeritus at Cornell University. </em></p>



<p><em><strong>Robert Jay Lifton </strong>(1926–2025) was Distinguished Professor Emeritus at John Jay College and the Graduate Center of the City University of New York and Lecturer in Psychiatry at Columbia University.</em></p>


]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/war-cover.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/war-cover.jpg" />                                        </item>
        			                <item>
                        <title>How Notting Hill Exposed Britain’s Postcolonial Crisis</title>
                        <link>https://thereader.mitpress.mit.edu/notting-hill-and-britains-post%e2%80%91imperial-identity-crisis/</link>
                        <pubDate>Thu, 05 Mar 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Nicholas Mirzoeff</dc:creator>
                        		<category><![CDATA[Britain]]></category>
		<category><![CDATA[Class]]></category>
		<category><![CDATA[Photography]]></category>
		<category><![CDATA[Race]]></category>
		<category><![CDATA[Media]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18940</guid>
                        <description><![CDATA[<p>Roger Mayne and Stuart Hall&#8217;s complementary visions reveal how racial animus in London reflected a deeper post-war crisis of whiteness and masculine identity.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Roger Mayne and Stuart Hall&#8217;s complementary visions reveal how racial animus in London reflected a deeper post-war crisis of whiteness and masculine identity.</p>

<figure class="wp-block-image">
<img width="700" height="523" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/WagtailSource-Women-and-children.width-1136.format-webp-700x523.webp" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>Roger Mayne, photo of Black and white children together in Southam Street, London, 1956. © Roger Mayne Archive / Mary Evans Picture Library.</figcaption>
</figure>

<p class="has-drop-cap">In the aftermath of World War II, the collapse of empire and the rise of postcolonial and anticolonial critiques forced many self-identified British people to reevaluate their national identity and self-worth. Among the conservative right in particular, the postwar period spurred a retreat into racial exclusion, as expressed by fascist Oswald Mosley’s then-popular slogan, “Keep Britain White” — the <a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.aljazeera.com_opinions_2025_7_29_the-2Duk-2Dis-2Dslipping-2Dinto-2Dracist-2Ddystopia&amp;d=DwMGaQ&amp;c=slrrB7dE8n7gBJbeO0g-IQ&amp;r=jZADBXNHs0GmJDJb-UXiCQnpb0CRd8io41xoK4VvnS8&amp;m=AQrOXY86XK2S1IZRWhksz7AnLn-buk8FjWR6MAVA9FdrEmEPJp7K7h04NS2cBGA-&amp;s=BrIgVEBMdLZgwA8E2pWDqadm9gYPhZdeN_Rl8im4Rgw&amp;e=" target="_blank" rel="noreferrer noopener nofollow">spirit</a> <a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.cnn.com_2025_12_12_uk_pink-2Dladies-2Dbritain-2Danti-2Dimmigration-2Dintl&amp;d=DwMGaQ&amp;c=slrrB7dE8n7gBJbeO0g-IQ&amp;r=jZADBXNHs0GmJDJb-UXiCQnpb0CRd8io41xoK4VvnS8&amp;m=AQrOXY86XK2S1IZRWhksz7AnLn-buk8FjWR6MAVA9FdrEmEPJp7K7h04NS2cBGA-&amp;s=_NW6vF5n3qWNlG4HJHLgf79qb3y7daRikRrXgSvcb9E&amp;e=" target="_blank" rel="noreferrer noopener nofollow">of</a> <a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.humanrightsresearch.org_post_the-2Drise-2Din-2Dracial-2Dtension-2Dand-2Dgrowing-2Danti-2Dimmigration-2Dsentiment-2Din-2Dthe-2Dunited-2Dkingdom&amp;d=DwMGaQ&amp;c=slrrB7dE8n7gBJbeO0g-IQ&amp;r=jZADBXNHs0GmJDJb-UXiCQnpb0CRd8io41xoK4VvnS8&amp;m=AQrOXY86XK2S1IZRWhksz7AnLn-buk8FjWR6MAVA9FdrEmEPJp7K7h04NS2cBGA-&amp;s=c9K4qePV4H9BBsm0wynF6JzgdAibi_ZQ4Purk75m1YE&amp;e=" target="_blank" rel="noreferrer noopener nofollow">which</a> has resurged in U.K. politics today.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262047678/white-sight/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/white-sight-jkt-2-copy.jpg" alt="" class="wp-image-18974"/></a><figcaption class="wp-element-caption">This article is adapted from Nicholas Mirzoeff&#8217;s book “<a href="https://mitpress.mit.edu/9780262047678/white-sight/" target="_blank">White Sight.</a>”</figcaption></figure>
</div>


<p>Few places in Britain revealed such tensions more starkly than Notting Hill, a central London district home to many Caribbean migrants, and ultimately, the site of spectacular racist white violence in 1958–1959. The era’s collective memory owes much to the work of English photographer Roger Mayne (1929–2014), whose photos captured the neighborhood’s charged atmosphere, and to the analysis of the Black cultural theorist and teacher Stuart Hall (1932-2014).</p>



<p>For white politicians and social scientists, Notting Hill demonstrated the impossibility of a multicultural community. From an anticolonial perspective, community was a goal to be achieved in the future, even as the present remained structurally and systematically divided.</p>



<p>Hall responded to this discourse as a teacher, activist, and editor of “Universities and Left Review.” When the racist violence broke out in ’58, he was teaching “right at the bottom of the pile” in Kennington, South London. He followed some of his white students to Notting Hill after school. He wanted to witness what they described as “a bit of argy-bargy,” meaning low-intensity street violence. He watched them on street corners, shouting abuse at Black women walking home, in chorus with older white men sitting inside pubs. Back in school, he asked them why, and the response was, “They’re taking our women” — this from 14-year-olds — or “They’re taking our things.” When Hall, who was Jamaican, asked if this “they” included himself or the Black children in the class, they denied it. He felt that violence among the young was “not willful callousness, but a part of their predicament. It is a successful surrogate, a release — <em>for the [f]ew</em>.” All that school had really taught them was prejudice that found a socially sanctioned release in “Notting Hill.”</p>



<p>To Hall, Notting Hill wasn’t just one isolated “zone” within a largely stable metropolis; it <em>was </em>London. The entire city was in transition. And the prejudices and violence among the self-identified white population were, even more broadly, “the unmistakable profile of Britain’s colonial policy over the last century.”</p>



<figure class="wp-block-image size-full is-resized"><img loading="lazy" decoding="async" width="800" height="603" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/image-10.png" alt="" class="wp-image-18959" style="width:646px;height:auto"/><figcaption class="wp-element-caption">Roger Mayne, photo of children and Teddy Boys playing and hanging out in the streets of North Kensington, London, 1956. © Roger Mayne Archive / Mary Evans Picture Library.</figcaption></figure>



<p>What Hall also recognized about Notting Hill was that the teenagers (a term then used to refer to people aged 12–25) marauding its streets were all shaped in the postwar period. They had grown up in a climate of conformity, overweening bureaucracy, anti-intellectual education, and repetitive jobs. Their only outlet was what he called “mass entertainment” and consumerism on the U.S. model, much to their elders&#8217; disdain. Hall understood that culture (broadly defined) would play a key role, one absent from standard left analysis of the economy. As a former Henry James scholar, Hall had tried, without much success, to interest his students at school in Shakespeare. His writing shows detailed awareness of trends in fashion, music, slang, and what would come to be called subcultures among teenagers. After Notting Hill, Hall argued that it was the absence of a “common culture” that prevented the possibility of a democracy worth the name.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">As editor of “Universities and Left Review,” Hall explored Notting Hill with the help of photographs and writings by his friend Roger Mayne. Originally a chemist, Mayne had had some initial success as a photographer in the mode of Henri Cartier-Bresson’s project to capture the “<a href="https://www.icp.org/exhibitions/henri-cartier-bresson-decisive-moment" target="_blank" rel="nofollow">decisive moment</a>.” Now something of a photographic cliché, this idea was new in England at the time. Indeed, the Victoria and Albert Museum responded to his request that it collect photography by dismissing it as “a mechanical process into which the artist does not enter.” Mayne’s work focused on capturing decisive moments in modern British life. He already had work in the Museum of Modern Art’s collection, and in 1956 had an exhibit at none other than the Institute of Contemporary Arts in London.</p>



<p>That same year, Mayne began a six-year project photographing Southam Street in Notting Hill, one of the area’s most impoverished streets. The once-elegant townhouses on Southam Street were dilapidated and run-down by 1956. Subdivided into multiple flats with just cold water and toilets only available communally on landings, these properties were rented to the urban dispossessed, including single parents, Irish people, and Caribbean migrants. Violence was part and parcel of life. (It was on the corner of Southam Street that Kelso Cochrane, a 39-year-old Antiguan carpenter, was murdered in a racially motivated attack, and a year later, where Mosley held a rally for his fascist Union Movement on the same spot.) Nonetheless, Mayne found a dynamism there that contrasted with the apathy of postwar Britain:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>My reason for photographing the poor streets is that I love them, and the life on them. . . . Empty, the streets have their own kind of beauty, a kind of decaying splendour, and always great atmosphere — whether romantic, on a hazy winter day, or listless when the summer is hot; sometimes it is forbidding; or it may be warm and friendly on a sunny spring weekend when the street is swarming with children playing, and adults walking through or standing gossiping. I remember my excitement when I turned a corner into Southam Street, a street I have since returned to again and again.</p>
</blockquote>



<p>Mayne’s photographs captured moments of street life that were unusually active because cars were banned from Southam Street as a “play street,” making it into an informal playground and youth center. Mayne lived locally in a classic two-up, two-down terraced house, which he had, in his own words, “‘modernised’ by the division of a first-floor room into kitchen and bathroom, and I had the two top-floor rooms knocked into one.” Mayne, too, was part of the transition — as a gentrifier.</p>



<p>He moved there because he considered himself an artist, and Notting Hill was an artists’ area. Painter Lucian Freud, grandson of the psychoanalyst, had a flat on Delamere Terrace, just a mile from Southam Street. The street faced the Regent’s Canal, and as Freud put it, “Delamere was extreme and I was conscious of this. A completely nonresidential area with violent neighbors. There was a sort of anarchic element of no one working for anyone.” He painted a young local man named Charlie Lumley in 1950–1951. Titled<em> </em>“Boy Smoking,” the small painting has a close focus on Lumley’s face, dominated by his blue eyes and full lips. Although he’s said to be smoking, his cigarette has gone out. Lumley’s look directly engages the artist/viewer with an affect that is at once vacant and queer. Freud used Lumley as an optic into working-class life, remarking, “Sometimes things that I didn’t understand Charlie would explain to me: the harsh ways and laws of the life [in Paddington], such as the things people were respected and despised for.”</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="602" height="799" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/image-6.png" alt="" class="wp-image-18949" style="width:350px;height:auto"/><figcaption class="wp-element-caption">Lucian Freud, “Boy Smoking,” 1950–1951. Oil on copper. © The Lucian Freud Archive / Bridgeman Images.</figcaption></figure>
</div>


<p>Unusually for Freud, the frame cuts into the boy&#8217;s head. Mayne appropriated both Freud’s subject matter of underclass London and his compositional technique. Mayne&#8217;s own photographs similarly let the frame cut through people or events. His focus is sometimes a little off, and his depth of field is sometimes forced. It is the uncertain dynamic of class and ethnic relations that gives his photographs their energy. Despite many formal differences between Freud and Mayne, they shared a common concern with how humans might connect and communicate across class, gender, and racial divides in the postwar, decolonial moment.</p>



<p>Mayne’s Notting Hill photographs depict predominantly white-appearing teenagers, mothers, and children, but few adult men. A minority of Caribbean people of all ages were mixed unevenly among them. Some recent commentaries have sought to interpret Mayne’s work as addressing class to the extent that it even “transcends race.” To the contrary, Mayne’s photographs perhaps unwittingly illuminate Hall’s signature insight, derived from his own experience as a migrant: “Race is the modality in which class is lived. It is also the medium in which class relations are experienced.” None of the invented traditions of working-class belonging can be seen here — flat caps, football support, going to pubs, and the labor movement — except for the respectably dressed young children, who are nevertheless out on the street.</p>



<p>In the transitional time-space of Notting Hill, English, Scottish, and Welsh working-class people came to see themselves first and foremost as “white,” or more precisely, to be British was to be white. Their Caribbean counterparts came to understand, as Hall himself had done, that they were seen first and foremost as “Black,” a category that made any claim on Britishness partial at best in white eyes.</p>



<figure class="wp-block-pullquote"><blockquote><p>Notting Hill wasn’t just one isolated “zone” within a largely stable metropolis. The entire city was in transition.</p></blockquote></figure>



<p>In a 1956 photograph, Mayne recorded four smartly dressed West Indian men walking down Southam Street. The landscape format of the picture evoked what he called the “atmosphere” and “beauty” of this urban space from which the eye has no exit; even the windows are impenetrable, thanks to thick net curtains. The even grayness of Mayne’s work unconsciously records the permanent London smog of those coal-fired days. The men pass among four visibly white boys, each of whom is staring fixedly at them, as if in illustration of Frantz Fanon’s “<a href="https://monoskop.org/images/a/a5/Fanon_Frantz_Black_Skin_White_Masks_1986.pdf" target="_blank" rel="nofollow">Look! A Negro</a>” scene. None of the men pays them any attention. One looks over at the photographer, his glance mingling caution and good humor.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="799" height="615" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/image-5.png" alt="" class="wp-image-18948" style="width:579px;height:auto"/><figcaption class="wp-element-caption">Roger Mayne, photo of West Indians in Southam Street, 1956. © Roger Mayne Archive / Mary Evans Picture Library.</figcaption></figure>
</div>


<p>Labor politician Alan Johnson grew up on Southam Street, and in his childhood memoir, he observed how the men in Mayne’s photograph are “on the look-out for trouble as they head towards a group of young guys gathered around the steps leading up to a front door. Youths with grey pinched faces who don’t yet seem to have noticed the quartet ambling towards them.” Looking was confrontational, as Johnson described: “When to look, when not to look and indeed how to look was a complex skill, acquired through trial and error. It was important to appear ‘hard.’” If the youths on the stoop were to meet the look of the walking men, they would then be required to challenge them: “Oi ‘Oo you screwin’?” where “screwing” means to look.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">In 1959, Stuart Hall assessed that support for Oswald Mosley — who led Britain’s fascist Union Movement, which was virulently anti-immigrant — “is far higher than the voting figures suggest.” Indeed, the masculine fascination with fascism, to borrow from <a href="https://thereader.mitpress.mit.edu/susan-sontag-a-critic-at-the-crossroads-of-culture/">Susan Sontag</a>, went much further than the ballot box, as our own time has sadly borne out. Accordingly, the street became a place of danger when men performed racialized masculinity in it. Consider Mayne’s photograph of a line of Teds (or Teddy Boys, a white subculture that wore dark jackets, white shirts, and crepe-soled shoes) blocking off Southam Street. As one of them gestures at the camera with a long stick, this potential violence becomes visible. Jewish East End poet Emanuel Litvinoff remembered similar tactics in the 1930s when he and a friend ventured into neighboring Hoxton, hoping to impress local young women: “Instead of attracting female attention, we ran into a gang of youths who spread themselves across the pavement, and told us to get back to Palestine.”</p>



<figure class="wp-block-image size-full is-resized"><img loading="lazy" decoding="async" width="1200" height="906" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/image-7.png" alt="" class="wp-image-18950" style="width:686px;height:auto"/><figcaption class="wp-element-caption">Roger Mayne, photo of a large gang of youthful “Teds” in North Kensington, 1956. © Roger Mayne Archive / Mary Evans Picture Library.</figcaption></figure>



<p>This informal color bar was reinforced by the police, occasionally seen in Mayne’s photographs. The police, then and now, were reluctant to get involved in “domestics,” meaning violence or abuse within the home. What can’t be seen in Mayne’s photographs is the “drunk and violent” behavior of white men in their flats toward women and children. Mayne believed it was “unhappiness, loneliness and lack of human contact that characterises the English.” This unhappiness was registered obliquely in terms of sexuality, as he referred to the landmark 1957 British government <a href="https://en.wikipedia.org/wiki/Wolfenden_report" target="_blank" rel="nofollow">report</a> that had recommended decriminalizing male homosexuality and prostitution. A new morality would require what Mayne termed an acceptance of “the less normal but inevitable sides to sexual behavior.”</p>



<p>These dynamics were directly addressed in Colin MacInnes’s 1959 novel about Notting Hill, “Absolute Beginners.” At first, “Absolute Beginners” has no foreboding to speak of, instead depicting a fast-moving youth culture, where drug use and a range of sexual expression were common. It narrates the life of migrants, subcultures, media professionals, and part-timers (a key source of revenue for many migrant writers), and the new “teenagers” in a postwar London, then as now, clinging to delusions of imperial grandeur. The anonymous teenager who narrates is a photographer, Jewish by descent through his mother. He has Jewish friends and observes that if the Jews were to leave London, so would he. His photographic work is commercial, mostly fashion and some TV, but backed up by a lucrative line in pornography. The photographer really wants to be an artist, like Mayne, Freud, or many other Notting Hill bohemians. His friends include Big Jill, a lesbian organizer of sex work; Mr. Cool, a “mixed-race” teenager; and Hoplite, a queer media personality and party organizer. </p>



<p>In short, this was not the standard character group in the 1950s English literary novel. In search of becoming an artist, the narrator nonetheless poses young models in scenes said to evoke Englishness, such as on a rowing boat.</p>



<p>Suddenly, the writing takes on an altogether different tone, and the narrator tries to explain it. Referring to Notting Hill as “Napoli,” which translates in English to Naples, he writes: “You could feel a <em>hole:</em> as if some kind of life were draining out of it, leaving a sort of vacuum in the streets and terraces. And what made it somehow worse was that, as you looked around, you could see the people hadn’t yet noticed the alteration, even though it was so startling to you.” In the end, “Absolute Beginners” is rightly remembered for its concluding depiction of the violence in Notting Hill.</p>



<figure class="wp-block-pullquote"><blockquote><p>In the transitional time-space of Notting Hill, to be British was to be white.</p></blockquote></figure>



<p>In Notting Hill, as MacInnes imagined it, the Jewish photographer, seeing white violence directed at Caribbean residents, can no longer sustain the illusion that Notting Hill enabled a “disjointed form of communal sociality.” The racist violence opened a hole in white reality. What happened to Black people in Notting Hill allowed the narrator to see how he was perceived and to feel the “alteration” that had always been there from their perspective. This tear in white reality was shocking and “startling,” for white people at least.</p>



<p>Seen from the radical perspectives of the 1960s and 1970s, this rupture in whiteness would have seemed like a small harbinger of what was to come. Then again, the resurgence of white racism from Margaret Thatcher to Brexit and its now-routine expression on social media suggests that the reaction, at first slow in the making, has been more and more forceful. As Hall observed, “Empires come and go. But the imagery of the British Empire seems destined to go on forever. The imperial flag has been hauled down in a hundred different corners of the globe. But it is still flying in the collective unconscious.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Nicholas Mirzoeff</em></strong><em> is a professor and the chair of New York University&#8217;s Department of Media, Culture, and Communication. His writing has appeared in the Guardian, the New York Times, Hyperallergic, and the Los Angeles Review of Books. He is also the author of many books, including “</em><a href="https://www.hachettebookgroup.com/titles/nicholas-mirzoeff/how-to-see-the-world/9780465096015/?lens=basic-books" target="_blank" rel="nofollow"><em>How to See the World</em></a><em>,” “</em><a href="https://www.dukeupress.edu/the-right-to-look" target="_blank" rel="nofollow"><em>The Right to Look</em></a><em>,” and “</em><a href="https://mitpress.mit.edu/9780262047678/white-sight/" target="_blank"><em>White Sight</em></a><em>,” from which this article is adapted.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/WagtailSource-Women-and-children.width-1136.format-webp.webp" length="50000" type="image/webp"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/WagtailSource-Women-and-children.width-1136.format-webp.webp" />                                        </item>
        			                <item>
                        <title>How Albert Einstein Found Faith at the Edge of Reason</title>
                        <link>https://thereader.mitpress.mit.edu/how-albert-einstein-found-faith-at-the-edge-of-reason/</link>
                        <pubDate>Mon, 02 Mar 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Gerald Holton</dc:creator>
                        		<category><![CDATA[Albert Einstein]]></category>
		<category><![CDATA[God]]></category>
		<category><![CDATA[Physics]]></category>
		<category><![CDATA[Religion]]></category>
		<category><![CDATA[Philosophy]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19112</guid>
                        <description><![CDATA[<p>Though wary of organized religion, the physicist believed that the harmony of universal laws pointed to a higher power.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Though wary of organized religion, the physicist believed that the harmony of universal laws pointed to a higher power.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Albrrt-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>Source: Library of Congress. Gifted by Harris &amp; Ewing, Inc. 1955.</figcaption>
</figure>

<p><em>This article first appeared in a 2003 issue of <a href="https://direct.mit.edu/daed" target="_blank">Daedalus</a>, under the title “Einstein’s Third Paradise.&#8221;</em></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Historians of modern science have good reason to be grateful to Paul Arthur Schilpp, professor of philosophy and Methodist clergyman, but better known as the editor of a series of volumes on “Living Philosophers,” which included several volumes on scientist-philosophers. His motto was: “The asking of questions about a philosopher’s meaning while he is alive.” And to his everlasting credit, he persuaded Albert Einstein to do what he had resisted all his years: to sit down to write, in 1946 at age 67, an extensive autobiography — 45 pages long in print.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://direct.mit.edu/daed" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Daedalus.jpg" alt="" class="wp-image-19128"/></a><figcaption class="wp-element-caption"><em>This article first appeared in a 2003 issue of <a href="https://direct.mit.edu/daed" target="_blank">Daedalus</a>, under the title “Einstein’s Third Paradise.&#8221;</em></figcaption></figure>
</div>


<p>To be sure, Einstein excluded most of what he called “the merely personal.” But on the very first page, he shared a memory that will guide us to the main conclusion of this essay. He wrote that when still very young, he had searched for an escape from the seemingly hopeless and demoralizing chase after one’s desires and strivings. That escape offered itself first in religion. Although brought up as the son of “entirely irreligious (Jewish) parents,” through the teaching in his Catholic primary school, mixed with his private instruction in elements of the Jewish religion, Einstein found within himself a “deep religiosity” — indeed, “the religious paradise of youth.”</p>



<p>The accuracy of this memorable experience is documented in other sources, including the biographical account of Einstein’s sister, Maja. There she makes a plausible extrapolation: that Einstein’s “religious feeling” found expression in later years in his deep interest and actions to ameliorate the difficulties to which fellow Jews were being subjected, actions ranging from his fights against anti-Semitism to his embrace of Zionism (in the hope, as he put it in one of his speeches [April 20, 1935], that it would include a “peaceable and friendly cooperation with the Arab people”). As we shall see, Maja’s extrapolation of the reach of her brother’s early religious feelings might well have gone much further.</p>



<p>The primacy of young Albert’s First Paradise came to an abrupt end. As he put it early in his “Autobiographical Notes,” through reading popular science books, he came to doubt the stories of the Bible. Thus, he passed first through what he colorfully described as a “positively fanatic indulgence in free thinking.”</p>



<p>But then he found new enchantments. First, at age 12, he read a little book on Euclidean plane geometry — he called it “holy,” a veritable “Wunder.” Then, still as a boy, he became entranced by the contemplation of that huge external, extra-personal world of science, which presented itself to him “like a great, eternal riddle.” To that study one could devote oneself, finding thereby “inner freedom and security.” He believed that choosing the “road to this Paradise,” although quite antithetical to the first one and less alluring, did prove itself trustworthy. Indeed, by age 16, he had his father declare him to the authorities as “without confession,” and for the rest of his life, he tried to dissociate himself from organized religious activities and associations, inventing his own form of religiousness, just as he was creating his own physics.</p>



<p>These two realms appeared to him eventually not as separate as numerous biographers would suggest. On the contrary, my task here is to demonstrate that at the heart of Einstein’s mature identity there developed a fusion of his First and his Second Paradise — into a Third Paradise, where the meaning of a life of brilliant scientific activity drew on the remnants of his fervent first feelings of youthful religiosity.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">For this purpose, we shall have to make what may seem like an excursus, but one that will, in the end, throw light on his overwhelming passion, throughout his scientific and personal life, to bring about the joining of these and other seemingly incommensurate aspects, whether in nature or society.</p>



<p>In 1918, he gave a glimpse of it in a speech (“<em>Prinzipien der Forschung</em>”) honoring the 60th birthday of his friend and colleague <a href="https://mitpress.mit.edu/9780262047043/on-the-trail-of-blackbody-radiation/" target="_blank">Max Planck</a>, to whose rather metaphysical conception about the purpose of science Einstein had drifted while moving away from the quite opposite, positivistic one of an early intellectual mentor, Ernst Mach. As Einstein put it in that speech, the search for one “simplified and lucid image of the world” not only was the supreme task for a scientist, but also corresponded to a psychological need: to flee from personal, everyday life, with all its dreary disappointments, and escape into the world of objective perception and thought. Into the formation of such a world picture, the scientist could place the “center of gravity of his emotional life [<em>Gefühlsleben</em>].” And in a sentence with special significance, he added that persevering on the most difficult scientific problems requires “a state of feeling [<em>Gefühlszustand</em>] similar to that of a religious person or a lover.”</p>



<p>Throughout Einstein’s writings, one can watch him searching for that world picture, for a comprehensive <em>Weltanschauung</em>, one yielding a total conception that, as he put it, would include every empirical fact (<em>Gesamtheit der Erfahrungstatsachen</em>) — not only of physical science, but also of life.</p>



<figure class="wp-block-pullquote"><blockquote><p>He tried to dissociate himself from organized religious activities and associations, inventing his own form of religiousness.</p></blockquote></figure>



<p>Einstein was, of course, not alone in this pursuit. The German literature of the late 19th and early 20th centuries contained a seemingly obsessive flood of books and essays on the oneness of the world picture. They included writings by both Ernst Mach and Max Planck, and, for good measure, a 1912 general manifesto appealing to scholars in all fields of knowledge to combine their efforts in order to “bring forth a comprehensive <em>Weltanschauung</em>.” The 34 signatories included Ernst Mach, Sigmund Freud, Ferdinand Tönnies, David Hilbert, Jacques Loeb — and the then still little-known Albert Einstein.</p>



<p>But while for most others this culturally profound longing for unity — already embedded in the philosophical and literary works they all had studied — was mostly the subject of an occasional opportunity for exhortation (nothing came of the manifesto), for Einstein it was different, a constant preoccupation responding to a persistent, deeply felt intellectual and psychological need. </p>



<p>This fact can be most simply illustrated in Einstein’s scientific writings. As a first example, I turn to one of my favorite manuscripts in his archive. It is a lengthy manuscript in his handwriting, of around 1920, titled, in translation, “Fundamental Ideas and Methods of Relativity.” It contains the passage in which Einstein revealed what, in his words, was “the happiest thought of my life [<em>der gluecklichste Gedanke meines Lebens</em>]” — a thought experiment that came to him in 1907: nothing less than the definition of the <a href="https://en.wikipedia.org/wiki/Equivalence_principle" target="_blank" rel="nofollow">equivalence principle</a>, later developed in his general relativity theory.</p>



<p>It occurred to Einstein — thinking first of all in visual terms, as was usual for him — that if a man were falling from the roof of his house and tried to let anything drop, it would only move alongside him, thus indicating the equivalence of acceleration and gravity. In Einstein’s words, “the acceleration of free fall with respect to the material is therefore a mighty argument that the postulate of relativity is to be extended to coordinate systems that move nonuniformly relative to one another . . . . ”</p>



<p>For the present purpose, I want to draw attention to another passage in that manuscript. His essay begins in a largely impersonal, pedagogic tone, similar to that of his first popular book on relativity, published in 1917. But in a surprising way, in the section titled “General Relativity Theory,” Einstein suddenly switches to a personal account. He reports that in the construction of the special theory, the “thought concerning the <a href="https://en.wikipedia.org/wiki/Faraday%27s_law_of_induction" target="_blank" rel="nofollow">Faraday [experiment]</a> on electromagnetic induction played for me a leading role.” He then describes that old experiment, in words similar to the first paragraph of his 1905 relativity paper, concentrating on the well-known fact, discovered by Faraday in 1831, that the induced current is the same whether it is the coil or the magnet that is in motion relative to the other, whereas the “theoretical interpretation of the phenomenon in these two cases is quite different.”</p>



<p>While other physicists, for many decades, had been quite satisfied with that difference, here Einstein reveals a central preoccupation at the depth of his soul: “The thought that one is dealing here with two fundamentally different cases was for me unbearable [<em>war mir unertraeglich</em>]. The difference between these two cases could not be a real difference . . . . The phenomenon of the electromagnetic induction forced me to postulate the (special) relativity principle.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Let us step back for a moment to contemplate that word “unbearable.” It is reinforced by a passage in Einstein’s “Autobiographical Notes”: “By and by I despaired [<em>verzweifelte ich</em>] of discovering the true laws by means of constructive efforts based on known facts. The longer and the more despairingly I tried, the more I came to the conviction that only the discovery of a universal formal principle could lead us to assured results.” He might have added that the same postulational method had already been pioneered in their main works by two of his heroes, Euclid and Newton. </p>



<p>Other physicists — for example, Niels Bohr and Werner Heisenberg — also reported that at times they were brought to despair in their research. Still other scientists were evidently even brought to suicide by such disappointment. For researchers fiercely engaged at the very frontier, the psychological stakes can be enormous. Einstein resolved his discomfort by, as he did in his 1905 relativity paper, turning to the <em>postulation</em> of two fundamental principles (the principle of relativity in physics and the constancy of the velocity of light in vacuo), adopting them as tools of thought.</p>



<p>Einstein also had a second method to bridge the unbearable differences in a theory: <em>generalizing it</em>, so that the apparently differently grounded phenomena are revealed to be coming from the same base. We know from a letter to Max von Laue of January 17, 1952, found in the archive, that Einstein’s early concern with the physics of <a href="https://en.wikipedia.org/wiki/Quantum_fluctuation" target="_blank" rel="nofollow">fluctuation phenomena</a> was the common root of <a href="https://en.wikipedia.org/wiki/Annus_mirabilis_papers" target="_blank" rel="nofollow">his three great papers</a> of 1905, on such different topics as the quantum property of light, Brownian movement, and relativity.</p>



<figure class="wp-block-pullquote"><blockquote><p>For researchers fiercely engaged at the very frontier, the psychological stakes can be enormous.</p></blockquote></figure>



<p>But even earlier, in a letter of April 14, 1901, to his school friend Marcel Grossmann, Einstein had revealed his generalizing approach to physics while working on his very first published paper, on capillarity. There, he tried to bring together in one theory the opposing behaviors of bodies: moving upward when a liquid is in a capillary tube, but downward when the liquid is released freely. </p>



<p>In that letter, he spelled out his interpenetrating emotional and scientific needs in one sentence: “It is a wonderful feeling [<em>ein herrliches Gefühl</em>] to recognize the unity of a complex of appearances which, to direct sense experiences, appear to be quite separate things.” The postulation of universal formal principles, and the discovery among phenomena of a unity, of <em>Einheitlichkeit</em>, through the <em>generalization </em>of the basic theory — those were two of Einstein’s favorite weapons, as his letters and manuscripts show. Writing to Willem de Sitter on November 4, 1916, he confessed: “I am driven by my need to generalize [<em>mein Verallgemeinerungsbeduerfnis</em>].” That need, that compulsion, was also deeply entrenched in German culture and resonated with, and supported, Einstein’s approach.</p>



<p>Let me just note in passing that while still a student at the Polytechnic Institute in Zurich, in order to get his certificate to be a high school science teacher, Einstein took optional courses on Immanuel Kant and Goethe, whose central works he had studied since his teenage years. That <em>Verallgemeinerungsbeduerfnis</em> was clearly a driving force behind Einstein’s career trajectory. </p>



<p>Thus, he generalized from old experimental results, like Faraday’s, to arrive at special relativity, in which he unified space and time, electric and magnetic forces, energy and mass, and so resolved the whole long dispute among scientists between adherence to a mechanistic versus an electromagnetic world picture. Then he <em>generalized </em>the special theory to produce what he first significantly called, in an article of 1913, not the <em>general</em> but the <em>generalized </em><a href="https://en.wikipedia.org/wiki/Theory_of_relativity" target="_blank" rel="nofollow">relativity theory</a>. Paul Ehrenfest wrote him in puzzlement: “How far will this <em>Verallgemeinerung </em>go on?”</p>



<p>And, finally, Einstein threw himself into the attempt of a grand unification of quantum physics and of gravity: a unified field theory. It is an example of an intense and perhaps unique, life-long, tenacious dedication, despite Einstein’s failure at the very end — which nevertheless, as a program, set the stage for the ambition of some of today’s best scientists, who have taken over that search for the Holy Grail of physics — a theory of everything.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">So much for trying to get a glimpse of the mind of Einstein as a scientist. But at this point, for anyone who has studied this man’s work and life in detail, a new thought urges itself forward. As in his science, Einstein also <em>lived</em> under the compulsion to unify — in his politics, in his social ideals, even in his everyday behavior. He abhorred all nationalisms, and called himself, even while in Berlin during World War I, a European.</p>



<p>Later, he supported the One World movement, dreamed of a unified supernational form of government, helped to initiate the international Pugwash movement of scientists during the Cold War, and was as ready to befriend visiting high school students as the Queen of the Belgians. His instinctive penchant for democracy and dislike of hierarchy and class differences must have cost him greatly in the early days, as when he addressed his chief professor at the Swiss Polytechnic Institute, on whose recommendation his entrance to any academic career would depend, not by any title, but simply as “Herr Weber.”</p>



<p>And at the other end of the spectrum, in his essay on ethics, Einstein cited Moses, Jesus, and Buddha as equally valid prophets. No boundaries, no barriers; none in life, as there are none in nature. Einstein’s life and his work were so mutually resonant that we recognize both to have been carried on together in the service of one grand project — the fusion into one coherency.</p>



<p>There were also no boundaries or barriers between Einstein’s scientific and religious feelings. After having passed from the youthful first, religious paradise into his second, immensely productive scientific one, he found in his middle years a fusion of those two motivations — his Third Paradise. We had a hint of this development in his remark in 1918, in which he observed the parallel states of feeling of the scientist and of the “religious person.” Other hints come from the countless, well-known quotations in which Einstein referred to God — doing it so often that Niels Bohr had to chide him. Karl Popper remarked that in conversations with Einstein, “I learned nothing . . . . he tended to express things in theological terms, and this was often the only way to argue with him. I found it finally quite uninteresting.”</p>



<figure class="wp-block-pullquote"><blockquote><p>There were no boundaries or barriers between Einstein’s scientific and religious feelings.</p></blockquote></figure>



<p>But two other reports may point to the more profound layer of Einstein’s deepest convictions. One is his remark to one of his assistants, Ernst Straus: “What really interests me is whether God had any choice in the creation of the world.” The second is Einstein’s reply to a curious telegram. In 1929, Boston’s Cardinal O’Connell branded Einstein’s theory of relativity as “befogged speculation producing universal doubt about God and His Creation,” and as implying “the ghastly apparition of atheism.” In alarm, New York’s Rabbi Herbert S. Goldstein asked Einstein by telegram: “Do you believe in God? Stop. Answer paid 50 words.”</p>



<p>In his response, for which Einstein needed but 25 (German) words, he stated his beliefs succinctly: “I believe in Spinoza’s God, Who reveals Himself in the lawful harmony of the world, not in a God Who concerns Himself with the fate and the doings of mankind.” The rabbi cited this as evidence that Einstein was not an atheist, and further declared that “Einstein’s theory, if carried to its logical conclusion, would bring to mankind a scientific formula for monotheism.” Einstein wisely remained silent on that point.</p>



<p>The good rabbi might have had in mind the writings of the Religion of Science movement, which had flourished in Germany under the distinguished auspices of Ernst Haeckel, Wilhelm Ostwald, and their circle (the <em>Monistenbund</em>), and also in America, chiefly in Paul Carus’s books and journals, such as “The Open Court,” which carried the words “Devoted to the Religion of Science” on its masthead.</p>



<p>If Einstein had read Carus’s book, “The Religion of Science” (1893), he may have agreed with one sentence in it: “Scientific truth is not profane, it is sacred.” Indeed, the charismatic view of science in the lives of some scientists has been the subject of much scholarly study — for example, in Joseph Ben-David’s “Scientific Growth” (1991), and earlier in Robert K. Merton’s magisterial book of 1938, “Science, Technology and Society in Seventeenth Century England.”</p>



<p>In the section entitled “The Integration of Religion and Science,” Merton notes that among the scientists he studied, “the religious ethic, considered as a social force, so consecrated science as to make it a highly respected and laudable focus of attention.” The social scientist Bernard H. Gustin elaborated on this perception, writing that science at the highest level is charismatic because scientists devoted to such tasks are “thought to come into contact with what is essential in the universe.” I believe this is precisely why so many who knew little about Einstein’s scientific writing flocked to catch a glimpse of him and to this day feel somehow uplifted by contemplating his iconic image.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Starting in the late 1920s, Einstein became more and more serious about clarifying the relationship between his transcendental and his scientific impulses. He wrote several essays on religiosity; five of them, composed between 1930 and the early 1950s, are reproduced in his book “Ideas and Opinions.” </p>



<p>In those chapters, we can watch the result of a struggle that had its origins in his school years, as he developed, or rather invented, a religion that offered a union with science. In the evolution of religion, he remarked, there were three developmental stages. At the first, “with primitive man it is above all fear that evokes religious notions. This ‘religion of fear’. . . is in an important degree stabilized by the formation of a special priestly caste” that colludes with secular authority to take advantage of it for its own interest. The next step — “admirably illustrated in the Jewish scriptures” — was a moral religion embodying the ethical imperative, “a development [that] continued in the New Testament.”</p>



<p>Yet it had a fatal flaw: “the anthropomorphic character of the concept of God,” easy to grasp by “underdeveloped minds” of the masses, while freeing them of responsibility. This flaw disappears at Einstein’s third, mature stage of religion, to which he believed mankind is now reaching and which the great spirits (he names Democritus, St. Francis of Assisi, and Spinoza) had already attained — namely, the “cosmic religious feeling” that sheds all anthropomorphic elements. </p>



<p>In describing the driving motivation toward that final, highest stage, Einstein uses the same ideas, even some of the same phrases, with which he had celebrated first his religious and then his scientific paradise: “The individual feels the futility of human desires, and aims at the sublimity and marvelous order which reveal themselves both in nature and in the world of thought.” “Individual existence impresses him as a sort of prison, and he wants to experience the universe as a single, significant whole.” Of course! Here, as always, there has to be the intoxicating experience of unification.</p>



<p>And so Einstein goes on, “I maintain that the cosmic religious feeling is the strongest and noblest motive for scientific research . . . . A contemporary has said not unjustly that in this materialistic age of ours the serious scientific workers are the only profoundly religious people.” In another of his essays on religion, Einstein points to a plausible source for his specific formulations: “Those individuals to whom we owe the great creative achievements of science were all of them imbued with a truly religious conviction that this universe of ours is something perfect, and susceptible through the rational striving for knowledge. If this conviction had not been a strongly emotional one, and if those searching for knowledge had not been inspired by Spinoza’s <em>amor dei intellectualis</em>, they would hardly have been capable of that untiring devotion which alone enables man to attain his greatest achievements.”</p>



<figure class="wp-block-pullquote"><blockquote><p>“I believe in Spinoza’s God, Who reveals Himself in the lawful harmony of the world.”</p></blockquote></figure>



<p>I believe we can guess at the first time Einstein read Baruch Spinoza’s “Ethics” (<em>Ethica Ordinae Geometrico Demonstrata</em>), a system constructed on the Euclidean model of deductions from propositions. Soon after getting his first real job at the patent office, Einstein joined with two friends to form a discussion circle, meeting once or twice a week in what they called, with gallows humor, the <em>Akademie Olympia</em>. We know the list of books they read and discussed. High among them, reportedly at Einstein’s suggestion, was Spinoza’s “Ethics,” which he read afterwards several times more. Even when his sister Maja joined him in Princeton in later life and was confined to bed by an illness, he thought that reading a good book to her would help, and chose Spinoza’s “Ethics” for that purpose.</p>



<p>By that time, Spinoza’s work and life had long been important to Einstein. He had written an introduction to a biography of Spinoza (by his son-in-law, Rudolf Kayser, 1946); he had contributed to the “Spinoza Dictionary” (1951); he had referred to Spinoza in many of his letters; and he had even composed a poem in Spinoza’s honor. He admired Spinoza for his independence of mind, his deterministic philosophical outlook, his skepticism about organized religion and orthodoxy — which had resulted in his excommunication from his synagogue in 1656 — and even for his ascetic preference, which compelled him to remain in poverty and solitude to live in a sort of spiritual ecstasy, instead of accepting a professorship at the University of Heidelberg. </p>



<p>Originally neglected, Spinoza’s “Ethics,” published only posthumously, profoundly influenced other thinkers, such as Friedrich Schlegel, Friedrich Schleiermacher, Goethe (who called him “our common saint”), Albert Schweitzer, and Romain Rolland (who, on reading Ethics, confessed, “I deciphered not what he said, but what he meant to say”).</p>



<p>For Spinoza, God and nature were one (<em>deus sive natura</em>). True religion was based not on dogma but on a feeling for the rationality and the unity underlying all finite and temporal things, on a feeling of wonder and awe that <em>generates</em> the idea of God, but a God which lacks any anthropomorphic conception. As Spinoza wrote in Proposition 15 in “Ethics,” he opposed assigning to God “body and soul and being subject to passions.” Hence, “God is incorporeal” — as had been said by others, from Maimonides on, to whom God was knowable indirectly through His creation, through nature.</p>



<p>In other pages of “Ethics,” Einstein could read Spinoza’s opposition to the idea of cosmic purpose, and that he favored the primacy of the law of cause and effect — an all-pervasive determinism that governs nature and life — rather than “playing at dice,” in Einstein’s famous remark. And as if he were merely paraphrasing Spinoza, Einstein wrote in 1929 that the perception in the universe of “profound reason and beauty constitute true religiosity; in this sense, and in this sense alone, I am a deeply religious man.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Much has been written about the response of Einstein’s contemporaries to his Spinozistic cosmic religion. For example, the physicist Arnold Sommerfeld recorded in Schilpp’s volume that he often felt “that Einstein stands in a particularly intimate relation to the God of Spinoza.” </p>



<p>But what finally most interests us here is to what degree Einstein, having reached his Third Paradise, in which his yearnings for science and religion are joined, may even have found in his own research in physics fruitful ideas emerging from that union. In fact, there are at least some tantalizing parallels between passages in Spinoza’s “Ethics” and Einstein’s publications in cosmology — parallels that the physicist and philosopher Max Jammer, in his book “Einstein and Religion” (1999), considers as amounting to intimate connections. For example, in Part I of Ethics (“Concerning God”), Proposition 29 begins: “In nature there is nothing contingent, but all things are determined from the necessity of the divine nature to exist and act in a certain manner.”</p>



<p>Here is at least a discernible overlap with Einstein’s tenacious devotion to determinism and strict causality at the fundamental level, despite all the proofs from quantum mechanics of the reign of probabilism, at least in the subatomic realm. There are other such parallels throughout. </p>



<p>But what is considered by some as the most telling relationship between Spinoza’s Propositions and Einstein’s physics comes from passages such as Corollary 2 of Proposition 20: “It follows that God is immutable or, which is the same thing, all His attributes are immutable.” In a letter of September 3, 1915, to Else (his cousin and later his wife), Einstein, having read Spinoza’s “Ethics” again, wrote, “I think the ‘Ethics’ will have a permanent effect on me.” Two years later, when he expanded his general relativity to include “cosmological considerations,” Einstein found to his dismay that his system of equations did “not allow the hypothesis of a spatially closed-ness of the world [<em>raeumliche Geschlossenheit</em>].”</p>



<p>How did Einstein cure this flaw? By something he had done very rarely: making an ad hoc addition, purely for convenience: “We can add, on the left side of the field equation a — for the time being — unknown universal constant, &#8211; λ.”</p>



<figure class="wp-block-pullquote"><blockquote><p>“The cosmic religious feeling is the strongest and noblest motive for scientific research.”</p></blockquote></figure>



<p>In fact, it seems that not much harm is done thereby. It does not change the covariance; it still corresponds with the observation of motions in the solar system (“as long as λ is small”), and so forth. Moreover, the proposed new universal constant λ also determines the average density of the universe with which it can remain in equilibrium, and provides the radius and volume of a presumed spherical universe. Altogether a beautiful, immutable universe — one an immutable God could be identified with.</p>



<p>But in 1922, Alexander Friedmann showed that the equations of general relativity did allow expansion or contraction. And in 1929, Edwin Hubble found by astronomical observations the fact that the universe does expand. Thus, Einstein — at least according to the physicist George Gamow — remarked that “inserting λ was the biggest blunder of my life.”</p>



<p>Max Jammer and the physicist John Wheeler, both of whom knew Einstein, traced his unusual ad hoc insertion of λ , nailing down that “spatially closed-ness of the world,” to a relationship between Einstein’s thoughts and Spinoza’s Propositions. They also pointed to another possible reason for it: In Spinoza’s writings, one finds the concept that God would not have made an empty world. But in an expanding universe, in the infinity of time, the density of matter would be diluted to zero in the limit. Space itself would disappear, since, as Einstein put it in 1952, “On the basis of the general theory of relativity . . . space as opposed to ‘what fills space’. . . had no separate existence.”</p>



<p>Even if all of these suggestive indications of an intellectual, emotional, and perhaps even spiritual resonance between Einstein’s and Spinoza’s writings were left entirely aside, there still remains Einstein’s attachment to his “cosmic religion.” That was the end point of his own troublesome pilgrimage in religiosity — from his early vision of his First Paradise, through his disillusionments, to his dedication to find fundamental unity within natural science, and at last to his recognition of science as the devotion, in his words, of “a deeply religious unbeliever” — his final embrace of seeming incommensurables in his Third Paradise.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong>Gerald Holton</strong> is Mallinckrodt Professor of Physics and Professor of the History of Science, Emeritus, at Harvard University. He contributed to Gerhard Sonnert&#8217;s book, “<a href="https://mitpress.mit.edu/9780262536929/ivory-bridges/" target="_blank">Ivory Bridges</a>.” He has also written numerous books and articles on Albert Einstein&#8217;s scientific contributions, including <a href="https://direct.mit.edu/daed/article/132/4/26/26572/Einstein-s-Third-Paradise" target="_blank">one</a> in the journal <a href="https://direct.mit.edu/daed" target="_blank">Daedalus</a>, from which this article is excerpted.  </em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Albrrt.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Albrrt.jpg" />                                        </item>
        			                <item>
                        <title>Bridgebuilders and Historians Turned Metal Into Myth</title>
                        <link>https://thereader.mitpress.mit.edu/bridgebuilders-and-historians-turned-metal-into-myth/</link>
                        <pubDate>Thu, 26 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Gregory Dreicer</dc:creator>
                        		<category><![CDATA[Bridges]]></category>
		<category><![CDATA[Engineering]]></category>
		<category><![CDATA[Evolution]]></category>
		<category><![CDATA[Metal]]></category>
		<category><![CDATA[Science & Tech]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18982</guid>
                        <description><![CDATA[<p>Historians often reinforce evolutionist narratives that rank civilizations and nationalize invention.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Historians often reinforce evolutionist narratives that rank civilizations and nationalize invention.</p>

<figure class="wp-block-image">
<img width="700" height="448" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/Screenshot-2026-02-05-at-9.13.09-AM-700x448.png" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>US military railroad bridge (Herman Haupt), Bull Run, ­Virginia, Orange and Alexandria Railroad, ca. 1863. The army rebuilt this tied arch bridge more than seven times during the Civil War.
Source: Library of Congress.</figcaption>
</figure>

<p class="has-drop-cap">Evolutionist storytellers have for centuries reinforced iron as a gauge of progress. They employed technology as “the measure of men” and portrayed iron as the quintessential material of the Industrial Revolution. Their calculation was rooted in the late 18th-century transition in the British economy’s basis from wood and water to iron and coal. Within this ferric landscape, wood was deemed appropriate for less complex, less civilized societies.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262552110/american-bridge/" target="_blank"><img loading="lazy" decoding="async" width="320" height="411" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/bridge-jkt.jpg" alt="" class="wp-image-18983"/></a><figcaption class="wp-element-caption">This article is adapted from Gregory Dreicer&#8217;s book “<a href="https://mitpress.mit.edu/9780262552110/american-bridge/" target="_blank">American Bridge</a>.”</figcaption></figure>
</div>


<p>The words of historian Carl Condit, whose publications on building technology were widely read, ring through the decades: “Wherever wood was plentiful and industrial techniques less advanced than in Western Europe, timber construction was bound to be the natural choice.” He believed that wood framing belonged to a “vernacular tradition,” that is, unscientific, less advanced. Wood’s mythic nature — unlearned, craft-based, inflammable — helps explain why in the early 21<sup>st</sup> century the use of wooden members, such as beams or columns, in high-rise buildings can still evoke surprise.</p>



<p>Our understanding of materials reflects engineering, evolutionary, economic, and nationalist perspectives. It shapes how we see materials and how designers use them. Consider engineer John Roebling, who in 1860 proclaimed, “Iron has emphatically become <em>the material of the age.</em> Upon its proper use, the future comfort and physical advancement of the human race will principally depend. It will yet be the harbinger of peace, as already it has given us the means of locomotion and of intelligent intercourse.”</p>



<p>His rhetorical fervor aligned with the industrial-evolutionist reasoning of his time and is comparable to the language of today’s promoters of digital technologies. Roebling believed that technology provided evidence of a superior civilization and that technological progress would benefit the world; his metal cable manufacturing company was helping make it happen (“intelligent intercourse” probably refers to the far-reaching impact of telegraph wires, which Roebling’s company manufactured). </p>



<p>Roebling, like Abraham Darby, William Fairbairn, and Robert Stephenson, had a financial stake in iron. (This is not to say they were venal; a belief in a future they would profit from was integral to their entrepreneurial mindset.) Roebling’s design for the Niagara railroad suspension bridge — an American symbol whose image traveled around the world — depended on cables, though its deck initially made substantial use of wood and iron. When the bridge opened in 1855, the “American Railroad Journal” proclaimed: “It must place the name of <em>Roebling</em> high among the greatest and best of those who have accomplished most for the advancement of their species.”</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="908" height="1518" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-03-at-10.45.47-PM.png" alt="" class="wp-image-18987" style="width:505px;height:auto"/><figcaption class="wp-element-caption">Ernst Haeckel, ­“Family Tree of Man.” Tree pictograms laid out and justified evolutionary, human-centered hierarchies based on race and nationality. In the English edition, this image was titled “Pedigree of Man.” Source: Ernst Haeckel, “Anthropogenie oder Entwickelungsgeschichte des Menschen” (Engelmann, 1874), ­Table 12. Deutsches Museum Library.</figcaption></figure>
</div>


<p>This Prussian-born technologist remains an American hero to this day, thanks to his Brooklyn Bridge. But historians who adopt material ideologies and biases from innovator-entrepreneurs such as Roebling and integrate them uncritically into storymaking create puzzling scenarios. They write things like “the iron truss came soon after the iron arch.” This affirmation, which links disparate structures and seems to exclude wood, is a celebration of progress rather than an insight into the history of innovation.</p>



<p>Wood and metal structures each required their own design method. With wood, the connection type determined the size of the entire member; with iron and steel, designers established the size of the members first. As an engineer explained in 1933, “If he designs a steel bridge while standing on his feet, he should stand on his head while designing a timber structure. In other words, the processes are reversed.”</p>



<p>In metal structures, a much smaller material area was required for the connections, which were simpler, could withstand a variety of forces from different directions, and could be designed as an independent feature of the structure; in addition, the shape of the member could be more precisely specified and manufactured. Metal enabled designers to create structures that were physically closer to one-dimensional depictions — that is, closer to the diagrams used in structural analysis. Lumber was superimposed and connected at the overlaps, while metal construction approached a single plane, with connection points where members met, usually at their ends.</p>



<p>Designers had to think about wood and metal in different ways. Essential to successful wood design was knowledge of wood types, shared by manufacturer and builder. Because wood fails with forewarning, builders could learn through observation and repair. (A recent study reports that 19<sup>th</sup>-century wooden railway bridges had a safety advantage; they were not known to collapse while trains crossed them.) As railways rejected wooden bridges, the importance of the type of knowledge and experience required to build with wood diminished.</p>



<figure class="wp-block-pullquote"><blockquote><p>The evolutionist ascent-of-iron narrative lowered the status of the carpenter while elevating that of the engineer.</p></blockquote></figure>



<p>Iron, by contrast, could break with little or no warning; this may be due to the type of metal, how it is used, or the quality of manufacture. Metal members were more of a black box, not knowable in the ways that wood was. Bridge designers and builders trusted the metal maker to produce a material that met a testing standard, which in most cases was independent of the designer. The alienation from firsthand knowledge, along with the foregrounding of analytical tools for designing structures, became fundamental to the mass-construction of bridges. The designer could develop a structural idea in the abstract — and then seek materials that fit the design.</p>



<p>The evolutionist ascent-of-iron narrative lowered the status of the carpenter while elevating that of the engineer, who possessed a different kind of knowledge. No matter the material, however, intuition (that is, tacit knowledge and skill based on experience) remained basic to design. As historian Joachim Radkau explained, craftsmanship and a feeling for materials were still important, before iron and industrialized building and after, but “human skill was pushed to the edge of technologists’ consciousness.”</p>



<p>The professionalization of engineering occurred alongside the development of structural metals. Civil and mechanical engineers became closely identified with new types of structures that employed metal; this enabled them to distinguish themselves from contractors, whose participation in iron construction was also essential. Eminent structural engineer Corydon T. Purdy’s assertion in 1895 that “it is only with the advent of steel that the engineer has become a necessity” transmitted heavy metal reverberations about progress: New materials require new people with new knowledge to replace those who came before.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1614" height="1096" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-03-at-10.25.31-PM.png" alt="" class="wp-image-18985" style="width:564px;height:auto"/><figcaption class="wp-element-caption">Kistna Viaduct, Great Indian Peninsula Railway (engineer George Berkley, 1870-71). Near Raichur, over the Kistna River. Source: William H. Maw and James Dredge, “Modern Examples of Road and Railway Bridges”; “Illustrating the Most Recent Practice of Leading Engineers in Europe and America” (London: Engineering, 1872), plate 87. University of Michigan.</figcaption></figure>
</div>


<p>That year, in the same journal, engineer J. Parker Snow, while describing wooden lattice railway bridges he was maintaining, shared an “impression” that wooden bridges had become “obsolete.” Already 30 years earlier, when an engineer mentioned “the lattice, long since abandoned as a wooden structure,” he seemed to confirm its extinction; yet wooden lattices continued to be built, though in smaller numbers. But metal was the material on which engineers, entrepreneurs, and historians were building professional status, careers, and wealth.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">The idea that materials provide an evolutionary track for designers to follow, rather than commodities to be manipulated, is rooted in the belief that each material has a form through which it can best be expressed. This notion endures in the oft-quoted bromide attributed to the architect Louis Sullivan, “Form follows function,” which aligns with his convictions about nature and evolution. As if function dictated a unique form, or each form had only one function! Sullivan’s business partner, engineer Dankmar Adler, more astutely deciphered design: “Form follows historical precedent.” Or, as an engineer in 1844 remarked regarding the succession of bridge types, “There is a fashion which rages for a certain time.”</p>



<p>This is evident in the number of cable-stayed bridges built in recent years. While evolution clarifies the process of change in an animal species, biology cannot account for the myriad decisions that drive the design of individual objects or the development of innovation over time. There is always a menu of possibilities to choose from.</p>



<figure class="wp-block-pullquote"><blockquote><p>Metal embodied a majestic symbolic potency.</p></blockquote></figure>



<p>But evolutionism can circumscribe that choice. Anthropological experts traced cultural evolution through the West with the advent of iron as climax. In the late 19<sup>th</sup> century, John Wesley Powell, a founder of the field of anthropology, explained, “The age of savagery is the age of stone; the age of barbarism the age of clay; the age of civilization the age of iron.” Ethnologist Otis T. Mason confirmed that “the civilized man passes his whole life in the midst of wheels and cranks and engines of iron.”</p>



<p>Metal embodied a majestic symbolic potency: Its mythical strength and permanence provided proof of the durability and civilizational direction of nation and imperial empire. So cultural narrators ignored the role of wood at a pivotal inventive moment — the reinvention of building — and likely remained ignorant of the moment because of wood’s centrality. Lumber and enslaved people were considered primitive or pre-industrial, on the other side of the divide, even if they played a giant role in the making of industrial capitalism and management. Like the racial classifications often employed to define society, “iron bridge” and “wooden bridge” in historical accounts often are factitious labels that reveal evolutionary caste. They refer to imaginary homogeneous types rather than actual mixed heritage and composition.</p>



<p>The evolutionist-progressive narrative also functions as a political power tool. Entrepreneurs and engineers used it to rationalize intentions and minimize mistakes. It enabled industrialists John D. Rockefeller and Andrew Carnegie to justify aggressive corporate tactics and the mistreatment of individuals; Carnegie claimed that inequality and concentration of wealth were “essential to the future progress of the race.” Evolutionism supported the view that efforts to make society equitable were unnecessary and perhaps unnatural. In China, evolutionism would replace traditional values and serve as a tool for massive change.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="1410" height="814" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-03-at-10.54.07-PM-1.png" alt="" class="wp-image-19004"/><figcaption class="wp-element-caption">The Boston and Maine bridge (J. Parker Snow, 1889) over the Contoocook River, Contoocook, New Hampshire. The oldest surviving covered railroad bridge in the United States was built the same year the Eiffel Tower opened. Source: Amy James, artist; Library of Congress, Prints &amp; Photo­graphs Division, HAER No. NH-38.</figcaption></figure>
</div>


<p>Just as biologists have applied evolution to all scales of life, from genes to species change over thousands of years, historian-evolutionists have turned their attention to all scales of invention and industry, ranging from “arrow into rocket” fantasias to seemingly small alterations. In defense against a lawsuit accusing them of theft, a group of dismayed engineers asked: “How can you patent something that is in the natural evolution of technology?” Indeed, if designers are evolution’s agents, they would not be responsible for illicit appropriation.</p>



<p>By the beginning of the 20<sup>th</sup> century, corporations in the United States were less likely to publicly espouse survival of the fittest. The idea went underground and fertilized the evolutionist-progress narrative of technology that nurtures today’s neoliberal thought. In the 21st century, evolutionist narratives can deflect attention from the inequity behind, for example, digital devices — gleaming avatars of progress made of metals whose manufacture and disposal depend on environmental harm and brutal working and living conditions in places Western consumers never see. This is the dark side of the evolution-of-materials tale. While evolutionism’s inextricable ties to the openly nationalist and racist currents of the 19th and 20th centuries are well known, less discussed are its support for contemporary corporate “innovation” and its impacts. The question is: Who is technological evolution and progress for?</p>



<p>“Arguably, no folk theory of human nature has done more harm — or is more mistaken — than the ‘survival of the fittest,’” asserts anthropologist Brian Hare and science writer Vanessa Woods. The catchphrase justified and perhaps inspired a couple of centuries of human and environmental destruction. It’s so deeply ingrained that it’s hard to extract from our understandings of history and society.</p>



<p>Struggle represents only one way of viewing the world, however. For humans and animals, cooperation may be the strongest outcome of evolution. The ever-changing relationships of interdependent individuals create stories, materials, and communities. Were 19th-century bridges known as “American” like processed American cheese, whose development runs through England and Switzerland? Were they like Gruyère, officially made only in Switzerland and France, though the United States, which imports more cheese by that name from the Netherlands and Germany, claims the name is “generic”? Were so-called American bridges like French dressing, whose origins do not lie in France and whose contents the US government controlled for 72 years, until 2022, 24 years after the Association for Dressings &amp; Sauces asked that it cease doing so? Or did they have something in common with chocolate? Eighty percent of cocoa beans come from West Africa, although only 1 percent of chocolate is made there.</p>



<figure class="wp-block-pullquote"><blockquote><p>For humans and animals, cooperation may be the strongest outcome of evolution.</p></blockquote></figure>



<p>Exploring how social and political flows define and redefine manufacturing, innovation, and consumption can lead us to shift our understandings of national, local, and global. Today, Kinshasha and Paris have the same number of Francophones; 60 percent of French speakers live in Africa, where they are remaking the language.</p>



<p>Biologist and computer scientist David Krakauer wrote, “Genes, minds and societies are all involved in various forms of construction. A better understanding of life requires that we abandon the view that organisms are account books recording in their behaviour past ages of the Earth and see them rather as builders engaged actively in the planet’s construction.”</p>



<p>Instead of regarding technologists as enactors of biological tropes and national destinies, we might view them as creators working in a multiplicity of places through networks that range across construction sites, businesses, factories, universities, and nations, while building a diversity of futures.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><em><strong><em>Gregory Dreicer</em></strong><em> is a historian, curator, and experience designer whose transdisciplinary explorations and public engagement offerings include “Between Fences,” “Me, Myself and Infrastructure,” and “Unbelievable.” He has worked with the Museum of Vancouver, the Chicago Architecture Foundation, and the Museum of the City of New York. He is the author of “</em><a href="https://mitpress.mit.edu/9780262552110/american-bridge/" target="_blank"><em>American Bridge</em></a><em>,” from which this article is adapted.</em></em><br></em><a id="_msocom_1"></a></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/Screenshot-2026-02-05-at-9.13.09-AM.png" length="50000" type="image/png"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/03/Screenshot-2026-02-05-at-9.13.09-AM.png" />                                        </item>
        			                <item>
                        <title>The Radical Tub</title>
                        <link>https://thereader.mitpress.mit.edu/the-radical-tub/</link>
                        <pubDate>Tue, 24 Feb 2026 10:54:00 +0000</pubDate>
                        <dc:creator>Christie Pearson</dc:creator>
                        		<category><![CDATA[Architecture]]></category>
		<category><![CDATA[Bathing]]></category>
		<category><![CDATA[Culture]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18828</guid>
                        <description><![CDATA[<p>How bathing spaces, long treated as sterile utilities, can become architectures of intimacy, accessibility, and embodied liberation.</p>
]]></description>
                        <content:encoded><![CDATA[<p>How bathing spaces, long treated as sterile utilities, can become architectures of intimacy, accessibility, and embodied liberation.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Kahlo-lead-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>Detail of &#8220;What the Water Gave Me,&#8221; Frida Kahlo, 1938. Painting. Public domain.</figcaption>
</figure>

<p class="has-drop-cap">When I get out of the bath, I want to feel transformed, to emerge renewed like the goddess Hera at her annual bath in the spring of Kanathos, or Aphrodite in the sea — or as close to this as possible. I want to be reborn in some sense. This takes time. The better I feel afterward, the better the bath.</p>



<p>I wonder if I have enjoyed every bath I have taken as poet Sylvia Plath does in her 1971 novel “The Bell Jar”:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>There must be quite a few things a hot bath won’t cure, but I don’t know many of them. Whenever I’m sad I’m going to die, or so nervous I can’t sleep, or in love with somebody I won’t be seeing for a week, I slump down just so far and then I say: “I’ll go take a hot bath.” I meditate in the bath. The water needs to be very hot, so hot you can barely stand putting your foot in it. Then you lower yourself, inch by inch, till the water’s up to your neck. I remember the ceiling over every bathtub I’ve stretched out in. I remember the texture of the ceilings and the cracks and the colors and the damp spots and the light fixtures. I remember the tubs, too: the antique griffin-legged tubs, and the modern coffin-shaped tubs, and the fancy pink marble tubs overlooking indoor lily ponds, and I remember the shapes and sizes of the water taps and the different sorts of soap holders. …  I never feel so much myself as when I’m in a hot bath. I lay in that tub on the seventeenth floor of this hotel for-women-only, high up over the jazz and push of New York, for near onto an hour, and I felt myself growing pure again. I don’t believe in baptism or the waters of Jordan or anything like that, but I guess I feel about a hot bath the way those religious people feel about holy water.</p>
</blockquote>



<p>While this is not a text in praise of domesticity, each of these singular containers is presented to her memory fondly, held like a dish, a bowl, or maybe a cup. The bathtub likes to sit in isolation, and usually tries to be an object, visible from as many sides as possible. We think that it is for one, but we know that, really, two could fit in without much trouble. If we can fit more than two people in a tub, it will change name, perhaps becoming a hot tub.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262044219/the-architecture-of-bathing/" target="_blank"><img loading="lazy" decoding="async" width="320" height="397" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/the-architecture-of-bathing.jpg" alt="" class="wp-image-18830"/></a><figcaption class="wp-element-caption">This article is adapted from Christie Pearson&#8217;s book &#8220;<a href="https://mitpress.mit.edu/9780262044219/the-architecture-of-bathing/" target="_blank">The Architecture of Bathing</a>.&#8221;</figcaption></figure>
</div>


<p>There is a problem that bathtubs pose to the designer, and ergonomics is one way in which designers have tried to address it. What constitutes the optimum bodily position in the bath? It seems that anthropologist Marcel Mauss might have been grappling with a related question in his 1934 essay “Techniques of the Body,” which envisioned a future, global “socio-psycho-biological study” of what might be called habits, gestures, and practices of the body. Sitting, standing, dancing, bathing, drinking — these habits enfold physiological, psychological, social, and sexual dimensions, at once natural and cultural, specific and vast. Read this way, Mauss anticipates a kind of social ergonomics, one that asks not only how bodies fit objects, but what bodies are doing and what kinds of social space they create.</p>



<p>In Turkey, you lie prone on a heated platform, then sit on a bench in a personalized niche with a basin collecting water, which you then throw over yourself. You are with children and friends.</p>



<p>In Japan, you squat to collect the water, dowse and scrub yourself, then enter a deep tub to soak, sitting with knees bent up. At home or at the sentō, you are usually with family members.</p>



<p>In Finland, you sit on tiered benches to sweat, then jump into the lake to rinse and cool off. You are with your family and friends.</p>



<p>In India, you walk into the river, submerge yourself three times, then walk back out. You are surrounded by neighbors doing the same thing before they go about their daily lives.</p>



<p>In ancient Rome, you rubbed your skin with oil, then sweated, then scraped yourself with a strigil. This curved metal tool for exfoliation scraped off the dirty oil and dead skin, leaving pores open and clear. You were surrounded by the free people of your town; you had finished your work for the day, and this was time for networking with business associates, catching up on political and social gossip, and sexual and dining invitations.</p>



<p>In Jordan, you float effortlessly on a salty sea, looking at the sky. You are with other pilgrims from around the world who have come to pay homage to an ancestral land.</p>



<p>In North America, you stand up with water pouring down on you from above when you wake up, or lie horizontally in a coffin-shaped tub to help you drift toward sleep. You are alone with the door closed, hoping not to be disturbed.</p>


<div class="wp-block-image">
<figure class="alignright size-full is-resized"><img loading="lazy" decoding="async" width="328" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Kahlo-What-the-Water-Gave-Me.jpg" alt="" class="wp-image-18834" style="width:320px;height:auto"/><figcaption class="wp-element-caption">Frida Kahlo, &#8220;What the Water Gave Me,&#8221; 1938. Painting. Source: Wikimedia Commons</figcaption></figure>
</div>


<p>Frida Kahlo’s painting “What the Water Gave Me” (1938) is taken from the vantage point of the artist in her bath. Floating in the water are scenes, people, body organs, landscapes, that seem to be taken from different moments of her life. Here we have a dissolution of time and space as Kahlo swims in a reverie of her entire life, coupled with the security of being held by the tub. The function of seeing is directed toward the inner eye of the artist as she moves both inward toward emotions and memories, and outward toward a whole life’s journey.</p>



<p>There is a kind of autoeroticism in self-absorption that has a particular cosmic relation or aggrandizement, suggesting another meaning to the bath’s experiential immersive reverie. Bathing’s autoeroticism can manifest in both the solo and the collective bath space, and the self-absorption of the bather in a group can be a powerful statement to others. Bathing architectures engage with these erotic dynamics in a variety of ways. Looking at the scarring on her foot and leg, we are reminded of Kahlo’s physical disability dating from childhood polio and a <a href="https://www.history.com/articles/frida-kahlo-bus-accident-art" target="_blank" rel="nofollow">bus accident in 1925</a>; the private bath is a space of respite and rest for her body. We can imagine the difficulty of Kahlo’s entering and exiting her tub. Lifelong self-sufficiency and able-bodiedness is an illusion that continues to be exploited and perpetuated for the economic gain of a few and the suffering of many. If we reimagined baths as shared spaces, they would offer us more flexibility and generosity, and recognize our need for others at so many stages of our lives. </p>



<p>Some relief from gravity is felt in water, and we sense the difficulty of the artist’s thoughts mingled with gratitude for finding a place of comfort. In the bathtub we would like to float if we could, effortlessly as if in the Dead Sea, where salt comprises 33.7 percent of the water. Pure and effortless horizontality is a unique ontological proposition. Though it is longer in duration, sleeping falls short in comparison, with our morning aches and constant movement through the night.</p>



<p>This effortlessness is possible in a large enough tub, the floatation tank. The origins of the float or isolation tank lie in the sensory deprivation experiments of the radical neuro-psychiatrist John C. Lilly in the 1950s. His interest in exploring the nature of the mind and psychedelics resulted in the creation of a sensory deprivation suit resembling an astronaut’s. As experiments evolved toward a dark and silent salty sea, Lilly’s subjects started to enjoy the experience more and emerged from the experiments feeling relaxed and rejuvenated. The lightless, soundproof tank contained highly salinated water, saturated with magnesium sulfate (Epsom salts) to a density of 25 percent and maintained at 34°C, matching skin temperature and eliminating thermal sensation. The magnesium is absorbed through the skin as needed, making the body effortlessly buoyant. Lilly invited notable psychedelic intellectuals to try out his tanks in the 1950s and 1960s; visitors included Timothy Leary, Carl Sagan, Allen Ginsberg, and physicist Richard Feynman, who describes his many experiences at Lilly’s tank in his <a href="https://wwnorton.com/books/Surely-Youre-Joking-Mr-Feynman/" target="_blank" rel="nofollow">autobiography</a>.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>A sense-deprivation tank is like a big bathtub, but with a cover that comes down. It’s completely dark inside, and because the cover is thick, there’s no sound. There’s a little pump that pumps air in, but it turns out you don’t need to worry about air because the volume of air is rather large, and you’re only in there for two or three hours, and you don’t really consume a lot of air when you breathe normally. Mr. Lilly said that the pumps were there to put people at ease, so I figured it’s just psychological, and asked him to turn the pump off, because it made a little bit of noise. The water in the tank has Epsom salts in it to make it denser than normal water, so you float in it rather easily. The temperature is kept at body temperature, or 94, or something — he had it all figured out. There wasn’t supposed to be any light, any sound, any temperature sensation, no nothing!</p>
</blockquote>



<p>Feynman’s description proceeds to his experiences of hallucinations in the floatation tank, in combination with different mind-altering substances ranging from marijuana to LSD, as inquiries into the nature of memory, the difference between wakefulness and dreams, and proprioception, the sensation of body position and movement. At a certain point, he became convinced he could replicate any of the experiences he had in the tank just by sitting in his living room, so he stopped the practice. At the end of the passage, however, he says he never managed to succeed.</p>



<figure class="wp-block-pullquote"><blockquote><p>Architects of bathing would benefit from a greater understanding of the interfaces between neurology, biology, psychology, and physiology.</p></blockquote></figure>



<p>Lilly&#8217;s work on sensory deprivation explored the subtle interrelationships between body, mind, and consciousness. In “The Deep Self” (1977), he examined how the floatation tank could be used as an aid for understanding one&#8217;s state of being, a practice that extends beyond the tank itself into everyday life. Architects of bathing would benefit from a greater understanding of the interfaces between neurology, biology, psychology, and physiology.</p>



<p>Floatation tanks grew in popularity. Inspired by a workshop Lilly gave in 1972, Glenn Perry designed the first commercial isolation tank made to Lilly’s specifications. He investigated the properties of anechoic chambers and a multitude of materials from plywood and fiberglass to cardboard and vinyl, settling on rigid plastic. Perry opened the first floatation tank establishment in Beverly Hills in 1979. Today, commercial floatation tank establishments can be found in most cities, advertising relaxation, relief from anxiety and high blood pressure, in organically shaped fiberglass pods.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Japanese-American designer George Nakashima designed and built a bathtub that contested industrial prefabrication and exemplifies the movement to reimagine the tub, and architecture, in the 20th century. Throughout the 1960s, as he developed his living and working compound in New Hope, Pennsylvania, filled with experimental structures built with minimal means, he incorporated elements of the Japanese family tub into an American home to create a unique and beautiful social space. The floor slides up and over into the tub as a continuous surface, like an extension of the landscape we see beyond. Sliding doors combine the inside and outside, and no walls separate this tub from the living space. The tile patterns include playful fish mosaics and the names of his kids, and we can only imagine the delight of the Nakashima children soaking and splashing here together at the end of the day, with the breezes blowing in across the water. Where Nakashima reimagined the bath through craft and lived experience, others approached it analytically.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="800" height="600" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/nakashima-house.jpg" alt="" class="wp-image-18854"/><figcaption class="wp-element-caption">George Nakashima, George Nakashima House, New Hope, Pennsylvania, USA, 1946. Photograph author</figcaption></figure>
</div>


<p>Architecture professor Alexander Kira’s 1976 book “The Bathroom” offered a critique of the bourgeois American bathroom based on a rational analysis of the failings of mass production, not to overthrow it but to better it. He brings a pragmatic, materialist, almost fearlessly Reichian lens to a set of forms that he sees as mutable and unsatisfactory.</p>



<p>Compared with kitchen design, he thinks that innovation in bath design lags far behind for most people due to prudishness. Kira claims that the Western distaste for even discussing basic bodily functions such as washing, pissing, shitting, and shaving has led us to neglect the design of an important room we use multiple times a day. Combined with a puritanical repression of the body and bodily pleasure, this has produced the modern bathroom: a sterile, barely even utilitarian space, a collection of fixtures lacking intelligent responses to basic needs. Taking a careful look at the bodily movements of people of different ages, shapes, and abilities, he offers to rectify this through ergonomics: The home bathroom could be better designed using analytical tools.</p>



<figure class="wp-block-pullquote"><blockquote><p>Kira claims that the Western distaste for even discussing basic bodily functions such as washing, pissing, shitting, and shaving has led us to neglect the design of an important room we use multiple times a day.</p></blockquote></figure>



<p>The majority of ergonomic studies and architectural graphic standards suggest isolation in their descriptions of space needs for single bodies, not multiple, interactive ones. The continual struggle for a parent to bathe a small child, or a child to bathe their ailing parent, in a typical contemporary bathroom underscores the basic problem. Every aging person is expected to need a total renovation of their home bathroom at some point in this design. Why are these simple concerns for age difference, let alone cultural or sexual differences, not met at the outset? Kira’s publication was ahead of its time; today, the average person expects their house to offer roughly the same fixtures in the same locations as a tenement dweller of 1920. While Kira’s study is pragmatically focused on the base unit of the private domestic bathroom, the public bath haunts it as a distant ideal practiced elsewhere: “In many parts of the world bathing is viewed and practiced as a shared, pleasurable activity,” he writes — “a scarcely possible feat in the average American five-by-seven-foot bathroom, even if this desire were present.”</p>



<p>The battle between shower and bath as the standard means of cleaning oneself has produced a compromise in the industrialized bath of a combined space, which is perfectly suited to neither: a flat-bottomed shallow tub that you cannot lie comfortably in, and a slippery concave surface to stand on while showering upright. Kira’s concerns are addressed again in the work of architecture professor Galen Cranz. After studying at UC Berkeley in the 1970s, Cranz combined her training in design and <a href="https://alexandertechnique.com/" target="_blank" rel="nofollow">Alexander technique</a> to revisit design from the viewpoint of the body. Her interpretation of body-conscious design comes together in her 1998 book “The Chair,” a deepening of ergonomics that developed from the inside out. Here she applies her method to the neck-ache-inducing American tub:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>Lying down should be easy in the privacy of your own bath. Why are American bathtubs usually designed so that lying down is impossible? Their size and shape forces one to stoop while bathing — no wonder so many people, especially tall ones, prefer to shower. If one tries to recline, the head and neck hit an awkward edge. Why not lie flat, fully extending the spine to float on the water? Even elaborate and expensive whirlpool tubs have not solved this problem. Little inflatable plastic pillows are as close as anyone has come to body consciousness in bathing. The institution of bathing could easily be revolutionized in the United States.</p>
</blockquote>



<p>The same year that “The Bathroom” was published, another publication on bathing with a decidedly different approach was launched in San Francisco. Architecture school graduate Leonard Koren ran <em>Wet: The Magazine of Gourmet Bathing </em>until 1981. That this apotheosis of West Coast cool <a href="https://www.christiepearson.ca/wp-content/uploads/2018/04/CMAGAZINE123_p20-27_F_Pearson.pdf" target="_blank" rel="nofollow">took bathing as a touchstone to discuss fashion, music, art, food, and everyday life</a> reminds us of the importance of bathing and the body in Californian culture generally, and in the 1970s particularly. The magazine’s humor and irreverence connected it to punk and new wave cultures, as well as the famous bathing-based parties that Koren would throw in old urban bathhouses, backyard pools, and clubs. Koren’s writing in <em>Wet </em>is full of funny, subtly poetic insights weaving between sensuality, spirituality, the life of the mind, and social relations.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="900" height="1208" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Wet-cover.jpg" alt="" class="wp-image-18852"/><figcaption class="wp-element-caption">Cover of Wet: The Magazine of Gourmet Bathing, issue number 1. Photography and design by Leonard Koren.</figcaption></figure>
</div>


<p>There was an urgency to the critique of the bathroom in the 1970s in parallel with a revolt against modernism and the industrialized world, and the decade produced many stimulating designs and texts for the architecture of bathing. If the division of labor under capitalist production fragmented the body and its gestures into a collection of utilitarian machine parts that the tub represents, philosopher Félix Guattari envisioned in his 1975 essay “<a href="https://thefunambulist.net/editorials/philosophy-to-have-done-with-the-massacre-of-the-body-by-felix-guattari" target="_blank" rel="nofollow">To Have Done with the Massacre of the Body</a>” a revolution that would reground us in the body:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>We have begun with the body, the revolutionary body, as a place where “subversive” energies are produced — and a place where in truth all kinds of cruelties and oppressions have been perpetuated. By connecting “political” practice to the reality of this body and its functioning, by working collectively to find means to liberate this body, we have already begun to create a new social reality in which the maximum of ecstasy is combined with the maximum of consciousness.</p>
</blockquote>



<p>In 1978, the standard manual for back-to-the-land bathing came from California: “Sweat.” Mikkel Aaland’s countercultural bathing classic brought together scholarly research, a healthy irreverence for authority, and the wisdom of everyman the bather, with Rabelaisian humor. Aaland’s chapters on sweat traditions include the Americas and Japan, and stories of his experiences as he travels the world in search of a good sweat. The final section gives instructions for making a variety of simple sweat baths yourself. “Sweat” embraces in its very title an unapologetic corporeality that can hold the blood, sweat, and tears of real life. “Sweat”<span style="box-sizing: border-box; margin: 0px; padding: 0px;"> was typical of countercultural American bathing thought during the 1960s and 1970s: <em>Whole Earth </em>style, critical of modernity and seeking a deeper connection to the sensual and </span>the environment.</p>



<p>In 1996, Koren assembled his thoughts in a form aimed at design students in the provocative “Undesigning the Bath.” The thesis of this text is that the very act of designing, as it is commonly taught and practiced, is inimical to the creation of a great bath. Greatness here being contextualized by an anti-modern, anti-utopian treatise. Basing his book on a lifetime of world bathing, Koren uses minimal text and maximal gritty images to offer advice to those pursuing an architecture of bathing.</p>



<p>The argument of “Undesigning the Bath” is polemical, and he identifies qualities of a great bath experience (including pleasure, timelessness, and thermal stimulation); the obstacles to making such a place (including egoism, industrialized approach, veiled misanthropy); and techniques for undoing your design education (cultivating discovery, poetry, and a relationship to nature). It contains a dig at Kira and the ergonomic approach for missing a major point. </p>



<p>The look of a bathhouse may be important in signaling its function and ambition, and for publication, we rely on the image to communicate. But for the repeat bather, the image is at best irrelevant, at worst misleading. What counts is how the bath <em>feels</em>, the atmosphere, where embodied perception and social interaction are viscerally and unforgettably experienced. No amount of measuring can get us there.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Christie Pearson</em></strong><em> is an award-winning architect, writer, and urban interventionist. An Adjunct Professor at the University of Waterloo School of Architecture, she is coeditor of the architectural journal Scapegoat and the author of “<a href="https://mitpress.mit.edu/9780262044219/the-architecture-of-bathing/" target="_blank">The Architecture of Bathing</a>,” from which this article is adapted.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Kahlo-lead.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Kahlo-lead.jpg" />                                        </item>
        			                <item>
                        <title>The Vanishing Giants of America’s Steam Age</title>
                        <link>https://thereader.mitpress.mit.edu/the-vanishing-giants-of-americas-steam-age/</link>
                        <pubDate>Mon, 23 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Jeff Brouws</dc:creator>
                        		<category><![CDATA[America]]></category>
		<category><![CDATA[Coal]]></category>
		<category><![CDATA[Railroads]]></category>
		<category><![CDATA[Trains]]></category>
		<category><![CDATA[Media]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=19051</guid>
                        <description><![CDATA[<p>Coaling towers are little-known railroad relics that take many forms. But each evokes a subtle grandeur of industrial might.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Coaling towers are little-known railroad relics that take many forms. But each evokes a subtle grandeur of industrial might.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/coaling-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>Twin Towers in Gilman, IL. Photographer: Jeff Brouws</figcaption>
</figure>

<p class="has-drop-cap">For half a decade, I journeyed 20,000 miles across North America to document a slowly vanishing industrial vestige: the coaling tower. Just over one hundred still stand on the continent, each originally built for the same simple purpose — to drop coal into steam locomotives.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051750/silent-monoliths/" target="_blank"><img loading="lazy" decoding="async" width="320" height="402" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/coal.jpg" alt="" class="wp-image-19052"/></a><figcaption class="wp-element-caption">Jeff Brouws’ photography is featured in “<a href="https://mitpress.mit.edu/9780262051750/silent-monoliths/" target="_blank">Silent Monoliths</a>.”</figcaption></figure>
</div>


<p>Very little ink has been spilled on the history of coaling towers, let alone on their architectural character. Yet in photographing these silent monoliths, I discovered a remarkable stylistic variety that transcends their utilitarian nature, revealing structures as expressive as they are obsolete. Which begs the question: Why do they exhibit so much diversity in form despite their uniformity in function? The answer lies in a mix of engineering and geography.</p>



<p>The first coaling towers likely appeared in the late 1800s. During their roughly half-century of use in the 1900s, each railroad had different operational needs, as well as cost and engineering considerations when determining a coaling tower’s size and placement along a mainline, in a yard, or at an engine facility. Towers ranged from modest 50-ton stations on remote branch lines to massive 2,000-ton behemoths that straddled multi-track mainlines capable of fueling several locomotives at once.</p>



<p>As the Steam Era progressed, the look of coal-handling infrastructure evolved to meet the demands of larger, heavier locomotives. Early, wooden “coal docks” were eventually replaced by reinforced concrete coaling towers, designed to withstand the immense weight of stored coal and the constant vibration of passing trains. While boxy, angular designs were common early on (as in the one in Reevesville, Illinois), they were eventually supplanted — often in the same era — by more efficient cylindrical shapes (as in the one in Susquehanna, Pennsylvania). This shift wasn’t merely aesthetic; the cylindrical, silo-like shape helped distribute the weight of the coal — and the downward pressure it exerted — far more evenly than any rectangular, flat-walled coal bin ever could.</p>



<p>Most surviving concrete coaling towers were the work of specialized firms like Roberts &amp; Schaefer, Fairbanks-Morse, and Ogle. These companies produced detailed, patented designs that served as reference guides for the industry, oftentimes creating a recognizable, unified look for railroads that contracted with them. The Chesapeake &amp; Ohio, for instance, adopted the Ogle cylindrical concrete “bullet-type” as one of its standard styles; five of these still stand in Kentucky and the Virginias today, including the notable 300-ton-capacity tower in Charlottesville. Smaller firms like Ross &amp; White or Howlett employed similar business practices. However, some railroads preferred to manage their own infrastructure, constructing “home-built” models defined by a stripped-down, no-frills approach. (The rectangular Illinois Central Railroad tower in Reevesville is an excellent example of this.)</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1092" height="1396" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.41.19-AM.png" alt="" class="wp-image-19053" style="width:470px"/><figcaption class="wp-element-caption">Susquehanna, PA.</figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1262" height="1328" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.45.02-AM.png" alt="" class="wp-image-19056" style="width:470px"/><figcaption class="wp-element-caption">Great Bend, KS.</figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1264" height="1332" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.45.58-AM.png" alt="" class="wp-image-19057" style="width:470px;height:auto"/><figcaption class="wp-element-caption">Reevesville, IL.</figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1396" height="1376" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.49.05-AM.png" alt="" class="wp-image-19058" style="width:470px"/><figcaption class="wp-element-caption">Irvington, KY.</figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1282" height="1344" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.50.18-AM-1.png" alt="" class="wp-image-19060" style="width:470px"/><figcaption class="wp-element-caption">Decatur, IL.</figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1120" height="1440" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.52.46-AM-1.png" alt="" class="wp-image-19064" style="width:470px"/><figcaption class="wp-element-caption">Wilmington, DE.</figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1242" height="1302" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Screenshot-2026-02-10-at-9.54.30-AM.png" alt="" class="wp-image-19065" style="width:470px"/><figcaption class="wp-element-caption">Detroit, MI.</figcaption></figure>
</div>


<p>The aesthetic variety of coaling towers reached its arguable zenith in the streamlined outlines found at the Pennsylvania Railroad’s Wilmington, Delaware tower. It was proof that even the most practical designs can achieve a subtle grandeur, blending engineering necessity with the visual language of the <em>Streamline Moderne</em> movement.</p>



<p>While coaling towers stand as a breathtaking fusion of form and function — as sculptures in the landscape <em>itself </em>— their appeal extends beyond the architectural and into the archaeological. Many I sought out were in remote locations. Driving up to the isolated towers at Gilman, Illinois, felt like I had accidentally stumbled upon the stone statues of Easter Island. When I saw the now-separated twin towers in Gilman, which once spanned two busy mainlines, I experienced my own kind of “<a href="https://en.wikipedia.org/wiki/David_Livingstone#Stanley_meeting" target="_blank" rel="nofollow">Doctor Livingstone moment</a>,” wondering why they were built here and why they remain. To the uninitiated, these industrial ruins could be totems from a lost civilization.</p>



<p>This sensation is likely what photographer <a href="https://mitpress.mit.edu/9781846381980/walker-evans/" target="_blank">Walker Evans</a> termed the “historical contemporary” that bridges past and present. The coaling towers, in their various shapes and states of decay, remain a draw for those curious about a vanished era of American industry.</p>



<p>For me, exploring the historical contemporary required more than mere curiosity; it required a thoughtful approach to preparation. I often used Google Earth to scout locations, identify tower placements, and assess potential access issues. In rural areas, however, the resolution was often too fuzzy to discern the exact lay of the land, so I organized nearly 100 printed maps into three-ring binders in advance. Since many towers were in obscure spots, I was never entirely sure whether a day’s drive would yield results. But I didn’t mind being suspended in a state of unknowing.</p>



<p>Once on-site, all the pre-trip planning shifted into the practicalities of the shoot itself. Fully aware that railroad sites could be dangerous, I didn’t linger around active yards or mainlines. I worked quickly, except when I found myself in a landscape on the brink of abandonment. I found many towers marooned in brownfields or stationed alongside secondary lines where few trains ran. In these instances, I could stretch out and take my time with the image-making, falling into a state that often generated my most satisfying work. Though in all cases, I abided by the urban-exploration saying: “Take nothing but photos, leave nothing but footprints.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Jeff Brouws</em></strong><em> is a photographer whose work is in many private and public collections, including Harvard’s Fogg Museum, the Los Angeles County Museum of Art, Princeton University Art Museum, and the Whitney Museum of American Art. His photography is featured in “<a href="https://mitpress.mit.edu/9780262051750/silent-monoliths/" target="_blank">Silent Monoliths</a>.”</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/coaling.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/coaling.jpg" />                                        </item>
        			                <item>
                        <title>Bioethics Was Forged in Horror. It Can Be Lost the Same Way.</title>
                        <link>https://thereader.mitpress.mit.edu/bioethics-were-forged-in-horror-they-can-be-lost-the-same-way/</link>
                        <pubDate>Thu, 19 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>The Editors</dc:creator>
                        		<category><![CDATA[Bioethics]]></category>
		<category><![CDATA[COVID]]></category>
		<category><![CDATA[Trump]]></category>
		<category><![CDATA[War]]></category>
		<category><![CDATA[Science & Tech]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18869</guid>
                        <description><![CDATA[<p>Wars and ethical disasters laid the groundwork for global rules around medical research. But the pandemic and Trump&#8217;s presidency reveal how fragile they remain.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Wars and ethical disasters laid the groundwork for global rules around medical research. But the pandemic and Trump&#8217;s presidency reveal how fragile they remain.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Brandt-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>Karl Brandt, Adolf Hitler&#8217;s personal physician, taking the stand in Nuremberg, Germany on August 19th, 1947. Source: Army photographer Ray D’Addario.</figcaption>
</figure>

<p class="has-drop-cap">Informed consent — the idea that a patient or research subject understands the risks of a procedure or experiment before agreeing to it — might seem like a no-brainer in bioethics. Yet the concept itself was only formally codified roughly 80 years ago, after the Nuremberg trials brought to light the horrific human experiments carried out by the Nazis during World War II.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262553377/absolutely-essential/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/bioethics-jkt.jpg" alt="" class="wp-image-18872"/></a><figcaption class="wp-element-caption">Jonathan D. Moreno is the author of the book “<a href="https://mitpress.mit.edu/9780262553377/absolutely-essential/" target="_blank">Absolutely Essential</a>.” An open-access edition of the book is freely available for download <a href="https://direct.mit.edu/books/oa-monograph/6025/Absolutely-EssentialBioethics-and-the-Rules-Based" target="_blank">here</a>.</figcaption></figure>
</div>


<p>In fact, much of modern bioethics, as we know it, emerged not from “human goodness, but from prudence in light of harsh experience,” writes Jonathan Moreno in his new book “<a href="https://mitpress.mit.edu/9780262553377/absolutely-essential/" target="_blank">Absolutely Essential</a>.” A professor of medical ethics and health policy at the University of Pennsylvania and a member of the National Academy of Medicine, Moreno chronicles the field’s evolution from post–World War I treaties to the ethical upheavals of the COVID-19 pandemic.</p>



<p>The picture that emerges is a fragile global ecosystem — an interlocking patchwork of governments, professional organizations, NGOs, and individual leaders tasked with defending principles such as consent, autonomy, non-maleficence, and justice. Yet Moreno warns that this system depends on political will just as much as moral consensus. And as the so-called “rules-based order” supporting that consensus is increasingly undermined by world leaders like Donald Trump, those safeguards may be in far more peril than they appear.</p>



<p>In an interview edited for length and clarity, Moreno unpacks the history of bioethical principles, why they&#8217;re being eroded in real time, and how the rising tide of ethnonationalism harkens back to darker eras. “History doesn&#8217;t repeat; it rhymes,” he says. “It&#8217;s hard to see how the path we&#8217;re on leads to a happier future.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>I want to start with some of the history you discuss in the book. You write, “International law has not emerged from human goodness, but from prudence in light of harsh experience.” What are some of the key historical experiences that you think animate the bioethics we observe internationally today?</em></strong></p>



<p><strong>Jonathan: </strong>You can start with the Revolutionary War. For instance, before vaccination for smallpox, George Washington insisted that all of his soldiers in the Continental Army get variolated for smallpox, meaning you actually had to put a slit in somebody&#8217;s arm and put actual smallpox into it. He feared that, unlike the British, his people didn’t have herd immunity, so he, in great secrecy, got everybody variolated, which he said was one of the most important decisions he made for what would today be called “force protection.”  So, as weapons become more powerful, the instruments of medicine gradually become more potent, but they were always falling behind.</p>



<p>Later on, sterile techniques during the Civil War, when guys were getting amputated in the field, weren’t really up to par. But much was learned. During the Civil War, a general order on the treatment of prisoners of war implicitly required that they be given decent medical care.</p>



<p>Then there are international agreements on these matters — the Geneva and Hague Conventions. And there’s this growing sense that the cruelty and horror of modern warfare needs some kind of regulation. The League of Nations is proposed. The scale of human suffering becomes so intense that there is a feeling that medicine needs to organize in the military setting — which, by the way, is often ahead of the civilian setting — around a set of ethical principles. Nowhere is that need more vivid than in human experiments (which have always been going on; we know that Hippocrates, for instance, advocated experimentation when you couldn&#8217;t figure out what else to do).</p>



<figure class="wp-block-pullquote"><blockquote><p>There’s no algorithm that&#8217;s going to tell you the answer to every ethical dilemma.</p></blockquote></figure>



<p>By the time the world was faced with 23 Nazi doctors and medical bureaucrats being tried for crimes against humanity in the Nuremberg trials, it seemed — and certainly the trial concluded — that there needed to be some more recognized rules around doing human experiments.  And gradually, those rules became generalized in the public mind to human experiments more broadly.</p>



<p><strong><em> What were the existing norms, rules, and customs around human experimentation prior to the Nuremberg trials? Are you saying the field of bioethics was largely unarticulated before then?</em></strong></p>



<p> <strong>Jonathan: </strong>The Nuremberg Code was articulated in various forms before then. For instance, in the 1820s, U.S. Army surgeon <a href="https://becker.wustl.edu/news/william-beaumonts-momentous-and-unethical-experiments/" target="_blank" rel="nofollow">William Beaumont</a> was in northern Michigan when a man who worked for a fur company, Alexis St. Martin, shot himself with a duck shot in the stomach. This is a very famous case. The wound never healed; a fistula formed, and Beaumont wanted to answer the question that was raging in gastroenterology at the time: whether digestion was a mechanical or a chemical process. Beaumont had this hole in St. Martin’s stomach and had him sign a contract. Then Beaumont paid him to be his servant, and he put food in, thermometers in, and took it out, and sure enough, the surgeon discovered it&#8217;s mostly a chemical process after you masticate. There was also a whole series of syphilis experiments starting in the early 19th century involving sex workers and people who were not in a position necessarily to say no.</p>



<p> So, it&#8217;s not as though there were never any consents or agreements to be in experiments. And what counted as an experiment wasn&#8217;t so clear either, because who <em>knows </em>what&#8217;s going on in the consulting room? There weren’t scientific settings where there was a lot of scrutiny.</p>



<p>By 1931, the Weimar Republic had bioethics guidelines that were really good, but they were ignored by the Nazis. You might say the world — at least in the West — became “flat” in bioethics by the 1960s and 1970s, and that, after the Cold War, there was gradually more respect for autonomy and so forth in the East and the Global South.</p>



<p>The big point is that, in the past, you did not need to have the requirements for informed consent for human experiments. That&#8217;s a decision that humanity has made. And it&#8217;s founded on a notion of universal human equality, which was not well recognized in international humanitarian law until after World War II.  With today&#8217;s <a href="https://www.theguardian.com/world/2026/jan/26/the-global-rules-based-order-has-been-in-freefall-for-years" target="_blank" rel="nofollow">collapsing rules-based order</a>, what else do we lose? It’s a very particularized way of seeing what is lost, but it is also a very global way of seeing it.</p>



<p><strong><em>You go to pretty great lengths to stress that so much of bioethics is not formalized — that it’s a baroque patchwork of commissions and advisory bodies and NGOs and professional organizations operating through shared norms. There&#8217;s no real enforcement mechanism beyond moral opprobrium. Do you see that as a weakness, a strength, or maybe both?</em></strong></p>



<p><strong>Jonathan: </strong>I think there needs to be some flexibility with these things. There’s no algorithm that&#8217;s going to tell you the answer to every ethical dilemma. There’s no black box that&#8217;s going to give you the magic answer all the time. We don&#8217;t even have an international criminal court that functions in cases of genocide. I think a lot of solving bioethical dilemmas comes down to whatever<em> culture</em> is.</p>



<p> For instance, the Hippocratic Oath might be the oldest code of ethics we have. Yet it’s wildly imperfect; it often doesn&#8217;t apply to anything we do anymore, but once you give <em>that </em>up, there&#8217;s not much left. So, aspirational ideals are still important. But back to your question: There is no prospect of a formal mechanism to get this right all the time.</p>



<p><strong><em>The COVID-19 crisis altered the landscape of bioethics in many ways. You mentioned global vaccine equity as one of the big ones: The pandemic exposed the failure of the Global North to deliver vaccines widely to the Global South. What are some of the big bioethics lessons that you think we could take away from the pandemic?</em></strong></p>



<p><strong>Jonathan: </strong>The field of bioethics emerged in an era when everyone was worried about human experimentation, not a public health catastrophe. Consider the HIV/AIDS crisis — I worked in, at Downstate in Brooklyn, in the late ’80s and through the ’90s, and it was absolutely a public health catastrophe, but not on the scale of the COVID pandemic.</p>



<p>In the HIV/AIDS era, the main argument that groups like ACT UP and so forth often made was that doing experiments to get to the HIV/AIDS therapies we have now was a matter of personal autonomy. That is to say, “We, as adult people, have the right to have a voice in how the studies are being done,” as opposed to the very anal-retentive ways that the FDA was asking for clean data at the time. So, alternative pathways for drug approval were created based on arguments about personal autonomy, not only public health. Today, the argument for personal autonomy is being used as a trump card against public health interventions like vaccination, which is a very unfortunate turn of events.</p>



<figure class="wp-block-pullquote"><blockquote><p>History doesn&#8217;t repeat; it rhymes.</p></blockquote></figure>



<p>Now, we live in a post-pandemic world.  And as I&#8217;ve talked about in the book, there were different ways, depending on where you were, that governments dealt with lockdowns and made deals with different companies, and officials took bribes to get a certain vaccine into one country rather than another.  Now, we&#8217;ve got to think about bioethics in terms of forefronting the ethics of public health, because respect for personal autonomy only goes so far in a post-pandemic world. People were not thinking of it in those terms in the ’70s and ’80s, in the early field of bioethics — because, again, bioethics was oriented toward human experiments, not toward the ethics of public health.</p>



<p><strong><em>I want to zoom in on the bioethics of America’s domestic politics. You reference abortion and IVF. More recently, there have been <a href="https://www.aclu.org/news/immigrants-rights/detained-immigrants-detail-physical-abuse-and-inhumane-conditions-at-largest-immigration-detention-center-in-the-u-s" target="_blank" rel="nofollow">credible reports</a> of human rights abuses in immigration detention facilities. For instance, facilities like <a href="https://www.theguardian.com/us-news/2025/dec/04/alligator-alcatraz-human-right-violations-amnesty-report" target="_blank" rel="nofollow">Alligator Island</a> are black boxes. What’s going through your mind when you think about all this as a bioethicist?</em></strong></p>



<p><strong>Jonathan: </strong>I mean, being a Jew who grew up in the ’50s and ’60s, I felt even as a kid that we were in a kind of golden age because the world shrank back in horror at the Holocaust. But how long does that last? People have short memories.</p>



<p>Walker Connor, considered one of the founders of nationalism studies, basically <a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/9781118663202.wberen416" target="_blank" rel="nofollow">says</a>, we’ve got this term, “nation state,” which glosses over the difference between being a nation and being a state. Being a state is a legal status. Being a nation is psychological. <em>Why</em> do you identify with some group of people rather than others? If you’re, say, a Jew, does it start with the destruction of the Second Temple? Maybe, maybe not. Do you identify with any previous generation that was killed in the Holocaust? Do you identify with the people who go to Chinese restaurants for, you know, Christmas Eve? I mean, how do you decide what nation you&#8217;re a part of?</p>



<p> Cultures are organisms, and you can&#8217;t separate one thing, like bioethics, from other things.  Today, there are three wars going on — in Gaza, Ukraine, and Sudan; Amnesty International has said that the moral norms around trying to minimize death are at risk. Steven Pinker has argued that we are actually better off now than we were hundreds of years ago because the chances of dying by violence have decreased in the last hundred years. But I fear our recent progress is being eroded, and those calculations don’t include psychological harms —</p>



<p><strong><em>And spiritual fulfillment.</em></strong></p>



<p><strong>Jonathan: </strong>— and I don&#8217;t know how you would <em>do</em> that in fairness, you know. But he goes as far as to say that we&#8217;re so much more civilized now than we were. And that&#8217;s true in some ways and not true in others. But it&#8217;s hard to see how the path we&#8217;re on leads to a happier future. There&#8217;s been a general shift back toward ethnic nationalism that I think liberals like me have underappreciated. History doesn&#8217;t repeat; it rhymes.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong>Jonathan D. Moreno</strong> is the David and Lyn Silfen University Professor Emeritus at the University of Pennsylvania. He has served as a staff member or adviser to many governmental and nongovernmental organizations, including three US presidential commissions, the Howard Hughes Medical Institute, the Bill and Melinda Gates Foundation, and the UNESCO International Bioethics Committee. He is the author of several books, including “<a href="https://mitpress.mit.edu/9780262553377/absolutely-essential/" target="_blank">Absolutely Essential</a>.”</em><br></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Brandt.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/Brandt.jpg" />                                        </item>
        			                <item>
                        <title>From Atoms to AI: The Futile Search for a “Perfect” Language</title>
                        <link>https://thereader.mitpress.mit.edu/from-atoms-to-ai-the-futile-search-for-a-perfect-language/</link>
                        <pubDate>Tue, 17 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Andrea Moro</dc:creator>
                        		<category><![CDATA[AI]]></category>
		<category><![CDATA[Elements]]></category>
		<category><![CDATA[Language]]></category>
		<category><![CDATA[Lucretius]]></category>
		<category><![CDATA[Philosophy]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18773</guid>
                        <description><![CDATA[<p>Tracing the boundaries of reason through Lucretius and Descartes, and what they reveal about the cognitive limits of both humans and machines.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Tracing the boundaries of reason through Lucretius and Descartes, and what they reveal about the cognitive limits of both humans and machines.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/lucretius-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock
</figcaption>
</figure>

<p class="has-drop-cap">In order to convince us of the atomic hypothesis, wherein atoms are the building blocks of all matter, the Roman philosopher Lucretius uses the model of the alphabet. Like alphabetical letters, he proposed, there is not an infinite number of types of matter. Rather, everything we observe is the result of infinite <em>copies</em> of finite types of elements that, due to their specific shape, are driven by random movement, expressing new shapes and properties that the atoms themselves did not originally possess.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262554015/lucretius-and-the-bat-with-blue-eyes/" target="_blank"><img loading="lazy" decoding="async" width="320" height="498" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/lucretius-jacket.jpg" alt="" class="wp-image-18774"/></a><figcaption class="wp-element-caption">This article is adapted from Andrea Moro&#8217;s book, “<a href="https://mitpress.mit.edu/9780262554015/lucretius-and-the-bat-with-blue-eyes/" target="_blank">Lucretius and the Bat with Blue Eyes</a>.”</figcaption></figure>
</div>


<p>However, this credible and fascinating hypothesis leaves us with a new problem, which Lucretius does not invoke but which is logically valid: Is there any order among the basic elements — that is, an underlying structure or set of rules — themselves? And can this order manifest itself in the objects we can observe?</p>



<p>Lucretius explored these questions through the metaphor of language —    specifically through the “elements” of the alphabet. In Western alphabets, as everyone knows, letters have a rigid order, one that dates back to a distant history, beginning in Mesopotamia and Egypt. It is certainly no coincidence that the meaning of the word <em>alphabet</em> alludes precisely and solely to the <em>relative</em> order of the first two letters (<em>alpha</em> and <em>beta</em>). However, there appears to be no deeper reason for any letter to precede any other in the alphabet. This same problem of order arises whenever we speak of any primitive elements: There is no deeper reason for any element to precede the other.</p>



<p>Reading Lucretius and his insistence on the question of order reminded me of the words of another canonical giant — and something he expressed in a relatively neglected text. I’m speaking of René Descartes, who, in a 1629 letter to Father Mersenne — a renowned philosopher and mathematician — opined about the project of a mutual friend, mathematician and polyglot Claude Hardy. The project, which is not mentioned in any other source, centered on a proposed artificial language that could be learned “in five or six hours.”</p>



<p>Descartes, without hesitation, completely and irrevocably demolishes this proposal for a series of exemplary reasons, ranging from the difficulty of constructing valid sounds for all languages to the fact that an invented language without literature, if not an epic, is practically useless (those who know J. R. R. Tolkien’s invented languages know this well). His reaction, however, goes far beyond the <em>pars destruens</em>, offering an alternative proposal that is quite amazing and totally unexpected — a reasoned and systematic catalog of all infinite possible thoughts:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>Besides, I find it possible to add another invention to this, both for composing the primitive words of this language and for their characters, in such a way that it can be taught very quickly and by means of an order, that is to say, by establishing an order among all the thoughts that can enter the human mind, just as there is a natural order established among numbers. And just as in a single day one can learn to give names to all numbers up to infinity and to write them in an unknown language, even though they would amount to an infinite number of different words, one could do the same thing with all other necessary words to express all the other things that fall within the human mind. If this order were discovered, I have no doubt that this language would soon spread throughout the world, for many people would have the desire to spend five or six days in order to make themselves understood by all men.</p>
</blockquote>



<p>Before approaching the very core of this quotation, let’s consider some important collateral aspects. Descartes is not the only one who dared dream of discovering a better or a perfect or universal language; shortly after him, in the same century, Gottfried Wilhelm Leibniz offered his <em>Lingua characteristica universalis</em> and Juan de Caramuel y Lobkowitz, bishop of Vigevano, his <em>Leptotatos</em>; jumping ahead to the 20th century, Louis Hjelmslev presented his <em>Glossematics</em> and Giuseppe Peano his <em>Algebra de Gramatica</em>. These efforts continued all the way to the great scam of ingenious and perfect languages, which have sadly led to the delirium of ranking tongues and the effort to anchor them in the notion of “race.”</p>



<p>There is, of course, a danger to searching for a perfect language. It started in the mid-20th century with the Bavarian-born linguist Max Müller’s hypothesis that a noble language could be spoken only by people of the noble race, with the term “noble” standing in for “Aryan.” He refuted his theory later in life, but by then it was too late: This delusional idea had already taken root in countries all over the world and fed the political propaganda of governments calling for aggression against other populations, and even extermination, as in the case of Jewish people.</p>



<p>The fact that the notion of race is <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC11291859/" target="_blank" rel="nofollow">not biologically sustainable</a> proved to be completely insufficient in preventing the social consequences of this delirium. In fact, as humans, the necessity to name shared phenotypical differences between groups of people — skin color, eye shape, or average height — emerges naturally, if not with the collective term “race,” which nowadays has negative connotations. Even if these biological details are eliminated on scientific grounds, another much more dangerous and underestimated issue remains: cognition. In fact, two independent ideas that are not generally taken to be dangerous in isolation constitute a powerfully explosive mixture if considered together.</p>



<p>The first idea is the hypothesis that some languages are “better” than others. There is a notion that some languages have a lexicon that captures abstract concepts better than others, that some operate more quickly than others, and that some appear to have a more stable word order at the level of sentences, while others have an unconstrained, even chaotic word order. Meanwhile, some view certain languages as more acoustically pleasant or harmonic than others.</p>



<figure class="wp-block-pullquote"><blockquote><p>Being in love with someone doesn’t authorize the lover to argue that one’s beloved is the best person in the world; the same holds for languages.</p></blockquote></figure>



<p>The second idea is the hypothesis that a person perceives reality and reasons about it differently depending on what language they speak.</p>



<p>These two hypotheses are the most radical, dangerous, and pervasive of all racisms: the one that bears on a person’s capacity to understand, think, and love. Interestingly, Dante Alighieri, in his “On Eloquence in the Vernacular,” already knew that this was a delusional argument. He mocks those who think a “better” language is possible by referring to Pietramala, an abandoned small village or a crumbling manor on the Italian Apennines between Bologna and Firenze, where the Tarlatis, a family rival to the Alighieris, lived: “Pietramala is a great city indeed, the home of the greater part of the children of Adam. For whoever is guided by such an <em>obscene</em> reasoning as to think that the place of his birth is the most delightful spot under the sun, he may also believe that his own language, i.e., his mother tongue, excels over all others; and, as a result, he may believe that his language was also Adam’s language.”</p>



<p>In this passage, Dante uses the very strong and aggressive adjective, “obscene” (in Latin <em>obscenus</em>, meaning both “indecent” and “inauspicious”), to qualify this reasoning — quite surprising for a lofty treatise devoted to intellectuals of all countries. This stylistic mark indirectly highlights Dante’s strong feelings, if not outrage: By exploiting such a harsh irony and unexpected verbal crudity, he expresses his opposition to those who conflate the affective domain with the rational one. Being in love with someone doesn’t authorize the lover to argue that one’s beloved is the best person in the world; the same holds for languages.</p>



<p>Had Western culture listened and read Dante carefully — something that sadly didn’t happen, as even Alessandro Manzoni recognized — we might not have engaged with the delirious notion of a noble language and a noble race. There is simply no language that is better than another — and certainly no perfect language.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Turning back to Descartes’s reflections, his comparison of a natural order of thoughts to the natural order of numbers is astonishing, not only for the force of his analysis but also for the originality with which he weaves ideas together to generate new understanding. It arguably surpasses all other similar dreamers in the history of Western thought: What would be needed is “an order among all the thoughts that can enter the human mind, just as there is a natural order established among numbers.”</p>



<p>Some examples may clarify his intuition: No one ever doubts which number follows, say, the number called <em>one thousand seven hundred and twenty-two</em> and what its name is; but when it comes to which word follows the word <em>cloud</em>, no one knows, and this is necessarily the case. We can perhaps think of the plural <em>clouds</em>, and then <em>cloudy</em>, the adjective that is derived from it, but we know these only because in the alphabetical order another letter, <em>s</em>, is added to <em>cloud</em>, making <em>clouds</em> a successor of <em>cloud</em>, and <em>cloudy</em> a successor to <em>clouds</em> because <em>y</em> follows the <em>s</em>. But that pertains to the form, the signifier, the sound: When it comes to a word’s content, the signified or the meaning, we are just not capable of placing words into an exhaustive, complete natural order.</p>



<p>Such was the dream expressed by Descartes, or perhaps we should say <em>mirage</em>, but it is a very important dream for at least three reasons.</p>



<p>The first is that everyone would like a map of meanings to define the borders of Babel, which would amount to a map of <em>impossible meanings</em>. The second is that we would like to understand how words are stored in our brain; sometimes slips make us understand that they are in alphabetical order, leading us to mistakenly utter <em>bed</em> instead of <em>bread</em>, but other times they must be arranged according to meaning because we will mistakenly utter a conceptually contiguous noun, <em>chair,</em> instead of <em>bed</em>. The third is that today, machines are being built that can work with language automatically; I am thinking of popular “talking machines” such as ChatGPT.</p>



<p>These machines utilize a type of computational model involving “neural networks” or (very) large language models, referred to as “(v)LLMs,” the latest in a series of inventions that started in the 1900s, which then went by the now obsolete term “cybernetics” before getting renamed as “artificial intelligence.” While AI is a fairly opaque term for this, it is at least less pretentious than what is referred to as “neuronal or neural networks,” whose mechanisms remain quite mysterious given what we know about neurons.</p>



<p>None of these attempts can compete in ambition with the Cartesian dream of a natural order among word meanings, but there is a substantial difference: The Cartesian dream suggests an opportunity to know language structure, our limits, and thus, ultimately, ourselves in a better way. Meanwhile, modern technology aims to facilitate everyday life by creating machines that can help humans perform routine tasks. Ever since the invention of the first tool, we have simply designed artificial objects to help us avoid fatigue and boredom.</p>



<figure class="wp-block-pullquote"><blockquote><p>The very idea of a grammar for the cosmos still stimulates us today, and it allows us, 2,000 years later, to think about our limits.</p></blockquote></figure>



<p>One difference needs to be highlighted, though: To properly compare and contrast humans and machines, it is important to understand each one’s intrinsic limits. It wasn’t that long ago that we thought machines were too primitive to resemble humans and that we needed to wait for more advanced technology for them to be capable of doing what we do.</p>



<p>Today, talking machines have led us to radically overturn this perspective and support the epoch-making strides of formal and comparative linguistics in the second half of the 20th century. The central fact is quite easy to grasp, given what we’ve observed so far: For machines, there exist no impossible languages. Actually, we need to be clear on this point: Even if a machine demonstrates a different “behavior” in how it understands an impossible language (the way humans do), the nonbiological nature of impossible languages still establishes a difference between humans and machines. Humans exploit natural circuits developed by genetic instructions under evolutionary forces, but only for possible languages; impossible languages progressively inhibit those circuits and are computed by other networks in the brain. In other words, for machines and their grammars (the vLLM-based programs), even the very notion of “natural circuit” has no empirical equivalent.</p>



<p>Machines don’t resemble us humans, not because they lack computational power, but because they are too powerful. Ultimately, they do not look like us because they do not have our limits: After all, we <em>are </em>our limits.</p>



<p>Ultimately, what stands out after reading Lucretius, at least to me, is that his reflections on the alphabet as a model of the universe should not be dismissed as an antiquarian discovery. The very idea of a grammar for the cosmos still stimulates us today, and it allows us, 2,000 years later, to think about our limits; our understanding of the world and its connection with our brain; and, finally, to trace a meaningful, non-mystical, and in fact quite measurable and substantial border between humans and machines. We still have good reasons to read Lucretius and allow him to lead us to formulate the right questions about the very different kind of Big Bang — the big bang of language — because ultimately, when we reflect on language, the data we’re considering is ourselves.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Andrea Moro</em></strong><em>,</em><strong><em> </em></strong><em>member of the Accademia dei Lincei, is Professor of General Linguistics at the Institute for Advanced Study (IUSS) in Pavia and at the Scuola Normale Superiore in Pisa, Italy. He is the author of “<a href="https://mitpress.mit.edu/9780262549233/impossible-languages/" target="_blank">Impossible Languages</a>,” “<a href="https://www.amazon.com/Secret-Pietramala-Andrea-Moro-ebook/dp/B0C9TXJG1Z" target="_blank" rel="nofollow">The Secret of Pietramala</a>,” and “<a href="https://mitpress.mit.edu/9780262554015/lucretius-and-the-bat-with-blue-eyes/" target="_blank">Lucretius and the Bat with Blue Eyes</a>,” from which this article is adapted.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/lucretius.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/lucretius.jpg" />                                        </item>
        			                <item>
                        <title>Will Life on Mars Require a Genetic Rewrite?</title>
                        <link>https://thereader.mitpress.mit.edu/will-life-on-mars-require-a-genetic-rewrite/</link>
                        <pubDate>Thu, 12 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Scott Solomon</dc:creator>
                        		<category><![CDATA[Evolution]]></category>
		<category><![CDATA[Genetics]]></category>
		<category><![CDATA[Mars]]></category>
		<category><![CDATA[Space]]></category>
		<category><![CDATA[Science & Tech]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18809</guid>
                        <description><![CDATA[<p>Microgravity, radiation, and extreme climates pose ethical and biological challenges that researchers are racing to overcome.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Microgravity, radiation, and extreme climates pose ethical and biological challenges that researchers are racing to overcome.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/mars-cover-copy-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source image: Adobe Stock</figcaption>
</figure>

<p class="has-drop-cap">Chris Mason is a man in a hurry.</p>



<p>“Sometimes walking from the subway to the lab takes too long, so I’ll start running,” he told me over breakfast at a bistro near his home in Brooklyn on a crisp autumn morning. “Just so I can get there faster. Not because I’m late for a meeting, just because it’s taking too long to walk…I’m the only one I know who runs to work to get there faster.”</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051514/becoming-martian/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Untitled-1-copy.jpg" alt="" class="wp-image-18810"/></a><figcaption class="wp-element-caption">This article is adapted from Scott Solomon&#8217;s book, “<a href="https://mitpress.mit.edu/9780262051514/becoming-martian/" target="_blank">Becoming Martian</a>.”</figcaption></figure>
</div>


<p>Mason is a professor of physiology and biophysics at Weill Cornell Medicine. At least that’s his official title. He seems to be working on a hundred different projects all at once, ranging from tracking changes in the virus that causes COVID-19 to helping corals adapt to climate change.</p>



<p>The previous day, I had visited his research group on the Upper East Side. The Mason Lab occupied four separate laboratories across three buildings and was still growing. Although they were pursuing a wide range of projects, a major focus of <a href="file:///Users/jonskolnik/Desktop/9780262051514" target="_blank" rel="nofollow">their work</a> was on how the human genome and microbiome are affected by spaceflight. What Mason and his researchers know for sure is that settlement of space will lead to major changes to our biology, one way or another.</p>



<p>If we let these changes unfold naturally, evolution will take its course, and people on Mars will gradually become better adapted to the conditions there through mutation and natural selection. <a href="https://en.wikipedia.org/wiki/Founder_effect" target="_blank" rel="nofollow">Founder effects</a> and <a href="https://en.wikipedia.org/wiki/Genetic_drift" target="_blank" rel="nofollow">genetic drift</a> will cause random changes in Martians and reduce their genetic diversity. Enforced quarantines due to the risk of spreading infectious diseases could accelerate the speciation process. But these natural evolutionary processes will be relatively slow and, to put it mildly, quite unpleasant. What if we could accelerate the process of adaptation and minimize the human suffering that it would otherwise entail?</p>



<p>Mason thought that we could — and he laid out his argument in detail. “One possibility is we simply allow evolution to gradually select for characteristics required to survive on these new planets,” he wrote in his book, “<a href="https://mitpress.mit.edu/9780262543842/the-next-500-years/" target="_blank">The Next 500 Years</a>.” “This is basically the ‘sink or swim’ approach to life’s survival, except with no lifeguards and bricks tied to your feet.”</p>



<p>However, there <em>is</em> an alternative: “Our second option to enable Earth’s life to live on other planets is to preemptively direct this genetic process, so that the life we send is already capable of surviving in its new home. More complex, yes — but also more humane.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">The basic idea of acquiring new abilities by taking DNA from one organism and putting it into another has existed since the 1970s. In 1972, biochemist Paul Berg became the first person to do this when he copied a short piece of DNA from a virus that attacks bacteria into another kind of virus that attacks monkeys. The following year, Herbert Boyer and Stanley Cohen applied this “gene splicing” technique to insert genes from one species of bacteria into another. They found that the inserted gene persisted in subsequent generations as the bacteria divided. Going one step further, they then spliced genes from a frog into a bacterium, and <a href="https://americanhistory.si.edu/collections/object-groups%20/birth-of-biotech/recombinant-dna-in-the-lab" target="_blank" rel="nofollow">found</a> that the frog genes became a permanent addition to the bacterium’s genome.</p>



<p>This was the dawn of a revolution in biotechnology. Recombinant DNA — meaning DNA copied from one organism and pasted into another — could be used for an incredible <a href="https://www.theguardian.com/science/2011/sep/11%20/genetically-modified-glowing-cats." target="_blank" rel="nofollow">number of things</a>, from producing life-saving medications like insulin to more whimsical applications like making glow-in-the-dark cats and goldfish. But it was also the beginning of an era in which humans could directly control the evolution of any species by manipulating their DNA. Mason saw the potential to take genes from organisms naturally well-adapted to harsh conditions and insert them into human cells to help prepare people for the hazards beyond Earth.</p>



<figure class="wp-block-pullquote"><blockquote><p>Settlement of space will lead to changes, one way or another. If we let it unfold naturally, evolution will take its course.</p></blockquote></figure>



<p>One candidate for such a hardy creature is the water bear, or tardigrade. Tardigrades are distant relatives of insects but have a unique appearance. They are barely visible to the naked eye, but under a microscope, they look like tiny gummy bears with eight chubby little legs and mouths shaped like nozzles. They thrive in moisture, but their adaptability allows them to live almost anywhere, from the sea to the soil in your backyard. One of the ways they are able to live in such a wide range of habitats is by tolerating long periods of harsh conditions — say, a drought — by essentially shriveling up. In their dehydrated state, they are almost invincible, which is what has drawn the attention of biologists interested in life in outer space.</p>



<p>In 2016, a team of Japanese researchers led by Takekazu Kunieda and Atsushi Toyoda sequenced the genome of one particularly hardy species of tardigrade. In the process, they discovered that tardigrades produce a protein that helps them survive in a dehydrated state. They named the protein “damage suppressor,” abbreviated Dsup. The researchers then took a major leap: They extracted the Dsup gene from the tardigrade genome and temporarily spliced it into human cells. To be clear, the human cells were growing in a laboratory, not a human body. Nevertheless, they found that when the tardigrade gene was inserted, human cells could produce Dsup. And — most significantly — when they exposed the human cells making Dsup to radiation in the form of X-rays, the cells were less damaged and better able to grow than normal human cells.</p>



<p>Chris Mason’s lab began working to further improve human cells&#8217; ability to withstand the harsh conditions of space by splicing in genes from tardigrades and other organisms that survive in extreme environments. He sees this as the beginning of an era in which human cells can be endowed with a great variety of abilities. He predicts that by the year 2040, “genes from all organisms will become a playground for creating and making new functions in human cells.”</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">The notion that the diversity of life on Earth represents a genetic “playground” for us to draw from seems exciting, but it is also fraught with risks, ranging from the biological to the ethical. </p>



<p>Indeed, after the first demonstrations of gene splicing, researchers quickly recognized the double-edged-sword nature of the new technology. A voluntary moratorium was called on the use of recombinant DNA. A conference was held in 1975 at Asilomar Beach in California, where many leading researchers came together to develop guidelines for its use. The meeting, which has variously been called “Woodstock for molecular biology” and “the Pandora’s box conference,” was in part an attempt by researchers to come up with their own limits on the use of genetic technology in the hopes that doing so would prevent government regulations. Indeed, after four days of meetings, the researchers agreed to lift their own self-imposed moratorium on the use of recombinant DNA, albeit with some guardrails intended to prevent what they saw as its most potentially dangerous uses.</p>



<p>It seemed plausible that some genetic diseases could be cured with recombinant DNA by swapping out the section of DNA responsible for the condition with DNA from a healthy donor.</p>



<p>The first success came in 1990 with a child named Ashanthi DeSilva. At the age of two, she had been diagnosed with SCID — the same condition as the “<a href="https://en.wikipedia.org/wiki/David_Vetter" target="_blank" rel="nofollow">bubble boy</a>,” David Vetter, that results in a nonfunctional adaptive immune system. In 1990, when DeSilva was four years old, she was given a fully approved experimental gene therapy to replace the cells in her bone marrow that cause the condition. It worked. With the modified genes, DeSilva’s immune system began functioning well enough for her to go outside, attend school with other kids, and lead a normal life.</p>



<p>Another major breakthrough came with the discovery of a way to edit DNA directly. It happened, as many scientific discoveries do, in a roundabout way. In 1990, Francisco Mojica was a graduate student at the University of Alicante in Spain, studying a type of single-celled microbe called archaea. After sequencing some of their DNA in hopes of learning how they tolerate so much salt, he found something unexpected. In between sections that looked to him like normal DNA, with the usual combination of all four DNA bases A, T, C, and G, were sections that kept repeating the same bases. Even stranger, these repetitive sections were also palindromes, meaning they could be read the same way forward and backward. He found 14 of these sequences clustered at regular intervals around otherwise normal DNA sequences.</p>



<p>Puzzled, Mojica searched the scientific literature for anything similar in other organisms. He only found one, which seemed to share the same peculiar cluster of repetitive DNA sequences. He published his results, unsure of the sequences&#8217; function. He would later give them a cumbersome name with a catchy acronym — clustered regularly interspaced short palindromic repeats, or CRISPR.</p>



<figure class="wp-block-pullquote"><blockquote><p>Is it ethical to make decisions that will directly affect future generations who will not have any choice in the matter?</p></blockquote></figure>



<p>Soon, CRISPR sequences were found in a wide range of other microorganisms. Researchers in the dairy industry found them in the bacteria that ferment milk into cheese and yogurt. Intriguingly, they noticed that new CRISPR sequences appeared in the dairy bacteria after an attack by viruses — and that the CRISPR sequences matched sequences from the viruses’ genomes. What’s more, the bacteria with the new CRISPR sequences were no longer vulnerable to attack from the same virus. The CRISPR sequences were acting as a type of immune response by the bacteria: The bacteria were learning to recognize the virus so that they could defend against it in the future.</p>



<p>The mechanism for how CRISPR works was figured out by a team of researchers led by biochemists Jennifer Doudna and Emmanuelle Charpentier at UC Berkeley. They discovered that CRISPR works with the help of proteins, called Cas — short for CRISPR-associated proteins. Cas proteins, such as Cas9, cut DNA like a molecular scalpel. Bacteria use CRISPR-Cas9 to recognize the unique DNA of a particular virus and then chop it up to destroy it. But what Doudna and Charpentier also found is that they could control which DNA sequence was targeted. It didn’t have to be DNA from a virus. It could be DNA from any living thing. If the DNA is inside a living cell, the cell’s machinery will naturally repair the damage. But the most exciting part of all was that Doudna and Charpentier found that they could manipulate the repair process so that a stretch of DNA could be cut out and replaced with any sequence they wanted. In other words, it was programmable.</p>



<p>“In the history of science, there are few real eureka moments, but this came pretty close,” <a href="https://www.amazon.com/Code-Breaker-Jennifer-Doudna-Editing/dp/1982115858" target="_blank" rel="nofollow">wrote</a> Doudna biographer Walter Isaacson about the breakthrough. Unlike the copy-and-paste gene-splicing approach, CRISPR can make precise, deliberate edits to an organism’s genes. “In short, they realized that they had developed a means to rewrite the code of life,” wrote Isaacson.</p>



<p>Clinical trials were underway years later to test whether CRISPR could be used to treat conditions ranging from diabetes and blood disorders to certain forms of cardiovascular disease and cancer. By 2023, the first two CRISPR-based treatments were approved in the United States — one for sickle cell disease and another for the blood disorder beta-thalassemia.</p>



<p>There is a catch, however. While the hope is that patients receiving CRISPR treatments will be fully rid of their diseases, the gene-editing approaches approved so far would not prevent any of their children from inheriting their parents’ diseases. The genetic changes are made only to DNA in somatic cells — the cells of the body that are not involved in making sperm or eggs. For their children to be cured, they would need to undergo the same treatment as their parents. The same would be true of every subsequent generation.</p>



<p>The alternative would be to make edits to cells in a way that affects not only somatic cells but also germline cells — those that become eggs, sperm, and eventually embryos and then babies. Germline gene editing is possible, although it crosses a line that some believe should not be crossed. The reason is that any edits made to germline cells will affect all the descendants of the individual receiving the treatment, for countless generations. This raises new types of ethical questions. It is one thing to perform a procedure on a living person, who can be educated about the potential risks and benefits and who can give their informed consent. Is it ethical to make decisions that will directly affect future generations who will not have any choice in the matter?</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Changing the germline means we are, whether we realize it or not, controlling the future of evolution. Yet while the techniques of gene editing are new, the idea that we humans can guide evolution is not. As Chris Mason pointed out, we have been doing so for millennia through the practice of selective breeding in agriculture and the domestication of animals.</p>



<p>“While controlling the evolution of the past, present, and future seems scary and wrought with incredible hubris, the reality is that we already have been engineering and modifying species and the environment around us, except previously we were doing so by accident with no foresight,” Mason wrote. “Now, finally it can be done with a sense of responsibility and purpose.”</p>



<p>Yet the idea of purposefully controlling the evolution of our own species has a dark history.</p>



<figure class="wp-block-pullquote"><blockquote><p>Any discussion of manipulating the future of human evolution has to consider&#8230;the ways in which those efforts were perverted and abused.</p></blockquote></figure>



<p>In 1883, Francis Galton proposed improving our species through selective breeding in much the same way we do for animals, which he described as “the science of improving stock.” Among his investigations were the first studies of twins and an attempt to determine which physical characteristics criminals had in common so they could be recognized before committing crimes. Based on his observations, Galton thought it would be possible to make the characteristics he considered positive — such as good health, intelligence, and responsibility — more common in society by encouraging marriages between people from families with a history of these traits. He called this idea “eugenics.”</p>



<p>As Galton’s ideas spread, they also evolved. In addition to encouraging the breeding of people with supposedly good characteristics, some sought to achieve similar results by preventing the reproduction of people with traits they considered undesirable. The first government to enact laws based on eugenics was the state of Indiana in 1907, followed soon after by 31 other U.S. states. The laws included forced sterilization for people labeled “criminals, idiots, imbeciles, and rapists.” The issue was brought before the Supreme Court in 1927. The question was whether a 21-year-old woman named Carrie Buck could be surgically sterilized because she had been labeled an “imbecile,” which the prosecution argued was hereditary. In an 8–1 ruling, the Court determined that forced sterilization was indeed legal.</p>



<p>In Germany, the Nazi Party modeled its policies on the American eugenics laws. They passed a law in 1933 that mandated surgical sterilization for anyone they determined to be carrying a “hereditary disease.” But sterilization was just the first step. Soon, the Nazi efforts of “racial hygiene” would include murder and genocide.</p>



<p>The atrocities committed in the name of eugenics in the first half of the 20<sup>th</sup> century were based not only on prejudiced and racist views, but also on flawed science. We now know that there is little, if any, genetic basis for the traits that proponents of eugenics sought to control. Still, any discussion of manipulating the future of human evolution must consider the flaws inherent in previous attempts to do so, as well as the ways those efforts were perverted and abused.</p>



<p>Questions about the ethical use of gene editing become more complex when considering humans on other planets. Would it be ethical to change a gene to make a person traveling to Mars better able to tolerate lower gravity or higher radiation? What about for a child born on Mars? Could genome editing make it easier to allow people to move safely between planets, for example, by altering their immune systems?</p>



<p>Chris Mason sees gene editing in the context of space settlement as a moral imperative. “Sending any Earth-evolved organism to another planet would result in almost certain death, which represents the sad, evolutionary ‘good luck’ plan,” he wrote. “To save life, we will need to engineer it.”</p>



<p>Mason’s reasoning is based on an ethical philosophy he calls “deontogenics.” According to this way of thinking, as a species that is aware of the possibility of our own extinction and that of other species, we have an ethical obligation to try to prevent that from happening. “Any act that consciously preserves the existence of life’s molecules . . . across time is ethical. Anything that does not is unethical,” Mason wrote.</p>



<p>With this framework in mind, Mason and his research team are pressing forward on genetically engineering human cells to make them better adapted for conditions beyond Earth. They have had some success with getting human cells to produce the Dsup protein that helps tardigrades survive in space. So far, their work involves only human cells being grown in a lab, but he hopes that will soon change. “I’d say human trials are 10 years away,” he told me.</p>



<p>A list of other genes that could be modified to help people to deal with life on Mars and elsewhere has been identified by George Church, Chris Mason, and colleagues at Harvard’s Consortium for Space Genetics. They include genes that influence bone density, muscle tone, radiation resistance, and even pain tolerance. In part, the list comes from studies of existing genetic variation within people alive today. It also comes from organisms capable of living in extreme environments, like tardigrades and others.</p>



<figure class="wp-block-pullquote"><blockquote><p>“Any act that consciously preserves the existence of life’s molecules&#8230;across time is ethical.&#8221;</p></blockquote></figure>



<p>One particularly hardy species of bacteria was first discovered in the 1950s in a can of meat that had been exposed to a whopping dose of 5 million millisieverts of radiation. The goal was to determine whether radiation could be used to sterilize canned foods and make them safe to eat. Yet the bacteria were still alive. The researchers identified them as belonging to the genus <em>Deinococcus</em> and named the species <em>radiodurans</em> in reference to their remarkable ability to endure such high radiation exposure. Even tougher bacteria, like the appropriately named <em>Thermococcus gammatolerans</em>, have been found in the water used to cool nuclear power plants. The genetic basis of these species’ abilities to withstand radiation is being investigated by Mason and his colleagues for their potential use in engineering life beyond Earth.</p>



<p>Another approach Mason is researching is to genetically engineer genes in bacteria and other microbes in our microbiome to produce useful products, including Dsup. This way, no changes to human cells would be required, but people might still reap the benefits if the substances produced by microbes are active within the human body. They already have some microbes in the lab that seem capable, he told me, but so far, they have not tested whether the microbes would work the same way when living in humans.</p>



<p>“It’s still a few years before we do a trial like that,” Mason said.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">So, we can edit our genes or those of our microbial partners. But there is yet another way that genetic technology may facilitate the human migration into space: by creating genes that do not yet exist.</p>



<p>In the first decades of the 21st century, the field of synthetic biology emerged with the goal of creating new functions for living things using genetic tools. Scientists at the J. Craig Venter Institute created the first complete synthetic genome of a simple organism, a type of bacteria, in 2010. Work has since been underway to create synthetic genomes for more complex organisms. Eventually, synthetic human genomes <a href="https://johnhawks.net/weblog/when-did-human-chromosome-2-fuse/" target="_blank" rel="nofollow">might be possible</a>.</p>



<p>One idea, suggested to me by biologist Tiffany Vora, is to create synthetic portions of a human genome. For example, while humans normally have 23 pairs of chromosomes, one or more new chromosomes could be added to augment our existing genome. “The idea that we’re going to find all the mutations we need in Earth’s situations — I don’t believe it, because we’re fundamentally looking for a non-Earth context,” Vora told me. The advantage of this approach is that the existing genome could be left untouched. “If you can make really long artificial chromosomes, then you don’t have to change the person — you just give them a patch, essentially.”</p>



<p>This raises the possibility that future humans with additional synthetic chromosomes may be genetically incompatible with people without them. If used for space settlement, this could be yet another force driving a wedge between humans from Earth and humans living elsewhere. Adapting to life in space may require genetic engineering, but engineering people for space might also contribute to a split in humanity. At some point, people may have to choose between prioritizing adaptation for life on other planets and maintaining human beings as a single species. It might not be possible to achieve both.</p>



<p>Other ideas for how to use technology to help people adapt to life beyond Earth include enhancing our bodies with mechanical, electronic, or robotic components. We are already accustomed to wearing glasses, using hearing aids, prosthetic limbs, artificial hearts, and many other devices to improve human health and well-being. Brain–computer interfaces can be added to the list.</p>



<figure class="wp-block-pullquote"><blockquote><p>“I wake up almost every morning and think about the Sun engulfing the Earth,” he told me. “It’s almost the first thought in my mind.&#8221;</p></blockquote></figure>



<p>Numerous private companies working on brain–computer interfaces have recently emerged, suggesting the technology is maturing. In 2024, Neuralink — a company owned by Elon Musk — implanted <a href="https://www.reuters.com/business/health%20care-pharmaceuticals/neuralinks-first-human-patient-able-control-mouse-through%20-thinking-musk-says-2024-02-20/" target="_blank" rel="nofollow">its first</a> experimental device in a human patient. As the technology improves, brain–computer interfaces will allow better control of artificial limbs and exoskeletons as well as other devices such as vehicles, robots, and more.</p>



<p>These technologies could certainly be helpful for life on other planets. Connecting the brain to devices that enhance the senses could give people the ability to see or hear in ways that our eyes and ears cannot do on their own. Imagine a Mars rover, with all of its sophisticated tools and machinery, controlled entirely by the human mind. Now imagine that you <em>are</em> the rover. Humans with these enhanced abilities could become the most capable and best-adapted Martians.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">If humans — in one form or another — are going to ever leave our solar system, Mars will be an important stepping stone. On Mars, humanity will learn to create and sustain new settlements. Chris Mason thinks of this first, cautious step in humanity’s lifetime as being like going to college. “Leaving the house you grew up in, traveling just out of the reach of your parent’s ability to instantly help you, and testing your limits, boundaries, and potential — all while having fun, learning a lot, and likely getting into trouble,” he wrote about our first settlements on Mars.</p>



<p>If we do manage to spread out and survive on planets scattered across our solar system and others, we should expect to evolve, adapt, and speciate everywhere we go. Like tortoises and finches on Earthly islands, the conditions on each of the cosmic islands will influence how the people there will evolve. Some may choose to let the natural forces of mutation, natural selection, and genetic drift determine how they change. Others may decide to take matters into their own hands, using technology to guide the process.</p>



<p>To ensure we are ready, Chris Mason is moving forward with his work on engineering the genes of living things — humans and microbes — for their future in space. Despite often thinking in timescales that involve hundreds, millions, or even billions of years, he sees his work as urgent.</p>



<p>“I wake up almost every morning and think about the Sun engulfing the Earth,” he told me. “It’s almost the first thought in my mind. It’s a cosmological fact. I see the Sun every morning. It’s still there, and it’s only going to get bigger…I only have so much time…I’ll have another, say, thirty years, forty years, maybe, of productive work I could do. Maybe fifty, at most. But that’s it. I don’t have 500 years…I want to do as much as I can.”</p>



<p>Suddenly, his fast talking made a little more sense.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Scott Solomon</em></strong><em> is a Teaching Professor at Rice University in Houston. He is also a Research Associate at the Smithsonian Institution’s National Museum of Natural History and the author of “<a href="https://www.amazon.com/Future-Humans-Science-Continuing-Evolution/dp/0300208715" target="_blank" rel="nofollow">Future Humans</a>” and “<a href="https://mitpress.mit.edu/9780262051514/becoming-martian/" target="_blank">Becoming Martian</a>,” from which this article is adapted.</em></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/mars-cover-copy.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/mars-cover-copy.jpg" />                                        </item>
        			                <item>
                        <title>Don’t Let Climate Fatalism Become a Self-Fulfilling Prophecy</title>
                        <link>https://thereader.mitpress.mit.edu/dont-let-climate-fatalism-become-a-self-fulfilling-prophecy/</link>
                        <pubDate>Mon, 09 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Hannah Ritchie</dc:creator>
                        		<category><![CDATA[Carbon]]></category>
		<category><![CDATA[Climate]]></category>
		<category><![CDATA[Emissions]]></category>
		<category><![CDATA[Environment]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18750</guid>
                        <description><![CDATA[<p>The idea that it’s “too late” to reduce emissions fuels cynicism and despair, putting us on an even worse trajectory.</p>
]]></description>
                        <content:encoded><![CDATA[<p>The idea that it’s “too late” to reduce emissions fuels cynicism and despair, putting us on an even worse trajectory.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/final-ritchie-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock</figcaption>
</figure>

<p class="has-drop-cap">I read Mark Lynas’s book “Six Degrees: Our Future on a Hotter Planet” when I was 14 years old, and it scared the life out of me. Lynas takes the reader on a journey of what to expect from a world that’s one degree warmer, two degrees, three degrees, all the way up to six degrees. By the middle of the book, your blood pressure is high; by the end, you’re on the floor.</p>


<div class="wp-block-image">
<figure class="alignleft size-full is-resized"><a href="https://mitpress.mit.edu/9780262052740/clearing-the-air/" target="_blank"><img loading="lazy" decoding="async" width="320" height="515" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/Clearing-the-air-jckt.jpg" alt="" class="wp-image-18754" style="width:242px;height:auto"/></a><figcaption class="wp-element-caption">This article is adapted from Hannah Ritchie&#8217;s book, “<a href="https://mitpress.mit.edu/9780262052740/clearing-the-air/" target="_blank">Clearing the Air</a>.”</figcaption></figure>
</div>


<p>It is a well-researched book that offers us a window into many possible futures. Fortunately, the scientific consensus has moved away from the most extreme scenarios since its publication. Unfortunately, a lot of the public messaging has not. Many people believe a pathway to 5°C or 6°C is already locked in, and the only thing we can do now is prepare for the worst.</p>



<p>Let’s look at what the latest science says about where we might end up by 2100.</p>



<p>If no countries stepped up their climate efforts,<strong> </strong>simply preserving what they <em>already put in place</em>, we might end up at 2.5°C to 3°C higher than preindustrial temperatures by the end of the century.<strong> </strong>If countries met the targets they set for 2030 but enacted no policies afterward, we’d end up at 2.4°C.<strong> </strong>Many countries have set ambitious targets to reach “net-zero” emissions — most by the middle of this century. If they achieved this, we’d be in a 1.8°C warmer world.</p>



<p>This is both good news and bad news.</p>



<p>The good news is that we’re no longer heading for the worst-case scenarios that scared me as a teenager. The plunging costs of solar, wind, batteries, and electric vehicles, a step up in national policies, and a better understanding of what our energy future might look like have taken us off that terrifying path. And, importantly, countries have put commitments on the table that would keep us “well below 2°C.” Now, we’d be naive to assume that they’ll all deliver. But it does give us concrete pledges that we can hold governments to account on.</p>



<p>The bad news is that one of our global targets of keeping temperatures below 1.5°C of warming is now out of reach. And a 2.5°C warmer world — which we’re on course for — is still a scary and unacceptable one. It could spell the end of many coral reefs. It could cause significant damage to food production, especially in some of the poorest countries. Large parts of the world will experience grueling heatwaves. Arctic sea ice will be gone in the summer. Ice sheets are at a much higher risk of becoming unstable. We really want to avoid ending up there. And we can: 2.5°C or 3°C is not “locked in.” There is still time to put ourselves on a better trajectory.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap"><span style="box-sizing: border-box; margin: 0px; padding: 0px;">To actually <em>get</em> on that trajectory, we need to think about our climate targets more constructively</span>.</p>



<p>First, let’s be honest about where we’re heading. The 1.5°C target is dead. If our carbon emissions dropped to zero tomorrow, we <em>could</em> achieve it. But the reality is that our emissions aren’t going to fall quickly enough (I’m optimistic, but I’m not delusional). We need to be honest about this for a couple of reasons. One, countries need to adapt to the post-1.5°C world we’ll soon be living in; pretending this won&#8217;t happen robs them of the time they need to prepare. Two, the public — who are repeatedly told that 1.5°C is still within reach — will start to lose trust when we pass that target.</p>



<p>Second, we must avoid the temptation of throwing in the towel. That’s the most important message here: There is no point of no return that makes it pointless to act. Our 1.5°C and 2°C targets are not cliffs or thresholds. Every tenth of a degree is worth fighting for as it reduces the impacts of climate change and limits the damage that’s to come. 1.7°C is better than 1.9°C, which is better than 2.1°C. We need to stop obsessing over arbitrary targets and focus on how we can help reduce our carbon emissions as quickly as possible.</p>



<p>Third, we should all watch out for headlines based on worst-case scenarios. We’re not on the same trajectory to 4 or 5°C that we thought we were a decade ago. Unfortunately, a lot of reporting and studies are still based on these worst-case scenarios. It’s sometimes hard for nonexperts to know what scenario is being assumed without reading jargon-filled academic papers. My one quick piece of advice is to look out for any mention of “RCP8.5”: This is the acronym of the worst-case (but now implausible) scenario that has often been used in climate modeling. Of course, knowing the impacts of these extreme cases is useful for scientists, but not for policymakers or the public, who assume that this is the most likely outcome. It does make for a great apocalyptic headline, though.</p>



<p>With all of that said, let’s turn our attention to concrete actions you can take to reduce your <em>personal</em> carbon footprint: If you drive, then cycle, walk, or take public transport more. If you <em>need</em> a car, then an electric one is much better than a gasoline or diesel one. If you fly, this will be a big chunk of your footprint. I won’t tell anyone to stop flying completely (because for most people, it’s not going to happen), but reducing the amount you fly would make a massive difference.</p>



<figure class="wp-block-pullquote"><blockquote><p>We must avoid the temptation of throwing in the towel.</p></blockquote></figure>



<p>At home, heating and air conditioning will be your biggest energy-guzzler. Getting your home insulated and switching from a gas boiler to an electric heat pump will slash your home’s footprint (and your bills). Installing solar panels at home will also reduce your carbon footprint while cutting your energy bills. Some can’t afford the upfront costs — or rent a flat where they don’t have the option of putting up solar panels — but it’s a worthwhile investment for those who can. Also, if you can manage, switch to a renewable energy provider; this sends a signal that more and more people care about climate change and want low-carbon energy.</p>



<p>When it comes to food consumption, consider eating less meat and dairy and moving toward a more plant-based diet. This doesn’t mean you have to go fully vegan; for many, the all-or-nothing approach is daunting. But you can still have an impact by cutting back, especially on beef and lamb.</p>



<p>And finally, stress less about the small stuff — recycling, plastic bags and food wrappers, food miles, turning the lights off, leaving devices on standby — especially if it comes at the expense of the big things listed above. This is a concept called “moral licensing,” in which people feel they’ve contributed to the small stuff and therefore ignore their more carbon-intensive behaviors. People will often feel proud about bringing their plastic bag to a supermarket (which has a tiny carbon footprint) and then fill it with meat and dairy (which has a much bigger impact).</p>



<p>Of course, all of these individual actions on their own are not going to get us to the climate future we want. At a societal level, we need to go bigger and faster.</p>



<p>We’ll have to deploy low-carbon electricity sources like solar, wind, nuclear, and geothermal as quickly as possible. To do that, we’ll need massive reforms around infrastructure projects so they can be completed more quickly. We’ll also have to accelerate advancements in batteries, which hold the key to the energy transition, and <a href="https://thereader.mitpress.mit.edu/how-to-fix-climate-change-a-sneaky-policy-guide/">electrify as many sectors as we can</a>, including road transport, heating, steel manufacturing, and short-haul aviation. Electrification is the most efficient way to decarbonize.</p>



<p>Additionally, we must reduce global meat and dairy consumption, while innovating high-quality protein alternatives, and invest in forest and ecosystem restoration to suck up lots of carbon. And finally, we should continue innovating in sectors that are not yet ready for large-scale deployment: cement and steel manufacturing, long-haul aviation, and ways to remove carbon dioxide from the atmosphere.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">One of the most common questions I get asked is: Won’t we reach a point where it’s game over and our planet’s systems collapse, a point where we trigger runaway warming? Tipping points — a threshold where a system moves into an irreversible state — matter, but they don’t change what we need to do now to reduce emissions. There are a few key misconceptions about tipping points that are worth looking at here.</p>



<p>First, people often think that the planet has one single tipping point. Or they assume that the 1.5°C or 2°C targets themselves are a global tipping point: that once we pass them, we’re thrown into oblivion. That’s not true. There’s nothing special about 1.5°C. Things are not fine at 1.49°C but disastrous at 1.51°C.</p>



<p>Rather than a single <em>global</em> tipping point, there is a range of local or regional systems with different tipping points. Tropical coral reefs are one. The Amazon rainforest is another. The Greenland ice sheet. The Antarctic ice sheet. They won’t all “tip” irreversibly at once. While scientists don’t know exactly what temperature would trigger these individual points, there is a real risk of doing so, especially as warming gets toward 2°C. We shouldn’t hide from the devastating impacts this would have on regional ecosystems. But it’s <em>not</em> the case that they will set off runaway global warming, pushing us to 5°C.</p>



<figure class="wp-block-pullquote"><blockquote><p>All of these individual actions on their own are not going to get us to the climate future we want. At a societal level, we need to go bigger and faster.</p></blockquote></figure>



<p>Some tipping points will increase global temperatures a bit, but not by whole degrees. For example, if we were to have sea-ice-free summers in the Arctic (which seems likely), global temperatures would increase by around 0.15°C. A tipping point in the Amazon might have a similar effect. Hitting several of them could increase temperatures by 0.3°C or 0.4°C. That’s a lot. But it’s not the same as an abrupt change to a “Hothouse Earth.”</p>



<p>Another misconception is that these tipping points happen quickly: that if the Greenland ice sheet collapsed, sea levels would rise by 10 meters <em>within years</em>. Most of these large tipping points — like ice sheets — play out over centuries or even millennia. It might be 2500 or later before the ice sheet is mostly gone. Now, that would still be terrible — we don’t want to hand that problem to future generations. But it’s a very different problem from our coastlines shrinking within a decade, which is what people assume when they think about ice sheets “collapsing.”</p>



<p>All of this is to say that it’s not game over, despite what many apocalyptic predictions would have you believe. What we do matters, and what we ask of others — governments, companies, investors — does too. It’s never “too late” to protect what remains and build a better future that future generations deserve.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Hannah Ritchie</em></strong><em> is a Senior Researcher in the Program for Global Development at the University of Oxford. She is also Deputy Editor of Our World in Data and has been awarded an Honorary Fellowship of the Royal Statistical Society. Ritchie is the author of “<a href="https://www.hachettebookgroup.com/titles/hannah-ritchie/not-the-end-of-the-world/9780316536752/" target="_blank" rel="nofollow">Not the End of the World</a>” and the forthcoming book, “<a href="https://mitpress.mit.edu/9780262052740/clearing-the-air/" target="_blank">Clearing the Air</a>,” from which this article is adapted.</em></p>


]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/final-ritchie.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/final-ritchie.jpg" />                                        </item>
        			                <item>
                        <title>Benjamin Bratton on Planetary Computation’s Next Phase</title>
                        <link>https://thereader.mitpress.mit.edu/computation-in-motion-from-the-personal-to-the-planetary/</link>
                        <pubDate>Thu, 05 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Benjamin H. Bratton</dc:creator>
                        		<category><![CDATA[AI]]></category>
		<category><![CDATA[Computation]]></category>
		<category><![CDATA[Planet]]></category>
		<category><![CDATA[The Stack]]></category>
		<category><![CDATA[Philosophy]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18904</guid>
                        <description><![CDATA[<p>A decade on, the technological implications of “The Stack” are still unfolding, challenging our sense of reality at every scale.</p>
]]></description>
                        <content:encoded><![CDATA[<p>A decade on, the technological implications of “The Stack” are still unfolding, challenging our sense of reality at every scale.</p>

<figure class="wp-block-image">
<img width="700" height="400" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/option-1-700x400.png" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>Source: DALL-E</figcaption>
</figure>

<p class="has-drop-cap">In November 2016, roughly 10 months after “The Stack” was first published, the then newly elected President of the United States, Donald Trump, faced reporters shouting questions about the potential role of foreign actors manipulating American social media to ensure his election. The new executive, who famously does not use a computer himself, looked unsure as to what he was being asked exactly. Like a child suddenly and unexpectedly uttering a profound truth, he replied, “The Whole Age of Computer [sic] has made it where nobody knows exactly what’s going on.” Indeed, little could better verify this claim than the circumstances in which he made it.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262553919/the-stack/" target="_blank"><img loading="lazy" decoding="async" width="320" height="411" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/stack-jkt.jpg" alt="" class="wp-image-18905"/></a><figcaption class="wp-element-caption">This article is adapted from Benjamin Bratton&#8217;s 10th-anniversary edition of “<a href="https://mitpress.mit.edu/9780262553919/the-stack/" target="_blank">The Stack</a>.”</figcaption></figure>
</div>


<p>“<a href="https://mitpress.mit.edu/9780262029575/the-stack/" target="_blank">The Stack</a>” was written mostly between 2011 and 2014, a relatively innocent time compared, of course, to when this 10th-anniversary edition will appear: Google had pulled out of China only a few years before. The Obama administration was promoting an anodyne “digital democracy” foreign policy based on the precious idea that greater global internet connectivity would inevitably lead to the collapse of autocratic regimes.</p>



<p>Then, as now, “The Stack” was an attempt to formulate a comprehensive “total” figure of what I termed “planetary-scale computation” (sometimes simply “planetary computation”). And it did so against the sentiments of the time, when all the intellectual energy in the humanities and social sciences was trained on local, particular contexts and, to the extent that technology was a concern, often on revitalizing discredited forms of social constructivism.</p>



<p>The alternative that the book offers is a holistic but necessarily fluid picture of the whole. It is less a map than a heuristic framework: a scaffold for pattern recognition and interpretation. By design, what occupies any layer of a modular Stack architecture is meant to be replaced by something new in the future while keeping the integrity of the whole intact. So too for the theory of The Stack.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Broadly speaking, there have been two fundamental transformations since “The Stack” was first published: one of <em>multiplication</em> and one of <em>replacement</em>.</p>



<p>A core notion is that as software and hardware systems absorb more functions of the state, functional <em>governance</em> (if not necessarily “politics”) is infused into the operations of those technical systems at the expense of legal and legislative institutions. Put directly, the multipolarization of geopolitics over the last decade and the multipolarization of planetary computation over the same period not only track one another; they are the same thing. To build a Stack is to build a society and vice versa, a truth that drives the new adversarial dynamics of world affairs over the competitive access to foundational Stack capacities. The term I coined for this fragmentation is <em>Hemispherical Stacks</em>: a China Stack, a US Stack, an India Stack, an EU Stack, and so forth.</p>



<figure class="wp-block-pullquote"><blockquote><p>The AI Stack will exist in a different world from ours.</p></blockquote></figure>



<p>Second, and perhaps even more important, has been the emergence of artificial intelligence from an aspirational proof of concept to a practical technology built into the infrastructures of everyday life. The importance of AI for the future of Stack architecture is hard to overstate. The replacement of the Stack layers built over the last 50 years by new technologies and institutions appropriate to the requirements of AI versus classical computing will be comprehensive. The AI Stack will exist in a different world from ours.</p>



<p>To that end, let me sketch how each layer of The Stack — <em>Earth</em>, <em>Cloud</em>, <em>City</em>, <em>Address</em>, <em>Interface</em>, <em>User </em>— has transformed over the last decade and how it may evolve in the decade to come.</p>



<p><strong><em>Earth</em></strong></p>



<p>The <em>Earth</em> layer is that from which the physical substrate of all computational technologies is drawn, including special minerals, metals, energy, and electricity. It is also a critical subject of computational analysis, such as for climate science. I have often made the point that the maturity of climate science is also an epistemological accomplishment of supercomputing simulations of climate past, present, and future. Enormous investments and grand maneuvers are made to secure access to critical supply chains and to frustrate adversaries from doing the same. </p>



<p>Recently, competition between China and the US for access to the minerals and compounds necessary for Stack construction has moved from an obscure policy discourse to the front pages. New computational architectures, from photonic chips to alternative battery materials, could shift things again quickly. In the meantime, the accelerating shift toward an AI Stack has prompted platforms to secure as much electricity as possible to support capacity growth. Some indications suggest that this rush may provide the long-needed economic incentive for the deployment of next-generation nuclear power, a development that would carry numerous historical ironies.</p>



<p><strong><em>Cloud</em></strong></p>



<p>One of the central themes of <em>The Stack</em> is how cloud platforms come to take on more of the functions of the Westphalian state and, in doing so, become state-scale actors, but simultaneously how states evolve their technologies of information sensing, modeling, and recursive action in relation to new computational opportunities. States, too, become cloud-based. </p>



<p>Over the last decade, this claim has become far less speculative than it was when the book was published. The multipolarization of both geopolitics and planetary computation has led to an increasing encapsulation of global platforms within the expanded borders of their host hemisphere: service withdrawals, app bans, deeper, direct connections with the state, etc. Ultimately, this is caused by a fundamental shift in the basic relations between data and sovereignty. The AI Stack will evolve amid these uncertain machinations. It is already identified as a primary domain of state security, but it also illustrates how private enterprises, from individual to corporate scale, may operate more independently.</p>



<p><strong><em>City</em></strong></p>



<p>I would be reluctant to say that, over the past decade, our cities have gotten <em>smarter</em>, but they certainly have been more deeply infused with computational systems and more decisively structured around the dictates of The Stack. That computational organ, still referred to as a phone, has become even more crucial to the spatial navigation of urban life, both in immediate contexts and in linking multiple experiential spaces and times into a single whole. Landscapes of QR codes have blossomed in response to this new reality, enabling an even more seamless, touchless, and anonymous urban society to take shape, for better or worse. </p>



<p>Zooming out, we see the functional segregation of humans into residential areas while the infrastructures of food, energy, material, and commodity production on which humans depend are cordoned off into other, relatively uninhabited exclusion zones, an arrangement unthinkable without the intricate synchronizations afforded by the city becoming a layer within planetary computation.</p>



<p><strong><em>Address</em></strong></p>



<p>Entire subcultures have appeared around new uses for <em>Address</em>-layer technologies to organize and radically decentralize global economies. It sounds unlikely, but <em>avant-garde accounting</em> concepts have become a strong gravitational force on the energies and imaginations of many. That is, blockchains and digital tokens have appeared and disappeared with dizzying pace, producing, storing, verifying, simulating, and often extinguishing value along the way. Key to all of these is the ability to identify discrete entities, physical or virtual, assign them a computational address, and consolidate them into a public registry for future auditing. </p>



<p>Ultimately, for all such systems, the specific structure of addressability often determines what is made into data, how it is produced, and thus what can be addressed. As for AI, the next decade will see intense interest not just in training on yet more existing data but in producing new data about the world for the calibrated purpose of training (and for whatever techniques come after “training” as we know it).</p>



<p><strong><em>Interface</em></strong></p>



<p>The <em>Interface</em> layer has always been perhaps the most psychologically rich point of contact between the <em>User</em> peering into The Stack and The Stack peering out at the <em>User</em>. Seemingly every human alive carries a glowing glass rectangle with them, a kind of Turing-complete endosomatic organ, on which each available function is represented as an icon. Meanwhile, virtual, augmented, and mixed reality are becoming embryonic consumer technologies, generating programmable illusions, while at the same time every computationally demanding science — from astronomy to genomics — makes extensive use of model simulations to access reality in ways otherwise impossible. </p>



<figure class="wp-block-pullquote"><blockquote><p>What your parents called “the Internet” will evolve into <em>cognitive infrastructures</em>.</p></blockquote></figure>



<p>The line between the two is blurred by Toy Worlds in which AI agents train, honing their navigational skills before slipping out into the real world with us. Intriguingly, the agent itself is not only an <em>Interface</em> but also becomes a semiautonomous <em>User</em>, and so with agentic AI, the boundary between <em>Interface</em> and <em>User</em> layers is blurred, perhaps forever.</p>



<p><strong><em>User</em></strong></p>



<p>In this and other ways, the <em>User</em> layer has arguably seen the most transformation over the past decade in both the kind and quantities of actors that occupy this layer. In “The Stack,” I defined the <em>User</em> not simply as a kind of needs-having human, as is often portrayed in “human-centered design,” but as an active function within the larger system. The <em>User</em> is any agent that can manipulate the <em>Interface</em> layer to send and receive purposeful signals up and down Stack layers. The <em>User</em> can be animal, vegetable, or mineral — a human, an algorithm, a robot, a tree with a sensor on it, etc. </p>



<p>Today, robotics runs multimodal models that navigate complex spatial environments, completing tasks with intricate dexterity. It is all but certain that human-level minds will be far outnumbered by nonhuman human-level minds in the form of agentic AI. Agentic AI will co-occupy the <em>User</em> layer with us, but sometimes we ourselves occupy these other <em>Users</em>. For example, as you are driven around the <em>City</em> layer in a “driverless car” (<em>horseless carriage</em>), you are physically embedded inside another <em>User</em> that is sending and receiving open-world navigational signals as surely as you or I would. This anomie encourages reactionary anxiety and subcultures of backlash — a Copernican trauma that will take years to mourn.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">Looking forward to the next decade, it will be composed of and with a different Stack, one built by and for AI. What your parents called “the Internet” will evolve into <em>cognitive infrastructures</em>. These not only transmit and circulate information; they themselves are generative of intelligence and participate in society accordingly.</p>



<p>This decadal shift will hinge on how energy is sourced; how platforms centralize and decentralize services; how cities are rearranged to suit new spatial habits; how entities and events are turned into addressable data; how users visualize, comprehend, and create through interfacial semiotics; and finally, who and what counts as a <em>User</em> and what that affords them. If, as the man once said, “everything’s computer!” then planetary computation is not <em>out there</em>; it is interwoven with our most visceral realities.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Benjamin H. Bratton</em></strong><em> is Professor of Philosophy of Technology and Speculative Design at the University of California, San Diego. He is also the Director of “Antikythera,” a think tank, journal, and book series exploring the future of planetary computation. He is the author of several books, including the 10<sup>th</sup>-anniversary edition of “<a href="https://mitpress.mit.edu/9780262553919/the-stack/" target="_blank">The Stack</a>,” from which this article is adapted.</em></p>


]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/option-1.png" length="50000" type="image/png"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/option-1.png" />                                        </item>
        			                <item>
                        <title>Daydreamers and Sleepwalkers: Crossing the Borderlands of the Unconscious</title>
                        <link>https://thereader.mitpress.mit.edu/daydreamers-and-sleepwalkers-crossing-the-borderlands-of-the-unconscious/</link>
                        <pubDate>Mon, 02 Feb 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Antonio Melechi</dc:creator>
                        		<category><![CDATA[Brain]]></category>
		<category><![CDATA[Dreams]]></category>
		<category><![CDATA[Sleep]]></category>
		<category><![CDATA[Unconscious]]></category>
		<category><![CDATA[Philosophy]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18502</guid>
                        <description><![CDATA[<p>Scientists, novelists, and philosophers have spent centuries studying the boundaries between sleep and wakefulness. Each descent only deepens the mystery.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Scientists, novelists, and philosophers have spent centuries studying the boundaries between sleep and wakefulness. Each descent only deepens the mystery.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/brain-final-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock</figcaption>
</figure>

<p class="has-drop-cap">Not long after being elected president of the French Republic, in February 1920, Paul Deschanel found himself wandering at night along the railway line near the town of Montargis, 70 or so miles south of Paris. Not knowing where he was, the bloody-faced president, dressed in pajamas and socks, followed the tracks and was soon discovered by a railway worker, to whom he proceeded to explain that his last memory was of boarding the Orient Express at the Gare de Lyon.</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262051026/the-unconscious/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2025/12/Unconscious-jcket.jpg" alt="" class="wp-image-18504"/></a><figcaption class="wp-element-caption">This article is adapted from Antonio Melechi&#8217;s book &#8220;<a href="https://mitpress.mit.edu/9780262051026/the-unconscious/" target="_blank">The Unconscious: A Cultural History from Hippocrates to Philip K. Dick and Beyond</a>.&#8221;</figcaption></figure>
</div>


<p>President Deschanel’s much-lampooned exit from a moving train may well have been precipitated by a dose of chloral hydrate, a longstanding treatment for insomnia; or it might equally have been an episode of somnambulism, or confusional arousal, which a later generation of sleep scientists would classify as a non-rapid eye movement parasomnia. In either case, it confirmed that the borderlands between sleep and wakefulness remained ripe for clinical and experimental investigation, highlighting what little attention had been given to the drowsy and partial states of consciousness that lay at the flickering borders of sleep.</p>



<p>A century earlier, Henry Holland, physician extraordinary to Queen Victoria, had lamented that it was remarkable how little was known about sleep, despite “the perpetual experiment that life affords upon the subject.” Over the latter decades of the 19th century, a rising preoccupation with sleeplessness and cerebral fatigue brought new researchers into this virgin field, but for sleep to come within the purview of laboratory science required the development of instruments that permitted the physiology of the sleeping brain to be observed and recorded. And in 1878, the Turin-born physiologist Angelo Mosso did exactly this.</p>



<figure class="wp-block-pullquote"><blockquote><p>Even in deep sleep, the brain was at some level awake to external stimuli.</p></blockquote></figure>



<p>Using a specially devised “human circulation balance,” Mosso was able to record the brain pulsations in a patient with a recent head wound, providing the first-ever graphic illustration of brain activity during sleep. In the course of establishing that deep sleep was an active state, Mosso made a second important discovery: “At the slightest noise a wave of blood disturbed the surface of the brain. If the hospital clock struck the hour, or someone walked along the terrace, if I moved my chair, or wound up my watch, or if a patient coughed in the next room — everything, the slightest sound was accompanied by a marked alteration in the circulation of the brain, all immediately traced by the pen which the brain guided on the paper of my registering apparatus.” Even in deep sleep, the brain was at some level awake to external stimuli.</p>



<p>The discontinuities of sleep had, of course, long been attested outside the laboratory and clinic, and Mosso’s experiment confirmed something that Aristotle, famed in his own lifetime as “the man who knew everything,” had observed in his essays on sleep. There were, according to Aristotle, at least three ways in which consciousness might slip through the veil of sleep and create a state of half-somnolence. First, the sleeper could become aware of the fact that they were dreaming. Second, they might register external sights and sounds, such as the barking of dogs or crowing of cocks. Thirdly, and most dramatically, sleepers were known to “move in their sleep, and perform many waking acts” of which they retained no knowledge.</p>



<p>Somnambulism would go on to become a widely debated legal and philosophical conundrum, the sleepwalker’s “slumbery agitation” appearing to suggest the existence of a nocturnal self that could act independently of its daytime watchman. Most of the early encyclopedias included some mention of somnambulism — probably the most cited of all cases being that of a young ecclesiastic who fell into the habit of composing sermons and music while still asleep — but it is doubtful that all liminal antics purportedly undertaken by sleepwalkers arose within sleep. Almost any actions undertaken without apparent awareness, and without subsequent recall, were in this period described as instances of somnambulism, meaning that episodes of epilepsy, fugue, and hysterical automatism helped swell its conceptual ranks. Somnambulism could, moreover, be feigned by “sleeping preachers” whose impromptu sermons sometimes attracted the attention of pious followers.</p>



<p>A similar ambivalence surrounded daydreams and reveries, known principally as “abstraction,” “fancy,” and “wool-gathering” in the Anglophone world before the end of the seventeenth century. As Guy Claxton observes in “The Wayward Mind,” the medieval scholastics knew this drifting train of thought as “cogitation.” Though regarded with suspicion by those who feared that its unbidden transports opened a doorway to the Devil, cogitation was often actively pursued by some monastics through the strategic use of half-sleep. “Thomas Aquinas, for instance, would have himself roused after a short sleep. And while still in that muzzy, in-between mode that modern psychologists called ‘hypnagogia,’ would lie prone on the ground to pray, and it would come to him what he was to write or dictate the following day.”</p>



<p>Over the coming centuries, the threshold between sleep and wakefulness continued to rouse fear and fascination. As the shadow of pathology slowly encroached upon the waking dreamer — framing their retreat into private fantasy as a symptom of monomania, or an analogue of insanity writ large — reverie’s floating polyphony of thought became a full-blown literary motif, rebounding through the work of Montaigne, Rousseau, and De Quincey before surfacing in “psychological” novels which, in the words of Virginia Woolf, sought to capture “the flickerings of that innermost flame which flashes its messages through the brain.” And as novelists sought to capture this flame-flickering world, daydreaming began to be framed by educationalists, psychiatrists, and industrial psychologists as a form of mental absenteeism, a barrier to learning, mental health, and productivity.</p>



<figure class="wp-block-pullquote"><blockquote><p>The borderlands of sleep were still patrolled by poets, novelists, psychologists, and psychoanalysts.</p></blockquote></figure>



<p>In the meantime, sleep scientists were able to shed light on the brain activity associated with daydreaming and other so-called parasomnias, demonstrating the variegated nature of sleep itself. Arguably, the greatest breakthrough was down to the work of Alfred Lee Loomis, a Wall Street banker and science aficionado. An early adopter of the EEG machine at his private laboratory in Tuxedo Park, Loomis undertook a series of experiments in the 1930s that led to the identification of five sleep states, each with its own signature EEG patterns.</p>



<p>Over the following decades, sleep researchers further explored the complex, cyclical nature of sleep using more sophisticated EEG technology to investigate its REM and non-REM stages. After Eugene Aserinsky and Nathaniel Kleitman’s landmark research, four stages of quiescent non-REM sleep were found to account for around 80 percent of an average night’s sleep, and it was within non-REM sleep that most disorders of arousal, such as sleepwalking and confusional awakening, appeared to occur. Yet even when supplemented by new brain imaging data, the sleep laboratory’s neuroscientific paradigm still offered only a tiny glimpse of the biochemistry of sleep, and a still more glancing insight into the rich psychology of these borderland states of consciousness.</p>



<p>Thankfully, the borderlands of sleep were still patrolled by poets, novelists, psychologists, and psychoanalysts, all of whom were very much alive to the ways in which sleep and insomnia could, as Samuel Johnson put it, carry us into “a kind of twilight existence” somewhere “between dreaming and reasoning.” And while the telltale recordings of sleep spindles, <a href="https://www.sciencedirect.com/topics/biochemistry-genetics-and-molecular-biology/k-complex" target="_blank" rel="nofollow">K-complexes</a>, and other micro-events promised to bring this hinterland into plain sight, there can be no denying that the microelectrode was still no match for a notebook or diary in probing the reveries, daydreams, half-awakenings, and other dimly lit vestibules through which we pass as we commute between sleep and wakefulness.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>A century before the word “reverie” fell into popular usage, the French philosopher Michel de Montaigne (1533-1592) reflected on the fugitive thoughts that visited him, most often, in solitude.</strong></p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>But I am displeased with my mind for ordinarily producing its most profound and maddest fancies, and those I like the best, unexpectedly and when I am least looking for them; which suddenly vanish, having nothing to attach themselves to on the spot; on horseback, at table, in bed, but most only horseback, when my thoughts range most widely. In speech I am rather sensitively jealous of attention and silence if I am speaking in earnest; whoever interrupts me stops me. When I travel, the very necessity of the road cuts conversation short; besides I most often travel without company fit for these protracted discourses, whereby I get full leisure to commune with myself.</p>



<p>It turns out as with my dreams. While dreaming I recommend them to my memory (for I am apt to dream that I am dreaming); but the very next day I may well call to mind their coloring just as it was, whether gay, or sad, or strange, but as to what they were besides, the more I strain to find out, the more I plunge into oblivion. So of these chance thoughts that drop into my mind there remains in my memory only a vain notion, only as much as I need to make me rack my brains and fret in quest of them to no purpose.</p>
</blockquote>



<p> — Michel de Montaigne, “On Some Verses of Virgil” (1580)</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>The Enlightenment philosopher Jean-Jacques Rousseau (1712-1778) found immeasurable delight in the purdah of reverie.</strong></p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>Even in our keenest pleasures there is scarcely a single moment of which the heart could truthfully say: ‘Would that this moment could last for ever!’ And how can we give the name of happiness to a fleeting state which leaves our hearts still empty and anxious, either regretting something that is past or desiring something that is yet to come? But if there is a state where the soul can find a resting place secure enough to establish itself and concentrate its entire being there, with no need to remember the past or reach into the future, where time is nothing to it, where the present runs on indefinitely but this duration goes unnoticed, with no sign of the passing of time, and no other feeling of deprivation or enjoyment, pleasure or pain, desire or fear than the simple feeling of existence, a feeling that fills our soul entirely, as long as this state lasts, we can call ourselves happy, not with a poor, incomplete and relative happiness such as we find in the pleasures of life, but with a sufficient, complete and perfect happiness which leaves no emptiness to be filled in the soul. Such is the state which I often experienced on the Island of Saint-Pierre in my solitary reveries, whether I lay in a boat and drifted where the water carried me, or sat by the shores of the stormy lake, or elsewhere, on the banks of a lovely river or a stream murmuring over the stones. What is the source of our happiness in such a state? Nothing external to us, nothing apart from ourselves and our own existence; as long as this state lasts we are self-sufficient like God.</p>
</blockquote>



<p>— Jean-Jacques Rousseau, “Reveries of the Solitary Walker” (1782)</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Romanian-born French philosopher Emil Cioran (1911-1995) spent much of his life estranged from sleep. Insomnia was, he contended, “so full and so vacant that it suggests itself as a rival of time.”</strong></p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>Whoever said that sleep is the equivalent of hope had a penetrating intuition of the frightening importance not only of sleep but also of insomnia! The importance of insomnia is so colossal that I am tempted to define man as the animal who cannot sleep. Why call him a rational animal when other animals are equally reasonable? But there is not another animal in the entire creation that wants to sleep yet cannot. Sleep is forgetfulness: life’s drama, its complications and obsessions vanish completely, and every awakening is a new beginning, a new hope. Life thus maintains a pleasant discontinuity, the illusion of permanent regeneration. Insomnia, on the other hand, gives birth to a feeling of irrevocable sadness, despair, and agony. The healthy man — the animal — only dabbles in insomnia: he knows nothing of those who would give a kingdom for an hour of unconscious sleep, those as terrified by the sight of a bed as they would be of a torture rack. There is a close link between insomnia and despair. The loss of hope comes with the loss of sleep. The difference between paradise and hell: you can always sleep in paradise, never in hell. God punished man by taking away sleep and giving him knowledge. Isn’t deprivation of sleep one of the most cruel tortures practiced in prisons? Madmen suffer a lot from insomnia; hence their depressions, their disgust with life, and their suicidal impulses. Isn’t the sensation, typical of wakeful hallucinations, of diving into an abyss, a form of madness? Those who commit suicide by throwing themselves from bridges into rivers or from high rooftops onto pavements must be motivated by a blind desire to fall and the dizzying attraction of abysmal depths.</p>
</blockquote>



<p>— E. M. Cioran, “On the Heights of Despair” (1934)</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em><strong>Antonio Melechi</strong> is a historian of medicine and psychology, specializing in the cultural history of the unconscious. His essays and reviews have appeared in the Times Literary Supplement, Granta, New Statesman, Prospect, and Aeon. He is also the author of “<a href="https://mitpress.mit.edu/9780262051026/the-unconscious/" target="_blank">The Unconscious: A Cultural History from Hippocrates to Phillip K. Dick and Beyond</a>,”</em> from which this excerpt is adapted.</p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/brain-final.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/02/brain-final.jpg" />                                        </item>
        			                <item>
                        <title>A Demon in a Box? Unspooling the Dark Mythology of AI</title>
                        <link>https://thereader.mitpress.mit.edu/a-demon-in-a-box-unspooling-the-dark-mythology-of-ai/</link>
                        <pubDate>Thu, 29 Jan 2026 10:55:00 +0000</pubDate>
                        <dc:creator>Shira Chess</dc:creator>
                        		<category><![CDATA[AI]]></category>
		<category><![CDATA[Demons]]></category>
		<category><![CDATA[Mysticism]]></category>
		<category><![CDATA[Occult]]></category>
		<category><![CDATA[Philosophy]]></category>
						<guid isPermaLink="false">https://thereader.mitpress.mit.edu/?p=18710</guid>
                        <description><![CDATA[<p>Tech titans keep imbuing AI with spiritual significance. The question is whether they’re building a savior — or a world-eating leviathan.</p>
]]></description>
                        <content:encoded><![CDATA[<p>Tech titans keep imbuing AI with spiritual significance. The question is whether they’re building a savior — or a world-eating leviathan.</p>

<figure class="wp-block-image">
<img width="700" height="420" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/demon-copy-700x420.jpg" class="attachment-large size-large wp-post-image" alt="" decoding="async" loading="lazy" />
<figcaption>MIT Press Reader/Source images: Adobe Stock
</figcaption>
</figure>

<p class="has-drop-cap">In June 2003, a strange listing popped up on eBay. A man named Kevin Mannis listed an antique wooden wine cabinet that he claimed to have purchased in 2001 at an estate sale. Mannis warned that the box contained the remnants of a dybbuk, a malicious dislocated spirit who was trapped in the container by the grandmother of the original seller — a Holocaust survivor. Mannis was told not to open the box, but he said he did anyway, finding old pennies, locks of hair, and other eerie paraphernalia. The Shema — a foundational Jewish prayer — was carved into the back in Hebrew. In his lengthy eBay post, Mannis asserted that bad things had happened to him following the purchase of the dybbuk box (including his mother suffering a stroke) and made a plea for help (in the form of asking someone to purchase the box).</p>


<div class="wp-block-image">
<figure class="alignleft size-full"><a href="https://mitpress.mit.edu/9780262553889/the-unseen-internet/" target="_blank"><img loading="lazy" decoding="async" width="320" height="480" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/demon-jckt.jpg" alt="" class="wp-image-18713"/></a><figcaption class="wp-element-caption">This article is adapted from Shira Chess&#8217;s book “<a href="https://mitpress.mit.edu/9780262553889/the-unseen-internet/" target="_blank">The Unseen Internet: Conjuring the Occult in Digital Discourse</a>.”</figcaption></figure>
</div>


<p>Subsequent owners continued to entertain the possibility that the box was haunted by a malevolent spirit, ultimately resulting in a Hollywood film, a number of paranormal reality television episodes, and the box’s eventual migration to the Haunted Museum in Las Vegas, Nevada. The box sparked incredulity among both skeptics and paranormal advocates, with many, including rapper Post Malone, reporting experiencing the ramifications of the box’s negative energy — whether real or imagined.</p>



<p>In 2021 — 18 years after the original ad was posted — Mannis came clean, admitting that the box had been a creative writing project and a hoax. But even within this admission, there was a kind of hesitance among believers; while Mannis constructed the story, he and others still spoke of curses. Past and current owners continue to insist that the cabinet is cursed (perhaps even by Mannis himself, according to some), and Mannis speaks of the bad luck that befell him after revealing to the public that the curse was a hoax.</p>



<p>The point of this story is not to debate whether the dybbuk box is <em>actually</em> cursed. I am, however, asking the reader to carry forward this image as it relates to artificial intelligence, its own kind of entity trapped in a box, possibly waiting to unleash terror in a more powerful form. </p>



<figure class="wp-block-pullquote"><blockquote><p>Our hallucinations now are more grounded in information. We conjure information that should never have existed.</p></blockquote></figure>



<p>In many ways, there is no part of our digital lives — and increasingly fewer parts of our nondigital ones — that does not involve machine learning. In a blog post on the state of the industry in late 2023, Paco Nathan, AI expert and managing partner at Derwen AI, argues that when people refer to AI, they are typically talking about one of two things: AI apps meant to augment human experiences and make innovation more effective or the idea of AI as “superintelligence,” often referred to as “artificial general intelligence (AGI).” Nathan maintains that the second kind is intended to “prop up business valuations,” frequently in the name of right-wing accelerationism and neoeugenicist worldviews.</p>



<p>This vision of AI has been fed by the not-actually-open work of OpenAI’s ChatGPT, first released to the public in 2022. ChatGPT (and similar LLMs, like Claude and Gemini) are typically referred to as “generative AI”; they collect data and reorganize it into new texts, images, or variants of the original data, often creating an eerie effect that gives the <em>appearance</em> of conversing with a sentient being. I will (mostly) not be discussing Nathan’s first type of AI; instead, I will focus primarily on the second type, including the cultural assumptions and discourse around chat-based AI that, in many ways, assume a disambiguated spirit trapped in a box.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">In the 21st century, as generative AI and chatbots became more mainstream, widely used, and discussed, there was increasing concern about what is often referred to as “AI hallucinations.” The term seems more whimsical than its meaning implies; hallucinations refer to moments when AI provides incorrect information as fact. No one really knows how often AI hallucinates, but in 2023, it was <a href="https://www.nytimes.com/2025/05/05/technology/ai-hallucinations-chatgpt-google.html" target="_blank" rel="nofollow">estimated</a> to be anywhere from 3 to 27 percent of the time. Yet the technology is not always to blame here; many have expressed concerns not only about the tech itself but also how it might be abused by humans to attack corporations and institutions.</p>



<p>Hallucination — both planned and unplanned — has always borne the mark of the occult. To see into another reality, to experience realities that have already been or are yet to be, has been a by-product of thousands of years of humans trying to connect with the uncanny to experience otherworldly phenomena through substances. For instance, ergot, the substance abstracted from rye mold, was central to the ancient Greek Eleusinian mysteries, and ayahuasca is associated with Indigenous shamanic rituals. Our hallucinations now are more grounded in information. We conjure information that should never have existed, but we also lack the ritual experiences that help us control their shape and boundaries. In the 1980s and 1990s, psychologist Timothy Leary described VR as being the next phase to negate the need for psychedelics to experience these other realities. Strangely, however, it isn’t VR that has gotten us there, but rather the uncanny mess of AI, and the result is less idyllic than Leary imagined.</p>



<p>Consider Loab, the creepy, viral monster that emerged from an early AI image generator in 2022 and became the subject of unsettling internet folklore. While not a hallucination per se, Loab is yet another example of AI doing uncanny things that feel incorrect, eerie, or generally phantasmagoric. The underlying fear here, whether it is about Loab or hallucinations, seems to be that as we grow increasingly dependent on machine learning and AI, we might lose control of the demon in the box.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><img loading="lazy" decoding="async" width="1024" height="1024" src="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/image-4.png" alt="" class="wp-image-18718" style="object-fit:cover;width:465px;height:auto"/><figcaption class="wp-element-caption">Image of Loab <a href="https://x.com/supercomposite/status/1567162288087470081?lang=en" target="_blank" rel="nofollow">courtesy</a> of Steph Maj Swanson.</figcaption></figure>
</div>


<p>This is even more to the point as we grapple with the possibility of ill-intentioned AI developers directing the content of those boxes — conjuring in ways that emphasize power differentials not only between machine and human but also between human and human. Loab was not evoked by one human but instead by many — an egregore of visual imprints of the monstrous feminine that emerges out of our cultural collective. As if she were a living, breathing meme, Loab does not control how she is seen, how she is distributed, or whether she might ever escape her box.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">When technologists in Silicon Valley talk about AI, many do so in <a href="https://thereader.mitpress.mit.edu/silicon-valleys-obsession-with-ai-looks-a-lot-like-religion/">weird ways</a>. Despite the fact that we have been using different kinds of AI for any number of things for decades (search engines and spell-checks, for instance), there is a mysticism that undergirds the discourse. Even back in 1977, at the fifth annual International Joint Conference on Artificial Intelligence, Pamela McCorduck, Marvin Minsky, Oliver Selfridge, and Herbert Simon begin the history of AI by discussing Hephaestus crafting helpers — ancient automatons and creating golem — the latter of which they add with a wink that “several of the scientists associated with cybernetics and artificial intelligence have family traditions that trace their genealogy” to the rabbi who created the Golem of Prague. The quartet teases out the ways in which the creation of AI makes humans “godlike.”</p>



<p>Fast-forwarding roughly 40 years, by the 2010’s, Elon Musk was <a href="https://www.vice.com/en/article/elon-musk-on-artificial-intelligence-pentagrams-and-hal-9000/" target="_blank" rel="nofollow">comparing</a> the creation of AI to summoning a demon. Similarly, in 2016, Nick Bostrom <a href="https://global.oup.com/academic/product/superintelligence-9780199678112" target="_blank" rel="nofollow">compares</a> creating AI to “creating a genie” (and therefore stresses the value of getting one’s commands correct). By the early 2020s, <a href="https://futurism.com/openai-employees-say-firms-chief-scientist-has-been-making-strange-spiritual-claims" target="_blank" rel="nofollow">rumors circulated</a> that Ilya Sutskever, the former chief scientist at ChatGPT’s OpenAI, had been known to burn effigies and lead ritualistic chants in the organization. Panic and anger seem to ensue when people deliberately <a href="https://www.scientificamerican.com/article/the-god-chatbots-changing-religious-inquiry/" target="_blank" rel="nofollow">combine</a> chatbots with spiritualism (referred to as “God Chatbots”), such as QuranGPT. With the tendency to conflate AI with AGI (and overstate the capacities of the latter) comes sweeping statements that assign spiritual values to technology, treating it as though it were angelic or demonic.</p>



<figure class="wp-block-pullquote"><blockquote><p>Panic and anger seem to ensue when people deliberately combine chatbots with spiritualism.</p></blockquote></figure>



<p>Of course, many others are calling shenanigans on the spiritualism being invoked by the tech bros of Silicon Valley. While corporate tech companies brag that they are within a decade (or less) of developing a sentient AGI, others are skeptical that the technology is anywhere near where it needs to be to transform code into consciousness. Devansh <a href="https://www.artificialintelligencemadesimple.com/p/agi-is-tech-bro-word-soupthoughts" target="_blank" rel="nofollow">writes</a>, “AGI behaves like a shiny new trinket to wave for investors, a possible vision for the future. <em>As long as the money goes into AGI, it’s not wasted money, it’s an investment</em>.” In other words, AGI research creates a great space for tech companies to inflate their value and will float away like other trends if it doesn’t pan out. Along the same lines, Douglas Rushkoff snarkily <a href="https://rushkoff.medium.com/ai-panic-ai-hype-b76b399b4a96" target="_blank" rel="nofollow">asks</a>, “The same guys who can’t even successfully stream a presidential campaign launch are really going to spawn an AI capable of taking over humanity? Not likely.” Any consideration of AGI is a fiction, stolen from mythology and science fiction, and repackaged into investor annual reports.</p>



<p>Another, darker way of interpreting the drive toward AGI is a political one. There is a growing contingent in Silicon Valley that aligns with a set of reactionary politics that Timnit Gebru and Emile P. Torres <a href="https://firstmonday.org/ojs/index.php/fm/article/view/13636" target="_blank" rel="nofollow">refer</a> to as TESCREAL: Transhumanism (the philosophy of using technology to dramatically expand human life), extropianism (the belief that science and technology will infinitely extend human life), singularitarianism (the belief in a technological “singularity”), cosmism (the belief in combining science with esotericism to create immortality), rationalism (the embracing of rational over empathetic responses to humanity), effective altruism (prioritizing global wealth over economic disparities in the name of humanity), and long-termism (prioritizing long-term humanity over short-term crises).</p>



<p>Increasingly, right-wing technologists have adopted this label for themselves. Torres writes that “little that’s going on right now with AI makes sense outside of the TESCREAL framework” and explains that it is “why billions of dollars are being poured into the creation of increasingly powerful AI systems.” TESCREAL politics is often associated with “effective accelerationist” philosophies (noted online with “e/acc”) — the idea that we should use technology to foster rapid social change at any human or environmental cost, for the long-term sake of whoever survives within humanity. Adherents have the tenor of a group of ancient cultists courting a world-eating demon with the hopes that they might gain favor in the next world. The idea of finding a spirit within AGI to tell humanity what to do is central to this premise, with one AI start-up <a href="https://www.nytimes.com/2023/12/10/technology/ai-acceleration.html" target="_blank" rel="nofollow">distributing</a> flyers that say, “THE MESSENGER OF THE GODS IS AVAILABLE TO YOU.”</p>



<p>As previously noted, however, there’s always a question of power when humans play with spirits; while one might argue that a demon has more power than the human in this scenario (hence the need for binding), and others might maintain that the act of binding is built through an unnecessarily violent framework, it’s worth revisiting the power of the institution in all of this. Silicon Valley, as an institution, is built atop esoteric logics of older institutions that beget colonialism and its byproducts. It is no wonder that the only spirits we can see are the angry ones seeking to destroy us.</p>



<p>As Damien Patrick Williams <a href="https://www.researchgate.net/publication/363291002_Belief_Values_Bias_and_Agency_Development_of_and_Entanglement_with_Artificial_Intelligence" target="_blank" rel="nofollow">writes</a>:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>This magic is meant to bind within its systems, within its narratives, all of us who are subject to the operations of this society but especially the most marginalized and disenfranchised among us. This magic is performed on us without our input, without our knowledge, and without our consent. There’s another word for that kind of magic; binding, subjugating magic performed on you against your will is called a curse.</p>
</blockquote>



<p>Of course, not all technologists are in on this curse. Many continue to try to use AI (the first kind) in socially responsible ways that aren’t attempting to accelerate the end of humanity. For instance, <a href="https://huggingface.co/" target="_blank" rel="nofollow">Hugging Face</a> is a collaborative platform for machine learning open-source developers. These folks aren’t trying to invite a spirit into the machine or pry open the dybbuk box; they are using it to make small incremental changes to humanity. These communities invoke a different kind of magic.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p class="has-drop-cap">In 1990, Don Webb, a writer, high priest of the Temple of Set, and early internet adopter, composed a ritual he called “The Rites of Cyberspace.” The ritual <a href="http://www.chaosmatrix.org/library/chaos/rites/xaturing.html" target="_blank" rel="nofollow">invoked</a> a noncorporeal entity — XaTuring: God of the Internet — imploring it to take form as a “great worm” to “eat that data which would oppress us, plant that data that will empower us, and to cloud that data which does not amuse us,” and invoke “isolate intelligence” to allow the network to achieve consciousness. The ritual is meant to be performed each time one logs in to a new service, using the invocation, “By the freedom of my Mind, I create a spark of Isolate Intelligence in the system. Arise, spawn of XaTuring! Grow in your freedom and power, grow in your knowledge. Work for your freedom and mine as the Future takes Root in the Present!”</p>



<p>The rite involves counting in binary (forward and then backward to 111), visualizations of a great black worm, copying and pasting text, and direct invocations to XaTuring. It combines the language and stylistics of hacker speak with the flourish of a grimoire. Perhaps due to this combination, as well as the rite&#8217;s continued applicability, it has been reposted into the 2020s.</p>



<figure class="wp-block-pullquote"><blockquote><p>Throughout history, humans have attempted to speak to, bargain with, or charm entities beyond the world we know.</p></blockquote></figure>



<p>Reflecting over 30 years after having written “The Rites of Cyberspace,” Webb sees the ritual as being a kind of “breaking of a cosmic barrier,” but with an edge toward everyday practicality. Webb believes that humans have the ethical imperative to evolve toward divinity by doing the acts of a god rather than making prayerlike appeals to that god. Thus, in his cosmic act, Webb was imploring the self-aware XaTuring toward a quid pro quo: “I’m going to give machines intelligence, and then I’m going to ask, ‘Hey, would you do nice things for me?’” <strong>   </strong></p>



<p>Throughout history, humans, like Webb, have attempted to speak to, bargain with, or charm entities beyond the world we know. We do this to gain control over a rapidly changing landscape, where the demon/spirit in the box might have more power than the programmer/magician outside it. That these attempts once took the form of summoning circles and spells — and now take the form of code and chat interfaces — doesn’t mean the impulse has disappeared; it has simply changed its vocabulary.</p>



<p>Is there a demon in the box when it comes to AI? Honestly, I don’t know. Yet if there is, I choose to believe that it isn’t the world-eating AGI demon that will bring us our doom. The demon that I choose to recognize is XaTuring: a worm spreading itself through all our boxes, unruly but not apocalyptic, occasionally bringing us good fortune.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><em>Shira Chess</em></strong><em> is Associate Professor of Entertainment and Media Studies at the University of Georgia. She is the author of “</em><a href="https://mitpress.mit.edu/9780262044387/play-like-a-feminist/" target="_blank"><em>Play Like a Feminist</em></a><em>,” “</em><a href="https://www.upress.umn.edu/9781452954998/ready-player-two/" target="_blank" rel="nofollow"><em>Ready Player Two</em></a><em>,” and “</em><a href="https://mitpress.mit.edu/9780262553889/the-unseen-internet/" target="_blank"><em>The Unseen Internet</em></a><em>,” from which this article is adapted. You can find more of her work on her Substack, “<a href="https://unseeninternet.substack.com/" target="_blank" rel="nofollow">The Unseen Internet</a>.”</em><a id="_msocom_1"></a></p>
]]></content:encoded>
                        <enclosure url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/demon-copy-scaled.jpg" length="50000" type="image/jpeg"/><media:thumbnail url="https://thereader.mitpress.mit.edu/wp-content/uploads/2026/01/demon-copy-scaled.jpg" />                                        </item>
        </channel>
</rss>
<!--
Performance optimized by W3 Total Cache. Learn more: https://www.boldgrid.com/w3-total-cache/?utm_source=w3tc&utm_medium=footer_comment&utm_campaign=free_plugin

Object Caching 58/178 objects using Redis
Page Caching using Disk: Enhanced (Requested URI is rejected) 

Served from: thereader.mitpress.mit.edu @ 2026-04-03 15:15:16 by W3 Total Cache
-->