I remember, anyway, church suppers and outings, and, later, after I left the church, rent and waistline parties where rage and sorrow sat in the darkness and did not stir, and we ate and drank and talked and laughed and danced and forgot all about “the man.” We had the liquor, the chicken, the music, and each other, and had no need to pretend to be what we were not. This is the freedom that one hears in some gospel songs, for example, and in jazz. In all jazz, and especially in the blues, there is something tart and ironic, authoritative and double-edged. White Americans seem to feel that happy songs are happy and sad songs are sad, and that, God help us, is exactly the way most white Americans sing them—sounding, in both cases, so helplessly, defenselessly fatuous that one dare not speculate on the temperature of the deep freeze from which issue their brave and sexless little voices. Only people who have been “down the line,” as the song puts it, know what this music is about. I think it was Big Bill Broonzy who used to sing “I Feel So Good,” a really joyful song about a man who is on his way to the railroad station to meet his girl. She’s coming home. It is the singer’s incredibly moving exuberance that makes one realize how leaden the time must have been while she was gone. There is no guarantee that she will stay this time, either, as the singer clearly knows, and, in fact, she has not yet actually arrived. Tonight, or tomorrow, or within the next five minutes, he may very well be singing “Lonesome in My Bedroom,” or insisting, “Ain’t we, ain’t we, going to make it all right? Well, if we don’t today, we will tomorrow night.” White Americans do not understand the depths out of which such an ironic tenacity comes, but they suspect that the force is sensual, and they are terrified of sensuality and do not any longer understand it. The word “sensual” is not intended to bring to mind quivering dusky maidens or priapic black studs. I am referring to something much simpler and much less fanciful. To be sensual, I think, is to respect and rejoice in the force of life, of life itself, and to be present in all that one does, from the effort of loving to the breaking of bread. It will be a great day for America, incidentally, when we begin to eat bread again, instead of the blasphemous and tasteless foam rubber that we have substituted for it. And I am not being frivolous now, either. Something very sinister happens to the people of a country when they begin to distrust their own reactions as deeply as they do here, and become as joyless as they have become. It is this individual uncertainty on the part of white American men and women, this inability to renew themselves at the fountain of their own lives, that makes the discussion, let alone elucidation, of any conundrum—that is, any reality—so supremely difficult. The person who distrusts himself has no touchstone for reality—for this touchstone can be only oneself. Such a person interposes between himself and reality nothing less than a labyrinth of attitudes. And these attitudes, furthermore, though the person is usually unaware of it (is unaware of so much!), are historical and public attitudes.
2019
Elizabeth Hart, a specialist in early literature, writes that in medieval or classical texts, “people are constantly planning, remembering, loving, fearing, but they somehow manage to do this without the author drawing attention to these mental states.” This changed dramatically between 1500 and 1700, when it became common for characters to pause in the middle of the action, launching into monologues as they struggled with conflicting desires, contemplated the motives of others, or lost themselves in fantasy—as is familiar to anyone who’s studied the psychologically rich soliloquies of Shakespeare’s plays. Hart suggests that these innovations were spurred by the advent of print, and with it, an explosion in literacy across classes and genders. People could now read in private and at their own pace, re-reading and thinking about reading, deepening a new set of cognitive skills and an appetite for more complex and ambiguous texts.
The emergence of the novel in the 18th and 19th centuries introduced omniscient narrators who could penetrate their characters’ psyches, at times probing motives that were opaque to the characters themselves. And by the 20th century, many authors labored not just to describe, but to simulate the psychological experience of characters. In her literary manifesto “Modern Fiction,” Virginia Woolf wrote, “Let us record the atoms as they fall upon the mind in the order in which they fall, however disconnected and incoherent in appearance, which each sight or incident scores upon the consciousness.”
Writing a book is like moving into an imaginary house. The author, the sole inhabitant, wanders from room to room, choosing the furnishings, correcting imperfections, adding new wings. Often, this space feels like a sanctuary. But sometimes it is a ramshackle fixer-upper that consumes time rather than cash, or a claustrophobic haunted mansion whose intractable problems nearly drive its creator mad. No one else can truly enter this house until the book is launched into the world, and once the work is completed the author becomes a kind of exile: the experience of living there can only be remembered.
https://www.newyorker.com/magazine/2008/07/07/the-back-of-the-world Sobre Chesterton, l’elogi del quotidià, els aforismes i el seu injustificat antisemitisme. “the deeper ones are genuine Catholic koans, pregnant and profound: “Blasphemy depends on belief, and is fading with it. If anyone doubts this, let him sit down seriously and try to think blasphemous thoughts about Thor.” Or: “The function of the imagination is not to make strange things settled, so much as to make settled things strange.” Or: “A key has no logic to its shape. Its logic is: it turns the lock.”
The two central insights of his work are here. First, the quarrel between storytelling, fiction, and reality is misdrawn as a series of illusions that we outgrow, or myths that we deny, when it is a sequence of stories that we inhabit. The second is not that small is beautiful but that the beautiful is always small, that we cannot have a clear picture in white light of abstractions, but only of a row of houses at a certain time of day, and that we go wrong when we extend our loyalties to things much larger than a puppet theatre. (And this, in turn, is fine, because the puppet theatre contains the world.)// Discussing the “mystery” of his Fleet Street success, he wrote, “I have a notion that the real advice I could give to a young journalist, now that I am myself an old journalist, is simply this: to write an article for the Sporting Times and another for the Church Times, and put them into the wrong envelopes.”
https://www.newyorker.com/magazine/2021/03/08/kazuo-ishiguro-uses-artificial-intelligence-to-reveal-the-limits-of-our-own Kazuo Ishiguro. narrat per un robot. When Pascal wrote that “an image of men’s condition” was “a number of men in chains, all condemned to death, some of whom are slaughtered daily within view of the others, so that those who are left see their own condition in that of their fellows, and, regarding one another with sorrow and without hope, wait their turn,” the vision was saved from darkest tragedy by God’s certain presence and salvation. Ishiguro offers no such promise. We learn, late in the book, that Artificial Friends are all subject to what is called a “slow fade,” as their batteries expire. Of course, we, too, are subject to a slow fade; it might be the definition of a life.
Klara wants to save Josie from early death, but she can do this only within her understanding and her means, which is where the novel’s title becomes movingly significant. Because the AFs are solar-powered, they lose energy and vitality without the sun’s rays; so, quite logically, the sun is a life-giving pagan god to them.
The Plot Twist: This literary invention is now so well-known that we often learn to identify it as children. But it thrilled Aristotle when he first discovered it, and for two reasons. First, it supported his hunch that literature’s inventions were constructed from story. And second, it confirmed that literary inventions could have potent psychological effects. Who hasn’t felt a burst of wonder—or as Aristotle called it, thaumazein—when a story pivots unexpectedly?
Hurt delay: Recorded by Aristotle in Poetics, section 1449b, this invention’s blueprint is a plot that discloses to the audience that a character is going to get hurt—prior to the hurt actually arriving. The classic example is Sophocles’ Oedipus Tyrannus, where we learn before Oedipus that he’s about to undergo the horror of discovering that he’s killed his father and married his mother.
The Tale Told From Our Future: This invention was created simultaneously by many different global authors, among them the 13th-century West African griot poet who composed the Epic of Sundiata. Basically, a narrator uses a future-tense voice to address us in our present. As it goes in the Epic: “Listen to my words, you who want to know; by my mouth you will learn the history of Mali. By my mouth you will get to know the story. . .
The Secret Discloser: The earliest-known beginnings of this invention—a narrative revelation of an intimate character detail—lie in the ancient lyrics of Sappho and an unknown Shijing poetess. And it exists throughout modern poetry in moments such as this 1952 love song by e. e. cummings: “here is the deepest secret nobody knows / I carry your heart (i carry it in my heart)”. Outside of poetry, variants can be found in the novels of Charlotte Brontë, the memoirs of Maya Angelou, and the many film or television camera close-ups that reveal an emotion buried in a character’s heart.
Serenity elevator: This element of storytelling is a turning around of satire’s tools (including insinuation, parody and irony) so that instead of laughing at someone else, you smile at yourself. It was developed by the Greek sage Socrates in the 5th-century B.C. as a means of promoting tranquility—even in the face of excruciating physical pain. And such was its power that Socrates’ student Plato would claim that it allowed Socrates to peacefully endure the terrible agony of swallowing hemlock.
The Empathy Generator: In this narrative technique, a narrator conveys us inside a character’s mind to see the character’s remorse. That remorse can be for a genuine error, like when Jo March regrets accidentally burning her sister Meg’s hair in Louisa May Alcott’s Little Women. Or it can be for an imagined error, like the many times that literary characters rue their physical appearance, personality quirks or other perceived imperfections.
The Almighty Heart: This invention is an anthropomorphic omniscient narrator—or, to be more colloquial, a story told by someone with a human heart and a god’s all-seeing eye. It was first devised by the ancient Greek poet Homer in The Iliad, but you can find it throughout more recent fiction, for example, in the opening sentence of Charles Dickens’s A Tale of Two Cities.
The Anarchy Rhymer: This innovation is the slipperiest of the eight to spot. That’s because it doesn’t follow rules; its blueprint is a rule-breaking element inside a larger formal structure. The larger structure was originally a musical one, as in this 18th century Mother Goose’s Medley nursery rhym:
https://www.wsj.com/articles/alien-languages-may-not-be-entirely-alien-to-us-11616817660 llenguatge As a first step, let us consider why we think that this essay is language but birdsong isn’t. Some birds sing incredibly complex and varied songs. The mockingbird, for instance, combines up to 100 different song types into long sequences that rarely repeat themselves.//Yet despite the complexity of birdsong and whale song, animals don’t seem to have that much to say to each other. “Stay away from my territory,” “Beware of the leopard” and “Come mate with me” sum up most of the messages we expect from animals. They could combine their sounds in almost infinitely varied ways, but they use just the tiniest fraction of these possibilities.
The pseudonym was part of that effort, but Porter also avoided being photographed, rarely gave interviews, and steered clear of situations where someone might pry into his past. He was not a recluse, but he did not like to be the center of attention. People found him affable, unpretentious, and somewhat inscrutable.
As a writer, Porter was identified with New York City, where more than a hundred of his stories are set, but he was born in the Confederacy, in Greensboro, North Carolina, in 1862, and he retained, as you can see in some of his stories, the racial prejudices of a white Southerner of his time.
In New York, he began producing at an astonishing rate. He contracted to write a story a week for the Sunday World, and he continued to write for magazines. In 1904 alone, he published sixty-six stories. He began bringing out collections, notably, in 1906, “The Four Million,” which contains some of his most famous work: “The Gift of the Magi,” “The Cop and the Anthem,” “An Unfinished Story,” and “The Furnished Room.”
The “common man” spirit of the stories may explain their appeal to readers of the popular press in the period during which Porter was writing, a time of mass immigration to cities like New York. It may also account for the fact that he was a favorite writer of both William James, the pragmatist philosopher who hated corporate bigness, and John Reed, the American journalist who joined the Bolshevik Revolution. It surely accounts for his popularity in the Kremlin. O’Connor says that, between 1920 and 1945, 1.4 million copies of the writer’s books were published in the Soviet Union. Even in 1953, the final year of Stalin’s dictatorship, the Soviets printed almost a quarter of a million O. Henry books. The thing that doubtless even Russian readers really enjoyed in an O. Henry story, though, was not the proletarian heroes but the punch line, the twist, the reveal—what became known as the “O. Henry ending.”
Porter distinguished between the story and the plot. He got his stories mainly from people he met—out West, on Broadway and the Bowery, even in prison. But he invented his plots. He took probable situations and gave them improbable outcomes.
The twist, usually a neat pirouette at the very end, annoyed critics like Mencken, who complained about O. Henry’s “variety show smartness.” And there is something gimmicky about the endings. But Porter, although he pretended to regard himself as a hack, was well read, and a self-conscious writer. He understood the literary form he was working in.
Porter was writing in a golden age for the short story which starts with Edgar Allan Poe and includes Anton Chekhov, Guy de Maupassant, and Charles Chesnutt. He was a contemporary of two wildly popular story writers, Arthur Conan Doyle (1859-1930) and Rudyard Kipling (1865-1936), and his own work can be classed with the subgenres they worked in: the detective story and the ghost story, both of which are gimmicky, in the sense that they are deliberately crafted to startle and surprise. You know what you’re getting when you read a Sherlock Holmes story.
The near-contemporary whose work most resembles Porter’s is the Scottish writer H. H. Munro (1870-1916), also universally known by a pen name, Saki. Munro’s characters are drawn from the upper classes, and his prose is droll in the British way—wry and epigrammatic. He is a much defter comic writer than Porter. But he also specialized in short stories—some, like the classic “The Open Window,” very short—with surprise endings.
If you think about the experience of reading a short story, you can feel, even in the case of stories by “literary” writers like Chekhov or Hemingway, that the ending is the money note of the form, the high C of the composition. And the pleasure it gives us is, in some way, sensory. It produces a brief thrill, a frisson—sometimes (as with many Kipling stories) a sense of mystery (“What really happened?”), sometimes (as with ghost stories) a little shiver of horror, sometimes (as with detective stories) a satisfying “Aha!”
Edgar Allan Poe, who wrote both detective stories and ghost stories, called this sensation the “effect,” and he thought that producing it was the purpose of all short-form writing, including poetry. “A skillful literary artist has constructed a tale,” he wrote in 1842. “If wise, he has not fashioned his thoughts to accommodate his incidents; but having conceived, with deliberate care, a certain unique or single effect to be wrought out, he then invents such incidents . . . as may best aid him in establishing this preconceived effect.”
Short stories are more like poems than like novels. Novelists put stuff in, because they are trying to represent a world. Story writers, as Poe implied, leave stuff out. They are not trying to represent a world. They are trying to express a single, intangible thing. The story writer begins with an idea about what readers will feel when they finish reading, just as a lyric poet starts with a nonverbal state of mind and then constructs a verbal artifact that evokes it. The endings of modern short stories tend to be oblique, but they, too, are structured for an effect, frequently of pathos.
https://www.newyorker.com/magazine/2021/09/20/reading-dantes-purgatory-while-the-world-hangs-in-the-balance Dante’s conception of Purgatory is remarkably like a
wilderness boot camp. Its terrain is forbidding—more like an alp than like a Tuscan hillside. Each of the rugged terraces is a setting for group therapy, where supernatural counsellors dispense tough love. Their charges are sinners, yet not incorrigibles: they all embraced Jesus as their savior. But, before dying, they harmed others and themselves, so their spirits need reëducation. They will graduate to the Earthly Paradise, and eventually to Heaven, after however much time it takes them to transcend their mortal failings by owning them.
For many students of Dante, Purgatory is the Divine Comedy’s central canticle poetically, philosophically, and psychologically. It is, as one of its best translators, the poet W. S. Merwin, noted, the only one that “happens on the earth, as our lives do. . . .
By 1295, Dante had finished “Vita Nuova,” a stylized autobiography. Its author is a self-absorbed youth with the leisure to moon after an aloof woman. He knows he’s a genius and can’t help showing off. Passages of prose alternate with sonnets and canzoni on the theme of love, but the author doesn’t trust us to understand them. His didactic self-commentary has been hailed as the birth of metatextuality, though it also seems to mark the advent of mansplaining.
Beatriu: That night, he dreams of her asleep, “naked except for a crimson cloth,” in the arms of a “lordly man.” The man wakes her, holding a blazing heart—Dante’s—and compels her to eat it, which she does “unsurely.”
In 1301, the White Guelfs sent Dante to Rome on a mission to secure the Pope’s support for their cause. But while he was away from Florence the Black Guelfs seized power. They banished Dante in absentia and confiscated his property; he would burn at the stake should he ever return. He never did, even in 1315, when the city offered to commute his sentence if he repented publicly. Exile was preferable to abasement for a man of his temperament, which was reported to be vain and contentious. After leaving Purgatory’s terrace of pride, he worries that he’ll be remanded there after death.
Dante spent the last nineteen years of his working life as an itinerant diplomat and secretary for the lords of northern Italy. The poem that he called, simply, the “Comedy” (a Venetian edition of 1555 added the adjective “Divine,” and it stuck) is the work of an embittered asylum seeker. Its profoundest lesson may be that love’s wellspring is forgiveness. Yet Dante never forgave Florence.
The Comedy is both an epic road trip indebted to Homer and a medieval pilgrimage, though it is also a landmark in Western literature: one of its first masterpieces in a Romance vernacular.
They are even being published by the same university press, Princeton. Montás’s is called “Rescuing Socrates: How the Great Books Changed My Life and Why They Matter for a New Generation”; Weinstein’s is “The Lives of Literature: Reading, Teaching, Knowing.” / Both men teach what are called—unfortunately but inescapably—“great books” courses./ As they see it, they are doing God’s work. Their humanities colleagues are careerists who have lost sight of what education is about, and their institutions are in service to Mammon and Big Tech./It will probably not improve their spirits to point out that professors have been making the same complaints ever since the American research university came into being, in the late nineteenth century. / The idea of the great books emerged at the same time as the modern university. It was promoted by works like Noah Porter’s “Books and Reading: Or What Books Shall I Read and How Shall I Read Them?” (1877) and projects like Charles William Eliot’s fifty-volume Harvard Classics (1909-10). (Porter was president of Yale; Eliot was president of Harvard.) British counterparts included Sir John Lubbock’s “One Hundred Best Books” (1895) and Frederic Farrar’s “Great Books” (1898). None of these was intended for students or scholars. They were for adults who wanted to know what to read for edification and enlightenment, or who wanted to acquire some cultural capital./ In a great-books course of the kind that Montás and Weinstein teach, undergraduates read primary texts, then meet in a classroom to share their responses with their peers. Discussion is led by an instructor, but the instructor’s job is not to give the students a more informed understanding of the texts, or to train them in methods of interpretation, which is what would happen in a typical literature- or philosophy-department course. The instructor’s job is to help the students relate the texts to their own lives. / Why should an English professor who got his degree with a dissertation on the American Transcendentalists (as Montás did), and who doesn’t read Italian or know anything about medieval Christianity, teach Dante (in a week!), when you have a whole department of Italian-literature scholars on your faculty? What qualifies a man like Arnold Weinstein, who has spent his entire adult life in the literature departments of Ivy League universities, to guide eighteen-year-olds in ruminations on the state of their souls and the nature of the good life? / Many students who take a great-books-type course enjoy encountering famous texts and seeing that the questions they raise are often relevant to their other coursework. And some students experience a kind of intellectual awakening, which can be inspiring and even transformational. For students who are motivated—and motivation is half of learning—these courses really work. They are happy to read Dante in translation and without a scholarly apparatus, because they want to get a sense of what Dante is all about, and they know that if they don’t get it in college they are unlikely to get it anywhere else. / The quarrel between generalist and specialist—or, as it is sometimes framed down in the trenches, between dilettante and pedant—is more than a hundred years old and it would seem that this is not a quarrel that one side has to win. Montás and Weinstein, however, think that the conflict is existential, and that the future of the academic humanities is at stake. Are they right? / Between 2012 and 2019, the number of bachelor’s degrees awarded annually in English fell by twenty-six per cent, in philosophy and religious studies by twenty-five per cent, and in foreign languages and literature by twenty-four per cent. In English, according to the Association of Departments of English, which tracked the numbers through 2016, research universities, like Brown and Columbia, took the biggest hits. More than half reported a drop in degrees of forty per cent or more in just four years. / What humanists should be teaching, Montás and Weinstein believe, is self-knowledge. To “know thyself” is the proper goal. Art and literature, as Weinstein puts it, “are intended for personal use, not in the self-help sense but as mirrors, as entryways into who we ourselves are or might be.” Montás says, “A teacher in the humanities can give students no greater gift than the revelation of the self as a primary object of lifelong investigation.” You don’t need research to learn this. Research is irrelevant. You just need some great books and a charismatic instructor. / or the advocates of liberal culture a century ago, the false god of literature departments was philology. Today, the false god is “theory.” Montás complains that contemporary theory—he calls it “postmodernism”—subverts the college’s educational mission by calling into question terms like “truth” and “virtue.” A postmodernist, in his definition, is a person who believes that there is no capital-T truth, that “true” is just the compliment those with power pay to their own beliefs. “This unmooring of human reason from the possibility of ultimate truth in effect undermines all of Western metaphysics,” he tells us, “including ethics.” (He blames this all on Friedrich Nietzsche, whom he calls “Satan’s most acute theologian,” which is an amazing thing to say. Nietzsche wanted to free people to embrace life, not to send them to Hell. He didn’t believe in Hell. Or theology.) / And if, as these authors insist, education is about self-knowledge and the nature of the good, what are those things supposed to look like? How do we know them when we get there? What does it mean to be human? What exactly is the good life? / It all sounds a lot like “Trust us. We can’t explain it, but we know what we’re doing.” / In the creation of the modern university, science was the big winner. The big loser was not literature. It was religion. The university is a secular institution, and scientific research—more broadly, the production of new knowledge—is what it was designed for. All the academic disciplines were organized with this end in view. Philology prevailed in literature departments because philology was scientific. It represented a research agenda that could produce replicable results. Weinstein is not wrong to think that critical theory has played the same role. It does aim to add rigor to literary analysis. / Weinstein won’t even call what students learn in science courses “knowledge.” He calls it “information,” which he thinks has nothing to do with how one ought to live. “Life is more than reason or data,” he tells us, “and literature schools us in a different set of affairs, the affairs of heart and soul that have little truck with information as such.” / “Today, the heirs to Descartes’s project are perhaps most visible in Silicon Valley,” Montás says, “but the ethic that informs his approach is pervasive in the broader culture, including the culture of the university.” / What did Descartes write that set us on the road to Facebook? He wrote that scientific knowledge can lead to medical discoveries that improve health and prolong life. / Montás calls this proposition “Faustian.” He says that it implies that there is “no higher value than the subsistence and satisfaction of the self,” and that this is what college students are being taught today. / Humanists cannot win a war against science. They should not be fighting a war against science. They should be defending their role in the knowledge business, not standing aloof in the name of unspecified and unspecifiable higher things. They need to connect with disciplines outside the humanities, to get out of their silos. / Art and literature have cognitive value. They are records of the ways human beings have made sense of experience. They tell us something about the world. But they are not privileged records. A class in social psychology can be as revelatory and inspiring as a class on the novel. The idea that students develop a greater capacity for empathy by reading books in literature classes about people who never existed than they can by taking classes in fields that study actual human behavior does not make a lot of sense. / Knowledge is a tool, not a state of being. Universities are in this world, and education is about empowering people to deal with things as they are. Students at places like Brown and Columbia want to make the world a better place, and they can see, as Descartes saw, that science can provide tools to do this. If some of those students make a lot of money, who cares? / Isn’t it a little arrogant for humanists like the authors of these books to presume that economics professors and life-science professors and computer-science professors don’t care about their students’ personal development? The humanities do not have a monopoly on moral insight. Reading Weinstein and Montás, you might conclude that English professors, having spent their entire lives reading and discussing works of literature, must be the wisest and most humane people on earth. Take my word for it, we are not. We are not better or worse than anyone else. I have read and taught hundreds of books, including most of the books in the Columbia Core. I teach a great-books course now. I like my job, and I think I understand many things that are important to me much better than I did when I was seventeen. But I don’t think I’m a better person.
[ és aprendre? o és com una teràpia?]
ny3-10
[ és un recurs barat per donar prefunditat a un personatge? Història de la idea de trauma]
The Case Against the Trauma Plot
Fiction writers love it. Filmmakers can’t resist it. But does this trope deepen characters, or flatten them into a set of symptoms?
Trauma has become synonymous with backstory; the present must give way to the past, where all mysteries can be solved.Illustration by Aldo Jarillo
It was on a train journey, from Richmond to Waterloo, that Virginia Woolf encountered the weeping woman. A pinched little thing, with her silent tears, she had no way of knowing that she was about to be enlisted into an argument about the fate of fiction. Woolf summoned her in the 1924 essay “Mr. Bennett and Mrs. Brown,” writing that “all novels begin with an old lady in the corner opposite”—a character who awakens the imagination.
Those details: the sea urchins, that saucer, that slant of personality. To conjure them, Woolf said, a writer draws from her temperament, her time, her country. An English novelist would portray the woman as an eccentric, warty and beribboned. A Russian would turn her into an untethered soul wandering the street, “asking of life some tremendous question.”
Dress this story up or down: on the page and on the screen, one plot—the trauma plot—has arrived to rule them all. Unlike the marriage plot, the trauma plot does not direct our curiosity toward the future (Will they or won’t they?) but back into the past (What happened to her?). “For the eyeing of my scars, there is a charge,” Sylvia Plath wrote in “Lady Lazarus.” “A very large charge.” Now such exposure comes cheap. Frame it within a bad romance between two characters and their discordant baggage. Nest it in an epic of diaspora; reënvision the Western, or the novel of passing. Fill it with ghosts. Tell it in a modernist sensory rush with the punctuation falling away. Set it among nine perfect strangers. In fiction, our protagonist will often go unnamed; on television, the character may be known as Ted Lasso, Wanda Maximoff, Claire Underwood, Fleabag. Classics are retrofitted according to the model. Two modern adaptations of Henry James’s “The Turn of the Screw” add a rape to the governess’s past. In “Anne with an E,” the Netflix reboot of “Anne of Green Gables,” the title character is given a history of violent abuse, which she relives in jittery flashbacks. In Hogarth Press’s novelized updates of Shakespeare’s plays, Jo Nesbø, Howard Jacobson, Jeanette Winterson, and others accessorize Macbeth and company with the requisite devastating backstories.
I hear grumbling. Isn’t it unfair to blame trauma narratives for portraying what trauma does: annihilate the self, freeze the imagination, force stasis and repetition? It’s true that our experiences and our cultural scripts can’t be neatly divided; we will interpret one through the other. And yet survivor narratives and research suggest greater diversity than our script allows. Even as the definition of what constitutes P.T.S.D. has grown more jumbled—“the junk drawer of disconnected symptoms,” David J. Morris calls it in “The Evil Hours: A Biography of Post-Traumatic Stress Disorder” (2015)—the notion of what it entails, the sentence it imposes, appears to have grown narrower and more unyielding. The afterword to a recent manual, “Stories Are What Save Us: A Survivor’s Guide to Writing About Trauma,” advises, “Don’t bother trying to rid yourself of trauma altogether. Forget about happy endings. You will lose. Escaping trauma isn’t unlike trying to swim out of a riptide.”
The prevalence of the trauma plot cannot come as a surprise at a time when the notion of trauma has proved all-engulfing. Its customary clinical incarnation, P.T.S.D., is the fourth most commonly diagnosed psychiatric disorder in America, and one with a vast remit. Defined by the DSM-III, in 1980, as an event “outside the range of usual human experience,” trauma now encompasses “anything the body perceives as too much, too fast, or too soon,” the psychotherapist Resmaa Menakem tells us in “My Grandmother’s Hands: Racialized Trauma and the Pathway to Mending Our Hearts and Bodies” (2017). The expanded definition has allowed many more people to receive care but has also stretched the concept so far that some 636,120 possible symptom combinations can be attributed to P.T.S.D., meaning that 636,120 people could conceivably have a unique set of symptoms and the same diagnosis.
It was not war or sexual violence that brought the idea of traumatic memory to light but the English railways, some six decades before Woolf chugged along from Richmond to Waterloo. In the eighteen-sixties, the physician John Eric Erichsen identified a group of symptoms in some victims of railway accidents—though apparently uninjured, they later reported confusion, hearing voices, and paralysis. He termed it “railway spine.” Sigmund Freud and Pierre Janet went on to argue that the mind itself could be wounded. In the trenches of the Great War, railway spine was reborn as shell shock, incarnated in the figure of the suicidal veteran Septimus Smith, in Woolf’s “Mrs. Dalloway.” What remained unaltered was the scorn that accompanied diagnosis; shell-shocked soldiers were sometimes labelled “moral invalids” and court-martialled. In the decades that followed, the study of trauma slipped into “periods of oblivion,” as the psychiatrist Judith Herman has written. It wasn’t until the Vietnam War that the aftershocks of combat trauma were “rediscovered.” P.T.S.D. was identified, and, with the political organizing of women’s groups, the diagnosis was extended to victims of rape and sexual abuse. In the nineteen-nineties, trauma theory as a cultural field of inquiry—pioneered by the literary critic Cathy Caruth—described an experience that overwhelms the mind, fragments the memory, and elicits repetitive behaviors and hallucinations. In the popular realm, such ideas were given a scientific imprimatur by Bessel van der Kolk’s “The Body Keeps the Score” (2014), which argues that traumatic memories are physiologically distinctive and inscribe themselves on an older, more primal part of the brain.
“If Greeks invented tragedy, the Romans the epistle and the Renaissance the sonnet,” Elie Wiesel wrote, “our generation invented a new literature, that of testimony.” The enshrinement of testimony in all its guises—in memoirs, confessional poetry, survivor narratives, talk shows—elevated trauma from a sign of moral defect to a source of moral authority, even a kind of expertise. In the past couple of decades, a fresh wave of writing about the subject has emerged, with best-selling novels and memoirs of every disposition: the caustic (Edward St. Aubyn’s Patrick Melrose novels), the sentimental (Jonathan Safran Foer’s “Extremely Loud & Incredibly Close”), the enraptured (Leslie Jamison’s essay collection “The Empathy Exams”), the breathtakingly candid (the anonymously written memoir “Incest Diary”), or all of the above (Karl Ove Knausgaard’s six-volume “My Struggle”). Internet writing mills offered a hundred and fifty dollars a confession. “It was 2015, and everyone was a pop-culture critic, writing from the seat of experience,” Larissa Pham recalls in a recent essay collection, “Pop Song.” “The dominant mode by which a young, hungry writer could enter the conversation was by deciding which of her traumas she could monetize . . . be it anorexia, depression, casual racism, or perhaps a sadness like mine, which blended all three.” “The Body Keeps the Score” has remained planted on the Times best-seller list for nearly three years.
To question the role of trauma, we are warned, is to oppress: it is “often nothing but a resistance to movements for social justice,” Melissa Febos writes in her forthcoming book, “Body Work: The Radical Power of Personal Narrative.” Those who look askance at trauma memoirs, she says, are replicating the “classic role of perpetrator: to deny, discredit and dismiss victims in order to avoid being implicated or losing power.” Trauma survivors and researchers who have testified about experiences or presented evidence that clashes with the preferred narrative often find their own stories denied and dismissed. In the nineties, the psychologist Susan A. Clancy conducted a study of adults who had been sexually abused as children. They described the grievous long-term suffering and harm of P.T.S.D., but, to her surprise, many said that the actual incidents of abuse were not themselves traumatic, characterized by force or fear—if only because so many subjects were too young to fully understand what was happening and because the abuse was disguised as affection, as a game. The anguish came later, with the realization of what had occurred. Merely for presenting these findings, Clancy was labelled an ally of pedophilia, a trauma denialist.
In a recent Harper’s essay, the novelist Will Self suggests that the biggest beneficiaries of the trauma model are trauma theorists themselves, who are granted a kind of tenure, entrusted with a lifetime’s work of “witnessing” and interpreting. George A. Bonanno, the director of Columbia’s Loss, Trauma, and Emotion Lab and the author of “The End of Trauma,” has a blunter assessment: “People don’t seem to want to let go of the idea that everybody’s traumatized.”
The experience of uncertainty and partial knowledge is one of the great, unheralded pleasures of fiction. Why does Hedda Gabler haunt us? Who does Jean Brodie think she is? What does Sula Peace want? Sula’s early life is thick with incidents, any one of which could plausibly provide the wound around which personality, as understood by the trauma plot, might scab—witnessing a small boy drown, witnessing her mother burn to death. But she is not their sum; from her first proper appearance in the novel, with an act of sudden, spectacular violence of her own, she has an open destiny.
The trauma plot flattens, distorts, reduces character to symptom, and, in turn, instructs and insists upon its moral authority. The solace of its simplicity comes at no little cost. It disregards what we know and asks that we forget it, too—forget about the pleasures of not knowing, about the unscripted dimensions of suffering, about the odd angularities of personality, and, above all, about the allure and necessity of a well-placed sea urchin.
Stanislaw Lem, la influència en la seva obra de la seva experiència com a jueu sota el nazisme
https://www.newyorker.com/magazine/2022/01/17/how-the-chinese-language-got-modernized
The late, great sinologist Simon Leys once pointed out a peculiar paradox. China is the world’s oldest surviving civilization, and yet very little material of its past remains—far less than in Europe or India. Through the centuries, waves of revolutionary iconoclasts have tried to smash everything old; the Red Guards, in the nineteen-sixties, were following an ancient tradition. The Chinese seldom built anything for eternity, anyway, nothing like the cathedrals of Europe. And what survived from the past was often treated with neglect. / To become an official in imperial China, one had to compose precise scholarly essays on Confucian philosophy, an arduous task that very few could complete. Even Chairman Mao, who incited his followers to destroy every vestige of tradition, proudly displayed his prowess as a calligrapher, establishing himself as the bearer of Chinese civilization./ Leys was right about the continuity of the Chinese written word. But zealots, intent on erasing old incarnations of Chinese civilization in order to make way for new ones, have often targeted the written language, too. One of Mao’s models was the first Qin emperor (259-210 B.C.), a much reviled despot who ordered the construction of the Great Wall and was perhaps the first major book burner in history. He wanted to destroy all the Confucian classics, and supposedly buried Confucian scholars alive. Mao’s only criticism of his hated predecessor was that he had not been radical enough. It was under the Qin emperor that the Chinese script was standardized.
So what accounts for the longevity of Chinese civilization? Leys believed it was the written word, the richness of a language employing characters, partly ideographic, that have hardly changed over two thousand years.
Chinese certainly presents unique difficulties. To be literate in the language, a person must be able to read and write at least three thousand characters. To enjoy a serious book, a reader must know several thousand more. Learning to write is a feat of memory and graphic skill: a Chinese character is composed of strokes, to be made in a particular sequence, following the movements of a brush, and quite a few characters involve eighteen or more strokes.
Tsu begins her story in the late nineteenth century, when China was deep in crisis. After bloody uprisings, humiliating defeats in the Opium Wars, and forced concessions—predatory foreign powers were grabbing what spoils they could from a poor, exhausted, divided continent—the last imperial dynasty was falling apart. Chinese intellectuals, influenced by then fashionable social-Darwinist ideas, saw China’s crisis in existential terms. Could the Chinese language, with its difficult writing system, survive? Would Chinese civilization itself survive? The two questions were, of course, inextricably linked.
In this cultural panic, many intellectuals were ashamed of the poverty and the illiteracy of the rural population, and of the weakness of a decadent and hidebound imperial élite. They hoped for a complete overhaul of Chinese tradition. Qing-dynasty rule was brought to an end in 1911, but reformers sought to cleanse imperial culture itself. The authority of a tradition based on various schools of Confucian philosophy had to be smashed before China could rise in the modern world. The classical style of the language, elliptical and complex, was practiced by only a small number of highly educated people, for whom it functioned rather like Latin in the Catholic Church, as a pathway to high office. Reformers saw it as an impediment both to mass literacy and to political progress. Before long, classical Chinese was supplanted by a more vernacular prose in official discourse, books, and newspapers. In fact, a more vernacular form of written Chinese, called baihua, had already been introduced, during the Ming dynasty (1368-1644). So there was a precedent for making written Chinese more accessible. / More radical modernizers hoped to do away with characters altogether and replace them with a phonetic script, either in Roman letters or in a character-derived adaptation, as had been the practice for many centuries in Japanese and Korean. A linguist, Qian Xuantong, famously argued that Confucian thought could be abolished only if Chinese characters were eradicated. “And if we wish to get rid of the average person’s childish, naive, and barbaric ways of thinking,” he went on, “the need to abolish characters becomes even greater.” Lu Xun, the most admired Chinese essayist and short-story writer of the twentieth century, offered a blunter prognosis in 1936: “If the Chinese script is not abolished, China will certainly perish!” / Many attempts have been made to transliterate Chinese in the Latin alphabet. These range from a system invented by two nineteenth-century British diplomats, Thomas Wade and Herbert Giles, to the “Pinyin” system, developed by linguists in the People’s Republic of China, which is different again from various forms of Romanization used in Taiwan. / Difficulties confront all such systems. The time-honored character-based writing system can readily accommodate different modes of pronunciation, even mutually unintelligible dialects. Chinese has a great many homonyms, which transliterations are bound to conflate. And Chinese, unlike Korean or Japanese, is a tonal language; some way of conveying tones is necessary. (Wade-Giles uses superscript numerals; a system developed by the linguist and inventor Lin Yutang uses spelling conventions; Pinyin uses diacritical marks.) The different efforts at Romanization, accordingly, yield very different results. The word for strength, say, is ch’iang2 in Wade-Giles, chyang in Lin’s script, and qiáng in Pinyin. / Most of the people whom Tsu writes about looked to the United States. Many of them studied at American universities in the nineteen-tens, subsidized by money that the United States received from China as an indemnity after the anti-Western Boxer Rebellion was defeated.. Zhou Houkun, who invented a Chinese typewriting machine, studied at M.I.T. Hu Shi, a scholar and a diplomat who helped elevate the vernacular into the national language, went to Cornell. Lin Yutang, who devised a Chinese typewriter, studied at Harvard. Wang Jingchun, who smoothed the way for Chinese telegraphy, said, with more ardor than accuracy, “Our government is American; our constitution is American; many of us feel like Americans.”
It’s true that Japan’s industrial, military, and educational reforms since the Meiji Restoration of 1868 were themselves based on Western models, including artistic movements, such as Impressionism and Surrealism. But these ideas were transmitted to China by Chinese students, revolutionaries, and intellectuals in Japan, and had a direct and lasting impact on written and spoken Chinese. Many scientific and political terms in Chinese—such as “philosophy,” “democracy,” “electricity,” “telephone,” “socialism,” “capitalism,” and “communism”—were coined in Japanese by combining Chinese characters.
In the Soviet Union, the Roman alphabet had been used in order to impose political uniformity on many different peoples, including Muslims who were used to Arabic script. The Soviets supported and subsidized Chinese efforts to follow their example. For the Communists, as Tsu notes, the goal was simple: “If the Chinese could read easily, they could be radicalized and converted to communism with the new script.”
Mao, in the decade that followed, ushered in two linguistic revolutions: Pinyin, the Romanized transcription that became the standard all over China (and now pretty much everywhere else), and so-called simplified Chinese.
The Committee on Script Reform, created in 1952, started by releasing some eight hundred recast characters. More were released, and some were revised, in the ensuing decades. The new characters, made with many fewer strokes, were “true to the egalitarian principles of socialism,” Tsu says. The Communist cadres rejoiced in the fact that “the people’s voices were finally being heard.” Among the beneficiaries were “China’s workers and peasants.” After all, “Mao said that the masses were the true heroes and their opinions must be trusted.” / In 1956, Tao-Tai Hsia, then a professor at Yale, wrote that strengthening Communist propaganda was “the chief motivation” of language reform: “The thought of getting rid of parts of China’s cultural past which the Communists deem undesirable through the language process is ever present in the minds of the Communist cultural workers.”
Zhi Bingyi worked on his ideas about a Chinese computer language in a squalid prison cell during the Cultural Revolution, writing his calculations on a teacup after his guards took away even his toilet paper. Wang Xuan, a pioneer of laser typesetting systems, was so hungry during Mao’s disastrous Great Leap Forward campaign, in 1960, that “his body swelled under the fatigue, but he continued to work relentlessly.” Such anecdotes add welcome color to the technical explanations of phonetic scripts, typewriters, telegraphy, card-catalogue systems, and computers. Sentences like “Finally, through a reverse process of decompression, Wang converted the vector images to bitmaps of dots for digital output” can become wearying.
Today, in the era of standardized word processors and Chinese social-media apps like WeChat, Pinyin and characters are seamlessly connected. Users typically type Pinyin on their keyboards while the screen displays the simplified characters, offering an array of options to resolve homonyms. (Older users may draw the characters on their smartphones.) China will, as Tsu says, “at last have a shot at communicating with the world digitally.” The old struggles over written forms might seem redundant. But the politics of language persists, particularly in the way the government communicates with its citizens.
Demands for radical reform came to a head in 1919, with a student protest in Beijing, first against provisions in the Treaty of Versailles which allowed Japan to take possession of German territories in China, and then against the classical Confucian traditions that were believed to stand in the way of progress. A gamut of political orientations combined in the so-called New Culture movement, ranging from the John Dewey-inspired pragmatism of Hu Shi to early converts to socialism. Where New Culture protesters could agree, as Tsu notes, was on the critical importance of mass literacy.
I still shudder at the memory of reading, as a student in the early nineteen-seventies, Maoist publications in Chinese, with their deadwood language, heavy Soviet sarcasm, and endless sentences that sounded like literal translations from Marxist German—the exact opposite of the compressed poeticism of the classical style. But in Mao’s China mastery of this style was as important as writing Confucian essays had been in imperial times. When, back in the seventies, the official Chinese news agency, Xinhua, urged the government to speed up computer technology, its stated aim was to spread the Communist Party’s doctrines more efficiently.
These days, China’s geopolitical and technological status means that its political “narratives” have become global. China is advancing an alternative model to Western-style democracy. Soft power is being used to change the way China is perceived abroad, and the way business with China is to be conducted. Tsu says that China wants to have the ability to promote its “narrative as the master or universal narrative for the world to abide by.” This sounds ominous. Still, it isn’t always clear from her book whether she is talking about China as a civilization, as the Chinese-speaking peoples, or as the Chinese Communist Party. She writes that “the China story no doubt aims for a triumphant narrative.” But which China story? Does it include Taiwan, where citizens enjoy even more advanced information technology than their counterparts in the People’s Republic? Or is it vaguer than that, an entity that binds all Chinese cultures?
To Xi Jinping, of course, there is no distinction. At a Party meeting in November, something called Xi Jinping Thought was defined as “the essence of Chinese culture and China’s spirit.” The question is whether the Chinese Communist government will succeed in using its soft power to make its “narrative” universally triumphant. It already has its hands full imposing official dogma on its own people. China has enough gifted scientists, artists, writers, and thinkers to have a great influence on the world, but that influence will be limited if they cannot express themselves freely. These days, many written Chinese words cannot appear at all, in printed or digital form. In the aftermath of the Peng Shuai affair, even the word “tennis” has now become suspect in Chinese cyberspace.
In the last sentence of her book, Tsu writes, “Still unfolding, history will overtake China’s story.” I’m not sure what that means. But the story of the Chinese language under Communism is mostly one of repression and distortion, which only heroes and fools have defied. In an account of language, narratives, characters, and codes, the meaning of words still matters the most. Overemphasize the medium, and that message may get lost.
Another commentator numbered Mann among those “literary monoliths who have outlived their proper time.”
In Germany, that verdict did not hold. Circa 1950, Mann was a divisive figure in his homeland, widely criticized for his belief that Nazism had deep roots in the national psyche. Having gone into exile in 1933, he refused to move back, dying in Switzerland in 1955. Over time, his sweeping analysis of German responsibility, from which he did not exclude himself, ceased to be controversial.
It is impossible to talk seriously about the fate of Germany in the twentieth century without reference to Thomas Mann.
In America, however, one can coast through a liberal-arts education without having to deal with Mann. General readers are understandably hesitant to plunge into the Hanseatic decadence of “Buddenbrooks” or the sanatorium symposia of “The Magic Mountain,” never mind the musicological diabolism of “Doctor Faustus” or the Biblical mythography of “Joseph and His Brothers.”
Because I have been almost unhealthily obsessed with Mann’s writing since the age of eighteen, I may be ill-equipped to win over skeptics, but I know why I return to it year after year. Mann is, first, a supremely gifted storyteller, adept at the slow windup and the rapid turn of the screw. He is a solemn trickster who is never altogether earnest about anything, especially his own grand Goethean persona. At the heart of his labyrinth are scenes of emotional chaos, episodes of philosophical delirium, intimations of inhuman coldness. His politics traverse the twentieth-century spectrum, ricochetting from right to left. His sexuality is an exhibitionistic enigma. In life and work alike, his contradictions are pressed together like layers in metamorphic rock.
At first glance, Tóibín’s undertaking seems superfluous, since there are already a number of great novels about Thomas Mann, and they have the advantage of being by Thomas Mann. Few writers of fiction have so relentlessly incorporated their own experiences into their work. Hanno Buddenbrook, the proud, hurt boy who improvises Wagnerian fantasies on the piano; Tonio Kröger, the proud, hurt young writer who sacrifices his life for his art; Prince Klaus Heinrich, the hero of “Royal Highness,” who rigidly performs his duties; Gustav von Aschenbach, the hidebound literary celebrity who loses his mind to a boy on a Venice beach; Mut-em-enet, Potiphar’s wife, who falls desperately in love with the handsome Israelite Joseph; the confidence man Felix Krull, who fools people into thinking he is more impressive than he is; the Faustian composer Adrian Leverkühn, who is compared to “an abyss into which the feelings others expressed for him vanished soundlessly without a trace”—all are avatars of the author, sometimes channelling his letters and diaries. Mann liked to say that he found material rather than invented it—a play on the verbs finden and erfinden.
Woman has a moat around her cubicle.
“Since you somehow managed to get past my moat, I’ll give you a few minutes.”
Mann’s most dizzying self-dramatization can be found in the novel “Lotte in Weimar,” from 1939. It tells of a strained reunion between the aging Goethe and his old love Charlotte Buff, who, decades earlier, had inspired the character of Lotte in “The Sorrows of Young Werther.” Goethe is endowed with Mannian traits, flatteringly and otherwise. He is a man who feeds on the lives of others and appropriates his disciples’ work, stamping all of it with his parasitic genius. Mann, too, left countless literary victims in his wake, including members of his family. One of them is still with us: his grandson Frido, who loved his Opa’s company and then discovered that a fictional version of himself had been killed off in “Doctor Faustus.”
“The Magician,” deft and diligent as it is, ultimately diminishes the imperial strangeness of Mann’s nature. He comes across as a familiar, somewhat pitiable creature—a closeted man who occasionally gives in to his desires. The real Mann never gave in to his desires, but he also never really hid them. Gay themes surfaced in his writing almost from the start, and he made clear that his stories were autobiographical. When, in 1931, he received a newspaper questionnaire asking about his “first love,” he replied, in essence, “Read ‘Tonio Kröger.’ ” Likewise, of “Death in Venice” he wrote, “Nothing is invented.” Gay men saw the author as one of their own. When the composer Ned Rorem was young, he took a front-row seat at a Mann lecture, hoping to distract the eminence on the dais with his hotness. “He never looked,” Rorem reported.
To the end of his life, Mann kept insisting that any attempt to separate the artistic from the political was a catastrophic delusion. His most succinct formulation came in a letter to Hermann Hesse, in 1945: “I believe that nothing living can avoid the political today. The refusal is also politics; one thereby advances the politics of the evil cause.” If artists lose themselves in fantasies of independence, they become the tool of malefactors, who prefer to keep art apart from politics so that the work of oppression can continue undisturbed. So Mann wrote in an afterword to a 1937 book about the Spanish Civil War, adding that the poet who forswears politics is a “spiritually lost man.” The same conviction is inscribed into the later fiction. The primary theme of “Doctor Faustus” is the insanity of the old Romantic ethos.
In speeches of the period, Mann called for “social self-discipline under the ideal of freedom”—a political philosophy that doubles as a personal one. He also said, “Let me tell you the whole truth: if ever Fascism should come to America, it will come in the name of ‘freedom.’ ” He left the United States in 1952, fearing that McCarthyism had made him a marked man once again.
In the years before the First World War, Mann labored to come up with a second masterpiece. He contemplated a novel about Frederick the Great and other weighty schemes. When none of them panned out, he busied himself with seemingly trivial subjects: a story about a charming confidence man; a tale involving tuberculosis patients at a Swiss clinic; a novella based on a beach vacation in Venice. The last, published in 1912, proved to be the breakthrough to Mann’s mature manner. But it took the form of a fabulously intricate self-satire, in which the Frederick the Great novel and other unrealized plans were attributed to an older, sadder version of himself. It was a bonfire of his vanities, a kind of artistic suicide. Mann struggled with suicidal impulses in his early years, and he found cathartic satisfaction in killing off his alter egos.
“Reflections,” in the course of its meanderings, addresses perceived misunderstandings of “Death in Venice.” Readers saw the novella as an exercise in attaining a “master style”; for Mann, it is a parody of his own quest for mastery. “Death in Venice” is secretly a comedy, in a very dark register. The narrator’s grandiloquence overshoots the mark and becomes ludicrous: “What he craved, though, was to work in Tadzio’s presence, to take the boy’s physique as the model for his writing, to let his style follow the contours of this body which seemed to him divine, to carry its beauty into the realm of the intellect, as the eagle once carried the Trojan shepherd into the ether.” The real point of collapse comes when we are assured that the outer world will enjoy Aschenbach’s miraculous prose without knowing its tawdry origins. The boundary between art and life is obliterated as soon as it is drawn.
Mann’s new style is modernism in a high-bourgeois mode, as byzantine in its layering as anything in Joyce. The seventh chapter of “Lotte in Weimar,” in which Goethe delivers an interior monologue, creates an astonishingly dense mosaic of Goethean utterances intermingled with Mann’s own thoughts; at the same time, it is a radical demythologizing of a cultural demigod. (You might not notice from reading Helen Lowe-Porter’s stilted translation, but Goethe wakes up with a hard-on.) “Doctor Faustus” restages the life of Nietzsche, borrows fragments from Mann’s old diaries, and absorbs chunks of the musical philosophy of Arnold Schoenberg and Theodor W. Adorno.
The film in question is, of course, the 1942 Walt Disney classic “Bambi.” Perhaps more than any other movie made for children, it is remembered chiefly for its moments of terror: not only the killing of the hero’s mother but the forest fire that threatens all the main characters with annihilation. Stephen King called “Bambi” the first horror movie he ever saw, and Pauline Kael, the longtime film critic for this magazine, claimed that she had never known children to be as frightened by supposedly scary grownup movies as they were by “Bambi.”
It was adapted from “Bambi: A Life in the Woods,” a 1922 novel by the Austro-Hungarian writer and critic Felix Salten./ Felix Salten was an unlikely figure to write “Bambi,” since he was an ardent hunter who, by his own estimate, shot and killed more than two hundred deer. He was also an unlikely figure to write a parable about Jewish persecution, since, even after the book burnings, he promoted a policy of appeasement toward Nazi Germany. And he was an unlikely figure to write one of the most famous children’s stories of the twentieth century, since he wrote one of its most infamous works of child pornography./ The production that brought Salten the most infamy, however, did not bear his name: “Josefine Mutzenbacher; or, The Story of a Viennese Whore, as Told by Herself.
If you haven’t seen the Disney version of “Bambi” since you were eight, here is a quick refresher: The title character is born one spring to an unnamed mother and a distant but magnificently antlered father. He befriends an enthusiastic young rabbit, Thumper; a sweet-tempered skunk, Flower; and a female fawn named Faline. After the death of his mother the following spring, he and Faline fall in love, but their relationship is tested by a rival deer, by a pack of hunting dogs, and, finally, by the forest fire. Having triumphed over all three, Bambi sires a pair of fawns; as the film concludes, the hero, like his father before him, is watching over his family from a faraway crag.
That vision is of an Eden marred only by the incursion of humankind. There is no native danger in Bambi’s forest; with the exception of his brief clash with another male deer in mating season, and maybe that hardscrabble winter, the wilderness he inhabits is all natural beauty and interspecies amity. The truly grave threats he faces are always from hunters, who cause both the forest fire and the death of his mother, yet the movie seems less anti-hunting than simply anti-human. The implicit moral is not so much that killing animals is wicked as that people are wicked and wild animals are innocent. Unsurprisingly, “Bambi” has long been unpopular among hunters, one of whom sent a telegram to Walt Disney on the eve of the film’s release to inform him that it is illegal to shoot deer in the spring. Nor is the film a favorite among professional wilderness managers, who now routinely contend with what they call “the Bambi complex”: a dangerous desire to regard nature as benign and wild animals as adorable and tame, coupled with a corresponding resistance to crucial forest-management tools such as culling and controlled burns.
But perhaps the most vociferous if also the smallest group of critics consists of devotees of Salten, who recognize how drastically Disney distorted his source material. Although the animals in the novel do converse and in some cases befriend one another across species, their over-all relations are far from benign. In the course of just two pages, a fox tears apart a widely beloved pheasant, a ferret fatally wounds a squirrel, and a flock of crows attacks the young son of Friend Hare—the gentle, anxious figure who becomes Thumper in the movie—leaving him to die in excruciating pain. Later, Bambi himself nearly batters to death a rival who is begging for mercy, while Faline looks on, laughing. Far from being gratuitous, such scenes are, in the author’s telling, the whole point of the novel. Salten insisted that he wrote “Bambi” to educate naïve readers about nature as it really is: a place where life is always contingent on death, where starvation, competition, and predation are the norm.
—————–
On the contrary, the book is at its best when it revels in rather than pretends to resolve the mystery of existence. At one point, Bambi passes by some midges who are discussing a June bug. “How long will he live?” the young ones ask. “Forever, almost,” their elders answer. “They see the sun thirty or forty times.” Elsewhere, a brief chapter records the final conversation of a pair of oak leaves clinging to a branch at the end of autumn. They gripe about the wind and the cold, mourn their fallen peers, and try to understand what is about to happen to them. “Why must we fall?” one asks. The other doesn’t know, but has questions of its own: “Do we feel anything, do we know anything about ourselves when we’re down there?” The conversation tacks back and forth from the intimate to the existential. The two leaves worry about which of them will fall first; one of them, gone “yellow and ugly,” reassures the other that it has barely changed at all. The response, just before the inevitable end, is startlingly moving: “You’ve always been so kind to me. I’m just beginning to understand how kind you are.” That is the opposite of a paean to individualism: a belated but tender recognition of how much we mean to one another. / What makes it such a startling source for a beloved children’s classic is ultimately not its violence or its sadness but its bleakness. Perhaps the most telling exchange in the book occurs, during that difficult winter, between Bambi’s mother and his aunt. “It’s hard to believe that it will ever be better,” his mother says. His aunt responds, “It’s hard to believe that it was ever any better.”
goodnight moon, little fur family, “The Little Island” (1946) In 1950, she published “The Dream Book” . A few dozen yards away from Brown’s house in Vinalhaven, Rockefeller erected a headstone for her. The inscription was composed by Brown herself: “MARGARET WISE BROWN / Writer of Songs and Nonsense.”
una biblioteca amb textos que no es publicaran fins d’aquí a 100 anys
https://www.newyorker.com/magazine/2022/09/19/the-mysterious-case-of-inspector-maigret
Four iconic generations of literary detectives passed through crime fiction during those decades, from the early thirties to the early seventies, when Simenon was writing his books. There was the Sherlock Holmes type, still dominant in the thirties, with all those eccentric, brainy, slightly comic puzzle solvers: Hercule Poirot, Nero Wolfe, Peter Wimsey, and so on. (A French variant was Arsène Lupin, a gentleman thief, whose creator actually borrowed the character of Holmes on occasion, violating copyright law as he did.) Then came the hardboiled kind, with Dashiell Hammett’s Sam Spade establishing it in the nineteen-thirties and Raymond Chandler’s Philip Marlowe giving it poetry in the forties. In the fifties and sixties, Ross Macdonald and John D. MacDonald introduced the philosophical, brooding, and discursive “therapeutic” detective, with Lew Archer in Los Angeles and Travis McGee in Florida. Finally, there’s the police-procedural detective: Evan Hunter’s Eighty-seventh Precinct is more memorable as a collective institution than is any one detective within it.
There is little doubt that, of these two first-time readers, the erudite and the uninformed, Eliot would lean toward the second. “Genuine poetry can communicate before it is understood,” he wrote, in an essay on Dante. “It is better to be spurred to acquire scholarship because you enjoy the poetry, than to suppose that you enjoy the poetry because you have acquired the scholarship.” What he sought, as both a writer and a reader, was “some direct shock of poetic intensity.” True to that quest, “The Waste Land” is a symphony of shocks, and, like other masterworks of early modernism, it refuses to die down.
One of the first people to hear the poem was Virginia Woolf, and her judicious response, as outlined in a journal entry of June, 1922, has lost none of its honesty:
Eliot dined last Sunday & read his poem. He sang it & chanted it & rhythmed it. It has great beauty & force of phrase: symmetry; & tensity. What connects it together, I’m not so sure.
Woolf added, “One was left, however, with some strong emotion.” Indeed.
By the time Samuel Johnson came to write his “Lives of the Poets,” in 1779-81, tastes had changed. In a neoclassical era, ideas still had a place in poetry, but they were supposed to be familiar ones, dignified by harmonious verse—“What oft was thought, but ne’er so well express’d,” in the words of Alexander Pope, the master of the rhyming couplet. By this standard, Donne’s ideas looked weird. Johnson found them “abstruse.” He bestowed on Donne and his contemporaries the label “the metaphysical poets,” not intending it as a compliment. Their trouble, he wrote, was that they were “men of learning, and to show their learning was their whole endeavour; but, unluckily resolving to show it in rhyme, instead of writing poetry they only wrote verses.”
This judgment prevailed into the nineteenth century. The most popular poetry anthology in Victorian England, Francis Turner Palgrave’s “The Golden Treasury,” included not a single poem by Donne.
In contrast, the fifth edition of “The Norton Anthology of Poetry,” published in 2004, includes thirty-one—more than those by Wordsworth or Keats, almost as many as those by Shakespeare. What made the difference was the revolution of modernism, and particularly the influence of T. S. Eliot. In his 1921 essay “The Metaphysical Poets,” Eliot argued that it was exactly Donne’s difficulty and strangeness that made him great. “A thought to Donne was an experience; it modified his sensibility,” Eliot wrote, and modernist poets wanted to recover that union between intellect and feeling. If the poetry that resulted was obscure, that was not a defect but a proof of authenticity. “Poets in our civilization, as it exists at present, must be difficult,” he declared.
Three hundred years earlier, Donne had felt the same way. In “An Anatomy of the World,” he turned an elegy for a fourteen-year-old girl into a diagnosis of spiritual chaos in a world that “Is crumbled out again to his atomies. / ’Tis all in pieces, all coherence gone.” And he worked this incoherence into the very texture of his poetry. In “A Valediction: Of Weeping,” parting lovers cry coins and globes; in “The Comparison,” the sweat of a rival’s mistress is the “spermatic issue of ripe menstruous boils.” In “A Nocturnal Upon St. Lucy’s Day,” Donne annihilates himself: “I am rebegot / Of absence, darkness, death; things which are not.”
Katherine Rundell titles her new biography of Donne “Super-Infinite” (Farrar, Straus & Giroux).
Donne was most widely known in his lifetime as a priest. As the dean of St. Paul’s Cathedral from 1621 until his death, he was one of the capital’s most prominent clergymen, a celebrated preacher whose performances drew thousands.
But “Devotions Upon Emergent Occasions,” a series of vivid and searching reflections on mortality, remains just as powerful as when Donne wrote it, in 1623, during a serious illness. Lying in bed, he heard church bells toll for the dying and wondered if they were being rung for him. Perhaps “they who are about me, and see my state, may have caused it to toll for me, and I know not that,” he writes. The thought led to Donne’s most famous lines, though probably few who quote them know who wrote them and why: “No man is an island, entire of itself; every man is a piece of the continent, a part of the main . . . any man’s death diminishes me, because I am involved in mankind, and therefore never send to know for whom the bell tolls; it tolls for thee.”
Rundell observes that Donne was born within sight of the cathedral where he would later preside—the old St. Paul’s, which burned down in 1666 and was replaced by Christopher Wren’s dome. But he was hardly destined to rise in the Church of England. The Donnes were a Catholic family, who kept the old faith at a time when Queen Elizabeth I was determined to make England a Protestant realm once and for all. Through his mother, the poet was related to Thomas More, the author of “Utopia,” who died as a martyr in 1535 for resisting Henry VIII’s break with Rome. Half a century later, being a Catholic was still a matter of life and death. In 1593, when Donne was twenty-one, his younger brother Henry was arrested for hiding a Jesuit priest in his rooms in London and died in jail of plague. (The priest was hanged, drawn, and quartered.)
Donne’s Catholic background meant that certain doors were closed to him. He attended Oxford as a teen-ager but didn’t take a degree, since doing so required swearing an oath of allegiance to the Church of England. As a young man, however, he converted to Anglicanism—whether out of sincere belief, the desire to get ahead, or (most likely) a combination of both. Donne was set on a career at court, and the right faith was a prerequisite, along with intelligence, boldness, and the ability to flatter.
Donne’s poems were written to be passed hand to hand. Manuscript copies from his lifetime are still being discovered. This intimacy helps to explain one of their most recognizable features: the casually forceful first lines that seem to reach out and shake you by the shoulder. “For God’s sake hold your tongue and let me love,” Donne demands in “The Canonization”; “Busy old fool, unruly Sun,” he chides in “The Sun Rising.” He’s no more polite toward himself. “I am two fools, I know / For loving, and for saying so / In whining poetry,” begins “The Triple Fool.”
“The Ecstasy” begins by likening the reclining poet and his lover to a pillow on a bed, then to a violet drooping on a riverbank. Their clasped hands are cemented together by a balm; their eyes are threaded together on a string. These inanimate comparisons are undeniably weird—the kind of thing Samuel Johnson had in mind when he complained about images “yoked by violence together.”
The uncanniness is deliberate. Donne turns the lovers’ bodies into objects to emphasize that their souls have escaped and are now merging in the air to create a new, joint soul. (“Ecstasy,” he counts on the reader to know, comes from the Greek word ekstasis, which literally means “standing outside oneself.”) Above all, however, it is the poetic equivalent of a gymnast’s floor routine: a demonstration of literary agility, as Donne leaps from idea to image and back without ever putting a foot wrong. Shakespeare, Donne’s contemporary, amazes us by making great verse seem so easy to write, as if it simply spoke itself. Donne amazes us by making it look almost impossibly hard.
[Es cas d’amagat, és empresonat i viu uns anys en la pobresa fins que aconsegueix un càrrec a Saint Paul.]
When his secret marriage was discovered and ruin loomed, the poet wrote to his bride, “John Donne, Anne Donne, Un-done”—a bit of wordplay that became part of his legend. Because his poems are mostly undated, it’s impossible to know how many years passed before he returned to the same pun in the refrain of his solemn poem “A Hymn to God the Father”:
Wilt thou forgive that sin, through which I run,
And do run still, though still I do deplore?
When thou hast done, thou hast not done,
For I have more.
There was plenty of support for that idea in a society like Renaissance England, where so many fundamental beliefs were being rewritten. For centuries, being a good Christian had meant obeying the Pope; now it meant hating him. For even longer, the stars in the night sky had revolved around the Earth in harmonious spheres. Now, thanks to the discoveries of Copernicus and Kepler, “The sun is lost, and th’earth, and no man’s wit / Can well direct him where to look for it,” Donne wrote in “The Anatomy of the World.”
This mental vertigo works itself into Donne’s poems in ways large and small. One of his “Holy Sonnets” begins in arresting fashion: “At the round earth’s imagined corners, blow / Your trumpets, angels.” The image is taken from the Book of Revelation, where, on Judgment Day, angels stand at “the four corners of the earth.” The poem acknowledges that, since we know the Earth is a sphere, its corners can only be a figure of speech; even Scripture can’t be taken at face value. But, if so, who’s to say that the angels, too, aren’t “imagined,” along with the redemption they herald? Donne the priest would never have doubted the existence of angels and Judgment Day, but Donne the poet couldn’t stop himself from raising the question. As the modernists would find centuries later, once poets start thinking in language, there’s no telling where they might end up.
At bottom, it’s not about length but about whether it’s O.K. for the novelist, having dealt with his story from one angle, to wander off and then come back to it from a different angle. In the mind of your typical nineteenth-century historical novelist, this is obviously O.K. He’s a great writer, so why should anyone object if he interrupts his story to give us a lesson on the whiteness of the whale or the succession wars in northern Italy in the seventeenth century? He’ll come back to the main story. What’s the problem?
According to James, the problem was that this was not art. It was basically a picture without “composition,” by which he meant selection, focus. “A picture without composition slights its most precious chance for beauty,” James wrote.
“The Betrothed” emerges in the new translation as a work that anyone who cares about nineteenth-century fiction should want to read. It has the great events—war, famine, plague—and the record of their impact on humble people. It has the sentimentality: demure maidens and brave lads and black-hearted villains. It has passages of lyrical description and passages where the specificity of detail verges on the sociological. It has the prolixity, annoying to some, comforting to others. In other words, it is an exemplary historical novel.
“The Betrothed” took place not in the nineteenth century but, rather, in the seventeenth, a terrible time, the period of the Thirty Years’ War and of resurgent bubonic plague. This permitted Manzoni to make his book more sensational and exotic. (The men wear those floppy-cuffed seventeenth-century boots, like Puss in Boots.) It also, by relieving him of the temptation to allude to people in power in his time, kept him out of jail.
These two people have been through a lot. They both seem older than they were at the start. I cried.
Part of the pleasure of reading “The Betrothed” comes simply from its romanticism, its sweep and danger and excitement: great, gloomy castles jutting over perilous abysses, pious maidens being abducted by unrepentant villains, murderous nuns.
Manzoni was a philologist of sorts—he wrote essays on language—and he deplored the ragbag nature of his native tongue. Because, in his time, Italians mostly stayed close to home and were ruled by foreigners, they barely had a native tongue; the peninsula was a patchwork of mutually unintelligible dialects. Manzoni said that his own writing was an “undigested mixture of sentences that are a little Lombard, a little Tuscan, a little French, and even a little Latin; and also of sentences that do not belong to any of these categories.” In the first edition of “The Betrothed,” published in three volumes from 1825 to 1827, he tried hard, with the help of dictionaries and learned friends, to write a purer Italian—which to him meant the Tuscan dialect, the language of Dante. This edition was an immediate success, but Manzoni wasn’t satisfied with it. He was ashamed of the Milanese and other Lombard usages still defacing his text, as he saw it, so he sat down and for the next thirteen years painstakingly revised the novel, effectively translating his own book—even moving to Florence for a while, to be able to command the cadences of Florentine Tuscan. This revision, which then appeared in ninety-six installments between 1840 and 1842, is what Italians read today and what Michael F. Moore has translated for the Modern Library.
But “The Betrothed” is not just a novel. Its weakest component is its plot, or the plot’s organization. A lot of its psychology isn’t too strong, either. Under the influence of early-twentieth-century commentators such as Henry James and E. M. Forster, we, too, may believe that those things are the most important elements of a novel. “The Betrothed,” however true to its time, is closer to an opera, crammed with solos, duets, choruses, and lyric passages that, from what we can tell, are there more for art’s sake than for the sake of anything else.
veiem el temps de manera semblant a com llegim d’esquerra a dreta, o de dalt a baix en cas del xinès.
KAFKA
Aside from these forays into fiction, the diaries’ most arresting writing is clinically visual. Kafka’s many meticulous descriptions of acquaintances, strangers, and urban tableaux are as cruelly observant as a portrait by Lucian Freud. “Artless transition from the taut skin of my boss’s bald head to the delicate wrinkles of his forehead,” one reads.
La crítica literària
To be the kind of person who could translate the Iliad in 1880, or do a close reading of a poem in 1950, or “queer” a work in 2010, was to be manifestly the product of a university, and to reap economic and social rewards because of it. Any claim about what should be taught had to be seen in light of the academy’s institutional role. Whether one spoke of the Western canon (as Bloom did), the feminist canon (as Sandra Gilbert and Susan Gubar did), or the African American canon (as Henry Louis Gates did), the idea of a literary canon was a form of cultural capital.
“How far beyond the classroom, or beyond the professional society of the teachers and scholars, does this effort reach?” he asks, knowing that the answer is: not far at all.
As a result, literary study has contracted. State legislatures have slashed funding for the arts and humanities; administrators have merged or shut down departments; and the number of tenure-track jobs for graduate students has dwindled. Since the nineteen-sixties, the proportion of students pursuing degrees in English has dropped by more than half.
Whatever the case may be, the hard truth is that no reader needs literary works interpreted for her, certainly not in the professionalized language of the literary scholar. Soon, Guillory writes, the knowledge and pleasure transmitted by literary criticism in the university may become “a luxury that can no longer be afforded.”
The hundred years on either side of “The Critic” marked, for Virginia Woolf, the ascendancy of “the great critic—the Dryden, the Johnson, the Coleridge, the Arnold.”
Woolf to look around and lament the sudden absence of greatness. “Reviewers we have but no critic; a million competent and incorruptible policemen but no judge. Men of taste and learning and ability are for ever lecturing the young,” she wrote. “But the too frequent result of their able and industrious pens is a desiccation of the living tissues of literature into a network of little bones.” Hovering just outside the frame of these damning sentences is the institution of the academy, the place where lectures and dissections were undertaken, and where the social order—and criticism along with it—was transformed by the rise of the profession. [abans hi havia crítics amb una visió del món, sobre el bell i sobre el moral, després només aplicaven fórmules de l’escola a la qual pertanyien]
Establishing a formal method of critical inquiry was in part an attempt to put literary studies on a par with the sciences, which were the chief models for the development of the professions in the university. Close reading branched out into many methods of reading—rhetorical reading for the deconstructionists, symptomatic reading for the Marxists, reparative reading for the queer theorists—culminating in what has been called the “method wars.” But the method wars, Guillory argues, really represented a willingness to settle for “no method.” None of these practices were replicable in a scientific sense; no literary scholar could attempt to corroborate the results of, say, a feminist critique of “Jane Eyre.” Furthermore, criticism became more interested in its own protocols than in what Guillory calls “the verbal work of art.” Discussions of how a novel or a poem worked were less valuable than whatever historical or political occurrences it manifested. The aims of criticism and of scholarship diverged.
The final phase of criticism’s arc began with the rise of a figure that Roger Kimball memorably described as the “tenured radical,” and which we might think of as the Scholar-Activist. For her, the proper task of criticism was to participate in social transformations occurring outside the university. The battle against exploitation, she claimed, could be waged by writing about racism, sexism, homophobia, and colonialism, using an increasingly refined language of historical context, identity, and power.
Today, in academe, one looks around with dismay at what a century of professionalization has wrought—the mastery, yes, but also the bureaucratic pettiness, the clumsily concealed resentment, the quickness to take offense, and the piety, oh, the piety! The contemporary literary scholar, Guillory tells us, is marked by an inflated sense of the urgency and importance of his work. This professional narcissism is the flip side of an insecurity about his work’s social value, an anxiety that scholarly work, no matter how thoughtful, stylish, or genuinely interesting, has no discernible effect on the political problems that preoccupy him.
Scholars, instead of chasing relevance via a politics of surrogacy, might gain from embracing the marginality of literary study. Doing so could free criticism’s practitioners to play to their hidden strengths: their ability to pronounce with intensity and determination on the beauties and defects of writing; their capacity to think about language with absorption and intelligence; their mingled love of art, craft, erudition, connection, and sensuousness. Who knows what consequences this might have on the attractiveness of the discipline to undecided undergraduates or interested lay readers?
[què hauria de fer un crític? Primer aportar el context que potser la majoria dels lectors no tenim. Identificar què volia fer l’autor i valorar fins a quin punt ho aconsegueix. Després valor si el que pretenia fer l’autor seria una elecció correcta ]
Italo Calvino was, word for word, the most charming writer to put pen to paper in the twentieth century. He was born a hundred years ago in Cuba, the eldest son of a wandering Italian botanist and her agronomist husband.
https://www.nytimes.com/2023/06/26/books/goodreads-review-bombing.html?utm_source=pocket_mylist review bombing literatura
https://getpocket.com/explore/item/50-great-classic-novels-under-200-pages?utm_source=pocket_mylist
https://getpocket.com/explore/item/13-books-that-will-actually-make-you-laugh-out-loud?utm_source=pocket_mylist
2024
https://www.newyorker.com/magazine/2019/10/14/how-to-read-gilgamesh?utm_social-type=owned&mbid=social_facebook&utm_brand=tny
Gilgamesh
Scott Frank reescriu guions dels altres a 300m per setmana, centrant-se en què fa interessants els personatges.
ny20240101