Background Abstract Misty Mountain Range Colorful Wallpaper Digital Art Gradiant Pastel Dramatic Backdrop

In Our Feeds

Decoding Macbeth, Celebrating Obituaries, and Opting In to Optimism: Five Things That Made Us Smarter This Week

From the computer scientist who quit to the historian who refused to give up, we learned a lot over the last seven days

We’re living in a world awash with content—from must-read articles and binge-worthy shows to epic tweetstorms and viral TikToks and all sorts of clickbait in between. The Elective is here to help cut through the noise. Each week, members of the Elective team share the books, articles, documentaries, podcasts, and experiences that not only made them smarter but also changed how they see the world around them and, often, how they see themselves.

Class of young women in head coverings attending school

Paula Bronstein /Getty Images

Afghan girls read during class at the Markaz high school October 13, 2010 in Bamiyan, Afghanistan. In the peaceful province of Bamiyan girls were able to attend school without any fears, unlike many in the violent Taliban infested areas.

Dreams Denied
 

As a general rule, societies that don’t value the education of young women—and actively prohibit their education—are not good places to be; generally speaking, they are some of the poorest and most repressive places on Earth. So among all the other reasons to look with horror at events in Afghanistan over the last week, there’s this 2017 report from Human Rights Watch with the heartbreaking title ‘I Won’t Be a Doctor, and One Day You’ll Be Sick’: Girls’ Access to Education in Afghanistan. “Among the Taliban’s most systematic and destructive abuses against women was the denial of education,” the report reminds us. “During their five-year rule, the Taliban prohibited almost all education for girls and women.” And no country can prosper, economically or culturally, by stifling the talent of half its people.

Over the last 20 years, despite sporadically intense bouts of fighting between Taliban militants and American and NATO troops, Afghan girls made huge strides in education. The Human Rights Watch report draws on interviews across the country, sharing stories of girls who endured terrible risks to go to school; families who sent older children to work so they could afford the fees and books for younger siblings; kids who walked hours each day to access a classroom. The desire to learn, even under the harshest conditions, is so intrinsic to humanity that it takes extraordinary effort to prevent people from trying to access schooling. “Both the Taliban and ISIS have opposed girls’ access to education, and have been responsible for many attacks on education, particularly girls’ education,” Human Rights Watch reports. “Women and girls living in areas controlled by groups affiliated with ISIS described restrictions so severe that they could not leave their homes, let alone go to school.” I live in a university town, and thousands of students from all over the country just arrived for the start of the fall semester. They’re free to walk where they please, wear what they please, read anything they please, and argue with their professors and classmates about anything they please. The freedom to learn is an extraordinary privilege. I wish it for everyone. —Eric Johnson

Animated gif of a monkey pushing a laptop off a table

We've all wanted to do just dramatically ditch the machine and walk away from it all. Philip Agre did it. Jealous.

The Big Opt-Out
 

To read the first few paragraphs of Reed Albergotti’s Washington Post profile of Philip Agre, you'd be forgiven for thinking the man was dead. His colleagues and friends talk about him in the past tense, lamenting the inability of society, the academy, and the tech industry to listen to and heed the warnings of this pioneering computer scientist, professor, and scholar who, in 1994, correctly predicted the big-data-ingestion-by-a-million-vacuums state of digital life; then in 1997 predicted, again correctly, the ethical failures of artificial intelligence development; then, in 2001, predicted, yep, the surveillance implications of facial recognition software in the exquisitely titled paper "Your Face Is Not a Bar Code: Arguments Against Automatic Face Recognition in Public Places”. He foresaw the topology of our contemporary experience with technology. "Charlotte Lee, who studied under Agre as a graduate student at UCLA, and is now a professor of human-centered design and engineering at the University of Washington," Albergotti writes. "She said she wishes he were around to help her understand it even better." Because he's no longer among the living, right? Wrong. "In 2009, he simply dropped off the face of the earth, abandoning his position at UCLA,” Albergotti continues. “When friends reported Agre missing, police located him and confirmed that he was okay, but Agre never returned to the public debate."

The world of computer science is littered with pioneers (often women and people of color) scrubbed from the record, but it's not everyday we meet someone who ripped himself out of the conversation. But that's what Agre did, with apparently no second thoughts. "Agre resurfaces from time to time, according to a former colleague, but has not been seen in years," Albergotti writes. Why? The Washington Post couldn't track Agre down, nor would his family talk with the paper. His friends and former students and colleagues, even as they extolled his place in the firmament of our digital lives, were similarly mum, citing his privacy. It's a wild, fantastic story that not only expands how we understand the moral tailspin technology finds itself in and our place in the gyre, but it does something exceedingly rare in our Very Online times: presents an essentially unsolvable mystery, then demands we be happy with the crumb trail knowing full well it leads to a narrative dead end. And you know what, that's great. This is the kind of piece that sets the mind racing—not only about who Philip Agre is, but what insights are buried in his work.  More of this please, Internet. —Dante A. Ciampaglia

Young Black woman smiling as the sun forms a bright corona behind her

Lilly Roadstones/Getty Images

Let's retire "If it bleeds, it leads" and replace it with "If it's bright, it's right." (Eh, we'll keep workshopping it...)

Up With Optimism
 

Optimism is the new counterculture. Given the sense of permanent crisis in the world—much of it real, but also at times overhyped—any analysis of what’s going right seems like an act of intellectual rebellion. But as Yuval Levin argues in this great piece for Commentary, “you can’t learn much if you aren’t willing to acknowledge successes alongside failures.” And in patient, clear-headed detail, Levin lays out the overlooked ways that America’s pandemic response actually met the scale of the challenge. Our stuttering regime of covid-19 testing eventually got up to speed and surpassed most other countries. Congress, an institution not typically known for swift and effective action, managed to keep the economy afloat and people from destitution during the worst disruption since the Great Depression, providing a level of targeted financial support unprecedented in American history. And most critically, decades of investment in basic science—much of it conducted in major research universities—made it possible to develop vaccines with truly astonishing speed. “The genome of the virus was first made public in China on January 11, 2020,” Levin writes. “The American pharmaceutical company Moderna, working with federal researchers from the National Institutes of Health, produced the first doses of an mRNA vaccine to protect against the virus two days later, on January 13. By late February, the NIH was launching a Phase 1 clinical study; the first study participant received a shot in his arm on March 16, 2020” (emphasis mine). I remember when most health officials were predicting a somewhat effective vaccine, maybe, within two years. Instead, we had stunningly good vaccines authorized in less than 12 months.

What I like most about Levin’s take is that he zeroes in on the way our national character shapes our successes and failures. We’re bad at collective discipline; we’re good at creative, energetic mobilization. “Americans don’t mobilize into order—we mobilize into action, and our modes of mobilized action are often very disorderly,” Levin writes. That can be frustrating when you’re trying to get compliance with public health measures that demand restraint, and we’d certainly benefit from more civic cohesion. But raw energy has benefits, too. “The legislative response to the virus was not a set of rules for Americans to follow, but a set of resources for Americans to deploy. The health-system response did not set strict criteria for triage; it built respirators by the thousands and put enormous field hospitals in parks and football stadiums. The vaccine deployment began with a futile attempt to prioritize recipients, but it ultimately succeeded as a vast, chaotic dissemination of doses to every pharmacy and supermarket in the country.” Being honest about where we’ve failed is important, but so too is recognizing where we’ve done pretty well. —Stefanie Sanford

Orson Welles, wearing a crown and robes, looks into the distance while being watched by men in Viking uniforms in a scene from the film 'Macbeth'

Republic Pictures/Getty Images

The only thing that can make "Macbeth" more creepy than repeating a certain word? Orson Welles' lewk from his 1948 big screen adaptation. *hides*

Macbeth’s Lexis Nexus
 

One of the best experiences I've had living in New York City was seeing—pre-pandemic, obviously—a production of Macbeth in a shipping container during a sweltering late-summer night in 2017, then drinking whiskey around an oil drum fire with the troupe. It's a memory I return to often, especially now. (Boy was that fun.) But it popped in my mind this week after reading a story on an excellent deployment of data crunching to analyze why Macbeth is such a "creepy play," as Clive Thompson describes it in a piece for OneZero. Academics Jonathan Hope and Michael Witmore, in a paper published in 2014, cracked it. And it's not just because it's soaked with the supernatural and geysers of blood. "It turns out that Macbeth's uncanny flavor springs from the unusual way that Shakespeare deploys one particular word, over and over again." What's that word? Read the story! (Hint: It's in this blurb a few times, and in this article a lot.) The story digs into Hope and Whitmore's methodology and findings, which becomes a portal into the field of digital humanities and how this often-misunderstood discipline can add to our understanding of work, like Macbeth, that we think we know inside and out. And, as Thompson reiterates, data analysis such as this is only the starting point. "It is a very fun discovery about Macbeth," he writes. "When you go back and reread the play, you now have a new type of x-ray vision, and you notice Shakespeare’s fascinating overuse of “[REDACTED]” everywhere. It’s obviously not the only reason Macbeth is an unsettling play. Like all good art, it’s super complex and can’t be reduced to any single literary effect." Counterargument: witches. —Dante A. Ciampaglia

Man in a leather jacket, black shirt, and tie speaks while standing at a lectern

University of California Television/YouTube

Leon Litwack speaks at a Legacy of Slavery at UC Berkley in January 2004.

Turn Every Page
 

Reading obituaries sounds macabre, but I love it. Major newspapers, like the New York Times and Washington Post, tend to write obits about major figures in niche fields, and it’s fascinating to get concise stories about people who had big impacts in the world of, say, sudoku or labor organizing or urban photography. You learn that the path to influence is rarely linear and pretty much always involves a lot of work.

This week, I relearned an important lesson from the Times’ obituary for Leon Litwack, “a leather-jacket-wearing, blues-loving historian” at the University of California, Berkeley. Litwack, who won the Pulitzer Prize in 1980, made a major contribution to the study of Black history by doing something very few scholars had done before: go back to the source material. “Beginning in the early 1960s, a time when many historians still treated enslaved and freed Black people as passive actors in their own narratives, he cut a different path, immersing himself in the archives to discover Black voices and their stories and show how they thought about, and struggled against, oppression,” the Times’ Clay Risen writes. Immersing yourself in the archives, relying on primary source documents instead of hand-me-down interpretations, is how to reshape a scholarly field. It’s the reason classes like AP US History emphasize document-based questions and a reliance on first-person reading of historical artifacts. “Professor Litwack’s most well-known book, Been in the Storm So Long: The Aftermath of Slavery, dispensed with telling a linear history about the years following emancipation and instead, drawing on years of research in obscure archives, presented thematic stories focused on the way Black Americans experienced their freedom and shaped it,” Risen writes. “The book was revolutionary; many historians had assumed that such documents didn’t exist.” But they did, and Leon Litwack found them. And thanks to his obituary, more people have a chance to find him, and his work. —Eric Johnson