Pile of floppy disks in various colors

In Our Feeds

Fame, Fortune, and Files: Five Things That Made Us Smarter This Week

From making sense of incomprehensible numbers to the creation of planets orbiting multiple suns, we learned a lot over the last seven days

We’re living in a world awash with content—from must-read articles and binge-worthy shows to epic tweetstorms and viral TikToks and all sorts of clickbait in between. The Elective is here to help cut through the noise. Each week, members of the Elective team share the books, articles, documentaries, podcasts, and experiences that not only made them smarter but also changed how they see the world around them and, often, how they see themselves.

Animated gif showing three small suns orbiting in the middle of a cloud of galactic dust

ESO/Exeter/Kraus et al./L. Calçada

Animation based on a computer model of the inner region of GW Orionis, which shows an artist rendition of the orbits of the three stars at the system's center.

My Three Suns
 

As the possibility of civilian space flight becomes a reality for the super rich, I want to go on the record that I'm not interested—at least not yet. That reticence didn't stop my space-related excitement almost two years ago when I read that NASA discovered a "Tatooine-like" planet that orbits two stars. This sense of wonder skyrocketed (ahem) when I learned of an even more recent discovery. Scientists have been studying the young GW Ori system, 1,300 light years away, which is in the process of developing a Saturn or Jupiter-like planet. Scientists believe that this new gas giant will orbit not one, not two, but three different stars. If confirmed, it will become the first circumtriple planet discovery. And it could support another hypothesis: planet formation is relatively common. "What we’ve learned is any time planets can form, they do,” astronomer Sean Raymond told the New York Times. That means planets orbiting upwards of six stars could exist. While one astronomer speculates about the incredible sunsets on the planet in the GW Ori system, my thoughts immediately turn to the possibility of alien life. As we uncover the incredible magnitude and variety of planets, will we eventually learn whether we're alone or not in the galaxy? The Fermi Paradox suggests we're not, so until proven otherwise I'll keep my eyes peeled for UFOs.  —Hannah Van Drie

Andy Warhol in a tuxedo and blue plaid shirt on the left with a microphone under his chin, Candy Darling in a pink dress on the right

Michael Ochs Archives/Getty Images

"This is not what I had in mind when I said everyone would be famous for 15 minutes," Andy Warhol—seen here with Candy Darling at a Hollywood premiere in 1971—probably after encountering influencers.

Where Things Are Hollow
 

Poets and philosophers have always warned about the intoxicating effects of fame. “Let fame, that all hunt after in their lives / Live registered upon our brazen tombs,” Shakespeare wrote in the opening lines of Love’s Labor’s Lost. In Shakespeare’s day many may have wanted fame, but only a few could actually achieve it. Today, the experience of being known by people who don’t know you is familiar to anyone with a Facebook account—and that has had some very weird effects on our collective psychology. “The Era of Mass Fame is upon us,” writes Chris Hayes in a wandering New Yorker essay that bounces between Hegel, fox ears, and Kevin Durant’s Twitter habits in a way that makes perfect sense. “The Western intellectual tradition spent millennia maintaining a conceptual boundary between public and private—embedding it in law and politics, norms and etiquette, theorizing and reinscribing it,” Hayes writes. “With the help of a few tech firms, we basically tore it down in about a decade.”

What I like about Hayes’ argument is that he nails precisely why the experience of social media stardom, at whatever fleeting or modest level most people will ever attain it, feels so unsatisfying. It comes down to the very nature of human relationships: they have to be reciprocal in order to be meaningful. “The Star seeks recognition from the Fan, but the Fan is a stranger, who cannot be known by the Star,” Hayes writes, drawing on the philosophical insights of Hegel by way of the French thinker Alexandre Kojève. “Because the Star cannot recognize the Fan, the Fan’s recognition of the Star doesn’t satisfy the core existential desire. There is no way to bridge the inherent asymmetry of the relationship, short of actual friendship and correspondence, but that, of course, cannot be undertaken at the same scale. And so the Star seeks recognition and gets, instead, attention.” To be recognized is to be truly seen, respected, attended to in a sincere and sustained way. To gain attention is, well, something much less. That distinction accounts for so much of what makes online life a less-healthy version of IRL existence, at least as we’ve organized it under massive, networked tech platforms. Social media is all about scale. You can deliver attention at scale, but relationships, recognition, genuine human connection—there’s no hack or 10x shortcut for that. And fame is no substitute for friendship. —Eric Johnson

Animated gif of Dr. Evil from Austin Powers demanding 100 billion dollars

New Line Cinema

Remember when everyone laughed at Dr. Evil for demanding one hundred. billion. dollars. in one of his hare-brained schemes and telling him that kind of money doesn't exist? That was cute.

What’s a Few Hundred Centuries Between Frenemies?
 

Humans struggle with scale. Statistics about population, references to geologic time, estimates about how fast galaxies are speeding away from one another—our brains just don't easily grasp numbers like that. "It feels like the difference between a million and a billion is closer to a factor of three than a factor of 1,000," wrote K.C. Cole in a 2015 issue of Nautilus. "That’s because our brain naturally works using something like a logarithmic scale, so that it can condense information like vast ranges in loudness and brightness efficiently." This is a challenge when we have to think clearly about scientific concepts that involve exponential scales, and it's increasingly challenging when we have to think about public policy. In a country of almost 330 million people, the scale of the decisions we have to make can get literally mind-boggling.

I remember the days when a billion dollars sounded like a lot of money, but right now Congress is haggling over how many trillions will be in the next spending bill. And having that argument with any real comprehension of the stakes is nearly impossible. One of my favorite political newsletters tried to give some context to the $2 trillion gap in competing proposals from Senators Joe Manchin and Bernie Sanders. "To provide some context as to how far apart $2 trillion dollars is, think about this analogy. The next million seconds is about 11 days; the next billion seconds is about 31 years; the next trillion seconds is about 315 centuries," write The Winston Group . "At a rate of spending a dollar per second, a $2 trillion dollar difference is the equivalent of about 630 centuries. So from a spending perspective, Manchin and Sanders are 630 centuries apart." Even that number sends the mind reeling. So here’s some context: 63,000 years is more than all of human history. Even under the most generous negotiation terms, meeting in the middle in this case is the equivalent of more than 30,000 years. On that scale, “What planet are you living on?!” might not simply be an incredulously rhetorical response to a heated budget debate. —Stefanie Sanford

Man in a dark blazer, tie, and shorts walking on a sidewalk

Drew Angerer/Getty Images

A man walks past the building that houses the Appleby law firm offices, November 8, 2017 in Hamilton, Bermuda. #ShortsLife

Short Pants Shortsightedness
 

I live in North Carolina, where temperatures from July to September linger in the mid-90’s and the humidity makes it feel like you’re swimming down the sidewalk. I also work on a college campus, where attending a meeting or stepping out for a cup of coffee can mean a mile-long walk in heat often described as “oppressive” and “stifling.” Wearing pants in that scenario—let alone a full-blown suit, jacket, tie, etc.—has always seemed like an act of derangement. So for years I have been the only person sitting around university conference tables wearing a light dress shirt and a pair of sensible shorts. “For most of the history of men’s dress codes, the idea of wearing shorts at work was pretty much heresy—at least in a white-collar office environment, even after the rise of casual Fridays,” writes New York Times fashion critic Vanessa Friedman. “And the question of whether this is an outdated prejudice is the subject—still—of vociferous debate.”

For the life of me, I don’t understand why. Wearing clothes that match the environment has been one of the great innovations of the human species, allowing us to live in places as diverse as Finland and Maui without freezing or baking to death. We have spent the last century experimenting with synthetic fabrics and high-tech forms of textile production that gave us stain-resistant pants, quick-drying jackets, and antimicrobial socks. Yet I’m supposed to slog through swamp-monster conditions all summer long because King Charles II was into vests? Hard pass. “‘Shorts’ is … well, short, for ‘short pants,’ which are what little boys traditionally wore to denote their juvenile status,” Friedman points out. “On the other hand, the expression that someone ‘wears the pants,’ is intended to denote the person’s powerful, boss-like aura.” Duly noted. But I think my boss-like aura would be even more compromised if I arrive at every meeting sweat-soaked and gasping for breath. And good luck to anyone who tells LeBron James he looks juvenile—Eric Johnson

Animated gif of a character in the movie Zoolander throwing a computer on the ground

nickyonge/reddit

...where'd all the files go?

The Files Are *Inside* the Computer!
 

Something like a decade ago, when I was running Scholastic’s Kid Reporter program, one of my kid journalists was in the office and stumbled on a 3.5-inch floppy disk. She treated it like a piece of long-rotting fruit. “What’s this?!” she asked. I explained how we used them to save files and computers once had disk drives and… “Wait,” I said, “look at this.” I unearthed a 5 1/4-inch disk from a drawer. “What do you think *this* is?” I’m pretty sure her brain fritzed out. It was a humbling experience, but I got it—kids stored stuff differently than us old fogies. Reading Monica Chin’s recent piece for The Verge, though, made me feel like an ancient cast off from an alternate dimension. College professors, it seems, are having a hard time explaining rudimentary computing concepts people not that much older take for granted. “The concept of file folders and directories, essential to previous generations’ understanding of computers, is gibberish to many modern students,” Chin writes.

The story is wild and fascinating. I absolutely identify with professors like Nicolás Guarín-Zapata, an applied physicist and lecturer at Colombia’s Universidad EAFIT, who thinks of his hard drives like filing cabinets. (My stack of external drives attest to how Team File Cabinet I am.) I also see how someone like Texas A&M journalism student Aubrey Vogel, who came of age in the search era, can see digital files as vestigial tech tails. (I might love external drives but I will never ever ever have a physical file cabinet in my home.) But as Chin reports, this generation gap has real consequences for computer education. “Vogel recalls saving to file folders in a first-grade computer class, but says she was never directly taught what folders were—those sorts of lessons have taken a backseat amid a growing emphasis on ‘21st-century skills’ in the educational space,” Chin writes. This story might seem like a funny “kids these days!” diversion—that’s what caught my eye, anyway—but there are real issues and challenges raised here. If coding and AI and robotics are important, can students work with that technology if they don’t understand something as basic as files? It’s hardly an abstract query—and not one easily answered by searching Google or asking Siri. —Dante A. Ciampaglia