Blue dots on an orange background illustrated to create the optical illusion that it's swirling counterclockwise

In Our Feeds

Phone Homes, Data Illusions, and Digital Canyons: Five Things That Made Us Smarter This Week

From perception limits to human snails, we learned a lot over the last seven days

We’re living in a world awash with content—from must-read articles and binge-worthy shows to epic tweetstorms and viral TikToks and all sorts of clickbait in between. The Elective is here to help cut through the noise. Each week, members of the Elective team share the books, articles, documentaries, podcasts, and experiences that not only made them smarter but also changed how they see the world around them and, often, how they see themselves.

Series of green, white, blue, and black arcs made into an optical illusion to create the sensation that the image is swirling

Vectordivider/Getty Images

Yes, your eyes are deceiving you this isn't a gif. Or *is it*??

Who You Gonna Believe, the Internet or Your Own Eyes?

What you see is what you get. Except, sometimes not. In this fun article for SyfyWire, Phil Plait uses a series of classic optical illusions to draw a distinction between data—the raw inputs making their way into our brains—and information, the more nuanced and contextual conclusions we draw from that data. Tricking our brains is a great way to learn how that information processing works, and why we have to be careful about believing our own eyes. (It’s also a great way to start endless arguments on the internet. Anybody remember the famous blue-gold dress of viral photo fame, circa 2015? What innocent days those were in the online world….)

Cultivating a little skepticism for what’s on your screen is even more important in the dawning age of deepfakes, where powerful editing software can bend digital reality to create convincing videos of events that never happened and remarks that were never said. It’s already freaking out the national security world and creating controversy for politicians, celebrities, and the Queen of England. But it’s not all doom-and-gloom. The same technology that lets troublemakers plant words in people’s mouths also lets artists do some amazing things. Like hit the slopes with ostriches, I guess? In conclusion: the internet is doing weird stuff to our brains. [And only we know if there’s really an optical illusion embedded deep in this text. Ed.] —Stefanie Sanford

Map of the United States with counties shaded in blue and gray denoting where the percentage of people using the internet at speeds of 25Mbps or above is below 15% (blue) or 15% of at or greater than 15% (gray).

The Verge

Map created with Microsoft data of the nation's digital divide problem. Absolutely visit The Verge (link in blurb) to see the full, interactive version.

Tired and Extremely Not Wired

We spend a good amount of time at The Elective talking about the digital divide—with policymakers and leaders, with colleagues, and among ourselves. The accessibility gap is an intractable problem that keeps tens of millions of Americans from participating fully in the 21st century. Job opportunities are lost, remote school becomes impossible, access to news and information is cut off. The issue is undeniable, but its scale has become the focus of intense scrutiny during and because of the covid-19 pandemic. How many people really don't have access to reliable high-speed internet? Unsurprisingly, the answer depends on who you ask. The most recent data released by the FCC, in June 2020, says 14.5 million Americans lack broadband access—an improvement over the 21.3 million number it cited in 2019. But in a piece published by Vice this week, the consumer-focused agency Broadband Now dug into the FCC data and found that the digital divide is closer to a yawning chasm. "The actual number of Americans without broadband remains stuck around 42 million, or roughly three times higher than the U.S. government claims," Vice reports. The piece goes into how BroadbandNow came to this number, which involves analyzing information in a reporting document, Form 477, that internet service providers file with the FCC. It's worth reading the piece and tumbling down the rabbit hole. But the salient point is that BroadbandNow says Form 477 has "a widely acknowledged flaw" relating to how ISPs can claim coverage that radically undercounts how many people are actually able to access high-speed internet. The FCC data also doesn't account for "broadband monopolies," where Americans are forced into high-cost plans because of lack of competition. And since these costs can be prohibitive to low-income households, it perpetuates and widens the digital divide.

But don't just take BroadbandNow's word for it. Also this week, The Verge published a map created using Microsoft data that similarly takes the FCC's ISP-friendly data to task. "The disparity...can be shocking," The Verge writes. "In Lincoln County, Washington, an area west of Spokane with a population just a hair over 10,000, the FCC lists 100 percent broadband availability. But according to Microsoft’s data, only 5 percent of households are actually connecting at broadband speeds." (Shocking isn't exactly the word I'd use. #NSFW) The Microsoft data points to yet another problem with the FCC data: the flattening of the term "usage." Just because you can go to a McDonald's or library parking lot and pick up some leaky Wi-Fi signal doesn't mean you have reliable internet access. Nor does owning a device mean you're always able to connect. Smartphones and tablets are expensive, but they're one-time costs if you don't get a mobile data plan and instead rely on publicly available Wi-Fi. Mobile and home internet plans are recurring costs, often exorbitant, which leads many to pick up cheaper devices and find public access points for calls, emails, class, and other important modes of 21st communication. The device-access gap is another component to the digital divide, but building out reliable, inexpensive high-speed internet is an infrastructure issue that can be solved—and should be solved, like, yesterday. The federal government wired the nation for electricity with the New Deal Tennessee Valley Authority program; it's well past time for a similar solution for broadband connectivity. —Dante A. Ciampaglia

Male student, seen from behind, stares at a long blackboard with math equations written on it.

Hill Street Studios/Getty Images

This college student is actually looking at an equation to help him bend the laws of physics and space-time to prevent himself from graduating while the economy's in a recession.

Worst. Commencement Speech. Ever.

“Don’t graduate into a recession.” That’s the straightforward—if highly impractical—advice from the economists at the Richmond Federal Reserve, summarizing a research paper on the long-term effects of finishing college during an economic downturn. Labor economists have known for a long time that tossing your newly minted resume into a depressed job market leads to lower earnings and less career growth in the early years, but it was widely assumed those impacts would level off over time. Not so, says the latest findings: “New research suggests that the picture is worse, with longer-term consequences not only for workers' earnings, but also for their health and family outcomes.” Not only do recessionary graduates suffer lower earnings, but they’re also more likely to die from “deaths of despair” in mid-life, leading to a loss of six to nine months of life expectancy. “The long-term effects of graduating into a recession are costlier than previously believed,” they write, with econometric understatement.

All of this data is based on the 1982 recession in the United States, so it’s not at all clear the same findings will carry forward to modern economic slumps. It’s an especially hard question for the unlucky class of 2020, who went very suddenly from a booming economy to a pandemic-induced nosedive in hiring, followed now by the sharpest growth in decades. Economic whiplash is bound to have some strange effects over time, too. But ultimately, there’s not a lot you can do about the state of the world at commencement time. “Don’t graduate into a recession” is great guidance. If only there were some way to follow it. —Eric Johnson

Photo of three friends having a good time while eating pizza at an indoor party in an apartment

The Good Brigade/Getty Images

Show of hands: Who has 150 friends? ... Really? That many of you? Can you, you know, share some?

The Dunbar Popularity Humblebrag

How many friends can you really have? Social butterflies might have more, introverts fewer, but when you average it all out the max is about 150. At least according to Robin Dunbar, the Oxford professor who popularized the theory and lent his name to it. “Dunbar’s number applies to quality relationships, not to acquaintances–which account for the more casual outer layers of our social networks, beyond our 150 meaningful friendships,” Dunbar writes in a recent essay. “We see it in telephone calling networks, Facebook groups, Christmas card lists, military fighting units, and online gaming environments. The number holds for church congregations, Anglo-Saxon villages as listed in the Domesday Book, and Bronze Age communities associated with stone circles.”

There has been plenty of debate in recent years about whether Dunbar’s 150-person limit holds in the era of social media and instant communication, where it’s theoretically possible to keep in regular contact with more people. But he’s adamant that genuine friendship—defined as people you could greet warmly and converse with easily if you met them out in public—has an upper bound. I’m inclined to agree with him. Time is finite, and solid friendships take some real temporal investment. Passively following someone’s social media doesn’t translate into a cheerful hello in the real world. Even though Swedish researchers claim there’s no limit for a person who wants to put in the effort, my life experience puts me in the Dunbar camp. —Eric Johnson

Gif of a woman saying "Internet is exciting."

Ah, the heady days of Web 1.0 when everyone hyped the internet as if were a sentient being that arrived from a distant galaxy to become humanity's benevolent overlord. What weirdos! (That'll fool 'em. Does this please you, Internet....?)

Do They Still Count As Friends If They’re Snails?

Almost 30 years ago, theorist and author Michael Hauben coined an enduring term of the Internet Age: netizen. The portmanteau of "net citizen" was meant to describe the person, in the 21st century, who exists "as a citizen of the world thanks to the global connectivity that the Net gives you," Hauben wrote in his 1993 article "Common Sense: The Impact the Net Has on People's Lives." "You physically live in one country but you are in contact with much of the world via the global computer network. Virtually you live next door to every other single netizen in the world. Geographical separation is replaced by existence in the same virtual space." It wasn't long before "netizen" was ruined by people trying to sound clever as the Dot Com Bubble inflated, but in its original usage there was a not-so-subtle utopianism to the idea that even if we existed separately as corporeal beings our digital avatars could be neighbors. Thanks to recently released research by a team at University College London, about smartphone habits, I got to thinking about netizenship for the first time in years. And once again it's far removed from the original utopian promise of the internet.

“The smartphone is no longer just a device that we use, it’s become the place where we live,” Professor Daniel Miller, who led the UCL study, told The Guardian. “The flip side of that for human relationships is that at any point, whether over a meal, a meeting or other shared activity, a person we’re with can just disappear, having ‘gone home’ to their smartphone.” On the one hand this is hardly news. We've all been in situations where we've witnessed this vanishing act, be it over a meal, at a gathering, or now on Zoom calls. Still, the findings are disconcerting. Who wants to fit the researchers' description of smartphone users: “human snails carrying our homes in our pockets”? There are piles of research that show we're addicted to our devices, and the pandemic hasn't helped matters. When you're not able to interact with the real world, what else is there to do but retreat into the virtual one? The tech-first world wrought by the last 14 months has also returned the concept of "digital citizenship" to the conversation. But really what we're talking about is netizenship—or, maybe, netizenship 2.0. The devices might be our homes, but too often the first (and only) rooms we use are our bunkers and panic rooms. After more than a year of living separately together, Hauben's original netizen idea sounds wonderful. Let's all invite each other over to explore these fancy digs we all carry around in our pockets. You won’t even need a vaccine passport! —Dante A. Ciampaglia