Last Tuesday, shortly after the Centers for Disease Control and Prevention issued relaxed guidelines for wearing masks in public during the COVID-19 pandemic, President Joe Biden gave a speech on the North Lawn of the White House. The setting was so verdant—bright sunlight, tall trees framing a lectern, shrubbery in full bloom—that it might have been a virtual Zoom background. Biden wore a black mask to the lectern, then took it off to speak. “If you’re in a crowd, like in a stadium or at a concert, you still need to wear a mask, even if you’re outside,” he said. “But, beginning today, gathering with a group of friends in the park, going for a picnic, as long as you are vaccinated and outdoors, you can do it without a mask.” He described the chance to avoid masking up outdoors as a reason to get a vaccine, and cited it as a giant step for the nation in its drive to gain “independence from the virus” by the Fourth of July. After the speech, he put on dark aviator sunglasses and went back into the White House, leaving the mask behind.
The United States, as a society, is far from leaving masks behind. Most of public life takes place indoors (on subways and buses, in offices and schools, in big-box stores and houses of worship), and the outdoor gatherings most dramatically suspended because of the pandemic—concerts and sporting events—are those where mask wearing is still strongly recommended. The coronavirus may surge back, owing to the persistence of variant strains or a sharp uptick in travel this summer. Even if we reach herd immunity, with seventy per cent of the population vaccinated—something that currently seems unlikely—many of us will keep a mask handy, like a pair of sunglasses, and wear it as the situation demands.
Nevertheless, the ending of the mask mandate marks the ending of a yearlong cultural moment in which we protected ourselves like surgeons and hid our faces like bandits. It also makes clear what has not ended. The public-health protocols that led us to don masks also produced the opposite effect. In the same year that we shrouded our faces outdoors, we put them on sustained and exaggerated display indoors, via Zoom, Teams, and other video-conferencing apps. And the facing-the-camera practice is here to stay, made ubiquitous by technologies whose development advanced while we were masked.
Not so long ago, to cover your face in public was to be seen as an outlaw—as suggested last week, when Ted Wheeler, the mayor of Portland, Oregon, vowed to “unmask” violent protesters in the city. Not so long ago, to “show your face” was to be physically present in a place. Doctors’ appointments, cross-examinations, job interviews, first dates—all were conducted in person, out of the conviction that the best way to read people is by their faces. And the importance given to the human face as the center of character and emotion made full-face video the final frontier in communications technology. The writers of the early-nineteen-sixties TV cartoon “The Jetsons” recognized this: the show’s videophone (along with its jet pack and robotic vacuum cleaner) was the future-auguring device to beat all. So did the founders of Skype, which had more than six hundred million users by 2010; so did Apple, which introduced FaceTime on its phones the same year.
Yet, in “Infinite Jest,” David Foster Wallace made the allure of video-calling a parable about the drawbacks of transformative technology. The novel, published in 1996, is set in 2009. The narrator fondly recalls the era of “the retrograde old low-tech Bell-era voice-only telephonic interface,” when you could presume that you had the total attention of the person on the other end of the line, while you yourself “could look around the room, doodle, fine-groom, peel tiny bits of dead skin away from your cuticles, compose phone-pad haiku, stir things on the stove.” Then came the videophone. But it turned out that people didn’t like seeing another person’s face on a screen while they talked, and that giving the other party your full attention was hard, all of which led to “videophonic stress.” In the end, though, videophony faltered because people loathed the way that they looked onscreen, and came to suffer from a malady known as video-physiognomic dysphoria. Soon, entrepreneurs devised solutions: a tech trick that improved the images of faces; then a line of masks that could be worn for different moods or calls; and then full-body cutouts (masks for the whole body) and transmittable tableaux (electronic human shapes shown in place of the caller). All this was to insure that callers couldn’t see one another—just as it used to be in the low-tech Bell days.
During the pandemic, video calls finally became an everyday reality—and so have many of the drawbacks that Wallace foresaw. All that face time has made us intensely self-conscious about the image we project. As a result, there has been a “Zoom boom” in facelifts, prompted by Zoom dysmorphia—people’s obsession with imperfections that they’ve noticed onscreen. Of a hundred and thirty-four dermatologists surveyed by a trade journal, eighty-six per cent said that their new patients were prompted to get work done by the way they look while videoconferencing. A recent Stanford University study with more than ten thousand participants concluded that long periods of videoconferencing caused women, in particular, to experience “mirror anxiety,” and recommended that organizations space out Zoom meetings and hold some meetings without video. Last month, concern about Zoom burnout led the chief executive of Citigroup, Jane Fraser, to institute Zoom-free Fridays at the company.
Meanwhile, full-featured videoconferencing rigs have become standard equipment for those with budgets to pay for them. During the Golden Globe Awards, in February, Aaron Sorkin drew attention for his setup: a wide-angle view of a living room and kitchen where family members and colleagues tried to look casual, as he accepted an award for Best Motion Picture Screenplay. The New York City mayoral candidate Ray McGuire, a former Citigroup executive, joins campaign video events from a book-lined corner of his apartment, facing a video setup centered on a thirty-nine-hundred-dollar camera that’s mounted (along with an L.E.D. ring light) on a tripod. During one event, McGuire mimicked an aide directing him: “You need to sit back, you need to lean forward, you need to turn this way, you need to turn that way. You’ve got a great camera, but, you know, smile a bit!”
After a year, we are also more aware than we were of the face itself as a mask, a site of performance and scrutiny. Watching video shot by an adolescent bystander, jurors in the trial of the former Minneapolis police officer Derek Chauvin, for the murder of George Floyd, could see the look on Floyd’s face as Chauvin’s knee pressed on his neck—a mix of pain, terror, and astonishment—as well as the utter indifference on Chauvin’s face. That video was vital to the prosecution’s case. Closeup, on-the-spot video may make law-enforcement officers accountable as never before.
At the same time, monitoring the human face through technology has become the default mode of public encounter in less edifying ways. After years of public resistance to telemedicine, doctors are now regularly seeing patients by videoconference, and in-person consultation with a physician for non-urgent health matters may go the way of the house call. It’s so commonplace now, when you enter an arena or an office, for someone to point a device at your forehead to take your temperature that you don’t even register it. That’s a long way from the sinister strategies found in dystopian fiction; but the application of face-recognition technology depends on our getting so accustomed to having our faces scrutinized that we no longer think twice about it.
Last June, a group of Senate Democrats, led by Ed Markey, of Massachusetts, and Jeff Merkley, of Oregon, introduced a bill calling for a ban on the use of facial-recognition technology by law enforcement. Markey, in a statement, cited concerns that the practice “poses a serious threat to our privacy and civil liberties, and it also disproportionately endangers Black and Brown Americans.” (The technology is said to be less accurate in identifying nonwhite faces than white ones.) The bill faltered, and six months later, private citizens and local law-enforcement officials employing face-recognition technology sent federal investigators tips that helped them identify people who had taken part in the January 6th riot at the Capitol. Markey decried the practice, urging law enforcement to “keep the public safe and hold criminals accountable without relying on invasive tools that are proven to have serious accuracy and bias issues.” He plans to reintroduce the bill later this year.
Face recognition, in fact, is now the leading edge of public surveillance. The Chinese Communist Party has spent a decade embedding such technology in urban life, and now habitually uses it to monitor citizens. We may assume that such monitoring won’t happen here, but, just as street-surveillance cameras, contested by civil-liberties groups twenty years ago, are now taken for granted, so public resistance to face recognition is likely softening. Tech companies seem to have recognized this. In 2017, Apple introduced Face ID, which “lets you securely unlock your iPhone or iPad, authenticate purchases, sign in to apps, and more—all with just a glance.”