Sections

Commentary

Seeing past the cool: Facebook’s new smart glasses

Facebook logo reflected in glasses from phone.

The announcement of Facebook’s Ray-Ban Stories glasses that capture audio and video is a technological triumph that adds yet another facet to the debate over the protection of personal privacy. While the product is new, the issues it raises are over a century old.

Google Glass was the first effort at commonly-available image capturing eyewear. I recall when, in 2014, the doors to the elevator I was riding opened and in stepped a young man wearing Google Glass. The unique-looking glasses were a giveaway. Fellow elevator passengers were a mix of wonder and wariness. I asked, “Are you shooting us now?” With a bit of an attitude, the reply was, “You got a problem with that?”

Though an innovative idea at the time, Google Glass soon became just an interesting idea. It continued in some enterprise applications, but was not yet ready for prime time. Facebook’s new Ray-Ban Stories glasses appear to have overcome Google’s earlier problems: the devices are priced at about one-fifth of Google Glass, and look like the Ray-Bans Tom Cruise made famous in the movies. Back then on the elevator, we knew something atypical was afoot because of the geeky appearance of Google Glass. Tom Cruise cool, however, is another matter.

Without a doubt, the Facebook Ray-Ban technology is impressive: two 5-megapixel cameras, three microphones, four gigabytes of storage, and contained not in a bulky or Star Trek-looking device, but in a simple pair of undistinguishable Ray-Bans. Simply tap the shades and the audio or video is captured. The addition of a small LED on the Ray-Ban Stories frame is supposed to let people know they’re being recorded. The product’s website reassuringly proclaims, “Designed for privacy, controlled by you.” The privacy issue, however, is not just about “you” as the picture-taker, but also “them,” those being snapped and recorded. That the secret snapping is associated with Facebook, a company not known for its respect of personal privacy, is not reassuring.

The Evolution of Privacy

This is not the first time the issue of personal privacy and covert photography has arisen. That history is illustrative of the evolving interpretation of what constitutes privacy. In the late 19th century George Eastman developed the Kodak mass-market portable camera. It set off howls of concern. Suddenly, everyone could have had in their hands the ability to capture anyone else’s images and actions without the other person’s knowledge or permission.

No less a legal scholar than Louis Brandeis, who would go on to become one of the great Supreme Court justices, responded to the new Kodak with a law review article entitled The Right to Privacy. “Recent inventions and business methods call attention to the next step which must be taken for the protection of the person, and for securing for the individual…the right ‘to be left alone,’” Brandeis and Samuel Warren wrote for the Harvard Law Review. The article went on to warn that, “numerous mechanical devices threaten to make good the prediction that ‘what is whispered in the closet shall be proclaimed from the house-tops.’”

Brandeis’ analysis seems quaint today when smartphones generate over a billion photographs daily. But replace his term “mechanical devices” with “digital devices” and “whispering in the closet” with the ability to photograph and record the conversations in an elevator without permission and to store and manipulate that information and the result is a brave new world.

The Digital Conundrum

Ray-Ban Stories illustrates the conundrum of the digital era: how the exponential growth in technological capabilities stretches the linear thinking of humans and their institutions—a topic explored in a new book, The Exponential Age, by Azeem Azhar. To become a modern-day Luddite is clearly not the answer. But neither is rolling over and not considering the consequences of the new developments. The issue is not repealing the digital revolution, but the establishment of public interest guardrails for its behavioral results.

The amazing technology of the new glasses is a classic example of how digital entrepreneurs think. The question “can we build it?” supersedes asking, “what are the consequences if we build it and how can they be mitigated?” Facebook appears to have anticipated the need to address the privacy issue as a part of its product rollout, but the issues raised call for more than public relations.

To Facebook’s credit, their Responsible Innovation Principles profess a commitment to “building inclusive, privacy-centric products.” The principles, however, push such responsibility off on the users. Principle 1 is “Never surprise people;” while being “transparent about how our products work and the data they collect” is a worthwhile goal, it does not prevent an unwanted surprise in the elevator. Principle 2 is “Provide controls that matter,” that “put people in charge of their experience,” which in this case appears to be the LED warning light. Principle 3 is “Consider everyone,” specifically, “we also need to consider people who aren’t using our products.” Principle 4 is “Put people first,” and is defined as being “responsible stewards of people’s data.”

Just what do the words in the last principle mean? “[W]e treat it [personal data] with the sensitivity it deserves…[and] take precautions with particularly personal types of data” is the explanation. Beyond the lofty words, however, what will be done with the data that can be secretly collected?

We have all experienced how Facebook and others harnessed the Web to siphon away our personal information – both virtual and physical – even when we are not using the product. Now comes the ability to have wandering collectors of data feed information into Facebook’s servers. To their credit, the company has said they will not use the data created by the glasses for its more traditional aggregation and targeting activities. The digital images and audio captured by the glasses are stored on the device rather than automatically uploaded to Facebook. The wearer makes the offload decision, but once made, the data still ends up in Facebook’s server when the user loads it on an app tied to that user’s Facebook account.

“We take your privacy and security seriously,” the glasses’ website promises. Well, at least they’re talking about it. But this, of course, is the same company that in 2014 promised regulators it would run WhatsApp as a separate company, segregating and protecting consumer data, then—acting unilaterally in 2021—did just the opposite. This is a company whose business model is built on the use of personal information.

Beyond Brandeis

This does not mean some kind of heavy-handed regulation; but it does mean more than allowing the technologists to unilaterally make the rules.

In the 20th century we moved beyond Brandeis’ fears about cameras. The 21st century, however, opens a whole new set of issues around the use of personal data. Facebook makes clear that the glasses are the entry gateway to augmented reality, describing them as “first-generation smart glasses.” We know what is coming: augmented reality and virtual reality.

Rather than repeating what was experienced with the Web—awakening after the horse has left the barn—we should today be establishing policies for the use of the data that is created by the even newer technologies. This does not mean some kind of heavy-handed regulation; but it does mean more than allowing the technologists to unilaterally make the rules.

The internet’s ability to spy on users caught everyone off guard in its early iteration. Facebook’s Ray-Ban Stories glasses now provide the opportunity to get in front of the issue of next-generation wearable technology. It is commendable that companies such as Facebook say they will be sensitive to personal privacy issues—but it is not sufficient. We have seen how companies’ “privacy policies” are less about privacy protection and more about invading a user’s privacy.

Let’s take Facebook (and others) at their word about protecting individual privacy and develop federally enforceable behavioral standards for the use of the data that is created as we use these amazing new products.

Facebook and Google are general, unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the author and not influenced by any donation.