Release Date: November 16, 2015
BUFFALO, N.Y. – The world’s premier festival for art and technology has a stand pitching a mobile phone service? This is Ars Electronica in Linz, Austria. It’s supposed to be the highroad of new media art, not a shopping mall crowding its concourse with catchpenny kiosks. The arts may require original fundraising models, but this seems intrusive, an exhibitionist among the exhibits.
But this isn’t another pop-up retailer at all.
It is a corporate fiction and its convincing appearance is part of a fiercely original performance and workshop designed to sensitize visitors to privacy and security concerns related to mobile phone infrastructure.
This is False Positive, a breakthrough work created by Mark Shepard, associate professor of architecture and media study at the University at Buffalo, with Julian Oliver, a critical engineer and artist based in Berlin, Germany, and Moritz Sefaner, who works at the crossroads of data visualization, information aesthetics and user interface design.
False Positive premiered in September at Ars Electronica, followed by performances in Belgium and West Germany.
False Positive’s next performance will be Nov. 20-21 in Madrid, Spain, at Medialab-Prado, a center for media art and technology that explores forms of experimentation and collaborative learning emerging from digital networks.
“The inspiration comes out of an increasing concern for our privacy, both online and offline, emerging in the wake of the recent revelations of Edward Snowden. We wanted to find ways to help people understand and come to terms with the different ways that we trade bits of personal information for access to online services,” says Shepard, “and to demonstrate how this information about us can be gathered without our knowledge, whether it’s by mobile service providers, the NSA or law enforcement agencies.”
False Positive is a participatory work that builds unique data portraits of its participants assembled from the digital traces left by their online activities. False Positive builds a personal mosaic of a participant’s “data body”: images of friends, interests, work history, hometown, age and gender.
“The focus is on education and learning,” says Shepard. “Among the best ways for people to learn about how their personal information can be exploited is to confront them with the range of information about themselves that they didn’t know was accessible, and to show what inferences can be made from this information.”
It is a work of art as varied as the images of people passing a mirror. But while reflections last only as long as someone stands before silvered glass, False Positive’s portraits are potentially more enduring.
The experience starts with a text message announcing the company name and its mantra: “Welcome to Candygram. Let’s get personal.”
Participants respond with their email address. This initiates a conversation via text messaging that directs them to the False Positive kiosk for a personal data consultation.
“People are incredibly interested in finding out what others know about them. Having a chance to sit down with someone one-on-one about this material is intriguing,” says Shepard. “But it also produces a bit of anxiety about what will come next. What information is out there?”
And what comes next is not always accurate.
Shepard says the accuracy of data bodies generated by predictive analytics and data mining are measured in terms of likelihood and are prone to error. He adds this may be harmless when trying to optimize a marketing campaign, but the stakes are quite higher when dealing with predictive policing, for example.
“The false positives that arise in the course of the consultation spark conversations about how these errors occur and why,” says Shepard. “Responses vary from participant to participant, but generally these are moments where the more abstract notion of a false positive is rendered palpable through specific examples people can relate to.”
The data’s fate is determined by the participant. From the start it’s understood that material collected during the process is restricted to the participant and the data consultant. Visitors can choose to have the assembled portrait deleted or retained for presentation as part of the project.
At the end of the consult, guests receive a brochure that outlines best practices for digital online security and source references where people can take action to protect their information.
“We want to sensitize people to the mobile infrastructures on which we increasingly depend. What are the loopholes in privacy regulations? What information are we voluntarily and involuntarily disclosing, and what are the implications of doing so? We want to increase understanding, not just raise awareness, of how these processes work and how they relate to you personally,” he says.