Recently, Van Alen Institute and Columbia GSAPP’s Cloud Lab hosted a tech demo and dialogue that asked, “How Does the Brain Respond to the City?” That the event was posed as a question was suggestive of its initial answer: We’re only just beginning to understand.
“An era of crowd-sourced science is coming at us 1,000 miles an hour,” said Cloud Lab co-director Mark Collins to the event’s packed house at ISSUE Project Room, referencing a 2011 white paper and subsequent research that predicts 50 billion internet-connected devices will be in use by 2020. Using brain-computer interfaces (BCI), that “device deluge,” as Collins calls it, will include scores of new wearable technology to help us collect biofeedback. The data collected, he hopes, will help us better understand our neural responses to the urban environment.
Two BCI projects—MindRider and OpenBCI—were featured at the Van Alen event, part of its spring Elsewhere series investigating urban well-being and the effects of the city on the mind and body. Context for those projects included the expertise of Nancy Wells, an environmental psychologist; Dave Jangraw, a neuroscientist; and Collins and his co-director at Cloud Lab, Toru Hasegawa (both are trained as architects). After presentations from all researchers, it became clear that environment-reading BCI technology is in a nascent state, still currently at the mercy of “noisy” data and the complexity of urban conditions, among other challenges.
However, OpenBCI, an open-source, low-cost hardware platform that reads EEG-based signals, promises an answer to the electricity caused by movement: co-founders Joel Murphy and Conor Russomanno showed off a customizable, flexible headgear called Spiderclaw to adjust to a user’s scalp.
MindRider takes this one step further by covering the whole head. A BCI bicycle helmet, it uses GPS and EEG sensors to track (and sometimes broadcast) a cyclist’s mental state. With the noise reduced, it’s easier to see how a MindRider cyclist’s brain reacts to different parts of her journey. DuKode Studio co-founder Arlene Ducao and designer Josue Diaz hope that the data collected through the helmet might one day be used to aid transportation activists or others advocating for calmer streets.
“We know a lot of things qualitatively,” said Ducao, “Wouldn’t it be great if we could add quantitative data?”
Still, so much happens in an urban moment—even momentary changes in the weather—that the first steps toward quantification are just being taken.
“The real world is way more complicated,” neurologist Dave Jangraw said, responding to a question regarding his controlled, if isolated, lab research in which he uses a BCI to track how interesting people find certain objects in virtual, “naturalistic” environments. He recommended introducing urban factors slowly, in the hope of better understanding causality.
Cloud Lab remains ambitious about making connections in real-world situations, and, prior to this event, it partnered with Van Alen on a data-collecting tour of Brooklyn’s DUMBO neighborhood. Fifty participants walked around the area fitted with EEG sensors that collected their electrical reactions to their environment. Collins presented those findings as a visualization of that data, expressing that the lab aims to re-associate what happens in the visual field with what happens in the brains of the participants—projecting a person’s neural activity back out to the architecture they experience.
Toward the end of the night, presentations gave way to a lively Q&A on the as-yet unknown universe of BCI, and a crowd buzzed around during demo hour, eagerly tinkering with the experimental hardware.