A Missing Link in Making Meaning from Air Monitoring?
Gwen Ottinger, Drexel University
16 April, 2016
What’s in the air in the neighborhoods closest to oil refineries? In general, we don’t know. Ambient air monitoring in “fenceline communities” has historically been neglected by regulators, who are charged with collecting data that represents the overall air quality of an area and rarely the air quality of its most polluted neighborhoods.
But advocacy for better monitoring is slowly expanding the data available to fenceline communities. There are two refineries in the San Francisco Bay area where continuous monitoring for chemicals like benzene, sulfur dioxide, and hydrogen sulfide is taking place, and community members can access the data on a website.
However, “available” does not necessarily mean “accessible” when it comes to data. Taking a reading every five minutes, the monitors create datasets that break the capacities of spreadsheet programs like Excel. As a result, remarkably little is happening with the data—even though it has the potential to show regulatory violations and/or harmful levels of exposure to toxins.
In our Meaning from Monitoring project, we set out to learn how these data can become a more important part of the public conversation about air quality in fenceline communities. Framing the problem in this way got us to a broader question: what does it mean to reshape data cultures to enable community participation?
We knew that the data cultures in air quality policy circles revolve around particular forms of calculation. Air quality standards compare averages to a threshold; when ambient concentrations of a chemical, averaged over a certain time period, exceed the threshold, the standard is violated. For years, community activists have objected to this approach. As one explained, “If you take the average temperature of water running over your hand, but for 30 seconds it is boiling hot, you still get burned!”
One advantage of having continuous, real-time monitoring is that relying on averages is no longer necessary. However, how to do it differently, and take advantage of the new-found data granularity, is far less clear.
To work through this question, we organized a one-day workshop on April 2nd, 2016 in Richmond, California, together with activists working onrefinery issues. We did bit of digging into the data ourselves ahead of time, and started to think about the data in terms of “incidents”—moments when the water suddenly went scalding, to borrow the metaphor—rather than thresholds and averages. We imagined that incidents could emerge from the monitoring data or be assembled by community members using multiple data sources, including their own experiences.
When we discussed our idea in the workshop, it resonated with community members—not a surprise, given that it was modeled on the ways that community activists already talk about the refineries’ effects. What did surprise us was that an entirely different approach to giving the data meaning seemed far more urgent.
During the course of the workshop, we had offered community members examples of what changing the approach to data would look like. Among them was the “Shenango Channel,” a website which our collaborator Randy Sargent, an engineer at Carnegie Mellon University’s CREATE Lab, helped to create. The Shenango Channel shows a video feed of a smokestack alongside time series data of the pollutants believed to be coming from it; the user can fast-forward through the video and see the changes in air quality, or click on points of interest on the graph and see an image of what the stack looked like at the time. We thought the site was powerful, and activists participating in the workshop agreed.
So when it came time to identify next steps, we listed creating a video feed—which would entail getting a camera up and running—separately from putting the monitoring data onto an interactive graph. For community members, though, the distinction made no sense. The juxtaposition of the sensor data and the visual data gave both meaning: peaks on the graphs showed up as a dark plume in the picture, and smoke emanating from the stack became, though the graph, particulate matter being breathed in to human lungs. In effect, the video connected the data-heavy language of policymaking to breath and bodies.
Of course, adding a video to animate the sensor data doesn’t guarantee they will become any more useful to communities. (It’s also not yet clear whether a low-cost camera can show most refinery emissions.) One experienced activist argued against prioritizing cameras, pointing out that it proliferated data in a situation where the problem is not too little but too much. Given the group’s enthusiasm, though, we’ve committed to working with them to get cameras up and running. Time will tell whether they are indeed a missing link, or just another aspect of information overload.
Gwen Ottinger is an Assistant Professor in the Department of Politics and the Center for Science, Technology, and Society at Drexel University. Her ethnography of a Louisiana refinery community, Refining Expertise: How Responsible Engineers Subvert Environmental Justice Challenges, was awarded the 2015 Rachel Carson Prize.