Tech tools help track smoky air pollution
Over the fall, as wildfire smoke shrouded the Bay Area and people rushed to buy face masks, a San Francisco company called Aclima seized the chance to track the pollution. It sent two cars bristling with air-monitoring equipment northward toward the fire zone. One made a notable discovery: In the East Bay hills above El Cerrito, smoke on that day was far stronger than in the flatlands.
“It was just like, you passed a boundary, and the concentration was definitely higher,” said Melissa Lunden, the company’s chief scientist, who followed the results remotely.
With wildfire danger expected to rise further over the years as global warming intensifies, the race is on to produce more detailed maps of smoke pollution. Regional authorities have stationed 16 permanent mon-
itors for fine particles — an especially harmful ingredient of wildfire smoke — around the Bay Area. But low-cost sensors, satellites and other tools are rapidly evolving. In future fires, people in San Leandro may be able to look at their phones and know that the air is healthy enough to exercise even if Berkeley’s air remains polluted.
“It’s kind of a little bit the Wild West out there,” with lots of new products and technology, said Frank Freedman, a research scientist with the Center for Applied Atmospheric Research and Education at San Jose State University. He is working with NASA to help build neigh borhood scale pollution maps with satellite data.
Unlike Los Angeles and the Central Valley, the Bay Area generally meets annual federal standards for fine particle pollution, which can bury itself in people’s lungs and contribute to asthma and other respiratory problems. But wildfires deviate from the usual pattern: They can send plumes of smoke across cities that are hundreds of miles away.
Besides the bad air in October, a string of eight Spare the Air days in the Bay Area in a row in early December occurred, in small part, because of smoke drifting northward from the Southern California wildfires. However, wildfires were not a factor in the current run of bad air.
Government pollution monitors are generally highly accurate. The problem is that they are stationary, so they cannot move around to take pollution readings in different neighborhoods. And there aren’t many of them. That’s because they are expensive, costing $200,000 to $250,000 apiece, estimates Eric Stevenson, director of meteorology and measurement at the Bay Area Air Quality Management District, which handles regional air monitoring. That figure includes not only fine-particle monitoring, but also equipment for measuring other pollutants such as ozone and nitrogen oxides.
Basic sensors can cost just $20 or $30, according to John Volckens, a professor of mechanical engineering at Colorado State University, though companies often sell them for a few hundred dollars after adding battery power and Wi-Fi or cellular connections.
San Francisco has only one permanent government fine-particle monitor. Santa Rosa, the largest city in fire-devastated Wine Country, has not had one since the air district lost a lease in late 2013 on a monitoring site. After failing to quickly find another Santa Rosa location, the district sent the monitor 10 miles down the road to Sebastopol. When the fires struck, authorities installed temporary fineparticle monitors at sites in Wine Country, including some at schools such as John B. Riebli Elementary in Santa Rosa, to make sure that harmful dust or ash did not spike at the sites during the cleanup. It has not, according to Stevenson.
Government monitors have other problems. One in Napa got knocked out of service for about a day during the fires because of the power failure. Another quirk: There is a natural lag in the flow of information from the monitors, as the heavy-duty equipment finalizes the scientific readings, so information conveyed to the public generally lags real-time readings by an hour or more. More peculiarly, during daylight-saving time — which includes October, when the fires occurred — the readings provided to the public were generally at least two hours late. That is problematic because a change in the wind direction could shift wildfire smoke with remarkable speed.
Sensors, by contrast, can provide real-time information across a vast geographic area. The challenge, Volckens said, is that they’re generally “just not good enough yet to provide high-fidelity data.”
However, with technology advancing and getting dramatically cheaper, “It’s likely that a decade from now, the low-cost devices will be performing as well as” a type of monitor governments use, he said.
Information from satellites is improving too. Two new satellites that can monitor smoke have gone into orbit in the past 14 months. Another, which will be geostationary over the western United States, is scheduled to launch in March.
“Essentially we have three new satellites that are bringing unprecedented resolution and quality of data,” said Susan O’Neill, a research scientist with the U.S. Forest Service. Satellites traditionally have trouble distinguishing pollution at a high level from that near the ground, but scientists’ ability to overcome that problem is improving, she said.
On the ground, one company that scientists are watching closely is Aclima, founded in 2007. It operates four Google Street View cars — the companies have a partnership — from a cavernous garage on an Embarcadero pier. Inlets on the top of the car draw in fine particles and other pollutants (when vehicles go electric, it will be even easier to avoid pulling in immediate air pollution from exhaust, Lunden noted). The cars’ pollution monitoring equipment includes both labgrade measuring capabilities and low-cost sensors, which are measured against the former for accuracy.
Google is its primary customer for outdoor monitoring, according to Aclima, which monitors the indoor air quality at several Google buildings. The company did not detail how much funding it has, saying that “founder bootstrapping” and customer revenue helped the company at an early stage.
Other companies are piling into the pollution monitoring market too. The South Coast Air Quality Management District, which includes most of Los Angeles, tests dozens of sensors to see how their readings correlate to those of standard, government-grade monitors.
Particle sensors tend to perform relatively well, said Andrea Polidori, the air district’s atmospheric measurements manager, as do ozone sensors.
Stevenson said that sometime in 2018, the Bay Area air district hopes to have guidance and technical expertise available to local communities that want to test out low-cost air monitors.
One particle-measuring technology that has impressed regional air district officials is PurpleAir, a laser-based sensor system made by a Utah company that costs in the neighborhood of $200. It uploads data to the cloud and displays it on a world map.
Lunden, of Aclima, has one at home. During the October fires, she noticed that one day it reported the indoor concentration of particles as high, at a time when the outdoor pollution was diminishing. So she knew to open the windows to air out the house.
“I really enjoyed playing around with it,” she said, noting that she — like many in the Bay Area — could “really feel” the smoke.
At Aclima, the company has consolidated a set of sensors, for fine particles and other pollutants, into a container the size of two shoeboxes. It has enough confidence in its measurements that this year, it will send the sensors without the labgrade equipment (which it currently uses to verify the sensors’ readings).
Meanwhile, the company is gearing up to watch for wintertime smoke, as its cars continue to roam Bay Area streets.
“Everyone around Christmas and New Year starts to burn wood in their fireplaces,” Lunden said. “You can definitely see that signal.”