When it comes to reporting on air quality the mission should be to get that reporting right the first time and every time. In reality, though, that doesn’t always happen. Reference “Why reporting accurately air quality information is crucial” and below.
Today the discussion continues.
Nearly nine months ago, I wrote: “In a nutshell: The good and bad of Valley air-pollution reduction.”
Toward the end of that piece I wrote:
“An 80 percent reduction in [San Joaquin] Valley air pollution levels overall sounds impressive. And, not a single exceedance of the federal one-hour standard of 125 parts per billion of ozone anywhere in the Valley this year to boot? Awesome! But the more stringent federal and state [California] – eight-hour ozone standards – ones that are more protective of human health – have yet to be met and meeting these standards seems years away at best, decades away at worst.
“Add to this that Valley fine particulate matter pollution also persists.
“And what this means is the Valley is not out of the air-pollution woods just yet and what that sounds like is in getting that additional 20 percent air-pollution reduction is not going to be easy.”
Complicating matters is when numbers, from one year to the next, don’t jibe.
What I am getting at exactly is in making comparisons between the separate 2012-13 and 2013-14 editions of the San Joaquin Valley Air Pollution Control District’s “Report to the Community,” there appears to be disagreement regarding certain aspects.
For instance, in the 2012-13 Report, in the “County Days over Federal 8-hr Ozone Standards” graph over the 1997 standard (blue plot) of 84 parts per billion (ppb) and over the 2008 standard (red plot) of 75 ppb, for year 2012 the corresponding numbers are roughly 105 (blue plot) and 320 (red plot), respectively. Meanwhile, in regards to the like graph in the 2013-14 edition, for year 2012 and also for the “County Days over Federal 8-hr Ozone Standard,” the numbers are right around 150 (blue plot) and 375 (red plot), respectively. It’d be one thing if this was the only conflict, but it’s not.
In the same two reports and graphs, such is the case again. In the 2012-13 Report as it pertains to the “County Days over Federal 8-hr Ozone Standards” graph and as it has to do with the 2008 ozone standard of 75 ppb (red plot), 320 is the number of “County Days over Federal 8-hr Ozone Standards;” in fact, the number of such days (320) is the same for years 2011 and 2012. Not so in the 2013-14 Report regarding those same two years (2011 and 2012). The number of “County Days over Federal 8-hr Ozone Standard” for year 2011 is 320, but for year 2012, on the other hand, the “County Days” number is 375. A similar situation exists with respect to the two reports concerning the blue plots, also for years 2011 and 2012. So, why the above discrepancies? I’m not even going to speculate.
Meanwhile, another curiosity and a seeming inconsistency, this time related to “PM2.5 Trends,” the Report (2012-13) for both the “24-hour PM2.5 Design Value Trend” and “Annual PM2.5 Design Value Trend” graphs show the last year as being 2011. But, in the Report (2013-14) regarding the same two graphs, like graphs show the last year as being 2013. Again, I do not care to speculate as to why the difference. It seems to me, though, regarding the two said graphs, that if the latter Report has year 2013 as the last, logic would have it that the former Report should have shown as its last year, year 2012.
The bottom line is that when there are inconsistencies it is difficult to know just what information is accurate and what isn’t and leading one to question the accuracy of all information.
Remember: These being annually-released “Report to the Community” reports, every attempt at accurate reporting and corresponding information dissemination should be maintained. If such is not strived for, why report at all? All of which supports the whole premise of this dialog that it is crucial to report air quality reporting information accurately regardless of location and who is reporting.