An analysis of water, sediment and seafood samples taken in 2010 during and after the oil spill in the Gulf of Mexico has found higher contamination levels in some cases than previous studies by federal agencies did, casting doubt on some of the earlier sampling methods. http://www.nytimes.com/2013/08/20/science/earth/new-analysis-of-gulf-oil-spill.html?ref=us The lead author, Paul W. Sammarco of the Louisiana Universities Marine Consortium, said tthat the greater contamination called into question the timing of decisions by the National Oceanic and Atmospheric Administration to reopen gulf fisheries after the spill and that “it might be time to review the techniques that are used to determine” such reopenings. Some areas were reopened before the well was capped three months after the blowout.
But the study found higher levels of many oil-related compounds than earlier studies by NOAA scientists and others, particularly in seawater and sediment. The compounds studied included polycyclic aromatic hydrocarbons, some of which are classified as probably carcinogenic, and volatile organic compounds, which can affect the immune and nervous systems.
“When the numbers first started coming in, I thought these looked awfully high,” Dr. Sammarco said, referring to the data he analyzed, which came from samples that he and other researchers had collected. Then he looked at the NOAA data. “Their numbers were very low,” he said, “I thought what is going on here? It didn’t make sense.”
Because of the widespread use of dispersants during the spill — which raised separate concerns about toxicity — the oil, broken into droplets, may have remained in patches in the water rather than dispersing uniformly.
“Sampling a patchy environment, you may not necessarily hit the patches,” he said.
“To see NOAA doing this, that’s inexcusable,” Dr. Riki Ott, an independent marine toxicologist said, “It has been known since Exxon Valdez that this spotty sampling does not work.”