Saturday, June 23, 2012

Comments on John Mashey's Pal Review and Skeptical Science (Part 3)

Lessons in Moderation / Censorship and Closed Minds on Skeptical Science 

These two topics are are quite closely linked, so we'll cover both in one post.  

Background

I have participated in probably a dozen or more on-line forums, and I've never once come across one that had so many editorial rules.  In fact, I can't recall having a single post blocked or edited in any forum, so this was a new experience for me.  I do realize that moderators have a purpose.  They can help keep the discussion on track, and they certainly should edit out profanity, random posts, advertisements, and other forms of spam.  

Rufus9 did not start off too well with the forum when he called Mashey's paper "3,000 wasted words" without providing a basis for that statement.  And, in retrospect, the words were not wasted.  As I have now more fully explained in the first post, I just think the words aren't all that valuable.  So, at first, the Moderator did his job correctly, and I think the censorship of my first post was justified.  Later posts seemed to have more and more arbitrary rules, somewhat designed to keep "outside ideas" from entering the forum, or so it seemed to me.

Reasonable Debate?

One area where the moderation of this forum seems to block reasonable debate is in the exclusion of information from other areas of science.  At one point, I received this warning from the moderator:
offer links to the supportive (climate science) literature to support your points, discuss the material raised by others to counter your previous assertions (which you have avoided doing) or concede them.
In response, I wrote something along the lines of:
"My views of science are broader than climate science and many of my references have been deleted..."
And the Moderator snipped that part of my response.   Wow.  In part, that was why I asked for a more full deletion of one of my posts, since the context of statement is very important when stating an argument.

Don't Use Analogies or Studies from Other Fields of Science

The Moderator and member of the forum were especially sensitive to commentary from other fields of science, such as critiques of the Peer Review process, or references to the field of medicine as an analogy. 
"As noted by others earlier, the field of medicine is not a good comp to that of climate science, for a variety of reasons."
For example, part of my argument about Mashey's paper is that it is analogous to an observational study in medicine - capable of producing a hypothesis, but not capable of showing causality.  Others in the debate then raised the case that sometimes these studies are helpful.  I agreed that they could be, and as support, I offered this quote from this article  by Gary Taubes.
"Moreover, this meat-eating association with disease is a tiny association. Tiny. It’s not the 20-fold increased risk of lung cancer that pack-a-day smokers have compared to non-smokers. It’s a 0.2-fold increased risk — 1/100th the size. So with lung cancer we could buy as a society the observation that cigarettes cause lung cancer because it was and remains virtually impossible to imagine what other factor could explain an association so huge and dramatic. Experiments didn’t need to be done to test the hypothesis because, well, the signal was just so big that the epidemiologists of the time could safely believe it was real."
As we can see, if you have a massive signal like that of the impact of smoking on health, then we might agree that we have causation.  What the Moderator left in was my following statement, with the context mostly destroyed:

"So, while some may see AGW as the 20-fold risk of smoking, there are credible sources who do not see it that way."
In my view, Mashey's paper does not even come close to producing that sort of 20-fold signal, so it is dangerous to imply causality from it.

They also clipped the following links that ARE within the bounds of climate science.

16 Scientists - No Need to Worry

Bob Murphy's Critique of Nordhaus


Don't Be Afraid of "Other Science" or PUBMED

One reason to read about other areas of science is to see whether concepts of ideas in one field of science my somehow solve a problem in another field.  Imagine you are a scientist working on some intractable problem.  Should you A) send the problem to other scientists in your field of study, or B) post it on the Internet for random individuals to see?   If you choose A, you might be very interested in listening to this Econtalk podcast with Jonah Lehrer on creativity and where our imagination comes from.

http://www.econtalk.org/archives/2012/06/jonah_lehrer_on.html

This was a fascinating discussion that showed that problems in science are rarely solved by people directly in the field.  You can skip ahead to the 41:56 part to hear this discussion of Eli Lilly and how they and other companies are solving "impossible" problems.
"... a bunch of Fortune 500 firms like Procter and Gamble, Lilly, Pfizer, Kraft, General Electric--companies with huge R&D budgets, in the billions of dollars. And they post a bunch of hard-to-solve problems online. And what's quite astonishing as an incentive is their success rate, anywhere between 40-60% of the problems are solved within 6 months. And they are solved by these strangers working in their spare time. The most interesting incentive, though, in the return to this outsider thinking, is that the problems, we look at who they are solved by. They are almost never solved by someone inside that same field."
Another part of the argument was the discussion of the peer review process and flaws in general with published science.  So, what might PUBMED have to say about this?  And is it relevant to climate science?  Yes and Yes, as we shall see:

First, we have this paper - it is a very interesting piece that covers reasons why research might be false, even if we exclude bias from the equation.

Why Most Published Research is False

Next, one suggestion I made to counteract the flaws of peer review is something similar to the Cochrane Collaboration [link] for the review of scientific papers.  Cochrane conducts meta-studies and seeks to improve the accuracy of science.

Oh, but wait, are these studies themselves potentially flawed?   Why yes they are.  And, has anyone thought of a way to address some of the flaws?  Why yes, they have.

Detecting Bias in Meta-Analysis

Seems like reading papers from outside one's home field of science might be a good thing.

The Spreadsheet Error

Finally, let's discuss the simple error in Mashey's spreadsheet.  He was attempting to calculate the average of a column of numbers to see whether there might be some "express review" process that contributed to the publishing of the papers de Freitas reviewed.  This cell has a formula for calculating the average, which he calculated at 235.   (The formula is the average of cells from M53:M715, when it should be M5:M715 - this has not been corrected as of 23 June)

Since I have an audit background and know that many spreadsheet have errors, it was probably 2 minutes after I opened the spreadsheet that I noticed that the range of cells the formula covered did not include all of the data.  As such, when I corrected the formula, I got a different answer (233 days vs 235 days).

The point of this is not that this was a fatal error, but that there is a good reason to read work outside of climate science to see if we can come up with a protocol that avoids errors like this.  I wonder if anyone has thought about this problem?  Why yes, the Dartmouth Tuck Spreadsheet Engineering Research Project (SERP) has been thinking about this. [link]

They have written this paper about the problem [link]

and they even developed an 11 step auditing program that would have likely caught the error.  (Specifically, Step 8:  Review all formulae not already reviewed for errors.")

While this error was relatively inconsequential, sometimes they are not, such as in this case in medical research where an entire column of data was mishandled.

Cardiologists retract... columns of data misaligned

So, hopefully any scientist whose daily job involves handling data in spreadsheets will be open-minded enough to take a look at this auditing protocol and implement it.  Maybe they could even collaborate with their MBA or Business program colleagues since their students need practical experience conducting audits.

Conclusions?
I'm going close this piece with a general discussion of how we often paint ourselves into a corner and then fail to realize that there is a way out.  I received several "suggestions" from the Skeptical Science gang about "how to be true skeptic" or how to think or what to read or who to trust.  There was virtually no discussion about points where are arguments agreed or any appreciation for ideas from outside climate science that might prove useful.  They might lodge the same complaint about me, so without debating that, here is a technique for when you find yourself painted in that corner and feeling defensive.  

Byron Katie has the answers [here] and [here]

The first part is what she calls 4 Questions and a Turnaround
  1. Is it true?
  2. Can you absolutely know that it's true?
  3. How do you react, what happens, when you believe that thought? 
  4. Who would you be without the thought?
Turn the thought around. Then find at least three specific, genuine examples of how each turnaround is true for you in this situation.

So let's say you believe that the polar ice caps are melting and we are all doomed.   
1?   Yes, they are melting... Yes, I think we are doomed.
2?   Absolutely know?  Well, I might have to admit that we might not be doomed.  I guess there is a shadow of a doubt.  People in Denver might be okay.
3.    React?  I get very worried.  I want to pass carbon taxes.  I want to recycle everything.  I want big oil companies to....  and I really want to tell those global warming skeptics to....
4.   Who would I be if I was not worried?   I'll let you answer that for yourself
Turnaround:  What would you do differently if you did not think we were doomed?  Anything?  Are you less stressed?  Good.

And, be careful about judging me on the basis of this post.  Byron Katie is onto the judgmental stuff, too.





No comments: