BAD Science or BAD Science Journalism? – A Response to Daniel Lakens

10 02 2013

ResearchBlogging.orgTwo weeks ago  there was a hot debate among Dutch Tweeps on “bad science, bad science journalism and bad science communication“. This debate was started and fueled by different Dutch blog posts on this topic.[1,4-6]

A controversial post, with both fierce proponents and fierce opposition was the post by Daniel Lakens [1], an assistant professor in Applied Cognitive Psychology.

I was among the opponents. Not because I don’t like a new fresh point of view, but because of a wrong reasoning and because Daniel continuously compares apples and oranges.

Since Twitter debates can’t go in-depth and lack structure and since I cannot comment to his Google sites blog, I pursue my discussion here.

The title of Daniels post is (freely translated, like the rest of his post):

Is this what one calls good science?” 

In his post he criticizes a Dutch science journalist, Hans van Maanen, and specifically his recent column [2], where Hans discusses a paper published in Pediatrics [3].

This longitudinal study tested the Music Marker theory among 309 Dutch kids. The researchers gathered information about the kids’ favorite types of music and tracked incidents of “minor delinquency”, such as shoplifting or vandalism, from the time they were 12 until they reached age 16 [4]. The researchers conclude that liking music that goes against the mainstream (rock, heavy metal, gothic, punk, African American music, and electronic dance music) at age 12 is a strong predictor of future minor delinquency at 16, in contrast to chart pop, classic music, jazz.

The University press office send out a press release [5 ], which was picked up by news media [4,6] and one of the Dutch authors of this study,  Loes Keijsers,  tweeted enthusiastically: “Want to know whether a 16 year old adult will suffer from delinquency, than look at his music taste at age 12!”

According to Hans, Loes could have easily broadcasted (more) balanced tweets, likeMusic preference doesn’t predict shoplifting” or “12 year olds who like Bach keep quiet about shoplifting when 16.” But even then, Hans argues, the tweets wouldn’t have been scientifically underpinned either.

In column style Hans explains why he thinks that the study isn’t methodologically strong: no absolute numbers are given; 7 out of 11 (!) music styles are positively associated with delinquency, but these correlations are not impressive: the strongest predictor (Gothic music preference) can explain no more than 9%  of the variance in delinquent behaviour, which can include anything from shoplifting, vandalism, fighting, graffiti spraying, switching price tags.  Furthermore the risks of later “delinquent” behavior are small:  on a scale 1 (never) to 4 (4 times or more) the mean risk was 1,12. Hans also wonders whether it is a good idea to monitor kids with a certain music taste.

Thus Hans concludesthis study isn’t good science”. Daniel, however, concludes that Hans’ writing is not good science journalism.

First Daniel recalls he and other PhD’s took a course on how to peer review scientific papers. On basis of their peer review of a (published) article 90% of the students decided to reject it. The two main lessons learned by Daniel were:

  • It is easy to critize a scientific paper and grind it down. No single contribution to science (no single article) is perfect.
  • New scientific insights, although imperfect, are worth sharing, because they help to evolve science. *¹

According to Daniel science jounalists often make the same mistakes as the peer reviewing PhD-students: critisizing the individuel studies without a “meta-view” on science.

Peer review and journalism however are different things (apples and oranges if you like).

Peer review (with all its imperfections) serves to filter, check and to improve the quality of individual scientific papers (usually) before they are published  [10]. My papers that passed peer review, were generally accepted. Of course there were the negative reviewers, often  the ignorant ones, and the naggers, but many reviewers had critique that helped to improve my paper, sometimes substantially. As a peer reviewer myself I only try to separate the wheat from the chaff and to enhance the quality of the papers that pass.

Science journalism also has a filter function: it filters already peer reviewed scientific papers* for its readership, “the public” by selecting novel relevant science and translating the scientific, jargon-laded language, into language readers can understand and appreciate. Of course science journalists should put the publication into perspective (call it “meta”).

Surely the PhD-students finger exercise resembles the normal peer review process as much as peer review resembles science journalism.

I understand that pure nitpicking seldom serves a goal, but this rarely occurs in science journalism. The opposite, however, is commonplace.

Daniel disapproves Hans van Maanen’s criticism, because Hans isn’t “meta” enough. Daniel: “Arguing whether an effect size is small or mediocre is nonsense, because no individual study gives a good estimate of the effect size. You need to do more research and combine the results in a meta-analysis”.

Apples and oranges again.

Being “meta” has little to do with meta-analysis. Being meta is … uh … pretty meta. You could think of it as seeing beyond (meta) the findings of one single study*.

A meta-analysis, however, is a statistical technique for combining the findings from independent, but comparable (homogeneous) studies in order to more powerfully estimate the true effect size (pretty exact). This is an important, but difficult methodological task for a scientist, not a journalist. If a meta-analysis on the topic exist, journalists should take this into account, of course (and so should the researchers). If not, they should put the single study in broader perspective (what does the study add to existing knowledge?) and show why this single study is or is not well done?

Daniel takes this further by stating that “One study is no study” and that journalists who simply echo the press releases of a study ànd journalists who just amply criticizes only single publication (like Hans) are clueless about science.

Apples and oranges! How can one lump science communicators (“media releases”), echoing journalists (“the media”) and critical journalists together?

I see more value in a critical analysis than a blind rejoicing of hot air. As long as the criticism guides the reader to appreciate the study.

And if there is just one single novel study, that seems important enough to get media attention, shouldn’t we judge the research on its own merits?

Then Daniel asks himself: “If I do criticize those journalists, shouldn’t I criticize those scientists who published just a single study and wrote a press release about it? “

His conclusion? “No”.

Daniel explains: science never provides absolute certainty, at the most the evidence is strong enough to state what is likely true. This can only be achieved by a lot of research by different investigators. 

Therefore you should believe in your ideas and encourage other scientists to pursue your findings. It doesn’t help when you say that music preference doesn’t predict shoplifting. It does help when you use the media to draw attention to your research. Many researchers are now aware of the “Music Marker Theory”. Thus the press release had its desired effect. By expressing a firm belief in their conclusions, they encourage other scientists to spend their sparse time on this topic. These scientists will try to repeat and falsify the study, an essential step in Cumulative Science. At a time when science is under pressure, scientists shouldn’t stop writing enthusiastic press releases or tweets. 

The latter paragraph is sheer nonsense!

Critical analysis of one study by a journalist isn’t what undermines the  public confidence in science. Rather it’s the media circus, that blows the implications of scientific findings out of proportion.

As exemplified by the hilarious PhD Comic below research results are propagated by PR (science communication), picked up by media, broadcasted, spread via the internet. At the end of the cycle conclusions are reached, that are not backed up by (sufficient) evidence.

PhD Comics – The news Cycle

Daniel is right about some things. First one study is indeed no study, in the sense that concepts are continuously tested and corrected: falsification is a central property of science (Popper). He is also right that science doesn’t offer absolute certainty (an aspect that is often not understood by the public). And yes, researchers should believe in their findings and encourage other scientists to check and repeat their experiments.

Though not primarily via the media. But via the normal scientific route. Good scientists will keep track of new findings in their field anyway. Suppose that only findings that are trumpeted in the media would be pursued by other scientists?

7-2-2013 23-26-31 media & science

And authors shouldn’t make overstatements. They shouldn’t raise expectations to a level which cannot be met. The Dutch study only shows weak associations. It simply isn’t true that the Dutch study allows us to “predict” at an individual level if a 12 year old will “act out” at 16.

This doesn’t help lay-people to understand the findings and to appreciate science.

The idea that media should just serve to spotlight a paper, seems objectionable to me.

Going back to the meta-level: what about the role of science communicators, media, science journalists and researchers?

According to Maarten Keulemans, journalist, we should just get rid of all science communicators as a layer between scientists and journalists [7]. But Michel van Baal [9] and Roy Meijer[8] have a point when they say that  journalists do a lot PR-ing too and they should do better than to rehash news releases.*²

Now what about Daniel criticism of van Maanen? In my opinion, van Maanen is one of those rare critical journalists who serve as an antidote against uncritical media diarrhea (see Fig above). Comparable to another lone voice in the media: Ben Goldacre. It didn’t surprise me that Daniel didn’t approve of him (and his book Bad Science) either [11]. 

Does this mean that I find Hans van Maanen a terrific science journalist? No, not really. I often agree with him (i.e. see this post [12]). He is one of those rare journalists who has real expertise in research methodology . However, his columns don’t seem to be written for a large audience: they seem too complex for most lay people. One thing I learned during a scientific journalism course, is that one should explain all jargon to one’s audience.

Personally I find this critical Dutch blog post[13] about the Music Marker Theory far more balanced. After a clear description of the study, Linda Duits concludes that the results of the study are pretty obvious, but that the the mini-hype surrounding this research is caused by the positive tone of the press release. She stresses that prediction is not predetermination and that the musical genres are not important: hiphop doesn’t lead to criminal activity and metal not to vandalism.

And this critical piece in Jezebel [14],  reaches far more people by talking in plain, colourful language, hilarious at times.

It also a swell title: “Delinquents Have the Best Taste in Music”. Now that is an apt conclusion!

———————-

*¹ Since Daniel doesn’t refer to  open (trial) data access nor the fact that peer review may , I ignore these aspects for the sake of the discussion.

*² Coincidence? Keulemans has covered  the music marker study quite uncritically (positive).

Photo Credits

http://www.phdcomics.com/comics/archive.php?comicid=1174

References

  1. Daniel Lakens: Is dit nou goede Wetenschap? – Jan 24, 2013 (sites.google.com/site/lakens2/blog)
  2. Hans van Maanen: De smaak van boefjes in de dop,De Volkskrant, Jan 12, 2013 (vanmaanen.org/hans/columns/)
  3. ter Bogt, T., Keijsers, L., & Meeus, W. (2013). Early Adolescent Music Preferences and Minor Delinquency PEDIATRICS DOI: 10.1542/peds.2012-0708
  4. Lindsay Abrams: Kids Who Like ‘Unconventional Music’ More Likely to Become Delinquent, the Atlantic, Jan 18, 2013
  5. Muziekvoorkeur belangrijke voorspeller voor kleine criminaliteit. Jan 8, 2013 (pers.uu.nl)
  6. Maarten Keulemans: Muziek is goede graadmeter voor puberaal wangedrag – De Volkskrant, 12 januari 2013  (volkskrant.nl)
  7. Maarten Keulemans: Als we nou eens alle wetenschapscommunicatie afschaffen? – Jan 23, 2013 (denieuwereporter.nl)
  8. Roy Meijer: Wetenschapscommunicatie afschaffen, en dan? – Jan 24, 2013 (denieuwereporter.nl)
  9. Michel van Baal. Wetenschapsjournalisten doen ook aan PR – Jan 25, 2013 ((denieuwereporter.nl)
  10. What peer review means for science (guardian.co.uk)
  11. Daniel Lakens. Waarom raadde Maarten Keulemans me Bad Science van Goldacre aan? Oct 25, 2012
  12. Why Publishing in the NEJM is not the Best Guarantee that Something is True: a Response to Katan – Sept 27, 2012 (laikaspoetnik.wordpress.com)
  13. Linda Duits: Debunk: worden pubers crimineel van muziek? (dieponderzoek.nl)
  14. Lindy west: Science: “Delinquents Have the Best Taste in Music” (jezebel.com)




Health and Science Twitter & Blog Top 50 and 100 Lists. How to Separate the Wheat from the Chaff.

24 04 2012

Recently a Top 100 scientists-Twitter list got viral on Twitter. It was published at accreditedonlinecolleges.com/blog.*

Most people just tweeted “Top 100 Scientists on Twitter”, others were excited to be on the list, a few mentioned the lack of scientist X or discipline Y  in the top 100.

Two scientist noticed something peculiar about the list: @seanmcarroll noticed two fake (!) accounts under “physics” (as later explained these were: @NIMAARKANIHAMED and @Prof_S_Hawking). And @nutsci (having read two posts of mine about spam top 50 or 100 lists [12]) recognized this Twitter list as spam:

It is surprising how easy it (still) is for such spammy Top 50 or 100 Lists to get viral, whereas they only have been published to generate more traffic to the website and/or to earn revenue through click-throughs.

It makes me wonder why well-educated people like scientists and doctors swallow the bait. Don’t they recognize the spam? Do they feel flattered to be on the list, or do they take offence when they (or another person who “deserves” it) aren’t chosen? Or perhaps they just find the list useful and want to share it, without taking a close look?

To help you to recognize and avoid such spammy lists, here are some tips to separate the wheat from the chaff:

  1. Check WHO made the list. Is it from an expert in the field, someone you trust? (and/or someone you like to follow?)
  2. If you don’t know the author in person, check the site which publishes the list (often a “blog”):
    1. Beware if there is no (or little info in the) ABOUT-section.
    2. Beware if the site mainly (only) has these kind of lists or short -very general-blogposts (like 10 ways to….) except when the author is somebody like Darren Rowse aka @ProBlogger [3].
    3. Beware if it is a very general site producing a diversity of very specialised lists (who can be expert in all fields?)
    4. Beware if the website has any of the following (not mutually exclusive) characteristics:
      1. Web addresses like accreditedonlinecolleges.com, onlinecolleges.com, onlinecollegesusa.org,  onlinedegrees.com (watch out com sites anyway)
      2. Websites with a Quick-degree, nursing degree, technician school etc finder
      3. Prominent links at the homepage to Kaplan University, University of Phoenix, Grand Canyon University etc
    5. Reputable sites less likely produce nonsense lists. See for instance this “Women in science blogging”-list published in the Guardian [4].
  3. When the site itself seems ok, check whether the names on the list seem trustworthy and worth a follow. Clearly, lists with fake accounts (other then lists with “top 50 fake accounts” ;)) aren’t worth the bother: apparently the creator didn’t make the effort to verify the accounts and/or hasn’t the capacity to understand the tweets/topic.
  4. Ideally the list should have added value. Meaning that it should be more than a summary of names and copy pasting of the bio or “about” section.
    For instance I have recently been put on a list of onlinecollegesusa.org [b], but the author had just copied the subtitle of my blog: …. a medical librarian and her blog explores the web 2.0 world as it relates to library science and beyond.
    However, sometimes, the added value may just be that the author is a highly recognized expert or opinion leader. For instance this Top Health & Medical Bloggers (& Their Twitter Names) List [5] by the well known health blogger Dean Giustini.
  5. In what way do these lists represent *top* Blogs or Twitter accounts? Are their blogs worth reading and/or their Twitter accounts worth following? A nobel price winner may be a top scientist, but may not necessarily be a good blogger and/or may not have interesting tweets. (personally I know various examples of uninteresting accounts of *celebrities* in health, science and politics)
  6. Beware if you are actively approached and kindly requested to spread the list to your audience. (for this is what they want).It goes like this (watch the impersonal tone):

    Your Blog is being featured!

    Hi There,

    I recently compiled a list of the best librarian blogs, and I wanted to let you know that you made the list! You can find your site linked here: [...]

    If you have any feedback please let me know, or if you think your audience would find any of this information useful, please feel free to share the link. We always appreciate a Facebook Like, a Google +1, a Stumble Upon or even a regular old link back, as we’re trying to increase our readership.

    Thanks again, and have a great day!

While some of the list may be worthwhile in itself, it is best NOT TO LINK TO DOUBTFUL LISTS, thus not  mention them on Twitter, not retweet the lists and not blog about it. For this is what they only want to achieve.

But what if you really find this list interesting?

Here are some tips to find alternatives to these spammy lists (often opposite to above-mentioned words of caution) 

  1. Find posts/lists produced by experts in the field and/or people you trust or like to follow. Their choice of blogs or twitter-accounts (albeit subjective and incomplete) will probably suit you the best. For isn’t this what it is all about?
  2. Especially useful are posts that give you more information about the people on the list. Like this top-10 librarian list by Phil Bradley [6] and the excellent “100+ women healthcare academics” compiled by @amcunningham and @trishgreenhalgh [7].
    Strikingly the reason to create the latter list was that a spammy list not recognized as such (“50 Medical School Professors You Should Be Following On Twitter”  [c])  seemed short on women….
  3. In case of Twitter-accounts:
    1. Check existing Twitter lists of people you find interesting to follow. You can follow the entire lists or just those people you find most interesting.
      Examples: I created a list with people from the EBM-cochrane people & sceptics [8]. Nutritional science grad student @Nutsci has a nutrition-health-science list [9]. The more followers, the more popular the list.
    2. Check interesting conversation partners of people you follow.
    3. Check accounts of people who are often retweeted in the field.
    4. Keep an eye on #FF (#FollowFriday) mentions, where people worth following are highlighted
    5. Check a topic on Listorious. For instance @hrana made a list of Twitter-doctors[10]. There are also scientists-lists (then again, check who made the list and who is on the list. Some health/nutrition lists are really bad if you’re interested in science and not junk)
    6. Worth mentioning are shared lists that are open for edit (so there are many contributors besides the curator). Lists [4] and [7] are examples of crowd sourced lists. Other examples are truly open-to-edit lists using public spreadsheets, like the Top Twitter Doctors[11], created by Dr Ves and  lists for science and bio(medical) journals [12], created by me.
  4. Finally, if you find the spam top 100 list truly helpful, and don’t know too many people in the field, just check out some of the names without linking to the list or spreading the word.

*For obvious reasons I will not hyperlink to these sites, but if you would like to check them, these are the links

[a] accreditedonlinecolleges.com/blog/2012/top-100-scientists-on-twitter

[b] onlinecollegesusa.org/librarian-resources-online

[c] thedegree360.onlinedegrees.com/50-must-follow-medical-school-professors-on-twitter

  1. Beware of Top 50 “Great Tools to Double Check your Doctor” or whatever Lists. (laikaspoetnik.wordpress.com)
  2. Vanity is the Quicksand of Reasoning: Beware of Top 100 and 50 lists! ((laikaspoetnik.wordpress.com)
  3. Google+ Tactics of the Blogging Pros (problogger.net)
  4. “Women in science blogging” by  ( http://www.guardian.co.uk/science)
  5. Top Health & Medical Bloggers (& Their Twitter Names) List (blog.openmedicine.ca)
  6. Top-10 librarian list by Phil Bradley (www.blogs.com/topten)
  7. 100+ women healthcare academics by Annemarie Cunningham/ Trisha Greenhalgh (wishfulthinkinginmedicaleducation.blogspot.com)
  8. Twitter-doctors by @hrana (listorious.com)
  9. EBM-cochrane people & sceptics (Twitter list by @laikas)
  10. Nutrition-health-science (Twitter list by @nutsci)
  11. Open for edit: Top Twitter Doctors arranged by specialty in alphabetical order (Google Spreadsheet by @drves)
  12. TWITTER BIOMEDICAL AND OTHER SCIENTIFIC JOURNALS & MAGAZINES (Google Spreadsheet by @laikas)






Silly Sunday #50: Molecular Designs & Synthetic DNA

23 04 2012

As a teenager I found it hard to picture the 3D structure of DNA, proteins and other molecules. Remember we didn’t have a computer then, no videos, nor 3D-pictures or 3D models.

I tried to fill the gap, by making DNA-molecules of (used) matches and colored clay, based on descriptions in dry (and dull 2D) textbooks, but you can imagine that these creative 3D clay figures beard little resemblance to the real molecular structures.

But luckily things have changed over the last 40 years. Not only do we have computers and videos, there are also ready-made molecular models, specially designed for education.

O, how I wished, my chemistry teachers would have had those DNA-(starters)-kits.

Hattip: Joanne Manaster‏ @sciencegoddess on Twitter: 

Curious? Here is the Products Catalog of http://3dmoleculardesigns.com/news2.php

Of course, such “synthesis” (copying) of existing molecules -though very useful for educational purposes- is overshadowed by the recent “CREATION of molecules other than DNA and RNA [xeno-nucleic acids (XNAs)], that can be used to store and propagate information and have the capacity for Darwinian evolution.

But that is quite a different story.

Related articles





Friday Foolery #42 So You Think You Can Dance Your PhD Thesis?

5 11 2011

It’s hard to explain your research to non-scientists. My PhD defense was preceded by a slide show (yes, that was once-upon-a-time that we didn’t use Powerpoint). It was the only part the public could follow a bit. But it is too long, static and detailed.

That cannot be said of these videos, where PhD’s from all over the world interpret their graduate research in dance form.

The videos below are the winners of the 2011 edition of the Dance your PhD contest. For the 4th year, this contest is organized by Gonzolabs & Science. See http://gonzolabs.org/dance/

There are 4 categories—chemistry, physics, biology, and social sciences

The overall winner of 2011 was Joel Miller (category physics), a biomedical engineer at the University of Western Australia in Perth. Miller apparently compensated his poor dancing skills and the lack of a video by applying stop-motion animation (stringing together about 2,200 photos to make it look as though his “actors” were dancing). His video shows the creation of titanium  alloys that are both strong and flexible enough for long-lasting hip replacements.
I love the song by the way. It fits perfectly to the dance scene.

You can see all winning videos here and all 2011 (this years) PhD videos here. You can also check out the 2010 and the 2009 PhD dances.

The other winners of 2011 were  FoSheng Hsu (chemistry category) who guides viewers through the entire sequence of steps required for x-ray crystallography,  Emma Ware (social science) who studies the traditional ‘stimulus-release’ model of social interaction using pigeon courtship (a beautiful pas a deux) and Edric Kai Wei Tan (biology) with the funny dance about Smell mediated response to relatedness of potential mates, simply put “fruit fly sex”.

Being Dutch, I would like to close with the Dutch winner of the biology category in 2010, Maartje Cathelijne de Jong who dances her PhD, “The influence of previous experiences on visual awareness.”





Frantic Friday #37. The Aftermaths of the Japanese Earthquake & Tsunami. With Emphasis on (Mis)information

19 03 2011

The Frantic Friday belongs to the same series as the Silly Saturday, Funny Friday etc. posts. These are not directly related to Science or library matters. Often these post are about  humorous things, but not in this case. Therefore the name of the series was adapted. It took me a week to write it down, so it reflects what happens over the entire period (and insights did change)

 

Aerial of Sendai, Japan, following earthquake.

Last week was overshadowed by the terrible earthquake in North East Japan, and the subsequent tsunami which swept away many villages in this part of the country. Some people see this as a sign of the world coming to an end, especially since the dates of the Twin tower attack (9-11-01) and the date of the Tsunami in Japan (3-10-11) add up to 12/21/12, the predicted date of the end of the world. Whether you believe in this omen or not (I don’t), the pictures and videos of this event sure do show the unprecedented power of nature, which is devastating beyond imagination. The Jazeera video below was shown on Dutch tv the entire morning: people, cars and boats have no time to escape and a large tsunami is engulfing various cities, eating anything on its path.

Another impressive video shows how a small stream grows to a wild turbulent flood and sweeps away cars and even houses. Sadly, many commenters to this video see the disaster as a punishment for “those that have turned there backs on HIM etc”. Videos like these can now be found anywhere, like at BBC news Asia.

Here are photo’s before and after the tsunami, and here are some photos, not only showing the violent streams, but also the consequences. I was especially moved by this photo of what appear to be mother and child. For after all, this natural disaster is mainly a human tragedy. Lets hope many beloved (human and animal) have found or find each other in good health again, like this reunion of a dog owner and his dog.

As if it wasn’t enough there was also a volcano eruption last Sunday, and the initial small problems with the nuclear plants near to the tsunami area seem now to get out of hand (see below).

Indirectly, there are some library, web 2.0/social media & science aspects to this natural disaster. I will concentrate on (medical and scientific) information

Google immediately reacted  to Japanese tsunami with a Person Finder tool (Engadget). As in the Haiti earthquake (see earlier post), Cochrane makes Evidence Aid resources available.

Immediately after the earthquake we could learn some scientific facts about earthquakes and tsunamis. On thing I learned is that the more superficial the earthquake the more devastating the effects in the area surrounding it. I also learned that a tsunami can have a speed of 800 km per hour, i.e. “flies” with the speed of an airplane, and that a wave can be 1 km long and have an incredible force. Science writers further explain why Japan’s tsunami triggered an enormous whirlpool.

These are facts, but with the nuclear effects we are unsure as to what is happening and “how bad it will be”. I’m a scientist, but surely no expert in this field, and I find the information confusing, contradictory and sometimes misleading.

Lets start with the misleading information. Of course there are people who see the hand of God in this all, but that is so obviously without any foundation (“uit de lucht gegrepen”), that I won’t discuss it further.

First this nuclear fallout map. (it is a lie!)

I saw it on Facebook and took it seriously. Others received it by mail, with an explanation that 550-750 rads means “nausea within a couple of hours and no survivors.” Clearly that is nonsense (fallout killing all people in the US East Coast). Also disturbingly, the makers of this map “bored” the logo of the Australian Radiation Services (ARS). (see Snopes.com, thanks to David Bradley of Sciencebase.com who mentioned it on Facebook).

But the pro-nuclear people come with equal misinformation. There is a strange link on Facebook leading to a post : “MIT scientist says no problems”. The post was blogged by an australian teacher in Japan, who wrote up the words of a friend, family member and MIT-scientist Josef Oehmen (@josefoehmen on Twitter)… But the post really seems to be a repost from something called The Energy Collective, and written by Brooks, a strong proponent of nuclear power. The site is powered by Siemens AG, which recently became an “industry partner” of MIT/LAI. (and the circle is round). Read about this and more at Genius Now in : The Strange Case of Josef Oehmen (access the cache if the site can’t be reached). The German translation of the official piece is here. The comments (permitted) are revealing….

Another misleading claim is that of attorney Ann Coulter in a column and in the O’Reilly show:

With the terrible earthquake and resulting tsunami that have devastated Japan, the only good news is that anyone exposed to excess radiation from the nuclear power plants is now probably much less likely to get cancer.” We shouldn’t worry about the damaged Japanese reactors because they’ll make the locals healthier”

She refers to the hormesis effect, the effect that under certain conditions low doses of certain chemicals/drugs can have opposite effects to high doses in certain experimental models. See PZ Myers at Pharyngula for an excellent dissection of this nonsense.

And -help!- here is a post of a CAM doctor who advises people from the US to immediately take the following (because Japanese Nuclear Radiation Plume Has Reached the United States):

Ample amounts of necessary minerals such as magnesium, iodine, selenium, zinc, and others, Saunas, both infrared and far-infrared, Raising core energy levels with botanical formulas, Supporting and improving individual capacities to mobilize and eliminate toxins, Therapeutic Clays to remove positively charged particles, Solum uliginosum products from Uriel Pharmacy – also available directly from us etcetera.

Thus various examples of misinformation by seemingly well-informed scientists, experts & doctors.

Perhaps this is the downside of Social Media. Twitter and Facebook are very well suited to spread the news fast, but they can also easily spread false information or hoaxes fast-via “your friends”. It is important to check where the news actually comes from (which can be hard if one misuses logo’s and institutions) and if the writer has any biases pro or con nuclear power. But an other disadvantage of Social Media is that we hurry through it by speed-reading.

Besides real lies there is also something called bias.

I have to admit that I have a bias against nuclear power. I was teenager when learned of the Club of Rome, I was in my twenties when the Dutch held large Peace Marches with “Ban the bomb” placards, I was in my thirties when the Dutch cattle had to be kept in stables and we couldn’t eat spinach, because of the Chernobyl nuclear fallout. At the University, my professor in Physics spend one or two lectures talking about the danger of nuclear power and the connection with poverty and the arms race, instead of teaching the regular stuff. During environmental studies I learned about the pitfalls of other energy sources as well. My conclusion was we had to use our energy sources well and I decided to use my feet instead of driving a car (a decision I sometimes regret).

The opinion piece by By David Ropeik “Beware the fear of nuclear….FEAR!” in Scientific American seems a little biased in the opposite direction. This guest post, written soon after the trouble at the Fukushima Daiichi plant, mainly stresses that:

“… the world is facing the risk of getting the risk of nuclear power wrong, and raising the overall risk to public and environmental health far more in the process.”

As if nuke-o-noia that is the most worrying at the moment. He also stresses that in addition to being actually physically hazardous, nuclear power has some psychological characteristics (odorless, man-made) that make it particularly frightening: It is all in the mind, isn’t it?

I do get his point though and agree as to the quiet danger of fossil fuels and the risk of being too dependent upon other countries for energy. But as a commenter said: two wrongs don’t make a right. And isn’t there something like renewable resources and energy saving?

Furthermore the nuclear problems in Japan do show what happens if a country is reliant on nuclear power. The lack of electricity causes great problems with infrastructure. This not only affects Tokyo commuters, but a lack of fuel, electricity, food and the cold weather also hampers the aid efforts. There might also be insufficient fuel to evacuate refugees from the exclusion area, a problem that will grow if the government has to widen the evacuation zone around the plant again (Nature News). Not the most important, but  the japanese quake will likely affect our supply of gadgets and other industries, like the auto-industry.

So we now have polarized discussions between pro- and contra- nuke movements. And it has become an irrational political issue. China has suspended approval for all new nuclear power stations, Germany’s government has announced a safety review of its 17 nuclear power plants, and is temporarily shutting down the seven oldest and the Dutch Government will take the Japanese experience into account when deciding on the Dutch nuclear power program.

It is surprising, that minds have changed overnight: all (potential) risks of nuclear plants were long known.

Regarding misinformation, TEPC, the utility that runs the Fukushima Daiichi nuclear power plant and supplies power for Tokyo, has a long history of disinformation: here were many incidents (200) which were covered up  (Dutch: see NRC-handelsblad, Thursday 2011-03-17; non-official forum here).

There are also signals that the Japanese government, and even the IAEA (according to a Russian nuclear accident specialist) aren’t or weren’t as transparent as one would like them to be. The government seems to downplay the risks and is late with information. The actions are not consistent with what is said: Everything was said to be in control, while people were being evacuated etc. Also the American analysis of the severity of the nuclear was much graver than that of the Japanese government. When the Japanese advise to keep a distance of 30 km, the United States and the British Foreign and Commonwealth Office recommend that citizens stay at least 80 km from the nuclear plant. (Discussed in the NY-Times and The Great Beyond (Nature).
The last days the Japanese government has become more open. It publishes The Japanese science ministry, MEXT, has  its publishes the radiation levels throughout the region and gives more background info about the health risks (source: The Great Beyond). Today, it has also raised the warning level from 4 to 5 on a 7-level international. Outside experts have said for days that this disaster was worse than that at Three Mile Island — which was rated a 5 but released far less radiation outside the plant than Fukushima Daiichi already has. Level 4 means only “local effects”.
The Prime Minister’s Office of Japan now also has an official English account on Twitter: @JPN_PMO.

But now for reliable information? Where can we get it? What about the health risks? Again, I’m no expert in this field, but the following information at least helped me to get an idea about the situation and the actual danger.

  1. It looks like that the situation at the Fukushima Daiichi nuclear power plant is getting out of control (Nature News, March 16th).
  2. It is possible that it the will not be confined to leaks of radioactivity and explosions, but that a nuclear meltdown may occur.
  3. A nuclear meltdown or nuclear reactor explosion is a grave event, but is NOT a nuclear explosion. As explained at Sciencebase:  “There is a risk of particles of radioactive material entering the atmosphere or the ocean, but this does not amount to the impact of an actual nuclear explosion.” Thus even in a worst-case scenario the effects are not as severe as a nuclear explosion.
  4. One major difference with Chernobyl is that radioactivity at Fukushima remains largely contained within the reactor and that we know the problems from the start (not surprised by fall-out).
  5. Still radioactive fumes leak from the power plant. March 16th there was “an alarmingly high dose rate of 0.08 millisieverts (mSv) per hour, 25 kilometres away from the plant (Nature News). March 17th is 17 mSv/hr, 30 kilometres northwest of the reactor. There are also reports of .012 mSv/hr in Fukushima City, 60 km away from the plant. (The Great Beyond). Sanya Gupta monitored that his radiation levels quadrupled, even in Tokyo (see CNN-video).
  6. The time of exposure is as important as the dose. Thus exposure to a  4 to 10 times higher radiation than normal during a couple of days, poses little extra health risk. But if you would receive 4 to 10 times more radiation than usual during months or years it could pose a health risk (cumulative effect). On the other hand peak doses recorded at Fukushima of 400 mSv per hour are enough to induce radiation sickness in about two hours’ time ((The Great Beyond)
  7. Radiation sickness is a (more or less) acute effect of irradiation. It can occur in the immediate surroundings of the radioactive leak. A single dose of 1000 mSv causes radiation sickness and nausea but not death. But 6000 mSv (chernobyl-workers) kills people within a month (see picture in The Guardian)
  8. Over the long term, exposure to radiation, may increase the risk of developing cancer. An exposure rate of 100 mSv/yr is considered the threshold at which cancer rates begin to increase.
  9. To put this into perspective: we are all exposed to 2 mSv natural irradiation per year, one full body CT-scan gives 10 mSv and a flight from New York – Tokyo polar route gives 9 mSv.
  10. The most worrisome on the reported releases of radioactive material in Japan are radioactive cesium-137 (gamma emitter, high energy radiation, penetrating deep) and Iodine-131, a beta emitter (can be easily shielded, dangerous when ingested or inhaled).
  11. Iodine-131 has a short half life of 8 days, but is dangerous when it is absorbed, i.e. through contaminated food and milk. It will accumulate in the thyroid and can cause (mostly non-lethal) thyroid cancer. An easy form of protection is potassium iodide (KI), but this should only be taken by people in the emergency zone, because it can cause serious adverse effects and should not be taken unnecessarily. (For more info see CDC).
  12. Over the long term, the big threat to human health is cesium-137, which has a half-life of 30 years. It is cesium-137 that still contaminates much land in Ukraine around the Chernobyl reactor. Again it can enter the body via food, notably milk.

Note: this is a short summary of what I’ve read. Please go to official sites to get more detailed scientific and medical information.

There are several informative charts or FAQ:

Credits:





Stories [9]: A Healthy Volunteer

20 09 2010

The host of Next Grand Rounds (Pallimed) asked to submit a recent blog post from another blogger in addition to your own post.
I choose “Orthostatics – one more time” from DB Medical rants and a post commenting on that from Musings of a Dinosaur.

Bob Center’s (@medrants) posts was about the value of orthostatic vital sign measurements (I won’t go into any details here), and about who should be doing them, nurses or doctors. In his post, Bob Center also mentioned briefly that students were seeing this as scut work similar as drawing your own bloods and carrying them to the lab.

That reminded me of something that happened when I was working in the lab as a PhD, 20 years ago.

I was working on a chromosomal translocation between chromosome 14 and 18. (see Fig)

The t(14;18) is THE hallmark of follicular lymphoma (lymphoma is a B cell cancer of the lymph nodes).

This chromosomal translocation is caused by a faulty coupling of an immunoglobulin chain to the BCL-2 proto-oncogene during the normal rearrangement process of the immunoglobulins in the pre-B-cells.

This t(14;18) translocation can be detected by genetic techniques, such as PCR.

Using PCR, we found that the t(14:18) translocation was not only present in follicular lymphoma, but also in benign hyperplasia of tonsils and lymph nodes in otherwise healthy persons. Just one out of  1 : 100,000 cells were positive. When I finally succeeded in sequencing the PCR-amplified breakpoints, we could show that each breakpoint was unique and not due to contamination of our positive control (read my posts on XMRV to see why this is important).

So we had a paper. Together with experiments in transgenic mice, our results hinted that t(14;18) translocations is necessary but not sufficient for follicular lymphoma. Enhanced expression of BCL-2 might give make the cells with the translocation “immortal”.

All fine, but hyperplastic tonsils might still form an exception, since they are not completely normal. We reasoned that if the t(14;18) was an accidental mistake in pre B cells it might sometimes be found in normal B cells in the blood too.

But then we needed normal blood from healthy individuals.

At the blood bank we could only get pooled blood at that time. But that wasn’t suitable, because if a translocation was present in one individual it would be diluted with the blood of the others.

So, as was quite common then, we asked our colleagues to donate some blood.

The entire procedure was cumbersome: a technician first had to enrich for T and  B cells, we had to separate the cells by FACS and I would then PCR and sequence them.

The PCR and sequencing techniques had to be adopted, because the frequency of positive cells was lower than in the tonsils and approached the detection limit. ….. That is in most people. But not in all. One of our colleagues had relatively prominent bands, and several breakpoints.

It was explained to him that this meant nothing really. Because we did find similar translocations in every healthy person.

But still, I wouldn’t feel 100% sure, if so many of my blood cells (one out of 1000 or 10.000) contained t(14:18) translocations.

He was one of the first volunteers we tested, but from then on it was decided to test only anonymous persons.

Related Articles





Silly Sunday [33] Science, Journalists & Reporting

12 09 2010

I Friday I read a post of David Bradley at Sciscoop Science on six reasons why scientist should talk to reporters, which was based on an article in this week’s The Scientist magazine by Edyta Zielinska (registration required).

The main reasons why scientist should talk to reporters:

  • It’s your duty
  • It raises your profile with journal editors and funders
  • Your bosses will love it
  • You may pick up grant-writing tips
  • It gets the public excited about science
  • It’s better you than someone else

But the most strong part of the Zielinska article are the practical tips, which fall into 3 categories:

  • the medium matters (i.e. tv versus print)
  • getting the most out of a press call (KISS, significance metaphors)
  • Common press pitfalls, and how to avoid them (avoid oversimplification, errors, jargon, misquotes, sensational stories)

The article is concluded by a useful glossary. Read more: Why Trust A Reporter? – The Scientist

Alan Dangour has experienced what may happen when you report scientific evidence which is then covered by the news.

He and his group published systematic reviews that found no evidence of any important differences in the nutritional composition of foodstuffs grown using conventional and organic farming methods. There was also no evidence of nutrition-related health benefits from consuming organically produced foods.

The press quickly picked up on the story. The Times ran a front-page headline: “Organic food ‘has no extra health benefits’ ”, the Daily Express added “Official” while, in a wonderfully nuanced piece, the Daily Mail ran: “A cancerous conspiracy to poison your faith in organic food”.

Initially it was “tremendously exciting and flattering, but their findings were contrary to beliefs held by many and soon the hate-mails started flooding in. That’s why he concludes: “Come on scientists, stand up and fight!” when not the scientific evidence is called into question, but also your scientific skills, and  personal and professional integrity. Quite appropriately a Lancet editorial put it like this: “Eat the emotion but question the evidence”.

Journalists can also be target of hate mail or aggressive comments. In the whole XMRV-CFS torrent, patients seem to almost “adore” positive journalists (i.e. Amy Dockser Marcus of the WSJ Health Blog), while harassing those who are a bit more critical, like @elmarveerman of Noorderlicht author of “tiring viruses“). It has caused another journalist (who wrote about the same topic) to stop because people hurled curses at her. A good discussion is fine, but unfounded criticism is not, she reasoned.

Last  week, 2 other articles emphasized the need for science journalism to change.

One article by Matthew Nisbet at  Big Think elaborated on an idea on what Alice Bell calls “upstream science journalism.” Her blog post is based on her talk at Science Online London as part of a plenary panel with David Dobbs, Martin Robbins and Ed Yong on “Rebooting” (aka the future of) science journalism (video -of bad quality- included).

Upstream, we have the early stages of communication about some area of science: meetings, literature reviews or general lab gossip. Gradually these ideas are worked through, and the communicative output flows downstream towards the peer-reviewed and published journal article and perhaps, via a press release and maybe even a press conference, some mass media reporting.

This still is pretty vague to me. I think less pushed press releases copied by each and every news source and more background stories, giving insight in how science comes about and what it represents would be welcomed. As long as it isn’t too much like glorification of certain personalities. (More) gossip is also not what we’re waiting for.

Her examples and the interesting discussion that follows clarify that she thinks more of blogs and twitter as tools propelling upstream science journalism.

One main objection (or rather limitation) is that: “most science journalists/writers cover whatever they find interesting and what they believe their readers will find interesting (Ian Sample in comments).”

David ropeik

Wonderful goal, to have journalism serve society in this, or any way, but, forgive me, it’s a naive hope, common among those who observe journalism but haven’t done it.(…..)
Even those of us who feel journalism is a calling and serves an important civic role do not see ourselves principally as teachers or civil servants working in the name of some higher social cause, to educate the public about stuff we thought they should know. We want the lead story. We want our work to get attention. We want to have impact, sure, hopefully positive. But we don’t come into work everyday asking “what should the public know about?”

That’s reality. John Fleck (journalist) agrees that the need to “get a lot of attention” is a driving force in newsroom culture and decision-making, but stresses that the newspapers he worked for have always devoted a portion of their resources to things managers felt were important even if not attention-getting.

So truth in the middle?

Another blogpost -at Jay Rosen: Public Notebook gives advice to journalist “formerly known as the media”. Apart from advice as “you need to be blogging and you need to “get” mobile he want the next generation journalists to understand:

  1. Replace readers, viewers, listeners and consumers with the term “users.”
  2. Remember: the users know more than you do.
  3. There’s been a power shift; the mutualization of journalism is here. We bring important things to the table, and so do the users. Therefore we include them. “Seeing people as a public” means that.
  4. Describe the world in a way that helps people participate in it.  When people participate, they seek out information.
  5. Anyone can doesn’t mean everyone will. (…) It’s an emerging rule of thumb that suggests that if you get a group of 100 people online then one will create content, 10 will ‘interact’ with it (commenting or offering improvements) and the other 89 will just view it… So what’s the conclusion? Only that you shouldn’t expect too much online.
  6. The journalist is just a heightened case of an informed citizen, not a special class.
  7. Your authority starts with, “I’m there, you’re not, let me tell you about it.”
  8. Somehow, you need to listen to demand and give people what they have no way to demand (…) because they don’t know about it yet
  9. In your bid to be trusted, don’t take the View From Nowhere; instead, tell people where you’re coming from.
  10. Breathe deeply of what DeTocqueville said: “Newspapers make associations and associations make newspapers.”

I think those are useful and practical tips, some of which fit in with the idea of more upstream journalism.

O.k. that’s enough for now. We have been pretty serious on the topic. But it is a Friday Fun/ Silly Sunday post. So bring in the comics.

These are self-explanatory, aren’t they?

(HT: David Bradley and commenter on Facebook. Can’t find it anymore. Facebook is hard to search)

From SMBC comics: http://www.smbc-comics.com/index.php?db=comics&id=1623

Come on scientists, stand up and fight! From where I’m sitting it looks as if we are under attack from those who not only want to question the importance of scientific evidence but also to cast doubt on our scientific skills, and our personal and professional integrity. In the year of the 350th anniversary of the Royal Society we must defend the importance of scientific evidence and stand up for science.

I’m quite lucky. My research is just about interesting enough to discuss at dinner. It helps that I’m a public health nutritionist and, at least at dinner, my friends are generally happy to talk about food and sometimes even health. I work on projects including nutritional and physical activity interventions designed to maintain health and function in later life and the impact our love affair with animal foods has on both the environment and public health. Dressed up, and with a light touch of spin, these are all possible dinner party conversations.

My first brush with an audience outside the narrow circles of academia came soon after completing my PhD on the growth of the legs of Amerindian children (the things you used to be able to get funding for!). It turns out that leg length is a sensitive marker of diet and health in early childhood. Later work in England showed that the legs of English boys and girls are now longer than they were 20 years ago, probably because of improved diet and environmental conditions. The great British press loved this story. Lots of photos of long-legged women adorned the newspapers and one national paper even ran a competition to find Britain’s longest legs! This was a good story — easy to understand, straightforward to report and not challenging any pre-existing beliefs.

However, I have recently had a different experience of what can happen when you report scientific evidence. Last year, a team of us from the London School of Hygiene & Tropical Medicine released two systematic reviews on the nutritional quality and nutrition-related health benefits of organically produced foods. The research had been commissioned by the Food Standards Agency and had taken more than a year to complete.

We were not the first people to ask whether there were any differences in nutritional composition or health benefits of foods produced under different production regimens but it became clear that no one had addressed the question systematically. Systematic reviews are an important tool for scientists; unlike ordinary reviews, they are seen as original research and help to provide clarity in areas of uncertainty. The basic underpinning of a systematic review is that the process of conducting the review is pre-specified and that the review itself is as comprehensive as possible within these pre-specified limits. Reviews that are not systematic are much more prone to bias, especially with regards to the selection of papers included for review.

Our systematic reviews found that there was no evidence of any important differences in the nutritional composition of foodstuffs grown using conventional and organic farming methods. There was also no evidence of nutrition-related health benefits from consuming organically produced foods.

The press quickly picked up on the story. The Times ran a front-page headline: “Organic food ‘has no extra health benefits’ ”, the Daily Express added “Official” while, in a wonderfully nuanced piece, the Daily Mail ran: “A cancerous conspiracy to poison your faith in organic food”.

This was initially a tremendously exciting and unprecedented period in my academic career. My ego was certainly flattered! However, the tide of emotion quickly started to turn sour. I became increasingly dismayed at the way in which our data were being used and distorted, especially by those who would benefit from the return of uncertainty to the argument. I was also frustrated that we were being criticised for not including other aspects of organic farming (use of pesticides etc) in our review.

With correspondents only a click away, it will not be surprising to learn that we also received many hundreds of e-mails (it would be very interesting to know what proportion of these correspondents had actually read our reports). My favourite e-mail came from a physician in the US who complained that his wife had “been wasting money for years on organic food” and that at last our “scientific review may finally bring her to her senses”.

Other correspondents were less polite and we received many angry, even vicious e-mails questioning the integrity, independence and ability of the team. These are essential ingredients for a good research team and it is fair to ask these questions but the ferocity of the attack suggested that, by questioning the scientific evidence on the nutrient content of organic food, we had actually questioned something bigger. For the first time, we had drawn into sharp focus the strength of the evidence supporting the widespread belief that organic food is “better” — and many people did not like what they saw. As a Lancet editorial put it: “Eat the emotion but question the evidence”.

Beliefs are important, but so is science and standing up for scientific evidence is crucial. We should not be afraid to report our findings publicly, whether they are merely of academic interest or of a controversial nature. This is our job as scientists.

I expected our reviews to be read with interest but I’m not sure that I fully realised how far I was putting my head above the parapet. I think I’ve passed through the toughest hours and have emerged stronger and better able to fight for the importance of science in modern life.

Returning to the dinner party theme, I have also learnt the — at times painful ­— consequences of telling women that “based on current scientific evidence” their legs are slightly shorter than would be expected for their height. There’s a time and a place for everything.

Alan Dangour is a senior lecturer at the London School of Hygiene & Tropical Medicine

//

//









Follow

Get every new post delivered to your Inbox.

Join 610 other followers