BAD Science or BAD Science Journalism? – A Response to Daniel Lakens

10 02 2013

ResearchBlogging.orgTwo weeks ago  there was a hot debate among Dutch Tweeps on “bad science, bad science journalism and bad science communication“. This debate was started and fueled by different Dutch blog posts on this topic.[1,4-6]

A controversial post, with both fierce proponents and fierce opposition was the post by Daniel Lakens [1], an assistant professor in Applied Cognitive Psychology.

I was among the opponents. Not because I don’t like a new fresh point of view, but because of a wrong reasoning and because Daniel continuously compares apples and oranges.

Since Twitter debates can’t go in-depth and lack structure and since I cannot comment to his Google sites blog, I pursue my discussion here.

The title of Daniels post is (freely translated, like the rest of his post):

Is this what one calls good science?” 

In his post he criticizes a Dutch science journalist, Hans van Maanen, and specifically his recent column [2], where Hans discusses a paper published in Pediatrics [3].

This longitudinal study tested the Music Marker theory among 309 Dutch kids. The researchers gathered information about the kids’ favorite types of music and tracked incidents of “minor delinquency”, such as shoplifting or vandalism, from the time they were 12 until they reached age 16 [4]. The researchers conclude that liking music that goes against the mainstream (rock, heavy metal, gothic, punk, African American music, and electronic dance music) at age 12 is a strong predictor of future minor delinquency at 16, in contrast to chart pop, classic music, jazz.

The University press office send out a press release [5 ], which was picked up by news media [4,6] and one of the Dutch authors of this study,  Loes Keijsers,  tweeted enthusiastically: “Want to know whether a 16 year old adult will suffer from delinquency, than look at his music taste at age 12!”

According to Hans, Loes could have easily broadcasted (more) balanced tweets, likeMusic preference doesn’t predict shoplifting” or “12 year olds who like Bach keep quiet about shoplifting when 16.” But even then, Hans argues, the tweets wouldn’t have been scientifically underpinned either.

In column style Hans explains why he thinks that the study isn’t methodologically strong: no absolute numbers are given; 7 out of 11 (!) music styles are positively associated with delinquency, but these correlations are not impressive: the strongest predictor (Gothic music preference) can explain no more than 9%  of the variance in delinquent behaviour, which can include anything from shoplifting, vandalism, fighting, graffiti spraying, switching price tags.  Furthermore the risks of later “delinquent” behavior are small:  on a scale 1 (never) to 4 (4 times or more) the mean risk was 1,12. Hans also wonders whether it is a good idea to monitor kids with a certain music taste.

Thus Hans concludesthis study isn’t good science”. Daniel, however, concludes that Hans’ writing is not good science journalism.

First Daniel recalls he and other PhD’s took a course on how to peer review scientific papers. On basis of their peer review of a (published) article 90% of the students decided to reject it. The two main lessons learned by Daniel were:

  • It is easy to critize a scientific paper and grind it down. No single contribution to science (no single article) is perfect.
  • New scientific insights, although imperfect, are worth sharing, because they help to evolve science. *¹

According to Daniel science jounalists often make the same mistakes as the peer reviewing PhD-students: critisizing the individuel studies without a “meta-view” on science.

Peer review and journalism however are different things (apples and oranges if you like).

Peer review (with all its imperfections) serves to filter, check and to improve the quality of individual scientific papers (usually) before they are published  [10]. My papers that passed peer review, were generally accepted. Of course there were the negative reviewers, often  the ignorant ones, and the naggers, but many reviewers had critique that helped to improve my paper, sometimes substantially. As a peer reviewer myself I only try to separate the wheat from the chaff and to enhance the quality of the papers that pass.

Science journalism also has a filter function: it filters already peer reviewed scientific papers* for its readership, “the public” by selecting novel relevant science and translating the scientific, jargon-laded language, into language readers can understand and appreciate. Of course science journalists should put the publication into perspective (call it “meta”).

Surely the PhD-students finger exercise resembles the normal peer review process as much as peer review resembles science journalism.

I understand that pure nitpicking seldom serves a goal, but this rarely occurs in science journalism. The opposite, however, is commonplace.

Daniel disapproves Hans van Maanen’s criticism, because Hans isn’t “meta” enough. Daniel: “Arguing whether an effect size is small or mediocre is nonsense, because no individual study gives a good estimate of the effect size. You need to do more research and combine the results in a meta-analysis”.

Apples and oranges again.

Being “meta” has little to do with meta-analysis. Being meta is … uh … pretty meta. You could think of it as seeing beyond (meta) the findings of one single study*.

A meta-analysis, however, is a statistical technique for combining the findings from independent, but comparable (homogeneous) studies in order to more powerfully estimate the true effect size (pretty exact). This is an important, but difficult methodological task for a scientist, not a journalist. If a meta-analysis on the topic exist, journalists should take this into account, of course (and so should the researchers). If not, they should put the single study in broader perspective (what does the study add to existing knowledge?) and show why this single study is or is not well done?

Daniel takes this further by stating that “One study is no study” and that journalists who simply echo the press releases of a study ànd journalists who just amply criticizes only single publication (like Hans) are clueless about science.

Apples and oranges! How can one lump science communicators (“media releases”), echoing journalists (“the media”) and critical journalists together?

I see more value in a critical analysis than a blind rejoicing of hot air. As long as the criticism guides the reader to appreciate the study.

And if there is just one single novel study, that seems important enough to get media attention, shouldn’t we judge the research on its own merits?

Then Daniel asks himself: “If I do criticize those journalists, shouldn’t I criticize those scientists who published just a single study and wrote a press release about it? “

His conclusion? “No”.

Daniel explains: science never provides absolute certainty, at the most the evidence is strong enough to state what is likely true. This can only be achieved by a lot of research by different investigators. 

Therefore you should believe in your ideas and encourage other scientists to pursue your findings. It doesn’t help when you say that music preference doesn’t predict shoplifting. It does help when you use the media to draw attention to your research. Many researchers are now aware of the “Music Marker Theory”. Thus the press release had its desired effect. By expressing a firm belief in their conclusions, they encourage other scientists to spend their sparse time on this topic. These scientists will try to repeat and falsify the study, an essential step in Cumulative Science. At a time when science is under pressure, scientists shouldn’t stop writing enthusiastic press releases or tweets. 

The latter paragraph is sheer nonsense!

Critical analysis of one study by a journalist isn’t what undermines the  public confidence in science. Rather it’s the media circus, that blows the implications of scientific findings out of proportion.

As exemplified by the hilarious PhD Comic below research results are propagated by PR (science communication), picked up by media, broadcasted, spread via the internet. At the end of the cycle conclusions are reached, that are not backed up by (sufficient) evidence.

PhD Comics – The news Cycle

Daniel is right about some things. First one study is indeed no study, in the sense that concepts are continuously tested and corrected: falsification is a central property of science (Popper). He is also right that science doesn’t offer absolute certainty (an aspect that is often not understood by the public). And yes, researchers should believe in their findings and encourage other scientists to check and repeat their experiments.

Though not primarily via the media. But via the normal scientific route. Good scientists will keep track of new findings in their field anyway. Suppose that only findings that are trumpeted in the media would be pursued by other scientists?

7-2-2013 23-26-31 media & science

And authors shouldn’t make overstatements. They shouldn’t raise expectations to a level which cannot be met. The Dutch study only shows weak associations. It simply isn’t true that the Dutch study allows us to “predict” at an individual level if a 12 year old will “act out” at 16.

This doesn’t help lay-people to understand the findings and to appreciate science.

The idea that media should just serve to spotlight a paper, seems objectionable to me.

Going back to the meta-level: what about the role of science communicators, media, science journalists and researchers?

According to Maarten Keulemans, journalist, we should just get rid of all science communicators as a layer between scientists and journalists [7]. But Michel van Baal [9] and Roy Meijer[8] have a point when they say that  journalists do a lot PR-ing too and they should do better than to rehash news releases.*²

Now what about Daniel criticism of van Maanen? In my opinion, van Maanen is one of those rare critical journalists who serve as an antidote against uncritical media diarrhea (see Fig above). Comparable to another lone voice in the media: Ben Goldacre. It didn’t surprise me that Daniel didn’t approve of him (and his book Bad Science) either [11]. 

Does this mean that I find Hans van Maanen a terrific science journalist? No, not really. I often agree with him (i.e. see this post [12]). He is one of those rare journalists who has real expertise in research methodology . However, his columns don’t seem to be written for a large audience: they seem too complex for most lay people. One thing I learned during a scientific journalism course, is that one should explain all jargon to one’s audience.

Personally I find this critical Dutch blog post[13] about the Music Marker Theory far more balanced. After a clear description of the study, Linda Duits concludes that the results of the study are pretty obvious, but that the the mini-hype surrounding this research is caused by the positive tone of the press release. She stresses that prediction is not predetermination and that the musical genres are not important: hiphop doesn’t lead to criminal activity and metal not to vandalism.

And this critical piece in Jezebel [14],  reaches far more people by talking in plain, colourful language, hilarious at times.

It also a swell title: “Delinquents Have the Best Taste in Music”. Now that is an apt conclusion!

———————-

*¹ Since Daniel doesn’t refer to  open (trial) data access nor the fact that peer review may , I ignore these aspects for the sake of the discussion.

*² Coincidence? Keulemans has covered  the music marker study quite uncritically (positive).

Photo Credits

http://www.phdcomics.com/comics/archive.php?comicid=1174

References

  1. Daniel Lakens: Is dit nou goede Wetenschap? – Jan 24, 2013 (sites.google.com/site/lakens2/blog)
  2. Hans van Maanen: De smaak van boefjes in de dop,De Volkskrant, Jan 12, 2013 (vanmaanen.org/hans/columns/)
  3. ter Bogt, T., Keijsers, L., & Meeus, W. (2013). Early Adolescent Music Preferences and Minor Delinquency PEDIATRICS DOI: 10.1542/peds.2012-0708
  4. Lindsay Abrams: Kids Who Like ‘Unconventional Music’ More Likely to Become Delinquent, the Atlantic, Jan 18, 2013
  5. Muziekvoorkeur belangrijke voorspeller voor kleine criminaliteit. Jan 8, 2013 (pers.uu.nl)
  6. Maarten Keulemans: Muziek is goede graadmeter voor puberaal wangedrag – De Volkskrant, 12 januari 2013  (volkskrant.nl)
  7. Maarten Keulemans: Als we nou eens alle wetenschapscommunicatie afschaffen? – Jan 23, 2013 (denieuwereporter.nl)
  8. Roy Meijer: Wetenschapscommunicatie afschaffen, en dan? – Jan 24, 2013 (denieuwereporter.nl)
  9. Michel van Baal. Wetenschapsjournalisten doen ook aan PR – Jan 25, 2013 ((denieuwereporter.nl)
  10. What peer review means for science (guardian.co.uk)
  11. Daniel Lakens. Waarom raadde Maarten Keulemans me Bad Science van Goldacre aan? Oct 25, 2012
  12. Why Publishing in the NEJM is not the Best Guarantee that Something is True: a Response to Katan – Sept 27, 2012 (laikaspoetnik.wordpress.com)
  13. Linda Duits: Debunk: worden pubers crimineel van muziek? (dieponderzoek.nl)
  14. Lindy west: Science: “Delinquents Have the Best Taste in Music” (jezebel.com)




Health and Science Twitter & Blog Top 50 and 100 Lists. How to Separate the Wheat from the Chaff.

24 04 2012

Recently a Top 100 scientists-Twitter list got viral on Twitter. It was published at accreditedonlinecolleges.com/blog.*

Most people just tweeted “Top 100 Scientists on Twitter”, others were excited to be on the list, a few mentioned the lack of scientist X or discipline Y  in the top 100.

Two scientist noticed something peculiar about the list: @seanmcarroll noticed two fake (!) accounts under “physics” (as later explained these were: @NIMAARKANIHAMED and @Prof_S_Hawking). And @nutsci (having read two posts of mine about spam top 50 or 100 lists [12]) recognized this Twitter list as spam:

It is surprising how easy it (still) is for such spammy Top 50 or 100 Lists to get viral, whereas they only have been published to generate more traffic to the website and/or to earn revenue through click-throughs.

It makes me wonder why well-educated people like scientists and doctors swallow the bait. Don’t they recognize the spam? Do they feel flattered to be on the list, or do they take offence when they (or another person who “deserves” it) aren’t chosen? Or perhaps they just find the list useful and want to share it, without taking a close look?

To help you to recognize and avoid such spammy lists, here are some tips to separate the wheat from the chaff:

  1. Check WHO made the list. Is it from an expert in the field, someone you trust? (and/or someone you like to follow?)
  2. If you don’t know the author in person, check the site which publishes the list (often a “blog”):
    1. Beware if there is no (or little info in the) ABOUT-section.
    2. Beware if the site mainly (only) has these kind of lists or short -very general-blogposts (like 10 ways to….) except when the author is somebody like Darren Rowse aka @ProBlogger [3].
    3. Beware if it is a very general site producing a diversity of very specialised lists (who can be expert in all fields?)
    4. Beware if the website has any of the following (not mutually exclusive) characteristics:
      1. Web addresses like accreditedonlinecolleges.com, onlinecolleges.com, onlinecollegesusa.org,  onlinedegrees.com (watch out com sites anyway)
      2. Websites with a Quick-degree, nursing degree, technician school etc finder
      3. Prominent links at the homepage to Kaplan University, University of Phoenix, Grand Canyon University etc
    5. Reputable sites less likely produce nonsense lists. See for instance this “Women in science blogging”-list published in the Guardian [4].
  3. When the site itself seems ok, check whether the names on the list seem trustworthy and worth a follow. Clearly, lists with fake accounts (other then lists with “top 50 fake accounts” ;)) aren’t worth the bother: apparently the creator didn’t make the effort to verify the accounts and/or hasn’t the capacity to understand the tweets/topic.
  4. Ideally the list should have added value. Meaning that it should be more than a summary of names and copy pasting of the bio or “about” section.
    For instance I have recently been put on a list of onlinecollegesusa.org [b], but the author had just copied the subtitle of my blog: …. a medical librarian and her blog explores the web 2.0 world as it relates to library science and beyond.
    However, sometimes, the added value may just be that the author is a highly recognized expert or opinion leader. For instance this Top Health & Medical Bloggers (& Their Twitter Names) List [5] by the well known health blogger Dean Giustini.
  5. In what way do these lists represent *top* Blogs or Twitter accounts? Are their blogs worth reading and/or their Twitter accounts worth following? A nobel price winner may be a top scientist, but may not necessarily be a good blogger and/or may not have interesting tweets. (personally I know various examples of uninteresting accounts of *celebrities* in health, science and politics)
  6. Beware if you are actively approached and kindly requested to spread the list to your audience. (for this is what they want).It goes like this (watch the impersonal tone):

    Your Blog is being featured!

    Hi There,

    I recently compiled a list of the best librarian blogs, and I wanted to let you know that you made the list! You can find your site linked here: [...]

    If you have any feedback please let me know, or if you think your audience would find any of this information useful, please feel free to share the link. We always appreciate a Facebook Like, a Google +1, a Stumble Upon or even a regular old link back, as we’re trying to increase our readership.

    Thanks again, and have a great day!

While some of the list may be worthwhile in itself, it is best NOT TO LINK TO DOUBTFUL LISTS, thus not  mention them on Twitter, not retweet the lists and not blog about it. For this is what they only want to achieve.

But what if you really find this list interesting?

Here are some tips to find alternatives to these spammy lists (often opposite to above-mentioned words of caution) 

  1. Find posts/lists produced by experts in the field and/or people you trust or like to follow. Their choice of blogs or twitter-accounts (albeit subjective and incomplete) will probably suit you the best. For isn’t this what it is all about?
  2. Especially useful are posts that give you more information about the people on the list. Like this top-10 librarian list by Phil Bradley [6] and the excellent “100+ women healthcare academics” compiled by @amcunningham and @trishgreenhalgh [7].
    Strikingly the reason to create the latter list was that a spammy list not recognized as such (“50 Medical School Professors You Should Be Following On Twitter”  [c])  seemed short on women….
  3. In case of Twitter-accounts:
    1. Check existing Twitter lists of people you find interesting to follow. You can follow the entire lists or just those people you find most interesting.
      Examples: I created a list with people from the EBM-cochrane people & sceptics [8]. Nutritional science grad student @Nutsci has a nutrition-health-science list [9]. The more followers, the more popular the list.
    2. Check interesting conversation partners of people you follow.
    3. Check accounts of people who are often retweeted in the field.
    4. Keep an eye on #FF (#FollowFriday) mentions, where people worth following are highlighted
    5. Check a topic on Listorious. For instance @hrana made a list of Twitter-doctors[10]. There are also scientists-lists (then again, check who made the list and who is on the list. Some health/nutrition lists are really bad if you’re interested in science and not junk)
    6. Worth mentioning are shared lists that are open for edit (so there are many contributors besides the curator). Lists [4] and [7] are examples of crowd sourced lists. Other examples are truly open-to-edit lists using public spreadsheets, like the Top Twitter Doctors[11], created by Dr Ves and  lists for science and bio(medical) journals [12], created by me.
  4. Finally, if you find the spam top 100 list truly helpful, and don’t know too many people in the field, just check out some of the names without linking to the list or spreading the word.

*For obvious reasons I will not hyperlink to these sites, but if you would like to check them, these are the links

[a] accreditedonlinecolleges.com/blog/2012/top-100-scientists-on-twitter

[b] onlinecollegesusa.org/librarian-resources-online

[c] thedegree360.onlinedegrees.com/50-must-follow-medical-school-professors-on-twitter

  1. Beware of Top 50 “Great Tools to Double Check your Doctor” or whatever Lists. (laikaspoetnik.wordpress.com)
  2. Vanity is the Quicksand of Reasoning: Beware of Top 100 and 50 lists! ((laikaspoetnik.wordpress.com)
  3. Google+ Tactics of the Blogging Pros (problogger.net)
  4. “Women in science blogging” by  ( http://www.guardian.co.uk/science)
  5. Top Health & Medical Bloggers (& Their Twitter Names) List (blog.openmedicine.ca)
  6. Top-10 librarian list by Phil Bradley (www.blogs.com/topten)
  7. 100+ women healthcare academics by Annemarie Cunningham/ Trisha Greenhalgh (wishfulthinkinginmedicaleducation.blogspot.com)
  8. Twitter-doctors by @hrana (listorious.com)
  9. EBM-cochrane people & sceptics (Twitter list by @laikas)
  10. Nutrition-health-science (Twitter list by @nutsci)
  11. Open for edit: Top Twitter Doctors arranged by specialty in alphabetical order (Google Spreadsheet by @drves)
  12. TWITTER BIOMEDICAL AND OTHER SCIENTIFIC JOURNALS & MAGAZINES (Google Spreadsheet by @laikas)






Silly Sunday #50: Molecular Designs & Synthetic DNA

23 04 2012

As a teenager I found it hard to picture the 3D structure of DNA, proteins and other molecules. Remember we didn’t have a computer then, no videos, nor 3D-pictures or 3D models.

I tried to fill the gap, by making DNA-molecules of (used) matches and colored clay, based on descriptions in dry (and dull 2D) textbooks, but you can imagine that these creative 3D clay figures beard little resemblance to the real molecular structures.

But luckily things have changed over the last 40 years. Not only do we have computers and videos, there are also ready-made molecular models, specially designed for education.

O, how I wished, my chemistry teachers would have had those DNA-(starters)-kits.

Hattip: Joanne Manaster‏ @sciencegoddess on Twitter: 

Curious? Here is the Products Catalog of http://3dmoleculardesigns.com/news2.php

Of course, such “synthesis” (copying) of existing molecules -though very useful for educational purposes- is overshadowed by the recent “CREATION of molecules other than DNA and RNA [xeno-nucleic acids (XNAs)], that can be used to store and propagate information and have the capacity for Darwinian evolution.

But that is quite a different story.

Related articles





Friday Foolery #49: The Shortest Abstract Ever! [2]

30 03 2012

In a previous Friday Foolery post I mentioned what I thought was the shortest abstract ever.

 “Probably not”.

But a reader (Trollface”pointed out in a comment that there was an even shorter (and much older) abstract of a paper in the Bulletin of the Seismological Society of America. It was published in 1974.

The abstract simply says: Yes.

It could only be beaten by an abstract saying: “No”, “!”, “?” or a blank one.





Friday Foolery #44. The Shortest Abstract Ever?

2 12 2011

This is the shortest abstract I’ve ever seen:

“probably not”

With many thanks to Michelynn McKnight, PhD, AHIP, Associate Professor, School of Library and Information Science, Louisiana State University, who put it on the MEDLIB-L listserv, saying :  “Not exactly structured …. but a great laugh!”

According to Zemanta (articles related to this post) Future Twit also blogged about it.

Related articles





Friday Foolery #42 So You Think You Can Dance Your PhD Thesis?

5 11 2011

It’s hard to explain your research to non-scientists. My PhD defense was preceded by a slide show (yes, that was once-upon-a-time that we didn’t use Powerpoint). It was the only part the public could follow a bit. But it is too long, static and detailed.

That cannot be said of these videos, where PhD’s from all over the world interpret their graduate research in dance form.

The videos below are the winners of the 2011 edition of the Dance your PhD contest. For the 4th year, this contest is organized by Gonzolabs & Science. See http://gonzolabs.org/dance/

There are 4 categories—chemistry, physics, biology, and social sciences

The overall winner of 2011 was Joel Miller (category physics), a biomedical engineer at the University of Western Australia in Perth. Miller apparently compensated his poor dancing skills and the lack of a video by applying stop-motion animation (stringing together about 2,200 photos to make it look as though his “actors” were dancing). His video shows the creation of titanium  alloys that are both strong and flexible enough for long-lasting hip replacements.
I love the song by the way. It fits perfectly to the dance scene.

You can see all winning videos here and all 2011 (this years) PhD videos here. You can also check out the 2010 and the 2009 PhD dances.

The other winners of 2011 were  FoSheng Hsu (chemistry category) who guides viewers through the entire sequence of steps required for x-ray crystallography,  Emma Ware (social science) who studies the traditional ‘stimulus-release’ model of social interaction using pigeon courtship (a beautiful pas a deux) and Edric Kai Wei Tan (biology) with the funny dance about Smell mediated response to relatedness of potential mates, simply put “fruit fly sex”.

Being Dutch, I would like to close with the Dutch winner of the biology category in 2010, Maartje Cathelijne de Jong who dances her PhD, “The influence of previous experiences on visual awareness.”





An Introduction to the Library for Graduate Students

6 11 2010

Below is a presentation I gave at the “World of Science”. This is a 3-day course for graduate students that aims to provide them the fundamental knowledge and skills needed for scientific research, and to prepare them for their thesis at our hospital, the AMC.

The 3-day program comprises a series of presentations on aspects of medical and biomedical research. These include the position of the pharmaceutical industry, the role of scientific journals, the ethical and legal framework of medical research, and the organization and funding of scientific research in the Netherlands. There is also an introduction to the scientific strategy of the AMC, presented by the Dean of the Faculty of Medicine.

Furthermore there are group discussions, workshops, and individual assignments.

The course is held outside the AMC. It provides a unique opportunity for a closer and more personal meeting with each other and with leading AMC scientists, to discuss such matters as the choices they made in their careers.

I had 15 minutes (actually 20 minutes) to tell something about the library. (It used to be 30 min. but wasn’t received so well). That is too short to explain searching to them. Furthermore that is dealt with in our courses, so why give it all away?

I choose to show them how the library could serve them, in an interactive and loose way.

First I asked them how they saw the library. Many, if not all, used our website. Pfff, that was a relieve!

I spend most time talking about searching, showing  examples of searches that failed. Which is the best way to show them they might need some extra education in this respect.

The atmosphere was very good & informal, there were many questions and it was sometimes quite hilarious, not only because of the presentation itself, but because I almost managed to ruin the screen (fell against it) and because I walked away with the microphone.

I had the opportunity to listen to the next speaker too, a young scientist who recently finished his thesis. His talk was great to listen to. He talked about his experience (which was not really representative imho, because it was quite a success story) and he gave the would-be PhD’s 10 handy tips. All in a very entertaining way.

But for now, here is my presentation.








Follow

Get every new post delivered to your Inbox.

Join 610 other followers