Reference Management Software, Shut Down of 5 Google Apps and a Plane that Crashed.

18 01 2009

Reference Management software, shut down of 5 Google apps and a plane that crashed. What have they in common? Nothing, except that these three unrelated subjects all reached me via Twitter last Thursday eve.

[1] When I checked my Tweetdeck (a twitter client) I saw a huge number of tweets (twitter messages) about the crash of a plain in the Hudson river. It now appears that Twitter and Flickr broke the news 15 minutes before the mainstream media. Below is the first crash picture which was posted on Twitter from an iPhone, taken by Janis Krums from a ferry. Earlier (Twitter as a modern tam tam) I gave some other examples of Twitter as a breaking news platform.

jkrums-plaatje-voor-blog

[2] Twitter is also a useful tool for up to date information and exchange of thoughts. For instance some tweeple (people on Twitter) had been asking about free reference management software. I had retweeted (RT, resend) the message and Thursday eve DrShock (of Dr Shock MD, PhD) tweeted a very useful link to Wikipedia which compared all reference management software, which was retweeted to the Twitter community.

The wikipedia article gives a comprehensive overview of the following software: 2collab, Aigaion, BibDesk, Biblioscape, BibSonomy, Bibus, Bookends, CiteULike, Connotea, EndNote, JabRef , Papers, ProCite, Pybliographer, refbase, RefDB, Referencer, Reference Manager, RefWorks, Scholar’s Aid, Sente, Wikindx, WizFolio, Zotero.

The following tables are included: the operating system support, export and Import file formats, citation styles, reference list file formats, word processor integration, database connectivity, password “protection” and network versions.

Very useful (although not always accurate). See: http://en.wikipedia.org/wiki/Comparison_of_reference_management_software.

wiki-ref-man-system

[3] @Symtym (of the blog Symtym) had just learned me how to use Google Notebook to clip and collect information as you surf the web, organize the notes in notebooks and publish the public notes automatically to twitter via twitterfeed. I found it real handy and gathered some material to write a post about it.

But then came the news, brought to me by @Dymphie (of Deetjes (Dutch)), that Google decided to close many services, including Notebook as well as Google Video, Catalog, Jaiku, Dodgeball) or as ReadWriteWeb says it: “Google Giveth, and Taketh Away”. (see announcement on the Google Operating System blog).

google-stopt-met-aantal-zaken1

Although Google Notebook itself will remain, the active development will be stopped. Of course this was shocking for many faithful users, including me, Dr. Shock and many others (see comments here)

wtf-gn-is-going-down-shock

What are the alternatives? Soon @DrCris, author of several blogs including Applequack, tweeted on a solution soon to come: “Evernote is working on a Google notebook importer“. I heard great things about Evernote, many doctors seem to use it, so I might as well give it a try.

evernote-google-nb-importer

Diigo is also planning to make a GN importer (see here). Presumably other tools will follow soon.

Note added:

Two articles in Lifehacker give tips [1] “where to go when google notebook goes down” and [2] describe how you can import the entirety of your google notebook to ubernote (Thanks Dr.Shock.)

——————-

nl vlag NL flag“Reference Management software, shut down of 5 Google apps and a plane that crashed”. Wat heeft dit met elkaar te maken? Niets eigenlijk, behalve dat ik donderdagavond hiervan via twitter op de hoogte gesteld werd.

[1] Eerder gaf ik al voorbeelden dat twitter als een moderne tam tam werkt en vaak een primeur heeft. Donderdag was dat ook het geval. De eerste berichten van het neerstorten van een vliegtuig in de Hudson rivier kwamen via twitter binnen.

[2] Twitter is ook nuttig om informatie te delen. Deze week vroegen mensen naar gratis reference manager software. Ik twitterde dat door (RT of retweet) en donderdag kwam @DrShock (van Dr Shock MD, PhD) met een erg nuttige link naar een artikel in wikipedia. Vervolgens werd door ‘retweeten’ een groot aantal volgers op de hoogte gesteld

In het artikel wordt de volgende software vergeleken: 2collab, Aigaion, BibDesk, Biblioscape, BibSonomy, Bibus, Bookends, CiteULike, Connotea, EndNote, JabRef , Papers, ProCite, Pybliographer, refbase, RefDB, Referencer, Reference Manager, RefWorks, Scholar’s Aid, Sente, Wikindx, WizFolio, Zotero met betrekking tot de volgende punten: “the operating system support, export and Import file formats, citation styles, reference list file formats, word processor integration, database connectivity, password “protection” and network versions”.

Heel erg nuttig en overzichtelijk (in tabelvorm met kleurtjes). Zie: http://en.wikipedia.org/wiki/Comparison_of_reference_management_software.

[3] Van @Symtym (blog: symtym) had ik juist geleerd hoe ik Google Notebook kon gebruiken om teksten al surfende op het net te knippen, bewaren en verzamelen in kladbloks en vervolgens te publiceren op twitter via twitterfeed (berichten automatisch ingekort tot 140 lettertekens). Ik vond het ontzettend handig. Het is een ideale manier om snel informatie te organiseren om later te bekijken, om er een stukje over te schrijven en/of om direct met anderen te delen.

Maar toen kwam als donderslag bij heldere hemel het nieuws via @Dymphie (van Deetjes) tot mij dat uit verschillende Google applicaties de stekker zou worden getrokken. Ook uit Google Notebook. En daarnaast Google Video, Catalog, Jaiku, Dodgeball).

Google Notebook zelf zal nog wel even blijven, maar de ontwikkeling zal worden stopgezet. Natuurlijk is dit nogal een schok voor trouwe gebruikers. Eerst worden mensen geenthousiasmeerd om een nieuwe tool te gebruiken en vervolgens wordt deze hen weer ontnomen

Gelukkig twitterde @DrCris, auteur van o.a. Applequack, vrijwel direct dat Evernote werkt aan een Google notebook importeerfunctie. Ik heb erge goede dingen gehoord van Evernote en veel artsen gebruiken het, dus ik ga dat ook maar eens proberen. Diigo is ook bezig met het ontwikkelen van een GN importeerfunctie (zie hier). Waarschijnlijk zal dit wel navolging krijgen. Toch blijft het vervelend om steeds maar van tool te moeten veranderen. Maar misschien moet je dat op de koop toenemen bij gratis applicaties.

Achteraf toegevoegd

Twee artikelen in ‘Lifehacker’ gaan over dit laatste punt [1] “where to go when google notebook goes down” en [2] describe how you can import the entirety of your google notebook to ubernote (Met dank aan Dr.Shock.)

Advertisements




Google spreadsheet as a wiki.

12 12 2008

google-doc-logoGoogle has developed so many new applications in short time, it is difficult to keep abreast of the latest developments.

One useful application is Google Docs. which is a free, Web-based word processor, spreadsheet, presentation, and form application offered by Google. It allows users to create and edit documents online while collaborating in real-time with other users.

During the Spoetnik Library 2.0 course we used Google Docs to write documents together, which we published on our blogs.

You can also choose to keep documents private. The advantage compared to MS Office is that you can access your docs anywhere from the web. All you need istweet-clin-cases-and-images to log into your Google account.

Ves Dimov of the Clinical Cases and Images – Blog draw my attention to an option in Google Spreadsheets (try-out here) , whereby you can allow people to edit the item as if it were a wiki.

This option was described at the Google Operating System Blog, with unofficial news and tips about Google) as follows:

“Google Spreadsheets added an option in the sharing dialog that allows anyone to view or edit the spreadsheet just by knowing the URL. Until now, you had to send an invitation URL that contained a secret code and the people you invited had to login using a Google account. If you click on the Share tab and enable “Let people edit without signing in*“, your spreadsheet becomes a wiki that can be edited by anyone.”

share-with-the-world-1-2-met-nrs

Not only has Ves described this possibility in a blogpost, he also set up a spreadsheet that lists “The best medical podcasts”.** Anybody can edit the list, see the original spreadsheet here and you are all invited to do so..

According to Ves (and Google) you can easily embed the Medical Podcast spreadsheet by just copying this HTML code in your own website. Alas, WordPress.com blogs appear to be a notable exception (again Grrr!).

Thus, to see how the spreadsheet evolves you have to go to the URL, Ves’s blogpost here or embed the spreadsheet yourselves.

To give you an impression I will show a figure of the (provisional) embedded spreadsheet instead:

spreadsheet-medical-postcasts

* original text: Anyone can edit this document WITHOUT LOGGING IN

** a closer look at the date revealed that the blogpost already stems from May 2008.

————

nl vlag NL flagGoogle Spreadsheets (try-out hier) is een, gratis, Excel-achtig bestand binnen Google Docs waar je online vanaf elke PC met internetaansluiting aan kunt werken, – zonder gebruik te hoeven maken van usb-sticks of e-mail- (zie Spoetnik-cursus, week 8). Je kunt alleen of samen aan een document werken.

Door Ves Dimov van het Clinical Cases and Images – Blog werd ik geattendeerd op een optie binnen Google Spreadsheets, waardoor mensen niet ingelogd hoeven te zijn om mee te werken aan je spreadsheet. De spreadsheet functioneert dan als een soort wiki.

Deze mogelijkheid werd reeds in mei dit jaar beschreven op het Google Operating System Blog. Wanneer je een Google spreadsheet hebt aangemaakt, kun je in het dialoogvenster aangeven dat je de spreadsheet wilt delen (“share tab“) en dat mensen het kunnen bewerken zonder in te loggen (“Let people edit without signing in*). De URL, (gemarkeerd bij 4 in bovenstaand figuur) kun je naar andere mensen sturen die het vervolgens kunnen bewerken. Belangrijk is om de spreadsheet daarna op te slaan en af te sluiten.

Ves heeft gelijk de daad bij het woord gevoegd en een lijst van beste medische podcasts toegevoegd, die eenieder kan bewerken. De oorspronkelijke spreadsheet vind je door hier te klikken.

Mij lukte het niet om deze spreadsheet te embedden in WordPress. Dus om de lijst van beste medische podcasts “real time” te kunnen zien kun je naar Ves’s blogpost gaan, de URL bekijken en/of deze zelf embedden.





Friday Foolery #31 Waving goodbye… (or not?)

13 08 2010

WHEN THE SHIP SANK…

(it was August 4th,  I remember I was at home multitasking
(twittering, blogging, mailing, scratching my back, playing
patience, humming a tune and looking out of the window)

WHEN..


THE REASON BEHIND IT


WHAT’s NEXT?

HERE AT WORDSTREAM THEY THINK THE SAME.

THE GOOGLE FLOPS & FAILURE GRAVEYARD IS EXTENDING

(HT: @drves)

Google Flops & Failures – The Failed Google Graveyard

 Google Failures and Google Flops - A list of Google Mistakes

I still miss Google Notebook . AND Google Wave sure had great potential

To think that a year ago I told people in a workshop that Google Wave could make their live easy 😉


Google Wave had potential, especially as a collaboration tool….

See this post at Tip of the Iceberg (how appropriate) describing how Google Wave was used  to collaborate with students.

Since much of the Code is open sourceambitious developers may pick up where Google left.

But some people hope Google Wave may be saved. It might for instance be worth saving for health systems.

Want to Save the Wave”? ….. Then click on the following image and express your support.

click to sign the petition

Related articles by Zemanta (and me)





Packrati.us = Twitter + Delicious = Useful + Simple

18 03 2010

To me, Twitter is an essential source for information. It is an easy way to keep updated in my field, it is fast and it is an ideal networking site to build relationships. Without it I wouldn’t have ‘met’ so many excellent and interesting people. In fact those people are my living filter to the Twitter noise (see previous post): I only follow people with whom I share the same interest (at least in some respects). Twitter also is one of my inspirational sources for blogging, and vice versa it is an outlet for my blog posts.

Unfortunately, Twitter has one shortcoming: Tweets are volatile. Twitter is designed to catch conversations real time. Therefore it is not easy to “keep” Tweets or read them later. Usually your tweets get lost after 7 to 10 days and cease to be found by  Twitter Search. Some tweets can still be Googled, but that is not a secure way of keeping tweets.

At least I safeguard my favorited tweets by taking a RSS to my favs (yellow starred in Fig).

But this is just a way to conserve your favorite tweets for a (more) prolonged time.

What you also would like is to “archive” the URLs of the actual pages that seem interesting (the red http links in the tweets).

I used Google Notebook for that. That was near perfect: the free online Google application allowed saving and organizing clips of information (via a Firefox add-on) while online (see Wikipedia). The information was saved to “notebooks” that could be made “public” and automatically fed into Twitter to share with others. It was easy tracing articles back by searching or browsing.

But that is no more. Google decided to drop the development of Google Notebook. In addition, several of of my notebooks  were flagged as violating Program Policies?!

I tried Evernote as an alternative, but it could never win my heart. Too time-consuming, for one thing.

I may not have tried hard enough, but testing tools is not my job. I ‘m just looking for tools/ways that make my live in the web 2.0 world easy. The tools must be easy to understand and easy to use.

A new tool Packrati.us. (http://packrati.us/) seems to meet all my needs in this respect. A week ago, I read about it in a Tech Crunch paper entitled:  Packrati.us: A Dead Simple Way To Make Delicious Bookmark The Links You Tweet. Dead simple that was what I needed!

Packrati.us is a simple bookmarking service. Once you register, they follow your Twitter feed, and whenever one of your tweets contains URLs, they are added to your Delicious.com bookmarks.

So, for instance I retweeted @amcunningham and @jrbtrip, who link to an interesting article regarding bias in dissemination & publication of research. The link is a shortened URL.

When I visit My Delicious (http://delicious.com/) via an add-on in Firefox, I see that the link is automatically saved in Delicious.

The bookmark shows

  1. the link to the URL (title),
  2. the number of people bookmarking the link,
  3. the actual tweet mentioned in notes (more notes can be added),
  4. the extended url,
  5. an automatic tag (packrati.us) chosen to indicate that this bookmark is automatically imported from Twitter and other tags that I manually added to facilitate retrieval.

When you click on the link you go to the actual article. I can always find the bookmark when I search for tags like bias

The following links can be automatically loaded into Delicious:

  • Links in your tweets and retweets (tweets you resend)
  • Links in tweets directed to you (send by others)
  • Links in your favorited tweets (!) (quite new)

You can choose to:

  • Expand the URLs that have been shortened with an URL shortening service
  • Replace existing bookmarks (no duplication, old tags are kept.
  • Not convert hashtags from tweets to tags for the bookmarks (default = tagging hashtags)
  • Exclude tweets with specific tags (new)
  • Exlude tweets from a selection of sources
  • Add the sender of the tweet (other than yourself)

Packrati.us is under continuous development, some features have just been added. I love the new feature that favorited tweets can be kept (alas it doesn’t work retrospectively, so the above favs are not included).

In practice you can get a lot of bookmarks if you tweet/favorite a lot. It is good to exclude some tweets beforehand and imo necessary to prune the tweets afterwards and add tags. Otherwise it becomes a (disorderly) mess.

Although Packrati.us links only Twitter and Delicious, you can use each platform separately. I also use Delicious to manually add bookmarks of websites I like. Yes, thanks to Packrati.us I learned to love delicious again.





Evidence Based Point of Care Summaries [1] No “Best” Among the Bests?

13 10 2011

ResearchBlogging.orgFor many of today’s busy practicing clinicians, keeping up with the enormous and ever growing amount of medical information, poses substantial challenges [6]. Its impractical to do a PubMed search to answer each clinical question and then synthesize and appraise the evidence. Simply, because busy health care providers have limited time and many questions per day.

As repeatedly mentioned on this blog ([67]), it is far more efficient to try to find aggregate (or pre-filtered or pre-appraised) evidence first.

Haynes ‘‘5S’’ levels of evidence (adapted by [1])

There are several forms of aggregate evidence, often represented as the higher layers of an evidence pyramid (because they aggregate individual studies, represented by the lowest layer). There are confusingly many pyramids, however [8] with different kinds of hierarchies and based on different principles.

According to the “5S” paradigm[9] (now evolved to 6S -[10]) the peak of the pyramid are the ideal but not yet realized computer decision support systems, that link the individual patient characteristics to the current best evidence. According to the 5S model the next best source are Evidence Based Textbooks.
(Note: EBM and textbooks almost seem a contradiction in terms to me, personally I would not put many of the POCs somewhere at the top. Also see my post: How Evidence Based is UpToDate really?)

Whatever their exact place in the EBM-pyramid, these POCs are helpful to many clinicians. There are many different POCs (see The HLWIKI Canada for a comprehensive overview [11]) with a wide range of costs, varying from free with ads (e-Medicine) to very expensive site licenses (UpToDate). Because of the costs, hospital libraries have to choose among them.

Choices are often based on user preferences and satisfaction and balanced against costs, scope of coverage etc. Choices are often subjective and people tend to stick to the databases they know.

Initial literature about POCs concentrated on user preferences and satisfaction. A New Zealand study [3] among 84 GPs showed no significant difference in preference for, or usage levels of DynaMed, MD Consult (including FirstConsult) and UpToDate. The proportion of questions adequately answered by POCs differed per study (see introduction of [4] for an overview) varying from 20% to 70%.
McKibbon and Fridsma ([5] cited in [4]) found that the information resources chosen by primary care physicians were seldom helpful in providing the correct answers, leading them to conclude that:

“…the evidence base of the resources must be strong and current…We need to evaluate them well to determine how best to harness the resources to support good clinical decision making.”

Recent studies have tried to objectively compare online point-of-care summaries with respect to their breadth, content development, editorial policy, the speed of updating and the type of evidence cited. I will discuss 3 of these recent papers, but will review each paper separately. (My posts tend to be pretty long and in-depth. So in an effort to keep them readable I try to cut down where possible.)

Two of the three papers are published by Rita Banzi and colleagues from the Italian Cochrane Centre.

In the first paper, reviewed here, Banzi et al [1] first identified English Web-based POCs using Medline, Google, librarian association websites, and information conference proceedings from January to December 2008. In order to be eligible, a product had to be an online-delivered summary that is regularly updated, claims to provide evidence-based information and is to be used at the bedside.

They found 30 eligible POCs, of which the following 18 databases met the criteria: 5-Minute Clinical Consult, ACP-Pier, BestBETs, CKS (NHS), Clinical Evidence, DynaMed, eMedicine,  eTG complete, EBM Guidelines, First Consult, GP Notebook, Harrison’s Practice, Health Gate, Map Of Medicine, Micromedex, Pepid, UpToDate, ZynxEvidence.

They assessed and ranked these 18 point-of-care products according to: (1) coverage (volume) of medical conditions, (2) editorial quality, and (3) evidence-based methodology. (For operational definitions see appendix 1)

From a quantitive perspective DynaMed, eMedicine, and First Consult were the most comprehensive (88%) and eTG complete the least (45%).

The best editorial quality of EBP was delivered by Clinical Evidence (15), UpToDate (15), eMedicine (13), Dynamed (11) and eTG complete (10). (Scores are shown in brackets)

Finally, BestBETs, Clinical Evidence, EBM Guidelines and UpToDate obtained the maximal score (15 points each) for best evidence-based methodology, followed by DynaMed and Map Of Medicine (12 points each).
As expected eMedicine, eTG complete, First Consult, GP Notebook and Harrison’s Practice had a very low EBM score (1 point each). Personally I would not have even considered these online sources as “evidence based”.

The calculations seem very “exact”, but assumptions upon which these figures are based are open to question in my view. Furthermore all items have the same weight. Isn’t the evidence-based methodology far more important than “comprehensiveness” and editorial quality?

Certainly because “volume” is “just” estimated by analyzing to which extent 4 random chapters of the ICD-10 classification are covered by the POCs. Some sources, like Clinical Evidence and BestBets (scoring low for this item) don’t aim to be comprehensive but only “answer” a limited number of questions: they are not textbooks.

Editorial quality is determined by scoring of the specific indicators of transparency: authorship, peer reviewing procedure, updating, disclosure of authors’ conflicts of interest, and commercial support of content development.

For the EB methodology, Banzi et al scored the following indicators:

  1. Is a systematic literature search or surveillance the basis of content development?
  2. Is the critical appraisal method fully described?
  3. Are systematic reviews preferred over other types of publication?
  4. Is there a system for grading the quality of evidence?
  5. When expert opinion is included is it easily recognizable over studies’ data and results ?

The  score for each of these indicators is 3 for “yes”, 1 for “unclear”, and 0 for “no” ( if judged “not adequate” or “not reported.”)

This leaves little room for qualitative differences and mainly relies upon adequate reporting. As discussed earlier in a post where I questioned the evidence-based-ness of UpToDate, there is a difference between tailored searches and checking a limited list of sources (indicator 1.). It also matters whether the search is mentioned or not (transparency), whether it is qualitatively ok and whether it is extensive or not. For lists, it matters how many sources are “surveyed”. It also matters whether one or both methods are used… These important differences are not reflected by the scores.

Furthermore some points may be more important than others. Personally I find step 1 the most important. For what good is appraising and grading if it isn’t applied to the most relevant evidence? It is “easy” to do a grading or to copy it from other sources (yes, I wouldn’t be surprised if some POCs are doing this).

On the other hand, a zero for one indicator can have too much weight on the score.

Dynamed got 12 instead of the maximum 15 points, because their editorial policy page didn’t explicitly describe their absolute prioritization of systematic reviews although they really adhere to that in practice (see comment by editor-in-chief  Brian Alper [2]). Had Dynamed received the deserved 15 points for this indicator, they would have had the highest score overall.

The authors further conclude that none of the dimensions turned out to be significantly associated with the other dimensions. For example, BestBETs scored among the worst on volume (comprehensiveness), with an intermediate score for editorial quality, and the highest score for evidence-based methodology.  Overall, DynaMed, EBM Guidelines, and UpToDate scored in the top quartile for 2 out of 3 variables and in the 2nd quartile for the 3rd of these variables. (but as explained above Dynamed really scored in the top quartile for all 3 variables)

On basis of their findings Banzi et al conclude that only a few POCs satisfied the criteria, with none excelling in all.

The finding that Pepid, eMedicine, eTG complete, First Consult, GP Notebook, Harrison’s Practice and 5-Minute Clinical Consult only obtained 1 or 2 of the maximum 15 points for EBM methodology confirms my “intuitive grasp” that these sources really don’t deserve the label “evidence based”. Perhaps we should make a more strict distinction between “point of care” databases as a point where patients and practitioners interact, particularly referring to the context of the provider-patient dyad (definition by Banzi et al) and truly evidence based summaries. Only few of the tested databases would fit the latter definition. 

In summary, Banzi et al reviewed 18 Online Evidence-based Practice Point-of-Care Information Summary Providers. They comprehensively evaluated and summarized these resources with respect to coverage (volume) of medical conditions, editorial quality, and (3) evidence-based methodology. 

Limitations of the study, also according to the authors, were the lack of a clear definition of these products, arbitrariness of the scoring system and emphasis on the quality of reporting. Furthermore the study didn’t really assess the products qualitatively (i.e. with respect to performance). Nor did it take into account that products might have a different aim. Clinical Evidence only summarizes evidence on the effectiveness of treatments of a limited number of diseases, for instance. Therefore it scores bad on volume while excelling on the other items. 

Nevertheless it is helpful that POCs are objectively compared and it may help as starting point for decisions about acquisition.

References (not in chronological order)

  1. Banzi, R., Liberati, A., Moschetti, I., Tagliabue, L., & Moja, L. (2010). A Review of Online Evidence-based Practice Point-of-Care Information Summary Providers Journal of Medical Internet Research, 12 (3) DOI: 10.2196/jmir.1288
  2. Alper, B. (2010). Review of Online Evidence-based Practice Point-of-Care Information Summary Providers: Response by the Publisher of DynaMed Journal of Medical Internet Research, 12 (3) DOI: 10.2196/jmir.1622
  3. Goodyear-Smith F, Kerse N, Warren J, & Arroll B (2008). Evaluation of e-textbooks. DynaMed, MD Consult and UpToDate. Australian family physician, 37 (10), 878-82 PMID: 19002313
  4. Ketchum, A., Saleh, A., & Jeong, K. (2011). Type of Evidence Behind Point-of-Care Clinical Information Products: A Bibliometric Analysis Journal of Medical Internet Research, 13 (1) DOI: 10.2196/jmir.1539
  5. McKibbon, K., & Fridsma, D. (2006). Effectiveness of Clinician-selected Electronic Information Resources for Answering Primary Care Physicians’ Information Needs Journal of the American Medical Informatics Association, 13 (6), 653-659 DOI: 10.1197/jamia.M2087
  6. How will we ever keep up with 75 Trials and 11 Systematic Reviews a Day? (laikaspoetnik.wordpress.com)
  7. 10 + 1 PubMed Tips for Residents (and their Instructors) (laikaspoetnik.wordpress.com)
  8. Time to weed the (EBM-)pyramids?! (laikaspoetnik.wordpress.com)
  9. Haynes RB. Of studies, syntheses, synopses, summaries, and systems: the “5S” evolution of information services for evidence-based healthcare decisions. Evid Based Med 2006 Dec;11(6):162-164. [PubMed]
  10. DiCenso A, Bayley L, Haynes RB. ACP Journal Club. Editorial: Accessing preappraised evidence: fine-tuning the 5S model into a 6S model. Ann Intern Med. 2009 Sep 15;151(6):JC3-2, JC3-3. PubMed PMID: 19755349 [free full text].
  11. How Evidence Based is UpToDate really? (laikaspoetnik.wordpress.com)
  12. Point_of_care_decision-making_tools_-_Overview (hlwiki.slais.ubc.ca)
  13. UpToDate or Dynamed? (Shamsha Damani at laikaspoetnik.wordpress.com)

Related articles (automatically generated)





Internet Sources & Blog Posts in a Reference List? Yes or No?

13 02 2011

A Dutch librarian asked me to join a blog carnival of Dutch Librarians. This carnival differs from medical blog carnivals (like the Grand Rounds and “Medical Information Matters“) in its approach. There is one specific topic which is discussed at individual blogs and summarized by the host in his carnival post.

The current topic is “Can you use an internet source”?

The motive of the archivist Christian van der Ven for starting this discussion was the response to a post at his blog De Digitale Archivaris. In this post he wondered whether blog posts could be used by students writing a paper. It struck him that students rarely use internet sources and that most teachers didn’t encourage or allow to use these.

Since I work as a medical information specialist I will adapt the question as follows:

“Can you refer to an internet source in a biomedical scientific article, paper, thesis or survey”?

I explicitly use “refer to” instead of “use”. Because I would prefer to avoid discussing “plagiarism” and “copyright”. Obviously I would object to any form of uncritical copying of a large piece of text without checking it’s reliability and copyright-issues (see below).

”]

Previously, I have blogged about the trouble with Wikipedia as a source for information. In short, as Wikipedians say, Wikipedia is the best source to start with in your research, but should never be the last one (quote from @berci in a twitterinterview). In reality, most students and doctors do consult Wikipedia and dr. Google (see here and here). However, they may not (and mostly should not) use it as such in their writings. As I have indicated in the earlier post it is not (yet) a trustworthy source for scientific purposes.

But Internet is more than Wikipedia and random Googling. As a matter of fact most biomedical information is now in digital form. The speed at which biomedical knowledge is advancing is tremendous. Books are soon out of date. Thus most library users confine themselves to articles in peer-reviewed scientific papers or to datasets (geneticists). Generally my patrons search the largest freely available database PubMed to access citations in mostly peer-reviewed -and digital- journals. These are generally considered as (reliable)  internet sources. But they do not essentially differ from printed equivalents.

However there are other internet sources that provide reliable or useful information. What about publications by the National Health Council, an evidence based guideline by NICE and/or published evidence tables? What about synopses (critical appraisals) such as published by DARE, like this one? What about evidence summaries by Clinical Evidence like, this one? All excellent, evidence based, commendable online resources. Without doubt these can be used as a reference in a paper. Thus there is no clearcut answer to the abovementioned question. Whether an internet source should be used as a reference in a paper is dependent on the following:

  1. Is the source relevant?
  2. Is the source reliable?
  3. What is the purpose of the paper and the topic?

Furthermore it depends on the function of the reference (not mutually exclusive):

  1. To give credit
  2. To add credibility
  3. For transparency and reproducibility
  4. To help readers find further information
  5. For illustration (as an example)

Lets illustrate this with a few examples.

  • Students who write an overview on a medical topic can use any relevant reference, including narrative reviews, UpToDate and other internet sites if appropriate .
  • Interns who have to prepare a CAT (critically appraised topic) should refer to 2-3 papers, providing the highest evidence (i.e. a systematic review and/or randomized controlled trial).
  • Authors writing systematic reviews only include high quality primary studies (except for the introduction perhaps). In addition they should (ideally) check congress abstracts, clinical trial registers (like clinicaltrials.gov), or actual raw data (i.e. produced by a pharmaceutical company).
  • Authors of narrative reviews may include all kinds of sources. That is also true for editorials, primary studies or theses. Reference lists should be as accurate and complete as possible (within the limits posed by for instance the journal).

Blog, wikis, podcasts and tweets.
Papers can also refer to blog posts, wikis or even tweets (there is APA guidance how to cite these). Such sources can merely be referred to because they serve as an example (articles about social media in Medicine for instance, like this recent paper in Am Pharm Assoc that analyzes pharmacy-centric blogs.

Blog posts are usually seen as lacking in factual reliability. However, there are many blogs, run by scientists, that are (or can be) a trustworthy source. As a matter of fact it would be inappropriate not to cite these sources, if  the information was valuable, useful and actually used in the paper.
Some examples of excellent biomedical web 2.0 sources.

  • The Clinical Cases and Images Blog of Ves Dimov, MD (drVes at Twitter), a rich source of clinical cases. My colleague once found the only valuable information (a rare patient case) at Dr Ves’ blog, not in PubMed or other regular sources. Why not cite this blog post, if this patient case was to be published?
  • Researchblogging.org is an aggregator of expert blogposts about peer-reviewed research. There are many other high quality scientific blogging platforms like Scientopia, the PLOSblogs etc. These kind of blogs critically analyse peer reviewed papers. For instance this blog post by Marya Zilberberg reveals how a RCT stopped early due to efficacy can still be severely flawed, but lead to a level one recommendation. Very useful information that you cannot find in the actual published study nor in the evidence based guideline
  • An example of an excellent and up-to-date wiki is the open HLWIKI (maintained by Dean Giustini, @giustini at Twitter) with entries about health librarianship, social media and current information technology topics, having over 565+ pages of content since 2006! It has a very rich content with extensive reference lists and could thus be easily used in papers on library topics.
  • Another concept is usefulchem.wikispaces.com (an initiative of Jean Claude Bradley, discussed in a previous post. This is not only a wiki but also an open notebook, where actual primary scientific data can be found. Very impressive.
  • There is also WikiProteins (part of a conceptwiki), an open, collaborative wiki  focusing on proteins and their role in biology and medicine.

I would like to end my post with two thoughts.

First the world is not static. In the future scientific claims could be represented as formal RDF statements/triplets  instead of or next to the journal publications as we know them (see post on nanopublications). Such “statements” (already realized with regard to proteins and genes) are more easily linked and retrieved. In effect, peer review doesn’t prevent fraud, misrepresentation or overstatements.

Another side of the coin in this “blogs as an internet source”-dicussion is whether the citation is always appropriate and/or accurate?

Today a web page (cardio.nl/ACS/StudiesRichtlijnenProtocollen.html), evidently meant for education of residents, linked to one of my posts. Almost the entire post was copied including a figure, but the only link used was one of my tags EBM (hidden in the text).  Even worse, blog posts are sometimes mentioned to give credit to disputable context. I’ve mentioned the tactics of Organized Wisdom before. More recently a site called deathbyvaccination.com links out of context to one of my blog post. Given the recent revelation of fraudulent anti-vaccine papers, I’m not very happy with that kind of “attribution”.

Related Articles





#EAHIL2009 Web 2.0 and Health Information – Chris Mavergames

4 06 2009

2-6-2009 23-11-41 EAHIL 2009

I’m in Dublin to attend the EAHIL workshop 2009.
The EAHIL is the European Association for Health Information and Libraries.

The EAHIL -workshop 2009 really started Wednesday afternoon. Tuesday morning, as a foretaste of the official program I attended a Continuing Education Course, namely the Web 2.0 and Health Information course by Chris Mavergames.

Chris Mavergames is currently the Web Operation Manager/Information Architect for the Cochrane Collaboration. Before, he worked in the field of information and library science.

So Chris and I are really colleagues, but we didn’t realize until we “met” on Twitter.

On this hot day in June I was pleased that the workshop was held in the cool Berkeley Library of Trinity College.
They have chosen real good locations for this EAHIL workshop. Most presentations are in the Dublin Castle, another place at the Heart of the Irish History.

The workshop took approximately 3 hours and consisted of two presentations, followed by short Q&A’s and an open forum afterwards.

The presentations:

  • Web 2.0 and Health Information“,
  • A case study of the experiences of implementing and using these technologies in a large, non-profit organization (Cochrane Collaboration).

Eighteen people could attend. Each of us had a computer, which raised expectations that they were needed during the workshop. They were not, but they were handy anyway to look up things and to draft a post. And.. I could post this message on Twitter before Chris loaded a photo of his class on TwitPic.  LOL.

4-6-2009 9-46-55 chris is making a photo

10848362 class chriss mavergames

Web 2.0 versus web 1.0
Chris began with asking the audience how many people either have used ..or at least have heard of Facebook, LinkedIn or any other social networking service. And then he asked which tools were being used. Afterwards he admitted he had checked everyone’s presence on various social bookmarking sites. Hilarious.

To my surprise, quite a number of people were familiar with most of the web 2.0 services and sources. Indeed, weren’t librarians the first to embrace web 2.0?

I got the impression Twitter was the least well known/appreciated tools. Most people were either on Facebook or Linkedin, not on both. This presumably has to do with separation of professional and personal things.

Chris first explained the difference between Web 1.0 and Web 2.0: Web 1.0 is a one way interaction, static. Web 2.0 is: “more finding or receiving, less searching”. It has a dynamic aspect: there is more interaction, the possibility to ‘comment, subscribe, post, add, share or as Chris puts it: “Web 2.0 allows you to have information “pushed” at you vs. you having to “pull”.

Another characteristic of web 2.0 is that technology has become easier. It is now more about content.

As an example he showed the Cochrane website from 2004 (web 1.0) and the current website. The first was just a plain web site where you could search, browse and email, the second has social bookmarking tools and is more dynamic and active: you can add comments, post on websites etc.. In addition the Cochrane Collaboration is now on Twitter and Facebook and produces podcasts of a selection of systematic reviews.

Another example of web 2.0 interfaces are MyNCBI of PubMed (for saving your searches) and i-Google.

Social Networking services
These services allow you to create an online profile so that you can interact with others, share and integrate.

Examples are Facebook, LinkedIn and 2 Collab. What is used most, differs around the world. Linkedin is more a professional site, an “online resume” and Facebook is for more general stuff. “You’re mother is on facebook too, so..”. Most young people don’t realize what others can read. However, Facebook offers the possibility to select precisely who can see exactly what.

Twitter
Twitter is a microblogging system, that allows a 140 chracter message (tweet). At first, Chris wasn’t very much interested. He only knew Twitter through the automatic updates on Facebook, but “wasn’t really interested in a  friend in New York eating a scrambled egg.”

It is as easy to subscribe to one’s updates as it is to unsubscribe. Chris uses Tweetdeck to filter for keywords that are of interest. But as he showed me later, he uses the i-phone to easily catch what people (he follows) are tweeting.

Although Twitter was created as a social tool it is now much more than that. It creates a so called “ambient awareness” and as such it is a perfect example of “push” technology: you won‘t see every tweet, but you will l be ambiently aware of the conversation (of your “friends” or the subject you follow). Twitter is also very useful for getting a real fast answer to your question. This is how Chris learned the value of Twitter. He had a question at a meeting. Someone said: just put it on Twitter with the hashtag of the congress (an agreed upon keyword with #in front, like #EAHIL2009). He did it and within 3 minutes he got an answer. Twitter is also very useful for sharing and finding links.
There are many “Twitter apps” around. Just search Google for it.

For professional use within a company the twitter look-alike Yammer can be a useful alternative, because only people in the company are able to follow the updates.

My personal experience is also very positive. Twitter and other web 2.0 tools can work synergistically, dependent on your Twitter community and how you use it.

Social bookmarking:
Although librarians aren’t always very happy with user generated tagging, social bookmarking tools are and easy way of allowing users to share a collections of links.
Links used (directly or indirectly) for his presentation are available at del.icio.us/mavergames under the tag EAHIL.

Blogs, Wiki’s
A blog can give a good summary of interesting articles in a particular field. Chris began a blog 2 months ago (http://mavergames.net) about  a very specific subject he is involved in: Drupal. For him is it just an open notebook: a platform to share your ideas with others.
It is possible receiving updates via RSS (push).

Wiki’s are a very powerful knowledge gathering tool,  a way to collaboratively create a resource, based on the principle of “Crowd sourcing” (The Wisdom of Crowds).

Examples of the two are:

  1. https://laikaspoetnik.wordpress.com/ (this blog)
  2. http://scienceroll.com/ (of the Hungarian Medical Student Bertalan Mesko)
  3. http://www.medpedia.com/ (not yet fully developped medical wiki)
  4. http://twictionary.pbwiki.com/ (a fun wiki with the Twitter Vocabulary)
  5. cochrane.org/ideas
  6. http://mavergames.net (Chris’s blog on Drupal)

Subscription services: RSS
Via RSS Really Simple Syndication you can push information from a variety of sources:

  • Podcasts, for instance cochrane.org/podcasts
  • Saved searches, like in PubMed
  • News feeds cochrane.org/news
  • Updates to sites
  • Updates to collections of bookmarks
  • Updates to flickr photos
  • Etcetera

Platforms can vary from Google Reader, Yahoo, Bloglines, but you can also use i-Google or a specilized medical page where you can find links to all kinds of sources, like blogs, podcasts and journals. Perssonalized Medicine (http://www.webicina.com/rss_feeds/) is especially recommended.

Somebody from the audience added that Medworm is a good (and free) medical RSS feed provider as well. For an overview of several of such platforms, including Medworm, i-Google and www.webicina.com see an earlier post on this blog:  Perssonalized Medicine and its alternatives (2009-02-27).

A typical Web 2.0 scenario:

  1. Chris visits Laika’s MedlibLog and reads Cochrane PodCasts are available.
  2. He finds it interesting , goes to the Cochrane website and subscribes to the Cochrane podcasts with RSS.
  3. He want to share this finding with others, so he decides to tweet that Cochrane podcasts are available.
  4. He gets a response: Hé do you know the Cochrane is on Facebook, so he visits Facebook joins and posts the news on facebook again. And so on.

Not only did Chris give a nice overview of Web 2.0 tools, but there was ample opportunity for discussions and remarks.

The two most common questions were: [1] When can you find time for this? and [2] what can you do when the IT-departments don’t allow access to web 2.0 tools like YouTube, Facebook, RSS? It really seamed the main barrier for librarians from many countries to the use of web 2.0. Nevertheless, Chris engaging presentation seemed to encourage many people to try the tools that were new to them at home. Afterwards I only heard positive comments on this workshop.

The slidecasts of the two presentations are now online on http://www.slideshare.net/mavergames.

The slidecast I’ve reviewed is below.