My colleague Rob Pearce has a thought-provoking question about the safety of medical information on Wikipedia:
I’m not having a go at Wikipedia at all – I’m a big fan, but I had a thought about the recent McMaster University and Wikimedia Canada initiative to for health care content creation in the creative commons (holding workshops Oct 4th, 2011 at McMaster introducing both professors and students to health care content creation in the creative commons) .
This has come up before when I was working on Open Educational Resources: what is to stop nutters/malcontents – subtly or otherwise – altering medical information that leads to somebody putting their health in jeopardy or pushing one procedure over another, promoting one drug over another, etc.? I dont quite know how to defend this argument yet.
If you Google search for “shark cartilage”, “laetrile therapy” or “copper bracelet” you can see the nutters and profiteers already have their own web sites, which of course you and I are unable to edit.
For that reason I’m very glad that there’s a site which is prominent in search results which has policies on neutrality and reliable sourcing, and on which there are people watching the articles, undoing damage and blocking persistent vandals or spammers. If I see misleading info on cancercures4u.org, I can’t correct it then and there, but if I see it on Wikipedia, I can.
That said, there aren’t as many people watching and contributing to Wikipedia articles as there ideally should be. This is why I’m enthusiastic about Cancer Research UK getting involved in editing cancer articles, this McMaster initiative, or the meeting yesterday at the Wellcome Institute which talked about using Wikipedia for public engagement. There are also projects to automatically sync information in Wikipedia (about RNA fragments, for example) with external databases.
Clearly we all only have a limited amount of time, but academics and support staff put a lot of time and effort into publications or sites that are supposedly aimed at informing the general public, but are only read by a small number of people. Maybe some of that effort could go into a put-it-where-people-will-look-for-it approach, for much greater impact? The interest in Wikipedia training from funding bodies and scholarly societies indicates many of them are starting to think this way.
It’s a general point about open, remixable content that anyone could potentially create their own versions with false information. That’s a consequence of open content being “free” as in “free speech”– free speech allows people to say stupid things, or give others stupid advice. As with any speech, readers need to consider the source.