T.V. Padma shows you how to separate real progress from hype and challenge poor practice when you're reporting on the conduct and outcomes of a clinical trial.
If you're a science journalist, you'll almost certainly have found yourself writing about a "breakthrough" in disease treatment, a "promising" drug trial, or a new vaccine "on the horizon".
Clinical trials provide a near-constant stream of stories and fulfil many news values. People always want to know about new, easier, cheaper, better disease treatments. A single drug can provide a lot of copy if you track its progress through each stage of clinical testing. And unsuccessful trials have their story too: why didn't a product live up to its promise?
Sometimes it's not that a drug failed tests, but that the trial didn't abide by the rules and regulations set up to protect participants and the general public. Journalists delight in chasing up something fishy, and the public are thirsty for scandal. Acting as watchdog on the pharmaceutical industry is an important journalistic role.
So science journalists need to be able to identify a well-performed trial from a shoddy one, and know when results are genuinely significant. And as international pharmaceutical companies increasingly carry out clinical trials in developing countries, with the dangers of exploitation that this can bring, reporting on the way they are conducted is ever more important.
Before people are sold a new drug or treatment, it must go through three 'phases' of clinical trials to ensure it is safe and effective.
The first tests the intervention in a small group of volunteers, typically 20–80, for safety and side-effects. If it seems safe, a second phase tests how well the drug works (its efficacy) in a group of several hundred people, and further evaluates its safety.
The third phase examines efficacy in large groups of people (from several hundred to several thousand) in several locations, and compares the effects of the intervention to that of other comparable drugs or treatments. It also monitors for adverse effects.
Once a drug or vaccine has given consistently good results in all three phases, researchers submit the data to a country's regulatory body to clear it for marketing. After its release, a fourth phase should monitor the intervention's widespread use.
Before you report on a clinical trial, check the basics.
Is the product relevant enough to your audience to make a good story?
Is the story really newsworthy? If a treatment has passed only phase I trials, it may be too early to shout about. Bear in mind that most drugs entering the clinical trials pipeline don't make it out the other end.
A trial's size is important too. Small trials have less 'power' than larger trials because their results are more likely to be affected by chance.
If you report an early or small trial, make sure your audience know how far the product is from the market — and that it might not make it at all.
Make sure that the trial is registered with the WHO International Clinical Trials Registry Platform, set up in response to concern about poor transparency in clinical trials.
This specifies 20 minimum facts that should be provided. Registration indicates that a trial is above board, and is also a mine of information, detailing the sponsors of the trials; funding bodies and contacts for public and scientific queries.
Similarly, www.clinicaltrials.gov, run by the U.S. National Institutes of Health, offers a publicly available, searchable web-based registry. Many countries are maintaining their own national registries too, though in general trials are registered internationally first. Treat trials not registered under an international registry with caution.
Too good to be true?
If the research passes these tests, and you've decided to write a story, then it's time to closely examine how a trial's results have been presented.
Has this presentation been influenced by the sponsors of the trial? An August 2010 report in the Annals of Internal Medicine highlights how trials funded by industry are more likely to report positive outcomes than those funded by other sources.
The report surveyed almost 550 drug trials conducted between 2000 and 2006 and found that trials funded by industry are also less likely to publish their results within two years of completion, making independent scrutiny harder. Half of trials supporting drugs approved by the US Food and Drug Administration remained unpublished five years after the drug approval.
Speaking to local people is important
The report also found that some researchers wait for initial results to emerge, and then register or report the trial as looking for those outcomes. To challenge this, editors of reputable medical journals now publish results only from properly-registered trials.
So compare the registration with the full version of the published paper. If there is no paper — perhaps only a press release — ask to see a copy of the paper. Be extra-vigilant in your reporting if you are not offered or provided with a copy.
Remember that preliminary trial results presented at a conference will probably not have been peer-reviewed. If this is the case, mention it in your story and explain the implications
Even trials that pass all the above tests may present their results in an overly-dramatic way, particularly in press materials.
HealthNewsReview.Org illustrates this point well, using as an example a new drug which reduces the risk of blindness associated with diabetes.
If two out of 100 diabetes patients using a conventional drug develop blindness over a five-year treatment period; and only one in 100 go blind on a new drug, the absolute risk of going blind has been reduced — but only from a low two per cent to an even lower one per cent.
Yet the relative risk is the ratio of the two, which is a half. So the drug company's PR team may prefer to say the new drug halves the risk of blindness — without mentioning the low initial risk. You should check for statements like this. For more information about handling statistics see our practical guide Communicating statistics and risk.
Don't assume that a new drug is better. Check how it compares with alternative treatments for efficacy, safety and cost, particularly in developing countries. A drug might be cheaper, but is it really better, for example, if it needs to be dissolved in clean water to be taken?
And watch sharply for data that has been 'cherry picked', i.e. favourable results published and unfavourable results omitted. This is difficult to spot if you're not trained in statistical analysis (and there's the obvious difficulty in finding something that isn't there).
Getting an expert opinion can really help. It's also worth noting that, without expert opinion, your story about a paper's data is unlikely to have much standing in scientific circles.
If you are suspicious about a paper, first try to find the raw data through the clinical trials registry. Next, ask an independent expert in the field to analyse the raw data and the publication. Their dissenting view could carry weight.
Cheaper trials could mean more money for drug companies
If a trial doesn't come up to standard, then that could be a story in itself. Digging deeper into clinical trials is much more difficult than simply reporting the results — but you could uncover a major injustice, and a great story.
Developing countries are increasingly attracting multinational firms wanting to carry out clinical trials. Is that because regulations governing such trials are weaker, or perhaps because it is far cheaper and often far easier to recruit participants from largely illiterate or semi-literate populations? And might those participants not ask as many uncomfortable questions as their Western counterparts?
If you are concerned about practices in a trial, check whether it had a robust ethical review before it started — as is mandatory in developed countries. Are such ethical reviews of prospective trials mandatory in your country? Was the trial you are interested in approved?
Ask critical questions of researchers and, if possible, trial participants. Were participants' decisions to take part in the trial swayed by the offer of money? Were participants well cared for when side-effects emerged? Were they fully informed about the intentions of the trial and its potential outcomes: did they know, for example, that the end-product might not directly benefit them?
e trials and informed consent
The issue of 'informed consent' — that is, a signed agreement to participate voluntarily in a trial after being fully and accurately briefed about its risks and benefits — is a murky one in developing countries.
In poor illiterate populations, 'informed consent' could in practice be a thumb impression from someone who may have understood only part of what doctors have told him/her, or is too intimidated or shy to seek clarifications.
But be cautious about identifying participants. Ask their permission. Some may wish to remain anonymous; particularly HIV/AIDS patients.
Ask yourself whether a trial is appropriate. For example, is it appropriate to test a heart disease drug that local people are unlikely to be able to afford?
In April 2010, four young girls died in the southern Indian state of Andhra Pradesh. Activists said the girls developed severe complications after a trial of an injectable cervical cancer vaccine and that proper informed consent was not given.
Look at papers on clinical trials carefully
SciDev.Net talked to vaccine policy and public health experts — and found that the government had authorised the trial without first checking whether the vaccine protected against strains of human papilloma virus present in the country (see India halts HPV programme). The Indian government temporarily halted its own programme.
Rigour brings rewards
Listen to activists, nongovernmental organisations, 'whistle blowers' concerned over unethical trial protocols, drug regulators who spot deficiencies, or company staff who might, if you are lucky, be willing to talk about unethical practices. Try all sources, but remember they will have their own agendas: find evidence to support their claims.
And you must put your concerns to the researchers and/or company involved, as well as to independent commentators.
It is vital that you do your research. Don't accuse a company, government or institute of ethical violations on flimsy evidence. Check your country's libel laws and abide by them. And remember that if your report is published on the Web, it could be vulnerable to the libel laws of any country it is viewed in.
Rigorous reporting can weed out weaker trials so you cover only the best and most relevant. And you can publicise trials that are run badly and sometimes illegally.
But remember, fair and transparent trials often do result in a genuinely beneficial vaccine or drug, in both developed and developing countries.
These are the products that people need to know about. Spread their message far and wide.
T.V. Padma is SciDev.Net's South Asia regional coordinator and contributes news to Nature Medicine.