Our scoring method

On this page you can read about how we collected our data on each company’s policy commitments, and also how we converted that data into a score for the website. You can read more detail about the background, methods and results for this project in our academic paper published in the BMJ.

Search strategy

We set out to include 50 companies: the top 25 pharmaceutical companies by global sales (29), and an arbitrary selection of smaller companies for exploratory analysis of smaller firms’ policies. Baxter was excluded as it no longer makes pharmaceutical products; Teva was excluded as it is principally a generics company; during the audit period six smaller companies ceased to exist, largely through merger, leaving 42 companies. We searched Google for company policies and statements on clinical trial transparency using the key terms <company name>, <clinical trial transparency> and <company name> <clinical trials>; and by navigating through company websites for formal standalone transparency policies. We also searched for policies on clinical trial transparency from the EFPIA and PhRMA. We saved archive copies of all company website pages containing transparency commitments, downloaded archive copies of all standalone documents such as PDFs, and downloaded policy documents from EFPIA and PhRMA, as they stood at April 17 2016.

Defining the reference standard for transparency commitments

In line with standard best-practice for audits, the reference standard for a transparency policy was established, and a data structure developed to reflect this standard. Our reference standard for transparency was that all trials should be registered as per ICJME requirements (30,31), WHO guidance (32,33), and legislative requirements (34); with their methods and results reported in summary form within 12 months of trial completion through online results reporting or other publication, as required under WHO guidance (1), EU legislation (35,36), and FDA Amendment Act 2007; that CSRs should be made publicly available if they have been created, as per current EU legislation (36) and various calls from civic society and academia (23,37); and that IPD should be available on request in some form to researchers (38,39). These broad commitments were then operationalised into structured questions across the four domains of registration, methods and results sharing, CSRs, and IPD, assessing the policy commitments on each domain in detail (full text in Appendix 1). Prospective commitments (for trials from now) and retrospective commitments (for trials in the past) were coded separately. Because some companies’ retrospective commitments only applied to very recent trials, while other went back several decades, start dates for retrospective commitments were also extracted into our coding sheet. We also assessed whether certain categories of trial, such as Phase 4 trials conducted after approval of a new drug, were included under each policy.

Data extraction

Five experienced researchers (BG, CH, KM, IO, SL) with a background in clinical trials, transparency and research integrity independently extracted the data from retrieved documentation and websites into a data extraction sheet reflecting the questions in Appendix 1. Data extraction was done independently by at least three individuals, who met to agree the final coding by consensus. There were circumstances when it proved impossible to code answers as “yes” or “no”: these were coded as “unclear”; we attempted to minimise use of this code and achieve consensus where possible. Additionally, examples of ambiguous, contradictory, or problematic commitments were collected during coding, and grouped by theme.

Engagement with companies

In 2015 before commencing we wrote to every company inviting them to meet us for an hour. This was to explain our project, allow companies to provide feedback, ensure they knew we would be collecting data on their policies, and ensure they knew we would be publishing our findings. In June 2016 all companies were sent the full set of data extracted for their company’s policy, and invited to respond setting out any disagreement they had on any element, by making reference to the text of their publicly accessible policy as it stood at April 17, 2016. The data sent did not indicate how the company compared to other companies.

Emails were sent to the CEO, Medical Director, and other email accounts for individuals in relevant roles who had previously responded to us on related queries regarding transparency. All responses were read, themes and disagreements were extracted and reviewed by at least two members of the team (BG, SL), and changes made where appropriate. In total, 7 out of 1806 scored policy elements were revised in light of feedback from companies. If any company has any concerns about our summary of their public policy after publication of the audit then we are also happy to review additional feedback.

Scoring system

We developed a scoring system - described in detail below - to make the large volume of data on companies more intelligible, and to permit comparison between companies’ policies. The scoring system was drafted by two researchers (BG, SL) and refined by consensus in discussion among the team. Points were ascribed to each positive activity, such as “registering trials”, “posting summary results online”, or “posting CSRs online”. Discounts were applied to a company’s score on any element where some trials were excluded from the transparency policy (such as “trials on off-label uses” or “phase 4 trials”). A commitment to post summary results on clinicaltrials.gov within 12 months scored more highly than a commitment to submit results to a journal: this was because journal publication adds additional delays, and there is evidence that journal publications on trial results are less complete than results submissions to clinicaltrials.gov with respect to reporting of prespecified outcomes and serious adverse events (20).

Prospective and retrospective commitments were scored separately. A discount was applied to the retrospective score for every year where transparency commitments did not apply retrospectively. 1991 (25 years ago) was selected as the furthest back that transparency would be scored, on the following grounds: the first quantitative data demonstrating failure to disclose trial results was published in 1980 (29); the first prominent academic paper calling for trial registration was in 1986 (30); 1990 saw the publication of a landmark paper describing non-disclosure as “research misconduct” (31), and the beginning of more extensive industry and civic discourse on the topic; and no company currently engages in prospective disclosure of clinical trials prior to 1990. In addition to this, standards for conducting and reporting trials in CSRs are widely held to have improved following the update of ICH-GCP in 1995 (22). For the purposes of scoring, an “unclear” commitment scored the same as “no”, since our aim was to compare companies’ publicly accessible policy commitments, on which they could be held accountable.

We scored all domains for all companies according to this system, scored prospective and retrospective commitments for the four sub-domains (registration, results, CSRs, IPD) separately, and created a final total score from the maximum available of 2000. All companies were then ranked.

We do not think our scoring system is the only way to compare companies. That is why we have shared all our raw underlying data, which you can download from this site. You can check all our working and, if you wish, generate an alternative scoring system that you think better reflects the overall sweep of all companies’ commitments. We do not think our scoring system is the only way to compare companies. That is why we have shared all our raw underlying data, which you can download from this site. You can check all our working and, if you wish, generate an alternative scoring system that you think better reflects the overall sweep of all companies’ commitments. To help you understand the appendix below, we have also written a walk-through of the process of ranking one arbitrarily chosen company, which you can read here.


Appendix. Full scoring system and coefficients

Note: labels such as registration_1 are question identifiers. These are referenced on the Measures section of each company’s page, and in the downloadable data.

Registration (max 500 points total)

Prospectively (max 250 points)

  • Do you have a policy to register all new trials? (registration_1, 200 points)
    • Do you audit compliance? (registration_2, 20 points)
      • Do you share summary results of the audit? (registration_3, 20 points)
      • Do you share line-by-line results of the audit? (registration_4, 10 points)

Discounts (apply to overall score above):

  • Does the policy include phase 4 trials? (registration_5, if “no”, -25%)

Retrospectively (max 250 points)

  • Does your current policy cover previous trials (ie from start date of current policy, or by specifying a date that it goes back to)? (registration_6, score from Prospectively)

Discount for number of years back (25 years to 1991):

  • Divide retrospective score by 25, then multiply that for every year to which the retrospective policy applies (registration_7)

Summary Results (max 500 points total)

Prospectively (max 250 points)

  • Do they have a policy to share results at all? (results_1, 150 points)
  • Do they make a timeline commitment, ie within 12 months of completion (max 100 points, best of):
    • Post summary results on all pre-specified primary and secondary outcomes of all trials to clinicaltrials.gov? (results_2, 100 points)
    • OR Submit all trials to journal? (results_4, 75 points)
    • OR Post summary results on all trials only to their own website? (results_3, 50 points)

Discounts:

  • Does the policy include unlicensed treatments (results_5, if “no,” -25%)
  • Does the policy include off-label uses (results_6, if “no,” -25%)
  • Does the policy include phase 4 trials? (results_7, if “no,” -25%)

[note these are additive: so if a company fails to include all 3 they lose 75% of score]

Retrospectively (max 250 points)

Do they have a policy to share past results at all? (results_8, 250 points)

Discounts:

  • Does the policy include unlicensed treatments (results_12, if “no,” -25%)
  • Does the policy include off-label uses (results_13, if “no,” -25%)
  • Does the policy include phase 4 trials? (results_14, if “no,” -25%) [note these are additive: so if a company fails to include all 3 they lose 75% of score]

Discount for number of years back:

  • Divide retrospective score by 25, then multiply for every year to which the retrospective policy applies (results_15)

Clinical Study Reports (max 500 points total)

Prospectively (max 250 points)

  • Do they commit to share all Clinical Study Reports publicly? (csrs_2, 250 points)
  • Is this commitment on request only, after a long review process, similar to that for IPD? (csrs_3, 200 points, instead of 250 points above)
  • Do they share only synopses of Clinical Study Reports? (csrs_7, 50 points, instead of 250 points above)

Discounts:

  • Does the policy include unlicensed treatments (csrs_5, if “no,” -25%)
  • Does the policy include off-label uses (csrs_6, if “no,” -25%)

Retrospectively (max 250 points)

  • Do they commit to share all Clinical Study Reports publicly? (csrs_8, 250 points)
  • Is this commitment on request only, after a long review process, similar to that for IPD? (csrs_9, 200 points, instead of 250 points above)
  • Do they share only synopses of Clinical Study Reports? (csrs_12, 50 points, instead of 250 points above)

Discounts:

  • Does the policy include unlicensed treatments (csrs_10, if “no,” -25%)
  • Does the policy include off-label uses (csrs_11, if “no,” -25%)

Discount for number of years back:

  • Divide retrospective score by 25, then multiply for every year to which the retrospective policy applies (csrs_13)

Individual Patient Data (max 500 points total)

Prospectively (max 250 points)

Do they have a policy to make IPD available on request? (ipd_1, 250 points)

Discounts:

  • Does the policy include unlicensed treatments (ipd_3, if “no,” -25%)
  • Does the policy include off-label uses (ipd_4, if “no,” -25%)
  • Does the policy include phase 4 trials? (ipd_5, if “no,” -25%)

Retrospectively (max 250 points)

Do they have a policy to make IPD available on request? (ipd_2, 250 points)

Discounts:

  • Does the policy include unlicensed treatments (ipd_3, if “no,” -25%)
  • Does the policy include off-label uses (ipd_4, if “no,” -25%)
  • Does the policy include phase 4 trials? (ipd_5, if “no,” -25%)

Discount for number of years back:

  • Divide retrospective score by 25, then multiply for every year to which the retrospective policy applies (ipd_2)