White Paper

Clinical Study Leader Or Laggard?

Clinical Study Leader Or Laggard?

Time to Assess Your Level of Maturity

Self-Assessment Helps Stakeholders Reflect on Study Startup Status

Quality management and competitive edge are vital to the clinical trials sector1,2 and are fueling widespread use of purpose-built technology. Cloud-based solutions, such as clinical trial management systems (CTMS) and the electronic trial master file (eTMF), are broadly accepted, but these tools fail to address study startup (SSU), a complex set of processes that contribute heavily to lengthy timelines, often lasting seven years.3 With greater globalization of clinical trials and factors ranging from site selection to regulatory document submission to enrolling the first patient, assessing SSU status is critical. It pinpoints bottlenecks and areas of risk that could impact data quality and derail study budgets and timelines.

"With greater globalization of clinical trials and factors ranging from site selection to regulatory document submission to enrolling the first patient, assessing SSU status is critical."

Risk is playing a more visible role, given the growing emphasis on risk-based management by regulators1, 4, 5,, and industry’s steady push for speeding therapies to patients, suggesting that it is time for stakeholders to determine if they are SSU leaders or laggards. To make this assessment, sponsors and contract research organizations (CROs) need metrics and better processes to evaluate startup, considering it takes eight months, on average, to move from the pre-visit phase to site initiation.6 For site selection, three months is a typical timeframe for completing the site identification process for Phase II and Phase III studies.7 This step needs further scrutiny as identifying the right sites is at the root of the bleak statistics on patient enrollment. Nearly half of investigative sites under-enroll, 11% of sites do not enroll any patients, and a meager 13% exceed the enrollment target.8 With all of these factors in play, SSU is a major cause of long cycle times, which have not changed in more than two decades.9

This white paper describes why stakeholders should take the opportunity to conduct an SSU self-assessment, with a focus on maturity level of a sponsor’s or CRO’s internal processes. And with growing reliance on CROs for many aspects of study execution, this assessment can help CROs evaluate their own status, as they attempt to differentiate their services.10 The self-assessment can be conducted using data from post-mortems, which evaluate the success or failure of a team to meet timeline and budget benchmarks, as well as site selection and patient recruitment targets.

To encourage laggards to join the ranks of leaders, and to spur continuous improvement among all players, goBalto’s Select, Activate, and Analyze are presented as valuable options. They form an end-to-end solution that fills the gap in the clinical stack by guiding stakeholders through the process of site identification, feasibility assessment and site activation. These tools—workflowdriven Select and Activate, along with Analyze, a data analytics platform— optimize the many SSU steps and provide real-time insight into study status, a major change from the Excel spreadsheet, which cannot offer this functionality.

Why Determine Level of Maturity?
It has long been asserted that the clinical trials sector is behind the curve when it comes to technology adoption.11 Continued reliance on older methods lacking in visibility are commonplace, as are dated and inefficient systems and processes.12 But with signals that the industry is trending in a more positive direction, this is the right time for stakeholders to determine their level of SSU maturity. The market for eClinical software is expanding at a compound average growth rate of 13.80%, increasing from $3 billion in 2014 to $6.8 billion by 2020.13

"It has long been asserted that the clinical trials sector is behind the curve when it comes to technology adoption. Continued reliance on older methods lacking in visibility are commonplace, as are dated and inefficient systems and processes."

There is also greater awareness that the real benefit of technology is its ability to overhaul deeply entrenched processes through integrated tools and realtime communication14—a significant change from point solutions that update just a single step in the development continuum. Kim et al. report on methods currently employed in clinical trials and those expected to be implemented in the near future.15 Their research indicates that in the next two years, the biggest area of investment is likely to be in technologies tied to risk-based strategies, an approach that correlates with regulatory encouragement from the Food and Drug Administration, the European Medicines Agency, and the new Good Clinical Practice guideline from the International Conference on Harmonisation, known as ICH-GCP E6 (R2).1 All of these regulatory guidances were put forth in recognition of the need for greater use of technology in clinical trials to improve quality management and oversight.

Applying this risk-based thinking to SSU starts with determining if a sponsor or CRO is an SSU leader or laggard. Making this assessment will help stakeholders gauge their maturity level based on the effectiveness of their site selection and study activation methods. Do they have processes that are implemented consistently internally and among partners? Are there substantial real-time data to help stakeholders select the best sites for a particular study?

Are systems integrated to provide access to information and enable proactive response to changing conditions and requirements? These questions will provide insight into an organization’s ability to mitigate risk. An immature organization may be more likely to solve problems retroactively rather than focusing proactively on prevention—a leadership characteristic.16

"An immature organization may be more likely to solve problems retroactively rather than focusing proactively on prevention—a leadership characteristic."

Using a data-driven approach aligns with the lean Six Sigma method, which offers a highly structured tool for determining maturity assessment for Six Sigma deployment.17 A simplistic description of this comprehensive system refers to three basic steps: Assess, Analyze, Address. Each of these steps has multiple parameters, starting with “Assess”, and its definitions to help users assign a 1 to 5 ranking to each (Chart 1).

Once this step is completed, the “Analyze” exercise evaluates the scores of the individual parameters, and compares them to the Maturity Index—the average of those scores—to identify which areas need the most attention. Finally, “Address” involves collaboration amongst leaders of various departments to begin addressing the weaknesses and working toward continuous improvement.

Applying this maturity model to an SSU leader-laggard determination involves creating a grid in the lean Six Sigma style. Chart 2 details SSU Process, People, Data, and Systems, and as part of the self-assessment, teams can assign a value to each response. In particular, they should conduct the assessment from the perspective of site selection through to site activation.

As a best practice, stakeholders can perform an initial self-assessment to establish a maturity baseline, followed by conducting post-mortems on each study. That way, teams will begin gathering metrics to evaluate SSU status over time. This exercise should prove valuable because clinical teams tend to be comfortable with the processes and technology they have in place, and often think everything is working well. Unfortunately, they may lack the metrics or standardized practices to back up these sometimes steadfast beliefs. Using site selection as an example, research from the Tufts Center for the Study of Drug Development (CSDD) hints at why many stakeholders are laggards when it comes to site selection practices, resulting in continued inefficiency and ineffectiveness.7,18 Results fell into three categories:

  • Planning and site identification phase
  • SSU and site initiation phase
  • Ongoing execution and completion phase

For the planning and site identification phase, Tufts found that most organizations perform very limited upfront planning. Also, companies often collect incomplete selection criteria, and of those collected, many of the criteria used to identify and select sites were not directly associated with site performance. And, there was continued reliance on somewhat unsophisticated approaches to site identification, namely word of mouth, literature searches, internet searches, chart review, and conferences.

"Excel does not offer project management or oversight capabilities, has inefficient workflows, lacks realtime reporting, and provides limited business intelligence."

Similar work by goBalto noted that these older approaches lead to significant gaps in the ability of the clinical trials sector to manage document workflows associated with efficient SSU.19 In a 2015 survey, respondents reported that Excel is the mainstay for SSU procedures. For study activation, more than 80% of sponsors and CROs still rely on Excel for tracking of SSU processes. Over two-thirds of sponsors and CROs use Excel for site selection and evaluation, with the majority of sponsors (93%) and CROs (80%) choosing Excel for site feasibility. These practices are problematic as Excel was not designed for this type of work, and lacks critical features needed for operational excellence, creating bottlenecks.20 Specifically, Excel does not offer project management or oversight capabilities, has inefficient workflows, lacks real-time reporting, and provides limited business intelligence.

Better Tools Make Better Assessments
New cloud-based solutions have been delivered to the market that move far beyond Excel, allowing stakeholders’ processes to mature as they move from laggard to leader status. Workflow-driven Select automates workflows for intelligent site profiling, and Activate expedites the document completion and management processes. Both solutions drive collaboration through direct notifications and role-based process workflows. Analyze, a data analytics tool, allows study team members to aggregate data and customize graphs, dashboards, and other data visualizations of study status.

Importantly, with the use of an application program interface (API), these tools integrate with other cloud-based solutions in the clinical stack, such as EDC, CTMS, and eTMF, to optimize the flow of data across the clinical trial continuum (Figure 1). Another advantage is access to these solutions using a single sign-on. This is a big improvement over other systems that require multiple passwords and credentials, a factor that limits adoption of new technology.21

To better understand how these solutions can improve SSU processes, here is a brief overview of Select, Activate, and Analyze.

• Select

"Select creates a target site profile using numerous data sources, such as investigator databases, CTMS, feasibility surveys, and key opinion leader (KOL) data sources, combined to create a complete view of site performance."

Select is designed to help stakeholders avoid choosing non-active and nonenrolling sites, a problem that wastes valuable SSU time, and ultimately drives up the cost of a trial, possibly by as much as 20%.22 To improve site selection, Select creates a target site profile using numerous data sources, such as investigator databases, CTMS, feasibility surveys, and key opinion leader (KOL) data sources, combined to create a complete view of site performance. The internal and external data sources are combined into a single meta-database, and using data-driven algorithms, stakeholders are led through the site-selection process (Figure 2).

Data used by Select to create a reusable master site profile of site characteristics and performance include:

  • Enrollment data
  • Cycle time performance
  • Experience
  • Profile information
  • Data quality

With Select, stakeholders steer clear of manual site selection methods, which often lack institutional memory, and collect data from disparate systems on site capabilities, past performance, and investigator background. This scattered approach aligns with one of the top findings from the Tufts CSDD research on SSU, namely that sponsors and CROs who use older methods tend to re-invent the wheel every time they recruit sites, using and re-using selection criteria when they already have sufficient knowledge about how specific sites perform.7

• Activate

"Activate becomes the repository for in-progress documents, and supports communication, reporting, tracking, oversight, and data management, providing a single source of truth."

Activate is a business process facilitation tool designed to guide the user through SSU using smart workflows that offer better visibility into which activities are next. As part of an integrated clinical stack, data housed in the various solutions, such as individual site performance, country performance and submission activities, are compiled, so the study team can view status in real time. With this information, Activate becomes the repository for in-progress documents, and supports communication, reporting, tracking, oversight, and data management, providing a single source of truth. Information contained in Activate only needs to be entered once, and documents from the principal investigator’s database and the investigator portal are accessed through a dashboard via a single sign-on.

Characteristics and performance include:

  • More than 60 standardized country workflows to choose from, including tracking site activation, protocol amendments, quality reviews, and expiring documents (Figure 3)
  • Configurable to track any activity, document, submission, and ad-hoc documents

• Analyze

Analyze is a reporting tool that creates reports using data analytics to aid stakeholders in viewing study status of multiple studies. This functionality helps identify bottlenecks across protocols by evaluating completion of documents on the critical path, such as site contracts or an informed consent form, and tracks cycle times amongst individual sites as well as countries. If a report signals a trend toward longer completion time for contracts, for example, the clinical team can act quickly to steer lagging sites back on track, or consider adding new sites. Standard reports are supplemented with customized reports that can be shared with team members via a simple click and an e-mail address to authorize those team members to see a dashboard of the reports. The customized reports can provide visualizations as they drill down into the details of:

  • Document and submissions status
  • Milestone status
  • Task duration
  • Country • Site
  • Team member
  • Volume of work scheduled for the next quarter

Time to Reflect, Time to Become a Leader
Sponsors, CROs, investigators, and regulators are demanding greater visibility into clinical trial data, expecting answers in real-time. This is driving the trend toward process improvement, encouraging stakeholders to reflect on their current SSU status, one of the most challenging and time-consuming aspects of clinical development. By going through a well-crafted assessment, whereby clinical trial teams evaluate their process, people, data, and systems, they can draw conclusions as to whether they are functioning as a leader or a laggard when it comes to areas of study startup, such as intelligent site profiling and better document management. This exercise in maturity will pinpoint where stakeholders are working well, and where they need to focus attention.

Making the necessary changes is facilitated by the availability of out-of-the-box, industry proven, end-to-end solutions with automated workflows that direct team members through the right steps, improving communication, mitigating risk, and helping them become stronger competitors as they develop much-needed new therapies.

  1. Integrated addendum to ICH E6(R1): guideline for good clinical practice E6(R2). Integrated conference on harmonisation harmonised guideline. June 11, 2015. Available at: http://www.ich.org/fileadmin/Public_Web_Site/ICH_Products/Guidelines/Efficacy/E6/E6_R2__Addendum_Step2.pdf. Accessed October 20, 2016.
  2. Miseta E. TransCelerate seeks to improve clinical trial quality. Clinical Leader. October 12, 2016. Available at: http://www.clinicalleader.com/doc/transcelerate-seeks-to-improve-clinical-trial-quality-0001. Accessed October 24, 2016.
  3. Getz K. Assessing and addressing site identification and activation inefficiencies. Tufts Center for the Study of Drug Development. March 2016.
  4. Guidance for Industry: Oversight of clinical investigations — A risk-based approach to monitoring. Food and Drug Administration. August 2013. Available at: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM269919.pdf. Accessed October 24, 2016.
  5. European Medicines Agency. Reflection Paper on risk based quality management in clinical trials. November 2013. Available at: http://www.ema.europa.eu/docs/en_GB/document_library/Scientific_guideline/2013/11/WC500155491.pdf. Accessed October 24, 2016.
  6. START Study Tufts CSDD-goBalto, 2012. Ken Getz’s presentation: Uncovering the drivers of R&D costs.
  7. Tufts Center for the Study of Drug Development– goBalto START Studies I and II. 2015.
  8. Getz K, Lamberti MJ. 89% of trials meet enrollment, but timelines slip, half of sites under-enroll. Impact Report. Tufts Center for the Study of Drug Development. January/February 2013, 15(1).
  9. Getz K. Assessing and addressing site identification and activation inefficiencies. Tufts Center for the Study of Drug Development. March 2016.
  10. Research and Markets, Research and Markets: The new 2015 trends of global clinical development outsourcing market, Business Wire. January 30, 2015. Available at: http://www.businesswire.com/news/home/20150130005621/en/Research-Markets-2015-Trends-Global-Clinical-Development#.VW3x01xViko. Accessed October 27, 2016.
  11. Morrison R. Technology’s role in clinical trials. Applied Clinical Trials. March 5, 2015. Available at: http://www.appliedclinicaltrialsonline.com/technology-s-role-clinical-trials. Accessed October 22, 2016.
  12. Neuer A. At the source. International Clinical Trials. November 2015. Available at: http://www.clinicalink.com/assets/ICTNOV2015.pdf. Accessed October 21, 2016.
  13. E-Clinical Solution Software Market - Global Industry Analysis, Size, Share, Growth, Trends and Forecast 2014 – 2020. Transparency Market Research. July 2014. Available at: http://www.transparencymarketresearch.com/e-clinical-solution-software-market.html. Accessed October 19, 2016.
  14. Warnock N. The benefits of using integrated technology with clinical trials. Life Science Leader. July 11, 2011. Available at: http://www.lifescienceleader.com/doc/the-benefits-of-using-integrated-technology-0001. Accessed October 20, 2016.
  15. Kim J. Kasher J, Azzi N. Data & technology in clinical trials 2015. The Pharma Review. November – December 2015. Available at: http://www.slideshare.net/NassimAzziMBA/data-technology-56371029. Accessed October 21, 2016.
  16. Maasouman MA. Development of lean maturity model for operational level planning. Concordia University. 2014. Available at: http://spectrum.library.concordia.ca/979560/1/Maasouman_MASc_S2015.pdf. Accessed October 24, 2016.
  17. Choudhury A. Are you ready? How to conduct a maturity assessment. iSixSigma. Available at: https://www.isixsigma.com/new-tosix-sigma/getting-started/are-you-ready-how-conduct-maturity-assessment/. Accessed October 24, 2016.
  18. Morgan C. Making site selection precise and accurate. Clinical Leader. May 25, 2016. Available at: https://www.gobalto.com/hubfs/docs/ArtReprint_08JUL_CL.pdf?t=1477369119836. Accessed October 25, 2016.
  19. goBalto. Global study startup survey reveals majority of life sciences companies lack automated processes. November 12, 2015. Available at: https://www.gobalto.com/news/global-study-startup-survey-reveals-majority-of-life-science-companies-lackautomated-processes. Accessed October 25, 2016.
  20. Lopienski K. 5 problems with using spreadsheets to collect clinical data. Forte Research Systems. January 2014. Available at: http://forteresearch.com/news/5-problems-using-spreadsheets-collect-clinical-data/. Accessed October 22, 2016.
  21. Merge eClinical. Integrating electronic systems in the clinical trial process. Clinical Leader. November 2014. Available at: http://www.clinicalleader.com/doc/integrating-electronic-systems-in-the-clinical-trial-process-0001. Accessed October 22, 2016.
  22. Poor clinical trial site choices inflate costs by 20%. Manufacturing Chemist Pharma. January 28, 2011. Available at: http://www.manufacturingchemist.com/news/article_page/Poor_clinical_trial_site_choices_inflate_costs_by_20/58943. Accessed October 21, 2016.