CDM’s High Standards
Use of standards has become increasingly widespread within clinical data management. There are few regulatory mandates for using any particular standard, using standards in all areas of data collection and handling can greatly increase an organization’s efficiency by shortening study setup time and incorporating effective and validated standards and most importantly the speed of a medical treatment’s path to market. Most of the established standards in use are designed to be independent of any platform.
Although multiple standards exist for similar concepts, the ultimate goal is for researchers everywhere to use the same standards and naming conventions for their studies, clinical research industry is trending this direction. The US Food and Drug Administration (FDA) has strongly encouraged (not yet mandated) the use of the Study Data Tabulation Model (SDTM) for data submissions. SDTM Controlled Terminology Guide, Data Sets for Study Guide and SDTM QA Guide. Data submissions in a standardized format allow the FDA and other regulatory bodies to expend fewer resources on their review of study data, and more easily compare across different studies.
International Organization for Standardization (ISO) was created in 1947 after delegates from 25 countries met to discuss the creation of an international organization to create and maintain international standards for industry. ISO has created generic standards for product quality and management systems that are applicable to any endeavor. There are multiple ISO standards specific to various processes involved with clinical research.
International Conference on Harmonisation (ICH) of Technical Requirements for Registration of Pharmaceuticals for Human Use began in 1990 as an effort to standardize pharmaceutical regulatory requirements in Europe, Japan, and the US. The ultimate objectives of ICH are to 1) maintain safety and quality while increasing efficiencies in the use of human, animal, and material resources, and 2) help eliminate unnecessary delays in bringing new medical treatments to market.
Clinical Data Interchange Standards Consortium (CDISC) create standards for clinical research data. The CDISC mission is to develop and support global, platform-independent data standards that enable information system. Also see the CDISC Clinical Research Glossary and Acronym Abbreviation and Initials Resource.
- Clinical Data Acquisition Standards Harmonization (CDASH) released October 2008 by CDISC, is intended to streamline and standardize data collection at clinical investigative sites. The published CDASH standard consists of a basic set of data collection fields (variable name, definition, metadata) that apply to the majority of case report forms (CRFs), regardless of therapeutic area or phase of development. Sponsors are expected to make additions for therapeutic area specific data collection fields, as well as other data collection fields needed for regulatory requirements. The CDASH standard also includes best practice guidelines, regulatory references, and information about the development of the CDASH standard. In order to ensure harmonization between standards, recommendations are provided for mapping CDASH data collection fields (or variables) into the SDTM submission structure. CDASH Domains data collection fields are divided into the following sixteen domains along with their associated codes: AE, CO, CM, DM, DS, DA, EG, EX, IE, LB, MH, PE, DV, SC, SU, VS.
- Laboratory Model (LAB): The CDISC LAB standard was initially released in 2002, and was designed to be a standard for the transfer of laboratory data applicable to clinical research. Data Field Levels 12 categories and associated data fields. Good transmission practice, Study, Site, Subject, vist, accession, record extension, base specimen, base battery, base test, base results. Extensions to lab base model includes: microbio, pharmagenom, ECG, specimen handling, edit/data query.
- Operational Data Model (ODM) released 2002 by CDISC to address the structure of data. Providing a standard format for transportation so automation of CRFs is possible within the EDC. Uses the extensible markup language (XML) to create a file with the four following primary elements such as study name and metadata, administrative information such as users, sites, and authorizations for the study, reference data (e.g., normal ranges), and clinical data from the study. Supported Data Formats include integers, decimals, text strings, Boolean terms, hex binary, base 64 binary, dates and times, partial dates and times, intervals, durations, and more.
- Study Data Tabulation Model (SDTM) was released by CDISC in 2004, and was developed to provide a standard for the organization, structure, and format of tabulation data to be submitted to regulatory agencies. Tabulation datasets contain collected data from a clinical study, and should not be handled in the same manner as the other three types of data submitted to regulatory agencies (e.g., analysis datasets, patient profiles, and listings). Variable Classification Scheme explains each variable, which normally corresponds to a column in a dataset, can be classified according to its Role. A Role determines the type of information conveyed by the variable in describing an observation. Variables can be classified into five major roles: Identifier variables, Topic variables, Timing variables, Qualifier variables, Rule variables, Qualifier variables into five subclasses: grouping qualifiers, result qualifiers, synonym qualifiers, record qualifiers, and variable qualifiers. Standard Domains include the following 6 catagories and respective codes: Special Purpose Domains (DM, CO, SE, SV), Interventions (CM, EX, SU), Events (AE, DS, MH, DV, CE), Findings (EG, IE, LB, PE, QS, SC, VS, DA, MB, MS, PC, PP, FA), Trial Design Domains (TA, TE, TV, TI, TS), and Special Purpose Relationship Datasets (SUPPQUAL, RELREC). In 2004 CDISC has also released an SDTM standard implementation guide (SDTMIG) and Summary of Annotations. This implementation guide is intended to guide the format, organization, and structure of tabulation datasets. Any organization using SDTM should also utilize this implementation guide.
- Analysis Dataset Model (ADaM) was initially released by CDISC in 2004 as a standard model to create analysis datasets for submission to regulatory bodies, and can be thought of as an extension to the SDTM standard. The ADaM describes the proposed content, structure, and metadata of analysis datasets, including analysis dataset metadata, analysis variable metadata, and analysis results metadata. The standard includes examples of datasets created using the ADaM. Four Key Principles for Analysis Datasets: Analysis datasets should facilitate clear and unambiguous communication; Analysis datasets should be useable by currently available software applications; Analysis datasets should be linked to machine-readable metadata; Analysis datasets should be analysis-ready. CDISC has released a draft ADaM Implementation Guide (ADaMIG) to augment the ADaM standard, and is intended to guide the format, organization, and structure of analysis datasets. ADaM also provides their Data Structure for Adverse Event Analysis and Validation Checks.
Electronic Common Technical Document (eCTD) standard was developed by the ICH to provide a standardized format for submitting files from pharmaceutical studies to regulatory bodies. Unlike some standards used in clinical research, eCTD focuses more on data and file structures than naming conventions. The eCTD relies heavily on the Document Type Definition (DTD) specification of the XML markup language. These DTDs are used to create a detailed hierarchical folder structure for each eCTD and to support high-level functional requirements.
- eCTD Modules consists of five modules, four of which are common to all countries and regions. The first of the following five modules may vary between different ICH regions. 1. Regional Administrative Information and Prescribing Information— Module One contains administrative information and forms that may vary between countries and regions. 2. Common Technical Document Summaries—Module Two contains summaries of the information contained in Modules Three, Four, and Five. 3. Quality—Module Three provides detailed information about the treatment being studied and details of the product’s development and manufacturing processes. 4. Nonclinical Study Reports—Module Four provides detailed pharmacological, pharmacokinetic and toxicological information. 5. Clinical Study Reports—Module Five contains the results of the study, including data related to background and development rationale, efficacy, safety, benefits and risks. Including electronic submissions to regulatory authorities.
HL7 Standards were founded in 1987 initially created to produce standards for hospital information systems. The following HL7 standards relate to clinical data management to increase use of electronic health records within hospitals.
- Reference Information Model (RIM)—provides structure, naming and coding conventions
- Clinical Context Object Workgroup (CCOW)—designed to enable different computer applications to communicate with each other effectively.
- Clinical Document Architecture (CDA)—based on the RIM, and uses the XML markup language to specify the coding, structure, and semantics of clinical documents to be exchanged.
eSubmitter (FDA): a standardized tool that is part of an electronic submissions program originated in the Center for Devices and Radiological Health (CDRH). The eSubmitter program evolved from two very successful pilot programs (eLaser and Turbo 510(k)) at CDRH. FDA eSubmitter is an improved and expanded package for a variety of submission types and is now available for voluntary use by sponsors and manufacturers in certain device and radiological health and blood regulated industries.
NCIs cancer Biomedical Informatics Grid (caBIG®): which is intended to simplify collaboration by leveraging shared expertise and large multidisciplinary data collections to speed many of the processes of cancer research. The four key principles of caBIG®—open access, open development, open source, and federation—have guided the development of interoperable software tools, data standards, and a computing infrastructure conceived to advance basic and clinical research.