TECHNICAL PROPOSAL COVER SHEET

 

                                                              IN RESPONSE TO RFP-NIH-     BAA-RM-04-23    

 

TITLE: Re-Engineering the Clinical Research Enterprise: Feasibility of Integrating and Expanding Clinical Research Networks

 

OFFEROR (Primary organization/institution):

 

                Name and Address:                             Columbia University                                                                                          

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

 

                PRINCIPAL INVESTIGATOR:  Stephen B. Johnson, Ph.D.                                                                                        

 

                OFFEROR PERSONNEL Name (Last, First, Initial) and Degree(s)*

 

                Bigger, Thomas,  M.D.                                                                                                                                                      

                Kukafka, Rita,  Dr.PH                                                                                                                                                       

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

 

SUBCONTRACTOR ORGANIZATIONS (If more than one, list each organization and its personnel separately):

                Name and Address:                                                                                                                                                            

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

 

                PRINCIPAL INVESTIGATOR:                                                                                                                                          

               

                SUBCONTRACTOR PERSONNEL Name (Last, First, Initial) and Degree(s)*

 

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

 

COLLABORATORS or CONSULTANTS:

                Name [Last, First, Initial) Degree(s)]                                                                 Organization

 

                Weir, James H.  MD                                                                                                                Independent consultant

                Florenz, Meir,                                                                                                           Independent software consultant

                                                                                                                                                                                                               

                                                                                                                                                                                                               

                                                                                                                                                                                                               

 

* List all co-investigators/key personnel

(USE MORE PAGES, IF NECESSARY, TO LIST PERSONNEL AND ORGANIZATIONS)


BAA-RM-04-23                                 Columbia University              Stephen B. Johnson, Ph.D.

 

ABSTRACT

 

Clinical research is an extremely complex process involving large numbers of stakeholders over extended time periods. Information is vital to all research activities, from conception of the protocol, through execution, to dissemination of results.  Poor information flow directly contributes to lack of quality in research and high cost due to slow manual processes, introduction of errors and the inability to combine information from fragmentary or isolated sources. Unfortunately, information technology has had little penetration into the clinical research enterprise. Most of this effort has concentrated on trials in academic medical centers. The needs of investigators in community settings differ in striking ways, and are not adequately served by software provided by industry sponsors.

We have developed a successful working clinical trials network of 39 community practice research sites with centralized administration located at an academic medical center. Clinical research networks offer certain economies of scale by providing access to sufficiently large subject populations, standardizing best practices and centralizing administrative, financial, regulatory, and educational activities. However, the efficiency and expansion of the network are limited by the lack of information technology resources. The broad, long-term objectives of this proposal are to address these limitations by improving the information flow among investigators, administrators and participants. The specific aims are to:

1.      Establish a flexible information infrastructure for clinical research, designed to enhance information flow, promote data sharing and reduce redundant effort

2.      Develop a behavioral model of information technology use in clinical research, based on empirical study of information needs of users, barriers to technology use, and strategies for improving use

3.      Promote research quality and encourage best practices through training, information technology for clinical trials, and collaborative technology to improve communication and cohesion

 

The flexible information infrastructure will consist of software that manages the basic “transactions” of clinical research (e.g., scheduling a visit, or reporting physical exam values), in order to connect front-end user applications with back-end databases. Services provided by this infrastructure include automatic translation between different data coding schemes, pooling of common data to encourage reuse and reduce redundant effort, automated monitoring of protocol accruals, and automated matching of potential subjects to trials. This architecture allows multiple commercial and institutional applications to interoperate and provides researchers with real-time, access to research data via the creation of a virtual collaboratory.

Our approach is based on the observation that information technology in itself is insufficient to improve research processes; it is necessary to understand the behaviors that are required of all stakeholders, and the various factors that drive these. The research design will compare a number of software tools in community and academic settings, and systematically contrast various educational interventions designed to promote usage. Software will be phased in over the three years, starting with simple, administrative tools in a few sites and expanding to more complex, clinical tools at a larger number of sites.

 

MILESTONES

 

Specific Aim



Year 1

Year 2

Year 3

 

 

 

 

Information Infrastructure

Standardized data elements that define protocols

Standardized data elements that define patient characteristics

Standardized data elements that define visits and procedures

Add tables to shared database that store protocols and enrollment

Add tables to shared database that store patient characteristics

Add tables to shared database that store visits and procedures

Deploy application to collect data on protocols and enrollment

Deploy application to collect data about individual patients

Deploy application to collect data on visits and procedures

Use pooled data to automatically monitor accruals

Use pooled data to automatically screen patients for trials

Use pooled data to automatically determine site payments

 

 

 

Behavioral Modeling

Use focus groups to identify the values and concerns of key stakeholders regarding existing systems and procedures

 

 

Analyze focus group data to identify needs that can be managed by an information system

 

 

 

Identify the stakeholders and behaviors that need to be performed to use application to collect data on protocols and enrollment

Identify the stakeholders and behaviors that need to be performed to use application to collect data about individual patients

Identify the stakeholders and behaviors that need to be performed to use application to collect data on visits and procedures

 

Identify the factors that influence the required behavioral changes

Identify the factors that influence the required behavioral changes

Identify the factors that influence the required behavioral changes

 

Develop and implement approaches that are proactive and specifically targeted to influencing favorably the behavioral factors

Develop and implement approaches that are specifically targeted to influencing favorably the behavioral factors

Develop and implement approaches that are specifically targeted to influencing favorably the behavioral factors

 

 

 

 

Promoting Best Practices

Install collaboration software; make protocol documents available to network; train users

Use collaboration software to deploy online training materials; Use software to hold focus groups

Use collaboration software to conduct live training

Update and reconfigure GCP training materials, test in live courses, evaluate in focus groups

Test and evaluate GCP training materials

Adjust and publish GCP training materials

Set up the system to conduct the metrics/benchmark experiment

Conduct the metrics/benchmark experiment

Analyze metrics data, and make recommendations for enhancements

PROGRAM PLAN

 

I.                  Detailed Technical Plan and Proposed Statement of Work

A.     Rationale

Multiple barriers prevent the effective translation of basic science discoveries into clinical and community practice. Our collective inability to break through these barriers has significant impact on the ability of health professionals to provide safe, effective patient care and to reduce the cost of healthcare delivery.[1] Significant steps must be taken to improve the inherent clinical research capacity and translational ability of the enterprise, which include development of information management facilities of the healthcare system.[2]

In the current healthcare environment, the use of information technology has increased due to the complexity, volume, scope, and nature of information needed to deliver patient care, with documented contributions to the efficiency, quality and productivity.[3] The demand for such tools is just as urgent in clinical research. Numerous computer applications have been developed for the clinical research domain, yielding a variety of improvements in the research process: decreasing the time and effort required to develop protocol documents[4], increasing the efficiency of subject recruitment,[5] increasing the quality of data collected during the course of protocols,[6] [7] decreasing protocol deviations,[8] automating data analysis,[9] and providing the facility to execute novel study designs.[10] [11] Unfortunately, physician investigators and research staff and their community counterparts have not harnessed such applications due to a combination of organizational, technical, and social factors, regardless of their proven and potential benefits.[12]

Clinical research is an extremely complex process involving large numbers of stakeholders over extended time periods. Clinical research networks offer certain economies of scale by providing access to sufficiently large subject populations, standardizing best practices and centralizing administrative, financial, regulatory, and educational activities. Information is vital to all research activities, from conception of the protocol, through execution, to dissemination of results.  Poor information flow directly contributes to lack of quality in research and high cost due to slow manual processes, introduction of errors and the inability to combine information from fragmentary or isolated sources. Unfortunately, information technology has had little penetration into the clinical research enterprise, limiting the effectiveness of networks.

Information technology in itself is insufficient to improve research processes.[13] [14] Our approach is based on the observation that it is necessary to understand the behaviors that are required of all stakeholders, and the various factors that drive these.[15] The research design will compare a number of software tools in community and academic settings, and systematically contrast various educational interventions designed to promote usage. Software will be phased in over the three years, starting with simple, administrative tools in a few sites and expanding to more complex, clinical tools at a larger number of sites.

This proposal seeks to enhance the efficiency of our tri-state community-based clinical research network by improving the information flow among investigators, clinical research coordinators, administrators, and participants. We will adopt a multidisciplinary approach, combining information technology, behavioral science and educational methods to affect change in the network.

 

B.     Specific Technical Steps to Be Taken

1.      Develop a flexible information infrastructure for clinical research

There has been little penetration of information technology into clinical research. Software tools have been created to assist with various stages of clinical research, such as design, recruitment, data collection, and analysis. However, these tools are poorly integrated, due to lack of standards and other factors.[16] [17] The focus of this proposal is the flow of information between clinical research applications and other health care information systems (e.g. electronic medical records). Our approach allows different sites within community and academic settings to acquire and use different applications, as dictated by local information needs, site capabilities and availability of funds. Some software will be purchased from vendors while other modules will be developed internally. We seek to develop a data-sharing infrastructure that integrates these applications.

Since the range of applications in clinical research is very large, we will investigate examples of data sharing in three areas: administrative, subject-specific and protocol-specific. Administrative applications are concerned with the general management of protocols. This information includes the institution (site) at which research occurs, personnel, dates of activity, and possibly therapeutic target, entry criteria, body sites, etc. Subject-specific applications gather information about individual patients such as demographics, history, medications, and labs. Protocol-specific applications include scheduling of treatments and visits and collection of data that are unique to a particular study. This tripartite division roughly reflects increasing complexity of the software, and we will deploy applications in this order over the three years (see Milestones section above).  We hypothesize that complexity is a major factor in ease of deployment and user acceptance.  This question and related behavioral issues are discussed in the following section.

Administrative applications are concerned with the general management of protocols. Within the Clinical Trials Network, we have experience in tracking some of this information with the Clinical Trials Manager (Appendix D). We also have substantial experience on the academic side with the Cancer Center information system (Appendix F). We propose to combine the best features of these two systems to create a generic system for clinical research. The merging will not be overly difficult, since the systems are both Web-based and share a common database platform (Sybase).

Although developed for a single disease area, the Cancer Center system has many features required for the network, e.g. maintaining information about investigators and their funding (source, title, major goals, start and end dates, cost information, percent effort of personnel, etc). The system also tracks individual studies, tying them to their IRB protocol number. This information includes principal investigator, title, contact person and various important dates for approvals and renewals. The system has several features that will be very interesting when adapted to community research, such as informing administrators when protocols need to be renewed, tracking the numbers of patients active in a trial, and which protocols are due for interim or final analysis. 

We will build on this preliminary work to develop software to manage recruitment in clinical trials.  Meeting recruitment goals is an almost universal problem for clinical trials.  Most trials have to add clinical recruiting sites, adjust strategies, or extend the recruitment period to meet their recruitment goals.  Informal studies in our Network have shown that active management of recruitment increases recruitment by 50% to 100% compared with a laissez faire policy.  Simple, inexpensive procedures are effective for monitoring recruitment.  First, pre-trial estimates of available, eligible patients need to be reasonably accurate.  Second, each clinical site must always know if it is ahead or behind in recruitment and how much.  During the pilot phase of the Clinical Trials Network, we have monitored recruitment manually.  But, with more sites and studies, manual monitoring will take too much personnel effort.  Therefore, we plan to develop software that will utilize centrally pooled data to monitor recruitment performance.  Early detection of recruitment deficits will prompt discussions between the central Network personnel and clinical site personnel to identify feasible new recruitment strategies.  It is important to detect a growing recruitment deficit early because corrective action usually takes months to reverse recruitment shortfalls.

The next stage of deployment is a subject-specific application that gathers information about individual patients.  Here again we can build on the Cancer Center information system, which collects data on demographics, history, medications, labs, medical history, and much more. The primary technical challenge that we will face is that this system maintains information on individual patients by hospital number. New techniques for identifying patients must be devised, but which allow clinical information to be gathered from multiple sources, e.g. the medical records systems of participating hospitals. 

We will build on this preliminary work to develop a registry for patients who want to volunteer for clinical trials. Such a registry can help address the numerous barriers for potential trial participants, primary physicians, and investigators introduced by the Health Insurance Portability and Accountability Act (HIPAA).  This information system will serve two purposes: (1) ensure that individuals who want to consider clinical trials be made aware of opportunities and (2) to help investigators meet their clinical trials recruitment goals by screening willing participants.  Washington University in St. Louis has had such a registry in operation for several years and their system will serve as our model.[18]

Once data become available on individual patients, it is possible to develop automatic screening to identify likely clinical trial candidates. The key concept of this approach is that the clinical office staffs will enter most of the screening data as part of the routine conduct of the practices’ business. Our central database will permit electronic screening for patients who may be eligible to participate in clinical trials, and provide valuable support for decisions about participating in clinical trial opportunities.  A common mistake community physicians make is to accept trial opportunities when they lack enough eligible patients to be productive.  Making this error has dire consequences for personnel and financial management because so many of the expenses of participating in a trial, e.g., reviewing the protocol, contract and budget negotiation, completing regulatory documents, and obtaining Institutional Review Board approval are incurred before any appreciable income is realized.  If recruitment is poor, these expenses are never recovered. 

During the conduct of the trial, the information system can produce lists of potentially eligible patients who have appointments during the following week and the research coordinator can screen their charts when the patient charts are pulled for office visits. Having routine lists for chart pre-screening promotes efficient and continuous screening despite other distracting events. 

We have developed a prototype screening module that automatically evaluates admissions to relevant hospital floors of New York-Presbyterian Hospital each day and compares all admissions with the inclusion and exclusion criterion for the ACCORD trial.  Research coordinators are notified of potentially eligible patients within 24 hours of admission so that the patients’ physicians can be contacted, the charts screened, and the patients contacted while they are still in the hospital (even with the short hospitalizations of contemporary America).  Our preliminary experience is that the number of charts to review is dramatically reduced, but only rarely is an eligible patient excluded from further screening.  These automated screening procedures help the coordinators screen effectively without overloading them.  If the community practices are affiliated with a hospital that is willing to share their clinical data, the research team can screen all of the patients in their affiliated hospitals, not just their own patients.  Over time, we believe that we can capture data for patients at most of the New York-Presbyterian Healthcare System hospitals.  Casting a wider net will increase the probability of reaching recruitment goals within the allotted time, an important metric of successful trials.

Protocol-specific applications include scheduling of treatments and visits and collection of data that are unique to a particular study. We have experience will this type of application in the Cancer Center (Appendix F). That system gathers information on visits as well as procedures such as biopsies, cytologies, x-rays, scoping procedures, and exploratory surgery. The system also tracks types of therapy used to treat the cancer patient, such as, surgery, radiation, chemotherapy, and immunotherapy. The system collects follow-up information on the patient after discharge from the hospital to ascertain any additional therapy the patient may have received as well as the status of the patient’s survival.

Building on this work, we will pursue software to facilitate payment for the services conducted in clinical trials. As a result of almost five years of experience running a community based Clinical Trials Network, we are aware that the barriers in this setting are substantially different from those in academic medical centers.  We are developing operating procedures and information technology solutions to remove these barriers. One striking finding in the operation of the Clinical Trials Network was that community medical practices have small cash reserves and relatively large expenses.  Accordingly, these community practices cannot tolerate the sluggish financial systems found in most academic medical centers. For community practices to conduct clinical trials as part of academic networks, especially government-sponsored trials, they will require better information systems and standard operating procedures that enable timely payment. The tasks for the payment information systems include tools for enumerating the payable items for the schedule of visits and procedures in a clinical trials protocol, Web-based tools for logging billable services, links between the database of billable services and software for generating invoices for the sponsor and requesting checks from accounts payable.  We have conducted focus groups in our Network and have assessed requirements for a payment information system.  We have developed software and tested it with NIH grants that are deployed in the community practice Clinical Trials Network, but have yet to test it with industry-sponsored trials (see Appendix E).  If we can harmonize payment software so that it works with both government and industry-sponsored trials, we will have removed one of the important barriers to public-private cooperative ventures and partnerships, speeding advances for human health in both venues.

We propose to develop an information infrastructure that connects these applications through a central hub (Figure 1). As information is captured locally by user applications, it will be transmitted to the hub, where it can be stored for later use and propagated to other sites when it is needed. We are interested in intercepting the basic “transactions” of clinical research, such as describing a research site, identifying an investigator, beginning a new study, entering demographics for a patient or scheduling a visit. The purpose is to reduce redundant entry of data. For example, if information about an investigator has been entered into a computer application at one site in the network, it should not be necessary to enter this information again at another site if he or she is about to conduct a study there. Similarly, information about patients who participate in multiple trials should be captured once. As new information about a patient becomes available, it should be shared with sites that require these data.  Policies for data security and authorization are discussed in Appendix G.

Figure 1. Clinical Research Support in Community Settings

 

This infrastructure presents significant challenges for the present generation of clinical research software. Very few research systems are designed with data sharing in mind. Therefore, it is necessary to modify these systems to transmit key clinical research transactions as they occur. These modifications can be carried out with relative ease with software that we have been developing. However, modifications to vendor applications can present significant obstacles, requiring negotiation and additional cost. This will be a major factor in selecting vendor applications for deployment in the network: “closed” applications that cannot be integrated with reasonable effort and expense will be considered significantly less desirable.

Additional technical barriers to sharing research data include the fact that data are captured and stored in radically different database structures, and that the data items are coded using different schemes, many of them proprietary. Therefore, one of the primary functions of the central hub is to translate among the various “languages” employed by clinical research applications. We will make use of resources being developed by the Clinical Data Interchange Standards Consortium (CDISC), which is defining industry standards to support the electronic acquisition, exchange, submission and archiving of clinical trials data and metadata. The CDISC data models will ultimately support the end-to end data flow of clinical trials, from sources into operational databases, through analysis to regulatory submission. Sources of data include patient records and clinical laboratory data. Operational data covers data acquisition during clinical trials, interchange and archiving. Submission data covers the flow from operational databases to regulatory bodies. CDISC is based on Extensible Markup Language (XML), which is an important industry standard, and also incorporates many aspects of Health Level 7 (HL7), a key standard in patient care applications.

The essential component of the central hub is an interface engine, which enables clinical research applications to communicate. This technology helps to propagate transactions from one computer system to another, translate the structure of the data into a standard form, and translate the data elements into standard codes. We have substantial experience within our academic medical center with e*Gate Integrator 5.0 (developed by SeeBeyond), which enables our patient care applications to communicate across two campuses (Presbyterian Hospital and New York Hospital). We believe that this tool can be adapted to support clinical research in both  community and academic settings.

We have developed a special piece of technology to assist with the translation of data elements from one coding system into another. The Medical Entities Dictionary (MED)[19] acts as a kind of Rosetta stone, storing all the different ways a medical concept can be expressed in different systems. The MED integrates ICD, CPT, UMLS, NDC, SNOMED and many other local coding systems that feed into our electronic medical record. We have successfully integrated the MED with e*Gate. The MED provides an extremely flexible way to define data elements in a single, centralized resource. Our challenge for clinical research is to map clinical research data elements (e.g. those defined in CDISC) to existing concepts in the MED, and to add any new data elements that are not yet present (e.g. Common Data Elements defined by NCI). We also anticipate discovering many new data elements that have not yet been identified by any standards body. A key advantage of the MED is the ability to include these local codes as well as national standards.

The chief advantage of sending clinical research transactions through a central hub is that it becomes possible to pool data centrally for sharing. This enables applications to share data about investigators, subjects, protocols, sponsors, suppliers, costs, etc. In this manner, data collected by an application can be reused by another at a later time, reducing redundant entry. Another benefit is the ability to perform quality assurance (consistency checks), to determine whether new data are consistent internally or historically (e.g., checking that a subject has the same birth date now as when first registered in a trial). Central pooling also opens the possibility of using data in new ways, augmenting what can be accomplished at a single site.

2.      Develop a behavioral model of information technology use in clinical research

We will conduct an empirical study of information needs of users, barriers to technology use, and strategies for improving use. The technical steps of this approach are based on an IT implementation framework developed by Kukafka et al,[20] grounded in more than two decades of behavioral research, and rests on two propositions.  The first proposition is that IT use is complex, multi-dimensional and influenced by a variety of factors at the individual, group and organization levels.  Based on behavioral theory, we maintain that it is more effective to target multiple levels simultaneously than a single level. For example, it is more effective to attend to environment factors such as technology infrastructure in conjunction with individual factors, such as skills and unfavorable attitudes towards IT and change. The second proposition is that success in achieving change is enhanced by the active participation of members from the target user groups.  Accordingly, we engage end users at the first phase of the IT implementation framework when we identify the goals and needs of the organization, as perceived by each user group.  Each phase then progressively builds on this assessment, pinpointing key causes and factors that contribute to problems or needs.  This process of assessment enables an IT implementation plan to be developed with objectives and strategies linked at each phase to objectives and strategies that are based on what is learned about the needs of the organization.  In short, the framework promotes participatory design through a linkage system of critical assessment phases to ensure that the IT implementation planners have a structure in place to engage system end users effectively from the start.

The phases of the implementation framework are summarized in Table 1. In Phase 1, we will engage all stakeholders (PIs, study coordinators, office managers etc) as active partners in “diagnosing” the problems they perceive to impact their existing procedures for conducting clinical research.  This will enable our developers to expand their knowledge by identifying the values and subjective concerns of these key stakeholders with existing systems and procedures.

The research team will use focus groups to perform this initial needs assessment.  The focus group approach assumes that a homogenous group will provide the participants with the freedom to express their thoughts, views, and behaviors.  Consequently, separate groups will be conducted for each stakeholder group, e.g., PIs, study coordinators etc. Consistent with published assumptions and methods in the literature, each focus group will consist of 5-10 members and will be held in a setting that is permissive and non-threatening.[21] [22]  Two research team members will co-lead the focus groups. One individual will moderate the focus group while a second individual will serve as an observer and note things such as body language and changes to model diagrams that will not be picked up by the tape recorder.

As part of each focus group, members will participate in the evaluation and refinement of use cases. A use case is a type of scenario that describes a typical interaction between a user and a computer system as a series of sequential steps.[23] [24]  The research team will develop the initial use cases and will work with intended users during focus groups to evaluate and refine use cases for the integrated system. The use cases will be represented using Unified Modeling Language.[25] The modular and integrated system level functions will be evaluated against the use case.  The recordings of each focus group will be transcribed verbatim.

After the data from each focus group has been entered into Nvivo software,[26] the comments of the participants will be analyzed by two members of the research team.  This data will then be examined by the developers to identify the components of the needs expressed that can be managed though an IT software solution.  (Phase 2 of the IT implementation framework).

Based on the Phase 1 analysis, user needs will be mapped to the existing software available to manage the various stages of clinical research and when software does not exist to meet user needs, a determination will be reached on the functionality of additional modules required for internal development.  It is here that each stakeholders’ needs for the integrated system will be matched with the integrated systems functions to ensure that the system is both flexible and comprehensive in meeting end user requirements.

With these phases complete, end users and developers should be harmonized with respect to the specifications and functionality identified for the integrated system.  This understanding will be formalized with a written document to each participating site to clarify who will benefit, by how much, and what outcome will be achieved by what time. 

The methods in phases 1 and 2 that are described thus far will contend with major pre implementation organizational change factors that have been empirically associated with the successful uptake of new information technology.  For example, Guimaraes[27] (1981) found incomplete and inaccurate evaluation of user needs results in implementation failure.  Second, end users are more likely to support a project that they believe is in their own best interests.  Romano[28] and Lorenzi et al[29] found that end users will be less likely to resist if they see advantages of the proposed change.  Demonstrated benefits and valued consequences have been found to have a positive impact on implementation.  The use case scenarios performed in Phase 1 will enable end users to map their own roles and specific tasks to the functions of the integrated system thereby making transparent the utility of the system with respect to their own job performance.  Third, the project team should understand end user needs and ensure that end users understand how the project will meet those needs.[30] [31] [32] Lee and Steinberg[33] found that successful implementation was positively correlated with clear objectives and explicitly identified, tangible and measurable tasks.  The formalized document that is written at the end of phase 2 will capture mutual agreement based on a shared understanding and analysis of users’ needs.

Phases 3 and 4 look more closely at the factors that determine the behaviors of end users necessary for successful IT effectiveness.  It is here that the IT framework focuses on individual level behaviors required for users to incorporate the software into their daily routines (e.g., entering data, reviewing reports) and it is also here that a distinction is made between IT adoption at the organizational level and IT effectiveness.  IT effectiveness or the return the organization realizes from adopting a new technology can be seen as a function of (1) how well and consistently the “human users” use the technology and (2) the capabilities of the technology itself.  Thus, IT effectiveness is in essence the “human connection” between an organization’s decision to adopt a new technology and the organization realizing a return on its investment in the technology.  IT effectiveness is the consistent and skillful use of the IT by targeted users.  IT implementation research seeks to determine what motivates targeted users in the aggregate to accept and use the technology.  Accordingly, IT effectiveness is a necessary, but not sufficient condition for the system to demonstrate effectiveness.  In the context of this current proposal, the use of the integrated system to manage clinical trials is believed to offer superior advantages over the current processes being used.  Thus, while the consistency and quality of each stakeholder’s use of this technology would demonstrate IT effectiveness, the effectiveness of the integrated system will be demonstrated by changes in the quality metrics, as will be described in the following section on evaluating outcomes.

        Figure 2- Illustrative Summary of Empirically Derived Determinants Influencing Use of Software

 
Implementation is defined as the process of getting a targeted group to use a system in an appropriate and committed manner.  IT implementation effectiveness results when the targeted users employ the system in a skillful and consistent way.  In this proposal, we will use quantitative survey methods and on site visits to examine several determinants of IT use.  Grouping these determinants into three categories enables a more comprehensive analysis at various levels of influence.  Predisposing factors are those antecedents to behavior that provide the rationale or motivation for the behavior.  Enabling factors are the antecedents to behavior that facilitate a motivation to be realized.  Reinforcing factors are factors subsequent to a behavior that provide the continued reward or incentive for the behavior and contribute to its persistence and maintenance.  A search through relevant literature reveals the typical predisposing factors including knowledge, beliefs, values attitudes and confidence.  Enabling factors include skills and training, and the accessibility of the resources that make it possible for a motivated person to take action, (e.g., availability of computers and Internet connections at locations consistent with workflow processes, the availability of technical support).  Reinforcing factors are the attitudes and the climate of support facilitating behavior change including management, co-workers etc.  Figure 2 provides a table summarizing the determinants that have been empirically identified as being most important to influencing the behaviors required for users to incorporate the software into their daily routines.

Predisposing Factors

Perceived Usefulness

Degree to which a person believes that using the system would enhance their job performance

 

Relative Advantage

Degree to which using the system is perceived as being better than using the precursor function

 

Outcome Expectations

Expectations that the system can deal with job related outcomes 

 

Ease of Use

The degree to which using the system is perceived as being difficult to use

 

Job-fit

The extent to which an individual believes that using the technology can enhance the performance of his or her job

 

Self-efficacy

Judgment of one’s ability to use a technology to accomplish a particular job or task

Reinforcing Factors

Social Norm

The person’s perception that most people who are important to him think he should or should not perform the behavior in question (e.g., enter data)

 

Image

The degree to which use of the system is perceived to enhance one’s image of status in the organization; the individual’s internalization of the reference groups subjective culture

 

Results Demonstr-ability

The tangibility of the results of using the system e.g., less time to accomplish the same task, improved outcomes e.g., meeting patient recruitment goals

Enabling Factors

 

Availability of training and technical support

 

 

Objective factors in the environment that observers agree make an act easy to accomplish e.g., location of the computers

 

 

Facilitating Conditions e.g., compatibility with other systems in use

Table 1.  Key Determinants of User Behavior

A search through the relevant literature has also yielded validated surveys for assessing the predisposing and reinforcing determinants.  For example, the following are items in one such survey to measure the determinant relative advantage,[34] defined as the degree to which using the system is perceived as being better than using its precursor: “using the system will enable me to accomplish tasks more quickly”, “using the system will improve the quality of the work I do”.  The following is another example of items to measure outcome expectation,[35] a construct that relates to the consequences of the behavior: “If I use the system….I will increase my effectiveness, I will spend less time on tasks, I will increase the quality of output, I will increase the quantity of output for the same amount of effort.”  The adaptation of an existing validated survey to assess predisposing and reinforcing factors will be chosen over developing such a survey instrument from scratch.  We will pretest the survey instrument with a sample of end users prior to implementation.  We will make every effort to keep the survey instrument short (not more than two pages taking fifteen minutes or less to complete).  We will administer the survey online with follow up by phone contact until we reach at minimum a 70% or greater response rate.  Data will be analyzed using the SPSS Statistical Software.[36]

The third category of factors, enabling factors, will be assessed during an onsite visit with each participating agency.   The major tasks in the assessment of the enabling factors will be accomplished using procedures that were developed at our institution.  One such procedure utilizes a participatory design process that enables users to understand how the use of a new IT system can be best integrated into existing workflow processes.  A second procedure is workflow analysis.[37] Instruments and processes for this assessment are provided in Appendix H.

In phase 5 (the final phase) of the IT implementation framework, we will develop and implement approaches that are proactive and specifically targeted to influencing favorably the predisposing, enabling, and reinforcing factors identified in Phase 4. 

The overall thrust of the processes in Phases 3 and 4 is the need to identify potential barriers and factors that can facilitate the desired change. The goal of phase 5 is to develop an action plan that mitigates barriers and effectively utilizes those factors that can facilitate the desired change.  The ultimate purpose of understanding the determinants of user adoption behavior is to develop educational interventions to influence the likelihood of its occurrence. Our approaches will incorporate skills training, a marketing mix of carefully constructed communication messages and channels to affect attitudes and beliefs that are seen as barriers to IT system use, and other methods that are best suited to address the barriers and disincentives we identify.  The strategies we will employ will be largely derived from behavioral theory.  Particularly, we will rely on intention-based theories such as theory of reasoned action,[38] theory of planned behavior,[39] social cognitive theory[40] and technology acceptance model. [41] [42] 

Persuasion refers to “an active attempt to influence people’s action or belief by an overt appeal to reason or emotion”,[43] or “communication intended to influence choice.” [44]  Since beliefs are the ultimate determinants of behavior, in order to influence an intention or the corresponding behavior, it is necessary to change underlying beliefs.  Persuasion has been shown to be one of the most important strategies for influencing beliefs and behavior.[45]  Based on behavioral and persuasion theories, we will examine the effects of argument quality, training, and direct-use experience on the formation and changes over time of individual beliefs and adoption decision of the IT system.  In addition, we will rely on social marketing theory, which provides a body of research to adapt techniques from commercial marketing, which inform strategies to plan and implement programs designed to bring about behavior change.[46] [47] [48]

The phases as discussed represent the logical progression of the framework but are not intended as a literal sequence to be carried out. The milestones and time lines for deployment as have been described, consist of several parallel efforts, to be unfolded in incremental phases. A major characteristic of our approach is that we will not attempt to address all user needs simultaneously. Instead, we will introduce applications in a progressive manner: administrative, subject-specific and protocol specific (see Milestones).

 

 

Task

Participants

Methods

Phase 1

Identify the values and subjective concerns key stakeholders have with existing systems and procedures.

 

PIs, Study Coordinators, Office Managers and other potential stakeholder groups

Focus groups

Use Cases

Phase 2

Identify the components of the needs expressed that can be managed by an information system. 

System Developers

Analysis of data from Phase 1

 

Based on this assessment, define initial system objectives, specifications, and functionality and map to perceived needs of end-users

System Developers

Analysis of data from Phases 1 and 2

 

Formalize a shared understanding of who will benefit, by how much, and what outcome will be achieved by what time. 

System Developers and End-Users

Milestones and Timeline for phased implementation

Phase 3

Identify for each stakeholder the behaviors that need to be performed to get the system used:

·         Reflect on the context of current actor responsibilities and roles, clearly identify what behavior changes in routine clinical and office management practices are needed.

·         Reflect on the context of current practice, clearly identify what environmental level changes are needed.

End-users, Project research team

On site visits

Participatory Design

 

Phase 4

Identify the factors that influence the required behavioral changes

Project team, end users

Survey

Phase 5

Develop and implement approaches that are proactive and specifically targeted to influencing favorably the predisposing, enabling, and reinforcing factors identified in Phase 4.

Project team, end users

Micromedia Breeze software; training; persuasive theoretically grounded communication via email and a newsletter

Table 2.  Tasks, Participants and Methods

When each application is progressively deployed: administrative, subject-specific and protocol specific, we will conduct the phase 3 behavioral and phase 4 determinant assessments to target the stakeholders, behaviors and behavioral determinants that are linked specifically with that application.  (Phases 1 and 2 will be conducted only once, at the start of the first year.) For example, we anticipate that PIs will not have primary interaction with the administrative application and therefore, their behaviors and behavioral determinants will not be primary. Although, their influence does affect the primary stakeholders behavior as a reinforcing determinant and therefore this influence as a reinforcing factor will be incorporated in the use-promoting intervention accordingly. Because we will be assessing on application specific behaviors and behavioral determinants, we can provide a highly tailored approach to developing and implementing approaches to influencing favorably the behavioral factors that are associated with each application.

Figure 3 shows the longitudinal data collection schedule that will be carried out in each of the three years as a new application is deployed.  System training will be specific for each application that is being deployed.  Months 5 and 8 represent measures of IT effectiveness, which as referred to previously, include quality and consistency of IT use.  Measurements will be usage log analysis supplemented with a questionnaire administered online, and a feedback discussion forum using the Micromedia Breeze software, to provide more open ended, qualitative data.  In months 11 of each of the three years, will also examine relevant quality metrics.  These metrics to evaluate site performance will quantify efficiency in starting a trail, recruitment rate, data quality.  Section D, part 3 describes the clinical trials metrics and benchmarks.

We will stagger educational and use inducing strategies targeting each specific application throughout each year.  To determine the effectiveness of different educational and use inducing strategies, we will manipulate our educational approaches to determine which strategies the targeted groups rate most useful, and which have greatest impact on promoting IT usage.  The data we analyze by systematically contrasting various education interventions will contribute to Aim 2, which is to develop a behavioral model of information technology use in clinical research, based on empirical study of information needs of users, barriers to technology use, and strategies for improving use. 

Figure 2.  Longitudinal Data Collection Schedule

 

     O                       X                   X                    O                X                 O                  X               O 

Behavioral                  System           System                User                System          User                 System         Usage

Assessment                Training          Use                     Reactions        Use                Reaction           Use             Measurement

                                                                                      Usage                                    Usage

                                                                                      Measurement                        Measurement

 

 Months                 Month                                               Month                                Month                                        Month                                 

   1-2                          3                                                        5                                         8                                               11                   

 

X = activity

                                                                                                                                    O = measurement

 

 

 

 

 

 

 

 

 


3.      Promoting Best Practices

A key objective of this proposal is to improve the efficiency and quality of clinical research by training investigators and coordinators. The centerpiece of this initiative is a course in good clinical practices (GCP), which introduces investigators and coordinators to human subjects’ protections and good clinical practices and prepares them for a certifying examination given by Columbia University. We require that all investigators and coordinators who participate in the Clinical Trials Network pass the GCP examination. 

The Office of Clinical Trials has developed several different approaches for disseminating this material into community practices. James H. Weir, MD began the course “Introduction to Good Clinical Practices” in 1996, which consisted of two days of lectures and discussion groups given at the Columbia University Health Sciences campus.  In 1998, a self-study workbook and two volumes of videotaped lectures were introduced into the course . At that time, the live lectures and panel discussions were shortened to one day. Another course developed  and run by Dr. Weir consisted of eleven 2-hour seminars given at lunchtime, which provided detailed discussions of the practical aspects of conducting clinical trials.  These seminars were offered to about 30 clinical research coordinators (capacity crowd) in 1999-2000 and 2001-2002. 

Most investigators preferred the all-day lecture and panel discussions, although some prepared for the GCP examination solely by independent study, using the GCP workbook and videotaped lectures.  However, the campus course proved to have disadvantages for clinical trials personnel working in the Network (New Jersey-Rockland County New York, Brooklyn-Queens-Long Island, and Westchester County New York-Connecticut).  For busy physicians, the trip to and from the Columbia Health Sciences campus added about three hours to the duration of the course. Losing an entire day of practice was perceived as a hardship and a logistical barrier.  For coordinators attending the 2-hour luncheon seminars held on campus, the trip to and from their offices took longer than the seminar; they preferred half or all-day seminars.  In response to this, GCP lecture slides were made available on line, but this proved inadequate in the absence of an audio track and high quality images; slides that were satisfactory in the context of a live lecture were inadequate as a stand alone web-based teaching tool. 

We propose to develop training modules that better match the needs of a network of community practice investigators and research coordinators.  Live courses will be given on a regional basis at sites closer to the research sites.  Because live courses can be given only a few times a year and not on demand, we will investigate various methods to provide training materials through the Internet. Initially, these resources can be used to deliver supplemental information or the entire training course to meet various demands for training between courses.

We are currently comparing a variety of tools to support online training, meetings and collaboration.  Macromedia Breeze[49] helps to personalize presentations with narration and deliver them through a standard Web browser. The tool can be used to build a collection of PowerPoint presentations that includes surveys, tracking, analysis, course administration, and content management. Another interesting feature supports online meetings. Alternatively, we are considering Documentum eRoom[50] which provides a “digital workplace” that allows people to work together on content, projects, and processes across the enterprise. The tool also offers collaborative features to manage content and workflow.


C.     Technical Plan with Clearly Defined Milestones of Progress and Key Decision Points

The specific aims described above will be carried out as three parallel yet interlocking programs: establishing an information infrastructure for clinical research, developing a model of the behavior of the users of this system, and promoting best practices through training. The table in the Milestones section summarizes the activities to be carried out for each aim, in each of the three years of the project. Each of the aims unfolds in a different manner across the three years.

 

Aim 1: Information Infrastructure

 

The progression of activities is determined by the type of information being collected in the network: the first year focuses on the administrative aspects of protocols; the second on the characteristics of individual patients; and the third on specific visits and procedures carried out during the conduct of a trial. 

 

Year 1: The principal operational task for this year is to deploy an application into the clinical research network that enables sites to collect basic administrative data about their protocols and to track the number of patients enrolled in each study. To facilitate data sharing, all data elements that define protocols will be standardized and stored in our central data dictionary. The transactions that create protocols and enroll subjects will also be standardized. This will allow information about protocols and enrollment to be stored in a central pool for the network as a whole. The goal for the year is to begin to automatically monitor accruals in each trial across the network.

 

Year 2: The principal operational task for this year is to deploy an application into the clinical research network that enables sites to collect data about individual patients. To facilitate data sharing, all data elements that define patient characteristics will be standardized and stored in our central data dictionary. The transactions that register new patients and add important clinical characteristics will also be standardized. This will allow information about patients in the network to be stored in a central pool. The goal for the year is to begin to automatically screen patients for trials.

 

Year 3: The principal operational task for this year is to deploy an application into the clinical research network that enables sites to collect data on the visits and procedures occurring in specific studies. To facilitate data sharing, all data elements that define visits and procedures will be standardized and stored in our central data dictionary. The transactions that schedule new visits and track the administration of procedures will also be standardized. This will allow information about visits and procedures being conducted in the network to be stored in a central pool. The goal for the year is to begin to automatically determine how much payment to make to each site, according to the specific activities.

 

 

 

Aim 2: Behavioral Modeling

 

The progression of activities is determined by the implementation framework described above. The first year focuses on what users need and how the three applications meet some of those needs.  Also in year 1, we examine what behaviors are actually required of users as they enter data on protocols and patients, and identify the factors that influence the required behavioral changes.  We develop, implement and evaluate approaches that are proactive and specifically targeted to influencing favorably the behavioral factors for the application to collect data on protocols and enrollment.  In year 2, we examine what behaviors are required of users as they collect data about individual patients, and identify the factors that influence the required behavioral changes.  We develop, implement and evaluate approaches that are proactive and specifically targeted to influencing favorable the behavioral factors for the application to collect data bout individual patients.  In year 3, we repeat these same steps but for the application to collect data on visits and procedures.

 

Year 1: Identify the values and subjective concerns key stakeholders have with existing systems and procedures. Identify the components of the needs expressed that can be managed by an information system.  Identify the behaviors that need to be performed to use the application to collect data on protocols and enrollment, and identify the factors that influence the required behavioral changes. Develop, implement and evaluate approaches that are proactive and specifically targeted to influencing favorably the behavioral factors.

 

Year 2: Identify the behaviors that need to be performed to use the application to collect data about individual patients, and identify the factors that influence the required behavioral changes.  Develop, implement and evaluate approaches that are proactive and specifically targeted to influencing favorable the behavioral factors.

 

Year 3: Identify the behaviors that need to be performed to use the application to collect data on visits and procedures, and identify the factors that influence the required behavioral changes.  Develop, implement and evaluate approaches that are proactive and specifically targeted to influencing favorable the behavioral factors. 

 

Aim 3: Promoting Best Practices

 

Year 1: Install collaboration software; make documents available to network; train users. Update and reconfigure GCP training materials, test in live courses, evaluate in focus groups. Set up the system to conduct the metrics/benchmark experiment.

 

Year 2: Use collaboration software to deploy online training materials; Use collaboration software to hold focus groups. Test and evaluate GCP training materials. Conduct the metrics/benchmark experiment.

 

Year 3: Use collaboration software to conduct live training. Adjust and publish GCP training materials. Analyze metrics data, and make recommendations for enhancements.

 

 

D.    Participating Networks and Areas of Expansion

The Clinical Trials Network conducts research in 5 therapeutic groups (cardiovascular, internal medicine/diabetes, gastroenterology, neurology, and oncology). Trials are supported by both government (NIH) and industry sponsors. There are currently 39 participating sites and 77 investigators, with 15% minority investigators and 30% minority participants. In the first four years, 1999–2002, 34 trials were conducted, 26 with industry sponsors and 8 with government sponsors. Key policies under which the network operates, and any other relevant information are are detailed in the site agreement (see Appendix B) and in the history of the Columbia-Cornell-NY Presbyterian Clinical Trials Network (Appendix A).

1.      Integration with Cancer Center Trials

The Herbert Irving Comprehensive Cancer Center (HICCC) was established in 1972 as a formal organizational component of Columbia University and the New York Presbyterian Hospital. In the area of clinical research, a comprehensive cancer center is expected to initiate and conduct early phase, innovative clinical trials and to participate in the NCI’s cooperative group system by providing leadership and accruing patients to trials. The Cancer Center Information System (Appendix F) supports a variety of computer applications to support these activities. The most basic functions provide central management and oversight for coordinating, facilitating and reporting on the cancer clinical trials of the institutions that define the center, whatever the origin (local, industrial, cooperative group, or other). The Protocol Review and Monitoring System (PRMS) provides a range of management and quality control functions, including a central location for cancer protocols, a centralized database of protocol-specific data, and an updated list of currently active protocols for use by center investigators, and status reports of protocols. Quality control functions include centralized education and training services for data managers and nurses, data auditing, and oversight of data and safety monitoring to comply with federal requirements. In April of 1995, the Cancer Center Protocol Office assumed responsibility for central registration of patients on cancer clinical trials. A system was developed to house demographic and study-specific information for each patient. This system assists with tracking of patient visits and study requirements, as well as to facilitate collection of long term follow-up data.

 

As described above, the proposed information infrastructure seeks to generalize the information system used in the Cancer Center, and combine it with tools developed within the Clinical Trials Network. The goal is to produce a generic tool for clinical research that may be deployed effectively in both academic and community settings.

2.      Development of Device Trial Programs in the New York-Presbyterian Healthcare System

Recent developments in implantable diagnostic or therapeutic devices have increased substantially the contribution of devices to human health.  As the efficacy and safety of devices has been proven for several common chronic diseases, the rate of device development has accelerated markedly and the number and size of clinical trials has shown a corresponding growth.  For example, cardiac resynchronization therapy (biventricular pacing) has been shown to improve quality of life, physical activity, and survival in a large subset of patients with severe heart failure, i.e., those with cardiac desynchronization.  There are about 500,000 new cases of heart failure each year and the number is growing.  Also, insulin pump use has increased and may surge forward if on-going trials show that early and intensive glycemic control with insulin prevents or reduces mortality and disabling morbidity in diabetic patients.  Like heart failure, the number of patients with diabetes is growing rapidly (currently estimated to be about 18 million in the US) as the average weight and age of the US population increases.  Also, new stenting devices are being developed at a rapid rate to treat occlusive atherosclerosis of the aorta and the coronary, carotid, and peripheral arteries.

There are many device companies, which are characteristically much smaller than pharmaceutical companies.  Because of fiscal limits, device companies tend to have product development and financing styles that are quite different from the drug industry.  Also, the US Food and Drug Administration Center for Devices and Radiological Health (CDRH) has regulations and traditions for the conduct of clinical trials that are quite different from the Center for Drug Evaluation and Research (CDER).  Most devices are expensive and implanted in hospital.  Device trial financing and contracting tend to be confusing to hospital finance groups and often are perceived as conflicting with hospital fiscal control policies.  Accordingly, contracting and budgeting issues can cause very substantial delays in starting device trials. 

We plan to develop an Office for Device Research in the central administration of the New York-Presbyterian Hospital with liaison to the Office of Clinical Trials in order to concentrate expertise in contracting, financing, and conducting device trials throughout the New York-Presbyterian Healthcare System and, thus, better support these trials.  Hopefully, our experience will be generalizable and help to remove some of the obstacles to efficiency in the conduct of device trials.

3.      Clinical Trials Metrics and Benchmarks

We propose to establish a set of universal data metrics to evaluate site performance in clinical trials (Appendix C). We will participate in a prospective study designed by the Data Metrics Task Force, Pharmaceutical Research and Manufacturers of America (PhRMA) – Association of American Medical Colleges (AAMC) Forum on Clinical Trials (Dr. Bigger was a member of the planning group).  The clinical trials metrics will quantify efficiency in starting a trial, recruitment rate, data quality, etc.  These metrics will permit clinical sites or networks to compare their performance with benchmarks to identify components of their clinical trials activity that need improvement.  Also, the performance measures can be repeated to determine if adjustments do, in fact, improve performance and how much.  Such metrics will be important for comparing and improving performance among sites or among networks.  We envision that clinical trials metrics will prove to be an important resource for developing, evaluating, and maintaining the national clinical trials network to be developed by the NIH Roadmap for re-engineering the clinical research enterprise.

 

II.               Offeror's Qualifications, Qualifications of Network Participants and Areas of Expansion, and Management Plan

The team assembled for this proposal brings expertise from several key disciplines. Dr. Johnson will lead the effort, and brings both theoretical and practical experience from the field of biomedical informatics; Dr. Bigger has substantial experience with clinical research in both academic and community settings; Dr. Kukafka brings theoretical and practical experience from behavioral science, providing a bridge between the technological and clinical worlds. These three individuals will carry out activities to meet the three specific aims; these are, respectively: establishing an informatics infrastructure, promoting best practices, and developing a behavioral model of technology use in clinical research. The organizational chart below details the roles and responsibilities of the individuals in these three areas. Time commitments of investigators and key personnel can be found in Summary of Related and Proposed Activities (Appendix I).

 

Figure 3. Organizational Chart

Each of the three areas of the project will hold weekly meetings to manage the tasks described in the Milestones above, inviting leaders of the other three areas as needed to coordinate their activities. Each of the groups will have extensive contact with clinical research personnel in the Network. The informatics group will interact with users in order to understand information needs, assist with technology training and to manage the daily operations of the computer applications once these are deployed in the field. The behavioral group will conduct focus groups, surveys, training sessions, and send communications via newsletters, email and the collaboration software. The practice improvement group will conduct training sessions face to face and via the collaboration software.

Additional participants in the project bring talents in data warehousing, applications development. The scientific and technical expertise of key personnel, task responsibilities, and amount of effort are described in further detail below.

Stephen B. Johnson, Ph.D. (Principal Investigator, 25% effort) is an Associate Professor of Biomedical Informatics at Columbia University. He holds a doctorate in Computer Science from New York University. Dr. Johnson has spent the last 16 years developing patient care systems at the Columbia University Medical Center. He brings substantial experience in developing large-scale, distributed clinical information systems. Of primary relevance to this proposal is research on information infrastructure that allows diverse computer applications (typically created by different vendors) to communicate and share information. [51] [52]  A key element of this infrastructure is the use of messaging standards to connect clinical applications.[53] 

Dr. Johnson is an expert in clinical database design and data modeling. The centerpiece of the Columbia system is a unique database design that enables multiple clinical applications to share data in a highly efficient and flexible manner.[54] [55] [56]  Dr. Johnson is trained in linguistics, and has done extensive research in natural language processing and on the semantics of clinical language and medical terminology. This work has contributed to portions of the Columbia information infrastructure, facilitating the translation among the various coding schemes for clinical data used by different computer systems.[57] [58]

More recently, Dr. Johnson has conducted research exploring how to adapt the information infrastructure from patient care to the challenges of clinical research.[59]  This work led to participation in the Clinical Research Roundtable (CRR) at the Institute of Medicine. The purpose of the CRR is to convene interested individuals to discuss the challenges facing clinical research, and the approaches that might be followed to create a more supporting environment for the conduct of a broad agenda of high quality clinical research. The CRR explores the ethical underpinnings, workforce problems and infrastructure-related issues that span the full spectrum of clinical research. Discussions in that group have stressed the importance of using information technology in breaking the translational barriers in the clinical research enterprise.[60]

J. Thomas Bigger, M.D. (Co-Investigator, 25% effort) Professor of Medicine and of Pharmacology at Columbia University’s College of Physicians and Surgeons, is a cardiologist with extensive government and industry sponsored clinical trials experience.  He was instrumental in the founding of the Office of Clinical Trials in 1992 and, in 1998, along with Mr. Michael Leahey and Dr. David Bickers conceived of a network of academic and community medical practices to perform clinical trials.  The Clinical Trials Network (CTN) began as a partnership among Columbia University Health Sciences, Weill College of Medicine of Cornell University, and New York-Presbyterian Hospital.  The CTN proposed to conduct a pilot study consisting of five therapeutic groups (cardiovascular, internal medicine/diabetes, gastroenterology, neurology, and oncology) each with five participating community practices, and five years of activity.  Dr. Bigger was the academic leader for the cardiovascular therapeutic group when the CTN launched in mid 1999.  In 2000, Dr. Bigger became the Director of the entire CTN.  During the first four years, the CTN conducted 34 clinical trials, 8 sponsored by the NIH and 26 by industry.  Over the past five years, Dr. Bigger has formulated policy and led the medical operations group that coordinates administrative and medical support to the network.  For six years, Dr. Bigger lectured in the Introduction to Good Clinical Practices course and his staff arranged courses and certification for clinical research personnel at clinical sites in the community Network.

During Network meetings and visits to participating practices, Dr. Bigger has become aware of many information technology infrastructure needs to facilitate and support CTN operations, e.g., recruitment, retention, data management, and payment.  He has been particularly interested in tailoring recruitment strategies to specific protocols and recruitment settings.  He will work with Biomedical Informatics to further define IT needs, to specify products, and to introduce and evaluate IT developments to the Network.  Dr. Bigger and his staff will arrange focus groups comprised of community-based investigators to address IT software development plans, pilot projects, and evaluations.  Input from the focus groups will be a vital part of effective IT product development.

For several years, Dr. Bigger has been working with the Pharmaceutical Research and Manufacturers of America (PhRMA) – Association of American Medical Colleges (AAMC) Forum on Clinical Trials and served on its Data Metrics Task Force.  This task force, comprised equally of industry and academic representatives, met quarterly and conducted two annual national workshops for clinical trials offices to select a set of metrics that measure performance of clinical sites participating in clinical trials. The task force evaluated these metrics in a retrospective study of industry-sponsored trials.  The metrics were revised based on results of the retrospective pilot study.  The Metrics Task Force is now planning a prospective study that will include both government (NIH) and industry-sponsored trials. 

Dr. Bigger will lead network expansion activity, selecting sites, and therapeutic areas for expansion.  His staff will support new sites and therapeutic groups through the now familiar steps of site development to reach productive clinical trials activity.  He will serve as one of the liaisons from the CTN to other networks that participate in the national program developed under BAA-RM-04-23. Dr. Bigger will collaborate with Dr. John Ennever to develop the device trials network among the hospitals of the New York-Presbyterian Healthcare System.  He plans to develop a new therapeutic group to conduct clinical trials in asthma, an effort that we anticipate will take three years.  The CTN will add other therapeutic groups as academic or community thought leaders are identified.  The steps to develop a new therapeutic group are clearly defined and initial discussions permit an accurate timetable for developing one.

Rita Kukafka, Dr.P.H. (Co-Investigator, 25% effort) is an Assistant Professor in the Department of Biomedical Informatics at Columbia University, with a joint appointment in the Department of Sociomedical Sciences.  The focus of Dr. Kukafka dual appointment is to develop a program of research and training in Public Health Informatics and Health Communication.  She holds a Doctorate degree from the School of Public Health at Columbia University, and two Masters degrees: one in health education, and the second in Medical Informatics from Columbia University, where she also completed a 3-year National Library of Medicine funded Postdoctoral Fellowship in Medical Informatics. Dr. Kukafka entered the field of medical informatics with more than fifteen years experience focusing on the design and evaluation of behavior and education interventions for patients, providers, and populations. 

Prior to her academic appointment at Columbia, she lead a study where she and her team of co-investigators developed, implemented and evaluated an initiative to reduce cardiovascular disease in a population of 210,000.  As part of this trial, she implemented in more than 50 community practices behavioral strategies designed to motivate physicians to incorporate the National Cholesterol Guidelines into their patient encounters and office procedures.  Successful grant outcomes lead to the establishment of the Preventive Medicine Institute where she served as founding director for 2 years. 

Her current research is on representing patient perceptions and beliefs for purposes of creating tailored information and to evaluate computer mediated communications designed to influence changes in health behaviors and provider practices.  Kukafka co-led MI-HEART, an educational dissemination project funded by the National Library of Medicine aimed at improving response efficacy to symptom of myocardial infarction.[61]  The project used patient-specific information from an electronic medical record to produce educational materials for patients at risk for myocardial infarction. Using a cognitive model, factors that influence decision-making were measured.[62] [63]  This led to representation of tailored education material using Enhanced Decision Tables, notably using the “disease process” as an anchor to tie multiple atomic educational elements tailored for different populations.    The patient finally experienced the intervention as a continuous and well-structured flow of education material without any perception of the underlying atomic structure of the elements. [64]  The applications of behavioral models to design the IT use inducing strategies for the proposed intervention will be similar to those used in MI-HEART. More recently, Dr. Kukafka is co principal investigator and primary developer of HIV TIPS, a web-based decision support system for clinicians and patients that includes tailored education aimed at improving HIV medication prescribing and adherence to prescribed regimen.[65] [66] The intervention is currently being implemented in 42 agencies in three states.  An agency and provider IT needs assessment was developed by Dr. Kukafka and collaborators to assess institutional barriers and physicians’ readiness to use such IT.  Results of the needs assessment is used to tailor for the staff of each agency IT use promoting strategies designed to motivate use of the software. 

Her IT implementation framework and experience for diffusing new technology into healthcare organizations has lead to another ongoing project in collaboration with the New York City Health Department.  In this new project, Dr. Kukafka is assessing the readiness and technology infrastructure in 20 federally funded community practice sites to transmit timely and quality data centrally for purposed of syndromic surveillance in New York City.  Dr. Kukafka will hold responsibility for the behavioral model aspect of the project. She will provide overall leadership and direction for each phase of the IT implementation framework, conduct focus groups, develop the survey instruments, and analyze the data.  She will provide guidance and review of training and information inducing strategies, and evaluate their effectiveness to promote IT usage and mitigate barrier. 

John F. Ennever, M.D.  (5% effort) Associate Clinical Professor of Pediatrics at the College of Physicians and Surgeons of Columbia University, is a Pediatric Gastroenterologist with experience in both basic and clinical research in bilirubin photochemistry as it applies to phototherapy for neonatal jaundice.  He demonstrated how the kinetics of bilirubin photoproduct elimination in human infants differ from that seen in the rat animal model and that biliary excretion an irreversibly formed structural isomer of bilirubin is the principal route of pigment elimination in neonates treated with phototherapy.  Dr. Ennever was involved in the design and initial clinical trials of a fiber-optic phototherapy device for the treatment of neonatal jaundice.

For the past four years, Dr. Ennever has served as the Associate Medical Director of the Clinical Trials Network (CTN).  He has worked closely with Dr. J. Thomas Bigger in the formulation of policy and procedures for this nascent, community-based network of clinical investigators.  He has developed an expertise in regulatory issues surrounding clinical trials, and has helped to assure the regulatory compliance in the conduct of all CTN trials.  He serves as a member of the Columbia University Medical Center Institutional Review Board.

Dr. Ennever has held a number of hospital administrative positions at the New York-Presbyterian Hospital.  As the Hospital’s representative to the Executive Committee of the Clinical Trials Advisory Committee, he was instrumental in gaining support from the Hospital for the establishment of the Clinical Trials Network.  Prior to joining the faculty in the Department of Pediatrics, he served as the Medical Director for the New York-Presbyterian Healthcare System.  His knowledge and understanding of the community hospitals that are members of this healthcare system will be invaluable in the establishment of a new system-based device network.  Dr. Bigger, who implanted devices for almost 20 years, will assist Dr. Ennever with interactions that involve company personnel or implanting physicians.

David Wajngurt, M.D. (5% effort) Assistant Clinical Professor in the Department of Biomedical Informatics is the director of the Clinical Data Warehouse at Columbia University Medical Center. The Warehouse is a collection of information about patients for use in clinical research, hospital administration, and patient care. The project is sponsored by the Clinical Trials Office, and seeks to provide a comprehensive collection of clinical information to facilitate clinical research, clinical trials, administration and patient care. Dr. Wajngurt’s experience will help with exploring ways to integrate clinical data from office practices into clinical data warehouses. A key issue within this task is management of metadata (data about the data). These efforts will support our goal of improving the yield of screening in clinical trial recruitment. Dr. Wajngurt will also explore policies and methods to effectively anonymize data in the Warehouse  to encourage sharing of clinical data by office practices.

Janie Weiss (15% effort), Manager for Information Technology and Operations. Ms. Weiss currently manages the Columbia University Comprehensive Cancer Center Oncoinformatics Core, a group of 5 employees providing an exceptional level of service to over 200 Cancer Center members and their staff. Ms. Weiss has overall responsibility for capacity planning, systems installation and setup, monitoring, troubleshooting data storage and backup, disaster recovery, and provides information technology consulting for new hardware and software purchases for the Cancer Center members. Ms. Weiss has over 20 years of experience in systems management, database design and programming. In the proposed project, she will direct the development efforts of the programmers, and coordinate the testing and integration phases of the software implementation. She will also assist with training of users, and manage the daily operations of the clinical research systems that we deploy.

Susan Kistler (50% effort), Senior Systems Analyst/Programmer, is a database expert and has an extensive background in information management with emphasis on the storage and retrieval of patient data. She is currently responsible for the development, implementation and management of the Comprehensive Cancer Center Information System (CCCIS) since 1987 (Appendix F). Ms. Kistler has a close working relationship with Biomedical Informatics, and has collaborated on multiple projects to explore new tools and methods to improve the quantity and retrieval techniques of clinical data available to researchers. She has also participated in information standards activities as a member of the Informatics Task Force of the Association of American Cancer Institutes (AACI).

III.           Facilities and Resources

Columbia University provides a unique environment for computing, with remarkable advantages to offer research. The medical school is one of the few in the country to have a true Department of Biomedical Informatics, which was formed in 1987. The department is one of the largest in the world, with over 40 faculty members, and conducts a wide spectrum of informatics research, spanning molecular, clinical and population research.  Biomedical Informatics works closely with the Cancer Center, and with Genome Center which was established in 1995, and which recently formed the Center for Computational Biology and Bioinformatics, funded under the Biomedical Information Science and Technology Initiative (BISTI). This effort brings together 10 bioinformatics faculty members from departments ranging from Biochemistry to Electrical Engineering. Columbia is also notable for having a School of Public Health, in which there is an emerging effort in informatics research.

Investigators within the campus have access to a variety of services through these departments, centers and schools. CUBHIS (Columbia University Biomedical and Health Information Services) provides networking, access to servers, desktop support, and Web site development. Clinicians access patient care through the Web-based Clinical Information System (WebCIS). These data are also available for research through a clinical data warehouse.

A.     Oncoinformatics

Oncoinformatics provides information services to the Herbert Irving Comprehensive Cancer Center. They serve more than 120 research groups, representing over 500 computer users comprised of faculty members, research scientists, and students engaged in basic and clinical research. The Oncoinformatics staff brings many years of experience to this proposal. Areas of expertise include data modeling, analysis and mining, acquisition of software tools and user training.  In addition, Oncoinformatics can contribute hardware resources to the proposed project, for development purposes or to hold large datasets. The group supports multiple servers which are maintained in a secure, raised-floor machine room.

The primary contribution of this group toward the project is the Cancer Center information system (described in greater detail in Appendix F) which supports multiple functions related to clinical research. While the system was developed to support cancer research, its design is quite general, and the system can be adapted with minor modifications for a multitude medical specialties. Administrative functions keep track of Cancer Center members (investigators), grant funding information, and track the usage of research facilities to manage billing information. Clinical functions include storing detailed information about patients (subjects), tracking outpatient visits, and maintaining information about multi-institutional clinical trials.

This system has several advantages to offer the current project. It is deployed through the Web, giving users access to information from many different kinds of computer platforms. The database design is generic, and therefore can easily be adapted to other disease areas. The design uses relational tables managed by the Sybase database management system. The team also has experience using software development tools for building cross-platform client/server applications (in this case, Panther from Prolifics).

B.     Web-based Clinical Information System

The Web-based Clinical Information System (WebCIS) is the electronic patient record at the Columbia University Medical Center. This system provides clinicians with information on over two million patients. Clinicians can look up demographics, registration data, insurance information, and examine the current census of admitted patients. The system allows physicians to maintain lists of their patients. They can also obtain information from the laboratory, radiology, pathology, and many other ancillary departments, as well as read discharge summaries. A popular new function is the ability to write admission notes and progress notes.

WebCIS contributes to the current proposal in several ways. The infrastructure of WebCIS is extremely general, and allows many different computer applications to interoperate (whether created within our institution or by different vendors). Data are transported between computer systems using the Health Level 7 standard, and data are translated into standardized formats using our Medical Entities Dictionary. WebCIS is thus the inspiration for much of what we proposed for the information infrastructure to support clinical research. The system can also serve as a source of real-time data on patients who are seen at the Columbia campus. Finally, the system provides links to the Cancer Center information system, which enables physicians to learn about trials that may be suitable for their patients. This facility will be augmented, broadening our ability to recruit patients for clinical trials within the network.

C.     Clinical Data Warehouse

The Clinical Data Warehouse at Columbia University Medical Center is a collection of information about patients for use in clinical research, hospital administration, and patient care. The project is sponsored by the Clinical Trials Office, and the departments of Surgery, Epidemiology, Biostatistics, and other departments within the institution. Its mission is to provide a comprehensive collection of clinical information to facilitate clinical research, clinical trials, administration and patient care. A data warehouse is a database containing consolidated data from distinct operational databases, spanning long time periods, and augmented with summary information. In this way, financial and clinical data from many disparate sources are brought together into a single location. (WebCIS is one of several sources of data for this resource.) The key advantage of a data warehouse is that data can be aggregated across millions of patient records quickly, and without impacting patient care information systems. This data can be used to support all levels of organizational decision-making and research.

This resource will contribute in several important ways to the proposed project. Because the database (and the server on which it is housed) is already supported by the Clinical Trials Office, it is a logical place in which to store the data to be shared by the Clinical Trails Network. The Warehouse staff has substantial experience with extracting and providing clinical data for research projects, and coordinating these activities with IRB approvals to ensure patient confidentiality. The group also has experience with technique and policies for anonymizing data, which can be extended to the network to encourage sharing of clinical data by office practices.  Finally, the staff has already begun creating a tool to automatically screen subjects for clinical trials. This tool can be extended to work with data from the community practices, greatly expanding its power.

D.    Columbia University Biomedical and Health Information Services

The mission of Columbia University Biomedical and Health Information Services (CUBHIS) is to provide high-quality, reliable, and effective information resources and technologies to support (a) optimal education, research, patient care, and administration, (b) an effective interface with other University components and NewYork Presbyterian Hospital, and (c) the institution's public presence on the Internet. CUBHIS also coordinates with affiliated clinical organizations and investigative laboratories in order to assure strong information-management support both for clinical care and for collaborative scholarly and educational activities of our faculty, staff, students, and affiliates.

The Desktop Support Group through CUHBIS and the Information Technology Service Group (ITS) provides information technology consulting and support services for the faculty and staff of Columbia University Medical Center. They are committed to providing quick and comprehensive computer support for all our users. They also actively explore the latest innovations in networking and information technologies to assist various communities of Health Sciences. They provide users with e-mail, connections to a variety of local administrative and clinical systems, medical research databases, and the World Wide Web.

E.     Core Resources

Core Resources is a joint department between the New York Presbyterian Hospital and Columbia University, and provides electronic data communications for daily operations. They are responsible for the campus data network as well as an extensive wide area network, and are committed to providing ongoing, pro-active service to ensure the reliability and efficiency of data communications.

The network is primarily comprised of fast Ethernet local area networks (LANs) in the various buildings around the campus. These connect via gigabit Ethernet to a central core that interconnects other building LANs, legacy networks, wide area networks and the Internet. They maintain extensive connections to nearby as well as remote facilities or institutions, which include FrameRelay, Point to Point T1, T3, fractional services, ATM, Microwave and WiFi. Other services include design, implementation and support of Domina Name Server, Internet Protocol management, Dynamic Host Configuration Protocol as well as remote connectivity via Virtual Private Network. The department also runs a Mail eXchange gateway for many of the University and Hospital departments.

F.      Minority Participation

The Columbia University Medical Center, located at the Northern end of Manhattan, serves as the main health care facility for a culturally diverse community.  Both patient care and research activities at the Columbia University Medical Center place a major emphasis on the inclusion of minorities and women in our study population, and in addressing aspects of the medical problems that are particularly relevant in minorities and women. 

The Inwood/Washington Heights neighborhood includes 400,000 individuals who largely obtain their health care from CUMC. The demographic composition of the Hispanic patient population served by Columbia Presbyterian has several major implications for conduct of clinical research. Washington Heights is home to the largest community of Dominicans outside the Dominican Republic itself.  Columbia-Presbyterian or its affiliates, Harlem Hospital and St. Luke’s/Roosevelt provides outpatient and inpatient care to the majority of these patients.  A large portion of the Dominican population of Washington Heights consists of recent immigrants (many of whom are undocumented.)   Information from U.S. census studies of 1970 and 1980 showed that of the eighteen numerically most important immigrant groups to enter the U.S., the highest percentage of non-English speaking immigrants was among Dominicans. 

Office of Clinical Trials has also created a Hispanic Recruitment and Research Center. Another developing source of local Dominican Republic immigrant patients is a liaison between Columbia University Cancer Center and members of the Dominican Medical Association, a professional society composed of physicians from the Dominican Republic practicing in the United States, predominantly in Upper Manhattan and the Bronx.

IV.            Dissemination and Use of Results

The proposed project will generate several resources that will contribute to the goal of re-engineering clinical research as part of the NIH Roadmap. These contributions fall in the three areas of our specific aims: establishing an information infrastructure, developing a model of behavior, and promoting best practices in clinical research.  The information infrastructure will develop user requirements for collecting information in three major areas: basic information about studies and enrollment, demographic and clinical on individual patients, and the visits and procedures of specific trials. These data will give rise to three applications of enormous utility in clinical research, respectively: monitoring recruitment levels in trials, screening patients for studies and determining payments to community sites participating in clinical research. The standardization of data elements and clinical “transactions” for these three tasks can contribute directly to informatics efforts in the National Electronic Clinical Trials and Research Network (NECTAR). We will work with other institutions participating in this NIH Roadmap to devise an overall technology transfer policy. Through this mechanism, we can package the Roadmap software and documentation for use in other institutions or community settings.  One likely possibility is to make it available for distribution by NIH through appropriate licensing agreements.

The model of user behavior in adopting technology to conduct clinical research will be another important contribution. This model will be the first to clearly describe user needs in community settings, and to uncover major barriers to technology use that arise in daily practice. We also hope to identity some strategies that can affect these behaviors and improve usage. These findings will largely be communicated through publications and presentations at conferences.

The training materials developed with NIH Roadmap support to promote Good Clinical Practice (GCP) will also be made available for use by other clinical trials networks. Our experience with distributing these materials through new electronic means (e.g., collaboration software) may facilitate this task, making it easier for other sites to communicate with their research personnel.

We have also proposed to study metrics and benchmarks for clinical trials performance in our Clinical Trials Network, in conjunction will the Data Metrics Task Force of the Pharmaceutical Research and Manufacturers of America (PhRMA) and Association of American Medical Colleges (AAMC).  Other institutions participating in the NIH Roadmap effort may also wish to participate in the proposed study of the metrics that will include both government (NIH) and industry-sponsored trials.


 



[1] McGlynn, E.A., et al., The Quality of Health Care Delivered To Adults In The United States. N Engl J Med, 2003. 348(26):2635-45.

[2] Sung, N.S., et al., Central challenges facing the national clinical research enterprise. Jama, 2003. 289(10):1278-87.

[3] Kian LA, Stewart MW Bagby C, Robertson J. Justifying the cost of a computer-based patient record. Health Financ Manage 1995;49:58-60.

[4] Rubin, D.L., J. Gennari, and M.A. Musen, Knowledge representation and tool support for critiquing clinical trial protocols. Proc AMIA Symp, 2000:724-8.

[5] Marks, L. and E. Power, Using technology to address recruitment issues in the clinical trial process. Trends Biotechnol, 2002. 20(3):105-9.

[6] Pogash, R.M., et al., Data management procedures in the Asthma Clinical Research Network. Control Clin Trials, 2001. 22(6 Suppl):168S-80S.

[7] King, D.W. and R. Lashley, A quantifiable alternative to double data entry. Control Clin Trials, 2000. 21(2):94-102.

[8] Claxton, K. and K.M. Thompson, A dynamic programming approach to the efficient design of clinical trials. J Health Econ, 2001. 20(5):797-822.

[9] Shakespeare, T.P., et al., Improving interpretation of clinical studies by use of confidence levels, clinical significance curves, and risk-benefit contours. Lancet, 2001. 357(9265):1349-53.

[10] Padkin, A., K. Rowan, and N. Black, Using high quality clinical databases to complement the results of randomised controlled trials: the case of recombinant human activated protein C. Bmj, 2001. 323(7318):923-6.

[11] Gottlieb, S., Study explores Internet as a tool for care of diabetic patients. Bmj, 2000. 320(7239):892.

[12] Payne PR, Johnson SB, Tilson HH, Dowdy D. Breaking The Translational Barriers: The Value Of Integrating Biomedical Informatics and Translational Research. JAMA 2004 (submitted).

[13] Davis SA, Bostrom RP. Training end users: an experimental investigation in to the roles of the computer interface and training methods. MIS Q 1963;17:61-85.

[14] Gery G. Granting three wishes through performance centered design. Commun ACM 1997;40(7):54-9.

[15] Kukafka R, Johnson SB, Linfante A, Allegrante JP. Grounding a new information technology framework in behavioral science: a systematic analysis of the literature on IT use. J Biomed Informatics 2003;36(3):218-227.

[16] Sung NS, Crowley WF, Genel M, Salber P, Sandy L, Sherwood LM, Johnson SB, et al. Central Challenges Facing the National Clinical Research Enterprise. JAMA;2003:289(10).

[17] Payne PR, Johnson SB, Tilson HH, Dowdy D. Breaking The Translational Barriers: The Value Of Integrating Biomedical Informatics and Translational Research. JAMA 2004 (submitted).

[18] Schuster DP, McGill J. The right infrastructure can help you attract clinical trials to an AMC. Applied Clinical Trials 2001;10(11):44-52.

[19] Cimino JJ, Hripcsak G, Johnson SB, Clayton PD.  Designing an Introspective, Multi-Purpose Controlled Medical Vocabulary.  Proceedings of the Thirteenth Annual Symposium on Computer Applications in Medical Care.  Washington, D.C.; 1989:513-518.

[20] Kukafka R, Johnson SB, Linfante A, Allegrante JP. Grounding a new information technology implementation framework in behavioral science: a systematic analysis of the literature on IT use. J Biomed Inform.  2003 Jun;36(3):218-27

[21] Kitzinger, J., Qualitative research. Introducing focus groups.  BMJ 1995; 311(7000):299-302. BMJ, 1995. 311(7000): p. 299-302.

[22] Greenbaum, Thomas L. 1998. The Handbook for Focus Group Research, second ed. Thousand Oaks, Calif.: Sage Publications.

[23] Cockburn A. Writing Effective Use Cases. 2000, Boston, MA: Addison-Wesley Publishing Co., Inc.

[24] Bittner, K., I. Spence, and I. Jacobson, Use Case Modeling. 2002, Boston, MA: Addison-Wesley Professional.

[25] Fowler, M., UML distilled: Applying the standard object modeling language. 1997, Reading, MA: Addison-Wesley

[26] QSR International. https://www.qsr.com.au/

[27] Guimaraes T. Understanding Implementation Failure. Journal of systems Management. 1981; 32(3):12-27.

[28] Romano CA. Predicator of nurse adoption of a computerized information system as an innovation. In MEDINFO 95 Proceedings, RA Greenes, Pp 1335-9, International Medical Informat.

[29] LorenI NM, Riler RT. Managing Change: an overview. Journal of the American Medical Informatics Association. 2000;7(2)116-124.

[30] Smith MJ, Carayon P. New technology, automation, and work organization: Stress problems and improved technology implementation strategies.  International Journal of Human Factors in Manufacturing. 1995; 5(1):99-116

[31] Delbecq A, Mills P. Managerial practices that enhance innovation. Organizational Dynamics;198514(1):24-34.

[32] Hage J, Aiken M. Social Change in Complex Organizations. New York: Random House. 1970.

[33] Lee WB, Steinberg E. Making implementation a success of failure. Journal of Systems Management 1980;3(4):19-25.

[34] Moore GC, Benbasat I. Development of an instrument to measure the perceptions of adopting an information technology innovation, Information Systems Research (2:3), 1991, 192-222.

[35] Compeau, DR, Higgins CA. Computer self-efficacy: development of a measure and initial test. MIS Quarterly. 19:2, 1995B, 189-211.

[36] SPSS, Inc SPSS for Windows. Release 12.

[37] Starren, J, Chan, S, Tahil, F, White, T. When Seconds are Counted:Tools for Mobile, High-Resolution Time-Motion Studies Proceedings of the American Medical Informatics Association Annual Fall Symposium; 2000:833-7.

[38] Fishbein, M., and Ajzen, I. Belief, Attitude, Intention and Behavior: An Introduction to Theory and Research, Reading, MA:Addison-Wesley, 1975.

[39] Ajzen, I. “The Theory of Planned Behavior,” Organizational Behavior and Human Decision Processes (50:2), 1991, pp. 179-211.

[40] Bandura, A. Social Foundations of Thought and Action: A Social Cognitive Theory, Englewood Cliffs, NJ: Prentice Hall, 1986.

[41] Davis, F. D. “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology,” MIS Quarterly(13:3), 1989, pp. 319-339.

[42] Davis, F. D., Bagozzi, R. P., and Warshaw, P. R. “User Acceptance of Computer Technology: A Comparison of Two Theoretical Models,” Management Science (35:8), 1989, pp. 982-1003.

[43] Wright, J. S., and Warner, D. S. Advertising, New York: McGraw-Hill, 1962.

[44] Brembeck, W. L., and Howell, W. S. Persuasion, A Means of Social Influence, Englewood Cliffs, NJ: Prentice-Hall, 1976.

[45] Fishbein, M., Ajzen, I., and McArdle, J. “Changing the Behavior of Alcoholics: Effects of Persuasive Communication,” inUnderstanding Attitudes and Predicting Social Behavior, I. Ajzen and M. Fishbein (eds.), Englewood Cliffs, NJ: Prentice-Hall, 1980.

[46] Kotler, P. and Zaltman, G. (1971) Social marketing: An approach to planned social change. Journal of Marketing. 35, 3-12.

[47] Pearcey, P., & Draper, P. (1996). Using the diffusion of innovation model to influence practice: a case study. Journal of Advanced Nursing, 23(4), 714‑721.

[48] Lefebvre, R.C., and Rochlin, L. Social marketing. In: Glanz, K.; Lewis, F.M.; and Rimer, B.K.; eds. Health Behavior and Health Education: Theory, Research, and Practice. 2nd ed. San Francisco, CA: Jossey-Bass Publishers, 1997, 384-401.

[49] Macromedia Breeze. https://www.macromedia.com/software/breeze/

[50] Documentum eRoom. https://www.documentum.com/solutions/collaboration/index.htm

[51] Johnson SB, Forman BH, Sengupta S, Sideli R, Cimino JJ, Clayton PD. The Electronic Medical Record: Architecture and Standards.  Toward An Electronic Patient Record, 1995 March 15-18; Orlando (FL). Boston (MA): Medical Records Institute, 1995(2):14-18.

[52] Johnson SB, Forman B, Cimino JJ, Hripcsak G, Sengupta S, Sideli R, Clayton PD.  A Technological Perspective on the Computer-Based Patient Record.  In: Steen EB (ed). Proceedings of the First Annual Nicolas E. Davies CPR Recognition Symposium, 1995 April 4-6; Washington (DC).  Washington (DC): Computer-based Patient Record Institute, 1995: 35-51.

[53] Sideli R, Johnson SB, Weschler M, Clark A, Chen J, Simpson R, Chen C.   Adopting HL7 as a standard for the exchange of clinical text reports. In: Miller R, editor. Proceedings of the 14th Annual Symposium on Computer Applications in Medical Care; 1990 Nov 4-7; Washington, D.C.; 1990: 226-229.

[54] Johnson SB, Friedman C, Cimino JJ, Clark T, Hripcsak G, Clayton PD.  Conceptual Data Model for a Central Patient Database.  Proceedings of the Fifteenth Symposium on Computer Applications in Medical Care.  Washington, D.C.; 1991: 381-385.

[55] Johnson SB.  Generic Data Modeling for Clinical Repositories. JAMIA, 1996:3(5).

[56] Johnson SB, Hripcsak G, Chen J, Clayton PD. Accessing the Columbia Clinical Repository. Proceedings of the Eighteenth Annual Symposium on Computer Applications in Medical Care, 1994 November 5-9; Washington (DC). New York: McGraw Hill, 1994.

[57] Cimino JJ, Hripcsak G, Johnson SB, Clayton PD.  Designing an Introspective, Multi-Purpose Controlled Medical Vocabulary.  Proceedings of the Thirteenth Annual Symposium on Computer Applications in Medical Care.  Washington, D.C.; 1989:513-518.

[58] Forman BH, Cimino JJ, Johnson SB, Sengupta S, Sideli R, Clayton P.  Applying a Controlled Medical Terminology to a Distributed, Production Clinical Information System.  Proceedings of the Nineteenth Annual Symposium on Computer Applications in Medical Care, 1995 October 28 - November 1; New Orleans (LA). Philadelphia: Hanley and Belfus, 1995:421-5.

[59] Johnson SB, Clayton PD.  Databases in Clinical Research.  In: Meyer RE (ed). For the Health of the Public: Ensuring the Future of Clinical Research, vol. II.  Association of American Medical Colleges, 2000.

[60] Sung NS, Crowley WF, Genel M, Salber P, Sandy L, Sherwood LM, Johnson SB, et al. Central Challenges Facing the National Clinical Research Enterprise. JAMA;2003:289(10).

[61] Kukafka et al. Modeling patient response to acute myocardial information: implications for a tailored technology-based program to reduce patient delay. Proc AMIA Symp. 1999. 570-574.

[62] Kukafka R, Lussier YA, Patel VL, Cimino JJ. Developing tailored theory-based educational content for WEB applications:illustrations from the MI-HEART project. Medinfo.  2001;10(Pt 2):1474-8. 

[63] Kukafka R, Lussier YA, Eng P, Patel VL, Cimino JJ. Web-based tailoring and its effect on self-efficacy: results from the MI-HEART randomized controlled trial. Proc AMIA Symp.  2002;:410-4. 

[64] Lussier YA, Kukafka R, Patel VL, Cimino JJ. Formal combinations of guidelines: a requirement for self-administered personalized health education. Proc AMIA Symp.  2000;:522-6.

[65] Merrill J, Kukafka R, Bakken S, Ferat R, Agopian E, Messeri P. Tailored Health Communication: Crafting the Patient Message for HIV TIPS. Proc AMIA Symp.  2003;:932.

[66] Kukafka R. et al, The HIV TIPS Project: Delivering treatment advice and tailored patient education to proviers in underserved communities. MedInfo 2004 (in press).