Abstract
Health information technology capabilities in some healthcare sectors, such as nursing homes, are not well understood because measures for information technology uptake have not been fully developed, tested, validated, or measured consistently. The paper provides a report of the development and testing of a new instrument designed to measure nursing home information technology maturity and stage of maturity. Methods incorporated a four round Delphi panel composed of 31 nursing home experts from across the nation who reported some of the highest levels of information technology sophistication in a separate national survey. Experts recommended 183 content items for 27 different content areas specifying the measure of information technology maturity. Additionally, experts ranked each of the 183 content items using an IT maturity instrument containing seven stages (stage 0–6) of information technology maturity. The majority of content items (40% (n=74)) were associated with information technology maturity stage 4, corresponding to facilities with external connectivity capability. Over 11% of the content items were at the highest maturity stage (Stage 5 and 6). Content areas with content items at the highest stage of maturity are reflected in nursing homes that have technology available for residents or their representatives and used extensively in resident care.
INTRODUCTION
The Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009 authorized up to $40 Billion to stimulate health care providers “meaningful use” of electronic health record (EHR) systems[1]. HITECH established new regulations to create programs that would develop health information exchange, develop strategic HIT research projects, standards and EHR certification criteria[2]. Since development of these programs, “meaningful use” for health information technology (HIT) uptake has been trended routinely in some sectors of health care, like hospitals and ambulatory care which were primarily targeted by the legislation [3]. However, IT capabilities of other health care sectors, such as nursing homes (NH), are less well understood because measures for HIT uptake have not been fully developed, rigorously tested, or validated in these settings. Until measures of IT capabilities are fully developed for all settings, including NH, it will be difficult to examine the true impacts of enhanced IT capabilities on quality and efficiencies in these settings. The purpose of this paper is to report on the development and testing of an instrument designed to measure NH IT maturity and stage of IT maturity.
The concept of IT maturity grew out of Nolan’s stage hypothesis theory which drew upon multiple disciplines including engineering, psychology, sociology, organizational behavior, etc. The formulation of this theory was dependent on two key IT variables: “1) identification of elements and 2) the conception of their growth through time.” P.400[4]. In preliminary development work, the primary author (GLA) developed a precursory measure of IT maturity, called IT sophistication. This measure has been used in a three year annual survey conducted in 815 United States (U.S.) NH[5]. IT sophistication was defined in three IT dimensions (IT capabilities, extent of IT use, and degree of internal/external IT integration) within three health care domains (resident care, clinical support (laboratory, pharmacy, radiology), and administrative activities). Using this model, IT sophistication was tabulated into 9 (3×3) dimensional units, plus a total IT sophistication score. IT sophistication scores proved to be reliable in assessing changing trends in IT dimensions over time and to assess associated trends in quality and safety measures[6]. However, certain aspects of the instrument (administrative activities) were highly technical and difficult to navigate for people completing the instrument. Another limitation was the lack of an overall staging measure similar to those used in acute care settings, such as the Health Information Management Systems Society (HIMSS) 8 stage adoption model used for assessing IT maturity and IT maturity stage[7]. This research was conducted to update and revise the existing NH IT sophistication instrument and associated IT content areas to reflect existing states of NH IT maturity nationally. Additionally, congruent IT maturity staging measures are needed to help draw similar comparisons between NH systems and other health care settings where IT maturity is being measured routinely.
METHODS
In prior work, investigators developed an IT maturity staging model using a two phase process including an extensive literature review (Phase 1) and a three round Delphi method (Phase 2)[8]. Phase 1 resulted in an initial 5 stage model of IT maturity, and after the three Delphi rounds developed into a 7 stage model of IT maturity. The IT maturity staging model was a new measurement tool developed to evaluate stages of IT maturity ranging from stage 0 (nonexistent IT solutions or EMR) to stage 6 (use of data by resident and/or resident representative to generate clinical data and drive self-management). The new 7 stage IT maturity model was designed to be used to rank content items within the new IT maturity instrument being developed. This report provides details of the development of the IT maturity instrument in association with recommended IT maturity model staging criteria.
Delphi Sample
The research team recruited 31 U.S. experts in NH administration and health IT systems to participate in this Delphi panel. Inclusion criteria for the administrators included: 1) 120 hours of experience, 2) Bachelor’s degree, and 3) formal training as a NH administrator. Administrators were recruited from NHs participating in prior research who indicated their facility was above the 75th percentile (198 facilities met this criteria) in IT sophistication. An internal team of experts at the university, with expertise in NH and IT systems, recommended NHs based on ownership, bed size, and location.
Instruments
Two instruments were used in this research. The first instrument was the completed 7 stage IT maturity staging model formulated in prior research, discussed previously. The second included Version 1 of an IT maturity instrument, adopted from the former IT sophistication instrument [9] used in prior research. The internal team recommended content for Version 1 of the IT maturity instrument within the same nine dimensions/domains of IT sophistication used in prior studies. Within these nine dimensions/domains 27 IT maturity content areas were identified. Within each content area a number of content items were recommended that describe the dimensions/domains present. The total numbers of content areas and content items, scoring format, response alternatives, and scoring parameters per domain/dimension are illustrated in Table 1.
Table 1:
IT Maturity Healthcare Domains/Dimensions (N=9) | Content Areas (N=27) | Content Items (N=183) | Scoring Format | Response Alternatives* | Scoring Min: 0 Max: 620 |
---|---|---|---|---|---|
Domain: Resident Care | |||||
IT Maturity Dimensions | |||||
IT Capabilities | 5 | 48 | Binary | (0,1) | 0–48 |
IT Extent of Use | 4 | 30 | Likert (0–7) | *NA=0, Extensively used=7 | 0–210 |
IT Degree of Integration | 3 | 15 | Likert (0–6) | Not at all=0, Very Much=6 | 0–90 |
Domain: Clinical Support (Lab, Pharmacy, Radiology) | |||||
IT Maturity Dimensions | |||||
IT Capabilities | 3 | 27 | Binary | (0,1) | 0–27 |
IT Extent of Use | 3 | 15 | Likert (0–7) | NA=0, Extensively used=7 | 0–105 |
IT Degree of Integration | 3 | 6 | Likert (0–6) | Not at all=0, Very Much=6 | 0–36 |
Domain: Administrative Activities | |||||
IT Maturity Dimensions | |||||
IT Capabilities | 2 | 13 | Binary | (0,1) | 0–13 |
IT Extent of Use | 2 | 20 | Binary; Likert | (0–1); (NA=0, Extensively used=7) | 0–80 |
IT Degree of Integration | 2 | 9 | Binary; Itemized; Likert | (0,1); (0, 1–5, 6–10, >10); Not at all=0, Very Much=6 | 0–11 |
Key:
NA=Not Available
Delphi Procedures
We conducted four iterative Delphi rounds with an expert panel from December 2016-January 2017. Since panel members were located in various U.S. states, we incorporated all instructions, instruments, and rating scales into Qualtrics, an online survey platform, and sent individual links to each panel member during each round. Specific procedures for each Delphi round are illustrated in Table 2. Summary statistics were obtained per content item at the end of each stage (quartiles). Following the third round, the internal team staged each content item to be included in the fourth round where Delphi members agreed or disagreed with the proposed stage. Criteria for staging an item included 80% agreement among the Delphi panel members. If the 80% threshold was not reached by Delphi members internal team members were required to meet a stringent 87.5% (7/8 internal members) agreement on a recommended stage. Following the fourth round, the internal team discussed the 91 content items that did not meet either agreement until consensus was met.
Table 2:
Round 1: Expert review of content areas and content items incorporating the recommended IT maturity staging model. | Round 2: Expert review of revised content areas and content items obtained from Round 1 Delphi experts. | Round 3: Expert review of revised content areas and content items obtained from Round 2 Delphi experts. | Round 4: Expert review of revised content areas and content items obtained from Round 3 Delphi experts. |
---|---|---|---|
|
|
|
|
Key: IT-Information Technology; EMR-Electronic Medical Record
RESULTS
Delphi panel members were recruited from five regions across the U.S. including Midwest (10), Northeast (7), South (5), South/Southwest (5), and Upper Midwest (4). Rural Urban Community Area codes identified 60% were from metropolitan, 23% from micropolitan, 10% from small towns, and nearly 7% from rural locations. There was a nearly even split of participating Delphi panel members from for profit and not for profit facilities. Just over 45% of administrators were from medium sized facilities with 60–120 beds. One hundred percent of Delphi panel members completed round 1, 87% round 2, and 90% rounds 3 and 4. The average length Delphi panel members had been a long term care administrator was 21.6 years (range 4–46 years). Panel members had been in their current administrative position from 2 weeks to 37 years.
Table 3 illustrates the number of content items by content area that were retained, added, or deleted from the original IT sophistication instrument at the end of the four rounds. The original IT sophistication instrument had 207 content items (retained + deleted). After four Delphi rounds the instrument was reduced to 183 content items (retained + new) among 27 content areas.
Table 3:
CA | CA Description | Retain | New | Delete |
---|---|---|---|---|
1 | Resident management processes that are computerized | 2 | 4 | 5 |
2 | Documents in resident care that are computerized | 6 | 3 | 2 |
3 | Clinical Processes or documents that are computerized | 13 | 4 | 2 |
4 | Physical/Occupational Therapy processes that are computerized | 8 | 3 | 2 |
5 | Technology that is available for residents or their representatives | 4 | 1 | 1 |
6 | Processes that are computerized in the laboratory systems | 6 | 3 | 3 |
7 | Processes that are computerized in radiology systems | 3 | 1 | 2 |
8 | Processes that are computerized in pharmacy systems | 13 | 1 | 2 |
9 | Processes for managing IT issues | 3 | 2 | 2 |
10 | Connectivity technologies used in the nursing home | 8 | 2 | 4 |
11 | Internet based applications used in the nursing home | 7 | 1 | 2 |
12 | IT activities which are currently outsourced to external providers | 8 | 0 | 4 |
13 | Extent of use of technologies in resident care | 12 | 3 | 3 |
14 | Extent of use of technology in nursing care | 6 | 0 | 1 |
15 | Extent of use of technology in Physical/Occupational Therapy | 5 | 0 | 1 |
16 | Extent of use of resident or their representatives technology | 4 | 0 | 1 |
17 | Extent of use of laboratory technology | 4 | 0 | 1 |
18 | Extent of use of radiology technology | 7 | 0 | 1 |
19 | Extent of use of technology for pharmacy management | 4 | 0 | 1 |
20 | Extent of use of office automation applications in the nursing home | 9 | 1 | 1 |
21 | Resident care systems are integrated (electronic/automatic transfer of information) with other nursing home systems | 8 | 1 | 3 |
22 | Nursing information systems are integrated (electronic/automatic transfer of information) with other information systems | 4 | 0 | 2 |
23 | Physical/occupational therapy systems are integrated (electronic/automatic transfer of information) with other information systems | 1 | 1 | 1 |
24 | Laboratory systems are integrated with other information systems | 2 | 0 | 1 |
25 | Radiology systems are integrated with other information systems | 2 | 0 | 1 |
26 | Pharmacy systems are integrated with other information systems | 2 | 0 | 1 |
27 | Total number of IT personnel in the nursing home (or corporate staff) excluding long term care consultants or subcontractors | 0 | 1 | 6 |
Totals | 151 | 32 | 56 |
Table 4 illustrates the frequency of content items per content area by a particular IT maturity stage[8] after the four Delphi rounds. Fourteen of the content areas were heterogeneous, meaning content items within a content area have more than one level of IT maturity. The widest range of IT maturity occurred in two content areas including content area 10 (Stages 1–4) representing 10 content items associated with connectivity technologies used in the NH, also content area 13 representing 15 content items associated with extent of use of technologies in resident care (Stages 3–6).
Table 4:
IT Maturity Stages* | |||||||
---|---|---|---|---|---|---|---|
Content Area | Stage 0 | Stage 1 | Stage 2 | Stage 3 | Stage 4 | Stage 5 | Stage 6 |
1. Resident management processes that are computerized | 2 | 4 | |||||
2. Documents in resident care that are computerized | 4 | 2 | 3 | ||||
3. Clinical Processes or documents that are computerized | 13 | 3 | 1 | ||||
4. Physical/Occupational Therapy processes that are computerized | 8 | 3 | |||||
5. Technology that is available for residents or their representatives | 5 | ||||||
6. Processes that are computerized in the laboratory systems | 1 | 6 | 2 | ||||
7. Processes that are computerized in radiology systems | 1 | 3 | |||||
8. Processes that are computerized in pharmacy systems | 11 | 3 | |||||
9. Processes for managing IT issues | 5 | ||||||
10. Connectivity technologies used in the nursing home | 1 | 1 | 2 | 6 | |||
11. Internet based applications used in the nursing home | 2 | 4 | 2 | ||||
12. IT activities which are currently outsourced to external providers | 8 | ||||||
13. Extent of use of technologies in resident care | 7 | 6 | 1 | 1 | |||
14. Extent of use of technology in nursing care | 6 | ||||||
15. Extent of use of technology in Physical/Occupational Therapy | 5 | ||||||
16. Extent of use of resident or their representatives technology | 4 | ||||||
17. Extent of use of laboratory technology | 4 | ||||||
18. Extent of use of radiology technology | 7 | ||||||
19. Extent of use of technology for pharmacy management | 4 | ||||||
20. Extent of use of office automation applications in the nursing home | 7 | 2 | 1 | ||||
21. Resident care systems are integrated (electronic/automatic transfer of information) with other nursing home systems | 6 | 3 | |||||
22. Nursing information systems are integrated (electronic/automatic transfer of information) with other information systems | 2 | 2 | |||||
23. Physical/occupational therapy systems are integrated (electronic/automatic transfer of information) with other information systems | 2 | ||||||
24. Laboratory systems are integrated with other information systems | 2 | ||||||
25. Radiology systems are integrated with other information systems | 2 | ||||||
26. Pharmacy systems are integrated with other information systems | 2 | ||||||
27. Total number of IT personnel in the nursing home (or corporate staff) excluding long term care consultants or subcontractors | 1 | 3 | |||||
Totals# | 9 | 19 | 63 | 74 | 11 | 10 |
Key:
Stage Definitions
Stage 0 = Nonexistent IT solutions or Electronic Medical Record (EMR)
Stage 1 = Incomplete or disparate fragmented IT solutions
Stage 2 = Established IT leadership that governs and coordinate structures, procedures, processes, and policies
Stage 3 = Automated internal connectivity and reporting
Stage 4 = Automated external connectivity and reporting
Stage 5 = Clinical risk intervention and predictive analytics
Stage 6 = Use of data by resident and/or resident representative to generate clinical data and drive self-management
Content items may have more than one stage within a content area, therefore total numbers of content items are not consistent
Lower IT maturity stages resulted from a few content items in content area 10 related to connectivity technologies which promote greater fragmentation including use of fax machines, or firewalls and perimeter security. Defined as computer systems designed to block unauthorized access while permitting outward communication. Additionally, use of office automation applications in content area 20 (word processing, spreadsheets, databases (e.g. Microsoft Access), desktop publishing, and project management software) were also recommended to be at a lower stage of IT maturity (Stage 1). Whereas, customer relations platforms (software to manage customer relationships) was recommended to be a higher level of IT maturity (Stage 5). Panel members also recommended that higher levels of IT staffing should reflect lower IT maturity (stage 2).
The majority of content items (40% (n=74)) are associated with IT maturity stage 4, which corresponds to facilities that have external connectivity. With IT maturity stage 4, 15% (11/74) of the content items are associated with content area 8 including computerized processes in pharmacy systems. Fewer content items (34%, (n=63)) are associated with IT maturity stage 3, corresponding to facilities with only internal connectivity. The majority of these are within content area 3 which measures clinical processes or documents that are computerized, such as care planning/care area assessments, nursing flow sheets, and incident reporting mechanisms.
Over 11% of the content items are at the highest IT maturity stage (Stage 5 and 6). Content areas with content items at the highest stage of IT maturity are reflected in NHs that have mature technologies, for example, technology available for residents or their representatives and that are used extensively in resident care (content areas 5, 13, and 16). Some of the highest IT maturity stages (Stage 5) in content area 8 identify pharmacy systems that have duplicate orders checking, drug interactions checking, and look alike - sound alike medication alert systems. Specific technologies that lead to higher levels of IT maturity include use of electronic health records by residents or their representatives, use of health information exchange, or availability of personal health records. For example, expert systems used by residents to enter their personal medical history by answering a set of questions (Stage 6), a content item under content area 13. Another example includes the use of a clinical decision support system (e.g. lab testing required for appropriate imaging obtainable from resident record) recommended to be Stage 5, a content item under content area 13.
DISCUSSION
The purpose of this research was to improve upon an existing instrument measuring IT sophistication in NHs. We did this by developing an improved instrument measuring NH IT maturity in association with a previously developed IT maturity staging model. Important guidelines for any instrument developer is to add specificity and clarity to the underlying items within an instrument by paying close attention to underlying content domains, setting, and population of interest[10]. In this context, we worked with a national sample of experienced NH administrators who reported some of the highest rates of IT sophistication among a national sample of 815 NHs [5]. These experts, over four iterative rounds, identified a pool of 183 content items in 27 content areas which reflect six stages of NH IT maturity. Reviewers were asked to evaluate the clarity and conciseness of content items. Furthermore, panel members were asked to critically evaluate items within content areas and to point out ways of measuring the phenomena which were not introduced in a content area.
Through the Delphi process we have identified strategically, content areas and associated content items that will enable the team to measure levels of NH IT maturity and IT maturity stage. There is variability across all stages of IT maturity among the content areas which will allow users of this instrument to narrowly define specific IT maturity stages for any particular NH completing the instrument. This particular type of assessment will be of interest to a range of people. For example, health system leaders in acute care (i.e. care coordinators, leaders of hospital post-acute care committees) who want to know if NHs are capable of sharing data about patients being transitioned, included in stage 4 and higher. NH board members who want to maintain a competitive edge in the market place may want to know how their strategic investments in health IT size up to other NHs in the same market, could be reflected in any stage. An IT maturity assessment should be a required component of all NH surveys and inspections which occur at least annually. The assessment will inform surveyors about the IT maturity stage enabling them to identify specific IT capabilities of a NH, further providing them a mechanism to strategically plan the survey they are conducting. Finally, NH residents and their representatives would be interested in using this IT maturity assessment as part of the selection process prior to entering a NH. Residents and their representatives may want to prioritize their selection by IT maturity stage, for example by choosing only homes that have the capability to connect externally, stage 4 or higher.
Currently, assessments of IT maturity and IT maturity stage are not conducted on a routine basis either locally or nationally. This information gap exists because, before now, there were no instruments linking NH IT maturity and IT maturity stage, like there is in acute care settings. Drawing on the expertise of NH leaders we have created an instrument to measure IT maturity and IT maturity stage that can be used to fill information gaps about NH information systems.
LIMITATIONS
The research team incorporated reliable methods based on prior evidence to adjust for possible limitations. For example to increase generalizability and lessen selection bias we recruited administrators from facilities with high levels of reported IT sophistication from various sectors of the country. Facilities were also selected with variable characteristics including small rural facilities to larger urban facilities, with a range of bed size from small to large, and with different for profit/not for profit status. As part of our evaluation mechanics we attempted to level the working definition of our construct (IT maturity) by controlling the context in which it was being evaluated. For instance, we asked evaluators to consider the ideal NH facility when considering content items and staging. We believe this encouraged experts to give more insightful comments about the ambiguity or relevance of certain content items[10]. The internal team was careful to examine all critiques within the context of the IT instrument as a whole. For example, internal members were cognizant of deleting too many related items within the constructs of the instrument that were perceived as redundant. In our final decisions, the internal team kept in mind that some content items provided redundancy, so they were retained. Redundancy is an important part of internal consistency and reliability of an instrument[10].
CONCLUSIONS
Developing valid instruments that help us understand the healthcare delivery system better as a whole is good for everyone, including patients, caregivers, administrators, regulators and policy experts. In the case of NH IT maturity, little is known about current advances in IT maturity at a granular level, such as what IT maturity stage a particular NH might be part of. The development of this IT maturity instrument and associated IT maturity staging model using an expert Delphi panel of NH administrators from across the United States is a first attempt at filling this information gap.
Acknowledgment:
The authors would like to thank Keely Wise, Project Coordinator, who help keep us all organized, moving forward through this project, and on target to accomplish goals. Further, this study would not be possible without the contributions of many nursing home leaders across the country who were consistent partners in our research.
Funding Statement: This project was supported by grant number R01HS022497 from the Agency for Healthcare Research and Quality. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality.
Footnotes
Disclosure of potential conflicts of interest: Dr. Alexander is Founder and Owner of TechNHOlytics, LLC. A company that provides feedback to nursing homes about information technology.
Research involving human participants: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent: All methods were approved by the universities Institutional Review Board (IRB) under IRB #2009109 HS.
Contributor Information
Gregory L Alexander, S415 Sinclair School of Nursing, University of Missouri Columbia, Columbia MO 65211-6000.
Chelsea Deroche, Office of Medical Research, University of Missouri Columbia, Columbia MO 65211-6000.
Kimberly Powell, Sinclair School of Nursing, University of Missouri Columbia, Columbia MO 65211-6000.
Abu Saleh Mohammad Mosa, School of Medicine, University of Missouri Columbia, Columbia MO 65211-6000.
Lori Popejoy, Sinclair School of Nursing, University of Missouri Columbia, Columbia MO 65211-6000.
Richelle Koopman, Family and Community Medicine, University of Missouri, Columbia MO 65211.
References
- 1.Ederhof M, Ginsburg PB. “Meaningful Use” of Cost-Measurement Systems - Incentives for Health Care Providers. New England Journal of Medicine 2019;381(1):4–6 [DOI] [PubMed] [Google Scholar]
- 2.Blumenthal D Launching HITECH. New England Journal of Medicine 2010;362:382–85 [DOI] [PubMed] [Google Scholar]
- 3.Yuan N, Dudley RA, Boscardin WJ, Lin GA. Electronic health records systems and hospital clinical performance: A study of nationwide hospital data. Journal of the American Medical Informatics Association 2019;26(10):999–1009 doi: doi: 10.1093/jamia/ocz092 [published Online First: Epub Date]|. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Nolan RL. Managing the computer resource: A stage hypothesis. Communications of the ACM 1973;16(7):399–405 [Google Scholar]
- 5.Alexander GL, Madsen R, Deroche CB, Alexander RL, Miller E. Ternary trends in nursing home information technology and quality measures in the United States. Journal of Applied Gerontology 2019. doi: https://doi.org/10.1177%2F0733464819862928[published Online First: Epub Date]|. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Alexander GL, Madsen D. A report of information technology and health deficiencies in U.S. nursing homes. Journal for Patient Safety 2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.HIMSS. Health Information Management Systems Society Analytics: Electronic Medical Record Adoption Model. 2017. http://www.himssanalytics.org/ (accessed 09/16/2019).
- 8.Alexander GL, Powell K, Deroche CB, et al. Building consensus toward a National Nursing Home Information Technology Maturity Model. Journal of the Americal Medical Informatics Association 2019. (in press) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Alexander GL, Wakefield DS. IT sophistication in nursing homes. Journal of the American Medical Directors Association 2009;10(6):398–407 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Devellis RF, editor. Scale Development Theory and Applications. 4th ed Los Angeles: Sage, 2017. [Google Scholar]