ABSTRACT Title of Dissertation: AN AUTOETHNOGRAPHIC ACCOUNT OF INNOVATION AT THE US DEPARTMENT OF VETERANS AFFAIRS Andrew E. Casertano, Doctor of Philosophy, 2020 Dissertation directed by: Professor Richard Marciano, Ph.D. College of Information Studies The history of the U.S. Department of Veterans Affairs (VA) health information technology (HIT) has been characterized by both enormous successes and catastrophic failures. While the VA was once hailed as the way to the future of twenty-first-century health care, many programs have been mismanaged, delayed, or flawed, resulting in the waste of hundreds of millions of taxpayer dollars. Since 2015 the U.S. Government Accountability Office (GAO) has designated HIT at the VA as being susceptible to waste, fraud, and mismanagement. The timely central research question I ask in this study is, can healthcare IT at the VA be healed? To address this question, I investigate a HIT case study at the VA Center of Innovation (VACI), originally designed to be the flagship initiative of the open government transformation at the VA. The Open Source Electronic Health Record Alliance (OSEHRA) was designed to promote the open innovation ecosystem public-private- academic partnership. Based on my fifteen years of experience at the VA, I use an autoethnographic methodology to make a significant value-added contribution to understanding and modeling the VA?s approach to innovation. I use several theoretical information system framework models including People, Process, and Technology (PPT), Technology, Organization and Environment (TOE), and Technology Acceptance Model (TAM) and propose a new adaptive theory to understand the inability of VA HIT to innovate. From the perspective of people and culture, I study retaliation against whistleblowers, organization behavioral integrity, and lack of transparency in communications. I examine the VA processes, including the different software development methodologies used, the development and operations process (DevOps) of an open-source application developed at VACI, the Radiology Protocol Tool Recorder (RAPTOR), a Veterans Health Information Systems and Technology Architecture (VistA) radiology workflow module. I find that the VA has chosen to migrate away from inhouse application software and buy commercial software. The impact of these People, Process, and Technology findings are representative of larger systemic failings and are appropriate examples to illustrate systemic issues associated with IT innovation at the VA. This autoethnographic account builds on first-hand project experience and literature- based insights. AN AUTOETHNOGRAPHIC ACCOUNT OF INNOVATION AT THE US DEPARTMENT OF VETERANS AFFAIRS by Andrew Emil Casertano Dissertation submitted to the Faculty of the Graduate School of the University of Maryland, College Park, in partial fulfillment of the requirements for the degree of Doctor of Philosophy 2020 Advisory Committee: Professor Richard Marciano, Chair Denny Gulick, PhD Margaret Chmiel, PhD Kari Kraus, PhD Ping Wang, PhD ? Copyright by Andrew E. Casertano 2020 Preface Figure 1 Map of the Dissertation Figure 1 shows how this dissertation is organized. Each of the chapters that follow will use a map to outline and organize my research. In the introduction, I offer my original contribution to the information science and software engineering field. I also explain why this subject matter is critical and timely. Then I address the research questions and their hypotheses. Having done that, I describe the autoethnographic methodology in more detail. The literature review which follows the description makes the case for the research. I look at my problem and place it among the most important current theories, findings, and concepts that exist. I identify where my original research contribution fills in the gaps that I have identified. My original ii findings are presented and then I reflect upon them based on the applicable theory. I conclude my research with a unique interpretation and establish several innovative conclusions. A detailed categorized bibliography as a valuable reference is included. iii Foreword Abraham Lincoln: ??To care for him who shall have borne the battle and for his widow, and his orphan.? Martin Luther King Jr: ?Our lives begin to end the day we become silent about things that matter.? William Faulkner: ?The past is never dead. It?s not even past.? iv Dedication This work is dedicated to my family. My Dad, Lawrence was a WWII veteran who passed away recently and my Mom, Louise. Their sacrifices are appreciated and honored. To my four sons, John, Drew, Nick, and Matthew, I hope that my example in completing my degree is inspirational to you to follow your dreams. v Acknowledgments I am grateful for all the support, encouragement, friendship of the faculty, friends, family, and colleagues throughout this Ph.D. journey. vi Table of Contents CHAPTER 1: INTRODUCTION 1 CHAPTER 2: LITERATURE REVIEW 19 CHAPTER 3: METHODOLOGY 82 CHAPTER 4: FINDINGS 108 CHAPTER 5: CONCLUSIONS 198 BIBLIOGRAPHY 226 Table 1 Abbreviations Abbreviation Explanation BI Behavioral Integrity BRD Business Requirements Document CAS Computational Archival Science CASCI Center for Advanced Study of Communities and Information CHIDS Center for Health Information & Decision Systems CIO Chief Information Officer vii CMS Content Management System COTS Commercial-off-the-shelf CPRS Computerized Patient Record System CT Computerized Tomography CTO Chief Technology Officer DevOps Development Operations DevSecOps Development Security Operations DOI Diffusion of Information DOD Department of Defense eHMP enterprise Health Management Platform EHR Electronic Health Record FOIA US Freedom of Information Act GAO Government Accountability Office GWOT Global War on Terrorism HCD Human-Centered Design viii HCI Human-Computer Interaction HIS Health Information System IaaS Infrastructure as a Service IP Intellectual Property IR Interventional Radiology IRB Institutional Review Board IT Information Technology JCAHO Joint Commission on Accreditation of Healthcare Organization KM Knowledge Management LAS Laboratory for Analytic Sciences MAS Medical Appointment Scheduling MHS Military Health Systems MIM Master of Information Management MR Magnetic Resonance MUMPS Massachusetts General Hospital Utility Multi-Programming System ix NSA National Security Agency NSR New Service Request OAWP Office of Accountability and Whistleblowers Protection OI&T VA Office of Information & Technology OIG VA Office of Inspector General OSEHRA Open Source Electronic Health Record Alliance (Agency) OSS Open Source Software PaaS Platform as a Service PACS Picture Archive and Communications System PCS VA Patient Care Services PDF Portable Document Format PMAS Project Management Accountability System PPT People, Process and Technology Improvement Model RAPTOR Radiology Protocol Tool and Recorder RBJ Radiology Business Journal x RadLex Radiology Lexicon RFP Request for Proposal RIS Radiology Information System RSNA Radiology Society of North America RT Representation Theory SaaS Software as a Service SIIM Society of Imaging Informatics in Medicine SME Subject Matter Expert SWOT Strengths, Weaknesses, Opportunities, Threats TAM Technology Acceptance Model TOE Technology, Organization, and Environment TOE TAG Technology, Organization, and Environment plus Technology Acceptance Models Failure Groups TOE TAM Technology, Organization, and Environment plus Technology Acceptance Models UAT User Acceptance Testing xi UMD University of Maryland VA US Department of Veterans Affairs VACI Veterans Affairs Center of Innovation VAI2 Veterans Affairs Innovation Initiative VAOIG VA Office of Inspector General VBA Veterans Benefits Administration VHA Veterans? Healthcare Administration VHPI Veterans? Healthcare Policy Institute VIC VA Innovation Center VistA Veterans? Health Information Systems and Technology Architecture VIP VistA Intake Program VSE VistA Scheduling Enhancement xii Chapter 1: Introduction The figure below illustrates the organization of the introduction chapter. I initially introduce the who, what, where, and when of the problem and the relationship among them. I then introduce the three research questions and discuss the orienting concepts that are used to direct and inform the study. I introduce myself and the autoethnographic methodology; in doing so, I describe the original contribution I make to the conceptual and theoretical field. Figure 1 Map of Introduction 1 Problem Statement The US Government Accountability Office (GAO) has designated the US Department of Veterans Affairs (VA) Veterans Health Administration (VHA) at high-risk in terms of its susceptibility to waste, fraud, and mismanagement; information technology (IT) challenges are a major contributing factor (GAO, 2015). A White House investigation found a ?corrosive culture? (Politico, 2014) and recommended the VA be restructured and reformed. The history of VA IT has been characterized by both enormous successes and catastrophic failures. Some programs were mismanaged, delayed, or internally flawed, to the extent that they could not be saved, resulting in the waste of hundreds of millions of dollars (Independent Budget, 2016). Over the past ten years, the number of Veterans has decreased rapidly, while per-patient spending has skyrocketed. VA policy expert Dr. Colin D. Moore asks (2015), ?Why does the VA continue to expand despite a decades-old reputation for scandal and mismanagement?? To address these IT issues, the White House has made the overhaul of VA medical records a centerpiece of its broader government reform efforts (Politico, 2018). Access to care and patient safety depend upon a modern health IT platform, especially an electronic health record (EHR) system, which directly impacts the quality and delivery of care to Veterans. White House opponents suggest that this approach is wrong (Newsweek, 2019) and likened this to ?rip[ping] the battery out, saying the whole car doesn?t work, so they can sell the parts?. In contrast to significant department-level IT failures, the VHA has, for more than 30 years, successfully developed, tested, and implemented a world-class comprehensive, integrated electronic EHR system. The current version of this system, which is based on the Veterans Health Administration?s self-developed Veterans Health Information Systems and Technology 2 Architecture (VistA) public domain software, sets the standard for EHR systems in the United States and has been publicly praised by Presidents Clinton, Bush, and Obama and many independent observers. (Independent Budget, 2016) (JCAHO, 2008). VistA was awarded an Innovations in American Government Award by the Ash Institute for Democratic Governance and Innovation at Harvard University?s John F. Kennedy School of Government in 2006. Unfortunately, the VA has not maintained or modernized VistA. One of the IT challenges with VistA is that it was originally designed and developed in the 1970s, a lifetime ago in IT terms. VistA supports daily healthcare operations and patient care and has been essential to the department?s ability to deliver health care to veterans (GAO, 2018). While several former Secretaries of the VA stated that VistA would be modernized, the VA in 2018 signed a no-bid $16 billion contract to scrap VistA and go with a proprietary commercial solution. The shelving of VistA is a knee-jerk reaction that wasted billions of dollars (Politico, 2017). Experts warn that the VA built the most important medical computer system in history and is now about to spend billions of dollars discarding it (Open Health News, 2017). One critic referred to it as ?a mix of sad and silly folly? (Shannon, webpage 2018). A VA doctor quoted by Politico claims that the Trump administration?s IT actions have ?taken a broke system and broken it completely? (2017). The failure of VistA presents a systemic dysfunction of an unsustainable culture of innovation within the VA. Considerable efforts were made to modernize VistA, but those too were shelved, despite being sufficiently well-developed and tested to be suitable for launch. The example I focus on in this study is The Radiology Protocol Tool Recorder (RAPTOR), a Veterans Health Information Systems and Technology Architecture (VistA) radiology workflow module that was part of the ?flagship initiative? for the transformation of the VA (Levin, 2010). The software has been 3 explicitly designated as part of the VA VistA Evolution Product Roadmap and was included in future budgeting as the highest priority for radiology (VA, 2014). The figure below shows the bidirectional interface between VistA and RAPTOR. From the user?s perspective, RAPTOR acts as a radiology web dashboard into VistA; it is essentially a radiologist?s version of VistA. RAPTOR helps manage radiology workflow and serves up medical images in a web viewer. Figure 2 RAPTOR to VistA integration RAPTOR was designed to integrate into existing VA medical imaging department VistA workflows and, in so doing, simultaneously improve safety, quality, efficiency, and compliance (OSEHRA, 2016). RAPTOR?s potential benefits include replacing the existing paper-based process with a tailored electronic workflow (Medverd, 2012). RAPTOR is a means to modernize VistA and its proposed benefits are described in detail in both the literature review and the findings section of this research. For the RAPTOR case study, I propose that the VACI was not able to overcome institutional obstacles from the OI&T, and PCS. VACI initially supported the design and development of the software. In the operations phase, VACI has limited resources and wanted to turn over the software to OI&T and PCS for deployment. Unfortunately, OI&T 4 did not provide any resources to support the design, development, and testing of RAPTOR. This lack of communication, integration and coordination resources resulted in a delayed schedule of transition. The design and development of RAPTOR cost over $2 million (from prototype inception through user acceptance testing) (VA Contract VA118-11-RP-0173) and were delivered on time and on budget. After successfully completing the VA intake process, RAPTOR could have been introduced nationally (Bulson, 2014). User acceptance testing and enterprise security testing was successfully completed in four pilot sites and the project was voted one of the Top 5 Health IT projects of the year by the Society of Imaging Informatics in Medicine (SIIM) and Radiology Business Journal (RBJ) (Proval, 2012). It also successfully passed the Open Source Electronic Health Agent (OSEHRA) software quality certification (OSEHRA Technical Journal, 2016). However, although the application addresses an ongoing need to improve advanced imaging safety, quality, and compliance, RAPTOR has never been implemented within a live clinical setting for everyday use. This research seeks to understand why. It does so by focusing on DevOps software processes using an autoethnographic methodology. This research comprehensively investigates many attributes that impacted the project and the organization. I am studying the decade long development process underpinning the RAPTOR project in order to gain insight into innovation at the VA more broadly. The VA OIG told Congress (2017) of the VA?s struggles to design, procure, and/or implement functional information technology (IT) systems. The VA has a high number of legacy systems needing redesign, improvement, or replacement, including VistA. Redesigning and replacing systems have been a major challenge across the government and are not unique to the VA (VA OIG, 2017). Unfortunately for the VA, the kind of dysfunction we see with RAPTOR is a symptom of 5 a broader innovation problem and is not unique to the VistA case. It occurs regardless of whether the software is developed in-house or externally. Hence the broader emphasis on IT healthcare innovation. The original contribution that this study makes is twofold. First, I am focusing on the autoethnographic approach applied to existing data. This paper seeks to advance an understanding of my case study as it is applied to VACI software project RAPTOR. I am also offering initial findings based on the Toulmin Method, of findings and supporting data. This dissertation contributes to the academic literature, which has yet to investigate innovation in VACI which in the literature review, I categorize as a semiformal organization. The current academic research focuses on clinical innovation in the VA (as detailed in the radiology information systems, organizational innovation and knowledge management, data visualization, and human-computer interaction sections of the bibliography), but the current news reports that the VA is replacing VistA reveal a ?buy-first? strategy (Shulkin, 2017, Blackburn, 2018) that internal IT software development innovation is not a priority. The literature does not address the cross-disciplinary, systematic-approach to DevOps innovation in understanding innovation at the VACI. There are therefore gaps in the current literature that my research seeks to fill using an autoethnographic account of open-source information technology projects within the VACI. This study will, therefore, be the first of its kind. Second, there is a more general contribution. It seeks to advance knowledge in software engineering DevOps more broadly, particularly when it comes to challenges to innovation in large organizations, and to help in the development of mitigation strategies to respond to these problems. With that in mind, the autoethnographic account I present here will investigate the information studies aspects of RAPTOR, including the innovation and the human-computer 6 interaction (HCI) underpinning it. Ultimately, I am interested in why it was canceled. A limitation may be that I am not able to truly answer ?why?, but an autoethnography can help document the processes and decisions that were made that led up to the cancelation. My case study examines the people, process, and technology (PPT) improvement model (Prodan 2015; IBM 2011) as it is applied to VACI software project RAPTOR. Using the PPT model as a guideline to research and to answer the broad question of ?why are HIT innovations successfully developed and then never introduced clinically, despite its benefits?? and the secondary question ?can the VA IT be healed?? These are the central research questions driving this research. I will examine the VA based on its desired goals. VA CTO Peter Levin quotes (Fedscoop video, 2010) Secretary Shinseki on the ?VA?s emphasis on transforming the people, process, and technology?. Levin lists on his fingers, ?cultural change, business process reengineering, and technology renovation?. He said that the Secretary was in the process of transforming this 310,000-person agency that is moribund, paper-bound, and a stovepipe culture. As my evaluation focuses on matters related to VA stakeholders, I examine the interactions between people, processes, and technology as understood and judged from those inside the program or activity (Greenwood, 2006). Thus, to answer these questions, I draw upon several different theories that were introduced to me at the UMD Center for the Advanced Studies of Communities and Information (CASCI). I will answer my research questions using adaptive theory methodology which Layder (1998) defined as a combination of the pre-existing theory and theory generated from data analysis in the formulation and the actual conduct of empirical research in order to connect the academic theories and the VA data I've collected. 7 My case study examines the VACI software project using the people, process, and technology (PPT) improvement model (Prodan 2015; IBM 2011). I am using the PPT as a central organizing concept because it is the model that the VA used to transform itself. The PPT has been used specifically by the VA in response to the Obama Open Government Transformation (Levin Fedscoop video, 2010). In the figure below, I place the VACI within the PPT improvement model. PPT is a holistic model that has been used in IT (IBM, 2008) and across industries (Prodan, 2015) for more than thirty years. I will examine each of the RAPTOR case study?s three dimensions: technology (what types of HIT software and tools are used); organization (how HIT is managed); and process (how HIT software procedures are followed). In the literature review, I first examine the VA organization and then propose several technology innovation models including the PPT. The PPT will serve as a set of guidelines that direct data collection efforts. This research uses concepts from PPT to guide theory generation efforts. Figure 3 VACI People, Process, and Technology innovation model 8 Using the PPT model, I propose three subsidiary research questions to supplement the central research question, why RAPTOR was never introduced. These are on: ? People: How does the VA organization, culture, and communication influence innovation? ? Process: In what ways do the breakdown of VA software, clinical, and management processes impact innovation? ? Technology: Is information technology the cause of the rejection of VA VistA and RAPTOR? Figure 4 Three PPT Research Questions Overview of Methods 9 This section is an overview of the mixed methods approach I will use to assess the VA barriers to innovation, the VACI organization, and the fate of the RAPTOR project. Through a narrative literature review, I have collected technical information my team used to design and develop RAPTOR over the past eight years. To understand the VACI organization, I am using an adaptive theory system methodology that I first became aware of through my participation in the UMD iSchool Center for Advanced Study of Communities and Information (CASCI). In addition, I will collect, analyze, and interpret the data I have collected to test these models and develop a new theory based on the data. A case study is my approach to organizing and presenting the data. I focus on a single RAPTOR case study and gain insights by comparing it to my experiences with VA OI&T (as an internal software developer for VistA), PCS (integrating a teleradiology COTS), and a contra-example with the NSA innovation. One lesson on going back to school is that RAPTOR is not the only focus of my study, but I have learned how to approach researching information systems in general. Autoethnography Methodology In this thesis, I use the autoethnographic methodology to describe my work with the VA since 2002. Autoethnography is a research approach that systematically describes a personal approach to understanding cultural experience. (Ellis, 2016). Although autoethnography is not a common research method in IT, it has been shown to be an effective (Anderson, 2006; Atkinson, 2006; Costello, 2016; Ghita, 2016) qualitative method on not only understanding stakeholder?s viewpoint but also the broad context of IT. Rowe (2012) lauds autoethnography for information systems research ?since experience is the main source of learning, immersion into the lifeworld 10 of those who live what we want to study is the best way to go.? Rowe (2012) calls it ?privileged research? in that it is rare that the opportunity is granted for the amount of time and resources needed to ?describe situations that are rarely observed,? such as paradoxical or insider conditions. My expertise is based on thirty years? experience in software engineering, and especially the twelve years of information technology experience at the VA, and three additional years at the Department of Defense (DOD) Military Health Systems (MHS). During these years, I worked as a practitioner (Stringer, 2015) on many successful and unsuccessful projects at the VA and the DOD, including their Electronic Health Records (EHRs). My unique observations originate from diverse technology roles at each of the three parts of the VA organization, which is shown in the timeline table at the end of this section. At the VA Office of Information Technology (OI&T), I was a VistA software developer. At VA Patient Care Services (PCS), I was an enterprise architect supporting the VA Chief Radiologist. On the RAPTOR project, I served as ?jack-of-all-trades? amongst a small team including innovator, designer, developer, and manager. My years working with VACI on RAPTOR will be the focus of this research. Thus, the methodology is an investigation into my own situation. The advantages and challenges of the autoethnographic method are described in detail in Chapter 3. Although I am not a veteran myself, my family has many current and former military personnel. My father credits the VA for saving his life after World War II, and it is at a VA hospital where he met my mother, a young nurse. 11 Approach My research approach can be summarized in the table below. Coming from a STEM background and profession, from a non-academic, practical, business based, background, one area I learned about was research methods. Since I have been back to school, I've learned these ideas have labels that come out of philosophy. I define epistemology what we believe is seen through our experiences, culture and surroundings. An example of this is how one could define HIT failure. The VACI could define RAPTOR as a success as contractually we delivered the software on time and on budget and successfully passed UAT. The VA radiology community could define the project as a failure as the VACI were unable to provide the application to them. My reasoning emphasis is highly inductive. With inductive, the researcher is free to change the approach based on considerations. I surveyed the available methods and evolved to autoethnography based on the conditions of the data available. My plan is to use my research to build theory, and conclusions rather than prove existing theory. Inductive is also less structured than deductive reasoning, as there is no guiding theory. My contribution will be to develop theory. Content analysis is widely used qualitative research technique. Table 2 Summary of my research approach Approach Theoretical Stance Ontology (my beliefs about my situation) There are multiple levels of reality 12 Epistemology (how I come to know the world) Meaning is culturally defined Methodology Qualitative Design Autoethnography Emphasis Inductive Methods (techniques for collecting data) Document analysis, observation, and optionally non-formal interviews Qualitative Descriptive Analysis How the data will be processed in order to answer my research questions Timeline Below is a timeline of relevant events that impacted my research. This includes both personal events and organizational events. My personal timeline is important as it shows my experience with VA software, and autoethnography is a research method that uses personal experiences ("auto") to describe and interpret ("graphy") cultural texts, experiences, beliefs, and practices ("ethno") using personal events. Table 3 VA Milestones and My Personal Timeline Date VA Milestone My Personal Timeline 13 The VistA OSS - In the early 1980s, VA made I graduate from the UMD (BS 1980s its software available without restriction Electrical Engineering) and in the public domain to other government begin my career in software and private sector organizations, in engineering. Most (almost all) compliance with the Freedom of of my experience is with Information Act (FOIA). VA recognized government organizations this opportunity to support widespread (various DOD government EHR adoption and offered the use of labs). VistA as the standard-bearer for EHR implementation around the world. 2002 - For the development of VistA, the VA After graduating with a 2006 was named a recipient of the prestigious Master of Science degree, I Innovations in American Government begin working in healthcare Award presented by the Ash Institute of information technology. As the John F. Kennedy School of part of my job as a DOD EHR Government at Harvard University in system engineer, I am directed 2006. The VistA electronic medical to study the VA EHR VistA. I records system is estimated to improve work to promote data sharing efficiency by 6% per year, and the between DOD and VA. I am monthly cost of the EHR is offset by the lead author on DOD VA eliminating the cost of even a few data sharing research report. unnecessary tests or admissions. 2005 - The VA receives an influx of new At the VA?s request, I directly 2009 patients due to the Global War on support the VA, while Terrorism (GWOT). working for my own engineering consulting firm. In my first VA OI&T contract, I support the VistA development team provide 14 technical design, development and testing expertise for the existing and the reengineering systems for multiple VistA Imaging software development releases such as DICOM (Digital Imaging and Communications in Medicine) Query and Retrieve, Remote Image View TeleReader and Tele-Ophthalmology, HL7 Transmission to Commercial PACS, and Import Reconciliation Workflow for Portable Media such as CDs. 2009 President Obama?s Open Government I am awarded a VA Certificate directive results in the VA Innovation of Achievement at both the VA Initiative (I2) VistA eHealth University Conference (VeHU) and Information Technology Conference (ITC). I am an active participant in the Federal Health Architecture (FHA) Consolidated Healthcare Informatics (CHI) Workgroup and the Joint DoD/VA Interagency Imaging Sharing Integrated Project Team. I served as a VA VistA Imaging representative to the 15 Integrating the Healthcare Enterprise (IHE) Radiology and Eye Care Technical Committees, DICOM Committees, VA/DoD Joint Imaging Team, and the Office of the National Coordinator for Health Information Technology. 2010 VA Modernization Report highlights the I move from supporting OI&T need to innovate VistA. The report to the Chief Radiology recommends moving forward with open- Consultant for PCS. source software. I become aware of the VA Innovation Initiative (VAI2). 2011 VA establishes the Open Source My company wins the first Electronic Health Record Alliance RAPTOR VA Innovation (OSEHRA) as the central governing body Initiative (VAI2) development that oversees the community of EHR contract from VACI for the users, developers, and service providers. RAPTOR prototype. 2012 VA Innovation Initiative (I2) changes the RAPTOR proof of concept is scope and leadership direction to the VA built and studied. Center of Innovation (VACI) 2014 VistA Evolution Program, an effort to RAPTOR agile web modernize VistA, is launched. development integrates with VistA 16 2015 A new OSS policy is initiated to evaluate RAPTOR User Acceptance open source solutions (along with larger Testing (UAT) completed at enterprise solutions) when acquiring or four sites. developing new software. This policy I present at several open- requires that the use of open source source conferences (including development practices be considered Drupal Government when VA or a VA support contractor Developer Days and develops software. OSEHRA). 2016 RAPTOR ?s Javascript software library is I am a UMD iSchool Ph.D. reused by the Daily Plan application. This Student with the intention of is an example of code reuse in open studying RAPTOR utilization source software applications. using mixed methods. 2017 RAPTOR is approved by the OI&T VistA I complete UMD classes and Intake Process. start preparing the iSchool integration paper on RAPTOR. 2018 VA plans VistA sunset. VA does not I present my iSchool introduce RAPTOR clinically. VA signs a Integration paper on RAPTOR sole-source $10 billion contract with a and am approved to a Ph.D. proprietary EHR vendor. candidate. My poster is accepted to iConference 2019. 2019 VistA has remained the top-rated EHR, I prepare a proposal (namely despite neglect and attrition in the Chapters 1,2,3) to study the programmer ranks. RAPTOR story and research VA, VACI and DevOps process. 17 2020 It is estimated that the EHR replacement I plan to prepare the results, contract will be closer to $16 billion conclusions, and lessons dollars and will not be completed for 10 learned from my research years. (Chapters 4,5). 18 Chapter 2: Literature Review This literature review will survey and unify several diverse strands of literature research at first the VA enterprise level and then at the RAPTOR project. It uses theory and concepts from many diverse frameworks, including academic, organizational, management, behavioral, information science, computer science, and investigative journalism domains. Even before I started my Ph.D. program, I collected articles on the literature review topics. I have been using an adaptive methodology approach, utilizing the concepts derived from foundational theories as orienting concepts, go into the data, refine the findings, go back to the data, then ultimately constructing a theory of IT healthcare innovation. An interdisciplinary literature review strengthens the foundation of the research and indicates where my research fits within the larger information science community. I will integrate the theoretical frameworks and concepts that I learned at the UMD iSchool to assess the VA barriers to innovation, the VACI organization, and the fate of the RAPTOR project. To understand the VACI, I will use technology system models. I first became aware of these through my participation in the UMD iSchool Center for Advanced Study of Communities and Information (CASCI). In addition, I have selected several distinct VA ?insider? assessments that I believe have influenced VA leadership. They are the VistA Modernization Strategy (2010), the VistA Evolution Roadmap (2014), and the RAND Corporation assessment for Choice (2015) and the Harvard Business Review (2016). Through a narrative literature review, I examine the information science domains my team used to design and develop RAPTOR. For a public institution, much of the VA is hidden in plain sight. Moore notes that ?relatively little has been written on the politics or history of the VA?s origins and expansion. 19 There is, for example, no book-length monograph on the history of VA health care? (2015, p. 338). Therefore, I will show below how the VACI is structured as a semiformal innovation organization and how, because of several organizational challenges, including changes in leadership and scope, the VA limited the VACI?s resources. I will show that without the support of other parts of the organization, particularly OI&T, and PCS, innovative software applications, including RAPTOR, were shelved. In this literature review, I will elaborate on innovation technology frameworks to assist with my theory building, including the People, Process, Technology (PPT) model; the Technology, Organization, and Environment (TOE) model; Davis?s Technology Acceptance Model (TAM); Strengths, Weaknesses, Opportunities, Technologies (SWOT) diamond; Semiformal managerial organizations; and, DevOps (IBM, 2008) (Davis, 1989) (Burton-Jones, 2013). The figure below shows the case for this research. It organizes the most important concepts at the VA department level, the VACI semiformal organization, and the RAPTOR software application level. 20 Figure 5 Literature Review Map Framework for Researching Innovation The literature review first broadly examines theoretical models of DevOps innovation and maps the foundational to the ?state-of-the-art? concepts to the VA and VACI. I will study VACI software engineering using the overlapping domains conceptual framework based on Lincoln & Korpman?s pioneering examination on computers, technology, information science, and informatics (Lincoln & Korpman, 1980, p.259). I initially look at VACI, a semiformal organization, and research innovation models. I examine the conceptual framework of the RAPTOR project and the process involved in innovating the software. For the development of RAPTOR, the overlapping domains under investigation are computational archives, human- 21 computer interaction, and knowledge management, as well as data visualization to explore, measure, and verify the efficiency and effectiveness of the VA?s radiology protocol workflow. RAPTOR Application Development theory, from Computational Archival Science (CAS), Organizational Innovation and Knowledge Management (KM), Computer Interaction (HCI) in radiology information systems (RIS), and Electronic Health Records (EHR) information systems and data visualizations will inform the analytical framework. A Short History of the VA The VA was created in 1921 by President Warren Harding and Congress to care for neglected and disabled veterans of World War I. From the start, the VA has been plagued by scandals and complaints of inefficiency. The VA's first director, Charles Forbes, was convicted of embezzlement and kickbacks. Forbes?s fall in Washington illuminates President Harding?s efforts to bring business efficiency to the government (Stevens, 2017). Harding?s Presidency is known today for his cabinet?s corruption and his extra-marital affairs. In 1946, Winston Churchill was quoted as saying, ?those who fail to learn from history are condemned to repeat it.? Today?s political headlines suggest that we have learned little from the days of Harding's Presidency. My father, a World War II (WWII) veteran, spent several years after the war recuperating at the VA. He served in the US Army Signal Corps in the China-Burma-India war theater. At war?s end, he was down to about half his normal body weight due to diseases brought on by harsh jungle conditions, such as malaria and dysentery. While recovering at a VA hospital he met my mother, a young nurse trainee. While my father and uncle had a positive care experience 22 at the VA after WWII, many deserving veterans failed to receive services. Newspaper headlines of the era are noted (Longman, p.15) for capturing the lack of quality care at the VA, ?Veterans Hospitals Called Backwaters of Medicine ?and ?Third-Rate Medicine for First-Rate Men?. In 1947, a government commission uncovered enormous waste, duplication, and inadequate care in the VA system and enacted major reforms. Longman notes (p.14) that it was not just scandals, but also blundering attempts to avoid scandals that have marred the entire history of the VA. In my ?baby boomers? generation, we have seen vast improvements in combat medicine that had the consequence of a lifetime of post-battlefield care that previous generations had not experienced. Baby boomer veterans also experienced an ungrateful nation who were not able to separate their feelings for a misguided war and the people who served. The book and movie ?Born on the Fourth of July? was my introduction to the history of VA scandals. In the late 1980s, the veteran population from WWII, Korea, and Vietnam wars totaled about 28 million and the VA?s budget was around $26 billion. The VA?s medical expenses have increased over the past 10 years and there have been increases to both total costs as well as individual patient expenditures, despite a drop in the veteran population. Currently, the veteran population is down to 20 million, but the budget has increased to over $200 billion. The VA is forecasting that the veteran population will be less than 14 million in 20 years (VACI, 2019). The number of veterans is falling rapidly, while per-patient spending growth has skyrocketed. The problem is not a lack of money; instead, the VA is plagued by long-running difficulties. Longman (2010) describes the VA as ?a gigantic, unionized bureaucracy, micromanaged by Congress and political appointees, and best by an uncertain budget, an aging infrastructure, and a legacy of scandal.? According to the VACI figures on social media (Akinyele,2019), approximately 63 23 million people are potentially eligible for VA benefits and services. Roughly 9 million are enrolled and more than 5.3 million received care in 2017. There were roughly 600,000 patients? admissions and nearly 57.5 million outpatient visits. The VA operates more than 1,300 care facilities, including 875 ambulatory care and community-based outpatient clinics, 154 medical centers, with at least one in each state. There are 136 nursing homes, 43 residential rehabilitation treatment programs, 206 Veterans Centers and 88 comprehensive home-care programs. There are several external stakeholders who impact innovation at the VACI, including the media, Congress, OI&T, Patient Care Services, and clinical users. VA Environment The VA environment is highly regulated and non-competitive. The VACI noted that ?despite significant increases in appropriations and a decline in Veteran populations, care to veterans remains problematic? (Akinyele, M., 2019). Recent VA problems of suicides and addiction have received media attention as full-blown crisis. The Military Times reports that the VA?s own accountability reporting notes the agency?s over-prescription of opioids to its continued struggles to get veterans in front of doctors in a timely fashion (Military Times, 2018). VA advocates note that the press has a responsibility to cover the VHA, as it does other healthcare systems, reporting not only on problems but also on innovations, research, and patient care (Gordon & Craven, 2018). Former VA Secretary David Shulkin said (VHPI report, 2018) he was ?frustrated with the VA?s environment? during his tenure. Shulkin also claimed a major challenge of the job was contending with unbalanced coverage of the agency. ?Of course, there are a few bad actors in the agency, there are 370,000 people in it,? Shulkin said (VHPI report, 2018). ?But the organization is unfairly labeled as a failure which casts a shadow over the agency even though 24 people are getting extraordinary care. We didn?t get the type of balanced reporting that would have helped us accelerate the culture and morale improvements that are underway,? Shulkin concluded. ?Bad news gets more attention than good news.? These statements show the importance of perception and communication to the VA culture at the Secretary level. This grassroots effort also required a change in culture. VA Secretary Kizer (1994 - 1999) advocated ?taking down the barriers that keep people from doing the right thing? and said that ?people tried to do the right thing in spite of the rules?(Gordon, p. 32, 2018), It is fair to say that these roadblocks and cultural complacency are still an issue within the VA. OI&T has an antagonistic relationship with the rest of the VHA. According to the Senior Enterprise Architect, Richard Pham, ?you will have a challenging, I outright say an antagonistic relationship with the IT department? (Pham, 2015) One of the VACI governing principles (Brown, VeHU presentation, 2010) was that applications would ?be piloted in a safe harbor environment?. This was realized by the sandbox environment that is described in the technology section. It is important to differentiate between the struggles that the VA is having in information technology innovation and the amazing advances it has made in clinical research and effective cost control (Oliver, 2007). As noted by Gordon (2018), the VA is a research powerhouse and has made advances in medical care, equipment, and pharmaceuticals that now benefit the entire world, not just veterans. 25 Figure 6 The VACI semiformal organization mission and name changes VA Innovation Program The VA?s Innovation Program was formed in 2010, as part of the VA Secretary?s agency transformation to a 21st Century organization. In fact, the VA Innovation Initiative (VAI2) was highlighted as the ?flagship initiative? of the VA Open Government Plan of 2010. The US Open Government Plan candidly noted that the, ?VA has not always been the model of government performance or service delivery.? It (VA Open Government Plan of 2010, p.3) listed the attributes of ?strong leadership, good governance, and a new commitment to creating a culture that is open, transparent, participatory, and collaborative.? The innovation program was formed to promote innovation. Over the past 10 years, the name of the VA?s national innovation program has changed at least four times, from VAi2 to the VA Center of Innovation (VACI and VCI), the VA Innovators Network and currently is called the VA Innovation Center (VIC) (see figure of logos above). To avoid confusion, I will use VACI throughout this text. 26 I presented at the VA eHealth University Conference in August 2010 in Tampa, Florida where I attended an ?Introduction to Innovation? session, which was presented by Chuck Brown, the Director of the Innovation Program. The Innovation Portfolio (Brown, 2010) stated, ?The VACI enables a steady flux of high-value innovations into the VA, moving them from concept to operational implementation.? The VACI is ?taking a lean startup approach and applying methods like the user-centered design to achieve results quickly?. This ?grassroots? organization is how the VistA EHR was formed in the 1970s (Longman, 2010) (Gordon, 2018). The VA?s internal innovation group is known as the ?Hardhats? and the ?Underground Railway? given that they shared information and collaborated to serve enterprise needs. They are avoiding being shut down by executive management by going to the national media and convincing Congress to align with their goals. In recent years, the VA work environment has been deemed by the government to be at high-risk. A VA internal leadership task force noted that it displayed obstructionist attitudes and clearly lacked integrity (GAO, Wagner, 2015). Since 2005, the VA Office of Inspector General (OIG) has completed 80 criminal investigations, involving wait times, and issued 18 reports identifying deficiencies, and in some cases concluded that wait times had been detrimental to patients? health (VA OIG, 2014). Rubenstein (2018) reported that patients complain about the length of time it takes to get appointments, the amount of bureaucracy involved in becoming eligible for treatment, as well as falsified records, and even preventable deaths. Oversight According to the GAO At-Risk List, every two years the GAO reports to agencies and the public, the areas that are at high risk due to their vulnerabilities to fraud, waste, abuse, and mismanagement. The GAO highlights those programs that are most in need of transformation 27 (GAO At-Risk List, 2015, 2017). The current OI&T organization is seen by GAO as having ?inadequate oversight and accountability?, ?information technology challenges?, and being ?at risk to fraud, waste, abuse, and mismanagement and in need of transformation.? A 2019 update to the GAO high-risk series noted that ?leadership commitment has regressed? (GAO-19-157SP). Another report (GAO-19-476T, 2019) notes that ?over many years, VA has experienced challenges in managing its IT projects and programs? and specifically to OI&T ?its ability to deliver?. An accounting of all the oversight and investigations of the VA includes VA OIG, the FBI, the White House, Congress, corporations, and the press. In the OI&T transformation (p.9) noted with slightly veiled frustration, ?Several high-profile media reports over recent years also highlighted how the build-up of bureaucracy over time had impacted care and services for our Veterans. We were the subject of study upon study, assessment upon assessment. Hundreds of findings, hearings, and interviews indicated everything that was wrong with OI&T?? Representative quotes from oversight experts on their frustration are from a March 30 GAO Report include, ?Our Hands Were Tied At Every Decision Point,? ?Instead of Our Expectation To Work With A Leadership Team That Genuinely Desired Positive Change, We Were Met With A Leadership Team That Displayed Obstructionist Attitudes, and Clearly Lacked Integrity.? These quotes reveal organizational behavioral integrity issues with VA oversight. One of the most significant and notorious project failures was replacing the legacy scheduling system (GAO-10-579) which is about 35 years old (it was first used in 1985). What is noteworthy is the various software solutions that the VA attempted and failed at over 35 years, a homegrown VA system, a ?bake-off? contest, and a commercial system. The scheduling 28 replacement project first started in 2000 and ended in failure in 2009. In 2013, the VA launched a Medical Appointment Scheduling System (MASS) contest, under the America COMPETES Act. that served as a ?proof-of-concept? prototype. The winner of the contest, MedRed stopped hearing from the VA (Politico, 2014) after the contest ended. The VA then awarded a scheduling contract to Epic that it subsequently cancelled during its pilot. ?We knew that scheduling was a serious problem,? said Peter Levin, the VA?s Chief Technology Officer (CTO) quoted in Politico (2014), ?We didn?t know it was an acute problem?. This quote is a remarkable and accurate summation of what I experienced at the VA when scheduling was involved. Theoretical Models for Innovative Organizations and Context It is noted above that the VA listed strong leadership and good governance as requirements to achieving innovation. Biancani (2014) noted several different innovation organizational model structures based on VACI in terms of the semiformal organization model. Ketti (1993, 2002a, 2002b, 2015) notes that much of the academic research on organizational innovation is in the commercial, manufacturing, international or private domain. The federal government and public bodies have different organizations, cultural norms for people, technology, and processes. Innovation creation and adoption in organizations is a highly complex process. It can be subjective, therefore illogical, and hard to research. Public-Private Partnerships (PPP) are used as a tool to improve the bleak success rate of federal innovation. (Ketti, 1993, 2002a, 2002b, 2015) The systematic study of non-adoption and resistance to adoption is as crucial as the study of adoption, but it is a largely unexplored field. People, Process, Technology On his first day in office, President Obama required (Obama, 2008) the federal government including the VA, to improve transparency, collaboration, and participation. VA 29 Secretary Shinseki fulfilled this executive order in the VA Open Government plan (Version 1.3, 2010) by using the People, Process, Technology (PPT) Model. This PPT model is a well-used in both academic and popular organizational literature (Prodan, 2015) (IBM, 2011). Leavitt (1976) proposed PPT as a way of explaining the critical success factors for organizational change. I now propose examining each of the three PPT elements of VACI capabilities. The People breakout figure below illustrates the significant areas (by my observations) where I want to examine the People element of the PPT model. Under people, I examine the organizational structure and culture, the private-public-academic partnership, and public communications. Groups of people create an organizational culture through shared values and behaviors. At the VA, the mission of serving Veterans is a very strong shared value. Secretary Shinseki?s Open Government Plan (V.1.3, 2010, p.7) promised a changed culture that will be open, transparent, participatory, and collaborative. The plan detailed that, ?creating an atmosphere of openness at VA, the second-largest Federal agency, will require not only leadership from the top of the organization, but also significant efforts to integrate these values into our business processes? (V.1.3, 2010, p.16). In the VA Blueprint of Excellence, one of the strategies (#4, p. 20) includes ?engaging and inspiring employees to their highest possible level of performance and conduct?. This culture was realized by the VA Innovation Initiative, which required employee participation in the ideas, selection voting, and implementation. In the findings section, I will investigate what were the organizational behavior issues that inhibited the changes promised. 30 Figure 7 The People details within the PPT model People: How does the VA organization, culture, and communication influence innovation? Behavioral integrity (BI) is defined (Dineen, B. R., Lewicki, R. J., & Tomlinson, E. C., 2006). when an organization?s words and actions are in alignment. BI has been shown to be associated with a broad range of effective transformational leadership behaviors (Simons, 2011). Simons (1999, 2002, 2008) has confirmed that BI is influenced by personal characteristics (specifically leadership behaviors) and contextual characteristics (specifically organizational and environmental factors). What role does BI, culture, and environment play in preventing innovation from thriving? It is the VA?s goal to develop a culture of safety by reducing and preventing inadvertent harm to patients (VA National Center for Public Safety, 2018). RAPTOR 31 was an easy fit to address the VA?s core values regarding Veteran-centric health care. I argue that failing to introduce RAPTOR represents a discrepancy between VA values and actions, and that the VA?s BI is misaligned. RAPTOR can improve the quality of care, patient safety, and regulatory compliance (VACI Good News Story, 2012). Tudor (2018) found that an automation tool to assist in the radiology order entry protocol selection of advanced imaging studies is a prime target to alleviate labor-intensive tasks. RAPTOR was identified in the VistA product roadmap as a ?best practices workflow tool? and certified for quality (OSEHRA Technical Journal, 2016). RAPTOR ordering functionality automates the correction of erroneous orders (RAPTOR Requirements, 2014). It was noted by VA employees that this correction feature could potentially be perversely abused by allowing users the ability to unethically improve their order and schedule imaging departmental metrics. The perceived ease of use in the Technology Acceptance Model (TAM) is an important theoretical consideration that I will use to explain why RAPTOR was not released (Davis, 1989). Cleaning and migrating erroneous legacy data revealed by introducing new systems is a common quality issue in computational archive systems (Rahm, 2000, Hasan, 2007). Erroneous legacy orders are an integrity problem. Holding up the implementation of a timesaving application while refusing to clean up legacy data is an ethical issue. Going into production will bring to light many lingering erroneous radiology orders. These occur when the order is not entered correctly and must be modified by the technologist. Paradoxically, this order housecleaning has not occurred, due to ethical considerations. What are the ethical ramifications with the huge backlog of erroneous radiology orders? The Washington Post (Joe Davidson, Politics, Perspectives, June 27, 2019) notes several 32 employees are testifying to Congress about VA exam ordering data and the broken process to either ignore or delete the data. This ongoing, sensitive issue of order data was first made clear to me by one of the acceptance sites, noted this issue when erroneous data was exposed by turning on RAPTOR. ?Erroneous orders. Going into production here in Portland has brought to light many lingering erroneous orders in our system. These occur when the order is not entered correctly and must be modified by the technologist, e.g., the clinician orders chest CT, then also orders abdomen/pelvis CT rather than one order for chest, abdomen, and pelvis. The tech attaches the abdomen and pelvis exam to the chest order, leaving the order for the abdomen and pelvis an orphan which must be deleted. Those cases are not being consistently deleted and therefore when one opens the Raptor worklist, the first several pages consist of STAT exams ordered several days earlier. This is not a Raptor problem but a housekeeping problem we need to solve locally. The problem is that a radiologist trying to protocol cannot tell which orders are legitimate and which are chaff.? This issue is not unique to a particular site as the Washington Post (Davidson, 2019) notes that Iowa City has tens of thousands of radiology orders, that whistleblower Jeffery Dettbarn testified on the retaliation that he received by reporting on the process of canceling these orders. If VA HIT is to help solve the veteran?s health care and help improve patient safety, HIT projects must succeed. Yet, HIT projects fail at a rate of up to 70%. Failure in this context is defined by Leviss (2010, p. xvi) as a: ?HIT project in which an unintended negative consequence 33 occurred, such as a project delay, a substantial cost overrun, a failure to meet an intended goal or complete abandonment of the project.? Process: In what ways does the breakdown of VA software, clinical, and management processes impact innovation? The Process breakout figure below shows where I want to examine the Process element of the PPT model. Under process, I examine the development-operations coordination, software engineering process, the user acceptance testing (UAT), the project management processes, and the role of patient scheduling on radiology workflow. Figure 8 The Process details within the PPT model The software engineering process model I propose to use to examine the VACI is DevOps, a blend of two terms, development, and operations. DevOps originates from modern software development techniques, including Agile, Scrum, and Extreme Programming 34 (Huttermann, 2012). DevOps is constantly evolving, and a clear definition is elusive. I propose a conceptual DevOps model (see the figure below). The model illustrates the overlapping domains of information science (on the left, represented in blue) and operational tasks (on right, represented in orange). This figure below sheds more light on the types of tasks performed in RAPTOR in the information science domain and by the VACI in the VistA sandbox to support users and administrators. It is important to note the large integration tasking (bidirectional green arrow) in the figure below. This integration represents a key DevOps handoff at the VACI to OI&T. Figure 9 RAPTOR DevOps Process DevOps is a software development methodology that combines software development (Dev) with information technology operations (Ops). Walls (2013) defines DevOps as a software culture that, when combined with several software development practices, enables rapid development. I am choosing the DevOps process model to understand the development of RAPTOR (Dev) in VACI (Ops). DevOps is a natural evolution of the Agile software 35 development process and spans from the planning to the implementation stage. The evolution of DevOps was made possible by the spread of cloud-based virtual technologies. The adoption of DevOps is, however, more complex than the adoption of Agile since changes at the organizational level are required. DevOps also requires new skills, coordination, and communication. The figure below from Gartner (2015) illustrates many of the possible DevOps tasks within the PPT framework. The figure below is a key illustration that shows the RAPTOR DevOps tasks. It is more concise than the figure above. My development team performed the design, coding, building of the servers, and acceptance testing. Additionally, we were required to load VistA data into our application. We were also required to build the servers for UAT. After the UAT, the software developers were dependent on the VACI for operational deployment and in order to monitor the application. Figure 10 Gartner (2015) places DevOps within the PPT model 36 To explore this issue, I will need to communicate with key stakeholders from various parts of the organization. As an established insider, I have familiar working relationships with many key stakeholders and can draw on my established professional network. I will interview stakeholders familiar with and influenced by VA organizational factors. This semi-structured process has been approved by the UMD IRB. Technology: Is information technology the cause of the rejection of VA VistA and RAPTOR? Figure 11 The Technology details within the PPT model Despite a considerable body of literature (listed extensively in the bibliography) on factors involved in the Open Source Software (OSS) adoption process, there is little academic 37 research into proprietary vs. OSS concerns by US public sector organizations. In fact, during the RAPTOR project, the head of the VACI requested that I write a paper justifying the project?s intent on using OSS (RAPTOR Options Analysis Report, 2011). Thus, it is important to reassess the factors inhibiting OSS adoption and enhancement within the VA. This study will explore OSS communities and attributes through the lens of the Technology Acceptance Model (TAM) (Davis, 1989; Burton-Jones, 2013). I will use HCI in radiology, imaging informatics, information science, information technology, and workflow analysis to show the technological innovations of RAPTOR. The VA is shutting down award-winning OSS applications for proprietary commercial applications (Politico, Healthcare IT News, 2018). It may be the case that the marginalization of OSS at the VACI has little to do with software development issues or the acceptance of users. Although VA official policy encourages open source applications (VA Memorandum, 2014), the VA has not nurtured such open-sourced behavior (Open Health News, 2018). The annual Medscape Electronic Health Record (EHR) Report consistently ranks VistA the number one EHR (Medscape 2014 - 2017). Although OSS applications are championed by users and have passed all software quality certifications, they will be retired by the VA (Healthcare IT News, 2018). VACI as a Semiformal Organization As a reaction against the ?rigidness? of the VA, the VACI was organized to increase the rate of innovation at the VA. Several of the modernization goals of the VistA Modernization Strategy includes, ?maintain clinician end-user involvement in requirements identification, application design and user acceptance and agile development for more collaboration and 38 adaption to changing business needs.? (p. 34, Table 5). Agile development is very different from what I experienced with multi-year development lifecycles as a VistA developer under OI&T. Banani and McFarland (2014) noted that organizational divisions make innovation difficult. They coined the term semiformal organization to refer to intra-organizational groups promoting new collaborations. Semiformal organizations are both structured and chaotic and mobilize around new ideas. Everyone?s participation is encouraged and offered in administrative decisions and voluntarily semiformal roles are occupied by employees. This type of project team increases informal communication. Characteristics of semiformal organizations include process flexibility, collaboration, and cross-functional teams. These semiformal structures support the exchange of knowledge and encourage networking among potential innovators. The extracurricular metaphor fits well with this type of organizational structure. ?Toucan (aka Two-can) is a slang expression in the software industry that represents a two-person team ? one highly technical and the other understands the needs of the users and is a user. This two-person team is what we used in RAPTOR. The radiologists and technologists serve as Subject Matter Experts (SMEs) on the application development but takes a lower priority to ?real clinical work?. This is because they often put in extra time, meaning that they spend some of their time engaging in volunteer or unpaid labor. I used Biancani and McFarland (2014) to understand that the VACI has characteristics of a semiformal organization. Radiologists retain their local formal memberships and have an additional semiformal membership in the application development team. Participation in the 39 VACI was encouraged through employee voting and contests. VACI was created because of a recognition that large government organizations can create barriers to innovation (Brown, 2010). It is a semiformal organization (Biancani and McFarland, 2014), unlike the other VA divisions, Patient Care Services (PCS) and the Office of Information Technology & Operations (OI&T) Having established that VACI is semiformal, but not an incubator or ?skunk-works?, I want to research what IT innovation models work well in the context of semiformal organizations. The VACI was organized as participatory ?grassroots? or a ?bottoms-up? program (Brown, 2010). The initial idea was the creation of employee-driven initiatives that the VACI would support, from concept to operational implementation. The VACI website described this approach as, ?government experts?teamed with private-sector doers taking a lean startup approach and applying user-centered design to achieve results in months, not years? (VACI, 2014). The VA Open Government plan said, (p.27-28, Version 1.3 June 25, 2010 ), ?VAi2 is our structured and sustainable vehicle for spurring innovation and introducing the best ideas into day-to-day operations within VA. Going forward, we will continue to conduct both employee- driven and industry-driven events in both bottoms-up programs that encourage a broad range of ideas) and top-down (directed programs focused on major challenges) fashion.? The process that VACI selected new innovations, its mission and focus lasted only a few years. There have been several VACI rebranding transitions over the past few years. In 2014 ? 2015, the VACI was reorganized to feature ?Shark Tank? competitions. This rebranding was an attempt to avoid large projects and to spread more resources to spark innovation (VA Innovators Network, 2015). This project initiation rebranding is known as ?Spark-Seed-Spread?. The current VA innovation organization is rebranded as iNET or Innovation Network. The current project initiation process is known as ?Go Fish?. 40 Currently, the VACI is being reorganized under the Office of Enterprise Integration. This organization chart is shown in the figure below. It has important to note that the VACI is not aligned with either technology (OI&T) or clinicians (PCS). This ?orphan? structure provides a lack of ownership and the ability to not follow through on projects. Figure 12 The VACI is organized within the VA Enterprise Integration What message is communicated by broken websites? The VA?s culture of poor transparency and communication is illustrated by broken links and outdated information. As shown in the figures below, the VACI website has been under maintenance since the Trump administration implemented the Mission Act in 2018. The website displays the message that ?a new and improved website is coming soon? for over a year (VACI, 2018, 2019). I also note that the flagship initiative of open government VAI2 has had a broken website for several years. 41 Figure 13 VACI website is under repairs (accessed in 2019) Figure 14 VAI2 website is broken (accessed in 2019) Adaptive Theory Process My case study needs a theoretical framework, but I found that the VA Center of Innovation is a semiformal organization, that is a dynamic evolving entity. This model?s 42 characteristics fit my observations. I will further investigate several organizational innovation models on Effective Use Representation Theory (RT), TAM (Technology Adoption Model), TOE (Technology, Organization, Environment Model), TIS (Technology Innovation System) and OIS (Organization Innovation System). These models help identify the determinant factors for adoption or non-adoption of the RAPTOR technology. It would be worthwhile to investigate which of these individual technology innovation models, or a combination of these models, would fit a semiformal organization. I propose to use Adaptive Theory using the following process. Adaptive Theory is based on a hybrid approach between the data and the research (Layder, Chapter 6). My initial suspicion is that to understand the VACI requires innovation theory using orienting concepts on a semiformal organization. At the heart of adaptive theory are a set of concepts. They are like lenses that help me filter the data and make sense of it. I have chosen an adaptive approach because I have not found an existing model that fits my collected data. I need to adapt and propose a new model. I am taking the pre-existing concepts and they will be adapted once I dive into my data. I will let the data help shape and, if necessary, discard or reform the concepts. My findings is an organized group of concepts and a systematic way of understanding this case study, which I can then weave together into a theory. As I investigate the organizational innovation models below, I will propose the steps of my adaptive methodology. Organizational Innovation Models Oliveira & Martins (2011, p.110) categorize all IT innovation adoption models as one of 43 either TOE or TAM types. In their literature survey, Oliveira & Martins claim that ?most studies on IT adoption at the firm level are derived from these two theories? (2011, p.110). These models help identify the determinant factors underpinning the adoption or non-adoption of RAPTOR technology. It is worthwhile to investigate which of these individual technology innovation models, or a combination of these models, would fit a semiformal organization. This following section was presented at the UMD CASCI on November 6, 2018, to gain insight, gather feedback from the iSchool community and to socialize model the proposed integration of the two foundational models TOE TAM. I am going to integrate TOE and TAM to make my own hybrid model. I will use this new model to orient my data and concepts. The resulting data will be the input to how I formulate my conclusions on the failure of innovation within the VA and point to a remediation process for organizational resilience. The Technology Acceptance Model (TAM) Since RAPTOR is a radiology module of the VA VistA EHR, I feel like I am well equipped to find a model to explain technology adoption behavior at VACI. One takeaway from the Oliveira & Martins reading and the UMD CASCI November 2018 discussion is how these models relate to other theories and models. My focus is on RAPTOR technical acceptance over performance. Having established that VACI is a semiformal organization and that VACI is using the DevOps process, I am interested in the various knowledge-sharing frameworks and psychological theories that can be found in the existing literature (Anwar, 2017). I now begin to model TAM on the open-source policy and telehealth implementations of VA OI&T as shown in the table below. 44 Table 4 VA Innovation vs. TAM Model VA Innovation TAM Model ? perceived ease of use and perceived usefulness, RAPTOR ? perceived ease of use ? Radiologists perceive that RAPTOR replaces manual, error-prone paper process ? perceived usefulness ? Radiologists receive multiple perceived Benefits including Patient Safety (see Table of RAPTOR benefits) Open Source Policy ? perceived ease of use ? Improved performance ? perceived usefulness - Moves VistA innovation forward much more quickly through open source route Telehealth ? perceived ease of use ? TeleReader improves manual, error-prone process ? perceived usefulness ? Replaces transportation resources (the image moves instead of the patient) The link between culture, organizational processes and technology usage has been clearly established in the literature (cf. Alavi et al., 2006). Research on organizational culture (Pope and Butler, 2012, Section 4.1) found that ?self-efficacy?, ?outcome expectancy? and ?organizational 45 climate? positively influenced an individual?s intentions to share knowledge. Pope and Butler?s research (2012, Section 2.2) showed that ?attitude?, ?perceived behavioral control?, ?subjective norms? and ?organizational support? have positive effects on technical adoption intention, which in return affects knowledge sharing and communication behavior. Figure 15 Technology Acceptance is an organizational innovation decision Organizational impact on Innovation Decisions In my experience with the VA, organizational behavioral integrity had an impact on innovation decisions. In the TAM model, I define commitment to be organizational competence and consistency. Management support is needed to help to overcome barriers and training and education are needed to increase knowledge of the innovation. When employees want to adopt new technology to increase the standard of care, it creates a professional advantage over others. 46 Operational technology support and access require resources if the system is to be properly maintained. In addition to technology and the environment, there is another component: the organization itself. Note that the model?s constructs (usefulness, user?s perception) are often beyond the remit of software innovation. Although the TAM model is effective, it does not focus on the organization itself. With that in mind, I will seek to elaborate upon it and further develop it in the following section. Technology, Organization, Environment (TOE) Framework Figure 16 TOE Framework (from Tornatzky and Fleischer, 1990) The figure above is a classic enterprise business model that shows the relationship between Technology, Organization, Environment (TOE) and Innovation (Tornatzky and Fleischer, 1990). TOE provides a holistic picture for user adoption of technology, its context, implementation, potential challenges, its impact on value chain activities, post-adoption 47 diffusion, factors influencing business innovation-adoption decisions and to the creation of better organizational capabilities and technology innovation. As shown in the TOE model above, the cross-functional area in the middle of the TOE is the Technological Innovation Decision (TID). We can deconstruct the TID, where attitude and perception impact user behavior, acceptance, and adoption. In the model, TID attitude and perception impacts the decision to innovate. Integrated TOE TAM model I have introduced the TOE and TAM individually, and they serve as building blocks for combining the two models. I now will argue that TOE and TAM can be integrated into a single organizational innovation model, as shown in the figure below. Oliveira & Martins (2011, p.120) conclude, ?In terms of further research, we think that for more complex new technology adoption it is important to combine more than one theoretical model to achieve a better understanding of the IT adoption phenomenon?. I will now examine whether an integrated TOE TAM can be used to describe semiformal organizations. The TOE framework needs to be strengthened by integrating it with innovative models that have clear constructs. Therefore, researchers have advocated the integration of TAM and TOE so that the predictive power of the resulting model can be improved and some of their individual limitations can be overcome (Abdelhadi, 2018; Awa, 2018). The TOE TAM overlay describes the semiformal organization. 48 Figure 17 TOE TAM can be integrated into a single model TIS applied at the VACI At the project level, TIS can be useful for the knowledge field or product (RAPTOR), breadth (VHA Innovation Portfolio), depth (Project Lifecycle) and domain (radiology IT, open source software). I propose to investigate the TIS theory in the semiformal organization, where the functions are determinant factors that can be dissected to understand the events that shaped the project?s outcome. Is there an Organizational Innovation System? TIS at the project level seeks to understand the emergence, growth, and performance of new technological fields. The nature of actors/markets may obstruct TIS formation. Starting with Representing Effective Use, I will show that TOE and TAM can be integrated and useful for finding determinant factors that lead to the adoption of technology. Functional dynamics and relationships are part of new organizational research in healthcare. This includes behavior. TIS is defined (Bergek, et.al. 2008, p. 407) as ?socio-technical systems 49 focused on the development, diffusion and use of a particular technology (in terms of knowledge, a product or both)?. Figure 18 Bergek (et.al. 2008) breaks down Technology Innovation Systems Figure 19 Integrating TOE-TAM-TIS-OIS in the context of the semiformal organization 50 In shaping my adaptive theory, I developed the following process. Step 1, I first reviewed the existing innovation models in the process described above. Step 2 is on taking these models, proposing a modified integrated version, and then using that as a launching pad for the rest of the dissertation. I am looking at OIS failure groups within a modified TOE TAM TIS model. In particular, the OIS Failure Groups (adopted from Van Lancker, et. al. 2016, Table 1) Table 5 Organizational Information System Failure Groups OIS failure groups Explanation Dimensional blindness Overlooking operations or not focusing on failure integration soon enough Iteration failure An improper balance between too much iteratively and too little feedback loops Resource failure Too few operational resources available within the VACI to successfully generate, develop and diffuse the innovation Representativeness failure Improper radiology and OI&T stakeholder group representativeness, non-representative organization or individual for the group, or non- representative individual for the organization Cooperation failure Too few strong OI&T ties in the innovation network, leading to, for example, trust issues and difficulties in cooperation 51 Lock-in failure Too many strong ties, leading to, for example, ?group think?, resulting in myopia and inertia within the innovation network. This is true with open source development. Hard institutional failure The lack or underdevelopment of formal arrangements, e.g. collaboration contracts, IP- arrangements, and non-disclosure agreements. Low radiology priority. Soft institutional failure The lack or non-alignment of informal arrangements, e.g. shared vision, social values, culture and norms, mutual trust, goals of the different partners and business models. Severe problem with VACI Capacity failure The lack of certain capacities of VACI to maximally benefit from innovations, e.g. absorptive capacity, or network management capacity I then considered and added Kaptein?s ethical factors adopted from 2013. Table 6 Ethical Factors (adopted from Kaptein, 2013) Ethical factors Explanation Clarity Clarity for contractors, and employees as to what constitutes desirable and undesirable behavior: the clearer 52 the expectations, the better people know what they must do and the more likely they are to do it. Role-modeling Role-modeling among administrators, management, or immediate supervisors: the better the government examples, the better people behave, while the worse the example, the worse the behavior. Achievability Achievability of goals, tasks, and responsibilities set: the better equipped VA employees and contractors are, the better they can do what is expected of them. Commitment Commitment on the part of contractors and employees in the organization: the more the organization treats its people with respect and involves them in the organization, the more these people will try to serve the interests of the VA. Transparency Transparency of behavior: the better people observe their own and others? behavior, and its effects, the more they take this into account and the better they can control and adjust their behavior to the expectations of others. Openness Openness to discussion of viewpoints, emotions, dilemmas, and transgressions: the more room people within the VA must talk about moral issues, the more they do this, and the more they learn from one another. Enforcement Enforcement of behavior, such as appreciation or even reward for desirable behavior, sanctioning of undesirable behavior and the extent to which people learn from mistakes, near misses, incidents, and accidents: the better 53 the enforcement, the more people tend toward what will be rewarded and avoid what will be punished. In the Literature Review theory section, my proposed refined model I will call the TOE TAG, G is for Groups, as in grouping of failure. I present my pre-data ?guess? of what the proposed refined PPT model. TOE TAG combines TOE TAM, PPT with a hybrid of OIS failure groups and ethical factors shown in the tables of ethics above. In the next findings section, my Adaptive Theory process continues with the following steps: Findings Step 1. Using PPT, I present findings using the models discussed in the Literature Review Step 3. I use the models as a framework from which to begin the discussion and structure my writing. The emphasis in the findings chapter is on presenting the facts: ?this is what?s going on?. Then, in the summary, I reflect on how well the models apply and suggest some minor amendments (if necessary) Then in Step 2, Using PPT, Ethical factors and OIS Failure groups to map to RAPTOR data collection. Keeping my emphasis on the findings chapters and presenting the facts using my models from Literature Review step 3 as a hook. RAPTOR Level Introduction The Department of Veterans Affairs (VA) is the largest healthcare system in the United States. A current headline (Politico, 2017) warns that the VA built the most important medical computer system in history and is now about to spend billions throwing it away. The White House has made the overhaul of the VA?s medical records a centerpiece of its government reform efforts (Politico, 2018). 54 This research studies and proposes to redesign a key part of the VA?s VistA (Veterans Information Systems and Technology Architecture) Electronic Health Record (EHR) system. This research could result in saving the government from throwing away VistA and wasting billions of dollars and improving care to the nation?s veterans. My research proposes applying information systems methodology to the VA radiology workflow. It serves as a case model that could help shape the national discussion on VistA for the modernization of legacy government health information systems. CTO Peter Levin defines VA innovation as invention plus implementation (Fedscoop, 2010). Levin highlighted seven attributes of implementation: open architecture, modular, scalable, standards-based, extensible, reliable, and maintainable. This definition is important to note now in the literature review as implementation is a critical issue that I will revisit in detail in the findings and conclusion section. Conceptual Framework This study will use the overlapping domains' conceptual framework. The interacting domains are computational archives, human-computer interaction, knowledge management, and data visualization, each of which are used to explore, measure, and verify the efficiency and effectiveness of the VA?s radiology protocol workflow. Lincoln & Korpman (1980, p. 259) used overlapping domains to illustrate the introduction of computers into the medical clinic. This framework assists the researcher by raising, ?issues that are difficult to resolve by the methods of information science or medical science applied in isolation. The melding of these two disciplines, together with the contributions of other disciplines, has created a new field of study called medical information science?. Healthcare Information Technology (HIT) does not 55 concentrate on a single domain but on the areas that domains overlap to consider an integrated approach to broader problems in HIT. Figure 20 Overlapping Domains Conceptual Framework My motivation for returning to UMD was to design research that can be used to redesign and automate a key radiology component of VistA, considered to be the most important medical computer system in history. The overlapping information science domains steered my first two years at the University of Maryland iSchool. Initially, my action research focus was on studying the VistA radiology paper workflow and releasing an automated tool that works with VistA. This research would have been used to understand the process involved in measuring the design, development, and potential introduction of a new automated tool into the VA?s radiology department. 56 Radiology Information System Context Mapping At one time, VistA was generally recognized as the most integrated and best Electronic Health Record system in the world. (Longman, 2007) VistA currently provides each veteran with a digital medical record; this has improved the quality of care, patient safety, patient, and provider satisfaction, and bought about lower costs. As open source software (OSS), it has value for the global healthcare community (VistA Modernization Report, 2010).). However, as one of the US government?s oldest legacy information technology systems, VistA must be updated and modernized for the VA to continue to meet the needs of the veteran community. The overlapping domains shown in the figure above illustrate the major radiology and technology influences that need to be addressed by this research. Radiologists review clinician orders for advanced diagnostic imaging exams and assign specific protocol instructions that direct how each exam must be acquired. Performance of this department function can impact patient safety, quality of care and productivity, yet its importance is often undervalued and not automated. This protocoling is predominantly a paper- based manual process at VHA facilities nationwide. Paper processes have inherent shortcomings. Lost and duplicated exam requests negatively impact efficiency. Information necessary for optimized protocol selection can be missing from paper processes. If it is not available on paper, this information can be cumbersome to obtain when the data is stored in disparate health information repositories. This lack of information availability negatively impacts the quality of care. 57 Figure 21 RAPTOR Context Map Cousins and Robey (2005) examine how patterns of technology use are shaped by context of use, and how these patterns affect individuals. Recordable electronic transactions assure responsibility and authentication of documentation. The provision of secure provider communication protects patient privacy. Electronic emulators of paper processes are at risk of providing non-optimized functionality and falling short of efficiency and quality targets if enough system interoperability is not achieved. Diverse health system requirements, including consent for contrast agents, application of conscious sedation protocols and documentation of order changes, can be automated within an optimized electronic dashboard solution. Utilization of open standard, open source architecture and tools could result in improved reliability and functionality, facilitate maintainable extensions, and minimize ownership costs and development time. 58 Open Source Software Strategy What is an appropriate strategy for modernizing VistA and transitioning it to a more current and innovative architecture? ?When you look at the big trends in the IT industry, open- source is used everywhere. In fact, some of the most successful mega IT systems have a significant open source component,? said Dr. Seong Mun, CEO of the OSEHRA (Healthcare IT News, 2017). I have experience with several diverse software strategies at the VA, but when developing RAPTOR, I choose the open-source software strategy. As shown on my timeline (in the introduction), my initial OI&T experience with VA was as a legacy VistA developer. I was directed by VA employees (both technical and clinical) to develop new functionality in a waterfall methodology to meet VA?s specific requirements. My next experience was as a PCS software architect to assist with the integration of COTS software for teleradiology and PACS. COTS Software was evaluated and certified based on performance and interface standards (Henderson, Dayhoff, Casertano, 2010) However, other strategies of software methodology, including agile, open source are a way forward to modernize VistA. Around ten years ago in 2010, the VA Modernization Report provided a vision of VA open source development. This vision (VistA Modernization Report, p.7) foretold, ?a state of the art, open source medical application development environment with a comprehensive suite of extensible components and functional applications provided by VA, entrepreneurs, university 59 researchers, commercial medical and non-medical products companies, national health services, etc. with a superset of the functionality in today?s VistA system?. VA Assistant Secretary Roger Baker described the reasoning behind the VA?s open- source software policy in 2010: ?I just think we?re going to move VistA innovation forward much more quickly if we go the open-source route.? (Fierce Government website Q&A quoting VA Asst. Sec. Roger Baker, 2010). He added ?... how do we then get back to moving the innovation forward in VistA, and that?s really what the whole open source campaign is all about. Medical records systems have moved forward a tremendous amount in the United States since the time that VistA was started. And the private sector is doing a lot of stuff that we need to be able to incorporate into VistA. So, our thought is that by being part of an open-source community based around VistA (OSEHRA), the VA can encourage private sector folks to either directly contribute the open-source?you know, make improvements. Or integrate their products with the open source, so we can very easily buy a working product, instead of having to go down the government route.? Assistant Secretary Baker?s reasoning is that ancillary system integration would be cheaper with open source systems. Baker said, ?I believe we?ve got to go the open-source route?we have two important projects to integrate private-sector packages into VistA going on inside the government right now?one is for laboratory and one is for pharmacy. Both of those projects are going on five years, to integrate the private sector product into VistA because we are doing it the government way. That is far too long. We need to be able to go out and say, ?I?m interested in a pharmacy package, in six months I?m going to buy one that I prefer, from all the ones integrated with the open-source?let us go.? And when an organization like VA says it is 60 going to buy, that could be 200 or 300 million dollars. So, you know generating the private- sector interest in it? (ibid.). This section shows that the software industry in general, the VA and the RAPTOR project are targeting an open-source software strategy to maximize resources and maintain technical currency. In the private-public academic partnership section, I will expand more on the specific benefits of open source software, but next, I will discuss several challenges with this strategy. Open Source Challenges As RAPTOR was designed initially for the VA, it is important that the software satisfies VA?s unique requirements to ensure interoperability with other VA systems and data provenance. This requires many diverse and atypical domains to be mastered. The application development requires a deep understanding of the VistA?s patient data. It also requires an expert understanding of advanced imaging protocols to display and input premedication regimens, such as contrast administration and radiation dose. This domain is reflected in the contraindication rules? engine. Additional expert knowledge of the data utilized by radiology departments and Radiology Information Systems (RIS) is essential. Insight into workflow optimization, efficiency measures and quality feedback loops to users are required. The two figures below show the current VistA radiology protocol workflow. The applications are shown at the top, the users are labeled in the arrows and the steps are listed at the bottom of the figure. 61 Figure 22 The Physical Representation of VA Radiology Protocol Workflow: Part One Figure 23 The Physical Representation of VA Radiology Protocol Workflow: Part Two 62 Open source agile application development requires self-contained infrastructure and tools and participation in testing certification conducted by the Open Source Electronic Health Record Agency (OSEHRA) (Ito, 2016). RAPTOR has been certified at OSEHRA Level 2 (which is more rigorous than legacy code). OSEHRA Certification criteria comprise eight categories, including a code review, documentation, and testing (OSEHRA Certification Standards document, 2016). As an example of anti-disciplinary thinking, IT design, development and testing require a deep understanding of software design, content management systems, knowledge algorithms, web services, data visualization, and security. This project requires an understanding of the customer and the customer?s environment. Expertise with the VA, including national and local executives, administrators, radiologists and especially its mission to support our nation?s veterans, is critical to getting the project completed. RAPTOR Literature Review This literature review will survey and unify several diverse strands of research. It serves to verify and validate that the approaches I have taken to evaluate the root causes of innovation failure are aligned with evidence-based practices and theory within the field of information systems. Interdisciplinary research strengthens the project?s foundation and shows where it fits within the larger scientific community. I have grouped the literature and bibliography according to the categories within iSchool studies. Thus, theory from Computational Archival Science (CAS), Organizational Innovation and Knowledge Management (KM), Computer Interaction 63 (HCI) in Radiology Information Systems (RIS), and Electronic Health Records (EHR) information systems and the data visualizations literature will inform the analytical framework. Computational Archival Science Literature Review The UMD?s Digital Curation Innovation Center (DCIC) Computational Archival Science (CAS) workshop was my first introduction to this unique interdisciplinary approach to information studies (Marciano 2016). The gateway seminars highlighted a diverse combination of CAS concepts and new approaches to research. They clarified the social justice record archiving and management work being researched in the DCIC and sparked ideas with my own areas of interest while simultaneously helping me learn about the array of topics that can be explored in information studies. Lemieux (2016) describes the practical examples of telling a story with documents. I acknowledge that there is a significant responsibility to telling my story and my research determines whose stories are told, how their importance is weighted. Marciano discussed combining data from multiple sources, to gather insights across diverse information sets. Often, archives are not designed to cultivate this cross-domain thinking, so Marciano advocates restructuring them into what he refers to as a data observatory, in which one can borrow patterns of thought from computation to organize them into levels. In the gateway seminars, I researched and presented a medical imaging data curation case study (Kuzmak, 2013). This Department of Defense (DOD) VA case study looked at data curation issues, including public-private partnerships, chains of custody, trust, information retrieval & access, archive retention strategy, provenance, and standardization. This case study investigated streamlining the importation of DOD Digital Imaging Communication in Medicine (DICOM) studies into the VA VistA Imaging PACS (Picture Archiving and Communications 64 System). (Kuzmak, 2012) Understanding users in their environment is inherently interdisciplinary. Marciano (CAS Symposium, 2016) notes that CAS is an interdisciplinary research field, where the importance of humanity must be addressed in large data sets and in digital curation and interface design. My systems engineering background provides a foundational understanding of the semantic interoperability of Electronic Health Records (EHRs). EHRs contain millions of records and contain patient's longitudinal histories. Organizational Innovation and Knowledge Management Literature Review Knowledge Management (KM) is defined as a systematic process for gathering, organizing, and communicating both tacit and explicit organizational knowledge that can be used by stakeholders (Schultze & Leidner, 2002; Alavi 2005; Massey & Montoya-Weiss, 2006). To implement and make full use of knowledge, an organization must have a clear understanding of how knowledge is formed, disseminated, and applied (Ipe, 2003; Hooff & Huysman, 2009). A systems-based approach to understanding the VA radiology workflow includes understanding the radiology community of practices, protocoling, VA management, and organizational and environmental issues. I applied ethnographic research on VA organization and structure, individual accountabilities, and key collaborations. I researched how grounded the VACI is in the realities of its stakeholders. 65 Strengths Weaknesses Repeatable Manual Well-Understood Paper-based Workflow Wasted Resources Opportunities Threats Automation Patient Safety Improve Workflow No commercial Avoid Duplication product available Figure 24 SWOT Analysis The KM tool shown in the figure above is the application of SWOT analysis. Srikantaiah (2008, p.19) recommends SWOT analysis as an excellent tool for organizational planning. SWOT provides an analysis of strengths, weaknesses, opportunities, and threats. It is an effective way of examining those four areas to strengthen the organization?. SWOT analysis is used to provide decision support of a project by evaluating the probability and level of risk of reward and failure. It will provide a blueprint pointing out where the organization is strong and where the opportunities exist to capitalize on those strengths. The analysis will also reveal what areas in the organization are weak and need to be addressed to improve the existing condition. The analysis will also caution the organization of the threats to watch out for pointing out what needs to be done in order to sustain. 66 The VA radiology workflow business process discussed above reveals significant problems with the protocol library and collaboration. A lack of collaboration is often a liability associated with specialization (Biancani, 2014). VA Radiologists frequently do not receive enough information on exam requisitions to optimize the quality and safety of their protocol decisions. Efforts to augment the clinical detail provided by the ordering provider can be cumbersome and negatively impact radiologist productivity and department efficiency. Issues associated with collaboration include: a) Inefficient paper-based processes. b) Lost paperwork. c) Duplication of paperwork (and effort). d) Potential for vague documentation of responsibility. d) Finding and engaging a subject matter expert (SME); and e) No ability to track real-time patient information. Biancani (2014) noted that accessing the digital library is influenced by selection efficiency. For RAPTOR, this knowledge is coded into a document that is currently paper based. Information diffuses as VA radiologists develop techniques to overcome the barriers, which are influenced by local contexts. There are two types of data - patient and protocol - and each requires a different KM strategy (Hansen, 1999). The protocol library requires a codification strategy based on explicated 67 knowledge in available repositories (Hansen, 2005). Protocol knowledge is carefully codified and stored in databases, where it can be accessed and used easily by radiologists. While this data is all already available in one form or another to most radiologists, it is not enough to simply be available. Data must be prepared in a logical and consistent manner to allow for its orderly assessment. Many hospital systems are plagued by multiple independent computer systems that barely interconnect. For a busy practitioner, this could result in an incomplete review of the data before decision-making occurs. This is not necessarily due to the information being unavailable or to information overload, but rather that the data is not in the right place at the time a decision is made. If a practitioner must open a new application, log in, enter a patient identifier, select a subject, select a test, and wait for each of the accompanying windows, usage may be inconsistent at best. (Lin, 2005) Anatomic and modality knowledge is closely tied to the person who is an SME on an acquisition modality, and this is shared mainly through direct person-to-person contact. The collaboration requires a personalization strategy, as knowledge is shared among radiologists. Based on the problems I have already discussed associated with digital libraries and collaboration; I will define concrete knowledge management actions I recommend that the VA adopt. I will explain my choice of strategic action and the likely value that its implementation will create for the organization. Each VA hospital has variable procedures contained in a protocol. These are documented in a protocol notebook, in which radiologists maintain a set of official protocols for their site on paper. The protocol documents are explicit and describe the best practices of the imaging department for various combinations of patient and imaging factors. There can be any number of protocol documents and they vary in content and standards from site to site. 68 The KM digital library application captures the protocol notebook content for users and the system to access as needed. There are several aspects to the protocol library content. When considering bases of explicit knowledge saved in electronic format, the taxonomy of modality, protocol, and template values are used and directly connected with the body of metadata used to define, identify, point, describe and characterize the contents of that knowledge base. This taxonomy is shown in the table below. Table 7 Aspects of Protocol Library Content Aspect Value Type Description Raw Protocol Scanned A PDF containing a scanned image of the Document Document original paper-based protocol document. Image Protocol Programmaticall Key information about the protocol stored Matching y accessible in a format that the program can index and Values field data use to match operations. E.g., Modality (such as CT, MR, etc), and weighted keywords (such ?Head?, ?Neck? etc). Protocol Administrative Information to tell the system if the Administrative information protocol is still active and when it was Metadata introduced into the library. 69 Protocol Programmaticall When the radiologist selects a protocol, Template y accessible the system knows to propose the input Values candidate values values associated with the selected protocol. These are values that have already been identified for the pre-population of fields when this protocol is selected. (E.g., 50cc H20) One simple way to improve collaboration is to request those subject matter experts self- identify their specialties. Radiologists can identify their modality and anatomical expertise. When a difficult case is presented, a radiologist can request collaboration with an expert. This is shown in the RAPTOR request collaboration screenshot in the figure below. Figure 25 Request Collaboration 70 HCI in Radiology, imaging informatics, information science, information technology, and workflow analysis literature review My research requires an understanding of human-computer interaction (HCI). HCI is a subset of Human-Centered Design (HCD) that is specific to electronics, computers, and digital media. On Facebook and Twitter social media, the VACI wrote (Facebook, July 19, 2019) that ?human-centered design (HCD) is the bedrock principle of the Innovators Network?. The study of HCI is important, not only to understand the possibilities of computer automation but also to understand how humans behave and understand technology. Current HCI research (Barab, 2004) (Fry, 2007) Munzner, 2014) explores system design and development problems with no understanding of human factors considerations. Through my literature review, I found a rich history of healthcare information technology at the UMD HCI lab. I am aware of a specific example of UMD HCI research that eventually became a commercial radiology workflow product. Wongsuphasawat (et. Al. 2011) discusses a UMD HCIL tool called LifeFlow, which is used to visualize an overview of event sequences at hospitals. It summarizes all possible sequences and highlights the temporal spacing of the events within sequences of patient events. I am interested in the EHR domain, as I have worked in this area for more than fifteen years and have worked with some of the world?s largest healthcare organizations, including the US DOD and the VA. Some of the LifeFlow technology was acquired by Microsoft Health Solutions Group (MHG) and it evolved and was rebranded, first as Azyxii and later as Amalga. I met with Eric Weaver of the MHG in 2011 to discuss how RAPTOR and Amalga could be integrated with one another at a high-level. The story of this HCI technology takes an opposing journey to that of RAPTOR. Patternfinder evolved into Microsoft 71 proprietary software while RAPTOR remains open source. Amalga was marketed as a universal PACS. Siegel (1998, 2004) documented some of the earliest radiology surveys that measured productivity changes by introducing healthcare IT, including PACS and hospital/radiology information systems (HIS/RIS). The 2004 survey found an initial 10.8% drop in productivity during the first year of PACS implementation, followed by a 27.8% increase in productivity beyond year one. This suggests there is a "learning curve" phenomenon that should be considered when institutions are planning for automation implementation. This is an important point: a new technology introduction may produce mixed results. My qualitative research will investigate users? perception that new technology may not be welcomed due to the fear of a learning curve and loss of productivity. Burton-Jones and Grange (2013) claim that state representation is the idealized model of all IT systems. Representation theory (RT) states that an information system is made up of several structures that serve to represent some part of the world that a user and other stakeholders must understand. I propose applying representation theory to this study. Using RT, RAPTOR developed a radiology protocol workflow state diagram, shown below. These states (active, approved, and complete) are familiar VistA radiology terminology (VistA Radiology, 2013). RT describes the protocoling model that is coded inside RAPTOR. 72 Figure 26 Radiology workflow state diagram Gassert et al. (2014) studied a specific case at the University of Colorado Hospital, where Interventional Radiology (IR) was recently introduced. This article investigates the IR workflow process. The authors looked at the paper process and then created a web form. I became aware of this article through a Google scholar search, as it references the RAPTOR Journal of Digital Imaging (JDI) article (Medverd, Cross, Font, Casertano, 2013) and Gassert writes that RAPTOR validates their work: ?In the meantime, an electronic protocol workflow for cross-sectional imaging was designed and implemented in diagnostic radiology, an effort that has been undertaken elsewhere, as well (Medverd, 2013)?. I used this article to compare University of Colorado Protocol templates from Epic and RAPTOR. Both designs have a grid-based data form that is based on the physical paper form. As it is based upon an existing paper form, there is consistency and affordance. The user is familiar with the placement of the data and knows that the header data is constrained. In RAPTOR, the data in yellow alerts the user to risk, as yellow signifies caution. Tabs and links invite the user to 73 view additional information, mapping to the existing radiology workflow action. The main idea behind grid-based designs is that solid visual and structural balance of web applications can be created with them. Sophisticated layout structures offer more flexibility and enhance the visual experience of visitors. In fact, users can more easily follow the consistency of the page, while developers can update the layout in a well thought-out, consistent way. Morgan (et. al 2009) investigated the development of a radiology clinical dashboard, evaluating its effects on report turnaround time, and reporting the user?s impressions on their workflow. UPMC aims to be efficient by reducing inefficiencies associated with the current paper-based, manual processes, and supplanting the use of fax and scanning technology. Automation can result in the prevention of avoidable duplicate radiology studies, improving the traceability of records within radiology, improving cost savings related to improved regulatory compliance and improving QA/QC feedback and training. RTAT will result in prioritization alerts that enable timely responses to clinical alerts and prevent avoidable clinical errors. It will also result in a customizable workflow and improved safety checks. Morgan (1998) proposes using mixed methods research in which the investigator collects and analyzes the data and integrates both qualitative and quantitative findings. Combining both quantitative and qualitative data will assist me in calibrating the findings of both approaches. Results in both areas focus on different aspects but are nonetheless complementary and lead to a more complete picture. This mixed methods research is characterized by the collection and analysis of quantitative data followed by a collection and analysis of qualitative data. The purpose of this research method is to use qualitative results to assist in explaining and interpreting the findings of a quantitative study. The authors? theory on the impact of the radiology dashboard on efficiency and 74 effectiveness has been supported by the data. This research has captured the current manual IR workflow and transitioned it into an electronic process. The results now estimate that automation has improved department efficiency by 24%. The average turnaround time initially increased from 22.5 hours to 24.3. After additional modifications were made, the time was reduced to 17.7 hours. The authors have integrated a reporting system with a Radiology Information System (RIS). As a result, radiologists can learn the outcomes of their patients with much less effort. The authors intend that this tool be used to aid radiologists and to increase the efficiency of both teaching and research. This study includes data compiled by radiologists who hope to develop the system into a platform for the systematic, continuous, quantitative monitoring of performance in radiology. Morgan et al. (2008, p.57) discuss a distinction between HIT and other HCI implementations. One difference is that ?an inaccurate dashboard is worse than no dashboard?. A clinical decision support system has no value if users cannot trust the information. Moreover, if software errors are encountered, it may be difficult to overcome these first impressions. Some may say (Morgan, et.al. 2006) (Morgan, et.al. 2008) (Morgan, et.al. 2011) that these findings contradict the agile process, where system improvement is part of the process. The contradiction noted in Morgan (2006, 2008, 2011) between ?perfect on arrival? and ?constant refining? resulted in conflict between the agile process, user acceptance testing, and the additional round of development noted in this (Morgan, et.al. 2008) article. This is the difference between the perspectives of a software developer and a radiologist. This question is worthy of further investigation. Alkasab (2013) created a web-based application that allows radiologists to create and maintain personal databases of cases of interest. This tool integrates with existing information 75 systems to minimize manual input, such that radiologists can quickly flag cases for further follow-up without interrupting their clinical work. This research has integrated this case-tracking system with an electronic medical record aggregation and search tool. As a result, radiologists can learn the outcomes of their patients with much less effort. The aim of this tool was to aid radiologists in their own personal quality improvement and to increase the efficiency of both teaching and research. The study includes data compiled by radiologists who hope to develop the system into a platform for systematic, continuous, quantitative performance monitoring. It also highlights the Quality Assurance/Quality Control (QA/QC) aspect of radiology workflow. The researchers created a follow-up tool to track outcomes. RAPTOR expanded on this concept by including a QA mode. The RAPTOR QA mode is presented to users once the exam has been completed. Data Visualization Literature Review Plaisant (2004), of the UMD Human-computer Interaction lab, surveyed the data visualization literature to uncover challenges to information visualization evaluation. Usability testing and controlled experiments remain the backbone of evaluation. She found four thematic areas of evaluation: 1: Controlled experiments comparing design elements. 2: Usability evaluation of a tool. These studies matched tools with users, tasks, and real- world problems. 76 3: Controlled experiments comparing two or more tools. This is a common type of study; they usually try to compare a novel technique with the state of the art. 4: Case studies of tools in realistic settings. The advantage of case studies is that they report on users in their natural environment doing real tasks, demonstrating the feasibility and in-context usefulness. The disadvantage is that they are time-consuming to conduct, and results may not be replicable and generalizable. Plaisant (2004) then discusses three possible first steps to improve information visualization evaluation and facilitate adoption: the development of data and task repositories; the gathering of case studies and success stories; and the strengthening of the role of toolkits. This article bridges the gap between data visualization and HCI. HCI concepts of case studies and usability testing made sense for tool evaluation. While Plaisant noted the challenges of these evaluation methods, she also noted their value. When it comes to visual memory and attention, people have two different memory categories, short and long term. Consider the human cognition of RAPTOR. The UI design has been described as ?nice and clean? by the Chief Radiologist of the VA (personal correspondence with Chief Radiologist, Dr. Charles Anderson), with minimal clutter and clear navigation. The grid-based data has balance, as it is based on a VA standard paper form. This form is shown in the figure below. 77 Figure 27 Example of VA Form 519a The VA 519 form illustrates the limitations of the current paper-based processes. It is manually printed and then passed by hand through an extensive radiographic workflow. The diagnostic order is entered in the Computerized Patient Record System (CPRS) by the provider. An administrator prints out the order. This order is then assigned to a radiologist, typically randomly, and infrequently based on a specialty. The radiologist may have a resident who will act on their behalf; the resident or the radiologist will review the order, review the patient?s history, and current condition, and assign a protocol. This is often documented on the VA 519a 78 by handwritten, often illegible notes. The figure below shows a comparison between the form and the initial RAPTOR conception. Figure 28 Automating the VA Paper Process As it is based upon an existing well-known form, there is consistency and affordance. The user is familiar with the placement of the data. The user has a long-term visual memory and the designers can predict where viewers will focus their attention and each data element is consistent. For short- term attention, eyes beat memory. The data highlighted in the yellow block alerts the user to risk, as yellow signifies caution. This risk captures the user's immediate attention. The details of the risk are available below on demand. RAPTOR provides both data reduction and navigation. The UI design is based on the grid conceptual model. The main idea behind grid-based designs (Rogers, Y., Sharp, H., & Preece, J., Section 2.3.2), is that solid visual and structural balance of web applications can be created. Sophisticated layout structures offer more flexibility and enhance the visual experience of visitors. In fact, users can more easily 79 understand the page, while developers can update the layout in a well-thought-out, consistent way. Wideman & Gallet (2006, p.29) concluded that it is possible to quantitatively study radiological workflow across multiple sites. This study assumed normal distributions for times associated with each workflow activity and included a table for specific activities, time, and number of events. The study performed a statistical analysis of the workflow. In the discussion, the authors focused on a single site due to variability issues. They also did not include technologist data, although they had originally planned to focus on multiple sites and roles. This study noted that actual times of specific activities would be longer than measured as they include patient-generated activities. The use of a straight-line fit to the data points is effective for visualizing improved exam time improvement, but visually limiting. In the initial Ph.D. action research phase of my work, I identified Wideman & Gallet?s quantitative study as a benchmark to measure against the RAPTOR project and overcoming the limitations of this study is a worthwhile research opportunity. My initial hope was to include multiple variables in addition to the average exam time. When visualizing large amounts of data, analysts may have difficulties finding interesting data points. If the analyst does not have a good feel for the data and its distribution, many queries may be needed to find interesting data sets. The result for most queries will contain either less or more data than expected; perhaps the data may even be null. Keim (1993) discusses the desired ability to control the process of query specification. Improvements to graphical processing, user interfaces, and data visualization software since this article was written, over 25 years ago, have allowed analysts to improve query specification using visual feedback. When planning for RAPTOR implementation, I collected workflow data from multiple sites. This data would have 80 improved the software designer's understanding of what the efficiency and effectiveness issues would have been had RAPTOR been clinically introduced and used. 81 Chapter 3: Methodology Overview of Methods In this methods chapter, I will give an overview of my narrative research approach and a justification for its use. I use the autoethnographic method to answer the three research questions. The figure below provides an overview of the methods chapter. Figure 29 Mapping of Methods Chapter Research approach My research approach relies on narrative and an analytical investigation into the events that occurred around me and my feelings about the outcome. My beliefs about my work and the project outcome are entirely my own. The exact definitions of innovation, success, and failure 82 can be debated and are situational, as I believe there are multiple views of reality. My research relies on my views and that of other stakeholders and the accepted theory that I have found most useful in understanding the work of the VA. Studying VACI in detail requires a long timeline as it has changed its direction at least four times in ten years, therefore I am researching the organization since its creation. I draw upon my 30 years of direct experience in the software industry and my direct experience with RAPTOR. In this research, I will focus on DevOps innovation at the VACI from 2010 to 2018. This time period covers my complete experience with the RAPTOR project, from planning through to the prototype stage, through User Acceptance Testing (UAT) and right up to release. This direct and immersive experience in the end-to-end DevOps lifecycle provides relevance and credibility to this research, which I achieved over the fifteen years of working in the VA and seven years in the development of RAPTOR. A timeline of activities important to this research is included at the end of this chapter. For this research, convictions, and personal views count. This epistemology is open to different interpretations depending on opinions, internal organizational beliefs, and gut feeling. As a scholar, I will describe the insider issues and the culture of an organization, both of which are not well known. The figure below lists my research approach and the ontological, epistemological, methodological assumptions that underpin the research. It also outlines the inductive reasoning and broader research design. I characterize epistemology as what I believe as seen through my experiences, within the culture, and surroundings of my work. In my research, I have experienced relativist perspectives, I acknowledge that multiple people have multiple realities and take away multiple meanings from the same event. An example of this is how one could define HIT failure; VACI could have 83 defined RAPTOR as a success, as the software was delivered on time and on budget and successfully passed UAT. However, the VA radiology community could define the project as a failure, as the VACI was unable to apply it. I surveyed the available methods and eventually settled on autoethnography because of the requirements of the research. I am using a bottom up approach, and I am using a long timeline of data collection and am researching the project in great depth. My approach is highly inductive. In induction, the researcher is free to change an approach based on emerging considerations. I will use the findings to build theory and conclusions rather than prove the existing theory. Inductive reasoning is less structured than deductive reasoning, as there is no guiding theory. Table 8 Summary of my research approach Approach Theoretical Stance Ontology (my beliefs about my situation) There are multiple levels of reality (relativist) Epistemology (how I come to know the Meaning is culturally defined world) Methodology Qualitative Design Autoethnography Reasoning emphasis Inductive 84 Methods (techniques for collecting data) Document analysis, interviews, and observation to get multiple perspectives Analysis How the data are processed and contextualized in order to answer my research questions Autoethnography Methodology I define my methodology as an investigation into my own situation. Stringer (2015) uses the word ?practitioners? to describe researchers, implying that those engaged in an activity are well situated for an investigation. As my evaluation is to focus on the things that matter to VA stakeholders, I have conducted my research by examining the interactions between people, processes and technology (PPT) as understood and judged (Greenwood, 2006) from inside the innovation program or software development activity. Autoethnography is a research approach that systematically describes a personal experience to understand the cultural experience (Ellis, 2016). It is a qualitative method to systematically analyze (graphy) personal experience (auto) in order to understand a cultural experience (ethnos). Autoethnography is a context-conscious, qualitative research methodology that incorporates deep descriptions of evidence and personal reflection (Reed Danahay, 2009). Jackson and Mazzei (2008) describe the autoethnography process as a way of truth-telling and obtaining closure through research and writing. Ellis describes scholars who use the autoethnographic method as wanting to better understand the world we live in and change it for the better. 85 LeCornu (2005) notes that ?focusing on the relationship between reflection and learning and highlighting the dimension of personal growth through the concept of internalization?. Figure 30 The autoethnographic research process As shown in the self- created figure above (based on both the Kolb Learning (1984) and the Pastoral (Lartey, 2000) cycles, the autoethnographic research process can be modeled as a feedback loop. I created this process model when assessing this methodology. I found that with discipline, one dynamic activity leads to another. The figure above illustrates how the storytelling autoethnographic method places the narrator at the center of the narrative. Data 86 collection, management, analysis, and interpretations are a dynamic process. For example, I have been going through past correspondences and other documents and recollecting past experiences in a structured data collection harvest. Some important data is used depending on the story I am telling, while other data, perhaps that which is not as important to my narrative, is not used. Evaluating certain agency and project activities based on my narrative is an analytical and interpretative activity (Chang, 2008). As my story begins to take shape, I am using adaptive theory to continuously examine the validity of my data collection criteria to shape my findings, analysis, and interpretation. I am using the organizational innovation theory detailed in the literature review to help shape my data collection. I chose autoethnography to describe my work examining the VA since 2002. I will use my thirty-two years of technology experience as the basis for my autoethnographic approach. This includes my fifteen years at the VA and three additional years at the Department of Defense (DOD) Military Health Systems (MHS). I have worked on many successful and unsuccessful projects at the VA and the DOD, including their Electronic Health Records (EHRs). My unique observations originate from my diverse roles at each of the three parts of the VA, as shown in the timeline table in the introduction. At the VA Office of Information Technology (OI&T), I was a VistA solutions architect for the imaging software. At VA Patient Care Services (PCS), I was an enterprise architect supporting the Chief Radiologist. With VACI, I was the RAPTOR project innovator, designer, developer, and manager. My years working with VACI on RAPTOR will be the focus of this research, although I bring in other software and innovation experiences as counterexamples. At the outset of the research, I opted for a social constructivist approach and action research, but over time I transitioned to and settled on an autoethnographic approach. This was 87 because of the advantages it bought, particularly the greater level of insight, accessibility, and academic rigor it brings. However, it does bring challenges, including how to avoid bias, an important consideration when one considers the familiarity I have with the data and that the data is, in effect, collected, interpreted, and analyzed by just one person. This called for rigorous self- analysis and corroborative data collection. Initially, I was prepared to evaluate the effectiveness and efficiency of the VA radiology workflow. My original approach was to first measure the current manual error-prone paper workflow and then measure it again using RAPTOR (Casertano, 2018). However, as RAPTOR is designed to work with the now-canceled VistA, the cancelation forced me to change my initial scope, from understanding the use of the tool to researching the root causes of innovation success and failure at the VACI. As it is part of the VistA ecosystem, when VistA goes, so too does RAPTOR. Advantages and challenges of the autoethnographic method Drawing on the work of Costello (2016), I adopted the following three tables that help visualize the advantages and challenges of the autoethnographic method. The advantages of the autoethnographic method were an important consideration that resulted in the evolution of my methods from action research to autoethnography. Table 9 The Advantages of Autoethnography Advantage Arguments to support autoethnography References 88 Offers a new I have found that the current academic (Grover & perspective literature does not address the Lyytinen, 2015) organization, innovation, technology (Ellis, et al, issues that I experienced in my project. 2016) Some scholars are raising doubts about the (Anderson, 2006) value of IT theory to explain the actual practice. Autoethnography is suitable for new organizational forms of research, inquiry, and practice. Generates My personal introspective account (Klein and Rowe, depth of insight provides rich insights into various 2008) technical and human elements of the VA organization and cultural environment. Accessible When action research was made (O?Riordan, impracticable by the canceling of VistA, 2014) autoethnography provided a research (Chang, 2016) method to make sense of what happened. 89 Analytic rigor An increase in the use of autoethnography (O?Riordan, has increased analytic rigor and opened 2014) new approaches. (Anderson, 2006) Sense of Self Autoethnography is suitable content for a (Carter, et al. sense of self. The construct of ID Identity 2017) has been used in information systems. This research is an extension of that expression of one?s role, group, and personal identities. Based on Costello (2016) and the references listed, the challenges of the autoethnographic method are shown in the tables below. These challenges include ethical issues, which are important when the same person is involved in primary data collection and analysis. Only with reflective self-analysis and corroborative data collection can rigor be achieved. The table shows that, in my research, the autoethnography method has advantages and disadvantages that fit with my situation. I will be alert to the additional challenges shown in the table below. 90 Table 10 Autoethnography Disadvantages Challenges Challenges for Autoethnography Reference Issues with Data It can be difficult to get information (Anderson, 2006) Collection from inside the organization. I can overcome this using my insider position and years of experience to collect significant data. Internal VA documents are accessible with FOIA requests. Ethical Challenges Intruding on the lives of others. I (Ellis, 2007) can overcome this challenge by (Delamont, 2007) communicating my findings with (Shilton, 2018) key stakeholders for confirmation. Prevent or minimize bias, a self- indulgent lens with analytic rigor. Difficult to Evaluate Difficult to acquire and (O?Riordan, 2014) contextualize useful data. I can overcome this limitation by focusing on the RAPTOR case since I am immersed in the details. 91 Justifying my choice of research method, the table below offers a comparison between action research and autoethnography. Table 11 Comparing Autoethnography to Action Research Comparison Action Research Autoethnography attribute Value Proposition Method develops research The method is the story of self, project, and organization Perspective of time Observation and Self- observation, and interpretation of the present interpretation of the past condition and present condition Relationship to Active Meaningful experience research Basis for Purposeful artifacts Self- experience with examination organization, culture, project position 92 Epistemological Taking action that produces Experience in aims the desired outcome and organization, culture, seeking the consensus of the project that produces an community outcome and seeking lens of truth perspective Strategy for growth Evaluating whether actions The reflective analysis of knowledge produced intended outcomes builds on theory and models Methodological Evolution I chose autoethnography as my methodology after several years of reflection as a Ph.D. student. From my initial understanding of methods at UMD, I have attempted to find a methodology that would be the best fit. While I always knew that the focal point would be on the RAPTOR software application, it took time for me to settle on autoethnography. I started out with a focus on social construction, then action research and finally autoethnography. Autoethnography is a non-traditional methodology that I had to seek out, based on my circumstances and experience. It was not taught to me in the three methods classes I recently completed for my Ph.D. In fact, I am not sure that I was exposed to it at all, until after I presented my integrated paper, and became a Ph.D. candidate. Social Constructivist Research Epistemology My epistemological position (Crotty, 1998) is social constructionist, researching best practices in different communities. My epistemology has been to research social construction 93 communities. My theoretical perspective is phenomenology (Layder, p.67). I use the theoretical concepts of workflow process to reduce real world complexities to render it understandable, predictable and enable a reciprocity of perspectives. My real world (Gray, p. 24) research is an exploration of prevailing expertise via the agile experience. Value is attributed not only to the qualitative improvement of improved efficiency but also to the design of the process and application. In my experience and in my research, I have used the power of different communities or ecosystems. In the private-public-academic partnership figure below, I illustrate several important communities that I have utilized in my open source software development, and then researched and discussed in my dissertation. I discuss this partnership throughout my research and in my findings. I have collaborated with a diverse group of stakeholders and used my own years of design experience, and research. I have collaborated with subject matter experts in radiology, VA, and software design communities to: Understand the current paper-based VA radiology process; Using concepts and code from ?best of class? VA software applications; ?Lessons learned? from previous software projects. After reviewing diverse viewpoints in a relativistic orientation, I constructed the concept and design framework. I propose using multiple methods to establish different views. I initially intended to use action research through the agile process and include the opinions and interpretations of participants. These participants consist of VA radiologists, medical technologists, and VACI executive managers. I rely on qualitative analysis of data to understand the user community?s perspective. 94 Figure 31 Private Public Academic Partnership As shown in the figures below, I had the opportunity to mentor six different Master of Information Management (MIM) UMD graduate students. This partnership was a ?win-win? chance to innovate by: provide a challenging opportunity to learn modern tools; contribute to an important, award winning project; be a contributor in the open-source community; and, engage with and be mentored by the alumni community. 95 Figure 32 UMD iSchool as part of the Public-Private-Academic Partnership Figure 33 MIM iSchool poster on automating RAPTOR testing 96 Figure 34 MIM iSchool poster on RAPTOR data visualization Initially, I was prepared to evaluate the effectiveness and efficiency of the VA radiology workflow. My original approach was to first measure the current manual error-prone paper workflow and then measure it again while using RAPTOR (Casertano, 2018). The figure below shows my original step in the dissertation journey. It highlights the tool project goals that could be achieved based on tool utilization. 97 Figure 35 Original RAPTOR Poster The poster (Casertano, 2017) I presented at the iSchool showcase in 2017 is shown in the figure above. This poster illustrates that my original intent of the project was to utilize RAPTOR and measure the efficiency and effectiveness of the radiology workflow. However, the inability to collect more clinical data has forced me to change my initial scope and begin researching the root causes of innovation success and failure at the VACI. Given that it is part of the VistA ecosystem, when VistA went under, so did RAPTOR. Action Research My second methodology was action research. Action research is defined by Blum (1955) as researching a social problem with a view towards improving the problem. As a VA VistA designer and researcher, I was interested in researching real-world problems with VistA using action research. As an expert, I am intimately familiar with problems in the VA environment. 98 The ethical considerations and validity threats of this internal situation are examined in this research. The key characteristics of action research are decentralization and cooperation. Action research moves away from generalizable truths to an emphasis on the local context. One-size- fits-all solutions do not work. Action Research moves away from the conventional rules of research, as objectivity and generalizability must be redefined from tradition. There is no functional distinction between the researcher and the research subjects. People impacted by the situation have a voice and are empowered to help create and carry out the solutions to problems. In the RAPTOR software, radiologists became the designers with agile development. Action Research is designed to solve ?real world? problems. Formal scientific research does not translate well to the social and behavioral sciences. The social world is dynamic and always changing, so objective generalizable knowledge is often irrelevant to the actual problems research practitioners face. Community-based action research is a democratic, humanizing, and empowering approach to inquiry that aims to solve problems and to make a difference in people?s lives in a specific way, rather than just writing a report or publishing a paper. Ito (Weblong, 2016) says that when ?the problems are massively complex?it is nearly impossible for us to divide them into existing disciplines?. An antidisciplinary project is described by Ito as being more than a listing of the sum of its parts. Ito (Weblog, 2014) writes that ?interdisciplinary work is when people from different disciplines work together. But antidisciplinary is something very different; it is about working in spaces that simply do not fit into any existing academic discipline?a specific field of study with its own words, frameworks, and methods." 99 My antidisciplinary viewpoint includes atypical combinations of interdisciplinary domains. This antidisciplinary behavior is exhibited throughout my research for the RAPTOR project. In the initial project review, the director of the VA Center of Innovation (VACI) requested a report (RAPTOR Options Analysis Report, November 2011) back to the Innovation Center on the innovative open source technology we proposed. I have defined the VACI as a semiformal organization (Biancani,2014, -.1306) Unlike counterexamples including NSA LAS and the MIT Media Lab (Ito, 2014) the VACI is not a physical space or location, but a virtual set of projects and contracts. Stringer (2014) notes the value of phenomenologically focusing on people?s actual experiences. Creswell (1994) and Maxwell (2012) highlight the value of communities for their diverse expertise. Community-based action research is a democratic, humanizing, and empowering approach to inquiry. RAPTOR was designed with the input of several communities and is an open-source application. These communities have demonstrated that the open-source model can lead to reliable, predictable, safe, and robust applications, include software such as Apache, Drupal, MySQL, and Linux. OSS development methodologies typically result in high-quality software delivered in less time and at a lower cost than is achieved with alternative development methods. OSS is in wide use in many businesses and government agencies. The VACI has been initially cautious with the proposed open-source research approach. With the VistA Modernization Strategy, the VA proposes that an open-source approach to software development, launched with VA?s VistA EHR, provides the best framework to accelerate the rate of innovation, to provide more efficient component integration, to create an ecosystem that taps the best of the health care community to create high-quality EHR systems. 100 New users exhibit technology defamiliarization as they adjust from the paper process to an automated tool. The same protocols that were initially developed by a community of radiologists are now to be loaded as application templates. Use of Theory to Generalize from Case Studies As noted in the literature review, the PPT, TAM, TOE, and other theoretical propositions that went into the initial design of this case study will have formed the groundwork for an analytical generalization (Yin, 2014, p.40) driven by an adaptive approach. I propose that a new generalization (TOE TAG) will emerge from the case studies findings. My analytical generalization is based on corroborating and modifying the organizational innovation models discussed in the literature review. The theory generalization will be at a conceptual higher level (Yin, 2014, p.41) than the case study findings. As my role has evolved from inside VA innovator to outside UMD researcher, because I have deep insider knowledge and many years of experience on this case study, much of this information is unavailable to anyone else but me. Use of Validity on Research Design I think about my research design as a ?blueprint? (Yin 2014, p.45) for my research. The design addresses the research questions I am studying, what data I am collecting and presenting my findings, and how I will analyze my results. Yin (2014, p.45) notes that a research design can be judged for quality according to validity tests. My construct validity is that I have defined my research as innovation at VACI by focusing on the people, process, and technology that impacted the RAPTOR case study. I have been reviewing the draft research with key participants. The 101 internal validity concerns how I am making inferences. My autoethnographic method will? infer? that a project result was dependent on an event. My use of theory and checking with others will assist with validity. As RAPTOR was a module of VistA, and VACI was the VA?s signature approach to open participation, I will generalize on how the failure of the approach (open participation) led to the canceling of the software VistA. While I have looked at the appropriate models (people, process, and technology, and two main types of technology adoption, TOE TAM) and have based my design and collection on this theoretical framework. The reliability of my research is based on the history of the VA, both success and mainly the management failures. This history of failure has repeated throughout VA history. Reliability suggests that I could have performed this research on other VA software innovations and have similar results. In particular, the VA CIO announced that they are pursuing a buy first strategy, hence all innovation programs will be impacted. Ethical considerations, validity threats, limitations Twining et al (2016) points out that an important element of the design of a study relates to ethics. This is particularly critical within autoethnography research, where data are often personal and are collected from a small number of individual respondents. There are several ethical considerations and validity threats to the study. As a software developer and tester, I am aware of the implications of human-based research with software. Both the Freedom of Information Act (FOIA) review by the VA and UMD Institutional Review Board (IRB) review provided an independent validation perspective and approval of the data collection. This verification has highlighted the sensitivity of the research topic. 102 Action research can involve insider observations, and this introduces the potential for bias. Herr and Anderson (2014) noted that insider positions have implications for a study?s trustworthiness and may have ethical implications. I chose a research design that controls threats to the validity of the project and improves the project?s credibility. I propose the following case study tactics for testing the validity of my findings and conclusions and the existence of potential threats to those conclusions. Below are the case study tactics of my research designed to mitigate validity threats: ? Deep insider research: I have more than 15 years? experience working with the VA. at three different areas of the organization. Prolonged engagements aid validity. I can distinguish between the changes between the innovative open-source software development at VACI with RAPTOR, and what I experienced in internal VistA development in OI&T and commercial development in PCS. ? Trust: As a Ph.D. candidate, my self-interpretation methodologies have been discussed with stakeholders from the VA who worked with me. I have a deep history with them and have gained their trust and continue to communicate with them on the details of my research. ? Counterexamples: I have compared my work with three different areas (VACI, OI&T and, PCS) of the organization and summarized their software process differences: I have compared and contrasted VACI with NSA based on collaborating with Dr. Kathleen Vogel. Applying the same research design validates transferability. The counterexample of the NSA Innovation lab that was a physical (non-virtual) space, removed off-site and 103 with dedicated funding is a good example of a government led public-private innovation program. ? Triangulation: The credibility of the study will be enhanced by adding multiple sources of information that agree. I have a very large reference list of academic and government publications. I have news journal websites and videos. ? Member Checking: The plan is to publish and socialize the research. Project participants have fact-checked my research. I have included several past and current radiologists, and software developers that have provided excellent feedback. ? Checking for Rigor: The rigor of my case study research design, its methods, and the design decisions impacts my approach to theory. ? Rich data content analysis: I have deep experience with the VA organization and understand the history of culture and failure. I know where to find deep insider content to make my findings and conclusions. I have been filtering through all the documents and decide how much should I disclose in my writing. A review of the BI considerations when dealing with HIT failure is a reminder of the sensitivity of the human element of research. There are several ethical effects on the study. As a software developer and tester, I am aware of the implications of minimizing patient risk with software. Both the Freedom of Information Act (FOIA) review by the VA and the UMD Institutional Review Board (IRB) review provide an independent validation perspective and approval with data collection. 104 Content Analysis Content analysis is a widely used qualitative research technique. I have collected data since 2002 on the VA VistA EHR. With inductive analysis, the researcher is free to change the approach based on ongoing considerations. I surveyed the available methods and evolved autoethnography based on the conditions of the data available. I am using a bottom-up project- based inductive synthesis approach, where I am looking at long timeframe from 2002 to 2018. This timeframe includes several different roles of the VistA and RAPTOR software lifecycle and researching the VA in great depth. To reduce the breadth of my large data collection, my synthesis approach is to combine my content analysis and integrate using the PPT conceptual framework. s I will adapt the organization information system theory, and conclusions rather than prove the existing theory. Inductive is also less structured than deductive reasoning, as there is no guiding theory. My contribution is to develop a new model using a conceptual innovation (Strike, Posner, 1983). I synthesize loosely related phenomena information systems with ethics management in a highly conceptualized intuitive intellectual activity. This synthesis is at the center of the new concept I introduce in the findings. Data Collection Collecting Autoethnographic data was straight-forward. I started working for VA ? DOD collaboration since 2002, so my data access starts since then. From 2002 to 2012, I was a key contributor to the DoD/VA Image Sharing team and the Joint Services Imaging Working Group. 105 I led a team of DOD engineers that studied VistA and made recommendations for collaboration. I have reviewed most of the data I collected and, based on my research design, I have only chosen the most applicable to my research questions. I have a huge collection of design information on VistA since I was a member of the software development team. I have had no issues with collecting data but have had to choose the best representative data to tell my story. I have collected data from several databases associated with the VA, VACI organization and RAPTOR over the past 10 years. Some of the proposed RAPTOR development data collected include: ? The RAPTOR project team maintained Dia project tracking software throughout the project. The Dia was included in a monthly progress report that was contractually required from the developers to VACI. ? The developers maintained an internal project database in MantisBT. MantisBT is an open-source bug tracking tool written in PHP. This was used internally for project management to coordinate between developers and track bugs internally. ? The development team wrote the functional requirements document. ? The development team wrote the test cases and results. ? The Users Acceptance Test Report discussed the User Acceptance Testing process for each iteration of agile functional testing. This includes the three phases of testing: o 1. Viewing the VACI sandbox through a remote VMware view account o 2. Editing access to the initial cloud data center boxes that displayed test data 106 o 3. Editing access to ?clinical? UAT cloud data center boxes that displayed ?clinical? data ? The developers maintained an external test database in MantisBT to support User Acceptance Testing (UAT). ? VACI maintained a sandbox tracking database at help.vacloud.us ? The open-source options analysis report written by the developers ? RAPTOR Quality Assurance Surveillance Plan ? VistA document library (va.gov/vdl/) ? OSEHRA document library ? VistA Evolution Enterprise Health Management Platform (eHMP) documents including the VistA Evolution Plan I will start my research with the Open Source Software (OSS) directive memorandum, VACI, and OSEHRA correspondence and project documents specific to technology and HCI software design issues. Key VA technology documents include the VistA Modernization Strategy (2010) and the VistA Evolution Roadmap (2014). I have over 15 years of organizational correspondences, meeting notes, strategy roadmaps, and presentations. I know that to enhance the credibility of this research I need to ensure that the perspectives of multiple stakeholders are included. I have requested additional documentation through the Freedom of Information Act (FOIA). I will recruit and communicate with key stakeholders from various parts of the VA, OSEHRA, and various VA technology consultants. I will develop a context in which individuals 107 I have worked with, and groups I have joined and can formulate an integration of innovation at the VA. I interviewed stakeholders familiar with VA OSS communities and technological factors. Chapter 4: Findings I am comfortable to honestly tell my story of the RAPTOR project and the validity of the data and findings through my 17 years? experience with the VA. I collected the VA organization data from 2002 to today. In the previous chapters I established how I collected the data and what it is intended to represent. This chapter will gather and provide the findings using the PPT models, process methodology, and information theories established earlier. I have organized the findings using the PPT model as shown below. I will conclude with a summary of findings that will interpret the data. 108 Figure 36 Map of Findings Introduction to Findings The RAPTOR case study findings are a computational archival systematic (CAS) review of the diverse data I have collected and analyzed over many years. With CAS thinking, I acknowledge that there is a significant responsibility in telling my story and outlining my research. I have combined data from multiple sources to gather insights across diverse information sets and use a diverse combination of information science theory to determine how their importance is weighted and visualized into levels. 109 I have organized a mass of information collected since I first started analyzing VistA in 2002 (Casertano, et al, 2002). I have accompanied each finding with the significance of either highly relevant, relevant, or less relevant impact analysis as shown in the overall finding?s matrix below. A hybrid total of ten ethical issues failure groups were identified from a thematic analysis of the findings. The relevancy is based on the cumulation of two organizational criteria, OIS Failure Groups (Van Lancker, et. al, 2016) and the Ethnic Management Framework (Kaptein, 2013). As shown in the table below, I found that the failure groups and ethical shortcomings map very well. I name this hybrid TOE TAG, as an adoption of the popular TOE TAM research. The more a finding fit a failure group and an unethical behavior, the more relevancy was scored. Highly relevant earned a score of 10 to 8, relevant earned a score of 7 to 4, and less relevant earned from 3 to 1. Table 12 Summary of organizational innovation system failure groups (Relevancy Criteria) Hybrid of OIS failure groups Explanation plus Ethics Management Dimensional blindness failure Overlooking one or more dimensions or not focusing on one or more dimensions soon + Lack of Clarity ethics enough plus the success criteria Representativeness failure Improper stakeholder group representativeness, non-representative + Role-modeling ethics organization or individual for the group, or non-representative individual for the organization, plus the worse the example, the worse the behavior 110 Resource failure Too few financial resources or human resources within the OIS to successfully +Achievability ethics generate, develop and diffuse the innovation, plus the better equipped people in an organization are, the better they can do what is expected of them Lock-in failure Too many strong ties, leading to, for example, ?groupthink?, resulting in myopia +Commitment ethics and inertia within the innovation network plus misplaced commitment of leadership Soft institutional failure The lack or non-alignment of informal arrangements, e.g. shared vision, social +Commitment ethics values, culture and norms, mutual trust, goals of the different partners and business models, plus leadership commitment to innovation Cooperation failure Too few strong ties in the innovation network, leading to, for example, trust issues +Transparency ethics and difficulties in cooperation Openness failure Improper balance between consulting and participating with too many stakeholders, +Openness ethics plus lack of open discussion of viewpoints, emotions, dilemmas, and transgressions Hard institutional failure The lack or underdevelopment of formal arrangements, e.g. collaboration contracts, IP +Enforcement ethics arrangements and non-disclosure agreements, 111 plus sanctioning of undesirable behavior and the extent to which people learn from mistakes, near misses, incidents, and accidents Iteration failure Improper balance between too much needless iteration and too little feedback, plus a lack of coordination of responsibilities within the organization Capacity failure The lack of certain capacities of the innovation organization to maximally profit from the OIS, e.g. absorptive capacity or network management capacity Each findings section ? people, process and technology ? begins with a specific table. After presenting a finding, I explain it and point to the data source, before presenting the next result then explaining it, and so on. I will use induction, using information about my experiences with my projects RAPTOR and VistA, my experiences with VHA, OI&T, PCS, and VACI to reach conclusions about the VA. This inductive reasoning was discussed in the methodology section. One common way to test the adequacy of a generalization is to confirm it with counterexamples. I use several different counterexamples to validate my case study and to strengthen my findings. 112 Findings on People: The VA organization, culture, and communication influences innovation I researched and answered my question on people/organization/culture: How does the VA organization, culture, and communication influence innovation? My findings on culture, organization, and communication show the underlying people issues that currently plague the VA. As a semiformal organization, the VACI was designed as a bottom-up organization to foster a culture of innovation and transparency. VACI remains a rules-based instead of principles or mission-based culture, thereby hampering innovation. The government has often failed to sustain and maintain innovation over time. Organization communications can shape and omit the truth to emphasize a positive message and influence both internal and external opinion. The VA created a culture of groupthink with retaliation against whistleblowers. VACI uses social media in external communications to highlight successes and offset negative reporting. Internal VACI communication prematurely hailed RAPTOR as VACI innovation "good news? as management shapes and controls communications. VACI is recommending that the VA forget what made it successful in the past. This results in a lack of organizational behavioral integrity. The figure below shows the details of the people findings of culture, organization, and communications. 113 Figure 37 People Findings The table below organizes the people findings into culture, organization, and communication and labels them based on their relevance to my research question and maps the overall findings data source that follow. Table 13 Table of People Findings PEOPLE CULTURE COMMUNICATION ORGANIZATION 114 Very #1. (very relevant) #1. Lead VHA #1. Government have Relevant Mission v. Rules Radiologist and often failed to sustain Culture Radiography and maintain TOE TAG Technologist innovation over time. 10 to 8 #2. Retaliation personally advocated against #2. The failure to adopt for RAPTOR for more Whistleblowers is organizational policy is than 4 years. They and ingrained in the a failure of ethics and many other potential VA culture policy adoption users perceived #3. The RAPTOR?s usefulness. Scheduling #2. After RAPTOR scandal at was awarded one of the Phoenix had an top 5 Medical Imaging impact on the IT Projects of the Year VA?s culture and (2012), VACI funded the perception of the prototype buildout. the project?s software functionality. Relevant #4. The NSA LAS #3. External radiology #3. VACI is a is an media prematurely semiformal TOE TAG organizational publicized RAPTOR. organization. 7 to 4 115 counterexample of the VACI. Less #5. VACI is #4. VACI uses social #4. VACI changed its Relevant recommending media in external name, leadership, and that the VA forget communications to mission four times in TOE TAG what made the highlight successes and ten years. 3 to 1 VA successful in offset negative the past. reporting. On his first day in office, President Obama made a presidential Open Government Directive (Obama, 2009) to all cabinet level agencies to be transparent, collaborative, and participatory. Peter Levin, VA CTO, said (Levin Fedscoop Video, 2010) that ?this is the culture that he (Obama) is asking his administration to lead.? Levin said that VACI is the flagship initiative of the Open Government Plan. According to Linda Fischetti, the VA Chief Health Informatics Officer (presentation, June 10, 2010), VACI was formed to create, connect, and empower innovators. In other words, VACI is the VA?s innovation organization. The VA Open Government Plan (2010, p. 3) noted the impact of forming VACI (originally known as VAI2) would have on the agency?s culture. ?We have developed a very exciting flagship program, the VA Innovation Initiative, or VAi2. This initiative will transform our business processes, provide transparency to our work, and create a collaborative effort between our Agency, the Veterans we serve, and the private sector. Specifically, VA is tapping 116 the talent and expertise of individuals from both inside and outside government to contribute new ideas that will ultimately produce new, innovative solutions at VA.? The VA Open Government Plan (2010, p. 8) stated, ?Section 3. Changing the culture from top to bottom: Creating an atmosphere of openness at VA, the second largest federal agency, will require not only leadership from the top of the organization, but also significant efforts to integrate these values into our business processes.? The VA Open Government Plan (2010, p. 9) also stated, ?4. How we measure success: VA will know that we have been successful in our open government endeavors when the tidal wave of questions regarding the status of a claim recedes and Veterans receive the benefits and services they have earned, more quickly and more reliably. VA will use informal surveys on websites such as Facebook to monitor how we are doing. In the next calendar year, we will develop a short, formal, and web-based survey to determine whether stakeholders and the public have heard about our open government plan and whether it has been effective.? I have followed the VACI Facebook group since 2011 (Facebook, 2011) through its many changes, to understand the direction of VACI. I have attempted for several years to obtain additional data via FOIA requests. However, I was given the runaround with my FOIA requests. Initially, I was met with enthusiasm when I met with my old VA comrades, and promised whatever I needed. When months passed and nothing was forthcoming, I heard I was asking for too much and that I needed to document every document that I wanted. I kept after my VACI contacts from 2015 to 2020, but I have not received a single document from VACI through FOIA, despite repeated requests. 117 Mission v. Rules Culture People Finding - Culture #1 (very relevant, TOE TAG Score = 10) There was once a unique culture of innovation at the VHA. The VACI was designed to replicate that culture of innovation. I first became aware of the VistA culture of innovation known as ?Hardhats? while studying VistA for a DOD research paper (Casertano, 2002). The VA?s culture of innovation transformed the VA (Longman, 2010, p. 42) into the nation?s best- performing healthcare system. Longman exclaims that the VA succeeded in (2010, p.23) ?creating a wonder of bottom-up engineering that many experts say points the way to the future of 21st century healthcare.? In my research I found that much of the effectiveness and durability of VistA can be attributed to the collaboration between technologists and clinicians that defined the development process. This paired programming is known as ?Toucan? or ?Two Can? (VistA Modernization Strategy, 2010, p.25). Ogrysko (2017) notes that there is a cultural legacy of partnership between clinicians and developers. Clinicians appreciate the ability to work with developers to implement modifications to their instances of VistA. Several clinicians expressed satisfaction with this capability, and fear losing it with an enterprise COTS system. While developing VistA at OI&T, the team had a user representative, and meetings with users, but there was a perception that our sequential "waterfall" approach to software development resulted in cumbersome processes, lengthy delivery times, and prevented us from gathering meaningful requirements and iterative feedback from our business partners ? the end users. My RAPTOR work with VACI attempted to overcome this as the development team had a close relationship with the RAPTOR radiology innovator. 118 In the OI&T Transformation presentation, there is a quote that exemplifies this stagnant culture (2016, p.8): ?Employees expressed frustration at the perception that there was a field- based OI&T and Washington OI&T. Requirements in the field were not always passed on to decision makers at headquarters. OI&T developed IT solutions that meet the needs of a few without knowing what our business partners and field staff really needed from that technology, who relied on our IT solutions to address critical Veteran needs.? Some providers feel significant trust between clinicians and IT has been lost over time with respect to partnership in VistA and CPRS development. Part of this is related to the fact that EHR improvements are hampered by budget and approval processes and, additionally, disconnect exists between VA facilities and IT with regard to business planning. The Grant Thornton report asked (2017) about the importance of having VA clinicians having a say in their clinical practices, workflows, tools, and processes. pairing clinicians with developers. This software development process pairing is discussed further in the counterexample process finding. Whistleblower Retaliation People Finding ? Culture #2 (very relevant, TOE TAG Score = 9) A factor in unethical practices in the VA was an organizational climate that actively discouraged the reporting of problems within the system and allowed retaliation against whistleblowers in violation of federal law (Molina, 2018). This retaliation increases the risk of ?bystander inertia? present in the workplace. The larger and more complex the organization, the 119 easier it is to shift or shirk responsibilities (Kalstein, 2013). Retaliation against whistleblowers is ingrained in the VA culture. ?For years, a culture of fear has developed for whistleblowers at the VA,? Senator Tom Colburn wrote in Friendly Fire. However, the fear of reprisal often deters whistleblowers from coming forward. The retaliation has become so severe that a new law formed a new agency, the Office of Accountability and Whistleblowers Protection (OAWP). My findings describe a toxic culture at the VA and its impact on IT. The finding of retaliation against whistleblowers is an example of the lack of transparency at the VA. The inspector general of the Department of Veterans Affairs has issued a scathing report (VA OIG, 18-04968-249, October 24, 2019). finding that the OAWP has failed in its core mission of protecting whistleblowers, and instead has doubled down on the retaliation that is widespread in the agency. The VA OIG investigated adherence to whistleblowers protection and found significant failures in compliance. President Donald Trump heralded the new office to clean up a long-standing culture of retaliation against whistleblowers in the VA. Instead it has been used to retaliate against the whistleblowers it was created to protect, and to stifle their claims. The VA OIG found that the office?s first executive director, Peter O'Rourke, ?leveraged his power as head of the whistleblower office to end investigations into allies and failed to provide basic reports to Congress on the office?s operations." The VA OIG found that a ?hostile work environment is OAWP?s most common complaint? (p.17). When only a few people feel empowered to speak up, it is a sign that the VA is not particularly innovative, democratic or bottom-up, and that management does not want to know the truth. The VA is missing out on getting an accurate measure of how things are if employees 120 do not feel like they can be honest. The VA OIG found that the FOIA office has retaliation and backlog issues (p.89). The Washington Post (Davidson J., Politics Perspective, 2019) quotes Tom Devine, legal director of the advocacy group Government Accountability Project, that the VA ?remains a free- speech Death Valley for government witnesses. Retaliation is ingrained in the culture.? Iowa City VA Technologist Jeffrey Dettbarn, in sworn testimony to Congress (Dettbarn, June 25, 2019), noted that, ?There is a culture of fear and retaliation that the VA uses as the weapon to silence the whistleblower.? Dettbarn?s case is germane to RAPTOR?s functionality as it has to do with mass cancelations of radiology orders. Both RAPTOR?s functionality and the radiology workflow process are discussed further in the upcoming process finding. Scheduling Scandal People Finding ? Culture #3 (very relevant, TOE TAG Score = 8) The radiology scheduling scandal at Phoenix had a negative impact on the VA culture. The VA has a goal of trying to give its patients an appointment within 14 days of them first seeking care. Unfortunately, delays and irregularities in recording patient waiting times have been documented in numerous reports from government (GAO, 2012) and outside organizations for years and have been well-known to VA officials, members of Congress and veteran service organizations. The scandal in 2014 stems from allegations that employees were keeping a secret waiting list at the Phoenix VA hospital and that up to 40 patients may have died while awaiting care 121 (Federal News Network, 2014). A preliminary VA inspector general probe into the allegations found systemic falsification of appointment records at Phoenix and other locations. Software is not the only problem behind the vast VA scandal. VA policies were unworkable, managers had unreasonable expectations, and results were faked when employees could not meet the goals. An audit showed 57,000 veterans had been waiting at least three months for a first appointment. People died while waiting. These fatalities as well as the wait list scandal forced VA Secretary Eric Shinseki to resign. (Politico, 2014) Five years after the scandal (July 2019), there are signs that this scheduling scandal still impacts the VA culture. In her testimony to Congress, Debra Draper, director of GAO?s healthcare system noted (Ogrysko, 2019), ?At this time, we continue to be concerned that VA has not sufficiently addressed the reliability of its wait time data.? Another sign that the crisis continues is that schedulers are among the top ten highest turnover positions within the VA. During my time with the VistA and RAPTOR development teams, the VACI had a lack of resources. The VA OIG found that 96% of Veterans Health Administration facilities maintain at least one ?severe occupational staffing shortage,? with a lack of qualified applicants, non-competitive salary, recruitment challenges, private sector competition, and high staff turnover being the main reasons (VA OIG, 19-00346--241, September 24, 2019). Ultimately, when staff are unhappy, they vote with their feet. As the VA is seeing turnover as a chronic situation across the agency, it is time to capture learnings from those leaving, engage with employees, and take a close look at the company culture. Instead, the VA is retaliating against whistleblowers. 122 I found that the scheduling scandal was the start of the move to privatize care. In response to this, Congress took action to allow veterans who faced long waiting times for care, or who had to travel a long distance to receive care at a VA facility, to seek private care. The VA may now close more than 1,100 facilities to privatize more medical care. In his memoir, Secretary David Shulkin wrote that President Donald Trump wanted to close large parts of the VA down. "I am (Shulkin, 2019) convinced that the path now chosen, if allowed to continue, will leave veterans with fewer options, a severely weakened VA, and a private healthcare system not designed to meet the complex requirements of high-need veterans.? Partnership counterexample People Finding ? Culture #4 (relevant, TOE TAG Score = 7) In biology, an ecosystem is a community of interacting organisms and their physical environment. Innovation communities likewise form metaphorical ecosystems, where the "organisms" are organizations and individual researchers. To assist with gaining validity, I am looking at counterexamples of public-private-academic innovation partnership ecosystems between the government, academia, and industry. The NSA LAS is a counterexample of the VACI. The Laboratory for Analytic Sciences (LAS) at North Carolina State University, funded by the National Security Agency (NSA), is a collaborative, long-term research enterprise focused on improving innovation at this government agency. This lab serves as a counterexample to the VA innovation process as it is showing more promise in encouraging innovation. I discussed LAS with UMD associate professor Dr. Kathleen Vogel. She presented "Big Data, Privacy, and the U.S. Intelligence Workforce" at CASCI 2019. Vogel?s research (2017, 123 p.172) notes that the LAS innovation center is designed to ?explore ideas and alternative perspectives, gain new insights, generate new knowledge, or obtain new information.? This mission is very similar to the origins of VACI. The table below compares the program characteristics between the VACI and LAS (based on Vogel & Taylor, 2019). I believe that the LAS public-private-academic partnership is a counterexample in that it avoided the pitfalls of VACI. The key people differences include that it was formally structured and had stronger communications with the ecosystem partnership. VACI had many organizational and leadership changes over the project lifecycle, and a lack of leadership of VACI to see through the support needed for innovation in the operations and support processes. The interrelationships between different organizations is another failure group due to lack of cooperation and transparency. Table 14 Comparison between VACI and LAS Program Characteristics VACI Intel (LAS) (based on Vogel & Taylor 2019) Started by Obama?s Open Intelligence Vision 2015 Government directive (2008) Emphasis Participation Transparency 124 Dedicated facility No, virtual Yes, NCS Participant entry Selection through Rotational voting Open source (was) software Information, data (is) interfaces Improve the mission by Innovation Collaboration changing the culture # of changes in leadership/ Unstable, many Unstable, many changes programs changes Sustainability Many challenges Many challenges Organization type Semiformal Skunk works Interdisciplinary teams Agile paired Inside/outside (clinician/programmer) Operational Resource Yes, lack of Operations No intake process issue in DevOps process Forget the past People Finding ? Culture #5 (less relevant, TOE TAG Score = 3) 125 VACI is recommending that the VA forgets what made the VA successful in the past. In its medium social media posting, ?The VHA Innovators Network Adopts the Three Box Solution Framework?, (VHA Innovation on Medium, July 10, 2019), VACI freely acknowledges that it wants to forget what made the VA successful in the past. The Three Box Challenge is an overly simplified management ?airport book? similar to the ?One Minute Manager.?. It suggests an organization can put its innovation into three boxes. Box 1 is current business. Box 2 is the past. Box 3 is the future. Vijay Govindarajan writes (Chapter 5) that ?the hardest question that businesses never ask themselves is what should we stop doing?. Govindarajan recommends that organizations forget what made the business successful in the past. VACI is pushing unlearning of VistA to come to terms with the loss of in-house software development. This finding is expanded on in the VistA modernization finding section. This ?forget the past? culture is also exhibited in the lack of forthcoming information from VACI. They have stonewalled my FOIA requests. When I initially talked to the VACI director, he said that he would provide any documentation needed to complete my dissertation. However, no additional project information was forthcoming. The VACI director told me that information was disposed of with the administration change. I redirected my inquiries to the FOIA office and got into an endless loop between FOIA and VACI that has resulted in no assistance with this research. This behavior is consistent with the toxic culture that punishes whistleblowers. 126 William Faulkner is credited for the quote ?The past is never dead. It?s not even past.? Ignoring the past, the VA is stumbling forward. The logical conclusion is that by forgetting the past, the VA is doomed to repeat it. An example of making the same mistake over again is the four failed attempts to modernize VistA over the past twenty years. Radiologist approval People Finding ? Communication #1 (very relevant, TOE TAG Score = 9) RAPTOR has user acceptance throughout the VA radiology organization. The table below shows the executive leadership stakeholders who endorsed RAPTOR. Many levels of users, including the lead VHA Radiologist and Radiography Technologist, approved of RAPTOR. It passed user acceptance testing (UAT). Table 15 RAPTOR Stakeholders Type of Description Responsibilities Stakeholder Requester Michael Cortright Submits new service request (NSR). Portfolio Manager, VHA Submits business requirements. Innovation Program (VACI) Monitors progress of request. Endorser Dr. Charles M. Anderson Endorses this request. Provides Chief Radiologist strategic direction to the program. Consultant, Diagnostic Elicits executive support and Services funding. Monitors the progress and timelines. 127 Business Dr. Charles M. Anderson Provides final approval of BRD with Owner(s)/Program Chief Radiologist sign-off authority. Provides strategic Office(s) Consultant, Diagnostic direction to the program. Elicits Services executive support and funding. Business Subject Dr. Jonathan Medverd Provides background on current Matter Expert(s) system and processes. Describes Radiologist (Original (SME) features of current systems, and Innovator) proposed enhancements. In a correspondences, the Lead VHA Radiography Technologist and Radiology Quality Officer said that every one of the more than 130 sites should enter a National Service Request (NSR) for RAPTOR. This shows his impatience that VACI is delaying the release of the software. In this correspondence he also states his perceived usefulness of RAPTOR, and that he has been personally advocating for RAPTOR for over four years. He suggests overwhelming the New Service Request (NSR) process. The rank and file radiologists also perceived the usefulness of RAPTOR. I have several videos from RSNA conferences where I had the opportunity to introduce RAPTOR to VA radiologists across the country. In these video testimonials shown below, they discuss the usefulness and the ease of use of RAPTOR. This is an explicit case of a communication breakdown between the radiology community and VACI. While the message is unanimously positive from radiologists who want to use RAPTOR, VACI did not provide the operations resources required to allow RAPTOR to be introduced clinically as a radiology workflow module of VistA. 128 Figure 38 Video Testimonials of RAPTOR?s usefulness by VA Radiologists RAPTOR awarded one of the top five Medical Imaging IT Projects of the Year People Finding ? Communication #2 (very relevant, TOE TAG Score = 8) Prior to RAPTOR being awarded one of the top five Medical Imaging IT Projects of the Year in 2012, the prototype was completed, and the innovator and developers were waiting for VACI to move forward with building out the prototype. After the award, VACI initially hesitated but eventually funded the prototype buildout. The industry award ameliorated the delay in funding RAPTOR. 129 Appendix 1 is an excerpt of an article published in the Radiology Business Journal on the top five Medical Imaging IT Projects of 2012. The article announced the exclusivity of being one of the Top 5. Communications with several VACI managers stated that this award and the subsequent publicity was responsible for funding the prototype buildout. This is another explicit case of a communication breakdown between the radiology community and VACI. As I reflect on this award now, I am very happy to have received the designation. RAPTOR received the award for the prototype concept and the potential benefits in efficiency and effectiveness that it could have provided the VA. The patient safety benefits are detailed in the process finding. Premature publicity from press People Finding ? Communication #3 (relevant, TOE TAG Score = 5) External radiology press prematurely publicized RAPTOR. There are two articles about RAPTOR that were published on the radiology site Aunt Minnie. AuntMinnie.com is the largest and most comprehensive online community for medical imaging professionals worldwide. The first article included the abstract in the 2012 edition of AuntMinnie.com?s annual RSNA preview, ?Road to RSNA?. The second article was a post- RSNA interview with Dr. Medverd. It is included in its entirety in Appendix III of this research. In my discussion with the lead radiologist innovator during the findings phase, he felt that this was important at the time to help evolve from UAT into a national release, but that the news was premature as it in fact, did not take off. 130 There is a clear connection between the findings and the premature publicity. The prototype was an important first step, but RAPTOR was not ready for takeoff. In fact, when we were awarded the contract to build out the application, we used the prototype for inspiration but restarted the development. My process findings show that after the development and user acceptance, RAPTOR was still not ready as operations delayed the project rollout and ultimately failed operate the software. Social media to offset negative reporting People Finding ? Communication #4 (less relevant, TOE TAG Score = 2) Since the VA Open Government Plan of 2010, the VA uses social media to highlight its successes and to offset negative reporting. Current VACI director Dr. Ryan Vega noted in an interview that the VA?s ?longstanding view is shaped by the negative press? and ?VA is leading the way in innovation in IT.? (Vega, August 8, 2019). In the VA Open Government Plan, it states (2010, p.7): ?Celebrate Open Government Successes; In addition to sharing our successes in creating a more open VA within the Agency, we must also communicate our efforts to those outside VA. That is why we will continue our existing social media efforts through tools such as Facebook and Twitter and expand to other new media as well.? Over the course of performing this research, I noted that the VACI website became obsolete. Several of the project references have been wiped from the internet, possibly due to several reorganizations. For example, the official website (innovation.va.gov) notes that ?With the enactment of the VA Maintaining Internal Systems and Strengthening Integrated Outside 131 Networks (MISSION) Act of 2018 (Pub. L. 115-182, hereinafter MISSION Act), Sec. 152., we are shifting our focus to innovation initiatives enabled by the new law and exploring opportunities to maximize VA assets. Accordingly, VA enterprise innovation leadership transitioned operational control of programs supporting individual administrations, such as the Veterans Health Administration (VHA), back to their respective organization.? The above is a counterexample that supports my communications findings. This reorganization quote represents government ?non-speak?. Rather than say how IT innovation is in chaos, the social media post talks about exploring an opportunity to maximize VA assets. These findings show how VACI uses communications to thwart innovation. Governments have often failed to sustain and maintain innovation over time People Finding ? Organization #1 (very relevant, TOE TAG Score = 9) Several studies (Lee, 2014; Vogel & Tyler, 2019) found that the US government has failed to sustain innovation over time due to a variety of elements including communication channels, and the organization. A striking example of this is shown in slide 2 the VAOI&T Comprehensive Plan (VA OI&T, 2017), where 234 of 299 projects are being migrated or stopped. RAPTOR was caught in the 78 % of canceled projects per year. Table 16 VA OI&T 2017 Project Count Total Projects Total Continued Total Migrated/ Stopped 132 299 65 234 The VA has undergone significant leadership changes. The GAO found (2019) that the VA has experienced leadership instability over the past two years in several senior positions. Reinvention and resifting support occur every time the leadership changes. All the RAPTOR stakeholders named in the Radiologist Approval finding table above have left the VA. Using induction on the specific RAPTOR use case of high staff turnover was true all over the VA. The VA OIG reports (VA OIG, 2019) that there are 49,000 vacant positions. Ninety six percent of VA facilities reported at least one severe occupational shortage. Staffing shortages and turnover are the root cause of failing to maintain innovation. A rare direct quote of owning their shortcomings shows how the VA?s IT organization inhibited innovation, including what I personally experienced with RAPTOR. In the VAOI&T Transformation (2016 year in review) presentation, the OI&T admits that ?Prior to transformation, our relationships with OI&T and other VA employees suffered because we had no dedicated, coordinated methods of receiving employee feedback, and no mechanism for bi- directional communication between staff and leadership. This led to general employee dissatisfaction, a high rate of employee burnout, and a lack of trust in OI&T.? This organizational quote directly relates to many of the people findings, including communication. The OI&T organization?s lack of operational resources is a finding of operations maintaining innovation. This is discussed specifically in the DevOps process finding. 133 The failure to adopt organizational policy People Finding ? Organization #2 (very relevant, TOE TAG Score = 8) The failure to adopt organizational policy is a failure of ethics and policy adoption. Earlier in the literature review, I modeled TAM on the open source policy and telehealth implementations of VA OI&T. Based on my findings of diffusion of innovation adoption, I complete the TOE TAM model. Policy diffusion scholars have studied diffusion by concentrating on the stage of policy initiation (Damanpour & Schneider, 2006; Graham, Shipan, & Volden, 2012). Borrowing the TAM notions of the innovation initiation stage suggested by Zaltman et al. (1973), the stage of policy initiation can be subdivided as follows: knowledge awareness, formation of attitudes toward the innovation, and adoption decision. The VA CIO office noted the organizational roadblocks in implementing the 2014 Open Source Policy memo. This bias against open source software that RAPTOR experienced was not unique. The VA CIO noted in his presentation at the 2015 OSEHRA Conference that the effort to promote OSS at the VA ?kept running into roadblocks?. He noted VA stakeholders? lack of education and support of OSS. The CIO lamented that the VA needed to get the OSS message out and that VA Open Source Policy effort had no momentum or leadership support to change. For counterexamples to verify my findings, I look to my VistA experience from before RAPTOR. This counterexample shows the gap between a successful adoption of innovation in telehealth and the failures I experienced in VACI for RAPTOR and open source. 134 One of my first VA OI&T software development experiences with VistA Imaging was Patch 46, the TeleReader. The TeleReader was an enhancement to VistA Imaging that (Darkins, p. 762), ?gave VHA a robust IT infrastructure to complement its fledgling telehealth expansion. Close collaboration between clinicians and the VHA IT community created a multimedia health record.? The TeleReader project as part of the VA TeleHealth program is a positive example of policy diffusion in adoption over a twenty-year period (1994 to 2014). As shown in the figure below, the TeleReader was a VistA Imaging software module that supported the remote reading of consult imaging examinations at the reading center. Figure 39 The VistA Imaging TeleReader as an example of VA diffusion in innovation Darkins notes the VA?s challenges in creating large telehealth networks mirroring the experience of other organizations (nationally and internationally) in implementing and sustaining their programs with associated challenges, which include clinical buy-in, credentialing and privileging, staff training, technology standardization and interoperability, securing revenue 135 streams, clinical risk management, relationships with IT/biomedical engineering, and ensuring the quality of care. This TeleReader example shows a successful VA innovation being supported with post-development resources, unlike what occurred with RAPTOR. Table 17 VA Innovation vs. TAM Model Sustained Diffusion Adoption VA Innovation TAM Model Sustained Diffusion Adoption RAPTOR Adoption - Too few operational resources available within the VACI to successfully generate, develop, and diffuse the innovation Open Source Policy Adoption - Culture, Communication inhibits Technology Acceptance Telehealth Adoption - Successful diffusion of innovation VACI is a semiformal organization People Finding ? Organization #3 (relevant, TOE TAG Score = 7) A semiformal organization is believed to foster innovativeness (Robbins & Judge, 2009; Walker, 2007). These structures adapt to unstable conditions and change. They are characterized by individuals performing their tasks outside of a clearly defined hierarchy or structure. A semiformal organization can operate flexibly and adapt quickly to a rapidly changing environment (Jones, 2004). 136 VACI is the VA?s attempt to create a semiformal innovation organization. The VistA Modernization Report observed (May, 2010, p.36): ?We believe one of the biggest challenges the VA will have around the VistA project will be a culture change in the overall way they procure software, incentivize the open source communities to participate, and speed development.? Shortly after this report at the VistA eHealth University Conference in August 2010, I went to the introduction of the VACI. VACI was formed to organize scattered VA innovation initiatives. The modernization reports notes (p.38): ?Are there significant cultural barriers? Any time changes are made in an organization, there are impacts to agency culture. At the point that these changes become a barrier, the momentum moving forward with strategic change may be slowed.? The VA Open Government plan (June 25, 2010, p. 3) noted that ?Candidly, VA has not always been the model for government performance or service delivery. However, with strong leadership, good governance, and a new commitment to creating a culture that is open, transparent, participatory, and collaborative, we will achieve our objective and create a high performing VA of which our citizens, our nation, and most importantly, our Veterans and their families can be proud. Peter Levin noted (Fedscoop video, 2010) how overwhelmed the leadership was by the number and quality of engaged employees who participate in bettering their environment through VACI. 137 Figure 40 Secretary Bob McDonald speaking at OSEHRA Conference, (photo by Casertano July 30, 2015) On July 30, 2015, VA Secretary Robert McDonald spoke at the OSEHRA Conference. Impressively, he spoke without a script, as shown in the photograph from his speech above. McDonald stressed that direction of his leadership is to change the discussion away from problems within the VA management to a focus of putting VA customers first. McDonald joked that when he joined the VA, the General Counsel (lawyers) ran the department. This joke is illuminating about the VA oversight culture. In Buell (2016), McDonald said, ?a rules-based 138 organization is a safe place to work because if you follow the rules, you?re never going to be criticized. You go to General Counsel for each opinion, so you never have to take any personal risk.? McDonald has called the VA?s rule-based culture as a culture of learned helplessness. This was my experience with the VA. Everyone can blame external circumstances for his or her inability to act. In Learned Helplessness in Organizations, Ashlenas (2012) describes concentric circles of excuses that absolve managers from accountability for change or improvement. Rather than finding creative ways to deal with regulations or budget cuts, they accept the status quo and blame external conditions for the problems that exist. I experienced the helplessness phenomenon often at the VA. One example of this is the development team had completed its work seven months earlier (October 2015) and waited to hear about next steps. ?It has been seven months since we completed development of the EWD version of RAPTOR. This week, VA Innovations reached out to us to meet to help them with their build script. We quickly assisted VA Innovations troubleshoot its build. SAN diagnosed the VA Innovations script and found it missing several Cache database routines. Please note that SAN's code, testing and documentation of the EWD version of RAPTOR is not the reason for the seven-month delay.? This learned helplessness culture has the power to permeate an organization. Like a spreading infection, managers pass on learned helplessness from group to group and level to level. Eventually the standard response to any initiative is some variation of ?We?d love to do that, but we really can?t.? 139 Many name, mission, and leadership changes People Finding ? Organization #4 (less relevant, TOE TAG Score = 3) One of the subtle, but interesting, findings was how VA Innovation has changed its name and mission over the past ten years. VACI changed its leadership, name, and mission four times in ten years. What started as the VA Innovators Initiative, VAI2, then became VACI, and is now called either iNET, the Ecosystem, or VAI. There are several trends I have found in the changes through the years, in open communications, and in technology. VACI was formed to promote transparency and openness, and this has shifted to less transparency. Another trend is that in the beginning, VACI supported in-house information systems projects with funding. Now as the VA is moving away from in-house software development, VACI?s mission has shrunk. Earlier in the literature review in Figures 13 and 14 I captured screen shots of the broken VACI websites. This is an unfortunate sign of the broken organization and culture. My finding is that foreshadowing of the broken websites matches the broken expectations from the cancelation of VistA and RAPTOR and the broken promises of failing to promote open source software. 140 Process Findings: The breakdown of VA processes inhibiting innovation My findings on DevOps, radiology workflow, and project management show the underlying process issues currently inhibiting innovation at the VA. A lack of coordination between Development & Operations during a critical transition time caused the DevOps process to break. In the operations phase, VACI had limited resources and wanted to turn over the software to OI&T and PCS for deployment. Unfortunately, OI&T did not provide any resources to support the design, development, and testing of RAPTOR. This lack of communication, integration and coordination of resources resulted in a delayed schedule of transition. Although Sandbox development was not maturely implemented, and VACI management contracted and misaligned RAPTOR project resources, RAPTOR software was delivered on time and on budget, and successfully passed UAT and is certified for software quality. The radiology scheduling scandal at Phoenix impacted the VA culture, and the VA technology management failures of the Enterprise Scheduling system impacted the perception of the project?s software functionality and the radiology ordering process. The radiology support assistants (clerks) had no access to radiology appointments, and RAPTOR functionality would have improved this as well as having important potential patient safety benefits in alignment with its safety mission. The RAPTOR order cancelation functionality was designed to not permit unauthorized mass cancelations. The RAPTOR business case performed by the VHA Innovation Selection Board estimates substantial tangible and intangible cost/resource savings due to 141 efficiency in workflow. RAPTOR?s perceived efficiency and effectiveness benefits would have alleviated the scheduling and ordering processes that resulted in the current VA crisis. The figure below illustrates the detailed findings of the three examined VACI processes: DevOps, radiology workflow, and project management. The table below shows the relevancy of the three processes that impacted my research. Figure 41 Process Findings 142 Table 18 Table of Process Findings Process DevOps Radiology Workflow Project Management Very #1. There was no #1. The VA relevant coordination between Technology Development & Management failures TOE Operations during critical of the Enterprise TAG transition time. Scheduling System 10 to 8 impacted the radiology #2. VA OI&T and PCS ordering process. are counterexamples of software development #2. RAPTOR had very methodology from the important potential VACI. patient safety benefits 143 Relevant #3. RAPTOR successfully #3. Radiology support #1. VACI passed UAT and was assistants (clerks) had management and TOE successfully certified as no access to radiology contracting TAG OSEHRA Level #2. appointments. misaligned 7 to 4 RAPTOR functionality RAPTOR project #4. Sandbox development would have improved resources. was not maturely this situation. implemented. #2. VACI #4. RAPTOR?s Portfolio redesigned order Management cancelation workflow would reduce illegal employee waitlist manipulations. 144 Less #3. The RAPTOR Relevant business case estimates TOE substantial savings TAG due to efficiency 3 to 1 in workflow. 12. The process of selecting an innovation process has changed over time. No coordination between Development & Operations Process Finding ? DevOps #1 (very relevant, TOE TAG Score = 9) In the literature review, I defined the DevOps process. I earlier noted that the adoption of DevOps requires changes at the organizational level. My finding is that while development was agile, the coordination and communication between development and operations was non- existent. 145 As noted in the May 2015 RAPTOR bimonthly report risk log from the development team to VACI management shown below, there was no coordination between developers and the regional OI&T for the critical integration period to operational support. Risk #: VA Requests Delay in Delivering Production Servers Date Initially Logged: September 23, 2014 Type: Schedule Issue: VACI requests developers delay in delivering production servers. There has been no coordination between developers and Regional OI&T on server configuration for over a year and counting. Mitigation: Resume discussions between developers and Regional OI&T. Re-baseline plan. Impact: HIGH Impact for Production schedule In fact, this delay of operations had very relevant consequences causing RAPTOR not to be introduced clinically. VACI ?ran out the clock? on being unable to successfully stand up RAPTOR. Ultimately, VistA modernization was canceled, and RAPTOR was swept away along with many internal development projects. 146 Software development methodology counterexamples Process Finding ? DevOps #2 (very relevant, TOE TAG Score = 8) The purpose of counterexamples is to test the adequacy of a generalization. I created the table below to show my personal experiences with three different types of software development at the VA. I was in three different organizations and used three different software methodologies. These different characteristics are counterexamples between three different software development methodologies. My three software experiences were with multiple VistA Imaging patches for OI&T, integrating a commercial teleradiology software with VistA while at PCS, and open source RAPTOR development at VACI. Table 19 Different VA Software Development Characteristics Used By Office of Patient Care VACI Information and Services (PCS) Technology (OI&T) Process ?Cope and Hope? ?Buy COTS? ?Two-can? Nickname 147 Software Waterfall Ensuring COTS Agile, Paired Development interfacing Programming Process Types of Patching VistA Commercially Open Source Software (adding new available COTS interfacing with releases functionality to interfacing with VistA VistA) VistA My software VistA Imaging Teleradiology RAPTOR open experience and Radiology integration source Patches Biggest VA was unhappy VA?s Orphan organization Challenge with the resources requirements were did not obtain used (mainly time unique and no proper resources for to deliver) COTS products operations available Organizationa Highly structured Highly structured Semiformal l IT led by policies led by technologists clinicians Management 148 With my years at the VA, I was fortunate to spend my initial years as an OI&T VistA developer. The VistA Imaging development team was a mix of developers who practiced GOTS waterfall software development. My role was to ensure internally developed VistA would successfully interface with my new patches and with COTS acquisition modalities and PACS. I was then selected as a software architect for PCS. This role was overseeing the interfacing of COTS products in teleradiology and PACS. Finally, as described throughout this paper, my RAPTOR idea was selected to be a VACI innovation. This finding shows that using what I have observed in three different areas of the VA (OI&T, PCS, and VACI) as a basis for asserting that I encountered three different software methodologies, challenges, and organizational management. I infer that the agile software process is the best methodology from the three different types. Unfortunately, with the outsourcing of software development, the VA is contractually tied to interfacing with a commercial proprietary EHR. While each has its own unique set of challenges, the VA has shown with these examples that people, process, and technology challenges matter more than a development methodology. Another way of stating this is that if the VA is at risk and in chaos, it will not be successful no matter what methodology developers use. RAPTOR was delivered on time and on budget and was still not successfully made operational due to a lack of operational resources. RAPTOR successfully passed UAT and is OSEHRA certified Process Finding ? DevOps #3 (relevant, TOE TAG Score = 5) 149 Below are screenshots from the OSEHRA Technical Journal (2016) that show that RAPTOR has been successfully certified as Level 2. This speaks to the quality of the software. OSEHRA has created certification standards (OSEHRA, 2019) in which open community members inspect and certify code for compliance of good software engineering practices. The two screenshots below show that RAPTOR can accommodate specific VA interoperability needs and serve the needs of the open source community with Apache licenses and documentation. For example, the Level documentation 2 requires that a basic set of documentation be provided to label the intended purpose and requirements of the codebase, the installation instructions of RAPTOR, and a description on how to test the code. This finding shows that not only was RAPTOR accepted by the radiologist users, it was also approved by OSEHRA, an independent agency expert in open source, that the VA set up to confirm the quality of the software. This testing was done under no contractual obligation, as I was not paid to support this testing. I supported this certification because I am proud of the software and wanted to stand behind the quality of my team?s work. I also thought that this certification would push VACI to do the right thing and operationally support the software. 150 Figure 42 RAPTOR completed OSEHRA compliance checklist Figure 43 RAPTOR successfully completed OSEHRA open source certification Sandbox development was not maturely implemented Process Finding ? DevOps #4 (relevant, TOE TAG Score = 5) 151 The idea of a VistA test environment was long overdue. As described in the VistA Modernization Strategy (2010), the lack of a virtual test environment was a barrier to innovation. At the August 2010 Introduction to Innovations Program kick-off for VA employees at Tampa, Fl., I first heard the concept of the sandbox development environment. Accessing the development environment took at least four different attempts: Single laptop, multiple laptop, single virtual laptop account (that had to be shared), and sandbox that migrated several platforms. This environmental immaturity caused the developers much rework in rehosting our development and impacted our already tight schedule. As noted in the May 2015 RAPTOR bimonthly report risk log from the development team to VACI management shown below, the lack of maturity between development and operations had an impact on project resources being wasted. Risk Log Issue #1 Issue: The previous sandbox (cloud1) is non-optimal and required a transaction to a new environment cloud2. Cloud2 has some performance questions that might be addressed by migrating to new cluster environment (see risk #18). Mitigation: SAN has spent many resources creating and populating advanced imaging test data (including CPRS textual patient order data and imaging data) and installed the data loader into the sandbox. Monitor user comments during UAT. SAN requested VA Donation Data as per conversation with BC at OSEHRA (ticket #3344). 152 In my DevOps model, I identified four operational domains that VACI was required to provide for the RAPTOR project. The first was a VistA development environment known as the ?sandbox?. The figure below shows the network topography of the sandbox within the VA and outside the firewall. The platform of the sandbox evolved over the project, from a standalone encrypted laptop to a virtual machine and then to an Amazon web service (AWS) cloud-based platform. Figure 44 VACI ?Sandbox? The second proposed operations task was to ?seed? VistA data into that environment. The third was to automate the build processes to enable enterprise development. The fourth task was to maintain the environment through security patches. The VA performed poorly in all four operations tasking. 153 As shown in the DevOps model above, a transition took place from development to operation. However, when the application was turned over to OI&T, no support was given to deploy and sustain the software, meaning it was not maintained. Many radiologists throughout the country requested to pilot the software (RSNA testimonials 2011 to 2017). However, because of a number of organizational factors, such as the leadership void that resulted from the retirement of the VA Chief Radiology Consultant, and an incomplete DevOps process that failed to prioritize VACI and OI&T operations, the PCS did not devote any resources to deploy the software. The DevOps process failed due to the operational tasks that were poorly performed. The failures of the Enterprise Scheduling system Process Finding ? Radiology Workflow #1 (very relevant, TOE TAG Score = 9) The failures of the Enterprise Scheduling system impacted the perception of the project?s software functionality and the radiology ordering process. The VA has been trying to update its VistA-based scheduling system, developed in the 1980s, since 2000. As of 2019, despite several failed attempts and millions of dollars spent, the current system (screenshot shown below) is almost 40 years old with the MS-DOS look. It is easy to compare this ancient system to the more modern UI grid and boxes of Microsoft Outlook and Google Calendar. 154 Figure 45 Current VistA Scheduling and Appointment System Scheduling systems have had a shameful legacy of failure (GAO, 2010). Scheduling system failures have extended throughout all of HIT and have impacted software development and radiology scheduling processes. My finding is that it impacted both radiology workflow software modules such as RAPTOR and was used to replace VistA. RAPTOR and VistA got caught up in the fire and haze that scheduling has brought to VA IT. The GAO (2010, 2012, 2019) blames a broad range of VA managerial weaknesses that have plagued a series of failed projects. In 2012, there was a community-sourced submission for an open source scheduling system (OSEHRA Scheduling Contest, 2012). This contested project was known as the VistA Scheduling Enhancement (VSE) (Fedscoop, 2019). After years of delay and several million dollars of overrun, the VA OIG pointed to (Fedscoop, 2019) failure of 155 ?management of requirements, meeting user needs and continuity of leadership?. In 2014, the VA released an RFP for a new Medical Appointment Scheduling (MAS) System. The VA signed a contract with Epic to bring its scheduling software to the VA in 2015. The Epic scheduling project started before the decision was made to go with the Cerner EHR. When that happened, it set up a question that would eventually need to be answered: Does VA try to deploy Epic scheduling with the Cerner Millennium EHR, or does it scrap the Epic scheduling project and go with Cerner?s scheduling capability? The VA canceled the Epic contract. Epic was paid $25 million of a $625 million contract (Politico, 2018). As the users? perception is that scheduling is a key part of the radiology workflow, the RAPTOR project was caught up in this plague of scheduling IT failures. As shown in the radiology protocol workflow figures, scheduling is an important component in protocol workflow. In 2013, the VistA Evolution Radiology Package GUI Business Requirements Document (BRD) required an integrated scheduling solution. The following is an excerpt from the BRD, illustrating that to the end-user (here an imaging technologist), scheduling patients and equipment is a part of the RIS. User Story 2.2 - As a clerk or technologist, I want to schedule patients from the pending study (orders to be fulfilled) list to the MAS scheduling package using a GUI interface so patients can receive an appointment time and the radiology department can fulfill orders at a scheduled time. User Story 2.2.2 - As a clerk or technologist, I want all clinics in the MAS package that refer to a single piece of medical equipment to be aggregated in one calendar so I can view the schedule for a piece of equipment without viewing multiple calendars 156 Potential RAPTOR patient safety benefits Process Finding ? Radiology Workflow #2 (very relevant, TOE TAG Score = 8) Although it was never released clinically, RAPTOR had several important safety benefits. The table below highlights many potential benefits. RAPTOR was designed to improve adherence to Federal Regulations / Standards of Care and increased regulatory compliance. RAPTOR was designed in compliance with the Joint Commission Revised Requirements for Diagnostic Imaging Services. This JACHO requirement was effective July 1, 2014, including the Joint Commission Safety Checklist, and administering renal protective measures prior to imaging contrast agents. Compliance with the informed consent mandate is suboptimal within VHA facilities. 157 Table 20 Potential RAPTOR Benefits Provider-to- Clinical Decision Improved Radiologist Technology Provider Care Support and and Department Optimization Coordination Patient Safety Operational Efficiency ? Radiology ? Standardization ? Prevents avoidable ? Reusable ?dashboard? of evidenced- duplicate radiology technology provides real based radiology studies ? Low maintenance time insight protocols ? Reduction in paper- costs into patient ? Customizable based processes condition ? Open source/non- workflow and ? Reduction in the proprietary ? Current patient prioritization use of fax and technology history based alerts to enable scanning ? Auditable system on Vista timely responses technology electronic to provider record of workflow ? Improved records for compliance record for consult requests management review radiology care ? Rapid traceability within coordination ? Centralized application of Radiology Radiology Protocol ? Facilitates best practices ? Improved cost Repository communication ? Effective clinical savings related to between ? Expanded alerts to prevent improved scheduler, interoperability avoidable compliance resident, across in-place VA clinical errors radiologist, and ? Improved QA/QC health information technologist feedback and systems and training electronic medical ? Reducing waits records and delays for radiology engagement ? Elimination of ambiguous responsibility ? Medical appropriateness 158 By not implementing RAPTOR, the VA creates a critical gap in data integrity, provider- to-provider care coordination, clinical decision support, patient safety, improved radiologist and department operational efficiency, and technology optimization. These benefits are key best practices for workflow optimization in healthcare systems. RAPTOR?s order cancelation workflow Process Finding ? Radiology Workflow #3 (relevant, TOE TAG Score = 6) RAPTOR?s redesigned order cancelation workflow would reduce illegal employee waitlist manipulations. According to the Washington Post (Davidson, 2019), fabricated waitlist assertions have bedeviled the VA since it was consumed by scandal in 2014. In order to reduce the waitlist, VA employees were ordered to remove the patient?s names illegally. This corrupt process was supported by current information technology. The RAPTOR order cancelation functionality was designed to not permit unauthorized mass cancelations. RAPTOR also linked new orders to previous canceled orders. The following screenshots capture the RAPTOR order cancelation functionality. The first figure shows that the order cancelation in RAPTOR was only permitted by a privileged VistA user. Typically, only the physician or a designated signature is allowed access to cancel an order. The second figure below shows the replaced order in RAPTOR, with an audit trail note linking the original order to the new order. The final figure in the sequence shows this displayed in CPRS. Therefore, I claim that RAPTOR?s redesigned order cancelation workflow would have reduced illegal employee waitlist manipulations. 159 Figure 46 Cancel Order from RAPTOR Figure 47 RAPTOR displaying Replaced Order 160 Figure 48 CPRS displaying Replaced Order Radiology clerks have no access to radiology appointments Process Finding ? Radiology Workflow #4 (relevant, TOE TAG Score = 6) The VAOIG (2015) found that radiology support assistants (clerks) have no access to radiology appointments. RAPTOR functionality would have improved this situation. As shown in the figures below, RAPTOR?s functionality would have provided insight into radiology appointments and potentially could have alleviated some of the pressure on this choke point. In fact, I had several conversations with VACI leadership to enhance this functionality as it was being requested by the users and it made sense to provide this functionality as part of the radiology workflow. The current command line appointment system is shown below. The implications of this finding can easily be shown in the screenshots. There is a huge HCI gap between a clean modern web interface and the old (1980s) mainframe command line 161 emulator. It is easy to see the HCI impact on why radiology appointments have been such a recurrent issue with the VA. Figure 49 RAPTOR Pass Box functionality supports appointment management Figure 50 Radiology Scheduling Management from the RAPTOR Scheduler 162 Figure 51 VA Command Line Scheduling System VACI management misaligned RAPTOR project resources Process Finding ? Project Management #1 (relevant, TOE TAG Score = 7) I found that on the RAPTOR project level, there was a misalignment of resources by management and contracting. As noted earlier, there was a shortage of operational resources that resulted in the failure of the DevOps process. Additionally, during the development phase there were at least four Price Waterhouse Coopers (PWC) resources on our weekly calls. The only deliverable the development team received from PWC was a preliminary 508 testing report. Before releasing VA software, the VA requires a certificate of compliance to section 508 of the Rehabilitation Act. In all previous software projects, a final 508 report was delivered at or near the end of coding. When I asked the VACI project lead about receiving a preliminary (but not final) report, he said that the innovations coordinator thought they were a good idea, although all of the innovation development team thought it was a waste of resources. Therefore, an additional 163 summary finding is that VACI project managers are not empowered to manage project resources or priorities. There are two individual findings: Too many resources for unneeded early 508 oversight compliance and a severe shortage of operational resources. VACI contracting shortchanged the development team in several important areas including VistA data loading, test automation, and providing users with a comment tracking system. VistA data loading and RAPTOR test automation were not included in the development contract. These should have been budgeted for as they were critical to the successful development phase. Test automation was reviewed by the OSEHRA certification. VACI did not have a practical handle on the operational resource issues. Despite multiple inquiries to get radiology data, the development team was told ?it is what it is?. The implication(s) of this was a misalignment of resources, spending application development resources to do VACI?s tasking. A final VACI mismanagement was moving the RAPTOR application out of the development environment. After development, there was a year-long delay with operations while VACI tried to identify the correct resource. This is discussed in more detail in the technology findings. VACI portfolio management Process Finding ? Project Management #2 (relevant, TOE TAG Score = 4) In my large cache of VA documentation, I found several examples of VACI management portfolio announcements that were tailored for diverse stakeholders. Having access to this data 164 content is a good example of being an insider to the VACI culture. Appendix II is the content of the VACI ?good news story? that was published on an internal website. This ?marketing? type of positive organizational communication is prevalent from VACI project management and continues to this day. This VACI marketing message was a project management communication to radiologists. An excerpt from the notification that RAPTOR has been selected as a VA Innovation is, ?VA Innovation Competition has been intense. Over 45,000 users voted on 6,500 ideas that were originally submitted on the VHA Employee Innovation Competition website. Of those, 125 were invited to submit a proposal, and 101 proposals were received by the deadline. You are one of 32 that have made it to the final stage. This is a truly terrific accomplishment. Again, congratulations! You are about to embark on an exciting journey, and we are eager to assist and guide you through the process. Signed VHA Innovation Program & VA Innovation Initiative (VAi2).? The (Fedscoop, 2010) YouTube video hosted by VA CTO Peter Levin, shows that RAPTOR is one of the selected innovations. This is an external communication channel that the CTO used to tout the VACI portfolio. This screen shows that RAPTOR was a key project in the VACI portfolio and the range of projects in the VACI innovation pipeline. 165 Figure 52 Online Radiology Protocoling Tool Integrated with CPRS/VistA listed in the VHA & OIT Innovation Initiative (still from Dr. Peter Levin discusses innovation at the Dept. of Veterans Affairs YouTube video) Business case cost benefit justification Process Finding ? Project Management #3 (less relevant, TOE TAG Score = 3) The RAPTOR business case performed by VHA Innovation Selection Board estimates substantial tangible and intangible cost/resource savings due to efficiency in workflow. The following is an excerpt from the VHA Innovation Selection Board business case for national action on the RAPTOR prototype (the RAPTOR business case justification). This shows the value of the RAPTOR application to the business of radiology. The finding also shows the waste in developing RAPTOR for several millions of dollars but not going forward with it use after it is ready for clinical use. The VA had incurred all expenses but realized none of the benefits. 166 ?Implementation of RAPTOR will lead to substantial tangible and intangible cost/resource savings in addition to patient care improvements and compliance gains detailed elsewhere in this document. Conservative estimates of tangible economic benefits of RAPTOR implementation include: - Avoid estimated $23 million per year of costs from preventable complications of intravenous contrast administration adverse events. - Reduce radiology technologist and other support personnel manual labor, liberating an estimated $5.5 million worth of time per year. This conserved effort can be applied to other productive tasks, increasing overall department efficiency. - Reduce radiologist and nuclear medicine physician labor, liberating an estimated $3.7 million plus worth of radiologist time per year. This conserved effort can be applied to other productive tasks, increasing efficiency. - Costs incurred by typical IT projects include acquisition, contracting, and custom integration. All substantial costs can typically inflate to 80% over the total lifecycle. These costs are negated by RAPTOR. - Transition from paper to electronic workflow promises substantial workflow efficiencies and quality and safety gains in addition to economic and ecological benefits of paper and printing avoidance. Paper and printing savings alone are conservatively estimated at $0.25 million. 167 Intangible (and difficult to assign value) benefits of RAPTOR implementation likely value at magnitudes of scale greater than the selected economic benefits listed above. Two such examples include: - Avoid as much as 12% wasted effort expended on assigning protocols to requisitions that ultimately do not advance to exam completion (e.g. duplicate orders, canceled orders, unauthorized orders). - Workplace quality: Current paper processes for protocol assignment represents a chore. RAPTOR optimization and streamlining of workflow will improve employee attitude and satisfaction surrounding this necessary department function.? Project Initiation changes Process Finding ? Project Management #4 (less relevant, TOE TAG Score = 1) The process of selecting an innovation process has changed over time, from employee vote to shark tank selection to spark-seed-spread. This change shows the evolution away from information technology projects. The appendix has examples of each of the three selection processes. The implication of this shows several project initiations trends. One is that VACI is moving away from inclusion. Voting has the greatest employee inclusion, then to shark tank which has a public theatrical aspect, but lesser employee inclusion and spark-seed-spread has only experts deciding. 168 Another initiation trend is smaller projects, in both resources and size. The initial RAPTOR prototype contract was for $500K. The spark-seed-spread funding started at $50K. The result of this reduction of funding is that software prototypes are much less developed or that innovation is not IT based at all. This fits in with the VACI pattern away from internal software development. Findings on Technology: IT shortcomings are not the reason behind the rejection of VA VistA and RAPTOR My findings on VistA modernization, Office of Information & Technology (OI&T) and open source show the underlying technology issues that currently plague the VA are not the reason behind the rejection of VA VistA and RAPTOR. The use of the open-source framework and tools had no adverse impact on the project development schedule and budget, and RAPTOR's open-source code library was reused by at least one other project. The decision not to utilize the open source policy is a failure of policy diffusion in the technology adoption model. Although security is a priority at the VA, the operational phase did not support RAPTOR security maintenance. After UAT, Portland attempted to advance RAPTOR to national OI&T implementation. It was held up by OI&T operations? hesitation in moving from Class III (local) innovation to Class I (national OI&T support). Every RAPTOR component was on the VA OI&T listing of approved tools known as the Technical Reference Model (TRM). No additional functional enhancements were required by RAPTOR to pass the VISTA Intake Program. The VistA Evolution Roadmap was the main innovation pipeline from 2014?2017 with RAPTOR 169 being its highest priority. Upon the June 22, 2016 testimony to Congress of Dr. Shulkin, the VistA Evolution plan was scrapped. The VHA is phasing out in-house software development. Peter Levin, VA CTO, defines VA innovation as invention plus implementation (Fedscoop, 2010). Levin highlighted seven attributes of implementation: open architecture, modular, scalable, standards based, extensible, reliable, and maintainable. In his YouTube talk, he tells a story of technology challenge regarding modular software. On his first day in office, the VBA Secretary attempted to change a single digit from 60 to 30 days on a letter. This one-digit change took 11 months to implement. In contrast to enhanced RAPTOR modularity, the development team swapped out the entire middle ware section in one month. As championed by CTO Levin, RAPTOR was an open architecture (we swapped out the middleware). The front-end RAPTOR software is developed in Drupal, a highly scalable Content Management System (CMS). Large websites such as Weather.com and Time.com use Drupal. RAPTOR introduced RSNA standards- based radiology lexicon (RadLex) codes and worked with all VistA standardized codes and business processes. Therefore, I could not find a single technological reason for the rejection of RAPTOR. This is consistent with the false technical narrative for replacing VistA. There are no technical disqualifications against these applications. The figure below illustrates the detailed findings of the three examined VACI technologies: Open source, Office of Information & Technology (OI&T) and VistA modernization. 170 Figure 53 Technology Findings The table below shows the relevancy of the three technologies that impacted my research. It organizes the technology findings into open source, OI&T, and VistA modernization and labels them based on their relevance to my research question and maps the overall findings data source that follow. 171 Table 21 Table of Technology Findings Technology Open Source OI&T (Office of VistA Modernization Information and Technology) Very #1. Open source software #1. Security Maintenance #1. RAPTOR is the Relevant misconceptions through the highest radiology #2 VA is phasing out DevOps process. priority in the VistA TOE TAG inhouse software Evolution Roadmap, # 2. Use of open source has development. 10 to 8 the main innovation no adverse impact on the pipeline. project schedule and budget. #2. Shulkin scrapped VistA modernization. Relevant Open Source #3. Approved Architecture #3. Faulty cost and Components. information. TOE TAG #3. RAPTOR?s open source code was reused. #4. OI&T and VACI 7 to 4 delays 172 Less Relevant #4 OSEHRA closing. #4. The VISTA Intake Program. TOE TAG 3 to 1 Open source development not well-known at VACI Technology Finding ? Open Source #1 (very relevant, TOE TAG Score = 9) Within the DevOps process, open source development was new to operations. Many operational misconceptions existed throughout the process. While presenting RAPTOR at the Drupal Government Days 2012 conference (Drupal, 2012), I learned that many federal agencies are realizing significant IT savings from using open source application development which reduces lifecycle costs over average federal software development. Additional costs incurred by typical IT projects include acquisition, contracting, and custom integration. All these substantial costs can typically inflate to 80% over the total lifecycle. These IT costs are negated by open source RAPTOR. Another open source misconception is that Dr. Shulkin has asserted that the VistA electronic medical record system needs to be replaced because the VA cannot retain VistA developers. I believe his assertions are based on some false assumptions. Open source development attracts better talent than proprietary software. Many experts suggest open source is where the industry is heading. Many developers enjoy creating their own projects and having the 173 ability to interact with other developers to discuss innovative solutions. Giving developers freedom and flexibility is an important way to attract and nurture top development talent. During the RAPTOR development, VACI forced us to switch from the Linux open source development environment to Microsoft Windows Server and then back to open source Linux. VACI wanted us to use an unsupported version (Windows Server 2008) that was no longer available for purchase. We had to use Windows Server 2012 and then downgrade to 2008. This needless transition between open and proprietary show that many misconceptions exist at the VA regarding open source. Use of open source has no adverse impact on schedule and budget Technology Finding ? Open Source #2 (very relevant, TOE TAG Score = 8) Use of the open source framework and tools has no adverse impact on the project development schedule and budget. RAPTOR?s innovative application architecture is based on reusing open source tools and principles and is fully consistent with the VistA modernization strategy. By designing the layered application into discrete open components, RAPTOR offers the VA a wide range of interoperability potential and short development cycles. RAPTOR?s presentation and radiology process logic was built on a robust and secure Drupal open source CMS. RAPTOR reuses MDWS web services to pull clinical data from CPRS using data objects. The design and development of RAPTOR required the project team to consider and analyze potential services as discrete, standardized building blocks. The RAPTOR team analyzed their programmatic options and made several design choices. During the proof of concept phase, VACI required me to develop an open source plan showing the benefits of our using Linux, 174 Apache, MySQL, and PHP (LAMP). I justified all open tools used to manage the RAPTOR project and saved the VA tens of thousands of dollars of Microsoft licenses. Open source is designed as a community-based source software development and education. The source code is available to the general public for use or modification from the original design. It is not just using another vendor?s code, but a true collaboration such that organizations take code, improve upon it, and release those enhancements back into the community. For example, RAPTOR developers took open Drupal code modules to get started and delivered value to the VA very quickly, on time and on budget. Development software support options include no direct costs. Should VistA migrate its ecosystem to open source Linux, the cost of hardware and software will decrease. VistA also has the flexibility to be run on two different implementations of MUMPS (Massachusetts General Hospital Utility Multi-Programming System) when Linux is used, one of which is used extensively by the open source community (FIS's GT.M) and one which is used by the VA (InterSystems' Cache). This is a plus as it helps to keep downward pressure on software costs when there is no vendor lock-in. RAPTOR?s open source code was reused Technology Finding ? Open Source #3 (relevant, TOE TAG Score = 7) RAPTOR?s open source code was reused by at least one other project. Code reuse is one of the strongest arguments for OSS. Open source is reusable when the module supports the same business process or can addresses a software problem that is the same regardless of business- 175 process because the OSS allows developers to freely adopt the existing code, and adapt it as needed. In a June 29, 2016 conversation with a presidential scholar I noted that ?PwC are reusing RAPTOR?s EWD.js Java Library for the Daily Plan interface with VistA?. The Daily Plan (Patient Safety, 2009) provides patients with an itinerary for each day in the hospital. These patient-specific reports, one or two pages in length, reflect current orders such as allergies, medications, procedures, and diet. The presidential scholar forwarded this reuse information to a VACI director who responded that, ?VA is going to replace VistA with Cerner which changes the game for everyone.? RAPTOR and the Daily Plan both pull data out of VistA and then aggregate the data for web content. This data pull business process is a key component of VistA modernization. OSEHRA closing Technology Finding ? Open Source #4 (less relevant, TOE TAG Score = 3) After about ten years in existence, OSEHRA is closing due to lack of funding from the VA. In 2010, VA recognized that VistA?s rate of innovation and improvement had slowed substantially, and the codebase was unnecessarily isolated from private sector components, technology, and outcome-improving impact. To address those issues, VA established OSEHRA, the mechanism to open the aperture to broadly-based public and private sector contributions. OSEHRA was founded in 2010 as the open source health record custodian. OSEHRA works with VA in two ways. First, OSEHRA maintains and provides the VistA source code and a 176 variety of supporting resources. OSEHRA was formed to support the open source VistA community. OSEHRA was the bi-directional gatekeeper of VistA systems. OSEHRA helped to identify, analyze, prioritize, and certify open source software candidates, such as RAPTOR, for VA intake. For example, RAPTOR was verified and certified by OSEHRA as a part of the VistA intake process. This certification was discussed in the process findings (DevOps #3). Second, OSEHRA fosters an open ecosystem (shown in Figure 31) in which many organizations, including the VA, can equally participate. This private public academic partnership ecosystem is discussed as a people finding. These organizations include private companies, academic institutions, state government agencies, and federal government agencies. In an October 2019 email to community members, OSEHRA announced that unless it receives additional funding from the VA, it will be forced to close. The nonprofit has struggled to sustain its operations, as the VA has turned away from open source software. No additional funding was available and OSEHRA is closed for operations as shown in the notice in the figure below. Figure 54 OSEHRA closing notice 177 Lack of security maintenance Technology Finding ? OI&T #1 (very relevant, TOE TAG Score = 10) Although security is a very important concern at the VA, the operational phase did not support RAPTOR security maintenance. This paradox leads me to the previous technology finding that many misconceptions about open source software persisted through the RAPTOR DevOps process. For example, the development team handed over a UAT-approved web application server to operations. The lack of dedicated support experience in patching resulted in the application being inaccessible. After the first Microsoft software patch, VA Operations could not restart RAPTOR. This shows a lack of support ownership in Operations and a breakdown of the DevOps. This lack of action was not the fault of the open source software and tools as they provided solid information security in VistA. As a note, DevOps is now commonly referred to in the IT industry as DevSecOps. This shows the importance of security in the DevOps process. This highlights how critical the failure of VA OI&T operations was in not maintaining RAPTOR security. The VHA is phasing out in-house software development Technology Finding ? OI&T #2 (very relevant, TOE TAG Score = 9) In-house software development was once notably more common in the federal government, including the VA, than it is now. I have noted that the phasing out of in-house software has been going on for the last ten years. In my opinion, 2009 was the apex of in-house 178 software development. Looking at the current 2019 VistA Imaging webpage shown below, one may conclude that VistA?s peak was around ten years ago, based on the last event, a record number of images stored (July 2009). Figure 55 Current VistA Imaging webpage When I was a VistA software developer, I saw the sunset of VistARAD in 2009. Then in 2010 when I was at PCS, I assisted with the commercial PACS replacement initiative of VistARAD. Presently, I can see that non-maintenance VistA software re-engineering is being phased out. Later in this chapter, my technology finding is that the VA is stopping in-house software development. My current finding is the VACI is using the simplified ?three boxes? management approach to justify the sunsetting of in-house software development undertaking. 179 Standardization of approved architecture and components Technology Finding ? OI&T #3 (relevant, TOE TAG Score = 4) My finding is that standardization has been an ongoing issue with the VA in technology, architecture, processes, and costing. This finding is specific for technology and business models. These models are frameworks that help guide standardizations to minimize cost, complexity, and risk. While the VA has been undergoing upheaval, technology, process, and cost standardization models are a buffer against disorganization in patient care. Every RAPTOR component was on the VA OI&T listing of approved tools known as the Technical Reference Model (TRM) and is located at www.va.gov/trm/. For example, MySQL version 5.6.x had been approved by VA OI&T Architecture for CY2015 and 2016. This is relevant because every component from RAPTOR, including the open source tools that RAPTOR introduced for clinical use, meet all architectural, performance, and security requirements. This demonstrates the technical acceptance of RAPTOR components revealing that RAPTOR had no technical architecture issues. The lack of standardization in costs is discussed further in Faulty VistA Cost Model, Technology Finding ? VistA Modernization #3. There was no standard cost model until the Technology Business Management framework was introduced after VistA modernization was canceled. 180 OI&T and VACI delays Technology Finding ? OI&T #4 (relevant, TOE TAG Score = 4) After UAT, Portland attempted to advance RAPTOR to national OI&T implementation. It was held up by OI&T operations and VACI. After UAT, the radiologists at test sites had several options. One option was they could have passed on RAPTOR. The preference Portland chose was the approval for clinical use of RAPTOR, after which the radiologists tried to move it forward to promote it for VA operations support. The Portland UAT site liked RAPTOR and chose to move it forward to improve radiology workflow. The following correspondence originated from a UAT site manager to OI&T and is followed by OI&T?s response. This shows that despite a successful UAT, OI&T was against supporting innovation projects. UAT Manager: I?d like to take a moment to introduce the proposed integration of RAPTOR: The Radiology Protocol Tool and Recorder (RAPTOR) software is designed to render the labor- intensive process of protocol assignment and the often paper-based workflow in imaging departments obsolete, according to its developers. It's programmed to search for information from a patient's medical record that is important for protocol decisions and display it in a dashboard synchronized with an electronic protocoling tool. Funding from the VA Innovation Initiative was used for the project. VA Innovation is facilitating the evaluation and advancement of RAPTOR. The Innovator anticipates that it will eventually be rolled out for use in every Imaging Department of the VA Health Care Network. I thank you for your support and consideration in assisting with this platform integration. OI&T Response: I have concerns with moving class iii Innovations project into a production account. 181 This was just the beginning of several years of delays and frustration in getting an approved and certified innovation project into the radiology clinic. OI&T delayed even after RAPTOR was certified for software quality by OSEHRA. They attempted to create a new installation script to reduce reliance on developers. This simple task was never successfully completed. Thus, while RAPTOR development finished on time, operations had no timeline to maintain what was transitioned to them. Although the development team stayed involved in several years of meetings, VACI was unwilling to pay for any post-UAT development. This lack of a critical resource caused a very long and unnecessary delay. This delay turned into a cancelation once the VistA modernization was canceled. This finding is discussed further in Shulkin canceled VistA Modernization (Technology Finding ? VistA Modernization #2). The VistA Evolution Roadmap and RAPTOR Technology Finding ? VistA Modernization #1 (very relevant, TOE TAG Score = 10) Released in 2014, the VistA Evolution Roadmap was the main innovation pipeline. RAPTOR is the highest radiology priority in the VistA Evolution Roadmap. The figure below is an excerpt from VistA Evolution Roadmap, dated March 24, 2014. It shows that RAPTOR was the highest radiology priority in the VistA Evolution Roadmap and that the next step was to deploy RAPTOR across the enterprise. The full page that follows shows that RAPTOR?s capabilities were key functionalities that were planned benefits. 182 Figure 56 RAPTOR in the VistA Evolution Roadmap (March 24, 2014) In an email from OSEHRA dated, December 1, 2014, titled VA Design Patterns Briefing and VistA Evolution Update: Questions and Answers: * *Question*: What effect will the VistA Evolution work have on the future innovation projects from VHA? *Answer*: The VistA Evolution Program will oversee the transformation of VistA so that it adheres to a service-oriented architecture design pattern. This results in a vendor-agnostic technology platform that is highly responsive to changing clinical needs: new functionality can be added as new services, old functionality can be changed by modifying or replacing existing services. 183 Radiology VistA 4 will update the radiology application to transition radiology operations from paper- based to a paper-light practice. These enhancements will address the current practice demand with emphasis on increased efficiency, improved documentation, and enhanced patient safety. VistA 4 radiology and imaging enhancements will leverage some of the innovative work undertaken at by community VistA users for a new radiology user interface. This GUI may be used as a model user interface for the following radiology functions: enter order, schedule study, register patient, case edit study, protocol study; display status of patients who are in the department; display key management parameters: unscheduled orders, incomplete studies, un-dictated studies. Key functionalities targeted for the radiology interface include scheduling exams from a list of orders. This user interface will enhance functionality of the scheduling application to allow auto-populating in the radiology application of the scheduled appointment time, eliminating the need for duplicate entry. Additional new capabilities will consist of: ? Ability to assign orders for imaging studies to radiologists so they can be protocoled. ? Select acquisition protocols for ordered and scheduled imaging studies with rationale for selection. ? Communicate imaging instructions to technologists. ? Communicate patient communications from clerk to radiologist and technologist, and. ? Enter radiation dosage. As listed above, VistA 4 radiology will include best practices functionality such as support for 184 electronic protocols and a dashboard display of Radiology and Imaging the patient?s status, which will facilitate Clinical Decision Support communication between radiologists and Imaging best practice protocols combined with clinical decision technologists. Incorporating protocols within support CDS at the time of order entry helps remind providers of evidence- radiology procedures will ensure that important based and local guidelines, reduces unnecessary testing and provides safety information such as allergies and renal patient safety checks throughout the functions are clearly communicated. Radiology procedure. CDS capabilities will improve ordering guidelines to follow appropriateness criteria as defined by the American College of Radiologists. VistA 4 Imaging will build upon current image management capabilities to support enterprise image distribution and viewing. Such enhancements include the ability to import studies from external entities, improved image viewing functions, support for structured DICOM reports and integration and tracking of radiation dose metrics. These features will also enable imaging interoperability with our partners, including the DoD. The VistA 4 Radiology and Imaging System enhancements will improve the efficiency, quality of care and Veteran safety through efficient workflows, timely processing of orders, improved communications among staff, more complete documentation, and support for optimal scanning protocol. These enhancements will benefit clinicians by allowing simultaneous availability of patient images and data while planning and providing care, less time be spent locating images and improved communication among radiology clinicians and specialist. 185 VistA modernization canceled Technology Finding ? VistA Modernization #2 (very relevant TOE TAG Score = 10) Upon the June 22, 2016 testimony to Congress of Dr. Shulkin, the VistA Evolution plan was scrapped. On June 26, 2016, I received word that the VistA Evolution program was scrapped for a COTS replacement of VistA. This resulted in the canceling of RAPTOR. This announcement has reverberations to present day in-house software development being unseated. After the decision to move to a commercial EHR, Dr. Shulkin was asked what the response was from those involved. A paraphrase of his response, grouping stakeholders according to their knowledge of VistA, was that those who do not use VistA, including politicians, are generally pleased and those who use VistA, including VA employees who use it daily, are not enthusiastic. One critic in the hardhats community wrote that, ?this is a declaration of victory without an actual implementation. Silencing the staff is a common effort that has been used in the past and the reality is that the staff is bullied into accepting a lessor system (Hardhats forum topic LSNW4NYZBp8).? In June 2018, two years after announcing the VistA modernization is dead, the VA established the Office of Electronic Health Record Modernization. The two-year gap between announcements is illustrative of how unprepared VA management was of this dramatic change. 186 Faulty VistA cost model Technology Finding ? VistA Modernization #3 (relevant TOE TAG Score = 4) There was no standard cost model until the Technology Business Management framework was introduced after VistA modernization was canceled. The GAO report, ?Electronic Health Records, VA needs to identify and report system costs? (July 2019) found that ?VA?s total does not accurately reflect the development and sustainment costs for VistA.? The GAO advised that VA?s failure to keep track of its spending on VistA means ?the department, legislators, and the public do not have the comprehensive, reliable information needed to understand how much it actually cost to develop and maintain the system.? I found many different examples of the faulty VistA cost model impacting business decisions. The VA report that evaluated against open source VistA did not choose distinct service models and therefore will not have accurate cost information. Secretary Shulkin announced in January 2017 that he would decide regarding the future of VA?s EHR platform in July 2017. The Grant Thornton report (May 1, 2017) addressed four strategic options for modernizing the VA EHR. On June 22, 2017, Secretary Shulkin announced that he was canceling VistA modernization. This timing shows the importance of this report in canceling VistA COTS EHR. It is not a coincidence that the $16 bn amount is consistent with the no-bid contract awarded to Cerner for their COTS EHR. The following passage is from the Grant Thornton report. It describes and lists each new EHR option and their total costs for the next 15 years. 187 ?The four strategic options are as follows: ? Option 1- Commercial off-the-shelf (COTS) EHR: VA selects and implements a COTS EHR product and uses it for clinical and revenue cycle functionality. Although not all needs may be met by a single vendor, VA has the option to purchase additional COTS functionality and incorporate/integrate it with the primary COTS solution. The COTS EHR product will be hosted within a VA-purchased and operated, federally certified, secure cloud environment. Total Cost = $16.2B Option 2 - COTS EHR combined with the Joint Legacy Viewer (JLV) and electronic Health Management Platform (eHMP): This option is similar to Option 1: COTS plus VA retains the JLV and eHMP, both VistA packages, to develop and implement additional capabilities to fill gaps in COTS EHR capabilities. The COTS EHR product will be hosted within a VA-purchased, federally certified, secure cloud environment. Total Cost = $18.7B ? Option 3 - VistA commercialization: VA transfers VistA to a third-party vendor, and after modernization by the vendor, VA purchases licenses to use VistA as Software as a Service (SaaS). VA will receive considerations for pricing such as reduced licensing and implementation costs in exchange for VistA intellectual property rights. VA may also negotiate other terms such as directed development of new functionality to meet VA?s specific requirements. In the SaaS arrangement, the vendor provides the software on a subscription basis and is responsible for hosting the software in a federally certified, secure cloud environment. Total Cost = $11.9B 188 ? Option 4 - COTS EHR provided as SaaS: This option is similar to Option 1: COTS; however, in this option, the COTS EHR product is hosted and fully supported and managed by the vendor. In the SaaS arrangement, the vendor provides the software on a subscription basis and is responsible for hosting the software in a federally certified, secure cloud environment. Total Cost = $16.0B ? These options are essentially similar Software as a Service (SaaS) service variations of cloud computing. An accurate cost estimate would have priced different service variations including SaaS, Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). SaaS can be defined (Sulaiman et.al, 2019) as a software distribution model in which a third-party provider hosts an application and makes it available to customers over the Internet. The service provider will install all the applications and software required and ready for use by the user. PaaS ?PaaS provides a platform for computer users through the provision of hardware, networking, and operating systems. Users will design and develop their own applications in this model. It also linked between SaaS and IaaS. IaaS ? It is a form of cloud computing that provides virtualized computing resources. IaaS provides storage space and basic computing to users so they can develop the application in its own environment. The VISTA Intake Program Technology Finding ? VistA Modernization #4 (less relevant TOE TAG Score = 1) 189 The VISTA Intake Program (VIP) was launched in Jan 2015. Paul Tibbits, MD, VA CTO, in his presentation at the 2015 World Open VistA Forum noted that OSS is jointly approved by both the VA and OSEHRA to vet VIP candidates. To be nominated, code must be intact with no remaining enhancements necessary to be functionally aligned with VistA. No additional functional enhancements were required by RAPTOR to pass the VISTA Intake Program. This milestone shows that RAPTOR was technologically ready to be clinically introduced to radiologists and could be supported by OI&T. This makes the reality of the delays and cancelation much harder to justify. RAPTOR was essentially on the cusp of clinical introduction. It passed every hurdle and pre-requisite. Summary of Findings In analyzing autoethnographic data, my intent is to gain a cultural understanding of innovation at the VA. With about twenty years of software development experience in a variety of roles at the VA and DOD HIT, I am intimately connected to the people, processes, and technology in a cultural context. Therefore, my autoethnographic analysis and interpretation involved shifting my research between myself, RAPTOR, VistA, and other VACI stakeholders. To organize and prioritize my findings, I introduce an intuitive synthesis of organizational information systems and ethics management and categorize it using the PPT concept framework. This new Technology, Organization and Environment plus 190 Technology Acceptance failure Groups (TOE TAG) concept included several diverse models including TOE, TAM, OIS, and ethics management. This adaptive concept includes ethics as a response to understand communication breakdowns, process non- compliance and technology failures. This inductive synthesis was used on generalized findings that I connected from my specific data and experiences. The following table is an overall summary of my findings. They represent a blend between different software projects, organizational context, and personal observations that were validated by data. The consistency between total number of findings (24) and distribution (uniform) between people, process, and technology is consistent with total ethical issues of cloud computing (Sulaiman, 2019). Table 22 Matrix of Findings Importance vs. PPT Significance Very Relevant Relevant Less Relevant TOE TAG 10 to 8 7 to 4 3 to 1 People CULTURE CULTURE CULTURE Culture #1 Mission vs. Rules #4 Partnership #5 Forget the past. Culture. counterexample. Communica tion #2 Whistleblower COMMUNICATION Retaliation ingrained in Organizatio COMMUNICATION #4. VACI uses social culture. n media in external 191 #3 Impact of #3 Premature publicity from communications to Scheduling Scandal. radiology press. highlight successes and offset negative reporting. COMMUNICATION ORGANIZATION #1 Radiologist approval #3. VACI is a semiformal ORGANIZATION of RAPTOR. organization #4. VACI changed its #2 RAPTOR was name, leadership, and awarded one of the top mission four times in 5 Medical Imaging IT ten years. Projects of the Year ORGANIZATION #1 Government have often failed to sustain and maintain innovation over time. #2 The failure to adapt to organizational policy. Process DevOps DevOps Project Management DevOps #1. There was no #3. VA OI&T and PCS are #3. The RAPTOR coordination between counterexamples of the business case 192 Radiology Development & software development performed by VHA Workflow Operations during methodology from VACI. Innovation Selection critical transition time. Board estimates Project #4. Sandbox development substantial tangible and #2. VA OI&T and PCS was not maturely Managemen intangible cost/resource are counterexamples of implemented. t savings due to software development efficiency in workflow. methodology from the Radiology Workflow VACI. #3. RAPTOR?s order #4. The process of cancelation workflow would selecting an innovation Radiology Workflow reduce illegal employee process has changed #1. The VA waitlist manipulations. over time. Technology #4. Radiology clerks had no Management failures of access to radiology Enterprise Scheduling appointments. RAPTOR System impacted the functionality would have radiology ordering improved this situation. process. #2. RAPTOR had very Project Management important potential #1. VACI contracting patient safety benefits. shortchanged development team in several important areas including VistA data loading, test automation, and 193 providing users with comment tracking system. #2. VACI Portfolio Management Technology Open Source Open Source VistA Modernization Open #1. Open source #3. RAPTOR?s open source #3. The VISTA Intake Source software code was reused. Program. misconceptions through OI&T #4. Faulty cost information. the DevOps process. (Office of Information # 2. Use of open source OI&T and has no adverse impact #3. Approved Architecture Technology) on the project schedule and Components. and budget. VistA #4. OI&T and VACI delays. Modernizati on OI&T #1. Security Maintenance. #2 VA is phasing out in-house software development. VistA Modernization 194 #1. RAPTOR is the highest radiology priority in the VistA Evolution Roadmap, the main innovation pipeline. #2. Shulkin scrapped VistA modernization. 195 Table 21 Specific RAPTOR failure findings OIS failure Explanation RAPTOR Finding groups Dimensiona Overlooking of one or more DevOps ? no focus on OI&T l blindness dimensions or not focusing on operations implementation after failure one or more dimensions soon development enough Iteration Improper balance between too Lack of Schedule functionality failure much iteration and too few although all stakeholders wanted feedback loops it Resource Too few financial resources or DevOps ? no focus on failure human resources within the OIS implementation after to successfully generate, develop development and diffuse the innovation 96% of all facilities have severe staffing shortage Representat Improper stakeholder group Change of Chief Radiologist iveness representativeness, non- sponsor failure representative organization or OI&T Non- rep during Dev Ops individual for the group, or non- representative individual for the organization 196 Openness Improper balance between DevOps ? non-representative failure consulting and participating with OI&T too many stakeholders Cooperatio Too few strong ties in the Difficulties in cooperation n failure innovation network, leading to, between Dev & Ops, VACI and for example, trust issues and OI&T difficulties in cooperation Lock-in Too many strong ties, leading to, Strong groupthink culture failure for example, ?groupthink?, Whistleblower retaliation resulting in myopia and inertia Anti-OSS within the innovation network Hard The lack or underdevelopment of VACI semiformal organization institutional formal arrangements, e.g. tried to work between failure collaboration contracts, IP developers & OI&T and failed. arrangements, and non-disclosure Unnecessary delays. agreements Lack of executive leadership stability Soft The lack or non-alignment of VACI semiformal could not institutional informal arrangements, e.g. change rules-based culture failure shared vision, social values, culture and norms, mutual trust, goals of the different partners and business models Capacity The lack of certain capacities of OI&T controls network failure the innovation organization to management maximally profit from the OIS, 197 e.g. absorptive capacity or Ops was not completed by network management capacity VACI Chapter 5: Conclusions Although the VA has made four attempts to fully modernize VistA, it has been unable to do so over the past twenty years. On April 4, 2019, the week I gave my dissertation proposal, Carol Harris, director of IT management at the GAO (GAO, 2019) gave her testimony to Congress, that the VA ?From 2001 through 2018, VA pursued three efforts to modernize its health information system ? the Veterans Health Information Systems and Technology Architecture (VistA). However, these efforts experienced high costs, challenges to ensuring interoperability of health data, and ultimately did not result in a modernized VistA. Regarding the department?s fourth and most recent effort, the Electronic Health Record Modernization, GAO recently reported (GAO, 2019) that the governance plan for this program was not yet defined?. This testimony articulates the recent history the inability of VA HIT to innovate. The central question I ask is ?Can healthcare IT at the VA be healed?? My findings show the sources of failure that are within the VA are due to people, process, and technology. My findings are scored for relevancy based on my twenty years of experience in health and engineering science organizations that have exhibited behaviors 198 indicative of those found in failure groups. I show that there are many factors currently keeping the VA in chaos, and away from a stable and healthy environment. I have found the individual answers in my research questions on categories of people, process, and technology. Behavioral Integrity, Walking the talk - Authenticity I found that human behavior in corporate and government settings has a huge impact on how the organization?s, culture, and methods of communication negatively influence innovation. My findings show that talking about changes or culture is easy within the VA but putting them into practice is more difficult. When the VA says all the right things but does not act on them, employees can pick up on this and become disengaged. It is important to practice what you preach, otherwise confusion and resentment can build up, leading to the possibility of a toxic environment. This is the situation from the VA over the past few years. Mixed external communications The GAO (2015) has identified several issues at the VA that result in a lack of clarity, poor management, and oversight. These gaps include ambiguous policies and inconsistent processes, inadequate oversight and accountability, information technology challenges, inadequate training, and unclear resource needs and allocation priorities. Understanding why you are doing something and what you are working towards is key to being engaged in your work. Knowing how one?s tasks contribute to an overall goal 199 really helps employees stay motivated. I found that VACI uses social media communications to publicize the positive aspects of transformation, but I found that propaganda distorts reality and serves as marketing against the negative press and scandals that are reported. The content used in social media does not accurately reflect the true progress within the VA, which results in misleading the public and demoralizing those who experience roadblocks in innovative progress within the VA. This highlights a lack of transparency and openness that was espoused in the initial VACI rollout. The semiformal VACI organization has caused it to change its name, mission, and leadership every few years. This has resulted in a lack of follow-through on innovation projects, and a lack of resources for projects. The semiformal nature has also resulted in a lack of responsibility making it easy to forget the past and turn away from VistA historic successes. In VA?s controlling culture, I use the theory of mission and rules cultures to show that the VA is unable to innovate within its guiding principles. Secretary McDonald joked that the VA was run by lawyers. He defined a rules-based organization as a safe place that never takes risks. VACI remains rules-based thereby hampering innovation. In my findings, I contrast other innovative organizations as counterexamples. Retaliation against whistleblowers, groupthink, and forgetting the past are all signs of a culture under siege, which I have experienced firsthand. I also show by counterexamples that the VA?s private-public-academic partnership failed by not sustaining innovation when compared to other examples. I was part of three different divisions at the VA and I was part of three different project types and three different software methodologies. 200 Insider Knowledge In a sense, autoethnography is self-disclosure. I had struggles with self-disclosure, particularly struggles with trying to decide what and how to disclose about my research on project cancelation. Therefore, based on my understanding of autoethnography, I primarily relied on self-reports of the experience. My reports that stemmed from insider knowledge that I?ve lived through. My research utilizes my academic tools and training?my knowledge of communication, ethnography, and observation, of relationships, self-disclosure processes, and stigma management. I have observed how my autoethnography happened to a variety of audiences; I am the person?the researcher?who lived through and observed the experience. Thus, another joy of autoethnography: I can provide valuable, insider insight not possible with other research techniques (e.g., surveys, others? self-reports); in terms of autoethnography. I can use autoethnography to provide an account of what happened during and after my speech act presentations. That is why I included specific interactions in this research. So, what have I found ? and why does it matter? The headlines are startling. VA is the largest integrated US healthcare system. Its budget and vacancies keep growing. Many HIT programs have been mismanaged, 201 delayed, or flawed, resulting in the waste of hundreds of millions of taxpayer dollars. The GAO concluded that the VA is ?susceptible to waste, fraud, and mismanagement?. My research findings show the underlying issues that currently affect the VA. When I started the research, my plan was to only study the effectiveness and efficiency of the RAPTOR application. What I did not know when I started was the profound effect of people, process, and technology that impacted the result of the project and the VACI program. Although I have been studying the VA since 2002, I was looking narrowly at the technical literature to learn the VistA application and the clinical literature to learn the radiology workflow. When I entered the PhD program and began my research journey, I quickly had to broaden my literature review to understand the human side of information technology; the culture, communications, and organization that impact the software. I had participated in the DevOps process, although it was never called that. I realized writing my proposal that the DevOps label fit what I had experienced. What do I know now that I did not know before? I am backwards when compared to many typical iSchool students. I have much experience, but my academic background is in electrical engineering. My timeline of 30 years in IT is rich with worldly HIT experiences. I have been working on VistA since 2002 and began designing RAPTOR in 2010. One of my strengths is being aware of what I don?t know and one of my weaknesses is being unable to fake what I don?t know. The biggest lesson was learning research methods for information studies. I have learned about research methodology as my dissertation journey carried me through many 202 different iterations. I attempted both quantitative and qualitative studies, mixed methods, action research and finally autoethnography. I had never heard of action research or autoethnography until I read about them a year into my PhD program. The coursework I took in HCI design and data visualization was perfect for summarizing what I achieved in RAPTOR design. Unfortunately, it was too late for the design of RAPTOR and not applicable for writing this dissertation. Much of my integrated paper initial literature collection focused on HCI and data visualization. After making the commitment to studying the organization, I had to come up to speed in ethical management and organizational culture and policies. I have been called a decent writer for an engineer, but academic writing at this scale and level has been a challenge. Having the time to work through my research was a luxury not often afforded me in industry. As an engineer, I needed every minute to see through all the ramifications of committing to a design. In this case, it is a research design instead of software. It has been said that one of the hardest things to write about is yourself, let alone your own failures. Having spent many years of my professional life on a project that was canceled was a series of events that many people would like to repress rather than relive over 57,000 words. Who should care? One of the lessons I learned at the iSchool conference was how innovative it is to use autoethnography in information studies. There are few autoethnographic research 203 papers in HIT and I found it to be a worthwhile endeavor. Why are over 70% of HIT projects a failure? I think that my systematic approach to understanding the lessons learned through failure was a good requiem for the future. Based on the VA?s lack of openness, forgetting and suppressing past and current missteps, this research will not be welcomed officially. Unofficially, there are many different constructive interpretations of the many failures at the VA. Adding my research to the record may not be welcome in some quarters, but bad decisions continue to be made. Whether through ignorance or ignoring a fair critique of these events, they spend massive amounts of money to justify their poor decisions. Limitations This research is a personal exploration from my memories of my work and the larger sociological understanding of the VA. I was not the only developer who worked on the VistA or RAPTOR software, so I solicited and received feedback from several individuals who were with me for their constant review throughout my many drafts. Methodological Limitations After my integrated paper, I made the decision to continue this research as autoethnographic. Not being able to use data-driven, quantitative findings, which is the norm for longitudinal studies, has an advantage in validating the hypothesis agnostically. Autoethnographic experiences may be different from one person to another depending on their role and their cultural perspective on the event or series of events. This evolution from action research had several impacts as well as limitations. This method assures the 204 readers that I am at the center of this and that the story is mine. Several key findings went through multiple feedback iterations. A good example of different perspectives making a more rounded finding is RAPTOR?s premature publicity (People Finding, Communication #3). In my mind, the publicity from a well-respected industry media (Aunt Minnie) was a positive, in that it ?pushed? the VA to move forward with building out the prototype. After several conversations with the lead radiologist innovator, he felt that in hindsight, the story was premature and less positive than I had initially portrayed it. My amended finding includes both perspectives to the event. His radiologist perspective was that since he was quoted in the article (included in Appendix III), he received many unwanted questions around the status of the software after it was ?poised for take-off?. It was poised but never flew. I spent much of this writing considering autoethnographic ethics. There were many examples and voices to include in my findings but out of respect for the people involved, not all were included. Relative Uniqueness of RAPTOR test case I have used the RAPTOR project as a case study to the entire 40-year VistA program that has many diverse code module projects that have their own story. There are many differences between the entire VistA multi-domain environments, scope, and history and RAPTOR, a modernized radiology workflow module. To improve the narrative and strengthen my findings, I have grouped VistA with RAPTOR. My finding (Technology Finding, VistA Modernization #2), places the cancelation of RAPTOR as a direct result of the cancelation of VistA modernization. It is much more shocking that 205 VistA EHR modernization has been canceled than its innovative radiology web module. As a result, I feel that there are many commonalities in my intention of writing this. There is a feeling of injustice about the cancelations. I point to the unrealized benefits of RAPTOR as well as the industry-wide recognition of VistA as a pioneering health information technology platform, as well as helping others and hopefully bringing about change. The scale of injustice increases when considering this fraud, waste and abuse of taxpayer resources and the limiting of our veterans? timely access to care, compliance to patient safety guidelines, and cost avoidance of unnecessary procedures. In psychological terms, I am feeling a similar grief to that of many VA stakeholders. The VistA hardhats are feeling their loss of identity. A person who loses their primary identity mourns a lost sense of self. I am fortunate that I have this research to understand my story and to create a new narrative. However, I see the hardhats on the Google groups community forum writing that VistA will be coming back when the VA finally comes to its senses. The hardhats discuss every negative VA headline as a reason that VistA will return. Our VistA identity has been lost and the grief is compounded by the lack of control we had in the decision. With every new finding, our grief and lost sense of self is mourned. Another loss being felt by the VistA community is a deep sense of disorientation due to unfilled expectations. We share a deep sense of unfairness due to the unexpected political shift in that cancelation decision. In writing this autoethnography, it helped me deal with a lost sense of stability. Researching my findings helped me to understand how the VA worked and a new reality of what could not be controlled. 206 Lack of previous studies in the research area As shown in my bibliography, I found much research on the history of the VA and VistA. However, I found no academic research and no judicated information on VACI. My initial reasoning for the lack of information was that this is current, and contemporary organizations will not have academic research. However, over the journey of my research I found that the VA is being uncooperative in sharing less than flattering data, trying to control the narrative, and forgetting the past. This made researching and writing this more critical to preserve it for future research as a foundation to be built on by new research. The information I have used to support my findings is from diverse sources, and I have collected it over the past ten plus years. A small sampling of the diversity of data collected and used includes correspondences, videos, websites, Google groups, and Powerpoint presentations. The scope of my collection is detailed in my methodology. This series of events from within the VA that impacted both RAPTOR and VistA is a common phenomenon in the private and public health IT sector, and the goal is that by conducting the research in this manner, from a humanistic perspective, it will help other organizations detect earlier in product innovation efforts, to identify root causes for unsustainable innovation environments. Scope of discussion 207 The scope of my dissertation became wider as my research journey continued. As I documented, my research widened beyond my original RAPTOR project to the VACI organization within the VA. As I focused on people, process, and technology, I touched on many areas including organizational behavior and public policy that I am not trained in. In telling my story, I took the ethnical choice to not include anyone that I did not inform explicitly about my research. While I included correspondences, I removed all identifiable information and edited for brevity and clarification. Contributions to Knowledge Systems Thinking Based on my 35 years of electronic system engineering, I offer the lens of systems thinking as a framework for looking at challenges and failures. My systematic look at the VACI includes the interdependencies between people, process, and technology and I have looked at specific parts, such as public private academic partnerships, the DevOps process, and open source technology. I have taken an in-depth look at the RAPTOR project as an example of a bottom-up view over an extended period of the software lifecycle. Some of the systems engineering questions that I ask include: What additional insight into the VACI community of practice processes and considerations can be realized by this project? How effective is the VACI innovation processes? Can the VA use this research to gain insight into its VistA modernization or commercialization 208 strategy? Can the VA use this research to modify its decision to retire VistA? Can I assess the broader impact of the research in the health care marketplace? Can this be a use case demonstration of open source, open standards development to achieve a customized, license-free, stable enterprise solution economically? Autoethnography Process In the methods section, I introduce the Autoethnographic Research Process (Figure 30). I often think visually and systematically. I was surprised that I couldn?t find a process diagram that illustrates the autoethnographic research process. Once I understood and committed to using autoethnography, I started with putting myself into the reflective tasks. I then came across the Kolb Learning Cycle and how it was used in education. I thought that it could be adapted for autoethnography. The reflection tasks I performed fit in well with writing my narrative. Relevancy Criteria A toe tag is an historical artifact signifying the identification of death. As shown in the figure below, it typically has descriptors about the deceased and the cause of death (if known). 209 Figure 57 TOE TAG In my literature review, I attempted to find the theory that best fit my understanding of my situation. As VACI is a semiformal organization, I had difficulty matching the information system theory to my observations. The theory journey led me to TOE TAM, a hybrid of two distinct enterprise adoption models. Once I came upon failure groups in organizational innovation systems, I used inductive synthesis to gain insight into my observations and which can be used in other situations. The key insight was that information system theory alone was not enough. Once I lined up ethical management with OIS, it was clear how well that failures of ethics lined up with breakdowns of systems. Table 13 is the findings relevancy criteria named TOE TAG. TOE TAG is a summary of organizational innovation system failure groups blended with ethical management. These criteria can be used in ethical management audits. It is richer than either the OIS groups or ethical criteria alone. 210 Recommendations for Future Research When I was planning to use action research to measure the effectiveness of the RAPTOR tool, I initially wrote many pages on the HCI and data visualization design and development decisions. I had access to several site sets of data of the current manual process. Radiology workflow improvement is an area of future research that I can explore, or the VA may explore soon. At my Doctoral Consortium presentation for the Conference on Health on IT and Analytics (CHITA), this was a recommendation of Dr. Agarwal, Director of the Center for Health Information & Decision Systems (CHIDS). Dr. Agarwal suggested that systemic overview of radiology workflow was needed based on new technologies. A future research suggestion made by the lead radiologist innovator would be to include information from other VACI portfolio projects, for example the Daily Plan. I didn?t pursue this based on several limitations. I requested the directory of projects from my FOIA requests, which as I noted has not been granted. Another limitation was to refocus away from RAPTOR, which would have been to change the autoethnographic case study. However, this makes perfect sense as a follow-up to validate my findings, or as counterexamples of my experiences, especially in the DevOps process. If I can gain access to many different projects, this could enhance the scope of discussion. Other VACI projects will improve the credibility of my findings. 211 An obvious avenue for future research would be to apply the TOE TAG criteria to another case study. As was discussed by my committee, examples include other VA agencies, large IT shops, and other large bureaucratic organizations. Other case studies will improve the credibility of my relevancy criteria. At the iConference, I was discussing with Dr. Irene Lopatovoska of the Pratt Institute my model of autoethnography as a feedback system. She noted that this model doesn?t include emotion and non-linear recall of events and she suggested a potential collaboration on future research to refine and test the process model. Self-Reflection This autoethnographic research was a challenge of passion and patience. Twenty- five years after completing my master?s, I returned to school and felt that RAPTOR would be the perfect vehicle to research. After years of design and development gestation, it was on the cusp of being used clinically. Four years later, I can look back at a series of setbacks through which I had to persevere. When the cancelation of RAPTOR and VistA modernization made my initial research and methodology doubtful, I continued investigating until I found an ethnographic approach that I could continue with my doctoral journey. I found autoethnographic methodology allowed me to come to terms with my fate. The analysis and interpretation of my findings has required my memory and insight into several different approaches and theories. With little to no guideposts along the way, I had to scavenge bits of information and twist existing literature into a 212 narratively meaningful autoethnography. I was fortunate that I was able to collect the unstructured fragments of my experience and creatively weave a narrative. My personal data interpretation was built with systems theory framework in mind, and in the end, I had to adapt theory out of ethics and failure. 213 Appendix I: The Top Five Medical-imaging IT Projects of 2012 The following is an excerpt of the article ?The Top Five Medical-Imaging IT Projects of 2012, originally edited by C. Proval. The original article ran in the Radiology Business Journal (RBJ). It is edited here to only include RAPTOR and not the other four projects. For the related People finding and its implication, see Communication #2. ?The Top Five Medical-imaging IT Projects of 2012 One hallmark unites the winning entries in the top five medical-imaging IT projects of 2012, cosponsored by Radiology Business Journal and the Society for Imaging Informatics in Medicine (SIIM): Each project represents a view beyond the traditional acquisition, archiving, and communication of radiological images. All of the winning entries take a global view of medical imaging: mining the data in the DICOM headers and dose sheets to produce a relevant number for patients? exposure to radiation; solving the technical and operational problems of including non-DICOM images in PACS; creating a nonlinear, flexible workflow layer that can tell the radiologist whether a brain tumor has grown before he or she looks at the image, as well as creating a worklist for a geographically disparate organization; solving the interoperability issues inherent in the movement of pathology images to create a digital consultation portal for pathology; and scouring the electronic medical record (EMR) for the data required to create a safe protocol for a study. 214 The entries were judged on their innovation/ingenuity, on whether they met critical/urgent/unmet needs, on whether they improved quality, on the product/tool/idea validation or evaluation, and on the universality of the application. All six judges are members of the SIIM board: Donald K. Dennison is an imaging-vendor executive; J. Raymond Geis, MD, is a radiologist with Advanced Medical Imaging Consultants, PC (Fort Collins, Colorado); David S Hirschorn, MD, is director of radiology informatics at Staten Island University Hospital in New York; Elizabeth A. Krupinski, PhD, FSIIM, is a research professor in the departments of radiology and psychology at the University of Arizona; Wyatt M. Tellis, PhD, is an informaticist in the radiology and biomedical imaging department at the University of California?San Francisco; and James T. Whitfill, MD, is CMIO of Southwest Diagnostic Imaging, Ltd (Scottsdale, Arizona). ? ?The Radiology Protocol Tool and Recorder (RAPTOR) System Medverd, a radiologist on staff at Washington?s VA Puget Sound Health Care System, Seattle Division, also holds a faculty appointment at the University of Washington. He has long held the belief that making protocols for advanced imaging exams is undervalued in private, public, and university settings. ?The process is not optimized,? he says. ?You get a piece of paper with one or two lines on it providing the clinical provider?s problem and questions to be answered; then, when one wants more information, it?s often time consuming and cumbersome. If you talk to any radiologist who has protocol responsibility for cross- sectional imaging, he or she will tell you there?s a constant battle between efficiency and effectiveness for that task.? Using funding from the VA Innovations Initiative and leveraging VA IT resources, Medverd designed a prototype for filling those gaps with 215 information from the EMR, with an extensible design that could be rolled out nationally. Because the VA has a legacy health IT architecture with a vast repository of health information, Medverd approached the project with the intention of designing the application in layers. He planned to use Web services, for example, to virtualize the electronic health record, so that a Web application (as opposed to software that needs to be installed on every user?s computer) could be used. ?With Web services, all we need to do is build a sort of data-adapter layer into the content-management system for the presentation of the data,? he explains. A happy discovery was the availability of the VA?s Medical Domain Web Services (MDWS), which Medverd and his team used to virtualize the health records. ?Frankly, I was not aware of it when I first submitted the idea, and I thought we?d have to build it ourselves,? he says. ?The discovery of MDWS was great because somebody else had already done the work, and that?s the advantage of working in layers. That MDWS layer provides the interactivity with the legacy archives that we would have had to build, if it weren?t there.? For a Veterans Integrated Service Network (VISN) to implement RAPTOR, the VISN?s protocol library would be uploaded to the RAPTOR server. Through the uploading process, the VISN would also cross-link the protocol library with commonly accepted naming conventions in the RSNA?s RadLex. Medverd?s goal of improving efficiency and patient safety throughout the VA system appears within reach. ?Given the amount of enthusiasm folks have had, I?m very optimistic that we?re going to move forward,? he says. Problem/Objective The paper- based workflow predominantly used to create protocols for advanced medical imaging at Veterans Health Administration (VHA) facilities is subject to numerous process errors. The RAPTOR system leverages the VHA?s EMR and open-source content-management 216 frameworks to provide an efficient Web environment, with decision support for contrast risk assessment and protocol assignment. Solution the RAPTOR system extracts relevant information for each patient from the EMR and displays it next to the imaging requisition. The Web interface provides access from a variety of systems and includes features to sort the worklist, flag relevant allergy history and renal-function tests, suggest relevant department-approved imaging protocols, suggest standardized pre- and post- exam hydration, and suggest premedication for those with a history of contrast reactions. This offers a significant advantage over the prior system by ensuring legibility, standardization, prioritization, multiuser access, and improved patient safety. Additional features of RAPTOR will include secure messaging, restricted ordering access for specialized studies, recognition of order duplication, and logging of physicians? and staff members? input into the protocol decision-making process. While this solution will initially be deployed as a pilot at selected VHA facilities, the goal will be deployment across the entire VHA enterprise. Results A review of the current paper-based protocol workflow at one VHA facility evaluated 341 MRI orders over the course of a month, of which 61% were for neuroradiology, 12% were for musculoskeletal imaging, and 6% were for body imaging. The average paper protocol required an elapsed time of 11 days from the time that the study was ordered to the day that the patient was successfully contacted to schedule the exam. It was found that approximately 15% of exams for which protocols had been completed were never performed; for 1%, orders were duplicated but both had protocols prepared, and for 2.5%, protocols were unsigned. Rare (but observed) clerical errors, such as mismatched patient information, further corrupted this system. RAPTOR prototype testing suggests significant process improvement due to real-time 217 data-query capabilities. Unproductive and redundant protocol-making efforts are minimized, the speed of the protocol process is increased due to prioritization and distribution of work within a multiuser-accessible electronic work list, fulfillment of enterprise quality and safety goals is improved due to automated identification and flagging of patients at risk for harm from the performance of advanced medical imaging, and ambiguity in medical-decision responsibility is eliminated through the capture of documentation logs and electronic signatures.? Cheryl Proval is editor, Radiology Business Journal. Kris Kyes, technical editor, and Thanh Le, editorial coordinator, Radiology Business Journal, contributed to this article. 218 Appendix II: VACI ?good news? story on RAPTOR The following is an excerpt of the contents of an internal VACI website. I assisted in writing this content. For the related Process finding and its implication, see Project Management #2. ??Wow.? ?This is excellent.? ?I want to use this now.? These are some of the enthusiastic and complementary comments have been received from the VHA National Radiology Chief and radiologists in VHA facilities across the nation in response to demonstrations of the recently completed Radiology Protocol Tool Recorder (RAPTOR) prototype software funded through the VHA Innovation Program. RAPTOR was even named as one of the ?Top 5 Medical Imaging IT Projects of 2012? by the Society for Imaging Informatics in Medicine (SIIM) and Radiology Business Journal (RBJ). RAPTOR is tailored to optimize advanced medical imaging protocoling and performance at VHA facilities. VA Radiologists review all clinician orders for advanced diagnostic imaging (Computerized Tomography, Magnetic Resonance Imaging, and Nuclear Medicine tests) and assign specific protocol instructions directing how each examination must be performed so that the clinical questions are answered. This is an error-prone highly manual process that is paper based and can takes weeks to complete. VA Radiologists frequently do not receive enough information on exam requisitions to optimize the quality and safety of their protocol decisions. Efforts to augment the clinical detail provided by the ordering provider can be cumbersome and 219 negatively impact Radiologist productivity and imaging department efficiency. Similarly, paper-based systems have inherent inefficiencies compared to electronic solutions. By leveraging open source tools and standards, RAPTOR has capitalized on opportunities for interactivity between VHA information systems to maximize radiologist protocoling effectiveness while preserving productivity and simultaneously assuring safety through automated identification of risks and contraindications of some imaging studies for some patients. RAPTOR leverages VHA?s Class I Medical Domain Web Services (MDWS) to interact and extract information from Veterans Health Information Systems and Technology Architecture (VistA). MDWS is a suite is equipped with the capacity to virtualize any legacy VistA Remote Procedure Call (RPC) as a web service. Beyond its promised patient care and department efficiency benefits, RAPTOR represents an early use case of how the VA (and the government, in general) can exploit mature open source, open standards application development to modernize its information systems in a relatively short time, with zero licensing costs, low administrative burden and in accordance with the American Council for Technology ? Industry Advisory Council VistA Modernization Report. To date, the RAPTOR project has been invited to present at both the upcoming Drupal Government Days National meeting and SIIM international meeting.? 220 Appendix III: RAPTOR VA protocol software poised for takeoff The following unedited article was written by C. Keen in AuntMinnie.com and published on the web on March 29, 2013. For the related People finding and its implication, see Communication #3. RAPTOR VA protocol software poised for takeoff By Cynthia E. Keen. AuntMinnie.com staff writer March 29, 2013 -After more than a year of laboratory development and testing, radiologists at the Veterans Affairs? Puget Sound Health Care System in Seattle are hopeful that software called RAPTOR designed to help manage medical imaging protocol selection and workflow will be authorized to move into a real-world pilot program. The Radiology Protocol Tool and Recorder (RAPTOR) software is designed to render the labor-intensive process of protocol assignment and the often paper-based workflow in imaging departments obsolete, according to its developers. It is programmed to search for information from a patient's medical record that is important for protocol decisions and display it in a dashboard synchronized with an electronic protocoling tool. 221 Based on a radiologist's individualized parameters or a radiology department's requirements, RAPTOR will automatically extract and prioritize orders and relevant information from the VA's VistA electronic medical record (EMR) using Medical Domain Web Services (MDWS). Information required for protocol decisions, such as patient allergies, renal function, clinician contact information, key clinical notes, specific lab values, and radiology reports, is automatically populated into the dashboard and can be easily accessed by authorized users. "RAPTOR is designed to provide radiologists with seamless, just-in-time patient information, not only to assign protocols but also to track exam acquisition and study interpretation phases of the workflow," explained lead developer Dr. Jonathan Medverd, who is also an assistant professor in the department of radiology at the University of Washington. "Data moves instantly to the next staff member responsible and is distributed within a multiuser accessible work. List," he said. "Priorities can be assigned on the fly. Everything is date-stamped and electronically signed. Unlike paper systems, nothing gets misplaced." 222 Protocol assignments Although protocol assignment critically affects quality and safety within a radiology department, its importance can be overlooked, Medverd said. In hospitals with paper- based records, radiologists may not have access to search for data they need for protocol assignments. And while they may have access to EMRs, this usually means logging into another computer and then searching through electronic data. Patient information is often available, but quick access is rare. "No matter how well-intentioned radiologists may be to select the best and safest protocol for a patient, they may not have time to be as thorough as they would like to be," he said. "Protocoling is a step that you try to do well, but you try to do it as fast as possible because you have to get on to the business of interpreting images. Because few imaging departments measure the quality of protocol assignment, they do not know how well they are doing. You cannot assess performance - and patient safety - when it is not measured. Nor can you assess department efficiency and productivity with respect to protocol assignment." RAPTOR not only provides information and keeps a detailed record of it and decisions made, it facilitates communication with the ordering physician and other radiologists by secure messaging, Medverd said. It maintains a library of the department's standardized imaging protocols, both for acquisition of imaging and for application of pre- and post- 223 test hydration or medications. In fact, RAPTOR suggests the use of pre- and post-exam hydration or medication for patients, when appropriate, and recommends the best standardized protocols. The software can screen for history of contrast reaction and report it immediately. It will also identify when written informed consent is needed for administration of intravenous contrast agents. II can even automatically recognize unauthorized orders and recommend refusal, according to Medverd. Eliminating inefficiencies RAPTOR is designed to eliminate the inefficiencies of paper-based workflow and workflow environments where patient information is contained in siloed health IT systems. It will also provide a comprehensive, time-stamped permanent record of all activities related to protocol assignment- a record that is seldom acquired and retained by a RIS, he said. RAPTOR is currently in a process to receive approval to transition from the "laboratory sandbox" to a pilot program at the Seattle VA's radiology department and three other West Coast VA facilities. Its accuracy and performance will be measured against baseline data, such as time expended from an order to protocol assignment. 224 Funding from the VA Innovation Initiative was used for the project, according to Medverd. He said that the office is facilitating the evaluation and advancement of RAPTOR. If the software works as planned, Medverd anticipates that it will eventually be rolled out for use m every 1mag1ng department of the VA healthcare system. ''We have a record of innovation," he concluded in his interview with AuntMinnie.com. "The VA developed one of the earliest EMRs for its medical centers as well as deployed one of the first filmless radiology departments in the world." Information on RAPTOR was also presented at RSNA 2012 and in an article published online January 4 in the Journal of Digital Imaging. -- ? Copyright? 2013 AuntMinnie.com 225 Bibliography Methods and Methodology Anderson, L. (2006). Analytic Autoethnography. Journal of Contemporary Ethnography, 35(4), 373?395. Andrew, S. (2017). Searching for An Autoethnographic Ethic. Routledge. New York and London. Atkinson, P. (2006). Rescuing Autoethnography. Journal of Contemporary Ethnography, 35(4), 400?404. Ballard, S. D. (2015). Action Research: A Personal Epiphany and Journey with Evidence- Based Practice. Knowledge Quest, 43(3), 44-48. Bahk-Halberg, J. H. (2018) What?s Your Story? Using Interviews and Narrative in Academic Research. International Journal on Studies in English Language and Literature (IJSELL) 6(9). Bannan-Ritland, B., & Baek, J. Y. (2008) Investigating the Act of Design in Design Research: The Road Taken. Handbook of Design Research Methods in Education: Innovations in Science, Technology, Engineering, And Mathematics Learning And Teaching, 299-319. 226 Barab, S., & Squire, K. (2004). Design-Based Research: Putting A Stake in The Ground. The Journal of The Learning Sciences, 13(1), 1-14. Barnet, S., Bedau, H. A., & O'Hara, J. (2005). From Critical Thinking to Argument: A Portable Guide. Bedford/St. Martin's. Blum, F. H. (1955). Action Research--A Scientific Approach. Philosophy of Science, 22(1), 1-7. Bruni, N. (2002). The crisis of visibility: Ethical dilemmas in autoethnographic research. Qualitative Research Journal, 2(1), 24-33. Burns, D. (2014). Action Research. Institute of Developmental Studies at The University of Sussex, 12(1) Brighton, UK, Sage Journals. Accessed on June 24, 2019 at https://journals.sagepub.com/ (1-16). Beverly Hills, CA, USA. Carter, M., Compeau, D., Kennedy, M. I. L., & Schmalz, M. (2017). The content and context of identity in a digital society. 3245. Casertano, A. (2018). Integrating the US Department of Veterans Affair Radiology Protocol Workflow with Information Science, UMD Integrative Paper. Chang, H. (2016). Autoethnography As Method (Vol. 1). Routledge. New York and London. 227 Cornu, A. (2006). Theological reflection and Christian formation. Journal of Adult Theological Education, 3(1), 11-36. Costello, J., Feller, J., & Sammon, D. (2016). On the Road to Trusted Data: An Autoethnography Of Community Governance and Decision-Making. Journal of Decision Systems, 25(Sup1), 182-197. Cousins, K. C., & Robey, D. (2005). Human agency in a wireless world: Patterns of technology use in nomadic computing environments. Information and Organization, 15(2), 151-180. Creswell, J. W. (1994). Research Design: Qualitative & Quantitative Approaches. Sage Publications, Inc. Beverly Hills, CA, USA. Crotty, M. (1998). The Foundations of Social Research: Meaning and Perspective in The Research Process. Sage Publications, Inc. Beverly Hills, CA, USA. Davis, F. D. (1985). A Technology Acceptance Model for Empirically Testing New End- User Information Systems: Theory and Results (Doctoral Dissertation, Massachusetts Institute of Technology). Delamont, S. (2007, September). Arguments against auto-ethnography. In Paper presented at the British Educational Research Association Annual Conference (Vol. 5, p. 8). Denzin, N. K. (2013). Interpretive Autoethnography (Vol. 17). Sage Publications. Beverly Hills, CA, USA. 228 Dusick, D. (2014). Writing the Theoretical Framework. BOLD Educational Software Ellis, C. (2016). Revision: Autoethnographic Reflections on Life and Work. Routledge. London and New York. Fergusson, L., Van Der Laan, L., & Baker, S. (2019). Reflective practice and work-based research: a description of micro-and macro-reflective cycles. Reflective Practice, 20(2), 289-303. Frost, P. J., & Stablein, R. E. (Eds.). (1992). Doing Exemplary Research. Sage Publications. Beverly Hills, CA, USA. Ghita, C. R. (2019). IN DEFENCE OF SUBJECTIVITY. AUTOETHNOGRAPHY AND STUDYING TECHNOLOGY NON-USE. Gorman, G. E., Clayton, P. R., Shep, S. J., & Clayton, A. (2005). Qualitative Research for The Information Professional: A Practical Handbook. Facet Publishing. London. Gray, D. E. (2013). Doing Research in The Real World. Sage Publications. Beverly Hills, CA, USA. Greenwood, D.J. (2006). Introduction to Action Research: Social Research for Social Change, (2nd Ed.). Thousand Oaks, CA.: SAGE Publications, Inc. Grover, V., & Lyytinen, K. (2015). New State of Play in Information Systems Research: The Push to the Edges. Mis Quarterly, 39(2), 271-296. 229 Herr, K., & Anderson, G. L. (2014). The Action Research Dissertation: A Guide for Students and Faculty. Sage Publications. Beverly Hills, CA, USA. Herr, K. & Anderson, L. (2014). Designing the Plane While Flying It: Proposing and Doing the Dissertation. Action Research Dissertation: A Guide for Students and Faculty, 2nd Ed. (Pp. 83-110). Sage Publications. Beverly Hills, CA, USA. Husserl, E. (1970). The Crisis of European Sciences and Transcendental Phenomenology: An Introduction to Phenomenological Philosophy. Northwestern University Press. Evanston, IL. Jackson, A. Y., & Mazzei, L. A. (2008). Experience And ?I? In Autoethnography: A Deconstruction. International Review of Qualitative Research, 1(3), 299-318. Jacobsen, K. H. (2011). Introduction to Health Research Methods: A Practical Guide Sudbury, MA: Jones & Bartlett Learning. Jansen, B. J. (2006). Search Log Analysis: What It Is, What's Been Done, How to Do It. Library & Information Science Research, 28(3), 407-432. Jones, S. H., Adams, T. E., & Ellis, C. (Eds.). (2016). Handbook of Autoethnography. Routledge. London and New York. 230 Kelly, A. E., Lesh, R. A., & Baek, J. Y. (Eds.). (2014). Handbook of Design Research Methods in Education: Innovations in Science, Technology, Engineering, And Mathematics Learning And Teaching. Routledge. London and New York. Kolb, D. A. (1984). Experiential learning: Experience as a source of learning and development. New Jersey: Prentice Hall. Kumar, R. (2019). Research methodology: A step-by-step guide for beginners. Sage Publications Limited. Lapadat, J. C. (2017). Ethics in Autoethnography And Collaborative Autoethnography. Qualitative Inquiry, 23(8), 589-603. Lartey, E. (2000). Practical theology as a theological form. The Blackwell reader in pastoral and practical theology, 128-134. Layder, D. (1998). Sociological practice: Linking theory and social research. Sage Publications. Beverly Hills, CA, USA. Lee, J. (2014). Understanding States' Failure in Sustained Innovation from the Diffusion Perspective: The Empirical Study of the Diffusion of EFOIA in the US States (Doctoral dissertation, Arizona State University). Machi, L. A., & Mcevoy, B. T. (2016). The Literature Review: Six Steps to Success. Corwin Press. Sage Publications. Beverly Hills, CA, USA. 231 Maxwell, J. A. (2012). Qualitative Research Design: An Interactive Approach (Vol. 41). Sage Publications. Beverly Hills, CA, USA. Morgan, D. L. (1998). Practical Strategies for Combining Qualitative and Quantitative Methods: Applications to Health Research. Qualitative Health Research, 8(3), 362-376. Motulsky, H. (2014). Intuitive Biostatistics: A Nonmathematical Guide to Statistical Thinking. Oxford University Press, New York, USA. Patton, M. Q. (2008). Utilization-Focused Evaluation. Sage Publications. Beverly Hills, CA, USA. Pickard, A. J. (2013). Research Methods in Information. Facet Publishing. London, UK. Reason, P., & Bradbury, H. (Eds.). (2001). Handbook of Action Research: Participative Inquiry and Practice. Sage Publications. Beverly Hills, CA, USA. Reed-Danahay, D. (1997). Auto/Ethnography. New York: Berg. Reed-Danahay, D. (2009). Anthropologists, Education, And Autoethnography. Reviews in Anthropology, 38(1), 28-47. Rowe, F. (2012). Toward a richer diversity of genres in information systems research: new categorization and guidelines. European Journal of Information Systems 21, 469- 478. 232 Strike, K., & Posner, G. (1983). Types of Synthesis and Their Criteria. In S.A. Ward, & L.J. Reed (Eds.), Knowledge Structure and Use: Implications for Synthesis and Interpretation (pp. 344-361). Philadelphia: Temple University Press. Stringer, E. T. (2013). Action Research. Sage Publications. Beverly Hills, CA, USA. Taksa, I., Spink, A., & Jansen, B. J., editors, and authors (2009). Web Log Analysis: Diversity of Research Methodologies. In Handbook of Research on Web Log Analysis (Pp. 506-522). IGI Global. Te''eni, D. (2001). A cognitive-affective model of organizational communication for designing IT. MIS quarterly, 25(2), 251-312. Tolich, M. (2010). A Critique of Current Practice: Ten Foundational Guidelines for Autoethnographers. Qualitative Health Research, 20(12), 1599-1610. Twining, P., Heller, R. S., Nussbaum, M., & Tsai, C. C. (2017). Some guidance on conducting and reporting qualitative studies Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User Acceptance of Information Technology: Toward A Unified View. MIS Quarterly, Sep 1:425-78. Whetten, D. (2002). Modeling-as-theorizing: A systematic methodology for theory development. Essential skills for management research, 45. 233 Wilson, C. (2013). Interview Techniques for UX Practitioners: A User-Centered Design Method. Newnes. Morgan Kauffman, Waltham MA. Wired (2016), MIT Media Lab's Journal Of Design And Science Is A Radical New Kind Of Publication, Accessed Online Https://Www.Wired.Com/2016/03/Mit-Media-Labs- Journal-Design-Science-Radical-New-Kind-Publication/ Accessed January 7, 2019. Yin, R. K. (2017). Case study research and applications: Design and methods. Sage Publications. Beverly Hills, CA, USA. Computational Archival Science and Policy Literature Andes, N. (1988). The Commodification of Government Information: A Summary and Analysis of The Reagan Administration?s Restrictions on Federal Information. Government Publications Review, 15(5), 451?461. Apache software Foundation, http://apache.org/ Accessed June 10, 2019. Drupal, http://drupal.org/ Accessed June 10, 2019. Drupal Gov Days has been changed to https://www.drupalgovcon.org/ Accessed June 10, 2019. Eisenbeis, K. (1988). An NTIS Case Study: A Skirmish in The Privatization Wars. Government Publications Review, 15(4), 355?369. Esteva, M., & Marciano, R. (2018). Computational Archival Science (CAS): From Research to Practice. (Online). Available: Http://Dcicblog.Umd.Edu/Cas/#Workshops. Accessed January 7, 2019. 234 Fiorito, R., & Kollintzas, T. (2004). Public Goods, Merit Goods, And the Relation Between Private and Government Consumption. European Economic Review, 48(6), 1367?1398. Holeywell, R. (2013, February). Why Isn?t the U.S. Better at Public-Private Partnerships? Governing (Online). Available: Http://Www.Governing.Com/Topics/Finance/Gov- Public-Private-Partnerships-In-America.Html Accessed January 7, 2019. Hasan, R., Sion, R., & Winslett, M. (2007). Introducing Secure Provenance: Problems and Challenges. In Proceedings of the 2007 ACM Workshop on Storage Security and Survivability (Pp. 13-18). ACM. Ito, J. (2016), Design and Science, MIT Journal of Design and Science weblog, http://jods.mitpress.mit.edu/pub/designandscience. Accessed January 7, 2019. Ito, J. (2014), Antidiscplinary weblog. Accessed at https://joi.ito.com/weblog/2014/10/02/antidisciplinar.html, https://doi.org/10.31859/20141002.1939. Accessed January 7, 2019. Jansen, R. Marciano, S. Padhy, K. Mchenry, ?Designing Scalable Cyberinfrastructure for Metadata Extraction in Billion-Record Archives?, 13th International Conference on Digital Preservation, Ipres 2016, Bern Switzerland, Oct. 4, 2016. Kettl, D. F. (1993). Sharing Power: Public Governance and Private Markets. Washington, D.C: The Brookings Institution. Kettl, D. F. (2002). Accountability Challenges of Third-Party Government. In P. L. Posner & O. V. Elliott (Eds.), The Tools of Government: A Guide to The New Governance (Pp. 523?551). Oxford: Oxford University Press. 235 Kettl, D. F. (2015). The Job of Government: Interweaving Public Functions and Private Hands. Public Administration Review. 75(2), 219-229. Kent, C. A. (1989). The Privatizing of Government Information: Economic Considerations. Government Publications Review, 16(2), 113?132. Khajeh-Hosseini, A., Sommerville, I., & Sriram, I. (2010). Research Challenges for Enterprise Cloud Computing. Arxiv Preprint Arxiv:1001.3257. Lee, M., Zhang, Y., Chen, S., Spencer, E., Cruz, J. D., Hong, H., & Marciano, R. (2017, December). Heuristics for Assessing Computational Archival Science (CAS) Research: The Case of The Human Face of Big Data Project. In Big Data (Big Data), 2017 IEEE International Conference On (Pp. 2262-2270). IEEE. Conference location: Boston, MA, USA Lee, M., Chen, S., Zhang, Y., Spencer, E., & Marciano, R. (2018, March). Toward Identifying Values and Tensions in Designing A Historically Sensitive Data Platform: A Case-Study on Urban Renewal. In International Conference on Information (Pp. 632- 637). Springer, Cham. Lemieux, V. L. (Ed.). (2016). Building Trust in Information: Perspectives on The Frontiers of Provenance. Springer. New York, USA. Lemieux, V (2016), "One Step Forward, Two Steps Backward? Does E-Government Make Governments in Developing Countries More Transparent and Accountable?? (Online). Available: At: Http://Pubdocs.Worldbank.Org/En/287051452529902818/WDR16-BP-One-Step- Forward-Lemieux.Pdf Accessed January 7, 2019. Linux, https://www.linux.org/ Accessed June 10, 2019. 236 Marciano, R., Lemieux, V., Hedges, M., Esteva, M., Underwood, W., Kurtz, M., & Conrad, M. (2017). Archival Records and Training in The Age of Big Data. Re- Envisioning The MLS: Perspectives on The Future of Library and Information Science Education. Emerald Group Publishing Limited. Bingley, West Yorkshire, England. Accessed January 7, 2019. Marciano, R., (2016) Computational Archival Science Symposium. (Online). Available: Http://Dcicblog.Umd.Edu/Cas/Attendees/ Accessed January 7, 2019. Marciano, R., & Esteva, M. (2018). The Scope of Computational Archival Science (CAS): Methods, Resources, And Interdisciplinary Approaches. (Online). Available: Http://Dcicblog.Umd.Edu/Cas/#Workshops Accessed January 7, 2019. Mcmullen, S. (2000). US Government Information: Selected Current Issues in Public Access Vs. Private Competition. Journal of Government Information, 27(5), 581?593. MySQL, https://www.mysql.com/about/ Accessed June 10, 2019. Osborne, D., & Gaebler, T. (1992). Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector. Reading, Mass.: Addison-Wesley Pub. Co. Rahm, E., & Do, H. H. (2000). Data Cleaning: Problems and Current Approaches. IEEE Data Eng. Bull. 23(4), 3-13 Wood, F. B. (1988). Proposals for Privatization of The National Technical Information Service: A Viewpoint. Government Publications Review, 15(5), 403?409. Radiology Information Systems Literature 237 Abraham, L. Harris, M and Zalis. M (2010) "Initial Observations of Electronic Medical Record Usage During CT and MRI Interpretation: Frequency of Use and Impact On Workflow." American Journal of Roentgenology 195, 1: 188-193. Achenbach, S. (2002). Intranet and Radiology: A Critical Appraisal of Radiological Applications of Intranet Technology. European Radiology, 12(2), 485-490. Alkasab, T. K., Harris, M. A., Zalis, M. E., Dreyer, K. J., & Rosenthal, D. I. (2010). A Case Tracking System with Electronic Medical Record Integration to Automate Outcome Tracking for Radiologists. Journal of Digital Imaging, 23(6), 658-665. American College of Radiology. (2015). ACR Practice Parameter for Communication of Diagnostic Imaging Findings. American College of Radiology (ACR); 2014 (Resolution 11). Bassignani, M. J., Dierolf, D. A., Roberts, D. L., & Lee, S. (2010). Paperless Protocoling of CT and MRI Requests at An Outpatient Imaging Center. Journal of Digital Imaging, 23(2), 203-210. Boland, G. W., Guimaraes, A. S., & Mueller, P. R. (2009). The Radiologist?s Conundrum: Benefits and Costs of Increasing CT Capacity and Utilization. European Radiology, 19(1), 9-11. Brenner, D. J., & Hall, E. J. (2007). Computed Tomography?An Increasing Source of Radiation Exposure. New England Journal of Medicine, 357(22), 2277-2284. 238 Clark, K. W., Gierada, D. S., Marquez, G., Moore, S. M., Maffitt, D. R., Moulton, J. D., ... & Prior, F. W. (2009). Collecting 48,000 CT Exams for The Lung Screening Study of The National Lung Screening Trial. Journal of Digital Imaging, 22(6), 667. Clunie, D. A., Dennison, D. K., Cram, D., Persons, K. R., & Bronkalla, M. D. (2016). Technical Challenges of Enterprise Imaging: HIMSS-SIIM Collaborative White Paper. Journal of Digital Imaging, 29(5), 583-614. Crowe, B., & Sim, L. (2009). Workflow and Data Flow in Radiology. Int J CARS, 4(1), S160-S167. Dilsizian, S. E., & Siegel, E. L. (2014). Artificial Intelligence in Medicine and Cardiac Imaging: Harnessing Big Data and Advanced Computing to Provide Personalized Medical Diagnosis And Treatment. Current Cardiology Reports, 16(1), 441. Douglas, P. S., Hendel, R. C., Cummings, J. E., Dent, J. M., Hodgson, J. M., Hoffmann, U., ... & Masoudi, F. A. (2009). ACCF/ACR/AHA/ASE/ASNC/HRS/NASCI/RSNA/SAIP/SCAI/SCCT/SCMR 2008 Health Policy Statement on Structured Reporting in Cardiovascular Imaging. Circulation, 119(1), 187-200. Erickson, B. J., Meenan, C., & Langer, S. (2013). Standards for Business Analytics and Departmental Workflow. Journal of Digital Imaging, 26(1), 53-57. Gangopadhyay, A., Yesha, R., & Siegel, E. (2016). Knowledge Discovery in Clinical Data. In Machine Learning for Health Informatics (Pp. 337-356). Springer, Cham. 239 Gassert, G., Durham, J., Cain, M., & Sachs, P. B. (2014). Interventional Radiology Workflow Management in The Electronic Medical Record. Journal of Digital Imaging, 27(3), 314-320. Greco, G., Patel, A. S., Lewis, S. C., Shi, W., Rasul, R., Torosyan, M., ... & Siegel, E. L. (2016). Patient-Directed Internet-Based Medical Image Exchange: Experience from An Initial Multicenter Implementation. Academic Radiology, 23(2), 237-244. Health Level Seven Standard Health Version 2.5. Hoffer, J. A. (2012). Modern Systems Analysis and Design, 6/E. Pearson Education India. Hostetter, J. M., Morrison, J. J., Morris, M., Jeudy, J., Wang, K. C., & Siegel, E. (2017). Personalizing Lung Cancer Risk Prediction and Imaging Follow-Up Recommendations Using the National Lung Screening Trial Dataset. Journal of The American Medical Informatics Association, 24(6), 1046-1051. Hurlen, P., ?stbye, T., Borthne, A., Dahl, F. A., & Gulbrandsen, P. (2009). Do Clinicians Read Our Reports? Integrating the Radiology Information System with The Electronic Patient Record: Experiences from The First 2 Years. European Radiology, 19(1), 31-36. Keen, Helen (October 2012) Road to RSNA Preview (Online). Available: Auntminnie.Com. Accessed January 7, 2019. Keen, Helen (March 28, 2013) RAPTOR VA Protocol Software Poised for Takeoff (Online). Available: Auntminnie.Com. Accessed January 7, 2019. 240 Kohli, M. D., Warnock, M., Daly, M., Toland, C., Meenan, C., & Nagy, P. G. (2014). Building Blocks for A Clinical Imaging Informatics Environment. Journal of Digital Imaging, 27(2), 174-181. Krupinski, E. A., Reiner, B., & Siegel, E. (2014, March). How Does Radiology Report Format Impact Reading Time, Comprehension and Visual Scanning? Medical Imaging 2014: Image Perception, Observer Performance, And Technology Assessment (Vol. 9037, P. 90370C). International Society for Optics and Photonics. Leviss, J., Gugerty, B., & Kaplan, B. (2010). HIT Or Miss. Lessons Learned from Health Information Technology Implementations. Chicago, IL: American Health Information Management Association. Lincoln, T., & Korpman, R. (1980). Computers, Health Care, and Medical Information Science. Science, 210(4467), 257-263. Retrieved from http://www.jstor.org/stable/1684861 Martino, S., Reid, J., & Odle, T. G. (2008). Computed Tomography in the 21st Century. Changing Practice for Medical Imaging and Radiation Therapy Professionals. American Society of Radiologic Technologists. Http://Www. Asrt. Org/Docs/Default- Source/Whitepapers/Asrt_Ct_Consensus. Pdf. Accessed January 7, 2019. 241 Medverd, J. R., Cross, N. M., Font, F., & Casertano, A. (2013). Advanced Medical Imaging Protocol Workflow?A Flexible Electronic Solution to Optimize Process Efficiency, Care Quality and Patient Safety in The National VA Enterprise. Journal of Digital Imaging, 26(4), 643-650. Medverd, Fuller III, (2013) Computed Tomography Imaging Protocol Contrast Risk Assessment, VHA Puget Sound Health Care System Diagnostic Imaging Service Quality Improvement Project Morgan, M. B., Branstetter, B. F., Lionetti, D. M., Richardson, J. S., & Chang, P. J. (2008). The Radiology Digital Dashboard: Effects on Report Turnaround Time. Journal of Digital Imaging, 21(1), 50-58. Morgan, M. B., Branstetter, B. F., Clark, C., House, J., Baker, D., & Harnsberger, H. R. (2011). Just-In-Time Radiologist Decision Support: The Importance Of PACS-Integrated Workflow. Journal of The American College of Radiology, 8(7), 497-500. Morgan, M., Mates, J., & Chang, P. (2006). Toward A User-Driven Approach to Radiology Software Solutions: Putting the Wag Back in The Dog. Journal of Digital Imaging, 19(3), 197-201. Morris, M. A., Saboury, B., Burkett, B., Gao, J., & Siegel, E. L. (2018). Reinventing Radiology: Big Data and The Future of Medical Imaging. Journal of Thoracic Imaging, 33(1), 4-16. 242 Patti, J. A. (2011). The National Radiology Data Registry: A Necessary Component of Quality Health Care. Journal of The American College of Radiology, 8(7), 453. Reiner, B. I., Siegel, E. L., Carrino, J. A., & Goldburgh, M. M. (2002). SCAR Radiologic Technologist Survey: Analysis of The Impact of Digital Technologies on Productivity. Journal of Digital Imaging, 15(3), 132-140. Rosenfeld, L., & Morville, P. (2002). Information Architecture for The World Wide Web. " O'Reilly Media, Inc.". Rubin, D. L. (2011). Informatics in Radiology: Measuring and Improving Quality in Radiology: Meeting The Challenge With Informatics. Radiographics, 31(6), 1511-1527. Schueler, B. A. (2008). Incorporating Radiation Dose Assessments into The ACR Appropriateness Criteria?. Journal of The American College of Radiology, 5(6), 775- 776. Schauer, D. A., & Linton, O. W. (2009). NCRP Report No. 160, Ionizing Radiation Exposure of The Population of The United States, Medical Exposure?Are We Doing Less with More, And Is There A Role for Health Physicists? Health Physics, 97(1), 1-5. Schneider, E., Franz, W., Spitznagel, R., Bascom, D. A., & Obuchowski, N. A. (2011). Effect of Computerized Physician Order Entry on Radiologic Examination Order Indication Quality. Archives of Internal Medicine, 171(11), 1036-1038. Siegel, E. L. (1998). Economic and Clinical Impact of Filmless Operation in A Multifacility Environment. Journal of Digital Imaging, 11(2), 42-47. 243 Sharma, A., Hostetter, J., Morrison, J., Wang, K., & Siegel, E. (2016). Focused Decision Support: A Data Mining Tool to Query the Prostate, Lung, Colorectal, And Ovarian Cancer Screening Trial Dataset and Guide Screening Management For The Individual Patient. Journal of Digital Imaging, 29(2), 160-164. Tudor, J., Klochko, C., Patel, M., & Siegal, D. SIIM 2017 Scientific Session Posters & Demonstrations Exploratory Data Analysis of Order Entry Protocol Practices. Tudor, J., Klochko, C., Patel, M., & Siegal, D. (2018). Order Entry Protocols Are an Amenable Target for Workflow Automation. Journal of the American College of Radiology, 15(6), 854-858. Vano, E., Padovani, R., Neofotistou, V., Tsapaki, V., Kottou, S., Ten, J. I., ... & Faulkner, K. (2010, January). Improving Patient Dose Management Using DICOM Header Information. The European SENTINEL Experience. In Proceedings of The International Special Topic Conference on Information Technology in Biomedicine. Available at Http://Medlab. Cs. Uoi. Gr/Itab2006/Proceedings/Medical% 20Imaging/149. Pdf, Accessed June 3, 2019. (Vol. 22). Wang, K. C., Patel, J. B., Vyas, B., Toland, M., Collins, B., Vreeman, D. J., ... & Langlotz, C. P. (2017). Use of Radiology Procedure Codes in Health Care: The Need for Standardization And Structure. Radiographics, 37(4), 1099-1110. Wideman, C., & Gallet, J. (2006). Analog to Digital Workflow Improvement: A Quantitative Study. Journal of Digital Imaging, 19(1), 29-34. 244 Organizational Innovation and Knowledge Management Literature Abdekhoda, M., Dehnad, A., & Zarei, J. (2018). Determinant factors in applying electronic medical records in healthcare. Abou-Zeid, E.-S. (2002). A Knowledge Management Reference Model. Journal of Knowledge Management, 6(5), 486-499 Act, A. (1996). Health Insurance Portability and Accountability Act Of 1996. Public Law, 104, 191. Agencies See Advantages To Open Source Content Management Systems, Http://Www.Nextgov.Com/Nextgov/Ng_20111014_4116.Php?Oref=Topnews, Accessed June 3, 2018. Ahmed, S., Fiaz, M. & Shoaib, M. (2015). Impact of Knowledge Management Practices on Organizational Performance: An Empirical Study of Banking Sector In Pakistan. FWU Journal of Social Sciences, 9(2), 147-167 Ajmal, M. & Koskninen, K. (2008) Knowledge Transfer in Project-Based Organizations: An Organizational Culture Perspective. Project Management Journal, 29(1), 7-15. Alavi, M., Kayworth, T. & Leidner, D. (2005). An Empirical Examination of The Influence of Organizational Culture on Knowledge Management Practices. Journal of Management Information Systems, 22(3), 191-224 245 Alwis, R. S.-D & Hartmann, E. (2008). The Use of Tacit Knowledge Within Innovative Companies: Knowledge Management in Innovative Enterprises. Journal of Knowledge Management, 12(1), 133-147 Anderson, P., & Tushman, M. L. (1991). Managing Through Cycles of Technological Change. Research-Technology Management, 34(3), 26-31. Ansari, M., Youshanlouei, H. R. & Mood, M. M. (2012) A Conceptual Model for Success in Implementing Knowledge Management: A Case Study in Tehran Municipality. Journal of Service Science and Management, 5(1), 212-222 Ashkenas, R (2012) Learned Helplessness in Organizations, Harvard Business Review online, https://hbr.org/2012/06/learned-helplessness-in-organi, accessed in May 2019. Baker, J. (2012). The technology?organization?environment framework. In Information systems theory (pp. 231-245). Springer, New York, NY. Biancani, S., Mcfarland, D. A., & Dahlander, L. (2014). The Semiformal Organization. Organization Science, 25(5), 1306-1324. Bobrov, E., Bucchiarone, A., Capozucca, A., Guelfi, N., Mazzara, M., & Masyagin, S. (2019). Teaching Devops In Academia and Industry: Reflections and Vision. Arxiv Preprint Arxiv:1903.07468. 246 Burton-Jones, A., & Grange, C. (2012). From use to effective use: a representation theory perspective. Information systems research, 24(3), 632-658. Burton-Jones, A., & Volkoff, O. (2017). How Can We Develop Contextualized Theories of Effective Use? A Demonstration in The Context of Community-Care Electronic Health Records. Information Systems Research, 28(3), 468-489. Burton-Jones, A., Recker, J., Indulska, M., Green, P. F., & Weber, R. (2017). Assessing Representation Theory with a Framework for Pursuing Success and Failure. MIS Quarterly, 41(4), 1307-1333. Carneiro, A. (2000). How Does Knowledge Management Influence Innovation and Competitiveness? Journal of Knowledge Management, 4(2), 87-98 Carrion, G.C., Navarro, J. G. C. & Jiminez, D.J. (2012)> The Effect of Absorptive Capacity on Innovativeness: Context and Information Systems Capability As Catalysts. British Journal of Management. 23(1), 110-129 Chang, C. L.-H & Lin, T.-C. (2015) The Role of Organizational Culture in The Knowledge Management Process. Journal of Knowledge Management. 19(3), 433-455. Chesbrough, H. W. (2012) ?Open Innovation: Where We've Been and Where We?re Going,? Research Technology Management 55(4), Pp. 20-27 Dahiya, D., Gupta, M. & Jaine, P. (2012) Enterprise Knowledge Management System: A MultiAgent Perspective Information Systems, Technology and Management, 285(4), 271- 281 247 Darrin, M. A. G., & Krill, J. A. (Eds.). (2016). Infusing Innovation into Organizations: A Systems Engineering Approach. CRC Press. Davenport, T., Harris, J. G., De Long, D. W., & Jacobson, A. L. (2001). Data to Knowledge to Results: Building an Analytical Capability. California Management Review, 43(2), 117-138. Dineen, B. R., Lewicki, R. J., & Tomlinson, E. C. (2006). Supervisory guidance and behavioral integrity: Relationships with employee citizenship and deviant behavior. Journal of Applied Psychology, 91(3), 622. Eardley, A. (Ed.). (2010). Innovative Knowledge Management: Concepts for Organizational Creativity and Collaborative Design: Concepts for Organizational Creativity And Collaborative Design. IGI Global. Edureka Blog (2019), CI CD Pipeline ? Learn How to Setup A CI CD Pipeline from Scratch (Online). Available: Https://Www.Edureka.Co/Blog/Ci-Cd-Pipeline/ Accessed June 3, 2018. Eze, U. C., Goh, G. G. G., Goh, C. Y. & Tan, T. L. (2013). Perspectives of Smes On Knowledge Sharing. The Journal of Information and Knowledge Management Systems, 43(2), 210-236 Fletcher, R. D., Dayhoff, R. E., Wu, C. M., Graves, A., & Jones, R. E. (2001). Computerized Medical Records in The Department of Veterans Affairs. Cancer, 91(S8), 1603-1606. 248 Gangwar, H., Date, H., & Ramaswamy, R. (2015). Understanding determinants of cloud computing adoption using an integrated TAM-TOE model. Journal of Enterprise Information Management, 28(1), 107-130. Accessed June 3, 2018. Gartner Devops Model (Online). Accessed: Https://Www.Cloudtp.Com/Doppler/Adopting-Cloud-And-Devops-Across-2000- Developers-At-Vanguard/ Accessed June 3, 2018. Ghani, I. (Ed.). (2016). Emerging Innovations in Agile Software Development. IGI Global. Gladwell, M. (2011). Creation Myth: Xerox PARC, Apple, And the Truth About Innovation. New Yorker, 87(13). Gold, A., Malhotra, A. & Segars, A. (2001). Knowledge Management: An Organizational Capabilities Perspective. Journal of Management Information Systems. 18(1), 185-214 Govindarajan, V. (2016). The three-box solution: A strategy for leading innovation. Harvard Business Review Press. Greenhalgh, T., Stramer, K., Bratan, T., Byrne, E., Mohammad, Y., & Russell, J. (2008). Introduction of Shared Electronic Records: Multi-Site Case Study Using Diffusion of Innovation Theory. BMJ, 337, A1786. 249 Greiner, M. Bohmann, T. & Kremar, H. (2007). A Strategy for Knowledge Management Journal of Knowledge Management, 11(6), 3-15 Gurusamy, K., & Campbell, J. (2011, July). A Case Study of Open Source Software Adoption in Australian Public Sector Organizations. In PACIS (P. 70). Hall, R. & Andriani, P. (2002). Managing Knowledge for Innovation. Long Range Planning, 35(1), 29-48 Ha, S.-T., Lo, M.-C. & Wange, Y-C. (2015). Relationship Between Knowledge Management and Organizational Performance: A Test on SMEs In Malaysia, Kuching, Sarawak. Elsevier. Hansen, M. T., Nohria, N. And Tierney, T. (1999) "What?s Your Strategy for Managing Knowledge?" The Knowledge Management Yearbook 2000?2001: 1-10. Hansen, B., And N?rbjerg J. "Codification or Personalization-A Simple Choice." Proceedings of the 28th Information Systems Research Seminar in Scandinavia, Kristiansand, Norway. 2005. Hargadon, A. B., & Douglas, Y. (2001). When Innovations Meet Institutions: Edison And the Design of The Electric Light. Administrative Science Quarterly, 46(3), 476-501. Heath, C., & Heath, D. (2007). Made to Stick: Why Some Ideas Survive, And Others Die. Random House. 250 Hopkins, Michael S. (2010). The Four Ways IT Is Revolutionizing Innovation. MIT Sloan Management Review 51(3), 51-56. H?ttermann, M. (2012) Devops For Developers, 1st Ed. Dordrecht: Springer. IBM Institute of Industrial and Systems Engineering. IBM Global Business Services, People Process Technology ? The Three Element for a Successful Organizational Transformation. (2011). http://www.iise.org/Details.aspx?id=24456, Accessed August 23rd, 2018. Ipe, M. (2003) Knowledge Sharing in Organizations: A Conceptual Framework. Human Resource Development Review, 2(4), 337-359. Jennex, M. E. (Ed.). (2008). Current Issues in Knowledge Management. IGI Global. Kankanhalli, A., Tan, B. & Wei, K.-K. (2005). Contributing Knowledge to Electronic Knowledge Repositories: An Empirical Investigation. MIS Quarterly, 29(1), 113-143 Kaplan, R. S., & Norton, D. P. (2001). The Strategy-Focused Organization. Strategy and Leadership, 29(3), 41-42. Larsen, K. R., Allen, G., Vance, A., & Eargle, D. (2019). Theories used in IS research wiki. Accessed at: https://is.theorizeit.org/wiki/Main_Page Retrieved August, 4, 2019. 251 Leidner, D., Alavi, M. & Kayworth, T. (2006). The Role of Culture in Knowledge Management: A Case Study of Two Global Firms. International Journal Of E- Collaboration, 2(1), 17-40. Leonard-Barton, D. (1995). Wellsprings of Knowledge: Building and Sustaining the Sources Of Innovation. Lepore, J. (2014). The Disruption Machine. The New Yorker, 23, 30-6. Leavitt, H. J. (1976). Applied Organization Change in Industry: Structural, Technical, And Human Approaches. Reader in Operations Research for Libraries, 50-60. Liao, S.-H. & Wu, C.-C. (2009). The Relationship Among Knowledge Management, Organizational Learning, And Organizational Performance. International Journal of Business and Management, 4(4), 64-76. Leroy, H., Palanski, M. E., & Simons, T. (2012). Authentic leadership and behavioral integrity as drivers of follower commitment and performance. Journal of Business Ethics, 107(3), 255-264. Ling, T. N., San, L. Y. & Hock, N.T. (2009). Trist: Facilitator of Knowledge-Sharing Culture. Journal of Communications of The IBIMA (CIBIMA). 7(15), 137-142. 252 Loukis, E., Arvanitis, S., & Kyriakou, N. (2017). An empirical investigation of the effects of firm characteristics on the propensity to adopt cloud computing. Information Systems and e-Business Management, 15(4), 963-988. Manhart, M. & Thalmann, S. (2015). Protecting Organizational Knowledge: A Structured Literature Review. Journal of Knowledge Management, 19(2), 190-211. Marks, E. A., & Bell, M. (2008). Service Oriented Architecture (SOA): A Planning and Implementation Guide for Business and Technology. John Wiley & Sons. Markus, L., Marjchrzak, A. & Gasser, L. (2002). A Design Theory for Systems That Support Emergent Knowledge Processes. MIS Quarterly, 26(3), 179-212. Mcafee, A., Brynjolfsson, E., Davenport, T. H., Patil, D. J., & Barton, D. (2012). Big Data: The Management Revolution. Harvard Business Review, 90(10), 60-68. Mcdermott, R. (1999). Why Information Technology Inspired but Cannot Deliver Knowledge Management. California Management Review, 41(4), 103-117. Mooradian, N. (2006). Tacit Knowledge: Philosophic Roots and Role in KM. Journal of Knowledge Management, 9(6), 104-133. Murphy, S., & Cox, S. (2016, May). Classifying Organizational Adoption of Open Source Software: A Proposal. In IFIP International Conference on Open Source Systems (Pp. 123-133). Springer, Cham. 253 Mustonen?Ollila, E., & Lyytinen, K. (2003). Why Organizations Adopt Information System Process Innovations: A Longitudinal Study Using Diffusion of Innovation Theory. Information Systems Journal, 13(3), 275-297. Nascimento, L. M. A., & Travassos, G. H. (2017, September). Software Knowledge Registration Practices at Software Innovation Startups: Results of An Exploratory Study. In Proceedings of the 31st Brazilian Symposium on Software Engineering (Pp. 234-243). ACM. Nicolas, R. (2004). Knowledge Management Impacts on Decision Making Process. Journal of Knowledge Management, 8(1), 20-31. Nonaka, I., 1994. A Dynamic Theory of Organizational Knowledge Creation. Organization Science, 5(1), 14-37. O'Brien, E. (Ed.). (2010). Knowledge Management for Process, Organizational and Marketing Innovation: Tools and Methods: Tools And Methods. IGI Global. Pang, M. S., Lee, G., & Delone, W. H. (2014). IT Resources, Organizational Capabilities, And Value Creation in Public-Sector Organizations: A Public-Value Management Perspective. Journal of Information Technology, 29(3), 187-205. Pope, A., & Butler, T. (2012). Unpacking the People, Process and Technology Dimensions of Organisational KMS. In ECIS (P.183). Prahalad, C. K., & Ramaswamy, V. (2003). The New Frontier of Experience Innovation. Sloan Management Review (Summer 2003), 12-18. 254 Prodan, M., Prodan, A., & Purcarea, A. A. (2015). Three new dimensions to people, process, technology improvement model. In New contributions in information systems and technologies (pp. 481-490). Springer, Cham. Quinn, L., & Gardner-Madras, H. (2010). Comparing Open Source Content Management Systems: Wordpress, Joomla, Drupal And Plone. Idealware. Rahimli, A. (2012). Knowledge Management and Competitive Advantage. Information and Knowledge Management, 2(7), 37-43. Ramsaroop, P., And Oldham, B. W. (2004). Securing Business Intelligence: Knowledge and Cybersecurity in The Post-9/11 World. Falls Church, VA: Evolvent Press. Rasula, J., Vuksic, V.B. & Stemberger, M. I. (2012). The Impact of Knowledge Management on Organizational Performance. Economic and Business Review, 14(2), 147-168 Rogers, E. M. (2003). Diffusion of Innovations. Free Press. New York. Sanchez, J. H., Sanchez, Y. H., Ruiz, D.C. & Tarrasona, D.C. (2012). Knowledge Creating and Sharing Corporate Culture Framework. Crete, Greece, Elsevier. Rose, J., And Furneaux, B. (2016). Innovation Drivers and Outputs for Software Firms: Literature Review and Concept Development. Advances in Software Engineering, 2016. Safa, N. S. And R. von Solms, "An Information Security Knowledge Sharing Model in Organizations," Computers in Human Behavior, Vol. 57, Pp. 442-451, 2016. 255 Sanson-Fisher, R. W. (2004). Diffusion of Innovation Theory for Clinical Change. Medical Journal of Australia, 180(6 Suppl), S55. Simons, T. (2002). Behavioral integrity: The perceived alignment between managers' words and deeds as a research focus. Organization Science, 13(1), 18-35. Simons, T. L. (1999). Behavioral integrity as a critical ingredient for transformational leadership. Journal of Organizational Change Management, 12(2), 89-104. Singh, H., Spitzmueller, C., Petersen, N. J., Sawhney, M. K., Smith, M. W., Murphy, D. R., ... & Sittig, D. F. (2012). Primary Care Practitioners' Views on Test Result Management In EHR-Enabled Health Systems: A National Survey. Journal of The American Medical Informatics Association, 20(4), 727-735. Smith, E. A. (2001). The Role of Tacit and Explicit Knowledge in The Workplace. Journal of Knowledge Management, 5(4), 311-321. Srikantaiah, T., & Koenig, M. E. (Eds.). (2008). Knowledge Management in Practice: Connections and Context. Information Today, Inc. Su, N. M., Wilensky, H. N., & Redmiles, D. F. (2012). Doing Business with Theory: Communities of Practice in Knowledge Management. Computer Supported Cooperative Work (CSCW), 21(2-3), 111-162. 256 Sulaiman, M. A., & Bakar, N. A. A. (2019). Systematic Review on Ethical Issues in Cloud Computing. Open International Journal of Informatics (OIJI), 7(2), 65-74. Swanson, E. B. (2012). The Managers Guide to IT Innovation Waves. MIT Sloan Management Review, 53(2), 75. Swanson, E. B., & Ramiller, N. C. (2004). Innovating Mindfully with Information Technology. MIS Quarterly, 553-583. Szmodics, P. (2015). Knowledge-Based Process Management. Gyor, Hungary, IEEE. Tiwana, A. (2000). The Knowledge Management Toolkit: Practical Techniques for Building A Knowledge Management System. Prentice Hall PTR. Van Den Hooff, B., & Huysman, M. (2009). Managing Knowledge Sharing: Emergent and Engineering Approaches. Information & Management, 46(1), 1-8. Von Hippel, E. (2001). Innovation by User Communities: Learning from Open-Source Software. Sloan Management Review, Summer, 82-86. Walls, M. (2013). Building A Devops Culture. O'Reilly Media, Inc. Wang, P., & Ramiller, N. C. (2009). Community Learning in Information Technology Innovation. MIS Quarterly, 709-734. 257 Wang, P. (2010). The Surprising Impact of Fashions in Information Technology. Sloan Management Review, 51(4), 14-17. Yates, D., & Paquette, S. (2011). Emergency Knowledge Management and Social Media Technologies: A Case Study of the 2010 Haitian Earthquake. International Journal of Information Management, 31, 6-13. Zack, M. H. (2003). Rethinking the Knowledge-Based Organization. Sloan Management Review, Summer, 67-71. Zaied, A. N., Hussein, G. S. & Hassan, M. (2012). The Role of Knowledge Management in Enhancing Organizational Performance. International Journal of Information Engineering and Electronic Business, 5(1), 27-35. Human-Computer Interaction (HCI) Literature Bannan-Ritland, B., & Baek, J. Y. (2008). Investigating the Act of Design in Design Research: The Road Taken. Handbook of Design Research Methods in Education: Innovations in Science, Technology, Engineering, And Mathematics Learning And Teaching, 299-319. Borgatti, S. P., Mehra, A., Brass, D. J., & Labianca, G. (2009). Network Analysis in The Social Sciences. Science, 323(5916), 892-895. Burnett, M., Peters, A., Hill, C., & Elarief, N. (2016, May). Finding Gender- Inclusiveness Software Issues with Gendermag: A Field Investigation. In Proceedings of 258 the 2016 CHI Conference on Human Factors in Computing Systems (Pp. 2586-2598). ACM. Elmqvist, N., Dragicevic, P., & Fekete, J. D. (2008). Rolling the Dice: Multidimensional Visual Exploration Using Scatterplot Matrix Navigation. IEEE Transactions on Visualization and Computer Graphics, 14(6), 1539-1148. Hamon, R., Hix. D. (1989) Towards Empirically Developed Methodologies, Human- Computer Interface Development, Academic Press. Healey, C., & Enns, J. (2012). Attention and Visual Memory in Visualization and Computer Graphics. IEEE Transactions on Visualization and Computer Graphics, 18(7), 1170-1188. Hunsucker, A. J., & Siegel, M. A. (2015). Once Upon A Time: Storytelling in The Design Process. In Proceedings of the 3rd International Conference for Design Education Researchers. Keim, D. A., Kriegel, H. P., & Seidl, T. (1993, October). Visual Feedback in Querying Large Databases. In Proceedings of the 4th Conference on Visualization. IEEE Computer Society. Lucero, A., Desjardins, A., Neustaedter, C., H??k, K., Hassenzahl, M., & Cecchinato, M. (2019). A Sample of One: First-Person Research Methods In HCI. Norman, D. (2013). The Design of Everyday Things: Revised and Expanded Edition. Basic Books (AZ). 259 Plaisant, C. (2004, May). The Challenge of Information Visualization Evaluation. In Proceedings of The Working Conference on Advanced Visual Interfaces ACM. Puerta, A. R. (1997). A Model-Based Interface Development Environment. IEEE Software, 14(4), 40-47. Quinn, L., & Gardner-Madras, H. (2010). Comparing Open Source Content Management Systems: Wordpress, Joomla, Drupal And Plone. Idealware. December. Rogers, Y., Sharp, H., & Preece, J. (2014). Interaction Design: Beyond Human-Computer Interaction. John Wiley & Sons. Van Den Hooff, B., & Huysman, M. (2009). Managing Knowledge Sharing: Emergent and Engineering Approaches. Information & Management, 46(1), 1-8. Schneiderman, B. (2007). Creativity Support Tools. Communications of the ACM, 50(12), 20-32. VA Center of Innovation (October 2015, Vol. 1). Designing for Veterans A Toolkit for Human-Centered Design. Presentation Available at https://www.va.gov/playbook/downloads/vaci-project-toolkit.pdf. Accessed on July 1, 2019. Wennergren, D. M. (2009). Clarifying Guidance Regarding Open Source Software (OSS). Department of Defense Chief Information Officer. Email attachment from VACI, Accessed on June 3. 2019. 260 Wongsuphasawat, K., Guerra G?mez, J. A., Plaisant, C., Wang, T. D., Taieb-Maimon, M., & Shneiderman, B. (2011, May). Lifeflow: Visualizing an Overview of Event Sequences. In Proceedings of The SIGCHI Conference on Human Factors in Computing Systems. ACM. Data Visualization Literature Barab, S., & Squire, K. (2004). Design-Based Research: Putting A Stake in The Ground. The Journal of The Learning Sciences, 13(1), 1-14. Bertone, A., Lammarsch, T., Turic, T., Aigner, W., & Miksch, S. (2010). Does Jason Bourne Need Visual Analytics to Catch the Jackal? In Proc. First International Symposium on Visual Analytics Science and Technology Held in Europe (Eurovast 2010). Eurographics (Pp. 61-67). Elmqvist, N., Dragicevic, P., & Fekete, J. D. (2008). Rolling the Dice: Multidimensional Visual Exploration Using Scatterplot Matrix Navigation. IEEE Transactions on Visualization and Computer Graphics, 14(6), 1539-1148. Fry, B. (2007). Visualizing Data: Exploring and Explaining Data with The Processing Environment. O'Reilly Media, Inc. Healey, C., & Enns, J. (2012). Attention and Visual Memory in Visualization and Computer Graphics. IEEE Transactions on Visualization and Computer Graphics, 18(7), 1170-1188. 261 Keim, D. A., Kriegel, H. P., & Seidl, T. (1993, October). Visual Feedback in Querying Large Databases. In Proceedings of the 4th Conference on Visualization'93 (Pp. 158- 165). IEEE Computer Society. Munzner, T. (2014). Visualization Analysis and Design. CRC Press. Plaisant, C. (2004). The Challenge of Information Visualization Evaluation. In Proceedings of The Working Conference on Advanced Visual Interfaces (Pp. 109-116). ACM. Schutt, R., & O'Neil, C. (2013). Doing Data Science: Straight Talk from The Frontline. O'Reilly Media, Inc. Simsion, G., & Witt, G. (2004). Data Modeling Essentials. Elsevier. Singh, H., Spitzmueller, C., Sawhney, M., Espadas, D., Modi, V., & Sittig, D. F. (2011). Perceptions of Alert Fatigue by Pcps Using an Integrated Electronic Health Record. In Journal of General Internal Medicine 26, S178-S178) Wongsuphasawat, K., Guerra G?mez, J. A., Plaisant, C., Wang, T. D., Taieb-Maimon, M., & Shneiderman, B. (2011, May). Lifeflow: Visualizing an Overview of Event Sequences. In Proceedings of The SIGCHI Conference on Human Factors in Computing Systems (Pp. 1747-1756). ACM. VA Organizational Literature 262 Aday, L. & Cornelius, L. (2006). Designing and Conducting Health Surveys: A Comprehensive Guide (3rd. Ed). San Francisco, CA: Jossey-Bass. Adler, J. L. (2017). Burdens of War: Creating the United States Veterans Health System. JHU Press. Allen, A. (2018) ?We Took A Broken System and Just Broke It Completely? Politico (Online). Available: Https://Www.Politico.Com/Story/2018/03/08/Veterans-Military- Health-System-Trump-386232, Accessed On June 3, 2019. Allen, A. (2017) ?A 40-Year 'Conspiracy' At The VA? Politico. Online: Available: Https://Www.Politico.Com/Agenda/Story/2017/03/Vista-Computer-History-Va- Conspiracy-000367 Accessed in July 19, 2018. American Council for Technology?Industry Advisory Council Vista Modernization Report (May 4, 2010), Available At: Https://Www.Osehra.Org/Sites/Default/Files/Vista_Modernization_Report_- _Legacy_To_Leadership_May_4_2010-1.Pdf, Accessed in June 5, 2018 Anderson, Charles (2011). Online Meeting at VACO Washington DC, Between the RAPTOR Team, And VHA Chief Radiologist Charles Anderson, May 2011 at Veterans Affairs Vermont Ave., Washington D.C. Anthracite, N (2016). The VA Should Keep Vista. Http://Opensourcevista.Net/Nancysvistaserver/The VA Should Keep Vista.Doc Accessed On June 3, 2019. 263 Anthracite, N (2019). Considerations for The VA Adoption of Cerner. Http://Opensourcevista.Net/Nancysvistaserver/Considerationsforthevaadoptionofcerner- 3-13-2019.Doc Accessed on June 3, 2019. Asch, S. M., Mcglynn, E. A., Hogan, M. M., Hayward, R. A., Shekelle, P., Rubenstein, L., & Kerr, E. A. (2004). Comparison of Quality of Care for Patients In The Veterans Health Administration And Patients In A National Sample. Annals of Internal Medicine, 141(12), 938-945. Akinyele, M. (2019) VA Innovation Center. Accessed online on June 24, 2019, at https://www.slideshare.net/slideshow/embed_code/key/eI5bexi6Zqu4xq Bhatnagar, S., and John D?Adamo (2016) Combining Innovation and Improvement at the VA to Serve Veterans. Accessed online on June 24, 2019, at https://medium.com/vainnovation/combining-innovation-and-improvement-at-the-va-to- serve-veterans-623f2e38237. Berdou, E. (2010). Organization in Open Source Communities: At the Crossroads of The Gift And Market Economies. Routledge. London and New York. Bogardus, Andrew, "Exposed by Phoenix: Veterans Health Care in the Age of Operations Enduring Freedom and Iraqi Freedom" (2016). Master of Arts in Liberal Studies (MALS) Student Scholarship. 115. Available at https://creativematter.skidmore.edu/mals_stu_schol/115 264 Blackburn, Scott (2010). IT Modernization Summit 2018 ? VA's Scott Blackburn, Fedscoop Youtube video, accessed on May 31, 2019, https://www.youtube.com/watch?v=LDfJmkqP9Zc Bonner, L., Simons, C., Parker, L., Yano, E., & Kirchner, J. (2011). ?To Take Care of The Patients? Qualitative Analysis of Veterans Health Administration Personnel Experiences with A Clinical Informatics System. Implementation Science, 5(63). Borowsky, S. J., & Cowper, D. C. (1999). Dual Use of VA and Non?VA Primary Care. Journal of General Internal Medicine, 14(5), 274-280. Bronstein, S., Griffin, D., & Black, N. (2014, June 24). VA Deaths Covered Up to Make Statistics Look Better, Whistle-Blower Says. CNN. (Online). Available At: Https://Www.Cnn.Com/. Accessed August 23rd, 2018. Brennan, D (2019) ?ALEXANDRIA OCASIO-CORTEZ ON VA: GOP WANTS TO RIP THE BATTERY OUT, SAY THE WHOLE CAR DOESN?T WORK, AND SELL IT FOR PARTS? Newsweek online , https://www.newsweek.com/alexandria-ocasio- cortez-veterans-gop-privatization-va-funding-healthcare-1405375. Accessed on May 19, 2019. Brown, (2010). Innovative Thinkers: The VA Innovation Program, Powerpoint Presented at VA Ehealth University (Vehu). Buell, R. W. (2016). A Transformation Is Underway at US Veterans Affairs. Harvard Business Review. 265 Casertano, A., Radiology Protocol Tool and Reporter (RAPTOR) Options Analysis Report, Initiative #292, Task #: VA118-11-RP-0173, November 2011. Casertano, A.et. al., Feasibility Analysis for DOD-VA Collaboration on an Imaging System, tasked by COL A. Smith, December 16, 2002. Chokshi, D. (2014). Improving Health Care for Veterans--A Watershed Moment for The VA. New England Journal of Medicine. 371(4). Davis, J. (2017, May 17). Everybody Hates Vista? Not Its Users. Retrieved From: Https://Www.Healthcareitnews.Com/News/Everybody-Hates-Vista-Not-Its- Users#Gs.Fpnxgfi. Accessed on June 3, 2018. Colburn, Tom (2014). Friendly Fire: Death, Delay and Dismay at the VA Retrieved From: https://www.hsdl.org/?abstract&did=755200 Accessed on June 3, 2018. Darkins, A. (2014). The growth of telehealth services in the Veterans Health Administration between 1994 and 2014: a study in the diffusion of innovation. Telemedicine and e-Health, 20(9), 761-768. Davis, J. (2017, April 17). Open source experts to VA: Keep VistA, it can be fixed. https://www.healthcareitnews.com/news/open-source-experts-va-keep-vista-it-can-be- fixed. Accessed on June 3, 2018. 266 Davis, J. (2018, April 17). The Results Are In: Readers Say VA Officials Should Keep Vista EHR, Not Sign with Cerner. Https://Www.Healthcareitnews.Com/News/the- results-are-in-readers-say-va-officials-should-keep-vista. Accessed On June 3, 2018. Department of Veterans Affairs (VA) Electronic Health Record, Open Source Custodial Agent (CA) Request for Information (RFI), VA118-11-RI-0194 Elnitsky, C., Andresen, E., Clark, M., Mcgarity, S., Hall, C., & Kerns, R. (2013) Access to The US Department of Veterans Affairs Health System: Self-Reported Barriers to Care Among Returnees Of Operation Enduring Freedom And Iraqi Freedom. BMC Health Services Research 13,498. Facebook group VA Innovation (@VHAInnovation). Available at: (https://www.facebook.com/VHAInnovation/?epa=SEARCH_BOX). Accessed on June 26, 2019. Fahrenthold, D. (2014, May 30). How the VA developed its culture of coverups. The Washington Post. Retrieved from http://www. washingtonpost.com/sf/national/2014/05/30/how-the-vadeveloped-its-culture-of- coverups/?utm_term=.126b1bf13f0c Federal News Network Staff (2014) Shinseki resigns amid VA health care issues. Federal News Network. Available at: https://federalnewsnetwork.com/defense/2014/05/shinseki- resigns-amid-va-health-care-issues/. Accessed on June 26, 2019. 267 Fierce Government (201) Q&A: Roger Baker on The Future of Vista and VLER (Online). Available: Http://Www.Fiercegovernmentit.Com/Story/Q-Roger-Baker- Futurevista-And-Vler/2010-10-25. Accessed on June 3, 2019. Fortney, J., Koboli, P., & Eisen, S. (2011). Improving Access to VA Care. Journal of General Internal Medicine, 26, 621-622. Fischetti, L. and Tastrom, J (June 10, 2010). Presentation of the Department of Veterans Affairs: Veterans Health Administration Innovation Program to the 21st WorldVistA Community Meeting, June 10, 2010. Available: Http://worldvista.org/Conferences/past-events/21st-vista-community-meeting . Accessed on June 3, 2019. French, L. (2014) ?W.H. report: VA's 'corrosive culture' ?Politico. Online: Available: https://www.politico.com/story/2014/06/white-house-report-overhaul-va-corrosive- culture-108403 Freier-Heckler, L. (2017). Reducing veteran patient wait time (Doctoral dissertation, Capella University). Ginter, P.M., Duncan, W.J., & Swayne, L.E. (2013). Communicating the Strategy and Developing Action Plans. Strategic Management of Healthcare Organizations (7th Ed.). San Francisco, CA: Jossey-Bass. Giroir, B.P., & Wilensky, G.R. (2015). Reforming the Veterans Health Administration Beyond Palliation of Symptoms New England Journal of Medicine, 373(18), 1693-1695. 268 Gold, A. (2014) ?VA Fiasco: A Tale of 2 Softwares? Politico. Online: Available: Https://Www.Politico.Com/Agenda/Story/2014/06/va-schedule-software-problems- 107839 Accessed In July 19, 2018 Gordon, S. (2017). The Battle for Veterans? Healthcare: Dispatches from the Front Lines of Policy Making and Patient Care. Cornell University Press. Gordon, S. (2018). Wounds of War: How the VA Delivers Health, Healing, And Hope to The Nation's Veterans. Cornell University Press. Gordon & Craven (2018) Unreliable Sources, How Corporate Funders Influenced Mass Media Coverage of Veterans? Healthcare. The Veterans Healthcare Policy Institute (VHPI Online). Available: Veteranspolicy.Org, Accessed on June 3, 2018. Hardhats community, available on Google Groups, https://groups.google.com/forum/#!forum/hardhats accessed on January 16, 2020 Harvard Kennedy School Ash Center for Democratic Governance and Innovation 2006 Innovations in American Government Award. Available at Http://Www.Innovations.Harvard.Edu/Awards.Html?Id039711. Accessed on June 3, 2018. Henderson, M. L., Dayhoff, R. E., Titton, C. P., & Casertano, A. (2006). Using IHE And HL7 Conformance to Specify Consistent PACS Interoperability for A Large Multi-Center Enterprise. Journal of Healthcare Information Management, 20(3), 47. 269 Jha, A. K., Perlin, J. B., Kizer, K. W., & Dudley, R. A. (2003). Effect of The Transformation of The Veterans Affairs Health Care System on The Quality Of Care. New England Journal of Medicine, 348(22), 2218-2227. Joint Commission on Accreditation of Healthcare Organizations (JCAHO) (2008) Guiding Principles for The Development Of The Hospital Of The Future (Online). Available: Http://Www.Jointcommission.Org/Assets/1/18/Hosptal_Future.Pdf. Accessed on June 3, 2018. Kizer, K.W., & Jha, A.K. (2014). Restoring Trust in VA Health Care. New England Journal of Medicine, 2014(371). 295-297. Koven, S. G. (2019). Discretion and National Policy: National Law Enforcement and Veterans Affairs Abuses. In the Case Against Bureaucratic Discretion (Pp. 135-166). Palgrave Macmillan, Cham. Kuzmak, Dayhoff, Gavrilov, Cebelinski, Shovestul And Casertano (2012). Streamlining Importation of Outside Prior DICOM Studies into An Imaging System.? Journal of Digital Imaging. 25(1), 70-77 Levin, Peter (2010). Dr. Peter Levin Discusses Innovation And Culture Change at the Dept. of Veterans Affairs, Fedscoop Youtube videos, accessed on May 31, 2019, https://www.youtube.com/watch?v=LMoqixKDIf4&fbclid=IwAR1eVklXcwRujMqgJ9P LgV7ygaea0po_0ZaynRStJujss1x4_rtKFBDqSSM 270 Levin, Peter (2011). Meeting at VACO Washington DC, Between the RAPTOR Team, VACI Chuck Brown, And VHA CTO Peter Levin, May 2011 at Veterans Affairs Vermont Ave., Washington D.C. Lichtenwald, I. (2018, December 20) HIT Think: How to create EHRs that doctors do not hate. (Health Data Management Online). Available: https://www.healthdatamanagement.com/opinion/how-to-create-ehrs-that-doctors-dont- hate Accessed August 13rd 2019. Lichtenwald, I. (2018, April 18) Why the VA Should Stick with VistA and Not Waste $16 Billion on an Attempt to Replace It. (Open Health News Online). Available: http://www.openhealthnews.com/story/2018-04-18/why-va-should-stick-vista-and-not- waste-16-billion-attempt-replace-it Accessed April 23rd 2018. Longman, P. (2010). Best Care Anywhere. Berrett-Koehler Publishers. Mansell, R., & Tremblay, G. (2013). Renewing the Knowledge Societies Vision for Peace And Sustainable Development. UNESCO. MDWS Developers Guide Is Available At The Vista Document Library, Http://Www.Va.Gov/Vdl/Application.Asp?Appid=192 . Accessed on July 19, 2018. Medverd, J., Cross, N., Font, F., Casertano, A (2012) Open Source Radiology Dashboard Improves Clinical Workflow And Protocol Decision Support, Dr. J. Medverd, Dr. N. Cross, F. Font, A. Casertano, Presentation At RSNA. 271 Mitchell, B. (2019). VA scheduling tool enhancement ?almost complete,? just in time to be replaced, Fedscoop, accessed at https://www.fedscoop.com/va-vista-scheduling- enhancement-ig-report/ on August 21, 2019, Molina, A. D. (2018). A systems approach to managing organizational integrity risks: Lessons from the 2014 veterans affairs waitlist scandal. The American Review of Public Administration, 48(8), 872-885. Moore, C. D. (2015). Innovation Without Reputation: How Bureaucrats Saved the Veterans? Health Care System. Perspectives on Politics, 13(2), 327-344. Oliver, A. (2007). The Veterans Health Administration: An American Success Story? The Milbank Quarterly, 85(1), 5-35. Obama Open Government Directive, January 21, 2009. Available at: https://obamawhitehouse.archives.gov/the-press-office/transparency-and-open- government. Accessed on July 19, 2018. Ogrysko N. (2019) New accountability office has not made a dent in VA?s ?culture of retaliation,? whistleblowers say. Federal News Network. Available at: https://federalnewsnetwork.com/veterans-affairs/2019/07/new-accountability-office- hasnt-made-a-dent-in-vas-culture-of-retaliation-whistleblowers-say/. Published June 26, 2019. 272 Ogrysko N. (2019) Five years after Phoenix scandal, VA wait times still under scrutiny Federal News Network. July 25, 2019. Available at: https://federalnewsnetwork.com/veterans-affairs/2019/07/five-years-after-phoenix- scandal-va-wait-times-still-under-scrutiny/ Accessed On July 29, 2019. Ogrysko N. (2019) In abandoning VistA, VA faces culture change that?s ?orders of magnitude bigger? than expected. Federal News Network. January 25, 2019. Available at: https://federalnewsnetwork.com/veterans-affairs/2017/06/in-abandoning-vista-va-faces- culture-change-thats-orders-of-magnitude-bigger-than-expected. Accessed on July 29, 2019. OI&T Transformation Presentation. https://www.oit.va.gov/reports/year-in-review/2016/. Accessed August 23rd, 2018. Oppel, R. (2014). VA official acknowledges link between delays and patient deaths. The New York Times A, 17. OSEHRA, The Open Source Electronic Health Record Agent, Http://Www.Osehra.Org/Medical Domain Web Services (MDWS) Version 2.0, C3-C1 Conversion Project Developer?s Guide And Systems Management Guide (MWVS*2), Http://Www.Va.Gov/Vdl/Application.Asp?Appid=192, Accessed In July 19, 2018. OSEHRA (2016) Open Source Software Strengths Weaknesses Opportunities Threats (SWOT) Analysis (Online). Available: Https://Www.Osehra.Org/Sites/Default/Files/SLIN%200002AB%20- %20SWOT%20Q2%20Updated.Pdf, Accessed on July 19, 2018. 273 OSEHRA Technical Journal (2016) (Online). Available: https://code.osehra.org/journal/journal/view/789 Accessed On July 19, 2018. OSEHRA Technical Journal RAPTOR review (2016) (Online). Available:https://code.osehra.org/journal/reviewosehra/submit?review_id=31 Accessed On July 19, 2018. OSEHRA Scheduling Contest (2016) (Online). Available: https://www.osehra.org/post/corporate-support/community-sourced-submission- osehrava-scheduling-contest-webinar Accessed On July 19, 2018. Percy, A. (2009). Quality Initiatives Undertaken by The Veterans Health Administration. DIANE Publishing. Rubenstein, D. (2018) Industry Watch: Healing the VA Through Modernization. Online Available: Https://Sdtimes.Com/Digx/Industry-Watch-Healing-The-Va-Through- Modernization/ Accessed August 23rd, 2018. Pham, R. (2017) Research Data Overview Presentation at The Science Training Enhancement Program (Online). Available: Nciphub.Org/Bdstep, Accessed August 23rd, 2018. Proval, C. (2012) The Top Five Medical-Imaging IT Projects Of 2012, Article in Radiology Business Journal (RBJ) (Online). Available: 274 Http://Www.Radiologybusiness.Com/Topics/Business/Top-Five-Medical-Imaging-It- Projects-2012?Nopaging=1, Accessed August 23rd, 2018. Shane, L. (2018, June 11) Report: Vets Still Face Long Waits with VA Choice Program. (Online). Available: Https://Www.Militarytimes.Com/Veterans/2018/06/04/Report-Vets- Still-Face-Long-Waits-With-Va-Choice-Program/ Accessed August 23rd 2018. Shannon, T. (2018, June 11) Been There, Done That, Doesn?t Work: Veterans Health Administration IT Goes Back in Time. (Online). Available: Http://Www.Openhealthnews.Com/Story/2018-06-11/Been-There-Done- Doesn%E2%80%99t-Work-Veterans-Health-Administration-It-Goes-Back-Time Accessed August 23rd, 2018. Shulkin, David J., M.D. (2016). Beyond the VA Crisis--Becoming A High-Performance Network. New England Journal of Medicine, 374(11), 1003-1005. Shulkin, David J., M.D. (2019). It Shouldn't Be This Hard to Serve Your Country: Our Broken Government and the Plight of Veterans. PublicAffairs. ISBN 1541762657. Skoufalos, M. (2012) Announcing the Top 5 Imaging IT Projects of 2012. Article in Radiology Business Journal (RBJ) (Online). Available: https://www.radiologybusiness.com/topics/business-intelligence/announcing-top-5- imaging-it-projects-2012, Accessed August 13rd 2019. 275 Slack, D. (2018) ?I knew something was not right?: Mass cancellations of diagnostic test orders at VA hospitals draw scrutiny. USA Today, October 1, 2018, accessed at https://www.usatoday.com/story/news/politics/2018/10/01/va-hospitals-cancellations- diagnostic-exam-orders-draw-scrutiny/1424298002/ Accessed October 14th 2018. Stevens, R. (2017). A Time of Scandal: Charles R. Forbes, Warren G. Harding, And the Making of The Veterans Bureau. Baltimore: Johns Hopkins University Press. The Independent Budget, Veterans Policy Agenda for Congress and the Administration Http://Www.Independentbudget.Org/Archive.Html Accessed August 23rd, 2018. University of Michigan Medical School Information Services (2014). Vista Data Loader Version 2.2. (Online). Available: Http://Opensourcevista.Net/Nancysvistaserver/Vista Data Loader Version 2.2.Doc Accessed August 23rd 2018. U.S. Government Accountability Office. GAO Report (December 2012) Reliability of Reported Outpatient Medical Appointment Wait Times and Scheduling Oversight Need Improvement https://www.gao.gov/assets/660/651076.pdf Accessed on July 29, 2019. U.S. Government Accountability Office. GAO Report (2015). Managing risks and improving VA health care, in High Risk Series: An update. Retrieved from http://www.gao.gov/highrisk/overview. Accessed on July 29, 2019. U.S. Government Accountability Office. GAO Report (April 2, 2019) ?VETERANS AFFAIRS Addressing IT Management Challenges Is Essential to Effectively Supporting the Department?s Mission?. Available at : https://www.gao.gov/assets/700/698164.pdf Accessed on July 29, 2019. 276 U.S. Government Accountability Office. GAO Report (July 25, 2019) ?Electronic Health Records, VA needs to identify and report system costs?. Available at: https://www.gao.gov/assets/660/700478.pdf. Accessed on July 29, 2019. VA Center Of Innovation, Https://Www.Innovation.Va.Gov/About.Html, Accessed August 23rd 2018. VA National Center for Patient Safety, The Daily Plan (2009) Available: https://www.patientsafety.va.gov/professionals/onthejob/dailyplan.asp Accessed On July 15, 2019. VA Open Source Policy Memorandum (VAIQ# 7532631), Dated November 4, 2014, Available At Https://Www.Va.Gov/Vapubs/Viewpublication.Asp?Pub_ID=804&Ftype=2, Accessed August 23rd 2018. VA National Data Center Program (NDCP) Intake Management Request Form, Provided by VACI in 2014. VA National Data Center Program (NDCP) Intake Management Request Form Template, Provided by VACI in 2014. . VA National Data Center Program (NDCP) Intake Process Questions, Provided by VACI in 2014. 277 VA Office Of Inspector General Report (2011), Informed Consent and Prevention of Disease Progression in Vets With Chronic Kidney Disease. (Online). Available: Http://Www.Va.Gov/Oig/Pubs/Vaoig-10-03399-51.Pdf, Accessed On June 3, 2018. VA Office Of Inspector General Report (2014), Administrative Investigation Failure to Properly Supervise, Misuse of Official Time And Resources, And Prohibited Personnel Practice VA Center For Innovation VA Central Office (Online). Available: Https://Www.Va.Gov/Oig/Pubs/VAOIG-13-01488-86r.Pdf, Accessed On June 3, 2018. VA Office Of Inspector General Report (2019), VA?s Implementation of the Veterans Information Systems and Technology Architecture Scheduling Enhancement Project Near Completion (Online). Available: https://www.oversight.gov/sites/default/files/oig- reports/VAOIG-16-03597-171.pdf, Accessed On August 23, 2019. VA Office Of Inspector General Report (2019), OIG Determination of Veterans Health Administration?s Occupational Staffing Shortages (Online). Available: https://www.va.gov/oig/pubs/VAOIG-19-00346-241.pdf, Accessed On October 23, 2019. VA Office Of Inspector General Report (2019), Failures Implementing Aspects of the VA Accountability and Whistleblower Protection Act of 2017 (Online). Available https://www.va.gov/oig/pubs/VAOIG-18-04968-249.pdf, Accessed On October 23, 2019. VA 2019 Budget in Brief, Accessed on June 3, 2018. (Online). Available:Https://Www.Va.Gov/Budget/Docs/Summary/Fy2019vabudgetinbrief.Pdf VACI Postings on Twitter (@VHAInnovation) Accessed on June 3, 2018, (Online). Contact information: allison.amrhein@va.gov Available: 278 Https://Twitter.Com/Vainnovation?Lang=En VA Open Government Plan, version 1.3 (June 25, 2010), Accessed on June 3, 2018. Https://Www.Va.Gov/OPEN/Docs/Open_Govt_Plan.Pdf VA Puget Sound Health Care System News: Radiologist Receives IT Innovation Award, Https://Www.Pugetsound.Va.Gov/Publicaffairs/Vaitinnovation.Asp, Accessed On June 3, 2018 VA Information Resource Center (VIReC) Clinical Informatics Cyberseminar Presentation led by Dr. J. Medverd, Modernizing VA Legacy Healthcare IT: RAPTOR Project (Radiology Protocol Tool Recorder) (Online). Available: Http://Www.Hsrd.Research.Va.Gov/For_Researchers/Cyber_Seminars/Archives/856- Notes.Pdf, Accessed On June 3, 2018. VA Information Resource Center (VIReC) Quality Enhancement Research Initiative (QUERI) led by Dr. D. Hynes, Health Information Technology Approaches in QUERI Implementation Research: Case Study Evaluation (Online). Available: Http://Www.Hsrd.Research.Va.Gov/For_Researchers/Cyber_Seminars/Archives/856- Notes.Pdf, Accessed On June 3, 2018. VA OI&T (Office of Information & Technology) Comprehensive Information Technology Plan, Sharing with Business Stakeholders, September 20, 2017, Available: https://docs.google.com/viewer?url=https%3A%2F%2Fwww.oit.va.gov%2Flibrary%2Fst rategy%2FComprehensiveITPlan.pptx%3Futm_source%3Dreportspage%26utm_medium 279 %3Dlink%26utm_campaign%3Dcitp%26fbclid%3DIwAR2myPGNX-NvPqIsH- JQPR4XVt8HswlL1GYCYTLgBtKjyLOnTA9fq49j_UU, Accessed Online On August 10, 2019. Vega, R. (July 26, 2019), Interview on Connecting Vets Radio https://connectingvets.radio.com/articles/va-innovation-advancements- medcine?fbclid=IwAR17RijcuRhnUOx5GIVGgeRQF3rgbqIavwMZ1XQKiO7HWajlkga l5RgMJjs Accessed Online On August 10, 2019. VHA Innovation Ecosytem (July 10, 2019) https://medium.com/vainnovation/vha- innovators-network-adopts-the-three-box-solution-framework-3accea6880b1. Accessed Online on July 10, 2019. VHA Innovation Program Boot Camp, held at the Technology Acquisition Center (TAC) Eatontown NJ, Between Andrew Casertano, Innovator, and VHA Innovations Review Group, June 14, 2010 Meeting. VHA Innovation Program (August 12, 2010) Innovative Thinkers Presentation, VA 2010 VistA eHealth University (VeHU) Conference, held in Tampa, FL. VHA Solicitation/Contract VA118-11-RP-0173, Issued by The VA Office of Acquisition and Logistics, Technology Acquisition Center, 2011. Contract Officer Sarah Basilotto. 280 Vista Document Library, Available At Https://Www.Va.Gov/Vdl/, Accessed Online on June 3, 2018. Vista Modernization Report (May 4. 2010), Available at Https://Www.actgov.org/ Accessed Online in May 2018. Vista 4 Product Roadmap, (2014), Available At Https://Docs.Google.Com/Viewer?Url=Https%3A%2F%2Fwww.Osehra.Org%2Fsites% 2Fdefault%2Ffiles%2Fvista_4_Product_Roadmap_3-24-14.Pdf, Accessed On June 3, 2018. Wagner, D. (2015). The VA Scandal Coverage. Available At Https://Www.Azcentral.Com/Investigations/Vahealthsystem/ , Accessed On June 3, 2018. Wennergren, D. M. (2009). Clarifying Guidance Regarding Open Source Software (OSS). Department of Defense Chief Information Officer. Wennergren, D. M. (2009). Consideration of Open Source Software Memorandum, Department of Defense Chief Information Officer. West, J., & O'Mahony, S. (2005, January). Contrasting Community Building in Sponsored and Community Founded Open Source Projects. In System Sciences, 2005. HICSS'05. Proceedings of the 38th Annual Hawaii International Conference On (Pp. 196c-196c). IEEE. 281 Theoretical Organizational Models Literature Abdekhoda, M., Dehnad, A., & Zarei, J. (2018). Determinant Factors in Applying Electronic Medical Records in Healthcare. World Health Organization Eastern Mediterranean Health Journal, accessed online June 24, 2019 at http://applications.emro.who.int/emhj/v25/01/EMHJ_2019_25_01_24_33.pdf Abdekhoda, M., Gholami, Z., & Zarea, V. (2018). Determinant Factors in Adopting Mobile Technology-Based Services by Academic Librarians. DESIDOC Journal of Library & Information Technology, 38(4), 271-277. Awa, H. O., Ojiabo, O. U., & Emecheta, B. C. (2015). Integrating TAM, TPB And TOE Frameworks and Expanding Their Characteristic Constructs for E-Commerce Adoption by Smes. Journal of Science & Technology Policy Management, 6 (1), 76-94. Biancani, S., Mcfarland, D. A., & Dahlander, L. (2014). The Semiformal Organization. Organization Science, 25(5), 1306-1324. Bozeman, B., Youtie, J. (2017). Socio-economic impacts and public value of government- funded research: Lessons from four US National Science Foundation Initiatives Research Policy. DOI: https://doi.org/10.1016/j.respol.2017.06.003 Burton-Jones A., Camille Grange (2013) From Use to Effective Use: A Representation Theory Perspective. Information Systems Research 24(3):632-658. Burton-Jones, A., & Volkoff, O. (2017). How Can We Develop Contextualized Theories of Effective Use? A Demonstration in The Context of Community-Care Electronic Health Records. Information Systems Research, 28(3), 468-489. Bergek, A., Jacobsson, S., Carlsson, B., Lindmark, S., & Rickne, A. (2008). Analyzing the Functional Dynamics of Technological Innovation Systems: A Scheme of Analysis. Research Policy, 37(3), 407-429. Chan, J. (2018) Making Literature Reviewing Less Painful and More Commonplace (Online). Available: Https://Www.Youtube.Com/Watch?V=P41b8txru1q. Accessed November 23rd, 2018. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of 282 information technology. MIS quarterly, 319-340. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Management Science, 35(8), 982-1003. Feak, C., & Swales, J. M. (2009). Telling a Research Story: Writing a Literature Review (Michigan Series in English for Academic Professional Purposes). Ann Arbor, MI: University of Michigan Press. Gangwar, H., Date, H., & Ramaswamy, R. (2015). Understanding Determinants of Cloud Computing Adoption Using an Integrated TAM-TOE Model. Journal of Enterprise Information Management, 28(1), 107-130. Govindarajan, V. (2016). The three-box solution: A strategy for leading innovation. Harvard Business Review Press. Gray, P. H., & Cooper, W. H. (2010). Pursuing failure. Organizational Research Methods, 13(4), 620-643. Greenhalgh, T., Robert, G., Bate, P., Macfarlane, F., & Kyriakidou, O. (2008). Diffusion of Innovations in Health Service Organisations: A Systematic Literature Review. John Wiley & Sons. Kaptein, M. (2013). Workplace morality: Behavioral ethics in organization. Bingley, UK: Emerald Group. Jones, G. R. (2004). Organizational theory, design, and change: Text and cases. Upper Saddle River, NJ: Pearson/Prentice Hall. Machi, L. A., & Mcevoy, B. T. (2016). The Literature Review: Six Steps to Success. Corwin Press. Oliveira, T., & Martins, M. F. (2011). Literature Review of Information Technology Adoption Models at Firm Level. Electronic Journal of Information Systems Evaluation, 283 14(1), 110. Ridley, D. (2012). The Literature Review: A Step-By-Step Guide for Students. Sage Publications. Beverly Hills, CA, USA. Robbins, S. P., & Judge, T. A. (2009). Organizational behavior (13th ed.). Upper Saddle River, NJ: Prentice Hall. Tornatzky, L. G., Fleischer, M., & Chakrabarti, A. K. (1990). The Processes of Technological Innovation. Issues in Organization and Management Series. Lexington Books. Van Lancker, J., Mondelaers, K., Wauters, E., & Van Huylenbroeck, G. (2016). The Organizational Innovation System: A Systemic Framework for Radical Innovation at The Organizational Level. Technovation, June. 52, 40-50. Vogel, S. (2013, February 7) Vets See Promise in Hagel And His Short VA Tenure. The Washington Post (Online). Available: Https://Www.Washingtonpost.Com/Politics/Vets- See-Promise-In-Hagel-And-His-Short-Va-Tenure/2013/02/07/F4001f12-6fd6-11e2- Ac36-3d8d9dcaa2e2_Story.Html?Utm_Term=.700e1ccfe729 Accessed August 23rd 2018. Vogel, K. M., Jameson, J. K., Tyler, B. B., Joines, S., Evans, B. M., & Rendon, H. (2017). The Importance of Organizational Innovation and Adaptation in Building Ac-ademic? Industry?Intelligence Collaboration: Observations from the Laboratory for Analytic Sciences. The International Journal of Intelligence, Security, and Public Af-fairs, 19(3), 171-196. Vogel, K. M., "Big Data, Privacy, and the U.S. Intelligence Workforce" presentation at UMD CASCI, April 30, 2019 Vogel, K. M., "Study: Intelligence Community Benefits From Collaborations, And Can Do Better? https://news.ncsu.edu/2019/06/intelligence-community-collaborations/ Accessed June 19, 2019. 284 Vogel, K. M., & Tyler, B. B. (2019). Interdisciplinary, cross-sector collaboration in the US intelligence community: lessons learned from past and present efforts. Intelligence and National Security, 1-30. Walker, R. M. (2007). An empirical evaluation of innovation types and organizational and environmental characteristics: Towards a configuration framework. Journal of Public Administration Research and Theory, 18(4), 591?615. Whittinghill, C. (2011). An evaluation of the perceived organizational culture and innovative climate of a Department of Defense community of organizations. The University of Alabama in Huntsville. Whittinghill, C., Berkowitz, D., & Farrington, P. A. (2015). Does your culture encourage innovation? DEFENSE ACQUISITION UNIV FT BELVOIR VA. 285