How secure is your imaging AI implementation? 7 aspects to consider

Radiology operations taking on AI projects would do well to prioritize cybersecurity atop other aims and concerns. A new paper offers guidance.

Citing the appeal of augmentative AI in light of the growth of imaging data worldwide over the past several years, the authors point to lessons learned outside of medicine.

“Embarking on a radiology artificial intelligence (AI) project is complex and not without risk, especially as cybersecurity threats have become more abundant in the healthcare industry,” they write. “Fortunately, healthcare providers and device manufacturers have the advantage of being able to take inspiration from other industry sectors that are leading the way in the field.”

The paper is lead-authored by radiologist Brendan Kelly, MBBS, of St. Vincent’s University Hospital in Dublin, Ireland. Senior author is computer scientist James Burrell, PhD, of the University of Hawaii. European Radiology published the work July 7 [1].

In a section on data management, the authors cover seven aspects key to AI adoption. Some are common across all industries, others are specific to healthcare and a few are unique to medical imaging. Examples:

1. Data ethics. Acknowledging that the rights and wrongs of using patients’ data to train and test AI are “complex and a matter for debate,” Kelly and co-authors cite prior scholarly commentary arguing that the best basis for such deliberations—regardless of data ownership—is considering data “a resource that can benefit society.” More:

“[A]lmost every healthcare institution has had a third party request to purchase their data. While the ethics of buying data are still a subject for debate, it is clear that security and confidentiality updates are inextricably linked with good ethical standards.”

2. Data access. Kelly and colleagues note an important security concern is the sheer number of IT systems and clinical services using electronic health records inside the provider organization.

“The ability for multiple users to interact with an EHR is a cornerstone of its value. Multiple users, however, pose risks of their own. Those with permission to access the internet or download online content have the potential to be exploited.”

3. Data queries. The authors point to Health Level 7, Version 3, as the established gold standard for exchanging data across technologies from different vendors. But use caution, they warn:  

“HL7 Version 3, which is based in XML, involves the transfer of text data without encryption. To facilitate transfer between different technologies, HL7 assumes that encryption will take place at a lower level and provides no protocol-level encryption.”

4. Data de-identification. Kelly and colleagues name several challenges involved in this process. For one, most medical images contain metadata that wily malefactors can unpack to identify patients.  

“Moreover, the medical images themselves often contain identifiable information such as data that can be used to reconstruct a facial image of the patient. This emphasizes some of the additional complexities encountered by radiology specific use cases, the concept of disproportionate effort notwithstanding.”

5. Data storage. Back in 2019, researchers demonstrated the relative ease with which malware could alter the DICOM file format. Kelly and co-authors allow that there has been no attack of this type—yet. But the very possibility shows medical imaging faces threats unique across all industries.

“This reality further emphasizes the need to examine healthcare technologies and industry standards to identify potential security vulnerabilities.”

6. Data transfer. Healthcare institutions commonly lack computational resources needed to quickly but securely transfer mountains of data to industry partners that could perform the requisite related analysis, the authors point out. This lack “increases the cyberattack surface,” they remark.

“Whether data is stored locally or remotely, security and privacy are important considerations for hardware and software applications, access control and integrity of the systems governing these processes.”

7. Data labeling. This step involves the assignment of one or more descriptors that provide context to data. It represents “one of the most interesting challenges both in Big Data and AI for digital health,” the authors comment.

“Data labels could become compromised during a cyberattack. An AI algorithm may then be subsequently trained on this data which would result in inaccurate results or recommendations. This could have a harmful impact on patient care and may only become apparent during the evaluation or testing of the AI algorithm.”

Kelly et al. conclude: “While the potential for AI to revolutionize the practice of radiology is clear, it is important to realize the potential impact of increased connectively and adoption of technology on the confidentiality, integrity and availability of healthcare data.”

The paper is available in full for free.

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The all-in-one Omni Legend PET/CT scanner is now being manufactured in a new production facility in Waukesha, Wisconsin.

Trimed Popup
Trimed Popup