Artificial Intelligence & Machine Learning , Events , Fraud Management & Cybercrime

Emerging Security Concerns About Generative AI in Healthcare

Lee Kim of HIMSS Discusses the Promise and Risk of Evolving AI Tools
Emerging Security Concerns About Generative AI in Healthcare

Generative AI tools such as ChatGPT will undoubtedly change the way clinicians and healthcare cybersecurity professionals work, but the use of these technologies come with security, privacy and legal concerns, said attorney Lee Kim of the Healthcare Information Management and Systems Society.

"I would recommend that everyone in healthcare should have some passing literacy because it is going to change what we do, whether we are in cyber or something else," she said in an interview with Information Security Media Group during the 2023 Healthcare Information Management and Systems Society Global Health Conference and Exhibition in Chicago.

Her concern with the tools is that "some of these technologies are a bit of a black box in terms of what goes into training these algorithms, whether if we input something [sensitive], that will be incorporated," she said.

"Beware if you're inputting your intellectual property or secrets you don't want anyone but your closest friends or organization to know," Kim warned.

In the interview (see audio link below photo) Kim also discusses:

Kim, an attorney, is the senior principal of cybersecurity and privacy at HIMSS. She also has served as a team leader of the U.S. Department of Homeland Security's analytic exchange program and as a member of the National Cybersecurity Training and Education Center National Visiting Committee. Before joining HIMSS, Kim practiced law in the areas of IT, healthcare technology, intellectual property and privacy and security. She also previously worked in the healthcare technology field.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing careersinfosecurity.eu, you agree to our use of cookies.