WEBVTT 1 00:00:07.110 --> 00:00:09.300 Anna Delaney: Hello, and thanks for joining us for the ISMG 2 00:00:09.300 --> 00:00:11.640 Editors' Panel. I'm Anna Delaney, and today we're 3 00:00:11.640 --> 00:00:15.540 covering everything from RSA 2024 to the latest guidance on 4 00:00:15.540 --> 00:00:18.450 web trackers from federal regulators, alongside 5 00:00:18.450 --> 00:00:21.720 developments in quantum computing. Our merry team today 6 00:00:21.720 --> 00:00:25.110 includes Tom Field, senior vice president of editorial; Marianne 7 00:00:25.140 --> 00:00:28.680 Kolbasuk McGee, executive editor for HealthcareInfoSecurity; and 8 00:00:28.680 --> 00:00:32.130 Michael Novinson, managing editor for ISMG business. Great 9 00:00:32.130 --> 00:00:32.760 to see you. 10 00:00:33.450 --> 00:00:34.290 Tom Field: Thanks for having us over. 11 00:00:35.400 --> 00:00:35.790 Marianne McGee: Thanks. 12 00:00:36.830 --> 00:00:38.900 Anna Delaney: Tom, you've have some green visions behind you. 13 00:00:38.990 --> 00:00:39.620 Where are you? 14 00:00:40.200 --> 00:00:42.510 Tom Field: Well, Norway, technically. It's the Norway 15 00:00:42.510 --> 00:00:47.460 part of the Disney Epcot theme park. And behind me you see Elsa 16 00:00:47.460 --> 00:00:51.240 and Anna, from Frozen. But I thought with all the exotic 17 00:00:51.240 --> 00:00:54.090 European landscapes you've been sharing recently I had to find 18 00:00:54.090 --> 00:00:55.650 something even if it was on home turf. 19 00:00:56.550 --> 00:00:58.890 Anna Delaney: Very good. I've yet to go to Norway. So you've 20 00:00:59.130 --> 00:01:02.400 me there. Marianne, you're working? 21 00:01:03.690 --> 00:01:08.280 Marianne McGee: Yeah, this is a photo looking down into the main 22 00:01:08.280 --> 00:01:11.580 convention center outside the press room at HIMSS last week. 23 00:01:12.890 --> 00:01:14.210 Tom Field: Clearly before I got dizzy. 24 00:01:15.090 --> 00:01:16.710 Marianne McGee: Yeah, well, it was actually in the middle of 25 00:01:16.710 --> 00:01:19.920 the day. So you'd be surprised. You know, a lot of people I 26 00:01:19.920 --> 00:01:22.740 think, go into the exhibits. And then you know, this is just for 27 00:01:22.740 --> 00:01:24.780 the boring people here. 28 00:01:25.680 --> 00:01:27.630 Anna Delaney: Boring people; was it was it well attended? 29 00:01:29.010 --> 00:01:31.590 Marianne McGee: Yeah. You know, I'm always a very bad judge of 30 00:01:31.590 --> 00:01:34.110 these things. You know because, when you're there, you're kind 31 00:01:34.110 --> 00:01:37.830 of cooped up, based on like, what you're covering. When I 32 00:01:37.830 --> 00:01:40.890 went to the exhibit booths, you know, walking around there, I 33 00:01:40.890 --> 00:01:43.650 went over to the Cyber Command Center, it look busy. You know, 34 00:01:43.650 --> 00:01:47.550 they had a couple of theaters going on there. You know, it 35 00:01:47.550 --> 00:01:51.300 depends like where you are, you know, some sessions are more 36 00:01:51.300 --> 00:01:53.430 filled than others. You know, how it is. 37 00:01:53.910 --> 00:01:57.270 Anna Delaney: For sure. And Michael, where are you? 38 00:01:58.170 --> 00:01:59.970 Michael Novinson: I am in my hometown of Seekonk, 39 00:01:59.970 --> 00:02:03.240 Massachusetts. Spring has sprung and that means pick your own 40 00:02:03.240 --> 00:02:05.490 blueberry season is around the corner. If you look in the 41 00:02:05.490 --> 00:02:09.390 bottom left corner of my image, the sign says pick your own 42 00:02:09.390 --> 00:02:12.750 blueberries. So come June or July, that's a full four minute 43 00:02:12.750 --> 00:02:14.910 drive from my house. I think we'll have to hit it up. I 44 00:02:14.910 --> 00:02:17.370 certainly think it'd be a big hit with my daughter. 45 00:02:18.050 --> 00:02:20.210 Anna Delaney: So blueberry pies at your place then. 46 00:02:21.750 --> 00:02:23.310 Michael Novinson: Sounds good. You got to make your way over 47 00:02:23.310 --> 00:02:23.460 here. 48 00:02:25.220 --> 00:02:27.110 Anna Delaney: Well, this is a glimpse of Littlehampton, a 49 00:02:27.110 --> 00:02:30.260 seaside town in West Sussex, England, where it was last 50 00:02:30.260 --> 00:02:33.620 weekend. And it's just a refreshing change of scenery, 51 00:02:33.620 --> 00:02:38.030 bit of salty sea air, and even some sunrays. So that was very 52 00:02:38.030 --> 00:02:38.390 nice. 53 00:02:38.730 --> 00:02:40.410 Tom Field: You have to explain where you stayed. 54 00:02:43.090 --> 00:02:47.200 Anna Delaney: Well, friends rather got quite lucky in an 55 00:02:47.200 --> 00:02:51.190 auction actually and bought what was originally Roger Moore's 56 00:02:51.190 --> 00:02:58.210 penthouse. So 180 degrees view of the sea. So a 1970s kind of 57 00:02:58.210 --> 00:03:01.600 ... it's actually a very ugly building, the views are better 58 00:03:01.870 --> 00:03:05.050 from inside out, but it's quite a fun story, isn't it? 59 00:03:05.540 --> 00:03:07.280 Tom Field: Exactly, Anna. You were the only Bond girl. 60 00:03:07.510 --> 00:03:11.980 Anna Delaney: Yeah, Miss Moneypenny, maybe as well. So 61 00:03:11.980 --> 00:03:15.400 Tom, I can't quite believe I'm saying this. But RSA 2024 is 62 00:03:15.670 --> 00:03:16.510 right around the corner. 63 00:03:16.000 --> 00:03:24.163 Anna Delaney: Very much. So I've been receiving few emails 64 00:03:16.500 --> 00:03:19.274 Tom Field: Just over a month away? Can you believe it? It 65 00:03:19.334 --> 00:03:22.953 just ... preparation for you. I have just received a list of 66 00:03:23.014 --> 00:03:26.512 folks that we're going to be going after to bring into our 67 00:03:24.348 --> 00:03:35.294 wanting to set up interviews. And I'm saying, hold fire for 68 00:03:26.572 --> 00:03:30.131 studios. So we'll be talking about this with you very soon, 69 00:03:30.191 --> 00:03:33.931 in quite detail. RSA is coming. This is going to be - I say it 70 00:03:33.991 --> 00:03:37.308 every year, it's going to be probably the biggest, most 71 00:03:35.479 --> 00:03:46.796 now. I haven't seen the schedule yet. But it's exciting. It's 72 00:03:37.369 --> 00:03:40.927 focused team we send to the event. And our mission is to go 73 00:03:40.988 --> 00:03:44.727 there and talk with the movers and shakers and make sure we're 74 00:03:44.787 --> 00:03:48.346 talking with people about the topics that matter most. Now, 75 00:03:46.982 --> 00:03:58.670 exciting then also to bring the team together again, always the 76 00:03:48.406 --> 00:03:52.025 we're not there to cover every session, nobody is. But if we 77 00:03:52.086 --> 00:03:55.524 can get some of the most important people from around the 78 00:03:55.584 --> 00:03:58.781 world plugged into the cybersecurity/technology space 79 00:03:58.841 --> 00:04:02.761 and bring them into our studios, studios, plural, to sit down and 80 00:03:58.855 --> 00:04:08.874 best meals and in between moments that we can grab and 81 00:04:02.822 --> 00:04:06.622 have some discussions and engage in some panels and create some 82 00:04:06.682 --> 00:04:10.602 new content. Well, I think we're going to have a grand time and I 83 00:04:09.059 --> 00:04:19.077 build a team but speak to a whole range of people from 84 00:04:10.663 --> 00:04:13.860 very much look forward to it. So how about you, Anna? 85 00:04:14.620 --> 00:04:44.170 Tom Field: That's why I wanted to talk about it today too to 86 00:04:19.263 --> 00:04:30.023 academics to anybody in the legal profession to of course, 87 00:04:30.209 --> 00:04:40.784 security leaders, and people in government. So I'm really 88 00:04:40.969 --> 00:04:44.680 looking forward too. 89 00:04:46.420 --> 00:04:49.810 let our audience know that this is upcoming. We're starting to 90 00:04:49.810 --> 00:04:53.050 fill our studios now. And we're interested in a couple of 91 00:04:53.050 --> 00:04:56.380 things, your recommendations, who do you think should be 92 00:04:56.380 --> 00:05:00.400 visiting with us at RSA studios or are you going to be there? In 93 00:05:00.400 --> 00:05:04.450 which will to come and join one of our discussions. We post this 94 00:05:04.960 --> 00:05:07.960 panel on our sites, we post it on social media, there's plenty 95 00:05:07.960 --> 00:05:12.250 of opportunity to respond and comment, let us know who you'd 96 00:05:12.250 --> 00:05:14.770 like to see in our studios and whether you'd like to visit as 97 00:05:14.770 --> 00:05:16.570 well. And we'll show you a teaser because we had the 98 00:05:16.570 --> 00:05:19.300 opportunity to sit down with so many thought leaders over the 99 00:05:19.300 --> 00:05:22.450 course of the three or four days. I think we did what, 100 00:05:22.450 --> 00:05:25.840 Michael, north of 150 interviews last year. 101 00:05:26.440 --> 00:05:27.580 Michael Novinson: That sounds correct to me. 102 00:05:28.170 --> 00:05:30.960 Tom Field: And guess what we're going to top that this year. So 103 00:05:30.960 --> 00:05:34.110 it's luminaries, we talked with CISOs. As Anna said, we talked 104 00:05:34.110 --> 00:05:37.170 with academics, researchers, government officials, I want to 105 00:05:37.170 --> 00:05:40.650 share just a teaser of one of the discussions I had last year 106 00:05:40.650 --> 00:05:44.460 with White House advisor, Anne Neuberger. Now the setup for 107 00:05:44.460 --> 00:05:48.180 this is we were talking about accomplishments so far and the 108 00:05:48.210 --> 00:05:50.340 Biden administration. I asked her what she felt the 109 00:05:50.340 --> 00:05:53.880 administration had done the first two years of his tenure 110 00:05:53.970 --> 00:05:56.940 when it comes to cybersecurity. So I'm going to share here, her 111 00:05:56.940 --> 00:05:57.750 response to me. 112 00:05:57.000 --> 00:06:00.120 Anne Neuberger: It's a great question. I think to your point, 113 00:06:00.150 --> 00:06:04.620 the executive order really said two core messages. One, we will 114 00:06:04.620 --> 00:06:07.860 practice what we preach. And we set aggressive guidelines for 115 00:06:08.070 --> 00:06:11.010 improving cybersecurity across federal government networks. 116 00:06:11.220 --> 00:06:14.010 That was in the aftermath of SolarWinds that compromised 117 00:06:14.610 --> 00:06:18.300 quite a few sensitive federal government networks. The second 118 00:06:18.300 --> 00:06:21.930 piece was we said, we in the U.S. government buy large 119 00:06:21.930 --> 00:06:25.110 amounts of technology and we buy the same tech. Americans are 120 00:06:25.110 --> 00:06:27.600 buying, American companies are buying, let's use the power of 121 00:06:27.600 --> 00:06:31.170 the purse to say we will only buy software that meets these 122 00:06:31.170 --> 00:06:33.870 critical security standards. Let's establish that standard. 123 00:06:34.140 --> 00:06:37.860 And buy our own purchases, lift that up. There were many 124 00:06:37.860 --> 00:06:42.000 elements of the executive order. Those were two key ones that we 125 00:06:42.000 --> 00:06:45.510 focused on. When we look at the National Cybersecurity Strategy. 126 00:06:45.660 --> 00:06:48.570 You have, of course, that first piece where it captures the work 127 00:06:48.570 --> 00:06:51.330 done to improve the security of critical infrastructure I 128 00:06:51.330 --> 00:06:54.870 mentioned a moment ago, it focuses on our international 129 00:06:54.870 --> 00:06:58.230 partnerships. And it focuses as well to say there's a shared 130 00:06:58.230 --> 00:07:01.080 partnership between the companies who build tech, and 131 00:07:01.080 --> 00:07:05.070 the companies who use tech. And as tech is a bigger part of our 132 00:07:05.070 --> 00:07:07.440 economy, it is a bigger part of our critical infrastructure, 133 00:07:07.740 --> 00:07:11.340 that companies who build tech really need to recognize their 134 00:07:11.340 --> 00:07:14.130 role in building tech that's as secure as possible. 135 00:07:14.900 --> 00:07:17.442 Tom Field: And boy, that last topic hasn't become any more 136 00:07:17.496 --> 00:07:20.579 critical over the past year, there's going to be a lot to 137 00:07:20.633 --> 00:07:24.094 talk about. Think about it. Ever since we were at RSA last year, 138 00:07:24.149 --> 00:07:27.286 AI has exploded more than had even then we have seen China 139 00:07:27.340 --> 00:07:30.423 exert his muscles in terms of incursions into us critical 140 00:07:30.477 --> 00:07:33.830 infrastructure. And we've seen an explosion of ransomware that 141 00:07:33.884 --> 00:07:37.076 Marianne has talked about, specifically in her sector every 142 00:07:37.130 --> 00:07:40.375 week recently. Lots to talk about with our constituents this 143 00:07:40.429 --> 00:07:40.700 year. 144 00:07:41.780 --> 00:07:43.250 Anna Delaney: it's the best gauge of what's happening in the 145 00:07:43.250 --> 00:07:46.700 industry. I mean, it's the event of the year. So as you say, lots 146 00:07:46.700 --> 00:07:48.950 to talk about really looking forward to see you there in May 147 00:07:48.950 --> 00:07:54.110 then. Well, Marianne federal regulators have issued some 148 00:07:54.200 --> 00:07:57.860 updated guidance regarding the use of web trackers on patient 149 00:07:57.860 --> 00:08:01.010 portals and health related websites. And I know this topic 150 00:08:01.010 --> 00:08:04.070 has been on your radar for a while now. So I'd love to hear 151 00:08:04.070 --> 00:08:07.310 your insights and maybe initial impressions of the updated 152 00:08:07.000 --> 00:12:29.920 Anna Delaney: Lots of acronyms to get through that, well-done. 153 00:08:07.310 --> 00:08:07.910 guidance. 154 00:08:07.890 --> 00:08:10.945 Marianne McGee: Sure, well, first of all online trackers 155 00:08:11.022 --> 00:08:15.682 such as Megapixel are widely used in many websites, you know, 156 00:08:15.758 --> 00:08:20.265 in and outside the healthcare sector, but in the healthcare 157 00:08:20.341 --> 00:08:24.696 sector, they're very common on hospital websites, patient 158 00:08:24.772 --> 00:08:29.279 portals and other health related websites. But those online 159 00:08:29.356 --> 00:08:34.168 trackers pose privacy concerns because they may share sensitive 160 00:08:34.245 --> 00:08:38.828 information about individuals from the hospital websites and 161 00:08:38.905 --> 00:08:43.641 portals with third-party vendors such as the tracking vendors, 162 00:08:43.717 --> 00:08:48.072 and that sharing might involve unauthorized disclosure of 163 00:08:48.148 --> 00:08:52.808 protected health information under HIPAA. Now, this week, the 164 00:08:52.884 --> 00:08:57.544 Department of Health and Human Services updated controversial 165 00:08:57.621 --> 00:09:01.975 guidance that they first issued in December of 2022. That 166 00:09:02.052 --> 00:09:06.482 earlier guidance warned that HIPAA regulated entities that 167 00:09:06.559 --> 00:09:11.142 use online trackers in their websites and patient portals to 168 00:09:11.219 --> 00:09:16.031 collect and transmit protected health information, including IP 169 00:09:16.108 --> 00:09:19.927 addresses of users devices, could constitute HIPAA 170 00:09:20.004 --> 00:09:23.747 violations subject to enforcement actions such as 171 00:09:23.823 --> 00:09:28.483 civil monetary fines. Now on Monday, HHS OCR backtracked that 172 00:09:28.560 --> 00:09:33.449 2022 guidance a bit by providing some new scenarios illustrating 173 00:09:33.525 --> 00:09:38.109 when the use of web trackers to collect and transmit certain 174 00:09:38.185 --> 00:09:42.845 user information may or may not constitute a HIPAA violation. 175 00:09:42.921 --> 00:09:47.581 For instance, depending on the scenario HHS OCR now says that 176 00:09:47.658 --> 00:09:51.554 not every IP address is considered protected health 177 00:09:51.630 --> 00:09:55.755 information and IPS address may be PHI only in certain 178 00:09:55.832 --> 00:10:00.186 circumstances, when a individual is visiting a website in 179 00:10:00.262 --> 00:10:04.617 relationship to their past present or future health care. 180 00:10:04.693 --> 00:10:08.666 HHS OCR clarified that the intentions of the website 181 00:10:08.742 --> 00:10:13.402 visitor also matters when making that determination. So if an 182 00:10:13.478 --> 00:10:17.527 individual is accessing a website or using an app for 183 00:10:17.603 --> 00:10:22.263 information regarding their own healthcare needs, while using 184 00:10:22.340 --> 00:10:27.152 their own device, the collection of that person's IP address is 185 00:10:27.229 --> 00:10:31.507 still considered PHI. However, if the user is visiting a 186 00:10:31.583 --> 00:10:36.014 website of a hospital and they're looking at job postings, 187 00:10:36.090 --> 00:10:40.368 or maybe just the visiting hours, the IP address of that 188 00:10:40.445 --> 00:10:45.105 user would not necessarily be considered HIPAA PHI. Now, this 189 00:10:45.181 --> 00:10:49.994 all gets pretty complicated, but a lot of this matters a lot to 190 00:10:50.070 --> 00:10:54.424 certain entities such as hospitals who are frequent users 191 00:10:54.501 --> 00:10:59.161 of these tracking tools. Now after HHS OCR issued its earlier 192 00:10:59.237 --> 00:11:03.744 guidance in December 2022, broadly warning about the use of 193 00:11:03.821 --> 00:11:08.175 tracking tools. The American Hospital Association filed a 194 00:11:08.251 --> 00:11:12.988 lawsuit against HHS demanding that the agency rescind or amend 195 00:11:13.064 --> 00:11:17.113 its guidance. Now, the HAA contends that, among other 196 00:11:17.189 --> 00:11:21.544 allegations that HHS OCR is broad guidance in December of 197 00:11:21.620 --> 00:11:25.898 2022 exceeded the agency's authority under HIPAA and the 198 00:11:25.974 --> 00:11:30.176 First Amendment. The AHA said that the earlier guidance 199 00:11:30.252 --> 00:11:34.378 upended hospitals and health systems' ability to share 200 00:11:34.454 --> 00:11:38.732 healthcare information with their communities to analyze 201 00:11:38.808 --> 00:11:43.239 their own websites to enhance accessibility and to improve 202 00:11:43.315 --> 00:11:47.746 public health. Now, the AHA told me this week that HHS OCR 203 00:11:47.823 --> 00:11:52.406 modifying its earlier guidance this week, in response to the 204 00:11:52.482 --> 00:11:56.531 group's lawsuit last year, concedes that the original 205 00:11:56.608 --> 00:12:01.420 guidance was flawed as a matter of law and policy. But even saw 206 00:12:01.497 --> 00:12:05.775 the AHA complained still that the updated guidance still 207 00:12:05.851 --> 00:12:10.358 suffers from the same basic defects as the original one and 208 00:12:10.435 --> 00:12:15.171 that the agency cannot rely on these cosmetic changes to evade 209 00:12:15.247 --> 00:12:20.213 judicial review. So the AHA says that HHS' modified guidance will 210 00:12:20.289 --> 00:12:25.178 continue to chill hospitals use of commonplace technologies such 211 00:12:25.255 --> 00:12:29.609 as web tracking, that they really do need to use in order 212 00:12:29.685 --> 00:12:34.193 to effectively reach their patients. Now the lawsuit by AHA 213 00:12:30.430 --> 00:12:44.290 But so Marianne, what do you think the broader effects on of 214 00:12:34.269 --> 00:12:38.852 against HHS as of yesterday was still playing out in federal 215 00:12:38.929 --> 00:12:43.589 court. In the meantime, several U.S. healthcare organizations 216 00:12:43.665 --> 00:12:48.096 are also facing proposed class action lawsuits by patients 217 00:12:44.530 --> 00:14:09.700 this guidance will be on the healthcare industry or patient 218 00:12:48.172 --> 00:12:53.061 involving privacy concerns over their current or previous use of 219 00:12:53.138 --> 00:12:57.263 online trackers on their websites and patient portals. 220 00:12:57.339 --> 00:13:01.770 Many of those lawsuits were filed after HHS OCR issued its 221 00:13:01.847 --> 00:13:06.201 original HIPAA guidance about web trackers in 2022. Also, 222 00:13:06.277 --> 00:13:10.937 Facebook parent company of Meta faces a proposed consolidated 223 00:13:11.014 --> 00:13:14.986 class action lawsuit in California, alleging that it 224 00:13:15.062 --> 00:13:19.188 violated privacy law by collecting patient information 225 00:13:19.264 --> 00:13:23.542 through its pixel tracker on hospital websites. Now that 226 00:13:23.618 --> 00:13:28.507 class action came into play even before HHS issued that original 227 00:13:28.584 --> 00:13:33.320 broad web tracking guidance. So now we have to see whether HHS 228 00:13:33.396 --> 00:13:37.827 OCR takes any future enforcement actions against regulated 229 00:13:37.904 --> 00:13:42.487 entities that continue to use these web trackers. The agency 230 00:13:42.564 --> 00:13:47.147 has been warning about possible enforcement actions in these 231 00:13:47.223 --> 00:13:51.807 cases for more than a year now but so far has not issued any 232 00:13:51.883 --> 00:13:56.696 maybe because of the AHA lawsuit but it kind of kind of chilled 233 00:13:56.772 --> 00:13:58.530 everything for a while. 234 00:14:09.700 --> 00:14:11.770 privacy rights? Do you have thoughts of your own on that? 235 00:14:12.470 --> 00:14:15.140 Marianne McGee: Well, you know, soon after HHS, OCR issued that 236 00:14:15.197 --> 00:14:18.833 initial guidance and even before that there was an investigative 237 00:14:18.889 --> 00:14:22.355 report by, I think it was markup and some other agent or some 238 00:14:22.412 --> 00:14:25.423 other organization, your nonprofit organizations that 239 00:14:25.480 --> 00:14:28.321 basically did some sort of analysis and found that 240 00:14:28.377 --> 00:14:31.957 thousands of hospital websites use these trackers and you know, 241 00:14:32.013 --> 00:14:35.479 many patients they just don't know that, you know, they don't 242 00:14:35.536 --> 00:14:38.831 know that you know, certain information if they're looking 243 00:14:38.888 --> 00:14:42.353 up, you know, symptoms of cancer or whatever on a portal that 244 00:14:42.410 --> 00:14:45.421 this could be then sent to Facebook and then all of a 245 00:14:45.478 --> 00:14:48.830 sudden they get these weird ads and you know, whatever else 246 00:14:48.887 --> 00:14:52.580 happens. So it's kind of spooky, I think, to some people, but the 247 00:14:52.637 --> 00:14:55.761 hospitals that use these trackers say, well, we need to 248 00:14:55.818 --> 00:14:59.000 use these in order to see you know what our patients are 249 00:14:59.057 --> 00:15:02.579 concerned about. And, you know, when that investigative report 250 00:15:02.636 --> 00:15:05.704 came out, that's when you started seeing some of these 251 00:15:05.761 --> 00:15:09.397 hospitals actually report HIPAA breaches involving their own use 252 00:15:09.454 --> 00:15:12.692 of these online trackers, you know, just to kind of cover 253 00:15:12.749 --> 00:15:16.158 themselves, you know, hey, you know, maybe this is a breach, 254 00:15:16.214 --> 00:15:19.282 we're not sure we're using trackers, and they reported 255 00:15:19.339 --> 00:15:22.805 these large breaches. So, you know, I think it's hard to tell 256 00:15:22.862 --> 00:15:26.157 what's going to go on, because you have these crackers are 257 00:15:26.214 --> 00:15:29.282 pervasive, you know, on all websites, you know, you go 258 00:15:29.338 --> 00:15:32.861 online, look for sneakers, and all of a sudden you get 20 ads, 259 00:15:32.918 --> 00:15:36.156 you know, sending you, you know, recommendations for pink 260 00:15:36.213 --> 00:15:39.678 sneakers, you know, so these things are not unusual. But when 261 00:15:39.735 --> 00:15:42.860 it comes to, you know, collecting sensitive information 262 00:15:42.917 --> 00:15:46.269 about patients, and then maybe being able to link who these 263 00:15:46.326 --> 00:15:49.678 patients are, that's where it gets kind of creepy. And, you 264 00:15:49.734 --> 00:15:53.257 know, I don't know really what you can do about it, other than 265 00:15:53.314 --> 00:15:56.495 to kind of threaten these organizations not to use these 266 00:15:56.552 --> 00:15:59.961 trackers, because they're so embedded now, I don't know what 267 00:16:00.018 --> 00:16:03.483 it would take to, you know, pull them out. And yet, you know, 268 00:16:03.540 --> 00:16:05.870 some of the damage has already been done. 269 00:16:07.430 --> 00:16:10.250 Anna Delaney: It's a complex story. Well, thank you so much, 270 00:16:10.250 --> 00:16:13.850 Marianne, for that. Michael, we're talking quantum computing 271 00:16:13.850 --> 00:16:16.580 inspired by a feature you've written this week. And the 272 00:16:16.580 --> 00:16:19.220 industry has obviously long discussed the emergence of 273 00:16:19.220 --> 00:16:22.640 quantum computing. But the question remains, just how 274 00:16:22.640 --> 00:16:26.750 imminent is the threat it poses to current encryption standards 275 00:16:26.780 --> 00:16:28.010 and digital security? 276 00:16:29.020 --> 00:16:30.610 Michael Novinson: It's a good question. And thank you for 277 00:16:30.610 --> 00:16:34.630 asking it. So as you said, this has been on the radar, a 278 00:16:34.630 --> 00:16:38.440 theoretical question, really dating back to the 1990s. And 279 00:16:38.440 --> 00:16:41.920 that being the question of once we have computers that are 280 00:16:41.920 --> 00:16:46.810 powerful enough to decrypt RSA to break RSA, what does that 281 00:16:46.810 --> 00:16:50.560 mean for our ability to secure the internet traffic to secure 282 00:16:50.560 --> 00:16:54.580 network traffic? But what was once a hypothetical scenario is 283 00:16:54.580 --> 00:16:58.150 increasingly becoming real that experts today believe that 284 00:16:58.150 --> 00:17:02.260 within the next decade, we will see advances in quantum 285 00:17:02.260 --> 00:17:06.700 computing to the point where RSA or the modern cryptography 286 00:17:06.700 --> 00:17:10.780 standards can be broken. So what does that mean? It means a 287 00:17:10.780 --> 00:17:13.300 couple things for offense, a couple of things for defense - 288 00:17:13.300 --> 00:17:16.480 on the offensive side, we're seeing a lot of these steal now, 289 00:17:16.480 --> 00:17:20.950 decrypt later schemes, where adversaries are just collecting 290 00:17:20.950 --> 00:17:25.900 data -data they can't view today, but in hopes that once 291 00:17:25.930 --> 00:17:30.850 RSA is able to be broken, they can decrypt and enjoy all the 292 00:17:30.850 --> 00:17:34.570 goodies then. So part of this is figuring out what types of data 293 00:17:34.570 --> 00:17:37.720 should you target? Where is this relevant, and essentially, 294 00:17:37.720 --> 00:17:42.490 you're looking for pieces of equipment that have a long 295 00:17:42.490 --> 00:17:45.400 lifespan, so they're likely to steal, or at least have a chance 296 00:17:45.400 --> 00:17:48.940 of still being in use once RSA is broken. So think, pieces of 297 00:17:48.940 --> 00:17:52.570 medical equipment, think automobiles, stuff that's used 298 00:17:52.630 --> 00:17:56.650 often for decades, rather than very years. So that's what we're 299 00:17:56.650 --> 00:18:00.250 seeing adversaries going after? What can defenders do now? I 300 00:18:00.250 --> 00:18:02.920 mean, yes, there's certainly it's a bit of a game of latency 301 00:18:02.920 --> 00:18:06.970 that we're still waiting in the United States, at least for the 302 00:18:06.970 --> 00:18:11.890 adoption of new cryptography standards, that the private 303 00:18:11.890 --> 00:18:15.670 industry that would adhere to that would be considered quantum 304 00:18:15.700 --> 00:18:20.410 proof. They're working on them. Folks I spoke to for this story 305 00:18:20.410 --> 00:18:23.770 said that there's four algorithms that are currently 306 00:18:23.770 --> 00:18:28.570 under consideration, decision expected soon, but nothing 307 00:18:28.570 --> 00:18:32.230 today. I mean, some private industry has gone ahead and you 308 00:18:32.230 --> 00:18:34.990 look around and press releases, you can see it that folks are 309 00:18:34.990 --> 00:18:37.930 already advertising that they have, that they're building 310 00:18:38.590 --> 00:18:41.680 hardware, they're building technology with quantum 311 00:18:41.680 --> 00:18:44.320 resistant algorithms. You've seen Apple advertising this with 312 00:18:44.320 --> 00:18:46.990 some of their most modern phones. Cloudflare advertising 313 00:18:46.990 --> 00:18:50.230 this, also some of the technology they have. Obviously, 314 00:18:51.340 --> 00:18:54.010 today, that's not backed by any government agency, they haven't 315 00:18:54.010 --> 00:18:58.360 had to get certified by anyone, but that will change soon. So 316 00:18:58.360 --> 00:19:02.320 from the standpoint of individual contributors, chief 317 00:19:02.350 --> 00:19:05.770 information security officer, what should you do now? So it's 318 00:19:05.770 --> 00:19:10.210 a matter I mean, obviously, get your house in order to prevent 319 00:19:10.210 --> 00:19:13.870 any bleeding if offense beats the defense, so segmentation, 320 00:19:13.870 --> 00:19:18.760 isolation, airgapping, do all that type of stuff. So if there 321 00:19:18.760 --> 00:19:22.660 is some type of a compromise, you can minimize the extent of 322 00:19:22.660 --> 00:19:27.400 the damage. But also, it's about really a ton of acid inventory, 323 00:19:27.400 --> 00:19:29.650 figure out where within your organization, you're using 324 00:19:29.650 --> 00:19:32.470 cryptography, which isn't really something folks have thought 325 00:19:32.470 --> 00:19:34.930 about too much to date. But figure out not only were you 326 00:19:34.930 --> 00:19:38.890 using cryptography, but what secrets are being guarded by 327 00:19:38.920 --> 00:19:42.010 cryptography, which of those secrets are a) most sensitive 328 00:19:42.010 --> 00:19:44.770 and b) most likely to be relevant several years down the 329 00:19:44.770 --> 00:19:48.700 road? So that you can really have a prioritization list in 330 00:19:48.700 --> 00:19:53.380 effect once there is more robust quantum resistant defense that 331 00:19:53.380 --> 00:19:57.820 you can adopt. And so yeah, a lot of that is also just trying 332 00:19:57.820 --> 00:20:01.720 to future proof so that what Once there are quantum 333 00:20:01.720 --> 00:20:05.260 resistant, quantum resistant encryption that you can put in 334 00:20:05.260 --> 00:20:10.090 place that it's a seamless transition that you can remove 335 00:20:10.090 --> 00:20:13.840 your current cryptography replace it with, with the more 336 00:20:13.870 --> 00:20:16.360 advanced cryptography, and that you can do that with minimal 337 00:20:16.360 --> 00:20:19.540 disruption to your organization. And then it's about also just 338 00:20:19.540 --> 00:20:23.140 thinking about employee of thinking about your workforce, 339 00:20:23.140 --> 00:20:25.870 not in terms of just the corporate life, but also in 340 00:20:25.870 --> 00:20:29.320 their personal life that if you're thinking about PII and 341 00:20:29.320 --> 00:20:33.640 HIPAA, that this isn't just an enterprise consideration, but if 342 00:20:33.640 --> 00:20:37.600 you're an HR department and you have are you in payroll, and do 343 00:20:37.600 --> 00:20:40.720 you have credit card information, you have personal 344 00:20:40.720 --> 00:20:42.940 health records, medical records for your employees, what are you 345 00:20:42.940 --> 00:20:46.000 doing to keep their personal data safe, not just corporate 346 00:20:46.000 --> 00:20:48.820 data for your customers, so there's multiple different 347 00:20:48.820 --> 00:20:52.840 spheres. So it's really about trying, today, just trying to 348 00:20:52.840 --> 00:20:57.340 get those ducks in a row so that once more modern algorithms are 349 00:20:57.340 --> 00:21:03.070 available, that they can be implemented quickly and done in 350 00:21:03.070 --> 00:21:04.810 the most important areas first. 351 00:21:05.560 --> 00:21:07.300 Anna Delaney: Lots of sound advice there. Thanks, Michael. 352 00:21:07.330 --> 00:21:11.290 How do we see different regions approaching the readiness for 353 00:21:11.290 --> 00:21:14.320 quantum computing? And how does it affect cybersecurity? 354 00:21:15.260 --> 00:21:17.570 Michael Novinson: Good question. And I do think we're seeing some 355 00:21:18.170 --> 00:21:21.200 variation in terms of compliance-driven approaches 356 00:21:21.200 --> 00:21:24.200 versus risk-driven approaches that some of the folks are 357 00:21:24.200 --> 00:21:27.140 speaking to said that this that which is the U.S. National 358 00:21:27.140 --> 00:21:30.530 Institute of Standards and Technology is really focused on 359 00:21:30.530 --> 00:21:35.690 a compliance driven approach to quantum safe cryptography. And 360 00:21:35.690 --> 00:21:39.140 that other regions, Europe and Asia have been focused more on 361 00:21:39.140 --> 00:21:42.560 risk minimization, rather than more of a compliance framework. 362 00:21:43.670 --> 00:21:48.530 So that's certainly a piece of it, and then, from a 363 00:21:48.530 --> 00:21:52.430 cybersecurity standpoint, I know cryptography originally was 364 00:21:52.430 --> 00:21:56.360 really its own world how RSA was a cryptography conference before 365 00:21:56.360 --> 00:22:00.500 it really morphed into a cyber conference. But yeah, I think 366 00:22:00.500 --> 00:22:03.950 it's just really a back to basics approach and making sure 367 00:22:03.980 --> 00:22:08.180 that, that you're doing the bread and butter. And that in 368 00:22:08.180 --> 00:22:11.720 particular, that if you're in a less regulated industry, that 369 00:22:12.020 --> 00:22:14.750 your financial services and your healthcare companies and your 370 00:22:14.750 --> 00:22:19.850 retail where a lot of them have more robust protections in place 371 00:22:19.850 --> 00:22:22.610 right now. But if you're in sectors particularly like 372 00:22:22.610 --> 00:22:26.030 enterprise software, where there's not a regulatory party 373 00:22:26.060 --> 00:22:32.420 forcing you to do certain things from a cryptographic standpoint, 374 00:22:33.230 --> 00:22:38.690 that you voluntarily up your game, so that you're not too far 375 00:22:38.690 --> 00:22:42.710 behind the curve once this quantum Armageddon hits. 376 00:22:44.030 --> 00:22:46.460 Anna Delaney: Michael, you've done a great job tackling a very 377 00:22:46.460 --> 00:22:50.840 meaty topic in less than five minutes so thank you. Finally, 378 00:22:50.840 --> 00:22:54.770 and just for fun, what would the soundtrack for a day in the life 379 00:22:54.770 --> 00:22:58.340 of a cybersecurity professional sound like. Name three tracks 380 00:22:58.610 --> 00:22:59.360 that would feature. 381 00:23:01.130 --> 00:23:03.230 Tom Field: I think it'd be Guardians of the Galaxy. We tap 382 00:23:03.230 --> 00:23:04.430 right in the classic rock. 383 00:23:05.240 --> 00:23:05.960 Michael Novinson: Classic rock. 384 00:23:11.600 --> 00:23:14.600 Tom Field: Start the day with the Beatles Helter Skelter. I 385 00:23:14.600 --> 00:23:18.200 think that sets the scene very well for what someone faces at 386 00:23:18.200 --> 00:23:22.040 the start of a day. By midday, we're going to call it Billy 387 00:23:22.040 --> 00:23:28.550 Joel's Pressure and by day's end, I'm hoping that we've done 388 00:23:28.550 --> 00:23:34.670 just enough that we can all be talking about - wait for it - 389 00:23:39.620 --> 00:23:40.640 Good Vibrations. 390 00:23:42.560 --> 00:23:43.070 Anna Delaney: Can you sing it? 391 00:23:44.360 --> 00:23:45.770 Tom Field: I'm picking up Good Vibrations. 392 00:23:45.000 --> 00:23:49.650 Anna Delaney: Very good. Yeah, that's really very good. Well 393 00:23:49.680 --> 00:23:51.690 thought through. Marianne, go for it. 394 00:23:52.020 --> 00:23:54.030 Marianne McGee: Well, believe it or not, I picked Pressure for 395 00:23:54.030 --> 00:24:02.100 one; as you're going through an attack, Tie a Yellow Ribbon by 396 00:24:02.100 --> 00:24:05.670 Tony Orlando and hope that you'll get your data and systems 397 00:24:05.670 --> 00:24:10.680 back and hopefully by the end of that episode, you can sing I 398 00:24:10.680 --> 00:24:14.160 Will Survive by Gloria Gaynor . 399 00:24:15.930 --> 00:24:16.530 Anna Delaney: Michael? 400 00:24:18.640 --> 00:24:21.280 Michael Novinson: This is a fun one, man. I did have to put on 401 00:24:21.280 --> 00:24:24.910 my thinking cap here. So some of mine are touch more current, no 402 00:24:24.910 --> 00:24:29.140 offense to Tom and Marianne here. So I was thinking about 403 00:24:29.560 --> 00:24:32.020 maybe a bit more of a morose state that I wanted to start 404 00:24:32.020 --> 00:24:34.390 with Billy Eilish's Bad Guy in terms of fighting the 405 00:24:34.390 --> 00:24:37.960 adversaries. Then assuming an adversary gets in I was thinking 406 00:24:37.960 --> 00:24:40.840 of the old rock band Silversun Pickups and their song Panic 407 00:24:40.840 --> 00:24:45.580 Switch, got to flip that panic switch and then if the cyber 408 00:24:45.580 --> 00:24:47.440 adversaries are aware that you're going after them that 409 00:24:47.440 --> 00:24:50.860 they may just slashed and burned everything and shut all your 410 00:24:50.860 --> 00:24:54.130 systems down which made me think of the Beastie Boys on Sabotage. 411 00:24:54.400 --> 00:24:58.120 So my day does not have a happy ending, but lots of good music 412 00:24:58.120 --> 00:24:58.630 along the way. 413 00:24:59.380 --> 00:25:00.700 Anna Delaney: Michael, I'm going to have to pick up the mood 414 00:25:00.700 --> 00:25:04.240 here, but I love this like, you know, pressure and panic and 415 00:25:04.240 --> 00:25:08.650 sabotage and then survival mode. I haven't chosen any 416 00:25:08.650 --> 00:25:12.640 backgrounds, but I've chosen the Matrix main theme for the 417 00:25:12.640 --> 00:25:15.790 morning just to build up some suspense for the day, midday 418 00:25:15.790 --> 00:25:22.180 crisis will be the Chain by Fleetwood Mac. So we'll be there 419 00:25:22.180 --> 00:25:25.210 in the race against time to mitigate the potential threats. 420 00:25:25.210 --> 00:25:28.660 But also, I'm going to say we avert disaster because I want to 421 00:25:28.720 --> 00:25:33.100 end on a positive note and then we're ending on Always Look on 422 00:25:33.100 --> 00:25:36.160 the Bright Side of Life, just because we got to keep that 423 00:25:36.160 --> 00:25:37.150 positive outlook. 424 00:25:37.840 --> 00:25:39.820 Tom Field: There's an RSA connection because Eric Idle 425 00:25:39.820 --> 00:25:41.950 performed that at RSA 2023. 426 00:25:43.120 --> 00:25:46.390 Anna Delaney: There you go! Well, thank you so much for your 427 00:25:46.390 --> 00:25:49.000 insight. It's been informative and entertaining as always, 428 00:25:49.270 --> 00:25:51.160 really loved it. Oh, 429 00:25:51.460 --> 00:25:52.840 Tom Field: Always on the bright side of life. 430 00:25:54.520 --> 00:25:56.620 Anna Delaney: Absolutely. And thanks so much for watching. 431 00:25:56.800 --> 00:25:57.520 Until next time.