TechXY Turbo - Episode 2
AI, Cybersecurity, and the Future of Digital Privacy with Jessica Copeland
Jessica Copeland is chair of the cybersecurity and data privacy and law practice as well as the AI law practice at Bond, Schoeneck & King law firm, where she focuses on all aspects of technology law, including data privacy compliance, AI governance and intellectual property protection and litigation. She’s on the front line for clients who have experienced data breaches and other incidents, often defending them in class action litigations or helping them navigate ransomware attacks, business email compromises and inadvertent disclosures.
Jessica also protects patents and trademarks in federal and appellate courts and before the International Trade Commission for clients in industries such as medical and mechanical devices, computer software and hardware, pharmaceuticals, telecommunications and e-commerce technologies. Jessica earned her B.A. in Mathematics from New York University and her J.D. from St. John’s University School of Law. Jessica is a Certified Information Privacy Professional/United States (CIPP/US) through the ANSI-accredited International Association of Privacy Professionals, has been ranked for IP litigation by New York Super Lawyers ® for the past 10 years and was named part of the “Legal Elite of Western New York” by Buffalo Business First.
Jessica joins TechXY Turbo to dive deep into the intersections of artificial intelligence, cybersecurity, and the legal industry. We explore Jessica's insights on how AI is reshaping data protection and compliance, what legal professionals should know about the future of tech, and how businesses can prepare for what's ahead. Listeners will gain valuable perspectives on tech innovation, privacy, and legal adaptation in the AI age.
Please enjoy and listen to the episode on Apple Podcasts, Spotify, YouTube, Amazon, Pandora, Podcast Addict, Deezer, JioSaavn, or below.
Transcript
Frank Gullo: Welcome to another edition of TechXY Turbo. My name is Frank Gallo, and I'm your host. Today we're joined by Jessica Copeland. Jessica is a renowned expert in data privacy, cybersecurity, and artificial intelligence governance. As the chair of her firm's cybersecurity and data privacy practice, Jessica provides strategic counsel on data protection, compliance, and emerging technologies with deep expertise in navigating the legal implications of AI.
She helps businesses and legal professionals understand and adapt to evolving tech landscapes. In this episode, she shares her views on AI's role in security, legal transformation, and the digital future.
Jessica, it's really great to have you here on TechXY Turbo. To start off, can you tell our listeners a little about yourself and the work you're doing today?
Jessica Copeland: Sure. Thank you for having me. I'm excited to have a good conversation with you about AI. Jessica Copeland, I'm a partner at the law firm Bond, Schoeneck & King in Buffalo, New York. Born and raised actually in Long Island. I don't have the Long Island accent, but, I did once, and so. And it wasn't AI that got rid of it.
Jessica Copeland: Surprisingly, in my area of law, I like to say that I'm a technology lawyer. I actually started practicing law in the patent space, handling patents in the, IT and communications sector. And that was a natural segue into what I do today, which is data privacy and cybersecurity, compliance work, incident response work. And now artificial intelligence, governance. And we'll talk a lot about what that means going forward in the conversation.
Frank Gullo: Right. So so like your work spans tech, legal compliance and a number of other things. How would you describe your relationship to tech yourself?
Jessica Copeland: Well, personally, I've always been, a proponent of staying up with advanced technologies. That's why I enjoyed going into patent law, to be honest, because I was always working in cutting edge technology. When a company invests money in the new innovation, they file for patent protection. And so it kind of put me right at the beginning phase of many launched, technological solutions.
Frank Gullo: So part of the purpose of this podcast is to inform listeners about tools we use. To start, what are some of your go to tools that you're using today? Particularly anything new that you've been trying for the last few years.
Jessica Copeland: In terms of hardware technology or software technology?
Frank Gullo: Tools, AI tools, applications, apps.
Jessica Copeland: Well, since we're talking about AI, I will say that as much as there are concerns about using freely of, available AI solutions such as, OpenAI's ChatGPT and Google Gemini, I have been indulging in those application ads for personal use in a way that does not convey confidential personal information. And the reason that's important, obviously, is because anything that you enter into those little chatbots can be used to train their model and will also not be protected from a secure posture.
Jessica Copeland: But I have been using it, you know, to develop sort of a calendar of events or, organize certain items in my personal life or professional life. Though not containing confidential or private information.
Frank Gullo: With your frontline experience, what are the most significant cybersecurity threats that businesses should be most concerned about right now?
Jessica Copeland: Well, it seems that the uptick in ransomware attacks is actually tied mostly to exploited phishing campaigns. And so what that means is, you receive a mock or a fake email that is purporting to be from somebody asking for urgent transfer of funds or asking for you to check out this document and brings you to a box account and you enter your credentials, unknowingly giving your credentials to a threat actor.
Jessica Copeland: That is considered an exploited phishing attack. I think that there has been a study recently indicating that phishing attacks and business email compromises are the top source of cyber attacks in the last year. And I believe that what we're starting to see is why artificial intelligence adds just another layer of an advanced technology that can facilitate cyber criminal activity.
Jessica Copeland: And what does that mean? I'm sure you've heard of deep fakes, and I'm going to segue a bit from phishing emails to deepfakes because they are somewhat related. What we're now seeing is you receive a phishing email and it says that your computer needs to be updated. Can you click on this link and then you'll get a phone call purportedly from your IT department.
Jessica Copeland: And they've been able to mask their phone number, make it sound like it's from their your IT department. And they ask you to enter your credentials or actually accept this remote desktop application right onto your computer. And now they're on your computer and they're navigating and they're dropping malware. So we could talk for hours about different attack vectors. I do think that's the primary risk that companies are facing at this point in time.
Frank Gullo: Great information. And for those who are unfamiliar, deepfakes are fake. videos. Just this week, Jessica, there was an interesting report from the Ponemon Institute (privacy, security) about how deep fakes are increasingly targeting executives. So a great point. And, you know, kind of segues into broader question about breaches and beyond reacting to breaches. When you help clients prepare their incident response and governance plans, what would you say are 2 to 3 really crucial proactive, both technical and policy measures they can take to help protect themselves from from all these threats that you mentioned?
Jessica Copeland: In terms of the incident response plan, that plan should certainly have your IR team. So the incident response team internally that you're going to put together, and it should include if you have in-house counsel, the in-house counsel, or outside, cyber counsel, and that is intended to protect any investigation as attorney client communication. And then also if you have cyber insurance, which I highly recommend that you have cyber insurance to, info to contact the hotline that's available.
Jessica Copeland: I even include the hotline number in the incident response plan, because you want to make sure, especially if you're faced with a ransomware attack and your systems are encrypted, that you get forensics that are approved by your cyber insurance coverage, on the front line, because they will be able to get into your system sooner, than trying to find a vendor when you're in the midst of an attack.
Jessica Copeland: Certainly, if you have, external MSP, make sure their contact information is available so that they know what's going on with your systems in case they haven't picked up on it. Now, in terms of how to support against these attacks, one, I always say multi-factor authentication is it's a baseline. If you don't have it, it's irresponsible at this point.
Jessica Copeland: But really more importantly, is having a robust XDR or EDR solution, a SIM solution so that it can, through various channels, look at your network and environment to understand anomalies. Frankly, I'm seeing, the use of artificial intelligence in those cyber security measures really advancing. And so, you know, it used to be that you would just get some antivirus or antivirus is like 30 years ago at this point.
Jessica Copeland: And right now this is what we're seeing - because artificial intelligence learns so quickly and it can identify anomalies faster than any other solution. It's embedded in most of the XDR solutions that you can engage, a vendor with great stuff.
Frank Gullo: This will all be in the show notes, thank you. And speaking of AI, that that was my my next topic. As we consider AI tools to help with both mitigation and and support, what would you say are potential data privacy pitfalls and other IP concerns organizations should consider when adopting AI, whether in security tools or in general?
Jessica Copeland: In terms of data privacy concerns with bringing on AI tools into an organization, it's critical for your IT department or your CSO to investigate the security stack of what that solution offers and where the data that is being consumed by the AI where it would be stored, how it's being protected, and very significantly making sure that there is clear, language in the contract with these vendors that their model is not going to be trained off of your data.
Jessica Copeland: Those are the top two things to look at in terms of protecting data privacy. And I'm sorry, Frank, there is a second part to your question. What is it?
Frank Gullo: Intellectual property.
Jessica Copeland: Certainly if you are using the freely available AI tools, then do not submit any R&D, confidential or proprietary information about your company. This is something that engineers within tech companies and manufacturing companies need to be trained on because, you know, they might be looking for a solution and they put too much of that confidential, potentially trade secret information into a chatbot.
Jessica Copeland: And then the information is no longer protectable. It is public domain. The other IP angle with AI is who owned what copyright where. And that is honestly an issue that is not going to be resolved anytime soon because there are so many pending litigations across the country about whether it's fair use of copyrighted material to train AI models or whether that is not a defense.
Jessica Copeland: And by uploading copyrighted material to an AI tool, you have an AI that is going to use that information and generate an output that integrates that copyrighted material which is an infringement. So those are sort of two very different tracts of IP that are related to the AI tools that we should all be mindful of.
Frank Gullo: Good stuff, thank you. So you gave us some helpful information about incident response. However, in the first 24 to 48 hours, anything you would highlight and particularly common mistakes that that firms may make under pressure when they think there's an event that may or may not be an incident. So those first those first 24 hours, 48 hours?
Jessica Copeland: Well, first of all, if if you're, a recipient of a ransom note, don't communicate to the threat actor. Leave that to the experts that you'll hire, that do it on a daily basis, and probably even might know the crime family, if you will, of the where the threat actor is coming from. So, yes. And within the 24 hour period, it is critical to make sure you have the most up to date backups available to you, and hopefully they are segregated from your network.
Jessica Copeland: This way there wouldn't be a risk of lateral movement by the threat actor. Identifying where the intrusion occurred is not going to happen within 24 to 48 hours. So honestly, just get that out of your minds, because I find that a lot of clients expect to know right away who let this person in. How do you know? How did they get there? And it can be a distraction from what the investigation should be. You should definitely take your systems offline, change passwords. Right away. If you are identifying a potential business email compromise, meaning someone is within your email environment sending mass phishing emails to try to get someone's attention under a legitimate user, the best way to, try to fix that is to change everyone's password in the environment.
Jessica Copeland: And make sure in particular anyone with administrative access to get into Active Directory changes their password as well because that's where we're seeing these intrusions. Navigate to to get credentials from Active Directory and then to manipulate recipients of emails from legitimate users.
Frank Gullo: Do we have a sense, based on recent statistics, how many of these attacks are from insiders versus crimes of opportunity or kind of mass campaigns from outside of the U.S?
Jessica Copeland: I don't know what the study breakdown is but I can speak anecdotally that I've rarely, rarely been engaged on, a cyber attack that has been an inside job. It's typically outside threat actors. I mean, there are the occasional rogue employee that sent an email with tons of HR data, but that's honestly rare. And what I've been seeing in the last couple of years.
Frank Gullo: Jessica, we're in New York state. However, regulations apply throughout the country. But in terms of New York, with regulations like GDPR, CCPA, New York Shield and others like HIPAA and New York DFS regulations, how do you advise clients to create a comprehensive compliance strategy that addresses these multiple and sometimes overlapping requirements?
Jessica Copeland: The best way is to understand is what is the organization's main source of regulation. And that means certainly in the financial industry and the health care space, it's pretty clearly delineated. And so if you are adhering to the more enhanced, compliance obligations that are in the heavily regulated industries, you're likely complying with the Massachusetts obligation to have a written information security plan, which is called a West or New York Shield acts, cybersecurity mandate.
Jessica Copeland: And in fact, you're probably not not going to have to separately find out, certify that you're compliant with the New York Shield Act if you are, New York state chartered bank, because you're compliant with DFS. So that helps a bit. And I always say you work from the most rigid compliance model that you need to work within. And obviously then you would be meeting the requirements that filter lower.
Frank Gullo: Let's pivot a bit to technology agreements and contract. I know from your work you review tech software contracts. What would you say are some red flags or conversely essential clauses that you look for when you're reviewing data privacy, compliance and protections within agreements?
Jessica Copeland: First of all, if there's no specification that the vendor needs cyber insurance, that's a red flag. If there's a complete disclaimer of any type of warranty of no malware or other type of worm, there's like this whole phrase that I always make sure is dropped into our agreement, because if you're going to be providing client software, then you should be white box checking it before you deploy it into systems. Being able to provide a warranty that it's not going to deploy any malware should be a position that a reputable software company would take.
Frank Gullo: You mentioned cyber liability. Moving to insurance, particularly for small and medium sized businesses, what should businesses understand about the current cyber insurance market and how volatile it's been year over year? What due diligence would you say people businesses should have in place before they they engage for cyber liability?
Jessica Copeland: A couple of things. The cyber insurance industry has been, navigating in my view. I don't write insurance. I don't sell insurance, but it would seem like it's a bit of, a topsy turvy environment, if you will, because it is becoming more upside down in that there are more claims than the industry may have anticipated 5 to 7 years ago.
Jessica Copeland: And so they're trying to correct that by adding more exclusions into policies, some exclusions that you might not necessarily even understand to mean that they would be able to avoid paying you if there were a cyber attack. So I, I would recommend making sure that you work with your IT department and potentially legal to understand the language of certain exclusions as well as at the point of pricing it out.
Jessica Copeland: There'll be typically a questionnaire that you have to declare certain security measures are in place within your environment. Make sure the person filling that out understands what your technology, resources and your network solutions actually are because if there's an incorrect statement, for example, a really easy one is do you have MFA on all endpoint users?
Jessica Copeland: And if you say yes, but it turns out in the throes of an investigation of a data breach, that MFA wasn't applied. That's a basis to deny coverage. So it's it's critical that you have, someone with knowledge filling that form out. If you have an external IT resource, if you have an MSP, you should have them working through that with you so that your, accurately depicting what your security posture is. And then the insurance that would work for your company would be issued, and there would be little risk of a denial in the event of a breach.
Frank Gullo: Thank you. Related, as a professional who defends clients and data breach class actions, what are you seeing now in litigation? Are the courts focusing on specific types of failures? Are there lessons that we can learn from from what you're saying.
Jessica Copeland: Probably less so through litigation and more so seeing it through enforcement actions, I would say because I'd like to pivot more to see what in particular in New York the attorney general has been doing to enforce the Shield Act. What we're seeing is that not only are there financial penalties, but there are pretty prescriptive recommendations and requirements in terms of updating your cybersecurity posture, meaning do you rotate passwords? Do you have a SIM, do you have XDR? I mean, these are all requirements that are not in a in a granular way required by the Shield act cybersecurity mandate, but by interpretation are required based on what we're seeing from the New York state attorney general.
Frank Gullo: AI is in the news but so are robots. And I think there's there's a certain, you know, fascination with the future of robots. And robots are in place in many businesses, like many manufacturing sites. Any caveats or special considerations for robots that may have software programing and IoT, whether they're connected to the Internet or not, to consider around cyber.
Jessica Copeland: In terms?
Frank Gullo: Things to keep in mind when when you are maintaining robots as part
Jessica Copeland: Well, I mean, I think you have to think about it the same way that you do just having your iPhone available to you, which, you know, there are certain applications that are listening to our conversation right now. And so if your company is purchasing robots to work in a manufacturing site, is there anything in that contractual relationship that protects the information learned by the robot?
Jessica Copeland: I think it's a valuable clause to put into an agreement to say that no information acquired through the work of the robot can be used by your company and is solely owned by XYZ Corp that has engaged the robotics company.
Frank Gullo: Looking a little ahead, two years, what emerging tech or evolving cyber threats such as quantum, we mentioned deepfakes, IoT vulnerabilities, do you think may pose the next major challenge for data privacy, cyber security professionals, and also, business, public at large?
Jessica Copeland: Man, that's a big question, Frank. You know, I think we are moving to a place where there will be leverage of AI and robotics and in a way that I don't know that necessarily about what cyber security threats but I think there's threats to workforce. I think there's threats to the labor industry entirely.
Jessica Copeland: I mean, if you can replace manufacturing sites with robots, if you could replace fast food cashiers with robots, where are certain jobs going to be? And we can get more into socio socio economic questions and concerns, I think easier than what would the cyber threat be, because the cyber threats are, although evolving, still the same.
Jessica Copeland: I mean, you have cyber criminals that are always going to be ahead of the curve in terms of manipulating AI models or using it to generate malware. And so having to warn against that is a similar solution. But what do we do at the point in time where we have a workforce that can't get jobs because it's cheaper and more efficient to have AI tools integrated within your company.
Frank Gullo: Right. We're recording and just today reading statistics from the FBI Internet Crime Complaint Center which said it tallied over 850,000 reports last year, for a combined 16.6 billion in financial losses, up 33% from the previous year. So before we think of the future, have we have we peaked? Have we hit rock bottom? Will the problem get worse before it gets better?
Jessica Copeland: Oh, it will get worse before it gets better. I don't think that we've had enough time to get our arms around how to protect against the advance intrusion vectors. To be quite honest, I think we're just starting to see that uptick, and it'll probably be maybe another 2 to 3 years before we start to see some leveling off.
Frank Gullo: Jessica, I've really enjoyed this discussion. We can talk about all these topics more, but any final thoughts for our listeners? Any any any advice you want to leave them with?
Jessica Copeland: The best advice I have is to remain vigilant in your own personal banking matters and your company’s confidential information — to protect it and every way that you can. Meaning, you know, don't use the freely open, available, AI tools that seem really interesting and fun to use with any confidential or private information. It seems pretty rudimentary, Frank, but you'd be surprised.
Jessica Copeland: And the other piece is, don't fear the technology. Embrace it, because it's not going anywhere. My my take on technology has always been I want to learn it because I want to use it to to make my self better, to make my counsel to my clients better and effective and most efficient.
Frank Gullo: Where can people find you online if they want to reach out more? If they want to work with you in your firm, what's the best way? Is at LinkedIn or anywhere else?
Jessica Copeland: My work email is the best way to get in touch with me or LinkedIn. So work email is jcopeland@bsk.com. And my LinkedIn page is Jessica Copeland. You could just find me on LinkedIn and message me there.
Frank Gullo: Thank you, Jessica, had a really great conversation and looking forward to more.
Jessica Copeland: Thanks, Frank.



