AI Risk Mitigation and Legal Strategies Series No. 3: Practical Guide to AI Audits

If your company is not on Biden's upcoming regulation targeting list, do you sit back and relax? If your company is on the target list, will you wait until the other shoe drops? The answer to both questions is a resounding "NO."

With the widespread AI technology, it's very likely that your company is either an AI developer, an AI user, or both and thus inevitably exposed to a wide range of legal risks under the existing laws. Therefore, it's critical that a company's legal counsel audit every aspect of its business and detect issues to address them proactively. In this article, I will explore audits of AI's impact on IP and data privacy based on my experiences and real-life lawsuits. Additionally, I will share a list of audit questions and strategic action steps derived from audit results. Remember that each company is unique, with its situation, and subject to different regulations. This list is incomplete. Please contact me for customized solutions that address your company's needs.

a. IP infringement and Loss of Proprietary IP

1. Hypothetical Scenario: Your IT department or vendor utilized AI tools and models to develop and launch AI-powered products and services.

2. Legal and Business Risks: AI tools and models are trained based on large sets of data, some of which are publicly available materials. Some public materials are subject to third parties' intellectual property rights or open-source license requirements. As a result, by utilizing AI models or tools, a company may unintentionally infringe on third parties' IP rights and even risk losing its own proprietary IP rights.

3. Real-life Lawsuit: In January of 2023, stock photo provider Getty Images sued AI company Stability AI, accusing it of misusing more than 12 million Getty photos to train its Stable Diffusion AI image-generation system and demanding damages of up to $150,000 for each infringed work, which could add up to billion-dollar damages. The case is currently pending.

4. To effectively protect a company's intellectual property rights and minimize the risk of infringement, here is a list of questions to examine when conducting an AI audit on IP issues:

  • Does the company have an IP policy in place? Has it been updated to address the AI issue?

  • Do the AI tools and models incorporate any open-source license?

    • If so, does the company have an open-source policy in place?

    • Has it been updated to address the open-source AI issue?

  • What data has been used to "train" the AI applications?

    • Even if your business team characterizes it as the company's data, which could be derived from clients' data, you will need to conduct a detailed legal analysis to decide who owns the data and whether you have the license to use such data for AI training purposes.

    • Will using such data affect your company's ownership of the AI products?

  • Are there any third-party materials involved?

    • If so, has your company obtained an IP or data license or written consent from third parties?

    • Does the third-party license cover such use?

    • Will using such data affect your company's ownership of the AI products?

  • Will your company be notified if your vendor incorporates their newly developed AI technology into their products and services provided to you?

    • Does your company have a vendor policy and RFT process in place? Have you updated them to address the AI issues?

    • Have you revised the vendor due diligence questionnaire to address the AI issues?

    • Have you revised the vendor contract template to address the AI liability issue?

Depending on your answers, you may need to prepare or revise the policies and procedures, negotiate a license with or obtain written consent from clients or third parties, re-negotiate the vendor contractors, or stop the developers from using those materials in training AI models. There may be other strategies, depending on your situation. Training employees on this topic will help prevent this from happening.

b. Violation of Data Privacy Laws

1. Hypothetical Scenario: Your company or vendor scrapes publicly available personal data and uses such data in AI technology products or services.

2. Legal and Business Risks: AI systems rely on a vast amount of data, including sensitive personal data, to learn and make decisions. If personal data used by AI systems is not properly collected and processed in compliance with applicable data privacy laws, companies may face consumer lawsuits and government penalties. More importantly, companies that violate data privacy laws subject themselves to negative publicity and the immeasurable damage of lost consumer trust and confidence.

3. Real-life Lawsuit: In early 2020, consumers, the American Civil Liberties Union, other nonprofits, and the Vermont Attorney General sued facial recognition company Clearview AI. Clearview reportedly scrapped billions of photos from Twitter, Facebook, and other social media platforms and created and sold a faceprint database to police departments and private companies like Macy's. Clearview agreed to settle the class-action privacy lawsuit in September of 2023.

4. To ensure compliance with data privacy laws, here is a list of questions to examine when conducting an AI audit on data privacy issues:

  • Has the company established a data governance program, including an internal data privacy policy?

    • Have you revised the data governance program to address the use of AI?

    • Has the policy been revised to address the AI issues?

    • Has the policy been updated to comply with the state data privacy laws that took effect this year, including CPRA, CPA, CTDPA, UCPA, and VCDPA?

  • Does the company have an external privacy notice in place?

    • Has the privacy notice been revised to address the AI issues?

    • Has the privacy notice been updated to comply with CPRA, CPA, CTDPA, UCPA, and VCDPA?

  • Has the company's AI products or services incorporated any personal data?

    • Has the company obtained consent from consumers regarding this particular use?

    • Is there any sensitive personal data involved? Has the company performed data privacy impact assessments?

  • Will you be notified if your vendor scrapes publicly available personal data and uses such data in AI technology products or services?

    • Have you asked vendors to complete a data privacy and security questionnaire before selecting them?

    • Do you require vendors to maintain cybersecurity insurance?

    • Have you signed a data processing addendum (“DPA”) with vendors? If so, you may need to consider an amendment to the DPA to address the AI issue.

    • Have you updated your DPA template to address the AI issue?

    • Does your contract have the necessary provisions to protect your company if your vendor's AI products or services violate the data privacy laws?

Depending on your answers, you may need to establish a data governance program, prepare or revise the data privacy policies and notices, conduct data privacy impact assessments, amend the DPA, or stop developers from using personal data in training AI models. Depending on the specifics of your situation, other strategies may be applicable. Training employees on this topic is a proactive measure to prevent this incident from occurring.

I will address AI's impacts on other areas of law in the next article.

Please contact me at lkempe@lklawfirm.net if you would like me to advise on IP or data privacy and security issues arising from AI products and services, prepare policies and contract templates, or provide training to your employees.

Please get in touch with me at lkempe@lklawfirm.net if you would like me to set up a data governance program for you, advise on IP or data privacy and security issues arising from AI products and services, prepare policies and contract templates, or provide training to your employees.

Click here for the Author’s Profile.

Click here for other articles in the AI series.

 
Previous
Previous

AI Risk Mitigation and Legal Strategies Series No. 4: AI Washing

Next
Next

AI Risk Mitigation and Legal Strategies Series No. 2: Executive Order