March 4, 2024
When we think of the ‘Roaring 20s’, we think lavish Gatsby-style parties, speakeasies and dancing the Charleston. Of course, we cannot know what people will think of in 100 years’ time, but in the 2020s we’ve certainly been celebrating the biggest steps in AI innovation.
A critical milestone - OpenAI brought us ChatGPT and DALL-E. Now Microsoft has released their plug-in, Copilot, that allows users to integrate OpenAI’s capabilities into their Microsoft Office experience.
Wait … Who's flying the plane?
At Law 365, we’ve always embraced innovation, especially when it addresses productivity and efficiency, or it challenges the way we work. This is mirrored in the attitudes of our clients, who have sprung into action and are increasingly incorporating Copilot into their offerings to their customers. So we have matched the pace of tech development and considered how best to advise our clients to help them continue to grow with less risk.
What are the risks for the end users?
We already know that ChatGPT hallucinates, that DALL-E creates the most bizarre hands, and that AI cannot (yet?) be fully relied upon. So what might you need to consider when adding Copilot to your offering? The team at Law 365 have concluded that there are three initial key areas to consider:
-
Confidentiality
Microsoft assures users that anything put into (or generated by) Copilot is kept confidential and will not be used for training purposes or shared with other customers.
But Copilot works across a user’s workspace (emails, documents, internet), so it's crucial to review and understand the terms of service regarding data privacy and confidentiality, adjusting user settings in Security Copilot as necessary, to protect sensitive information.
For example, you wouldn’t want Copilot pulling data of one client, to be used or inadvertently disclosed to another or to Microsoft!
-
IP Indemnity
There is a risk that any AI tool creates an image that looks suspiciously alike the logo of another company (IP infringement alert!).
Fortunately, Microsoft has expanded its intellectual property (IP) indemnity to cover Copilot, providing a degree of assurance against potential legal risks related to third-party IP infringement.
However, this indemnity may have limitations and requirements that users must adhere to, including the use of content filters and safety systems built into Copilot.
-
Accuracy and Accountability
As mentioned before, "hallucinations" or inaccuracies may still occur, underscoring the importance of verifying the accuracy of outputs generated by Copilot.
Users should exercise caution and diligence in reviewing and validating any content produced by the AI.
What should IT cloud solution providers do?
To mitigate risk and protect your business’ interests, Law 365 recommends taking proactive measures, including:
-
Consider liability for third-party services
It’s vital to exclude your liability for your customers’ and the end users’ use of Copilot. Existing contracts and templates should be reviewed carefully to make sure that you are not on the hook for the use of Copilot.
You should also check that you haven’t given any warranties for third party services.
-
Integrate relevant third party documents
Ensure that all applicable engagements explicitly reference and incorporate the relevant product terms and end user licence agreements.
This helps establish clear contractual obligations and limitations of liability for both your company as supplier, and your customer as end user.
-
Clarifying an "as is" basis for services
It's important to communicate clearly that Copilot services are provided on an "as is" basis, meaning that the supplier (your company) is not responsible for the specific outcomes or results obtained by the client from using the services.
This disclaimer helps manage expectations and reduces the risk of potential disputes if the end user is looking for someone to blame!
-
Create and implement a Copilot (or more generic generative AI) usage policy:
If you intend to allow your workforce to use Copilot, it is good practice to put in place a workplace policy to set out the guidelines and limitations around its use.
This may also include a right as the employer to monitor, issues relating to technical support and training.
Image courtesy of monticellllo - stock.adobe.com
Need help? Get in touch with Law 365
At Law 365, we specialise in providing comprehensive legal support tailored to the unique needs and challenges of tech companies.
Our experienced team can assist you in reviewing contracts, including product terms, EULAs and statements of works. Minimising risk and maximising benefits.