Infosys uses Meta's Llama to build in-house Conversational AI legal assistant
Here's an interesting case study of how Meta's Llama is being adopted by many organisations to help power their internal (Conversational AI) infrastructure.
I'm documenting this one for all of my colleagues (some who I know are reading) who are constantly having difficult discussions with their information security, risk and compliance colleagues about the merits of using the likes of Meta's Llama.
The TLDR is: Infosys is doing it.
When you've got a team of engineers capable of building your applications and services, it can make a lot of sense to standardise on a platform such as Llama and build out from there. You don't necessarily have to boil the ocean and make your own LLMs when you've got one of the best on the planet available to download (free!) to your own infrastructure.
Given the open-source nature of Llama, it's no surprise that companies are embracing it. I remember having all sorts of dialogues about whether 'MySQL' or 'RedHat' was 'any good' given the fact they were open-source!
Conversely, that's also been a limiting aspect for some though, who have (for many valid reasons) preferred the idea of some kind of direct service level agreement from a commercial vendor. It's all about your priorities and preferences.
Here's the first paragraph from the news item from Meta:
For the team at Infosys, a global leader in next-generation digital services and consulting, open source is the future. Using Llama 3.1, they’ve built a variety of agent-based AI models and retrieval augmented generation (RAG) systems for processing documents, videos, audio, and more. The company provides the services as part of Infosys Topaz, an AI-first offering used to accelerate business value for global enterprises.
I have direct knowledge of quite a few financial players who are doing the same.
I thought this paragraph was particularly instructive:
Infosys also leverages Llama 3 as part of its in-house legal assistant that uses RAG. When someone asks a question, they’re provided with information and cited sources, which boosts the trust of the people who use it. Keeping security a priority, the legal assistant is deployed on a GPU cluster within Infosys firewalls and helps provide cutting-edge gen AI capabilities to Infosys users securely and confidentially.
Put your hand up if you can name at least one company developing live production services on top of Meta's Llama? Yup. Me too.
I think we'll see a lot more of these kinds of announcements ongoing.
The interesting question for me is at what point does your in-house Legal Assistant built on Llama get too annoying and expensive to maintain yourself, when you can buy the same capabilities from a vendor, operating at a much faster development speed and reliability level? Yes, in-house makes everyone feel better.
But, is it really in-house? What's your definition? Fair enough if it's your own 'tin' – i.e. your own servers, running your own CPUs, sitting in your own data centre. Fair enough.
But when you're sitting on a Microsoft or Amazon Cloud – and your 'vital, super confidential' legal documents are being saved there, already... you can probably use a vendor operating on your tenancy.
I do find it somewhat crazy, sometimes, when companies (I have a lot of experience with banks) obsess over the physical locations of these kinds of systems and services. It's laudable. I understand it. But it's next to pointless obsessing over these aspects when the legal team are writing the most sensitive information into Microsoft Word and emailing it via Google Mail to the partner who's opening it on their Amazon WorkSpace virtual desktop, then saving it to their physical laptop to 'work on' at the summer house, because the internet connection there isn't very good.
So I still think there will be a lot of room for vendors offering all sorts of services. Very exciting.
Anyway, it's great to see all these use cases popping up.
Find out more about Llama here: https://ai.meta.com/.