Hi, everyone. James with IntelliChatTM. I wanted to discuss something called Zero-Shot Learning. I think that there’s a kind of big misunderstanding about chat GPT, how it’s currently being used as a toy, and how we can utilize Chat GPT, but really other large language models as well because there are several out there that are really good and start to use those as tools.
One of the biggest things that companies are struggling with is that they want to be able to ask a question about their own data and have the large language model humanly answer that and multiple language multiple languages as well.
So how can we do that? Well, if you just go and ask Chat GPT about your specific question, you know, about your company. You’re going to get something that may or may not be relevant. Something may be found on the Internet prior to June twenty twenty-one. Whatever it is, it’s probably not going to work for you. And especially if you have any data that are proprietary, there’s just no real place for an open language model.
What we have to do is something called zero-shot learning. So zero-shot learning says that the large language model is able to answer a question on something that it wasn’t trained on. And the only way to do that is to provide the context with which to answer the question. Now we use vector databases which essentially are high-speed databases that allow us to ask a question, and get a couple pieces of relevant content out in a very short period of time in a matter of a second or two. And then take that context and the question. Send it over to the large language model like Chat GPT, and say, hey, answer this question with these pieces of context and then they can do that, and they can do it in multiple languages.
And that is essentially zero-shot learning.