From the course: Introduction to Prompt Engineering for Generative AI

Solution: Zero-shot, few-shot learning

From the course: Introduction to Prompt Engineering for Generative AI

Solution: Zero-shot, few-shot learning

(upbeat music) - [Instructor] So the goal of this challenge was to take a paragraph and turn it into a Q&A engine that rhymes. So basically it answers questions in rhymes for educational purposes. And the input was this paragraph about Jupiter. And hopefully you had fun with this one and I'll show you how I went about doing this. So the first example I have here is using the Zero-shot approach. So instead of the instruction that we had on top telling it to summarize this for a second grader, I added this instruction right here. So I told it to answer in rhymes as Jupiter in the first person and then I added a question and a room for answer. I tried to avoid having white spaces here so I won't have a space here. The model doesn't seem to appreciate that. So I'm going to go ahead and submit. And the question was, what's the biggest planet in the solar system? And it's answering in rhyme. Now to turn this into a Few-shot example, I basically took a few answers and just made sure there are good quality and good rhymes. I can, you know, delete this, make it shorter and whatever kind of makes this a high quality answer. And let me show you what I did. So here I am in the Few-shot example and to tell the model it's Few-shot, here you're going to want to add a stop sequence. And that basically is as simple as typing in hashtag twice and hitting Enter. Now you'll see a few examples separated by these stop sequences. So the first one is an answer that I got that I basically improved upon, and I always made sure that the facts are tied to the paragraph. And then another one and then another one. And for the third one, I did something funny. I asked it whether a tomato is a fruit or a veggie. And I purposely picked something that is not in this paragraph right here. And then I made up this like little rhyme saying, sorry, could not help you out. Jupiter's the only topic I answer about. And then I did that again. I said, what are computers made of? Once again, sorry, could not help you out. Jupiter is the only topic I answer about. And then from these examples, the model can sort of infer that when you ask about something outside the scope of this, it should come up with this rhyme right here. So let's try this out. I'm going to ask it a question relating to Jupiter and I ask it, is Jupiter larger than the sun? And I'll submit. And it tells me that the mass is way smaller than the sun, which is a fact that it pulled from the paragraph, which is great. I'm going to delete that and I'm going to ask a question that's completely unrelated. I'm going to ask it, what's the tallest tower in the world? And let's see what the answer it comes up with is. And what I wanted to say is that it doesn't answer questions about things that are not in the paragraph basically. Sorry, could not help you out. Jupiter is the only topic I answer about. And this is how powerful Few-shot learning is. It gives you the ability to refine what you want the model to do and it lets you take these large language models and kind of make them better at this one thing with relatively little work. So I hope this gave you a sense of how you can leverage Zero-shot and Few-shot learning to get the most out of these large language models.

Contents