An Introduction to Prompt Engineering
Turn your videos into live streams with Restream https://restre.am/ANIm Hello all! Continuing along with our recent theme of Generative AI, I thought we'd focus this week on prompt engineering. If you aren't familiar with what this emerging practice is, it's essentially a means to squeeze the most value out of these new Generative AI tools. As with any predictive model, it's either garbage in, garbage out or good stuff in, good stuff out! See you this Friday (4/28) at our usual 2:00pm CST time! #promptengineering #generativeai #chatgpt
An Introduction to Prompt Engineering
Introduction
The video starts with music playing in the background.
Prompt Engineering
In this section, the speaker talks about prompt engineering and its importance in generative AI.
What is Prompt Engineering?
- A prompt is an input into one of these generative AI models.
- Prompt engineering is writing the prompt in such a way that you get the best results out of what you're hoping to retrieve from it.
Why is Prompt Engineering Important?
- It helps to get better results from generative AI models.
- Writing prompts effectively can help retrieve more accurate information from generative AI models.
How to Use Prompt Engineering
- Write prompts that are specific and clear.
- Experiment with different prompts to see which ones work best for your needs.
Conclusion
The speaker concludes by summarizing the importance of prompt engineering in generative AI and how it can be used effectively.
Key Takeaways
- Prompt engineering is important for getting better results from generative AI models.
- A prompt is an input into one of these generative AI models.
- Writing prompts effectively can help retrieve more accurate information from generative AI models.
- Experimenting with different prompts can help determine which ones work best for your needs.
Introduction to Transfer Learning and Prompt Engineering
In this section, the speaker introduces the concept of transfer learning and prompt engineering in neural networks. The speaker explains how pre-trained models can be used for image classification and how transfer learning can be applied to retrain these models on new data. The speaker also discusses prompt engineering as an alternative approach to transfer learning.
Transfer Learning
- Transfer learning is the idea of using pre-trained models for image classification.
- Pre-trained models were created by large companies like Google and TensorFlow for general-purpose image classification.
- Transfer learning allows users to leverage the training structure of a neural network without having to train their own model from scratch.
- Users can strip off the last layer or two layers of a pre-trained model and retrain it on their own data.
Prompt Engineering
- Prompt engineering is an alternative approach to transfer learning that has been found to produce good results.
- Prompt engineering involves creating prompts or input sequences that are tailored to specific tasks rather than retraining entire neural networks.
- Prompt engineering is easier and quicker than transfer learning because it does not require technical expertise in retraining neural networks.
Zero-Shot Prompts and One-Shot Prompts
- Zero-shot prompts refer to prompts that have not been seen during training but still produce accurate results.
- One-shot prompts refer to prompts that only require one example during training to produce accurate results.
Prompt Engineering Tips
In this section, the speaker discusses prompt engineering and how it can be used to get more precise results from large language models. The speaker also provides an example of prompt engineering using a Python script.
Setting a Role for System User
- Prompt engineering involves providing contextual information to help get more precise results from large language models.
- One-shot prompting involves providing one or many different sorts of contextual things to help you get a more precise result out.
- You can set the role of a system user prior to interacting with the chat flow programmatically or directly in the chat GPT UI.
- The speaker demonstrates this by setting the system user as Jar Jar Binks in a Python script that emulates the chat GPT interface.
Example of Prompt Engineering Using Python Script
- The speaker shares an example of prompt engineering using a Python script that emulates the chat GPT interface.
- When initiated through the command line, the script acts as if it is Jar Jar Binks and responds to prompts accordingly.
- This is done programmatically, so there is no indication on the UI that Jar Jar Binks is giving responses.
- This simple prompting technique can be used with any other large language model by setting any role you want to fill in.
Non-Technical Approach to Prompt Engineering
- Prompt engineering does not necessarily require technical expertise but rather experimentation with playing language until you have something that works well.
- It is subjective and requires continuous experimentation until you find something that works well for your use case.
Prompt Engineering Tips
In this section, the speaker discusses prompt engineering and how it can be used to automate tasks without needing to know coding.
Acting as a Certain Role
- Acting as a certain role can help prompt engineering.
- You don't need to be a software engineer or machine learning engineer to do these kinds of things.
Exporting Data as a Table
- Data can be exported as a table.
- Use plain English language to describe what you want.
- Results will come back in markdown table string format.
Zero Shot Prompting
In this section, the speaker explains zero shot prompting and how it works.
Top Five Most Populous Cities in the United States
- Ask for the top five most populous cities in the United States.
- The results will come back in bulleted form.
- Results can also be displayed in table format using markdown.
Introduction to Prompt Engineering
In this section, the speaker introduces prompt engineering and expresses skepticism about the need for companies to hire prompt engineers.
Prompt Engineering
- The speaker does not consider themselves an expert in prompt engineering.
- Prompt engineering can be a skill set that can be added to any existing role.
- Companies may not need to hire prompt engineers specifically as they may be doing similar activities as machine learning or software engineers.
Chain of Thought Prompting
In this section, the speaker discusses chain of thought prompting and demonstrates how it works using a Monty Python reference.
Zero Shot Prompt
- The speaker asks a question about the air speed velocity of an unladen swallow.
- The response is basic and recognizes the question's context in Monty Python.
Chain of Thought Prompt
- The speaker asks the same question but adds "let's think through this step by step."
- The response is more detailed and includes information on velocity formula.
Using GPT-3 to Solve Math Problems
In this section, the speaker demonstrates how to use GPT-3 to solve math problems.
Basic Math Problem
- The speaker provides a simple math problem: "What is 3^4 / 3^2?"
- He explains that the answer is 3 and demonstrates how GPT-3 can provide a basic output for such problems.
Complex Math Problem
- The speaker presents a more complex problem: "What does x equal in the equation 8x + 2 = 110?"
- He shows that GPT-3 can provide the straight answer of x = 1 for this problem.
- However, he notes that adding the prompt "Walk me through this step by step" can help build intuition around why a problem is solved in a certain way.
- He walks through the steps of balancing both sides of the equation and isolating x, demonstrating how GPT-3 can be used as a tool for learning concepts in algebra.
Usefulness of GPT-3 for Learning Concepts
- The speaker emphasizes that using GPT-3 to walk through complex problems step by step can be especially useful for learning concepts in subjects like calculus or trigonometry.
- He acknowledges that his examples were not ideal due to lack of preparation time but hopes they illustrate the potential usefulness of GPT-3 as an educational tool.
Understanding Grounding and Priming
In this section, the speaker talks about grounding and priming in relation to zero-shot prompting and few-shot prompting. The speaker explains how providing the right context for an LLM can produce more precise results.
Grounding and Priming
- Grounding or priming is the idea of providing contextual information to an LLM to produce more precise results.
- Providing contextual information involves taking several concepts such as zero-shot prompting, few-shot prompting, and putting them together.
- The speaker provides an example of sending an email to a customer who has purchased a smartphone with a free six-month trial of a streaming service. The email informs the customer that if they do not cancel their subscription at the end of the trial period, they will be charged $9.99 per month.
- Salesforce recently announced Einstein GPT, which uses open AI APIs behind the scenes to automate tasks such as sending automated emails to customers.
Conclusion
Grounding or priming is essential in producing precise results from an LLM. By providing contextual information, users can get more accurate responses from their models.
Writing Professional Emails
In this section, the speaker discusses how to write professional emails and set the tone of the email.
Setting the Tone of an Email
- Adding qualifiers such as "very cheerful" can change the tone of an email.
- Using more exclamation points and qualifying adjectives like "amazing" or "incredible" can make an email more positive and upbeat.
- Changing the closing from "best regards" to "warm regards" can also make an email feel more welcoming.
Programming with OpenAI API
In this section, the speaker demonstrates how to use OpenAI API to summarize a blog post programmatically.
Summarizing a Blog Post Programmatically
- The speaker loads up a blog post text from a file using Python.
- The loaded text is then broken up with new line characters.
- The OpenAI API is used to programmatically summarize the blog post into a single paragraph.
- This could be useful for leaders who want a quick summary of a lengthy blog post.
Setting up the Chat Flow
In this section, the speaker discusses how to set up a chat flow and append it appropriately.
Creating an Empty Chat Flow
- To set up a chat flow, create an empty chat flow.
- The empty chat flow will be used to append prompts.
Appending Prompts Appropriately
- Append prompts by using
chatflow.append.
- Use an F string to pass in the blog post text as content.
- Programmatically summarize any text into a single paragraph using the
blog post textvariable.
Example Use Case
- As an example use case, imagine being a book publisher who receives a prospective author's book without a summary.
- You can pipe that text into the full book text and have it summarized by Chai TBT before sending it over to the primary editor.
Invoking the API
In this section, the speaker discusses how to invoke the OpenAI API.
Invoking OpenAI API
- Invoke OpenAI API by using
openai.chat.
- Pass in your prompt as an argument.
- Receive your response from OpenAI API.
Introduction to Prompt Engineering
In this section, the speaker introduces prompt engineering and explains how it works.
What is Prompt Engineering?
- Prompt engineering involves using machine learning models to generate text based on a given prompt.
- The speaker demonstrates an example of prompt engineering by showing how a program can summarize a blog post.
- The program sends a request to an API and receives a response back with the summarized content.
How Does Prompt Engineering Work?
- To use prompt engineering, you need to define the skill sets associated with the task at hand.
- For example, for machine learning engineering, these skills include machine learning, deep learning, software engineering, and technical architecture.
- By combining these skill sets with prompt engineering techniques, you can create systems that automatically generate text without manual input.
Advantages of Prompt Engineering
In this section, the speaker discusses the advantages of using prompt engineering in various fields.
Benefits of Combining Skill Sets
- When you combine different skill sets with prompt engineering techniques, you can create powerful systems that automate tasks such as summarizing text.
- This approach saves time and effort compared to manually copying and pasting information into UI forms repeatedly.
Conclusion
In this section, the speaker concludes their discussion on prompt engineering and hints at future topics they plan to cover in upcoming videos.
Final Thoughts on Prompt Engineering
- While not necessarily complex, prompt engineering can be useful when combined with other skill sets such as software or machine learning engineering.
- The speaker does not recommend hiring a dedicated prompt engineer but rather suggests incorporating these techniques into existing roles.
Future Topics
- The speaker plans to cover Lang Chain and Baby AGI or Auto GPT in future videos.
- They will not be available for a month after the next video due to the release of Zelda Tears of the Kingdom.
Wrapping up for the Weekend
In this section, the speaker concludes their stream and announces their excitement for an upcoming game release.
Conclusion of Stream
- The speaker has been playing Breath of the Wild for close to 400 hours.
- They announce that they are excited for Tears of the Kingdom.
- The speaker plans to get another stream in before the release of Tears of the Kingdom.
- They thank viewers for watching and wish them a good weekend.
Closing Remarks
- The speaker hopes to see viewers again next week.
- They say goodbye and sign off.