Mastering AI Conversations: The Art of Prompt Engineering

Leone Perdigão
8 min readJun 3, 2023

--

Source: unsplash.com

In the ever-evolving field of artificial intelligence, a new discipline has emerged: prompt engineering. This burgeoning field focuses on the development and optimization of prompts, a critical component in leveraging the power of language models (LMs) for a diverse array of applications and research topics. With a keen understanding of the capabilities and limitations of large language models (LLMs), those skilled in prompt engineering are able to harness these AI tools more effectively.

Researchers are increasingly turning to prompt engineering to enhance the proficiency of LLMs in tackling both conventional and intricate tasks, such as question answering and arithmetic reasoning. Meanwhile, developers are employing the same skills to devise robust and efficient prompting techniques, facilitating better interfaces with LLMs and other technological tools.

What is Prompt Engineering?

Prompt Engineering refers to the practice of designing and refining prompts, or inputs, for an AI model to generate desired outputs. The goal is to achieve the desired output from the AI model by controlling the specificity and context of the prompts.

Prompt engineering, however, goes beyond mere design and development of prompts. It incorporates a wide variety of skills and methodologies that prove invaluable when working with LLMs. It’s a crucial skill set for those who wish to effectively interface with, construct using, and understand the capabilities of LLMs. Moreover, prompt engineering can be employed to bolster the safety of LLMs, and even to innovate new capabilities such as enhancing LLMs with domain knowledge and integrating them with external tools.

The effectiveness of the AI’s output can be greatly influenced by the quality of the prompt. A well-constructed prompt is clear, concise, and specific, providing the AI model with all the necessary information to generate a high-quality response. Conversely, a poorly constructed prompt is often vague, ambiguous, or incomplete, potentially leading the AI model to generate irrelevant or poor-quality responses​.

A well-constructed AI prompt provides the language model with all the necessary information to generate a high-quality response. Here are some examples:

  • Write a poem about a lost love.
  • Translate this sentence from English to Spanish.
  • Write a code that prints the numbers from 1 to 10.
  • Summarise this text between quotes, limiting your response in 300 words.

Contrarily, a poorly constructed AI prompt is vague, ambiguous, or incomplete. It may not provide the model with enough information to generate a response, or it could lead the model to produce an irrelevant response​ [1]​.

Here are some examples:

  • Write something.
  • Tell me a story.
  • Translate this.
  • Write a program.

The Importance of Precision in Prompts

The more descriptive the prompt, the better the results. But remember to avoid unnecessary details. Experimentation and iteration are key to optimising prompts.

Being too clever or vague can lead to imprecise responses. Be direct and clear in your communication with AI. Instead of instructing the model what not to do, specify what it should do. This encourages specificity and leads to more accurate responses​ [1​].

There are several best practices for AI prompt engineering, including:

  1. Make your AI Prompts Specific: Instructions should be detailed and specific about the task you want the model to perform. The more descriptive the prompt, the better the results. For instance, if you want the AI to extract only the animals from a text, your prompt could be: “Extract the names of animals from the following text.”
  2. Make your AI Prompts Precise: Being too clever or vague can lead to imprecise responses. Be direct and clear in your communication with AI. If you want a concise explanation of artificial general intelligence (AGI), it’s better to be clearer about how many sentences and in what style you want the response.
  3. Avoid Prompts That Ask What Not to Do: Rather than instructing the model what not to do, specify what it should do. This encourages specificity and leads to more accurate responses [​1​].

Prompting Techniques

While the basic examples were fun, in this section we cover more advanced prompting engineering techniques that allow us to achieve more complex and interesting tasks.

It is important to note that these are just general descriptions of some of the existing techniques. The specific implementation and effectiveness of each technique may vary depending on the task and the specific language model being used.

Zero-shot Prompting

Zero-shot prompting refers to the technique where a model is given a task without any specific examples or training for that particular task. The model uses its pre-training knowledge to generate the best response it can. This technique is useful for one-off tasks or when specific training data is not available.

Prompt:

Classify the text into neutral, negative or positive. Text: I think the vacation is okay.Sentiment:

Output:

Neutral

Few-shot Prompting

In few-shot prompting, the model is given a few examples of the desired task before being asked to perform it. These examples serve as a guide, helping the model to understand the task and generate an appropriate response. Few-shot prompting can be used to improve the performance of the model on specific tasks.

Prompt:

A "whatpu" is a small, furry animal native to Tanzania. An example of a sentence that usesthe word whatpu is:We were traveling in Africa and we saw these very cute whatpus.To do a "farduddle" means to jump up and down really fast. An example of a sentence that usesthe word farduddle is:

Output:

When we won the game, we all started to farduddle in celebration.

Chain-of-Thought Prompting

Chain-of-thought prompting involves a series of related prompts that guide the model through a complex task. Each prompt builds on the previous ones, leading the model towards the desired output. This technique is useful for tasks that require multi-step reasoning or problem solving.

Source: Wei et al. (2022)

Self-Consistency

The self-consistency technique involves asking the model to verify its own output. The model generates a response, then is asked a question about that response. This can be used to check for errors or inconsistencies in the model’s output.

Prompt:

When I was 6 my sister was half my age. Now I’m 70 how old is my sister?

Output:

35

Generate Knowledge Prompting

Generate knowledge prompting involves asking the model to generate new information or ideas based on its pre-training knowledge. This technique can be used for creative tasks or when new ideas or perspectives are needed.

Source: Liu et al. 2022

Automatic Prompt Engineering

Automatic prompt engineering involves using algorithms or other automatic processes to generate or optimize prompts. This can be a time-saving technique for large-scale or complex tasks.

Source: Zhou et al., (2022)

Active-Prompt

Active-prompt is a technique where the model is actively engaged in the prompt creation process. The model generates a preliminary response, which is then used to create a new prompt. This can be a useful technique for iterative tasks or when the desired output is not well-defined.

Source: Diao et al., (2023)

Directional Stimulus Prompting

Directional stimulus prompting involves guiding the model’s output in a specific direction through the use of carefully chosen prompts. This can be used to achieve specific stylistic or content goals in the model’s output.

Source: Li et al., (2023)

ReAct

ReAct is a technique where the model is asked to react or respond to a given situation or piece of information. This can be used to generate emotional or thoughtful responses from the model.

Source: Yao et al., 2022

Multimodal CoT

Multimodal chain-of-thought (CoT) involves using multiple modalities (such as text, images, and audio) in the prompting process. This can enhance the model’s understanding and performance on complex or multimodal tasks.

Source: Zhang et al. (2023)

Graph Prompting

Graph prompting involves using a graph-based approach to generate prompts. The graph represents the relationships between different concepts or pieces of information, and the model is asked to generate a response based on this graph. This technique can be used for tasks that involve complex relationships or systems.

Liu et al., 2023 introduces GraphPrompt, a new prompting framework for graphs to improve performance on downstream tasks.

Applications of Prompt Engineering

Prompt engineering can be used for various applications, including:

  • Generating creative content, such as poems, stories, and scripts
  • Translating languages
  • Answering questions
  • Writing different kinds of content, such as blog posts, articles, and reports
  • Generating code
  • Solving math problems
  • Creating art or images
  • Generating new ideas​ [1​]

Learning Resources for Prompt Engineering

For developers interested in diving deeper into AI and prompt engineering, Stanford’s selection of online AI programs is highly recommended. Additionally, a free course on prompt engineering is available from DeepLearning.AI, founded by Stanford professor Andrew Ng [​1]​.

The course, “ChatGPT Prompt Engineering for Developers”, created in collaboration with OpenAI, helps developers build applications for text processing, robotic process automation, coaching, and more using ChatGPT’s API. The course covers best practices for prompting, with use cases such as summarising, inferring, transforming, and expanding texts, and even building custom chatbots​ [2​].

I also found this guide quite interesting [3].

Concluding Thoughts

Andrew Ng puts it well:

The key to being an effective prompt engineer isn’t so much about knowing the perfect prompt, it’s about having a good process to develop prompts that are effective for your application — Andrew Ng​.

As AI continues to develop and become more integrated into our lives, effective prompt engineering will become an increasingly valuable skill.

The information provided here is current as of June 2023. For more recent developments in prompt engineering, I recommend checking the DeepLearning.AI website and other reputable AI education and research sources.

As an emerging field, new techniques and use cases are being discovered every day. Personally, I’m still exploring and following information from many sources to get a more complete picture of the current state of prompt engineering.

Thank you very much for reading! If you liked this article, please give it some claps and leave a comment below.

References

  1. Hackr.io. (2023). “Understanding AI Prompt Engineering: A Complete Guide.” Available from https://hackr.io/blog/prompt-engineering​.
  2. DeepLearning.AI. (2023). “New Course — ChatGPT Prompt Engineering for Developers.” Available from https://www.deeplearning.ai/short-courses/chatgpt-prompt-engineering-for-developers/​.
  3. Prompt Engineering Guide. Available from https://www.promptingguide.ai/.

--

--