Meta Prompting for AI Systems

Tsinghua University, Shanghai AI Lab, Shanghai Qizhi Institute

Abstract

In this work, we present a comprehensive study of Meta Prompting (MP), an innovative technique reshaping the utilization of large language models (LLMs) and AI systems in problem-solving and data interaction. Grounded in type theory and category theory, Meta Prompting emphasizes the structure and syntax of information over traditional content-centric methods. The paper explores the formal definitions of Meta Prompting, sets it apart from few-shot prompting, and underlines its effectiveness in various AI applications. A key focus is applying Meta Prompting for complex reasoning tasks, showing how it effectively deconstructs intricate problems into simpler sub-problems, enhancing token efficiency, and enabling more equitable problem-solving comparisons, especially against few-shot prompting methods. Additionally, the paper introduces Meta Prompting for prompting tasks, allowing LLMs to self-generate new prompts in a recursive, metaprogramming-like manner. Empirical experiments, including using a Qwen-72B base language model equipped with meta prompt without instruction-tuning to solve MATH problems with accuracy at 46.3%, which surpass the supervised fine-tuned counterpart trained with extensive mathematical QA instruction pairs and even the initial version of GPT-4, solving GSM8K problems with 83.5% accuracy with zero-shot meta-prompted Qwen-72B base language model, and solving the Game of 24 tasks with a 100% success rate using GPT-4, demonstrate the meta prompting's efficacy in achieving high accuracy and efficiency, showcasing Meta Prompting's transformative impact on AI problem-solving.

Meta Prompting

Meta Prompting is an advanced prompting technique that focuses on the structural and syntactical aspects of problems, prioritizing the general format and pattern over specific content details. It aims to construct a more abstract, structured approach to interacting with large language models (LLMs), emphasizing the structure and syntax of information. This technique is particularly effective in contexts where the underlying pattern or framework of a problem is crucial for understanding or solving it.

Definition. A Meta Prompt is an example-agnostic structured prompt designed to capture the reasoning structure of a specific category of tasks. It provides a scaffold that outlines the general approach to a problem, enabling LLMs to fill in specific details as needed. This approach allows for more efficient and targeted use of LLM capabilities by focusing on the "how" of problem-solving rather than the "what".


MY ALT TEXT

Characteristics of Meta Prompting

Syntax-Oriented: Meta Prompting prioritizes the form and structure over the content, using syntax as a guiding template for the expected response or solution.

Abstract-Example-Based: It employs abstracted examples as frameworks, illustrating the structure of problems and solutions without focusing on specific content.

Type Theory Inspiration: Drawing from type theory, Meta Prompting emphasizes the categorization of components in a prompt, such as problem statements, solution steps, or conclusions. It focuses on their logical arrangement and interrelationships, ensuring a coherent and structured approach to problem-solving.

Adaptability: Meta Prompting is versatile, applicable across various domains, and capable of providing structured responses to a wide range of problems.

Guidance for Detailed Exploration: It provides a clear roadmap for problem-solving, focusing on structural patterns to aid in navigating complex topics.

Citation

Please cite the paper and star this repo if you use Meta Prompting (MP) and find it interesting/useful, thanks!

@article{zhang2023meta,
  title={Meta Prompting for AGI Systems},
  author={Zhang, Yifan and Yuan, Yang and Yao, Andrew Chi-Chih},
  journal={arXiv preprint arXiv:2311.11482},
  year={2023}
}