promptrefiner: Using GPT-4 to Create a Perfect System Prompt for Your Local LLM

-

Image created by DALL·E 3
Image created by DALL·E 3

On this tutorial, we’ll explore promptrefiner: A tiny python tool I actually have created to create perfect system prompts on your local LLM, through the use of the assistance of the GPT-4 model.

The python code in this text is out there here:

https://github.com/amirarsalan90/promptrefiner.git

Crafting an efficient and detailed system prompt on your program could be a difficult process that always requires multiple trials and errors, particularly when working with smaller LLMs, comparable to a 7b language model. which might generally interpret and follow less detailed prompts, a smaller large language model like Mistral 7b could be more sensitive to your system prompt.

Let’s imagine a scenario where you’re working with a text. This text discusses a number of individuals, discussing their contributions or roles. Now, you must have your local language model, say Mistral 7b, distill this information into a listing of Python strings, each pairing a reputation with its associated details within the text. Take the next paragraph as a case:

Screenshot from the input text. Image created by the creator

For this instance, I would really like to have an ideal prompt that leads to the LLM giving me a string just like the following:

"""
["Elon Musk: Colonization of Mars", "Stephen Hawking: Warnings about AI", "Greta Thunberg: Environmentalism", "Digital revolution: Technological advancement and existential risks", "Modern dilemma: Balancing ambition with environmental responsibility"]
"""

After we use an instruction fine-tuned language model (language models which might be fine-tuned for interactive conversations), the prompt often consists of two parts: 1)system prompt, and a couple of)user prompt. For this instance, consider the next system and user prompt:

Screenshot from system + user prompt. Image created by the creator

You see the primary a part of this prompt is my system prompt that tells the LLM the best way to generate the reply, and the second part is my user prompt, which is…

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

2 COMMENTS

0 0 votes
Article Rating
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

2
0
Would love your thoughts, please comment.x
()
x