Home Artificial Intelligence Train a custom ChatGPT to reply questions in your resume. Introduction Prerequisites Create an OpenAI API Key Install libraries Arrange the information and code Train the AI chatbot with resume data Conclusion

Train a custom ChatGPT to reply questions in your resume. Introduction Prerequisites Create an OpenAI API Key Install libraries Arrange the information and code Train the AI chatbot with resume data Conclusion

3
Train a custom ChatGPT to reply questions in your resume.
Introduction
Prerequisites
Create an OpenAI API Key
Install libraries
Arrange the information and code
Train the AI chatbot with resume data
Conclusion

On account of ChatGPT’s tremendous capabilities, similar customized AI chatbots will grow in popularity on a private and enterprise level.

In actual fact, ChatGPT needs no introduction. Interestingly though, and admittedly thankfully, OpenAI offers a GPT-3 API (application programming interface) reference that may applied to virtually any task that involves understanding or generating natural language, code, or images. On this piece, we’re going to leverage the facility of this API to coach an AI chatbot with personalized data, i.e., a private resume. The identical implementation will be also followed for another application, comparable to the creation of the summary of a book, report or financial plan.

Hereby some notes to have the option to follow this text:

  • You have already got Python3 installed (if not, take a look at this installation guide).
  • You have already got an OpenAI account. For those who ever used ChatGPT, then you might have an account. If not, go to platform.openai.com/signup and .
  • To get the perfect results, the information set Nonetheless, in response to OpenAI, it’s going to also work with popular international languages like French, Spanish, German, etc. So go ahead and provides it a try in your personal language.
  • In this text we’re going to feed a quite small PDF document (just 4 pages) for our model. If you ought to use a bigger dataset, be certain that your computer comes with a robust CPU and GPU.
  1. Log in OpenAI
  2. Go to Personal>View API keys

3. Select Create latest secret key

4. Copy the API key. Remember that afterward you won’t have the option to either view or copy the important thing so be certain that you simply copy-paste it to your notebook or working file, otherwise you’ll must create a latest key.

5. Now, visit the usage page to confirm that you might have adequate remaining credit. By default, upon creation of a latest account, OpenAI offers $18 of free credit to make use of inside a certain period. In case your credit has expired or drained, you should buy the OpenAI API on the usage page. Alternatively, you may establish a latest OpenAI account with a unique phone number to receive additional free credits. It will make it easier to avoid encountering Error 429 (You exceeded your current quota, please check your plan and billing details) while running the code.

6. As stated by the “secret”, the API secret is strictly personal. So, don’t share it with others, or expose it within the browser or other client-side code. With the intention to protect the safety of your account, OpenAI may routinely rotate any API key that has been found leaked publicly.

Let’s install our libraries. Open the Terminal and kind the next commands, waiting between each installation to complete before starting the following one.

We’re going to use the OpenAI library to coach our LLM AI chatbot. In a while, we’ll import its LangChain framework.

pip install openai

Next, we’re installing the gpt_index, which can allow us to hook up with external data that shall be used to coach our chatbot.

pip install gpt_index==0.4.24

On this application we’re going to work with a PDF document so we’ll install PyPDF2 and PyCryptodome to parse PDF files without errors.

pip install PyPDF2
pip install PyCryptodome

Lastly, we’ll use gradio for a straightforward interactive UI for our AI chatbot.

pip install gradio

We’ll use the summary of my LinkedIn profile as training data. It may be downloaded by choosing More>Save to PDF through the important profile page.

The code that’s used to coach the chatbot is shown below:

The chatbot is predicated on GPT-3 language model and more specifically uses “davinci” , most capable GPT-3 model that may do any task the opposite models can do, often with higher quality. The code also uses the GPT_index package for constructing and managing the index of possible responses.

In line 8, ‘Your API key’ must be replaced by your personal API key that was created previously.

The create_index function uses the imported modules to create an index of documents in a specified directory path, which is utilized by the chatbot to generate responses. The function sets several parameters comparable to max_input_len, num_outputs, max_chunk_overlap, and chunk_size_limit to configure the index. The function also uses a PromptHelper to help in generating prompts and an LLMPredictor to predict the likelihood of generating certain responses based on the GPT-3 model.

Once the index is created, it’s saved to disk in JSON format using the save_to_disk method. The chat function loads the saved index from disk and queries it with the user’s input text. Empirically, setting response_mode="tree_summarize" leads to higher summarization results. The function then returns the chatbot’s response.

Finally, the script creates a Gradio interface for the chatbot, which allows users to enter text and receive responses. The interface launches with the launch method, and the share parameter is ready to True to permit others to access the interface.

Now, we’ll create the next easy structure in any directory of our preference. The simplest is to only save the files on the Desktop. The “docs” folder accommodates the PDF of the LinkedIn profile summary and the ”custom_gpt” is our code.

We’ll execute the script from the command line.

  1. Open the Terminal.
  2. Navigate to the previously mentioned directory where the “docs” folder and source code are saved using the cd command.
  3. Run the next command:
python custom_gpt.py

4. Copy paste the local URL in your web browser. It’s going to load the interface of our chatbot.

5. The interface will appear to be this and is able to answer any relevant to the training data questions:

Let’s try it out!

Pretty good job for such a straightforward, yet quite sophisticated chatbot. Considering the low volume of knowledge with which it’s trained, it will probably adequately summarize some key areas of the resume. Should the resumes of several people be added within the “docs” folder, the chatbot would have the option to distinguish between them and generate the suitable response. The present implementation can easily work with PDF and text files, simply add them to the “docs” folder and rerun the script via the Terminal.

To stop the custom-trained AI chatbot, press “Ctrl + C” within the Terminal window. If it doesn’t work, press “Ctrl + C” again. Also, visit again the usage page to view the usage that incurred and keep track of your tokens.

On this piece we explored how a customized AI chatbot will be trained with one’s own data after which answer relevant questions. The identical approach will be used to summarize books, articles or anything in a PDF or text format. The chances are limitless, though limitations still exist.

Pavlos

3 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here