How to Integrate ChatGPT to Review Pull Requests on GitHub Using GitHub Actions

I decided to add it to my open source project ChatGPT as a reviewer of pull requests, so that he can immediately point out typos and minor inaccuracies in the code. In the article I will share how to do this without purchasing foreign numbers, cards and various VPNs. For this we will use the service ProxyAPI and write a short one yml file for GitHub Actions.

Start

The first thing you need to start with is adding a file .github/workflows/cr.yml to the root of the project.

The contents of the file itself are as follows:

name: Code Review

permissions:
  contents: read
  pull-requests: write

on:
  pull_request:
    types: [opened, reopened, synchronize]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: anc95/ChatGPT-CodeReview@v1.0.11
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
          LANGUAGE: Russian
          OPENAI_API_ENDPOINT: https://api.proxyapi.ru/openai/v1
          MODEL: gpt-3.5-turbo
          PROMPT: "You are an experienced Kotlin/Java developer and your job is to review pull requests. Please review the following code for any misunderstandings or violations. Don't spend time commenting on what is already working perfectly. I'm looking for constructive criticism and suggestions for improving the code, only useful and thorough notes."

I won’t go into detail explaining every line. I’ll only tell you about the important ones.

Let’s use a ready-made action - uses: anc95/ChatGPT-CodeReview@v1.0.11 link to repository.

Now let’s go through the settings of this action:

  • GITHUB_TOKEN needed in most actions (about GITHUB_TOKEN)

  • OPENAI_API_KEY is the OpenAI key. To use requests to OpenAI from the Russian Federation, you can create a key in the service ProxyAPI. To do this, after registration, go to your personal account and create a key:

Next, you need to create a new secret called OPENAI_API_KEY (you can use your own name) in your repository:

  • LANGUAGE is responsible for the language in which ChatGPT will write comments on the pull request. This is what it looks like codewhere the prompt for chatGPT is compiled

private generatePrompt = (patch: string) => {
    const answerLanguage = process.env.LANGUAGE
      ? `Answer me in ${process.env.LANGUAGE},`
      : '';

    const prompt =
      process.env.PROMPT ||
        'Below is a code patch, please help me do a brief code review on it. Any bug risks and/or improvement suggestions are welcome:';

    return `${prompt}, ${answerLanguage}:
    ${patch}
    `;
  };

That is, after a prompt from a variable PROMPT (more about it below) will be added to the end of the request Answer me in Russian.

  • OPENAI_API_ENDPOINT is responsible for where to send the request, but since a proxy service is used, you need to change the value to https://api.proxyapi.ru/openai/v1

  • MODEL determines the version of ChatGPT that will respond to us. Default is gpt-3.5-turbo, but it can be changed to any that the service supports. The versions of ChatGPT that are supported can be found here. here

  • PROMPT is responsible for the request itself. Everything is simple here: this is a regular question to the model as in a regular chat, so we ask the model to review the code. It is important to note that a separate request will be made for each diff file.

Conclusion

Now, after creating a pull request, the workflow will be launched:

If you did everything correctly, then after the workflow is completed, ChatGPT will write comments for each changed file in your pull request:

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *