MessengerGPT
This project is a Facebook Messenger chatbot that uses the OpenAI GPT-3 model to generate responses to user messages. It leverages Flask as a web framework to handle incoming messages from Messenger and makes requests to the OpenAI API to generate responses.
Install / Use
/learn @kentemman/MessengerGPTREADME
Facebook Messenger Chatbot with OpenAI GPT-3
This guide will walk you through the process of setting up a chatbot that uses the OpenAI GPT-3 model to respond to messages on Facebook Messenger.
Install Using this Script
If you prefer to use a script instead of running the command, you can download the run_app.sh script from this repository and run it using ./run_app.sh.
wget https://raw.githubusercontent.com/kentemman/MessengerGPT/main/run_app.sh
chmod +x run_app.sh
./run_app.sh
Prerequisites
Before getting started, you'll need:
- A Facebook Developer account
- A Facebook page for your chatbot
- An OpenAI API key
- Python 3 installed on your computer
- Flask and requests Python packages installed
Step 1: Create a Facebook app and page
- Go to the Facebook Developer portal and create a new app.
- Follow the steps to set up your app, including adding a Messenger product and linking it to your Facebook page.
- Generate a Page Access Token and keep it handy, you'll need it later.
Step 2: Get an OpenAI API key
- If you don't already have one, sign up for an account on the OpenAI website.
- Generate an API key for the GPT-3 model, and keep it handy.
Step 3: Set up the Flask server
- Create a new directory for your project and navigate to it in the terminal.
- Create a new Python file and call it
app.py. - Paste the code from the original post into this file.
- Replace the OpenAI API key and Facebook Page Access Token with your own tokens.
- Install the Flask and requests Python packages by running pip install flask requests in the terminal.
- Start the Flask server by running python app.py in the terminal.
Step 4: Set up the Facebook webhook with ngrok
- Open a new terminal tab or window and navigate to the directory where you installed ngrok.
- Start ngrok by running the command:
./ngrok http 5000 - Note the "Forwarding" URL that is displayed in the ngrok console. This is the URL that you will use as your callback URL in the Facebook Developer portal.
- Go back to your Facebook Developer portal and navigate to your app's Messenger settings.
- Under the Webhooks section, click on the "Setup Webhooks" button.
- Enter
Step 5: Test the chatbot
- Go to your Facebook page and send a message to your chatbot.
- Check the console of your Flask server to see the 3. input message and the response from OpenAI GPT-3.
- Check the Facebook Messenger conversation to see 5. the chatbot's response.
Step 6: Deploy the chatbot
- Once you're happy with the chatbot's functionality, you can deploy it to a server so that it can run 24/7.
- There are many ways to deploy a Flask server, including using services like Heroku, AWS Elastic Beanstalk, or Google Cloud Run.
- Follow the instructions for your chosen deployment method to upload your Flask app and run it on a server.
Modified the nano app.py if the api is not working
- Change the
OPEN_AI_APIto your api - Change the
PAGE_TOKENto your page token
That's it! You now have a Facebook Messenger chatbot that uses OpenAI GPT-3 to generate responses. You can customize the chatbot's behavior by modifying the code in app.py.
Related Skills
claude-opus-4-5-migration
110.6kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
model-usage
351.2kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
openhue
351.2kControl Philips Hue lights and scenes via the OpenHue CLI.
sag
351.2kElevenLabs text-to-speech with mac-style say UX.
