Skip to content

Kejsan/AI-Powered-Prompt-Optimizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Prompt Optimizer

An AI-powered web application that helps users improve their prompts for large language models (LLMs). This tool analyzes a user's prompt and provides expert suggestions for improvement based on different expertise levels.

Features

  • AI-Powered Analysis: Uses the Google Gemini API to provide intelligent feedback on your prompts.
  • Multiple Expertise Levels: Get suggestions tailored to your skill level:
    • Novice: Simple, high-impact tips for beginners.
    • Expert: Advanced techniques for experienced prompters.
    • Do it for Me: Let the AI rewrite your prompt for you.
  • Context-Aware Suggestions: Add notes to give the AI context about your goals, and get more relevant advice.
  • Rate Limiting: Built-in rate limiting to prevent abuse and ensure service availability.
  • Secure: Your API keys are kept secure on the backend and are never exposed to the client.

How It Works

This project is a single-page application (SPA) with a serverless backend, designed for deployment on Netlify.

  • Frontend: A static index.html file that uses Tailwind CSS for styling and vanilla JavaScript to handle user interactions.
  • Backend: A Netlify Serverless Function (getpromptsuggestions.js) acts as a secure proxy to the Google Gemini API. It constructs a detailed system prompt based on the user's input and sends it to the API.
  • Edge Logic: A Netlify Edge Function (rate-limiter.js) intercepts requests to the backend function to enforce rate limiting based on the user's IP address. It uses Netlify Blobs to track request timestamps.

Getting Started

Follow these instructions to set up the project for local development.

Prerequisites

  • Node.js
  • Netlify CLI: You'll need to install and authenticate the Netlify CLI to run the project locally.
    npm install -g netlify-cli

Setup

  1. Clone the repository:

    git clone <repository-url>
    cd <repository-directory>
  2. Set up environment variables: This project requires a Google Gemini API key. You will need to create a file named .env in the root of the project and add your API key to it.

    Create a .env file:

    GEMINI_API_KEY=your_gemini_api_key_here
    

    Important: The netlify dev command will automatically load environment variables from a .env file. Do not commit this file to your repository. Add .env to your .gitignore file.

Local Development

To run the application locally, use the dev script defined in package.json.

npm run dev

This command starts the Netlify local development server, which will serve the index.html file and run the serverless and edge functions. The application will be available at http://localhost:8888.

Deployment

This project is configured for seamless deployment to Netlify.

  1. Connect your repository to a Netlify site.
  2. Configure Environment Variables: In your Netlify site's settings, go to Build & deploy > Environment and add your GEMINI_API_KEY. This will make the key available to your serverless function in the production environment.
  3. Trigger a deploy: Netlify will automatically build and deploy your site when you push changes to your connected Git branch.

The netlify.toml file in the repository contains the necessary configuration for redirects, edge functions, and build settings.

About

It will analyze a user's prompt, and based on their selected expertise level ("Novice" or "Expert"), it will provide actionable suggestions grounded in the principles from "The Prompt Report." It will also include a "Do it for Me" option that leverages the Gemini API to automatically rewrite the prompt.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors