Skip to content

aipack-ai/aipack

Repository files navigation

Static Badge Static Badge

AIPACK - Run, Build, and Share AI Packs

Check out the site: https://aipack.ai for more information and links.

Open-source Agentic Runtime to run, build, and share AI Packs.

  • Supports all major AI providers and models.
  • Efficient and small (< 20MB), with zero dependencies.
  • Built in Rust using Lua for embedded scripting (small and efficient).
  • Runs locally, completely IDE-agnostic.
  • Or in the cloud—server or serverless.

Static Badge

Quick Start

Install

For now, installation requires building directly from source via Rust. Works on all major OSes.

NOTE: Ironically, while the binary is relatively small (<20MB with batteries included), the build process can take up quite a bit of space. However, Cargo should clean it up afterward. Binaries and installation instructions will be available at https://aipack.ai

DISCLAIMER: For now, v0.6.x, AIPACK works on Linux & Mac, and requires WSL on Windows.

IMPORTANT: Proper Windows support is coming sometime in v0.6.x and definitely by v0.7.x (around Mid/End of March).

Run

# In the terminal, go to your project
cd /path/to/my/project/

# Initialize workspace .aipack/ and ~/.aipack-base
aip init

# Make sure to export the desired API key
export OPENAI_API_KEY    = "sk...."
export ANTHROPIC_API_KEY = "...."
export GEMINI_API_KEY    = "..."
# For more keys, see below

# To proofread your README.md (namespace: demo, pack_name: proof)
aip run demo@proof -f ./README.md

# You can just use @pack_name if there is no other pack with this name
aip run @proof -f ./README.md

# To do some code crafting (will create `_craft-code.md`)
aip run demo@craft/code

# Or run your .aip file (you can omit the .aip extension)
aip run path/to/file.aip

# This is a good agent to run to ask questions about aipack
# It can even generate aipack code
aip run core@ask-aipack
# The prompt file will be at `.aipack/.prompt/core@ask-aipack/ask-prompt.md`

pro@coder

  • You can install pro@coder with aip install pro@coder, and then
  • Run it with aip run pro@coder or aip run @coder if you don't have any other @coder pack in a different namespace.

This is the agent I use every day for my production coding.

IMPORTANT 1: Make sure everything is committed before use (at least while you are learning about aipack).

IMPORTANT 2: Make sure to have your API_KEY set as an environment variable (on Mac, there is experimental keychain support).

OPENAI_API_KEY
ANTHROPIC_API_KEY
GEMINI_API_KEY
XAI_API_KEY
DEEPSEEK_API_KEY
GROQ_API_KEY
COHERE_API_KEY

Info

  • Website: https://aipack.ai

  • AIPACK Overview Video

  • Preview 'devai' intro video for v0.5

  • Built on top of the Rust genai library, which supports many top AI providers and models (OpenAI, Anthropic, Gemini, DeepSeek, Groq, Ollama, xAI, and Cohere).

  • Top new features: (see full CHANGELOG)

    • 2025-03-02 (v0.6.7) - Fixes and tune-up. Pack install test and other refactoring
    • 2025-03-02 (v0.6.4) - Fixes, and now supports the first repo pack aip install jc@coder
    • 2025-02-28 (v0.6.3) - aip pack .., aip install local..., ai_response.price_usd, and more
    • 2025-02-26 (v0.6.0) - BIG UPDATE - to AIPACK, now with pack support (e.g., aip run demo@craft/code)
    • 2025-02-22 (v0.5.11) - Huge update with parametric agents and coder (more info soon)
    • 2025-01-27 (v0.5.9) - DeepSeek distill models support for Groq and Ollama (local)
    • 2025-01-23 (v0.5.7) - aipack run craft/text or aipack run craft/code (example of new agent module support)
    • 2025-01-06 (v0.5.4) - DeepSeek deepseek-chat support
    • 2024-12-08 (v0.5.1) - Added support for xAI
  • WINDOWS DISCLAIMER:

    • This CLI uses a path scheme similar to Mac/Unix-like systems, which might not function correctly in the Windows cmd.exe (Command Prompt) or traditional batch files (.bat).
    • Full Windows local path support is in development.
    • RECOMMENDATION: Use PowerShell or WSL on Windows. Please log issues if minor changes can improve support for Windows PowerShell/WSL.
  • Thanks to

How it works

  • One Agent == One Markdown
    • An .aip agent file is just a Markdown file with sections for each stage of the agent's processing.
    • See below for all the possible stages.
  • aip run demo@proof -f "./*.md"
    • will run the installed agent file main.aip in the
    • pack named proof
    • namespace demo
    • agent file main.aip
    • Full path ~/.aipack-base/pack/installed/demo/proof/main.aip
    • You can pass input to your agent using:
      • -f "path/with/optional/**/glob.*" -f "README.md" (the Lua code will receive a {path = .., name =..} FileMeta-like structure as input)
      • -i "some string" -i "another input" (the Lua code will receive these strings as input)
      • Each input triggers one run of the agent.
  • aip run some/path/to/agent
    • If the path ends with .aip, it's treated as a direct file run.
    • If there is no .aip extension, then:
      • .../agent.aip will be executed if it exists.
      • or .../agent/main.aip will be executed if it exists.
  • aipack agents are simple .aip files that can be placed anywhere on disk.
    • e.g., aipack run ./my-path/to/my-agent.aip ...
  • Multi AI Provider / Models - aipack uses genai and therefore supports providers like OpenAI, Anthropic, Gemini, Groq, Ollama, Cohere, and more.
  • Lua is used for all scripting (thanks to the great mlua crate).
  • Handlebars is used for all prompt templating (thanks to the great Rust native handlebars crate).

Multi Stage

A single aipack file may comprise any of the following stages.

Stage Language Description
# Before All Lua Reshape/generate inputs and add global data to the command scope (the "map" part of map/reduce).
# Data Lua Gather additional data per input and return it for the next stages.
# System Handlebars Customize the system prompt using data from # Before All and # Data.
# Instruction Handlebars Customize the instruction prompt using data from # Before All and # Data.
# Assistant Handlebars Optional for special customizations, such as the "Jedi Mind Trick."
# Output Lua Processes the ai_response from the LLM. If not defined, ai_response.content is output to the terminal.
# After All Lua Called with inputs and outputs for post-processing after all input processing is complete (the "reduce" part of map/reduce).
  • # Before All and # After All act like the map and reduce steps, running before and after the main input processing loop, respectively.

more info on stages

See the aipack documentation at core/doc/README.md (with the Lua modules doc).

You can also run the ask-aipack agent.

# IMPORTANT: Make sure you have the `OPENAI_API_KEY` (or the key for your desired model) set in your environment
aip run core@ask-aipack
# The prompt file will be at `.aipack/.prompt/core@ask-aipack/ask-prompt.md`

About

Run, Build, Share your AI Packs

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •