Skip to content

Releases: m-marinucci/LANCompute

LANCompute v0.1.0 — LM Studio/Ollama integration, CLI helper, tests, and docs

16 Sep 22:02

Choose a tag to compare

LANCompute v0.1.0 — LM Studio/Ollama integration, CLI helper, tests, and docs

Summary

  • Adds a minimal OpenAI-compatible workflow to exercise a local LLM service (LM Studio API server or Ollama).
  • Introduces a tiny CLI for quick prompts and model discovery.
  • Includes an integration test to verify /v1/models and /v1/chat/completions.
  • Adds environment templates, updates docs, and improves developer ergonomics.

Highlights

  • CLI helper: scripts/lmstudio_chat.py
    • Lists models and sends quick chat prompts to an OpenAI-compatible endpoint.
    • Reads .env automatically and respects LM_STUDIO_BASE_URL.
    • Works out-of-the-box against http://127.0.0.1:1234 (override via env or flag).
  • Makefile targets
    • make models lists models via the CLI helper.
    • make chat MODEL=<id> PROMPT="<text>" sends a quick prompt.
    • make setup upgrades pip and ensures requests is present.
  • Integration test: tests/test_lmstudio_integration.py
    • Discovers available models, prefers lightweight IDs, and runs a short chat completion.
    • Environment overrides: LM_STUDIO_BASE_URL, LM_STUDIO_TEST_MODEL, LM_STUDIO_TEST_PROMPT.
  • Configuration and docs
    • .env.example for public-safe defaults.
    • .envrc auto-activates .venv and loads .env (with direnv).
    • .gitignore ignores .env and .direnv/.
    • README adds integration guide, CLI usage, env setup, and test instructions.

Getting Started

python -m venv .venv
source .venv/bin/activate
pip install -U pip requests pytest
# optional: direnv allow
cp .env.example .env
# set LM_STUDIO_BASE_URL (default: http://127.0.0.1:1234)

Usage

python scripts/lmstudio_chat.py --list-models
make models
python scripts/lmstudio_chat.py --model mistral:latest --prompt "Give me one fun fact."
make chat MODEL=mistral:latest PROMPT="Give me one fun fact."

Integration Test

pytest -q tests/test_lmstudio_integration.py

Environment overrides:

  • LM_STUDIO_BASE_URL — API base URL
  • LM_STUDIO_TEST_MODEL — preferred model id (optional)
  • LM_STUDIO_TEST_PROMPT — short test prompt (optional)

Security Notes

  • Only expose your LLM service to a trusted network.
  • Prefer binding locally (127.0.0.1) and use SSH tunnels for development.
  • If you bind 0.0.0.0, ensure you control access (firewall/VLAN/reverse proxy).

Changelog

  • Add scripts/lmstudio_chat.py CLI for models and chat
  • Add Makefile targets: models, chat, setup
  • Add tests/test_lmstudio_integration.py with env overrides
  • Add .env.example; load .env in CLI and tests
  • Update .envrc to source .env
  • Update .gitignore to exclude .env and .direnv/
  • Expand README with integration guide and usage