Skip to content

Popular repositories Loading

  1. exllamav2 exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    Python 4.2k 318

  2. exui exui Public

    Web UI for ExLlamaV2

    JavaScript 501 47

  3. exllamav3 exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    Python 405 30

Repositories

Showing 3 of 3 repositories
  • exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav3’s past year of commit activity
    Python 405 MIT 30 21 1 Updated Jun 19, 2025
  • exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav2’s past year of commit activity
    Python 4,214 MIT 318 127 19 Updated Jun 5, 2025
  • exui Public

    Web UI for ExLlamaV2

    turboderp-org/exui’s past year of commit activity
    JavaScript 501 MIT 47 33 3 Updated Feb 6, 2025

Most used topics

Loading…