Skip to content

Transformer

Vignesh Rao edited this page Jun 2, 2024 · 2 revisions

Generative Pre-trained Transformer

Warnings

  • This module uses a pre-trained transformer to generate predictive responses.
  • Due to the size of machine learning models, this feature will be disabled in limited mode.

Requirements

Minimum RAM availability

  • 8 GB to run the 7B models
  • 16 GB to run the 13B models
  • 32 GB to run the 33B models

References

Clone this wiki locally