Stefan Lionar
Jiabin Liang
Gim Hee Lee
Sea AI Lab Garena National University of Singapore

This is the official repository of 🌲 TreeMeshGPT: Artistic Mesh Generation with Autoregressive Tree Sequencing.
TreeMeshGPT is an autoregressive Transformer designed to generate high-quality artistic meshes from input point clouds. Unlike conventional autoregressive models that rely on next-token prediction, TreeMeshGPT retrieves the next token from a dynamically growing tree structure, enabling localized mesh extensions and enhanced generation quality. Our novel Autoregressive Tree Sequencing method introduces an efficient face tokenization strategy, achieving a 22% compression rate compared to naive tokenization. This approach reduces training difficulty while allowing for the generation of meshes with finer details and consistent normal orientation. With 7-bit discretization, TreeMeshGPT supports meshes with up to 5,500 faces, while the 9-bit model extends this capability to 11,000 faces.
Table of Contents
- Google Colab demo – Run TreeMeshGPT in your browser.
- Inference - Generate artistic mesh conditioned on point cloud sampled from dense mesh (
inference.py
) - Tokenizer - Create input-output pair for Autoregressive Tree Sequencing (
tokenizer.py
) - Training script
- Dataset
To set up TreeMeshGPT, follow the steps below:
git clone https://github.com/sail-sg/TreeMeshGPT.git
cd TreeMeshGPT
conda create -n tmgpt python=3.11
conda activate tmgpt
pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cu124
PyTorch≥2.5.0 is needed as we use FlexAttention for the point cloud condition during training. Older versions should work for inference.
pip install -r requirements.txt
Download and arrange the pre-trained models using the following commands:
mkdir checkpoints
# 7-bit model
gdown 1IERM76szBQq9oAMoFw1Sbgp3kx97_Kib
mv treemeshgpt_7bit.pt checkpoints/
# 9-bit model
gdown 19Auv48x7kgoODRS7dij8QijWzDqRjq97
mv treemeshgpt_9bit.pt checkpoints/
Alternatively, you can download them manually from the following links and put them inside checkpoints
folder:
We provide a demo of artistic mesh generation conditioned on a point cloud sampled from a dense mesh. The dense meshes are generated using the text-to-3D model from Luma AI. Our demo can also be run on Google Colab.
Use the following command to generate a mesh:
python inference.py
The output will be saved to generation
folder.
Argument | Type | Default | Description |
---|---|---|---|
--version |
str |
7bit |
Select model version: 7bit or 9bit . |
--ckpt_path |
str |
./checkpoints/treemeshgpt_7bit.pt (7-bit) / ./checkpoints/treemeshgpt_9bit.pt (9-bit) |
Path to the model checkpoint. |
--mesh_path |
str |
demo/luma_cat.glb |
Path to the input mesh file. |
--decimation |
bool |
True |
Enable or disable mesh decimation. Recommendation: True if input is dense mesh and False if input is Objaverse mesh. |
--decimation_target_nfaces |
int |
6000 |
Target number of faces after decimation. Use smaller number if generated mesh contains too many small triangles. |
--decimation_boundary_deletion |
bool |
True (7-bit) / False (9-bit) |
Allow deletion of boundary vertices of decimated mesh. Set to True if generated mesh contains too many small triangles. |
--sampling |
str |
uniform (7-bit) / fps (9-bit) |
Sampling method: uniform if 7-bit and fps if 9-bit. |
-
Run with 9-bit model and a mesh specified in
--mesh_path
:python inference.py --version 9bit --mesh_path demo/luma_bunny.glb
-
Set
--decimation_boundary_deletion
toTrue
and optionally use a lower--decimation_target_nfaces
if the default configuration results in meshes with too many small triangles:python inference.py --version 9bit --mesh_path demo/luma_box.glb --decimation_boundary_deletion True --decimation_target_nfaces 2000
-
Run without decimation (e.g., for Objaverse evaluation):
python inference.py --decimation False --mesh_path demo/objaverse_pig.obj
Our code is built on top of PivotMesh codebase. Our work is also inspired by these projects: