Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug]: Cannot determine base type #7703

Open
1 task done
tvortsa opened this issue Feb 27, 2025 · 2 comments
Open
1 task done

[bug]: Cannot determine base type #7703

tvortsa opened this issue Feb 27, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@tvortsa
Copy link

tvortsa commented Feb 27, 2025

Is there an existing issue for this problem?

  • I have searched the existing issues

Operating system

Windows

GPU vendor

Nvidia (CUDA)

GPU model

3060

GPU VRAM

6Gb

Version number

5.6.2

Browser

chrome

Python dependencies

Again and again same trpubles: Cannot determine base type

Give me a way to determine "base type" by myself may be ?

"base type" is SD 3.5 !

What happened

try to install model from file

What you expected to happen

i get new model

How to reproduce the problem

try to instal this model

Additional context

WTF guys? its so gard this few "base types" !?!?!?!?

Discord username

No response

@tvortsa tvortsa added the bug Something isn't working label Feb 27, 2025
@JamesClarke7283
Copy link

JamesClarke7283 commented Feb 28, 2025

Hello,
Please can you give us the link to the model weights on HF or wherever to make it easier to troubleshoot. ideally also a log output of the exact model file names mentioned under the error. Often times, its an issue with the model authors config etc, if its a finetune. additionally, doing it from file might be the issue, safetensors & GGUF dont always have that base model info in them.

also your invokeai version is old. the latest stable is 5.7.1. you can update through the InvokeAI launcher i think. its what i do on my linux system.

@Asherathe
Copy link

I frequently have this problem with VAEs, for instance kl-f8-anime2, anything, and blessed_2. Every version I've tried of these VAEs, from any source (CivitAI or different repos on HF), they all have the same issue. It hasn't happened to me with any other type of file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants