How to load a PEFT adapter with per-layer rank variation via from_pretrained? #2578
-
Hi all, I'm working on a PEFT variant where each adapter layer has its own rank, instead of using a globally uniform rank (as in standard LoRA). For example, However, when I call
What I'm looking for: Is there any recommended way to delay or override adapter allocation inside Or perhaps, is there a hook or extension point that allows me to customize how LoRA layers are created during Any guidance would be appreciated! Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
When creating the |
Beta Was this translation helpful? Give feedback.
When creating the
LoraConfig
, you can passrank_pattern
with specificr
values for specific layers.