Skip to content

Add validation for lora_dropout #316

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jun 9, 2025
Merged

Add validation for lora_dropout #316

merged 6 commits into from
Jun 9, 2025

Conversation

timofeev1995
Copy link
Contributor

Have you read the Contributing Guidelines?

Issue: Lora droupout can be set to values outside the 0-1 range

Describe your changes

Add validation to the parameter

@timofeev1995 timofeev1995 requested review from mryab and artek0chumak and removed request for mryab June 9, 2025 10:30
@mryab mryab requested a review from sbassam June 9, 2025 10:31
@artek0chumak
Copy link
Contributor

Don't forget to bump up the version before merging!

@@ -101,6 +101,11 @@ def create_finetune_request(
raise ValueError(
f"LoRA adapters are not supported for the selected model ({model_or_checkpoint})."
)

if lora_dropout is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can be simplified to if lora_dropout

Copy link
Contributor

@sbassam sbassam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just left a few comments but nothing major. LGTM otherwise

Comment on lines +105 to +111
create_finetune_request(
model_limits=_MODEL_LIMITS,
model=_MODEL_NAME,
training_file=_TRAINING_FILE,
lora=True,
lora_dropout=lora_dropout,
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure why we need this but if it successfully detects out of range dropouts that's fine with me.

@timofeev1995 timofeev1995 merged commit c6353ae into main Jun 9, 2025
10 checks passed
@timofeev1995 timofeev1995 deleted the egor/lora-dropout branch June 9, 2025 17:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants