Skip to content

Conversation

@alexahuxn
Copy link
Contributor

Added L2-norm atom #131

@alexahuxn alexahuxn changed the title Alexa/norm Added L2-norm atom #131 Dec 5, 2025
@pratikrathore8 pratikrathore8 linked an issue Dec 5, 2025 that may be closed by this pull request
@pratikrathore8 pratikrathore8 self-requested a review December 5, 2025 21:31
@pratikrathore8
Copy link
Collaborator

pratikrathore8 commented Dec 5, 2025

Thanks for making this PR @alexahuxn. Could you get your tests to pass the CI? Then I will take a look at the code.

@pratikrathore8 pratikrathore8 changed the title Added L2-norm atom #131 Added L2-norm atom Dec 5, 2025
@alexahuxn
Copy link
Contributor Author

Thanks for making this PR @alexahuxn. Could you get your tests to pass the CI? Then I will take a look at the code.

Thank you. I fixed the issue and passed the CI.

Copy link
Collaborator

@pratikrathore8 pratikrathore8 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good for the most part.

The L2Norm atom needs to be added to the docs. It also needs to be added to the __init__.py file in rlaopt.atoms.

Also please merge the contents of the main branch into your branch!

lam = scaling * prox_scaling

def prox_l2(x: torch.Tensor) -> torch.Tensor:
norm = torch.linalg.norm(x, ord=2)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Checking equality for 0 could be unsafe. Could you tweak your implementation to use the compact form at the bottom of pg. 143 in this book: https://www.jacobaguirre.com/First%20Order%20Methods%20in%20Optimization.pdf?

@sonarqubecloud
Copy link

Copy link
Collaborator

@pratikrathore8 pratikrathore8 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!

@alexahuxn alexahuxn merged commit 5728f2e into main Dec 11, 2025
10 checks passed
@alexahuxn alexahuxn deleted the alexa/norm branch December 11, 2025 02:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add L2 norm atom

3 participants