Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fine-tune the Masked Particle Modeling foundation model for tau lepton reconstruction #77

Open
Laurits7 opened this issue Sep 8, 2024 · 0 comments

Comments

@Laurits7
Copy link
Member

Laurits7 commented Sep 8, 2024

See here for MPM: https://arxiv.org/pdf/2401.13537

Following the example of OmniParT: https://github.com/HEP-KBFI/ml-tau-en-reg/blob/main/enreg/tools/models/OmniParT.py
define a model where the pretrained MPM encodings are used as an input to the binary classification, multiclassification and regression heads.

Compare a pretrained model with fixed weights, to a model with fine-tuned weights, to a model trained from scratch.
Compare physics metrics (cls, reg resolution) as a function of the training statistics.

@jpata jpata changed the title Include "Masked Particle Modeling" in comparison Fine-tune the Masked Particle Modeling foundation model for tau lepton reconstruction Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant