Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Annual fAPAR_max and LAI_max #403

Open
wants to merge 30 commits into
base: develop
Choose a base branch
from

Conversation

MarionBWeinzierl
Copy link
Collaborator

@MarionBWeinzierl MarionBWeinzierl commented Jan 28, 2025

Description

Implements the computation of fAPAR_max and LAI_max, and provides examples usages of the API (including test data).

Fixes #348 .

Type of change

  • New feature (non-breaking change which adds functionality)
  • Optimization (back-end change that speeds up the code)
  • Bug fix (non-breaking change which fixes an issue)

Key checklist

  • Make sure you've run the pre-commit checks: $ pre-commit run -a
  • All tests pass: $ poetry run pytest

Further checks

  • Code is commented, particularly in hard-to-understand areas
  • Tests added that prove fix is effective or that feature works

@MarionBWeinzierl MarionBWeinzierl added the enhancement New feature or request label Jan 28, 2025
@MarionBWeinzierl MarionBWeinzierl self-assigned this Jan 28, 2025
@MarionBWeinzierl MarionBWeinzierl marked this pull request as draft January 28, 2025 16:30
@codecov-commenter
Copy link

codecov-commenter commented Jan 29, 2025

Codecov Report

Attention: Patch coverage is 51.78571% with 54 lines in your changes missing coverage. Please review.

Project coverage is 94.75%. Comparing base (56ccf57) to head (e8d30f1).
Report is 67 commits behind head on develop.

Files with missing lines Patch % Lines
pyrealm/phenology/fapar_limitation.py 46.53% 54 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop     #403      +/-   ##
===========================================
- Coverage    96.27%   94.75%   -1.53%     
===========================================
  Files           34       38       +4     
  Lines         2659     2860     +201     
===========================================
+ Hits          2560     2710     +150     
- Misses          99      150      +51     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@MarionBWeinzierl
Copy link
Collaborator Author

@davidorme , could you have a look and sense-check what's there? As you suggested in #407 (I hope I understood that right) I used Scaler directly, rather than extracting any functionality.

Copy link
Collaborator

@davidorme davidorme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some optimisations and a couple of issues. I think we're on the right track here. I did wonder about making the class __init__ accept daily/subdaily observations and moving the resampling in the from_Pmodel into there, but on balance I think it is right to have the straightforward usage as __init__ and then use factory methods to handle other core inputs.

Comment on lines +188 to +254

annual_total_potential_gpp = get_annual(
pmodel.gpp, datetimes, growing_season, "total"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this need to check we are dealing with whole years in the input?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering whether this should be something similar as in __init__ of scaler, or something much more simple (like checking that growing_season can be divided by 365 or 366).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah - this was tying me in knots as well. I think it needs to be something similar to the scaler init. We want to:

  • Check that the timing is regular - although that is a pain when we want to support e.g. weekly or monthly inputs
  • Check that there are the same number of observations per year - although that is a pain for leap years at daily precision.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I implemented a check for the datetimes. I am sure I have missed something...

Probably a good time to implement tests.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It currently assumes daily or subdaily inputs

Copy link
Collaborator

@davidorme davidorme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More thoughts - not fully formed!

Comment on lines +188 to +254

annual_total_potential_gpp = get_annual(
pmodel.gpp, datetimes, growing_season, "total"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah - this was tying me in knots as well. I think it needs to be something similar to the scaler init. We want to:

  • Check that the timing is regular - although that is a pain when we want to support e.g. weekly or monthly inputs
  • Check that there are the same number of observations per year - although that is a pain for leap years at daily precision.

@MarionBWeinzierl MarionBWeinzierl force-pushed the issue348_annual_fapar_and_lai branch from 1f2087a to f51db91 Compare February 17, 2025 10:52
@MarionBWeinzierl MarionBWeinzierl marked this pull request as ready for review February 18, 2025 14:22
@MarionBWeinzierl
Copy link
Collaborator Author

I have marked this as ready for review now, although there is still no integration test - I am trying to figure out whether there is something easier than the whole LAI example.

What do you think about merging in what is there and create a separate issue for the regression test @davidorme ?

@MarionBWeinzierl
Copy link
Collaborator Author

I created issue #436 to implement regression tests.

Copy link
Collaborator

@davidorme davidorme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MarionBWeinzierl Some minor doc stuff and simple syntax changes. Then I think this is great as something to build regression tests and further development around.

@davidorme
Copy link
Collaborator

@MarionBWeinzierl I'm just wondering about what we want to do about 2.0.0 and this feature. My gut feeling is that we maintain this as a separate feature branch for now, merging #436 down into here for example, until after the 2.0.0 meta branch is merged in #383.

We don't want too many branches hanging around but it feels like it is worth keeping this partial implementation out of the 2.0.0 release candidates?

@MarionBWeinzierl
Copy link
Collaborator Author

@MarionBWeinzierl I'm just wondering about what we want to do about 2.0.0 and this feature. My gut feeling is that we maintain this as a separate feature branch for now, merging #436 down into here for example, until after the 2.0.0 meta branch is merged in #383.

We don't want too many branches hanging around but it feels like it is worth keeping this partial implementation out of the 2.0.0 release candidates?

Hm, I am not exactly enthusiastic about the idea, as it will mean more effort to make sure it does not diverge even more from the main implementation. We have merged in other not-quite-complete implementations, building things up incrementally. How is this part different?

@davidorme
Copy link
Collaborator

I agree it is a change in how we've handled feature branches. I want to get all of the stuff in #383 down into develop soon and that will contain breaking changes for this feature.

I guess I've also treated new features very much as a firehose of things that are immediately public, which is one reason we've ended up with a big refactor for 2.0.0. I guess I'd like 2.0.0 to be a checkpoint of what we already have to then build on?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Review
Development

Successfully merging this pull request may close these issues.

Calculation of annual fAPAR and LAI values from a fitted PModel and precipitation data
3 participants