-
-
Notifications
You must be signed in to change notification settings - Fork 59
Use set_data
in forecast method
#451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
@@ -2060,6 +2065,7 @@ def forecast( | |||
|
|||
with pm.Model(coords=temp_coords) as forecast_model: | |||
(_, _, *matrices), grouped_outputs = self._kalman_filter_outputs_from_dummy_graph( | |||
scenario=scenario, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @jessegrabowski, sorry about the constant back and forth, but I think this addition here fixes the bug. The tests still fail but the error message is different than the expected dimension mismatch it is an assertion error:
FAILED tests/statespace/test_statespace.py::test_foreacast_valid_index - UserWarning: Skipping `CheckAndRaise` Op (assertion: The first dimension of a time varying matrix (the time dimension) must be equal to the first dimension of the data (the time dimension).) as JAX tracing would remove it.
Running the test test_foreacast_valid_index
in a notebook works without any problems.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's just a warning, so we need to add a @pytest.filterwarnings
decorator to the test.
I made this change then undid it because I thought it wasn't doing the right thing. I need to double-check, but the forecasting logic basically goes like this:
- The original user graph -- using the training data -- is reconstructed in its entirety.
- Using the original graph, we compute the value of the hidden states at the requested x0 for the forecasts. We do not want new scenario data here, otherwise the value of the requested x0 will be wrong.
- Starting from this x0, we construct a new graph which iterates the statespace equations forward for the requested number of time steps. If there are exogenous regressors, this is where we do want them.
I think I had the impression that this change put the scenario data in step (2), but I looked at it a few weeks ago now and I forget.
Closes #424
There was some really complicated logic for clone-replacing data variables in the
forecast
method for absolutely no reason. For certain models, this meant that the data was not being correctly updated, and coordinates didn't match after sampling.Just doing a basic
pm.set_data
does the trick just fine.