You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In #169 I outlined a use case for Domains. In writing up that usecase, I realized that I could also work with a "Bundle" TimeSeries. Namely, I often work with a number of boolean valued time series whose values are constant for large periods of time. However, combining the time series multiplies the number of points required to represent the time series (particularly if combined with a real valued signal).
My current solution has been to create a dictionary of TimeSeries objects, but this ran into problems when the timeseries are defined on different domains. It would be nice to have an object that "bundles" multiple TimeSeries objects but asserts that they all have the same domain.
The text was updated successfully, but these errors were encountered:
I was just thinking this today. I've used the dictionary of time series approach too. Along with the weird domains its also inefficient if you want to apply a function (moving_average in my case) too all of them.
A thought would be using a TimeSeries with a sort of collection as the value, that way they all share the same index.
In #169 I outlined a use case for
Domains
. In writing up that usecase, I realized that I could also work with a "Bundle" TimeSeries. Namely, I often work with a number ofboolean
valued time series whose values are constant for large periods of time. However, combining the time series multiplies the number of points required to represent the time series (particularly if combined with a real valued signal).My current solution has been to create a dictionary of
TimeSeries
objects, but this ran into problems when the timeseries are defined on different domains. It would be nice to have an object that "bundles" multipleTimeSeries
objects but asserts that they all have the same domain.The text was updated successfully, but these errors were encountered: