Replies: 1 comment
-
I would say: it's different. Definitely different from transfer learning: the RNN estimates parameters for the same data set on which it also forecasts. Of course you may take the trained RNN and use it on another data set, but that's not really part of the original idea. For multi-task learning, my first idea is to say that it's also not related. However, there are some folks who say that if you have a panel of time series, forecasting each time series in that panel is its own task. So, in that sense, any global forecasting model (one that learns parameters over the entire panel and not just a single data set) achieves a form of multi-task learning -- and hence the Deep State model family as well. |
Beta Was this translation helpful? Give feedback.
-
Since deepstate is learning parameters from RNN and giving to state space models , is it similar to multi-task learning or transfer learning ? Or is it different from those ?
Beta Was this translation helpful? Give feedback.
All reactions