Today we have access to computational power that is hundred times faster than available only a couple decades ago. Finally researchers can try all the fundamental approaches and ideas gathered during the last century. Following this trend, the author compares simple but effective shrinkage methods to the computational demanding combination of models approach, represented by Complete Subset Regression (CSR) and eXtreme Gradient Boosting method (XGBoost). Combination of models often provide much more stable output, but can they act well during the periods of shock? While tree-based models rarely show descent results in forecasting of economic data, recent success of XGBoost in numerous competitions attracted attention to the possible game-changer from this category. This work brought interesting results, providing reasons to believe that including cross-validation part in already best-performing CSR may further improve this approach without increase in computational cost. On the other hand, further confirmations on Monte Carlo simulations and other types of data required in the future, in order to prove validity of the method in general.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.