T R A C K       P A P E R
ISSN:2394-3661 | Crossref DOI | SJIF: 5.138 | PIF: 3.854

International Journal of Engineering and Applied Sciences

(An ISO 9001:2008 Certified Online and Print Journal)

GARCH Parameter Estimation by Machine Learning

( Volume 5 Issue 8,August 2018 ) OPEN ACCESS

Tetsuya Takaishi


It is of great importance to estimate volatility of asset returns for risk management in empirical finance. The GARCH model is often used to estimate volatility. To utilize the GARCH model, we need to estimate model parameters so that the model matches the underlying return time series. Usually the maximum likelihood or the Bayesian method is used for the parameter estimation of the GARCH model. In this study we apply the machine learning technique for the parameter estimation. We minimize the loss function defined by the likelihood function of the GARCH model. The minimization is done by the Adam optimizer of TensorFlow. We find that the machine learning estimates the model parameters correctly. We also investigate the convergence property of the Adam optimizer and show that the convergence rate increases as the learning rate increases up to a certain maximum learning rate. Over the maximum value, the minimization fails with the optimizer.



Paper Statistics:

Total View : 441 | Downloads : 432 | Page No: 16-19 |

Cite this Article:
Click here to get all Styles of Citation using DOI of the article.