P-V-L Deep: A Big Data Analytics Solution for Now-casting in Monetary Policy

Document Type : Research Paper

Authors

1 Ph.D. Candidate, Department of Information Technology Management, Science and Research Branch, Islamic Azad University, Tehran, Iran.

2 , Associate Prof., Department of Industrial Management, Science and Research Branch, Islamic Azad University, Tehran, Iran.

3 Associate Prof., Department of Information Technology Management, Science and Research Branch, Islamic Azad University, Tehran, Iran.

4 Prof., Department of Management, Tarbiat Modares University, Tehran, Iran.

5 Associate Prof., Department of Management, Sharif University of Technology, Tehran, Iran.

Abstract

The development of new technologies has confronted the entire domain of science and industry with issues of big data's scalability as well as its integration with the purpose of forecasting analytics in its life cycle. In predictive analytics, the forecast of near-future and recent past - or in other words, the now-casting - is the continuous study of real-time events and constantly updated where it considers eventuality. So, it is necessary to consider the highly data-driven technologies and to use new methods of analysis, like machine learning and visualization tools, with the ability of interaction and connection to different data resources with varieties of data regarding the type of big data aimed at reducing the risks of policy-making institution’s investment in the field of IT. The main scientific contribution of this article is presenting a new approach of policy-making for the now-casting of economic indicators in order to improve the performance of forecasting through the combination of deep nets and deep learning methods in the data and features representation. In this regard, a net under the title of P-V-L Deep: Predictive Variational Auto Encoders - Long Short-term Memory Deep Neural Network was designed in which the architecture of variational auto-encoder was used for unsupervised learning, data representation, and data reconstruction; moreover, long short-term memory was adopted in order to evaluate now-casting performance of deep nets in time-series of macro-econometric variations. Represented and reconstructed data in the generative network of variational auto-encoder to determine the performance of long-short-term memory in the forecasting of the economic indicators were compared to principal data of the net. The findings of the research argue that reconstructed data which are derived from variational auto-encoder embody shorter training time and outperform of prediction in long short-term memory compared to principal data.

Keywords


Alexander, L., Das, S. R., Ives, Z., Jagadish, H. V., & Monteleoni, C. (2017). Research Challenges in Financial Data Modeling and Analysis. Michigan conference: Big Data in Finance. Michigan.
Alvarez, R. M., & Perez-Quiros, G. (2016). Aggregate versus disaggregate information in dynamic factor models. International Journal of Forecasting, 32, 680 – 694.
Andersson, M. K., & Reijer, A. H. (2015). Nowcasting. Sveriges RIKSBANK Economic Review.
Askitas, N., & Zimmermann, K. F. (2009). Google Econometrics and Unemployment Forecasting. Applied Economics Quarterly, 55(2), 107–120.
Atsalakis, G. S., & Valavanis, K. P. (2009). Surveying stock market forecasting techniques – Part II: Soft computing methods. Expert Systems with Applications, 36(3), 5932-5941.
Bai, J., & Ng, S. (2008). Forecasting Economic Time Series Using Targeted Predictors. Journal of Econometrics, 146, 304-317.
Bańbura, M., & Modugno, M. (2014). Maximum Likelihood Estimation of Factor Models on Datasets with Arbitrary Pattern of Missing Data. Journal of Applied Econometrics, 29(1), 133-160.
Bańbura, M., Giannone, D., & Lenza, M. (2014-2015). Conditional Forecasts and Scenario Analysis with Vector Autoregressions for Large Cross-Section. Working Papers ECARES ECARES.
Bańbura, M., Giannone, D., & Reichlin, L. (2010). Large Bayesian Vector Autoregressions. Journal of Applied Econometrics, 25(1), 71-92.
Banerjee, A., Marcellino, M., & Masten, I. (2013). Forecasting with Factor-augmented Error Correction Models. International Journal of Forecasting, In Press.
Bengio, Y. (2009). Learning Deep Architectures for AI. Foundations and Trends in Machine.
Bengio, Y., Courville, A., & Vincent, P. (2013). Representation learning: A review and new perspectives. IEEE transactions on pattern analysis and machine intelligence, 35, 1798-1828.
Bernanke, B., Boivin, J., & Eliasz, P. S. (2005). Factor Augmented Vector Autoregressions (FVARs) and the Analysis of Monetary Policy. Quarterly Journal of Economics, 120(1), 387-422.
Bernanke, B., Boivin, J., & Eliasz, P. S. (2005). Measuring the Effects of Monetary Policy: A Factoraugmented Vector Autoregressive Approach. The Quarterly Journal of Economics, 120(1), 387-422.
Boesen, A., Larsen, L., & Sonderby, S. K. (2015). Generating Faces with Torch. Retrieved from Torch: torch.ch/blog/2015/11/13/gan.html
Boivin, J., & Giannoni, M. (2006). DSGE Models in a Data Rich Environment. NBER Working Paper, 12272.
Bordoloi, S., Biswas, D., Singh, S., Manna, U. K., & Saggar, S. (2010). Macroeconomic Forecasting Using Dynamic Factor Models. Reserve Bank of India Occasional Papers, 31(2), 69-83.
Bragolia, D., & Modugno, M. (2016). A Nowcasting Model for Canada: Do U.S. Variables Matter? Finance and Economics Discussion Series 34.
Bragolia, D., & Modugno, M. (2017). Now-casting the Japanese Economy. International Journal of Forecasting, Elsevier, 13.
Britz, D. (2015). Implementing a GRU/LSTM RNN with Python and Theano. Retrieved from Recurrent neural network tutorial: http://www.wildml.com/2015/10/recurrent-neural-network-tutorial-part-4-implementing-agrulstm-
Camacho, M., & Sancho, I. (2003). Spanish Diffusion Indexes. Spanish Economic Review, 5(3), 173-203.
Carriero, A., Clark, T. E., & Marcellino, M. (2012a). Real-time Nowcasting with a Bayesian Mixed Frequency Model with Stochastic Volatility. Federal Reserve Bank of Cleveland, Working Paper, 1227.
Carriero, A., Kapetanios, G., & Marcellino, M. (2011). Forecasting Large Datasets with Bayesian Reduced Rank Multivariate Models. Journal of Applied Econometrics, 26(5), 735-761.
Carriero, A., Kapetanios, G., & Marcellino, M. (2012b). Forecasting government bond yields with large Bayesian Vector Autoregressions. Journal of Banking & Finance, 36(7), 2026-2047.
Charte, D., Charte, F., Garc´ıa, S., Jesus, M. J., & Herrera, F. (2018). A practical tutorial on autoencoders for nonlinear feature fusion: Taxonomy, models, software and guidelines.
Chernis, T., & Sekkel, R. (2017). A Dynamic Factor Model for Nowcasting Canadian GDP Growth. Bank of Canada Staff Working Paper, 2.
Choi, H., & Varian, H. (2012). Predicting the Present with Google Trends. Economic Record, 88(s1), 2-9.
Chollet, F. (2016). Keras. Retrieved from Github: https://github.com/fchollet/keras
Dai, H., Tian, Y., Dai, B., Skiena, S., & Song, L. (2018). Syntax -Directed Variational Autoencoder For Structured Data. arXiv:1802.08786v1.
Deng, L., & Yu, D. (2014). Deep learning: methods and applications. Foundations and Trends in Signal Processing, 7, 197-387.
Diebold, F. X. (2003). Big Data Dynamic Factor Models for Macroeconomic Measurement and Forecasting. (M. Dewatripont, L. P. Hansen, & S. Turnovsky, Eds.) Advances in Economics and Econometrics, Eighth World Congress of the Econometric Society, 115-122.
Diederik, P. K., & Welling, M. (2013). Auto-Encoding Variational Bayes. arXiv:1312.6114.
Dixon, M., Klabjan, D., & Bang, J. H. (2015). Implementing deep neural networks for financial market prediction on the Intel Xeon Phi. Proceedings of the 8th Workshop on High Performance Computational Finance, (pp. 1-6).
Doersch, C. (2016). Tutorial on variational autoencoders. arXiv preprint arXiv:1606.05908.
Domingos, P. (2015). The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World. In Deeper into the Brain.
Doz, C., Giannone, D., & Reichlin, L. (2012). A quasi–maximum likelihood approach for large, approximate dynamic factor models. Review of Economics and Statistics, 94, 1014–1024.
Duarte, C., Rodrigues, P. M., & Rua, A. (2017). A Mixed Frequency Approach To The Forecasting Of Private Consumption With ATM/POS Data. International Journal of Forecasting, Elsevier, 15.
Elaraby, N. M., Elmogy, M., & Barakat, S. (2016). Deep Learning: Effective Tool for Big Data Analytics. International Journal of Computer Science Engineering (IJCSE), 9.
Fan, J., Han, F., & Liu, H. (2014). Challenges of Big Data Analysis. National Science Review, 1(2), 293-314.
Federal Reserved Bank. (2017). Nowcasting Report. The FRBNY Staff Nowcast.
Figueiredo, F. R. (2010). Forecasting Brazilian Inflation using a Large Dataset. Central Bank of Brazil, Working Paper, 228. Retrieved from http://www.bcb.gov.br/pec/wps/ingl/wps228.pdf
Fischer, T., & Krauss, C. (2017). Deep learning with long short-term memory networks for financial market predictions. FAU Discussion Papers in Economics.
Forni, M., Hallin, M., Lippi, M., & Reichlin, L. (2005). The Generalized Dynamic Factor Model: OneSided Estimation and Forecasting. Journal of the American Statistical Association, 100(471), 830-840.
Gal, Y., & Ghahramani, Z. (2016). A theoretically grounded application of dropout in recurrent neural networks. Advances in Neural Information Processing Systems, 1019-1027.
Galbraith, J. W., & Tkacz, G. (2016). Nowcasting with Payments System Data. International Journal of Forecasting, Elsevier.
Galeshchuk, S., & Mukherjee, S. (2017). Deep Learning for Predictions in Emerging Currency Markets. the 9th International Conference on Agents and Artificial Intelligence, Science and Technology. ResearchGate. doi:10.5220/0006250506810686
Gavidel, S. Z., & Rickli, J. L. (2017). Quality assessment of used-products under uncertain age and. International Journal of Production Research, 1-15.
Gers, F. A., Schmidhuber, J., & Cummins, F. (2000). Learning to forget: Continual prediction with LSTM. Neural computation, 12(10), 2451-2471.
Ginsberg, J., Mohebbi, M., H, M., Patel, R. S., Brammer, L., Smolinski, M. S., & Brilliant, L. (2009). Detecting influenza epidemics using search engine query data. Nature, 457(7232), 1012–14.
Giovanelli, A. (2012). Nonlinear Forecasting using Large Datasets: Evidence on US and Euro Area Economies. CEIS Tor Vergata, 10(13), 1-29. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2172399
Girin, L., Hueber, T., Roche, F., & Leglaive, S. (2019). Notes On The Use Of Variational Autoencoders For Speech And Audio. the 22nd International Conference on Digital Audio Effects. Birmingham. Retrieved from http://www.gipsa-lab.fr/~laurent.girin/papers/Girin_et_al_ DAFx2019.pdf
Global Pluse. (2013). Mobile Phone Network Data. United Nations Global Pluse, 1-12.
Glorot, X., & Bengio, Y. (2010). Understanding the difficulty of training deep feedforward neural networks. Aistats, 9.
Goel, S., Hofman, J. M., Lahaie, S., Pennock, D. M., & Watts, D. J. (2010). Predicting consumer behavior with Web search. Proceedings of the National Academy of Sciences of the United States of America, 107(41), 17486–90.
Graves, A. (2014). Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850.
Graves, A., & Schmidhuber, J. (2005). Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Networks, 602-610.
Graves, A., Liwicki, M., Fern´ andez, S., Bertolami, R., Bunke, H., & Schmidhuber, J. (2009). A novel connectionist system for unconstrained handwriting recognition. IEEE transactions on pattern analysis and machine intelligence, 31(5), 855-868.
Graves, A., Mohamed, A. -r., & Hinton, G. (2013). Speech recognition with deep recurrent neural networks. IEEE international conference on acoustics, speech and signal processing, (pp. 6645-6649).
Green, J., Hand, J. R., & Zhang, X. F. (2013). The supraview of return predictive signals. Review of Accounting Studies, 18(3), 992-730.
Gupta, R., Kabundi, A., Miller, S., & Uwilingiye, J. (2013). Using Large Datasets to Forecast Sectoral Unemployment. Statistical Methods & Applications, forthcoming.
Han, K., Zhang, C., Li, C., & Xu, C. (2018). Autoencoder Inspried Unsupervised Feature Selection. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). ResearchGate.
Hassani, H., & Silva, E. S. (2015). Forecasting with Big Data: A Review. Annals of Data Science, 20.
He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. Proceedings of the IEEE international conference on computer vision.
Hindrayanto, I., JanKoopman, S., & Winter, J. d. (2016). Forecasting and Nowcasting Economic Growth in the Euro Area Using Factor Models. International Journal of Forecasting, Elsevier, 22.
Hinton, G. E. (2009). Deep belief networks. Scholarpedia, 4(5947).
Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.
Hoog, S. V. (2016). Deep Learning in Agent-Based Models: A Prospectus. Working Papers in Economics and Management, 19.
Hsu, Y.-C., & Kira, Z. (2016). NEURAL NETWORK-BASED CLUSTERING USING PAIR-WISE CONSTRAINTS. arXiv:1511.06321v5.
Huck, N. (2009). Pairs selection and outranking: An application to the S&P 100 index. European Journal of Operational Research, 196(2), 819-825.
Huck, N. (2010). Pairs trading and outranking: The multi-step-ahead forecasting case. European Journal of Operational Research, 207(3), 1702–1716.
Jacobs, H. (2015). What explains the dynamics of 100 anomalies? Journal of Banking & Finance.
Jagadish, H. V., Gehrke, J., Papakonstantinou, A., Patel, Y., Ramakrishnan, R., & Shahabi, C. (2014). Big data and its technical challenges. Communications of the ACM, 57(7), 86-94.
Jurado, K., Ludvigson, S., & Ng, S. (2015). Measuring Macroeconomic Uncertainty. American Economic Review, 105(3), 1177-1216.
Kapetanios, G., & Marcellino, M. (2009). A Parametric Estimation Method for Dynamic Factor Models of Large Dimensions. Journal of Time Series Analysis, 30(2), 208-238.
Kapetanios, G., Marcellino, M., & Papailias, F. (2016). Big Data and Macroeconomic Nowcasting. European Commission, Worldwide Consultants, 79.
Karpathy, A. (2015). The unreasonable effectiveness of recurrent neural networks. Retrieved from Github: http://karpathy.github.io/2015/05/21/rnn-effectiveness/
Karray, F., Campilho, A., & Yu, A. (Eds.). (2019). Image Analysis and Recognition: 16th International Conference. Waterloo, ON, Canada: Springer.
Kingma, D., & Welling, M. (2019). An Introduction to Variational Autoencoders. Foundations and Trends in Machine Learning, 12, 307-392. doi:10.1561/2200000056
Koop, G. M. (2013). Forecasting with Medium and Large Bayesian VARs. Journal of Applied Econometrics, 28(2), 177-203.
Koturwar, S., & Merchant, S. (2019). Weight Initialization of Deep Neural Networks(DNNs) Using Data Statistics. Retrieved from ResearchGate: https://www.researchgate.net/publication/320727084_Weight_Initialization_of_Deep_Neural_Net
Koturwar, S., & Merchant, S. N. (2018). Weight Initialization of Deep Neural Networks(DNNs) Using Data Statistics. Elsevier, arXiv:1710.10570v2.
Krauss, C., Do, X. A., & Huck, N. (2017). Deep neural networks, gradient-boosted trees, random forests Statistical arbitrage on the S&P 500. European Journal of Operational Research, 259(2), 689-702.
Kroft, K., & Pope, D. G. (2014). Does Online Search Crowd Out Traditional Search and Improve Matching Efficiency? Journal of Labor Economics, 32(2), 259–303.
Kuhn, P., & Mansour, H. (2014). Is Internet Job Search Still Ineffective? Economic Journal, 124(581), 1213–1233.
Kuhn, P., & Skuterud, M. (2004). Internet Job Search and Unemployment Durations. American Economic, Review, 94(1), 218–232.
Lahiri, K., & Monokroussos, G. (2013). Nowcasting US GDP: The role of ISM business surveys. International Journal of Forecasting, 29, 644–658.
Lahiri, K., Monokroussos, G., & Zhao, Y. (2015). Forecasting Consumption: the Role of Consumer Confidence in Real-time with many Predictors. Journal of Applied Econometrics.
Le, X.-H., Viet Ho, H., Lee, G., & Jung, S. (2019). Application of Long Short-Term Memory (LSTM) Neural Network for Flood Forecasting. Water, 11(1387).
LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521, 436–444.
Lee, H., Ekanadham, C., & Ng, A. (2008). Sparse deep belief net model for visual area v2. (J. Platt, D. Koller, Y. Singer, & S. Roweis, Eds.) Advances in neural information processing systems, 20, 873-880.
Li, R., Li, X., Lin, C., Collinson, M., & Mao, R. (2019). A Stable Variational Autoencoder for Text Modelling. Proceedings of The 12th International Conference on Natural Language Generation (pp. 594–599). Tokyo: Association for Computational Linguistics. Retrieved from https://www.aclweb.org/anthology/W19-8673.pdf
Li, X. (2016). Nowcasting with Big Data: is Google useful in Presence of other Information? London Business School.
Liangchen, W., & Deng, Z.-H. (2017). Variational Autoencoding Approach for Inducing Cross-lingual Word. The Twenty-Sixth International Joint Conference on Artificial Intelligence.
Louizos, C., Shalit, U., & Mooij, J. (2017). Causal Effect Inference with Deep Latent-Variable Models. arXiv:1705.08821v2.
Lu, M. (2019). A monetary policy prediction model based on deep learning. Neural Computing and Applications, 5649–5668.
Ludvigson, S., & Ng, S. (2011). A Factor Analysis of Bond Risk Premia. Handbook of Empirical Economics and Finance. (D. Gilles, & A. Ullah, Eds.)
Mamoshina, P., Vieira, A., Putin, E., & Zhavoronkov, A. (2016). Applications of deep learning in biomedicine. Molecular pharmaceutics, 13, 1445-1454.
Marcellino, M., Stock, J., & Watson, M. (2006). A Comparison of Direct and Iterated AR Methods for Forecasting Macroeconomic Time Series h-steps Ahead. Journal of Econometrics, 135, 499-526.
McCracken, M. W., & Ng, S. (2015). FRED-MD: A Monthly Database for Macroeconomic Research. Research Division-Federal Reserve Bank of St. Louis, Working Paper Series. Retrieved from http://research.stlouisfed.org/wp/2015/2015-012.pdf
Medsker, J. (2000). Recurrent Neural Networks: Design and Applications. International Series on Computational Intelligence.
Mendels, G. (2019). Selecting the right weight initialization for your deep neural network. Data Science.
Miotto, R., Wang, F., Jiang, X., & Dudley, J. T. (2018). Deep learning for healthcare: review, opportunities and challenges. Briefings in Bioinformatics. doi:10.1093/bib/bbx044
Mol, C. D., Giannone, D., & Reichlin, L. (2008). Forecasting using a Large Number of Predictors: Is Bayesian Shrinkage a Valid Alternative to Principal Components? Journal of Econometrics, 146(2), 318-328.
Moradi-Aliabadi, M., & Huang, Y. (2016). Multistage optimization for chemical process sustainability. ACS Sustainable Chemistry & Engineering.
Moritz, B., & Zimmermann, T. (2014). Deep conditional portfolio sorts: The relation between past and future stock returns. Working paper, LMU Munich and Harvard University.
Najafabadi, M., Villanustre, F., Khoshgoftaar, T., Seliya, N., Wald, R., & Muharemagic, E. (2015). Deep learning applications and challenges in big data analytics. Journal of Big Data, SpringerOpen, 2(1). doi:10.1186/s40537-014-0007-7
Njuguna, C. P. (2017). Constructing spatiotemporal poverty indices from big data. Journal of Business Research, Elsevier, 10.
Nymand-Andersen, P. (2015). The use of Google Search data for macro-economic nowcasting. CCSA, Europen Central Banks, EuroSystem, 16.
Nymand-Andersen, P. (2016). Big data: The hunt for timely insights and decision certainty. IFC Working Papers, Bank for International Settlements, 14. Retrieved from www.bis.org
Olah, C. (2015). Understanding LSTM Networks. Retrieved from http://colah.github.io/posts/2015-08-Understanding-LSTMs/
Ostapenko, N. (2020). Identification of Monetary Policy Shocks from FOMC Transcripts. SSRN. Retrieved from https://ssrn.com/abstract=3527415
Ouysse, R. (2013). Forecasting using a Large Number of Predictors: Bayesian Model Averaging versus Principal Components Regression. Australian School of Business Research Paper, 2013ECON04, 1-34. Retrieved from http://research.economics.unsw.edu.au/RePEc/papers/ 2013-04.pdf
Partaourides, H., & Chatzis, S. P. (2017). Asymmetric deep generative models. Neurocomputing, 241(90). doi:10.1016/j.neucom.2017.02.028
Rajkomar, A., Oren, E., & Chen, K. (2018). Scalable and accurate deep learning with electronic health records. Nature. Retrieved from https://www.nature.com/articles/s41746-018-0029-1
Rezende, D. J., Mohamed, S., & Wierstra, D. (2014). Stochastic backpropagation and approximate inference in deep generative models. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), (pp. 1278-1286).
Rocca, J. (2019). Understanding Variational Autoencoders (VAEs). Data Science.
Roostaei, J., & Zhang, Y. (2016). Spatially explicit life cycle assessment: Opportunities and challenges. Algal Research.
Sak, H., Senior, A., & Beaufays, F. (2014). Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition. arXiv preprint arXiv:1402.1128.
Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural Networks, 61.
Sermpinis, G., Theofilatos, K., Karathanasopoulos, A., Georgopoulos, E. F., & Dunis, C. (2013). Forecasting foreign exchange rates with adaptive neural networks using radial-basis functions and particle swarm optimization. European Journal of Operational Research, 225(3), 528–540.
Shuyang, D., Pandey, M., & Xing, C. (2017). Modeling approaches for time series forecasting and anomaly detection. Standford. ArXiv.
Simidjievski, N., Bodnar, C., Tariq, I., Scherer, P., Andres Terre, H., Shams, Z., . . . Liò, P. (2019). Variational Autoencoders for Cancer Data Integration: Design Principles and Computational Practice. Bioinformatics and Computational Biology, a section of the journal Frontiers in Genetics, 10. doi:10.3389/fgene.2019.01205
Soto, V., Frias-Martinez, V., Virseda, J., & Frias-Martinez, E. (2011). Prediction of socioeconomic levels using cell phone records. Lect. Notes Comput. Sci., 6787 LNCS(1), 377 –388.
Staudemeyer, R., & Morris, E. (2019). Understanding LSTM -- a tutorial into Long Short-Term Memory Recurrent Neural Networks. ResearchGate. Retrieved from https://www.researchgate.net/publication/335975993
Stevenson, B. (2008). The Internet and Job Search. NBER Working Paper 13886.
Stock, J. H., & Watson, M. W. (1996). Evidence on Structural Instability in Macroeconomic Time Series Relations. Journal of Business and Economic Statistics, 11-30.
Stock, J. H., & Watson, M. W. (1998). Diffusion Indexes. NBER Working Paper, 6702.
Stock, J. H., & Watson, M. W. (2002). Macroeconomic Forecasting Using Diffusion Indexes. Journal of Business and Economic Statistics, 20(2), 47-162.
Stock, J., & Watson, M. W. (2005). Implications of Dynamic Factor Models for VAR analysis. NBER Working Paper, 11467.
Stock, J., & Watson, M. W. (2006). Forecasting with Many Predictors, Handbook of Forecasting. North Holland.
Suk, H.-I., Lee, S.-W., & Shen, D. (2016). Latent feature representation with stacked auto-encoder for AD/MCI diagnosis. Brain Structure and Function, 220(2), 841-859.
Takeuchi, L., & Lee, Y. -Y. (2013). Applying deep learning to enhance momentum trading strategies in stocks. Working paper, Stanford University.
Thiemann, P. G. (2016). Big Data: Bringing Competition Policy to the Digital Era. OEDC - Organisation for Economic Co-operation and Development, 40.
Tieleman, T., & Hinton, G. (2012). Lecture 6.5-rmsprop: Divide the gradient by a running average of its recent magnitude. COURSERA: Neural Networks for Machine Learning, 4(2), 26-31.
Tiffin, A. (2016). Seeing in the Dark: A Machine-Learning Approach to Nowcasting in Lebanon. IMF Working Paper, 20.
Tozzi, C. (2018). Big Data Trends: Liberate, Integrate & Trust, eBook. Syncsort. Retrieved from Available at: https://blog.syncsort.com/2018/02/big-data/big-data-use-cases-2018/
Tuhkuri, J. (2014). Big Data: Google Searches Predict Unemployment in Finland. ETLA Reports, 31.
Tuhkuri, J. (2016). A Model for Forecasting with Big Data - Forecasting Unemployment with Google Searches in Europe. Elsevier, 54(20).
Varian, H. (2018). Retrieved from https://www.youtube.com/watch?v=aUl OVgT Y
Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y., & Manzagol, P.-A. (2010). Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local. The Journal of Machine Learning Research, 11, 371-3408.
Wang, H., Xu, Z., Fujita, H., & Liu, S. (2016). Towards felicitous decision making: An overview on challenges and trends of Big Data. Information Sciences, Science Direct, Elsevier, 19.
Wong, W.-k., Shi, X. J., Yeung, D. Y., & Woo, W.-C. (2016). A deep-learning method for precipitation nowcasting. WMO WWRP 4th International Symposium on Nowcasting and Very-short-range Forecast 2016. Hong Kong.
Wu, L., & Brynjolfsson, E. (2015). The Future of Prediction: How Google Searches Foreshadow Housing Prices and Sales. (A. Goldfarb, S. Greenstein, & C. Tucker, Eds.) Economic Analysis of the Digital Economy, 89–118.
Yasir, M., Afzal, S., Latif, K., Chaudhary, G., Malik, N., Shahzad, F., & Song, O.-y. (2020). An Efficient Deep Learning Based Model to Predict Interest Rate Using Twitter Sentiment. Sustainability. doi:10.3390
Zafar Nezhad, M., Zhu, D., Li, X., Yang, K., & Levy, P. (2017). SAFS: A Deep Feature Selection. arXiv:1704.05960v1.
Zafar Nezhad, M., Zhu, D., Sadati, N., & Yang, K. (2018). A Predictive Approach Using Deep Feature Learning for Electronic Medical Records: A Comparative Study. arXiv:1801.02961v1.
Zafar Nezhad, M., Zhu, D., Sadati, N., Yang, K., & Levy, P. (2017). A supervised bi-clustering approach for precision medicine. arXiv preprint arXiv:1709.09929.