FAANG Stock Forecasting: Using Ridge Regression, Linear Regression, and Neural Networks.
DOI:
https://doi.org/10.47611/jsrhs.v14i1.8642Keywords:
artificial intelligence, machine learning, linear regression, ridge regression, stock forecasting, stock projectionAbstract
Blue-chip stocks serve as an important cornerstone in the stock market especially for long-term investors, since they specialize in high stability, high liquidity, and long-term growth. However, predicting future prices remains a challenge because of increased market volatility. This research aims to compare and evaluate the efficacy of three machine learning models: a ridge regression model, a linear regression model, and a neural network model— in order to forecast top-performing tech stock prices and trends over the long term. Using historical data (opening/closing prices, high/low prices) from Meta (formerly known as Facebook), Amazon, Apple, Netflix, and Alphabet (formerly known as Google), each of the three types of ML models will be developed for each stock. The models will all be separately trained and tested; they will be assessed for predictive accuracy using various success metrics (MSE, R2, RMSE) and will be compared with each other using MAE as the common success metric. The neural networks had the MAE with the least value, with ridge regression having the greatest MAE value. Based on this MAE comparison, the research concluded that the LSTM recurrent neural networks had the most accurate outputs with minimal error, linear regression performed the second best and ridge regression performed the worst.
Downloads
References or Bibliography
Colah, J. (2015, August 27). Understanding LSTM networks. http://colah.github.io/posts/2015-08-Understanding-LSTMs/
Yale Department of Statistics. (1997). Lecture notes: Simple linear regression. http://www.stat.yale.edu/Courses/1997-98/101/linreg.htm
Patel, J., Shah, S., Thakkar, P., & Kotecha, K. (2015). Predicting stock and stock price index movement using Trend Deterministic Data Preparation and machine learning techniques. Expert Systems with Applications, 42(1), 259–268. doi:10.1016/j.eswa.2014.07.040
Haykin, S. (1999). Neural networks: A comprehensive foundation (2nd ed.). Prentice Hall.
Kuchibhotla, A. K., Patil, P., & Rinaldo, A. (2019). All of linear regression. arXiv preprint arXiv:1910.06386. https://doi.org/10.48550/arXiv.1910.06386
Arashi, M., Roozbeh, M., Hamzah, N. A., & Gasparini, M. (2021). Ridge regression and its applications in genetic studies. PLOS ONE, 16(2), e0245376. https://doi.org/10.1371/journal.pone.0245376
S. Ansari and A. B. Nassif, "A Comprehensive Study of Regression Analysis and the Existing Techniques," 2022 Advances in Science and Engineering Technology International Conferences (ASET), Dubai, United Arab Emirates, 2022, pp. 1-10, doi: 10.1109/ASET53988.2022.9734973.
M. Huang, "Theory and Implementation of linear regression," 2020 International Conference on Computer Vision, Image and Deep Learning (CVIDL), Chongqing, China, 2020, pp. 210-217, doi: 10.1109/CVIDL51233.2020.00-99.
Published
How to Cite
Issue
Section
Copyright (c) 2025 Zacharius Song

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright holder(s) granted JSR a perpetual, non-exclusive license to distriute & display this article.


