Time Series Forecasting Meets Large Language Models

No Thumbnail Available

Files

URL

Journal Title

Journal ISSN

Volume Title

Perustieteiden korkeakoulu | Bachelor's thesis
Electronic archive copy is available locally at the Harald Herlin Learning Centre. The staff of Aalto University has access to the electronic bachelor's theses by logging into Aaltodoc with their personal Aalto user ID. Read more about the availability of the bachelor's theses.

Date

2024-09-06

Department

Major/Subject

Data Science

Mcode

SCI3095

Degree programme

Aalto Bachelor’s Programme in Science and Technology

Language

en

Pages

20 + 3

Series

Abstract

Time series forecasting involves predicting future values by analysing past data and its trends in a time-dependent manner. The prevalence of time series data in multiple domains makes time series forecasting an important research area. As a result, new methods that enhance the accuracy of these predictions are always emerging. These methods aim to use the patterns and trends present in the data to make accurate forecasts about the behavior of the time series in the future. Initial forecasting approaches used mathematical models relying on the statistical properties of the data. Recently, the emergence of deep learning has lead to various new models based on recurrent neural networks and transformers for time series forecasting. The use of large language models (LLMs), a class of transformers, for forecasting time series data is an emerging area of research. This thesis is a literature review that seeks to determine the current state of research regarding the application of LLMs for time series forecasting. Six different models are presented that employ a variety of techniques to make forecasts using LLMs. The thesis also identifies some key challenges with the use of LLMs for forecasting that need be answered and presents directions for future research.

Description

Supervisor

Korpi-Lagg, Maarit

Thesis advisor

Ciaperoni, Martino

Keywords

time series, time series forecasting, large language models, transformers, deep learning

Other note

Citation