Transfer learning (TL) is a paradigm that improves the performance of statistical tasks for the target data by leveraging information from informative source datasets — often referred to as positive transfer in the literature. However, most existing methods are tailored to independent, light-tailed data, which may exclude critical applications involving temporal dependence or heavy-tailed covariates in statistical modelling, such as stock price forecasting.
In this talk, I will introduce the TL coefficient estimator for high-dimensional linear models and its rate of convergence with temporally dependent and potentially heavytailed covariates or error processes. Second, I will describe our proposed selfnormalized (SN) type statistic, used to screen out negative transfers, where negative transfers are source datasets that deviate significantly from the target data. The asymptotic properties of the SN statistic are also established. The validity of our methods will be demonstrated through simulations and a real -world application: forecasting the log prices of Dow Jones Index constituents. I will also briefly outline potential future work at the end of thispresentation.
Keywords: Transfer learning, time series analysis, high-dimensional data analysis,
functional dependence, self-normalized test, financial data application