no code implementations • 14 May 2024 • Qinshuo Liu, Yanwen Fang, PengTao Jiang, Guodong Li
Multivariate time series forecasting tasks are usually conducted in a channel-dependent (CD) way since it can incorporate more variable-relevant information.
no code implementations • 6 Jun 2023 • Yanwen Fang, Jintai Chen, Peng-Tao Jiang, Chao Li, Yifeng Geng, Eddy K. F. LAM, Guodong Li
Multi-person motion prediction is a challenging task, especially for real-world scenarios of highly interacted persons.
1 code implementation • 8 Feb 2023 • Yanwen Fang, Yuxi Cai, Jintai Chen, Jingyu Zhao, Guangjian Tian, Guodong Li
Motivated by this, we devise a cross-layer attention mechanism, called multi-head recurrent layer attention (MRLA), that sends a query representation of the current layer to all previous layers to retrieve query-related information from different levels of receptive fields.
no code implementations • 14 Apr 2022 • Huiling Yuan, Guodong Li, Junhui Wang
This paper introduces one new multivariate volatility model that can accommodate an appropriately defined network structure based on low-frequency and high-frequency data.
no code implementations • 9 Dec 2021 • Feiqing Huang, Yuefeng Si, Yao Zheng, Guodong Li
While recently many designs have been proposed to improve the model efficiency of convolutional neural networks (CNNs) on a fixed resource budget, theoretical understanding of these designs is still conspicuously lacking.
1 code implementation • NeurIPS 2021 • Jingyu Zhao, Yanwen Fang, Guodong Li
This paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer.
no code implementations • 1 Jan 2021 • Feiqing Huang, Yuefeng Si, Guodong Li
Many designs have recently been proposed to improve the model efficiency of convolutional neural networks (CNNs) at a fixed resource budget, while there is a lack of theoretical analysis to justify them.
1 code implementation • ICML 2020 • Jingyu Zhao, Feiqing Huang, Jia Lv, Yanjie Duan, Zhen Qin, Guodong Li, Guangjian Tian
The LSTM network was proposed to overcome the difficulty in learning long-term dependence, and has made significant advancements in applications.
no code implementations • 6 Sep 2019 • Di Wang, Feiqing Huang, Jingyu Zhao, Guodong Li, Guangjian Tian
Autoregressive networks can achieve promising performance in many sequence modeling tasks with short-range dependence.