no code implementations • 4 Jan 2023 • Peiwang Tang, Xianchao Zhang
The Transformer architecture yields state-of-the-art results in many tasks such as natural language processing (NLP) and computer vision (CV), since the ability to efficiently capture the precise long-range dependency coupling between input sequences.
no code implementations • 4 Oct 2022 • Peiwang Tang, Xianchao Zhang
Large-scale self-supervised pre-training Transformer architecture have significantly boosted the performance for various tasks in natural language processing (NLP) and computer vision (CV).
Multivariate Time Series Forecasting Self-Supervised Learning +1
no code implementations • 5 Sep 2022 • Peiwang Tang, Xianchao Zhang
Firstly, the complex features are extracted according to the irregular patterns of different events.