1 code implementation • 2 May 2024 • Shengsheng Lin, Weiwei Lin, Wentai Wu, Haojun Chen, Junjie Yang
This paper introduces SparseTSF, a novel, extremely lightweight model for Long-term Time Series Forecasting (LTSF), designed to address the challenges of modeling complex temporal dependencies over extended horizons with minimal computational resources.
no code implementations • 1 Mar 2024 • Weiwei Lin, Chenhang He, Man-Wai Mak, Jiachen Lian, Kong Aik Lee
This forces the model to learn a speaker distribution disentangled from the semantic content.
no code implementations • 8 Sep 2023 • Chong-Xin Gan, Man-Wai Mak, Weiwei Lin, Jen-Tzung Chien
Contrastive self-supervised learning (CSL) for speaker verification (SV) has drawn increasing interest recently due to its ability to exploit unlabeled data.
2 code implementations • 22 Aug 2023 • Shengsheng Lin, Weiwei Lin, Wentai Wu, Feiyu Zhao, Ruichao Mo, Haotong Zhang
To address these issues, we propose two novel strategies to reduce the number of iterations in RNNs for LTSF tasks: Segment-wise Iterations and Parallel Multi-step Forecasting (PMF).
Ranked #1 on Time Series Forecasting on Weather (720)
no code implementations • 9 Aug 2023 • Shengsheng Lin, Weiwei Lin, Wentai Wu, SongBo Wang, Yongxiang Wang
Recently, the superiority of Transformer for long-term time series forecasting (LTSF) tasks has been challenged, particularly since recent work has shown that simple models can outperform numerous Transformer-based approaches.
no code implementations • 14 May 2023 • Weiwei Lin, Chenhang He, Man-Wai Mak, Youzhi Tu
Self-supervised learning (SSL) speech models such as wav2vec and HuBERT have demonstrated state-of-the-art performance on automatic speech recognition (ASR) and proved to be extremely useful in low label-resource settings.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
1 code implementation • 21 Feb 2023 • Tiansheng Huang, Li Shen, Yan Sun, Weiwei Lin, DaCheng Tao
Personalized federated learning, as a variant of federated learning, trains customized models for clients using their heterogeneously distributed data.
no code implementations • 27 Jan 2022 • Tiansheng Huang, Shiwei Liu, Li Shen, Fengxiang He, Weiwei Lin, DaCheng Tao
To counter this issue, personalized FL (PFL) was proposed to produce dedicated local models for each individual user.
no code implementations • 29 Sep 2021 • Tiansheng Huang, Shiwei Liu, Li Shen, Fengxiang He, Weiwei Lin, DaCheng Tao
Federated learning (FL) is particularly vulnerable to heterogeneously distributed data, since a common global model in FL may not adapt to the heterogeneous data distribution of each user.
no code implementations • 14 Sep 2021 • Wentai Wu, Ligang He, Weiwei Lin
Both classification and regression tasks are susceptible to the biased distribution of training data.
no code implementations • 10 Feb 2021 • Tiansheng Huang, Weiwei Lin, Xiaobin Hong, Xiumin Wang, Qingbo Wu, Rui Li, Ching-Hsien Hsu, Albert Y. Zomaya
With astonishing speed, bandwidth, and scale, Mobile Edge Computing (MEC) has played an increasingly important role in the next generation of connectivity and service delivery.
1 code implementation • 2 Feb 2021 • Wentai Wu, Ligang He, Weiwei Lin, Carsten Maple
The results show that the selective behaviour of our algorithm leads to a significant reduction in the number of communication rounds and the amount of time (up to 2. 4x speedup) for the global model to converge and also provides accuracy gain.
no code implementations • 17 Nov 2020 • Tiansheng Huang, Weiwei Lin, Li Shen, Keqin Li, Albert Y. Zomaya
Federated Learning (FL), arising as a privacy-preserving machine learning paradigm, has received notable attention from the public.
no code implementations • 3 Nov 2020 • Tiansheng Huang, Weiwei Lin, Wentai Wu, Ligang He, Keqin Li, Albert Y. Zomaya
The client selection policy is critical to an FL process in terms of training efficiency, the final model's quality as well as fairness.
no code implementations • 28 Jul 2020 • Wentai Wu, Ligang He, Weiwei Lin, Rui Mao
In this paper, a multi-layer federated learning protocol called HybridFL is designed for the MEC architecture.
no code implementations • 3 Oct 2019 • Wentai Wu, Ligang He, Weiwei Lin, Rui Mao, Carsten Maple, Stephen Jarvis
Federated learning (FL) has attracted increasing attention as a promising approach to driving a vast number of end devices with artificial intelligence.
no code implementations • 3 Aug 2019 • Wentai Wu, Ligang He, Weiwei Lin, Yi Su, Yuhua Cui, Carsten Maple, Stephen Jarvis
In light of this, we have developed a prediction-driven, unsupervised anomaly detection scheme, which adopts a backbone model combining the decomposition and the inference of time series data.