Joint-Label Learning by Dual Augmentation for Time Series Classification
DOI:
https://doi.org/10.1609/aaai.v35i10.17071Keywords:
Time-Series/Data StreamsAbstract
Recently, deep neural networks (DNNs) have achieved excellent performance on time series classification. However, DNNs require large amounts of labeled data for supervised training. Although data augmentation can alleviate this problem, the standard approach assigns the same label to all augmented samples from the same source. This leads to the expansion of the data distribution such that the classification boundaries may be even harder to determine. In this paper, we propose Joint-label learning by Dual Augmentation (JobDA), which can enrich the training samples without expanding the distribution of the original data. Instead, we apply simple transformations to the time series and give these modified time series new labels, so that the model has to distinguish between these and the original data, as well as separating the original classes. This approach sharpens the boundaries around the original time series, and results in superior classification performance. We use Time Series Warping for our transformations: We shrink and stretch different regions of the original time series, like a fun-house mirror. Experiments conducted on extensive time-series datasets show that JobDA can improve the model performance on small datasets. Moreover, we verify that JobDA has better generalization ability compared with conventional data augmentation, and the visualization analysis further demonstrates that JobDA can learn more compact clusters.Downloads
Published
2021-05-18
How to Cite
Ma, Q., Zheng, Z., Zheng, J., Li, S., Zhuang, W., & Cottrell, G. W. (2021). Joint-Label Learning by Dual Augmentation for Time Series Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 8847-8855. https://doi.org/10.1609/aaai.v35i10.17071
Issue
Section
AAAI Technical Track on Machine Learning III