In this paper, we develop a new library TenDeC++ for tensor decompositions in C++. TenDeC++ supports popular tensor decomposition functions including Canonical Polyadic, Tucker, tensor-train, and t-SVD, assisting C++ programmers to shorten the development cycle of deep learning applications. Compared with the resource-intensive Python and MATLAB, C++ has the nature advantages on fast running time and high compatibility. To further explore potentials of C++, we propose a novel underlying technology PointerDefomer leveraging the unique pointer. Since the transformation between tensor and size-specific matrix is indispensable in tensor decompositions, PointerDefomer can virtually achieve such a transformation by controlling the movement of pointer in memory address. As a result, the conventional transformation steps can be skipped to accelerate the decomposition process and there is no memory needed for saving the intermediate results of tensor transformation. In our experiment, TenDeC++ reduces decomposition time and support larger size of tensor compared with the classic Tensorly in Python and TensorLab in MATLAB, respectively.
[1]
Maja Pantic,et al.
TensorLy: Tensor Learning in Python
,
2016,
J. Mach. Learn. Res..
[2]
Tamara G. Kolda,et al.
Tensor Decompositions and Applications
,
2009,
SIAM Rev..
[3]
Nikos D. Sidiropoulos,et al.
Tensor Decomposition for Signal Processing and Machine Learning
,
2016,
IEEE Transactions on Signal Processing.
[4]
Endong Wang,et al.
Intel Math Kernel Library
,
2014
.
[5]
Shoaib Kamil,et al.
The tensor algebra compiler
,
2017,
Proc. ACM Program. Lang..
[6]
Steven G. Johnson,et al.
The Design and Implementation of FFTW3
,
2005,
Proceedings of the IEEE.