南京大学学报(自然科学版) ›› 2015, Vol. 51 ›› Issue (2): 390–404.

• • 上一篇    下一篇

张量树学习算法

路梅1,2李凡长1   

  • 出版日期:2015-03-06 发布日期:2015-03-06
  • 作者简介:(苏州大学计算机学院苏州215006;2. 江苏师范大学计算机学院徐州221116)
  • 基金资助:
    国家自然科学基金(61033013,61272297,61402207 )

Tensor ree learning method

Lu Mei1,2, Li Fanzhang1   

  • Online:2015-03-06 Published:2015-03-06
  • About author:(1. College of Computer Science and Technology, Soochow University, Suzhou, 215006, China; 2. College of Computer Science and Technology, Jiangsu Normal University, Xuzhou, 221116,China)

摘要: 基于张量几何理论及人类视觉认知的一、二、三维认知模式,本文提出了张量树学习算法(Tensor Tree Learning, TTL)。其内容包括:张量树学习的基本概念、张量树学习算法、基于张量树的Tucker分解和CP分解的学习算法等;同时也给出了阶张量树树高的最小高度为;最后在数据库Coil100,Coil20和本实验室创建的数据库上进行了验证,结果表明张量树学习算法是有效、合理的。

Abstract: Based on the research of the tensor geometry theory and 1-D, 2-D and 3-D cognitive model of human visual perception, the Tensor Tree Learning (TTL) method is proposed in this paper. The main purpose of this paper are fourfold: the basic concept of tensor tree learning, TTL algorithm, Tucker decomposition and CP decomposition based on tensor tree algorithm. Furthermore, the minimum depth of the n-order tensor tree is also proven to be . The proposed method is evaluated on COIL100, COIL20,ORL and our lab’s databases to show its effectiveness and the results have proven the validity and rationality of our proposed TTL method

[1] 李凡长, 张莉, 杨季文, 等. 李群机器学习. 合肥:中国科学技术大学出版社, 2013.
[2] Hutchinson Brian, Deng Li, Yu Dong. Tensor deep stacking networks. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 2012, 35(8): 1944~1957.
[3] Tao D C, Li X L, Hu W M, et al. Supervised tensor learning. Knowl. Inf. Syst, 2007, 13:1~42.
[4] Guo W W, Kotsia I, Patras I. Tensor learning for regression. Image Processing, IEEE Transactions on, 2012, 21(2): 816~827.
[5] Bromuri S. A Tensor Factorization Approach to Generalization in Multi-agent Reinforcement Learning. in: Proceedings of the 2012 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology, 2012: 274~281.
[6] Duan G F, Wang H C, Liu Z Y, et al. K-CPD: Learning of overcomplete dictionaries for tensor sparse coding. in: Pattern Recognition (ICPR), 2012 21st International Conference on, 2012: 493~496.
[7] Shi Z Q, Han J Q, Zheng T R, et al. Audio Segment Classification Using Online Learning Based Tensor Representation Feature Discrimination. 2013, 21(1): 186~196.
[8] Zubair S, Wang W W. Tensor Dictionary Learning With Sparse Tucker Decomposition. in: Digital Signal Processing (DSP), 3013 18th International Conference on, 2013:1~6.
[9] Kung S Y. From green computing to big-data learning: A kernel learning perspective. in: Application-Specific Systems, Architectures and Processors (ASAP), 2013 IEEE 24th International Conference on, 2013: 1~1.
[10] Hao Z F, He L F, Chen B Q, et al. A Linear Support Higher-Order Tensor Machine for Classification. IEEE Transactions on Image Processing, 2013, 22: 2911~2920.
[11] Tucker L R. Implications of factor analysis of three-way matrices for measurement of change. Problems in measuring change, 1963: 122~137.
[12] Kroonenberg P M, De L J. Principal component analysis of three-mode data by means of alternating least squares algorithms. Psychometrika, 1980, 45(1): 69~97.
[13] Tucker L R. Some mathematical notes on three-mode factor analysis. Psychometrika, 1966, 31(3): 279~311.
[14] Krooneberg P M. Three-mode principal component analysis: Theory and applications. DSWO press: Leiden, 1983, 2.
[15] De L L, De M B, Vandewalle J. A multilinear singular value decomposition. SIAM journal on Matrix Analysis and Applications, 2000, 21(4): 1253~1278.
[16] De L L, De M B, Vandewalle J. On the best rank-1 and rank-(R1, R2,..., Rn) approximation of higher-order tensors. SIAM journal on Matrix Analysis and Applications, 2000, 21(4): 1324~1342.
[17] Cichocki A, Zdunek R, Amari S. Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization. in: 7th International Conference, ICA 2007. Berlin: Springer, 2007, 169~176.
[18] Ye Jieping. Generalized low rank approximations of matrices. Machine Learning, 2005, 61(1~3): 167~191.
[19] He X F, Cai D, Niyogi Partha. Tensor subspace analysis. in: NIPS, 2005: 1.
[20] Sun J T, Zeng H J, Liu H, et al. CubeSVD: a novel approach to personalized Web search. in: Proceedings of the 14th international conference on World Wide Web, 2005: 382~390.
[21] Harshman R A. Foundations of the PARAFAC procedure: models and conditions for an" explanatory" multimodal factor analysis. UCLA Working Papers in Phonetics, 1970, 16: 1~84.
[22] Harshman R A. PARAFAC2: Mathematical and technical notes. UCLA working papers in phonetics, 1972, 22: 30~44.
[23] ten B, Jos M. Simplicity and typical rank results for three-way arrays. Psychometrika, 2011, 76(1): 3~12.
[24] Carroll J D, Chang J J. Analysis of individual differences in multidimensional scaling via an N-way generalization of “Eckart-Young” decomposition. Psychometrika, 1970, 35(3): 283~319.
[25] Kolda T G, Bader B W. The TOPHITS model for higher-order web link analysis. in:Workshop on link analysis, counterterrorism and security, 2006: 26~29.
[26] Bader B W, Berry M W, Browne Murray. Discussion tracking in Enron email using PARAFAC. In Survey of Text Mining II. London: Springer, 2008, 147~163.
[27] Bro R, De J S. A fast non-negativity-constrained least squares algorithm. Journal of Chemometrics, 1997, 11(5): 393~401.
[28] Friedlander M P, Hatz K. Computing non-negative tensor factorizations. Optimisation Methods and Software, 2008, 23(4): 631~647.
[29] Paatero P. A weighted non-negative least squares algorithm for three-way ‘PARAFAC’factor analysis. Chemometrics and Intelligent Laboratory Systems, 1997, 38(2): 223~242.
[30] Hazan T, Polak S, Shashua A. Sparse image coding using a 3D non-negative tensor factorization. in: Computer Vision, 2005, Tenth IEEE International Conference on, 2005: 50~57.
[31] Kong H, Teoh E K, Wang J G, et al. Two dimensional fisher discriminant analysis: Forget about small sample size problem. in: Proc. IEEE Intern. Conf. on Acoustics, Speech, and Signal Processing, 2005: 761~764.
[32] Yan SC, Xu D, Yang Q, et al. Multilinear discriminant analysis for face recognition. Image Processing, IEEE Transactions on, 2007, 16(1): 212~220.
[33] Tao D C, Li X L, Wu X D, et al. General tensor discriminant analysis and gabor features for gait recognition. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 2007, 29(10): 1700~1715.
[34] Jin Q Y, Huang Y Z, Wang C F. Modular discriminant analysis and its applications. Artificial Intelligence Review, 2013, 39(4): 285~303.
[35] Kotsia I, Patras I. Multiplicative update rules for multilinear support tensor machines. in: Pattern Recognition (ICPR), 2010 20th International Conference on, 2010: 33~36.
[36] Kotsia I, Guo W W, Patras I. Higher rank support tensor machines for visual recognition. Pattern Recognition, 2012, 45(12): 4192~4203.
[37] Yang X W, Chen B Q, Chen J. A tensor factorization based least squares support tensor machine for classification. In Advances in Neural Networks–ISNN 2013. Springer: 2013, 437~446.
[38] 曾奎, 何丽芳, 杨晓伟. 基于多线性主成分分析的支持高阶张量机. 南京大学学报(自然科学), 2014, 50(2): 219-227.
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!