报告时间：2016年6月27日 (周一) 上午10: 00 -12: 00 am
报 告 人：Prof. Sun-Yuan Kung
In the internet era, we experience a phenomenon of “digital everything”. Due to its quantitative (volume and velocity) and qualitative (variety) challenges, it is imperative to address various computational aspects of big data. For big data, the curse of high feature dimensionality is causing grave concerns on computational complexity and over-training. In this talk, we shall explore various projection methods for dimension reduction - a prelude to visualization of vectorial and non-vectorial data. A popular visualization tool for unsupervised learning is Principal Component Analysis (PCA). PCA aims at best recoverability of the original data in the Euclidean Vector Space (EVS). We shall propose a supervised PCA - Discriminant Component Analysis (DCA) - in a Canonical Vector Space (CVS). Simulations confirm that DCA far outperforms PCA, both numerically and visually.
In addition, “big-data” often has a connotation of “big-brother”, since the data being collected on consumers like us is growing exponentially, attacks on our privacy are becoming a real threat. New technologies are needed to better assure our privacy protection when we upload personal data to the cloud. To this end, we shall explore joint optimization over three design spaces: (a) Feature Space, (b) Classification Space, and (c) Privacy Space. This prompts a new paradigm called Compressive Privacy (CP) to explore information systems which simultaneously perform.
■ Utility Space Maximization: deliver intended data mining, classification, and learning tasks.
■ Privacy Space Minimization: safeguard personal/private information.
DCA is promising for privacy protection when personal data are being shared on the cloud in collaborative learning environments. Furthermore, DCA may be extended to DUCA whose derivation covers various theoretical foundations and their associated privacy relevant applications including: Estimation and Classification, Information Theory, Statistical Analysis and Subspace Optimization.
S.Y. Kung, a Life Fellow of IEEE, is a Professor at Department of Electrical Engineering in Princeton University. His research areas include machine learning, data mining and analysis, statistical estimation, system identification, wireless communication, VLSI array processors, genomic signal processing, and multimedia information processing. He was a founding member of several Technical Committees (TC) of the IEEE Signal Processing Society, and was appointed as the first Associate Editor in VLSI Area (1984) and later the first Associate Editor in Neural Network (1991) for the IEEE Transactions on Signal Processing. He served on the Board of Governors of the IEEE Signal Processing Society (1989-1991). Since 1990, he has been the Editor-In-Chief of the Journal of VLSI Signal Processing Systems. He was the recipient of IEEE Signal Processing Society's Technical Achievement Award for the contributions on “parallel processing and neural network algorithms for signal processing” (1992); a Distinguished Lecturer of IEEE Signal Processing Society (1994); a recipient of IEEE Signal Processing Society's Best Paper Award for his publication on principal component neural networks (1996); and a recipient of the IEEE Third Millennium Medal (2000). He has authored and co-authored more than 500 technical publications and numerous textbooks including “VLSI Array Processors”, Prentice-Hall (1988); “Digital Neural Networks”, Prentice-Hall (1993); “Principal Component Neural Networks”, John-Wiley (1996); “Biometric Authentication: A Machine Learning Approach”, Prentice-Hall (2004); and “Kernel Methods and Machine Learning”, Cambridge University Press (2014).