text-only page produced automatically by Usablenet Assistive Skip all navigation and go to page content Skip top navigation and go to directorate navigation Skip top navigation and go to page navigation
National Science Foundation
Awards
design element
Search Awards
Recent Awards
Presidential and Honorary Awards
About Awards
Grant Policy Manual
Grant General Conditions
Cooperative Agreement Conditions
Special Conditions
Federal Demonstration Partnership
Policy Office Website



Award Abstract #1352259

CAREER: Theory and Methods for Simultaneous Variable Selection and Rank Reduction

NSF Org: DMS
Division Of Mathematical Sciences
divider line
Initial Amendment Date: February 11, 2014
divider line
Latest Amendment Date: March 9, 2016
divider line
Award Number: 1352259
divider line
Award Instrument: Continuing grant
divider line
Program Manager: Gabor J. Szekely
DMS Division Of Mathematical Sciences
MPS Direct For Mathematical & Physical Scien
divider line
Start Date: June 1, 2014
divider line
End Date: May 31, 2019 (Estimated)
divider line
Awarded Amount to Date: $275,341.00
divider line
Investigator(s): Yiyuan She yshe@stat.fsu.edu (Principal Investigator)
divider line
Sponsor: Florida State University
874 Traditions Way, 3rd Floor
TALLAHASSEE, FL 32306-4166 (850)644-5260
divider line
NSF Program(s): STATISTICS,
Division Co-Funding: CAREER
divider line
Program Reference Code(s): 1045
divider line
Program Element Code(s): 1269, 8048

ABSTRACT

The data explosion in all fields of science creates an urgent need for methodologies for analyzing high dimensional multivariate data. The project deepens and broadens existing sparsity and low rank statistical theories and methods by making the following major scientific achievements: (a) an innovative selectable reduced rank methodology through simultaneous variable selection and projection, with guaranteed lower error rate than existing variable selection and rank reduction rates in theory, which paves the way to new frontiers in high dimensional statistics and information theory; (b) fast but simple-to-implement algorithms that can deal with all popular penalty functions (possibly nonconvex) in computation with guaranteed global convergence and local optimality, to ensure the practicality of the proposed approaches in big data applications; (c) a generic extension to non-Gaussian models capable of taking into account the correlation between multivariate responses, with a universal algorithm design based on manifold optimization; (d) a unified robustification scheme that can both identify and accommodate gross outliers occurring frequently in real data, to overcome the non-robustness of many conventional multivariate tools; (e) general-purpose model selection methods serving variable selection and/or rank reduction and achieving the finite-sample optimal prediction error rate with theoretical guarantee.

The need to recover low-dimensional signals from high dimensional multivariate noisy data permeates all fields of science and engineering. Hence a project of this nature, designed to develop transformative theory and methods for simultaneous variable selection and rank reduction, finds applications in a wide range of disciplines and areas such as machine learning, signal processing, and biostatistics, among others. By cross-fertilizing ideas from statistics, mathematics, engineering, and computer science, the integrated research and education help students develop critical thinking through cross-disciplinary training, and assist students in becoming life-long learners. The investigator uses the rich topics in this project to inspire the learning and discovery interest of the public and students of all ages. The educational plan consists of course development, student mentoring, outreach, and recruiting underrepresented students.


PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.


Yiyuan She, Yuejia He, Dapeng Wu. "Learning Topology and Dynamics of Large Recurrent Neural Networks," IEEE Transactions on Signal Processing, v.62, 2014, p. 5881-5891.

Yiyuan She, Yuejia He, Shijie Li, Dapeng Wu. "Joint Association Graph Screening and Decomposition for Large-scale Linear Dynamical Systems," IEEE Transactions on Signal Processing, v.63, 2015, p. 389-401.

Pratik Brahma, Dapeng Wu, and Yiyuan She. "Why Deep Learning Works: A Manifold Disentanglement Perspective," IEEE Transactions on Neural Networks and Learning Systems, 2015, p. 1-12.

Xin Shi, Chao Zhang, Fangyun Wei, Hongyang Zhang, Yong Luo, and Yiyuan She. "Manifold-Regularized Selectable Factor Extraction for Semi-supervised Image Classification," Proceedings of British Machine Vision Conference, Swansea, UK, 2015, 2015.

 

Please report errors in award information by writing to: awardsearch@nsf.gov.

 

 

Print this page
Back to Top of page
  FUNDING   AWARDS   DISCOVERIES   NEWS   PUBLICATIONS   STATISTICS   ABOUT NSF   FASTLANE  
Research.gov  |  USA.gov  |  National Science Board  |  Recovery Act  |  Budget and Performance  |  Annual Financial Report
Web Policies and Important Links  |  Privacy  |  FOIA  |  NO FEAR Act  |  Inspector General  |  Webmaster Contact  |  Site Map
National Science Foundation Logo
The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
Tel: (703) 292-5111, FIRS: (800) 877-8339 | TDD: (800) 281-8749
  Text Only Version