已收录 273192 条政策
 政策提纲
  • 暂无提纲
Learning in high dimensions with projected linear discriminants
[摘要] The enormous power of modern computers has made possible the statistical modelling of data with dimensionality that would have made this task inconceivable only decades ago. However, experience in such modelling has made researchers aware of many issues associated with working in high-dimensional domains, collectively known as `the curse of dimensionality', which can confound practitioners' desires to build good models of the world from these data. When the dimensionality is very large, low-dimensional methods and geometric intuition both break down in these high-dimensional spaces. To mitigate the dimensionality curse we can use low-dimensional representations of the original data that capture most of the information it contained. However, little is currently known about the effect of such dimensionality reduction on classifier performance. In this thesis we develop theory quantifying the effect of random projection -a recent, very promising, non-adaptive dimensionality reduction technique-on the classification performance of Fisher's Linear Discriminant (FLD), a successful and widely-used linear classifier. We tackle the issues associated with small sample size and high-dimensionality by using randomly projected FLD ensembles, and we develop theory explaining why our new approach performs well. Finally, we quantify the generalization error of Kernel FLD, a related non-linear projected classifier.
[发布日期]  [发布机构] University:University of Birmingham;Department:School of Computer Science
[效力级别]  [学科分类] 
[关键词] Q Science;QA Mathematics;QA75 Electronic computers. Computer science [时效性] 
   浏览次数:3      统一登录查看全文      激活码登录查看全文