已收录 272893 条政策
 政策提纲
  • 暂无提纲
Analyzing Effect on Residual Learning by Gradual Narrowing Fully-Connected Layer Width and Implementing Inception Block in Convolution Layer
[摘要] Research conducted on the advancement of CNN architecture for computer vision problems focuses on strategically choosing and modifying convolution hyperparameters (kernel, pooling, etc.). However, these research works don't exploit the advantage of employing multi fully-connected layers post the core schema to avail further performance improvements, which have been identified as the first research gap. Studies were also conducted to address the challenges of vanishing gradients in deep networks by employing residual learning via skip connections and lowering model training computational costs using parallel convolution rather than sequential convolution operations by employing inception blocks. These studies also don't discuss in detail the impact of sparing features on feature learning, which has been identified as the second research gap. Diagnosis of infectious patterns in chest X-rays using residual learning is chosen as the problem statement for this study. Results show that ResNet50 architecture achieved improved accuracy by 0.6218% and declined error rate by 2.6326% if gradually narrowing FC layers are employed between core residual learning schema and output layer. Also, independent implementation of inception blocks (google net v2) before skip-connections in ResNet50 architecture boosts accuracy by 0.961% and lowers the error rate by 4.2438%. These performance improvements were achieved without regularization and thus, encourage future work in this direction.
[发布日期]  [发布机构] 
[效力级别]  [学科分类] 计算机科学(综合)
[关键词] Fully-Connected Layer;Neuron Layer Width;ResNet50;Residual Network;Skip-Connections;Inception Blocks [时效性] 
   浏览次数:3      统一登录查看全文      激活码登录查看全文