Skip to content

prajunaa/CNNMoreVsLess

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

CNNMoreVsLess

This is a personal research project I have completed! I am currently writing a research paper for the work and eventually hope to have it published in a journal.

Abstract: Researchers commonly attribute the performance degradation of Convolutional Neural Networks to insufficient sample size. Augmentation is a method used to lessen that decline in performance by transforming existing images to reinforce relevant visual patterns, yet effectiveness is often only measured through final accuracy, leaving its impact on the internal learning process of the model often poorly understood. Building on this, we examined how these augmentations affect the interclass separation and intraclass consistency of the models and compared these results to those observed in models trained on larger, unaugmented datasets in order to determine the impact of each property. We hypothesized that augmentations and larger sample sizes would both increase interclass separation with sample size having a more significant effect, while augmentations significantly reduce the variance of intraclass consistency in comparison to larger sample size trained models. Using pre-trained ResNet18 models, we visually represented global average pooling (avgpool) layer activations through PCA graphs. The graphs revealed that in comparison to models with larger training data samples, models trained with augmentations and less data provided inconsistent, small increases and decreases in interclass separation but caused consistent and significantly larger reductions to intraclass variance. These findings emphasized the role of representation stability in low data learning and demonstrated the value of internal analysis for understanding the impact of data properties.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •