Accession Number:

AD1098053

Title:

Training Convolutional Neural Network Classifiers Using Simultaneous Scaled Supercomputing

Descriptive Note:

Technical Report,28 Feb 2020,28 Feb 2020

Corporate Author:

University of Dayton Dayton United States

Personal Author(s):

Report Date:

2020-05-07

Pagination or Media Count:

64.0

Abstract:

Convolutional neural networks CNN are revolutionizing and improving todays technological landscape at a remarkable rate. Yet even in their success, creating optimal trained networks depends on expensive empirical processing to generate the best results. They require powerful processors, expansive datasets, days of training time, and hundreds of training instances across a range of hyperparameters to identify optimal results. These requirements can be difficult to access for the typical CNN technologist and ultimately wasteful of resources, since only the most optimal model will be utilized. To overcome these challenges and create a foundation for the next generation of CNN technologist, a three-stage solution is proposed 1 To cultivate a new dataset containing millions of domain-specific aerial annotated images 2 to design a flexible experiment generator framework which is easy to use, can operate on the fastest supercomputers in the world, and can simultaneously train hundreds of unique CNN networks and 3 to establish benchmarks of accuracies and optimal training hyperparameters. An aerial imagery database is presented which contains 260 new cultivated datasets, features tens of millions of annotated image chips, and provides several distinct vehicular classes. Accompanying the database, a CNN-training framework is presented which can generate hundreds of CNN experiments with extensively customizable input parameters.

Subject Categories:

  • Computer Systems

Distribution Statement:

APPROVED FOR PUBLIC RELEASE