Rewriting CNN
Tags: control, think, and visionPersonhours: 12
Task: Begin rewriting the Convolutional Neural Network using Java and DL4J
While we were using Python and TensorFlow to train our convolutional neural network, we decided to attempt writing this in Java, as the code for our robot is entirely in Java, and before we can use our neural network, it must be written in Java.
We also decided to try using DL4J, a competing library to TensorFlow, to write our neural network, to determine if it was easier to write a neural network using DL4J or TensorFlow. We found that both DL4J and TensorFlow were similarly easy to use, and while each had a different style, code written using both were equally easy to read and maintain.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 | java //Download dataset DataDownloader downloader = new DataDownloader(); File rootDir = downloader.downloadFilesFromGit("https://github.com/arjvik/RoverRuckusTrainingData.git", "data/RoverRuckusTrainingData", "TrainingData"); //Read in dataset DataSetIterator iterator = new CustomDataSetIterator(rootDir, 1); //Normalization DataNormalization scaler = new ImagePreProcessingScaler(0, 1); scaler.fit(iterator); iterator.setPreProcessor(scaler); //Read in test dataset DataSetIterator testIterator = new CustomDataSetIterator(new File(rootDir, "Test"), 1); //Test Normalization DataNormalization testScaler = new ImagePreProcessingScaler(0, 1); testScaler.fit(testIterator); testIterator.setPreProcessor(testScaler); //Layer Configuration MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder() .seed(SEED) .l2(0.005) .weightInit(WeightInit.XAVIER) .list() .layer(0, new ConvolutionLayer.Builder() .nIn(1) .kernelSize(3, 3) .stride(1, 1) .activation(Activation.RELU) .build()) .layer(1, new ConvolutionLayer.Builder() .nIn(1) .kernelSize(3, 3) .stride(1, 1) .activation(Activation.RELU) .build()) /* ...more layer code... */ .build(); |
Next Steps
We still need to attempt to to fix the inaccuracy in the predictions made by our neural network.
Date | October 20, 2018