Implement a simple black box attack in Keras to attack a pretrained simple neural network. For the substitute model we use a two hidden layer neural network with each layer having 100 nodes. Our goal is to generate adversaries to decieve a simple single layer neural network with 20 hidden nodes into misclassifying data from a test set that is provided by us. This test set consists of examples from classes 0 and 1 from CIFAR10. The target model is pretrained on CIFAR10 class 0 vs 1 and achieves 89% test accuracy. The CIFAR10 data is numpy form available from here https://web.njit.edu/~usman/courses/cs677_summer20/CIFAR10/. You can easily extract classes 0 and 1 with testData = np.load(testFile) testLabels = np.load(testLabelsFile) testData = testData[np.logical_or(testLabels == 0, testLabels == 1)] testLabels = testLabels[np.logical_or(testLabels == 0, testLabels == 1)] testLabels = keras.utils.to_categorical(testLabels, 2) We normalize each image by subtracting the mean: testDataMean = np.mean(valData, axis=0) testData = valData - valDataMean A successful attack should have a classification accuracy of at most 10% on the test. Submit your assignments as two files train.py and test.py. Make train.py take three inputs: the test data, the target model to attack (in our case this is the network with 20 hidden nodes), and a model file name to save the substitute model file to. python train.py Your train.py program should output the accuracy of the target model on the test data without adversaries as the first step. This is to verify that your model has high accuracy on the test data without adversaries. Otherwise if your model has low test accuracy it will be harder to attack. When running train.py output the accuracy of the target model on the adversaries generated from the test data after each epoch. Make test.py take three inputs: test set, target model, and the substitute model. The output should be the accuracy of adversarial examples generated with epsilon=0.0625. A successful submission will have accuracy below 10% on the adversarial examples. python test.py Copy both your programs and model file to your AFS course folder /afs/cad/courses/ccs/S20/cs/677/850/. The assignment is due 11:30am July 25th 2020.