Demos Applications Components Optimizers Experiments Datasets

Float

Test using 32-bit precision with a radius of 1


Project maintained by SimiaCryptus Java, CuDNN, and CUDA are others' trademarks. No endorsement is implied.
  1. Serialization
    1. Raw Json
  2. Example Input/Output Pair
  3. Batch Execution
  4. Differential Validation
    1. Feedback Validation
    2. Learning Validation

Target Description: This is the general convolution layer, allowing any number of input and output bands. During execution it delegates processing to a dynamically created subbnet created using SimpleConvolutionLayer and ImgConcatLayer to implement the more general layer contract.

Report Description: Test using 32-bit precision with a radius of 1

Serialization

This run will demonstrate the layer’s JSON serialization, and verify deserialization integrity.

Raw Json

Code from SerializationTest.java:84 executed in 0.00 seconds:

    final JsonObject json = layer.getJson();
    final NNLayer echo = NNLayer.fromJson(json);
    if (echo == null) throw new AssertionError("Failed to deserialize");
    if (layer == echo) throw new AssertionError("Serialization did not copy");
    if (!layer.equals(echo)) throw new AssertionError("Serialization not equal");
    return new GsonBuilder().setPrettyPrinting().create().toJson(json);

Returns:

    {
      "class": "com.simiacryptus.mindseye.layers.cudnn.ConvolutionLayer",
      "id": "df4c080a-f22d-4b4b-bb09-bbe9f4cf3ca1",
      "isFrozen": false,
      "name": "ConvolutionLayer/df4c080a-f22d-4b4b-bb09-bbe9f4cf3ca1",
      "filter": [
        [
          [
            -1.252
          ]
        ],
        [
          [
            -1.904
          ]
        ],
        [
          [
            1.236
          ]
        ],
        [
          [
            -1.18
          ]
        ]
      ],
      "strideX": 1,
      "strideY": 1,
      "precision": "Float",
      "inputBands": 2,
      "outputBands": 2
    }

Wrote Model to ConvolutionLayer_Float.json; 495 characters

Example Input/Output Pair

Display input/output pairs from random executions:

Code from ReferenceIO.java:69 executed in 0.01 seconds:

    final SimpleEval eval = SimpleEval.run(layer, inputPrototype);
    return String.format("--------------------\nInput: \n[%s]\n--------------------\nOutput: \n%s\n--------------------\nDerivative: \n%s",
                         Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get(),
                         eval.getOutput().prettyPrint(),
                         Arrays.stream(eval.getDerivative()).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get());

Returns:

    --------------------
    Input: 
    [[
    	[ [ 0.756, 1.104 ], [ -1.712, -0.928 ], [ -0.884, -0.024 ], [ -0.26, 0.884 ], [ -0.968, -0.352 ] ],
    	[ [ -1.852, 0.964 ], [ -1.996, 1.484 ], [ 1.02, -0.064 ], [ 1.42, 1.236 ], [ -1.652, -0.468 ] ],
    	[ [ 0.636, -1.236 ], [ -0.872, -1.504 ], [ 0.976, 1.928 ], [ 1.74, 0.488 ], [ 0.1, -0.176 ] ],
    	[ [ -1.7, 1.732 ], [ -1.244, -1.34 ], [ 1.556, -0.544 ], [ -1.976, -0.776 ], [ 1.712, -0.28 ] ],
    	[ [ -0.628, -1.624 ], [ -0.948, -0.668 ], [ 0.152, -1.124 ], [ 1.472, -1.864 ], [ -1.924, -0.616 ] ]
    ]]
    --------------------
    Output: 
    [
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ]
    ]
    --------------------
    Derivative: 
    [
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ]
    ]

GPU Log

Batch Execution

Most layers, including this one, should behave the same no matter how the items are split between batches. We verify this:

Code from BatchingTester.java:113 executed in 0.03 seconds:

    return test(reference, inputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (1000#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Differential Validation

Code from SingleDerivativeTester.java:292 executed in 0.00 seconds:

    log.info(String.format("Inputs: %s", Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Inputs Statistics: %s", Arrays.stream(inputPrototype).map(x -> new ScalarStatistics().add(x.getData()).toString()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Output: %s", outputPrototype.prettyPrint()));
    log.info(String.format("Outputs Statistics: %s", new ScalarStatistics().add(outputPrototype.getData())));

Logging:

    Inputs: [
    	[ [ -1.504, 1.78 ], [ 1.78, 0.708 ], [ -1.644, 1.472 ], [ -1.768, 1.212 ], [ -1.072, -0.724 ] ],
    	[ [ -1.008, 1.856 ], [ -1.82, 0.656 ], [ -1.428, -1.804 ], [ 0.828, 0.06 ], [ 0.76, -0.72 ] ],
    	[ [ -1.5, 1.076 ], [ 1.868, -0.484 ], [ -1.852, 1.932 ], [ -1.288, -0.112 ], [ 0.392, 0.652 ] ],
    	[ [ -1.292, 0.704 ], [ 1.316, -1.68 ], [ -1.584, -0.452 ], [ -1.428, 0.06 ], [ -0.636, -0.7 ] ],
    	[ [ -0.52, -1.128 ], [ 0.1, 0.696 ], [ 0.004, 1.692 ], [ 1.756, -1.504 ], [ 1.56, 0.6 ] ]
    ]
    Inputs Statistics: {meanExponent=-0.09628489365300583, negative=25, min=0.6, max=0.6, mean=-0.08264000000000003, count=50, positive=25, stdDev=1.242029206741935, zeros=0}
    Output: [
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ], [ 0.0, 0.0 ] ]
    ]
    Outputs Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=50, positive=0, stdDev=0.0, zeros=50}
    

Feedback Validation

We validate the agreement between the implemented derivative of the inputs with finite difference estimations:

Code from SingleDerivativeTester.java:303 executed in 0.24 seconds:

    return testFeedback(statistics, component, inputPrototype, outputPrototype);

Logging:

    Feedback for input 0
    Inputs Values: [
    	[ [ -1.504, 1.78 ], [ 1.78, 0.708 ], [ -1.644, 1.472 ], [ -1.768, 1.212 ], [ -1.072, -0.724 ] ],
    	[ [ -1.008, 1.856 ], [ -1.82, 0.656 ], [ -1.428, -1.804 ], [ 0.828, 0.06 ], [ 0.76, -0.72 ] ],
    	[ [ -1.5, 1.076 ], [ 1.868, -0.484 ], [ -1.852, 1.932 ], [ -1.288, -0.112 ], [ 0.392, 0.652 ] ],
    	[ [ -1.292, 0.704 ], [ 1.316, -1.68 ], [ -1.584, -0.452 ], [ -1.428, 0.06 ], [ -0.636, -0.7 ] ],
    	[ [ -0.52, -1.128 ], [ 0.1, 0.696 ], [ 0.004, 1.692 ], [ 1.756, -1.504 ], [ 1.56, 0.6 ] ]
    ]
    Value Statistics: {meanExponent=-0.09628489365300583, negative=25, min=0.6, max=0.6, mean=-0.08264000000000003, count=50, positive=25, stdDev=1.242029206741935, zeros=0}
    Implemented Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Implemented Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=2500, positive=0, stdDev=0.0, zeros=2500}
    Measured Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Measured Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=2500, positive=0, stdDev=0.0, zeros=2500}
    Feedback Error: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Error Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=2500, positive=0, stdDev=0.0, zeros=2500}
    

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (2500#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Learning Validation

We validate the agreement between the implemented derivative of the internal weights with finite difference estimations:

Code from SingleDerivativeTester.java:311 executed in 0.00 seconds:

    return testLearning(statistics, component, inputPrototype, outputPrototype);

Logging:

    Learning Gradient for weight setByCoord 0
    Implemented Gradient: [ [ -1.503999948501587, -1.0080000162124634, -1.5, -1.2920000553131104, -0.5199999809265137, 1.7799999713897705, -1.8200000524520874, 1.8680000305175781, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 1.7799999713897705, 1.8559999465942383, 1.0759999752044678, 0.7039999961853027, -1.128000020980835, 0.7080000042915344, 0.656000018119812, -0.48399999737739563, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ] ]
    Implemented Statistics: {meanExponent=-0.09628489402528256, negative=50, min=0.6000000238418579, max=0.6000000238418579, mean=-0.04131999992299825, count=200, positive=50, stdDev=0.8792187535200029, zeros=100}
    Measured Gradient: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ] ]
    Measured Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=200, positive=0, stdDev=0.0, zeros=200}
    Gradient Error: [ [ 1.503999948501587, 1.0080000162124634, 1.5, 1.2920000553131104, 0.5199999809265137, -1.7799999713897705, 1.8200000524520874, -1.8680000305175781, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ -1.7799999713897705, -1.8559999465942383, -1.0759999752044678, -0.7039999961853027, 1.128000020980835, -0.7080000042915344, -0.656000018119812, 0.48399999737739563, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ] ]
    Error Statistics: {meanExponent=-0.09628489402528256, negative=50, min=-0.6000000238418579, max=-0.6000000238418579, mean=0.04131999992299825, count=200, positive=50, stdDev=0.8792187535200029, zeros=100}
    

Returns:

    java.lang.AssertionError: ToleranceStatistics{absoluteTol=5.5172e-01 +- 6.8581e-01 [0.0000e+00 - 1.9320e+00] (200#), relativeTol=1.0000e+00 +- 0.0000e+00 [1.0000e+00 - 1.0000e+00] (100#)}
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.lambda$testLearning$23(SingleDerivativeTester.java:353)
    	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250)
    	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110)
    	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:479)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.testLearning(SingleDerivativeTester.java:386)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.lambda$test$18(SingleDerivativeTester.java:312)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$null$1(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:59)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$code$2(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.test.SysOutInterceptor.withOutput(SysOutInterceptor.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.code(MarkdownNotebookOutput.java:203)
    	at com.simiacryptus.util.io.NotebookOutput.code(NotebookOutput.java:82)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.test(SingleDerivativeTester.java:311)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.test(SingleDerivativeTester.java:42)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.lambda$run$5(StandardLayerTests.java:257)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.run(StandardLayerTests.java:256)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.lambda$run$0(NotebookReportBase.java:105)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:76)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.run(NotebookReportBase.java:103)
    	at com.simiacryptus.mindseye.layers.LayerTestBase.test(LayerTestBase.java:37)
    	at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runners.Suite.runChild(Suite.java:128)
    	at org.junit.runners.Suite.runChild(Suite.java:27)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
    	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
    	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
    	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)