Demos Applications Components Optimizers Experiments Datasets

FloatConvolutionNetwork

Expands the an example low-level network implementing general convolutions. (32-bit)


Project maintained by SimiaCryptus Java, CuDNN, and CUDA are others' trademarks. No endorsement is implied.
  1. Network Diagram
  2. Serialization
    1. Raw Json
  3. Example Input/Output Pair
  4. Batch Execution
  5. Differential Validation
    1. Feedback Validation
    2. Learning Validation

Target Description: A simple network architecture based on the assumption of a linear sequence of components. Each component added becomes the new head node, and a default add method appends a new node on the existing head.

Report Description: Expands the an example low-level network implementing general convolutions. (32-bit)

Network Diagram

This is a network with the following layout:

Code from StandardLayerTests.java:251 executed in 0.17 seconds:

    return Graphviz.fromGraph(TestUtil.toGraph((DAGNetwork) layer))
                   .height(400).width(600).render(Format.PNG).toImage();

Returns:

Result

Serialization

This run will demonstrate the layer’s JSON serialization, and verify deserialization integrity.

Raw Json

Code from SerializationTest.java:84 executed in 0.00 seconds:

    final JsonObject json = layer.getJson();
    final NNLayer echo = NNLayer.fromJson(json);
    if (echo == null) throw new AssertionError("Failed to deserialize");
    if (layer == echo) throw new AssertionError("Serialization did not copy");
    if (!layer.equals(echo)) throw new AssertionError("Serialization not equal");
    return new GsonBuilder().setPrettyPrinting().create().toJson(json);

Returns:

    {
      "class": "com.simiacryptus.mindseye.network.PipelineNetwork",
      "id": "f4af4fbf-c78b-4446-b6d0-cee91d65d2c0",
      "isFrozen": false,
      "name": "PipelineNetwork/f4af4fbf-c78b-4446-b6d0-cee91d65d2c0",
      "inputs": [
        "a8bb34b1-92b6-49b5-ba6a-6fe07ba08e29"
      ],
      "nodes": {
        "c4f7aed8-ef24-4800-9a00-57965918cd7b": "f8943590-51e6-4d97-bc3d-da4d02b4b04e",
        "705625d8-56d1-498d-8727-9bbaee648e1d": "9ce2ddfe-fda4-4bb2-83d7-c45cda1127d2",
        "312fe3e3-e155-4cc3-b45f-a03925b9e836": "113b5661-98e0-4b1d-8c03-f3d92000b0fc"
      },
      "layers": {
        "f8943590-51e6-4d97-bc3d-da4d02b4b04e": {
          "class": "com.simiacryptus.mindseye.layers.cudnn.ImgConcatLayer",
          "id": "f8943590-51e6-4d97-bc3d-da4d02b4b04e",
          "isFrozen": false,
          "name": "ImgConcatLayer/f8943590-51e6-4d97-bc3d-da4d02b4b04e",
          "maxBands": -1,
          "precision": "Float"
        },
        "9ce2ddfe-fda4-4bb2-83d7-c45cda1127d2": {
          "class": "com.simiacryptus.mindseye.layers.cudnn.ImgBandBiasLayer",
          "id": "9ce2ddfe-fda4-4bb2-83d7-c45cda1127d2",
          "isFrozen": false,
          "name": "ImgBandBiasLayer/9ce2ddfe-fda4-4bb2-83d7-c45cda1127d2",
          "bias": [
            1.668,
            -0.512,
            -1.46
          ],
          "precision": "Float"
        },
        "113b5661-98e0-4b1d-8c03-f3d92000b0fc": {
          "class": "com.simiacryptus.mindseye.layers.cudnn.ActivationLayer",
          "id": "113b5661-98e0-4b1d-8c03-f3d92000b0fc",
          "isFrozen": false,
          "name": "ActivationLayer/113b5661-98e0-4b1d-8c03-f3d92000b0fc",
          "mode": 1,
          "precision": "Float"
        }
      },
      "links": {
        "c4f7aed8-ef24-4800-9a00-57965918cd7b": [
          "a8bb34b1-92b6-49b5-ba6a-6fe07ba08e29"
        ],
        "705625d8-56d1-498d-8727-9bbaee648e1d": [
          "c4f7aed8-ef24-4800-9a00-57965918cd7b"
        ],
        "312fe3e3-e155-4cc3-b45f-a03925b9e836": [
          "705625d8-56d1-498d-8727-9bbaee648e1d"
        ]
      },
      "labels": {},
      "head": "312fe3e3-e155-4cc3-b45f-a03925b9e836"
    }

Wrote Model to PipelineNetwork_FloatConvolutionNetwork.json; 1936 characters

Example Input/Output Pair

Display input/output pairs from random executions:

Code from ReferenceIO.java:69 executed in 0.00 seconds:

    final SimpleEval eval = SimpleEval.run(layer, inputPrototype);
    return String.format("--------------------\nInput: \n[%s]\n--------------------\nOutput: \n%s\n--------------------\nDerivative: \n%s",
                         Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get(),
                         eval.getOutput().prettyPrint(),
                         Arrays.stream(eval.getDerivative()).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get());

Returns:

    --------------------
    Input: 
    [[
    	[ [ -1.8, -1.736, 1.052 ], [ -1.676, -1.712, 0.792 ], [ -0.94, -1.788, -0.064 ], [ -1.34, 0.832, -0.272 ] ],
    	[ [ 0.344, 0.168, -1.588 ], [ 0.984, -1.768, -1.88 ], [ 0.116, -1.408, -0.344 ], [ 1.708, 0.988, 1.752 ] ],
    	[ [ 1.344, 1.08, 0.904 ], [ 1.412, 1.616, 0.28 ], [ -0.284, -1.38, -0.852 ], [ 1.584, -0.228, 0.984 ] ],
    	[ [ 1.772, -1.052, -0.144 ], [ 0.168, -1.416, -0.544 ], [ -1.944, -0.152, 1.988 ], [ -0.444, 0.996, 1.46 ] ]
    ]]
    --------------------
    Output: 
    [
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ]
    ]
    --------------------
    Derivative: 
    [
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ]
    ]

GPU Log

Batch Execution

Most layers, including this one, should behave the same no matter how the items are split between batches. We verify this:

Code from BatchingTester.java:113 executed in 0.02 seconds:

    return test(reference, inputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (960#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Differential Validation

Code from SingleDerivativeTester.java:292 executed in 0.00 seconds:

    log.info(String.format("Inputs: %s", Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Inputs Statistics: %s", Arrays.stream(inputPrototype).map(x -> new ScalarStatistics().add(x.getData()).toString()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Output: %s", outputPrototype.prettyPrint()));
    log.info(String.format("Outputs Statistics: %s", new ScalarStatistics().add(outputPrototype.getData())));

Logging:

    Inputs: [
    	[ [ -1.016, -1.888, 0.284 ], [ -1.752, 1.504, 0.416 ], [ 1.408, 0.312, 1.996 ], [ 0.9, 0.704, -0.68 ] ],
    	[ [ -0.52, 0.092, -1.38 ], [ 1.672, -1.8, 0.772 ], [ -0.36, -1.888, -0.696 ], [ 0.012, 0.92, -1.488 ] ],
    	[ [ -0.92, -1.932, 0.108 ], [ 1.124, 1.324, 0.124 ], [ -1.068, 1.628, 1.556 ], [ -0.2, -0.848, 1.044 ] ],
    	[ [ 0.716, -0.484, -0.496 ], [ -1.4, 1.6, 1.748 ], [ 1.088, -1.508, -0.684 ], [ -0.452, 0.764, 0.584 ] ]
    ]
    Inputs Statistics: {meanExponent=-0.1282910414648056, negative=22, min=0.584, max=0.584, mean=0.01958333333333333, count=48, positive=26, stdDev=1.148878943313389, zeros=0}
    Output: [
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ]
    ]
    Outputs Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=48, positive=0, stdDev=0.0, zeros=48}
    

Feedback Validation

We validate the agreement between the implemented derivative of the inputs with finite difference estimations:

Code from SingleDerivativeTester.java:303 executed in 0.14 seconds:

    return testFeedback(statistics, component, inputPrototype, outputPrototype);

Logging:

    Feedback for input 0
    Inputs Values: [
    	[ [ -1.016, -1.888, 0.284 ], [ -1.752, 1.504, 0.416 ], [ 1.408, 0.312, 1.996 ], [ 0.9, 0.704, -0.68 ] ],
    	[ [ -0.52, 0.092, -1.38 ], [ 1.672, -1.8, 0.772 ], [ -0.36, -1.888, -0.696 ], [ 0.012, 0.92, -1.488 ] ],
    	[ [ -0.92, -1.932, 0.108 ], [ 1.124, 1.324, 0.124 ], [ -1.068, 1.628, 1.556 ], [ -0.2, -0.848, 1.044 ] ],
    	[ [ 0.716, -0.484, -0.496 ], [ -1.4, 1.6, 1.748 ], [ 1.088, -1.508, -0.684 ], [ -0.452, 0.764, 0.584 ] ]
    ]
    Value Statistics: {meanExponent=-0.1282910414648056, negative=22, min=0.584, max=0.584, mean=0.01958333333333333, count=48, positive=26, stdDev=1.148878943313389, zeros=0}
    Implemented Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Implemented Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=2304, positive=0, stdDev=0.0, zeros=2304}
    Measured Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Measured Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=2304, positive=0, stdDev=0.0, zeros=2304}
    Feedback Error: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Error Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=2304, positive=0, stdDev=0.0, zeros=2304}
    

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (2304#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Learning Validation

We validate the agreement between the implemented derivative of the internal weights with finite difference estimations:

Code from SingleDerivativeTester.java:311 executed in 0.00 seconds:

    return testLearning(statistics, component, inputPrototype, outputPrototype);

Logging:

    Learning Gradient for weight setByCoord 0
    Implemented Gradient: [ [ 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ] ]
    Implemented Statistics: {meanExponent=0.0, negative=0, min=0.0, max=0.0, mean=0.1736111111111111, count=144, positive=25, stdDev=0.378774726202629, zeros=119}
    Measured Gradient: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ] ]
    Measured Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=144, positive=0, stdDev=0.0, zeros=144}
    Gradient Error: [ [ -1.0, -1.0, -1.0, -1.0, 0.0, -1.0, -1.0, -1.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ] ]
    Error Statistics: {meanExponent=0.0, negative=25, min=0.0, max=0.0, mean=-0.1736111111111111, count=144, positive=0, stdDev=0.378774726202629, zeros=119}
    

Returns:

    java.lang.AssertionError: ToleranceStatistics{absoluteTol=1.7361e-01 +- 3.7877e-01 [0.0000e+00 - 1.0000e+00] (144#), relativeTol=1.0000e+00 +- 0.0000e+00 [1.0000e+00 - 1.0000e+00] (25#)}
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.lambda$testLearning$23(SingleDerivativeTester.java:353)
    	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250)
    	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110)
    	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:479)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.testLearning(SingleDerivativeTester.java:386)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.lambda$test$18(SingleDerivativeTester.java:312)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$null$1(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:59)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$code$2(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.test.SysOutInterceptor.withOutput(SysOutInterceptor.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.code(MarkdownNotebookOutput.java:203)
    	at com.simiacryptus.util.io.NotebookOutput.code(NotebookOutput.java:82)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.test(SingleDerivativeTester.java:311)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.test(SingleDerivativeTester.java:42)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.lambda$run$5(StandardLayerTests.java:257)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.run(StandardLayerTests.java:256)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.lambda$run$0(NotebookReportBase.java:105)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:76)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.run(NotebookReportBase.java:103)
    	at com.simiacryptus.mindseye.layers.LayerTestBase.test(LayerTestBase.java:37)
    	at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runners.Suite.runChild(Suite.java:128)
    	at org.junit.runners.Suite.runChild(Suite.java:27)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
    	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
    	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
    	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)