Demos Applications Components Optimizers Experiments Datasets

Basic

Basic Test


Project maintained by SimiaCryptus Java, CuDNN, and CUDA are others' trademarks. No endorsement is implied.
  1. Serialization
    1. Raw Json
  2. Example Input/Output Pair
  3. Batch Execution
  4. Differential Validation
    1. Feedback Validation
    2. Learning Validation

Target Description: A dense matrix operator using vector-matrix multiplication. Represents a fully connected layer of synapses, where all inputs are connected to all outputs via seperate coefficients.

Report Description: Basic Test

Serialization

This run will demonstrate the layer’s JSON serialization, and verify deserialization integrity.

Raw Json

Code from SerializationTest.java:84 executed in 0.00 seconds:

    final JsonObject json = layer.getJson();
    final NNLayer echo = NNLayer.fromJson(json);
    if (echo == null) throw new AssertionError("Failed to deserialize");
    if (layer == echo) throw new AssertionError("Serialization did not copy");
    if (!layer.equals(echo)) throw new AssertionError("Serialization not equal");
    return new GsonBuilder().setPrettyPrinting().create().toJson(json);

Returns:

    {
      "class": "com.simiacryptus.mindseye.layers.cudnn.FullyConnectedLayer",
      "id": "eb3da678-ae10-4f53-9aab-5b60bacc7611",
      "isFrozen": false,
      "name": "FullyConnectedLayer/eb3da678-ae10-4f53-9aab-5b60bacc7611",
      "outputDims": [
        3
      ],
      "inputDims": [
        3
      ],
      "weights": [
        [
          0.14847271790166663,
          0.7279213652052879,
          0.8725472960249377
        ],
        [
          0.36366328442850826,
          -0.21566901421803286,
          -0.43144316254350895
        ],
        [
          0.08421335576441168,
          0.11464568375650427,
          0.9222014164362317
        ]
      ],
      "precision": "Double"
    }

Wrote Model to FullyConnectedLayer_Basic.json; 593 characters

Example Input/Output Pair

Display input/output pairs from random executions:

Code from ReferenceIO.java:69 executed in 0.02 seconds:

    final SimpleEval eval = SimpleEval.run(layer, inputPrototype);
    return String.format("--------------------\nInput: \n[%s]\n--------------------\nOutput: \n%s\n--------------------\nDerivative: \n%s",
                         Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get(),
                         eval.getOutput().prettyPrint(),
                         Arrays.stream(eval.getDerivative()).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get());

Returns:

    --------------------
    Input: 
    [[ -0.512, -1.3, -1.572 ]]
    --------------------
    Output: 
    [ -0.6811636965843693, -0.27254903536688946, -1.3355687308959625 ]
    --------------------
    Derivative: 
    [ 1.7489413791318924, -0.28344889233303355, 1.1210604559571475 ]

GPU Log

Batch Execution

Most layers, including this one, should behave the same no matter how the items are split between batches. We verify this:

Code from BatchingTester.java:113 executed in 0.01 seconds:

    return test(reference, inputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (60#), relativeTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (60#)}

Differential Validation

Code from SingleDerivativeTester.java:292 executed in 0.00 seconds:

    log.info(String.format("Inputs: %s", Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Inputs Statistics: %s", Arrays.stream(inputPrototype).map(x -> new ScalarStatistics().add(x.getData()).toString()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Output: %s", outputPrototype.prettyPrint()));
    log.info(String.format("Outputs Statistics: %s", new ScalarStatistics().add(outputPrototype.getData())));

Logging:

    Inputs: [ -1.072, 0.844, -1.052 ]
    Inputs Statistics: {meanExponent=-0.007149009399957812, negative=2, min=-1.052, max=-1.052, mean=-0.42666666666666675, count=3, positive=1, stdDev=0.898534114853496, zeros=0}
    Output: [ 0.05917660820291322, -1.082963610811931, -2.2696646206163704 ]
    Outputs Statistics: {meanExponent=-0.2790914596089582, negative=2, min=-2.2696646206163704, max=-2.2696646206163704, mean=-1.097817207741796, count=3, positive=1, stdDev=0.9508034634870461, zeros=0}
    

Feedback Validation

We validate the agreement between the implemented derivative of the inputs with finite difference estimations:

Code from SingleDerivativeTester.java:303 executed in 0.00 seconds:

    return testFeedback(statistics, component, inputPrototype, outputPrototype);

Logging:

    Not Implemented: com.simiacryptus.mindseye.layers.cudnn.FullyConnectedLayer
    Not Implemented: com.simiacryptus.mindseye.layers.cudnn.FullyConnectedLayer
    Not Implemented: com.simiacryptus.mindseye.layers.cudnn.FullyConnectedLayer
    Feedback for input 0
    Inputs Values: [ -1.072, 0.844, -1.052 ]
    Value Statistics: {meanExponent=-0.007149009399957812, negative=2, min=-1.052, max=-1.052, mean=-0.42666666666666675, count=3, positive=1, stdDev=0.898534114853496, zeros=0}
    Implemented Feedback: [ [ 0.14847271790166663, 0.7279213652052879, 0.8725472960249377 ], [ 0.36366328442850826, -0.21566901421803286, -0.43144316254350895 ], [ 0.08421335576441168, 0.11464568375650427, 0.9222014164362317 ] ]
    Implemented Statistics: {meanExponent=-0.5051671810744692, negative=2, min=0.9222014164362317, max=0.9222014164362317, mean=0.28739477141733405, count=9, positive=7, stdDev=0.4478949454147104, zeros=0}
    Measured Feedback: [ [ 0.14847271790185967, 0.727921365204498, 0.872547296024706 ], [ 0.3636632844283838, -0.21566901421854467, -0.4314431625473958 ], [ 0.08421335576436206, 0.11464568375529893, 0.9222014164311787 ] ]
    Measured Statistics: {meanExponent=-0.5051671810747391, negative=2, min=0.9222014164311787, max=0.9222014164311787, mean=0.28739477141603853, count=9, positive=7, stdDev=0.4478949454145969, zeros=0}
    Feedback Error: [ [ 1.930400284066991E-13, -7.899236820207989E-13, -2.3170354523927017E-13 ], [ -1.2445600106048005E-13, -5.118128143521972E-13, -3.886835298061442E-12 ], [ -4.961309141293668E-14, -1.2053413822599168E-12, -5.052958051976475E-12 ] ]
    Error Statistics: {meanExponent=-12.286428847707365, negative=8, min=-5.052958051976475E-12, max=-5.052958051976475E-12, mean=-1.2955115375529796E-12, count=9, positive=1, stdDev=1.7632798915190275E-12, zeros=0}
    

Returns:

    ToleranceStatistics{absoluteTol=1.3384e-12 +- 1.7309e-12 [4.9613e-14 - 5.0530e-12] (9#), relativeTol=1.7198e-12 +- 1.8579e-12 [1.3277e-13 - 5.2568e-12] (9#)}

Learning Validation

We validate the agreement between the implemented derivative of the internal weights with finite difference estimations:

Code from SingleDerivativeTester.java:311 executed in 0.00 seconds:

    return testLearning(statistics, component, inputPrototype, outputPrototype);

Logging:

    Learning Gradient for weight setByCoord 0
    Implemented Gradient: [ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ] ]
    Implemented Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=27, positive=0, stdDev=0.0, zeros=27}
    Measured Gradient: [ [ -1.0719999999997398, 0.0, 0.0 ], [ 0.0, -1.0719999999997398, 0.0 ], [ 0.0, 0.0, -1.0720000000041807 ], [ 0.8439999999998449, 0.0, 0.0 ], [ 0.0, 0.8439999999998449, 0.0 ], [ 0.0, 0.0, 0.843999999995404 ], [ -1.0519999999999974, 0.0, 0.0 ], [ 0.0, -1.0520000000013852, 0.0 ], [ 0.0, 0.0, -1.0520000000013852 ] ]
    Measured Statistics: {meanExponent=-0.007149009399946614, negative=6, min=-1.0520000000013852, max=-1.0520000000013852, mean=-0.14222222222264203, count=27, positive=3, stdDev=0.5563950989835548, zeros=18}
    Gradient Error: [ [ -1.0719999999997398, 0.0, 0.0 ], [ 0.0, -1.0719999999997398, 0.0 ], [ 0.0, 0.0, -1.0720000000041807 ], [ 0.8439999999998449, 0.0, 0.0 ], [ 0.0, 0.8439999999998449, 0.0 ], [ 0.0, 0.0, 0.843999999995404 ], [ -1.0519999999999974, 0.0, 0.0 ], [ 0.0, -1.0520000000013852, 0.0 ], [ 0.0, 0.0, -1.0520000000013852 ] ]
    Error Statistics: {meanExponent=-0.007149009399946614, negative=6, min=-1.0520000000013852, max=-1.0520000000013852, mean=-0.14222222222264203, count=27, positive=3, stdDev=0.5563950989835548, zeros=18}
    

Returns:

    java.lang.AssertionError: ToleranceStatistics{absoluteTol=3.2978e-01 +- 4.7016e-01 [0.0000e+00 - 1.0720e+00] (27#), relativeTol=1.0000e+00 +- 0.0000e+00 [1.0000e+00 - 1.0000e+00] (9#)}
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.lambda$testLearning$23(SingleDerivativeTester.java:353)
    	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250)
    	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110)
    	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:479)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.testLearning(SingleDerivativeTester.java:386)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.lambda$test$18(SingleDerivativeTester.java:312)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$null$1(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:59)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$code$2(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.test.SysOutInterceptor.withOutput(SysOutInterceptor.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.code(MarkdownNotebookOutput.java:203)
    	at com.simiacryptus.util.io.NotebookOutput.code(NotebookOutput.java:82)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.test(SingleDerivativeTester.java:311)
    	at com.simiacryptus.mindseye.test.unit.SingleDerivativeTester.test(SingleDerivativeTester.java:42)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.lambda$run$5(StandardLayerTests.java:257)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.run(StandardLayerTests.java:256)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.lambda$run$0(NotebookReportBase.java:105)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:76)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.run(NotebookReportBase.java:103)
    	at com.simiacryptus.mindseye.layers.LayerTestBase.test(LayerTestBase.java:37)
    	at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runners.Suite.runChild(Suite.java:128)
    	at org.junit.runners.Suite.runChild(Suite.java:27)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
    	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
    	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
    	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)