Demos Applications Components Optimizers Experiments Datasets

Sigmoid_Float

Configured with float (32-bit) precision using the Sigmoid function


Project maintained by SimiaCryptus Java, CuDNN, and CUDA are others' trademarks. No endorsement is implied.
  1. Serialization
    1. Raw Json
  2. Example Input/Output Pair
  3. Batch Execution
  4. Differential Validation
    1. Feedback Validation
    2. Learning Validation
    3. Total Accuracy
    4. Frozen and Alive Status
  5. Reference Implementation

Target Description: The generic Activation layer, exposing the activation types provided by CuDNN. This layer is stateless and is determined by a univariate function, e.g. ReLU or Sigmoid.

Report Description: Configured with float (32-bit) precision using the Sigmoid function

Serialization

This run will demonstrate the layer’s JSON serialization, and verify deserialization integrity.

Raw Json

Code from SerializationTest.java:84 executed in 0.00 seconds:

    final JsonObject json = layer.getJson();
    final NNLayer echo = NNLayer.fromJson(json);
    if (echo == null) throw new AssertionError("Failed to deserialize");
    if (layer == echo) throw new AssertionError("Serialization did not copy");
    if (!layer.equals(echo)) throw new AssertionError("Serialization not equal");
    return new GsonBuilder().setPrettyPrinting().create().toJson(json);

Returns:

    {
      "class": "com.simiacryptus.mindseye.layers.cudnn.ActivationLayer",
      "id": "82dc6c7c-b229-40e5-948b-37bce086c25a",
      "isFrozen": false,
      "name": "ActivationLayer/82dc6c7c-b229-40e5-948b-37bce086c25a",
      "mode": 0,
      "precision": "Float"
    }

Wrote Model to ActivationLayer_Sigmoid_Float.json; 243 characters

Example Input/Output Pair

Display input/output pairs from random executions:

Code from ReferenceIO.java:69 executed in 0.00 seconds:

    final SimpleEval eval = SimpleEval.run(layer, inputPrototype);
    return String.format("--------------------\nInput: \n[%s]\n--------------------\nOutput: \n%s\n--------------------\nDerivative: \n%s",
                         Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get(),
                         eval.getOutput().prettyPrint(),
                         Arrays.stream(eval.getDerivative()).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get());

Returns:

    --------------------
    Input: 
    [[
    	[ [ -0.084 ], [ -0.648 ], [ -1.5 ], [ -1.084 ], [ -1.624 ], [ -1.848 ], [ 1.816 ], [ -0.8 ] ],
    	[ [ -1.32 ], [ 1.892 ], [ 1.544 ], [ 0.92 ], [ 1.7 ], [ -1.108 ], [ 0.36 ], [ -1.032 ] ],
    	[ [ 1.32 ], [ 1.724 ], [ -0.4 ], [ -1.992 ], [ 0.548 ], [ -0.248 ], [ 0.82 ], [ 1.696 ] ],
    	[ [ -0.056 ], [ -1.46 ], [ -1.28 ], [ -0.192 ], [ 0.112 ], [ -1.664 ], [ 1.752 ], [ 1.704 ] ],
    	[ [ -1.896 ], [ -1.796 ], [ 0.136 ], [ 0.1 ], [ 0.152 ], [ -0.936 ], [ 1.212 ], [ -0.448 ] ],
    	[ [ -1.144 ], [ 0.752 ], [ -1.472 ], [ -0.608 ], [ 1.124 ], [ -0.672 ], [ -0.032 ], [ -0.032 ] ],
    	[ [ 0.472 ], [ 1.308 ], [ 1.616 ], [ 1.008 ], [ -1.896 ], [ -0.216 ], [ 0.776 ], [ 1.996 ] ],
    	[ [ -1.16 ], [ 1.956 ], [ -0.248 ], [ 1.636 ], [ -1.228 ], [ -1.12 ], [ 0.436 ], [ 1.368 ] ]
    ]]
    --------------------
    Output: 
    [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]
    --------------------
    Derivative: 
    [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]

GPU Log

Batch Execution

Most layers, including this one, should behave the same no matter how the items are split between batches. We verify this:

Code from BatchingTester.java:113 executed in 0.01 seconds:

    return test(reference, inputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (1280#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Differential Validation

Code from SingleDerivativeTester.java:292 executed in 0.00 seconds:

    log.info(String.format("Inputs: %s", Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Inputs Statistics: %s", Arrays.stream(inputPrototype).map(x -> new ScalarStatistics().add(x.getData()).toString()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Output: %s", outputPrototype.prettyPrint()));
    log.info(String.format("Outputs Statistics: %s", new ScalarStatistics().add(outputPrototype.getData())));

Logging:

    Inputs: [
    	[ [ 1.396 ], [ -0.336 ], [ 1.028 ], [ -1.104 ], [ 1.696 ], [ -1.52 ], [ 0.064 ], [ -1.428 ] ],
    	[ [ 0.916 ], [ -0.08 ], [ 1.256 ], [ -1.424 ], [ 0.54 ], [ 0.584 ], [ -0.928 ], [ -1.376 ] ],
    	[ [ -0.048 ], [ 1.856 ], [ 1.396 ], [ 0.844 ], [ 1.276 ], [ -0.144 ], [ 1.276 ], [ 0.736 ] ],
    	[ [ 1.444 ], [ -1.812 ], [ 1.38 ], [ -0.58 ], [ 0.004 ], [ 0.776 ], [ 0.276 ], [ 1.308 ] ],
    	[ [ -1.904 ], [ -1.832 ], [ -0.488 ], [ 0.304 ], [ 0.296 ], [ -0.004 ], [ 1.524 ], [ -1.276 ] ],
    	[ [ -0.476 ], [ 0.08 ], [ -1.564 ], [ 0.78 ], [ 0.608 ], [ -1.568 ], [ 0.688 ], [ -0.604 ] ],
    	[ [ 0.492 ], [ -0.592 ], [ 0.4 ], [ 0.54 ], [ -0.836 ], [ -1.828 ], [ 0.872 ], [ -0.784 ] ],
    	[ [ -1.532 ], [ 0.928 ], [ -0.256 ], [ -0.708 ], [ -1.972 ], [ 1.256 ], [ -1.412 ], [ -0.4 ] ]
    ]
    Inputs Statistics: {meanExponent=-0.20054866265592047, negative=31, min=-0.4, max=-0.4, mean=-0.031187500000000007, count=64, positive=33, stdDev=1.0856136927304068, zeros=0}
    Output: [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]
    Outputs Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=64, positive=0, stdDev=0.0, zeros=64}
    

Feedback Validation

We validate the agreement between the implemented derivative of the inputs with finite difference estimations:

Code from SingleDerivativeTester.java:303 executed in 0.08 seconds:

    return testFeedback(statistics, component, inputPrototype, outputPrototype);

Logging:

    Feedback for input 0
    Inputs Values: [
    	[ [ 1.396 ], [ -0.336 ], [ 1.028 ], [ -1.104 ], [ 1.696 ], [ -1.52 ], [ 0.064 ], [ -1.428 ] ],
    	[ [ 0.916 ], [ -0.08 ], [ 1.256 ], [ -1.424 ], [ 0.54 ], [ 0.584 ], [ -0.928 ], [ -1.376 ] ],
    	[ [ -0.048 ], [ 1.856 ], [ 1.396 ], [ 0.844 ], [ 1.276 ], [ -0.144 ], [ 1.276 ], [ 0.736 ] ],
    	[ [ 1.444 ], [ -1.812 ], [ 1.38 ], [ -0.58 ], [ 0.004 ], [ 0.776 ], [ 0.276 ], [ 1.308 ] ],
    	[ [ -1.904 ], [ -1.832 ], [ -0.488 ], [ 0.304 ], [ 0.296 ], [ -0.004 ], [ 1.524 ], [ -1.276 ] ],
    	[ [ -0.476 ], [ 0.08 ], [ -1.564 ], [ 0.78 ], [ 0.608 ], [ -1.568 ], [ 0.688 ], [ -0.604 ] ],
    	[ [ 0.492 ], [ -0.592 ], [ 0.4 ], [ 0.54 ], [ -0.836 ], [ -1.828 ], [ 0.872 ], [ -0.784 ] ],
    	[ [ -1.532 ], [ 0.928 ], [ -0.256 ], [ -0.708 ], [ -1.972 ], [ 1.256 ], [ -1.412 ], [ -0.4 ] ]
    ]
    Value Statistics: {meanExponent=-0.20054866265592047, negative=31, min=-0.4, max=-0.4, mean=-0.031187500000000007, count=64, positive=33, stdDev=1.0856136927304068, zeros=0}
    Implemented Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Implemented Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=4096, positive=0, stdDev=0.0, zeros=4096}
    Measured Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Measured Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=4096, positive=0, stdDev=0.0, zeros=4096}
    Feedback Error: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Error Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=4096, positive=0, stdDev=0.0, zeros=4096}
    

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (4096#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Learning Validation

We validate the agreement between the implemented derivative of the internal weights with finite difference estimations:

Code from SingleDerivativeTester.java:311 executed in 0.00 seconds:

    return testLearning(statistics, component, inputPrototype, outputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (4096#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Total Accuracy

The overall agreement accuracy between the implemented derivative and the finite difference estimations:

Code from SingleDerivativeTester.java:319 executed in 0.00 seconds:

    //log.info(String.format("Component: %s\nInputs: %s\noutput=%s", component, Arrays.toString(inputPrototype), outputPrototype));
    log.info(String.format("Finite-Difference Derivative Accuracy:"));
    log.info(String.format("absoluteTol: %s", statistics.absoluteTol));
    log.info(String.format("relativeTol: %s", statistics.relativeTol));

Logging:

    Finite-Difference Derivative Accuracy:
    absoluteTol: 0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (4096#)
    relativeTol: 0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)
    

Frozen and Alive Status

Code from SingleDerivativeTester.java:327 executed in 0.00 seconds:

    testFrozen(component, inputPrototype);
    testUnFrozen(component, inputPrototype);

Reference Implementation

This layer is an alternate implementation which is expected to behave the same as the following layer:

Code from EquivalencyTester.java:102 executed in 0.00 seconds:

    log.info(new GsonBuilder().setPrettyPrinting().create().toJson(reference.getJson()));

Logging:

    {
      "class": "com.simiacryptus.mindseye.layers.java.SigmoidActivationLayer",
      "id": "a719f901-6dcf-4fac-bf4f-b9fdc1078dd5",
      "isFrozen": true,
      "name": "SigmoidActivationLayer/a719f901-6dcf-4fac-bf4f-b9fdc1078dd5",
      "balanced": false
    }
    

We measure the agreement between the two layers in a random execution:

Code from EquivalencyTester.java:106 executed in 0.00 seconds:

    return test(subject, inputPrototype);

Logging:

    Inputs: Optional[[
    	[ [ 0.352 ], [ -0.728 ], [ 0.952 ], [ -0.612 ], [ -1.76 ], [ 1.356 ], [ -0.696 ], [ -0.04 ] ],
    	[ [ 0.512 ], [ 1.172 ], [ 0.12 ], [ 1.256 ], [ 1.588 ], [ -0.372 ], [ -0.44 ], [ 1.076 ] ],
    	[ [ -0.184 ], [ -1.012 ], [ -0.064 ], [ -1.488 ], [ 0.548 ], [ 1.996 ], [ 1.832 ], [ 1.148 ] ],
    	[ [ 1.36 ], [ 0.4 ], [ 0.504 ], [ -1.968 ], [ -1.328 ], [ -0.028 ], [ 1.58 ], [ 1.064 ] ],
    	[ [ -1.76 ], [ -1.468 ], [ -1.88 ], [ 0.672 ], [ -1.08 ], [ 1.18 ], [ 1.096 ], [ -1.732 ] ],
    	[ [ 1.444 ], [ 0.916 ], [ -1.112 ], [ 1.156 ], [ 1.08 ], [ 0.292 ], [ -0.056 ], [ -0.8 ] ],
    	[ [ -1.768 ], [ 1.996 ], [ -0.884 ], [ 1.68 ], [ 1.556 ], [ 0.776 ], [ 0.904 ], [ 0.892 ] ],
    	[ [ 0.028 ], [ 1.004 ], [ 0.572 ], [ 0.144 ], [ 0.744 ], [ 1.584 ], [ 1.144 ], [ -1.176 ] ]
    ]]
    Subject Output: [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]
    Reference Output: [
    	[ [ 0.5871024895421696 ], [ 0.3256337669975617 ], [ 0.7215172162514372 ], [ 0.3516031059824349 ], [ 0.1467903398013824 ], [ 0.7951088237697858 ], [ 0.3326996751729084 ], [ 0.4900013331200346 ] ],
    	[ [ 0.6252752039990873 ], [ 0.7635063348491598 ], [ 0.5299640517645717 ], [ 0.7783367615918962 ], [ 0.8303345309905271 ], [ 0.40805783960041064 ], [ 0.3917409692534856 ], [ 0.7457362744004505 ] ],
    	[ [ 0.4541293434458062 ], [ 0.26658862999609406 ], [ 0.48400545909729803 ], [ 0.18422210625703123 ], [ 0.6336714513637738 ], [ 0.8803764635220647 ], [ 0.861999811394382 ], [ 0.759145419201664 ] ],
    	[ [ 0.7957596977159083 ], [ 0.598687660112452 ], [ 0.6233988845696153 ], [ 0.12260386880496975 ], [ 0.20949038109877877 ], [ 0.49300045729748126 ], [ 0.8292045179776256 ], [ 0.7434542081463438 ] ],
    	[ [ 0.1467903398013824 ], [ 0.18724679446465686 ], [ 0.1323888735420654 ], [ 0.6619508479065779 ], [ 0.2535060166623378 ], [ 0.7649478037637647 ], [ 0.7495098760671497 ], [ 0.15033193655972282 ] ],
    	[ [ 0.8090733102129716 ], [ 0.7142263775539306 ], [ 0.24749821478614017 ], [ 0.7606051344021186 ], [ 0.7464939833376621 ], [ 0.5724856953888094 ], [ 0.4860036575196728 ], [ 0.31002551887238755 ] ],
    	[ [ 0.14579122481895845 ], [ 0.8803764635220647 ], [ 0.29234956690584907 ], [ 0.8429045311145473 ], [ 0.8257786295396119 ], [ 0.684817382739301 ], [ 0.7117708097419858 ], [ 0.709302729861882 ] ],
    	[ [ 0.5069995427025188 ], [ 0.7318442991255001 ], [ 0.6392245365175633 ], [ 0.5359379207244088 ], [ 0.6778699261136105 ], [ 0.8297702697724747 ], [ 0.7584132866558312 ], [ 0.23577216894214378 ] ]
    ]
    Error: [
    	[ [ -0.5871024895421696 ], [ -0.3256337669975617 ], [ -0.7215172162514372 ], [ -0.3516031059824349 ], [ -0.1467903398013824 ], [ -0.7951088237697858 ], [ -0.3326996751729084 ], [ -0.4900013331200346 ] ],
    	[ [ -0.6252752039990873 ], [ -0.7635063348491598 ], [ -0.5299640517645717 ], [ -0.7783367615918962 ], [ -0.8303345309905271 ], [ -0.40805783960041064 ], [ -0.3917409692534856 ], [ -0.7457362744004505 ] ],
    	[ [ -0.4541293434458062 ], [ -0.26658862999609406 ], [ -0.48400545909729803 ], [ -0.18422210625703123 ], [ -0.6336714513637738 ], [ -0.8803764635220647 ], [ -0.861999811394382 ], [ -0.759145419201664 ] ],
    	[ [ -0.7957596977159083 ], [ -0.598687660112452 ], [ -0.6233988845696153 ], [ -0.12260386880496975 ], [ -0.20949038109877877 ], [ -0.49300045729748126 ], [ -0.8292045179776256 ], [ -0.7434542081463438 ] ],
    	[ [ -0.1467903398013824 ], [ -0.18724679446465686 ], [ -0.1323888735420654 ], [ -0.6619508479065779 ], [ -0.2535060166623378 ], [ -0.7649478037637647 ], [ -0.7495098760671497 ], [ -0.15033193655972282 ] ],
    	[ [ -0.8090733102129716 ], [ -0.7142263775539306 ], [ -0.24749821478614017 ], [ -0.7606051344021186 ], [ -0.7464939833376621 ], [ -0.5724856953888094 ], [ -0.4860036575196728 ], [ -0.31002551887238755 ] ],
    	[ [ -0.14579122481895845 ], [ -0.8803764635220647 ], [ -0.29234956690584907 ], [ -0.8429045311145473 ], [ -0.8257786295396119 ], [ -0.684817382739301 ], [ -0.7117708097419858 ], [ -0.709302729861882 ] ],
    	[ [ -0.5069995427025188 ], [ -0.7318442991255001 ], [ -0.6392245365175633 ], [ -0.5359379207244088 ], [ -0.6778699261136105 ], [ -0.8297702697724747 ], [ -0.7584132866558312 ], [ -0.23577216894214378 ] ]
    ]
    

Returns:

    java.lang.AssertionError: ToleranceStatistics{absoluteTol=5.5414e-01 +- 2.3799e-01 [1.2260e-01 - 8.8038e-01] (64#), relativeTol=1.0000e+00 +- 0.0000e+00 [1.0000e+00 - 1.0000e+00] (64#)}
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:71)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.lambda$test$7(EquivalencyTester.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$null$1(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:59)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$code$2(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.test.SysOutInterceptor.withOutput(SysOutInterceptor.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.code(MarkdownNotebookOutput.java:203)
    	at com.simiacryptus.util.io.NotebookOutput.code(NotebookOutput.java:82)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:106)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:37)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.lambda$run$5(StandardLayerTests.java:257)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.run(StandardLayerTests.java:256)
    	at com.simiacryptus.mindseye.layers.cudnn.ActivationLayerTest.run(ActivationLayerTest.java:81)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.lambda$run$0(NotebookReportBase.java:105)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:76)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.run(NotebookReportBase.java:103)
    	at com.simiacryptus.mindseye.layers.LayerTestBase.test(LayerTestBase.java:37)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runners.Suite.runChild(Suite.java:128)
    	at org.junit.runners.Suite.runChild(Suite.java:27)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
    	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
    	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
    	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)