Demos Applications Components Optimizers Experiments Datasets

ReLu_Float

Configured with float (32-bit) precision, y=x<0?0:x


Project maintained by SimiaCryptus Java, CuDNN, and CUDA are others' trademarks. No endorsement is implied.
  1. Serialization
    1. Raw Json
  2. Example Input/Output Pair
  3. Batch Execution
  4. Differential Validation
    1. Feedback Validation
    2. Learning Validation
    3. Total Accuracy
    4. Frozen and Alive Status
  5. Reference Implementation

Target Description: The generic Activation layer, exposing the activation types provided by CuDNN. This layer is stateless and is determined by a univariate function, e.g. ReLU or Sigmoid.

Report Description: Configured with float (32-bit) precision, y=x&lt;0?0:x

Serialization

This run will demonstrate the layer’s JSON serialization, and verify deserialization integrity.

Raw Json

Code from SerializationTest.java:84 executed in 0.00 seconds:

    final JsonObject json = layer.getJson();
    final NNLayer echo = NNLayer.fromJson(json);
    if (echo == null) throw new AssertionError("Failed to deserialize");
    if (layer == echo) throw new AssertionError("Serialization did not copy");
    if (!layer.equals(echo)) throw new AssertionError("Serialization not equal");
    return new GsonBuilder().setPrettyPrinting().create().toJson(json);

Returns:

    {
      "class": "com.simiacryptus.mindseye.layers.cudnn.ActivationLayer",
      "id": "7b087bbb-0741-4cd0-a2a9-b3b080de9593",
      "isFrozen": false,
      "name": "ActivationLayer/7b087bbb-0741-4cd0-a2a9-b3b080de9593",
      "mode": 1,
      "precision": "Float"
    }

Wrote Model to ActivationLayer_ReLu_Float.json; 243 characters

Example Input/Output Pair

Display input/output pairs from random executions:

Code from ReferenceIO.java:69 executed in 0.17 seconds:

    final SimpleEval eval = SimpleEval.run(layer, inputPrototype);
    return String.format("--------------------\nInput: \n[%s]\n--------------------\nOutput: \n%s\n--------------------\nDerivative: \n%s",
                         Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get(),
                         eval.getOutput().prettyPrint(),
                         Arrays.stream(eval.getDerivative()).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get());

Returns:

    --------------------
    Input: 
    [[
    	[ [ 1.104 ], [ -1.016 ], [ 0.78 ], [ 0.068 ], [ 0.364 ], [ -1.6 ], [ -0.444 ], [ -1.048 ] ],
    	[ [ -1.54 ], [ -1.0 ], [ 1.98 ], [ 1.152 ], [ -1.296 ], [ 0.472 ], [ 1.8 ], [ 1.896 ] ],
    	[ [ -0.756 ], [ -1.592 ], [ 0.12 ], [ 1.372 ], [ 0.136 ], [ -0.196 ], [ 1.388 ], [ -1.884 ] ],
    	[ [ 1.044 ], [ -1.06 ], [ -0.216 ], [ -0.084 ], [ 0.572 ], [ 1.8 ], [ -1.032 ], [ -0.48 ] ],
    	[ [ 0.316 ], [ 1.56 ], [ 1.856 ], [ 0.748 ], [ -1.128 ], [ 1.532 ], [ -1.232 ], [ 0.096 ] ],
    	[ [ -0.528 ], [ -0.828 ], [ -0.096 ], [ 1.024 ], [ 1.92 ], [ 0.232 ], [ -1.152 ], [ -0.544 ] ],
    	[ [ 0.82 ], [ 1.056 ], [ 1.988 ], [ -1.344 ], [ 1.672 ], [ -0.952 ], [ -1.628 ], [ -0.228 ] ],
    	[ [ 1.016 ], [ -1.896 ], [ 0.332 ], [ 1.236 ], [ 1.74 ], [ -1.356 ], [ 1.828 ], [ -1.652 ] ]
    ]]
    --------------------
    Output: 
    [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]
    --------------------
    Derivative: 
    [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]

GPU Log

Batch Execution

Most layers, including this one, should behave the same no matter how the items are split between batches. We verify this:

Code from BatchingTester.java:113 executed in 0.01 seconds:

    return test(reference, inputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (1280#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Differential Validation

Code from SingleDerivativeTester.java:292 executed in 0.00 seconds:

    log.info(String.format("Inputs: %s", Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Inputs Statistics: %s", Arrays.stream(inputPrototype).map(x -> new ScalarStatistics().add(x.getData()).toString()).reduce((a, b) -> a + ",\n" + b).get()));
    log.info(String.format("Output: %s", outputPrototype.prettyPrint()));
    log.info(String.format("Outputs Statistics: %s", new ScalarStatistics().add(outputPrototype.getData())));

Logging:

    Inputs: [
    	[ [ 0.76 ], [ 1.752 ], [ 0.74 ], [ -1.508 ], [ -0.772 ], [ 1.06 ], [ 1.36 ], [ 0.836 ] ],
    	[ [ -1.648 ], [ 0.328 ], [ -0.996 ], [ 1.66 ], [ 1.184 ], [ 1.38 ], [ -0.24 ], [ 0.588 ] ],
    	[ [ 0.216 ], [ 0.58 ], [ -0.508 ], [ -1.532 ], [ -1.548 ], [ -0.668 ], [ -1.7 ], [ 0.496 ] ],
    	[ [ 1.352 ], [ -1.164 ], [ -0.968 ], [ 1.996 ], [ 0.924 ], [ -1.988 ], [ -0.724 ], [ 1.876 ] ],
    	[ [ -0.64 ], [ -1.18 ], [ -1.784 ], [ 1.88 ], [ 0.072 ], [ -0.728 ], [ -0.896 ], [ -0.272 ] ],
    	[ [ 1.404 ], [ -0.296 ], [ 1.54 ], [ 0.36 ], [ 1.68 ], [ -0.092 ], [ -0.196 ], [ -1.4 ] ],
    	[ [ 1.868 ], [ 1.048 ], [ 1.832 ], [ 1.46 ], [ -0.9 ], [ -1.868 ], [ -1.66 ], [ 1.952 ] ],
    	[ [ -0.92 ], [ 1.804 ], [ -1.74 ], [ -0.856 ], [ -0.048 ], [ -1.888 ], [ -0.48 ], [ 0.976 ] ]
    ]
    Inputs Statistics: {meanExponent=-0.05874533371193459, negative=33, min=0.976, max=0.976, mean=0.04931250000000008, count=64, positive=31, stdDev=1.2478363383648312, zeros=0}
    Output: [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]
    Outputs Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=64, positive=0, stdDev=0.0, zeros=64}
    

Feedback Validation

We validate the agreement between the implemented derivative of the inputs with finite difference estimations:

Code from SingleDerivativeTester.java:303 executed in 0.12 seconds:

    return testFeedback(statistics, component, inputPrototype, outputPrototype);

Logging:

    Feedback for input 0
    Inputs Values: [
    	[ [ 0.76 ], [ 1.752 ], [ 0.74 ], [ -1.508 ], [ -0.772 ], [ 1.06 ], [ 1.36 ], [ 0.836 ] ],
    	[ [ -1.648 ], [ 0.328 ], [ -0.996 ], [ 1.66 ], [ 1.184 ], [ 1.38 ], [ -0.24 ], [ 0.588 ] ],
    	[ [ 0.216 ], [ 0.58 ], [ -0.508 ], [ -1.532 ], [ -1.548 ], [ -0.668 ], [ -1.7 ], [ 0.496 ] ],
    	[ [ 1.352 ], [ -1.164 ], [ -0.968 ], [ 1.996 ], [ 0.924 ], [ -1.988 ], [ -0.724 ], [ 1.876 ] ],
    	[ [ -0.64 ], [ -1.18 ], [ -1.784 ], [ 1.88 ], [ 0.072 ], [ -0.728 ], [ -0.896 ], [ -0.272 ] ],
    	[ [ 1.404 ], [ -0.296 ], [ 1.54 ], [ 0.36 ], [ 1.68 ], [ -0.092 ], [ -0.196 ], [ -1.4 ] ],
    	[ [ 1.868 ], [ 1.048 ], [ 1.832 ], [ 1.46 ], [ -0.9 ], [ -1.868 ], [ -1.66 ], [ 1.952 ] ],
    	[ [ -0.92 ], [ 1.804 ], [ -1.74 ], [ -0.856 ], [ -0.048 ], [ -1.888 ], [ -0.48 ], [ 0.976 ] ]
    ]
    Value Statistics: {meanExponent=-0.05874533371193459, negative=33, min=0.976, max=0.976, mean=0.04931250000000008, count=64, positive=31, stdDev=1.2478363383648312, zeros=0}
    Implemented Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Implemented Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=4096, positive=0, stdDev=0.0, zeros=4096}
    Measured Feedback: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Measured Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=4096, positive=0, stdDev=0.0, zeros=4096}
    Feedback Error: [ [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, ... ], ... ]
    Error Statistics: {meanExponent=NaN, negative=0, min=0.0, max=0.0, mean=0.0, count=4096, positive=0, stdDev=0.0, zeros=4096}
    

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (4096#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Learning Validation

We validate the agreement between the implemented derivative of the internal weights with finite difference estimations:

Code from SingleDerivativeTester.java:311 executed in 0.00 seconds:

    return testLearning(statistics, component, inputPrototype, outputPrototype);

Returns:

    ToleranceStatistics{absoluteTol=0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (4096#), relativeTol=0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)}

Total Accuracy

The overall agreement accuracy between the implemented derivative and the finite difference estimations:

Code from SingleDerivativeTester.java:319 executed in 0.00 seconds:

    //log.info(String.format("Component: %s\nInputs: %s\noutput=%s", component, Arrays.toString(inputPrototype), outputPrototype));
    log.info(String.format("Finite-Difference Derivative Accuracy:"));
    log.info(String.format("absoluteTol: %s", statistics.absoluteTol));
    log.info(String.format("relativeTol: %s", statistics.relativeTol));

Logging:

    Finite-Difference Derivative Accuracy:
    absoluteTol: 0.0000e+00 +- 0.0000e+00 [0.0000e+00 - 0.0000e+00] (4096#)
    relativeTol: 0.0000e+00 +- 0.0000e+00 [Infinity - -Infinity] (0#)
    

Frozen and Alive Status

Code from SingleDerivativeTester.java:327 executed in 0.00 seconds:

    testFrozen(component, inputPrototype);
    testUnFrozen(component, inputPrototype);

Reference Implementation

This layer is an alternate implementation which is expected to behave the same as the following layer:

Code from EquivalencyTester.java:102 executed in 0.00 seconds:

    log.info(new GsonBuilder().setPrettyPrinting().create().toJson(reference.getJson()));

Logging:

    {
      "class": "com.simiacryptus.mindseye.layers.java.ReLuActivationLayer",
      "id": "643ac1a5-d48e-4167-8336-9100ca510340",
      "isFrozen": true,
      "name": "ReLuActivationLayer/643ac1a5-d48e-4167-8336-9100ca510340",
      "weights": [
        1.0
      ]
    }
    

We measure the agreement between the two layers in a random execution:

Code from EquivalencyTester.java:106 executed in 0.00 seconds:

    return test(subject, inputPrototype);

Logging:

    Inputs: Optional[[
    	[ [ 1.36 ], [ 1.684 ], [ 1.276 ], [ -0.5 ], [ 0.476 ], [ -1.564 ], [ 1.304 ], [ 1.132 ] ],
    	[ [ 1.784 ], [ 0.7 ], [ -0.912 ], [ -1.732 ], [ 0.14 ], [ -1.76 ], [ -1.892 ], [ 1.524 ] ],
    	[ [ 1.912 ], [ 0.312 ], [ -0.932 ], [ -0.332 ], [ -0.072 ], [ -1.344 ], [ 1.792 ], [ 0.932 ] ],
    	[ [ 0.744 ], [ 0.752 ], [ 1.768 ], [ -0.08 ], [ 1.62 ], [ 0.308 ], [ -0.14 ], [ 1.58 ] ],
    	[ [ -1.092 ], [ -1.94 ], [ -1.232 ], [ -0.192 ], [ -0.04 ], [ 1.796 ], [ 1.188 ], [ -0.428 ] ],
    	[ [ 0.284 ], [ 1.98 ], [ 1.228 ], [ 0.568 ], [ 1.112 ], [ 1.872 ], [ 0.28 ], [ -1.912 ] ],
    	[ [ 1.6 ], [ -0.796 ], [ -0.78 ], [ 0.568 ], [ -0.62 ], [ 1.696 ], [ -0.504 ], [ 1.636 ] ],
    	[ [ 0.512 ], [ 0.484 ], [ 1.536 ], [ 1.548 ], [ 0.716 ], [ -1.728 ], [ 1.264 ], [ -0.868 ] ]
    ]]
    Subject Output: [
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ] ]
    ]
    Reference Output: [
    	[ [ 1.36 ], [ 1.684 ], [ 1.276 ], [ 0.0 ], [ 0.476 ], [ 0.0 ], [ 1.304 ], [ 1.132 ] ],
    	[ [ 1.784 ], [ 0.7 ], [ 0.0 ], [ 0.0 ], [ 0.14 ], [ 0.0 ], [ 0.0 ], [ 1.524 ] ],
    	[ [ 1.912 ], [ 0.312 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 1.792 ], [ 0.932 ] ],
    	[ [ 0.744 ], [ 0.752 ], [ 1.768 ], [ 0.0 ], [ 1.62 ], [ 0.308 ], [ 0.0 ], [ 1.58 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 1.796 ], [ 1.188 ], [ 0.0 ] ],
    	[ [ 0.284 ], [ 1.98 ], [ 1.228 ], [ 0.568 ], [ 1.112 ], [ 1.872 ], [ 0.28 ], [ 0.0 ] ],
    	[ [ 1.6 ], [ 0.0 ], [ 0.0 ], [ 0.568 ], [ 0.0 ], [ 1.696 ], [ 0.0 ], [ 1.636 ] ],
    	[ [ 0.512 ], [ 0.484 ], [ 1.536 ], [ 1.548 ], [ 0.716 ], [ 0.0 ], [ 1.264 ], [ 0.0 ] ]
    ]
    Error: [
    	[ [ -1.36 ], [ -1.684 ], [ -1.276 ], [ 0.0 ], [ -0.476 ], [ 0.0 ], [ -1.304 ], [ -1.132 ] ],
    	[ [ -1.784 ], [ -0.7 ], [ 0.0 ], [ 0.0 ], [ -0.14 ], [ 0.0 ], [ 0.0 ], [ -1.524 ] ],
    	[ [ -1.912 ], [ -0.312 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ -1.792 ], [ -0.932 ] ],
    	[ [ -0.744 ], [ -0.752 ], [ -1.768 ], [ 0.0 ], [ -1.62 ], [ -0.308 ], [ 0.0 ], [ -1.58 ] ],
    	[ [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ 0.0 ], [ -1.796 ], [ -1.188 ], [ 0.0 ] ],
    	[ [ -0.284 ], [ -1.98 ], [ -1.228 ], [ -0.568 ], [ -1.112 ], [ -1.872 ], [ -0.28 ], [ 0.0 ] ],
    	[ [ -1.6 ], [ 0.0 ], [ 0.0 ], [ -0.568 ], [ 0.0 ], [ -1.696 ], [ 0.0 ], [ -1.636 ] ],
    	[ [ -0.512 ], [ -0.484 ], [ -1.536 ], [ -1.548 ], [ -0.716 ], [ 0.0 ], [ -1.264 ], [ 0.0 ] ]
    ]
    

Returns:

    java.lang.AssertionError: ToleranceStatistics{absoluteTol=7.0263e-01 +- 7.1064e-01 [0.0000e+00 - 1.9800e+00] (64#), relativeTol=1.0000e+00 +- 0.0000e+00 [1.0000e+00 - 1.0000e+00] (39#)}
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:71)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.lambda$test$7(EquivalencyTester.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$null$1(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:59)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$code$2(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.test.SysOutInterceptor.withOutput(SysOutInterceptor.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.code(MarkdownNotebookOutput.java:203)
    	at com.simiacryptus.util.io.NotebookOutput.code(NotebookOutput.java:82)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:106)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:37)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.lambda$run$5(StandardLayerTests.java:257)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.run(StandardLayerTests.java:256)
    	at com.simiacryptus.mindseye.layers.cudnn.ActivationLayerTest.run(ActivationLayerTest.java:81)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.lambda$run$0(NotebookReportBase.java:105)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:76)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.run(NotebookReportBase.java:103)
    	at com.simiacryptus.mindseye.layers.LayerTestBase.test(LayerTestBase.java:37)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runners.Suite.runChild(Suite.java:128)
    	at org.junit.runners.Suite.runChild(Suite.java:27)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
    	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
    	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
    	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)