Demos Applications Components Optimizers Experiments Datasets

Chained

Java 8 Neural Networks with CuDNN and Aparapi


Project maintained by SimiaCryptus Java, CuDNN, and CUDA are others' trademarks. No endorsement is implied.
  1. Network Diagram
  2. Serialization
    1. Raw Json
  3. Performance
  4. Example Input/Output Pair
  5. Reference Implementation

Target Description: A simple network architecture based on the assumption of a linear sequence of components. Each component added becomes the new head node, and a default add method appends a new node on the existing head.

Report Description:

Network Diagram

This is a network with the following layout:

Code from StandardLayerTests.java:273 executed in 2.24 seconds:

    return Graphviz.fromGraph(TestUtil.toGraph((DAGNetwork) layer))
                   .height(400).width(600).render(Format.PNG).toImage();

Logging:

    Could not initialize guru.nidi.graphviz.engine.GraphvizCmdLineEngine
    guru.nidi.graphviz.engine.GraphvizException: dot.exe command not found
    

Returns:

Result

Serialization

This run will demonstrate the layer’s JSON serialization, and verify deserialization integrity.

Raw Json

Code from SerializationTest.java:84 executed in 0.04 seconds:

    final JsonObject json = layer.getJson();
    final NNLayer echo = NNLayer.fromJson(json);
    if (echo == null) throw new AssertionError("Failed to deserialize");
    if (layer == echo) throw new AssertionError("Serialization did not copy");
    if (!layer.equals(echo)) throw new AssertionError("Serialization not equal");
    return new GsonBuilder().setPrettyPrinting().create().toJson(json);

Returns:

    {
      "class": "com.simiacryptus.mindseye.network.PipelineNetwork",
      "id": "c71e5799-83d2-4e5c-8b44-bca4c4507997",
      "isFrozen": false,
      "name": "PipelineNetwork/c71e5799-83d2-4e5c-8b44-bca4c4507997",
      "inputs": [
        "16ae1651-785d-43de-9ddb-e91f0e4da219"
      ],
      "nodes": {
        "fc637ba7-c6e1-42d9-a2dc-2958cfed1e29": "8781e5a6-24b9-4ce5-8d86-ba14b4d5e794"
      },
      "layers": {
        "8781e5a6-24b9-4ce5-8d86-ba14b4d5e794": {
          "class": "com.simiacryptus.mindseye.layers.cudnn.ImgCropLayer",
          "id": "8781e5a6-24b9-4ce5-8d86-ba14b4d5e794",
          "isFrozen": false,
          "name": "ImgCropLayer/8781e5a6-24b9-4ce5-8d86-ba14b4d5e794",
          "sizeY": 200,
          "sizeX": 300,
          "precision": "Double"
        }
      },
      "links": {
        "fc637ba7-c6e1-42d9-a2dc-2958cfed1e29": [
          "16ae1651-785d-43de-9ddb-e91f0e4da219"
        ]
      },
      "labels": {},
      "head": "fc637ba7-c6e1-42d9-a2dc-2958cfed1e29"
    }

Wrote Model to PipelineNetwork_Chained.json; 898 characters

Performance

Now we execute larger-scale runs to benchmark performance:

Code from PerformanceTester.java:183 executed in 9.22 seconds:

    test(component, inputPrototype);

Logging:

    10 batches
    Input Dimensions:
    Found 2 devices
    Device 0 - GeForce GTX 1080 Ti
    Device 1 - GeForce GTX 1060 6GB
    Found 2 devices; using devices [0, 1]
    	[400, 400, 3]
    Performance:
    	Evaluation performance: 0.024239s +- 0.000708s [0.022919s - 0.026154s]
    	Learning performance: 0.021302s +- 0.017371s [0.014132s - 0.113496s]
    

Per-layer Performance Metrics:

Code from TestUtil.java:216 executed in 0.00 seconds:

    final Map<NNLayer, MonitoringWrapperLayer> metrics = new HashMap<>();
    network.visitNodes(node -> {
      if (node.getLayer() instanceof MonitoringWrapperLayer) {
        final MonitoringWrapperLayer layer = node.getLayer();
        metrics.put(layer.getInner(), layer);
      }
    });
    TestUtil.log.info("Performance: \n\t" + metrics.entrySet().stream().map(e -> {
      final PercentileStatistics performanceF = e.getValue().getForwardPerformance();
      final PercentileStatistics performanceB = e.getValue().getBackwardPerformance();
      return String.format("%s -> %.6fs +- %.6fs (%d)", e.getKey(), performanceF.getMean(), performanceF.getStdDev(), performanceF.getCount()) +
        (performanceB.getCount() == 0 ? "" : String.format("%n\tBack: %.6fs +- %.6fs (%s)", performanceB.getMean(), performanceB.getStdDev(), performanceB.getCount()));
    }).reduce((a, b) -> a + "\n\t" + b).get());

Logging:

    Performance: 
    	ImgCropLayer/9168e190-9e97-4b67-a63b-175cea09419e -> 0.023253s +- 0.002091s (201)
    	Back: 0.000102s +- 0.000575s (101)
    

GPU Log

Example Input/Output Pair

Display input/output pairs from random executions:

Code from ReferenceIO.java:69 executed in 0.31 seconds:

    final SimpleEval eval = SimpleEval.run(layer, inputPrototype);
    return String.format("--------------------\nInput: \n[%s]\n--------------------\nOutput: \n%s\n--------------------\nDerivative: \n%s",
                         Arrays.stream(inputPrototype).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get(),
                         eval.getOutput().prettyPrint(),
                         Arrays.stream(eval.getDerivative()).map(t -> t.prettyPrint()).reduce((a, b) -> a + ",\n" + b).get());

Returns:

    --------------------
    Input: 
    [[
    	[ [ 1.1, 0.16, -1.24 ], [ -1.912, -1.084, 1.656 ], [ -1.948, 0.56, -0.844 ], [ 0.668, -0.252, -0.548 ], [ 0.572, -0.968, 0.464 ], [ -0.344, -1.128, -1.516 ], [ -1.656, -0.3, 0.612 ], [ -0.224, -1.9, -1.08 ], ... ],
    	[ [ -0.988, 1.392, -1.252 ], [ -1.704, -0.644, 0.208 ], [ 0.412, 1.012, -0.976 ], [ 1.54, -0.384, -0.488 ], [ 0.092, -0.416, -0.512 ], [ -0.288, 0.112, -1.716 ], [ -1.616, -0.996, 0.164 ], [ 1.584, -1.284, -0.856 ], ... ],
    	[ [ -1.36, 0.12, 1.156 ], [ -0.84, 1.544, 1.892 ], [ 1.204, -1.776, 0.968 ], [ 0.872, -0.5, -1.812 ], [ 1.388, 0.148, 1.98 ], [ -0.644, 1.236, -0.008 ], [ 1.228, 0.308, 1.704 ], [ -0.012, 0.9, -0.94 ], ... ],
    	[ [ -0.724, 1.848, 1.86 ], [ -0.772, -0.24, 0.62 ], [ -0.864, 0.972, -1.256 ], [ -1.716, 1.428, -1.612 ], [ 1.796, -1.208, -0.908 ], [ -1.216, 0.66, -1.548 ], [ -0.22, 0.304, -1.564 ], [ -0.492, 1.784, -0.676 ], ... ],
    	[ [ 1.052, -1.904, 1.56 ], [ 0.704, 1.5, -0.648 ], [ 1.952, 1.712, -1.96 ], [ 0.248, -0.736, -1.352 ], [ 1.744, 0.432, 0.712 ], [ 0.452, 0.128, 1.36 ], [ 0.832, -0.7, -1.368 ], [ -0.184, -1.012, 1.092 ], ... ],
    	[ [ 1.976, -0.74, 0.968 ], [ -1.204, -0.096, 1.68 ], [ 1.264, -0.112, -1.872 ], [ 0.952, -1.912, -1.336 ], [ -1.272, 0.112, 1.332 ], [ 1.14, -1.444, 0.984 ], [ 1.364, -0.008, 0.236 ], [ -1.26, 0.632, -1.624 ], ... ],
    	[ [ 1.028, -1.516, 1.916 ], [ -1.52, -0.8, -0.772 ], [ 1.688, 0.0, 1.412 ], [ -0.66, -1.504, -0.024 ], [ 1.176, 1.224, -1.404 ], [ 1.424, -1.828, -0.336 ], [ -1.112, 1.136, -0.612 ], [ -0.22, -0.092, 1.384 ], ... ],
    	[ [ -0.872, 0.592, 1.092 ], [ -1.376, 1.32, -0.24 ], [ -1.232, -0.664, -1.716 ], [ 1.088, -1.212, -0.832 ], [ 0.22, 0.632, 0.52 ], [ 0.236, 0.844, -0.168 ], [ 0.732, -1.416, -0.056 ], [ -1.188, 0.284, -0.896 ], ... ],
    	...
    ]]
    --------------------
    Output: 
    [
    	[ [ 0.512, 1.732, 1.672 ], [ -0.776, -0.844, -0.16 ], [ 1.12, 0.004, 0.8 ], [ 1.34, -0.312, 1.184 ], [ 1.54, 1.308, 1.54 ], [ 0.284, 0.012, 1.072 ], [ -1.212, -0.672, -0.86 ], [ -0.04, 0.1, 1.42 ], ... ],
    	[ [ -1.064, 0.304, -1.328 ], [ -1.912, 0.192, -1.632 ], [ 0.8, -1.072, 1.16 ], [ 0.256, -0.104, 0.896 ], [ -0.888, 1.624, 1.8 ], [ 0.172, 0.208, -0.5 ], [ -1.484, -1.24, 1.892 ], [ 1.044, -1.172, -0.168 ], ... ],
    	[ [ 0.944, -1.532, -1.34 ], [ 0.008, -0.216, -1.688 ], [ 1.324, 0.852, 0.932 ], [ 1.236, 1.584, -0.028 ], [ -1.984, -1.408, 0.68 ], [ 1.012, -1.348, 0.816 ], [ 0.416, -0.62, 1.3 ], [ -0.456, 1.408, 0.836 ], ... ],
    	[ [ 0.68, 1.468, -0.152 ], [ 0.096, -1.184, -1.868 ], [ -1.892, 1.188, 0.696 ], [ -0.272, -1.132, -0.148 ], [ 0.924, -1.408, -0.988 ], [ -1.952, 1.164, 1.556 ], [ -1.992, -0.576, 0.76 ], [ -0.912, -1.424, 0.796 ], ... ],
    	[ [ -1.44, 1.772, -1.336 ], [ -1.524, -1.088, -1.236 ], [ 1.0, -1.592, -0.98 ], [ -1.364, 1.508, 1.048 ], [ 0.168, -1.728, -0.328 ], [ 0.248, 1.024, 1.984 ], [ -1.2, 0.272, -0.444 ], [ 0.248, 1.76, -0.084 ], ... ],
    	[ [ -0.108, -1.668, 0.372 ], [ -1.096, 1.988, -0.872 ], [ -1.944, 1.364, 0.724 ], [ -1.972, -1.184, 1.928 ], [ -1.152, 0.512, 1.316 ], [ -1.976, -1.556, -1.404 ], [ 0.604, -2.0, 0.268 ], [ -0.78, -1.568, 1.6 ], ... ],
    	[ [ 0.72, 0.868, 0.56 ], [ 0.116, -1.88, -1.02 ], [ -1.716, -0.396, 0.284 ], [ 0.612, -1.592, -0.992 ], [ 0.004, 1.18, -0.96 ], [ -1.0, -1.016, 1.924 ], [ -0.992, -0.14, -0.42 ], [ -1.38, -1.9, 1.524 ], ... ],
    	[ [ -0.048, 0.48, 0.42 ], [ 0.78, -1.124, -1.268 ], [ -1.3, -1.064, -0.54 ], [ 0.388, -1.548, 0.008 ], [ 0.096, 1.284, 0.244 ], [ -1.112, -0.584, 1.472 ], [ -0.956, 1.844, 0.98 ], [ -0.312, -0.136, 0.164 ], ... ],
    	...
    ]
    --------------------
    Derivative: 
    [
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	[ [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0 ], ... ],
    	...
    ]

GPU Log

Reference Implementation

This layer is an alternate implementation which is expected to behave the same as the following layer:

Code from EquivalencyTester.java:102 executed in 0.00 seconds:

    log.info(new GsonBuilder().setPrettyPrinting().create().toJson(reference.getJson()));

Logging:

    {
      "class": "com.simiacryptus.mindseye.network.PipelineNetwork",
      "id": "af1327b8-a1e9-4ccb-9889-d3edc3edbd69",
      "isFrozen": false,
      "name": "PipelineNetwork/af1327b8-a1e9-4ccb-9889-d3edc3edbd69",
      "inputs": [
        "f4bdc5b2-61b2-4997-8ce4-0dc4c764a6f7"
      ],
      "nodes": {
        "daf600ea-71ee-4e18-8a93-9f3155d24020": "6829b4cd-3bb6-4347-a856-02308551cb64"
      },
      "layers": {
        "6829b4cd-3bb6-4347-a856-02308551cb64": {
          "class": "com.simiacryptus.mindseye.layers.java.ImgCropLayer",
          "id": "6829b4cd-3bb6-4347-a856-02308551cb64",
          "isFrozen": false,
          "name": "ImgCropLayer/6829b4cd-3bb6-4347-a856-02308551cb64",
          "sizeX": 300,
          "sizeY": 300
        }
      },
      "links": {
        "daf600ea-71ee-4e18-8a93-9f3155d24020": [
          "f4bdc5b2-61b2-4997-8ce4-0dc4c764a6f7"
        ]
      },
      "labels": {},
      "head": "daf600ea-71ee-4e18-8a93-9f3155d24020"
    }
    

We measure the agreement between the two layers in a random execution:

Code from EquivalencyTester.java:106 executed in 0.00 seconds:

    return test(subject, inputPrototype);

Logging:

    Inputs: Optional[[
    	[ [ -0.136, 1.652, -1.64 ], [ -0.184, 0.916, -1.812 ], [ -0.7, -0.228, 1.036 ], [ -1.848, -0.448, 0.684 ], [ -1.904, -1.832, 1.392 ], [ 0.388, -1.036, -0.096 ], [ 1.208, -0.536, -1.312 ], [ -1.164, 1.972, 0.492 ], ... ],
    	[ [ -0.872, 1.088, 0.24 ], [ 1.284, 1.26, 0.372 ], [ -1.72, -0.792, -0.304 ], [ 0.716, -1.336, -1.984 ], [ 1.592, 1.836, -0.728 ], [ -0.32, -0.292, -1.744 ], [ 0.316, -1.108, -1.796 ], [ -0.456, -1.04, 1.292 ], ... ],
    	[ [ -1.408, 1.624, -0.6 ], [ -1.324, 1.256, -0.824 ], [ -0.44, -1.124, 1.668 ], [ -0.88, -1.58, 0.32 ], [ 1.68, 1.04, 0.692 ], [ 1.344, -1.372, -1.148 ], [ -0.68, 1.532, -0.76 ], [ 1.308, -1.64, -1.9 ], ... ],
    	[ [ -0.384, 0.944, -1.472 ], [ -1.788, -1.504, 1.096 ], [ -1.752, -0.344, 1.772 ], [ 0.904, -0.028, 0.884 ], [ -1.66, 0.124, -0.988 ], [ -0.508, -0.724, 0.28 ], [ -0.524, 1.196, 0.784 ], [ -1.336, -0.872, 0.156 ], ... ],
    	[ [ 0.36, -0.588, -0.608 ], [ 0.448, -0.852, -0.412 ], [ -1.416, -1.588, -0.18 ], [ 1.576, 1.304, -0.12 ], [ 1.188, 1.544, -0.744 ], [ 1.7, -0.488, 1.4 ], [ 1.72, -0.332, 1.784 ], [ 0.26, -1.624, 1.696 ], ... ],
    	[ [ -1.756, -0.824, -0.92 ], [ -0.576, -0.052, 1.972 ], [ 0.128, -0.068, 0.412 ], [ 1.52, -0.404, 0.164 ], [ 0.5, -1.844, -0.584 ], [ -0.368, 1.976, 1.392 ], [ -0.988, 1.996, -1.576 ], [ -0.012, -1.816, 1.368 ], ... ],
    	[ [ 1.94, -0.508, 0.024 ], [ -1.984, -0.416, 1.832 ], [ -1.424, -1.864, 0.756 ], [ -1.672, -1.792, -0.68 ], [ 1.684, 1.984, -1.148 ], [ 1.532, -1.416, -0.308 ], [ 0.592, -1.492, 1.432 ], [ 1.204, 0.704, 1.16 ], ... ],
    	[ [ 1.672, 1.164, 1.968 ], [ 1.56, 0.124, -0.54 ], [ 1.292, 1.74, -0.776 ], [ -1.424, -1.46, 1.22 ], [ 0.392, -0.74, -0.96 ], [ -0.628, 0.88, -0.812 ], [ -1.628, 1.864, -1.972 ], [ 1.16, -1.26, -1.648 ], ... ],
    	...
    ]]
    Subject Output: [
    	[ [ 0.204, 0.148, 1.552 ], [ -0.936, -0.964, 1.84 ], [ -0.016, -1.564, 0.712 ], [ -1.436, -1.156, 0.432 ], [ 1.216, 0.396, -0.396 ], [ -0.928, -1.548, -1.456 ], [ 0.76, 1.52, -1.02 ], [ -1.9, -0.876, 0.048 ], ... ],
    	[ [ -0.188, 0.236, 0.172 ], [ -0.752, -1.584, -1.212 ], [ -0.368, -1.604, 1.564 ], [ 0.024, 1.96, 1.076 ], [ 0.888, -0.944, -0.668 ], [ 1.132, -1.676, 1.344 ], [ -1.292, -1.9, 1.148 ], [ 0.676, 0.268, -1.724 ], ... ],
    	[ [ 0.928, 1.94, -0.276 ], [ 1.76, 1.612, 1.176 ], [ -1.872, 0.424, -1.772 ], [ 0.212, 1.66, -0.496 ], [ 1.66, 1.948, 1.348 ], [ 1.048, -1.52, 1.292 ], [ 0.5, -0.192, -0.832 ], [ -0.544, -1.256, -0.272 ], ... ],
    	[ [ -1.144, 0.036, -1.236 ], [ -1.112, 1.66, 1.564 ], [ -0.092, 0.496, 0.168 ], [ -1.208, -1.324, 0.128 ], [ 0.348, 1.72, -0.028 ], [ -0.74, -0.276, -0.084 ], [ 1.84, 0.244, -1.7 ], [ 1.512, -0.316, 1.692 ], ... ],
    	[ [ -1.028, -0.788, -1.6 ], [ 1.752, 0.752, -0.016 ], [ 0.068, 1.592, -0.964 ], [ 0.136, -1.092, -1.716 ], [ -1.324, -1.272, -1.104 ], [ 0.76, -1.552, 1.776 ], [ 1.028, -1.844, 1.672 ], [ -1.688, 1.132, 0.116 ], ... ],
    	[ [ -1.428, 0.484, 1.748 ], [ -0.116, 0.972, 0.556 ], [ 1.608, 1.688, 1.08 ], [ 0.76, 1.612, 0.508 ], [ 1.728, 1.496, -0.24 ], [ 1.168, -0.404, -0.74 ], [ 0.932, -1.632, 1.672 ], [ -1.004, 0.668, -0.428 ], ... ],
    	[ [ 1.692, -1.66, -0.076 ], [ 1.148, 0.312, 1.172 ], [ 0.164, -1.312, 0.092 ], [ 1.792, -0.932, -0.244 ], [ 0.088, 0.864, 0.428 ], [ -0.408, 1.272, -1.204 ], [ 0.452, 1.292, 1.296 ], [ -0.816, -0.108, 1.488 ], ... ],
    	[ [ -1.336, -1.036, -0.84 ], [ 0.28, 1.888, 1.408 ], [ -1.316, 0.196, -1.404 ], [ -0.364, -0.24, -1.892 ], [ 1.324, -1.576, -1.264 ], [ 0.736, -1.284, -0.128 ], [ 0.3, -1.252, 0.988 ], [ 0.8, 0.736, 1.252 ], ... ],
    	...
    ]
    Reference Output: [
    	[ [ -1.204, -1.372, 1.5 ], [ -1.544, -0.544, 1.52 ], [ 0.508, -1.304, -0.768 ], [ 0.772, -0.708, 0.596 ], [ -0.248, 1.268, 0.528 ], [ -1.344, 0.808, -1.344 ], [ 0.452, 1.564, 1.108 ], [ 0.648, 0.276, -1.7 ], ... ],
    	[ [ 1.768, -1.768, 0.252 ], [ 1.416, 0.536, -0.612 ], [ 1.456, 1.772, -0.684 ], [ -0.496, -0.204, -1.696 ], [ -0.076, -0.772, 1.22 ], [ -1.72, -1.448, -0.748 ], [ 0.38, 1.08, 1.248 ], [ -1.752, -0.14, -1.264 ], ... ],
    	[ [ 1.168, 0.628, -1.964 ], [ -1.316, 1.448, -0.424 ], [ 1.832, -1.184, -1.564 ], [ -0.328, -0.132, -1.464 ], [ -0.88, -0.54, -1.572 ], [ -1.416, -1.26, -1.688 ], [ -1.136, 0.212, 1.268 ], [ 1.3, 1.372, -1.224 ], ... ],
    	[ [ 0.94, -0.096, 0.512 ], [ -0.244, 0.196, 0.192 ], [ 2.0, 1.248, 1.348 ], [ -0.844, 1.072, 1.444 ], [ 1.74, -0.644, -0.852 ], [ 0.264, 1.236, 1.22 ], [ -0.876, -0.748, -0.716 ], [ -0.744, -0.576, -1.488 ], ... ],
    	[ [ 0.884, 0.32, 0.776 ], [ 0.048, -1.252, 1.148 ], [ 1.276, 1.9, 0.944 ], [ 1.544, 1.008, 1.552 ], [ -1.008, 1.204, -0.212 ], [ 1.192, -0.844, 1.14 ], [ -1.068, 1.408, -0.488 ], [ -0.148, 1.508, 1.508 ], ... ],
    	[ [ -1.224, -0.652, -0.428 ], [ 1.936, -1.752, -0.456 ], [ -0.068, -1.884, -1.532 ], [ 0.236, 0.896, -0.44 ], [ -1.788, 1.404, -1.5 ], [ -1.08, -1.404, -0.192 ], [ -1.928, -0.76, 1.304 ], [ -0.352, 0.004, 0.736 ], ... ],
    	[ [ -0.848, 1.556, -1.328 ], [ 1.596, 0.688, 0.136 ], [ -0.124, -1.264, -1.132 ], [ -0.444, -0.488, 1.972 ], [ -1.112, -1.628, 1.18 ], [ 1.232, -1.104, -0.144 ], [ 1.244, -1.448, -1.212 ], [ 1.228, 1.4, -1.26 ], ... ],
    	[ [ 0.072, -0.328, 1.056 ], [ 0.812, 1.048, 1.056 ], [ -0.692, -1.692, -1.624 ], [ -1.86, 0.916, -0.596 ], [ 0.232, -0.012, -0.824 ], [ 0.244, -1.656, 1.956 ], [ 1.396, -0.812, 0.22 ], [ -1.2, -0.54, -1.992 ], ... ],
    	...
    ]
    Error: [
    	[ [ 1.408, 1.52, 0.052000000000000046 ], [ 0.608, -0.41999999999999993, 0.32000000000000006 ], [ -0.524, -0.26, 1.48 ], [ -2.208, -0.44799999999999995, -0.16399999999999998 ], [ 1.464, -0.872, -0.924 ], [ 0.41600000000000004, -2.356, -0.11199999999999988 ], [ 0.308, -0.04400000000000004, -2.128 ], [ -2.548, -1.1520000000000001, 1.748 ], ... ],
    	[ [ -1.956, 2.004, -0.08000000000000002 ], [ -2.168, -2.12, -0.6 ], [ -1.8239999999999998, -3.3760000000000003, 2.248 ], [ 0.52, 2.164, 2.7720000000000002 ], [ 0.964, -0.17199999999999993, -1.888 ], [ 2.852, -0.22799999999999998, 2.092 ], [ -1.6720000000000002, -2.98, -0.10000000000000009 ], [ 2.428, 0.40800000000000003, -0.45999999999999996 ], ... ],
    	[ [ -0.23999999999999988, 1.3119999999999998, 1.688 ], [ 3.076, 0.16400000000000015, 1.5999999999999999 ], [ -3.704, 1.6079999999999999, -0.20799999999999996 ], [ 0.54, 1.7919999999999998, 0.968 ], [ 2.54, 2.488, 2.92 ], [ 2.464, -0.26, 2.98 ], [ 1.636, -0.404, -2.1 ], [ -1.844, -2.628, 0.952 ], ... ],
    	[ [ -2.0839999999999996, 0.132, -1.748 ], [ -0.8680000000000001, 1.464, 1.372 ], [ -2.092, -0.752, -1.1800000000000002 ], [ -0.364, -2.396, -1.3159999999999998 ], [ -1.392, 2.364, 0.824 ], [ -1.004, -1.512, -1.304 ], [ 2.716, 0.992, -0.984 ], [ 2.2560000000000002, 0.25999999999999995, 3.1799999999999997 ], ... ],
    	[ [ -1.912, -1.108, -2.3760000000000003 ], [ 1.704, 2.004, -1.164 ], [ -1.208, -0.30799999999999983, -1.908 ], [ -1.408, -2.1, -3.268 ], [ -0.31600000000000006, -2.476, -0.8920000000000001 ], [ -0.43199999999999994, -0.7080000000000001, 0.6360000000000001 ], [ 2.096, -3.252, 2.16 ], [ -1.54, -0.3760000000000001, -1.392 ], ... ],
    	[ [ -0.20399999999999996, 1.1360000000000001, 2.176 ], [ -2.052, 2.724, 1.012 ], [ 1.6760000000000002, 3.572, 2.612 ], [ 0.524, 0.7160000000000001, 0.948 ], [ 3.516, 0.09200000000000008, 1.26 ], [ 2.248, 0.9999999999999999, -0.548 ], [ 2.86, -0.8719999999999999, 0.3679999999999999 ], [ -0.652, 0.664, -1.164 ], ... ],
    	[ [ 2.54, -3.216, 1.252 ], [ -0.4480000000000002, -0.37599999999999995, 1.036 ], [ 0.28800000000000003, -0.04800000000000004, 1.224 ], [ 2.236, -0.44400000000000006, -2.216 ], [ 1.2000000000000002, 2.492, -0.752 ], [ -1.64, 2.3760000000000003, -1.06 ], [ -0.792, 2.74, 2.508 ], [ -2.044, -1.508, 2.748 ], ... ],
    	[ [ -1.4080000000000001, -0.708, -1.896 ], [ -0.532, 0.8399999999999999, 0.35199999999999987 ], [ -0.6240000000000001, 1.888, 0.2200000000000002 ], [ 1.496, -1.1560000000000001, -1.2959999999999998 ], [ 1.092, -1.564, -0.44000000000000006 ], [ 0.492, 0.3719999999999999, -2.084 ], [ -1.0959999999999999, -0.43999999999999995, 0.768 ], [ 2.0, 1.276, 3.2439999999999998 ], ... ],
    	...
    ]
    

Returns:

    java.lang.AssertionError: ToleranceStatistics{absoluteTol=1.3303e+00 +- 9.4327e-01 [0.0000e+00 - 3.9960e+00] (180000#), relativeTol=6.9197e-01 +- 3.6566e-01 [0.0000e+00 - 1.0000e+00] (179819#)}
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:71)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.lambda$test$7(EquivalencyTester.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$null$1(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:59)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.lambda$code$2(MarkdownNotebookOutput.java:205)
    	at com.simiacryptus.util.test.SysOutInterceptor.withOutput(SysOutInterceptor.java:107)
    	at com.simiacryptus.util.io.MarkdownNotebookOutput.code(MarkdownNotebookOutput.java:203)
    	at com.simiacryptus.util.io.NotebookOutput.code(NotebookOutput.java:82)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:106)
    	at com.simiacryptus.mindseye.test.unit.EquivalencyTester.test(EquivalencyTester.java:37)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.lambda$run$8(StandardLayerTests.java:283)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
    	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
    	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
    	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    	at com.simiacryptus.mindseye.test.unit.StandardLayerTests.run(StandardLayerTests.java:282)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.lambda$run$0(NotebookReportBase.java:105)
    	at com.simiacryptus.util.lang.TimedResult.time(TimedResult.java:76)
    	at com.simiacryptus.mindseye.test.NotebookReportBase.run(NotebookReportBase.java:103)
    	at com.simiacryptus.mindseye.layers.LayerTestBase.test(LayerTestBase.java:37)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
    	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
    	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
    	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)