days
-1
-6
hours
0
-5
minutes
-5
-9
seconds
-4
-8
search
Additional deep learning features

Machine Learning in Java: Deeplearning4j Version 1.0.0-beta4 is here

Sarah Schlothauer
deeplearning4j
© Shutterstock / Digital_Art

Deeplearning4j is a deep learning library for Java and the JVM; in 2017 it joined the Eclipse Foundation and open sourced its libraries. The latest update, Deeplearning4j Version 1.0.0-beta4 adds new support, optimization, fixes some pesky bugs, and adds a few new features.

Deeplearning4j, the deep learning library for Java and the JVM, has a seemingly limitless number of use cases. It’s helped forecast oceanic temperatures and even built a Go-playing bot. Its creators Skymind joined the Eclipse Foundation in 2017, proving its maturity and stability.

Now DL4J released a new beta version: v1.0.0-beta4. What changes does the new version bring?

Full multi-datatype support

One of the main major highlights added in v1.0.0-beta4 includes full multi-datatype support for ND4J and DL4J. From the release notes:

In past releases, all N-Dimensional arrays in ND4J were limited to a single datatype (float or double), set globally. Now, arrays of all datatypes may be used simultaneously. The following datatypes are supported:

  • DOUBLE: double precision floating point, 64-bit (8 byte)
  • FLOAT: single precision floating point, 32-bit (4 byte)
  • HALF: half precision floating point, 16-bit (2 byte), “FP16”
  • LONG: long signed integer, 64 bit (8 byte)
  • INT: signed integer, 32 bit (4 byte)
  • SHORT: signed short integer, 16 bit (2 byte)
  • UBYTE: unsigned byte, 8 bit (1 byte), 0 to 255
  • BYTE: signed byte, 8 bit (1 byte), -128 to 127
  • BOOL: boolean type, (0/1, true/false). Uses ubyte storage for easier op parallelization
  • UTF8: String array type, UTF8 format

SEE ALSO: Keep your ML data on the down low with TensorFlow Privacy

Support added for:

  • ND4J/DL4J: CUDA 10.1 (support for CUDA 9.0 and Mac/OSX CUDA support dropped/Linux and Windows support still remains)
  • DL4J/ND4J: MKL-DNN support added. Deeplearning4j now supports MKL-DNN by default when running on CPU/native backend.
    • Support for the following layer types:
      • ConvolutionLayer and Convolution1DLayer (and Conv2D/Conv2DDerivative ND4J ops)
      • SubsamplingLayer and Subsampling1DLayer (and MaxPooling2D/AvgPooling2D/Pooling2DDerivative ND4J ops)
      • BatchNormalization layer (and BatchNorm ND4J op)
      • LocalResponseNormalization layer (and LocalResponseNormalization ND4J op)
      • Convolution3D layer (and Conv3D/Conv3DDerivative ND4J ops)
    • Support for more layer types will come in a future update

Improvements

ND4J, the scientific computing library, gains some improved performance.

The new update makes some memory management changes and tweaks to how garbage collection works. Periodic garbage collection is now disabled by default. GC will only perform when required, saving overhead resources and improving performance.

SEE ALSO: Machine learning gone wrong: Why should de-biasing be a priority?

From the release notes, here are some of the highlighted improvements and enhancements made to DL4J:

  • L1/L2 regularization made into a class, weight decay added
  • Added BertIterator for BERT training
  • UI JavaScript dependencies and fonts imported to WebJar
  • Improved Keras importing
  • EmbeddingLayer and EmbeddingSequenceLayer builders now have .weightInit(INDArray) and .weightInit(Word2Vec) methods for initializing parameters from pretrained word vectors
  • Better Kotlin support
  • MultiLayerNetwork/ComputationGraph can be converted between (floating point) datatypes FP16/32/64 for the parameters and activations using the MultiLayerNetwork/ComputationGraph.convertDataType(DataType) methods
  • Added GELU Activation function
  • Added ComputationGraph.output(List<String> layers, boolean train, INDArray[] features, INDArray[] featureMasks) method to get the activations for a specific set of layers/vertices only (without redundant calculations)
  • Report garbage collection information with PerformanceListener
  • Apache Lucene/Solr upgraded to 7.7.1
  • New dot product attention layers

For a complete rundown of the latest release, see the full release notes available here.

Beginning Deeplearning4j

New to DL4J? Follow their quickstart guide and find the monorepo of Deeplearning4j on GitHub.

You will need JDK 1.7 >, Apache Maven, IntelliJ IDEA or Eclipse, and Git. For additional resources, refer to the machine learning tutorial and DL4J wiki.

More tutorials and guides available on the DL4J website.

Author
Sarah Schlothauer

Sarah Schlothauer

All Posts by Sarah Schlothauer

Sarah Schlothauer is an assistant editor for JAXenter.com. She received her Bachelor's degree from Monmouth University and is currently enrolled at Goethe University in Frankfurt, Germany where she is working on her Masters. She lives in Frankfurt with her husband and cat.

Leave a Reply

Be the First to Comment!

avatar
400
  Subscribe  
Notify of