neon v1.1.0 released!
Oct 31, 2015
Oct 31, 2015
Highlights from this new release include:
* Sentiment analysis support (LSTM lookupTable based), new IMDB example network
* Support for merge and branch layer stacks via the introduction of LayerContainers
* Support for freezing layer stacks
* Adagrad based optimizer
* new GPU kernels for fast compounding batch norm, conv and pooling engine updates, new kernel build system and flags
* Modifications for Caffe support. Note that this may break backwards compatibility with previously serialized strided conv net models, see: http://neon.nervanasys.com/docs/latest/faq.html for details
* Default training cost display during progress bar is now calculated on a rolling window basis rather than from the beginning of each epoch
* Separate layer configuration and initialization steps
* Callback enhancements and updates. Note that validation_frequency renamed to evaluation_frequency
* Miscellaneous bug fixes and documentation updates throughout.
We are excited to announce the release of neon™ 2.3.0. It ships with significant performance improvements for Deep Speech 2 (DS2) and VGG models running on Intel® architecture (IA). For the DS2 model, our tests show up to 6.8X improvement1,4 with the Intel® Math Kernel Library (Intel® MKL) backend over the NumPy CPU backend with…
We are excited to announce the availability of neon™ 2.1 framework. An optimized backend based on Intel® Math Kernel Library (Intel® MKL), is enabled by default on CPU platforms with this release. neon™ 2.1 also uses a newer version of the Intel ® MKL for Deep Neural Networks (Intel ® MKL-DNN), which features optimizations for…
neon™ is a deep learning framework created by Nervana Systems with industry leading performance on GPUs thanks to its custom assembly kernels and optimized algorithms. After Nervana joined Intel, we have been working together to bring superior performance to CPU platforms as well. Today, after the result of a great collaboration between the teams, we…
Keep tabs on all the latest news with our monthly newsletter.