Example code for weight normalization research paper
Top 78.4% on sourcepulse
This repository provides example implementations of Weight Normalization, a technique to accelerate deep neural network training. It targets researchers and practitioners working with deep learning frameworks like Theano (via Lasagne), TensorFlow, and Keras. The primary benefit is faster convergence and improved training stability.
How It Works
Weight Normalization reparameterizes network weights by separating the direction and magnitude of weight vectors. This decoupling allows the optimizer to focus on learning the optimal direction, leading to faster convergence compared to standard weight parameterization.
Quick Start & Requirements
Highlighted Details
Maintenance & Community
Status: Archive (code is provided as-is, no updates expected).
Licensing & Compatibility
The repository does not explicitly state a license.
Limitations & Caveats
The code is archived and no longer maintained or updated. Compatibility with current versions of the underlying frameworks (Theano, TensorFlow, Keras) is not guaranteed.
6 years ago
Inactive