Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/nn.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2016-02-25Add VolumetricBatchNormalizationSam Gross
The BatchNormalization modules now all extend nn.BatchNormalization and use the same THNN/THCUNN implementation.
2016-02-18Add THNN conversion for Spatial* modulesFrancisco Massa
Add THNN conversion of SpatialBatchNormalization, SpatialFractionalMaxPooling and SpatialSubSampling Add THNN convertion of SpatialConvolutionLocal, SpatialFullConvolution and SpatialUpSamplingNearest THNN conversion of SpatialMaxUnpooling Remove unfold from generic Add functional conversion of SpatialCrossMapLRN Plus fix in the init.c Fix
2016-02-12clearState save_mean and save_std in BNSergey Zagoruyko
2016-02-09nn.clearStateSergey Zagoruyko
2016-02-08Update SpatialBatchNormalization.luaTimothy Emerick
Fix typo and copy-paste mistake
2016-02-04Fix loading of old SpatialBatchNormalization modulesSam Gross
The old-style running_std is actually the E[1/sqrt(var + eps)]. I forgot to subtract out the 'eps' when converting to running_var.
2016-01-22fix batchnorm resetSergey Zagoruyko
2016-01-05Add C implementation of SpatialBatchNormalizationSam Gross
This is primarily to support the fast, memory-efficient CUDA implementation. Some other changes include making weight and bias each individually optional and averaging the variances instead of the inverse standard deviation.
2015-10-19fix batchnorm resetSergey Zagoruyko
2015-09-1130% Memory savingDimitrios Korkinof
Not initialising those variables saved 30% of GPU memory when not training.
2015-06-03batchnorm is clonable by adding the running estimates to constructorsoumith
fixing batchnorm tests
2015-04-21making running_mean/std shareable in batch normalizationsoumith
2015-03-21adding batch normalizationsoumith