Welcome to
mirror list
, hosted at
ThFree Co
, Russian Federation.
github.com/soumith/cudnn.torch.git - Unnamed repository; edit this file 'description' to name the repository.
index
:
github.com/soumith/cudnn.torch.git
R1
R2
R3
R4
R5
R6
R7
functional-findex
half
master
revert-231-algo
volfullconv
Unnamed repository; edit this file 'description' to name the repository.
www-data
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
test
Age
Commit message (
Collapse
)
Author
2017-04-06
remove global variable from test
Trevor Killeen
2017-04-04
move back to cudnntest
Trevor Killeen
2017-04-04
implement for accGradParameters
Trevor Killeen
2017-04-04
parity for updateGradInput
Trevor Killeen
2017-04-04
test for hidden output as well
Trevor Killeen
2017-04-04
outputs for forward pass working + test
Trevor Killeen
2017-04-04
implement pack/pad utility functions
Trevor Killeen
2017-02-16
inplace tests for Sigmoid and Tanh, typo
Natalia Gimelshein
2016-11-10
Improved existing 16->32 fallback. Added performance-based fallback.
Boris Fomitchev
2016-10-21
debug diagnostic fixes. true fp16 disabled for now
Boris Fomitchev
2016-10-19
Fixing refactored methods
Boris Fomitchev
Conflicts: RNN.lua init.lua test/test.lua
2016-10-19
Merge remote-tracking branch 'upstream/master' into fp16
Boris Fomitchev
Conflicts: SpatialFullConvolution.lua init.lua
2016-10-19
Added new refactoring for convolution and filter descriptors
Boris Fomitchev
2016-10-19
Code review changes
Boris Fomitchev
2016-10-11
Merge remote-tracking branch 'upstream/master' into fp16
Boris Fomitchev
2016-10-08
make VolumetricFullConvolution use find
Natalia Gimeshein
2016-10-06
functional tests pass
Natalia Gimeshein
2016-10-05
merging master
Natalia Gimeshein
2016-10-04
functional tests for double with fixes
Sergey Zagoruyko
2016-09-30
Change detection of FP16 math to follow cutorch setting
Boris Fomitchev
2016-09-30
adding VolumetricFullConvolution
volfullconv
soumith
2016-09-29
functional softmax
Sergey Zagoruyko
2016-09-29
add activations to functional
Sergey Zagoruyko
2016-09-26
Fixed sticky algo modes
Boris Fomitchev
2016-09-23
Revamped workspace handling in find.lua
Boris Fomitchev
Retired functional.lua: impossible to maintain consistently with Find. Simplified FindEx state machine: replaced witgh warmup iterations concept, controllable by user. FindEx still needs some work. Improved cache handling and debug print
2016-09-19
Restoring test
Boris Fomitchev
2016-09-18
Adjusting half precision
Boris Fomitchev
2016-09-15
Merge remote-tracking branch 'upstream/master' into find_ex
Boris Fomitchev
2016-09-02
Refactoring for clarity and less allocations
Boris Fomitchev
2016-08-31
Debugging fallback, cleanup
Boris Fomitchev
2016-08-30
Addressed code review comments
Boris Fomitchev
2016-08-26
Fixed fallback - fp16 is fully working now
Boris Fomitchev
2016-08-26
FP16 to 32 fallback implemented
Boris Fomitchev
2016-08-23
Stream awareness restored. Better WS encapsulation
Boris Fomitchev
2016-08-21
FindEx implementetation + refactoring, take 3
Boris Fomitchev
2016-08-12
Add tests for VolumetricLogSoftMax and VolumetricCrossEntropyCriterion
Sasank Chilamkurthy
2016-08-06
working double precision
soumith
2016-08-06
refactoring tests, phase 1
soumith
2016-08-06
Revert "Refactoring CUDNN Find"
revert-231-algo
Soumith Chintala
2016-08-04
Completing cudnnFind refactoring; addressing code review notes
Boris Fomitchev
2016-08-03
Merge remote-tracking branch 'upstream/master' into algo
Boris Fomitchev
2016-08-03
Refactoring cudnnFind
Boris Fomitchev
2016-08-02
Refactoring CUDNN Find
Boris Fomitchev
2016-07-29
Adjusting test toletance, disabling double test
Boris Fomitchev
2016-07-29
Merge pull request #206 from szagoruyko/fp16
Soumith Chintala
half and double with tests
2016-07-07
Added bias and weight functions for RNN
SeanNaren
2016-06-23
deal with fp16 batchnorm
Sergey Zagoruyko
2016-06-23
half, double with tests
Sergey Zagoruyko
2016-06-11
Modified tests and params, updated docs for clipped ReLU
SeanNaren
2016-06-11
Added clipped ReLU
SeanNaren
[next]