diff options
author | Jonathan Tompson <tompson@cims.nyu.edu> | 2013-10-19 21:24:52 +0400 |
---|---|---|
committer | Jonathan Tompson <tompson@cims.nyu.edu> | 2013-10-19 21:24:52 +0400 |
commit | e1fbc0cccab633fd0615dc64ba9fd52f64072622 (patch) | |
tree | be1ed45abcee2e00105f0018a627fdb32991608b | |
parent | 4be845ee0d7550daa15a13d8b7c0c14065aa8242 (diff) |
it seems like torch.abs() doesn't have a cuda implementation, so the previous commit would fail when PairwiseDistance():cuda() was called.
-rw-r--r-- | PairwiseDistance.lua | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/PairwiseDistance.lua b/PairwiseDistance.lua index a11c864..4941210 100644 --- a/PairwiseDistance.lua +++ b/PairwiseDistance.lua @@ -48,7 +48,7 @@ function PairwiseDistance:updateGradInput(input, gradOutput) -- See here for derivative of p-norm: -- d/dx_k(||x||_p) = (x_k * abs(x_k)^(p-2)) / (||x||_p)^(p-1) -- http://en.wikipedia.org/wiki/Norm_(mathematics) - self.gradInput[1]:cmul(torch.abs(self.gradInput[1]):pow(self.norm-2)) + self.gradInput[1]:cmul(self.gradInput[1]:clone():abs():pow(self.norm-2)) if input[1]:dim() == 1 then -- Avoid the expand for dimension 1 self.gradInput[1]:mul(math.pow(self.output[1],-(self.norm-1))) |