tag:github.com,2008:https://github.com/knottb/pytorch/releases Tags from pytorch 2019-04-30T23:22:19Z tag:github.com,2008:Repository/194870354/v1.1.0 2019-04-30T23:22:19Z v1.1.0: Fix version handler in 1.1.0 docs. (#19977) <p>Fix version handler in 1.1.0 docs. (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/19977">pytorch#19977</a>)</p> <p>Update the find &amp; replace to be less restrictive. Will port this change <br />to master to avoid problems in the future.</p> zou3519 tag:github.com,2008:Repository/194870354/v1.0.1 2019-02-07T07:24:24Z v1.0.1: Remove unnecessary typing dependency. (#16776) <p>Remove unnecessary typing dependency. (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/16776">pytorch#16776</a>)</p> <p>Signed-off-by: Edward Z. Yang &lt;ezyang@fb.com&gt;</p> ezyang tag:github.com,2008:Repository/194870354/v1.0.0 2018-12-06T23:44:56Z v1.0.0 soumith tag:github.com,2008:Repository/194870354/v1.0rc1 2018-10-02T04:54:52Z v1.0rc1 <p>Back out "Revert D10123245: Back out "codemod cuda_gpu_id to device_i…</p> <p>…d"" (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/12232">pytorch#12232</a>)</p> <p>Summary: <br />Pull Request <span class="issue-keyword tooltipped tooltipped-se">resolved</span>: <a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/12232">pytorch#12232</a></p> <p>Original commit changeset: fca91fea58b7</p> <p>This adds proper modifications to the DeviceType &lt;-&gt;DeviceOption conversion code added in D10033396</p> <p>Reviewed By: jerryzh168</p> <p>Differential Revision: D10132473</p> <p>fbshipit-source-id: 801ef777e2950982cb47b48051b1471a0a91e64b</p> bddppq tag:github.com,2008:Repository/194870354/v1.0rc0 2018-10-02T04:54:52Z v1.0rc0 <p>Back out "Revert D10123245: Back out "codemod cuda_gpu_id to device_i…</p> <p>…d"" (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/12232">pytorch#12232</a>)</p> <p>Summary: <br />Pull Request <span class="issue-keyword tooltipped tooltipped-se">resolved</span>: <a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/12232">pytorch#12232</a></p> <p>Original commit changeset: fca91fea58b7</p> <p>This adds proper modifications to the DeviceType &lt;-&gt;DeviceOption conversion code added in D10033396</p> <p>Reviewed By: jerryzh168</p> <p>Differential Revision: D10132473</p> <p>fbshipit-source-id: 801ef777e2950982cb47b48051b1471a0a91e64b</p> bddppq tag:github.com,2008:Repository/194870354/v0.4.1 2018-07-26T00:28:04Z v0.4.1 soumith tag:github.com,2008:Repository/194870354/v0.4.0 2018-05-30T21:37:47Z v0.4.0 soumith tag:github.com,2008:Repository/194870354/v0.3.1 2018-02-09T17:07:43Z v0.3.1: Scopes 0.3.1 backport (#5153) <p>Scopes 0.3.1 backport (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/5153">pytorch#5153</a>)</p> <p>* Introduce scopes during tracing (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/3016">pytorch#3016</a>)</p> <p>* Fix segfault during ONNX export</p> <p>* Further fix to tracing scope (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/4558">pytorch#4558</a>)</p> <p>* Set missing temporary scope in callPySymbolicMethod</p> <p>* Use expected traces in all scope tests</p> <p>* Fix tracking of tracing scopes during ONNX pass (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/4524">pytorch#4524</a>)</p> <p>* Fix tracking of tracing scopes during ONNX pass</p> <p>* Use ResourceGuard to manage setting a temporary current scope in Graph</p> <p>* Add tests for ONNX pass scopes</p> <p>* Remove unused num_classes argument</p> <p>* Expose node scopeName to python (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/4200">pytorch#4200</a>)</p> <p>* Inherit JIT scopes when cloning only when it's correct</p> <p>It's correct only when the new graph owns the same scope tree <br />as the original one. We can end up with dangling pointers otherwise.</p> <p>* Fixes after cherry-picking, still one test to go</p> <p>* Fix for last failing test after scope cherry-pick</p> <p>* Fix linting issue</p> lantiga tag:github.com,2008:Repository/194870354/v0.3.0 2017-12-04T08:00:43Z Backport transposes optimization to v0.3.0 (#3994) <p>Backport transposes optimization to v0.3.0 (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/3994">pytorch#3994</a>)</p> <p>* Optimizer: optimize transposes in variety of circumstances (<a class="issue-link js-issue-link" href="https://github.com/pytorch/pytorch/pull/3509">pytorch#3509</a>)</p> <p>* Optimizer: Optimize transposes in variety of circumstances</p> <p>- No-op transposes <br />- Consecutive transposes (fuse them) <br />- Transposes into Gemm (fuse them into transA/transB parameter)</p> <p>* touch up out of date comment</p> <p>* Backporting optimizer changes</p> dzhulgakov tag:github.com,2008:Repository/194870354/v0.2.0 2017-08-28T14:41:55Z v0.2.0 soumith