Overview
Request 778030 accepted
- updated to stable release 1.4.0, which has as Highlights:
* Distributed Model Parallel Training
* Pruning functionalities have been added to PyTorch
- New Features:
* torch.optim.lr_scheduler now support “chaining.”
* torch.distributed.rpc is a newly introduced package
- full Changelog listed in relases file or under
https://github.com/pytorch/pytorch/releases
- added files:
* skip-third-party-check.patch which is a patch to skip
the check of disabled dependencies
* QNNPACK-7d2a4e9931a82adc3814275b6219a03e24e36b4c.tar.gz
which is part of pytorch but developed in different repo
- removed patch files:
* fix-build-options.patch
* honor-PSIMD-env.patch
* removed-some-tests.patch
- Requires python-PeachPy on x86_64 only, as it is optional
and available on x86_64 only
Request History
mslacken created request
- updated to stable release 1.4.0, which has as Highlights:
* Distributed Model Parallel Training
* Pruning functionalities have been added to PyTorch
- New Features:
* torch.optim.lr_scheduler now support “chaining.”
* torch.distributed.rpc is a newly introduced package
- full Changelog listed in relases file or under
https://github.com/pytorch/pytorch/releases
- added files:
* skip-third-party-check.patch which is a patch to skip
the check of disabled dependencies
* QNNPACK-7d2a4e9931a82adc3814275b6219a03e24e36b4c.tar.gz
which is part of pytorch but developed in different repo
- removed patch files:
* fix-build-options.patch
* honor-PSIMD-env.patch
* removed-some-tests.patch
- Requires python-PeachPy on x86_64 only, as it is optional
and available on x86_64 only
mslacken accepted request
OK