Skip to content

Switch to pytorch mixed precision#88

Open
senwu wants to merge 4 commits intomainfrom
fp16
Open

Switch to pytorch mixed precision#88
senwu wants to merge 4 commits intomainfrom
fp16

Conversation

@senwu
Copy link
Copy Markdown
Owner

@senwu senwu commented Dec 31, 2020

Description of the proposed changes

Switch to pytorch mixed precision training

Checklist

  • I have updated the documentation accordingly.
  • I have added tests to cover my changes.
  • All new and existing tests passed.
  • I have updated the CHANGELOG.rst accordingly.

@codecov
Copy link
Copy Markdown

codecov bot commented Dec 31, 2020

Codecov Report

Merging #88 (4deefba) into master (1a35ab3) will increase coverage by 0.24%.
The diff coverage is 66.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #88      +/-   ##
==========================================
+ Coverage   87.73%   87.97%   +0.24%     
==========================================
  Files          38       38              
  Lines        1745     1747       +2     
  Branches      375      375              
==========================================
+ Hits         1531     1537       +6     
+ Misses        114      110       -4     
  Partials      100      100              
Flag Coverage Δ
unittests 87.97% <66.66%> (+0.24%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
setup.py 0.00% <ø> (ø)
src/emmental/utils/parse_args.py 99.25% <ø> (-0.01%) ⬇️
src/emmental/learner.py 78.16% <66.66%> (+1.80%) ⬆️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant