Fixing test flakiness #84
                
     Open
            
            
          
  Add this suggestion to a batch that can be applied as a single commit.
  This suggestion is invalid because no changes were made to the code.
  Suggestions cannot be applied while the pull request is closed.
  Suggestions cannot be applied while viewing a subset of changes.
  Only one suggestion per line can be applied in a batch.
  Add this suggestion to a batch that can be applied as a single commit.
  Applying suggestions on deleted lines is not supported.
  You must change the existing code in this line in order to create a valid suggestion.
  Outdated suggestions cannot be applied.
  This suggestion has been applied or marked resolved.
  Suggestions cannot be applied from pending reviews.
  Suggestions cannot be applied on multi-line comments.
  Suggestions cannot be applied while the pull request is queued to merge.
  Suggestion cannot be applied right now. Please check back later.
  
    
  
    
The test
MultiHeadSelfAttentionTest::test_multi_head_self_attention_respects_maskinginpymagnitude/third_party/allennlp/tests/modules/seq2seq_encoders/multi_head_self_attention_test.pyfails intermittently with the following assertion error:This fix addresses this problem. I looked at the differences in the values that are being compared from several samples and changing the decimal places from 7 (default) to 6 fixes this problem and reduces the flakiness of the test.
Please let me know if this looks good or if you have any other suggestions for the fix.