Skip to content

Conversation

@psy2848048
Copy link
Contributor

Background

  • Ask an opinion for breaking but occuring data loss Float64 data

Summary

When parsing total_producer_vote_weight, the float64 value shows as 4.403240198383598e+20
So, there is a data loss in parsing.

- (string) (len=26) "total_producer_vote_weight": (string) (len=28) "440324019838359830528.000000"
+ (string) (len=26) "total_producer_vote_weight": (string) (len=21) "440324019838359800000"

But I think this loss is not really important in the lower digit part loss. What do you think?
If yes, I'll change the expected data to pass the test.

Note

Checklist

  • Backward compatible?
  • Test enough in your local environment?
  • Add related test cases?

@maoueh
Copy link
Contributor

maoueh commented Jan 4, 2023

@psy2848048 I'm not sure I understand your comment. I saw the diff, only thing I see that changes outside of tests is the change from JSONTime to BlockTimestamp. I don't see how it relates to your comment. So I'm unsure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants