I have an app working perfectly for a year now (thank you so much for the amazing product).
But for a week now, I have a bug with one of the columns, which apeard out of the blue.
The app is for time tracking, and when a user fills in the form for the time, he puts in the value for how many hours he spent on a task. That is done with a Decimal column Type, using an increase/decrease step of 0.25 and thus, two decimal digits.
So when the user hits the + sign, he starts from 0 and goes up through
- 0.25 - 0.5 - 0.75 - 1 - 1.25 and so on.
And this worked perfectly until a week ago when, as you hit the + button it goes through
- 0.25 - 0.5 - 0.8 - 1.00 - 1 - 1.25 - 1.5 - 1.8 - 2.00 - 2 and so on.
And also when hitting the - button from there back it goes though:
- 2.00 - 2 - 1.75 - 1.5 - 1.3 - 1.25 - 1.00 - 1 - 0.75 - 0.5 - 0.3 - 0.25 - 0.00 - 0
No idea why this is happening. I can’t think of what could have happened on my side, since I haven’t been working on it for a while now, and looking at the way it is set up, everything looks ok but in practice there is this bug.