I have a field defined in an MS/SQL 7.0 table as "REAL".
When I specify this field as a parameter the following options appear as in:
.03, .05, .08, .95, .98, .99, 1.00
If I selected .03 up to .08 I get the records I ask for, but if I select .95 up to 1.00 I ALWAYS get 1.00 for that field even if the value is really .95 up to .99 in the table. SQL query analyser defines these values like this
9.9999999 but CR does NOT. I read this artice at CR support:
Any way around this so that .95 records to .99 records show up correctly ?
When I specify this field as a parameter the following options appear as in:
.03, .05, .08, .95, .98, .99, 1.00
If I selected .03 up to .08 I get the records I ask for, but if I select .95 up to 1.00 I ALWAYS get 1.00 for that field even if the value is really .95 up to .99 in the table. SQL query analyser defines these values like this
9.9999999 but CR does NOT. I read this artice at CR support:
Any way around this so that .95 records to .99 records show up correctly ?