I'm sorry if this sounds totaly amateurish (and I guess easy to answer), but can't '0' be considered as a float or double? I'm parsing input lines to make sure all tokens are valid floats (e.g. 0.00001e+002). I do this by checking the returned value of 'atof' for each scanned token 'string'. If atof returns anything but 0, it's valid. However, in some cases values are zero (i.e. 0.00000e+000); if I use 'atof', I'll get '0'. Is there no way of distinguishing '0' from gibberish (e.g. 0.00????e+000)?
This isn't critical, as I can always skip this input checking procedure by later setting to '0' any token string that atof doesn't recognize as a float ... but I'd like to know all the same. The other way would be to mimick something like atof which would admit '0' as being valid.
Thanks in advance
This isn't critical, as I can always skip this input checking procedure by later setting to '0' any token string that atof doesn't recognize as a float ... but I'd like to know all the same. The other way would be to mimick something like atof which would admit '0' as being valid.
Thanks in advance