I have two sample applications using the same library and the main difference between them is that one uses qt and the other application is a console application.
In the common library, I have this test code:
double test = 0.1;
double test2 = atof("2.13134");
double test3 = atof("1,12345");
The values if I use the non-qt application are:
test = 0.10000000000001
test2 = 2.1323399999999999998
test3 = 1 // This is the expected result using a ',' as delimitation character
But with the qt application:
test = 0.10000000000001
test2 = 2 // This is not expected!!!
test3 = 1.1234500000000000001
Is there any case where the behaviour of the 'atof' changes because qt?
std::atof
depends on the currently set locale to tell it which character is the decimal point. In the default case ("C locale"), that is the period character '.
'.
It's likely that Qt is setting the locale to something else. You can revert that using the standard C[++] mechanism:
std::setlocale(LC_ALL, "C");
The problem you are noticing is most likely caused by Qt's notion of locale. You can use:
QLocale::setDefault(QLocale::C);
to make it work like atof
.
Update
It seems QLocale::setDefault
does not set the default locale used by Qt. It merely sets the default locale that will be created when you construct a QLocale
. See Changing locale in Qt and the accepted answer for more info.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With