The following (toy) program returns different things when linked against libstdc++ and libc++. Is this a bug in libc++ or do I not understand how istream eof() works? I have tried running it using g++ on linux and mac os x and clang on mac os x, with and without -std=c++0x. It was my impression that eof() does not return true until an attempt to read (by get() or something else) actually fails. This is how libstdc++ behaves, but not how libc++ behaves.
#include <iostream>
#include <sstream>
int main() {
std::stringstream s;
s << "a";
std::cout << "EOF? " << (s.eof() ? "T" : "F") << std::endl;
std::cout << "get: " << s.get() << std::endl;
std::cout << "EOF? " << (s.eof() ? "T" : "F") << std::endl;
return 0;
}
Thor:~$ g++ test.cpp
Thor:~$ ./a.out
EOF? F
get: 97
EOF? F
Thor:~$ clang++ -std=c++0x -stdlib=libstdc++ test.cpp
Thor:~$ ./a.out
EOF? F
get: 97
EOF? F
Thor:~$ clang++ -std=c++0x -stdlib=libc++ test.cpp
Thor:~$ ./a.out
EOF? F
get: 97
EOF? T
Thor:~$ clang++ -stdlib=libc++ test.cpp
Thor:~$ ./a.out
EOF? F
get: 97
EOF? T
EDIT: This was due to the way older versions of libc++ interpreted the C++ standard. The interpretation was discussed in LWG issue 2036, it was ruled to be incorrect and libc++ was changed.
Current libc++ gives the same results on your test as libstdc++.
old answer:
Your understanding is correct.
istream::get()
does the following:
good()
, and sets failbit
if it returns false (this adds a failbit to a stream that had some other bit set), (§27.7.2.1.2[istream::sentry]/2
)good()
is false at this point, returns eof and does nothing else.rdbuf()->sbumpc()
or rdbuf()->sgetc()
(§27.7.2.1[istream]/2
)sbumpc()
or sgetc()
returned eof, sets eofbit
. (§27.7.2.1[istream]/3
) and failbit
(§27.7.2.2.3[istream.unformatted]/4
)§27.7.2.2.3[istream.unformatted]/1
) and rethrows if allowed.(chapters quoted from C++11, but C++03 has all the same rules, under §27.6.*)
Now let's take a look at the implementations:
libc++ (current svn version) defines the relevant part of get() as
sentry __s(*this, true);
if (__s)
{
__r = this->rdbuf()->sbumpc();
if (traits_type::eq_int_type(__r, traits_type::eof()))
this->setstate(ios_base::failbit | ios_base::eofbit);
else
__gc_ = 1;
}
libstdc++ (as shipped with gcc 4.6.2) defines the same part as
sentry __cerb(*this, true);
if (__cerb)
{
__try
{
__c = this->rdbuf()->sbumpc();
// 27.6.1.1 paragraph 3
if (!traits_type::eq_int_type(__c, __eof))
_M_gcount = 1;
else
__err |= ios_base::eofbit;
}
[...]
if (!_M_gcount)
__err |= ios_base::failbit;
As you can see, both libraries call sbumpc()
and set eofbit if and only if sbumpc() returned eof.
Your testcase produces the same output for me using recent versions of both libraries.
This was a libc++ bug and has been fixed as Cubbi noted. My bad. Details are here:
http://lwg.github.io/issues/lwg-closed.html#2036
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With