In the below program , I intend to read each line in a file into a string , break down the string and display the individual words.The problem I am facing is , the program now outputs only the first line in the file. I do not understand why this is happening ?
#include<iostream>
#include<string>
#include<fstream>
#include<cstdio>
using namespace std;
int main()
{
ifstream InputFile("hello.txt") ;
string store ;
char * token;
while(getline(InputFile,store))
{
cout<<as<<endl;
token = strtok(&store[0]," ");
cout<<token;
while(token!=NULL)
{
token = strtok(NULL," ");
cout<<token<<" ";
}
}
}
I'm new to C++, but I think an alternative approach could be:
while(getline(InputFile, store))
{
stringstream line(store); // include <sstream>
string token;
while (line >> token)
{
cout << "Token: " << token << endl;
}
}
This will parse your file line-by-line and tokenise each line based on whitespace separation (so this includes more than just spaces, such as tabs, and new lines).
Well, there is a problem here. strtok()
takes a null-terminated string, and the contents of a std::string
are not necessarily null-terminated.
You can get a null-terminated string from a std::string
by calling c_str()
on it, but this returns a const char*
(i.e., the string is not modifiable). strtok()
takes a char*
and modifies the string when it is called.
If you really want to use strtok()
, then in my opinion the cleanest option would be to copy the characters from the std::string
into a std::vector
and the null-terminate the vector:
std::string s("hello, world");
std::vector<char> v(s.begin(), s.end());
v.push_back('\0');
You can now use the contents of the vector as a null-terminated string (using &v[0]
) and pass that to strtok()
.
If you can use Boost, I'd recommend using Boost Tokenizer. It provides a very clean interface for tokenizing a string.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With