I am using curl to communicate with a server.
When I make a request for data I receive the HTTP headers followed by jpeg data separated by a boundary like so:
I need to parse out
I have copied the incoming data to a a char array like so:
static size_t OnReceiveData ( void * pvData, size_t tSize, size_t tCount, void * pvUser )
{
printf("%*.*s", tSize * tCount, tSize * tCount, pvData);
char* _data;
if(pvData != nullptr && 0 != tCount)
{
_data = new char[tCount];
memcpy(_data, pvData, tCount);
}
return ( tCount );
}
How can I best do this in C++?? How do I actually inspect and parse the _data array for the information that I want?? Are the any boost libraries that I can use for example??
You could parse the headers on the fly or put them into a map and post-process later.
Use find
, substr
methods from the std::string.
Look at Boost String Algorithms Library, it contains lots of algorithms, e.g. trim
e.g. to place headers into the std::map
and print them (rough cuts):
#include <stdlib.h>
#include <iostream>
#include <sstream>
#include <string>
#include <map>
#include <boost/algorithm/string.hpp>
int main(int argc, char* argv[]) {
const char* s = "HTTP/1.1 200 OK\r\n"
"Content-Type: image/jpeg; charset=utf-8\r\n"
"Content-Length: 19912\r\n\r\n";
std::map<std::string, std::string> m;
std::istringstream resp(s);
std::string header;
std::string::size_type index;
while (std::getline(resp, header) && header != "\r") {
index = header.find(':', 0);
if(index != std::string::npos) {
m.insert(std::make_pair(
boost::algorithm::trim_copy(header.substr(0, index)),
boost::algorithm::trim_copy(header.substr(index + 1))
));
}
}
for(auto& kv: m) {
std::cout << "KEY: `" << kv.first << "`, VALUE: `" << kv.second << '`' << std::endl;
}
return EXIT_SUCCESS;
}
You will get the output:
KEY: `Content-Length`, VALUE: `19912`
KEY: `Content-Type`, VALUE: `image/jpeg; charset=utf-8`
Having the headers, you could extract the required ones for post-processing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With