This is not a duplicate of what-are-differences-between-xmlhttprequest-and-httprequest And for info, I tried this lib without success, because it copies the structure of the XMLHttpRequest but doesn't actually act like it.
I wonder what is the true network difference between HttpRequest
from Node and XMLHttpRequest
from a browser.
If I just watch the XMLHttpRequest inside chrome's devtools, I can't see any X-Requested-with
header in the request.
Besides, there's an online service that is behind CloudFlare's WAF with custom rules. If I make the request with XMLHttpRequest
, it just works, but I do it with https.request
it fails being firewalled by CF.
I need to do it with HttpRequest
so I can configure a proxy.
What is the network difference between the two, and how could I simulate a XMLHttpRequest from a HttpRequest ? And is that even possible ? I watched the source of chromium here but can't find anything interesting.
Maybe it differs from the IO layers ? TCP handshake ?
Advices required. Thanks
Here is the XMLHttpRequest (working)
let req = new XMLHttpRequest();
req.open("post", "https://haapi.ankama.com/json/Ankama/v2/Api/CreateApiKey", true);
req.withCredentials = true;
req.setRequestHeader('Accept', 'application/json');
req.setRequestHeader('Content-Type', 'text/plain;charset=UTF-8');
req.setRequestHeader('Accept-Encoding', 'gzip, deflate, br');
req.onload = function() {
console.log(req.response)
};
req.send("login=smallladybug949&password=Tl9HDKWjusopMWy&long_life_token=true");
The same, as cURL (not passing the CF's firewall)
curl 'URL' \
-H 'origin: null' \
-H 'accept-encoding: gzip, deflate, br' \
-H 'user-agent: Mozilla/5.0 (Linux; Android 6.0.1; Z988 Build/MMB29M) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/69.0.3497.100 Mobile Safari/537.36' \
-H 'content-type: text/plain;charset=UTF-8' \
-H 'accept: application/json' \
-H 'authority: URL.com' \
--data-binary 'login=123&password=def' \
--compressed
Here is the HttpRequest (not passing the CF's firewall)
let opts = url.parse(URL);
opts.method = post;
opts.headers = {
'Accept': 'application/json',
'Content-Type': 'text/plain;charset=UTF-8',
'Accept-Encoding': 'gzip, deflate, br',
'User-Agent': 'Mozilla/5.0 (Linux; Android 8.0.0; SM-G960F Build/R16NW) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.137 Mobile Safari/537.36'
}
let req = https.request(opts, function (res) {
res.setEncoding('utf8');
res.body = "";
res.on('data', (chunk) => {
res.body += chunk;
});
res.on('end', (chunk) => {
try {
res.body = JSON.parse(res.body);
} catch (e) {
return reject(res.body); // error, http 403 / 1020 error from CF (custom FW rule)
}
console.log(res.body); // we'll not reach this
});
});
req.on('error', e => {
console.error('error', e);
});
req.write("login=abc&password=def");
req.end();
After several tests, the curl command is working, the XHR works too, but with Postman or HttpRequest, it fails. Here is a video of the postman vs curl : https://streamable.com/81s57 The curl command in the video is this one :
curl -X POST \
https://haapi.ankama.com/json/Ankama/v2/Api/CreateApiKey \
-H 'accept: application/json' \
-H 'accept-encoding: gzip, deflate, br' \
-H 'accept-language: fr' \
-H 'authority: haapi.ankama.com' \
-H 'content-type: text/plain;charset=UTF-8' \
-H 'origin: null' \
-H 'user-agent: Mozilla/5.0 (Linux; Android 8.0.0; SM-G960F Build/R16NW) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.137 Mobile Safari/537.36' \
-d 'login=smallladybug949&password=Tl9HDKWjusopMWy&long_life_token=true'
(this is a test account so I don't need it and you can make tests with it). You can either add --compressed
flag to the curl request to decompress it or pipe it to gunzip
.
I found out that it was due to the misused (for CF) TLS protocol. By downgrading curl which is using OpenSSL/1.1.0f, the calls just work. But since OpenSSL/1.1.0g they don't. You can read more about OpenSSL changelogs here
As I discussed in the comments, I can reproduce: the XMLHttpRequest example from the first "Edit" works (HTTP status = 200), while the "copy as cURL" version of it returns 403 from Cloudfare.
Adding --cert-status
to curl makes it work for me, so it seems that Cloudfare analyzes TLS-level communication when deciding to deny a request.
Your curl command from the first Edit has a few other differences from the version I get when using "Copy as cURL":
curl 'URL'
instead of https://haapi.ankama.com/json/Ankama/v2/Api/CreateApiKey
obviously fails, please don't make it harder to reproduce your results.-H 'origin: null'
vs -H 'Origin: https://localhost:4443' -H 'Referer: https://localhost:4443/test_http.html'
- this doesn't make a difference.-H 'DNT: 1' -H 'Connection: keep-alive' -H 'Cookie: __cfduid=dcf1b80eef19562054c9b64f79139509e1566138746'
that don't make a difference either.-H 'user-agent:
- doesn't affect Cloudfare either-H 'authority: URL.com'
(with placeholder in place of the real domain), and this doesn't make a difference either.--data-binary 'login=123&password=def'
only affects the API results; doesn't affect the 403.-H 'Accept-Language:
header causes the 403 from Cloudfare.So you could try adding the missing Accept-Language
to the Node version to see if it helps.
My version of Node doesn't send Extension: status_request
in the TLS Client Hello (which seems to be the difference between curl invocations with or without --cert-status
), and I don't see how you would enable it. At this point I'd try contacting support if possible or falling back to calling curl from node.
P.S. while debugging it I attempted to compare the Wireshark captures of curl vs browser (node doesn't support SSLKEYLOGFILE
, forcing you to jump through hoops, so I didn't even try checking how its capture looks). There are so many minor differences, that trying to reverse engineer the rules that Cloudfare uses would be very time-consuming. --cert-status
was a lucky guess.
The SSL Client Hello across Firefox/curl/node are very different: Firefox curl node11
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With