I was on almaconnect.com, on home page there is a textbox which auto-suggest some results of universities when you type (load content by making an ajax call). I did make a curl request of same ajax call but request resulted in some encrypted lines on terminal
curl 'https://www.almaconnect.com/suggestions/portaled_institute?q=am' -H 'Host: www.almaconnect.com' -H 'User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:44.0) Gecko/20100101 Firefox/44.0' -H 'Accept: application/json, text/javascript, */*; q=0.01' -H 'Accept-Language: en-US,en;q=0.5' -H 'Accept-Encoding: gzip, deflate, br' -H 'X-Requested-With: XMLHttpRequest' -H 'Referer: https://www.almaconnect.com/' -H 'Cookie: Almaconnect=; _ga=GA1.2.315358219.1489989532; __utma=117457241.315358219.1489989532.1490871434.1492414070.3; __utmz=117457241.1490871434.2.2.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=(not%20provided); _gat=1; __utmb=117457241.1.10.1492414070; __utmc=117457241; __utmt=1'
I want exactly the same functionality for my website so that if any user try to fetch my website data , he would not be able to.
Whatever binary data you see in the terminal when you make the curl call is not encrypted content. It is just compressed content. You can verify it by running
curl $params > output
You can check if the file matches any known file formats by running
file output
You will see that the result as something similar to
output: gzip compressed data, from Unix
Running gzip -d -c output
will decompress and print the plaintext content to the terminal screen.
The reason why this happens is because, you send the accept-encoding
header with the curl call. Unlike the browser, curl does not decompress the result automatically. That is the reason for this confusion.
-H 'Accept-Encoding: gzip, deflate, br'
Removing this particular header from the curl call will get you the response in an uncompressed plaintext format directly. You can try the following command for that.
curl 'https://www.almaconnect.com/suggestions/portaled_institute?q=am' -H 'Host: www.almaconnect.com' -H 'User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:44.0) Gecko/20100101 Firefox/44.0' -H 'Accept: application/json, text/javascript, */*; q=0.01' -H 'Accept-Language: en-US,en;q=0.5' -H 'X-Requested-With: XMLHttpRequest' -H 'Referer: https://www.almaconnect.com/' -H 'Cookie: Almaconnect=; _ga=GA1.2.315358219.1489989532; __utma=117457241.315358219.1489989532.1490871434.1492414070.3; __utmz=117457241.1490871434.2.2.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=(not%20provided); _gat=1; __utmb=117457241.1.10.1492414070; __utmc=117457241; __utmt=1'
almaconnect.com does not really take any extra steps to obfuscate their AJAX responses. And it is generally a bad idea to do so. Whatever method you employ to obfuscate your responses (like using HTTP Referrer field), people can always come up with counter-measures to defeat them.
It is simply not worth the time to put in effort and time to come up with a mechanism which would eventually be broken by a determined attacker.
It is not possible.
The answer from gtux well explains the reasons why you are seeing binary characters of compressed content, not of encrypted content.
Note that this very simple version works:
curl 'https://www.almaconnect.com/suggestions/portaled_institute?q=am'
The answer from gaganshera may show you a way to obfuscate content, but that doesn't mean to really protect content, just to make a little harder for people to see it, since the decryption code is in public pages.
Your site can be protected by security (login + set cookie) or be public. If is protected, the security code checks the cookie header. If is public there are only ways to obfuscate content, not to protect it.
https://stackoverflow.com/a/14570971/1536382
https://www.quora.com/How-can-we-hide-JSON-data-from-tools-like-Chrome-development-tools-and-Firebug-etc-as-a-security-beyond-https
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With