Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

"Users" sending GET requests to a website, when POST is expected

I keep seeing weird behaviour in our logs, where URLs that should normally only be accessed via a POST request are called via GET. The URLs include ones that are only ever constructed via javascript, so you wouldn't expect a regular spider to come across them. If I search our logs for an IP that those requests are coming from, it seems like that user has only ever sent us GET requests.

It doesn't seem like typical bot behaviour - the requests are spread out, rather than spamming our server with a bunch of requests in a short timeframe. The user agents are all regular browsers. However - and this is slightly speculative - it doesn't really look like it's a human browsing the site, since they seem to jump all over the place rather than following one link to the next.

Does anyone else see this sort of behaviour on their site? Any suggestions what causes it?

like image 505
Jonathan del Strother Avatar asked Apr 20 '11 10:04

Jonathan del Strother


3 Answers

It may be somebody fishing for exploits in your site. They would analyse your forms then craft their own URLs looking for weaknesses or unconventional ways to use the service. If it's usually the same IP address then you could probably assume that's the case.

One example might be you are a streaming media provider and somebody is trying to piece together the source URLs for a video downloader script. Often though it's simply spammers looking to relay through your contact forms.

Don't assume too much from IP addresses and and user agents. The former can be proxied (through networks like Tor) and the later can be changed at will. Just because the IPs and user-agents change doesn't mean it isn't the same user generating the requests.

like image 74
SpliFF Avatar answered Nov 08 '22 17:11

SpliFF


I often scrape websites for information, and when I'm being really lazy, I will submit everything as a GET instead of using POST... many times, CGIs that require a POST will accept a GET. I set my script up to use a random USER-AGENT from a list: either safari on ipad, firefox on XP, or Internet Exploder on Vista.

Who knows, it may be me scraping your site, and getting the points for the answer ;-).

like image 24
Mike Pennington Avatar answered Nov 08 '22 19:11

Mike Pennington


Just a wild guess:

  • There is something called "web accelerator", a browser plugin that pre-fetches links, so when you decided to click on one, it was already cached. It shouldn't be caching things that look like queries, but maybe some detects your URLs as suitable for prefetching. Since it runs in browser, it will at very least see all URLs that javascript added to the document (by means of document.write or DOM access).

  • A "web accelerator" can also be implemented as part of web proxy. It seems less likely, because it would have to interpret the javascript, but if the URLs appear in the javascript in full, it might simply be grepping any text for anything that looks like a URL and might find them.

This would explain why the requests are spread out (such thing would fire a couple of requests each time a real user visits the site), why the user agent strings correspond to actual browser (if it's browser plugin, it uses it's user agent string) and why they jump all over the place (they simultaneously try to prefetch several links and probably the heuristics picking which ones it should be does not work well with your site).

like image 22
Jan Hudec Avatar answered Nov 08 '22 17:11

Jan Hudec