I'm trying to use the Real-time Photo Updates API to get all pictures with a specific tag as they come in. Since updates from this API really only tell you that new content exists (but not what it is), I am querying for recent media with my tag whenever I get the notification that something has changed.
The problem I am having is that I'm constantly getting duplicate media returned by that query. The documentation says this API does pagination, but I can't get it to work at all.
The documentation here says to use min_id
and max_id
in your query string to control pagination, however it says:
MIN_ID - Return media before this min_id.
MAX_ID - Return media after this max_id.
This seems backwards (normally min should return items above the minimum and max should return items below the maximum so that when you specify both you get a bounded range).
The actual JSON I get back from my queries includes a pagination object like this:
"pagination": {
"next_max_tag_id": "1387272337517",
"deprecation_warning": "next_max_id and min_id are deprecated for this endpoint; use min_tag_id and max_tag_id instead",
"next_max_id": "1387272337517",
"next_min_id": "1387272345517",
"min_tag_id": "1387272345517",
"next_url": "https://api.instagram.com/v1/tags/cats/media/recent?access_token=xxx&max_tag_id=1387272337517"
}
The parameter specified in the next_url
property is max_tag_id
, not max_id
like the documentation says.
There is also a deprecation warning that states that next_max_id
and min_id
are deprecated, but since there are properties in the pagination object with names like that, I don't know whether the query parameters or the object properties the ones that are deprecated..
I would think it means the properties, because the query string never used a parameter of next_max_id
, but then the deprecation message says to use min_tag_id
and max_tag_id
and there is no max_tag_id
property on the pagination object (just a next_max_tag_id
).
Regardless of all of this conflicting documentation, it doesn't seem to matter what I pass in my query string - I continue to get repeat media in subsequent queries. Can someone help me make sense of this API? All I really want is to get tagged media that is new since my last query.
To get the newest set of grams for a particular tag, use this:
https://api.instagram.com/v1/tags/latergram/media/recent?access_token=TOKEN
From that response, you can get newer grams from the same tag by taking the min_tag_id
from the response (under pagination) and build a url like so:
https://api.instagram.com/v1/tags/latergram/media/recent?access_token=TOKEN&min_tag_id=1387332980547
Or you can get the next (older) set of grams by using the next_url parameter from the original response (also under pagination), which looks like:
https://api.instagram.com/v1/tags/latergram/media/recent?access_token=TOKEN&max_tag_id=1387332905573
Make sure your subsequent queries (for new grams of a particular tag) are using the min_tag_id
returned by the latest response. I did a few tests and didn't see duplicates, however I was using #latergram and that one has a high volume of posts
@zachallia has answered spot on, but I figure it can't hurt with a sketch:
As the Instagram API says:
MIN_TAG_ID Return media before this min_tag_id.
MAX_TAG_ID Return media after this max_tag_id.
This is counterintuitive, with a slightly nutty flavor. But still, it is possible to make sense of it.
The /tags/MYTAG/media/recent
endpoint will give you grams, ordered by how newly they where tagged with MYTAG. You'll not get all grams, of course, just up to the limit set by Instagram:
|yesteryear ------------------ <---- LIMIT ----> now|
If you use min_tag_id
like so /tags/MYTAG/media/recent?min_tag_id=X
you'll get grams from X and before (aka older):
|yesteryear ------- <---- LIMIT ---> min ------- now|
If you use max_tag_id
like so /tags/MYTAG/media/recent?max_tag_id=Y
you'll get grams from Y and after (aka newer):
|yesteryear ------- max <---- LIMIT ---> ------- now|
That's how "max" gets to signify "newer" and "min" gets to signify "older".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With