I am not using CURL and only using jQuery, AJAX and JS for getting information from Github API. I am using a URL like this to get information about issues- https://api.github.com/repos/jquery/jquery/issues
But the result comes in multiple pages since Github API uses pagination feature. When using CURL we get to know about the header information in which the number of result pages are also shown but I am not using CURL and directly requesting data from above API url using jQuery and AJAX so I am unable to get the header information for above URL. I want to count the number of open and closed issues and open and closed PRs using the above URL for jquery/jquery repository and some other repositoris as well but since there is a lot of issues for some repositories, I am getting result in multiple pages.
I know about the "page" and "per_page" GET parameter that can be passed through the URL to get that result page and to display a number of results( e.g - 100) per page like this- https://api.github.com/repos/jquery/jquery/issues?page=5&per_page=100
I don't want to check the number of result pages manually. I want my script to get the number of result pages information automatically so that I can create a loop and iterate through all the pages to get information about all the issues.
e.g. if I get to know that the number of result pages are 8 then I can create a loop like this to get information about all the issues from all the result pages-
var number_of_pages=8;
var issues_information;
for(var nof=1; nof<=number_of_result_pages;nof++){
var URL='https://api.github.com/repos/jquery/jquery/issues?page='+nof+'&per_page=100';
$.getJSON(URL, function(json)){
issues_information=json;
}
}
Where "issues_information" will get JSON data that is fetched from Github API. But I am unable to get the count of result pages for a particular API call.
Can anybody tell me how to get number of result pages from Github API for a request? Please give an example code, URL format etc.
The score attribute is the search score of that document for a particular query, and is used for Best Match sorting. In other words, it's used for ranking search results, but it isn't shown in search results on github.com.
All you would need to do is create a GitHub Pages repository, put that JSON file in there, and your custom URL will have all of that data. From there you make an API call to your GitHub Pages URL rather than the API server.
So, to list all public repos from a user, send a GET request to https://api.github.com/users/<USER-NAME>/repos , replacing with the actual user from whom you want to retrieve the repositories.
From the docs:
Information about pagination is provided in the Link header of an API call. For example, let's make a curl request to the search API, to find out how many times Mozilla projects use the phrase addClass:
curl -I "https://api.github.com/search/code?q=addClass+user:mozilla" The -I
parameter indicates that we only care about the headers, not the actual content. In examining the result, you'll notice some information in the Link header that looks like this:
Link: <https://api.github.com/search/code?q=addClass+user%3Amozilla&page=2>; rel="next", <https://api.github.com/search/code?q=addClass+user%3Amozilla&page=34>; rel="last"
Let's break that down. rel="next" says that the next page is page=2. This makes sense, since by default, all paginated queries start at page 1. rel="last" provides some more information, stating that the last page of results is on page 34. Thus, we have 33 more pages of information about addClass that we can consume. Nice!
So to iterate overall the pages, just keep requesting pages until there is no "next" in the link header.
Here is some python code showing the logic:
params = {'page': 1, 'per_page':100}
another_page = True
api = GH_API_URL+'orgs/'+org['login']+'/teams'
while another_page: #the list of teams is paginated
r = requests.get(api, params=params, auth=(username, password))
json_response = json.loads(r.text)
results.append(json_response)
if 'next' in r.links: #check if there is another page of organisations
api = r.links['next']['url']
else:
another_page=False
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With