I go through this part:
How do I extract the data from that URL? I only want to print out the "networkdiff": 58954.60268219.
from urllib import urlopen
url = urlopen('http://21.luckyminers.com/index.php?page=api&action=getpoolstatus&api_key=8dba7050f9fea1e6a554bbcf4c3de5096795b253b45525c53562b72938771c41').read()
print url
This is what the API display as a result from print url command:
{
    "getpoolstatus": {
        "version": "1.0.0",
        "runtime": 16.618967056274,
        "data": {
            "pool_name": "21 Coin Pool @ Luckyminers.com",
            "hashrate": 485426748,
            "efficiency": 98.1,
            "workers": 14,
            "currentnetworkblock": 12025,
            "nextnetworkblock": 12026,
            "lastblock": 12023,
            "networkdiff": 58954.60268219,
            "esttime": 521.61956775542,
            "estshares": 241478052.58625,
            "timesincelast": 427,
            "nethashrate": 485426748
        }
    }
}
                You can use the json module to parse out a Python dictionary and get right to the value like so:
import json
result = json.loads(url)  # result is now a dict
print '"networkdiff":', result['getpoolstatus']['data']['networkdiff']
To do this multiple times (to answer your question in the comments section):
import json
import urllib
urls = {'joe': 'url1', 'jack': 'url2', 'jane': 'url3'}
for who in urls.keys():
    url = urllib.urlopen(urls[who])
    result = json.loads(url)  # result is now a dict
    print 'For %s: "networkdiff":' % who, result['getpoolstatus']['data']['networkdiff']
                        convert the response to json and then read it
from urllib import urlopen
import simplejson as json
url = urlopen('http://21.luckyminers.com/index.php?page=api&action=getpoolstatus&api_key=8dba7050f9fea1e6a554bbcf4c3de5096795b253b45525c53562b72938771c41').read()
url = json.loads(url)
print url.get('getpoolstatus').get('data').get('networkdiff')
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With