Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ERROR "Extra data: line 2 column 1" when using pycurl with gzip stream

Thanks for reading.

Background: I am trying to read a streaming API feed that returns data in JSON format, and then storing this data to a pymongo collection. The streaming API requires a "Accept-Encoding" : "Gzip" header.

What's happening: Code fails on json.loads and outputs - Extra data: line 2 column 1 - line 4 column 1 (char 1891 - 5597) (Refer Error Log below)

This does NOT happen while parsing every JSON object - it happens at random.

My guess is I am encountering some weird JSON object after every "x" proper JSON objects.

I did reference how to use pycurl if requested data is sometimes gzipped, sometimes not? and Encoding error while deserializing a json object from Google but so far have been unsuccessful at resolving this error.

Could someone please help me out here?

Error Log: Note: The raw dump of the JSON object below is basically using the repr() method that prints the raw representation of the string without resolving CRLF/LF(s).


'{"id":"tag:search.twitter.com,2005:207958320747782146","objectType":"activity","actor":{"objectType":"person","id":"id:twitter.com:493653150","link":"http://www.twitter.com/Deathnews_7_24","displayName":"Death News 7/24","postedTime":"2012-02-16T01:30:12.000Z","image":"http://a0.twimg.com/profile_images/1834408513/deathnewstwittersquare_normal.jpg","summary":"Crashes, Murders, Suicides, Accidents, Crime and Naturals Death News From All Around World","links":[{"href":"http://www.facebook.com/DeathNews724","rel":"me"}],"friendsCount":56,"followersCount":14,"listedCount":1,"statusesCount":1029,"twitterTimeZone":null,"utcOffset":null,"preferredUsername":"Deathnews_7_24","languages":["tr"]},"verb":"post","postedTime":"2012-05-30T22:15:02.000Z","generator":{"displayName":"web","link":"http://twitter.com"},"provider":{"objectType":"service","displayName":"Twitter","link":"http://www.twitter.com"},"link":"http://twitter.com/Deathnews_7_24/statuses/207958320747782146","body":"Kathi Kamen Goldmark, Writers\xe2\x80\x99 Catalyst, Dies at 63 http://t.co/WBsNlNtA","object":{"objectType":"note","id":"object:search.twitter.com,2005:207958320747782146","summary":"Kathi Kamen Goldmark, Writers\xe2\x80\x99 Catalyst, Dies at 63 http://t.co/WBsNlNtA","link":"http://twitter.com/Deathnews_7_24/statuses/207958320747782146","postedTime":"2012-05-30T22:15:02.000Z"},"twitter_entities":{"urls":[{"display_url":"nytimes.com/2012/05/30/boo\xe2\x80\xa6","indices":[52,72],"expanded_url":"http://www.nytimes.com/2012/05/30/books/kathi-kamen-goldmark-writers-catalyst-dies-at-63.html","url":"http://t.co/WBsNlNtA"}],"hashtags":[],"user_mentions":[]},"gnip":{"language":{"value":"en"},"matching_rules":[{"value":"url_contains: nytimes.com","tag":null}],"klout_score":11,"urls":[{"url":"http://t.co/WBsNlNtA","expanded_url":"http://www.nytimes.com/2012/05/30/books/kathi-kamen-goldmark-writers-catalyst-dies-at-63.html?_r=1"}]}}\r\n{"id":"tag:search.twitter.com,2005:207958321003638785","objectType":"activity","actor":{"objectType":"person","id":"id:twitter.com:178760897","link":"http://www.twitter.com/Mobanu","displayName":"Donald Ochs","postedTime":"2010-08-15T16:33:56.000Z","image":"http://a0.twimg.com/profile_images/1493224811/small_mobany_Logo_normal.jpg","summary":"","links":[{"href":"http://www.mobanuweightloss.com","rel":"me"}],"friendsCount":10272,"followersCount":9698,"listedCount":30,"statusesCount":725,"twitterTimeZone":"Mountain Time (US & Canada)","utcOffset":"-25200","preferredUsername":"Mobanu","languages":["en"],"location":{"objectType":"place","displayName":"Crested Butte, Colorado"}},"verb":"post","postedTime":"2012-05-30T22:15:02.000Z","generator":{"displayName":"twitterfeed","link":"http://twitterfeed.com"},"provider":{"objectType":"service","displayName":"Twitter","link":"http://www.twitter.com"},"link":"http://twitter.com/Mobanu/statuses/207958321003638785","body":"Mobanu: Can Exercise Be Bad for You?: Researchers have found evidence that some people who exercise do worse on ... http://t.co/mTsQlNQO","object":{"objectType":"note","id":"object:search.twitter.com,2005:207958321003638785","summary":"Mobanu: Can Exercise Be Bad for You?: Researchers have found evidence that some people who exercise do worse on ... http://t.co/mTsQlNQO","link":"http://twitter.com/Mobanu/statuses/207958321003638785","postedTime":"2012-05-30T22:15:02.000Z"},"twitter_entities":{"urls":[{"display_url":"nyti.ms/KUmmMa","indices":[116,136],"expanded_url":"http://nyti.ms/KUmmMa","url":"http://t.co/mTsQlNQO"}],"hashtags":[],"user_mentions":[]},"gnip":{"language":{"value":"en"},"matching_rules":[{"value":"url_contains: nytimes.com","tag":null}],"klout_score":12,"urls":[{"url":"http://t.co/mTsQlNQO","expanded_url":"http://well.blogs.nytimes.com/2012/05/30/can-exercise-be-bad-for-you/?utm_medium=twitter&utm_source=twitterfeed"}]}}\r\n'
json exception: Extra data: line 2 column 1 - line 4 column 1 (char 1891 - 5597)

Header Output:


HTTP/1.1 200 OK

Content-Type: application/json; charset=UTF-8

Vary: Accept-Encoding

Date: Wed, 30 May 2012 22:14:48 UTC

Connection: close

Transfer-Encoding: chunked

Content-Encoding: gzip

get_stream.py:


#!/usr/bin/env python
import sys
import pycurl
import json
import pymongo

STREAM_URL = "https://stream.test.com:443/accounts/publishers/twitter/streams/track/Dev.json"
AUTH = "userid:passwd"

DB_HOST = "127.0.0.1"
DB_NAME = "stream_test"

class StreamReader:
    def __init__(self):
        try:
            self.count = 0
            self.buff = ""
            self.mongo = pymongo.Connection(DB_HOST)
            self.db = self.mongo[DB_NAME]
            self.raw_tweets = self.db["raw_tweets_gnip"]
            self.conn = pycurl.Curl()
            self.conn.setopt(pycurl.ENCODING, 'gzip')
            self.conn.setopt(pycurl.URL, STREAM_URL)
            self.conn.setopt(pycurl.USERPWD, AUTH)
            self.conn.setopt(pycurl.WRITEFUNCTION, self.on_receive)
            self.conn.setopt(pycurl.HEADERFUNCTION, self.header_rcvd)
            while True:
                self.conn.perform()
        except Exception as ex:
            print "error ocurred : %s" % str(ex)

    def header_rcvd(self, header_data):
        print header_data

    def on_receive(self, data):
        temp_data = data
        self.buff += data
        if data.endswith("\r\n") and self.buff.strip():
            try:
                tweet = json.loads(self.buff, encoding = 'UTF-8')
                self.buff = ""
                if tweet:
                    try:
                        self.raw_tweets.insert(tweet)
                    except Exception as insert_ex:
                        print "Error inserting tweet: %s" % str(insert_ex)
                    self.count += 1

                if self.count % 10 == 0:
                    print "inserted "+str(self.count)+" tweets"
            except Exception as json_ex:
                print "json exception: %s" % str(json_ex)
                print repr(temp_data)



stream = StreamReader()

Fixed Code:


def on_receive(self, data):
        self.buff += data
        if data.endswith("\r\n") and self.buff.strip():
           # NEW: Split the buff at \r\n to get a list of JSON objects and iterate over them
            json_obj = self.buff.split("\r\n")
            for obj in json_obj:
                if len(obj.strip()) > 0:
                    try:
                        tweet = json.loads(obj, encoding = 'UTF-8')
                    except Exception as json_ex:
                        print "JSON Exception occurred: %s" % str(json_ex)
                        continue
like image 822
Sagar Hatekar Avatar asked May 30 '12 22:05

Sagar Hatekar


1 Answers

Try to paste your dumped string into jsbeatuifier.

You'll see that it's actually two json objects, not one, which json.loads can't deal with.

They are separated by \r\n, so it should be easy to split them.

The problem is that the data argument passed to on_receive doesn't neccessarily end with \r\n if it contains a newline. As this shows it also can be somewhere in the middle of the string, so only looking at the end of the data chunk won't be enough.

like image 121
mata Avatar answered Nov 09 '22 23:11

mata