A twitter bot
August 21, 2009
One of the first experiments that I did with python and twitter was a BOT. I was interested in testing how would an interactive application work using twitter, and a reasonable model for me is that it would answer back when a user sent a message, after performing a given action.
To test it, I built this bot, and it is kinda running at http://twitter.com/interweb. Send it a message and it answers back with the “feeling lucky” result in a google query. You may need to follow it to see the answer, according to your security options. Try it with @interweb test.
The code is based around the latest version of python twitter library, and it pretty straight forward to use and adapt for other bots. It runs on the crontab, polling the api for new messages, and save the answered ones in a pickled db file.
For a production grade bot, it should use a queue and start as a daemon, with a slower time to check the api for new statuses. Don’t forget to ask for a bigger rate limit to twitter, so your application dont get blacklisted. Also, one can take advantage of geocoding data to narrow the answer, along with other truly web 3.14 ideas !
As usual, here goes the code. Cheers !
import twitter from urllib import urlencode import simplejson import urllib2 import os import pickle GOOGLE_API_URL = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' my_user_name = '' #here goes your twitter username my_pass = '' #here goes the passwd filename = "statuses.db" if os.path.exists(filename): last_statuses = pickle.load(file(filename, 'r+b')) else: last_statuses = {} api = twitter.Api(username=my_user_name, password=my_pass) def query(wot): params = {'q': wot, 'rsz': 'small'} url = GOOGLE_API_URL % urlencode(params) try: # Query Google resp = urllib2.urlopen(url) except URLError, e: print e.reason return "Ouch, got an error trying to search what you asked me." print resp.geturl() response = resp.read() # Parse response try: data = simplejson.loads(response) results = data['responseData']['results'] if results: resp.close() return 'Feeling lucky: %s (+ %s results from Google)' % (results[0]['unescapedUrl'], data['responseData']['cursor'].get('estimatedResultCount')) except (ValueError, KeyError), e: print "Couldn't parse Google response due to %r: %s" % (e, response) return "Ouch, got some unknown error." resp.close(); return "Google returned 0 results for: %s" % wot reps = api.GetReplies() updated = False for s in reps: if s.user.screen_name == my_user_name: continue if s.id in last_statuses.keys(): continue print "%s (status id: %s): %s" % (s.user.screen_name, s.id, s.text) q = s.text.replace("@%s"% s.in_reply_to_screen_name, '') r = query(q) api.PostUpdate('@%s %s' % (s.user.screen_name, r)) last_statuses[s.id] = True updated = True print last_statuses if updated == True: pickle.dump(last_statuses, file(filename, 'w+b'))
September 11, 2009 at 8:28 pm
Nice work!
By any chance do you have a similar one in php?
September 12, 2009 at 1:03 pm
No, I never did that in PHP, alto I think it may not be difficult. Good luck.