Can someone help me customize an existing code sample?
I can see from the following article how to connect to gmail and download content, but I can't figure out how to search for a specific email and only download the timestamp and body?
ARTICLE: How can I download all emails with attachments from Gmail?
I specifically want to grab the emails from "[email protected]" for the last 5 days and download the send time and body of the emails. I'll then parse this to determine which emails I need to use.
I'm self-taught and am having a hard time customizing the script above to do this.
Any help is much appreciated. Thanks.
JD
print 'Proceeding' import email import getpass import imaplib import os import sys userName = '[email protected]' passwd = 'yourpassword' directory = '/full/path/to/the/directory' detach_dir = '. ' if 'DataFiles' not in os. listdir(detach_dir): os. mkdir('DataFiles') try: imapSession = imaplib.
I suggest using IMAPClient as it papers over many of the more esoteric aspects of IMAP.
The following snippet will pull messages based on your criteria, parse the message strings to email.message.Message
instances and print the Date
and From
headers.
from datetime import datetime, timedelta
import email
from imapclient import IMAPClient
HOST = 'imap.gmail.com'
USERNAME = 'username'
PASSWORD = 'password'
ssl = True
today = datetime.today()
cutoff = today - timedelta(days=5)
## Connect, login and select the INBOX
server = IMAPClient(HOST, use_uid=True, ssl=ssl)
server.login(USERNAME, PASSWORD)
select_info = server.select_folder('INBOX')
## Search for relevant messages
## see http://tools.ietf.org/html/rfc3501#section-6.4.5
messages = server.search(
['FROM "[email protected]"', 'SINCE %s' % cutoff.strftime('%d-%b-%Y')])
response = server.fetch(messages, ['RFC822'])
for msgid, data in response.iteritems():
msg_string = data['RFC822']
msg = email.message_from_string(msg_string)
print 'ID %d: From: %s Date: %s' % (msgid, msg['From'], msg['date'])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With