I'm trying to manipulate IMAP flags on a single message to mark it unread, after fetching all the "UNREAD" messages and then iterating through them and fetching them.
I'm not entirely sure how to mark the messages unread/unseen on a single-message basis. All I get is the message number, and I'm not sure how to store the UID properly to only affect the single message.
The answer in a similar question didn't appear to work, as it sets the wrong messages as 'unread'. How do I go about setting a single mail message that I've fetched as 'unread' again?
I was asked to give more information. While stripping out the specifics that are 'secret' here, this is the existing runtime that I've tried to implement, such that it attempts to process the message according to code rules, and stores the message numbers, etc., and attempts to set each message to "UNREAD" after storing the id and subject of the message in a pickle file, as anything that's been 'seen' during a run will be marked as 'read' on the server automatically, and not set to 'unread' status:
def main():
conn = imaplib.IMAP4('SERVER')
conn.login('username', 'passphrase')
conn.select('inbox')
(status, nums) = conn.search(None, '(UNSEEN)')
msgnums = map(int, nums[0].split())
for i in msgnums:
try:
raw_msg = conn.fetch(i, '(RFC822)')
raw_msg = conn.fetch(i, '(RFC822)')
msg = email.message_from_string(raw_msg[1][0][1])
body = "Date: %s\r\nSender: %s\r\nSubject: %s\r\n\r\n" % (msg['Date'], msg['From'], msg['Subject'])
msg_date = re.sub('/', '-', msg['Date']).replace(":", ".")
fdate = re.sub('\s+', '_', msg_date).replace(",", "")
print "Checking message: %s" % msg['Subject']
if not msg['Subject']:
continue # fname = "unknown_msg%d_%s" % (i,fdate)
elif msg['Subject'].lower().rfind('foobar') != -1:
print "Subject match 'foobar', processing: %s" % msg['Subject']
# We should have from the pickle an "observed" set of data, both subjects and message numbers.
if msg['Subject'] in PICKLED_MESSAGES['observed']['subjects']:
print "Already handled this message, moving on to next item."
# Since this was already observed we let it be removed so things don't rerun it later.
# noinspection PyBroadException
try:
PICKLED_MESSAGES['observed']['subjects'].remove(msg['Subject'])
PICKLED_MESSAGES['observed']['msgnums'].remove(i)
except:
pass
continue
else:
continue
# Do stuff with the message to store it in a special way on the filesystem
# Note that we've now looked at the message, so next-run we can see
# what was handled on the last run.
PICKLED_MESSAGES['observed']['msgnums'].append(i)
PICKLED_MESSAGES['observed']['subjects'].append(msg['Subject'])
print "PICKLED:\n%s" % PICKLED_MESSAGES['observed']
conn.uid('STORE', str(i), '-FLAGS', '(\Seen)')
except Exception:
conn.uid('STORE', str(i), '-FLAGS', '(\Seen)')
PICKLED_MESSAGES['observed']['msgnums'].remove(i)
PICKLED_MESSAGES['observed']['subjects'].remove(msg['Subject'])
print "PICKLED:\n%s\n" % PICKLED_MESSAGES
finally:
# Store the pickle file so we can use it next run.
cPickle.dump(PICKLED_MESSAGES, open('observed_msgs.pkl', 'wb'))
if __name__ == "__main__":
# pre-runtime checks - is IMAP up, etc. run first, then this:
# Initialize the PICKLED_MESSAGES data with pickle data or an empty
# structure for the pickle.
# noinspection PyBroadException
try:
PICKLED_MESSAGES = cPickle.load(open('observed_msgs.pkl', 'rb'))
except Exception as e:
PICKLED_MESSAGES = {
'observed': {
'msgnums': [],
'subjects': [],
},
}
# If all checks satisfied, continue and process the main() directives.
try:
main()
except Exception as e:
print("CRITICAL An unhandled error has occurred: %s" % str(e))
exit()
However, it's not setting the correct message as 'unread; when using the methods I've seen suggested on the system. So, I'm not entirely sure whether I'm not getting the UID of the message proper, or whether there's something else I'm missing doing here.
Right-click the selected messages. Select Mark as Read or Mark as Unread.
Mark messages as unread You can mark conversations as unread so you can return to them later when you have time to respond. , tap Select Messages, select the conversations you want to mark as unread, then tap Unread in the bottom-left corner.
In Google Chat, you can mark messages unread to easily get back to important messages. You can also mark a message as read without opening a conversation. When you mark a message as unread, you will get a red dot notification on that chat or space.
Well, I feel stupid today.
Apparently the message number being iterated over, and the UID of the message that conn.uid(...)
expects are NOT necessarily the same number. I figured out that one has to fetch the UID and do some post-fetch-processing to get just the UID to pass out.
The Original Approach
I was able to get the UID with the following, within that for
loop above:
for i in msgnums:
# ...
msg_uid = conn.fetch(i, 'UID')[1][0].split()[2].strip('()')
# ...
This gave me the UID of the message, which conn.uid
was expecting, rather than the plain message number. I feel kinda stupid for not realizing this, but this fixed the issue, it seems.
Updated Approach #1 (thanks to @Max in comments)
I replaced all the search/fetch/store commands with UID-equivalents.
conn.search(None, '(UNSEEN)')
becomes conn.uid('SEARCH', None, '(UNSEEN)')
conn.fetch(i, '(RFC822)')
becomes conn.uid('FETCH', i, '(RFC822)')
conn.store(i, '-FLAGS', '(\Seen)')
becomes conn.uid('STORE', i, '-FLAGS', '(\Seen)')
Updated Approach #2 (inspired by #1, but going a step further)
I basically got tired of writing out the UID commands, but also need to apply similar UID-based functionality in another program that uses similar IMAP interfaces and commands. Given this, I decided to write an imaplib_extension.py
module that 'extends' the imaplib
's IMAP4
and IMAP4_SSL
functions, and overrides "search", "fetch" and "store" commands with the uid
variants but otherwise keeps the "search", "fetch", and "store" commands as-is from imaplib
but returns a different result-set that is based on UID functions instead.
This is what is in my imaplib_extension.py
file, and I just import IMAP4
or IMAP4_SSL
from this module instead of from imaplib
directly, and replace any imaplib.IMAP4
and imaplib.IMAP4_SSL
calls with just IMAP4
or IMAP4_SSL
calls later. Therefore, no need to import imaplib
, just from imaplib import IMAP4
(or IMAP4_SSL
, accordingly):
import imaplib
class IMAP4(imaplib.IMAP4):
def search(self, charset, *criteria):
# conn.uid('SEARCH', charset, criteria)
return self.uid('SEARCH', charset, " ".join(criteria))
def fetch(self, message_set, message_parts):
# conn.uid('FETCH', msgset, parts)
return self.uid('FETCH', message_set, message_parts)
def store(self, message_set, command, flags):
# conn.uid('STORE', msg_uid, '-FLAGS', '(\Seen)')
return self.uid('STORE', message_set, command, flags)
# noinspection PyPep8Naming
class IMAP4_SSL(imaplib.IMAP4_SSL):
def search(self, charset, *criteria):
# conn.uid('SEARCH', charset, criteria)
return self.uid('SEARCH', charset, " ".join(criteria))
def fetch(self, message_set, message_parts):
# conn.uid('FETCH', msgset, parts)
return self.uid('FETCH', message_set, message_parts)
def store(self, message_set, command, flags):
# conn.uid('STORE', msg_uid, '-FLAGS', '(\Seen)')
return self.uid('STORE', message_set, command, flags)
I much prefer using this extension of imaplib
, because the command structure remains identical to the existing commands, but properly works with UIDs instead of 'message numbers' that might not be UIDs.
Updated Approach #3
After realizing that I've needed this in other Python applications, I got off my butt and published imaplibext
on PyPI, which basically is an improved and fleshed out version of approach #2 above. It does, however, have far better error handling, and the ability to actually specify timeout for an IMAP connection socket. This is an improvement as you can't directly do that for imaplib.IMAP4
or imaplib.IMAP4_SSL
, and among other things is essentially a drop-in replacement for imaplib
(though at its core it still uses imaplib
).
The code for this also exists at GitHub for general use and improvement suggestions and issue reports.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With