I've been experimenting a little with color output in terminal and Python 2.7.3. The ANSI color codes are always rendered flawlessly in the terminal, with this one little exception which I couldn't pinpoint yet any preciser than to this particular dictionary definition.
Here's what causes confusion:
color = {
'white': "\033[1,37m",
'yellow': "\033[1,33m",
'green': "\033[1,32m",
'blue': "\033[1,34m",
'cyan': "\033[1,36m",
'red': "\033[1,31m",
'magenta': "\033[1,35m",
'black': "\033[1,30m",
'darkwhite': "\033[0,37m",
'darkyellow': "\033[0,33m",
'darkgreen': "\033[0,32m",
'darkblue': "\033[0,34m",
'darkcyan': "\033[0,36m",
'darkred': "\033[0,31m",
'darkmagenta':"\033[0,35m",
'darkblack': "\033[0,30m",
'off': "\033[0,0m"
}
yellow = "\033[1;33m"
off = "\033[0;0m"
print color['yellow'] + "string to render" + color['off'] # fails to render properly
print "%(yellow)sstring to render%(off)s" % color # ditto
print "%sstring to render%s" % (color['yellow'], color['off'])# ditto
print yellow + "string to render" + off # as intended
pp = pprint.PrettyPrinter(indent=6)
pp.pprint(color)
The output for PrettyPrinter:
{ 'black': '\x1b[1,30m',
'blue': '\x1b[1,34m',
'cyan': '\x1b[1,36m',
'darkblack': '\x1b[0,30m',
'darkblue': '\x1b[0,34m',
'darkcyan': '\x1b[0,36m',
'darkgreen': '\x1b[0,32m',
'darkmagenta': '\x1b[0,35m',
'darkred': '\x1b[0,31m',
'darkwhite': '\x1b[0,37m',
'darkyellow': '\x1b[0,33m',
'green': '\x1b[1,32m',
'magenta': '\x1b[1,35m',
'off': '\x1b[0,0m',
'red': '\x1b[1,31m',
'white': '\x1b[1,37m',
'yellow': '\x1b[1,33m'}
This seems to me to be the proper translation into hexadecimal format. Despite this fact, the dictionary values are not passed properly to the print statement. Neither the raw nor the Unicode (out of despair) string literal modifiers change anything. I must be missing something rather obvious. On terminals without UTF-8 support, the Unicode character is omitted.
I've seen the implementations for termcolor:
if os.getenv('ANSI_COLORS_DISABLED') is None:
fmt_str = '\033[%dm%s'
if color is not None:
text = fmt_str % (COLORS[color], text)
if on_color is not None:
text = fmt_str % (HIGHLIGHTS[on_color], text)
if attrs is not None:
for attr in attrs:
text = fmt_str % (ATTRIBUTES[attr], text)
text += RESET
return text
colorama:
CSI = '\033['
def code_to_chars(code):
return CSI + str(code) + 'm'
class AnsiCodes(object):
def __init__(self, codes):
for name in dir(codes):
if not name.startswith('_'):
value = getattr(codes, name)
setattr(self, name, code_to_chars(value))
And a couple of others. Analytically, they all avoid defining the entire sequence in the dictionary. I do agree that such an approach is lexically reasonable. Yet the fact remains that the escape character from a dictionary value is not interpreted correctly, other than, say, in a Perl hash, C++'s vector
ized map <string, string>
, or a C's struct
's (if akin a map) char *string
.
And this leads to the question: Is there a particular reason, per the standard, if possible, as to why dictionary (let's dub this:) interpolation deviates from a plain string?
Here's the fixed color code dict (tab-indentated if editing, SO seemingly strips the tab chars for reading):
color = {
'white': "\033[1;37m",
'yellow': "\033[1;33m",
'green': "\033[1;32m",
'blue': "\033[1;34m",
'cyan': "\033[1;36m",
'red': "\033[1;31m",
'magenta': "\033[1;35m",
'black': "\033[1;30m",
'darkwhite': "\033[0;37m",
'darkyellow': "\033[0;33m",
'darkgreen': "\033[0;32m",
'darkblue': "\033[0;34m",
'darkcyan': "\033[0;36m",
'darkred': "\033[0;31m",
'darkmagenta':"\033[0;35m",
'darkblack': "\033[0;30m",
'off': "\033[0;0m"
}
I went through your source code and I think the probelm is with the color definition within the dictionary.
If you observe carefully, your dictionary value for a color is something like \033[1,30m for white color. However it should be \033[1;30m. Note that you are using a ',(comma) character instead of ';(semicolon) character. As a test I created a subset of the color dictionary and ran these tests.
>>> color = {'white' :'\033[1;37m', 'yellow':'\033[1;33m', 'off' : '\033[0;0m'}
>>> print color['white'] + 'string' + color['off']
string #this string is white in color
>>> print color['yellow'] + 'string' + color['off']
string #this string is yellow in color
>>> color['yellow'] = '\033[1,33m' #incorrect color code - using a , character instead of ;
>>> print color['yellow'] + 'string' + color['off']
string #prints the string in console default color i.e. not in yellow color
>>>
Hope this helps
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With