I'm running uwsgi around a Python Flask webapp with these options (among others) to get JSON-encoded log records on stdout:
fmt=$'{"timestamp": "${strftime:%FT%TZ}", "level": "DEBUG", "name": "uwsgi", "message": "${msg}"}\n'
uwsgi --env="TZ=UTC" --log-encoder="json ${fmt}" --logformat="%(status) [%(msecs)ms] %(method) %(uri)"
This nicely encodes the stdout from uwsgi but unfortunately also encodes the logging from my application, which is already in JSON format so I get things like:
{"timestamp": "2017-10-02T22:48:11Z", "level": "DEBUG", "name": "uwsgi", "message": "spawned uWSGI http 1 (pid: 75298)"}
{"timestamp": "2017-10-02T22:48:15Z", "level": "DEBUG", "name": "uwsgi", "message": "{\"timestamp\": \"2017-10-02T22:48:15.200Z\", \"message\": \"Descriptor requested\", \"request\": \"c6b08680-a7c3-11e7-9495-186590cba8eb\", \"name\": \"myapp.main\", \"level\": \"INFO\"}"}
{"timestamp": "2017-10-02T22:48:15Z", "level": "DEBUG", "name": "uwsgi", "message": "200 [11ms] GET /descriptor.json"}
The middle record has been wrapped in the same JSON encoding as other output from uwsgi.
How do I avoid the output from my Flask application being encoded but keep encoding of other output from uwsgi itself?
I've tried various combinations of --log-encoder
and --log-req-encoder
but the latter seems to encode nothing. The docs are not very clear on the distinction between the two options.
In the end I had to remove --log-encoder
and pass stdout (and stderr) through a process that wrapped output in JSON unless already encoded as JSON.
function log_json() { python -u -c "
import sys
import json
import datetime
log = dict(name='uwsgi', level='$1')
line = sys.stdin.readline()
while line:
line = line.strip()
if line.startswith('{') and line.endswith('}'):
print(line)
elif line:
log['timestamp'] = datetime.datetime.utcnow().isoformat() + 'Z'
log['message'] = line
print(json.dumps(log))
line = sys.stdin.readline()
"; }
{ uwsgi ... 2>&1 1>&3 3>&- | log_json ERROR; } 3>&1 1>&2 | log_json INFO
If you just want to encode request logging, add --logger-req=stdio
option to get --log-req-encoder
to encode request logging properly.
Not my answer, but thought I'd contribute as I struggled with this for a couple of hours as well, see here
The only issue I've found is that the startup messages are still not correctly JSON encoded, I think because there's no default log-encoder. What I get is:
*** Operational MODE: preforking+threaded ***
WSGI app 0 (mountpoint='') ready in 0 seconds on interpreter 0x55ac64f873c0 pid: 115 (default app)
*** uWSGI is running in multiple interpreter mode ***
spawned uWSGI master process (pid: 115)
spawned uWSGI worker 1 (pid: 120, cores: 2)
spawned uWSGI worker 2 (pid: 121, cores: 2)
spawned uWSGI worker 3 (pid: 123, cores: 2)
spawned uWSGI worker 4 (pid: 124, cores: 2)
spawned uWSGI http 1 (pid: 126)
{"timestamp":30-04-2019 04:20:56, "source": "request", "level":"INFO", "message":"[pid: 120|app: 0|req: 1/1] 10.0.75.1 () {48 vars in 3213 bytes} [Tue Apr 30 04:20:56 2019] GET /api/holidays =>generated 273 bytes in 4 msecs (HTTP/1.1 401) 2 headers in 82 bytes (1 switches on core 0)"}
{"timestamp":30-04-2019 04:21:00, "source": "app", "level":"INFO", "message":"/login - START"}
{"timestamp":30-04-2019 04:21:00, "source": "app", "level":"INFO", "message":"Trying to log in..."}
{"timestamp":30-04-2019 04:21:00, "source": "app", "level":"ERROR", "message":"Exception handling call 'login':
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/urllib3/connection.py", line 159, in _new_conn
(self._dns_host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.6/site-packages/urllib3/util/connection.py", line 80, in create_connection
raise err
File "/usr/local/lib/python3.6/site-packages/urllib3/util/connection.py", line 70, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused"}
My config:
logger = uwsgilog stdio
logger = application stdio
log-route = application {
log-route = uwsgilog ^((?!{).)*$
log-encoder = json:uwsgilog {"timestamp":${strftime:%%d-%%m-%%Y %%H:%%M:%%S}, "source": "uWSGI", "level":"INFO", "message":"${msg}"}
log-encoder = format:application ${msg}
logger-req = stdio
log-req-encoder = json {"timestamp":${strftime:%%d-%%m-%%Y %%H:%%M:%%S}, "source": "request", "level":"INFO", "message":"${msg}"}
log-req-encoder = nl
Another way of doing it would be to wrap the call to uwsgi like that (the answer borrows heavily from this one: Python read from subprocess stdout and stderr separately while preserving order). Here, I assume that you have a logger configured to output messages in the correct format. If not, you can construct the JSON object instead of calling logger.info()
.
import json
import logging
import os
import subprocess as sp
import sys
from threading import Thread
import time
def reader(process, source):
assert source in {'stdout', 'stderr'}
if source == 'stdout':
pipe = process.stdout
else:
pipe = process.stderr
with pipe:
for line in pipe:
if line == '' and process.poll() is not None:
return
# check if the message is already formatted because it's emitted by the app
if 'myapp.main' in line:
continue
if source == 'stdout':
logger.info(line.strip())
else:
logger.error(line.strip())
if __name__ == '__main__':
config_file_name = sys.argv[1]
# configure_logging(...)
logger = logging.getLogger('uwsgi')
cmd = ["pipenv", "run", "--", "uwsgi",
"--http", ":80",
"--master",
"--wsgi-file", "app.py",
"--callable", "app",
"--threads", "10",
"--pyargv", config_file_name]
process = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE, encoding='utf8')
Thread(target=reader, args=[process], kwargs={'source': 'stdout'}).start()
Thread(target=reader, args=[process], kwargs={'source': 'stderr').start()
return_code = process.wait()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With