Dask client spams warnings in my Jupyter Notebook output. Is there a way to switch off dask warnings?
Warning text look like this: "distributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 3.16 GB -- Worker memory limit: 4.20 GB"
The problem appear after these code:
import pandas as pd
from sqlalchemy import create_engine, MetaData
from sqlalchemy import select, insert, func
import dask.dataframe as dd
from dask.distributed import Client
client = Client(n_workers=4, threads_per_worker=4, processes=False)
engine = create_engine(uri)
meta_core = MetaData()
meta_core.reflect(bind=engine)
table = meta_core.tables['table']
dd_main = dd.read_sql_table(
table=table,
uri=uri,
index_col='id'
)
dd_main.head()
After executing the chunk above, I get a lot of these warnings in every Jupyter cell, so I can't even find my actual output.
You can pass a logging level to the Client constructor like the following:
client = Client(..., silence_logs='error')
I had to use a variation on MRocklin's answer:
import logging
client = Client(..., silence_logs=logging.ERROR)
The string "error"
is not recognized by python's logging
, which uses constants. An alternative is to pass the numerical value of the constant directly instead (here logging.ERROR == 50
). See logging levels.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With