One of the kwargs for building a random forest in sklearn is "verbose". The documentation says that it
Controls the verbosity of the tree building process
After searching online I am still not sure what this means.
Verbosity in keyword arguments usually means showing more 'wordy' information for the task. In this case, for machine learning, by setting verbose to a higher number ( 2 vs 1 ), you may see more information about the tree building process.
Many scikit-learn functions have a verbose argument that, according to their documentation, "[c]ontrols the verbosity: the higher, the more messages" (e.g., GridSearchCV).
8.5. Using the verbose Parameter. The -verbose parameter tells you what has been done. If you are doing something risky, then the verbose parameter doesn't provide protection against ill-advised actions like the -whatif or =confirm parameters, at least if you haven't worked out the precise effect of the command.
Verbosity level is related just to logging. In unit tests you find it for the logging of the information. Note: It is more pythonic to use levels as constant names( logging.INFO , logging. DEBUG rather than numbers. These levels decide the amount of information you will get.
Verbosity in keyword arguments usually means showing more 'wordy' information for the task.
In this case, for machine learning, by setting verbose to a higher number (2
vs 1
), you may see more information about the tree building process.
Seeing the verbosity settings for another machine learning application may help to understand the principle.
In case of verbose=1
you get only the red logs (pretty rare in big models) with elapsed time, but changing verbose to 2 prints you the rest of information about building each tree:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With