I have a file in python like:
def test_constructor_for_legacy_json():
"""Test if constructor works for a legacy JSON in an old database"""
a = A(**{
'field1': 'BIG TEXT WITH MORE THAN 500 CHARACTERS....(...)',
'field2': 'BIG TEXT WITH MORE THAN 500 CHARACTERS....(...)',
'field3': 'BIG TEXT WITH MORE THAN 500 CHARACTERS....(...)',
# (...)
'field1000': 'BIG TEXT WITH MORE THAN 500 CHARACTERS....(...)',
})
assert type(a) == A
When I run flake8
+ hacking
I receive an error because the lines are too big.
If I put this command at the beginning of the file # flake8: noqa
all file will be ignored from linter. But I only want to exclude from linter the block where a
is declared.
I want to lint the rest of the file, and I cannot put at the end of each fieldx
an # noqa: E501
.
Some one know how can I solve this? Thanks
There are two ways to ignore the file: By explicitly adding it to our list of excluded paths (see: flake8 --exclude ) By adding # flake8: noqa to the file.
[flake8] per-file-ignores = # line too long path/to/file.py: E501, This may be easier than using # noqa comments.
Selecting Violations with Flake8 This list can be overridden by specifying flake8 --select . Just as specifying flake8 --ignore will change the behaviour of Flake8, so will flake8 --select . Suddenly we now have far more errors that are reported to us. Using --select alone will override the default --ignore list.
linter-flake8 is a flake8 provider for linter.
There isn't a way in flake8 to ignore a block of code
Your options are:
ignore each line that produces an error by putting # noqa: E501
on it
ignore the entire file (but this turns off all other errors as well) with a # flake8: noqa
on a line by itself
ignore E501
in the entire file by using per-file-ignores
:
[flake8]
per-file-ignores =
path/to/file.py: E501
generally I'd prefer the third one, maybe even sequestering your long-strings into their own file to be ignored
disclaimer: I'm the current flake8 maintainer
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With