I noticed pylint doesn't handle well the case of:
#property
def foo(self):
return self._bar.foo
#foo.setter
def foo(self, foo_val):
self._bar.foo = foo_val
Though it's a perfectly valid case syntax since python2.6
It says I defined foo twice, and doesn't understand the ".setter" syntax (Gives E1101 & E0102).
Is there a workaround for that without having to change the code? I don't want to disable the errors as they are important for other places.
Is there any other tool I can use that handles it better? I already checked pyflakes and it behaves the same way. PyDev's code analysis seems to handle this specific case better, but it doesn't check for conventions, refactoring, and other cool features pylint does, and I can't run it from an external script (or can I??)
Thanks!
If you don't want to disable the errors globally, you can disable them for these specific lines, for example:
def foo(self, foo_val): # pylint: disable-msg=E0102
This is ticket http://www.logilab.org/ticket/51222 on the pylint project. Monitor it's status.
Huh. Annoying. And all the major tools I could find (pyflakes, pylint, pychecker) exhibit this problem. It looks like the problem starts in the byte code, but I can't get dis to give me any byte code for object properties.
It looks like you would be better off if you used this syntax:
# Changed to longer member names to reduce pylint grousing
class HughClass(object):
def __init__(self, init_value):
self._hugh = init_value
def hugh_setter(self):
return self._hugh * 2
def hugh_getter(self, value):
self._hugh = value / 2
hugh = property(hugh_getter, hugh_setter)
Here's a nice blog article on it. LOL-quote:
Getters and setters belong to the sad
world of Java and C++.
This was reported as a bug in pyflakes, and it appears to be fixed in current trunk. So I guess the answer (now) is: pyflakes!
Related
I wrote the code, but I get the following message in pycharm(2019.1):
"Parameterized generics cannot be used with class or instance checks"
def data_is_valid(data):
keys_and_types = {
'comment': (str, type(None)),
'from_budget': (bool, type(None)),
'to_member': (int, type(None)),
'survey_request': (int, type(None)),
}
def type_is_valid(test_key, test_value):
return isinstance(test_value, keys_and_types[test_key])
type_is_valid('comment', 3)
I really do not understand this message well. Did I do something wrong or is it a bug in pycharm?
The error disappears if I explicitly typecast to tuple.
def type_is_valid(test_key, test_value):
return isinstance(test_value, tuple(keys_and_types[test_key]))
That looks like a bug in pycharm where it's a bit overeager in assuming that you're using the typing module in an unintended way. See this example here where that assumption would have been correct:
The classes in the typing module are only useful in a type annotation context, not to inspect or compare to actual classes, which is what isinstance tries to do. Since pycharm sees a simple object with square brackets that do not contain a literal, it jumps to the wrong conclusion you are seeing.
Your code is fine, you can use it exactly as it is.
I will not repeat after others that this is a pycharm bug. Just if you are a perfectionist and the error hurts your eyes, add the comment
# noqa
to the line where the "error" is
This was a known bug in PyCharm 2018, reported here.
There are some related bugs still in more recent PyCharm versions, e.g. PyCharm 2021.2.2, here.
In general, when you found that some PyCharm warning is incorrect, I would first isolate a simple test case where it becomes more clear what PyCharm is actually incorrect about. When it is clear that PyCharm is wrong with the warning, then you should always fill a bug report about it (or maybe search for existing bug reports first). Here this is clear because PyCharm says you cannot do sth, while in fact you can, so sth is wrong.
Since it's agreed it's a bug, you can suppress it in Pycharm by the line:
# noinspection PyTypeHints
How would I handle long path name like below for pep8 compliance? Is 79 characters per line a must even if it becomes somewhat unreadable?
def setUp(self):
self.patcher1 = patch('projectname.common.credential.CredentialCache.mymethodname')
There are multiple ways to do this:
Use a variable to store this
def setUp(self):
path = 'projectname.common.credential.CredentialCache.mymethodname'
self.patcher1 = patch(path)
String concatenation:
An assignment like v = ("a" "b" "c") gets converted into v = "abc":
def setUp(self):
self.patcher1 = patch(
"projectname.common.credential."
"CredentialCache.mymethodname")
Tell pep8 that we don't use 80-column terminals anymore with --max-line-length=100 (or some sufficiently reasonable value). (Hat Tip #chepner below :) )
The 79-character limit in PEP8 is based more on historical beliefs than in actual readability. All of PEP8 is a guideline, but this one is more frequently ignored than most of the recommendation. The pep8 tool even has a specific option for changing the value of what is considered "too long".
pep8 --max-line-length 100 myscript.py
I frequently just disable the test altogether:
pep8 --ignore E501 myscript.py
The 80 columns guideline is not only for the sake of people coding on 1980 barkley unix terminals, but it also guarantees some uniformity across projects. Thanks to it you can set the GUI of your IDE to your liking and be sure that it will be good for all the different projects.
Alas sometimes the best solution is to violate it, it is extremely rare case but it sure happens. And jusst for that reason you can tag that line with comment:# noinspection PyPep8 so you could turn your code into:
def setUp(self):
# noinspection PyPep8
self.patcher1 = patch('projectname.common.credential.CredentialCache.mymethodname')
Which will allow you to follow the guidelines of pep8 all around, including the line limitations, and not have to worry about this false report. Sadly this directive isn't supported with all the checkers, but it is slowly getting there.
I prefer variant with concatenation.
def setUp(self):
self.patcher1 = patch(
"projectname.common.credential."
"CredentialCache.mymethodname")
Also concatenation braces are not required when calling a function.
I wrote some Python code which works but Pylint doesn't like the star. It keeps telling me:
Used * or ** magic (star-args)
Is it possible to write my code without the star? Some info: I'm using lxml; self.xml is an objectified XML file.
#property
def version_string(self):
'''Return the version as a string.'''
try:
version_format = self.xml.version.get("format")
except AttributeError:
return None
version_values = (v.text for v in self.xml.version.v)
return version_format.format(*version_values)
There's nothing wrong with the splat operator. Without knowing what the version_format function does, it's not possible to say if you could pass an iterable, or iterate the function directly, but frankly there's no reason to.
If you don't like that pylint warning, disable it. It was originally introduced because having lots of
def some_function(*args, **kwargs):
pass
lowers the readability / maintainability of the code.
star-args (W0142) is no longer present in pylint (at least since version 1.4.3). It appears to have been removed fairly recently.
When it comes to constructors, and assignments, and method calls, the PyCharm IDE is pretty good at analyzing my source code and figuring out what type each variable should be. I like it when it's right, because it gives me good code-completion and parameter info, and it gives me warnings if I try to access an attribute that doesn't exist.
But when it comes to parameters, it knows nothing. The code-completion dropdowns can't show anything, because they don't know what type the parameter will be. The code analysis can't look for warnings.
class Person:
def __init__(self, name, age):
self.name = name
self.age = age
peasant = Person("Dennis", 37)
# PyCharm knows that the "peasant" variable is of type Person
peasant.dig_filth() # shows warning -- Person doesn't have a dig_filth method
class King:
def repress(self, peasant):
# PyCharm has no idea what type the "peasant" parameter should be
peasant.knock_over() # no warning even though knock_over doesn't exist
King().repress(peasant)
# Even if I call the method once with a Person instance, PyCharm doesn't
# consider that to mean that the "peasant" parameter should always be a Person
This makes a certain amount of sense. Other call sites could pass anything for that parameter. But if my method expects a parameter to be of type, say, pygame.Surface, I'd like to be able to indicate that to PyCharm somehow, so it can show me all of Surface's attributes in its code-completion dropdown, and highlight warnings if I call the wrong method, and so on.
Is there a way I can give PyCharm a hint, and say "psst, this parameter is supposed to be of type X"? (Or perhaps, in the spirit of dynamic languages, "this parameter is supposed to quack like an X"? I'd be fine with that.)
EDIT: CrazyCoder's answer, below, does the trick. For any newcomers like me who want the quick summary, here it is:
class King:
def repress(self, peasant):
"""
Exploit the workers by hanging on to outdated imperialist dogma which
perpetuates the economic and social differences in our society.
#type peasant: Person
#param peasant: Person to repress.
"""
peasant.knock_over() # Shows a warning. And there was much rejoicing.
The relevant part is the #type peasant: Person line of the docstring.
If you also go to File > Settings > Python Integrated Tools and set "Docstring format" to "Epytext", then PyCharm's View > Quick Documentation Lookup will pretty-print the parameter information instead of just printing all the #-lines as-is.
Yes, you can use special documentation format for methods and their parameters so that PyCharm can know the type. Recent PyCharm version supports most common doc formats.
For example, PyCharm extracts types from #param style comments.
See also reStructuredText and docstring conventions (PEP 257).
Another option is Python 3 annotations.
Please refer to the PyCharm documentation section for more details and samples.
If you are using Python 3.0 or later, you can also use annotations on functions and parameters. PyCharm will interpret these as the type the arguments or return values are expected to have:
class King:
def repress(self, peasant: Person) -> bool:
peasant.knock_over() # Shows a warning. And there was much rejoicing.
return peasant.badly_hurt() # Lets say, its not known from here that this method will always return a bool
Sometimes this is useful for non-public methods, that do not need a docstring. As an added benefit, those annotations can be accessed by code:
>>> King.repress.__annotations__
{'peasant': <class '__main__.Person'>, 'return': <class 'bool'>}
Update: As of PEP 484, which has been accepted for Python 3.5, it is also the official convention to specify argument and return types using annotations.
PyCharm extracts types from a #type pydoc string. See PyCharm docs here and here, and Epydoc docs. It's in the 'legacy' section of PyCharm, perhaps it lacks some functionality.
class King:
def repress(self, peasant):
"""
Exploit the workers by hanging on to outdated imperialist dogma which
perpetuates the economic and social differences in our society.
#type peasant: Person
#param peasant: Person to repress.
"""
peasant.knock_over() # Shows a warning. And there was much rejoicing.
The relevant part is the #type peasant: Person line of the docstring.
My intention is not to steal points from CrazyCoder or the original questioner, by all means give them their points. I just thought the simple answer should be in an 'answer' slot.
I'm using PyCharm Professional 2016.1 writing py2.6-2.7 code, and I found that using reStructuredText I can express types in a more succint way:
class Replicant(object):
pass
class Hunter(object):
def retire(self, replicant):
""" Retire the rogue or non-functional replicant.
:param Replicant replicant: the replicant to retire.
"""
replicant.knock_over() # Shows a warning.
See: https://www.jetbrains.com/help/pycharm/2016.1/type-hinting-in-pycharm.html#legacy
You can also assert for a type and Pycharm will infer it:
def my_function(an_int):
assert isinstance(an_int, int)
# Pycharm now knows that an_int is of type int
pass
In general I want to disable as little code as possible, and I want it to be explicit: I don't want the code being tested to decide whether it's a test or not, I want the test to tell that code "hey, BTW, I'm running a unit test, can you please not make your call to solr, instead can you please stick what you would send to solr in this spot so I can check it". I have my ideas but I don't like any of them, I am hoping that there's a good pythonic way to do this.
You can use Mock objects to intercept the method calls that you do not want to execute.
E.g. You have some class A, where you don't want method no() to be called during a test.
class A:
def do(self):
print('do')
def no(self):
print('no')
A mock object could inherit from A and override no() to do nothing.
class MockA(A):
def no(self):
pass
You would then create MockA objects instead of As in your test code. Another way to do mocking would be to have A and MockA implement a common interface say InterfaceA.
There are tons of mocking frameworks available. See StackOverflow: Python mocking frameworks.
In particular see: Google's Python mocking framework.
Use Michael Foord's Mock
in your unit test do this:
from mock import Mock
class Person(object):
def __init__(self, name):
super(Person, self).__init__()
self.name = name
def say(self, str):
print "%s says \"%s\"" % (self.name, str)
...
#In your unit test....
#create the class as normal
person = Person("Bob")
#now mock all of person's methods/attributes
person = Mock(spec=person)
#talkto is some function you are testing
talkTo(person)
#make sure the Person class's say method was called
self.assertTrue(person.say.called, "Person wasn't asked to talk")
#make sure the person said "Hello"
args = ("Hello")
keywargs = {}
self.assertEquals(person.say.call_args, (args, keywargs), "Person did not say hello")
The big problem that I was having was with the mechanics of the dependency injection. I have now figured that part out.
I need to import the module in the exact same way in both places to successfully inject the new code. For example, if I have the following code that I want to disable:
from foo_service.foo import solr
solr.add(spam)
I can't seem to do this in the in my test runner:
from foo import solr
solr = mock_object
The python interpreter must be treating the modules foo_service.foo and foo as different entries. I changed from foo import solr to the more explicit from foo_service.foo import solr and my mock object was successfully injected.
Typically when something like this arises you use Monkey Patching (also called Duck Punching) to achieve the desired results. Check out this link to learn more about Monkey Patching.
In this case, for example, you would overwrite solr to just print the output you are looking for.
You have two ways to do this is no ,or minimal in the case of DI, modifications to your source code
Dependency injection
Monkey patching
The cleanest way is using dependency injection, but I don't really like extensive monkeypatching, and there are some things that are non-possible/difficult to do that dependency injection makes easy.
I know it's the typical use case for mock objects, but that's also an old argument... are Mock objects necessary at all or are they evil ?
I'm on the side of those who believe mocks are evil and would try to avoid changing tested code at all. I even believe such need to modify tested code is a code smell...
If you wish to change or intercept an internal function call for testing purpose you could also make this function an explicit external dependency set at instanciation time that would be provided both by your production code and test code. If you do that the problem disappear and you end up with a cleaner interface.
Note that doing that there is not need to change the tested code at all neither internally nor by the test being performed.