How to use PyCharm for GIMP plugin development? - python

I need to develop a plugin for GIMP and would like to stay with PyCharm for Python editing, etc.
FYI, I'm on Windows.
After directing PyCharm to use the Python interpreter included with GIMP:
I also added a path to gimpfu.py to get rid of the error on from gimpfu import *:
This fixes the error on the import, even when set to Excluded.
I experimented with setting this directory to Sources, Resources and Excluded and still get errors for constants such as RGBA-IMAGE, TRANSPARENT_FILL, NORMAL_MODE, etc.
Any idea on how to contort PyCharm into playing nice for GIMP plugin development?
Not really running any code from PyCharm, it's really just being used as a nice code editor, facilitate revisions control, etc.

As you find this variables are part of .pyd files (dll files for Python). PyCharm can't get signatures for content of this files.
For Python builtins (like abs, all, any, etc.) PyCharm has it's own .py files that uses only for signatures and docs. You can see it if you'll click on some of this funcs and go to it's declaration:
PyCharm will open builtins.py file in it's folder with following content:
def abs(*args, **kwargs): # real signature unknown
""" Return the absolute value of the argument. """
pass
def all(*args, **kwargs): # real signature unknown
"""
Return True if bool(x) is True for all values x in the iterable.
If the iterable is empty, return True.
"""
pass
def any(*args, **kwargs): # real signature unknown
"""
Return True if bool(x) is True for any x in the iterable.
If the iterable is empty, return False.
"""
pass
As you see functions are defined and documented, but have no implementation, because their implementation created with C and placed somewhere in binary file.
Pycharm can't provide such wrapper for every library. Usually people who created .pyd files provide their .py wrappers (for example, PyQt module: no native python implementation, just signatures).
Looks like Gimp doesn't have such wrapper for some of variables. Only way I see is to create some sort of own wrapper manually. For example, create gimpfu_signatures.py with following content:
RGBA_IMAGE = 1
TRANSPARENT_FILL = 2
NORMAL_MODE = 3
And import it while you're creating plugin:
from gimpfu import *
from gimpfu_signatures import * # comment on release
Not elegant, but better then nothing.
...
One more note about gimpfu.py's path. If I understand correctly, you just added this path to project. It may work, but correct way is to add it to project's PYTHONPATH (in project preferences). See this link for detailed manual.

Related

Calling a multiple python scripts from python with predefined environment

Probably related to globals and locals in python exec(), Python 2 How to debug code injected by the exec block and How to get local variables updated, when using the `exec` call?
I am trying to develop a test framework for our desktop applications which uses click bot like functions.
My goal was to enable non-programmers to write small scripts which could work as a test script. So my idea is to structure the test scripts by files like:
tests-folder
| -> 01-first-test.py
| -> 02-second-test.py
| ... etc
| -> fixture.py
And then just execute these scripts in alphabetical order. However, I also wanted to have fixtures which would define functions, classes, variables and make them available to the different scripts without having the scripts to import that fixture explicitly. If that works, I could also have that approach for 2 or more directory levels.
I could get it working-ish with some hacking around, but I am not entirely convinced. I have a test_sequence.py which looks like this:
from pathlib import Path
from copy import deepcopy
from my_module.test import Failure
def run_test_sequence(test_dir: str):
error_occured = False
fixture = {
'some_pre_defined_variable': 'this is available in all scripts and fixtures',
'directory_name': test_dir,
}
# Check if fixture.py exists and load that first
fixture_file = Path(dir) / 'fixture.py'
if fixture_file.exists():
with open(fixture_file.absolute(), 'r') as code:
exec(code.read(), fixture, fixture)
# Go over all files in test sequence directory and execute them
for test_file in sorted(Path(test_dir).iterdir()):
if test_file.name == 'fixture.py':
continue
# Make a deepcopy, so scripts cannot influence one another
fixture_copy = deepcopy(fixture)
fixture_copy.update({
'some_other_variable': 'this is available in all scripts but not in fixture'
})
try:
with open(test_file.absolute(), 'r') as code:
exec(code.read(), fixture_locals, fixture_locals)
except Failure:
error_occured = True
return error_occured
This iterates over all files in the directory tests-folder and executes them in order (with fixture.py first). It also makes the local variables, functions and classes from fixture.py available to each test-script.
A test script could then just be arbitrary code that will be executed and if it raises my custom Failure exception, this will be noted as a failed test.
The whole sequence is started with a script that does
from my_module.test_sequence import run_test_sequence
if __name__ == '__main__':
exit(run_test_sequence('tests-folder')
This mostly works.
What it cannot do, and what leaves me unsatisfied with this approach:
I cannot debug the scripts itself. Since the code is loaded as string and then interpreted, breakpoints inside the test scripts are not recognized.
Calling fixture functions behaves weird. When I define a function in fixture.py like:
from my_hello_module import print_hello
def printer():
print_hello()
I will receive a message during execution that print_hello is undefined because the variables/modules/etc. in the scope surrounding printer are lost.
Stacktraces are useless. On failure it shows the stacktrace but of course only shows my line which says `exec(...)' and the insides of that function, but none of the code that has been loaded.
I am sure there are other drawbacks, that I have not found yet, but these are the most annoying ones.
I also tried to find a solution through __import__ but I couldn't get it to inject my custom locals or globals into the imported script.
Is there a solution, that I am too inexperienced to find or another builtin Python function that actually does, what I am trying to do? Or is there no way to achieve this and I should rather have each test-script import the fixture and file/directory names from the test-scripts itself. I want those scripts to have as few dependencies and pythony code as possible. Ideally they are just:
from my_module.test import *
click(x, y, LEFT)
write('admin')
press('tab')
write('password')
press('enter')
if text_on_screen('Login successful'):
succeed('Test successful')
else:
fail('Could not login')
Additional note: I think I had the debugger working when I still used execfile, but it is not available in python3 environments.

[Python][Flask] PyLint too many instance attributes, is this resource correctly written? [duplicate]

This question already has answers here:
How to deal with Pylint's "too-many-instance-attributes" message?
(4 answers)
Closed 1 year ago.
I am writing an application in Python using Flask, now during creation of a Resource class for an endpoint I get a Pylint 'too-many-instance-attributes' warning. Now I do not know anymore if what I am doing is even a 'correct' way of writing Resources.
I inject dependencies into the resource like this:
api.add_resource(TicketsQuery, '/tickets/query',
'/ticket/query/<int:ticketID>',
resource_class_kwargs={'cleaner': Cleaner(StrategyResponseTickets()),
'machine_cleaner': Cleaner(StrategyResponseMachines()),
'db': soap_caller,
'cache': cache,
'field_map': app.config['FIELD_FILTER_MAP']['TICKETS'],
'endpoint_permission': TicketsQueryPermission
})
Which then shows up in the Resource as a kwargs argument. I also decorate the functions inside the init since I need a variable from the class (to do the decoration itself).
class TicketsQuery(Resource):
def __init__(self, **kwargs):
# Dependencies
self.cleaner = kwargs['cleaner']
self.machine_cleaner = kwargs['machine_cleaner']
self.db = kwargs['db']
self.cache = kwargs['cache']
self.field_map = kwargs['field_map']
self.endpoint_permission = kwargs['endpoint_permission']
# Permissions of endpoint method calls, implemented using wrapper
self.get = authorization_required(self.endpoint_permission, UserType.GENERIC_EMPLOYEE)(self.get)
self.post = authorization_required(self.endpoint_permission, UserType.GENERIC_EMPLOYEE)(self.post)
def get(self, permission_set: TicketsPermissionSet, ticketID=-1):
Is this a correct way of writing Resources in Flask? Or is there a better structure to adhere to? Any insight or tips are appreciated!
Don't be discouraged by a linter warning ~ it's there to remind you to stick to a specific style. If you don't yet have a .pylintrc in the project's root folder then the linter will be using the default rules; this is simply a style warning against "too many variables in instance" ~ i.e. too many self.*'s - it is 100% a pylint thing, and nothing to do with Flask.
You can suppress the warning on this but keep it active for any others by adding a comment # pylint: disable=too-many-instance-attributes above the class's def __init__(...):.
If you'd like a more pylintesque way of solving it, run pylint --generate-rcfile (assuming you've pip install pylint'd and these messages aren't just coming from the IDE) in your project's root folder to generate a .pylintrc.
From there you can add the error's code R0902 to the disable= list ~ i.e. disable=R0902, or disable=A*,B*,R0902 ~ if you're disabling multiple warnings (which accept globbing, so you could just disable ALL R* messages, but it's best to only do that if you're sure you know what warnings you're turning off). OR, you can find the line max-attributes= under [DESIGN] and set/up that number to a higher value you'd consider more reasonable.
For reference, the resources to find this information if you'd like to explore further ~ here's a list of the warnings you can get from pylint which you can search either by the name of the error too-many-instance-attributes or by its code R0902, and here's a sample pylintrc, which you can mostly browse by the code i.e. R0902 to find the property in it that affects that warning.
Lastly, here's another SO article on the same pylint error message, if you want more examples ~ How to deal with Pylint's "too-many-instance-attributes" message?.

creating extension for SPSS with python

So I have discovered the python extension for SPSS, and everything works fine, I have created some scripts now and included them in the extensions map and it works fine. However, now I have created a couple of scripts that require arguments, I thought I could just follow the same method but I guess not.
def Run(args):
import spss
def testing_p(variables):
all_variables = [spss.GetVariableName(i) for i in range(spss.GetVariableCount())]
variable_nr = [all_variables.index(i) for i in variables]
print all_variables
print variable_nr
With the following .xml-file:
<Command xmlns="http://xml.spss.com/extension" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" Name="testing_p" Language="Python">
</Command>
However, this keep throwing the error when calling testing_p(['my_var', 'my_var2']):
Warnings
This command should specify a valid subcommand at the beginning.
Execution of this command stops.
I cannot wrap my head around this because everything works fine when not put in the extensions map and only doing:
BEGIN PROGRAM.
import spss
def testing_p(variables):
all_variables = [spss.GetVariableName(i) for i in range(spss.GetVariableCount())]
variable_nr = [all_variables.index(i) for i in variables]
print all_variables
print variable_nr
END PROGRAM.
For an extension, which can be writen in Python, R, or Java, you need to create a syntax specification containing the command name, any subcommands, and the arguments and argument types you want. Here is a picture of the start of one (SPSSINC_TURF, which is installed with Statistics).
This will guide the Statistics parser in checking the user input. It also then calls the Run function with a complicated structure containing the user input. You can use the functions in the extensions module to map that to your Python variables and do further validation. Here is a picture of the start of the Run function for SPSSINC TURF.
Finally, if the syntax is valid, your Run function calls the worker function to do something useful, mapping all the parameters to the specified arguments by calling
processcmd(oobj, args, superturf, vardict=spssaux.VariableDict())
which was imported from extensions.py.
Look at the doc for extensions in the help system, and look at some of the extensions installed with Statistics for examples.
Finally, here is a slide from one of my presentations summarizing the flow from user input to results.

Pythonic way to set module-wide settings from external file

Some background (not mandatory, but might be nice to know): I am writing a Python command-line module which is a wrapper around latexdiff. It basically replaces all \cite{ref1, ref2, ...} commands in LaTeX files with written-out and properly formatted references before passing the files to latexdiff, so that latexdiff will properly mark changes to references in the text (otherwise, it treats the whole \cite{...} command as a single "word"). All the code is currently in a single file which can be run with python -m latexdiff-cite, and I have not yet decided how to package or distribute it. To make the script useful for anybody else, the citation formatting needs to be configurable. I have implemented an optional command-line argument -c CONFIGFILE to allow the user to point to their own JSON config file (a default file resides in the module folder and is loaded if the argument is not used).
Current implementation: My single-file command-line Python module currently parses command-line arguments in if __name__ == '__main__', and loads the config file (specified by the user in -c CONFIGFILE) here before running the main function of the program. The config variable is thus available in the entire module and all is well. However, I'm considering publishing to PyPI by following this guide which seems to require me to put the command-line parsing in a main() function, which means the config variable will not be available to the other functions unless passed down as arguments to where it's needed. This "passing down by arguments" method seems a little cluttered to me.
Question: Is there a more pythonic way to set some configuration globals in a module or otherwise accomplish what I'm trying to? (I don't want to rely on 3rd party modules.) Am I perhaps completely off the tracks in some fundamental way?
One way to do it is to have the configurations defined in a class or a simple dict:
class Config(object):
setting1 = "default_value"
setting2 = "default_value"
#staticmethod
def load_config(json_file):
""" load settings from config file """
with open(json_file) as f:
config = json.load(f)
for k, v in config.iteritems():
setattr(Config, k, v)
Then your application can access the settings via this class: Config.setting1 ...

Eclipse Pydev: Supress no-self errors in python wrappers generated with swig

when generating python wrappers with swig the python wrapper classes in the generated python file do not have an explicit self parameter, for example see below:
class PySwigIterator(_object):
def value(*args): return _spatiotemporalnmf.PySwigIterator_value(*args)
def incr(*args): return _spatiotemporalnmf.PySwigIterator_incr(*args)
def decr(*args): return _spatiotemporalnmf.PySwigIterator_decr(*args)
def distance(*args): return _spatiotemporalnmf.PySwigIterator_distance(*args)
I am developing with the eclipse pluging Pydev. Pydev always shows an error when it detects a method without explicit self parameter. I am aware of two methods to get rid of the errors: First, disable error checking for the whole project in the Pydev preferences. Second, add a ##NoSelf to every line with an error. I don't want to use the first one, because I still want to get error warnings for my non-swig-generated files. Obviously the second one is also not very good, because I would have to do it by hand and every time I generate the file again, all ##NoSelfs will be gone.
My Question now is, is there a better way to achieve this?
Thanks
As from the documentation, any file with the comment
##PydevCodeAnalysisIgnore
inside will not be analyzed.
Therefore, you just need to add it to all SWIG-generated files, and you should be OK. It is just one place to change, and you could even write a very small processor that will add it automatically.

Categories

Resources