I'm developing a GUI application in Python that stores it's documents in an XML based format. The application is a mathematical model which several pre-defined components which can be drag-and-dropped. I'd also like the user to be able to create custom components by writing a python function inside an editor provided within the application. My issue is with storing these functions in the XML.
A function might look something like this:
def func(node, timestamp):
return node.weight * timestamp.day + 4
These functions are wrapped in an object which provides a standard way of calling them (compared to the pre-defined components). If I was to create one from Python directly it would look like this:
parameter = ParameterFunction(func)
The function is then called by the model like this:
parameter.value(node=node, timestamp=timestamp)
The ParameterFunction object has a to_xml and from_xml functions which need to serialise/deserialise the object to/from an XML representation.
My question is: how do I store the Python functions in an XML document?
One solution I have thought of so far is to store the function definition as a string, eval() or exec() it for use but keep the string, then store the string in a CDATA block in the XML. Are there any issues with this that I'm not seeing?
An alternative would be to store all of the Python code in a separate file, and have the XML reference just the function names. This could be nice as it could be edited easily in an external editor. In which case what is the best way to import the code? I am envisiging fighting with the python import path...
I'm aware there are will be security concerns with running untrusted code, but I'm willing to make this tradeoff for the freedom it gives users.
The specific application I'm referring to is on github. I'm happy to provide more information if it's needed, but I've tried to keep it fairly generic here. https://github.com/snorfalorpagus/pywr/blob/120928eaacb9206701ceb9bc91a5d73740db1953/pywr/core.py#L396-L402
Nope, you have the easiest and best solution that I can think of. Just keep them as strings, as long as your not worried about running the untrusted code.
The way I'd deal with external python scripts containing tiny snippets like yours would be to treat them as plain text files and read them in as strings. This avoids all the problems with importing them. Just read them in and call exec on them, then the functions will exist in scope.
EDIT: I was going to add something on sandboxing python code, but after a bit of research it seems this will not be an easy task, it would be easier to sandbox the entire program. Another longer and harder way to restrict the untrusted code would be to create your own tiny interpreter that only did safe operations (i.e mathematical operations, calling existing functions, etc..)
Related
I have an application that dynamically generates a lot of Python modules with class factories to eliminate a lot of redundant boilerplate that makes the code hard to debug across similar implementations and it works well except that the dynamic generation of the classes across the modules (hundreds of them) takes more time to load than simply importing from a file. So I would like to find a way to save the modules to a file after generation (unless reset) then load from those files to cut down on bootstrap time for the platform.
Does anyone know how I can save/export auto-generated Python modules to a file for re-import later. I already know that pickling and exporting as a JSON object won't work because they make use of thread locks and other dynamic state variables and the classes must be defined before they can be pickled. I need to save the actual class definitions, not instances. The classes are defined with the type() function.
If you have ideas of knowledge on how to do this I would really appreciate your input.
You’re basically asking how to write a compiler whose input is a module object and whose output is a .pyc file. (One plausible strategy is of course to generate a .py and then byte-compile that in the usual fashion; the following could even be adapted to do so.) It’s fairly easy to do this for simple cases: the .pyc format is very simple (but note the comments there), and the marshal module does all of the heavy lifting for it. One point of warning that might be obvious: if you’ve already evaluated, say, os.getcwd() when you generate the code, that’s not at all the same as evaluating it when loading it in a new process.
The “only” other task is constructing the code objects for the module and each class: this requires concatenating a large number of boring values from the dis module, and will fail if any object encountered is non-trivial. These might be global/static variables/constants or default argument values: if you can alter your generator to produce modules directly, you can probably wrap all of these (along with anything else you want to defer) in function calls by compiling something like
my_global=(lambda: open(os.devnull,'w'))()
so that you actually emit the function and then a call to it. If you can’t so alter it, you’ll have to have rules to recognize values that need to be constructed in this fashion so that you can replace them with such calls.
Another detail that may be important is closures: if your generator uses local functions/classes, you’ll need to create the cell objects, perhaps via “fake” closures of your own:
def cell(x): return (lambda: x).__closure__[0]
I've considered storing the high scores for my game as variables in the code itself rather than as a text file as I've done so far because it means less additional files are required to run it and that attributing 999999 points becomes harder.
However, this would then require me to run self-modifying code to overwrite the global variables representing the scores permanently. I looked into that and considering that all I want to do is really just to change global variables, all the stuff I found was too advanced.
I'd appreciate if someone could give me an explanation on how to write self-modifying Python code to do just that, preferably with an example too as it aids understanding.
My first inclination is to say "don't do that". Self-modifying Python (really any language) makes it extremely difficult to maintain a versioned library.
You make a bug fix and need to redistribute - how do you merge data you stored via self-modification.
Very hard to authenticate packaging using a hash - once the local version is modified it's hard to tell which version it originated because SHAs won't match.
It's unsafe - You could just save and load a Python class that's not stored with your package, however, if it's user writable, a foreign process could add any arbitrary Python code to that file to evaluate. Kind of like SQL injection but Python style.
Python makes is so trivial to load and dump JSON files, that for simple things, I wouldn't think of anything else. Even CSV files are trivial and can be bound to maps but can be more easily manipulated as data using your favorite spreadsheet editor.
My suggestion - don't use self-modifiying Python unless you're just wanting to experiment; It's just not a practical solution in the real world, unless you're working in an embedded environment where disk and memory are a premium.
After reading the Software Carpentry essay on Handling Configuration Files I'm interested in their Method #5: put parameters in a dynamically-loaded code module. Basically I want the power to do calculations within my input files to create my variables.
Based on this SO answer for how to import a string as a module I've written the following function to import a string or oen fileobject or STringIO as a module. I can then access varibales using the . operator:
import imp
def make_module_from_text(reader):
"""make a module from file,StringIO, text etc
Parameters
----------
reader : file_like object
object to get text from
Returns
-------
m: module
text as module
"""
#for making module out of strings/files see https://stackoverflow.com/a/7548190/2530083
mymodule = imp.new_module('mymodule') #may need to randomise the name; not sure
exec reader in mymodule.__dict__
return mymodule
then
import textwrap
reader = textwrap.dedent("""\
import numpy as np
a = np.array([0,4,6,7], dtype=float)
a_normalise = a/a[-1]
""")
mymod = make_module_from_text(reader)
print(mymod.a_normalise)
gives
[ 0. 0.57142857 0.85714286 1. ]
All well and good so far, but having looked around it seems using python eval and exec introduces security holes if I don't trust the input. A common response is "Never use eval orexec; they are evil", but I really like the power and flexibility of executing the code. Using {'__builtins__': None} I don't think will work for me as I will want to import other modules (e.g. import numpy as np in my above code). A number of people (e.g. here) suggest using the ast module but I am not at all clear on how to use it(can ast be used with exec?). Is there simple ways to whitelist/allow specific functionality (e.g. here)? Is there simple ways to blacklist/disallow specific functionality? Is there a magic way to say execute this but don't do anythinh nasty.
Basically what are the options for making sure exec doesn't run any nasty malicious code?
EDIT:
My example above of normalising an array within my input/configuration file is perhaps a bit simplistic as to what computations I would want to perform within my input/configuration file (I could easily write a method/function in my program to do that). But say my program calculates a property at various times. The user needs to specify the times in some way. Should I only accept a list of explicit time values so the user has to do some calculations before preparing the input file? (note: even using a list as configuration variable is not trivial see here). I think that is very limiting. Should I allow start-end-step values and then use numpy.linspace within my program? I think that is limiting too; whatif I want to use numpy.logspace instead? What if I have some function that can accept a list of important time values and then nicely fills in other times to get well spaced time values. Wouldn't it be good for the user to be able to import that function and use it? What if I want to input a list of user defined objects? The thing is, I don't want to code for all these specific cases when the functinality of python is already there for me and my user to use. Once I accept that I do indead want the power and functionality of executing code in my input/configuration file I wonder if there is actually any difference, security wise, in using exec vs using importlib vs imp.load_source and so on. To me there is the limited standard configparser or the all powerful, all dangerous exec. I just wish there was some middle ground with which I could say 'execute this... without stuffing up my computer'.
"Never use eval or exec; they are evil". This is the only answer that works here, I think. There is no fully safe way to use exec/eval on an untrusted string string or file.
The best you can do is to come up with your own language, and either interpret it yourself or turn it into safe Python code before handling it to exec. Be careful to proceed from the ground up --- if you allow the whole Python language minus specific things you thought of as dangerous, it would never be really safe.
For example, you can use the ast module if you want Python-like syntax; and then write a small custom ast interpreter that only recognizes a small subset of all possible nodes. That's the safest solution.
If you are willing to use PyPy, then its sandboxing feature is specifically designed for running untrusted code, so it may be useful in your case. Note that there are some issues with CPython interoperability mentioned that you may need to check.
Additionally, there is a link on this page to an abandoned project called pysandbox, explaining the problems with sandboxing directly within python.
I want to modify quicktime files, so I'm working with quicktime.py but it only parses information. It doesn't know how to write/modify things.
In C, struct records are actually very powerful - you get 4 things for the cost of 1 readable definition:
Define names for each variable.
Define serializable types and order for each variable (let's ignore machine specific shenanigans for this discussion).
Pack (write)
Unpack ('parse')
In python, the struct module can do numbers 2-4 but you need to do extra work to make python define names for both packing and unpacking based on 1 definition (DRY).
OTOH ctypes is able to do 1-4 (3-4 aren't exactly in the stdlib but they're easier to patch in using this) and ctypes supports nesting.
I understand that if more complex parsing/serializing is needed, specific code will be written. But still it seems to me that I should be able to explain to python what a file looks like and it could do the rest (pack/unpack). Problem is that ctypes is advertised as a "foreign function library" so it's not "supposed" to do this. Another issue is ctypes probably won't work well with a HUGE file where you just want to seek to wherever and change a few bits, though I haven't tested this.
Here's the question: what's the DRY way to read and modify binary formats in python?
Maybe try Hachoir ?
Try Construct, it does exactly what you want.
I cannot understand it. Very simple, and obvious functionality:
You have a code in any programming language, You run it. In this code You generate variables, than You save them (the values, names, namely everything) to a file, with one command. When it's saved You may open such a file in Your code also with simple command.
It works perfect in matlab (save Workspace , load Workspace ) - in python there's some weird "pickle" protocol, which produces errors all the time, while all I want to do is save variable, and load it again in another session (?????)
f.e. You cannot save class with variables (in Matlab there's no problem)
You cannot load arrays in cPickle (but YOu can save them (?????) )
Why don't make it easier?
Is there a way to save the current variables with values, and then load them?
What you are describing is Matlab environment feature not a programming language.
What you need is a way to store serialized state of some object which could be easily done in almost any programming language. In python world pickle is the easiest way to achieve it and if you could provide more details about the errors it produces for you people would probably be able to give you more details on that.
In general for object oriented languages (including python) it is always a good approach to incapsulate a your state into single object that could be serialized and de-serialized and then store/load an instance of such class. Pickling and unpickling of such objects works perfectly for many developers so this must be something specific to your implementation.
Since you're talking about Matlab, you probably want to try out IPython, which is a shell for Python offering much more functionality than the standard interpreter shell you get when executing Python.
Among this functionality is the ability to load/save workspace sessions, create macros out of session input etc., which is probably more like what you are used to in Matlab (I actually use both and find IPython to be much more elegant, but YMMV):
http://ipython.scipy.org
PiCloud has implemented a fancier pickle, but I can't find the code. I saw a poster session.
Generally in Python instantiated objects don't have any one way to recreate them, and in some cases its particularly difficult (like an open file) as it takes several steps to recreate.
I take issue with the statement that the saving of variables in Matlab is an environment function. the "save" statement in matlab is a function and part of the matlab language not just a command. It is a very useful function as you don't have to worry about the trivial minutia of file i/o and it handles all sorts of variables from scalar, matrix, objects, structures.
It's an old thread, but thought I should throw it out there anyway - Spyder the Scientific Python development environment allows you to do just this through the Variable explorer. There's a button there Save data that packs your whole workspace up in a .spydata file that you can later reload. Works like a charm when you're switching between projects!