I have python file, for example, named blah.py, that I would like to have compiled then placed into another folder using cmake. Right now, I am capable of doing this with the following code in my cmake file:
ADD_CUSTOM_TARGET(output ALL /usr/bin/python -m py_compile src/blah.py
COMMAND /bin/mv src/blah.pyc build VERBATIM)
This is on ubuntu 12.04. This code works as intended; the only problem is that the python file is being compiled in the source directory, then being put in the build directory.
However, I can't assume that this src directory will have read AND write privileges, meaning what I need to do is combine these two commands into one (compile the python file and place the compiled python file into my build directory, instead of compiling it in the src directory then moving it)
I'm sure there must be some way I could be using to specify where I would like this compiled code to be placed, but I can't find any. Help would be greatly appreciated! :)
EDIT: This link may have a solution..not sure:
Can compiled bytecode files (.pyc) get generated in different directory?
I was typing out this answer, and then looked at your edited link. This same answer is given in one of the unaccepted answers: https://stackoverflow.com/a/611995/496445
import py_compile
py_compile.compile('/path/to/source/code.py', cfile='/path/to/build/code.pyc')
To call this via a basic shell command you can format it like this:
python -c "import py_compile; py_compile.compile('/path/to/source/code.py', cfile='/path/to/build/code.pyc')"
Related
I am new to python compiling and I try to make it work for compiled scripts using python 3.9 when having miniconda installed.
Here is my main.py in the root folder of my project.
from say_hello import just_say_hello
print('Hi')
just_say_hello()
Here is say_hello.py
def just_say_hello():
print('Hello')
The files are in the same folder. In that folder, I run the following command-line statement to compile:
python -m compileall .
Afterwards, I run the following commandline-statement to run:
cd __pycache__
python .\main.cpython-39.pyc
This results in the following error:
ModuleNotFoundError: No module named 'say_hello'
Why this module cannot be found when running the script from the __pycache__ folder? Both script files (compiled) are in there. It should not be such a big problem. When running the normal original script, everything works fine. When using pycompile, suddenly, things break. However, I really need to compile.
How can I just compile this python script in such a way I can run it without such issues?
I tried several things. For example, I renamed the the pyc files to normal python files. After running
python main.py
I received the following error:
ValueError: source code string cannot contain null bytes
So I really need a solution to run my compiled multi file script.
Python is an interpreted language, what this means for you is that you don't have to compile the scripts to run them (Even though they are technically still compiled). This means that when you run a python file it is compiled to byte code (Machine code) and then runs. However, if other python files are referenced in the file you are running, then they are compiled, so that they do not have to be compiled when they are called, this is what .pyc files are. I would not recommend deleting them. I hope this helps.
Module import is one of many things that are wrong with Python, and just like significant whitespace, 'Python' seems proud of it.
Having said that, I usually solve my problems with import by adding everything to the PYTHONPATH. Either in the environment variable (depends on OS) or in code itself with sys.path.append('..')
Expect new problems when you reboot or move your application to a different machine or even a different directory. Or when you want to make your application cross-platform.
Test carefully for these scenarios.
PS. There is also a somewhat older answer that explains different ways of importing here: How can I import other Python files?
When you compile it down to machine code, it hardcodes the directory paths to the source files.
All you need to do to make the compiled files work is move them into the same directory where the uncompiled versions are.
What's the difference between running python file as a module executed as a script vs just running the python file? In particular, I'm wondering what the difference is between running
python -m filename vs
python filename.py
I'm reading the documentation here: https://docs.python.org/3.6/using/cmdline.html but it's not entirely clear to me.
In particular, I notice that when I'm running a file I wrote that imports other modules I've written, it works when I run python -m filename but when I run python filename.py it says it can't find the module I've written. Why is this? Is this something to do with the path?
I am not a python guy but I read something in the link you have provided that may provide some explanation.
If this option is given, the first element of sys.argv will be the
full path to the module file (while the module file is being located,
the first element will be set to "-m"). As with the -c option, the
current directory will be added to the start of sys.path.
I guess what that means is that the directory you are running python -m filename is added to the system path. The sys.path (or system path) is basically a list of paths (folders) that python will try search for the file you are trying to import. I am assuming the files you are looking for in your import is in the same folder you run python -m filename. Running python without the -m does not modify the sys.path list.
You can read more on this here https://docs.python.org/3.6/library/sys.html#sys.path
Hope this is what you want.
When I call
python3.5 my_script.py
where does Python look for my_script.py? Just in the current working directory or is there a similar mechanism as with PYTHONPATH for import or PATH if I use the script with a shebang line?
(Before you consider this as a duplicate question: So far, the questions and answers on StackOverflow do not consider the mentioned case!)
Calling the Python executable directly with python <something>, without any other command line arguments, will make the Python executable attempt to run the file <something>. So it will interpret that thing as a path and try to locate it.
So if you just write python my_script.py, it will look in the current directory. If you write python ../my_script.py it will look in the parent directory. If you write python /home/foo/my_script.py it will look in foo’s home directory.
As with any command, it’s a good idea to look at its manpage to get an idea of how it works:
when called with a file name argument or with a file as standard input, it reads and executes a script from that file
The argument supplied to python3.5 is a path. Here, it is not starting with / so it is a relative path, equivalent to ./script
Python will not try to resolve the path to other locations for two reasons:
it is not common to do so
it avoids errors and security issues
Python will look in the same directory from you launched the command. If the file isn't there, python will raise and error.
Yes, I read some of those .pyc questions, but still nothing.
I need to save a project to a CD and preferably I'd like to be able to run it right from there. Should I put the .pyc files in there or not?
No,
byte-code is not version compatible
Yes
the project is supposed to run with the same python version
.py files won't be changed any more in that release
it might be load faster
if smth doesn't suit, python will (need to) create new .pycs somewhere anyway
The latter one: Python will handle that and use a temp directory, won't it?
Answer : No Need to Include
You want to Compile & Run Means :-
Your question says, you are going to compile your python source code in another machine. When you compile your code, the ".pyc" file will be created. So, it is not necessary to include the ".pyc" file with your cd. If you put means, no problem, when you compile your source code in another machine, it will replace the ".pyc" file with the newly created ".pyc" file.
You want to Only Run Means:-
But, if you want to run without compile means, you should convert your program into executable. Executable file may be for windows or linux.
For Linux, to create executable file : Use "CDE" package.
Link: http://www.pgbovine.net/cde.html
For Windows, to create executable file : Use "Py2Exe" package
Link : http://www.py2exe.org/index.cgi/WorkingWithVariousPackagesAndModules
I hope this is the answer you want.
PYC is python bytecode making the original code run faster. You can omit it, since your code will still be able to run.
However if you need to run your code on different machines, probably without any python distribution. Take a look at
http://www.py2exe.org/ which is for windows only
http://www.pyinstaller.org/ which is for most systems
I personally worked with py2exe, it is simple to use. Although it makes fairly huge package, since it add all python packages required to run the original python script.
Hope this is helpful for You.
I have a simple python script, which imports various other modules I've written (and so on). Due to my environment, my PYTHONPATH is quite long. I'm also using Python 2.4.
What I need to do is somehow package up my script and all the dependencies that aren't part of the standard python, so that I can email a single file to another system where I want to execute it. I know the target version of python is the same, but it's on linux where I'm on Windows. Otherwise I'd just use py2exe.
Ideally I'd like to send a .py file that somehow embeds all the required modules, but I'd settle for automatically building a zip I can just unzip, with the required modules all in a single directory.
I've had a look at various packaging solutions, but I can't seem to find a suitable way of doing this. Have I missed something?
[edit] I appear to be quite unclear in what I'm after. I'm basically looking for something like py2exe that will produce a single file (or 2 files) from a given python script, automatically including all the imported modules.
For example, if I have the following two files:
[\foo\module.py]
def example():
print "Hello"
[\bar\program.py]
import module
module.example()
And I run:
cd \bar
set PYTHONPATH=\foo
program.py
Then it will work. What I want is to be able to say:
magic program.py
and end up with a single file, or possibly a file and a zip, that I can then copy to linux and run. I don't want to be installing my modules on the target linux system.
I found this useful:
http://blog.ablepear.com/2012/10/bundling-python-files-into-stand-alone.html
In short, you can .zip your modules and include a __main__.py file inside, which will enable you to run it like so:
python3 app.zip
Since my app is small I made a link from my main script to __main__.py.
Addendum:
You can also make the zip self-executable on UNIX-like systems by adding a single line at the top of the file. This may be important for scripts using Python3.
echo '#!/usr/bin/env python3' | cat - app.zip > app
chmod a+x app
Which can now be executed without specifying python
./app
Use stickytape module
stickytape scripts/blah --add-python-path . > /tmp/blah-standalone
This will result with a functioning script, but not necessarily human-readable.
You can try converting the script into an executable file.
First, use:
pip install pyinstaller
After installation type ( Be sure you are in your file of interest directory):
pyinstaller --onefile --windowed filename.py
This will create an executable version of your script containing all the necessary modules. You can then transfer (copy and paste) this executable to the PC or machine you want to run your script.
I hope this helps.
You should create an egg file. This is an archive of python files.
See this question for guidance: How to create Python egg file
Update: Consider wheels in 2019
The only way to send a single .py is if the code from all of the various modules were moved into the single script and they your'd have to redo everything to reference the new locations.
A better way of doing it would be to move the modules in question into subdirectories under the same directory as your command. You can then make sure that the subdirectory containing the module has a __init__.py that imports the primary module file. At that point you can then reference things through it.
For example:
App Directory: /test
Module Directory: /test/hello
/test/hello/__init__.py contents:
import sayhello
/test/hello/sayhello.py contents:
def print_hello():
print 'hello!'
/test/test.py contents:
#!/usr/bin/python2.7
import hello
hello.sayhello.print_hello()
If you run /test/test.py you will see that it runs the print_hello function from the module directory under the existing directory, no changes to your PYTHONPATH required.
If you want to package your script with all its dependencies into a single file (it won't be a .py file) you should look into virtualenv. This is a tool that lets you build a sandbox environment to install Python packages into, and manages all the PATH, PYTHONPATH, and LD_LIBRARY_PATH issues to make sure that the sandbox is completely self-contained.
If you start with a virgin Python with no additional libraries installed, then easy_install your dependencies into the virtual environment, you will end up with a built project in the virtualenv that requires only Python to run.
The sandbox is a directory tree, not a single file, but for distribution you can tar/zip it. I have never tried distributing the env so there may be path dependencies, I'm not sure.
You may need to, instead, distribute a build script that builds out a virtual environment on the target machine. zc.buildout is a tool that helps automate that process, sort of like a "make install" that is tightly integrated with the Python package system and PyPI.
I've come up with a solution involving modulefinder, the compiler, and the zip function that works well. Unfortunately I can't paste a working program here as it's intermingled with other irrelevant code, but here are some snippets:
zipfile = ZipFile(os.path.join(dest_dir, zip_name), 'w', ZIP_DEFLATED)
sys.path.insert(0, '.')
finder = ModuleFinder()
finder.run_script(source_name)
for name, mod in finder.modules.iteritems():
filename = mod.__file__
if filename is None:
continue
if "python" in filename.lower():
continue
subprocess.call('"%s" -OO -m py_compile "%s"' % (python_exe, filename))
zipfile.write(filename, dest_path)
Have you taken into considerations Automatic script creation of distribute the official packaging solution.
What you do is create a setup.py for you program and provide entry points that will be turned into executables that you will be able run. This way you don't have to change your source layout while still having the possibility to easily distribute and run you program.
You will find an example on a real app of this system in gunicorn's setup.py