I have the following directory structure:
TestFolder:
test.py
CommonFolder:
common.py
And in test.py, I need to import common.py.
In order to do that, in test.py I add the path of CommonFolder to the system paths.
Here is what I started off with:
sys.path.append(os.path.join(os.path.dirname(os.path.dirname(__file__)), 'CommonFolder'))
Then I figured that / is a valid separator in pretty much every OS, so I changed it to:
sys.path.append(os.path.dirname(os.path.dirname(__file__)) + '/CommonFolder')
Then I figured that .. is also a valid syntax in pretty much every OS, so I changed it to:
sys.path.append(os.path.dirname(__file__) + '/../CommonFolder')
My questions:
Are my assumptions above correct, and will the code run correctly on every OS?
In my last change, I essentially add a slightly longer path to the system paths. More precisely - FullPath/TestFolder/../CommonFolder instead of FullPath/CommonFolder. Is the any runtime impact to this? I suppose that every import statement might be executed slightly slower, but even if so, that would be minor. Is there any good reason not to do it this way?
If you're writing code to span multiple Operating Systems it's best not to try to construct the paths yourself. Between Linux and Windows you immediately run into the forward vs backwards slash issue, just as an example.
I'd recommend looking into the Python pathlib library. It handles generating paths for different operating systems.
https://docs.python.org/3/library/pathlib.html
This is a great blog about this subject and how to use the library:
https://medium.com/#ageitgey/python-3-quick-tip-the-easy-way-to-deal-with-file-paths-on-windows-mac-and-linux-11a072b58d5f
UPDATE:
Updating this with a more specific answer.
Regarding the directory paths, as long as you're not building the paths yourself (using a utility such as pathlib) the paths you've created should be fine. Linux, Mac, and Windows all support relative paths (both mac and linux are Unix ~based of course).
As for whether it's efficient, unless you're frequently dynamically loading or reloading your source files (which is not common) most files are loaded into memory before the code is run, so there would be no performance impact on setting up the file paths in this way.
Use os.path.join() for OS independent path separator instead of /
Example: os.path.join(os.path.dirname(__file__),"..","CommonFolder")
Or instead you can make CommonFolder as python package by just placing a empty file by name __init__.py inside CommonFolder. After that you can simply import common in test.py as:-
from CommonFolder import common
Related
I am trying to make an adventure game in python, and I need to know how to import image, which is in the same folder as the code with pygame. How to do it? I have tried
Character = pygame.image.load('Resources/MainCharFront.png')
but I'm getting an error:
pygame.error: Couldn't open Resources/MainChar_Front.png
I really need it to be in the same folder, because I am often switching devices and my file system is always different.
If you have structured your code as a Python package (which you should), you can use the pkg_resources module to access resource files like images, etc, that are part of your project.
For example, if I have the following layout:
./mypackage/__init__.py
./mypackage/main.py
./mypackage/images/character.jpg
I can write in mypackage/main.py:
import pygame
import pkg_resources
Character = pygame.image.load(
pkg_resources.resource_filename('mypackage', 'images/character.jpg'))
You can see this in action below:
>>> import mypackage.main
pygame 1.9.6
Hello from the pygame community. https://www.pygame.org/contribute.html
>>> mypackage.main.Character
<Surface(359x359x24 SW)>
>>>
In your comment you say that the image is in the same directory as your code, however the path you are showing implies that you are trying to load it from a sub-directory called Resources:
Character = pygame.image.load('Resources/MainCharFront.png')
So you can likely fix your problem by removing that from the path ans just using :
Character = pygame.image.load('MainCharFront.png')
However that is not the approach that I would recommend. You are better off to keep the resources in a separate sub-directory like Resources to try and keep thing organized. You said that you want to use a flat structure with everything in one folder because you move the game around between different systems with different file systems. I will assume from that that you are having issues with the path separator on these different systems. That is fairly easy to handle though.
#larsks has suggested one way that is a good approach. You do not have to go quite that far though to still be able to keep structure in your resources.
The easy way to deal with different path separators on different file systems is to use os.path.join() to link your path components with the file system appropriate separator, like this:
Character = pygame.image.load(os.path.join('Resources', 'MainCharFront.png'))
This will allow you to move between Windows, Linux, etc. without having to flatten your structure. The os.path.join() can take multiple path components as arguments, not just 2, so you can have as much hierarchy as you need. Just break up the path string into separate strings where the slashes would be. like this:
os.path.join('Resources', 'images', 'MainCharFront.png')
You can find the docs for the os.path.join() command here
Just to be overly clear, the os.path.join() method is not the same as the standard string join() method (which joins strings using the separator you tell it to). The os.path.join() method determines the separator for you based on the system it is being run on.
I have a python file in "mainDirectory/subdirectory/myCode.py" and in my code, I want to refer to a .csv file in "mainDirectory/some_data.csv" I don't want to use an absolute path since I run the code in different operating systems and using absolute path may make trouble. so in short, is there a way to refer to the upper directory of the current directory using a relative path in python?
I know how to refer to the subdirectories of the current directory using a relative path. but I'm looking for addressing to the upper-level directories using the relative path. I don't want to use absolute bath since the file is going to be run in different folders in different operating systems.
Update:
I found one method here (it is not based on the relative path but it does the job):
the key is using "." for importing upper hand directories.
for example, one can access the same layer directory by using one dot. for accessing the one-layer higher, one can use two dots. in the above case, to access the "subdirectory" one can put this line in "myCode.py"
from .subdirectory import something
to access the "mainDirectory:
from ..mainDirectory import something
If you look at the Python documentation, you will find that the language is not designed to be used like that. Instead you'll want to try to refactor your directories to put myCode.py in a parent level directory like mainDirectory.
mainDirectory
│
└───myCode.py
│
└───subdirectory
│
│ some_data.csv
You are facing an issue with the nature of the PYTHONPATH. There is a great article on dealing with PYTHONPATH here:
https://chrisyeh96.github.io/2017/08/08/definitive-guide-python-imports.html
It can be challenging if you're new to the language (or if you've been using it for years!) to navigate Python import statements. Reading up on PEP8 conventions for importing, and making sure your project conforms to standard language conventions can save you a lot of time, and a lot of messy git commits when you need to refactor your directory tree down the line.
You can always derive absolute path at run time using API provided by os package.
import os
dirname = os.path.dirname(__file__)
filename = os.path.join(dirname, 'relative/path/to/file/you/want')
Complete answer was always a click away. Relative paths in Python
There are several tools in os.path that you can use:
from os.path import dirname
upper = dirname(dirname(__filepath__))
Here is alink to a more comprehensive answer on accessing upper directories:
How do I get the parent directory in Python?
The reason I want to this is I want to use the tool pyobfuscate to obfuscate my python code. Butpyobfuscate can only obfuscate one file.
I've answered your direct question separately, but let me offer a different solution to what I suspect you're actually trying to do:
Instead of shipping obfuscated source, just ship bytecode files. These are the .pyc files that get created, cached, and used automatically, but you can also create them manually by just using the compileall module in the standard library.
A .pyc file with its .py file missing can be imported just fine. It's not human-readable as-is. It can of course be decompiled into Python source, but the result is… basically the same result you get from running an obfuscater on the original source. So, it's slightly better than what you're trying to do, and a whole lot easier.
You can't compile your top-level script this way, but that's easy to work around. Just write a one-liner wrapper script that does nothing but import the real top-level script. If you have if __name__ == '__main__': code in there, you'll also need to move that to a function, and the wrapper becomes a two-liner that imports the module and calls the function… but that's as hard as it gets.) Alternatively, you could run pyobfuscator on just the top-level script, but really, there's no reason to do that.
In fact, many of the packager tools can optionally do all of this work for you automatically, except for writing the trivial top-level wrapper. For example, a default py2app build will stick compiled versions of your own modules, along with stdlib and site-packages modules you depend on, into a pythonXY.zip file in the app bundle, and set up the embedded interpreter to use that zipfile as its stdlib.
There are a definitely ways to turn a tree of modules into a single module. But it's not going to be trivial. The simplest thing I can think of is this:
First, you need a list of modules. This is easy to gather with the find command or a simple Python script that does an os.walk.
Then you need to use grep or Python re to get all of the import statements in each file, and use that to topologically sort the modules. If you only do absolute flat import foo statements at the top level, this is a trivial regex. If you also do absolute package imports, or from foo import bar (or from foo import *), or import at other levels, it's not much trickier. Relative package imports are a bit harder, but not that big of a deal. Of course if you do any dynamic importing, use the imp module, install import hooks, etc., you're out of luck here, but hopefully you don't.
Next you need to replace the actual import statements. With the same assumptions as above, this can be done with a simple sed or re.sub, something like import\s+(\w+) with \1 = sys.modules['\1'].
Now, for the hard part: you need to transform each module into something that creates an equivalent module object dynamically. This is the hard part. I think what you want to do is to escape the entire module code so that it can put into a triple-quoted string, then do this:
import types
mod_globals = {}
exec('''
# escaped version of original module source goes here
''', mod_globals)
mod = types.ModuleType(module_name)
mod.__dict__.update(mod_globals)
sys.modules[module_name] = mod
Now just concatenate all of those transformed modules together. The result will be almost equivalent to your original code, except that it's doing the equivalent of import foo; del foo for all of your modules (in dependency order) right at the start, so the startup time could be a little slower.
You can make a tool that:
Reads through your source files and puts all identifiers in a set.
Subtracts all identifiers from recursively searched standard- and third party modules from that set (modules, classes, functions, attributes, parameters).
Subtracts some explicitly excluded identifiers from that list as well, as they may be used in getattr/setattr/exec/eval
Replaces the remaining identifiers by gibberish
Or you can use this tool I wrote that does exactly that.
To obfuscate multiple files, use it as follows:
For safety, backup your source code and valuable data to an off-line medium.
Put a copy of opy_config.txt in the top directory of your project.
Adapt it to your needs according to the remarks in opy_config.txt.
This file only contains plain Python and is exec’ed, so you can do anything clever in it.
Open a command window, go to the top directory of your project and run opy.py from there.
If the top directory of your project is e.g. ../work/project1 then the obfuscation result will be in ../work/project1_opy.
Further adapt opy_config.txt until you’re satisfied with the result.
Type ‘opy ?’ or ‘python opy.py ?’ (without the quotes) on the command line to display a help text.
I think you can try using the find command with -exec option.
you can execute all python scripts in a directory with the following command.
find . -name "*.py" -exec python {} ';'
Wish this helps.
EDIT:
OH sorry I overlooked that if you obfuscate files seperately they may not run properly, because it renames function names to different names in different files.
I'm a bit puzzled by the way paths are handled in Python. Using common constructs line "~" or "." or "..", I often run into cases where a path is not recognized as valid or existing, especially if I pass the path on as an argument to a shell command; but all of my problems go away if I always do something like:
some_path = os.path.abspath(os.path.expanduser(some_path))
Is this a common — or perhaps even required — idiom, or am I just reinventing the wheel? Should I really expect that wherever I have some_path, I should have the above code before passing it to any (or at least most) functions that do anything with it?
Yes, most things you can call will expect path that has been run through that idiom. When you use paths like that in the shell (eg, when you do something like cat ~raxacoricofallapatorius/foo.txt), it is the shell itself - rather than cat or any other program you might run - that does the path normalisation.
You can verify this trivially - eg,
lvc#tiamat:~/Projects$ echo ~
/home/lvc
So this does mean that if you expect to get a path with those kinds of variables as input, you will need to do the preprocessing yourself. The alternative is to run the commands through a shell, and be ready to deal with all the problems that brings.
However, at least on unix-like systems (Windows may or may not behave the same way), you don't need to do this for . and .. - these are understood by the system calls, and the shell does not transform them - so, eg:
lvc#tiamat:~/Projects$ file ..
..: directory
lvc#tiamat:~/Projects$ file ~
/home/lvc: directory
Notice that file sees .. unchanged, but sees the expanded form of ~.
This means that if all you want is paths that will work directly in external programs, passing it through expanduser and possibly expandvars is sufficient. You will want to call abspath primarily if the program you are calling out to will run in a different working directory than yours.
Yes, if you need an absolute path with $HOME resolved, you'll have to do that.
It should be easy enough to write a short helper function, if you require this functionality regularly. There are also path helper libraries available, like these:
https://github.com/jaraco/path.py
https://github.com/xando/python-shelltools
It's generally a good idea. os.path.abspath will resolve relative paths like ., .., and ~. If you want your code to be portable across OSes, you should be using os.path instead of defining your own path handling, if you can - os.path always points to the correct path module for the OS you are on. If you try define your own path functions, you lose the built-in cross platform behavior of os.path.
Is there any way to create a virtual import path in Python?
My directory structure is like this:
/
native
scripts
some.py
another.py
[Other unrelated dirs]
The root is the directory from where the program is executed. Atm I add native/scripts/ to the search path so I can do import some, another instead of from native.scripts import some, another, but I'd like to be able to do it like this:
from native import some
import native.another
Is there any way to achieve this?
Related questions:
Making a virtual package available via sys.modules
Why not move some.py and another.py out into the native directory so that everything Just Works and so that people returning to the source code later won't be confused about why things are and aren't importable? :)
Update:
Thanks for your comments; they have usefully clarified the problem! In your case, I generally put functions and classes that I might want to import inside, say, native.some where I can easily get to them. But then I get the script code, and only the script code — only the thin shim that interprets arguments and starts everything running by passing those to a main() or go() function as parameters — and put that inside of a scripts directory. That keeps external-interface code cleanly separate from code that you might want to import, and means you don't have to try to fool Python into having modules several places at once.
In /native/__init__.py, include:
from scripts import some, another