I have a multiplatform project in which I required to ship few third party executable/ data files which are not part of python. In the source file I kept it under data directory and from main script calling the executable using this line
trd_prt_exe = os.path.join("tools", "syslinux", "bin", "executable_name")
It works perfectly fine while testing/ developing from source. Problem comes when I distribute the same using setup.py. After installing application using setup.py, I get this error
for path, subdirs, files in os.walk(os.path.join("tools"))
File "/usr/lib/python2.7/os.py", line 276, in walk
names = listdir(top)
TypeError: coercing to Unicode: need string or buffer, NoneType found
Clearly, python could not find my executables under data directory.
How can we access these executables/ data files during development and after distributing it.
Update I
I could have included but simply forgot. Here is my complete project strecture:-
[sundar#arch multibootusb-7.0.0]$ tree
.
├── data
│ ├── multibootusb.desktop
│ └── multibootusb.png
├── LICENSE.txt
├── multibootusb
├── PKG-INFO
├── README.txt
├── scripts
│ ├── admin.py
│ ├── detect_iso.py
│ ├── __init__.py
│ ├── install_distro.py
│ ├── install_syslinux.py
│ ├── isodump.py
│ ├── multibootusb_ui.py
│ ├── qemu.py
│ ├── uninstall_distro.py
│ ├── update_cfg.py
│ └── var.py
├── setup.py
└── tools
├── checking.gif
├── mbr.bin
├── multibootusb
│ ├── chain.c32
│ ├── extlinux.cfg
│ ├── grub.exe
│ ├── memdisk
│ ├── menu.c32
│ ├── menu.lst
│ ├── syslinux.cfg
│ └── vesamenu.c32
├── multibootusb.png
├── syslinux
│ └── bin
│ ├── syslinux3
│ ├── syslinux4
│ ├── syslinux5
│ └── syslinux6
└── version.txt
Here is what I have in setup.py:-
from distutils.core import setup
import os
mbusb_version = open(os.path.join("tools", "version.txt"), 'r').read().strip()
setup(
name='multibootusb',
version=mbusb_version,
packages=['scripts'],
scripts = ['multibootusb'],
platforms = ['Linux'],
url='http://multibootusb.org/',
license='General Public License (GPL)',
author='Sundar',
author_email='feedback.multibootusb#gmail.com',
description='Create multi boot Live linux on a USB disk...',
long_description = 'The multibootusb is an advanced cross-platform application for installing/uninstalling Linux operating systems on to USB flash drives.',
data_files = [("/usr/share/applications",["data/multibootusb.desktop"]),
('/usr/share/pixmaps',["data/multibootusb.png"]),
('multibootusb/tools',["tools/checking.gif"]),
('multibootusb/tools',["tools/mbr.bin"]),
('multibootusb/tools',["tools/version.txt"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/chain.c32"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/extlinux.cfg"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/grub.exe"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/memdisk"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/menu.c32"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/menu.lst"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/syslinux.cfg"]),
('multibootusb/tools/multibootusb',["tools/multibootusb/vesamenu.c32"]),
('multibootusb/tools/syslinux/bin',["tools/syslinux/bin/syslinux3"]),
('multibootusb/tools/syslinux/bin',["tools/syslinux/bin/syslinux4"]),
('multibootusb/tools/syslinux/bin',["tools/syslinux/bin/syslinux5"]),
('multibootusb/tools/syslinux/bin',["tools/syslinux/bin/syslinux6"])]
#('multibootusb/tools',["tools/multibootusb.png"])]
)
The problem what I found is that the main executable script "multibootusb" is available in usr/bin/multibootusb but other data/ third party executables are under /usr/multibootusb/ and other modules/s scripts required by main program multibootusb is under /usr/lib/python2.7/site-packages/scripts. Therefore, the main program is unable to locate my third party data/ executables.
How to overcome this issue? where am I doing wrong?
Related
I have several related projects that I think will be a good fit for Python's namespace-packages. I'm currently running python 3.8, and have created the following directory structure for testing.
├── namespace-package-test.package1
│ ├── LICENSE.txt
│ ├── README.md
│ ├── setup.cfg
│ ├── setup.py
│ ├── src
│ │ └── pkg1
│ │ ├── cli
│ │ │ ├── __init__.py
│ │ │ └── pkg1_cli.py
│ │ └── __init__.py
│ └── tests
├── namespace-package-test.package2
│ ├── AUTHORS.rst
│ ├── CHANGELOG.rst
│ ├── LICENSE.txt
│ ├── README.md
│ ├── setup.cfg
│ ├── setup.py
│ ├── src
│ │ └── pkg2
│ │ ├── cli
│ │ │ ├── __init__.py
│ │ │ └── pkg2_cli.py
│ │ └── __init__.py
│ └── tests
The entire project is on a private bitbucket (cloud) server at;
git#bitbucket.org:<my-company>/namespace-package-test.git
I would like to install, locally, only package 1. I've tried every iteration I can imagine of the following, but nothing seems to get me there. I either get a repository not found error or a setup.py not found error.
pip install git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git
Is this possible?
Is my project structure correct for what I am doing?
What should the pip install command look like?
Bonus, what if I only want to install a specific spec using pipx?
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git"
I think I figured it out ... for posterity sake
Pip install (into virtual environment)
pip install git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1
pipx install - with spec
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1"
im triyng to convert game that i wrote in python with pictures to exe file . if i convert file without pics , its work, but with pictures it does not,I used pyinstaller. Is there any other process with pictures
The main problem is that, your .exe file does not see your images!
First of all, make sure you're using relative paths to your images.
If you did, you should put the images beside your executable file as the same level as your main.py file, for example, if you have a project structure like this :
.
├── assets
│ ├── fonts-folder
│ │ ├── OFL.txt
│ │ └── Rubik-Regular.ttf
│ ├── imag1.jpg
│ └── icon.png
└── src
└── main.py
After generating file.exe you should put your file.exe into an executable directory, so your project structure will be looked like that :
.
├── executable
│ └── file.exe
├── assets
│ ├── fonts-folder
│ │ ├── OFL.txt
│ │ └── Rubik-Regular.ttf
│ ├── imag1.jpg
│ └── icon.png
└── src
└── main.py
OR (NOT RECOMMENDED)
you can just put your .exe file beside your main.py file, so your project structure may look like this:
.
├── assets
│ ├── fonts-folder
│ │ ├── OFL.txt
│ │ └── Rubik-Regular.ttf
│ ├── imag1.jpg
│ └── icon.png
└── src
├── main.py
└── file.exe
I use two libraries built on top of caffe: crf-rnn(https://github.com/torrvision/crfasrnn/tree/master/python-scripts) and hed(https://github.com/s9xie/hed/blob/master/examples/hed/), the former for semantic image segmentation, the latter for contour detection. Finally, I realized how to get them to work together for object tracking, but now I face an embarrassing problem: as both are built on top of caffe, they import the same package, but each with very different content, i.e. crf-rnn uses caffe.Segmenter which hed doesn't have and ed uses caffe.TEST which crf-rnn doesn't have.
Python doesn't allow import of two packages with the same name. I've tried finding a workaround by puting hed in a separate Python file and importing it in the main script, and using as to import caffe as cf for one of the packages, but so far nothing has worked out.
Any suggestions?
EDIT: this is a file called Aux.py
def import_hed_caffe():
import sys,os
caffe_dir = '/home/alex/Downloads/hed/python'
sys.path.insert(0,caffe_dir)
hed_model = 'deploy.prototxt'
hed_pretrained = 'hed_pretrained_bsds.caffemodel'
import caffe as cf
net = cf.Net(hed_model, hed_pretrained, cf.TEST)
return net
This is the main script:
caffe_root = '../caffe-crfrnn/'
sys.path.insert(0, caffe_root + 'python')
import caffe as espresso
import AuxScript
net = espresso.Segmenter(MODEL_FILE, PRETRAINED, gpu=False)
a=AuxScript.import_hed_caffe()
and I get
AttributeError: 'module' object has no attribute 'TEST'
Needless to say, separately everything works fine, so it's just the import
EDIT 2:
./CMakeFiles
./CMakeFiles/pycaffe.dir
./CMakeFiles/pycaffe.dir/caffe
./caffe
./caffe/imagenet
./caffe/proto
./caffe/test
EDIT 3:
├── caffe
│ ├── _caffe.cpp
│ ├── _caffe.so -> /home/alex/Downloads/hed/lib/_caffe.so
│ ├── classifier.py
│ ├── classifier.pyc
│ ├── detector.py
│ ├── detector.pyc
│ ├── draw.py
│ ├── imagenet
│ │ └── ilsvrc_2012_mean.npy
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── io.py
│ ├── io.pyc
│ ├── net_spec.py
│ ├── net_spec.pyc
│ ├── proto
│ │ ├── caffe_pb2.py
│ │ └── __init__.py
│ ├── pycaffe.py
│ ├── pycaffe.pyc
│ └── test
│ ├── test_layer_type_list.py
│ ├── test_net.py
│ ├── test_net_spec.py
│ ├── test_python_layer.py
│ ├── test_python_layer_with_param_str.py
│ └── test_solver.py
├── classify.py
├── CMakeFiles
│ ├── CMakeDirectoryInformation.cmake
│ ├── progress.marks
│ └── pycaffe.dir
│ ├── build.make
│ ├── caffe
│ │ └── _caffe.cpp.o
│ ├── cmake_clean.cmake
│ ├── CXX.includecache
│ ├── DependInfo.cmake
│ ├── depend.internal
│ ├── depend.make
│ ├── flags.make
│ ├── link.txt
│ └── progress.make
├── cmake_install.cmake
├── CMakeLists.txt
├── detect.py
├── draw_net.py
├── Makefile
├── requirements.txt
I have seen your last edit, and I must say that changing/tampering with python sys.path is necessary in your context but not sufficient here: you have to rename one of the caffe packages.
Ex: if the caffe package is a directory called caffe containing a __init__.py file, rename caffe to espresso and in your code simply:
import espresso
(if it's just a caffe.py file, rename to espresso.py although it may be more problematic if there are other modules in the same directory, well worth a try)
BTW: When importing a module, say, xxx, you can know which full filepath it is using by typing:
print(xxx.__file__)
(useful when you have a doubt)
OK, so I found the least sophisticated solution possible: I wrote two scripts, one for crf-rnn producing the blobs that I ran on the full dataset just once and stored the output.
Then I wrote the second script, with hed edge detector that I use every time I detect and track objects.
I want to create a pybuilder project with unit tests and packages. As an example, modified simple python app example, with "helloworld" moved to package "hello".
My first instinct was to match the package structure with "main" and "unittest" sources:
+---src
+---main
| \---python
| \---hello
| helloworld.py
| __init__.py
|
\---unittest
\---python
\---hello
helloworld_tests.py
__init__.py
This does not work because of conflicting "hello" package.
BUILD FAILED - 'module' object has no attribute 'helloworld_tests'
I see pybuilder itself just skips the top-level pybuilder package in unittests, but won't do if there are multiple top-level packages.
My second guess would be to create extra top level package for unittests.
\---unittest
\---python
\---tests
| __init__.py
\---hello
helloworld_tests.py
__init__.py
Is there a better solution or established convention how to organize python tests in packages?
Probably nothing really new for the OP, but I just wanted to collect all options that I could come up with in one place:
1) Just append _tests to names of top-level packages
The easiest way to mirror the structure of src/main/python in the src/test/python almost 1:1 would be by simply appending _tests to the names of the top-level packages. For example, if I have only one top-level package rootPkg, then I can add the corresponding rootPkg_tests to the test/ subdirectory:
src
├── main
│ ├── python
│ │ └── rootPkg
│ │ ├── __init__.py
│ │ ├── pkgA
│ │ │ ├── __init__.py
│ │ │ └── modA.py
│ │ └── pkgB
│ │ ├── __init__.py
│ │ └── modB.py
│ └── scripts
│ └── entryPointScript.py
└── test
└── python
└── rootPkg_tests
├── __init__.py
├── pkgA
│ ├── __init__.py
│ └── modA_tests.py
└── pkgB
├── __init__.py
└── modB_tests.py
This seems to work nicely with PyBuilder 0.11.15 and unittest plugin (notice that I've deviated from PyBuilders convention, and put tests in test instead of unittest, you probably shouldn't do this if you intend to use multiple testing frameworks).
2) If there is only one package: do it like PyBuilder
The PyBuilder is itself built with PyBuilder. This is what its source directory looks like (drastically reduced, unnecessary details omitted):
src
├── main
│ ├── python
│ │ └── pybuilder
│ │ ├── __init__.py
│ │ ├── cli.py
│ │ ├── core.py
│ │ └── plugins
│ │ ├── __init__.py
│ │ ├── core_plugin.py
│ │ └── exec_plugin.py
│ └── scripts
│ └── pyb
└── unittest
└── python
├── cli_tests.py
├── core_tests.py
├── plugins
│ ├── exec_plugin_tests.py
│ ├── __init__.py
│ ├── python
│ │ ├── core_plugin_tests.py
│ │ ├── __init__.py
If I understand it correctly, the tree in unittest mirrors the tree in src, but the directory for the top-level package pybuilder is omitted. That's what you have described in your question as first workaround. The drawback is that it doesn't really work if there are multiple top-level packages.
3) Add one additional tests top-level package
That's what you have proposed as a workaround: mirror the tree in main, but wrap everything in an additional tests-package. This works with many top-level packages in /src/main/python and prevents any package name collisions.
I'm not aware of any convention. Upvote one of the comments below if you have an opinion on that matter.
i have the following file structure:
ihe/
├── dcmt
│ ├── actions
│ ├── calendar_observer
│ ├── cms
│ ├── consumption
│ ├── data_mining
│ ├── dcmt
│ ├── dcmt_db
│ ├── dcmt_db.bak.bak
│ ├── dcmt_db.sqlite
│ ├── devices
│ ├── d.py
│ ├── gadgets
│ ├── history
│ ├── houses
│ ├── hwc_settings
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── manage.py
│ ├── notifications
│ ├── profitable
│ ├── rules
│ └── schedule
├── hwc
│ ├── configuration
│ ├── daemons
│ ├── database
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── utils
│ └── wrapper
├── __init__.py
├── __init__.pyc
dcmt is a django project. hwc is pure python. however for instance in hwc/daemons there is a runme.py script. in that runme.py script i want to be able to import the models from the django project. Now as i understand it i have to have the correct python path and then somehow set the django settings. My question is how do i best do this so that for the whole hwc modules I only have to do that once?
Your project structure seems a bit confused.
It's probably not a good idea to have a Django project inside another package hierarchy. A lot of the import paths assume your project is in a top-level package and the only reason you're probably not running into issues already is that Python 2.x still supports relative imports (which have been removed in 3.x). This makes references to packages very ambiguous and can cause weird bugs.
From what I can see your settings package is actually called (fully-qualified) ihe.dcmt.hwc_settings. If ihe is in your Python path (check the value of sys.path in the script you're trying to run), that (i.e. the fully-qualified path) is probably what DJANGO_SETTINGS_MODULE should point at.
If you want to hook into Django's functionality in your scripts, you might want to look into the documentation for writing manage.py commands. This would let you write Django-related scripts more consistently and save you the worry about referencing and initialising Django's settings correctly yourself.