PyInstaller packaging snmpsim/snmprec with mandatory arguments - python

I'm trying to package snmprec as a standalone exe run with a set of mandatory arguments, as without them, the required data never resolves.
It's snmprec (from the snmpsimd package) but need to pass in the following argument (or something like it) to be able to resolve all the dependencies
--agent-udpv4-endpoint=localhost:161 --output-file=sql.snmprec --variation-module=sql --variation-module-options=dbtype:sqlite3,database:snmpsim.db,dbtable:snmprec --community=public

Related

How to use pydeps to produce a diagram showing functions dependencies?

I have a small project which I want to depict the dependencies between the functions.
I tried using pydeps for this but when I do that I get a, very nice, diagram of the modules in the project but not how the functions interact.
This is how I invoke pydeps ...
pydeps --include-missing --show-deps --max-bacon 4 ./test/
Inside of ./test I have a set of python modules including an __init__.py.
I'm not even 100% sure that functional dependency depiction is what pydeps is meant to do ... the documentation only shows links between modules.
Can pydeps do this ? If not can something else do it ?
Things I've tried
I tried pycallgraph . Although I installed it in a pipenv environment I couldn't follow the directions without installing it using apt-get. When I then try to use it it complains it can't find 'pandas' which is installed within the pipenv environment but clearly not visible. I don't want to install pandas globally (which I assume would resolve this issue).
Tried pyan3 and got the message __init__() got multiple values for argument 'root'... which I wasn't sure how to interpret but is referenced here https://github.com/Technologicat/pyan/issues/64 without any useful resolution.

Can I zip PySpark dependencies containing some setuptools.Extension?

I am attempting to include the dateparser package for a PySpark (v2.4.3) shell session by a short little zip build process pip install -r requirements.txt -t some_target && cd some_target && zip -r ../deps.zip . && cd .., after which I would, for example, pyspark --py-files deps.zip. When importing dateparser, though, I get an indirect ModuleNotFoundError from the regex library, whining that "No module named 'regex._regex'" (stack trace says this is referenced in /mnt/tmp/spark-some/long/path/deps.zip/regex/_regex_core.py line 21, which is of course referenced much farther up the stack by dateparser).
I attempted adding a flag to the dateparser line in requirements.txt like dateparser --no-binary=regex, but the error persisted. A normal python shell is able to import without issue, and other packages in this zip seem to be importable in PySpark shell without issue. This has led me down a number of rabbit holes, but I think/hope I have finally found the culprit: namely, that regex._regex is not a normal .py file, but rather a .so. My knowledge of python build process is limited, but it seems that regex library's setup.py uses the setuptools.Extension class to compile some C files into this shared object. I have seen suggestions to modify LD_LIBRARY_PATH environment variable in order to make those shared objects discoverable to python, but a number of comments also suggested this was dangerous and not a viable long-term solution. The fact that a normal python interactive session has no issue with the import also has me skeptical, since the LD_LIBRARY_PATH variable doesn't even exist in os.environ within that interactive shell. I'm thence left wondering if --py-files is insufficient for including packages that compile these Extension objects (seems unlikely, since there are a lot of people doing crazier things than my simple use case), or if this actually stems from neglecting some other setting.
Merci mille fois for any and all help :)
The error appears to stem from the import statements not being able to recognize binary (.so) files within a zip archive, i.e., the dependencies.zip that I pass with the --py-files parameter. I first tried pulling out regex dependency and building a .whl to include in --py-files, to discover that my version of PySpark (v2.4.3) predates wheel support. I was, however, able to build an .egg based on the source code, then set PYTHON_EGG_CACHE and PYTHON_EGG_DIR env variables for spark.executorEnv and spark.driverEnv... Not sure if the last step would be necessary for others; it seems to have stemmed from weird permissions issues that may just apply to my user/group/use case.

What does ${python3:Depends} mean in a debian source-package control file?

I'm trying to build a .deb from a python package. In order to do so I have to configure a control file. The control file contains a line where you can define dependencies for your package, e.g:
Depends: python-appindicator, python3-yaml (>=3.11), ${misc:Depends}, ${python3:Depends}
The dependency definition for python3-yaml is easy to understand, but what do ${misc:Depends} and ${python3:Depends} stand for?
This means that during build process variable ${python3:Depends} will be replaced with guessed py3 dependencies for that package. dh_python3 will help to do that. It's trying to guess what are the dependencies of package which contains such entry by looking for requires.txt file within the build directory, for example at debian/python-foo/usr/lib/python3.4/dist-packages/foo-0.0.1.egg-info/requires.txt and then translating it to the debian-like dependencies. Also ${misc:Depends} means such types of dependencies that are being involved by debhelper itself (by some of dh_* utilities).

packaging python application for linux

I have made a GUI application using python and PyQt5. I want to package this app but there doesn't seems to be a straight forward way to do this. Moreover what I have found answers to is to package a python module and not an application. I have read various articles and the official docs but still don't seem to have a proper answer to this, though there are several workarounds through which I could achieve the same, I just want to know what is the standard way.
This is my directory structure :
Moodly/
Moodly/
__init__.py
controller.py
logic.py
models.py
view.py
resoure.py
style.py
sounds/
notify.wav
message.wav
setup.py
MANIFEST.in
setup.cfg
run.py
moodly.png
Moodly.desktop
What do I want to achieve: The user is given with a tar file of Moodly. The user extracts it, runs the command
python setup.py install
in the terminal, the setup places all the files in the proper place and creates a Moodly.desktop file probably in usr/local/share/applications clicking on which user can run the app.
My way of achieving this:
setup.py
from setuptools import setup
setup(
name="Moodly",
version="1.0",
author="Akshay Agarwal",
author_email="agarwal.akshay.akshay8#gmail.com",
packages=["Moodly"],
include_package_data=True ,
url="http://github.com/AkshayAgarwal007/Moodly",
entry_points = {
'gui_scripts': [
'moodly = Moodly.controller:main',
],
},
# license="LICENSE.txt",
description="Student Intimation system",
# long_description=open("README.txt").read(),
# Dependent packages (distributions)
)
MANIFEST.in
include Moodly/sounds/notify.wav
include Moodly/sounds/message.wav
Now with no setup.cfg I run the command:
python setup.py install
This succesfully installs Moodly to /usr/lib/python-3.4/site-packages
alongwith the sounds directory.And now from the terminal when I type in moodly(as specified in entry points in setup.py) my GUI application launches successfully.
Now I just need the setup to create the Moodly.desktop alongwith moodly.png in usr/local/share/applications which I am trying to achieve through this:
setup.cfg
[install]
install_data=/usr/local/share/applications
Adding this to setup.py
data_files = [("Moodly", ["moodly.png","Moodly.desktop",])],
But this somehow seems to copy the files inside python-3.4/site-packages/Moodly rather than the specified destination but it used to work well with distutils
This guy also seems to have faced the same issue
Some other links I have used:
python-packaging
starting with distutils
So the way I am trying to do it , how much of it is correct and what is the standard way to do it. How can I possibly place that Moodly.desktop in the right place or what could be a better alternative way to do the entire process.
Moreover would using Pyinstaller be a better idea. Pyinstaller would package the app with PyQt5, requests and beautifulsoup4 (external modules that I have used) which I don't want. I want to use the install_requires option provided by setuptools and not unnecessary make the user download the modules which they already might have.
The .desktop file isn't supposed to be installed using Distutils. Distutils is only concerned with installing Python packages.
To install .desktop files, icons and other files incidental to distribution level packaging, you should look at build automation systems, such as CMake.
The first step in this process is to get CMake to build a Python project. You should take a look here for how to do that: https://bloerg.net/2012/11/10/cmake-and-distutils.html
Beyond that, installing .desktop files is easy. Assuming you've written a .desktop file and put it somewhere, installing it is a matter of doing:
install(PROGRAMS com.akshay.moodly.desktop DESTINATION ${XDG_APPS_INSTALL_DIR})
in your CMakeLists.txt file.
Note that you install the .desktop file to ${XDG_APPS_INSTALL_DIR} (that's a CMake variable), not a hardcoded path like /usr/local/share/applications or something. The user (and pretty much every automated distro package builder) will always install your package to a temporary path and then copy files over into their packages. Never assume that your app will live in /usr/bin or /usr/local/bin or whatever. The user could install things into /opt/Moodly or even $HOME/Moodly.

Why doesn't ./configure work in python setup.py?

When using subprocess.call(["./configure"]) and then subprocess.call(["make"]) in a python setup.py file, why might autotools look for the wrong version of automake? We are calling:
$ python setup.py install
....
WARNING: 'automake-1.13' is missing on your system.
You should only need it if you modified 'Makefile.am' or
'configure.ac' or m4 files included by 'configure.ac'.
The 'automake' program is part of the GNU Automake package:
<http://www.gnu.org/software/automake>
It also requires GNU Autoconf, GNU m4 and Perl in order to run:
<http://www.gnu.org/software/autoconf>
<http://www.gnu.org/software/m4/>
<http://www.perl.org/>
Short answer: turn AM_MAINTAINER_MODE off with --disable-maintainer-mode.
Long answer: Despite the version difference, it should not error out since it works fine on the command line. Something with the Python packaging process is interfering.
When you do
$ python setup.py sdist
the setuptools module creates hard links, makes a tar archive from that, then deletes the hard links. During this linking process, the timestamps on the files have been modified and don't match the original modification times, creating the illusion that some of the source files have been modified.
When the Makefile is run, it notices the timestamp difference. If AM_MAINTAINER_MODE is enabled, it runs the missing script. This script then detects the difference in versions of aclocal, causing make to error out.
Passing the --disable-maintainer-mode option to the configure script should suppress the invocation of the missing script and allow the build to succeed:
subprocess.call(["./configure", "--disable-maintainer-mode"])
subprocess.call(["make"])
(See here for more information about automake's maintainer mode. Apparently the timestamp business is also a problem with users of CVS.)

Categories

Resources