I'm trying to add a menu entry to file manager(nautilus) that call my own python/Qt program that do some work on the selected file/folder
i found
import nautilus
class ExampleMenuProvider(nautilus.MenuProvider):
def __init__(self):
pass
def get_file_items(self, window, files):
submenu = nautilus.Menu()
item = nautilus.MenuItem('Nautilus::sbi','Nau-T','image')
item.set_submenu(submenu)
item_two = nautilus.MenuItem('Nautilus::s','www','image')
submenu.append_item(item_two)
return item,
But i couldn't find the nautilus module anywhere to install
and i read somewhere that i should create a nautilus extension but installing nautilus-extension but if i install this package on my dev machine how can i garanty that in will be installed on the client machine
and thank you for your help
but if i install this package on my dev machine how can i garanty that in will be installed on the client machine
There are at least two options:
Document in your README how to get the dependencies installed.
Ship that module alongside your application.
I would personally prefer the former because that would allow users to benefit from the system-wide installation on their Linux machine.
For instance, if a bug is fixed in that module, they could update it on your system without you shipping a new version, or messing themselves with the installation directory.
It ain't that bad in my opinion, and after all, you have the same situation with the Python dependency and the Qt library. Presumably, you also let your users know where they can obtain them if they have difficulties.
Related
I'm creating a Gtk4 application with python bindings on Linux. I'm using flatpak as well.
So I'm creating like an extensions/plugins systems where user can define his main module to call in a specific file and later on ill load it. Every thing works but when the user uses imports for external libraries like NumPy or pandas it will start looking inside the flatpak sandbox and this is normal. I'm wondering how can I tell the python interpreter to use the modules of the system for the imported plugins instead of looking in the modules of the app.
User's code and requirements should be independent of the app's requirements.
This is how I'm loading the modules:
extension_module = SourceFileLoader(file_name, module_path).load_module()
extension_class = getattr(dataset_module, class_name)
obj = extension_class()
This is an example of the loaded class
the absolute path of this module is /home/user/.extensions/ext1/module.py
import numpy as np
class Module1:
def __init__(self):
self.data = np.array([1, 2, 3, 4, 5, 6])
def get_data(self):
return self.data
I tried using
os.path.append('/usr/lib64/python-10.8/site-packages')
It's added, but in the sandbox environment.
I thought about looking for user imports manually like when a user import pandas ill try to look for installed python packages in the system and use the importlib or the SourceFileLoader to load it, but I don't think it's a good way to do it.
So, finally, after a day of reading the Flatpak docs, I found a way to do it.
I had to add the argument --filesystem=home. This argument will give you access to the user's directory. When you use pip to install packages, the packages will be installed in the following directory ~/.local/lib/python3.10/site-packages/. To let python interpreter search for packages in that folder, you can add it to the path like this.
import sys
sys.path.append('~/.local/lib/python3.10/site-packages/')
Note1: In my case, this is enough because the app is for learning and not serious, so it's 0 security concerns.
I am using openSUSE Tumbleweed, so I have python 3.10.11 and on the Flatpak runtime there is python 3.10.6. So some users who have an old distro or a distro like Ubuntu or Debian maybe you don't have the latest python version, and you may have compatibility issues.
A better solution is to create a simple folder in the user local directory f.g ~/.cache/myapp/packages and then you can add it to the flatpak manifest file --filesystem=~/.cache/myapp:create this way you're mapped your folder to be accessed from the sandbox and the option :create will create the folder if it doesn't exist. Then i your python script to install required packages based on the imports used in external scripts. In your python, add the folder path to the path sys.path.append('~/.cache/app/packages/').
Note2: It's not safe to import directly scripts to your code when you want to create a plugin system it's better to create a sub process that executes these scripts, this way you isolate your code from external code. You can use IPC protocol or other techniques to change data between the main process and the subprocess.
I've just written my first MSIX app in python 3. I use pyinstaller to generate an EXE. Then I use WiX Toolset to generate an MSI. I then use MSIX Packaging Tool to create the MSIX. There's probably an easier way to go from code to MSIX, but that's what I've gotten to work so far.
Ideally, I'd like to capture an onuninstall event and throw a GUI prompt asking the user why they are uninstalling. I can do this in the MSI. However, my understanding is that MSIX offers no onuninstall event. Please let me know if you know differently!
Since I apparently can't catch the MSIX uninstall event, my next preference is to offer the user a way to uninstall the app from the tray icon. The user selects an uninstall tray menu button from my app's icon, which pops up a window where the app then asks them why they are uninstalling. They type in an answer, and then click the submit button. Then, the app should completely uninstall itself. This also works well in the MSI. However, I can't get it to work in the MSIX.
Here's what works in python with the MSI installed:
subprocess.call('msiexec.exe /x {' + myguid + '}', shell=True)
However, the MSIX, which is built from the MSI, throws this popup error message when that line runs, and never actually uninstalls the app:
This action is only valid for products that are currently installed.
I've tried using the GUID from my WXS file's <Product> entry, hard-coded, just to see if it would work. That one worked for uninstalling the MSI, but not the MSIX. I also tried getting the GUID dynamically, but that didn't work for either MSI or MSIX, both producing the same error as above. Here is how I got the GUID dynamically:
from System.Runtime.InteropServices import Marshal
from System import Reflection
myguid = str(Marshal.GetTypeLibGuidForAssembly(
Reflection.Assembly.GetExecutingAssembly()
)).upper()
While running the MSI (where I have far better logging than in the MSIX), it appears as though GetExecutingAssembly() gets an assembly with a FullName of Python.Runtime, which is certainly something I don't want to uninstall. GetCallingAssembly() produces the same result. GetEntryAssembly() produces a null value.
I looped through AppDomain.CurrentDomain.GetAssemblies() to see what was listed and my app was not listed, though I saw many libraries that it uses.
So, any thoughts on how I can get the app to programmatically uninstall? Maybe a suggestion on how I can get the correct GUID for the MSIX app, if that's the problem? DotNet code should be fine. I can probably figure out how to translate it to python.
Or better yet, any idea how I can catch the MSIX uninstall event and run some custom code?
Thanks in advance!
From what I know at this moment catching the uninstall event is not possible. I don't recommend implementing the approach you suggested (the tray icon) but to ask your question, you can use the MSIX PowerShell commandlets to install and uninstall MSIX packages programmatically.
Also, I noticed you are really torturing yourself to create the MSIX package.
The MSIX Packaging Tool was created by Microsoft for IT professionals that don't have access to the source code. Developers can either use the Windows Application Packaging Project project template (if they use Visual Studio) or other third-party tools, like Advanced Installer or Wix (from what I know there was a Wix extension that can be used to build MSIX packages).
Here is a quick tutorial on how to create an MSIX from scratch, easier with Advanced Installer:
https://www.advancedinstaller.com/create-msi-python-executable.html
Disclaimer: I work on the team building Advanced Installer.
I used Bogdan's Powershell suggestion and came up with the following code, which seems to work well:
import subprocess
powershellLocation = "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe"
try:
# Use the name of the app (which I found manually using Get-AppPackage)
# to get the full name of the app, which seems to have some random numbers
# on the end of it.
powershell_tuple = subprocess.Popen([
powershellLocation,
"Get-AppPackage",
"-name",
'"' + myAppPackageName + '"'
],shell=True,stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE).communicate()
appStrings = powershell_tuple[0].decode('utf-8').strip()
except Exception as e:
pass
for appStr in appStrings.splitlines():
# Find the full name of the app
if appStr.startswith('PackageFullName'):
colonIndex = appStr.index(':') + 1
fullName = appStr[colonIndex:].strip()
if fullName:
# The MSIX package was found, and I now have its full name
subprocess.call(powershellLocation + ' Remove-AppPackage -Package "' + fullName + '"', shell=True)
I'm making a buildroot for my raspberrypi3 for a school project.
I've made a buildroot with everything from python included because i want to use WebIOPi. A buildroot has been done and the image has been written on the SDCard.
Now when I want to install it on the buildroot device it asks for python-dev, wich is not included by buildroot. With further research I've only found this. Thats a python-dev0.4.0 but i think there's a much recent version on my virtual ubuntu16 os.(main os is windows 10, so need image to use win32diskimager)
But I don't know how to implement this in the python buildroot packages. I've already read the manuals from buildroot, it's kinda confusing for me...
I've already tried to make a directory named 'python-dev' in the buildroot/package directory (ubuntu os) but with no succes.
This is what i've got so far:
buildroot/package/python-dev:
config.in
python-dev.mk
in the config.in file:
config BR2_PACKAGE_PYTHON_DEV
bool "python-dev"
help
in the python-dev.mk file (copied from libconfig):
################################################################################
#
# python-dev
#
################################################################################
PYTHON_DEV_VERSION = 0.4.0
PYTHON_DEV_SOURCE = dev-0.4.0.tar.gz
PYTHON_DEV_SITE = https://pypi.python.org/packages/53/34/e0d4da6c3e9ea8fdcc4657699f2ca62d5c4ac18763a897feb690c2fb0574/dev-0.4.0.tar.gz
PYTHON_DEV_LICENSE = Python software foundation license v2, others
PYTHON_DEV_LICENSE_FILES = README
PYTHON_DEV_SETUP_TYPE = setuptools
PYTHON_DEV_DEPENDENCIES = libconfig boost
$(eval $(python-package))
When I run a make menuconfig and search for python-dev, it's not there...
I hope someone could help me with this.
If there's an easier way, it's pretty much welcome.
Thank you in advance.
The python-dev package that the WebIOPi setup script is checking for has nothing to do with the dev python package that you found at https://pypi.python.org/pypi/dev.
The python-dev package is a package on Linux distributions that contains the development files for the Python library that is installed together with the Python interpreter. It installs the necessary files to allow C/C++ programs to link against libpython.
Buildroot has already installed what you need in STAGING_DIR. However, you are probably trying to install WebIOPi directly on the target, which is not how Buildroot is intended to be used. Buildroot does not allow to do development on the target: it does not provide a compiler on the target, nor the necessary files for development.
Buildroot is intended to be used as a cross-compilation environment. So what you should do instead is create a Buildroot package for WebIOPi, and have it cross-compiled (from your host machine), so that it gets installed, ready to use, in your target filesystem.
The large corporation that I work for uses a custom version of Setuptools. This private fork of setuptools is intended to deal with certain networking and security difficulties that are unique to our organization. The bottom line is that neither the standard Setuptools nor Distribute would work as expected on our environment.
Id like to start using Ian Bicking's excellent VirtualEnv tool on systems, particularly in our test systems where we need to be able to set up a large number of sandboxed areas for test-code - e.g. in our continuous integration environment.
Unfortunately any time I try to build a new virtual environment the virtualenv tool tries to obtain and install the latest official version of Setuptools. This would fail for the reason stated above, and also because the corporate firewall would block the action.
Instead of installing the official version:
setuptools-0.6c11-py2.4.egg
I'd like to install our customized version which might be called something like:
setuptools-foo-0.6c11-py2.4.egg
This egg can always be guaranteed to be found in the system's global site-packages. I can also guarantee that it's present in all of our corporate egg servers.
Can you help me make my virtualenv use my customized setuptools instead of the regular version of setuptools.
The name is hardcoded in virtualenv.py. You have to either patch virtualenv.py or name your patched setuptools egg 'setuptools-0.6c11-py2.4.egg'
I've taken to writing my own wrapper scripts which import virtualenv. The main reason is that I use dpkgs to install most of my dependencies, including distribute, so I like to avoid downloading additional copies when I create a new environment - this has a bonus that it runs much faster.
Here is a baseline wrapper you can use to start with. I've added a comment where you could insert some code to symlink/copy your custom setuptools code into the virtualenv:
import os, subprocess, sys, virtualenv
# virtualenv changed its internal api slightly after 1.5.
NEW_API = (1, 5)
def get_version(version):
return tuple([int(v) for v in version.split('.')])
def main():
# set the logging level here
level = virtualenv.Logger.level_for_integer(0)
logger = virtualenv.Logger([(level, sys.stdout)])
virtualenv.logger = logger
# insert your command-line parsing code here, if needed
root = sys.argv[1]
home, lib, inc, bin = virtualenv.path_locations(root)
result = virtualenv.install_python(home, lib, inc, bin,
site_packages=True, clear=False)
pyexec = os.path.abspath(result)
version = get_version(virtualenv.virtualenv_version)
if version < NEW_API:
virtualenv.install_distutils(lib, home)
else:
virtualenv.install_distutils(home)
virtualenv.install_activate(home, bin)
# insert whatever post-virtualenv-setup code you need here
if __name__ == '__main__':
main()
Usage:
% python wrapper.py [path]
There's the option --extra-search-dir that allows to define a local directory containing the desired version of setuptools. This is explained in the docs.
I need to keep a large number of Windows XP machines running the same version of python, with an assortment of modules, one of which is python-win32. I thought about installing python on a network drive that is mounted by all the client machines, and just adjust the path on the clients. Python starts up fine from the network, but when importing win32com I get a pop-up error saying:
The procedure entry point ?PyWinObject_AsHANDLE##YAHPAU_object##PAPAXH#Z could not be located in the dynamic link library pywintypes24.dll
after dismissing the message dialog I get in the console:
ImportError: DLL load failed: The specified procedure could not be found.
I searched the python directory for the pywintypes24.dll and it is present in "Lib\site-packages\pywin32_system32" .
What am I missing and is there another way in which I can install Python + Python-Win32 + additional module once and have them running on many machines? I don't have access to the Microsoft systems management tools, so I need to be a bit more low-tech than that.
On every machine you have to basically run following pywin32_postinstall.py -install once. Assuming your python installation on the network is N:\Python26, run following command on every client:
N:\Python26\python.exe N:\Python26\Scripts\pywin32_postinstall.py -install
Another important thing is Good Luck!. The reason is that you might need to do this as admin. In my case such setup worked for all but one computer. I still did not figure out why.
Python (or precisely, the OS) searches the DLLs using os.environ["PATH"] and not by searching sys.path.
So you could start Python using a simple .cmd file instead which adds \server\share\python26 to the path (given the installer (or you) copied the DLLs from \server\share\python26\lib\site-packages\pywin32-system32 to \server\share\python26).
Or, you can add the following code to your scripts before they try to import win32api etc:
# Add Python installation directory to the path,
# because on Windows 7 the pywin32 installer fails to copy
# the required DLLs to the %WINDIR%\System32 directory and
# copies them to the Python installation directory instead.
# Fortunately, in Python it is possible to modify the PATH
# before loading the DLLs.
os.environ["PATH"] = sys.prefix + ";" + os.environ.get("PATH")
import win32gui
import win32con
You could use batch files running at boot to
Mount the network share (net use \\server\share)
Copy the Python and packages installers from the network share to a local folder
Check version of the msi installer against the installed version
If different, uninstall Python and all version dependent packages
Reinstall all packages
This would be pretty much a roll your own central management system for that software.