Hello I'm trying to add a custom step to my Python3 package. What I wish for, is to execute a make command in a root directory of my Python package. However, when I install my package with pip3 install ., the CWD (current working directory) changes to /tmp/pip-req-build-<something>
Here is what I have:
from setuptools import setup, find_packages
from setuptools.command.develop import develop
from setuptools.command.install import install
import subprocess, os
class CustomInstall(install):
def run(self):
#subprocess.run(['make', '-C', 'c_library/']) # this doesn't work as c_library/ doesn't exist in the changed path
subprocess.run('./getcwd.sh')
install.run(self)
setup(
cmdclass={
'install': CustomInstall
},
name='my-py-package',
version='0.0.1',
....
)
Now, what's interesting to me is that the getcwd.sh gets executed, and inside of it I have this. It also prints the TMP location.
#!/bin/bash
SCRIPT=`realpath $0`
SCRIPTPATH=`dirname $SCRIPT`
echo $SCRIPTPATH > ~/Desktop/my.log
Is there a way to get the path from where the pip install . was run?
Python 3.8.5, Ubuntu 20.04, Pip3 pip 20.0.2 from /usr/lib/python3/dist-packages/pip (python 3.8)
Related
I am developing a package for internal use, and said package has a setup.py file (see below). I am a bit baffled because when I install my package (inside an environment & in editable mode so changes get reflected as I develop it), the import works if I am in the development directory, but says "package not found" from any other directory.
Moreover, pip list shows the package.
To install, I do the following:
~ > cd path/to/package
package > conda activate env
package (env) > pip install -e .
The package installs. Now, the problem is
package (env) > python -c "import mypackage" # works!
package (env) > cd
~ (env) > python -c "import mypackage" # error!
~ (env) > pip list | grep "mypackage"
mypackage 0.1.0
What's happening? Documentation says
-e, --editable <path/url>
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url.
Which doesn't really tell me that it should only work in said local project path...
setup.py
import os
import sys
from setuptools import find_packages, setup
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
LICENSE = "MIT"
setup(
name="name",
version="0.1.0",
author="organization",
description="...",
long_description=read('README.md'),
packages=find_packages(),
license=LICENSE,
classifiers=[
...
],
)
You need to add where you have your module to the system path
import sys
sys.path.append(yourPathHere)
import yourModule
Edit - for clarity - the above goes into the .py that calls your module, not the setup.py that you use for compiling it.
Huh - I see your environment does see it with the grep. Prob disregard the above then!!
I am a new bee in python programming, and want to write a python script file which I can use as an 'import' statement on top of first executable file of my python program which in turn will download all the required dependencies (pip install ).
Please help me with it.
Try this using the pip module:
import pip
packages = ['numpy', 'pandas'] # etc (your packages)
for pckg in packages:
if hasattr(pip, 'main'):
pip.main(['install', pckg])
else:
pip._internal.main(['install', pckg])
an alternative would be:
import subprocess
import sys
packages = ['numpy', 'pandas'] # etc
for pckg in packages:
subprocess.call([sys.executable, "-m", "pip", "install", pckg])
I want to install and import Python 3 modules at runtime.
I'm using the following function to install modules at runtime using pip:
def installModules(modules):
for module in modules:
print("Installing module {}...".format(module))
subprocess.call([sys.executable, "-m", "pip", "install", "--user", module])
The module is installed successfully, but I'm not able to import it at runtime, after the installation finishes. So if I do:
modules = [ "wget", "zipfile2" ]
installModules(module)
import wget
I get a ModuleNotFoundError. If, after that, I start another Python 3 session, I am able to use the modules e.g. wget, which means that the modules have been installed, but they are not available for this current Python 3 session.
Is it possible in Python 3 to install and then import the installed modules in the same Python 3 session i.e. right after installation?
Thank you!
EDIT:
On a fresh Ubuntu 19.04 install inside VirtualBox, after a sudo apt-get install python3-pip, running the following script:
import os, sys
import subprocess
def installModules(modules):
for module in modules:
print("Installing module {}...".format(module))
subprocess.call([sys.executable, "-m", "pip", "install", "--user", module])
def process():
modulesToInstall = [ "wget", "zipfile2" ]
installModules(modulesToInstall)
process()
import wget
def main():
wget.download("http://192.168.2.234/test/configure.py")
if __name__ == "__main__":
main()
I get:
user#user-VirtualBox:~$ python3 script.py
Installing module wget...
Collecting wget
Installing collected packages: wget
Successfully installed wget-3.2
Installing module zipfile2...
Collecting zipfile2
Using cached https://files.pythonhosted.org/packages/60/ad/d6bc08f235b66c11bbb76df41b973ce93544a907cc0e23c726ea374eee79/zipfile2-0.0.12-py2.py3-none-any.whl
Installing collected packages: zipfile2
Successfully installed zipfile2-0.0.12
Traceback (most recent call last):
File "script.py", line 17, in <module>
import wget
ModuleNotFoundError: No module named 'wget'
The Python 3 version is:
user#user-VirtualBox:~$ python3 --version
Python 3.7.3
The pip3 version is:
user#user-VirtualBox:~$ pip3 --version
pip 18.1 from /usr/lib/python3/dist-packages/pip (python 3.7)
Other info:
user#user-VirtualBox:~$ whereis python3
python3: /usr/bin/python3.7m /usr/bin/python3.7-config /usr/bin/python3.7 /usr/bin/python3 /usr/bin/python3.7m-config /usr/lib/python3.7 /usr/lib/python3.8 /usr/lib/python3 /etc/python3.7 /etc/python3 /usr/local/lib/python3.7 /usr/include/python3.7m /usr/include/python3.7 /usr/share/python3 /usr/share/man/man1/python3.1.gz
Any ideas?
By default, at startup Python adds the user site-packages dir (I'm going to refer to it as USPD) in the module search paths. But this only happens if the directory exists on the file system (disk). I didn't find any official documentation to support this statement 1, so I spent some time debugging and wondering why things seem to be so weird.
The above behavior has a major impact on this particular scenario (pip install --user). Considering the state (at startup) of the Python process that will install modules:
USPD exists:
Things are straightforward, everything works OK
USPD doesn't exist:
Module installation will create it
But, since it's not in the module search paths, all the modules installed there won't be available for (simple) import statements
When another Python process is started, it will fall under #1.
To fix things, USPD should be manually added to module search paths. Here's how the (beginning of the) script should look like:
import os
import site
import subprocess
import sys
user_site = site.getusersitepackages()
if user_site not in sys.path:
sys.path.append(user_site)
# ...
Update #0
1 I just came across [Python]: PEP 370 - Per user site-packages directory - Implementation (emphasis is mine):
The site module gets a new method adduserpackage() which adds the appropriate directory to the search path. The directory is not added if it doesn't exist when Python is started.
I'm trying to prepare setup.py that will install all necessary dependencies, including google-cloud-pubsub. However, python setup.py install fails with
pkg_resources.UnknownExtra: googleapis-common-protos 1.6.0b6 has no such extra feature 'grpc'
The weird thing is that I can install those dependencies through pip install in my virtualenv.
How can I fix it or get around it? I use Python 2.7.15.
Here's minimal configuration to reproduce the problem:
setup.py
from setuptools import setup
setup(
name='example',
install_requires=['google-cloud-pubsub']
)
In your setup.py use the following:
from setuptools import setup
setup(
name='example',
install_requires=['google-cloud-pubsub', 'googleapis-common-protos==1.5.3']
)
That seems to get around it
I am trying to add a runnable script for my project with setup.py. I added it to the scripts= argument of setup. The script works fine when I run it from the project, ./solver. I install it with sudo python setup.py install, and try to run it with solver, but I get ImportError: No module named 'model'. How do I correctly install and run my script with setuptools?
SOLVER/
solver/
model/
__init__.py
view/
__init__.py
controller/
__init__.py
__init__.py
main.py
solver <-- starts the app
setup.py
README.md
LICENCE
setup.py:
#!/usr/bin/env python3
import os
from setuptools import setup, find_packages
setup(
name='SOLVER',
version='1.0.0',
description='SOLVER app test',
author=['me'],
license='BSD',
classifiers=['Programming Language :: Python :: 3 :: Only'],
packages=['solver'],
#packages=find_packages(exclude=["doc", "tests"]),
install_requires=['numpy>=1.10.4'],
scripts=['solver/solver'],
)
solver:
#!/usr/bin/env python3
from solver import main
main.gui_mode()
You need to list all the packages, including the sub-packages, in the packages argument. You can use find_packages to generate that list for you. Currently, you're just installing the Python files in the solver/ directory.
from setuptools import setup, find_packages
setup(
...
packages=find_packages(),
...
)
You should also use entry_points rather than scripts, especially when all your script does is import and call one function. Setuptools will build scripts from the entry points that use the correct Python binary for the env they were installed in.
setup(
...
packages=find_packages(),
entry_points={
'console_scripts': [
'solver=solver.main:gui_mode'
]
...
}
You can install your package in development mode to get your script, rather than writing it yourself.
pip install -e .
You should use pip to install to the system as well. It keeps track of what was installed so you can uninstall it later.
pip install .