I am trying install a Django app on Heroku. My app needs pyke3. The recommended way for installing pyke3 is to download pyke3-1.1.1.zip https://sourceforge.net/projects/pyke/files/pyke/1.1.1/ and then install (into a virtualenv if desired) using the instructions on http://pyke.sourceforge.net/about_pyke/installing_pyke.html. How do I install pyke3 on heroku? Is there a way to add this to the requirements.txt, and how will heroku know where to get the pyke3 zip file?
From pip's docs:
pip supports installing from PyPI, version control, local projects, and directly from distribution files.
So, pip supports installing packages directly from links. All you have to do is put the link to the required package in your requirements file.
To download the package pyke3-1.1.1.zip, add this link in your requirements:
https://sourceforge.net/projects/pyke/files/pyke/1.1.1/pyke3-1.1.1.zip/download
Related
I created a repository, on Artifactory, which includes a zip containing 2 folders.
https://artifactory.healthcareit.net:443/artifactory/pi-generic-local/paymentintegrity-airflow-lib-plugins/paymentintegrity-airflow-libs-plugins-202211031330.zip
Is there a way to download that zip and extract the directories during a pip install? Basically, the developer just runs a pip install and gets the directories where needed.
I'm not looking for something in the [scripts] section since installing these directories would be the only thing currently needed in the Pipfile (its a weird project). Is this possible?
You can abuse the install command in the setup.py to call any arbitrary code, such as a one that unzips the assets and install it at the right place. I have seen this being done to package c++ binaries using python pypi in a place I worked for. See Post-install script with Python setuptools for hints on how to override the install command.
As per my observations, you have created generic repo and trying to fetch the packages using pip install. I would recommend creating a PyPI repository in the Artifactory and then try fetching the packages. This will. help in creating the metadata which will maintain all the package versions in the repository.
If you have the packages in your local then push them in the PyPI local repo and when you resolve them from Artifactory it will automatically download for the pip install based on your requirement.
If your requirement is to zip up multiple packages and push the archive file to the Artifactory and want the Artifactory to unzip and give the dependeciense during the pip install - then this is not possible from the Artifactory side we need to use a Post-install script with Python setup tools as mentioned .
I am confused as to what exactly pip install (package) does. In my django project, I wanted to install a package and thought that I only needed to include it in the settings.py INSTALLED_APPS. However I also needed to run the command pip install (package) as well.
Why is this the case? I thought that pip install only installed packages locally? The package seems to also work through my remote repository from another user as well which is why I am confused
pip is a package manager. When you pip install (package), it searches PyPI (the Python Package Index) for a package with the name (and potentially, the version) that you have provided. It then downloads the package and installs it.
After the package has been installed locally, you can reference it in your INSTALLED_APPS in your Django settings file.
Please read more details here: https://realpython.com/what-is-pip/
I use free gear on openshift.com, I need to install SQLAlchemy 1.0+, but openshift use local easy_install mirror for downloading packages and latest version of SQLAlchemy is 0.7.9. i try to using pip, but it also use local mirror.
Then i find this solution, but compiling process crash.
--index-url https://pypi.python.org/simple/
In requirements.txt and had problem with permissions(to cache folder and to other that used by pip).
How i can get fresh packages from pypi?
Try
1) SSH into your app
2) activate your venv
source python/virtenv/venv/bin/activate
3) Manually install package
easy_install sqlalchemy==1.0
Is there any way to tell django to install some dependencies through external repositories? For example, I'd not like to keep twitter-bootstrap code downloaded into my repository, I'd like to define a github link and fetch it automatically through a shell command. Something silimiar to collectstatic. I know I can write my own, but maybe there's something built-in or already implemented?
Python modules you can install directly from git. For example: pip install -e git+git://github.com/jschrewe/django-genericadmin.git
For frontend modules you can use tools like bower. For installing Twitter Bootstrap: bower install bootstrap
Both tools has config files, which can be used to track dependencies.
I'm trying to use SVN to manage my python project.
I installed many external Libs (the path is like:"C:\Python27\Lib\site-packages")on Computer A,then I upload the project to the SVN Server.
and then I use Computer B which just has python(v2.7) been installed.I checkout from the SVN server
:here comes the problem..there is no external Libs in computer B.Is there any solution to solve this problem,I don't want to install the external Libs on Computer B again!
Thanks advance!
The normal Python way to deal with this is to use pip and requirements files. virtualenv, which lets you have multiple sets of installed packages, is also commonly used.
For example, if you have a project which depends on any version of itsdangerous and any version of Werkzeug over 0.9, you could have this requirements file:
Werkzeug>=0.9
itsdangerous
You would usually store that in a file named requirements.txt. You would then install the packages like this:
pip install -r requirements.txt
pip will find all of the packages needed not already installed and install them.
You could actually copy the package source code from site-packages to your project folder, and your project folder normally has a higher prority than site-packages.
Then you just need check-in library to your svn.