I have a Github Action that builds Windows wheels. At the end of the build it installs the wheels to make sure everything's OK, but right now the version is hard coded in the filename. I saw this question that deals with releases, but I'd like to run this on every push to master to check that things are OK.
Right now there's a line in my action looks like this:
pip install "fugashi-0.1.9rc1-cp${{ matrix.py-short }}-cp${{ matrix.py-short2 }}-win_amd64.whl"
I don't want to have to update the action every time the version changes, so I would like the line to look like this:
pip install "fugashi-$VERSION-cp${{ matrix.py-short }}-cp${{ matrix.py-short2 }}-win_amd64.whl"
But I do not know how to get the version into the environment of the github action.
Is there some way I can get the version number from setup.py in an environment variable for the job?
This ended up being much simpler than I thought. You can just get the version from setup.py itself and use that.
VERSION=$(python setup.py --version)
pip install "dist/fugashi-$VERSION-cp${{ matrix.py-short }}-cp${{ matrix.py-short2 }}-win_amd64.whl"
Trying to change the Github Action environment was a distraction.
Related
I am aware that pip freeze > requirements.txt exists, yet that that prints out my system packages, of which only a few my directory/ project needs.
I am not using a virtualenv so I'm pretty sure I can't print out local packages like that.
I also know that pipdeptree exsists but I also don't see how that solves my problem?
I believe tools like the following could help:
pipreqs
pigar
As far as I can tell, these tools read the code in the directory and try to figure out the dependencies required based on the import statements they found in the code.
Related:
https://stackoverflow.com/a/61202584
https://stackoverflow.com/a/61540466
https://stackoverflow.com/questions/61143402/how-to-generate-requirements-txt-for-given-py-sources-folder-or-specific-py-file
https://stackoverflow.com/a/31684470
i am trying to install Home Automatization (https://home-assistant.io) on my Synology. I've installed python via the synology packaging system, i've done basic setup (https://home-assistant.io/docs/installation/synology/) but when i try to run the daemon i see this in console:
homeassistant requires Python '>=3.5.3' but the running Python is 3.5.1
Is there any chance to update the python to required version on synology? Can you help me please?
Synology offers only python 3.5.1 at the moment.
You need to install older version of HA as mentioned on the installation page in blue box.
./python3 -m pip install homeassistant==0.64.3
If you would like to install latest HA, you would need to use docker instances on Synology, if your model supports it.
BTW.
Since the Python3 update (3.5.1-108) on Synology. once you install the HA you need to edit two files.
/volume1/#appstore/py3k/usr/local/bin/hass
and
/volume1/#appstore/py3k/usr/local/lib/python3.5/runpy.py
and add "import pip" to the beginning of the file where the import statements are. Otherwise HA will not start.
I've provided an updated python3-3.5.6 SPK for Synology in SynoCommunity; python3-3.6.8 is in the pipeline. Consider adding SynoCommunity ("spksrc") to your NAS to seamlessly install updates.
Additionally, be aware that home-assistant-0.82 is in beta at the same site.
Raw SPKs can be manually downloaded from my beta site if you want to check there as well, but I would suggest you look into SynoCommunity, keep updated from there, contribute comments and code there.
I'm having a problem with this package that I installed in Python 3.5. After installing it, I try to run requestProxy.py but it won't import any of its own packages. Here's what I did, and what's happening.
I cloned it and created a private repo using these instructions.
I installed in an activated virtualenv, created without using sudo, using:
pip3 install -e HTTP_Proxy_Randomizer
Terminal said it installed ok.
I can find the egg link in my virtualenv's site-packages folder, but when I try to run the main file, it says:
from project.http.requests.parsers.freeproxyParser import freeproxyParser
ImportError: No module named project.http.requests.parsers.freeproxyParser
I had to write a setup.py for the package, which didn't seem to come with its own. I came up with:
setup(name='HTTP_Request_Randomizer',
version='1.0',
description='HTTP Proxy Request Randomizer',
package_dir={'project': 'project','http':'project/http',\
'requests':'project/http/requests','errors':'project/http/requests/errors',\
'parsers':'project/http/requests/parsers','proxy':'project/http/requests/proxy'},
packages=['project','http','requests','errors','parsers','proxy']
Here's the package structure:
pip3 freeze
gives me:
Complete output from command git config --get-regexp remote\..*\.url:
fatal: bad config file line 4 in /home/danny/.gitconfig
----------------------------------------
Error when trying to get requirement for VCS system Command "git config --get-regexp remote\..*\.url" failed with error code 128 in /home/danny/Documents/HTTP_Request_Randomizer, falling back to uneditable format
Could not determine repository location of /home/danny/Documents/HTTP_Request_Randomizer
Django==1.9.7
## !! Could not determine repository location
HTTP-Request-Randomizer==1.0
mysqlclient==1.3.7
So I want to have requestProxy.py install the other necessary packages and not fail at line 1. I'm sure this is a problem with my implementation and not the original author's coding. I was experimenting with this package a couple of weeks ago before I was aware of virtualenvs or pip install -e, and just copied it manually to site-packages. It worked then. Now I understand the concepts to do it more cleanly, but I can't get those to work.
It feels as though I have done something wrong with my git config or with my package_dir structure in setup.py, perhaps?
I've been pythoning for maybe a month and have a lot to learn. I normally find what I need on Stack Overflow without having to bother anyone, but after trying everything with this, I really need some help. Any advice much appreciated.
I figured it out. I was using Ninja IDE, and even though I entered the virtualenv for the project and restarted, it still wasn't recognizing it. I was able to run it from the terminal, and also in Pycharm and Liclipse.
I have recently tried to use pylearn2, a deep machin learning package for Python developed at University of Montreal.
I've just installed it and tried to run a simple example, but it did not work.
I have been using a pc with an Ubuntu 13.10 system, on which I found ipython installed.
I have installed Theano and later pylearn2, by following this webpage instructions:
http://deeplearning.net/software/pylearn2/
I have also modified the .bashrc file, as suggested
I thought that everything went well, and then I tried this Quick start example:
http://deeplearning.net/software/pylearn2/tutorial/index.html
I stopped at the first command:
python make_dataset.py
My terminal states:
Traceback (most recent call last): File "make_dataset.py", line 14,
in
Do you have any ideas on why it is not working?
Do you why these errors occur?
Thanks a lot
EDIT: the 14 line is the first non-commented line of the file. It states
from pylearn2.utils import serial
Without more information, I can only guess, but my first guess is…
You haven't actually installed pylearn2, because if you follow the linked docs to grab the git repo and add a PYLEARN2_DATA_PATH variable, nothing gets installed into site-packages (or dist-packages or anywhere else on sys.path).
This means that pylearn2 will only work when you start Python from within the top-level directory of the pylearn2 repo.
So, if you run a script like this:
$ cd /path/to/pylearn2
$ cd scripts/tutorials/grbm_smd/
$ python make_dataset.py
… it won't actually work.
It looks like there is a setup.py file in the repository. Does it work? I have no idea. Even though the docs don't mention using it, you might want to try. Either this:
$ pip install .
… or, if you don't have pip or it doesn't work on this package:
$ python setup.py install
Either way, of course, you may need sudo or a flag to install to your user site-packages instead of system, etc., as with any other Python package.
If that doesn't work, you might be able to just add /path/to/pylearn2 to your sys.path in some way. The most obvious way is by doing an export PYTHONPATH=/path/to/pylearn2:$PYTHONPATH in your ~/.bashrc.
Also, you will need to either source ~/.bashrc or create a new shell to get any effects of modifying the file.
If you're wondering why the instructions and the tutorial together don't give you enough information to make this work without a lot of hassle, I think that's covered in the very top of the documentation:
Pylearn2 is still undergoing rapid development. Don’t expect a clean road without bumps!
And the very fact that there is no PyPI download yet implies that this really is not ready for novices to use. If you don't know enough about using Python packages (and bash basics) to muddle through on your own, there's a good chance you won't be able to use this package.
When I'm working on module fork, I'll often add the fork (in progress) to the virtualenv of a full project for integrated testing, using
python setup.py develop
( which updates the easy_install.pth to point to the local copy )
When I'm done, the only way I've figured out how to clearly get rid of this, is to remove the entry from easy_install.pth or edit it to point to the already installed version.
I also can't easy_install --upgrade , because it realizes the development version is the latest.
I think pip could force the upgrade, but then it tries to reinstall every single dependency.
Does anyone have a good technique / strategy for managing this sort of stuff. I know I'm missing something obvious here.