I apologize ahead of time if these questions are very dumb. I'm pretty new to sourcing python code from github. The link I am attempting to use is a study from link: https://github.com/malllabiisc/ConfGCN. So far, what I have tried is downloading the code as a zip. Then, I followed the instructions from Github and downloaded Ubuntu to run the shell file setup.sh. However, I am running into errors as after running sudo bash setup.sh in Ubuntu, it gives me this error:
Install python dependencies
setup.sh: line 11: pip: command not found
I have checked out the respective files this references. It calls for:
echo "Install python dependencies"
pip install -r requirements.txt
Inside the requirements.txt file it has a variety of python packages I have already installed inside a Venv in Pycharm. It specifically calls for:
numpy==1.16.0
tensorflow==1.12.1
scipy==1.2.0
networkx==2.2
Previous lines in setup.sh run perfectly fine in terms of updating files included in the folder. Another question I have is in general on how to setup a python package. I am currently using Pycharm CE 2020 and I've attempted creating a python package inside of my workspace. I noticed that it auto generates a init.py file. How can I integrate my downloads from GitHub into my Pycharm Project?
There is no reason to run setup.sh as root because it is just supposed to install some packages which do not necessitates Sudo access. You can simply create a virtual environment and run setup.sh. For setting up environment just run:
$ virtualenv -p /usr/bin/python3.6 myenv # Create an environment
$ source myenv/bin/activate # Load the environment
(myenv) $ ./setup.sh
Once the environment is ready, you should be able to run the code. You can make Pycharm use that environment for executing the code.
Related
I'm new to python, and I was wondering if you could help me run a python script. I'm trying to run a script called PunchBox from Github: https://github.com/psav/punchbox. So far, I have Python 3.9.5 and Git Bash.
In the GitHub page, it says:
To install, clone the repo, cd into it and then execute the following:
virtualenv -p python2 .pb2
source .pb2/bin/activate
pip install -U pip
pip install .
What does this mean exactly? Where do I run this code?
So far, I tried downloading the zip file from GitHub, installing Python 3.5.9, using cmd, finding the directory with cd, and running that code; but got an error:
Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name punchbox was given, but was not able to be found.
error in punchbox setup command: Error parsing C:\Users\Mi\Downloads\punchbox-master\punchbox-master\setup.cfg: Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name punchbox was given, but was not able to be found.
There's also a requirements.txt that lists additional scripts needed:
pre-commit
click
mido
pbr
PyYAML
svgwrite
Do these install automatically upon running the script for the first time?
I'm a little confused why I'm getting an error. Do you know what I'm doing wrong?
Thank you so much!
Giovanni
I assume you are new to programming. You have to write these lines in a terminal.
On Windows, it is Command Prompt or PowerShell Applications (latter preferred). On macOS, it is terminal
Copy all these lines at once, and paste them to your preferred terminal. The terminal will automatically run these one after the another.
FYI: Venv is a python package to create a virtual environment. The preceding commands set up the environment. Now install the required dependencies using this command instead of the last command (pip install .)
pip install -r requirements.txt
Based on your comment, it looks like you don't have virtualenv installed in your system. You may install it using the command pip install virtualenv.
Now, as you are using a Windows machine, you may open a Command Prompt or Windows PowerShell window and navigate to the directory where your cloned project resides.
Now, execute the following commands.
virtualenv -p python2 .pb2
.pb2\Scripts\activate.bat
pip install -U pip
pip install -r requirements.txt
Once you are done working in your virtual environment (which is named .pb2), you may close it by executing deactivate command.
#Giovanni T.
See, as far as you have installed Python and also downloaded the GitHub Repository as a zip file.
pip install -r requirements.txt
Just run this command.
Please make sure that the directory is pointing to the folder where this requirements.txt file is stored.
I'm aware there are many similar questions but I have been through them all to no avail.
On Ubuntu 18.04, I have Python 2 and Python 3.6. I create a venv using the command below and attempt to install a package using pip. However, it attempts to install on the global system and not in the venv.
python3 -m venv v1
When I run 'which python' it correctly picks the python within the venv. I have checked he v1/bin folder and pip is installed. The path within the pip script is correctly pointed to toward python in the venv.
I have tried reinstalling python3 and venv, destroying and recreating the virtual environment and many other things. Wondering is there some rational way to understand and solve this.
The problem in my case was that the mounted drive I was working on was not mounted as executable. So pip couldn't be executed from within the venv on the mount.
This was confirmed because I was able to get a pip install using 'python -m pip install numpy' but when importing libraries, e.g. 'import numpy', was then faced with further error of:
multiarray_umath.cpython-36m-x86_64-linux-gnu.so: failed to map segment from shared object
which led back to the permissions issue as per github issue below. Fix for that by dvdabelle in comments then fixes dependent and original issue.
https://github.com/numpy/numpy/issues/15102
In his case, he could just switch drive. I have to use this drive. So the fix was to unmount my /data disk where I was working and remount it with exec option!
sudo umount /data
sudo mount -o exec /dev/sda4 /data
'which pip' now points to the pip in the venv correctly
Note: to make it permanent add the exec switch to the line for the drive in fstab as per https://download.tuxfamily.org/linuxvillage/Informatique/Fstab/fstab.html (make exec the last parameter in the options or user will override it) E.g.
UUID=1332d6c6-da31-4b0a-ac48-a87a39af7fec /data auto rw,user,auto,exec 0 0
I would like to easily export one Python project from one PC to other. When I created the project, I used a virtual environment in order to avoid problems with different package versions.
What I did was to just copy the project folder and paste it in the destination PC. Once I opened the project with Pycharm, I activated the virtual environment with project_path/venv/Scripts/activate, but when I tried to execute any Script, it said it didnĀ“t find the modules.
Which is the workflow I should follow in order to create projects and be able to run them from multiple PC-s without needing to install all the dependencies?
Since you did not specify your Python version I will provide a solution working for both Python 2.x and 3.x.
My suggestion is to create a requirements.txt file containing all your requirements.
This file can be easily prepared using the output from the command:
pip freeze
Then you can paste the output in your requirements.txt file and when you are going to install your Python code on another PC you can simply:
pip install -r requirements.txt
To install your requirements again.
Depending on your project it could be possible, for example, to create a single EXE file (if you are using Windows machines) but more detailed is needed if this is the case.
In case you are using Python 3 the method that is at the moment arguably more popular in the Python community is Pipenv.
Here's its relevant documentation.
And here you can read a simple example of a workflow.
if you are using python3 then use pipenv. It will automatically create Pipfile and Pipfile.lock. That will insure reinstalling dependencies on different machine will have the same packages.
basic and helpful commands:
pipenv shell # activate virutalenv
pipenv install # will install dependencies in Pipfile
pipenv install requests # will install requests lib. and will auto update Pipfile and Pipfile.lock
I am using Virtualenv to learn Python. The author of the book I am reading wants no system wide access of Python available during learning, so we created a virtual environment via virtualenv. This is not built-in Python 3 virtual environment functionality, it is the pip virtualenv. It's an issue for me because I cannot figure out how to run a script while inside the virtualenv. Virtualenv's documentation reads that activation (or path naming) isn't required when running from within the virtual environment's directory and although I have moved my file both there and within the Scripts directory, I cannot run it while inside the virtualenv environment. Any help? I am using Python 3.6.1. The code I'm trying to run is:
def local():
m=7
print(m)
m=5
print(m)
I realize it's not even training wheel code, but what I'm trying to ultimately do is be able to run code from within the virtual environment to follow as the book suggests. I'm also using a fully updated Windows 10 OS.
What happens when I run the script is this:
(.virtualenv) c:\users\aiii> cd c:\users\aiii\desktop\learning.python\.virtualenv
(.virtualenv) c:\users\aiii\desktop\learning.python\.lpvenv>scopes1.py
'scopes1.py' is not recognized as an internal or external command, operable program or batch file.
(.virtualenv) c:\users\aiii\desktop\learning.python\.lpvenv>python scopes1.py
python: can't open file 'scopes1.py': [Errno 2] No such file or directory.
(.virtualenv) c:\users\aiii\desktop\learning.python\.lpvenv>
I have placed the script both directly in the learning.python folder where the environments are contained c:\users\aiii\desktop\learning.python\.lpvenv and inside the .lpvenv folder in the Scripts folder since that is where other scripts run from within the virtualenv pip are at c:\users\aiii\Desktop\learning.python\.lpvenv\Scripts\
First, install Virtualenv:
sudo apt-get install python-virtualenv
Then Create Virtualenv:
virtualenv venv #venv is name
For activating virtualenv.First, move to folder, In which you want to enable and run this command:
source venv/bin/activate
Once, Your work is done then disable virtualenv:
deactivate
Is there any easy way to export the libs my script needs so that I can put all of the files into a git repo and run the script from Jenkins without the need of installing anything?
context:
remote Jenkins without some python libs (RO - no access to terminal)
need to run my script that needs external libs such as paramiko, requests, etc
I have tried freeze.py but it fails at make stage
I have found some articles here regarding freeze.py, p2exe, p2app, but none of those helped me.
You can use a virtual environment to install your required python dependencies in the workspace. In short, this sets up a local version of python and pip for which you can install packages without affecting the system installation. Using virtual environments is also a great way to ensure dependencies from one job do not impact other jobs. This solution does require pip and virtualenv to be installed on the build machine.
Your build step should do something like:
virtualenv venv
. venv/bin/activate
pip install -r requirements.txt
# ... perform build, tests ...
If you separate your build into several steps, the environment variables set in the activate script will not be available in subsequent steps. You will need to either source the activate script in each step, or adjust the PATH (e.g. via EnvInject) so that the virtualenv python is run.