I am competing in the HWO2014, yet I cannot run my bot. Here is the build file I was provided with, alongside the run file:
build:
#!/bin/bash
virtualenv env --no-site-packages --distribute
source env/bin/activate
run:
#!/bin/bash
source env/bin/activate
python main.py "$#"
However, when I run ./build on the MinGW terminal, the following error is reported:
./build: line 3: virtualenv: command not found
./build: line 5: env/bin/activate: No such file or directory
What does this error mean? How do I prevent it?
You need to install virtualenv on your machine. The run file is a bash script which is saying to go into the newly created virtual environment before running the python script. More info can be found on the official virtualenv docs.
Related
hi i have installed virtualenv python library to manage my env doing all the setup but when i use the following command to open my env that happen
francesco#AirdiFrancesco ~ % workon
zsh: command not found: workon
but when i run this code before workon don't give me problems
export WORKON_HOME=~/Envs
VIRTUALENVWRAPPER_PYTHON=$(which python3)
source /opt/homebrew/bin/virtualenvwrapper.sh
now i have tried to add this three line to my ~/zshch file but doesn't work. where i need to put this code in my bash?
I'm trying to run a run a file from a GitHub repo using the Command Prompt on Windows. I started with these commands:
python -m pip install virtualenv
python -m virtualenv ocopus_venv
.\venv\Scripts\activate.bat
curl -O https://github.com/zuphilip/ocropy-models/raw/master/en-default.pyrnn.gz
move en-default.pyrnn.gz models
No errors so far, but when I run:
./ocropus-nlbin tests/ersch.png -o book
I got this error: the '.' is not recognized
How can I make this command run properly?
On Windows cmd this doesn't work. Try without the ./ at the beggining. Condering ocropy supports Windows you can run it also wtih python2.7.
python ocropus-nlbin tests/ersch.png -o book
Don't forget you have to run python setup.py install before.
I would like to use the python3 executable when I use it from crontab as manual launch (ssh session).
bash script
#!/bin/bash
PYTHONPATH="$(which python3)"
echo $PYTHONPATH
python3 test.py
result from ssh command line, launched manually
/usr/local/bin/python3
result in log file from crontab -e
/usr/bin/python3
I would like the script launched by the crontab, uses /usr/local/bin/python3 executable instead of /usr/bin/python3
OR
if it's not possible, use the dependencies of my code available for /usr/bin/python3
How can I achieve this ? Thank you very much for your help
the python inside the docker container will not necessarily have the same path. If you want all the modules installed on your VM python3, create a requirements.txt file using pip freeze > requirements.txe and COPY this file as part of your Dockerfile and install it while building the image pip install -r requirements.txt
I am writing a bash script in linux that creates and activates a Python venv and then installs from a requirements.txt. Like this
python3 -m venv ~/myvenv/env
source ~/myvenv/env/bin/activate
cp requirements.txt ~/myvenv/env/requirements.txt
pip3 install -r ~/myvenv/env/requirements.txt
This doesn't work for me. It seems to create the myvenv directory but then doesnt switch in and run the requirements.txt file.
Is there a different way to activate it with source from within a bash script?
When you run script your shell spawns new process, activates it and then dies.
That's why when you get back into your shell you see unactivated one.
you can run your script just using source command. source command will load it into your active shell.
source script.sh
I'm trying to automate the deployment of my Python-Flask app on Ubuntu 18.04 using Bash by going through the motion of preparing all the necessary files/directories and cloning the source code from Github followed by creating the virtual environment, installing the pre-requisite modules and etc.
Now because I have to execute my Bash script using sudo, this means that the entire script will be executed as root except where I specify otherwise using sudo -u myuser and when it comes to activating my virtual environment, I get the following output: sudo: source: command not found and my subsequent pip installs are all installed outside of the virtual environment. Excerpts of my code below:
#!/bin/bash
...
sudo -u "$user" python3 -m venv .env
sudo -u $SUDO_USER source /srv/www/www.mydomain.com/.env/bin/activate
sudo -u "$user" pip install wheel
sudo -u "$user" pip install uwsgi
sudo -u "$user" pip install -r requirements.txt
...
Now for the life of me, I can't figure out how to activate the virtual environment in the context of the virtual environment if this makes any sense.
I've scoured the web and most of the questions/answers I found revolves around how to activate the virtual environment in a Bash script but not how to activate the virtual environment as a separate user within a Bash script that was executed as sudo.
That's because source is not an executable file, but a built-in bash command. It won't work with sudo, since the latter accepts a program name (i.e. executable file) as argument.
P.S. It's not clear why you have to execute the whole script as root. If you need to execute only a number of commands as root (e.g. for starting/stopping a service) and run a remaining majority as a regular user, you can use sudo only for these commands. E.g. the following script
#!/bin/bash
# The `whoami` command outputs the current username. Unlike `source`, this is
# a full-fledged executable file, not a built-in command
whoami
sudo whoami
sudo -u postgres whoami
on my machine outputs
trolley813
root
postgres
P.P.S. You probably don't need to activate an environment as root.