How to run a bash script through pycharm? - python

I have the following set of bash commands
docker pull mcr.microsoft.com/mssql/server:2017-latest
docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=my_password' \
--name 'sql1' -p 1401:1433 \
-v "my_space":/opt/project \
-d mcr.microsoft.com/mssql/server:2017-latest
winpty docker exec -it sql1 bash
mkdir -p /var/opt/mssql/backup
cp my_db /var/opt/mssql/backup/
/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P "my_password" -i /opt/project/scripts/database_import/sql_script.sql
apt-get update -y
apt-get install python3-pip -y
python3 -m pip install pymssql
python3 -m pip install pandas==0.19.2
python3 -m pip install time
python3 -m pip install sqlalchemy
python3 -m pip install sqlalchemy_utils
cd /opt/project/
python3 scripts/database_import/import_database.py
What this set of commands is essentially doing is that it pulls the mssql server, restores a database, installs some python packages and it runs a python script inside the mssql docker container.
Is there a way to run this bash script, from pycharm ?

Sure thing. If you are using the latest version, then Shell Script should be available https://www.jetbrains.com/help/idea/shell-scripts.html
So, for example, I have test.sh file:
I can just click this green run button and PyCharm will run it, or create Run Configuration for it (see the link above).

Related

Run file in conda inside docker

I have a python code that I attempt to wrap in a docker:
FROM continuumio/miniconda3
# Python 3.9.7 , Debian (use apt-get)
ENV TARGET=dev
RUN apt-get update
RUN apt-get install -y gcc
RUN apt-get install dos2unix
RUN apt-get install -y awscli
RUN conda install -y -c anaconda python=3.7
WORKDIR /app
COPY . .
RUN conda env create -f conda_env.yml
RUN echo "conda activate tensorflow_p36" >> ~/.bashrc
RUN pip install -r prod_requirements.txt
RUN pip install -r ./architectures/mask_rcnn/requirements.txt
RUN chmod +x aws_pipeline/set_env_vars.sh
RUN chmod +x aws_pipeline/start_gpu_aws.sh
RUN dos2unix aws_pipeline/set_env_vars.sh
RUN dos2unix aws_pipeline/start_gpu_aws.sh
RUN aws_pipeline/set_env_vars.sh $TARGET
Building the image works fine, running the image using the following commands works fine:
docker run --rm --name d4 -dit pd_v2 sh
My OS in windows11, when I use the docker desktop "CLI" button to enter the container, all I need to do is type "bash" and the conda environment "tensorflow_p36" is activated and I can run my code.
When I try docker exec in the following manner:
docker exec d4 bash && <path_to_sh_file>
I get an error that the file doesn't exists.
What is missing here? Thanks
Won't bash && <path_to_sh_file> enter a bash shell, successfully exit it, then try to run your sh file in a new shell? I think it would be better to put #! /usr/bin/bash as the top line of your sh file, and be sure the sh file has executable permissions chmod a+x <path_to_sh_file>

How to install python 3.9 on Amazon Linux 2 with cloud-init and CDK

I'm trying to install Python 3.9 an EC2 instance that uses Amazon Linux 2. I tried following this guide: https://computingforgeeks.com/install-latest-python-on-centos-linux/, and I was able to install Python3.9 manually on the EC2 instance by SSH'ing in and running the commands. I'm now trying to setup the EC2 instance with a UserData script that calls some CloudFormationInit scripts to install dependencies, including Python 3.9, and my script is failing.
Here's part of the script that I'm using to install Python 3.9:
const installPythonString = `
#!/bin/bash
sudo amazon-linux-extras install -y epel
sudo yum -y update
sudo yum groupinstall "Development Tools" -y
sudo yum install openssl-devel libffi-devel bzip2-devel -y
gcc --version
sudo yum install wget -y
sudo mkdir -p /opt/python3.9/
sudo chown -R $USER:$USER /opt/python3.9/
wget https://www.python.org/ftp/python/3.9.9/Python-3.9.9.tgz -P /opt/python3.9
cd /opt/python3.9/
tar xvf Python-3.9.9.tgz
whoami
sudo chown -R $USER:$USER Python-3.9.9
cd Python-3.9.9/
ls -al
pwd
./configure --enable-optimizations
sudo make altinstall
python3.9 --version
pip3.9 --version
`;
init.addConfig('install_python39', new ec2.InitConfig([
ec2.InitFile.fromString('/opt/install_python39.sh', installPythonString, {
mode: '000755',
owner: 'root',
group: 'root',
}),
ec2.InitCommand.shellCommand('sudo sh install_python39.sh', {
cwd: '/opt',
key: 'install_python39',
}),
]))
I'm getting the following errors when trying to start up the EC2 instance:
Python build finished successfully!
...
WARNING: The script pip3.9 is installed in '/usr/local/bin' which is not on PATH.
install_python39.sh: line 21: python3.9: command not found
install_python39.sh: line 22: pip3.9: command not found
Is there an easier way to install Python 3.9 on Amazon Linux 2 using CloudFormationInit?
Looks like the path to the python is /usr/local/bin which is not in $PATH so the python3.9 command is not found.
run the following commands in order.
export PATH="/usr/local/bin:$PATH" or echo "export PATH='/usr/local/bin:$PATH' >> ~/.bashrc(if you do this relaunch the ssh session) to save it to bashrc so you don't have to run the export everytime you log in.
python3.9 --version
additionally if you keep having issues, follow this to install python3.9, which is what i used, and everything went flawlessly.
if you have python packages that need to be installed, i would recommend creating a requirements.txt and using pip3.9 install -r requirements.txt to install them.

Dockerized Python (Streamlit) app uses wrong folder for python libraries

I try to dockerize a Streamlit App. Creating the image works, but when I try to start the app, python seems to try the wrong path for my packages.
The App should run on openshift with python 3.6.
FROM registry.access.redhat.com/redhat-openjdk-18/openjdk18-openshift
USER root
ADD content /
RUN yum -y update \
&& yum -y --enablerepo "*" install bzip2 \
python36-pip \
python36 \
python36-devel \
openssl \
&& yum clean all -y
RUN mkdir -p /usr/local/lib/python3.6/site-packages \
&& python3 -m ensurepip
ENV PIP_CONFIG_FILE=/opt/pip/pip.conf
RUN python3 -m pip install --upgrade pip
RUN python3 -m pip install -r /opt/pip/requirements.txt
ENV LC_ALL=en_US.utf-8
ENV LANG=en_US.utf-8
RUN useradd -rm -d /home/usdlmod -s /bin/bash -g root -u 1001 usdlmod -p "$(openssl passwd -1 usdlmod)"
RUN chgrp root /etc/passwd && chmod ug+rw /etc/passwd
USER usdlmod
CMD ["python", "-m", "streamlit.cli", "run", "main.py", "--server.port=8080"]
EXPOSE 8080
On Openshift I get the following error: /usr/bin/python: No module named streamlit
How can I solve this error?
It might be that you both have python 2 and python 3 installed. Changing the cmd to python3 will solve it.

How to run a python script by using cron in a docker container

I have a python code that contains a dockerfile like this:
FROM python:3.8.3 AS base
RUN apt-get update
RUN apt-get -y install software-properties-common cron vim
RUN apt-get update
RUN apt-get -y install python3-pip
RUN pip3 install pandas
RUN pip3 install sklearn
RUN pip3 install SQLAlchemy
RUN pip3 install ConfigParser
RUN pip3 install psycopg2
RUN pip3 install numpy
RUN pip3 install xgboost
RUN pip3 install xlrd
RUN pip3 install matplotlib
FROM base AS publish
WORKDIR /app
COPY . /app
RUN touch /var/log/cron.log
RUN (crontab -l ; echo "* * * * * python /app/main.py" >> /var/log/cron.log") | crontab
CMD cron && tail -f /var/log/cron.log
I want to execute my main.py python script in every minute, but my main.py is not running when i build and run the docker container. I have found some solutions on this site but those solutions did not work for me. What should i write to execute my main.py python script?

How do you setup python3.2 on ubuntu 14.04 for testing with tox?

Trying to use tox to run tests before pushing, but I keep running into errors like:
ERROR: py26: InterpreterNotFound: python2.6
ERROR: py32: InterpreterNotFound: python3.2
ERROR: py34: InterpreterNotFound: python3.3
apt-cache search isn't offering any packages that look like they will help. How do you load all these versions of the interpreter for ubuntu14.04?
Obviously, Ubuntu doesn't ship all historic versions of Python. But you can use deadsnakes PPA which has everything from 2.3 to 3.4.
For one project I used drone.io CI service with, I had the following tox section I ran before actual test envs.
[testenv:setupdrone]
whitelist_externals = /bin/bash
commands =
bash -c "echo 'debconf debconf/frontend select noninteractive' | sudo debconf-set-selections"
bash -c "sudo add-apt-repository ppa:fkrull/deadsnakes &> /dev/null"
bash -c "sudo apt-get update &> /dev/null"
bash -c "sudo apt-get -y install python2.6 python3.4 &> /dev/null"

Categories

Resources