I am trying to follow the instructions on this page:
How do I create a Lambda layer using a simulated Lambda environment with Docker?
on a Windows 7 environment.
I followed all of these steps:
installed Docker Toolbox
created a local folder 'C:/Users/Myname/mylayer' containing requirements.txt and python 3.8 folder structure
run the following commands in docker toolbox:
cd c:/users/myname/mylayer
docker run -v "$PWD":/var/task "lambci/lambda:build-python3.8" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.8/site-packages/; exit"
It returns the following error:
ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'
I don't understand what I am doing wrong. Maybe something obvious (I'm a beginner) but I spent the whole day trying to figure it out and it is getting quite frustrating. Appreciate the help!
I ran the following in Windows 10 Powershell and it worked
docker run -v ${pwd}:/var/task "amazon/aws-sam-cli-build-image-python3.8" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.8/site-packages; exit"
Related
I am trying to build https://github.com/apache/cassandra-website
Python3 is installed, I started the docker daemon, git pull and then run ./run.sh website preview but it yields the following permissions issue even though I am running as root.
[root#localhost cassandra-website]# ./run.sh website preview
Server Docker Engine version: 1.13.1
Executing docker command:
docker run --rm --name website_content -p 5151:5151/tcp -v /root/cassandra-website:/home/build/cassandra-website -v /root/cassandra-website/site-ui/build/ui-bundle.zip:/home/build/ui-bundle.zip -e ANTORA_CONTENT_SOURCES_CASSANDRA_WEBSITE_URL=/home/build/cassandra-website -e ANTORA_UI_BUNDLE_URL=/home/build/ui-bundle.zip apache/cassandra-website:latest preview
container: INFO: Entering preview mode!
container: INFO: Building site.yaml
python3: can't open file './bin/site_yaml_generator.py': [Errno 13] Permission denied
[SOLVED]
Update to latest docker from docker official repo
Update python3 and install yum install -y python36*
Make sure apache ant is installed (which explains the yaml issue)
Run docs build target
Run preview build target
Access 127.0.0.7:port/path/to/docs and Voila
I have tried zipping and structuring the zip in the python/lib/python3.7/site-packages/{matplotlib here} But it says couldn't import ft2font which is in matplotlib/init_.py
Interestingly there is no other files with that name in my package. I tried pip install and pip3 install on different OS but yet no luck
I have created a matplotlib layer for AWS lambda using docker following these steps:
sudo docker pull amazonlinux
sudo docker run -it amazonlinux:latest /bin/bash
# inside container
yum -y install python37 zip
python3 -m venv python
source python/bin/activate
pip3 install matplotlib
deactivate
rm -rf python/{bin,include,lib64,pyvenv.cfg} python/lib/python3.7/site-packages/{__pycache__,easy_install.py,numpy*,pip*,pkg_resources,setuptools*}
zip -r aws_lambda_python37_layer_matplotlib.zip python/
# in other terminal, copy file from container to host
CONTAINER_ID=$(sudo docker ps|grep amazonlinux|cut -d " " -f1)
sudo docker cp ${CONTAINER_ID}:/aws_lambda_python37_layer_matplotlib.zip .
# exit container
exit
Note that this script deletes numpy from the layer which is necessary due to size restrictions of a single lambda layer. So you will need to enable the official numpy/scipy layer from AWS in your lambda function as well.
You can also do this without docker on a small EC2 instance. You can find the resulting zip here: https://github.com/ttor/aws_lambda_python37_layer_matplotlib
I have created a docker container for my pure python program and have set python main.py to be executed when the container is run. Running the container works as expected on my local machine. However, I want to run the container on my institution's high-performance cluster. The cluster machines use Singularity, which I am using to pull my docker image hosted on Dockerhub (the repo is darshank11/ga_paci_final). However, when I try to run the Singularity container, I get the following error: python3: can't open file 'main.py': [Errno 2] No such file or directory.
I've tried to change the base image in the Dockerfile, for example from FROM python:latest to FROM ubuntu:latest. I've made sure the docker container worked on my local machine, and then got one of my co-workers to pull the container from Dockerhub and run it too. Everything works fine until I get to Singularity.
Here is my docker file:
FROM ubuntu:16.04
RUN apt-get update -y && \
apt-get install -y python3-pip python3-dev
RUN mkdir src
WORKDIR /src
COPY . /src
RUN pip3 install --upgrade pip
RUN pip3 install -r requirements.txt
CMD ["python3", "-u", "main.py"]
You're getting that error because the execution context is not what you're expecting. The run path in singularity is the current directory on the host OS (e.g., ~/ga_paci_final), which has been mounted into the singularity image.
As mentioned in the comments, one solution is to give the full path to the python file in the docker CMD statement. Another option is to modify the %runscript block of singularity definition file to something like:
%runscript
cd /src
python3 -u main.py
That way you ensure the run environment is identical between Docker and Singularity.
I'm having the issue needing to use docker compose in python (for using the docker_service functionality in Ansible), but its not possible to install using pip because of a network policy of the company (the VM has no network access only acces to a RPM). I although can use a yum repository that contains docker compose.
What I tried is to install "docker compose" (version 1.18.0) using yum. Although python is not recognizing docker compose and suggest me to use pip: "Unable to load docker-compose. Try pip install docker-compose
Since in most cases I can solve this issue by installing this using yum install python-, I already looked the web for a package called python-docker-compose but no result :(
minimalistic ansible script for test:
- name: Run using a project directory
hosts: localhost
gather_facts: no
tasks:
- docker_service:
project_src: flask
state: absent
hope anyone can help.
SOLUTION:
After some digging around I solved the issue by doing a local download on a machine that has internet:
pip download -d /tmp/docker-compose-local docker-compose
archiving all the packages that were downloaded in the folder
cd tmp
tar -czvf docker-compose-python.tgz ./docker-compose-local
since the total size of the package is slightly bigger than 1MB I added the file to the ansible docker role.
In the docker role a local install is done:
cd /tmp
tar -xzvf docker-compose-python.tgz pip install --no-index
--find-links file:/tmp/docker-compose-local/ docker_compose
Use a virtual environment!
unless you can not do that either, it depends whether the company policy is NOT to write on everyone's python (then you are fine) or whether you can not use pip (even in your own environment).
If you CAN do that then:
virtualenv docker_compose -p python3
source docker_compose/bin/activate
pip install docker-compose
You get all this junk:
Collecting docker-compose
Downloading https://files.pythonhosted.org/packages/67/03Collecting docker-pycreds>=0.3.0 (from docker<4.0,>=3.4.1->docker-compose)
Downloading https://files.pythonhosted.org/packages/ea/bf/7e70aeebc40407fbdb96fa9f79fc8e4722ea889a99378303e3bcc73f4ab5/docker_pycreds-0.3.0-py2.py3-none-any.whl
Building wheels for collected packages: PyYAML, docopt, texttable, dockerpty
Running setup.py bdist_wheel for PyYAML ... done
Stored in directory: /home/eserrasanz/.cache/pip/wheels/ad/da/0c/74eb680767247273e2cf2723482cb9c924fe70af57c334513f
Running setup.py bdist_wheel for docopt ... done
Stored in directory: /home/eserrasanz/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e
Running setup.py bdist_wheel for texttable ... done
Stored in directory: /home/eserrasanz/.cache/pip/wheels/99/1e/2b/8452d3a48dad98632787556a0f2f90d56703b39cdf7d142dd1
Running setup.py bdist_wheel for dockerpty ... done
Stored in directory: /home/eserrasanz/.cache/pip/wheels/e5/1e/86/bd0a97a0907c6c654af654d5875d1d4383dd1f575f77cee4aa
Successfully installed PyYAML-3.13 cached-property-1.5.1 certifi-2018.8.24 chardet-3.0.4 docker-3.5.0 docker-compose-1.22.0 docker-pycreds-0.3.0 dockerpty-0.4.1 docopt-0.6.2 idna-2.6 jsonschema-2.6.0 requests-2.18.4 six-1.11.0 texttable-0.9.1 urllib3-1.22 websocket-client-0.53.0
After some digging around I solved the issue by doing a local download on a machine that has internet:
pip download -d /tmp/docker-compose-local docker-compose
archiving all the packages that were downloaded in the folder
cd tmp
tar -czvf docker-compose-python.tgz ./docker-compose-local
since the total size of the package is slightly bigger than 1MB I added the file to the ansible docker role.
In the docker role a local install is done:
cd /tmp
tar -xzvf docker-compose-python.tgz
pip install --no-index --find-links file:/tmp/docker-compose-local/
docker_compose
I think you should be able to install using curl from GitHub, assuming this is not blocked by your network policy. Link: https://docs.docker.com/compose/install/#install-compose.
sudo curl -L "https://github.com/docker/compose/releases/download/1.22.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker-compose --version
# Outputs: docker-compose version 1.22.0, build 1719ceb
Hope this helps.
To complete answer from E.Serra you can use ansible to create the virtualenv, use pip and install docker-compose in one task this way.
- pip:
name: docker-compose
virtualenv: /my_app/venv
My objective is to scrape the web with Selenium driven by Python from a docker container.
I've looked around for and not found a docker image with all of the following installed:
Python 3
ChromeDriver
Chrome
Selenium
Is anyone able to link me to a docker image with all of these installed and working together?
Perhaps building my own isn't as difficult as I think, but it's alluded me thus far.
Any and all advice appreciated.
Try https://github.com/SeleniumHQ/docker-selenium.
It has python installed:
$ docker run selenium/standalone-chrome python3 --version
Python 3.5.2
The instructions indicate you start it with
docker run -d -p 4444:4444 --shm-size=2g selenium/standalone-chrome
Edit:
To allow selenium to run through python it appears you need to install the packages. Create this Dockerfile:
FROM selenium/standalone-chrome
USER root
RUN wget https://bootstrap.pypa.io/get-pip.py
RUN python3 get-pip.py
RUN python3 -m pip install selenium
Then you could run it with
docker build . -t selenium-chrome && \
docker run -it selenium-chrome python3
The advantage compared to the plain python docker image is that you won't need to install the chromedriver itself since it comes from selenium/standalone-chrome.
I like Harald's solution.
However, as of 2021, my environment needed some modifications.
Docker version 20.10.5, build 55c4c88
I changed the Dockerfile as follows.
FROM selenium/standalone-chrome
USER root
RUN apt-get update && apt-get install python3-distutils -y
RUN wget https://bootstrap.pypa.io/get-pip.py
RUN python3 get-pip.py
RUN python3 -m pip install selenium
https://hub.docker.com/r/joyzoursky/python-chromedriver/
It uses python3 as base image and install chromedriver, chrome and selenium (as a pip package) to build. I used the alpine based python3 version for myself, as the image size is smaller.
$ cd [your working directory]
$ docker run -it -v $(pwd):/usr/workspace joyzoursky/python-chromedriver:3.6-alpine3.7-selenium sh
/ # cd /usr/workspace
See if the images suit your case, as you could pip install selenium with other packages together by a requirements.txt file to build your own image, or take reference from the Dockerfiles of this.
If you want to pip install more packages apart from selenium, you could build your own image as this example:
First, in your working directory, you may have a requirements.txt storing the package versions you want to install:
selenium==3.8.0
requests==2.18.4
urllib3==1.22
... (your list of packages)
Then create the Dockerfile in the same directory like this:
FROM joyzoursky/python-chromedriver:3.6-alpine3.7
RUN mkdir packages
ADD requirements.txt packages
RUN pip install -r packages/requirements.txt
Then build the image:
docker build -t yourimage .
This differs with the selenium official one as selenium is installed as a pip package to a python base image. Yet it is hosted by individual so may have higher risk of stopping maintenance.