conda env create -f python3.6-environment.yml
This is the code I used to try to create an environment on conda, using a .yml file.
Fetching package metadata ...............
Solving package specifications: .
After running the first code, I get stuck at this.
However, I noticed that I can copy the contents in the .yml file & create a .py file with it.
conda env create -f python3.6-environment.py
Can I then run this code instead & have all the same files installed as with the .yml?
The contents of the .yml file are from github:
https://github.com/enigmampc/catalyst/blob/master/etc/python3.6-environment.yml
I think conceptually this works but for the sake of factoring I would suggest you write the activation as a bash script that also instantiates your python code. Just a thought.
Related
I have a .yml file that I would like to create a python environment with called icesattest.yml It is in my downloads folder. When I run the following command in the Anaconda Prompt:
conda env create -f icesattest.yml
I get the following error:
EnvironmentFileNotFound: 'C:\Users\scox4\icesattest.yml' file not found
How do I tell the Anaconda prompt to look for the .yml at C:\Users\scox4\downloads\icesattest.yml? Or, how do I move the .yml file to this path so it is read? I tried to drag and drop icesattest.yml to the My PC sidebar on file explorer but it wouldn't let me move it there. I know this is a really silly mistake but I am a complete novice so any simply advice would help!
I searched and found the command that is used to create environments from .yml files so I copy pasted it as so but it failed. I changed the command to match the icesattest.yml file and it still did not work.
The way I typically handle this is via the env update command. I've found it to be a bit more robust when it comes to working with environment files.
conda env update -n <name of virtual environment> -f environment.yml
This command works both with existing environments and with new environments.
I should mention that I typically work with Linux and not Windows, so I can only guarantee that this will work well with unix systems. I'd recommend you check out something like WSL if you can, as working with Linux is typically a lot easier than Windows once you get the hang of it.
If you want to read more about conda environments: https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#updating-an-environment
My goal is to create a .zip file which includes my python code and a batch file that runs that python code without installing anything else in the user's system but only uses files within the .zip. This means the batch file can't run python main.py since that would first involve getting the user to install python, which in my purposes, is just inconvenient and bad.
My first instinct was to put my code in a folder. Then, create a virtualenv with all dependencies installed. Create a batch file with the following code:
".venv/Scripts/python.exe" code_path/main.py
Package the code, virtualenv, and batch file in a zip file then pass that around to the users.
The problem I have with this is that I feel that it's dumb to add the virtualenv to the zip file. EDIT: Even if I were to add it, the .venv wouldn't even work for other systems apparently.
I tried other solutions like making a .exe instead with pyinstaller, but it keeps popping up with false positive detections from some security vendors like Chrome and Windows Defender, so I strayed away from this path and tried using .zip files instead.
EDIT: question currently limited to solutions for Windows 10/11.
Docker is a nice solution for the problem you face.
You could build the environment and apps in a container, which is based on official python docker image, then pack and distribute your python-app image.
The following Dockerfile is an example:
Prepare your app .py in /usr/src/app and write the requrements in requrements
FROM python:3
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD [ "python", "./your-daemon-or-script.py" ]
Visit https://hub.docker.com/_/python for more detail.
I'm trying to build a dashboard app via Plotly.Dash.
Therefore, I'm transitioning from using Jupyter to Atom.ide.
It seems I have managed to set up a virtual environment and activated it with no errors; in that virtual environment dash_reqlibs.yml i'm trying to install Plotly lib, which also loads with no errors according to terminal:
After the installation i'm trying to run my simple test code and it tells me it doesn't see module plotly:
Now, please note, that according to my research I'm not even supposed to do that since in my dash_reqlibs.yml I have specified all need - doesn't work either way though:
UPDATE:
Removed bad venv
Created new one successfully
Changed to app.py directory and activated the venv
cannot select venv as it doesnt show up
Try creating environment from the beginning again, using these steps:
Create “dash_reqlibs.yml” file with required libraries
Save it inside your Environments folder (or whatever name you call that folder)
Open Windows Command Prompt (or Atom terminal) and ‘cd’ into your Environments folder
Crate new virtual environment by typing: conda env create -f dash_reqlibs.yml
Activate your new environment:
a. On Windows, type: conda activate env_dash (name value inside the .yml file)
Let me know how it goes.
I have a python project and I am using pipenv to handle deps.
I need to create a zip file that includes the source code and all the dependencies code as well. I need this zip file for uploading it to AWS Lambda.
When working with pipenv, it downloads the dependency libraries somewhere in the computer, but for packaging/distribution of the project I need all the necessary code to be contained in the same place (a zip file).
Is there a way to run pipenv and set it to install dependencies at a specific path? If not, does someone knows where those dependencies are located in my machine?
Thanks
This has worked for me:
#!/bin/bash
# this is b/c pipenv stores the virtual env in a different
# directory so we need to get the path to it
SITE_PACKAGES=$(pipenv --venv)/lib/python3.6/site-packages
echo "Library Location: $SITE_PACKAGES"
DIR=$(pwd)
# Make sure pipenv is good to go
echo "Do fresh install to make sure everything is there"
pipenv install
cd $SITE_PACKAGES
zip -r9 $DIR/package.zip *
cd $DIR
zip -g package.zip posts.py
I've specifically tried it with numpy and it works correctly. It includes the .so files as well which is great because everything is self contained.
I currently have an executable file that is running Python code inside a zipfile following this: https://blogs.gnome.org/jamesh/2012/05/21/python-zip-files/
The nice thing about this is that I release a single file containing the app. The problems arise in the dependencies. I have attempted to install files using pip in custom locations and when I embed them in the zip I always have import issues or issues that end up depending on host packages.
I then started looking into virtual environments as a way to ensure package dependencies. However, it seems that the typical workflow on the target machine is to source the activation script and run the code within the virtualenv. What I would like to do is have a single file containing a Python script and all its dependencies and for the user to just execute the file. Is this possible given that the Python interpreter is actually packaged with the virtualenv? Is it possible to invoke the Python interpreter from within the zip file? What is the recommended approach for this from a Python point of view?
You can create a bash script that creates the virtual env and runs the python scripts aswell.
!#/bin/bash
virtualenv .venv
.venv/bin/pip install <python packages>
.venv/bin/python script