I created an application using py2app and have completed it. Running it on my machine has no problem, but I am concerned it might have problems such as missing module or other errors when run on someone else's with no programs installed. Is there a way to test this?
(sorry, I'm sure this is on the Internet but I'm not sure how to search for it)
you really can address your concern of module not found and ... on your current system by testing the app in an isolated virtual environment.
you can create a virtual environment, install the necessary package in it and see if it works
to do so:
I think the fastest easiest and most comfortable way is to use miniconda
to create a backup of the current ( or a fresh working virtual environment) and re-create one on another (or same) machine.
miniconda is an environment managing tool, a super lightweight version of conda (only ~60MB)
Just install miniconda from the instructions here.
Then create a new environment as below:
conda create --name newtestevironemnet python=3.9
So, the command will create an environment with only python. and then you can test your app see what no module error you will get and install packages.
whenever you reach a point where everything is working you can export necessary packages by :
conda env export > environment.yml
you can also create a requirement.txt file and tell the user to do pip install -r requirements.txt before running the app by:
conda list -e > requirements.txt
Related
I created an environment using virtualenvwrapper while my machine was running Python 3.8.3. Naturally, this made the environment with Python 3.8.3 as well.
Later, I updated Python on my main machine using home brew to 3.10.
Now I have 3.10-specific code in the virtual env project, running 3.8.3.
The first entry in that project's $PATH is set to the virtual env itself, which uses the old Python. The 2nd entry is Python 3.10. I believe this is standard. The virtual env itself is added to the front of $PATH by virtualenverapper upon activation.
Short of manually manipulating the .zprofile or a virtalenvwrapper's postactivate script, I am wondering if there is a more sweeping and automatic way of updating a virtual environment using virtualenvwarpper?
I am no expert on this stuff, so my concern is that manually manipulating files, or applying one-time fixes will just be asking for trouble down the line.
Thanks for your help.
EDIT: I wanted to add that I am also learning git and have a repo set up in this project. I would like to ideally preserve that information through the "upgrade," which it sounds like involves creating a new env. The virtualenvs are stored on the ~user directory in .virtualenvs. The git repo is in the project directory.
You don't want to do this. As you said, even if you pulled it off you're sure to have hidden issues that'll be a major headache down the line. Fortunately, it's very easy to recreate the virtual env with exactly the same installed packages you had before but with a new Python version.
What you want is to compile a list of installed packages in your old virtualenv, make your new venv with the desired Python version, then reinstall all the packages. We can do this simply like this :
workon oldenv
pip freeze > env_requirements.txt
deactivate
mkvirtualenv newenv -p `which python3.10`
pip install -r env_requirements.txt
If you're happy with the result, you can then delete the old venv :
rmvirtualenv oldenv
As to your concern with git, this should have absolutely no impact whatsoever on your git repo.
Hope you're doing well,
I was trying to work on this django project from github but i could not download all the packages in a virtual environment, it says
(venv) C:\Users\me\Downloads\movie_recommender-master>pip install -r requirements.txt
Unable to create process using 'C:\Users\me\AppData\Local\Programs\Python\Python310\python.exe "C:\Users\me\Downloads\movie_recommender-master\venv\Scripts\pip.exe" install -r requirements.txt'
. I have read through tons of questions in stack over flow but nothing seems to work, I would be very grateful if you could help me here.
From all answers in comments I would say that it is quite clear - you are not activating your virtual environment or it is not setup correctly.
As I can understand from your text and comments your working directory is:
C:\Users\Joydeep Paul\Downloads\movie_recommender-master
and your virtual environment is at:
C:\Users\Joydeep Paul\Downloads\movie_recommender-master\venv
When you issue this commands:
where pip
where python
they need to point to your virtual environment (after you activate it with activate command) but instead they point to system wide versions:
C:\Users\me\AppData\Local\Programs\Python\Python310\python.exe
So to answer your question, one of the following has happened:
your system wide python installation is bad and it does not correctly setup virtual environment in your project
you haven't activated your virtual environment (although you say that you did)
there is some major issue with your windows installation or privileges
Maybe you can try to deactivate vritual environment issue where python3 statement and then activate it again and try once more where command?
I'm new to python, so please be gentle.
In learning python and writing my first few scripts, I quickly glossed over any tutorial sections on virtualenv, figuring it wouldn't provide me any benefit in my nascent stage.
I proceeded to hack away, installing packages as I went with pip3 install package
Now I've built something that is potentially useful to my organization, and I'd like to share it. In this case, I want to distribute it as a windows executable.
Before building this distribution, I figure it's now time to take the next leap from individual scripts to proper python projects. It seems like virtualenv should be part of that.
Given that I've installed a number of packages to my "base" python environment: in order to do development in a "clean" virtual environment, do I need to somehow "revert" my base python environment (i.e. uninstall all non-standard packages), or will virtualenv shield a project within a virtual environment from non-standard packages installed to my "base" environment?
If you are using the venv module there is --system-site-packages flag that will grant the created virtual environment access to the system-wide site-packages directory:
--system-site-packages
Give the virtual environment access to the system
site-packages dir.
Go install VirtualEnvWrapper first. After that, create a new virtualenv, activate it, and run pip freeze. You should see nothing in there because nothing is installed. Deactivate the env to go back to your 'Base' environment and pip freeze again. You will see all the installs you have.
A best practice is to create a requirements.txt file and version control it so everyone can use the same versions of the same packages. If you don't want to do this, simply activate your new virtual env and pip install everything you want.
You can specify separately the required libraries and check if they are installed and if not then you can install them automatically.
Have a look at:
https://packaging.python.org/discussions/install-requires-vs-requirements/
In my company I have a setup where I have an original canopy distribution installed. Through some batch process a virtual environment is then created of that which contains additional python packages.
The virtual environment works fine from pycharm, however, I have the following problems:
When starting pip or python from the command line, the original canopy installation seems to be started. Am I right in thinking that 'activating' the virtual environment simply means adjusting the path variables to folders of the virtual environment? How is this best done automatically? Does canopy or python provide a good script? I want pip to install packages to the virtual environment, which it currently doesn't.
What is the best way to create a new virtual environment based on the virtual environment I already have?
I know that with anaconda this would all be easier, but my solution needs to be based on pure python or canopy.
Not sure about your specific environment, but for python projects, I usually get by with
pip freeze > requirements.txt
to save the list of packages installed in a virtual environment to a file
and
pip install -r requirements.txt
to restore the packages on a new virtual environment.
I've used requirements.txt as the filename, but you can pretty much use any file name you want for this.
I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.