I am on Mac OS and developing for google cloud platform.
I have created vitualenv - virtualenv xyz.
I activated using - source xyz/bin/activate
Then, I installed the pkg I needed - pip install python-dateutil
When I do pip list, I do see the python-dateutil is there
But when i run my service using dev_appserver.py . and try to make a post request. I get the ImportError: No module named dateutil.parser
Questions: In my appengine_config.py, I have vendor.add('lib') but the packages are installed under my_project-> xyz -> lib -> python2.7 -> site-packages -> dateutil. How does my app knows where to look for packages?
Second question: When I am ready to deploy to production, how do I deploy the packages. pip freeze > requirements.txt. Is that enough for prod server to know what packages to use. Do I need lib folder under my_project? I am confused about how packages get referred in virtualenv and in production.
You're mixing the instructions for installing dependencing for the standard environment with those for the flexible environment. Related: How to tell if a Google App Engine documentation page applies to the standard or the flexible environment
You're using dev_appserver.py so I assume your app is a standard environment one, in which case you need to install the library into your app (note the -t lib arguments), not on the system/venv. Assuming you execute from your app's dir:
pip install python-dateutil -t lib
Related
I wanted to install the PIP module on the server but i cant install anything on the server
as I do not have root access to it ..
Support suggested -
You cannot (and we will not) use "pip" to install them into the system Python directories. You will need to manually build and install it into your build directories in your development server and then package it with your code.
Instructions for separating the python "build" step from the "install" step can be obtained from wherever you got the module.
Here are some instructions on the Python website that may be useful:
https://docs.python.org/2.7/install/index.html#alternate-installation-the-user-scheme
You could just create a virtual environment and install your packages there
https://docs.python.org/3/library/venv.html
I am attempting to deploy a Django app to a Linux server via an Azure App Service. During the deployment via Azure Devops Pipelines, all requirements are installed from my requirements.txt file in the root directory of my project.
I have used the Kudu console to confirm the dependencies are installed to /antenv/lib/python3.7/site-packages on the server, however, the app crashes due to an error:
ModuleNotFoundError: No module named 'django'
I am beginning to think the virtual environment may be failing to actually start but do not know how to check or how to start it if this is the case.
Has anyone had a similar issue to this during their deployment? If so how did you resolve it? Any advise is much appreciated. Thank you!
Newest
Change target path ,--target="./.python_packages/lib/site-packages" .
- bash: |
python3.8 -m venv worker_venv
source worker_venv/bin/activate
pip3.8 install setuptools
pip3.8 install --target="./.python_packages/lib/site-packages" -r requirements.txt
displayName: 'Install Application Dependencies'
You need to install a new Python runtime at the path D:\home via Kudu site extensions.(Windows)
The problems is, azure app service use virtualenv by default, so the requirements.txt package is automaticly installed to python in the virtualenv... so I just edit the deploy.cmd to install requirements to python (extension)
For more details, you can refer Peter Pan's answer in below post.
Why is the azure app service django deploy keep failing?
I am developing a Python library and I need to make it available from GCP Notebook.
Is it possible? How?
Details:
I use Pipenv to manage my library dependencies. Currently my library source code exists in local and in a private git repository. So it is not in PyPI.
My code has multiple module files in nested directories.
My library's dependencies exist in PyPI.
Using Pipenv, the dependencies are described in Pipefile.
This is the type of my Jupyter VM instance : https://cloud.google.com/deep-learning-vm
And this is some interesting structure I could find using SSH from Google console :
$ ls /opt/deeplearning/
bin binaries deps jupyter metadata proxy-agent-config.json restriction src workspace
I envisage to install my library (using pip or something else) to be able to import its modules from the notebooks.
I need that all the dependencies of my library to be installed when installing the library.
If the Python Packages Index is public, I don't want to publish my library in it being proprietary.
Thank you.
What I understood from your question is: you are writing your own python module, which depends on many third-part python packages (can be installed with pip).
In this situation, I would probably do a pip freeze on the actual environment where the module loads everything perfectly.
pip freeze > requirements.txt (It will create a requirements.txt file with all the dependency modules/libraries)
Now, once in the jupyter notebook, you can use the following command to first install all the requirements.
(Run the following in the notebook code cell)
# Install a pip package in the current Jupyter kernel
import sys
!{sys.executable} -m pip install -r requirements.txt
I've been trying to use MySql when VM true flag is setted to true on my app.yaml
but this error is thrown: appcfg.py: error: Error parsing src/app.yaml: The "libraries:" directive has been deprecated for Managed VMs. Please delete this section from your app.yaml, use pip (https://pip.pypa.io/) to install
your dependencies, and save them to a requirements.txt. For more information, please visit http://cloud.google.com/python.
I didn't find anything specific for this error, where should I put this "requirement.txt", anyone had this issue?
Thanks!
If you're working with flexible environments (previously called Managed Virtual Machines) then you can't use the "libraries" directive in your app.yaml in order to activate third party libraries. Instead of that you should use pip to install your dependencies. From oficial docs;
Requirements.txt and the Python package manager pip are used to
declare and install application dependencies.
The documentation also is explicit when the author write;
Requirements.txt defines the libraries that will be installed both locally and when
deploying to App Engine.
You should put the requirement.txt file in the root directory. Here you can see a example where flask library was imported.
In your development environment you can run the following command in order to install your declared libraries.
pip install -r requirements.txt
Pip is default way to install libraries in python envs. Here you can find a very nice documentation.
Update:
You should use the following command to deploy:
gcloud proview app deploy
Take a look here for more details.
Also, here there is a official example of your use case. I can see a little difference, the author is using PyMySQL==0.7.3 instead of MySQL-python.
The official documentation for Google App Engine with python recommends using a virtualenv and installing third party libs into a subdirectory of the project root:
$ source /path/to/my/virtualenv/bin/activate
$ cd my/project/root
$ mkdir lib
$ pip install -t lib sqlalchemy
The docs then say to make an appengine_config.py file in the project root with the following content:
from google.appengine.ext import vendor
# Add any libraries installed in the "lib" folder.
vendor.add('lib')
This all works in the sense that the dev server can find sqlalchemy at run time. However, my virtualenv itself cannot. If I do this
$ python
>>> import sqlalchemy
I get an import error.
This makes testing things apart from the dev server awkward/impossible.
Is there some pip trick or something similar I can use to make the libs available both from within and without the dev server?
I follow a variation of the same steps but with
$ ln -s {virtualenv}/lib/python2.7/site-packages lib
This way a pip install in the virtualenv automatically goes to the lib directory as well.
Every pip install would then be available to the virtualenv's python and to the dev_appserver without supplying the target folder to make testing things bearable. Eg.:
$ pip install sqlalchemy