I installed some python packages in my gcp instance. python_speech_features was one of them. When I wrote pip list, it shows me that it is installed. But when I try to access it in my code, it says no module found with this name python_speech_features. I have attached screenshots of the error error and installed packages packages installed.
I tried to replicate a scenario like the one you currently have; however, I was able to use the ’python_speech_features’ library with this example on a GCE instance without issues. This is the procedure I did:
Create an GCE instance and connect it.
Create a virtual environment with this command: virtualenv -p python3 env
Activate the virtualenv with this command: source env/bin/activate
Install the following libraries:
numpy==1.18.5
python-speech-features==0.5
scipy==1.4.1
Download the example file: ‘example.wav’
Run the code: ‘python example.py’
I recommend trying this procedure to ensure that the library import works correctly.
Related
I am relatively new to GCP and am trying to schedule a notebook on GCP to run everyday. This notebook has dependencies in terms of libraries and other python modules/scripts. When I schedule this with the Cloud Scheduler (as shown in image), there are errors shown in logs at import statements of libraries and while importing other python modules.
I also created a requirements.txt file, but the scheduler doesn't seem to be reading it.
Am I doing something wrong?
Can anyone help or guide me with some possible solutions? Been stuck with this since a few days, any help would be highly appreciated.
PS- Cloud Functions would be by last option incase I'm not able to run this way.
The problem is that we have 2 different environments:
Notebook document itself
Docker container that Notebook Executor uses when you click on Execute: a Docker container is passed to Executor backend (Notebooks API + Vertex Custom Job) and since you are installing the dependencies in the Notebook itself (Managed Notebook underlying infra), these are not included in the container, hence this fails. You need to pass a container that includes Selenium.
If you need to build a custom container I would do the following:
Create a custom container
# Dockerfile.example
FROM gcr.io/deeplearning-platform-release/tf-gpu:latest
RUN pip install -y selenium
Then you’ll need to build and push it somewhere accessible.
PROJECT="my-gcp-project"
docker build . -f Dockerfile.example -t "gcr.io/${PROJECT}/tf-custom:latest"
gcloud auth configure-docker
docker push "gcr.io/${PROJECT}/tf-custom:latest"
Specify the container when launching the Execution "Custom Container"
The error means that you are missing the selenium module, you need to install it. You can use the following commands to install it:
python -m pip install -U selenium (you need pip installed)
pip install selenium
or depending on your permissions:
sudo pip install selenium
For python3:
sudo pip3 install selenium
Edit 1:
If you have selenium installed, check where you have Python located and where the Python looks for libraries/packages, including the ones installed using pip. Sometimes Python runs from a location, but looks for libraries in a different location. Make sure Python is looking for the libraries in the right directory.
Here is an answer that you can use to check if Python is configured correctly.
I am trying to use the microsoft azure custom vision service on a mac from Jupyter in VS Code
I have Python 3.8.3 installed.
I have done pip install azure.cognitiveservices.vision.customvision and confirmed it is there using pip show.
When I execute the command
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
I get the error:
ModuleNotFoundError: No module named 'azure.cognitiveservices.vision.customvision'
I have tried adding the location where the package is installed to $PATH but that does not fix the problem.
Any thoughts gratefully received! thx
It is recommended that you always create and activate a python virtual environment to work with Jupyter notebooks, like an Anaconda environment, or any other environment in which you've installed the Jupyter package.
To select an environment, use the Python: Select Interpreter command from the Command Palette (Ctrl+Shift+P). Once the appropriate environment is activated, you can create and open a Jupyter Notebook and connect to a remote Jupyter server for running code cells. Check Working with Jupyter Notebooks in Visual Studio Code for more info.
This is true for application development in Python in general as well.
A virtual environment is a folder within a project that isolates a copy of a specific Python interpreter. Once you activate that environment (which Visual Studio Code does automatically), running pip install installs a library into that environment only.
When you then run your Python code, it runs in the environment's exact context with specific versions of every library. You can create a requirements.txt file for the libraries you need, then use pip install -r requirements.txt.
Here is a snippet from a sample requirements.txt file:
azure-mgmt-core==1.2.0
azure-mgmt-network==16.0.0
azure-mgmt-resource==10.2.0
If you don't use a virtual environment, then Python runs in its global environment that is shared by any number of projects.
Refer to the Azure SDK for Python Developer docs for more information on configuring your local Python dev environment for Azure.
Whenever U get: ModuleNotFoundError, the simple solution is to install the module using
pip install (module name)
For example, in your case try to run the following line:
!pip install azure
The ! is to run a command in a notebook.
If I install a virtualenv on my local machine, activate it and try to run python3 then it works fine (with the imported modules). However, after I send it to the live server (using scp and filezilla) it gives the error:
-bash: /<path>/venv4/bin/python3: cannot execute binary file: Exec format error
This also happens with python and python3.8 in the same package.
I have tried reinstalling virtualenv and pipx, recreating the virtualenv and reuploading a few times.
It seems that it can't find the module, as when I activate the virtualenv on the live server and type "which python3" then it shows me the system python3:
/usr/bin/python3
It also does not work if I try to execute the venv's python3 directly, using the full path.
The reason I'm doing this is because the old virtualenv I was using has stopped working because it can't seem to find the installed modules anymore. I'm not sure why.
Any help would be much appreciated.
I believe some pip packages contain more than just python code, and must be compiled. If your host OS is different from your server OS, or you have different libraries installed, the host-compiled code will not be compatible with your server.
Common practice is to create a file with a list of required packages, using something like
pip freeze > requirements.txt
and rebuild the environment on the server, using something like
pip install -r requirements.txt
I am developing a Python library and I need to make it available from GCP Notebook.
Is it possible? How?
Details:
I use Pipenv to manage my library dependencies. Currently my library source code exists in local and in a private git repository. So it is not in PyPI.
My code has multiple module files in nested directories.
My library's dependencies exist in PyPI.
Using Pipenv, the dependencies are described in Pipefile.
This is the type of my Jupyter VM instance : https://cloud.google.com/deep-learning-vm
And this is some interesting structure I could find using SSH from Google console :
$ ls /opt/deeplearning/
bin binaries deps jupyter metadata proxy-agent-config.json restriction src workspace
I envisage to install my library (using pip or something else) to be able to import its modules from the notebooks.
I need that all the dependencies of my library to be installed when installing the library.
If the Python Packages Index is public, I don't want to publish my library in it being proprietary.
Thank you.
What I understood from your question is: you are writing your own python module, which depends on many third-part python packages (can be installed with pip).
In this situation, I would probably do a pip freeze on the actual environment where the module loads everything perfectly.
pip freeze > requirements.txt (It will create a requirements.txt file with all the dependency modules/libraries)
Now, once in the jupyter notebook, you can use the following command to first install all the requirements.
(Run the following in the notebook code cell)
# Install a pip package in the current Jupyter kernel
import sys
!{sys.executable} -m pip install -r requirements.txt
I'm used to using pip to install Python packages into my Django projects' virtual environments.
When I am working with a Divio Docker project locally, this does not work.
There are two things you need to be aware of when installing Python packages into a Docker project:
the package must be installed in the correct environment
if you want to use the installed package in the future, it needs to be installed in a more permanent way
The details below describe using a Divio project, but the principle will be similar for other Docker installations.
Installation in the correct environment
To use pip on the command line to install a Python package into a Dockerised project, you need to be using pip inside the Docker environment, that is, inside the container.
It's not enough to be in the directory where you have access to the project's files. In this respect, it's similar to using a virtual environment - you need to have the virtualenv activated. (Otherwise, your package will be installed not in the virtual environment, but on your own host environment.)
To activate a virtual environment, you'd run something like source bin/activate on it.
To install a package within a Divio web container:
# start a bash prompt inside the project
docker-compose run --rm web bash
# install the package in the usual way
pip install rsa
rsa is now installed and available to use.
More permanent installation
So far however, the package will only be installed and available in that particular container. As soon as you exit the bash shell, the container will disappear. The next time you launch a web container, you will not find the rsa package there. That's because the container is launched each time from its image.
In order to have the package remain installed, you will need to include it in the image.
A Divio project includes a requirements.in file, listing Python packages that will be included in the image.
Add a new line containing rsa to the end of that file. Then run:
docker-compose build web
This will rebuild the Docker image. Next time you launch a container with (for example) docker-compose run --rm web bash, it will include that Python package.
(The Divio Developer Handbook has some additional guidance on using pip.)
Note: I am a member of the Divio team. This question is one that we see quite regularly via our support channels.