I has been forced to develop python scripts on Windows 10, which I have never been doing before.
I have installed python 3.9 using windows installer package into C:\Program Files\Python directory.
This directory is write protected against regular user and I don't want to elevate to admin, so when using pip globally I use --user switch and python installs modules to C:\Users<user>\AppData\Roaming\Python\Python39\site-packages and scripts to C:\Users<user>\AppData\Roaming\Python\Python39\Scripts directory.
I don't know how he sets this weird path, but at least it is working. I have added this path to %Path% variable for my user.
Problems start, when I'm trying to use virtual environment and upgrade pip:
I have created new project on local machine in C:\Users<user>\Projects<project> and entered the path in terminal.
python -m venv venv
source venv\Scrips\activate
pip install --upgrade pip
But then I get error:
ERROR: Could not install packages due to an EnvironmentError: [WinError 5] Access denied: 'C:\Users\\AppData\Local\Temp\pip-uninstall-7jcd65xy\pip.exe'
Consider using the --user option or check the permissions.
So when I try to use --user flag I get:
ERROR: Can not perform a '--user' install. User site-packages are not visible in this virtualenv.
So my questions are:
why it is not trying to install everything inside virtual enviroment (venv\Scripts\pip.exe)?
how I get access denied, when this folder suppose to be owned by my user?
When using deprecated easy_install --upgrade pip everything works fine.
I recently had the same issue for some other modules. My solution was simply downgrade from python 3.9 to 3.7. Or make an virtual environment for 3.7 and use that and see how it works.
Related
The only line in my code that references blob storage is a simple import statement, but when I uncomment it the entire function crashes. I can't seem to figure out how to get azure-storage-blob to be accessible by the VENV where the function is running.
I've already installed the requisite packages in terminal with Rosetta and am running VS Code with Rosetta as well. The Azure Functions core tools work, but the blob storage tools don't.
I've tried re-installing the package directly in the venv, incorporating the answer from this post to address the "normal site-packages is not writeable" error, but as shown I still get the same result:
(.venv) jonahrotholz#Jonahs-MacBook-Pro Azure w: Rosetta % python3 -m pip install azure-storage-blobDefaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: azure-storage-blob in /Users/jonahrotholz/Library/Python/3.9/lib/python/site-packages (12.14.1)
Any help would be greatly appreciated!
I too faced this issue while working with Windows OS - Python Azure Functions.
But there are many reasons for this issue to occur with respective to User Installation of Python Packages, few of them with their resolutions.
In my previous workarounds, I always use to say to install Python Packages after activating the Virtual Environment and that seems not working in your case.
From the references SO #65808972 and the article given by the user #MingJie-MSFT and the author #BorislavHadzhiev, try with the below steps that will help to fix your issue and run the any of the below commands after activating the Virtual Environment:
python -m pip install <Your_required_python_Package_Name> - For Python Version 2
Replace the python word with python3 or to a specific version python3.x
Append the parameter --user to python package installation command.
If all of the above not worked, then elevate the permissions by replacing the python installer commands like python/python3.x to sudo pip, sudo pip3, sudo python, sudo python3by running the Visual Studio Code IDE as an Administrator mode.
Check the Python interpreter is installed for all of the users or the current user with required permissions and check the python installed path is added to the Environment Variables in the System.
I received the following error message when I installed some python packages in a debian instance:
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
There are packages installed under /usr/local/lib/python3.7, /usr/local/lib/python2.7 and /home/oliver/.local/lib/python3.7
The packages under /usr/local/lib are owned by root and the packages under /home/oliver/.local/lib are owned by oliver
Some version information:
Debian: 10
python3 -V : 3.7.3
python -V : 2.7.16
pip3 -V (and pip -V): 22.0.3
The primary question I have at this point is where should the python 3 packages ideally reside (and with what permissions) so as to be accessible to all users?
The packages are correctly installed. The reason it asks for root permission in order to use pip is that there are two main ways to install/uninstall python packages. The first is through the package manager of the linux distro and the other through pip.
When we want to code/experiment with Python and need some new Python package it's recommended to use Virtualenv which creates an isolated Python installation that doesn't interfere with your Linux installation. So you can do whatever you want with this and if something goes wrong you just delete it and there is no problem.
As you see this is just a WARNING that you got from Debian. If you insist you can login as root and use pip to install without Virtualenv.
I need to use the autogui module to do something in Python. When I run the script, it says that it can't find the autogui module so I installed it with
pip install autogui.
But when I run the script again it still says me this module doesn't exist.
Method 1:
You're probably having trouble setting up your correct Python Interpreter and working within it,try the following in VSCode.
Ctrl + Shift + p
And enter the following in the field.
python: select interpreter
Select the desired environment and reinstall PyAutoGui
Method 2:
Creating a virtual environment for your project where all your packages will be installed and will be isolated from others, and will have no import errors since it's an environment specifically for the project that you're working on.
I assume you use Windows, so open the command line in your working directory, or open your working directory in VSCode and enter the following in the command-line tool that is provided within VSCode.
The Python installers for Windows include pip. You should be able to access pip using:
py -m pip --version
You can make sure that pip is up-to-date by running the following
py -m pip install --upgrade pip
Installing virtual environment
py -m pip install --user virtualenv
Creating a virtual environment
py -m venv env
The second argument is the location to create the virtual environment. Generally, you can just create this in your project and call it env.
venv will create a virtual Python installation in the env folder.
Finally, to activate the environment run the following command
.\env\Scripts\activate
That will activate your environment.
pip install pyautogui
Make sure to change your interpreter to the one that you just created in the env/bin folder and run your code, or you could just enter the path to the *python file located in the env/bin folder.
Try pyautogui - I had the same problem. Instead of autogui, write `pyautogui. Or, if you are running python3 or higher then try:
pip3 install pyautogui.
How do you use a Python package such as Tensorflow or Keras if you cannot install the package on the drive on which pip always saves the packages?
I'm a student at a university and we don't have permission to write to the C drive, which is where pip works out of (I get a you don't have write permission error when installing packages through pip or conda`).
I do have memory space available on my user drive, which is separate from the C drive (where the OS is installed).
So, is there any way I can use these Python libraries without it being installed?
Maybe I can install the package on my user drive and ask the compiler to access it from there? I'm just guessing here, I have no knowledge of how this works.
install conda
create new environment (conda create --name foobar python=3.x list of packages
use anaconda to activate foobar (activate foobar)
check pip location by typing in cmd 'where pip' to be sure you use pip from withing the python from withing the foobar environment and not the default python installed in your system outside of your conda environment
and next use the pip from above location to install requested library into your environment.
ps. you may want to consider to install Cygwin on your Windows machine to get use to work with Linux environment.
I've installed python-virtualenv and python-virtualenvwrapper, and created a virtual environment by using mkvirtualenv NAME, and then activated it through workon NAME. By looking in ~/.virtualenvs/NAME/bin I see that pip is installed there.
However, when I try and install anything through pip, I'm told pip-python: command not found
I have not installed pip system wide, and was under the impression that I did not need to, given that it was already installed inside the virtual environment. Now, all this leads me to believe that something is not being set correctly with my $PATH, what could that be though? Once I'm in side the virtual environment as such: (NAME)[user#host]$ shouldn't my path already be modified to use the pip installation inside that environment? What do I need to do to make this so?
You must install pip on you system to make it accessible in virtualenv.
pip-python is the name of the executable in some Linux distributions. It is on my Fedora machine.
When pip is installed in a virtualenv, the name of the executable is simply pip, not pip-python. So you need to execute it with ~/.virtualenvs/NAME/bin/pip, not ~/.virtualenvs/NAME/bin/pip-python.