I'm using Google Drive to keep a copy of my code projects in case my computer dies (I'm also using GitHub, but not on some private projects).
However, when I try to create a virtual environment using virtualenv, I get the following error:
PS C:\users\fchatter\google drive> virtualenv env
New python executable in C:\users\fchatter\google drive\env\Scripts\python.exe
ERROR: The executable "C:\users\fchatter\google drive\env\Scripts\python.exe" could not be run: [Error 5] Access is denied
Things I've tried:
I thought it was because the path to the venv included blank spaces, but the command works in other paths with blank spaces. I also tried installing the win32api library, as recommended in the virtualenv docs, but it didn't work.
running the PowerShell as an administrator.
Any ideas on how to solve this? My workaround at the moment is to create the venv outside of the Google Drive, which works but is inconvenient.
After running into the same issue and playing with it for several hours, it doesn't seem possible. It has nothing to do with spaces in file/folder names. I've tested that. It seems that Google Drive Stream perform some action to the file/folder after a period of time that makes python loose its path the files. For instance, you can take clone a python module into a Google Drive Stream folder, do a "pip install -e ./", and it will work in the virtevn for a few minutes, for instance importing it into a python shell. But after a few minutes you can no longer import the module. I suspect that Google Drive Stream is simply not fully compatible with all filesystem system calls, one of which is being used by python.
Don't set up a virtual env in a cloud synced folder, nor should you run a python script from such a folder. It's a bad idea. They are not meant for version control. Write access (modifying files) to the folder is limited because in your case Google drive will periodically sync the folder which will prevent exclusive write access to the folder almost always.
TLDR; One cant possibly modify files while they are being synced.
I suggest you stick to git for version control.
Related
This is the massage that Pycharm gives me almost once a day, and I have to restart it. As I have multiple projects open, it gives this error for each virtualenv repeatedly until I force quit it.
Is there a way to prevent Pycharm from constantly invalidating cache?
#ps: I never had such issues with Pycharm on Windows
Invalid Python SDK
Cannot set up a python SDK at Python 3.9 (demographics-g5XoraTQ) (/Users/mamad/Library/Caches/pypoetry/virtualenvs/up-demographics-g5XoraTQ-py3.9/bin/python). The SDK seems invalid.
It turns out the issue was the number of git repositories each with a separate project SDK (i.e. venv interpreter) that I had simultaneously opened in my PyCharm instance (over 10).
The re-indexing of git caches and Python libraries created memory issues and eventually resulted in corruption of index files; this couldn't be solved unless I restarted my PyCharm instance once a day.
The solution was to:
Either re-use one virtual environment for all projects; which is not desirable at all.
Or, as soon as I am done with a project, remove the project along with its Python interpreter and git repository from PyCharm. To speed things up, I set my Poetry to create its .venv inside the project folder; so interpreter and project both can be removed in a single command.
One undesirable outcome of the second solution is that my shell now displays one venv name for all my virtual environments in all project folders.
I've put myself in a big ol' pickle. This morning, I unsynced some heavy folders from Box in order to save on power consumption and memory. What I didn't realize is that for some reason years ago I installed python within this synced drive that carries all my research. So a lot of dependencies broke after I unsynced a folder called .local. Since then I've re-synced the folder just to have it working for the day but it has 20,000 files in it and it's taking an eternity to patch things up. Little by little, I'm able to load more libraries, but it's just a bandaid and ultimately what I want to do is move all my python stuff from that synced drive to a local directory.
I'm using WSL. I've tried "uninstalling" and reinstalling python3 using the Windows installer, but whenever I try to run a .py program from the terminal, I get the same errors as before, such as:
cannot import name '_np_version_under1p14' from 'pandas.compat.numpy'
and a little while later (after some files synced)
AttributeError: module 'pandas' has no attribute 'read_csv'
So it appears to still reference the same place that is currently missing some files due to the sync problem. That means it's still referencing the directory within the synced drive. How can I tell the terminal to use the packages installed elsewhere? I see a directory in /mnt/usr/local/lib/python3.6/ that contains the site-packages I think I need. How can I use the terminal command python3 to look for files in this other directory by default?
It seems like you're using the global python interpreter to run your scripts which is usually not the best thing to do.
You will run into dependency issues or things breaking like you described.
Much better way is to create a virtual environment, install all your dependencies like pandas, and run your scripts from there.
Create new virtual environment:
python -m venv .venv
Activate it:
source .venv/bin/activate
Install dependencies:
pip install pandas
And then run your scripts.
If you have a lot dependencies you can use tools like python poetry to manage them. Also you'll get virtual environments helpers and locking for free.
https://python-poetry.org/
I use two Macs, one is at home and the other one at office. I installed Pycharm, hoping that I could load a same project using these two Macs. So I set the project folder to Dropbox, and everything could be synced immediately.
Note that I create a virtual environment in the project folder. In this folder, I could see all the site-packages and the Python executable. But when I try to load the project in the 2nd Mac, the alert goes, saying that the interpreter is invalid, as shown in the picture:
Do you know why?
Python environments aren't portable.
You should either use a docker image, or simply keep a requirements.txt file synchronized and use the local Python environment on both machines.
My goal is to run a Python script that uses Anaconda libraries (such as Pandas) on Azure WebJob but can't seem to figure out how to load the libraries.
I start out just by testing a simple Azure blob to blob file copy which works when run locally but hit into an error "ImportError: No module named 'azure'" when ran in WebJob.
example code:
from azure.storage.blob import BlockBlobService
blobAccountName = <name>
blobStorageKey = <key>
containerName = <containername>
blobService = BlockBlobService(account_name=blobAccountName,
account_key=blobStorageKey)
blobService.set_container_acl(containerName)
b = blobService.get_blob_to_bytes(containerName, 'file.csv')
blobService.create_blob_from_bytes(containerName, 'file.csv', b.content)
I can't even get Azure SDK libraries to run. let alone Anaconda's
How do I run a python script that requires external libraries such as Anaconda (and even Azure SDK). How do I "pip install" these stuff for a WebJob?
It seems like you've kown about deployment of Azure WebJobs, I offer the below steps for you to show how to load external libraries in python scripts.
Step 1 :
Use the virtualenv component to create an independent python runtime environment in your system.Please install it first with command pip install virtualenv if you don't have it.
If you installed it successfully ,you could see it in your python/Scripts file.
Step2 : Run the commad to create independent python runtime environment.
Step 3: Then go into the created directory's Scripts folder and activate it (this step is important , don't miss it)
Please don't close this command window and use pip install <your libraryname> to download external libraries in this command window.
Step 4:Keep the Sample.py uniformly compressed into a folder with the libs packages in the Libs/site-packages folder that you rely on.
Step 5:
Create webjob in Web app service and upload the zip file,then you could execute your Web Job and check the log
You could also refer to the SO thread :Options for running Python scripts in Azure
In addition, if you want to use the modules in Anaconda, please download them separately. There is no need to download the entire library.
Hope it helps you.
You can point your Azure WebJob to your main WebApp environment (and thus its real site packages). This allows you to use the newest fastest version of Python supported by the WebApp (right now mine is 364x64), much better than 3.4 or 2.7 in x86. Another huge benefit is then you don't have to maintain an additional set of packages that are statically held in a file somewhere (this gave me a lot of trouble with dynamic libraries with crazy dependencies such as psycopg2 and pandas).
HOW: In your WebJobs files, set up a .cmd file that runs your run.py, and in that .cmd file, you can just have one line of code like this:
D:\home\python364x64\python.exe run.py
That's it!
Azure WebJobs looks at .cmd files first, then run.py and others.
See this link for an official MS post on this method:
https://blogs.msdn.microsoft.com/azureossds/2016/12/09/running-python-webjob-on-azure-app-services-using-non-default-python-version/
I have my project stored on OneDrive. It sometimes works on my pc and laptop both of which have Windows 10. The project on both is in the same directory- C:/OneDrive/code/etc...
When I use virtualenv and download different packages, it works fine, but when I use my laptop nothing works at all (same applies the other way around). I get the following error:
Could not import runpy module ImportError:
No module named 'runpy'
What can I do to fix this problem on my laptop and PC? Anyone experiencing a similar issue?
Don't do this. OneDrive - and similar systems like Dropbox - are meant for sharing documents. They are not meant for code, and even less for installed libraries.
Store your code in a version control system like git, and push it up regularly to a host like Github. Then on each of your computers, clone the repo and install the dependencies locally inside a virtualenv.
I had a similar issue with a virtualenv synced with OneDrive ('pip' was no more recognized as a command for example).
I solved it by creating a symbolic link inside my OneDrive directory of a virtualenv I created outside. Like this, your drive provider can not modify/optimize/etc. your local files but they will be synced.
You can create a symlink with Windows cmd:
mklink /D "C:\...\OneDrive\...\target_dir\venv" "C:\...\source_dir\venv\"