I've put myself in a big ol' pickle. This morning, I unsynced some heavy folders from Box in order to save on power consumption and memory. What I didn't realize is that for some reason years ago I installed python within this synced drive that carries all my research. So a lot of dependencies broke after I unsynced a folder called .local. Since then I've re-synced the folder just to have it working for the day but it has 20,000 files in it and it's taking an eternity to patch things up. Little by little, I'm able to load more libraries, but it's just a bandaid and ultimately what I want to do is move all my python stuff from that synced drive to a local directory.
I'm using WSL. I've tried "uninstalling" and reinstalling python3 using the Windows installer, but whenever I try to run a .py program from the terminal, I get the same errors as before, such as:
cannot import name '_np_version_under1p14' from 'pandas.compat.numpy'
and a little while later (after some files synced)
AttributeError: module 'pandas' has no attribute 'read_csv'
So it appears to still reference the same place that is currently missing some files due to the sync problem. That means it's still referencing the directory within the synced drive. How can I tell the terminal to use the packages installed elsewhere? I see a directory in /mnt/usr/local/lib/python3.6/ that contains the site-packages I think I need. How can I use the terminal command python3 to look for files in this other directory by default?
It seems like you're using the global python interpreter to run your scripts which is usually not the best thing to do.
You will run into dependency issues or things breaking like you described.
Much better way is to create a virtual environment, install all your dependencies like pandas, and run your scripts from there.
Create new virtual environment:
python -m venv .venv
Activate it:
source .venv/bin/activate
Install dependencies:
pip install pandas
And then run your scripts.
If you have a lot dependencies you can use tools like python poetry to manage them. Also you'll get virtual environments helpers and locking for free.
https://python-poetry.org/
Related
I made a program in python which I now wanted to turn into an .exe so that other people at my office, who do not have python or any python skills can use it too. This is not the first time I did this, and I am still working on the same machine, however this time I run into the "module not found" error when trying to execute the exe.
Basically I created a GUI with PySimpleGUI and then followed my own guide from last time, where I created a spec file with pyi-makespec, specifying the paths in which the packages are located. These are two locations: in C:// where python is installed, and in the "venv" folder of my Pycharm project. PYSimpleGUI is located in "venv" but not in "C".
After creating the spec file I create the exe with pyinstaller. It was suggested to use --hidden-import=PySimpleGUI as additional flag, which I did do, but I still got the same error. I also made sure python is added to my PATH, but maybe I did something wrong there? Tha path I used is the one where python is installed: C:\Users\Username\AppData\Local\Programs\Python\Python39\Lib\site-packages is that right?
I am completely stuck and can't find any helpful information. How do I solve this issue? Also important: I do not have admin rights on my computer, so installing stuff is always linked to me having to call the support desk...
If more information about this project is needed, let me know.
Usually, modules not found tend to be the issues with virtual environment. Python uses those venvs as some sort of folders where it stores the Python interpreter and packages installed with pip. If you have an error saying some module is not found, you should either install it on your current environment or switch to the one where ot is already installed. For more details, I recommend you to take a look at this tutorial how to use and setup them.
Additionally, you do not need to compile Python code if you can afford relying on the interpreter installed. If you know your target audience has the interpreter installed (an interpreter is usually shipped with various Linux distributives), then you can just give them your source code or even pycache (.pyc) files that it can interpret as well.
I tried three different things at the same time, one of them seemed to have solved the problem:
1.) update pip with the following commands in the PyCharm Terminal
pip install --upgrade PySimpleGUI
2.) update PySimpleGUI with the following commands in the PyCharm Terminal
C:\Users\Username\AppData\Local\Programs\Python\Python39\python.exe -m pip install --upgrade pip
3.) add python39 to PATH
I checked again and I did have \Python39\Scripts in my PATH, but not \Python39 itself, so I added it. The way to do it on windows is to search for "environment variables" (Umgegungsvariablen in german), edit the "Path" Variable (double click or click and press edit, which opens a new window), then add new Path (mine is C:\Users\Username\AppData\Local\Programs\Python\Python39)
I sort them with Python39 being above Python39\Scripts.
This solved my problem with PySimpleGUI, my program started as expected, but then ran into another missing module error (xlsxwriter not found), which - fair enough - was not imported. (interestingly when running the code in pycharm I didn't need xlsxwriter). I imported it, added it to venv, and when starting the program again, the error showed up even before showing the GUI. I then upgraded xlsxwriter with
pip install --upgrade xlsxwriter
I again created the .spec file and ran pyinstaller to create the exe. This time it worked.
Now I only have issues left which are not connected to missing modules (just variable referenced before assigned).
I have a Python app running on windows that has imports for the following packages:
requests
json
psycopg2
I copy the entire project (I used Pycharms to write the app and import the packages) to a new machine and expected it would work. The new machine is also windows and I'm trying to run my script from the command line (i.e. no Pycharm on the new machine).
Instead, I get an error saying "ModuleNotFoundError: No module named 'requests'"
If I look at the project, I have the directories:
venv
Lib
site-packages
requests
What am I missing/doing wrong?
You have a couple of options here but first the problem. You are exporting your code base to a new machine without the required Modules installed on that machine and/or within your Python project's environment. After you have python installed on your new machine, you need to be sure to point your PyCharm Project to the proper environment.
File > Default Preferences > Project Interpreter
The window that appears on the right will contain a drop down menu labeled Project Interpreter. If you click on the drop down, it should reveal a list of the available Python environments on your machine.
Based on your description of your site-packages directory I would assume you do not have your interpreter pointed the proper environment on your new machine. With that said, you would be better served creating a new virtual python environment on your machine and installing each relevant dependency within that environment.
Take a look at this post here for your first best option on re-creating your old python environment on your new machine.
EDIT: I apologize for not reading the question more thoroughly before answering the questions. If this is running on a Windows machine you will need to double check the environment path python is using. It is very easy to install python at a different PATH than the command line environment is checking on a Windows box. If for example your PATH is pointing to a different version of Python and PIP is installing packages somewhere else this issue can occur. Double check your System PATH for python and which version the command line is running.
On the new machine you must source venv/bin/activate so your path environment variables are set properly. In particular, which python should say venv/bin/python rather than /usr/bin/python. Also, take care to conda env update or pip install -r requirements.txt so you'll have suitable venv libraries on the new machine.
I'm using Google Drive to keep a copy of my code projects in case my computer dies (I'm also using GitHub, but not on some private projects).
However, when I try to create a virtual environment using virtualenv, I get the following error:
PS C:\users\fchatter\google drive> virtualenv env
New python executable in C:\users\fchatter\google drive\env\Scripts\python.exe
ERROR: The executable "C:\users\fchatter\google drive\env\Scripts\python.exe" could not be run: [Error 5] Access is denied
Things I've tried:
I thought it was because the path to the venv included blank spaces, but the command works in other paths with blank spaces. I also tried installing the win32api library, as recommended in the virtualenv docs, but it didn't work.
running the PowerShell as an administrator.
Any ideas on how to solve this? My workaround at the moment is to create the venv outside of the Google Drive, which works but is inconvenient.
After running into the same issue and playing with it for several hours, it doesn't seem possible. It has nothing to do with spaces in file/folder names. I've tested that. It seems that Google Drive Stream perform some action to the file/folder after a period of time that makes python loose its path the files. For instance, you can take clone a python module into a Google Drive Stream folder, do a "pip install -e ./", and it will work in the virtevn for a few minutes, for instance importing it into a python shell. But after a few minutes you can no longer import the module. I suspect that Google Drive Stream is simply not fully compatible with all filesystem system calls, one of which is being used by python.
Don't set up a virtual env in a cloud synced folder, nor should you run a python script from such a folder. It's a bad idea. They are not meant for version control. Write access (modifying files) to the folder is limited because in your case Google drive will periodically sync the folder which will prevent exclusive write access to the folder almost always.
TLDR; One cant possibly modify files while they are being synced.
I suggest you stick to git for version control.
I have a virtualenv located at /home/user/virtualenvs/Environment. Now I need this environment at another PC. So I installed virtualenv-clone and used it to clone /Environment. Then I copied it to the other PC via USB. I can activate it with source activate, but when I try to start the python interpreter with sudo ./Environment/bin/python I get
./bin/python: 1: ./bin/python: Syntax Error: "(" unexpected
Executing it without sudo gives me an error telling me that there is an error in the binaries format.
But how can this be? I just copied it. Or is there a better way to do this? I can not just use pip freeze because there are some packages in /Environment/lib/python2.7/site-packages/ which I wrote myself and I need to copy them, too. As I understand it pip freeze just creates a list of packages which pip then downloads and installs.
Do the following steps on the source machine:
workon [environment_name]
pip freeze > requirements.txt
copy requirements.txt to other PC
On the other PC:
create a virtual environment using mkvirtualenv [environment_name]
workon [environment_name]
pip install -r requirements.txt
You should be done.
Other Resources:
How to Copy/Clone a Virtual Environment from Server to Local Machine
Pip Freeze Not Applicable For You?
Scenario: you have libraries installed on your current system that are very hard to migrate using pip freeze and am talking really hard, because you have to download and install the wheels manually such as gdal, fiona, rasterio, and then even doing so still causes the project to crash because possibly they were installed in the wrong order or the dependencies were wrong and so on.
This is the experience I had when I was brought on board a project.
For such a case, when you finally get the environment right you basically don't want to go through the same hell again when you move your project to a new machine. Which I did, multiple times. Until finally I found a solution.
Now, disclaimer before I move on:
I don't advocate for this method as the best, but it was the best for my case at the time.
I also cannot guarantee it will work when switching between different OSes as I have only tried it between Windows machine. In fact I don't expect it to work when you move from Windows to other OSs as the structure of the virtualenv folder from Unix-based OS is different from that of Windows.
Finally, the best way to do all of this is to use Docker. My plan is to eventually do so. I have just never used Docker for a non-web-app project before and I needed a quick fix as my computer broke down and the project could not be delayed. I will update this thread when I can once I apply Docker to the project.
THE HACK
So this is what I did:
Install the same base Python on your new machine. If you have 3.9 on the old, install 3.9 on the new one and so on. Keep note of where the executable can be located, usually something like C:\Users\User\Appdata\Local\Programs\Python\PythonXX
Compress your virtual env folder, copy it into the project directory
inside your new machine. Extract all files there
Using text editor of your choice, or preferably IDE, use the 'Search
in all files' feature to look for all occurrences of references to
your old machine paths: C:\Users*your-old-username*
Replace these with your new references. For my case I had to do it in
the following files inside the virtual env folder: pyvenv.cfg, Scripts/activate, Scripts/activate.bat, Scripts/activate.fish and Scripts/activate.nu.
And that's it!
Good luck everyone.
I think what occurs is that you just copy the symbolic links in the source file to the target machine as binary files(no longer links). You should copy it using rsync -l to copy to keep those links.
Usually I use virtualenv to create a new environment, then I go to the environment where I want to copy from, copy all the folders and paste it into the environment folder I just created, but most importantly when asking if you want to replace the Destination files, choose to skip these files. This way you keep your settings.
At least for me, this has worked very well.
I hope it works for you too.
I share my experience.
Suppose another PC does not install Python
Python version: 3.7.3
Platform: Platform: Windows 10, 7 (64bit)
The following is working for me.
Step:
download Windows embeddable zip file
download get-pip.py (because the embeddable zip file does not provide pip)
[Optional] install tkinter, see this article: Python embeddable zip: install Tkinter
Choose a packaging method (I use NSIS: https://nsis.sourceforge.io/Download)
folder artictures:
- main.nsi
- InstallData
- contains: Step1 & Step2
- YourSitePackages # I mean that packages you do not intend to publish to PyPI.
- LICENSE
- MANIFEST.in
- README.rst
- ...
- requirements.txt
- setup.py
The abbreviated content about main.nsi is as follows:
!define InstallDirPath "$PROGRAMFILES\ENV_PYTHON37_X64"
!define EnvScriptsPath "${InstallDirPath}\Scripts"
...
CreateDirectory "${InstallDirPath}" # Make sure the directory exists before the writing of Uninstaller. Otherwise, it may not write correctly!
SetOutPath "${InstallDirPath}"
SetOverwrite on
File /nonfatal /r "InstallData\*.*"
SetOutPath "${InstallDirPath}\temp"
SetOverwrite on
File /nonfatal /r "YourSitePackages\*.*"
nsExec::ExecToStack '"${InstallDirPath}\python.exe" "${InstallDirPath}\get-pip.py"' # install pip
nsExec::ExecToStack '"${InstallDirPath}\Scripts\pip.exe" install "${InstallDirPath}\temp\."' # install you library. same as: `pip install .`
RMDir /r "${InstallDirPath}\temp" # remove source folder.
...
/*
Push ${EnvScriptsPath} # Be Careful about the length of the HKLM.Path. it is recommended to write it to the HKCU.Path, it is difficult for the user path to exceed the length limit
Call AddToPath # https://nsis.sourceforge.io/Path_Manipulation
*/
hope someone will benefit from this.
I am currently writing a command line application in Python, which needs to be made available to end users in such a way that it is very easy to download and run. For those on Windows, who may not have Python (2.7) installed, I intend to use PyInstaller to generate a self-contained Windows executable. Users will then be able to simply download "myapp.exe" and run myapp.exe [ARGUMENTS].
I would also like to provide a (smaller) download for users (on various platforms) who already have Python installed. One option is to put all of my code into a single .py file, "myapp.py" (beginning with #! /usr/bin/env python), and make this available. This could be downloaded, then run using myapp.py [ARGUMENTS] or python myapp.py [ARGUMENTS]. However, restricting my application to a single .py file has several downsides, including limiting my ability to organize the code and making it difficult to use third-party dependencies.
Instead I would like to distribute the contents of several files of my own code, plus some (pure Python) dependencies. Are there any tools which can package all of this into a single file, which can easily be downloaded and run using an existing Python installation?
Edit: Note that I need these applications to be easy for end users to run. They are not likely to have pip installed, nor anything else which is outside the Python core. Using PyInstaller, I can generate a file which these users can download from the web and run with one command (or, if there are no arguments, simply by double-clicking). Is there a way to achieve this ease-of-use without using PyInstaller (i.e. without redundantly bundling the Python runtime)?
I don't like the single file idea because it becomes a maintenance burden. I would explore an approach like the one below.
I've become a big fan of Python's virtual environments because it allows you to silo your application dependencies from the OS's installation. Imagine a scenario where the application you are currently looking to distribute uses a Python package requests v1.0. Some time later you create another application you want to distribute that uses requests v2.3. You may end up with version conflicts on a system where you want to install both applications side-by-side. Virtual environments solve this problem as each application would have its own package location.
Creating a virtual environment is easy. Once you have virtualenv installed, it's simply a matter of running, for example, virtualenv /opt/application/env. Now you have an isolated python environment for your application. Additionally, virtual environments are very easy to clean up, simply remove the env directory and you're done.
You'll need a setup.py file to install your application into the environment. Say your application uses requests v2.3.0, your custom code is in a package called acme, and your script is called phone_home. Your directory structure looks like this:
acme/
__init__.py
models.py
actions.py
scripts/
phone_home
setup.py
The setup.py would look something like this:
from distutils.core import setup
install_requires = [
'requests==2.3.0',
]
setup(name='phone_home',
version='0.0.1',
description='Sample application to phone home',
author='John Doe',
author_email='john#doe.com',
packages=['acme'],
scripts=['scripts/phone_home'],
url='http://acme.com/phone_home',
install_requires=install_requires,
)
You can now make a tarball out of your project and host it however you wish (your own web server, S3, etc.):
tar cvzf phone_home-0.0.1.tar.gz .
Finally, you can use pip to install your package into the virtual environment you created:
/opt/application/env/bin/pip install http://acme.com/phone_home-0.0.1.tar.gz
You can then run phone_home with:
/opt/application/env/bin/phone_home
Or create a symlink in /usr/local/bin to simply call the script using phone_home:
ln -s /opt/application/env/bin/phone_home /usr/local/bin/phone_home
All of the steps above can be put in a shell script, which would make the process a single-command install.
And with slight modification this approach works really well for development environments; i.e. using pip to install / reference your development directory: pip install -e . where . refers to the current directory and you should be in your project directory alongside setup.py.
Hope this helps!
You could use pip as suggested in the comments. You need to create a MANIFEST.in and setup.py in your project to make it installable. You can also add modules as prerequisites. More info can be found in this question (not specific to Django):
How do I package a python application to make it pip-installable?
This will make your module available in Python. You can then have users run a file that runs your module, by either python path/run.py, ./path/run.py (with +x permission) or python -c "some code here" (e.g. for an alias).
You can even have users install from a git public reporitory, like this
pip install git+https://bitbucket.org/yourname/projectname.git
...in which case they also need git.