I tried to install the keras_contrib package in a virtual machine which does not allow internet access. So I manually unzip the package, navigate to folder and using python setup.py install to install it. After that I can find the package using pip list, however when I import the package cannot be found. And I cannot find the package folder in the anaconda/lib/site-packages.
(link of the package: https://github.com/keras-team/keras-contrib)
These are the screenshots at the beginning and ending during installation.
Any suggestion? Thanks very much.
You should look at your $PYTHONPATH environment variables. If you are using Mac or Linux, write which python command on your terminal. Also, you can look at the link below for a setup that is similar to your problem
https://github.com/sparklingpandas/sparklingpandas/wiki/setup.py-Install-for-Anaconda-Python
Related
I've been facing a problem for some time and I'm not finding a solution. At the company where I work, I'm trying to implement Python, but when I run the conventional command "Pip install pandas" in my vscode terminal, it gives an error because the company blocks the installation of external libraries, so it's as if I had to install these libraries on a PC without connection.
How should I follow this procedure?
I downloaded the .whl library from PyPi:
pandas-1.5.2-cp310-cp310-win_amd64.whl
ran pip install pandas-1.5.2-cp310-cp310-win_amd64.whl -f ./ --no-index --no-deps
Ok, the installation was successful. But this installation of pandas by cmd is not going to my system, because when trying to import pandas in my vscode it is not running, as if it had not been installed.
Would it be possible for me to download several libraries and leave them located in a folder where everyone in the company can use them? example using a function where I declare my path where all the libraries will be, and then I import them from there??
First confirm if you can run any python commands from VSCode. If the answer is yes, you can proceed to install your .whl file in python script folder.
Look for the Scripts folder
Add in your .whl file in this folder
Then open the folder, select the path, press Ctrl+D to display full path, type in cmd to open command prompt to this directory
Then just run your pip install here and you should be good to go
Remember the key is that your VSCode must be able to find Python.exe. If it can't from the start, you will need to add your python directory to PATH in environment variable
My apologies in advance if this isn't the most clear, I'm very new to writing packages. My research group is looking to install our created data analysis package across our computers.
I created an init.py and a setup.py for my package and was able to use pip install to install it locally. But when I try to import the project in jupyter notebooks, command line, or a python file, I keep getting some version of "ModuleNotFoundError"
I imagine it could be something due to my environment or permissions but I figured installing it on the root file system would have fixed that. I'm the only user of the computer.
The code is in this github repository: https://github.com/konnorve/DataAnalysis
I am able to load it in as a package if it is in the same file path as my notebook, but I'd like to be able to import it from any file path as I use this a lot.
Thank you in advance.
Check whether you are using a virtual environment. If using, then the package needs to be explicitly installed in your virtual environment. In that scenario, the system environment becomes different from the virtual environment.
For a class, I need to be able to use this github library: https://github.com/matsselen/pyolab. I am struggling to download/install it in a way that actually works. I am using anaconda for this and commands I've tried include:
conda install pyolab
pip install pyolab
conda install source_code_file_path
pip install -e git+https://github.com/matsselen/pyolab#egg=pyolab
I've saved the source code into anaconda's 'pkgs' folder and in my root folder.
I really don't know much about creating packages but in my searching, I found that the issue might be that there is no setup.py file included in the code on github. I tried to build my own but I can't get that to work. Here is the code I have for that:
from setuptools import setup
setup(name='pyolab',
version='master',
description='IOLab code'
author='mats selen'
packages=['pyolab']
)
Also, I am being required to use the python 2.7 version of the package instead of the newer python 3 version.
Can anyone help point me in the right direction to get this working?
No need to pip-install repository, any python scripts can be used directly from any regular files/dirs on file system.
Just clone the repository:
git clone https://github.com/matsselen/pyolab
and then use sys.path to specify location of library's scripts and import them:
import sys
sys.path.append('./pyolab/PyOLabCode/')
# all dirs from sys.path are scanned by Python when you do import
# and the first matched dir where the module is found is used
import commMethods # importing a script ./pyolab/PyOLabCode/commMethods.py
I try the following code to see if the library sodium can be located
import ctypes
import ctypes.util
# Taken from line 33 https://github.com/bgaifullin/pysodium/blob/master/pysodium/__init__.py
o = ctypes.util.find_library('sodium')
print o
This always returns "none"
Please how do I add external libraries (dependencies) and reference them correctly in my python code.
EDIT:
I am trying to work with pysodium it has a dependency on libsodium
I have downloaded libsodium, but i'm new to python...
I'm actually using PTVS 2.1 to get up to speed running python in my familiar dev environment.
If I understood you correctly. What you want is to import a library.
Put the pysodium directory under the script you want to use and then simply do
import pysodium
It is as simple as that.
Usually, what you do is install the libraries on your system, or in a virtualenv, and import them to your python script. Cloning the repository will not generally help unless the libraries you want to import are in the same directory as the script you're importing from.
I, personally, would recommend using virtualenv and pip together hand in hand. Read up on virtualenv, it will come very handy.
Assuming you have both virtualenv and pip, all you need to do is the following
virtualenv venv
source venv/bin/activate
pip install pysodium
This should create a virtualenv container, activate it and install pysodium inside. Your script will only work when the virtualenv is activated. You can deactivate it using the command deactivate.
I have installed opencv successfully in my system and able to import it without virtualenv. I know I need to copy cv2.so files in my virtualenv directory to be able to use it within virtualenv, but the problem is there is NO cv2.so files in my local site packages. All I can see some .so files named with libopencv-core * alike.
I grep-ed it, tried finding it manually in site-packages, py-modules, but no clue why its isn't there?
I've successfully build and make all files, I am sure there is nothing missed while installing it, because without virtualenv I am easily executing it.
OpenCV Version: 2.4.8
Python Version: 2.7.8
OS: Ubuntu 14.01
To import opencv using your virtualenv you should install it on your virtualenv or copy the cv2.pyd (on windows) to your venv site-packages directory.
If you are on linux you could install it using sudo apt-get install python-opencv.
If you are building it from source you should follow the steps listed here.
On step 12:
Also make sure that in the PYTHON field, everything is filled. (Ignore PYTHON_DEBUG_LIBRARY).
Look at the image presented in this step. The python paths listed there should be your venv python paths not your system's python paths.
Hope it helps!