Unable to import pyodbc on Apple Silicon - Symbol not found: _SQLAllocHandle - python

I am currently working on a python (3.8) project on my 2021 MacBook Pro with Apple Silicon. Ultimately, the goal is to build a ML model on data I read from an Azure SQL DB using Apple's Tensorflow fork. Therefore, I am developing the project on native Apple Silicon packages - not using Rosetta.
The Problem arises when I try to import the pyodbc package (4.0.30) in order to connect to my DB. I keep getting the following error
File "<stdin>", line 1, in <module>
ImportError: dlopen({myvenv}/lib/python3.8/site-packages/pyodbc.cpython-38-darwin.so, 2):
Symbol not found: _SQLAllocHandle
Referenced from: {myvenv}/lib/python3.8/site-packages/pyodbc.cpython-38-darwin.so
Expected in: flat namespace
in {myvenv}/lib/python3.8/site-packages/pyodbc.cpython-38-darwin.so
If however, I do the exact same thing using Rosetta everything works fine. I couldn't find any other thread describing a similar behaviour.
Does anyone know how to resolve this issue?

My feeling is that the package is not compiled properly for ARM architecture.
You can uninstall the pyodbc and install it again. If using pip, it would be like this:
pip uninstall pyodbc
and install with compiling it locally:
pip install --no-binary :all: pyodbc
Do not forget on Unix/Linux like platforms, you need to download the pyodbc source distribution is build against an ODBC driver manager, os unixosbc is a prerequisite. Example installation with brew package manager:
brew install unixodbc

The not-compiled installation should work:
pip install --pre --no-binary :all: pyodbc
This compiles locally and is thus compatible and overcomes the issue.

Related

bsddb.btopen alternative on Google Colab?

So I have my Notebook on Google Colab using Python 3 (and I will implement some Deep learning libraries ex: Keras, TF, Flair, OpenAI...) so I really want to keep using Python 3 and not switch to 2.
However, I have a .db file that I want to open/read, the script is written in Python 2 because they are using bsddb library (which is deprecated and doesn't work on Python 3)
self.term_to_id = bsddb.btopen(resource_prefix + '_term_to_id.db', 'r')
I tried modifying the Python 2 file to make it compatible on Python 3 so I can import it as a module in my Google Colab Notebook, what I tried:
I tried changing bsdbb to bsdbb3, and installing !pip install berkeleydb so I can do that later !pip install bsddb3 and just update bsdbb to bsdbb3 , but upon installing !pip install berkeleydbI get the following errors:
ERROR: Could not find a version that satisfies the requirement
berkeleydb (from versions: 18.1.0, 18.1.1, 18.1.2, 18.1.3, 18.1.4)
ERROR: No matching distribution found for berkeleydb
2)I thought maybe I could just import the dependency from python 2 file to my Python 3 notebook, but as expected it didn't work because it didn't recognize 'import bsdbb' in the Python 2 file.
Any tips/ work around to make it work on Google Colab ?
berkeleydb is only Python binding on database BerkeleyDB created in C/C++.
When I try to install it on my local system Linux Mint then I see error with
FileNotFoundError: [Errno 2] No such file or directory: 'src/Modules/berkeleydb.h'
which means that it tries to compile some C/C++ code.
And this usually need to install special package with C/C++ headers (files .h) with suffix -dev.
Using
!apt search Berkelay
I found that there is installed libdb5.3 so I installed libdb5.3-dev
!apt install libdb5.3-dev
and after that Python can install berkeleydb
This works for me on Colab
!apt install libdb5.3-dev
!pip install berkeleydb
import berkeleydb as bsddb

Python error message "Incompatible library version" libxml and etree.so

Update 2:
the main problem turned out to be a different one from what I had thought it was, and asked for help here. I moved the new question to a new post:
Install custom python package in virtualenv
Update:
ok, so I screwed up my non-virtualenv by accident.
The non-virtualenv (normal bash) I could easily fix by removing the manually installed (via pip) lxml and running
conda install lxml --force
But for some reason, that doesn't work in the virtualenv.
There, running
conda install lxml --force
works without error message, but when I run python and simply say
>>> import lxml
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named lxml
Any suggestions??
old message:
I'm trying to use virtualenv for my python flask application.
The python code runs perfectly fine without the virtualenv.
I've installed the packages I need in the virtualenv, but I after installing lxml via
pip install lxml
Installing collected packages: lxml
Successfully installed lxml-3.6.0
I get the following error message when running my code:
File "/Users/XXX/xxx/flask-aws/lib/python2.7/site-packages/docx-0.2.4-py2.7.egg/docx.py", line 17, in <module>
from lxml import etree
ImportError: dlopen(/Users/XXX/xxx/flask-aws/lib/python2.7/site-packages/lxml/etree.so, 2): Library not loaded: libxml2.2.dylib
Referenced from: /Users/XXX/xxx/flask-aws/lib/python2.7/site-packages/lxml/etree.so
Reason: Incompatible library version: etree.so requires version 12.0.0 or later, but libxml2.2.dylib provides version 10.0.0
I have seen other people report similar problems at stackoverflow, and one guy remarked that the problem might related to the virtualenv, but there was no solution.
Once again: The python code runs perfectly fine without virtualenv! But inside virtualenv, I can't get it to work.
I'm using Anaconda Python 2.7 on a Mac.
I'd appreciate any help guys!
I had the same error and stumbled upon this link, after searching for the incompatible library error "libxml2.2.dylib provides version 10.0.0"
Installing libxml2 that worked for me:
brew install libxml2
brew link --force libxml2
Solution that works for me in virtual environment is to force pip to recompile lxml:
pip install lxml --force-reinstall --ignore-installed --no-binary :all:

Importing opencv and getting numpy.core.multiarray failed to import

Trying to install OpenCV and running into an issue where attempting to import cv2 results in this output -
RuntimeError: module compiled against API version 9 but this version of numpy is 7
Traceback (most recent call last):
File "<pyshell#4>", line 1, in <module>
import cv2
ImportError: numpy.core.multiarray failed to import
I'm running on Windows 7 x64, Python v 2.7.9
Thanks!
The error is telling you that you have an out of date version of numpy. If you used pip to install things you can simply run pip install numpy -U, or download the appropriate version from their website.
In case
pip install -U numpy
doesn't work (even with sudo), you may want to make sure you're using the right version of numpy. I had the same "numpy.core.multiarray failed to import" issue, but it was because I had 1.6 installed for the version of Python I was using, even though I kept installing 1.8 and assumed it was installing in the right directory.
I found the bad numpy version by using the following command in my Mac terminal:
python -c "import numpy;print numpy.version;print numpy.file";
This command gave me the version and location of numpy that I was using (turned out it was 1.6.2). I went to this location and manually replaced it with the numpy folder for 1.8, which resolved my "numpy.core.multiarray failed to import" issue. Hopefully someone finds this useful!
I had a similar problem and I solved it by downgrading my numpy version.
What I did was:
pip install opencv-python
pip uninstall numpy
pip install numpy=1.18
This has worked for me using
Python 3.7
opencv-python 4.4.0.46
numpy 1.18.0
linux: sudo apt-get install python-numpy
if you are using ubuntu bionic beaver then try running: sudo apt-get install python-numpy
had the same issue, resolve by running the above command.
Hope it helps
In your environment you can try this command:
conda uninstall numpy
conda install -c conda-forge numpy
I use Python 3.7 # RPI 4.
For opencv to install properly I had to install the listed libraries below.
(Not every package was actually installed, after request)
Regarding Numpy, I think one should stick to the latest version.
For me what worked is to uninstall the existing version 1.16.2 and stick with the current stable 1.21.2.
Stackoverflow topic at missing libraries here: ImportError: libcblas.so.3: cannot open shared object file: No such file or directory.

Setup script exited with error: Unable to find vcvarsall.bat

I got the folloiwng error while running my script
Traceback (most recent call last):
File "mysql.py", line 2, in <module>
import MySQLdb
ImportError: No module named MySQLdb
Tried to install mysql-python as suggested in No module named MySQLdb but running
into following error ,can anyone suggest how to overcome this error?
C:\Dropbox\scripts>easy_install mysql-python
Searching for mysql-python
Reading http://pypi.python.org/simple/mysql-python/
Best match: MySQL-python 1.2.5
Downloading https://pypi.python.org/packages/source/M/MySQL-python/MySQL-python-1.2.5.zip#md5=654f75b302db6ed8dc5a898c625e030c
Processing MySQL-python-1.2.5.zip
Running MySQL-python-1.2.5\setup.py -q bdist_egg --dist-dir c:\users\gnakkala\appdata\local\temp\easy_install-kowc5r\MySQL-python-1.2.5\egg-dist-tmp-1gslvq
error: Setup script exited with error: Unable to find vcvarsall.bat
I had a similar issue with getting MySQL-python to install properly and work for me. I tried both easy_install and pip, both had issues with vcvarsall.bat. Below is what I did to solve my problem, which I think might be able to lead you in the right direction. I have a Windows 8 Machine, Python 2.7 installed and running my stuff through eclipse.
Some Background:
When I did an easy install it tries to install MySQL-python 1.2.5 which failed with an error: Unable to find vcvarsall.bat. I did an easy_install of pip and tried the pip install which also failed with a similar error. They both reference vcvarsall.bat which is something to do with visual studio, since I don't have visual studio on my machine, it left me looking for a different solution, which I share below.
The Solution:
Reinstall python 2.7.8 from 2.7.8 from https://www.python.org/download this will add any missing registry settings, which is required by the next install.
Install 1.2.4 from http://pypi.python.org/pypi/MySQL-python/1.2.4
After I did both of those installs I was able to query my MySQL db through eclipse.
1 install wheel
pip install wheel
2 Download .whl file from http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml
Ctrl+F, search mysql . you can find :
MySQL-python, a Python database API 2.0 interface for the MySQL database
Mysqlclient is a Python 3 compatible fork of MySQL-python.
MySQL_python-1.2.5-cp27-none-win32.whl
MySQL_python-1.2.5-cp27-none-win_amd64.whl
Mysqlclient, a fork of the MySQL-python interface for the MySQL database.
mysqlclient-1.3.8-cp27-cp27m-win32.whl
mysqlclient-1.3.8-cp27-cp27m-win_amd64.whl
mysqlclient-1.3.8-cp34-cp34m-win32.whl
mysqlclient-1.3.8-cp34-cp34m-win_amd64.whl
mysqlclient-1.3.8-cp35-cp35m-win32.whl
mysqlclient-1.3.8-cp35-cp35m-win_amd64.whl
mysqlclient-1.3.8-cp36-cp36m-win32.whl
mysqlclient-1.3.8-cp36-cp36m-win_amd64.whl

How to install zeroRPC (python) on windows

I would like to try zeroRPC but couldn't install the package properly. I am using the latest python_xy distribution (python 2.7.3) under windows 7 and I must say I don't have much experience with installing new modules since the distribution is allready pretty complete.
I pulled the master zeroRPC-python from gitHub and tried to do "python setup.py install"
I had a first problem with something like "impossible to locate vcvarsall.bat". I solved it by installing mingw as explained here error: Unable to find vcvarsall.bat
Then I could run the install untill the end, but now, when I import zerorpc, I get the following ImportError (only the end of the stack):
C:\Python27\lib\site-packages\gevent-0.13.8-py2.7-win32.egg\gevent\greenlet.py in <module>()
4 import traceback
5 from gevent import core
----> 6 from gevent.hub import greenlet, getcurrent, get_hub, GreenletExit, Waiter
7 from gevent.timeout import Timeout
8
C:\Python27\lib\site-packages\gevent-0.13.8-py2.7-win32.egg\gevent\hub.py in <module>()
28
29 try:
---> 30 greenlet = __import__('greenlet').greenlet
31 except ImportError:
32 greenlet = __import_py_magic_greenlet()
ImportError: No module named greenlet
I wonder more generally if I am following the right procedure to install new packages (under windows) or if there is a simpler way (safer with dependancies) that I would be overlooking (easy_install)? I must say I am very new to this and any hints or link to the relevant documentation would be appreciated.
Thanks in advance,
Samuel
I was struggling with this question myself for a while now. The solution involves several components, and many answers out there seem to relate to different versions of those components that don't always play well together.
Here is the complete solution that worked for me, starting from an empty virtualenv:
mkvirtualenv myenv
python -m pip install --upgrade pip==6.0.8 wheel==0.24.0
pip install gevent-1.0.1-cp27-none-win32.whl pyzmq-13.1.0-cp27-none-win32.whl zerorpc==0.4.4
The first step installs wheel and upgrades pip itself to support wheel package installations. The next step installs binary wheels for gevent-1.0.1 (downloadable from this unofficial but extremely useful python windows binaries page) and pyzmq-13.1.0 (available here), and the zerorpc-0.4.4 package from source in the usual way.
Note that I hard-coded source package versions here (pip 6.0.8, wheel 0.24.0, zerorpc 0.4.4) because as I said other versions don't always follow the same build patterns. This may not be necessary and future versions may prove to work just as well together.
The final result for me:
(myenv) C:\work>pip freeze
gevent==1.0.1
greenlet==0.4.5
msgpack-python==0.4.5
pyzmq==13.1.0
wheel==0.24.0
zerorpc==0.4.4
I used a slightly different way, I am using Anaconda + Jupyter to run my python notebooks.
I used this link to zerorpc package, and installed using
conda install -c groakat zerorpc
which installed following -

Categories

Resources