I am using a Windows 10 machine and have installed Python, numpy and OpenCV from the official link using pre built binaries. I can successfully import numpy and cv2 but get an error when I try to import cv.
import cv2
import numpy as np
import sys
import cv
def diceroll():
rng = cv.RNG(np.random.randint(1,10000))
print 'The outcome of the roll is:'
print int(6*cv.RandReal(rng) + 1)
return
diceroll()
ImportError: No module named cv
P.S: This is not a possible duplicate of this question. The user in the question involved is getting a dll file error whereas I am stuck with an import error for cv.
After inquiring on the OpenCV community, I learnt that the old cv or cv2.cv api was removed entirely from OpenCV3
One cannot use the RNG function from cv through opencv3. You can instead use numpy.random for the same functionality.
Reference: my question on Opencv community
It is somewhere in there, just need to search for it. Try running something like the following on your system:
from types import ModuleType
def search_submodules(module, identifier):
assert isinstance(module, ModuleType)
ret = None
for attr in dir(module):
if attr == identifier:
ret = '.'.join((module.__name__, attr))
break
else:
submodule = getattr(module, attr)
if isinstance(submodule, ModuleType):
ret = search_submodules(submodule, identifier)
return ret
if __name__ == '__main__':
import cv2
print cv2.__version__
print search_submodules(cv2, 'RNG')
On my system, this prints:
2.4.11
cv2.cv.RNG
Related
I wrote a module for my dataset classes, located in the same directory as my main file.
When i import it and try to instantiate a class i get a "np is not defined" error even though numpy is imported correctly in both the main file and the module.
I'm saying correctly because if i try both to call another numpy function from the main or to execute the module on alone no error rises.
This is the code in the main file:
import torch
import numpy as np
from myDatasets import SFCDataset
trainDs = SFCDataset(paths["train"])
and this is the module:
import torch
import numpy as np
from torch.utils.data import Dataset
#Single Fragment Classification Dataset
class SFCDataset(Dataset):
def __init__(self, path, transform=None, norms=False, areas=False, **kwargs, ):
super(SFCDataset, self).__init__()
self.data = torch.load(path)
self.fragments = []
self.labels = []
for item in self.data:
self.fragments.append(item[0])
self.labels.append(item[1])
self.fragments=np.array(self.fragments)
self.labels=np.array(self.labels)
if norms:
if areas:
self.fragments = np.transpose(self.fragments[:], (0,2,1,3)).reshape(-1,1024,9)[:,:,:7]
else:
self.fragments = np.transpose(self.fragments[:], (0,2,1,3)).reshape(-1,1024,9)[:,:,:6]
else:
if areas:
self.fragments = np.transpose(self.fragments[:], (0,2,1,3)).reshape(-1,1024,9)[:,:,[0,1,2,6]]
else:
self.fragments = np.transpose(self.fragments[:], (0,2,1,3)).reshape(-1,1024,9)[:,:,:3]
self.transform = transform
def __len__(self) -> int:
return len(self.data)
def __getitem__(self, index):
label = self.labels[index]
pointdata = self.fragments[index]
if self.transform:
pointdata = self.transform(pointdata)
return pointdata, label
if __name__ == "__main__":
print("huh")
path = "C:\\Users\\ale23\\OneDrive\\Desktop\\Università\\Tesi\\data\\dataset_1024_AB\\train_dataset_AED_norm_area.pt"
SFCDataset(path)
I don't know what else to try.
I'm on VSCode, using a 3.9.13 virtual enviroment.
edit:
this is the error i'm getting:
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
Cell In[75], line 5
2 import numpy as np
3 from myDatasets import SFCDataset
----> 5 trainDs = SFCDataset(paths["train"])
File c:\Users\ale23\OneDrive\Desktop\Università\Dottorato\EquivariantNN\myDatasets.py:17, in SFCDataset.__init__(self, path, transform, norms, areas, **kwargs)
15 self.fragments.append(item[0])
16 self.labels.append(item[1])
---> 17 self.fragments=np.array(self.fragments)
18 self.labels=np.array(self.labels)
20 if norms:
NameError: name 'np' is not defined
edit2: i edited some stuff like path names and parts occuring after the error to try to lighten up the code, sorry, i should have uploaded all the code as it is run now.
Edit3: I was trying to reproduce the error in some other way and was building a dummy module. I tried to import the class in that other module and running the dummy.py and it runs. That appears to be a problem with the fact i'm working on a notebook, is that possible?
the dummy module:
import numpy as np
def test():
print(np.array(1))
print(np.array(2))
print(np.sum(np.random.rand(2)))
from myDatasets import SFCDataset
trainDs=SFCDataset("C:\\Users\\ale23\\OneDrive\\Desktop\\Università\\Tesi\\data\\dataset_1024_AB\\train_dataset_AED_norm_area.pt")
this runs by calling "python testmodule.py" in the console
Edit 4:
Today i restarted my pc and run the same code as yesterday and the notebook works. Yesterday i tried to close vscode and restart it, but it did not help.
Maybe something is wrong with the virtual environment? I don't know where to look at honestly.
Anyways the program now runs with no errors, should i close this?
Thank you all for your time and help
While using the rpy2 library of Python to work with R. I get the following error message while trying to import a function of the bnlearn package:
# Using R inside python
import rpy2
import rpy2.robjects as robjects
import rpy2.robjects.packages as rpackages
from rpy2.robjects.vectors import StrVector
from rpy2.robjects.packages import importr
utils = rpackages.importr('utils')
utils.chooseCRANmirror(ind=1)
# Install packages
packnames = ('visNetwork', 'bnlearn')
utils.install_packages(StrVector(packnames))
# Load packages
visNetwork = importr('visNetwork')
bnlearn = importr('bnlearn')
tabu = bnlearn.tabu
fit = bn.learn.bn.fit
With the error:
AttributeError: module 'bnlearn' has no attribute 'bn'
While checking the bnlearn documentation one finds out that bn is a class structure. So one should check out all the attributes of the object in question, that is, running:
bnlearn.__dict__['_rpy2r']
After that you should get a similar output like the next one, where you find how you would import each attribute of bnlearn:
...
...
'bn_boot': 'bn.boot',
'bn_cv': 'bn.cv',
'bn_cv_algorithm': 'bn.cv.algorithm',
'bn_cv_structure': 'bn.cv.structure',
'bn_fit': 'bn.fit',
'bn_fit_backend': 'bn.fit.backend',
'bn_fit_backend_continuous': 'bn.fit.backend.continuous',
'bn_fit_backend_discrete': 'bn.fit.backend.discrete',
'bn_fit_backend_mixedcg': 'bn.fit.backend.mixedcg',
'bn_fit_barchart': 'bn.fit.barchart',
'bn_fit_dotplot': 'bn.fit.dotplot',
...
...
Then, running the following will solve the issue:
bn_fit = bnlearn.bn_fit
Now, you could, for example, run a bayesian Network:
structure = tabu(datos, score = "loglik-g")
bn_mod = bn_fit(structure, data = datos, method = "mle")
In general, this approach solves the issue of importing any function from an R package into Python through the rpy2 package.
I'm working on a documentation (personal) for nested matplotlib (MPL) library, which differs from MPL own provided, by interested submodule packages. I'm writing Python script which I hope will automate document generation from future MPL releases.
I selected interested submodules/packages and want to list their main classes from which I'll generate list and process it with pydoc
Problem is that I can't find a way to instruct Python to load submodule from string. Here is example of what I tried:
import matplotlib.text as text
x = dir(text)
.
i = __import__('matplotlib.text')
y = dir(i)
.
j = __import__('matplotlib')
z = dir(j)
And here is 3 way comparison of above lists through pprint:
I don't understand what's loaded in y object - it's base matplotlib plus something else, but it lack information that I wanted and that is main classes from matplotlib.text package. It's top blue coloured part on screenshot (x list)
Please don't suggest Sphinx as different approach.
The __import__ function can be a bit hard to understand.
If you change
i = __import__('matplotlib.text')
to
i = __import__('matplotlib.text', fromlist=[''])
then i will refer to matplotlib.text.
In Python 3.1 or later, you can use importlib:
import importlib
i = importlib.import_module("matplotlib.text")
Some notes
If you're trying to import something from a sub-folder e.g. ./feature/email.py, the code will look like importlib.import_module("feature.email")
Before Python 3.3 you could not import anything if there was no __init__.py in the folder with file you were trying to import (see caveats before deciding if you want to keep the file for backward compatibility e.g. with pytest).
importlib.import_module is what you are looking for. It returns the imported module.
import importlib
# equiv. of your `import matplotlib.text as text`
text = importlib.import_module('matplotlib.text')
You can thereafter access anything in the module as text.myclass, text.myfunction, etc.
spent some time trying to import modules from a list, and this is the thread that got me most of the way there - but I didnt grasp the use of ___import____ -
so here's how to import a module from a string, and get the same behavior as just import. And try/except the error case, too. :)
pipmodules = ['pycurl', 'ansible', 'bad_module_no_beer']
for module in pipmodules:
try:
# because we want to import using a variable, do it this way
module_obj = __import__(module)
# create a global object containging our module
globals()[module] = module_obj
except ImportError:
sys.stderr.write("ERROR: missing python module: " + module + "\n")
sys.exit(1)
and yes, for python 2.7> you have other options - but for 2.6<, this works.
Apart from using the importlib one can also use exec method to import a module from a string variable.
Here I am showing an example of importing the combinations method from itertools package using the exec method:
MODULES = [
['itertools','combinations'],
]
for ITEM in MODULES:
import_str = "from {0} import {1}".format(ITEM[0],', '.join(str(i) for i in ITEM[1:]))
exec(import_str)
ar = list(combinations([1, 2, 3, 4], 2))
for elements in ar:
print(elements)
Output:
(1, 2)
(1, 3)
(1, 4)
(2, 3)
(2, 4)
(3, 4)
Module auto-install & import from list
Below script works fine with both submodules and pseudo submodules.
# PyPI imports
import pkg_resources, subprocess, sys
modules = {'lxml.etree', 'pandas', 'screeninfo'}
required = {m.split('.')[0] for m in modules}
installed = {pkg.key for pkg in pkg_resources.working_set}
missing = required - installed
if missing:
subprocess.check_call([sys.executable, '-m', 'pip', 'install', '--upgrade', 'pip'])
subprocess.check_call([sys.executable, '-m', 'pip', 'install', *missing])
for module in set.union(required, modules):
globals()[module] = __import__(module)
Tests:
print(pandas.__version__)
print(lxml.etree.LXML_VERSION)
I developed these 3 useful functions:
def loadModule(moduleName):
module = None
try:
import sys
del sys.modules[moduleName]
except BaseException as err:
pass
try:
import importlib
module = importlib.import_module(moduleName)
except BaseException as err:
serr = str(err)
print("Error to load the module '" + moduleName + "': " + serr)
return module
def reloadModule(moduleName):
module = loadModule(moduleName)
moduleName, modulePath = str(module).replace("' from '", "||").replace("<module '", '').replace("'>", '').split("||")
if (modulePath.endswith(".pyc")):
import os
os.remove(modulePath)
module = loadModule(moduleName)
return module
def getInstance(moduleName, param1, param2, param3):
module = reloadModule(moduleName)
instance = eval("module." + moduleName + "(param1, param2, param3)")
return instance
And everytime I want to reload a new instance I just have to call getInstance() like this:
myInstance = getInstance("MyModule", myParam1, myParam2, myParam3)
Finally I can call all the functions inside the new Instance:
myInstance.aFunction()
The only specificity here is to customize the params list (param1, param2, param3) of your instance.
You can also use exec built-in function that execute any string as a Python code.
In [1]: module = 'pandas'
...: function = 'DataFrame'
...: alias = 'DF'
In [2]: exec(f"from {module} import {function} as {alias}")
In [3]: DF
Out[3]: pandas.core.frame.DataFrame
For me this was the most readable way to solve my problem.
I have come across a strange error when using dlib.shape_predictor function.
This is my code:
import sys
import dlib
from skimage import io
import matplotlib.pyplot as plt
import os
predictor_model = os.path.join(os.path.dirname(__file__),"Models","shape_predictor_68_face_landmarks.dat")
# Take the image file name from the command line
file_name = "/home/matt/Programming/Python/Face_detection/Images/wedding_photo.jpg"
# Test if they are equivalent
print predictor_model == "/home/matt/Programming/Python/Face_detection/Models/shape_predictor_68_face_landmarks.dat"
predictor_model = "/home/matt/Programming/Python/Face_detection/Models/shape_predictor_68_face_landmarks.dat"
# Create a HOG face detector using the built-in dlib class
face_detector = dlib.get_frontal_face_detector()
face_pose_predictor = dlib.shape_predictor(predictor_model)
If I use the predictor model as defined through relative path using os then I get an error:
ArgumentError: Python argument types in
shape_predictor.__init__(shape_predictor, unicode)
did not match C++ signature:
__init__(boost::python::api::object, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)
__init__(_object*)
If I hardcode the path as defined above the script runs without a problem. Any ideas what may have gone wrong?
I have a simple code as mentioned below:
import cv
from opencv.cv import *
from opencv.highgui import *
img = cv.LoadImage("test.jpg")
cap = cv.CreateCameraCapture(0)
while cv.WaitKey(1) != 10:
img = cv.QueryFrame(cap)
cv.ShowImage("cam view", img)
cascade = cv.LoadHaarClassifierCascade('haarcascade_frontalface_alt.xml', cv.Size(1,1))
But I faced to this error:
# AttributeError: 'module' object has no attribute 'LoadImage'
when I change the code to below:
import cv
#from opencv.cv import *
#from opencv.highgui import *
img = cv.LoadImage("test.jpg")
cap = cv.CreateCameraCapture(0)
while cv.WaitKey(1) != 10:
img = cv.QueryFrame(cap)
cv.ShowImage("cam view", img)
cascade = cv.LoadHaarClassifierCascade('haarcascade_frontalface_alt.xml', cv.Size(1,1))
now the first error got solve and another error raise.
AttributeError: 'module' object has no attribute 'LoadHaarClassifierCascade'
I need both of the modules but it seems that they have conflict to gether.
Now what I have to do?
In OpenCV to load a haar classifier (in the python interface anyway) you just use cv.Load.
import cv
cascade = cv.Load('haarcascade_frontalface_alt.xml')
See the examples here.
Also, the samples that come with the OpenCV source are really good (in OpenCV-2.xx/samples/python).
How do you access the stuff you've imported?
# imports the cv module, all stuff contained in it and
# the module itself is now accessible via: cv.classname, cv.functionname
# where classname, functionname is the name of the class/function which
# the cv module provides..
import cv
# imports everything contained in the opencv.cv module
# after this import it is available via it's classname, functionname, etc.
# Attention: without prefix!!
from opencv.cv import *
# #see opencv.cv import
from opencv.highgui import *
#see python modules for more details about modules and imports in python.
If you can provide which classes are contained in which module I could add a specific solution for your problem.