I am writing some code in C, which reads data in and stores it in a pointer of type;
uint8_t
This data I now want to make accessible in Python3.
For now I have figured out how to make my C code callable with the help of ctypes and I know it works since I am able to print my data out in the Python terminal, but not able to store it in a variable.
My problem lies in that I do not know how to transform this pointer into an array which can be stored and manipulated in Python, so therefore I ask, if someone has a simple example where they move for an example a 3 by 1 array from C into Python, and then are able to work with it there, it would benefit me a lot.
Okay, so I managed to figure it out, maybe not the smartest way, but atleast it works now. Since I return a pointer of the type, ``` u_int8 * ```` in C, I amble then in Python to use something like this:
g = (ctypes.c_float*5271).from_address(a)
And then g[0], g[:], etc. to get the data etc.
Related
i have the following problem. I want to read a .txt-File in Python and use the Variables in Matlab afterwards. I´ve written a Script in Python, which is reading this file line by line and gets the values after Names I let this searching for.
At this point, the Script saves the Values as str. If I want to convert the Values via Matlab.Engine to my Workspace, I define them as for example as
workspace['A'] = float(A)
and this works well. The Problem I´m facing is the handling of Vectors, which my script defines as Str. In the file there´re stores as {1, 2, 3}. I replace the { with [, but in the end I´m not able to set them up in the right way, to have them as a [ a X b] - Double Variable in my workpace.
Multiple ways of Converting didnt worked by now.. Maybe one of you guys already had some kind of problem in the past.
Converting the Vectors in many ways, e.g. using numpy
I want to ask the more experienced people how to get the RGB values of a pixel in an image using oiio's python bindings.
I have only just begun using oiio and am unfamiliar with the library as well as image manipulation on code level.
I poked around the documentation and I don't quite understand how the parameters work. It doesn't seem to work the same as python and I'm having a hard time trying to figure out, as I don't know C.
A) What command to even use to get pixel information (seems like get_pixel could work)
and
B) How to get it to work. I'm not understanding the parameters requirements exactly.
Edit:
I'm trying to convert the C example of visiting all pixels to get an average color in the documentation into something pythonic, but am getting nowhere.
Would appreciate any help, thank you.
Edit: adding the code
buffer = oiio.ImageBuf('image.tif')
array = buffer.read(0, 0, True)
print buffer.get_pixels(array)
the error message I get is:
# Error: Python argument types in
# ImageBuf.get_pixels(ImageBuf, bool)
# did not match C++ signature:
# get_pixels(class OpenImageIO::v1_5::ImageBuf, enum OpenImageIO::v1_5::TypeDesc::BASETYPE)
# get_pixels(class OpenImageIO::v1_5::ImageBuf, enum OpenImageIO::v1_5::TypeDesc::BASETYPE, struct OpenImageIO::v1_5::ROI)
# get_pixels(class OpenImageIO::v1_5::ImageBuf, struct OpenImageIO::v1_5::TypeDesc)
# get_pixels(class OpenImageIO::v1_5::ImageBuf, struct OpenImageIO::v1_5::TypeDesc, struct OpenImageIO::v1_5::ROI)
OpenImageIO has several classes for dealing with images, with different levels of abstraction. Assuming that you are interested in the ImageBuf class, I think the simplest way to access individual pixels from Python (with OpenImageIO 2.x) would look like this:
import OpenImageIO as oiio
buf = ImageBuf ("foo.jpg")
p = buf.getpixel (50, 50) # x, y
print (p)
p will be a numpy array, so the this will produce output like
(0.01148223876953125, 0.0030574798583984375, 0.0180511474609375)
Recently I found this awesome 2-factor authentication code generator written in Python 3. I was trying to convert it to Swift 3, but I am having trouble with one specific part, though:
def get_hotp_token(secret, intervals_no):
key = base64.b32decode(secret)
msg = struct.pack(">Q", intervals_no)
h = hmac.new(key, msg, hashlib.sha1).digest()
o = h[19] & 15
h = (struct.unpack(">I", h[o:o+4])[0] & 0x7fffffff) % 1000000
return h
I so far have only been able to do the first line of the function body :p using code from here
func getHotpToken(secret: String) -> [Int] {
let data = secret.base32DecodedData
<...>
return theTokens
}
I tried reading the documentation on struct.pack here and reading about what packing actually is here, but I still find the concept/implementation confusing, and I have no idea what the equivalent would be in Swift.
According to the documentation, struct.pack returns a string in the given format. The format in my case is >Q, which means that the byte order is little-endian and the C Type is an unsigned long long. Again, I am not exactly sure how this is supposed to look in Swift.
... And that is only the second line! I don't really understand how HMAC works (I can't even find the actual 'background' code), so I can't even translate the entire function. I could not find any native library for Swift that has this behavior.
Any pointers or help translating this function will be highly appreciated!
P.S. I checked and I think that this is on topic
Relevant imports:
import base64, struct, hmac
I just finished converting my code to Swift 3. This is a little different from the Python version, since this is more of a framework-type thing. It took a lot of experimentation to get the get_hotp_token to work (for example the Wikipedia page says it used SHA256 but it actually uses SHA1.
You can find it here.
When you use this, be sure to add a bridging header with #import <CommonCrypto/CommonHMAC.h>
Enjoy!
I have the following parameters in a Python file that is used to send commands pertaining to boundary conditions to Abaqus:
u1=0.0,
u2=0.0,
u3=0.0,
ur1=UNSET,
ur2=0.0,
ur3=UNSET
I would like to place these values inside a list and print that list to a .txt file. I figured I should convert all contents to strings:
List = [str(u1), str(u2), str(u3), str(ur1), str(ur2), str(ur3)]
This works only as long as the list does not contain "UNSET", which is a command used by Abaqus and is neither an int or str. Any ideas how to deal with that? Many thanks!
UNSET is an Abaqus/cae defined symbolic constant. It has a member name that returns the string representation, so you might do something like this:
def tostring(v):
try:
return(v.name)
except:
return(str(v))
then do for example
bc= [0.,1,UNSET]
print "u1=%s u2=%s u3=%s\n"%tuple([tostring(b) for b in bc])
u1=0. u2=1 u3=UNSET
EDIT simpler than that. After doing things the hard way I realize the symbolic constant is handled properly by the string conversion so you can just do this:
print "u1=%s u2=%s u3=%s\n"%tuple(['%s'%b for b in bc])
i need to convert PyInt to C int. In my code
count=PyInt_FromSsize_t(PyList_Size(pValue))
pValue is a PyObject, PyList. the problem i was having is the PyList_Size is not returning me the correct list size (count is supposed to be 5, but it gave me 6 million), or there is a problem with data types since im in C code interfacing to python scripts. Ideally, i want count to be in a C int type.
i've found python/c APIs that return me long C data types... which is not what i want... anybody can point me to the correct method or APIs??
The PyInt_FromSsize_t() returns a full-fledged Python int object sitting in memory and returns its memory address — that is where the 6-million number is coming from. You just want to get the scalar returned by PyList_Size() and cast it to a C integer, I think:
count = (int) PyList_Size(pValue)
If the list could be very long you might want to think about making count a long instead, in which case you could cast to that specific type instead.
Note: a count of -1 means that Python encountered an exception while trying to measure the list length. Here are the docs you should read to know how to handle exceptions:
http://docs.python.org/c-api/intro.html#exceptions