Read, Process and save bson file - python

I have a .bson file.
Inside the .bson file, I have a PDF whose data type is bytes.
I need to burn the PDF. which is inside the .bson file in a readable format. Does PDF make sense?
I need help, the steps I have to do in between
Note: I already saved the content in a PDF file and it says the file is damaged
My code:
with open ('LOL.bson') as myfile:
content = myfile.read()
print(content)
{"_id":{"$oid":"59d3522618206812388e35f1"},"files_id":{"$oid":"59d3522618206812388e35f0"},"n":0,"data":{"$binary":"JVBERi0xLjUNCiW1tbW1DQoxIDAgb2JqDQo8PC9UeXBlL0NhdGFsb2cvUGFnZXMgMiAwIFIvTGFuZyhwdC1QVCkgL1N0cnVjdFRyZWVSb290IDUzIDAgUi9NYXJrSW5mbzw8L01hcmtlZCB0cnVlPj4+Pg0KZW5kb2JqDQoyIDAgb2JqDQo8PC9UeXBlL1BhZ2VzL0NvdW50IDEvS2lkc1sgMyAwIFJdID4+DQplbmRvYmoNCjMgMCBvYmoNCjw8L1R5cGUvUGFnZS9QYXJlbnQgMiAwIFIvUmVzb3VyY2VzPDwvRXh0R1N0YXRlPDwvR1M1IDUgMCBSL0dTOCA4IDA....
Type of data
read_content = bson.json_util.loads(content)
print(read_content['data'])
b'%PDF-1.5\r\n%\xb5\xb5\xb5\xb5\r\n1 0 obj\r\n<</Type/Catalog/Pages 2 0 R/Lang(pt-PT) /StructTreeRoot 130 0 R/MarkInfo<</Marked true>>>>\r\nendobj\r\n2 0 obj\r\n<</Type/Pages/Count 1/Kids[ 3 0 R] >>\r\nendobj\r\n3 0 obj\r\n<</Type/Page/Parent 2 0 R/Resources<</ExtGState<</GS5 5 0 R/GS8 8 0 R>>/Font<</F1 6 0 R/F2 29 0 R>>/XObject<</Image9 9 0 R/Image11 11 0 R/Image13 13 0 R/Image15 15 0 R/Image17 17 0 R/Image19 19 0 R/Image21 21 0 R/Image23 23 0 R/Image25 25 0 R/Image27 27 0 R/Image32 32 0 R/Image34 34 0 R/Image35 35 0 R/Image37 37 0 R/Image39 39 0 R/Image41 41 0 R/Image43 43 0 R/Image45 45 0 R/Image47 47 0 R/Image49 49 0 R/Image51 51 0 R/Image53 53 0 R/Image55 55 0 R/Image57 57 0 R/Image59 59 0 R/Image61 61 0 R/Image63 63 0 R/Image65 65 0 R/Image67 67 0 R/Image69 69 0 R/Image71 71 0 R/Image73 73 0 R/Image75 75 0 R/Image77 77 0 R/Image79 79 0 R/Image81 81 0 R/Image83 83 0 R/Image85 85 0 R/Image87 87 0 R/Image89 89 0 R/Image91 91 0 R/Image93 93 0 R/Image95 95 0 R/Image97 97 0 R/Image99 99 0 R/Image101 101 0 R/Image103 103 0 R/Image105 105 0 R/Image107 107 0 R/Image109 109 0 R/Image111 111 0 R/Image113 113 0 R/Image115 115 0 R/Image117 117 0 R/Image119 119 0 R/Image121 121 0 R/Image123 123 0 R/Image125 125 0 R/Image127 127 0 R>>/Pattern<</P31 31 0 R/P33 33 0 R>>/ProcSet[/PDF/Text/ImageB/ImageC/ImageI] >>/MediaBox[ 0 0 960 540] /Contents 4 0 R/Group<</Type/Group/S/Transparency/CS/DeviceRGB>>/Tabs/S/StructParents 0>>\r\nendobj\r\n4 0 obj\r\n<</Filter/FlateDecode/Length 4008>>\r\nstream\r\nx\x9c\xbd[\xcb\x8e\x1d\xb7\x11\xdd\x0f0\xff\xd0K\xc9\x80Z|?\x00\xc3\x0b?"\xd8\x88\x11\'V\x90\x85\xe1\x850\x91\x15\x07\x1a\t\x91\x8c
read_content = bson.json_util.loads(content)
print(type(read_content['data']))
> `<class 'bytes'>
How to save .bson content in a readable format (PDF).

Related

Trying to predict the next number in cyphertext using tensorflow

I am experimenting with machine learning and I wanted to see how difficult it would be to predict a number given a series of other numbers. I have seen it accomplished with people making vectors such as 1-10. However, I wanted to try to do something more difficult. I wanted to do it based on the ciphertext. Here is what I have tried so far:
import numpy as np
import matplotlib.pyplot as plt
#from sklearn.linear_model import LinearRegression
from tensorflow.keras import Sequential
from tensorflow.keras import layers
from tensorflow.keras.layers import Input, LSTM, Dense
from tensorflow.keras.preprocessing.sequence import TimeseriesGenerator
from tensorflow.keras.layers import Lambda, SimpleRNN
from tensorflow.keras import backend as K
from numpy.polynomial import polynomial as poly
from sklearn.feature_extraction import DictVectorizer
import Pyfhel
def generateInput(x, length):
return np.append(x, [0 for i in range(length)], axis=0)
def main():
HE = Pyfhel.Pyfhel()
HE.contextGen(scheme='BFV', n=2048, q=34, t=34, t_bits=35, sec=128)
HE.keyGen()
a = "Hello"
a = np.asarray(bytearray(a, "utf-8"))
a = HE.encode(a)
ct = HE.encrypt(a).to_bytes('none')
ct = np.asarray([c for c in ct])
length = 100 # How many records to take into account
batch_size = 1
n_features = 1
epochs = 1
generator = TimeseriesGenerator(ct, ct, stride=length, length=length, batch_size=batch_size)
model = Sequential()
model.add(SimpleRNN(100, activation='leaky_relu', input_shape=(length, n_features)))
model.add(Dense(100, activation='leaky_relu', input_shape=(length, n_features)))
model.add(Dense(256, activation='softmax'))
model.compile(optimizer='adam', loss="sparse_categorical_crossentropy", metrics=['accuracy'])
history = model.fit(generator, epochs=epochs)
for i in range(1, length):
try:
x_input = np.asarray(generateInput(ct[:i], length-len(ct[:i]))).reshape((1, length))
yhat = model.predict(x_input).tolist()
yhat_normalized = [float(i)/sum(yhat[0]) for i in yhat[0]]
yhat_max = max(yhat_normalized)
yhat_index = yhat_normalized.index(yhat_max)
print("based on {} actual {} predicted {}".format(ct[:i], ct[i], yhat_index))
except Exception as e:
print("Error {}".format(e))
if __name__=="__main__":
main()
Now the problem is that all of my predictions are 0. Can anyone explain to me why this is happening? How can I fix this?
Here's what my current output looks like:
based on [94] actual 161 predicted 0
based on [ 94 161] actual 16 predicted 0
based on [ 94 161 16] actual 3 predicted 0
based on [ 94 161 16 3] actual 7 predicted 0
based on [ 94 161 16 3 7] actual 0 predicted 0
based on [ 94 161 16 3 7 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0] actual 105 predicted 0
based on [ 94 161 16 3 7 0 0 0 105] actual 128 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0] actual 78 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78] actual 6 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6] actual 78 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78] actual 65 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65] actual 45 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45] actual 23 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23] actual 12 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12] actual 234 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234] actual 155 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155] actual 45 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45] actual 217 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217] actual 42 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42] actual 230 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230] actual 122 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122] actual 64 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64] actual 99 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99] actual 53 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53] actual 143 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143] actual 104 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104] actual 96 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96] actual 158 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158] actual 146 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0] actual 99 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99] actual 122 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122] actual 217 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217] actual 34 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34] actual 140 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140] actual 238 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238] actual 76 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76] actual 135 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135] actual 237 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0] actual 2 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0] actual 8 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0] actual 1 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0] actual 240 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240] actual 63 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63] actual 94 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94] actual 161 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161] actual 16 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16] actual 3 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3] actual 7 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0] actual 24 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24] actual 128 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0 0 0 0 0] actual 0 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0 0 0 0 0 0] actual 16 predicted 0
based on [ 94 161 16 3 7 0 0 0 105 128 0 0 0 0 0 0 78 6
78 65 45 23 12 234 155 45 217 42 230 122 64 99 53 143 104 96
158 146 0 99 122 217 34 140 238 76 135 237 0 2 0 0 0 0
0 0 0 0 8 0 0 0 0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0 0 240 63 94 161 16 3 7 0 0 0 24
128 0 0 0 0 0 0 0 16] actual 0 predicted 0

Reshape data to new format for object detection

I have a data set in this format in dataframe
0--Parade/0_Parade_marchingband_1_849.jpg
2
449 330 122 149 0 0 0 0 0 0
0--Parade/0_Parade_Parade_0_904.jpg
1
361 98 263 339 0 0 0 0 0 0
0--Parade/0_Parade_marchingband_1_799.jpg
45
78 221 7 8 0 0 0 0 0
78 238 14 17 2 0 0 0 0 0
3 232 11 15 2 0 0 0 2 0
20 215 12 16 2 0 0 0 2 0
0--Parade/0_Parade_marchingband_1_117.jpg
23
69 359 50 36 1 0 0 0 0 1
227 382 56 43 1 0 1 0 0 1
296 305 44 26 1 0 0 0 0 1
353 280 40 36 2 0 0 0 2 1
885 377 63 41 1 0 0 0 0 1
819 391 34 43 2 0 0 0 1 0
727 342 37 31 2 0 0 0 0 1
598 246 33 29 2 0 0 0 0 1
740 308 45 33 1 0 0 0 2 1
0--Parade/0_Parade_marchingband_1_778.jpg
35
27 226 33 36 1 0 0 0 2 0
63 95 16 19 2 0 0 0 0 0
64 63 17 18 2 0 0 0 0 0
88 13 16 15 2 0 0 0 1 0
231 1 13 13 2 0 0 0 1 0
263 122 14 20 2 0 0 0 0 0
367 68 15 23 2 0 0 0 0 0
198 98 15 18 2 0 0 0 0 0
293 161 52 59 1 0 0 0 1 0
412 36 14 20 2 0 0 0 1 0
Can anyone tell me how to put these in dataframe where 1st column contain all the .jpg path next column contains all the coordinates but all the coordinate should be in correspondence to that .jpg path
eg.
Column1 coulmn2 column3
0--Parade/0_Parade_marchingband_1_849.jpg | 2 | 449 330 122 149 0 0 0 0 0 0
0--Parade/0_Parade_Parade_0_904.jpg | 1 | 361 98 263 339 0 0 0 0 0 0
0--Parade/0_Parade_marchingband_1_799.jpg | 45 | 78 221 7 8 0 0 0 0 0
| | 78 238 14 17 2 0 0 0 0 0
| | 3 232 11 15 2 0 0 0 2 0
| | 20 215 12 16 2 0 0 0 2 0
I have tried this
count1=0
count2=0
dict1 = {}
dict2 = {}
dict3 = {}
for i in data[0]:
if (i.find('.jpg') == -1):
dict1[count1] = i
count1+=1
else:
dict2[count2] = i
count2+=1

pattern finding in python and report as a tab separated text file

I have a big text file like this small example:
small example:
>chr9:128683-128744
GGATTTCTTCTTAGTTTGGATCCATTGCTGGTGAGCTAGTGGGATTTTTTGGGGGGTGTTA
>chr16:134222-134283
AGCTGGAAGCAGCGTGGGAATCACAGAATGGCCGAGAACTTAAAGGCTTTGCTTGGCCTGG
>chr16:134226-134287
GGAAGCAGCGTGGGAATCACAGAATGGACGGCCGATTAAAGGCTTTGCTTGGCCTGGATTT
>chr1:134723-134784
AAGTGATTCACCCTGCCTTTCCGACCTTCCCCAGAACAGAACACGTTGATCGTGGGCGATA
>chr16:135770-135831
GCCTGAGCAAAGGGCCTGCCCAGACAAGATTTTTTAATTGTTTAAAAACCGAATAAATGTT
>chr16:135787-135848
GCCCAGACAAGATTTTTTAATTGTTTAAAAACCGAATAAATGTTTTATTTCTAGAAAACTG
>chr16:135788-135849
CCCAGACAAGATTTTTTAATTGTTTAAAAACCGAATAAATGTTTTATTTCTAGAAAACTGT
>chr16:136245-136306
CACTTCACAAATAGAAGGCTGTCAGAGAGACAGGGACAGGCCACACAAGTGTTTCTGCACA
>chr7:146692-146753
GTGTGACCAAAACTTAGGATGTTAGCCGAACTCTCCGTTACTATCATTTTGGATTTCCAGT
>chr8:147932-147993
GGTAAAGGTAAATACATAAACAAACATAAAACCGATCCTATTGTAATTTTGGTTTGTAACT
this file is divided into different groups and every group has 2 parts (2 lines). the 1st line which starts with > is ID and the 2nd line is a sequence of characters. length of every sequence of characters is 61.
I have a short sequence (which is CCGA) I would like to scan every 2nd part for this short sequence. and output would be a text file with 2 columns.
1st column: is the position where the beginning of short sequence is located (every 2nd part has 61 characters so in the output I will report the position of characters which is a number).
2nd column: is the count of number of times that the beginning of short sequence is located at that specific position.
for instance for the following sequence of characters the beginning of short sequence is at position 49.
GCCTGAGCAAAGGGCCTGCCCAGACAAGATTTTTTAATTGTTTAAAAACCGAATAAATGTT
for the small example, the expected output would look like this:
expected output:
1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
10 0
11 0
12 0
13 0
14 0
15 0
16 0
17 0
18 0
19 0
20 0
21 1
22 0
23 0
24 0
25 0
26 1
27 0
28 0
29 0
30 0
31 1
32 4
33 0
34 0
35 0
36 0
37 0
38 0
39 0
40 0
41 0
42 0
43 0
44 0
45 0
46 0
47 0
48 0
49 1
50 0
51 0
52 0
53 0
54 0
55 0
56 0
57 0
58 0
59 0
60 0
61 0
I am trying to do that in python using the following code. but the output is not like what I want.
infile = open('infile.txt', 'r')
ss = 'CCGA'
count = 0
for line in infile:
if not line.startswith('>'):
for match in pattern.finder(ss):
count +=1
POSITION = pattern.finder(ss)
COUNT = count
do you know how to fix it?
The below uses finditer to find all non-overlapping occurences of the CCGA pattern, and creates a mapping from the index of the beginning of the sequence to the number of times a sequence has begun at that index.
from re import compile
from collections import defaultdict
pat = compile(r'CCGA')
mapping = defaultdict(int)
with open('infile.txt', 'r') as infile:
for line in infile:
if not line.startswith('>'):
for match in pat.finditer(line):
mapping[match.start() + 1] += 1
for i in range(1, 62):
print("{:>2} {:>2}".format(i, mapping[i]))
prints
1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
10 0
11 0
12 0
13 0
14 0
15 0
16 0
17 0
18 0
19 0
20 0
21 1
22 0
23 0
24 0
25 0
26 1
27 0
28 0
29 0
30 0
31 1
32 4
33 0
34 0
35 0
36 0
37 0
38 0
39 0
40 0
41 0
42 0
43 0
44 0
45 0
46 0
47 0
48 0
49 1
50 0
51 0
52 0
53 0
54 0
55 0
56 0
57 0
58 0
59 0
60 0
61 0
One way to export it to a file would be to use the print function
with open('outfile.txt', 'w+') as outfile:
for i in range(1, 62):
print(i, mapping[i], sep='\t', file=outfile)

Pandas Groupby, MultiIndex, Multiple Columns

I just worked on creating some columns using .transform() to count some entries.
I used this reference.
For example:
userID deviceName POWER_DOWN USER LOW_RSSI NONE CMD_SUCCESS
0 24 IR_00 85 0 39 0 0
1 24 IR_00 85 0 39 0 0
2 24 IR_00 85 0 39 0 0
3 24 IR_00 85 0 39 0 0
4 25 BED_08 0 109 78 0 0
5 25 BED_08 0 109 78 0 0
6 25 BED_08 0 109 78 0 0
7 24 IR_00 85 0 39 0 0
8 23 IR_09 2 0 0 0 0
9 23 V33_17 3 0 2 0 134
10 23 V33_17 3 0 2 0 134
11 23 V33_17 3 0 2 0 134
12 23 V33_17 3 0 2 0 134
I want to group them by userID and deviceName?
So that it would look like:
userID deviceName POWER_DOWN USER LOW_RSSI NONE CMD_SUCCESS
0 23 IR_09 2 0 0 0 0
1 V33_17 3 0 2 0 134
2 24 IR_00 85 0 39 0 0
3 25 BED_08 0 109 78 0 0
I also want them to be sorted by userID and maybe make userID and deviceName as multi-index.
I tried the df = df.groupby(['userID', 'deviceName'])
but returned a
<pandas.core.groupby.DataFrameGroupBy object at0x00000249BBB13DD8>.
not the dataframe.
By the way, Im sorry. I dont know how to copy a Jupyter notebook In and Out.
I believe need drop_duplicates with sort_values:
df1 = df.drop_duplicates(['userID', 'deviceName']).sort_values('userID')
print (df1)
userID deviceName POWER_DOWN USER LOW_RSSI NONE CMD_SUCCESS
8 23 IR_09 2 0 0 0 0
9 23 V33_17 3 0 2 0 134
0 24 IR_00 85 0 39 0 0
4 25 BED_08 0 109 78 0 0
If want create MultiIndex add set_index:
df1 = (df.drop_duplicates(['userID', 'deviceName'])
.sort_values('userID')
.set_index(['userID', 'deviceName']))
print (df1)
POWER_DOWN USER LOW_RSSI NONE CMD_SUCCESS
userID deviceName
23 IR_09 2 0 0 0 0
V33_17 3 0 2 0 134
24 IR_00 85 0 39 0 0
25 BED_08 0 109 78 0 0

BGR8 raw image conversion to numpy python

So far i have tested acquiring images from a camera in the raw BGR8 format into a numpy array, I am at the point in which i can get access to the data however the image seems to have visible image artifacts (vertical lines etc) and only displays in greyscale.
The following code is used to acquire the image in BGR8 format:
image=ctrl.GetImageWindow(100,100, 20,20) # offset 100,100, 20x20 grid of pixels
data = numpy.array(image)
Data returns the following 20-wide and 60-long numpy array - from some testing, the first "row" is Blue, the second, Green, the third Red recursively
49 48 49 57 68 76 62 59 46 54 62 58 68 64 45 60 65 51 56 70
76 72 62 62 66 59 65 62 53 65 62 67 75 58 59 57 67 64 64 63
54 64 55 67 67 61 64 43 66 60 59 73 48 74 88 77 65 54 69 57
80 59 42 56 79 51 53 67 64 40 53 68 74 83 60 81 53 37 42 72
61 71 73 75 79 63 64 66 70 60 64 61 68 64 56 60 60 61 67 61
60 62 69 83 66 64 76 63 62 72 66 70 58 61 77 83 76 71 75 63
58 75 74 61 67 54 58 59 55 46 54 61 52 81 56 59 53 66 45 50
49 60 67 63 64 66 76 63 69 62 71 66 67 63 57 55 61 54 63 63
74 62 64 73 59 64 56 68 67 54 65 70 60 52 53 59 71 66 63 68
34 56 53 57 65 52 65 65 75 73 72 59 40 61 64 72 54 72 66 55
59 63 65 69 63 60 70 68 67 59 60 69 69 74 69 64 64 60 63 66
75 66 73 61 52 65 53 58 58 44 51 56 75 56 61 53 52 62 62 60
54 41 39 38 49 29 48 58 60 72 56 53 52 57 66 68 65 70 54 77
59 69 59 78 70 66 71 63 76 74 67 63 64 63 59 68 68 61 63 55
68 63 68 64 53 65 63 63 49 55 53 60 60 51 66 69 49 55 54 52
71 61 58 47 69 48 45 55 51 69 65 72 79 58 77 60 65 69 56 62
61 46 54 62 75 77 68 64 73 69 66 64 55 63 68 62 65 71 67 59
72 69 69 63 68 64 59 59 70 65 55 69 46 54 70 66 62 60 65 52
61 57 67 71 85 62 50 73 63 80 59 71 105 78 57 80 73 74 79 70
66 54 65 60 55 55 66 56 55 57 68 66 51 64 49 47 51 53 62 66
73 57 61 63 72 73 61 68 52 64 58 62 58 61 69 72 69 82 80 60
77 61 69 57 76 59 40 57 55 62 60 45 71 57 64 54 81 81 63 71
68 59 66 54 73 64 78 69 63 66 73 72 57 74 56 55 50 48 61 62
52 61 68 64 71 70 71 74 67 70 52 69 56 63 81 62 55 64 72 62
76 75 75 54 79 49 46 42 31 51 48 56 65 55 58 43 61 75 52 69
60 54 56 63 65 69 79 68 67 72 64 66 71 67 62 61 66 61 67 58
58 63 68 63 56 59 63 55 58 62 52 70 60 68 71 67 59 63 61 59
86 59 56 78 66 67 46 69 45 52 70 73 58 71 67 50 55 51 60 71
64 65 58 52 60 68 63 59 71 85 57 53 64 62 69 60 54 62 70 58
51 53 71 60 59 76 80 69 64 76 74 62 67 61 56 57 69 66 62 67
57 65 73 54 55 52 50 54 65 78 70 62 59 77 71 57 69 55 81 78
60 70 51 61 75 72 66 57 59 51 61 62 63 61 65 67 57 67 70 56
62 56 82 68 58 56 74 63 65 69 68 69 79 74 65 53 61 58 64 70
44 56 63 75 58 55 65 72 73 65 65 76 76 56 51 76 75 63 51 56
73 73 61 69 67 74 68 66 64 59 73 60 68 69 63 66 59 66 62 53
57 62 73 65 62 61 57 50 63 75 68 58 54 65 74 54 68 60 71 63
64 50 81 67 39 36 70 84 77 68 64 56 63 66 73 81 63 50 22 24
74 71 59 60 77 71 49 51 59 58 69 77 65 72 51 53 48 53 40 27
55 53 56 62 65 73 68 74 69 67 68 47 43 58 72 63 67 56 69 60
61 61 78 67 47 70 88 76 70 76 80 70 60 72 52 67 64 36 25 9
70 79 53 59 75 73 59 50 51 60 62 68 74 61 59 56 44 31 20 5
62 49 51 67 65 54 67 76 74 71 57 54 61 48 64 64 64 57 63 63
61 67 69 60 46 68 60 58 54 73 72 75 68 59 46 55 43 34 32 5
63 68 58 71 64 49 60 68 52 56 63 56 56 66 52 43 35 24 14 6
49 56 57 64 71 66 68 62 71 65 66 57 68 60 66 72 67 59 64 60
76 74 69 45 41 68 66 68 76 77 50 67 65 77 62 46 28 21 25 7
64 59 63 64 73 67 62 58 55 48 63 61 60 40 33 22 26 20 19 23
57 56 59 69 75 73 49 72 67 69 70 68 70 58 57 56 63 58 56 54
79 56 68 66 43 73 67 63 74 65 75 50 38 42 42 24 23 4 22 27
60 55 63 67 66 57 70 64 71 58 48 46 44 18 20 8 13 13 21 20
60 58 61 72 64 62 62 55 58 72 68 64 72 63 67 67 66 66 67 57
69 58 55 65 48 60 64 64 68 57 50 22 18 44 24 18 24 19 34 35
64 68 66 73 72 71 65 60 67 48 39 22 25 9 1 8 17 21 20 22
62 61 61 54 62 60 62 53 50 59 58 64 67 63 72 74 71 59 59 45
75 50 55 57 81 64 72 60 59 40 29 19 18 47 27 20 24 25 19 43
68 68 69 65 57 64 63 44 51 47 29 14 16 6 0 16 25 24 13 17
65 76 69 56 62 67 70 71 62 62 67 75 70 61 66 74 63 57 68 59
72 66 70 70 65 60 81 41 25 32 20 21 6 10 17 17 27 18 29 43
58 54 60 57 54 58 49 27 32 27 13 2 18 14 17 22 25 26 32 22
67 75 75 74 81 76 66 79 80 58 60 65 60 63 55 42 55 60 66 79
The output of cv2.imwrite("test-raw.png", data) is the following image which is grey (it should have a red line)
I also tried adding in data = numpy.array(image).view(numpy.uint8) and the output was the following:
Data returns the following 80-wide and 60-long numpy array
49 0 0 0 62 0 0 0 49 0 0 0 45 0 0 0 58 0 0 0 63 0 0 0 51 0 0 0 59 0 0 0 50 0 0 0 45 0 0 0 45 0 0 0 38 0 0 0 59 0 0 0 62 0 0 0 42 0 0 0 39 0 0 0 45 0 0 0 57 0 0 0 74 0 0 0 58 0 0 0
51 0 0 0 48 0 0 0 44 0 0 0 56 0 0 0 51 0 0 0 40 0 0 0 40 0 0 0 33 0 0 0 52 0 0 0 45 0 0 0 43 0 0 0 55 0 0 0 50 0 0 0 51 0 0 0 51 0 0 0 57 0 0 0 49 0 0 0 42 0 0 0 36 0 0 0 54 0 0 0
41 0 0 0 44 0 0 0 48 0 0 0 50 0 0 0 50 0 0 0 59 0 0 0 47 0 0 0 45 0 0 0 52 0 0 0 49 0 0 0 54 0 0 0 58 0 0 0 47 0 0 0 48 0 0 0 51 0 0 0 49 0 0 0 42 0 0 0 52 0 0 0 50 0 0 0 45 0 0 0
61 0 0 0 48 0 0 0 61 0 0 0 44 0 0 0 50 0 0 0 66 0 0 0 41 0 0 0 57 0 0 0 59 0 0 0 61 0 0 0 48 0 0 0 54 0 0 0 66 0 0 0 60 0 0 0 60 0 0 0 45 0 0 0 63 0 0 0 49 0 0 0 51 0 0 0 53 0 0 0
53 0 0 0 47 0 0 0 46 0 0 0 49 0 0 0 49 0 0 0 54 0 0 0 53 0 0 0 54 0 0 0 52 0 0 0 45 0 0 0 52 0 0 0 45 0 0 0 47 0 0 0 45 0 0 0 43 0 0 0 55 0 0 0 69 0 0 0 57 0 0 0 49 0 0 0 49 0 0 0
47 0 0 0 58 0 0 0 44 0 0 0 53 0 0 0 52 0 0 0 57 0 0 0 56 0 0 0 54 0 0 0 46 0 0 0 54 0 0 0 54 0 0 0 51 0 0 0 47 0 0 0 35 0 0 0 50 0 0 0 58 0 0 0 50 0 0 0 40 0 0 0 51 0 0 0 48 0 0 0
53 0 0 0 33 0 0 0 47 0 0 0 35 0 0 0 34 0 0 0 32 0 0 0 31 0 0 0 54 0 0 0 69 0 0 0 51 0 0 0 54 0 0 0 50 0 0 0 62 0 0 0 44 0 0 0 41 0 0 0 37 0 0 0 47 0 0 0 48 0 0 0 46 0 0 0 49 0 0 0
53 0 0 0 52 0 0 0 50 0 0 0 53 0 0 0 51 0 0 0 48 0 0 0 52 0 0 0 48 0 0 0 46 0 0 0 51 0 0 0 43 0 0 0 45 0 0 0 50 0 0 0 48 0 0 0 55 0 0 0 51 0 0 0 61 0 0 0 49 0 0 0 48 0 0 0 42 0 0 0
54 0 0 0 69 0 0 0 50 0 0 0 51 0 0 0 43 0 0 0 60 0 0 0 51 0 0 0 54 0 0 0 45 0 0 0 57 0 0 0 50 0 0 0 60 0 0 0 62 0 0 0 38 0 0 0 50 0 0 0 48 0 0 0 48 0 0 0 44 0 0 0 56 0 0 0 59 0 0 0
71 0 0 0 36 0 0 0 40 0 0 0 48 0 0 0 37 0 0 0 43 0 0 0 41 0 0 0 33 0 0 0 39 0 0 0 37 0 0 0 63 0 0 0 54 0 0 0 53 0 0 0 48 0 0 0 45 0 0 0 50 0 0 0 37 0 0 0 47 0 0 0 57 0 0 0 49 0 0 0
48 0 0 0 55 0 0 0 50 0 0 0 56 0 0 0 53 0 0 0 55 0 0 0 48 0 0 0 56 0 0 0 52 0 0 0 51 0 0 0 46 0 0 0 45 0 0 0 54 0 0 0 58 0 0 0 49 0 0 0 46 0 0 0 48 0 0 0 49 0 0 0 52 0 0 0 52 0 0 0
44 0 0 0 53 0 0 0 57 0 0 0 51 0 0 0 45 0 0 0 51 0 0 0 41 0 0 0 53 0 0 0 45 0 0 0 56 0 0 0 43 0 0 0 52 0 0 0 51 0 0 0 47 0 0 0 54 0 0 0 48 0 0 0 51 0 0 0 57 0 0 0 49 0 0 0 46 0 0 0
55 0 0 0 36 0 0 0 43 0 0 0 44 0 0 0 53 0 0 0 42 0 0 0 46 0 0 0 48 0 0 0 66 0 0 0 48 0 0 0 54 0 0 0 61 0 0 0 60 0 0 0 39 0 0 0 42 0 0 0 51 0 0 0 44 0 0 0 47 0 0 0 69 0 0 0 55 0 0 0
49 0 0 0 54 0 0 0 48 0 0 0 52 0 0 0 50 0 0 0 56 0 0 0 52 0 0 0 58 0 0 0 48 0 0 0 49 0 0 0 50 0 0 0 44 0 0 0 49 0 0 0 51 0 0 0 47 0 0 0 48 0 0 0 49 0 0 0 51 0 0 0 46 0 0 0 52 0 0 0
47 0 0 0 50 0 0 0 60 0 0 0 53 0 0 0 52 0 0 0 53 0 0 0 54 0 0 0 55 0 0 0 40 0 0 0 51 0 0 0 49 0 0 0 47 0 0 0 40 0 0 0 49 0 0 0 49 0 0 0 47 0 0 0 52 0 0 0 47 0 0 0 43 0 0 0 50 0 0 0
60 0 0 0 40 0 0 0 40 0 0 0 46 0 0 0 56 0 0 0 49 0 0 0 36 0 0 0 46 0 0 0 61 0 0 0 49 0 0 0 38 0 0 0 42 0 0 0 30 0 0 0 47 0 0 0 60 0 0 0 73 0 0 0 77 0 0 0 67 0 0 0 54 0 0 0 54 0 0 0
37 0 0 0 52 0 0 0 51 0 0 0 52 0 0 0 54 0 0 0 53 0 0 0 51 0 0 0 51 0 0 0 62 0 0 0 62 0 0 0 57 0 0 0 51 0 0 0 57 0 0 0 55 0 0 0 43 0 0 0 43 0 0 0 37 0 0 0 49 0 0 0 52 0 0 0 52 0 0 0
54 0 0 0 61 0 0 0 58 0 0 0 38 0 0 0 50 0 0 0 48 0 0 0 57 0 0 0 58 0 0 0 50 0 0 0 48 0 0 0 51 0 0 0 48 0 0 0 33 0 0 0 46 0 0 0 52 0 0 0 38 0 0 0 37 0 0 0 52 0 0 0 46 0 0 0 50 0 0 0
50 0 0 0 48 0 0 0 39 0 0 0 49 0 0 0 54 0 0 0 47 0 0 0 40 0 0 0 44 0 0 0 52 0 0 0 54 0 0 0 41 0 0 0 48 0 0 0 26 0 0 0 33 0 0 0 51 0 0 0 59 0 0 0 55 0 0 0 48 0 0 0 49 0 0 0 52 0 0 0
44 0 0 0 48 0 0 0 63 0 0 0 59 0 0 0 52 0 0 0 50 0 0 0 45 0 0 0 49 0 0 0 58 0 0 0 63 0 0 0 57 0 0 0 47 0 0 0 59 0 0 0 57 0 0 0 55 0 0 0 44 0 0 0 46 0 0 0 46 0 0 0 62 0 0 0 58 0 0 0
50 0 0 0 58 0 0 0 61 0 0 0 47 0 0 0 43 0 0 0 59 0 0 0 65 0 0 0 57 0 0 0 39 0 0 0 47 0 0 0 60 0 0 0 56 0 0 0 49 0 0 0 54 0 0 0 52 0 0 0 50 0 0 0 47 0 0 0 50 0 0 0 52 0 0 0 47 0 0 0
50 0 0 0 35 0 0 0 44 0 0 0 50 0 0 0 30 0 0 0 55 0 0 0 51 0 0 0 57 0 0 0 54 0 0 0 49 0 0 0 60 0 0 0 57 0 0 0 48 0 0 0 40 0 0 0 59 0 0 0 40 0 0 0 47 0 0 0 34 0 0 0 53 0 0 0 62 0 0 0
54 0 0 0 50 0 0 0 52 0 0 0 47 0 0 0 57 0 0 0 59 0 0 0 50 0 0 0 35 0 0 0 48 0 0 0 51 0 0 0 36 0 0 0 46 0 0 0 49 0 0 0 56 0 0 0 43 0 0 0 45 0 0 0 51 0 0 0 52 0 0 0 53 0 0 0 57 0 0 0
50 0 0 0 55 0 0 0 48 0 0 0 50 0 0 0 45 0 0 0 45 0 0 0 50 0 0 0 56 0 0 0 50 0 0 0 50 0 0 0 54 0 0 0 54 0 0 0 52 0 0 0 58 0 0 0 35 0 0 0 56 0 0 0 52 0 0 0 47 0 0 0 52 0 0 0 48 0 0 0
50 0 0 0 24 0 0 0 43 0 0 0 45 0 0 0 46 0 0 0 62 0 0 0 51 0 0 0 73 0 0 0 45 0 0 0 53 0 0 0 75 0 0 0 51 0 0 0 44 0 0 0 40 0 0 0 63 0 0 0 59 0 0 0 42 0 0 0 47 0 0 0 63 0 0 0 38 0 0 0
54 0 0 0 63 0 0 0 50 0 0 0 57 0 0 0 56 0 0 0 59 0 0 0 44 0 0 0 42 0 0 0 47 0 0 0 44 0 0 0 35 0 0 0 55 0 0 0 46 0 0 0 49 0 0 0 36 0 0 0 43 0 0 0 53 0 0 0 59 0 0 0 55 0 0 0 50 0 0 0
47 0 0 0 51 0 0 0 52 0 0 0 55 0 0 0 56 0 0 0 47 0 0 0 49 0 0 0 55 0 0 0 60 0 0 0 59 0 0 0 59 0 0 0 56 0 0 0 66 0 0 0 73 0 0 0 59 0 0 0 61 0 0 0 51 0 0 0 47 0 0 0 44 0 0 0 53 0 0 0
61 0 0 0 54 0 0 0 67 0 0 0 57 0 0 0 42 0 0 0 40 0 0 0 55 0 0 0 72 0 0 0 49 0 0 0 45 0 0 0 60 0 0 0 60 0 0 0 57 0 0 0 44 0 0 0 52 0 0 0 52 0 0 0 51 0 0 0 47 0 0 0 46 0 0 0 39 0 0 0
55 0 0 0 51 0 0 0 45 0 0 0 52 0 0 0 54 0 0 0 59 0 0 0 52 0 0 0 44 0 0 0 41 0 0 0 45 0 0 0 48 0 0 0 55 0 0 0 54 0 0 0 40 0 0 0 42 0 0 0 51 0 0 0 52 0 0 0 59 0 0 0 52 0 0 0 51 0 0 0
41 0 0 0 49 0 0 0 56 0 0 0 50 0 0 0 41 0 0 0 52 0 0 0 54 0 0 0 45 0 0 0 58 0 0 0 53 0 0 0 58 0 0 0 47 0 0 0 55 0 0 0 62 0 0 0 66 0 0 0 55 0 0 0 53 0 0 0 44 0 0 0 44 0 0 0 48 0 0 0
50 0 0 0 52 0 0 0 62 0 0 0 62 0 0 0 43 0 0 0 62 0 0 0 47 0 0 0 50 0 0 0 51 0 0 0 51 0 0 0 52 0 0 0 61 0 0 0 66 0 0 0 52 0 0 0 36 0 0 0 49 0 0 0 50 0 0 0 44 0 0 0 46 0 0 0 48 0 0 0
50 0 0 0 48 0 0 0 44 0 0 0 51 0 0 0 54 0 0 0 58 0 0 0 54 0 0 0 48 0 0 0 54 0 0 0 53 0 0 0 48 0 0 0 48 0 0 0 50 0 0 0 53 0 0 0 45 0 0 0 48 0 0 0 49 0 0 0 53 0 0 0 53 0 0 0 53 0 0 0
55 0 0 0 61 0 0 0 56 0 0 0 60 0 0 0 50 0 0 0 39 0 0 0 41 0 0 0 47 0 0 0 56 0 0 0 54 0 0 0 53 0 0 0 49 0 0 0 42 0 0 0 55 0 0 0 55 0 0 0 53 0 0 0 53 0 0 0 53 0 0 0 53 0 0 0 54 0 0 0
56 0 0 0 62 0 0 0 65 0 0 0 59 0 0 0 60 0 0 0 52 0 0 0 52 0 0 0 52 0 0 0 58 0 0 0 67 0 0 0 59 0 0 0 56 0 0 0 51 0 0 0 32 0 0 0 35 0 0 0 40 0 0 0 41 0 0 0 39 0 0 0 42 0 0 0 46 0 0 0
58 0 0 0 50 0 0 0 49 0 0 0 56 0 0 0 50 0 0 0 54 0 0 0 62 0 0 0 47 0 0 0 56 0 0 0 51 0 0 0 54 0 0 0 54 0 0 0 56 0 0 0 54 0 0 0 56 0 0 0 51 0 0 0 40 0 0 0 55 0 0 0 51 0 0 0 45 0 0 0
40 0 0 0 52 0 0 0 45 0 0 0 51 0 0 0 57 0 0 0 49 0 0 0 41 0 0 0 33 0 0 0 51 0 0 0 51 0 0 0 52 0 0 0 40 0 0 0 50 0 0 0 52 0 0 0 44 0 0 0 59 0 0 0 56 0 0 0 59 0 0 0 45 0 0 0 50 0 0 0
45 0 0 0 34 0 0 0 54 0 0 0 48 0 0 0 46 0 0 0 50 0 0 0 64 0 0 0 50 0 0 0 54 0 0 0 53 0 0 0 70 0 0 0 62 0 0 0 36 0 0 0 46 0 0 0 60 0 0 0 54 0 0 0 39 0 0 0 57 0 0 0 47 0 0 0 32 0 0 0
62 0 0 0 56 0 0 0 48 0 0 0 48 0 0 0 50 0 0 0 49 0 0 0 50 0 0 0 64 0 0 0 55 0 0 0 48 0 0 0 48 0 0 0 53 0 0 0 56 0 0 0 54 0 0 0 60 0 0 0 54 0 0 0 49 0 0 0 48 0 0 0 40 0 0 0 27 0 0 0
51 0 0 0 49 0 0 0 51 0 0 0 56 0 0 0 56 0 0 0 41 0 0 0 37 0 0 0 40 0 0 0 50 0 0 0 61 0 0 0 50 0 0 0 41 0 0 0 52 0 0 0 56 0 0 0 50 0 0 0 54 0 0 0 60 0 0 0 51 0 0 0 56 0 0 0 45 0 0 0
50 0 0 0 46 0 0 0 53 0 0 0 47 0 0 0 37 0 0 0 41 0 0 0 51 0 0 0 60 0 0 0 53 0 0 0 50 0 0 0 70 0 0 0 78 0 0 0 61 0 0 0 65 0 0 0 49 0 0 0 32 0 0 0 40 0 0 0 62 0 0 0 31 0 0 0 8 0 0 0
55 0 0 0 60 0 0 0 53 0 0 0 54 0 0 0 55 0 0 0 57 0 0 0 51 0 0 0 49 0 0 0 52 0 0 0 45 0 0 0 38 0 0 0 41 0 0 0 45 0 0 0 45 0 0 0 66 0 0 0 54 0 0 0 54 0 0 0 38 0 0 0 26 0 0 0 4 0 0 0
42 0 0 0 50 0 0 0 52 0 0 0 54 0 0 0 56 0 0 0 45 0 0 0 39 0 0 0 49 0 0 0 47 0 0 0 58 0 0 0 53 0 0 0 60 0 0 0 48 0 0 0 45 0 0 0 48 0 0 0 48 0 0 0 53 0 0 0 52 0 0 0 54 0 0 0 48 0 0 0
51 0 0 0 52 0 0 0 43 0 0 0 45 0 0 0 50 0 0 0 38 0 0 0 45 0 0 0 45 0 0 0 43 0 0 0 55 0 0 0 68 0 0 0 69 0 0 0 57 0 0 0 66 0 0 0 59 0 0 0 48 0 0 0 42 0 0 0 35 0 0 0 30 0 0 0 18 0 0 0
48 0 0 0 46 0 0 0 55 0 0 0 58 0 0 0 54 0 0 0 58 0 0 0 53 0 0 0 50 0 0 0 53 0 0 0 38 0 0 0 47 0 0 0 46 0 0 0 40 0 0 0 49 0 0 0 49 0 0 0 38 0 0 0 30 0 0 0 24 0 0 0 18 0 0 0 1 0 0 0
38 0 0 0 51 0 0 0 48 0 0 0 59 0 0 0 46 0 0 0 50 0 0 0 44 0 0 0 49 0 0 0 62 0 0 0 56 0 0 0 46 0 0 0 52 0 0 0 46 0 0 0 47 0 0 0 42 0 0 0 51 0 0 0 52 0 0 0 48 0 0 0 49 0 0 0 49 0 0 0
48 0 0 0 47 0 0 0 52 0 0 0 63 0 0 0 68 0 0 0 51 0 0 0 42 0 0 0 35 0 0 0 59 0 0 0 64 0 0 0 49 0 0 0 45 0 0 0 39 0 0 0 57 0 0 0 53 0 0 0 33 0 0 0 33 0 0 0 28 0 0 0 15 0 0 0 1 0 0 0
44 0 0 0 41 0 0 0 50 0 0 0 55 0 0 0 47 0 0 0 56 0 0 0 56 0 0 0 48 0 0 0 43 0 0 0 52 0 0 0 47 0 0 0 48 0 0 0 50 0 0 0 45 0 0 0 39 0 0 0 37 0 0 0 20 0 0 0 13 0 0 0 18 0 0 0 8 0 0 0
47 0 0 0 63 0 0 0 51 0 0 0 45 0 0 0 51 0 0 0 48 0 0 0 42 0 0 0 55 0 0 0 54 0 0 0 50 0 0 0 53 0 0 0 50 0 0 0 45 0 0 0 53 0 0 0 47 0 0 0 56 0 0 0 52 0 0 0 49 0 0 0 48 0 0 0 55 0 0 0
43 0 0 0 31 0 0 0 45 0 0 0 56 0 0 0 55 0 0 0 63 0 0 0 49 0 0 0 20 0 0 0 39 0 0 0 40 0 0 0 49 0 0 0 43 0 0 0 23 0 0 0 30 0 0 0 30 0 0 0 21 0 0 0 8 0 0 0 18 0 0 0 22 0 0 0 9 0 0 0
56 0 0 0 51 0 0 0 56 0 0 0 38 0 0 0 47 0 0 0 51 0 0 0 55 0 0 0 55 0 0 0 45 0 0 0 57 0 0 0 47 0 0 0 51 0 0 0 47 0 0 0 37 0 0 0 22 0 0 0 17 0 0 0 22 0 0 0 13 0 0 0 14 0 0 0 5 0 0 0
51 0 0 0 58 0 0 0 52 0 0 0 60 0 0 0 51 0 0 0 40 0 0 0 35 0 0 0 65 0 0 0 59 0 0 0 56 0 0 0 49 0 0 0 45 0 0 0 53 0 0 0 52 0 0 0 57 0 0 0 54 0 0 0 49 0 0 0 54 0 0 0 61 0 0 0 63 0 0 0
23 0 0 0 50 0 0 0 51 0 0 0 63 0 0 0 49 0 0 0 51 0 0 0 51 0 0 0 18 0 0 0 59 0 0 0 57 0 0 0 40 0 0 0 38 0 0 0 23 0 0 0 7 0 0 0 8 0 0 0 15 0 0 0 2 0 0 0 10 0 0 0 11 0 0 0 24 0 0 0
55 0 0 0 58 0 0 0 59 0 0 0 51 0 0 0 46 0 0 0 48 0 0 0 53 0 0 0 52 0 0 0 48 0 0 0 46 0 0 0 35 0 0 0 32 0 0 0 15 0 0 0 11 0 0 0 16 0 0 0 11 0 0 0 13 0 0 0 15 0 0 0 8 0 0 0 10 0 0 0
52 0 0 0 45 0 0 0 50 0 0 0 57 0 0 0 48 0 0 0 50 0 0 0 45 0 0 0 67 0 0 0 56 0 0 0 56 0 0 0 56 0 0 0 41 0 0 0 55 0 0 0 52 0 0 0 52 0 0 0 49 0 0 0 57 0 0 0 63 0 0 0 58 0 0 0 50 0 0 0
55 0 0 0 57 0 0 0 33 0 0 0 47 0 0 0 53 0 0 0 35 0 0 0 59 0 0 0 55 0 0 0 65 0 0 0 54 0 0 0 33 0 0 0 14 0 0 0 11 0 0 0 3 0 0 0 0 0 0 0 34 0 0 0 29 0 0 0 24 0 0 0 21 0 0 0 29 0 0 0
60 0 0 0 57 0 0 0 51 0 0 0 57 0 0 0 47 0 0 0 46 0 0 0 44 0 0 0 36 0 0 0 43 0 0 0 37 0 0 0 28 0 0 0 17 0 0 0 17 0 0 0 19 0 0 0 12 0 0 0 0 0 0 0 0 0 0 0 21 0 0 0 14 0 0 0 16 0 0 0
45 0 0 0 38 0 0 0 50 0 0 0 51 0 0 0 47 0 0 0 52 0 0 0 54 0 0 0 63 0 0 0 59 0 0 0 50 0 0 0 50 0 0 0 51 0 0 0 52 0 0 0 47 0 0 0 44 0 0 0 58 0 0 0 68 0 0 0 56 0 0 0 54 0 0 0 52 0 0 0
58 0 0 0 50 0 0 0 45 0 0 0 60 0 0 0 60 0 0 0 48 0 0 0 55 0 0 0 40 0 0 0 36 0 0 0 37 0 0 0 14 0 0 0 10 0 0 0 12 0 0 0 2 0 0 0 6 0 0 0 27 0 0 0 17 0 0 0 10 0 0 0 39 0 0 0 36 0 0 0
51 0 0 0 53 0 0 0 51 0 0 0 41 0 0 0 39 0 0 0 51 0 0 0 38 0 0 0 31 0 0 0 26 0 0 0 25 0 0 0 15 0 0 0 8 0 0 0 0 0 0 0 11 0 0 0 9 0 0 0 0 0 0 0 2 0 0 0 20 0 0 0 21 0 0 0 22 0 0 0
43 0 0 0 48 0 0 0 50 0 0 0 48 0 0 0 54 0 0 0 51 0 0 0 63 0 0 0 58 0 0 0 48 0 0 0 40 0 0 0 55 0 0 0 58 0 0 0 58 0 0 0 59 0 0 0 53 0 0 0 60 0 0 0 64 0 0 0 53 0 0 0 52 0 0 0 52 0 0 0
EDIT for more context
After some reading into opencv and the following answer on SO i believe that it's a case of rotating the numpy array to match the default format
Adding a larger image and transposing based on the above assumption ends up with the same format as opencv is expecting rows-to-cols, but the image is still greyscale, is squished & has artifacts as below (the text should be red and it should be taller as an image)
data = data.transpose()
EDIT: Resultant code looks like:
image=ctrl.GetImageWindow(0,0, w,h)
data = numpy.array(image, dtype=numpy.uint8).reshape(768,-1,3)
data=data.transpose()
b,g,r = data[::3,], data[1::3,],data[2::3]
result = cv2.merge([b,g,r])
cv2.imwrite("test-raw.png", result)
cv2.imshow("test-raw", result)
cv2.waitKey()
OUTPUT
Image Details: {'width': 2048, 'dateTime': '2018-05-03 17:35:34.564646', 'bytesPerPixel': 3, 'height': 1536, 'pixelFormat': 'BGR8'}
libpng warning: Invalid image width in IHDR
libpng warning: Image width exceeds user limit in IHDR
libpng warning: Invalid image height in IHDR
libpng warning: Image height exceeds user limit in IHDR
libpng error: Invalid IHDR data
OpenCV Error: Bad flag (parameter or structure field) (Unrecognized or unsupported array type) in cvGetMat, file C:\projects\opencv-python\opencv\modules\core\src\array.cpp, line 2493
Traceback (most recent call last):
File "ActiveGigeComTypes3.py", line 69, in <module>
cv2.imshow("test-raw", result)
cv2.error: C:\projects\opencv-python\opencv\modules\core\src\array.cpp:2493: error: (-206) Unrecognized or unsupported array type in function cvGetMat
Working example is:
image=ctrl.GetImageWindow(0,0, w,h)
data = numpy.array(image, dtype=numpy.uint8).reshape(768,-1,3)

Categories

Resources