File contains data in an unknown format. (m4a load from librosa) - python

So I am currently working on a DNN that takes in m4a files. I have ffmpeg, it creates a few batches and then dies on this error:
Traceback (most recent call last):
File "/users/work/s163838/./main.py", line 126, in <module>
File "/users/work/s163838/./main.py", line 96, in main
print("e")
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 521, in __next__
data = self._next_data()
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1203, in _next_data
return self._process_data(data)
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1229, in _process_data
data.reraise()
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/_utils.py", line 425, in reraise
raise self.exc_type(msg)
EOFError: Caught EOFError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/librosa/core/audio.py", line 164, in load
y, sr_native = __soundfile_load(path, offset, duration, dtype)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/librosa/core/audio.py", line 195, in __soundfile_load
context = sf.SoundFile(path)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/soundfile.py", line 629, in __init__
self._file = self._open(file, mode_int, closefd)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/soundfile.py", line 1183, in _open
_error_check(_snd.sf_error(file_ptr),
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/soundfile.py", line 1357, in _error_check
raise RuntimeError(prefix + _ffi.string(err_str).decode('utf-8', 'replace'))
RuntimeError: Error opening 'vox2/dev/aac/id08194/QnBYPze-x9A/00079.m4a': File contains data in an unknown format.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
data = fetcher.fetch(index)
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/apl/tryton/python/3.9.5/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp>
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/users/work/s163838/vox_celeb_loader.py", line 53, in __getitem__
load(speaker2utt1, self.num_samples)
File "/users/work/s163838/vox_celeb_loader.py", line 13, in load
wav, sr = librosa.load(path, sr=16000)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/librosa/util/decorators.py", line 88, in inner_f
return f(*args, **kwargs)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/librosa/core/audio.py", line 170, in load
y, sr_native = __audioread_load(path, offset, duration, dtype)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/librosa/core/audio.py", line 226, in __audioread_load
reader = audioread.audio_open(path)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/audioread/__init__.py", line 111, in audio_open
return BackendClass(path)
File "/users/kdm/s163838/.local/lib/python3.9/site-packages/audioread/rawread.py", line 65, in __init__
self._file = aifc.open(self._fh)
File "/apl/tryton/python/3.9.5/lib/python3.9/aifc.py", line 917, in open
return Aifc_read(f)
File "/apl/tryton/python/3.9.5/lib/python3.9/aifc.py", line 358, in __init__
self.initfp(f)
File "/apl/tryton/python/3.9.5/lib/python3.9/aifc.py", line 314, in initfp
chunk = Chunk(file)
File "/apl/tryton/python/3.9.5/lib/python3.9/chunk.py", line 63, in __init__
raise EOFError
EOFError
I am using this command
wav, sr = librosa.load(path, sr=16000)
is it just a broken file? How do I skip such then? Or is it something about loading a m4a file even with ffmpeg and the desired output when tested on a single m4a file?

Related

Pyautogui failed to read

It said that the file I referenced did not exist even though it was in the same directory. I do not understand what have gone wrong. Did it stop supporting the newest python?
import pyautogui
print("started")
time.sleep(3)
#hello = pyautogui.locateCenterOnScreen('TEST.png')
pyautogui.click('hello.png')```
```[ WARN:0#3.681] global /Users/xperience/actions-runner/_work/opencv-python/opencv-python/opencv/modules/imgcodecs/src/loadsave.cpp (239) findDecoder imread_('hello.png'): can't open/read file: check file path/integrity
Traceback (most recent call last):
File "/Users/andrewzhuang/Programming/Python/Autogrinder/Autogrind Advanced.py", line 10, in <module>
pyautogui.click('hello.png')
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyautogui/__init__.py", line 598, in wrapper
returnVal = wrappedFunction(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyautogui/__init__.py", line 980, in click
x, y = _normalizeXYArgs(x, y)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyautogui/__init__.py", line 661, in _normalizeXYArgs
location = locateOnScreen(firstArg)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyautogui/__init__.py", line 175, in wrapper
return wrappedFunction(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyautogui/__init__.py", line 213, in locateOnScreen
return pyscreeze.locateOnScreen(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyscreeze/__init__.py", line 373, in locateOnScreen
retVal = locate(image, screenshotIm, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyscreeze/__init__.py", line 353, in locate
points = tuple(locateAll(needleImage, haystackImage, **kwargs))
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyscreeze/__init__.py", line 207, in _locateAll_opencv
needleImage = _load_cv2(needleImage, grayscale)
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/pyscreeze/__init__.py", line 170, in _load_cv2
raise IOError("Failed to read %s because file is missing, "
OSError: Failed to read hello.png because file is missing, has improper permissions, or is an unsupported or invalid format```

Exception occurs in for statement itself

I have an exception occurring in a for statement:
for _, data in enumerate(dataloader, 0):
Not in the body of the for statement, but in the for statement itself. How do I catch this and continue?
Here is the entire error trace:
Traceback (most recent call last):
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/reprex/run_DL.py", line 67, in <module>
ut.generate_validation_model(cfg)
File "/panfs/roc/groups/4/miran045/reine097/projects/AlexNet_Abrol2021/reprex/utils.py", line 227, in generate_validation_model
loss = train(trainloader, net, optimizer, criterion, cfg.cuda_avl)
File "/panfs/roc/groups/4/miran045/reine097/projects/AlexNet_Abrol2021/reprex/utils.py", line 96, in train
for _, data in enumerate(dataloader, 0):
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 521, in __next__
data = self._next_data()
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1203, in _next_data
return self._process_data(data)
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1229, in _process_data
data.reraise()
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/_utils.py", line 434, in reraise
raise exception
RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
data = fetcher.fetch(index)
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch
return self.collate_fn(data)
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/_utils/collate.py", line 84, in default_collate
return [default_collate(samples) for samples in transposed]
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/_utils/collate.py", line 84, in <listcomp>
return [default_collate(samples) for samples in transposed]
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/_utils/collate.py", line 64, in default_collate
return default_collate([torch.as_tensor(b) for b in batch])
File "/home/miran045/reine097/projects/AlexNet_Abrol2021/venv/lib/python3.9/site-packages/torch/utils/data/_utils/collate.py", line 56, in default_collate
return torch.stack(batch, 0, out=out)
RuntimeError: stack expects each tensor to be equal size, but got [1, 208, 300, 320] at entry 0 and [1, 320, 300, 208] at entry 13
The error occurs on this line:
File "/panfs/roc/groups/4/miran045/reine097/projects/AlexNet_Abrol2021/reprex/utils.py", line 96, in train
for _, data in enumerate(dataloader, 0):

Error shows up when using df.to_parquet("filename")

I want to save the data set as a parquet file, called power.parquet, and I use df.to_parquet(<filename>). But it gives me this errer "ValueError: Error converting column "Global_reactive_power" to bytes using encoding UTF8. Original error: bad argument type for built-in operation" And I installed the fastparquet package.
from fastparquet import write, ParquetFile
dat.to_parquet("power.parquet")
df_parquet = ParquetFile("power.parquet").to_pandas()
df_parquet.head() # Test your final value
`*Traceback (most recent call last):
File "/opt/anaconda3/lib/python3.9/site-packages/fastparquet/writer.py", line 259, in convert
out = array_encode_utf8(data)
File "fastparquet/speedups.pyx", line 50, in fastparquet.speedups.array_encode_utf8
TypeError: bad argument type for built-in operation
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/var/folders/4f/bm2th1p56tz4rq_zffc8g3940000gn/T/ipykernel_85477/3080656655.py", line 1, in <module>
dat.to_parquet("power.parquet", compression="GZIP")
File "/opt/anaconda3/lib/python3.9/site-packages/dask/dataframe/core.py", line 4560, in to_parquet
return to_parquet(self, path, *args, **kwargs)
File "/opt/anaconda3/lib/python3.9/site-packages/dask/dataframe/io/parquet/core.py", line 732, in to_parquet
return compute_as_if_collection(
File "/opt/anaconda3/lib/python3.9/site-packages/dask/base.py", line 315, in compute_as_if_collection
return schedule(dsk2, keys, **kwargs)
File "/opt/anaconda3/lib/python3.9/site-packages/dask/threaded.py", line 79, in get
results = get_async(
File "/opt/anaconda3/lib/python3.9/site-packages/dask/local.py", line 507, in get_async
raise_exception(exc, tb)
File "/opt/anaconda3/lib/python3.9/site-packages/dask/local.py", line 315, in reraise
raise exc
File "/opt/anaconda3/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task
result = _execute_task(task, data)
File "/opt/anaconda3/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task
return func(*(_execute_task(a, cache) for a in args))
File "/opt/anaconda3/lib/python3.9/site-packages/dask/utils.py", line 35, in apply
return func(*args, **kwargs)
File "/opt/anaconda3/lib/python3.9/site-packages/dask/dataframe/io/parquet/fastparquet.py", line 1167, in write_partition
rg = make_part_file(
File "/opt/anaconda3/lib/python3.9/site-packages/fastparquet/writer.py", line 716, in make_part_file
rg = make_row_group(f, data, schema, compression=compression,
File "/opt/anaconda3/lib/python3.9/site-packages/fastparquet/writer.py", line 701, in make_row_group
chunk = write_column(f, coldata, column,
File "/opt/anaconda3/lib/python3.9/site-packages/fastparquet/writer.py", line 554, in write_column
repetition_data, definition_data, encode[encoding](data, selement), 8 * b'\x00'
File "/opt/anaconda3/lib/python3.9/site-packages/fastparquet/writer.py", line 354, in encode_plain
out = convert(data, se)
File "/opt/anaconda3/lib/python3.9/site-packages/fastparquet/writer.py", line 284, in convert
raise ValueError('Error converting column "%s" to bytes using '
ValueError: Error converting column "Global_reactive_power" to bytes using encoding UTF8. Original error: bad argument type for built-in operation
*
I tried by adding object_coding = "bytes".I want to solve this problem.

Issue TypeError: argument must be a string or number

There is only one categorical column and I want to encode it, it is working fine on notebook but when it is being uploaded to aicrowd platform it is creating this trouble.
There are totally 3 categorical features where one is the target feature, one is the row of ids and after excluding them for the training I am left with one feature.
df[['intersection_pos_rel_centre']]
from sklearn.preprocessing import LabelEncoder
le=LabelEncoder()
df[['intersection_pos_rel_centre']]=le.fit_transform(df[['intersection_pos_rel_centre']])
df[['intersection_pos_rel_centre']]
My error is
Selecting runtime language: python
[NbConvertApp] Converting notebook predict.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python
Traceback (most recent call last):
File "/opt/conda/bin/jupyter-nbconvert", line 11, in <module>
sys.exit(main())
File "/opt/conda/lib/python3.8/site-packages/jupyter_core/application.py", line 254, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File "/opt/conda/lib/python3.8/site-packages/traitlets/config/application.py", line 845, in launch_instance
app.start()
File "/opt/conda/lib/python3.8/site-packages/nbconvert/nbconvertapp.py", line 350, in start
self.convert_notebooks()
File "/opt/conda/lib/python3.8/site-packages/nbconvert/nbconvertapp.py", line 524, in convert_notebooks
self.convert_single_notebook(notebook_filename)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/nbconvertapp.py", line 489, in convert_single_notebook
output, resources = self.export_single_notebook(notebook_filename, resources, input_buffer=input_buffer)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/nbconvertapp.py", line 418, in export_single_notebook
output, resources = self.exporter.from_filename(notebook_filename, resources=resources)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/exporters/exporter.py", line 181, in from_filename
return self.from_file(f, resources=resources, **kw)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/exporters/exporter.py", line 199, in from_file
return self.from_notebook_node(nbformat.read(file_stream, as_version=4), resources=resources, **kw)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/exporters/notebook.py", line 32, in from_notebook_node
nb_copy, resources = super().from_notebook_node(nb, resources, **kw)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/exporters/exporter.py", line 143, in from_notebook_node
nb_copy, resources = self._preprocess(nb_copy, resources)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/exporters/exporter.py", line 318, in _preprocess
nbc, resc = preprocessor(nbc, resc)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/preprocessors/base.py", line 47, in __call__
return self.preprocess(nb, resources)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/preprocessors/execute.py", line 79, in preprocess
self.execute()
File "/opt/conda/lib/python3.8/site-packages/nbclient/util.py", line 74, in wrapped
return just_run(coro(*args, **kwargs))
File "/opt/conda/lib/python3.8/site-packages/nbclient/util.py", line 53, in just_run
return loop.run_until_complete(coro)
File "/opt/conda/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
return future.result()
File "/opt/conda/lib/python3.8/site-packages/nbclient/client.py", line 553, in async_execute
await self.async_execute_cell(
File "/opt/conda/lib/python3.8/site-packages/nbconvert/preprocessors/execute.py", line 123, in async_execute_cell
cell, resources = self.preprocess_cell(cell, self.resources, cell_index)
File "/opt/conda/lib/python3.8/site-packages/nbconvert/preprocessors/execute.py", line 146, in preprocess_cell
cell = run_sync(NotebookClient.async_execute_cell)(self, cell, index, store_history=self.store_history)
File "/opt/conda/lib/python3.8/site-packages/nbclient/util.py", line 74, in wrapped
return just_run(coro(*args, **kwargs))
File "/opt/conda/lib/python3.8/site-packages/nbclient/util.py", line 53, in just_run
return loop.run_until_complete(coro)
File "/opt/conda/lib/python3.8/site-packages/nest_asyncio.py", line 98, in run_until_complete
return f.result()
File "/opt/conda/lib/python3.8/asyncio/futures.py", line 178, in result
raise self._exception
File "/opt/conda/lib/python3.8/asyncio/tasks.py", line 280, in __step
result = coro.send(None)
File "/opt/conda/lib/python3.8/site-packages/nbclient/client.py", line 852, in async_execute_cell
self._check_raise_for_error(cell, exec_reply)
File "/opt/conda/lib/python3.8/site-packages/nbclient/client.py", line 760, in _check_raise_for_error
raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
------------------
df[['intersection_pos_rel_centre']]
from sklearn.preprocessing import LabelEncoder
le=LabelEncoder()
df[['intersection_pos_rel_centre']]=le.fit_transform(df[['intersection_pos_rel_centre']])
df[['intersection_pos_rel_centre']]
------------------
TypeError: argument must be a string or number

Solving IOError while generating a pdf using reportlab and generated qr code image

I am trying to generate a qr code from text, and then insert into a reportlab pdf.
My code:
def qr_code_as_image(text):
from io import BytesIO
print("In show_qr")
img = generate_qr_code(text)
print(img, type(img))
i = Image(img)
print(i, type(i))
return i
def add_patient_header_with_qr(self):
line1 = ("Name", self.linkedcustomer.name,
"Age", self.linkedcustomer.age())
line2 = ("MRD No.", self.linkedcustomer.cstid,
"Date", self.prescription_time)
line3 = ("No.", "#", "Doctor", self.doc.name)
datatb = [line1, line2, line3]
patientdetailstable = Table(datatb)
patientdetailstable.setStyle(self.patientdetails_style)
col1 = patientdetailstable
checkin_url = reverse('clinicemr', args=[self.checkin.checkinno])
qr_image = qr_code_as_image(checkin_url)
qr_image.hAlign = 'LEFT'
col2 = Table([[qr_image]])
tblrow1 = Table([[col1, col2]], colWidths=None)
tblrow1.setStyle(self.table_left_top_align)
self.elements.append(tblrow1)
def final_generate(self, footer_content, action=None):
with NamedTemporaryFile(mode='w+b') as temp:
from django.http import FileResponse, Http404
from functools import partial
# use the temp file
cmd = "cat " + str(temp.name)
print(os.system(cmd))
print(footer_content, type(footer_content))
doc = SimpleDocTemplate(
temp.name,
pagesize=A4,
rightMargin=20,
leftMargin=20,
topMargin=20,
bottomMargin=80,
allowSplitting=1,
title="Prescription",
author="System.com")
frame = Frame(doc.leftMargin, doc.bottomMargin, doc.width, doc.height,
id='normal')
template = PageTemplate(
id='test',
frames=frame,
onPage=partial(footer, content=footer_content)
)
doc.addPageTemplates([template])
doc.build(self.elements,
onFirstPage=partial(footer, content=footer_content),
onLaterPages=partial(footer, content=footer_content)
)
print(f'Generated {temp.name}')
I get the following output:
2020-11-29 13: 06: 33, 915 django.request ERROR Internal Server Error: / clinic/presc/k-0NGpApcg
Traceback(most recent call last):
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 655, in open_for_read
return open_for_read_by_name(name, mode)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 599, in open_for_read_by_name
return open(name, mode)
ValueError: embedded null byte
During handling of the above exception, another exception occurred:
Traceback(most recent call last):
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 658, in open_for_read
return getBytesIO(datareader(name) if name[:5].lower() == 'data:' else urlopen(name).read())
File "/usr/lib/python3.6/urllib/request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python3.6/urllib/request.py", line 517, in open
req.timeout = timeout
AttributeError: 'bytes' object has no attribute 'timeout'
During handling of the above exception, another exception occurred:
Traceback(most recent call last):
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/django/core/handlers/exception.py", line 34, in inner
response = get_response(request)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/django/core/handlers/base.py", line 115, in _get_response
response = self.process_exception_by_middleware(e, request)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/django/core/handlers/base.py", line 113, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/home/joel/myappointments/clinic/views.py", line 6879, in GoGetPrescription
clinicobj = clinicobj,
File "/home/joel/myappointments/clinic/views.py", line 16222, in PDFPrescriptions
return prescription.generate_pdf(action=action, rating=True)
File "/home/joel/myappointments/clinic/views.py", line 15415, in generate_pdf
return self.final_generate(footer_content, action=action)
File "/home/joel/myappointments/clinic/views.py", line 15447, in final_generate
onLaterPages = partial(footer, content=footer_content)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/doctemplate.py", line 1291, in build
BaseDocTemplate.build(self, flowables, canvasmaker=canvasmaker)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/doctemplate.py", line 1056, in build
self.handle_flowable(flowables)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/doctemplate.py", line 912, in handle_flowable
if frame.add(f, canv, trySplit=self.allowSplitting):
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/frames.py", line 174, in _add
w, h = flowable.wrap(aW, h)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/tables.py", line 1206, in wrap
self._calc(availWidth, availHeight)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/tables.py", line 641, in _calc
W = self._calcPreliminaryWidths(availWidth) # widths
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/tables.py", line 754, in _calcPreliminaryWidths
new = elementWidth(value, style) or 0
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/tables.py", line 518, in _elementWidth
w = v.minWidth() # should be all flowables
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/tables.py", line 873, in minWidth
style.leftPadding+style.rightPadding)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/tables.py", line 512, in _elementWidth
if hasattr(v, 'drawWidth') and isinstance(v.drawWidth, (int, float)): return v.drawWidth
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/flowables.py", line 494, in __getattr__
self._setup_inner()
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/flowables.py", line 455, in _setup_inner
img=self._img
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/platypus/flowables.py", line 488, in __getattr__
self._img=ImageReader(self._file)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 813, in __init__
annotateException('\nfileName=%r identity=%s' %
(fileName, self.identity()))
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 1394, in annotateException
rl_reraise(t, v, b)
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 147, in rl_reraise
raise v
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 777, in __init__
self.fp=open_for_read(fileName, 'b')
File "/home/joel/myappointments/venv/lib/python3.6/site-packages/reportlab/lib/utils.py", line 660, in open_for_read
raise IOError('Cannot open resource "%s"' % name)
OSError: Cannot open resource "b'\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x00\xd2\x00\x00\x00\xd2\x01\x00\x00\x00\x00\x17\xe2\xa3\xef\x00\x00\x01$IDATx\x9c\xed\x98An\xc4 \x10\x04k\x16\xdf\xf1\x8f\xe0gyS~\x80\x9f\x92\x1f\xe0;\xab\xde\x03\xc6\xeb\x1c"\xe5d\xd0\xda\x1c\xd0 #\xcbRk4\x9e\xee\x01\xb6\xe5$\xa9 \xe5v\xc3\x83\xbf\xd7\xe7cA\x92\x94\xc1"N\x16k\x86\xa4\xd1x\x9e\x8bA\xc8#\xc8NJ\xbe e\'\xc0]</\x07\xccb\xdb\xfas\xe9\x8eM\xefP\xac\x13b\xed\xc6e0\xccKJ\x80\xd9\xecd\x11\x90T\xfap\x19\x06\xdb\x97\x13!;\xd5v\xd3\x87\xcbH\xd8\xa4=\x14\xeb\xd3\xc0\xb7\x9be$\x9e\x9d\xea\xa5V\x89/u\xab\xca\x94F\xe2yz^\x94\xbc\x04^\xda\x8ePe{,\x9e}\xeaE\xe9\xed\xe6\xe0\xae\x17\xa0\xa6#\xf9\xb2\x9b;\xe9\x1f\xdf}:\xc6A\x80v=\xbau\xba\xd5\xcb\xef_hk7#\xf1\xec\xee_\xf0\x92\x94\x9d.\xde_\xda<\xdd\xde\x19R\xb5\xbfW\xcf\xcb\x03V3\x8b\xb4\x911\xfc\x98\xd9\xd7\xe5\xfb\xcb\x11[f\'\x96\x19\x80\xa7\x8d\xcb\xf3\x0c\xec0O\x13\xbe\xa7b\xf8\x0c\x8b\xdd\xben\xef/\xc0\xf68\xb5E#\xf1\xec\xaaGU\xac\x9d\x08\xba\xba\xdf}\x01<\xf7\xbf\x8cN\xed-\x8a\x00\x00\x00\x00IEND\xaeB`\x82'"
fileName=b'\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x00\xd2\x00\x00\x00\xd2\x01\x00\x00\x00\x00\x17\xe2\xa3\xef\x00\x00\x01$IDATx\x9c\xed\x98An\xc4 \x10\x04k\x16\xdf\xf1\x8f\xe0gyS~\x80\x9f\x92\x1f\xe0;\xab\xde\x03\xc6\xeb\x1c"\xe5d\xd0\xda\x1c\xd0#\xcbRk4\x9e\xee\x01\xb6\xe5$\xa9 \xe5v\xc3\x83\xbf\xd7\xe7cA\x92\x94\xc1"N\x16k\x86\xa4\xd1x\x9e\x8bA\xc8#\xc8NJ\xbe e\'\xc0]</\x07\xccb\xdb\xfas\xe9\x8eM\xefP\xac\x13b\xed\xc6e0\xccKJ\x80\xd9\xecd\x11\x90T\xfap\x19\x06\xdb\x97\x13!;\xd5v\xd3\x87\xcbH\xd8\xa4=\x14\xeb\xd3\xc0\xb7\x9be$\x9e\x9d\xea\xa5V\x89/u\xab\xca\x94F\xe2yz^\x94\xbc\x04^\xda\x8ePe{,\x9e}\xeaE\xe9\xed\xe6\xe0\xae\x17\xa0\xa6#\xf9\xb2\x9b;\xe9\x1f\xdf}:\xc6A\x80v=\xbau\xba\xd5\xcb\xef_hk7#\xf1\xec\xee_\xf0\x92\x94\x9d.\xde_\xda<\xdd\xde\x19R\xb5\xbfW\xcf\xcb\x03V3\x8b\xb4\x911\xfc\x98\xd9\xd7\xe5\xfb\xcb\x11[f\'\x96\x19\x80\xa7\x8d\xcb\xf3\x0c\xec0O\x13\xbe\xa7b\xf8\x0c\x8b\xdd\xben\xef/\xc0\xf68\xb5E#\xf1\xec\xaaGU\xac\x9d\x08\xba\xba\xdf}\x01<\xf7\xbf\x8cN\xed-\x8a\x00\x00\x00\x00IEND\xaeB`\x82' identity=[ImageReader#0x7f1e0987ecf8 filename=b'\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x00\xd2\x00\x00\x00\xd2\x01\x00\x00\x00\x00\x17\xe2\xa3\xef\x00\x00\x01$IDATx\x9c\xed\x98An\xc4 \x10\x04k\x16\xdf\xf1\x8f\xe0gyS~\x80\x9f\x92\x1f\xe0;\xab\xde\x03\xc6\xeb\x1c"\xe5d\xd0\xda\x1c\xd0#\xcbRk4\x9e\xee\x01\xb6\xe5$\xa9 \xe5v\xc3\x83\xbf\xd7\xe7cA\x92\x94\xc1"N\x16k\x86\xa4\xd1x\x9e\x8bA\xc8#\xc8NJ\xbe e\'\xc0]</\x07\xccb\xdb\xfas\xe9\x8eM\xefP\xac\x13b\xed\xc6e0\xccKJ\x80\xd9\xecd\x11\x90T\xfap\x19\x06\xdb\x97\x13!;\xd5v\xd3\x87\xcbH\xd8\xa4=\x14\xeb\xd3\xc0\xb7\x9be$\x9e\x9d\xea\xa5V\x89/u\xab\xca\x94F\xe2yz^\x94\xbc\x04^\xda\x8ePe{,\x9e}\xeaE\xe9\xed\xe6\xe0\xae\x17\xa0\xa6#\xf9\xb2\x9b;\xe9\x1f\xdf}:\xc6A\x80v=\xbau\xba\xd5\xcb\xef_hk7#\xf1\xec\xee_\xf0\x92\x94\x9d.\xde_\xda<\xdd\xde\x19R\xb5\xbfW\xcf\xcb\x03V3\x8b\xb4\x911\xfc\x98\xd9\xd7\xe5\xfb\xcb\x11[f\'\x96\x19\x80\xa7\x8d\xcb\xf3\x0c\xec0O\x13\xbe\xa7b\xf8\x0c\x8b\xdd\xben\xef/\xc0\xf68\xb5E#\xf1\xec\xaaGU\xac\x9d\x08\xba\xba\xdf}\x01<\xf7\xbf\x8cN\xed-\x8a\x00\x00\x00\x00IEND\xaeB`\x82']
From the error, it appears that it is erroring out on getting the name of the image file. But there is no file. The image is being generated from BytesIO.
Your generate_qr_code function, which you did not show us, is NOT returning a BytesIO object. It's returning the raw bytes of the PNG image. When you print(img, type(img)), it told you it was of type "bytes", right? That's a string of bytes, not a BytesIO object. If you wrap those bytes into a BytesIO object, then the reportlab Image constructor will be able to handle it.

Categories

Resources