How to pass image to python script using flask? - python

I have a website (managed with python-flask) with images on canvas and i would like to pass the content of those canvas to another python script as images.
The other python script is using openCV in order to perform face detection.
I know i could upload the image on my server and then read the file on my opencv application but i would like not to save any data on my server.
Do you have any ideas ?

You anyway should upload file to the server, because you need to transfer user's data to your server application.
But instead of saving it as a regular file, you could use someting like SpooledTemporaryFile
In other words, you'll have workflow like this:
Send image with POST to the server;
Read file from POST request with flask;
Write it to SpooledTemporaryFile and receive a file-like object;
Use that file-like object for OpenCV

Related

How to call Python (cutting video into frames and then sending them as JSON) script from Jmeter?

I am trying to load test an API, which receives images as JSON and sends them back with blurred faces. All of this is written in Python.
I call the following POST request:
r = requests.post("http://127.0.0.1:8080/function/flask-service", json=json.dumps(files)) #POST to server as json
Now I want to load test my API with JMETER.
The problem however is, that my Python script first needs to cut down a video into individual frames. This is why the JSON is different each time.
What do I have to do to pass JMETER different JSONS for the different frames my Python script extracts.
You can call external programs and commands using OS Process Sampler for example somewhere in setUp Thread Group and store the individual frames as JSON files into a folder.
Then you can use Directory Listing Config plugin which reads next file from the specified directory into a JMeter Variable on each iteration of each virtual user.

Python API: Tweet with media without file

I'm using twitter from python in an environment where I can't store files.
I get a HTTP POST with a text and an image and want to create a tweet from this data without writing a local file (it's zappa on AWS api environment).
Tweepy only allows filenames, which does not work for me.
python-twitter seems to have something like that, but I can't find a doc for this.
Should I just send POST requests to twitter for uploading the images? Is there a simpler way?
Try passing a io.BytesIO to tweepy's API.update_with_media as file.
filename – The filename of the image to upload. This will automatically be opened unless file is specified
...
file – A file object, which will be used instead of opening filename. filename is still required, for MIME type detection and to use as a form field in the POST data
Edit:
It looks like you have the image data base64 encoded. You can use base64.b64decode to decode it before creating the io.BytesIO:
file = io.BytesIO(base64.b64decode(base64_data))

ClarifAI Python API : Passing image captured from Webcam, to API, without saving on hard drive

I am using ClarifAI python API to classify objects capture from my Raspberry PI camera. (the whole program runs on python on raspberry PI).
Now, the two ways in which this [Python API] could be used is:
1> passing the path of the image in hard drive as argument while calling API.
2> passing the URL of the image as argument while calling API.
Now, I want to directly pass the image captured by my Pi Camera, using picamera library to this API, and get the tags, without saving it on the hard drive. How can I do it?
What kind variable I should be using to store the image data and for passing it on ?
It looks like there's a few ways to go about it.
1) Use the capture function in picamera to save the file, push the file to Clarifai's function tag_images, then delete the file from the drive
2) Post the captured image's bytes to the Imgur (or similar) API and push the resulting URL to Clarifai's tag_image_urls function
3) Use REST instead of the Clarifai Python library and push the image bytes to the /tag endpoint

mySQL connection and loading images to Qt

I want to connect to a mySQL database with python, then query a number corresponding to an image, and load this image in Qt. From what I found online, it is suggested not to use mysql database to store the image, but instead store a file location on the server. If this is the case, can I load the image (do i have to download it?) into qt using mysql or do i have to open another connection with ftp, download the image to a folder, and then load it with qt? If there are any resources on this type of workflow I would appreciate it.
You don't need to download the file using FTP (or the like) to load it into Qt.
Assuming the database stores the correct file path to the image, you can just use the same functionality once you get the file path, i.e. you anyway only need the file path to load the image into Qt. There is nothing special you would do by downloading the image itself.
If the database is on a remote server, a possible approach is to use the JDBC API to access the database, get the image as a binary file and then serialize it, which can be transferred over the network.

Using memcached to host images

Im writing simple blogging platform in Flask microframework, and I'd like to allow users to change image on the front page but without actually writing it into filesystem. Is it possible to point src attribute in img tag to an object stored in memory?
Yes, you can do it.
Create a controller or serlvet called for example
www.yoursite.com/getImage/ID
When you execute this URL, your
program shoud connect to the memcached and return the image object
that you have previously stored in it.
Finally when in your html you add: src="www.yoursite.com/getImage/ID" the browser will execute
this url, but instead of reading a file from disk it will ask the memcached for the specific ID.
Be sure to add the correct content-type in your response from the server in order that the browser understand that you are sending an image content.
Fer

Categories

Resources