Possible to upload to ftp server with URL? - python

I'm wondering if there is a library that would enable me to upload a file to a remote server via ftp. I know there is ftplib, but from what i can tell, it only allows uploading from your own files. So if I had a URL like, https://vignette.wikia.nocookie.net/disney/images/d/db/Donald_Duck_Iconic.png , could I make a program to directly download/upload this to my ftp server. Instead of first having to download it to my own computer, then upload it to the server.
Sorry for formatting I'm on mobile.

You can use requests to download the file, and put the content into a BytesIO for upload:
from io import BytesIO
import requests
url = 'https://vignette.wikia.nocookie.net/disney/images/d/db/Donald_Duck_Iconic.png'
response = requests.get(url)
f = BytesIO(response.content)
Then, f is a file-like object that is suitable for FTP.storbinary.

Related

Download a giant file with HTTP and upload to FTP server without storing it

I have a project which I should download some giant files using HTTP and upload them to am FTP server. The simplest way is to first download the file and then upload it to FTP; Thinking it as two independent stages.
But can we use streams to upload the file as it is being downloaded? This way seems more efficient. Any example specially in python are welcome.
Use requests module to obtain a file-like object representing the HTTP download and use it with ftplib FTP.storbinary:
from ftplib import FTP
import requests
url = "https://www.example.com/file.zip"
r = requests.get(url, stream=True)
ftp = FTP(host, user, passwd)
ftp.storbinary("STOR /ftp/path/file.zip", r.raw)

How to serve a temporary file (non-static) via HTTP request?

I have a Python REST server, that is able to download and write a temporary file using Python TempFile.
That is, at request time I have a a file in the filesystem of the server, but it is not permanent so the client cannot access it statically (I.e. via http://myserver/path-to-my-file ). I need a way to access an endpoint, and then get a file returned based on the request. (I.e. http://myserver/myendpoint/myrequestparameters)
How does that work over HTTP? Is it possible?
(For context, right now I am serving the file encoded as string using base64 encoding and utf-8 decoding, but my frontend application needs a direct download link)
I believe there's a dedicated response type for such stuff in django. Assuming send_file is your endpoint:
from django.http import FileResponse
def send_file(response):
img = open('my_image_file.jpg', 'rb')
response = FileResponse(img)
return response

Sending file from URL in REST request Python

This is what I'm currently using to send images to the API:
import requests
response = requests.post("http://someAPI",
auth=(username, password),
files={'imgFile': open(filepath, 'rb')}
)
It works for local files, but I would like to be able to send images from a URL as well. I know how to do this by saving the file, then opening it as a local file, sending it to the API and then removing it.
Is there any way of doing this without having to save and remove the file?
You can use StringIO. StringIO is file compatible object which has open,seek,read,write. So you can load data to it and serve from that, without writing file to disk. I normally use this approch for creating CAPTCH.
import StringIO
import requests
# Imagine you read the image file in image as base64 encoded...
buff = StringIO.StringIO()
buff.write(image.decode('base64'))
buff.seek(0)
response = requests.post("http://someAPI",
auth=(username, password),
files={'imgFile': buff}
)

Download image from URL that automatically creates download

I have a url that when opened, all it does is initiate a download, and immediately closes the page. I need to capture this download (a png) with python and save it to my own directory. I have tried all the usual urlib and urlib2 methods and even tried with mechanize but it's not working.
The url automatically starting a download and then closing is definitely causing some problems.
UPDATE: Specifically it is using Nginx to serve up the file with a X-Accel-Mapping header.
There's nothing particularly special about X-Accel-Mapping header. Perhaps the page makes the HTTP request with ajax, and uses the X-Accel-Mapping reader value to trigger the download?
Here's how I'd do it with urllib2:
response = urllib2.urlopen(url_to_get_x_accel_mapping_header)
download_url = response.headers['X-Accel-Mapping']
download_contents = urllib2.urlopen(download_url).read()
import urllib
URL= YOUR_URL
IMAGE = URL.rsplit('/',1)[1]
urllib.urlretrieve(URL, IMAGE)
For details to dynamically download images from a url list visit here

python: download file and send it to online storage in realtime

I want to download file to my server and automatically send it to online storage(minus or dropbox) via minus or dropbox API, without saving the downloaded file in my server. So, its like streaming or pipe the HTTP connection. Right now im using minus.com API, but its require file object or local file as parameter. I can't figure out how to convert http response to file object.
It is possible to do this? if possible, how?
concept :
FILE_ON_ANOTHER_SERVER ----(http)---> MY_SERVER ----(http)----> ONLINE_STORAGE
thanks
You can get the data from a response via the read() method
response = urllib2.urlopen(request)
data = response.read()
The variable data has the binary data from the response.
Now you can create a StringIO Object which handles the data as a file like object.
import StringIO
datastream = StringIO.StringIO()
datastream.write(data)
datastream.seek(0)
#create dropbox client
client.put_file('/test', datastream)
urllib2.urlopen(url) will return a file-like object. Can you pass that directly to your minus api? See the urllib2 docs at
http://docs.python.org/library/urllib2

Categories

Resources