Pyrebase uploaded image not loading firebase storage - python

I'm not able to make this work, I want to upload a picture to firebase with pyrebase and the file(picture) goes through but it never loads on firebase/storage (see picture attached)
This is the code:
if request.method == 'POST':
upload = request.files['upload']
upload.save(os.path.join(UPLOAD_FOLDER, filename))
storage.child("images/test.jpg").put(UPLOAD_FOLDER+filename)

Last I heard, there is a bug in the Firebase console where images don't load if they were uploaded using one of the backend SDKs (as opposed to one of the Firebase client SDKs, or the Firebase console). It uses the "access token" that you would see in the "File location" section for that file. If there is no access token, the console won't know how to load the file.
If this is indeed the case for you, please file a bug with Firebase support. It's a known issue, but the report will let them know how many people are affected.

Clicking Create new access token under File Location solves the issue.

Related

Sharepoint API how to read file havin only sharing link to it

I'm using python office365 library to access sharepoint documents. I don't know how to access file via API that have been shared with me by sharing link. I need to get this file content and if possible metadata (last modify date). Could anyone help?
The user that I'm using have no access to this sharepoint folder other than a sharing link to a single file.
I tried many variations of normal file access API, bot by hand and by office365 library. I couldnt find a way to access a file when I have only sharing link to it.
My sharing link looks like that:
https://[redacted].sharepoint.com/:x:/s/[redacted]/dir1/dir2/ESd0HkNNSbJMhQFavQsr9-4BNHC2rHSWsnbs3zRdjtZsC3g so there is not really a filename here and I cannot read via API content of any folder per se because I have an error Attempted to perform an unathorized operation.. Authentication goes fine (when i mistake password I get different error).
According to my research and testing, you can use the following Rest API to read file (get file content):
https://xxxx.sharepoint.com/sites/xxx/_api/web/GetFolderByServerRelativeUrl('/sites/xxx/Library_Name/Folder Name')/Files('Document.docx')/$value
If you want to get last modify date, you can use the following Rest API to get the Modified field:
https://xxxx.sharepoint.com/sites/xxx/_api/web/lists/getbytitle('test_library')/Items?$select=Modified

gcs client library stopped working with dev_appserver

Google cloud storage client library is returning 500 error when I attempt to upload via development server.
ServerError: Expect status [200] from Google Storage. But got status 500.
I haven't changed anything with the project and the code still works correctly in production.
I've attempted gcloud components update to get the latest dev_server and I've updated to the latest google cloud storage client library.
I've run gcloud init again to make sure credentials are loaded and I've made sure I'm using the correct bucket.
The project is running on windows 10.
Python version 2.7
Any idea why this is happening?
Thanks
Turns out this has been a problem for a while.
It has to do with how blobstore filenames are generated.
https://issuetracker.google.com/issues/35900575
The fix is to monkeypatch this file:
google-cloud-sdk\platform\google_appengine\google\appengine\api\blobstore\file_blob_storage.py
def _FileForBlob(self, blob_key):
"""Calculate full filename to store blob contents in.
This method does not check to see if the file actually exists.
Args:
blob_key: Blob key of blob to calculate file for.
Returns:
Complete path for file used for storing blob.
"""
blob_key = self._BlobKey(blob_key)
# Remove bad characters.
import re
blob_fname = re.sub(r"[^\w\./\\]", "_", str(blob_key))
# Make sure it's a relative directory.
if blob_fname and blob_fname[0] in "/\\":
blob_fname = blob_fname[1:]
return os.path.join(self._DirectoryForBlob(blob_key), blob_fname)

Upload image to google cloud store through a web page form python, google app engine, angularjs

First let me say,
I think I've read every stack exchange post that seemed relevant to this topic.*
What I'm trying to do:
I'm trying to allow certain users to send an image to google cloud storage through a form on my site..
Another requirement along with sending that image to the cloud storage,is, I need to create a model of that image in the datastore, with it's relevant attributes.
I'm not sure what the best way to go about this is because..
Approaches I've read about:
There's a bunch of approaches for getting images to the cloud storage:
through the blobstore api
through the python client for cloud storage: https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/#Python_About_the_client_library
through rest api (I was considering a json approach)https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload#simple
I've read a lot of conflicting information,
and I've read so much I can't keep it straight,
however,
I think the resolution was:
For blobstore:
The blobstore.uploadtocloudstoragethingmethod may or may not have a file upload limit size (no that's not the actual call or method name) 2. The blobstore method sounded like it's going to be phased out??
For Jason API:
"However, the client library includes App Engine optimizations, so using the REST API might require extra development time. Note: The App Engine development server supports the Cloud Storage Client. It doesn't support the REST API."
For the client library:
I don't know. It looks fine so far. Here is the method I'm thinking about using:
"cloudstorage.open(filename, mode='r', content_type=None,
options=None,
read_buffer_size=storage_api.ReadBuffer.DEFAULT_BUFFER_SIZE,
retry_params=None)
In read mode (r) opens the specified Cloud Storage
object for read. In write mode w, if the specified file exists, it
opens it for an overwrite (append is not supported). If the file
doesn't exist, it is created in the specified bucket."
The stuff I'm using:
Google App Engine
Python (for handlers, models,etc)
angularjs and jquery (front end)
json sometimes (backend)
GAE boilerplate (don't ask)
What I think I'm gonna do :
Get the user's image:
So, I found this form which I tested. It does grab a file from your computer:
<form action="<?php echo $upload_url?>" enctype="multipart/form-data"
method="post">
Files to upload: <br> <input type="file" name="uploaded_files" size="40"> <input type="submit" value="Send">
(No, I can't use the, the php junk in the action. I took it from some place in the docs, but I tested it, it does grab a file, no it won't send, and it would be silly to expect it to)
Modify my current handler (which takes care of creating a model of the image in the datastore) to also use
this method:
"cloudstorage.open(filename, mode='r', content_type=None,
options=None,
read_buffer_size=storage_api.ReadBuffer.DEFAULT_BUFFER_SIZE,
retry_params=None)
In read mode (r) opens the specified Cloud Storage
object for read. In write mode w, if the specified file exists, it
opens it for an overwrite (append is not supported). If the file
doesn't exist, it is created in the specified bucket."
My Problems so far
I don't know how that form works, and I don't know how to get the right information from this form to a handler, and what info that handler will need, so I can process it, create the model (I already know how to create the model in the data store, already doing so), and also send the image to the google cloud storage with this method:
"cloudstorage.open(filename, mode='r', content_type=None,
options=None,
read_buffer_size=storage_api.ReadBuffer.DEFAULT_BUFFER_SIZE,
retry_params=None)
In read mode (r) opens the specified Cloud Storage
object for read. In write mode w, if the specified file exists, it
opens it for an overwrite (append is not supported). If the file
doesn't exist, it is created in the specified bucket."
Can anyone give me an example? I can't find any good relevant posts or docs on what to put in the form such that it will grab the file, grab the other form data that the user fills in, and then ends up sending the image to the google cloud storage.
--UPDATE-- :
So, right after posting this, I googled gcs.open, which led to this post:
how to upload image to the cloud storage via app engine?
I believe this could do the trick as far as backend goes, I do have one question left.
To repaste the question:
Questions:
I don't know what the image name will be, I'd like to use the the existing one, or perhaps a random one. Suggestions?
Where do I stick this: enctype="multipart/form-data" ?
will number 2. mess with my other form data? I need more info than just this image in the post request
To answer your questions:
I don't know what the image name will be, I'd like to use the the
existing one, or perhaps a random one. Suggestions?
I don't think that really matters, just make sure to think about possible name clashes and what would happen then. I think this mostly depends on the needs of your application. You can always use a random filename and retain the actual filename in the datastore next to a reference to the Cloud Storage item.
Where do I stick this: enctype="multipart/form-data" ?
Enctype is a form attribute and so needs to be set on the form element in your HTML:
<form enctype="multipart/form-data" action="/save" ... >
will number 2. mess with my other form data? I need more info than
just this image in the post request
No it should not. Just add more form input fields and they will all be submitted with the form data.
You can also use a signed form for a direct upload to GCS. This way you do not need the blobstore or a post handler. And you can control the timeout and objectname using a policy document. See my aswer here.

Python Facebook upload video from external link

I'm trying to upload a video to facebook from an external url. But I got error when I post it. I tried with local videos, and all works fine.
My simple code is :
answer = graph.post(
path="597739293577402/videos",
source='https://d3ldtt2c6t0t08.cloudfront.net/files/rhn4phpt3rh4u/2015/06/17/Z7EO2GVADLFBG6WVMKSD5IBOFI/main_OUTPUT.tmp.mp4',
)
and my error is allways the same :
FacebookError: [6000] There was a problem uploading your video file. Please try again with another file.
I looked into the docs and found the parameter file_url but it still the same issue.
The format of the video is .mp4 so it should work.
Any idea ?
Apparently this error message is very confusing. It's the same message when you've an access_token who doesn't work. For example, I've this error message when I'm trying with my user access token and not if I use the Page access token.
I've never used source, I'm pretty sure that's for reading video data off their API. Instead, I use file_url in my payload when passing video file URLs to Facebook Graph API.
Refer to their API doc for clarity on that...
It's also possible that the tmp.mp4 file extension is causing you problems. I've had issues with valid video URLs with non-typical file extensions similar to that. Is it possible to alter that at the source so that the URL doesn't have the tmp ?
A typical payload pass using Requests module to their API that works for me might look something like this:
fburl = 'https://graph-video.facebook.com/v2.3/156588/videos?access_token='+str(access)
payload = {'name': '%s' %(videoName), 'description': '%s' %(videoDescription), 'file_url': '%s' %(videoUrl)}
flag = requests.post(fburl, data=payload).text
print flag
fb_res = json.loads(flag)
I would also highly recommend that you obtain a permanent page access token. It's the best way to mitigate the complexities of Facebook's oAuth process.
facebook: permanent Page Access Token?

Why I can't configure OAuth 2.0 using Python?

I was trying to get my uploading YouTube video software to work using this guide/code:
https://developers.google.com/youtube/v3/guides/uploading_a_video
I installed & setup everything that is requiered. Also, I made new project, changed YouTube Data API v3's state to ON, got my credentials, entered them to client_secrets.json (also tried downloading one via button that google provided, btw. I renamed downloaded one to client_secrets.json manually, so that's not problem) but when I try to upload something I get this error:
C:\Users\%USERNAME%>c:/Python27/python.exe C:\Users\%USERNAME%\Documents\youtube_uploader\up
load.py --file=C:\Users\%USERNAME%\Documents\youtube_uploader\test_klip\test.mp4 --tit
le=Test upload
WARNING: Please configure OAuth 2.0
To make this sample run you will need to populate the client_secrets.json file
found at:
C:\Users\%USERNAME%\Documents\youtube_uploader\clientinfo.json
with information from the APIs Console
https://code.google.com/apis/console#access
For more information about the client_secrets.json file format, please visit:
https://developers.google.com/api-client-library/python/guide/aaa_client_secrets
Is there anything I missed or done wrong? If so, please help me to get this to work.
Thanks in advance! :)
P.S. I tried making and using both web/installed type of credentials but none of them work.

Categories

Resources