I'm using Google Colaboratory to create a series of plots using plotly and saving them as .html on Google Drive. I would like to publish does files on a WordPress.
Furthermore, I will update these plots regularly. So every time I update them, they should be updated on WordPress.
Is there a good way to implement this?
It seems that there isn't a super easy way to get the HTML doc between Google and WordPress, so the next best thing is to automatically pull the raw document from Drive, and have that integrate on a schedule. Based on this SO post, get the file id for your document (follow these steps here) to be able to access the raw data of your file. Note: you may want the download link instead, which can be found in the second link.
https://drive.google.com/uc?id=file_id
This may not work, as I've found there's redirection, but you could also possibly use the redirected URL to grab the image. I haven't had much success, however (using curl and wget). After that, you should be able to upload it to WordPress, or have WordPress itself download it and implement it.
Related
I'm trying to automatically upload JPG photo files from a particular directory on my computer to a particular album on Google Photos. I'd like the photos to periodically get pushed up to Google Photos (every day or so is frequent enough). Google Photos Backup almost does what I want, but it just uploads the files -- it doesn't put them into a particular [pre-existing] album on Google Photos. It's possible that I can somehow use Google Drive and a simple cron job for this, although I don't know how. I am also considering using the Picassa Web Albums API, but that feels overkill and I'd like to avoid that work unless it's necessary. Are there any straightforward solutions to this?
As you said that Google Photo Backup do the (upload) job, in my opinion the best way then is to use directly a Google Apps Script stored inside your Google Drive (running periodicaly) in order to push each new detected pictures inside a particular album.
If you need relative documentation, you may take a look at the album class documentation and also https://developers.google.com/apps-script/
If you need to use an other language to do the job (python, js, etc...) please specify which one and give us also more precision. (mac / windows / linux)
Use IFTTT for this. Google Photos channel perfectly fits for this purpose. https://ifttt.com/applets/DMgPS2uZ-back-up-new-android-photos-you-take-to-google-photos
I'm writing a program to make automatic reports via Google Analytics and everything was going fine until i saw that there should be graphical images, like the Analytics web version. (Most important example, the Behavior Flow)
Is there anyway i can implement this, through code, in PDF ?
You mentioned you wrote a program to download the GA data, so you have the knowledge to upload the data to a Google Spreadsheet. For more info on their api, try this URL: https://developers.google.com/google-apps/spreadsheets/?hl=en
Once the data is in a sheet within a Google Spreadsheet, you can use Google Data Visualizations. You can put the visualizations on a chart, or you can embed them in HTML, referring to the data sheet as a datasource.
The only challenge would be the Behavioural flow. I suggest that if your page flow is fairly static, you could probably draw something dynamically using GSS's ability to alter the sizes of block images dynamically, based on data values.
Chart Gallery:
https://developers.google.com/chart/interactive/docs/gallery?hl=en
This would entail a lot of work, as free solutions often are, but you should be able to get the result you want.
I need to save emails I receive so that the user can view them later on. They need to be saved in such a way that the images will remain even if their links a re broken (e.g. for the images that are link and not attachments, upload them to S3 and change the links to point to them).
Can anyone recommend a library that will help me achieve that?
I was thinking of two approaches:
1) Save the email to PDF - but I have no idea how to make it correctly include the images.
2) Save the original email and render it on the client, but then it doe snot show the attached images.
Any one of those will do with preference to the first option. If its the first option then I can write it on my RoR server or as an external Python service. If its the sercond I have to write it to work on RoR.
I am aware that this question is similar to: Best way to save email, including images and HTML data, using Java Mail API?
but I need to do it on Rails not Java.
Thank you!
Why not just have an auto-forwarder to a separate account? That way they would effectively be bcc'd on everything you get. I know Gmail can easily do that with filters.
Another option is forwarding the emails to a 'read it later' service and let their api do the heavy lifting. Not sure if they keep the attachment, but it is worth a look.
I am building a web application as college project (using Python), where I need to read content from websites. It could be any website on internet.
At first I thought of using Screen Scrapers like BeautifulSoup, lxml to read content(data written by authors) but I am unable to search content based upon one logic as each website is developed on different standards.
Thus I thought of using RSS/ Atom (using Universal Feed Parser) but I could only get content summary! But I want all the content, not just summary.
So, is there a way to have one logic by which we can read a website's content using lib's like BeautifulSoup, lxml etc?
Or I should use API's provided by the websites.
My job becomes easy if its a blogger's blog as I can use Google Data API but the trouble is, should I need to write code for every different API for the same job?
What is the best solution?
Using the website's public API, when it exists, is by far the best solution. That is quite why the API exists, it is the way that the website administrators say "use our content". Scraping may work one day and break the next, and it does not imply the website administrator's consent to have their content reused.
You could look into content extraction libraries - I've used Full Text RSS (php) and Boilerpipe (java). Both have web service available so you can easily test if it meets your requirements. Also you can download and run them yourself and further modify its behavior on individual sites.
Im thinking of setting up a Google App that simply displays an RSS or Atom feed. The idea is that every once in a while (a cron job or at the push of a magic button) the feed is read and copied into the apps internal data, ready to be viewed. This would be done in Python.
I found this page that seems to explain what I want to do. But that is assuming Im using some of the other Google products as it relies on the Google API.
My idea was more in line that added some new content, hosted it locally on my machine, went to the Google App administration panel, pushed a button and my (locally hosted) feed was read and copied.
My questions now are:
Is the RSS (or Atom, one is enough) format specified enough to handle add/edit/delete?
Are there any flavors or such I should worry about?
Have this been done before? Would save me some work.
One option is to use the universal feed parser library, which will take care of most of these issues for you. Another option would be to use a PubSubHubbub-powered service such as Superfeedr, which will POST updates to you in a pre-sanitized form, eliminating most of your polling and parsing issues.
What about using an additional library, like for instance Feedparser?