Adding JSON files to .xcodeproj via python script? - python

So basically, I have a python script that scrapes web data and stores it in JSON files. Now I want to add those JSON files to my Xcode project so I can display the data.
Right now my json files are stored in a folder. My first approach was to utilize a DB and use python to send those json files to the DB. For now it seems to sort of work, but these json files are intended to be updated daily.
My second approach was to just send those files from my python script to my Resources folder I made inside Xcode, however the files don't show up in the project.
Does anyone here that had experienced this can help me go in the right direction with this? Is there any flaws in my approach that I had missed?

Related

Refresh All Data on Excel Workbook in Sharepoint Using Python

To start I managed to successfully run pywin32 locally where it opened the Excel workbooks and refreshed the SQL Query then saved and close them.
I had to download those workbooks locally from Sharepoint and have them sync to apply the changes using one drive.
My Question is would this be possible to do within Sharepoint itself ? Have a python script scheduled on a server and have the process occur there in the backend through a command.
I use this program called Alteryx where I can have batch files execute scripts and maybe I could use an API of some sort to accomplish this on a scheduled basis since thats the only server I have access to.
I have tried looking on this site and other sources but I can't find a post where it would reference this specifically.
I use Jupyter Notebooks to write my scripts and Alteryx to build a workflow with those scripts but I can use other IDEs if I need to.

GitHub Actions - Where are downloaded files saved?

I've seen plenty of questions and docs about how to download artifacts generated in a workflow to pass between jobs. However, I've only found one thread about persisting downloaded files between steps of the same job, and am hoping someone can help clarify how this should work, as the answer on that thread doesn't make sense to me.
I'm building a workflow that navigates a site using Selenium and exports data manually (sadly there is no API). When running this locally, I am able to navigate the site just fine and click a button that downloads a CSV. I can then re-import that CSV for further processing (ultimately, it's getting cleaned and sent to Redshift). However, when I run this in GitHub Actions, I am unclear where the file is downloaded to, and am therefore unable to re-import it. Some things I've tried:
Echoing the working directory when the workflow runs, and setting up my pandas.read_csv() call to import the file from that directory.
Downloading the file and then echoing os.listdir() to print the contents of the working directory. When I do this, the CSV file is not listed, which makes me believe it was not saved to the working directory as expected. (which would explain why #1 doesn't work)
FWIW, the website in question does not give me the option to choose where the file downloads. When run locally, I hit the button on the site, and it automatically exports a CSV to my Downloads folder. So I'm at the mercy of wherever GitHub decides to save the file.
Last, because I feel like someone will suggest this - it is not an option for me to use read_html() to scrape the file from the page's HTML.
Thanks in advance!

How to download a folder on dropbox as a zip file and save it locally?

I'm progressively adding images to a dropbox folder remotely which I then need to download on my raspberry pi 3.
The thing is I only need the latest uploaded image in that folder so that I can classify it remotely using some code deployed on my raspberry pi 3.
I don't know the dropbox api well so I don't know if there's any functionality to directly implement what I said above, so I'm trying to download the entire folder with all the images locally and then select the image that I want.
Dropbox api v2 says they added functionality to download entire folders as zip files but whenever I try to implement the code given in the api and save the file locally, the local zip files always says it's corrupt and can't be opened.
Does anyone know how this can be implemented in python ?
Edit: Or maybe shed light if there's a simpler way to download the latest uploaded image to a folder without explicitly changing the code with that specific image's name or link ?
https://www.dropbox.com/developers/documentation/http/documentation#files-download_zip
Start by getting the download working in a Linux terminal using CURL, then you can work your way up by making the HTTP request using Python Requests library. That way you can debug it systematically. Make sure there aren't any issues with file permissions on Dropbox or API tokens.

Python script to download directory from URL

I want to copy my own photos in a given web directory to my Raspberry so I can display them in a slideshow.
I'm looking for a "simple" script to download these files using python. I can then paste this code into my slideshow so that it refreshes the pics every day.
I suppose that the python wget utility would be the tool to use. However, I can only find examples on how to download a single file, not a whole directory.
Any ideas how to do this?
It depends on the server used to host the images and if the script can see a list of images to download. If this list isn't there in some form e.g. a webpage list, JSON or XML feed, there is no way for a script to download the files as the script doesnt "know" what's there dynamically.
Another option is for a python script to SSH into the server, list the contents of a directory and then download. This presumes you have programmatic access to the server.
If access to the server is a no, and there is no dynamic list then the last option would be to go to this website where you know the photos are and scrape their paths and download them. However this may scrape unwanted data such as other images, icons, etc.
https://medium.freecodecamp.org/how-to-scrape-websites-with-python-and-beautifulsoup-5946935d93fe

how to upload to folder and replace files using python and Google Drive?

I crafted this little script which helps me upload files from terminal to Google Drive. I am going to use it with cron to make my life easier a bit.
This is the script I have: https://github.com/goranpejovic/drive-uploader/blob/master/upload.py
It essentially uploads file to GDrive and optionally converts it to Google Docs format. Now, I want to implement two more things but I am not sure how.
First is I want to have ability to upload to folders remotely. Not just create folder locally and upload it (this would be nice too now when I think of it), but upload file to remote folder in Google Drive. Is this possible? If so, any suggestion?
Second, I would like to be able to replace files that already exist on Drive. Obviously same filename doesn't matter to GDrive so I am assuming it has to do with metadata I am passing as "body". Or some other ways?
Thank you!
To create folders in drive is similar than creating a file, the difference would be the mime type: "application/vnd.google-apps.folder". So instead of uploading a folder, you have to use file.insert to create a new one in Drive.
Creating or inserting a file in Drive, as you mentioned, it will create a new file, even if the name already exists. In order to replace a file, you will have to update the existing file with the new information.

Categories

Resources