I have a Django application connected to Cloud Run at Google Cloud Platform.
I am in need to schedule a task to run everyday. The task should go through a table in the database and remove rows that exceed todays date.
I have looked in to Cloud functions, however, when I try to create one, it appears to only support Flask, and not Django.
Any ideas on how to precede to make this scheduled function and access the database?
Related
I have an onsite SQL server which runs and posts relevant records to to a data warehouse accessible via API endpoint. I need to create a webhook to detect changes whenever rows are added or deleted from the warehouse table.Preferably, the webhook should trigger a message to Azure queue storage via a httptrigger.
How can I go about this in azure? I cant get my hands on any straightforward documentation or tutorial.
If it cant be done in Azure, are there any other third party platforms with which I can create a webhook to detect changes to the table given the end points url?
I have been able to create a webhook in ArcGIS which is currently successfully running on the same logic. I am however now required to change and have that triggered by activity on the datawarehouse API. Any help will be appreciated?
Trying to deploy an ETL script which extracts data from BigQuery via Pandas-gbq and Google Sheets, and then uploads the transformed whole back to BigQuery. I want to deploy it as Flask app in App Engine.
I am using Sheets API to access Google Sheets and Pandas-gbq to access Google BigQuery. I have increased App Time out to 6000 seconds. While I get a response for a small number of rows (~100), for larger loads it boots workers with increasing PIDs and then shuts down.
I do not get an error message, and the status of the job displays Ran Successfully, however the data is not appended to the correct location as it was when the number of rows was small or if I ran it locally.
Do I need more computing power from the VM or another way to run the process? What would be the best way to deploy a bunch of such apps scheduled via a cron job to run at various times in a week?
It would be a difficult task to rewrite all the scripts, so any method to directly deploy them via app engine should help.
Thanks in advance.
I have project on Google Cloud App Engine. I have set up a cloud Scheduler to make a GET request every 24 hours to a certain endpoint on the app engine which invokes a simple Python script. The script simply reads a Google Sheet and updates the Cloud Firestore with the data from the sheet. It was working perfectly but for the past couple of days it fails to update the Database on scheduled time and gives an error. But when I trigger it manually from the console it works just fine. So that means the problem is not with my script. Can anyone have an idea what could be causing the problem?
I don't think there is enough information in your question, but I think you should analyze logs. In Cloud Scheduler/ Jobs you can find column "Logs" which contain links for every job. You can access Stockdriver Logging for this particular job directly from there.
I hope this will help!
We are building a simple single-page website using flask to be deployed on GKE. In this , we have queries on MSSQL databases (used by another application) for which we want to use celery with Google cloud memorystore redis instance to run those queries scheduled once a day , then use that result data from queries on the website for that day as we do not want to query the databases everytime there is a visitor to the site (as the data is mostly static for a day).
Now , I am quite new to Software development and particularly DevOps less so. After reading up on resources online ,I couldn't much about it and I am still unsure about how this works .
Is The result data after completing the celery task stored in the Redis Result backend(Google cloud memorystore) in Google storage the entire day and can be accessed anytime in my Python code using celery task variable whenever a user visits the site ? Or should I access the data stored in Redis Result backend(GCM) in Google storage using another query to google cloud db in my code ? Or is the data stored in Redis Result backend(GCM) only temporary until the task is marked as Done and cannot be accessed throughout the day ? How do I move forward ? Can someone please point this out ?
I've written a python script that uses Selenium to scrape information from a website and stores it in a csv file. It works well on my local machine when I manually execute it but I now want to run the script automatically once per hour for several weeks and safe the data in a database. It may take about 5-10 minutes to run the script.
I've just started off with Google Cloud and it looks like there are several ways of implementing it with either Compute Engine or App Engine. So far, I get stuck at a certain point with all three ways that I found so far (e.g. getting the scheduled task call a URL of my backend instance and getting that instance to kick off the script). I've tried to:
Execute the script via Compute Engine and use datastore or cloud sql. Unclear if crontab can easily be set up.
Use Task Queues and Scheduled Tasks on App Engine.
Use backend instance and Scheduled Tasks on App Engine.
I'd be curious to hear from others what they would recommend as the easiest and most appropriate way given that this is truly a backend script that does not need a user front end.
App Engine is feasible but only if you limit your use of Selenium to a .remote out to a site such as http://crossbrowsertesting.com/ -- feasible but messy.
I'd use Compute Engine -- and cron is trivial to use on any Linux image, see e.g http://www.thegeekstuff.com/2009/06/15-practical-crontab-examples/ !