How to deploy dash app with states using Heroku? - python

So I successfully deployed my dash app with Heroku.
My app has many tabs and saves the state for each tab. Meaning if the user changed the table in one tab and then switch to another tab and comeback to the tab then the table has the same content as before.
The problem is that I don't want the state to be saved if the user exit the site and then entered again.
So far during the development I achieved this by running the python command for running the app again, but now I can't do this (I launched the app with Docker container and it's seems that one image is shared between all the sessions).
Is there a way in Heroku to solve this problem? maybe create a new image for each new sessions?
Thanks in advance.

you create an image to release your application (say v1.0) and run it on Docker env (Heroku or other platforms). The application runs in a container and serves all users: every restart or redeploying a new image would need a downtime which impacts everyone.
The solution (for what I can understand) is that you want to clear the session data for a given user, so the saved state wont be used at the next access.
I think you should look at doing that in your app: for example when accessing the home url (ie entrypoint of your app) clear any setting related to the user (which you would typically recognised with a cookie).
Not if it help, if not share more details on how you save the user state.

Related

Django/Wagtail Deployment on Digital Ocean destroys admin session when navigating

Experienced with other cloud hosting providers I tried Digital Ocean the first time to set up a Wagtail app (should be a staging/production environment with pipeline in the future). Following this tutorial (but just deploying a SQLite database not a solid one), everything works fine. The app will be understand as a Python app when cloning it from GitHub and the default build process (by Python Buildpack) followed by running with Gunicorn server will be executed like expacted – there is no Dockerfile provided. Afterwards the frontend works as expected when first opening it. The admin panel allows to enter but when navigating to page editing it destroys the session and I´m faced with the login panel – probably auto logout, since expired session. The django-admin reacts the same way.
The tutorial uses get_random_secret_key. Maybe this is not accepted by Digital Ocean? Another maybe important information is that the set-cookie header first contains a expiry date in one year (like it is set). But after the session was destroyed it´s set to 1970 (probably something like a null value). Actually this is just the indicator for the forced ended session I guess.
Since it´s not so easy to find out, if it´s something which has to do with the code or safety measurements I didn´t share code. But I can do that of course, if it´s needed. It´s probably a issue not just for me and a hiint which is the cause could help other developers struggeling with this, too.

Running a python script saved in local machine from google sheets on a button press

I am trying to create Jira issues with data populated in a row in google sheet, I plan to put a button to read the contents of the row and create Jira issues, I have figured the Jira API wrote the script for it and also the Google sheets API to read the row values to put in the Jira API.
How do I link the button to the python script in my local machine in a simple manner, I went through other similar asks here, but they are quite old and hoping now some new way might be available.
Please help me achieve this in a simple way, any help is greatly appreciated.
Thank You and Stay Safe.
Google sheets cannot run code on your local machine. That means you have a few options:
Click the button locally
Instead of clicking a button on the google sheet, you can run the script yourself from the command line. This is probably how you tested it, and not what you want.
Watch the spreadsheet
You could have your python script setup to run every few minutes. This has the benefit of being very straightforward to setup (google for cron jobs), but does not have a button, and may be slower to update. Also, it stops working if you turn off your computer.
Make script available for remote execution
You can make it so that your script can be run remotely, but it requries extra work. You could buy a website domain, and point it towards your computer (using dynamic dns), and then make the google sheet request your new url. This is a lot of work, and costs real money. This is probably not the best way
Move the script into the cloud
This is probably what you want: cut your machine out of the loop. You can use Google AppScripts, and rewrite your jira code there. You can then configure the google AppScript to run on a button click.
Unfortunately, you really can't get a button press in a Google Sheet to launch a local Python script-- Google Sheets / your browser cannot access your local files and programs in that way.
You can create a button that runs a Google Apps Script (GAS). This is some code based on JavaScript, attached to the spreadsheet, hosted/run by Google. Here's a tutorial on how to run via button press.
If you can port your script into GAS, that is one solution.
If you want to keep the script in Python, you basically need to deploy it and then use GAS to call your Python script. The simplest way I can think of (which is not super simple, but is totally doable!) is as follows:
1. Make your Python script into an API.
Use something like Flask or FastAPI to setup your own API. The aim that when a certain URL is visited, it will trigger your Python program to run a function which does all the work. With FastAPI it might look like this:
from fastapi import FastAPI
app = FastAPI()
def main():
print("Access Google Sheet via API...")
# your code here
print("Upload to JIRA via API...")
# your code here
#app.get("/")
def root():
main()
return {"message": "Done"}
Here, "/" is the API endpoint. When you visit (or make a "get" request) to the URL of the deployed app, simply ending in "/", the root function will get called, which calls your main function. (You could set up different URL endings to do different things).
We can test this locally. If you follow the setup instructions for FastAPI, you should be able to run the command uvicorn main:app --reload which launches a server at http://127.0.0.1:8000. If you visit that URL in your browser, the script should get run and the message "Done" should appear in your browser.
2. Deploy your Python app
There are many services that can host your Python program, such as Heroku or Google Cloud. They may offer free trials but this generally costs money. FastAPI has instructions for deploying to Deta which seems to currently have a free tier.
When your app is app and running, there should be an associated web address such as "https://1kip8d.deta.dev/". If you access this in the browser it will run your script and return the "Done" message.
3. Hit your Python API from Google Sheets, using GAS
The last step it to "hit" that URL using GAS, instead of visiting it manually in the browser. Following the tutorial mentioned above, create a GAS script linked to your spreadsheet, and a button which is "assigned" to your script. The script will look something like this:
function myFunction() {
var response = UrlFetchApp.fetch("https://1kip8d.deta.dev/");
Logger.log(response.getContentText());
}
Now, whenever you press the button, GAS will visit that URL, which will cause your Python script to execute.
You might want to check out Google Colaboratory. It's a service by Google that can host your Python code (called a "notebook"), connect with your Google Drive (and other Google services), and make calls out to web endpoints (which would be your Jenkins server). I think those are the three pieces you're dealing with here.
Just to be clear... your code wouldn't be local anymore (if that's really important to you). Instead, it would be hosted by Google. The notebooks are saved to your Google Drive account, so you get the security that provides.

My callback webhook is overloaded - what can I do?

I use an API to synchronise a lot of data to my app every night. The API itself uses a callback system, so I send a request to their API (including a webhook URL) and when the data is ready they will send a notification to my webhook to tell me to query the API again for the data.
The problem is, these callbacks flood in at a high rate (thousands per minute) to my webhook, which puts an awful lot of strain on my Flask web app (hosted on a Heroku dyno) and causes errors for end users. The webhook itself has been reduced down to forwarding the message on to a RabbitMQ queue (running on separate dynos), which then works through them progressively at its own pace. Unfortunately, this does not seem to be enough.
Is there something else I can do to manage this load? Is it possible to run a particular URL (or set of URLs) on a separate dyno from the public facing parts of the app? i.e. having two web dynos?
Thanks
you can deploy you application with SAME code on more than one dyno using free tier. For example, you application is called rob1 and hosted at https://rob1.herokuapp.com, and source code accessible from https://git.heroku.com/rob1.git. You can create application rob2, accessible from https://rob2.herokuapp.com and with source code hosted at https://git.heroku.com/rob2.git
Than, you can push code to 2nd application.
$ cd projects/bob1
$ git remote add heroku2 https://git.heroku.com/rob2.git
$ git push heroku2 master
As result, you have single repo on your machine, and 2 identical heroku applications running code of your project. Probably, you'll need to copy environment parameters of 1st app to 2nd one.
But anyway, you'll have 2 identical apps on free tier.
Later, if you have obtained domain name, for example robsapp.example.org, you can make it have to CNAME DNS records pointing to your heroku apps to make load balancing like this
rob1.herokuapp.com
rob2.herokuapp.com
as result, you have you application webhooks available on robsapp.example.org and it automatically load balance requests between rob1 and rob2 apps

best practice for multi user flask application

I have built a python script that sends users telegram notifications about things happening in their account on another service.
For this a user needs to specify API keys for said service so that my script can pull the required information.
Now currently, for a new user, I manually create a new folder on my VPS, create a new venv, a new settings file and run the application from a screen session named after the user. This is becoming tedious with 10+ users, especially with updates to the script.
I am currently building a flask based website, where users can log in and set their API keys and other parameters on a own dashboard.
What I want to achieve:
if user registers, a new entity of the script has to be created with a settings file next to it containing user information
the user should have the option to start/stop said application from the dashboard
if I release an update to the script I want to deploy it to all users at once and restart their script if it was running
basically the flask website should only act as a configuration dashboard/frontend for the script that runs on my server so that people don't need to have an own VPS or leave their private system running 24/7
How do I go about this? Is it "just" file handling, creating new folders and files from a blueprint after a user registers? Are there better practices?
I tried to find answers to that via google and the stackoverflow search but I did not find a specific recommendation for that usecase.
If anyone could point me towards a resource on that or even better an example somewhere I'd really appreciate it!
Thanks in advance.
You should have only one script and all the configurations saved into a database, then you need to dispatch some notification just pass the right parameters to the script.

Run the some code whenever I upload the project on to the app engine server

I've built an appeninge project so, how can I run some piece of code on the appserver only once, i.e when ever I upload the whole project on to the server.
How should I achieve this task???
There isn't an official way to discover if your application has been modified altought each time you upload your application it gets a unique version number {app version.(some unique number)} but since there isn't a document API on how to get it I wound't take a risk and use it.
What you need todo is to have a script that will upload your application and when the script is done you can call a handler in your application that set a value in the datastore that marks the application as new.
Once you have that, you can look for it in the datastore in your handlers and run the code if you find it.

Categories

Resources