Django website with secondary SQLite database local - python

I have created a webframework in Django which functions like an interactive dashboard. There is however the problem that the webframework contains data that I can't put in an online environment, but has to stay in a local SQL storage. The webframework has to run on three computers that do not have access the command prompt, therefore I can't run a local Django project on each of the computers
Is the following possible?
Host a Django website (just a shell) with user authentication and an online database that just manages the users.
Each computer has its own local SQL storage that Django is accessing i.e. SQLlite file that is present on the desktop. In other words the SQL db that I want Django to access is not present on the hosting machine but on that of the recipient.
In this case, the worst case scenario would be that someone breaks into the webframework but can not access the files since they are only available locally.
Am I overlooking anything or is this possible? I'm aware that this is very unusual and not very pratical, but it is the only think I can think of. If there is better alternative I would be open to suggestions.
Regards

Related

How to move mysql database online with python

I have built an app that uses mysql database with Python, I would love to share some functionalities with different applications and that calls for an online database feature, kindly give me some insights over how i can move a python mysql database to online and how to make calls to it in order to facilitate for sharing of data between different applications.
I don't exactly know what you are calling a python database but there are some options here that you might want to consider
First, use heroku to host your app and heroku postgress to host your databaseOr you can use an EC2 aws machine to host your app and it's database (in case it's a custom code that you can't call from a browser using heroku)with both of these options you can access you database and the appp with the second one you can install other services such as ssh and other.

appengine set up local host with datastore for testing

I have tried to follow googles documentation on how to set up local development using a database (https://cloud.google.com/appengine/docs/standard/python/tools/using-local-server#Python_Using_the_Datastore). However, i do not have the experience level to follow along. I am not even sure if that was the right guide. The application is a Django project that uses python 2.7. To run the local host, i usually type dev_appserver.py --host 127.0.0.1 .
My questions are:
how do i download the data store database on google cloud. I do not want to download the entire database, just enough data to populate local host so i can do tests
once the database is download, what do i need to do to connect it to the localhost? Do i have to change a parameter somewhere?
do i need to download the datastore? Can i just make a duplicate on the cloud and then connect to that datastore?
When i run localhost, should it not already be connected to the datastore? Since the site works when it is running on the cloud. Where can i find the connection URI?
Thanks for the help
The development server is meant to simulate the whole App Engine Environment, if you examine the output of the dev_appserver.py command you'll see something like Starting Cloud Datastore emulator at: http://localhost:PORT. Your code will interact with that bundled Datastore automatically, pushing and retrieving data according to the code you wrote. Your data will be saved on a file in local storage and will persist across different runs of the development server unless it's explicitly deleted.
This option doesn't provide facilities to import data from your existing Cloud Datastore instance although it's a ready to go solution if your testing procedures can afford populating the local database with mock data through the use of a custom created script that does so programmatically. If you decide for this approach just write the data creation script and execute it before running the tests.
Now, there is another option to simulate local Datastore using the Cloud SDK that comes with handy features for your purposes. You can find the available information for it under Running the Datastore Emulator documentation page. This emulator has support to import entities downloaded from your production Cloud Datastore as well as for exporting them into files.
Back to your questions:
Export data from the Cloud instance into a GCS bucket following this, then download the data from the bucket to your filesystem following this, finally import the data into the emulator with the command shown here.
To use the emulator you need to first run gcloud beta emulators datastore start in a Cloud Shell and then in a separate tab run dev_appserver.py --support_datastore_emulator=true --datastore_emulator_port=8081 app.yaml.
The development server uses one of the two aforementioned emulators, in both cases it is not connected to your Cloud Datastore. You might create another project aimed for development purposes with a copy of your database and deploy your application there so you don't use the emulator at all.
Requests at datastore are made trough the endpoint https://datastore.googleapis.com/v1/projects/project-id although this is not related to how the emulators manage the connections in your local server.
Hope this helps.

How to deploy flask GUI web application only locally with exe file?

I'd like to build a GUI for a few Python functions I've written that pull data from MS SQL Server. My boss wants me to share the magic of Python & SQL with the rest of the team, without them having to learn any coding.
I've decided to go down the route of using Flask to create a webapp and creating an executable file using pyinstaller. I'd like it to work similarly to Jupyter Notebook, where you click on the file and it opens the notebook in your browser.
I was able to hack together some code to get a working prototype of the GUI. The issue is I don't know how to deploy it. I need the GUI/Webapp to only run on the local computer for the user I sent the file to, and I don't want it accessible via the internet (because of proprietary company data, security issues, etc).
The only documentation I've been able to find for deploying Flask is going the routine route of a web server.
So the question is, can anyone provide any guidance on how to deploy my GUI WebApp so that it's only available to the user who has the file, and not on the world wide web?
Thank you!
So, a few assumptions-- since you're a business and you're rocking a SQLServer-- you likely have Active Directory, and the computers that you care to access this app are all hooked into that domain (so, in reality, you, or your system admin does have full control over those computers).
Also, the primary function of the app is to access a SQLServer to populate itself with data before doing something with that data. If you're deploying that app, I'm guessing you're probably also including the SQLServer login details along with it.
With that in mind, I would just serve the Flask app on the network on it's own machine (maybe even the SQLServer machine if you have the choice), and then either implement security within the app that feeds off AD to authenticate, or just have a simple user/pass authentication you can distribute to users. By default random computers online aren't going to be able to access that app unless you've set your firewalls to deliberately route WAN traffic to it.
That way, you control the Flask server-- updates only have to occur at one point, making development easier, and users simply have to open up a link in an email you send, or a shortcut you leave on their desktop.
Unfortunately, you do not have control over a give users computer.
You are using flask, so your application is a web application which will be exposing your data to some port. I believe the default flask port is 5000.
Regardless, if your user opens the given port in their firewall, and this is also open on whatever router you are connected to, then your application will be publicly visible.
There is nothing that you can do from your python application code to prevent this.
Having said all of that, if you are running on 5000, it is highly unlikely your user will have this port publicly exposed. If you are running on port 80 or 8080, then the chances are higher that you might be exposing something.
A follow up question would be where is the database your web app is connecting to? Is it also on your users machine? If not, and your web app can connect to it regardless of whose machine you run it on, I would be more concerned about your DB being publicly exposed.

What GIT / workflow do I use to avoid overwriting my server side Django users, with my local users who are dummy/test users?

I am developing a Django application in Atom locally, and then on pythonanywhere, when I'm ready, I'm doing a GIT PUSH command after syncing those changes to GITHUB. The problem is, at some point, those changes which were pushed through from my local development env have overwritten all my live users with just the local dummy users I've been using for testing and development.
Basically, I'm testing the login and registration system locally, logging in and registering with lots of dumb emails. Once I was happy it was working, I synced the Django code I'd changed to GITHUB (with the desktop app) and then did a GIT PUSH command on a PythonAnywhere (my server) console. The sqlite DB is included in those updates/sync - is that correct? Or should it just be totally ignored?
I just realised, that one (perhaps all?) pushes have overwritten my sqlite DB, and there were perhaps 30 or so actual users who had signed up on the website whose data is no longer registered on the site. I managed to find a trace of them in the Django Admin logs, and I've found the version history of the Sqlite DB on GITHUB, but my question is - how do I avoid this happening?
What is the workflow to avoid this situation in the future? And is there a command I can run in shell to get those users back into my 'live' database from the backedup SQlite file?
I know this is a simple question, but I'm new to development and I'm slowly getting there with troubleshooting the code, but versioning, GIT, and workflow are tricky things to get my head around.

Is it Possible (and safe) To Insert and Update a MySQL database using Python locally.

I'm pretty new to Python but have been running a few programs locally using Komodo edit, and then uploading the results manually to my website's MySQL database.
I'm looking into letting Python do this on it's own, but as i understand it i have to open my MySQL database to anyone regardless of if they are running scripts on my server or not if I'm to do this.
I'm guessing this is due to with security reasons, but i don't know how vulnerable this can make my site? Is it a bad idea to do it this way, or would it be better to run my python program from the server itself? (I've never run python code from my server, and my python code too, might be insecure)
If you have a access to the entire server (i.e. not just the hosting directory as is common on some shared hosting setups), and can ssh into the server, then your safest (though not easiest) option is to place the script on the server outside of the web hosting folder. This will stop anyone from remotely accessing the script, and will let you connect to the db without enabling remote connections.
You could enable remote connections if your hosting server set up allows it (not sure if any hosting companies disable, or prevent it, though you may have to enable it from the start when you create the database) Just select a nice strong password. Then you can use your script locally, and you'd be as secure as your password.

Categories

Resources