I have created a chat server application using the Twisted framework. I am running it on my local machine and now I want to go global. The application is similar to omegle.com.
How can I develop on a third party commercial server so that it runs continuously?
Do I need to get a dedicated server for it?
As per this SO answer,
You can deploy Twisted on any hosting
provider who gives you a shell prompt
and doesn't limit your long-running
processes.
Some examples that I've used include:
Tummy ltd. and Slicehost.
The hosting server need not be dedicated, in other words, as long as those conditions are met (and of course as long as you have enough quota of RAM, disk, bandwidth, etc, for your purposes).
Take a look at Python friendly hosts to get an idea of what is available and what it will cost you. Typically, you could get away with a shared hosting package as long as you have a shell. However, if your program begins serving tons of clients, you might need to move it to a dedicated host.
Related
I want to create web form that stays on forever on a single computer. Users can come to the computer fill out the form and submit it. After submitting, it will record the responses in an excel file and send emails. The next user can then come and fill out a new form automatically. I was planning on using Flask for this task since it is simple to create, but since I am not doing this on some production server, I will just have it running locally in development on the single computer.
I have never seen anyone do something like this with Flask so I was wondering if my idea is possible or if I should avoid it. I am also new to web development so I was wondering what problems there could be with keeping a flask application stay on 24/7 on a local development computer.
Thanks
There is nothing wrong with doing this in principle however, it is likely not the best solution for the time-to-reward payoff.
First, to answer your question, this could easily be done, even for a beginner, completing this in a few hours with minimal Python and HTML experience could definitely be done. Your app could crash in the background for many reasons (running out of space, bad memory addresses, etc) but most likely you will be fine.
As for specifically building it, it is all possible, there are libraries you can use to add the results to an excel file, or you can easily just append to a CSV (which is what I would recommend). Creating and sending an email, similarly is relatively simple, but again, doing it without python would be much easier.
If you are not set on flask/python, you could check out Google Forms but if you are set on python, or want to use it as a learning experience, it can definitely be done.
Your idea is possible and while there are many ways to do this kind of thing, what you are suggesting is not necessarily to be avoided.
All apps that run on a computer over a long period of time start a process and keep it going until closed. That is essentially what you are doing.
Having done this myself (and still currently doing it) at my business, I can say that it works great.
The only caveat is that to ensure that it will always be available, you need to have the process monitored by some tool to make sure that it gets restarted if it ever closes due to a variety of reasons.
In linux, supervisor is a great tool for doing that. In windows you could register it as a service. But you could also just create an easy way to restart and make it easy for the user to do so if it is down when they need it.
Yes, this could be done. It's very similar to the applications that run on the servers in data centers.
To keep the application running forever or restarting it after your system starts you'll need to use a system manager similar to systemd in Unix. You could use NSSM - the Non-Sucking Service Manager
or Service Control to monitor your application and restart it if it crashes. This will also have to be enabled on startup.
Other than this, you could use Waitres to serve your Flask application. Waitress is a WSGI web server with which you can easily configure the number of threads and workers to enable serving multiple users at the same time.
In a production environment, it's always suggested to use a web server interface like Gunicorn or Waitress.
I plan to write a service for my *nix systems that can interact with a select number of GUI applications like Photoshop, Libre Office, etc. on the local machine.
The purpose for the local service is essentially to listen to a remote message and accordingly perform specific operations- for instance, changing the background of a layer in Photoshop, or adding margins to a page in Libre Office (or MS Office). You can assume that the application is active in the display environment of the operating system.
Now my question is:
Is this even possible? I personally find this task impossible unless I get to peek into the source code of these apps and basically augment these applications themselves. But since they are mostly proprietary apps, there are legal implications too.
Assuming it is somehow possible by changing the source of these apps, won't a team have to figure out the architecture and the inner workings of all these specifi applications or are there general frameworks concepts I should look into?
The UI should be updated from the main thread only. Or just consider sending Windows messages out.
The purpose for the local service is essentially to listen to a remote message and accordingly perform specific operations
Microsoft does not currently recommend, and does not support, Automation of Microsoft Office applications from any unattended, non-interactive client application or component (including ASP, ASP.NET, DCOM, and NT Services), because Office may exhibit unstable behavior and/or deadlock when Office is run in this environment.
If you are building a solution that runs in a server-side context, you should try to use components that have been made safe for unattended execution. Or, you should try to find alternatives that allow at least part of the code to run client-side. If you use an Office application from a server-side solution, the application will lack many of the necessary capabilities to run successfully. Additionally, you will be taking risks with the stability of your overall solution. Read more about that in the Considerations for server-side Automation of Office article.
I have prototyped a system using python on linux. I am now designing the architecture to move to a web based system. I will use Django to serve public and private admin pages. I also need a service running, which will periodically run scripts, connect to the internet and allow API messaging with an admin user. Thus there will be 3 components : web server, api_service and database.
1) What is best mechanism for deploying a python api_service on the VM? My background is mainly C++/C# and I would have usually deployed a C#-written service on the same VM as the web server and used some sort of TCP messaging wrapper for the API. My admin API code will be ad hoc python scripts run from my machine to execute functionality in this service.
2) All my database code is written to an interface that presently uses flat-files. Any database suggestion? PostgreSQL, MongoDB, ...
Many thanks in advance for helpful suggestions. I am an ex-windows/C++/C# developer who now absolutely loves Python/Cython and needs a little help please ...
Right, am answering my own question. Have done a fair bit of research since posting.
2) PostgreSQL seems a good choice. There seem to be no damning warnings against using it and there is much searchable help. I am therefore implementing concrete PostgreSQL classes to implement my serialization interfaces.
1) Rather than implement my own service in python that sits on a remote machine, I am going to use Celery. RabbitMQ will act as the distributed TCP message wrapper. I can put required functionality in python scripts on the VM that Celery can find and execute as tasks. I can run these Celery tasks in 3 ways. i) A web request through Django can queue a task. ii) I can manually queue a remote Celery task from my machine by running a python script. iii) I can use Celery Beat to schedule tasks periodically. This fits my needs perfectly as I have a handful of daily/periodic tasks that can be scheduled plus a few rare maintenance tasks that I can fire off from my machine.
To summarize then, where before I would have created a windows service that handled both incoming TCP commands and scheduled behaviour, I can use RabbitMQ, Celery, Celery Beat and python scripts that sit on the VM.
Hope this helps anybody with a similar 'how to get started' problem .....
I've been wanting to run my own server for a while and I figured that running one for my django website would be a good start. What do you recommend I use for this? I've been trying to use a Ubuntu Virtual Machine to run it on one of my old laptops that I don't really use anymore until I can buy a dedicated server.
Should I run it from a Virtual Machine? If so, would Ubuntu be best? That appears to be the case, but I want to be sure before I invest in anything. I want to be able to access the website from other computers, just like any other website. Am I going about this wrong? If so, what can you suggest me?
Yes, you will need a static IP address.
If this is your first experiment, my advice would be:
1) Use an old, dedicated PC with no other stuff on it. Unless you do it just right, you should presume hackers could get anything on the disk...
2) Why make life complex with layer after layer of software? Install Ubuntu and run a standard server under a Unix OS
3) Be very careful about the rest of your attached network. Even if the PC is dedicated, unless you properly managed port forwarding, etc., ALL of your computers could be susceptible to attack.
An old friend of mine discovered, back in the Napster peer-to-peer days, that he could basically go and read EVERYTHING on the hard drives of most people who had set up Napster on their computer.
It really depends on your requirements. Will you be accessing the website externally (making it public) or locally? Running Django from your laptop can work but if you are planning to make it public, you will need an external IP to point your domain to. Unless you have a business account, ISPs usually don't give static IPs to individual customers. Ubuntu would be a wise choice and you can run conda or virtualenv easily.
VPS are quite cheap these days. You can look into AWS free tier that provides you with 500 hours/month on a micro server.
If you are planning to access your website internally then you don't need anything other than your laptop or perhaps raspberry pi. If you are trying to make it available for everyone on the external network, VPS would be the best bet.
Currently, and this changes often, I like to either setup a local development environment using virtualenv (to install dependencies) and Ngrok (to expose machine to an external address) or C9.io. If you want further info about setup, I'm happy to provide.
As already stated Ubantu is a good choice but there is also Debian. I use Debian because I started off working with a colleague who was already using it and I find it very good. I began with an old, disused desktop PC which I nuked and turned into a proper linux server. For development I didn't need a very high spec machine. (Think it has 1 GB ram) I have it set up in my flat and my domestic internet connection is fine for most of my needs. Note: It isn't necessary to have a static IP address for development, although it is preferable if you already have one. As an alternative you can use a service such as dnydns.org where you can set up virtual domain names that point to your domestic dynamic IP address. Most routers these days have facilities within them for updating services like dyndns.org with your new dynamic IP address or you can install a plug-in to your server that will do this for you. All my projects have their own virtualenvs and I have VNCServer installed so I can access my server and work from anywhere where I have an internet connection. I've been running this way for the past three years with some household name clients and haven't had any issues at all.
When it comes to production you can simply use any of the many VPS services that are out there. Amazon has already been mentioned. Someone recommended creating a droplet at DigitalOcean.com as I was wanting to host django applications and I find them to be very good and cost effective. Anyway just my 2 cents worth...hope it helps
I want to be able to run WSGI apps but my current hosting restricts it. Does anybody know a company that can accommodate my requirements?
My automatic response would be WebFaction.
I haven't personally hosted with them, but they are primarily Python-oriented (founded by the guy who wrote CherryPy, for example, and as far as I know they were the first to roll out Python 3.0 support).
I am a big fan of Slicehost -- you get root access to a virtual server that takes about 2 minutes to install from stock OS images. The 256m slice, which has been enough for me, is US$20/mo -- it is cheaper than keeping an old box plugged in, and easy to back up. Very easy to recommend.
Plug plug for PythonAnywhere, our own modest offering in this space.
We offer free hosting for basic web apps, with 1-click config for popular frameworks like Django, Flask, Web2py etc. MySql is included, and you also get full suite of browser-based development tools like an editor and a console...
I have been using WebFaction for years and very happy with the service. They are not only python oriented. You should be able to run anything within the limitations of shared hosting (unless of course you have a dedicated server).
They are probably not the cheapest hosting service though. I don't know the prices. But I can still remember very well my previous hosting provider was unreachable for a week (not their servers, I mean the people).
I've been pretty happy with Dreamhost, and of course Google AppEngine.
Google App engine and OpenHosting.com
Have virtual server by OpenHosting, they are ultra fast with support and have very high uptime.
Check out http://pythonplugged.com/
They are trying to collect information on Python hosting providers using variuos technologies (CGI, FCGI, mod_python, mod_wsgi, etc)
I advise you to have a look at http://www.python-cloud.com
This PaaS platform can automatically scale up and down your application regarding your traffic. You can also finely customize if you want vertical, horizontal or both types of scalability. The consequence of this scaling is that you pay as you go : you only pay for your real consumption and not the potential one.
Deployment via git.
Non AWS, hosted in tier-4+ datacenters.
Free trial ;)
I use AWS micro server, 1 year free and after that you can get a 3 year reserved which works out to about $75/yr :) The micro server has only 20MB/sec throughput, ~600MB of ram, and a slower CPU. I run a few Mezzanine sites on mine and it seems fine.