I have trouble with understanding if it's possible to protect a python script from stealing it (I know this is not in fact possible but at least let's protect it as much as possible) by uploading the program made in python to Google Drive. I did reasearch such as How do I protect my python code? or Store Python scripts & run them online? and many more or less relevant links. But none of them is really answering me.
Let's say I have a python project made a Windows Executable (.exe) with GUI2Exe. It has GUI which loads images from specific folders etc.
I upload all of that to Google Drive.
Somehow run that program from Google Drive and if you can make the user not to realise it that it is from Google Drive than it is even better (log in interface etc)
I would like to know if one of the next solutions are possible or there is another way:
run the exe directly from Google Drive (oh, naivity, or perhaps there is something there I don't see)
run the exe from Google Drive with the help of another python code which can be on your computer but does nothing more than log in to Google and Run the exe from the right folder or download it to temporary and run it from there automatically.
using Google Drive as Windows Service so I guess you can use it as a simple partition of your computer and run the program from there Here is a better description
perhaps avoiding Google as it is and use some kind of encrypting (though I understood it's not really for python)
Becuase my python program has already more then 20000 lines of code and uses about 20 python libraries and needs to load formats as jpg, png, csv (spreadsheets) (etc.) I don't think that Google App Engine is enough. Though I'm quite a noob here so I can be very very wrong.
I hope I made myself clear and please just give me a lead if you can. The right way to go about this and I do my homework. I would really appreciate it.
Consider renting an EC2 Windows instance. Set it up with all the libraries you'll require.
Related
thanks in advance
I have a nginx server in which i am running python tornado application server. My tornado server conatins api endpoints(Handler), Models (DB table models) and the code for their respective services. We are using it as a backend service for an app that sells goods. Recently, we have implemented a complain feature that lets you upload image for the products.
I have to write code in python to convert base64 image into actual jpg/png (which is done), and then upload it to a different windows server(i m stuck in uploading part). I have been researching about it and found a few ways like
FTP
by RemoteDesktopConnection (it is clearly not for me)
and a few more.
if there is any better way to do this plz tell. i am not really experienced so plz explain your answers in a bit detail. Thank you for your time.
previously i was storing it in my application server which is clearly not a good thing to do and i was not able to expose the url for images also.
If it just needs to be a file in a directory on the other windows server, one could turn on file sharing for the specific directory on the other windows server and then smb mount that directory onto your application server. Then your python code could simply write a file to that directory.
A fairly good guide for doing this can be found here
Note that in a final production environment you want to be sure that the security is set appropriately (which is beyond the scope of this question).
I am creating a python program and converting it into a .exe with auto-py-to-exe(a gui for pyinstaller). I uploaded it to my google drive for my friend to download but he cant download it since chrome and google drive thinks its a virus. I couldn't even send it through gmail (I had to use AOl...... funny how AOl does that). I know the file isn't signed and thats whats likely making it look like a virus, I didn't know if there was a way to sign the code so that google doesn't throw hissy fits about it.
Im trying to run spleeter libĀ on Google Cloud Functions.
I have a problem. The separate_to_file function is creating pretrained_models files while executing in root folder.
The only directory I can write something is /tmp.
Is there any way to change directory of pretrained models?
You can set the MODEL_DIRECTORY environment variable to the path of the directory you want models to be written into before to run Spleeter. Please note that most models are quite heavy and may require lot of storage.
I was doing the same thing. Fetching the models can be solved by pointing to the /tmp directory, as already pointed out.
Unfortunately, that wasn't the end of it. Spleeter depends on linux binaries that weren't included in serverless enironments. I solved this by deploying a docker image instead of a plain script.
Then there was the issue that spleeter consumes a lot of memory, especially for 4-way and 5-way splits. Google Cloud doesn't offer enough RAM. AWS Lambdas offer 10GB RAM, which is enough to split a regular, radio-friendly song.
I have a web crawling python script that takes hours to complete, and is infeasible to run in its entirety on my local machine. Is there a convenient way to deploy this to a simple web server? The script basically downloads webpages into text files. How would this be best accomplished?
Thanks!
Since you said that performance is a problem and you are doing web-scraping, first thing to try is a Scrapy framework - it is a very fast and easy to use web-scraping framework. scrapyd tool would allow you to distribute the crawling - you can have multiple scrapyd services running on different servers and split the load between each. See:
Distributed crawls
Running Scrapy on Amazon EC2
There is also a Scrapy Cloud service out there:
Scrapy Cloud bridges the highly efficient Scrapy development
environment with a robust, fully-featured production environment to
deploy and run your crawls. It's like a Heroku for Scrapy, although
other technologies will be supported in the near future. It runs on
top of the Scrapinghub platform, which means your project can scale on
demand, as needed.
As an alternative to the solutions already given, I would suggest Heroku. You can not only deploy easily a website, but also scripts for bots to run.
Basic account is free and is pretty flexible.
This blog entry, this one and this video contain practical examples of how to make it work.
There are multiple places where you can do that. Just google for "python in the cloud", you will come up with a few, for example https://www.pythonanywhere.com/.
In addition, there are also several cloud IDEs that essentially give you a small VM for free where you can develop your code in a web-based IDE and also run it in the VM, one example is http://www.c9.io.
In 2021, Replit.com makes it very easy to write and run Python in the cloud.
If you have a google e-mail account you have an access to google drive and utilities. Choose for colaboratory (or find it in more... options first). This "CoLab" is essentially your python notebook on google drive with full access to your files on your drive, also with access to your GitHub. So, in addition to your local stuff you can edit your GitHub scripts as well.
Using just python, is it possible to possible to use a USB flash drive to serve files locally to a browser, and save information off the online web?
Ideally I would only need python.
Where would I start?
You can use portable python on the flash drive. Portable Python And code some sort of little python webserver, handling get and post extending the BaseHTTPRequestHandler class.
This doesn't seem much different then serving files from a local hard drive. You could map the thumbdrive to always be something not currently used on your machine (like U:).