I am using PyCharm in two instances for crypto-trading. Currently when my computer starts up, I would have to open both projects on PyCharm manually and enter two different commands manually.
Is there a way to run these projects with the commands needed automatically right after starting my computer?
A better way to start your bot on startup is to add your python script or something in a startup or run your python script as a service.
Check out this article
Related
I'm trying to debug any Python script that an interpreter runs so long and I have a reference to that script. I.e if my connected interpreter runs a script called abc.py, and in my script directory I have abc.py with breakpoints attached. The IDE will automatically stop execution at that break point
I'm using PyCharm, but I'd like to know the theory here to say if I'd ever like to connect VS Code I'd be able to do that as well. Additionally I'm currently connecting to a Docker container running airflow.
Given the above, I'm assuming that the goal is to do a "remote" debug.
Also since Python is a script, and run by an interpreter I am assuming, if I can read into the interpreter and if PyCharm can match the file ran by the interpreter then it should be able to pause the execution.
I am additionally assuming that the interpreter can run in "normal" mode. Not in debug mode as we have in Java.
I have read three approaches:
ssh interpreter to my Docker container - seems most promising for my current goal, but unsure if it'll work
using Python debug server (Debugging Airflow Tasks with IDE tools?) - still requires manual changes in the specific scripts
using Docker interpreter (https://medium.com/#andrewhharmon/apache-airflow-using-pycharm-and-docker-for-remote-debugging-b2d1edf83d9d) - still requires individual debug configs for executing a single DAG / script
Is debugging any file executed by a python interpreter possible, at least in theory?
Is it possible remotely?
Is it possible using airflow at all?
I want to know if there's a way to have windows server 2019 automatically launch django's web server. I also want the launch to be performed at startup and by SYSTEM.
I tried using batch scripts that launch manage.py from venv's python interpreter. When I launch the batch manually (i.e. double click) it works fine and dandy. But it appears that SYSTEM fails in running the script correctly when planning the task.
I made SYSTEM launch another script at startup (a simple python script that creates a txt file from within its own venv) and it works.
If the Django launch sceipt is launched by USER then it works.
The problem is with the launching of django with SYSTEM. I've also tried streamlit and the result is the same.
Do you have any Ideas?
Sample batch script:
cd path\of\managepyfile\
C:\path_to_venv\Scripts\python -m manage.py runserver
We run a similar application (not python) but an application that uses a web server.
We have it setup as a task in task scheduler that when the server starts up, it runs the powershell script that executes a command to start the web server.
Link to setup
However, you could use a web server like IIS and deploy the files to the www folder in the cdrive and run the site as an IIS service.
Setting it up on IIS was a little tricky if you've never used IIS before. Happy to help out as we have deployed our test access tool for one of our apps this way.
A .pyw script to present a small GUI (TkInter) to the user. From a Windows Server Terminal server, it does not run for others.
I wrote a .pyw script to present a small GUI (TkInter) to the user. On my windows desktop, with Python installed, it runs well. I uploaded the script to a Windows Server Terminal server, from where I want a number of users to run it. I can run it when I log onto the terminal server. Other users, however, cannot run it, and it does not display any error messages.
I have ensured that everyone on the server has full access to the script.
The code is running perfectly
Uhm... This was rather simpler than I thought.
Because it is a .pyw file, an error did not show up. I did not make use of Python's errror catching tools.
I have coded a Python Script for Twitter Automation using Tweepy. Now, when i run on my own Linux Machine as python file.py The file runs successfully and it keeps on running because i have specified repeated Tasks inside the Script and I also don't want to stop the script either. But as it is on my Local Machine, the script might get stopped when my Internet Connection is off or at Night. So i couldn't keep running the Script Whole day on my PC..
So is there any way or website or Method where i could deploy my Script and make it Execute forever there ? I have heard about CRON JOBS before in Cpanel which can Help repeated Tasks but here in my case i want to keep running my Script on the Machine till i don't close the script .
Are their any such solutions. Because most of twitter bots i see are running forever, meaning their Script is getting executed somewhere 24x7 . This is what i want to know, How is that Task possible?
As mentioned by Jon and Vincent, it's better to run the code from a cloud service. But either way, I think what you're looking for is what to put into the terminal to run the code even after you close the terminal. This is what worked for me:
nohup python code.py &
You can add a systemd .service file, which can have the added benefit of:
logging (compressed logs at a central place, or over network to a log server)
disallowing access to /tmp and /home-directories
restarting the service if it fails
starting the service at boot
setting capabilities (ref setcap/getcap), disallowing file access if the process only needs network access, for instance
We have a server running Windows 7 Pro. I have several Python script I'd like to save to the server and have it so that client computers can run them by simply double-clicking. The client computers are all running OSX. This is proving to be... problematic.
First I tried to simply make the Python scripts executable, but this doesn't seem to be possible on a Windows server -- since you can't set the 'executable' flag, double-clicking on a file will always open it in an editor (unless I were to go to every single computer and make .py files open with Python). Trying to create a shell script has the same problem -- there's no way to make them executable from the server.
My solution was to just make a simple AppleScript app that sends a command to launch the script. Unfortunately, as soon as I copy the app to the server, it stops working. It seems that OSX apps refuse to execute properly when saved to the server -- if you run the file, nothing happens at all.
Is there a simple solution I'm overlooking?
This is probably what you're looking for: http://oreilly.com/catalog/samba/chapter/book/ch05_03.html says that Samba clients (that OS X uses to connect to Windows shares) can map archive/hidden/system file attributes to owner/group/world executable bits respectively.
Try setting those attributes on the script file and make sure its first line is #!/usr/bin/python. If this mapping is enabled by default, the script will run by double-click.
actually the issue is that windows has no equivalent of the execute bit for files.
the solution is to change the mount options on the share so that all the files have their execute bit set.