My client has provided me with a Python console application which performs some work and writes the result into a .txt file. My task is to write a Windows service which reads that particular .txt file and performs further actions.
I used C# on .NET to write the service. My solution contains 3 projects:
The logic layer project.
The Windows service layer project.
A test app layer project, used for debugging and other purposes.
Both the Windows service layer and test app layer are using the logic layer for core functionality. When I run the application through the test layer, everything works perfectly, but whenever I try to run the application through the service, the Python standalone application that the service launches doesn't write any output files. I could see that the Python app runs in the task manager, but there's no output anywhere. I believe the Python code is crashing but I couldn't get the exact reason.
I've tried the following ways to debug the issue:
Searched the Windows and System32 directories for any related output files, just to consider the possibility of the service having these directories as the default working directory.
Used absolute paths in the service code to make sure that the Python part is not writing output files to some unknown location.
Had the client implement passing the output directory to the Python code through command line arguments.
Wrote a mock console app in C# which writes a file, tried to call it through the service, but it worked fine and wrote the file as expected.
Suspected the standard IO could be causing the Python application to crash and thus used the standard IO in my mock program, but it worked without any issues.
Tried giving a long task to the Python code, which should've taken about 30 minutes to execute completely, but the Python script ran and closed immediately, which essentially is reliable proof of the theory that it crashes at some point.
Tried running the service with my unelevated Windows user instead of the Local System pseudouser.
Tried configuring the service to be able to interact with the desktop.
I am all out of ideas here. Any direction I should also search in?
Just a note, if that matters: I am using System.Diagnostics.Process to launch the Python script.
If it works from your test app, it sounds like a permissions issue. What security context / user is the windows service running as, and does that user have permission to write to the filesystem where you are expecting it? Have you tried using a full path to the output file to be sure where it is expected?
I'd be inclined to write a tiny python app that just saves "hello world" to a file, and get that to work from a windows service, then build up from there.
Thanks to the help from timhowarduk, I finally was able to get to the root cause of the problem. The python script was looking for a configuration file, and when it was running from the Windows Service, it was looking for that config file in System32.
All the windows services are run from System32.
The above caused the python script to search in System32 since it was running as part of the windows service. I guess I might just ask the client to edit the python script to read config from the windows service application directory.
Related
I am trying to set up ezsheets for the use with Google Sheets. I followed the instructions from here https://ezsheets.readthedocs.io/en/latest/ and here https://automatetheboringstuff.com/2e/chapter14/
The set up process works quite differently on my computer: Somehow I could download the credentials-sheets.json. I need to download the token-sheets.pickle and token-drive.pickle files. When I run import ezsheets, no browser window is opended as described in the set up instructions. Nothing happens.
Is there another way to download both files?
I followed the steps you referenced and managed to generate the files, but I also encountered the same issue before figuring out the cause. The problem is that there are a few possible causes and the script silently fails without telling you exactly what happened.
Here are a few suggestions:
First off you need to configure your OAuth Consent Screen. You won't be able to create the credentials without it.
Make sure that you have the right credentials file. To generate it you have to go to the Credentials page in the Cloud Console. The docs say that you need an OAuth Client ID. Make sure that you have chosen the correct app at the top.
Then you will be prompted to choose an application type. According to the docs you shared the type should be "Other", but this is no longer available so "Desktop app" is the best equivalent if you're just running a local script.
After that you can just choose a name and create the credentials. You will be prompted to download the file afterwards.
Check that the credentials-sheets.json file has that exact name.
Make sure that the credentials-sheets.json file is located in the same directory where you're running your python script file or console commands.
Check that you've enabled both the Sheets and Drive API in your GCP Project.
Python will try to setup a temporary server on http://localhost:8080/ to retrieve the pickle files. If another application is using port 8080 then it will also fail. In my case a previously failed Python script was hanging on to that port.
To find and close the processes using port 8080 you can refer to this answer for Linux/Mac or this other answer for Windows. Just make sure that the process is not something you're currently using.
I just used the single import ezsheets command to get the files so after getting the token-sheets.pickle I had to run it again to get the token-drive.pickle, but after that the library should detect that you already have the files.
I am building a python script which needs to run infinitely on a server. It will access a Microsoft Exchange server and read mails, process them and trigger automated voice calls.
I have successfully implemented the automated call action. Presently the script runs on my PC. I have three questions.
For running the script on a server instead of PC, does the syntax of the code other than connecting to the server needs to change? I mean, the parts where I'm reading mails and triggering calls, does that need to be changed? Or can the same script run on a server? If it does need change, can somebody please attach what changes need to be done.
Since I need to run the script on a server, and access a Microsoft Exchange server, can the script be run on the Exchange server itself? If yes, please attach helpful resources.
The script does not take any input as such, but it accesses a couple of files that need to edited manually from time to time. How should I achieve that?
The distinction between PC and Server doesn't matter. Your script will require a set of resources and may make assumptions about the OS it's running on. Those are the things that matter. As long as the required resources are there, it should run fine. For example, if your script requires Python 3.6+ to run, then you must have Python 3.6+ installed on either the Server/PC. If you are using a particular python package, then it should be installed. If you make assumptions about where files are located on disk, those paths either need to be OS independent, or match the OS of the Server/PC, and those files need to be there. But the syntax of the python shouldn't change.
If your goal is to run the python server as a service on the server, then more information about what type of server (windows/linux) is required. Assuming you are considering running it on an exchange server, I suppose it's most likely you'll want to run on Windows. This has been asked and answered here. In relation to your code, you will want to make sure your script can be handled as a library, and you won't want to call sys.exit inside your code, but should rely on exceptions to pass up errors. My preferred pattern is something like
def main(argv=None):
# parse arguments if you have them and run the script
if __name__ == '__main__':
main()
Then in your service you can import and call main(...) without running another executable.
See #1. Whether it can run on that server depends on whether all of the required resources and files are available there. There is possibly a question of whether you would WANT to run the script on your exchange server. That answer depends on the load the script takes, how busy/active your server is, Whether you want the extra software installed on your server, etc.
Your best solution here will depend on your situation. If you can login and edit the files, then maybe that's what you do. If you want to edit them on your PC and then push them up, then there are solutions for that. All depends on what makes sense for your project/situation.
I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.
I am working on an azure web app and inside the web app, I use python code to run an exe file. The webapp recieves certain inputs (numbers) from the user and stores those inputs in in a text file. Afterwards, an exe file would run and read the inputs and generate another text file, called "results". The problem is that although the code works fine on my local computer, as soos as I put it on azure, the exe file does not get triggered by the following line of code:
subprocess.call('process.exe',cwd = case_directory.path, shell= True)
I even tried running the exe file on Azure manually from the Visual Studio Team Services (was Visual Studio Online) by "running from Console" option. It just did not do anything. I'd appreciate if anyone can help me.
Have you looked at using a WebJob to host\run your executable from? A WebJob can be virtually any kind of script or win executable. There are a number of ways to trigger your WebJob. You also get a lot of buit in monitoring and logging for free as well, via the Kudu interface.
#F.K I searched some information which may be helpful for you, please see below.
Accroding to the python document for subprocess module, Using shell=True can be a security hazard. Please see the warning under Frequently Used Arguments for details.
There is a comment in the article which gave a direction for the issue, please see the screenshot below.
However, normally, the recommended way to satisfy your needs is using Azure Queue & Blob Storage & Azure WebJobs to save the input file into a storage queue, and handling the files got from queue and save the result files into blob storage by a continuous webjob.
I have written a python tkinter program which runs on my Raspberry Pi, which does a number of things, including interfacing with my google calendar (read only access). I can navigate to the directory it is in and run it there - it works fine.
I would like the program to start at boot-up, so I added it to the autostart file in /etc/xdg/lxsession/LXDE, as per advice from the web. However it does not start at boot. So I try running the line of code I put in that file manually, and I get this.
(code I run) python /home/blahblah/MyScript.py
WARNING: Please configure OAuth 2.0
To make this sample run you will need to download the client_secrets.json file and save it at:
/home/blahblah/client_secrets.json
The thing is, that file DOES exist. But for some reason the google code doesn't realise this when I run the script from elsewhere.
How then can I get my script to run at bootup?
Figured this out now. It's tough, not knowing whether it's a Python, Linux or Google issue, but it was a Google one. I found that other people across the web have had issues with client_secrets.json as well, and the solution is to find where its location is stored in the Python code, and instead of just having the name of the file, include the path as well, like this.
CLIENT_SECRETS = '/home/blahblahblah/client_secrets.json'
Then it all works fine - calling it from another folder and it starting on startup. :)