I'm running a python script through powershell:
python.exe script.py
powershell is configured to run as Admin on the machine (Windows). Inside the script I'm trying to run subprocess:
test_process = subprocess.Popen("subscript.exe")
What I need for the "subscript.exe" process (subprocess) not to run as an admin on the machine, while not stopping the execution of the original script.
If Powershell and python need to run as admin, but you want just the subscript to run with basic user privileges, you can probably do something like this (not sure if you need to add arguments separately):
subprocess.Popen('runas /trustlevel:0x20000 subscript.exe')
I don't have python on my machine at the moment, but here's what it looks like running a new Powershell process that requires admin permissions for example:
trust level 0x20000 equates to Basic User, which you can see with runas /showtrustlevels. In my case, it's only one available.
Related
I have written a python script that downloads deb files from launchpad, and then calls out to a shell script to use alien to convert the debs to rpms.
The shell script uses alien, and so it needs to be run as root.
The program uses a thread pool to download the deb files asynchronously, using threadpool.apply_async, and then uses a processing pool to call the shell script asynchronously, so the whole thing happens reasonably quickly.
It all works well, but the shell script that calls alien needs to be root otherwise packages don't get built properly. When I first finished the script, I would use pkexec to call alien, after using sudo. In both cases, I had a couple of problems.
The first was that in starting in root, I lost the environment of the user, and so lost the pip installed python libraries. I could, perhaps, have used sudo -s or similar, but the second problem was that I had to enter my root password for every package that was built.
What I want to do, is to run the python script, qt gui and all, as a normal user, select which files to convert, and then hit the install button and only enter my superuser password once.
I decided to filter out the install parts of the python, which include the threaded download, and threaded call to the shell script, and then try and run those parts as root/superuser.
I created a dbus service, for this install part, and, after a steep dbus learning curve, managed to get the service working. However, I had no joy getting the script authenticated, and raising its privileges.
I have been able to use polkit to show the password dialog and authorise the super user, but I do not know how to use the return value from polkit
`authority.CheckAuthorization(subject, action_id, details, flags, cancellation_id)`
which shows the password dialog, for authorisation, but does not handle elevating the scripts privileges.
I have set the python install service as 0500 perms, so that hopefully, once I have figured out how to elevate privileges, the root user has the ability to read and execute the service, which is currently created on the session bus.
How can I elevate permissions, and, at the same time, keep the environment variables of the user, so that I don't have to install python modules into the root account?
Many thanks for your help in advance...
ps. I have written a polkit action file, and a polkit rule, but in each case I am not sure how the action id relates to the elevation of privileges.
pps. Can I/should I use pam?
I eventually ran the process as root, using pkexec to obtain a password dialog.
I'm running a python script on my raspberry pi, which makes some modifications in a SQL database, writes a log, and uploads everything to dropbox.
When I'm launching it using command line everything works fine.
UPDATE: When I'm launching it using cron, everything works, except for the Dropbox upload. No error messages in the log. The file simply doesn't appear in my dropbox.
Here is the code I am using:
from subprocess import call
data = "/home/pi/scripts/Dropbox-Uploader/dropbox_uploader.sh upload /home/pi/scripts/database.db /"
call ([data], shell=True)
How can this be fixed?
It works from an interactive terminal and not from cron is almost always an evidence of an PATH or environment problem. When you use an interactive session, the profile and eventually basrc files are used to set a number of environment variables including PATH. None of them are used from cron. So good practices are:
always use absolute path in scripts that can be launched from cron
explicitely set PYTHON environment variables in your crontab, or use a minimal shell to set them first and then start python
I know this is an exact copy of this question, but I've been trying different solutions for a while and didn't come up with anything.
I have this simple script that uses PRAW to find posts on Reddit. It takes a while, so I need it to stay alive when I log out of the shell as well.
I tried to set it up as a start-up script, to use nohup in order to run it in the background, but none of this worked. I followed the quickstart and I can get the hello word app to run, but all these examples are for web applications and all I want is start a process on my VM and keep it running when I'm not connected, without using .yaml configuration files and such. Can somebody please point me in the right direction?
Well, at the end using nohup was the answer. I'm new to the GNU environment and I just assumed it didn't work when I first tried. My program was exiting with an error, but I didn't check the nohup.out file so I was unaware of it..
Anyway here is a detailed guide for future reference (Using Debian Stretch):
Make your script an executable
chmod +x myscript.py
Run the nohup command to execute the script in the background. The & option ensures that the process stays alive after exiting. I've added the shebang line to my python script so there's no need to call python here
nohup /path/to/script/myscript.py &
Logout from the shell if you want
logout
Done! Now your script is up and running. You can login back and make sure that your process is still alive by checking the output of this command:
ps -e | grep myscript.py
I've written a python script. One of the functions opens a port to listen on. To open a port to listen on I need to do it as super user. I don't want to run the script with sudo or with root permissions, etc. I saw an answer here regarding sub-process using sudo. It's not a sub-process I want as far as I know. It's just a function within the application.
Question: How do I programmatically open a port with super user permissions?
You can't do that. If you could then malicious code would have free access to any system as root at any time!
If you want super user privileges you need to run the script from the root account or use sudo and type in the password - this is the whole point of having user accounts.
EDIT
It is worth noting that you can run bash commands from within a python script - for example using the subprocess module.
import subprocess
subprocess.run(['sudo', 'blah'])
This essentially creates a new bash process to run the given command.
If you do this your user will be prompted to enter their password in the same way as you would expect, and the privileges will only apply to the subprocess that is being created - not to the script that you are calling it from (which may have been the original question).
You could use sudo inside your python script like this, so you don't have to run the script with sudo or as root.
import subprocess
subprocess.call(["sudo", "cat", "/etc/shadow"])
I would like to achieve the following things:
Given file contains a job list which I need to execute one by one in a remote server using SSH APIs and store results.
When I try to call the following command directly on remote server using putty it executes successfully but when I try to execute it through python SSH programming it says cant find autosys.ksh.
autosys.ksh autorep -J JOB_NAME
Any ideas? Please help. Thanks in advance.
Fabric is a good bet. From the home page,
Fabric is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.
A quick example,
>>> from fabric.api import run, env, cd, settings, hide, show
>>> env.host_string='xxx.xxx.com'
>>> env.user='user'
>>> env.password='password'
>>> run('ls -lart')
After reading your comment on the first answer, you might want to create a bash script with bash path as the interpreter line and then the autosys commands.
This will create a bash shell and run the commands from the script in the shell.
Again, if you are using autosys commands in the shell you better set autosys environment up for the user before running any autosys commands.