Handling permissions and data in a distributed python script - python

I'm trying to distribute a python script through Pypi. This script takes input from a user and stores it in a text file. However, because the script is creating/writing to a text file, it requires a "sudo" every time it runs. I.e.:
$ my_script
Permission error
$ sudo my_script
Success
I've run into this problem while working on another script and solved it by chmod-ing a newly created file. This way, sudo was required only once to create a file with lowered permissions (that could be written to w/o extra privileges). However, I can't believe that this is the best answer to such a problem— requiring users to give privileges to a no-name script seems awfully suspicious. Is there really not a cleaner way to handle recording data when distributing through Pypi?

Related

Modify .bash_aliases with Python

So I've been trying to modify .bash_aliases programatically for a while now, and I've been running into issues with every method I've tried.
Running my script using sudo python3 myscript.py causes the script to modify the .bash_aliases file of the root user. I can't find a way to determine what user ran the script to modify their file.
Trying to use a shell command such as sudo echo "my string" >> ~/.bash_aliases gets an error: sh: 1: cannot create /home/migue/.bash_aliases: Permission denied, presumably because sudo can't display its password prompt when I call it programatically.
I can't find a way to temporarily get root permissions after determining the full path (ie expanding ~) of the file.
Basically, I'd love to know any reasonable method to modify and append to .bash_aliases through a Python script. I haven't found any questions on this where the solutions worked for me.
I'd prefer for this method to not require any non-standard modules, as installing them will just make the process less seamless for people who use the script.
I can't find a way to determine what user ran the script to modify their file.
You can reference the file ~/.bash_aliases in your script and run it without sudo, unless your current user is root.
EDIT:
You simply need to add write privileges to .bash_aliases for every user it belongs to.

How to run a Python script from Apache on Raspberry Pi?

So, on a Raspberry Pi I'm using a camera app with a web interface, I wanted to add LED lighting by adding a neopixel. I have successfully done this and can now turn it on and off running two python scripts.
Explanation and question:
I have a python script in /usr/local/bin that is executable.
It is owned by 'root root'.
I have a shell script in /var/www/html/macros that is executable and has to run the python script in /usr/local/bin.
The shell script is owned by 'www-data'
When I manually run the python file, it executes the script.
When I manually run the shell script, it executes the python script.
When I run the shell script by clicking on a button on my webpage, it seems to execute the shell script correctly, however, it looks like it doesn't execute the python script.
What can I do to fix this?
I'm not that experienced with permissions, but I wanted to emphasize on the fact that this is a closed system that does not contain any sensitive information. So safety/best practice is not a concern. I just want to make this work.
I'm not an expert in this area, but I believe to access /usr/local/bin/ you need root privileges which explains why you're having success but not Apache.
Rather than give Apache root permissions, it's best to simply remove the requirement from the individual file you want to execute. This can be accomplished by
$ cd /usr/local/bin
$ sudo chmod 777 your_script.py
Now, after 11 hours and a group of people thinking along we found a solution to the problem.
The problem turned out to be that the Web interface can only execute as 'www-data', and the NeoPixel library that the python script depends on needs to be executed as sudo/root.
These two factors make it so that there will never be a direct way of getting the scripts to work together.
However, the idea emerged to use some sort of pipe.
A brilliant user suggested to me to use sshpass. This would allow to pass data to ssh and have it essentially be executed as a root user.
The data from the web interface would be relayed to the sshpass and this would successfully run the needed scripts with the needed privileges.
Special thanks to Minty Trebor and Falcounet from the RRF for LPC/STM Discord!

How to implement multiple commands with root permissions with only one password prompt?

I'm working on a GUI applications which calls two system commands respectively.
Those two commands require root permissions to be executed.
The first approach I made, is to call gksu <command_1>, then gksu <command_2>.
This works fine but the user must enter his password twice respectively, and I believe this is not good idea from a UX perspective.
I tried to call gksu with the first command and sudo with the second, but I get this error:
sudo: no tty present and no askpass program specified
So I tried to separate those command in a python file and call a command from the original file that looks like gksu python3 commands.py.
I'm not sure whether this would be executed after I release a compiled version of the whole project, as I intend to use pyinstaller --onefile on it !
So, what I need exactly is to make the app be able to run a specific script with super user privileges considering the final state of the app which would be an executable-binary file and that doesn't include running the whole app with root permissions .
Thanks to Itz Wam, His answer guided me to the correct solution which is Using pkexec instead of gksu like this:
pkexec bash -c "command_1;command_2"
You could execute this :
gksu -- bash -c 'command1; command2; command3'
It will ask your password one time and execute the 3 commands as root
Source : https://askubuntu.com/questions/183608/gksudo-2-commands-with-one-pw-entry

How to use python subprocess.check_output with root / sudo

I'm writing a Python script that will run on a Raspberry that will read the temperature from a sensor and log to Thingspeak. I have this working with a bash script but wan't to do it with Python since it will be easier to manipulate and check the read values. The sensor reading is done with a library called loldht. I was trying to do it like this:
from subprocess import STDOUT, check_output
output = check_output("/home/pi/bin/lol_dht22/loldht", timeout=10)
The problem is that I have to run the library with sudo to be able to access the pins. I will run the script as a cron. Is it possible to run this with sudo?
Or could I create a bash script that executes 'sudo loldht' and then run the bash script from python?
I will run the script as a cron. Is it possible to run this with sudo?
You can put python script.py in the cron of a user with sufficient privileges (e.g. root or a user with permissions to files and devices in question)
I don't know which OS you're using, but if Raspbian is close to Debian, there is no need for sudo or root, just use a user with sufficient permissions.
It seems I can also do this check_output check_output(["sudo", "/home/pi/bin/lol_dht22/loldht", "7"], timeout=10)
Sure but the unix user that's going to invoke that Python script will need the sudo privilege (Otherwise can't call the sudo from subprocess). In which case you might as well do as above, run the cron from a user with the required permissions.
You can run sudo commands with cron. Just use sudo crontab -e to set the cron and it should work fine.
You should very careful with running things as root. Since root has access to everything, a simple error can potentially render the system unusable.
The proper way to have access to the hardware as a normal user is to change the permissions on the required device files.
It seems that the utility you mention uses the WiringPi library. Some digging in the source code indicates that it uses the /dev/gpiomem (or /dev/mem) devices.
On raspbian, device permissions are set with udev. See here and also here.
You could give every user access to /dev/gpiomem and other gpio devices by creating a file e.g. /etc/udev/rules.d/local.rules and putting the following text in it:
ACTION=="add", KERNEL=="gpio*", MODE="0666"
ACTION=="add", KERNEL=="i2c-[0-9]*", MODE="0666"
The first line makes the gpio devices available, the second one I2C devices.

Changing user within python shell

I am using ubuntu in my server.
I have two users, let them be user1 and user2
Each user have their own project folder with permissions set due to their needs. But user1 needs to run a python script which is in the other user's project folder. I use subprocess.Popen for this. The python file have required access permissions, so i do not have problem in calling that script. But log files (which have permission for user2) causes permision denied error (since they belong to other user, not the one i need to use).
So i tried to change the user with
Popen("exit", shell=True, stdin=sp.PIPE, stdout=sp.PIPE, stderr=sp.PIPE) #exit from current user, and be root again
Popen(["sudo", "user2"], shell=True, stdin=sp.PIPE, stdout=sp.PIPE, stderr=sp.PIPE)
Popen("/usr/bin/python /some/file/directory/somefile.py param1 param2 param3", shell=True, stdin=sp.PIPE, stdout=sp.PIPE, stderr=sp.PIPE)
But second Popen fails with
su: must be run from a terminal
Is there any way to change user within python shell?
PS: Due to some reasons, i can not change the user permissions about log files. I need to find a way to switch to other user...
EDIT: I need to make some explenaiton to make things clear...
I have two different django projects running. each projects have a user, have their own user folders where each project codes and logs are kept.
Now, in some cases, i need to transfer some data from project1 to projet2. Easiest way looks to me is writing a python script that accept parameters and do relevant job [data insertion] on the second project.
So, when some certain functions are called within project1, i wish to call py script that is in the project2 project folder, so i can do my data update on the second prject.
But, since this i call Popen within project1, currnt user is user1. But my script (which in in project2) have some log files which denied me because of access permissions...
So, somehow, i need to switch from user1 to user2 so i will not have permission problem, and call my python file. Or, do find another way to do this..
Generally, there is no way to masquerade as another user without having root permission or knowing the second user's login details. This is by design, and no amount of python will circumvent it.
Python-ey ways of solving the problem:
If you have the permission, you can use os.setuid(x), (where x is a numerical userid) to change your effective user id. However, this will require your script to be run as root. Have a look at the os module documentation.
If you have the permission, you can also try su -c <command> instead of sudo. su -c will require you to provide the password on stdin (for which you can use the pexpect library)
Unix-ey ways of solving the problem:
Set the group executable permission on the script and have both users in the same group.
Add the setuid bit on the script so that user1 can run it effectively as user2:
$ chmod g+s script
This will allow group members to run the script as user2. (same could be done for 'all', but that probably wouldn't be a good idea...)
[EDIT]: Revised question
The answer is that you're being far too promiscuous. Project A and project B probably shouldn't interact by messing aground with each other's files via Popening each other's scripts. Instead, the clean solution is to have a web interface on project A that gives you the functionality that you need and call it from project B. That way, if in the future you want to move A on a different host, it's not a problem.
If you insist, you might be able to trick sudo (if you have permission) by running it inside a shell instead. for example, have a script:
#!/bin/sh
sudo my/problematic/script.sh
And then chmod it +x and execute it from python. This will get around the problem with not being able to run sudo outside a terminal.

Categories

Resources