Hey everybody, I am just newbie at Python. I wanted to write a script in Python to change DNS.
But I learned that resolv.conf is read-only file, after writing that code. Because I took that error: IOError: [Errno 13] Permission denied: '/etc/resolv.conf'
myFile= open("/etc/resolv.conf", "w")
Then, I made a little search and found os.chmode() and I wrote a new line to remove all privileges of resolv.conf which is:
os.chmod("/etc/resolv.conf", 0777)
But now I'm taking that error: IOError: [Errno 13] Permission denied: '/etc/resolv.conf'
I can't get over this question and I'm waiting for your advices.
Thank you.
/etc/resolv.conf is typically owned by root. Unless your script is run in such a way that it has root privileges, it won't be able to change the file.
Chmod you must run as root before your script. And when you get permissions, your script will run without errors
You should never allow for a file like resolv.conf to be writable by all. It looks like you were chmod'ing it, or trying to anyway, to 777. That's really bad. There is a lot someone could do by changing a resolver on a host and making that host point to systems that were setup for malicious reasons. For example, one could have their own LDAP server, and by changing resolv.conf point a system at their resolver, and at their LDAP server, thereby possibly gaining privileged levels of access.
Keep this file locked down at all times.
Related
I created a program which modifies the hosts file in order to block some websites. But when I am running the programme, I am getting this error.
Btw I have created my Python file using Pycharm IDE and my intention is to run this script everytime open my PC using Task Scheduler. So please kindly do tell me what I should be running as the administrator. Is it like the Pycharm itself? Most importantly how do I give it Admin permissions permanently?
[Errno 13] Permission denied: 'C:\\Windows\\System32\\drivers\\etc\\hosts'
Please kindly tell me a way to fix this.
On Windows create a scheduled task on logon, and configure the task to run under a system account. The task could simply be run as a batch file and within the batch file you may run the below command. Ensure the task is set to run with administrator privileges in the task creation window
C:\PythonFolder\python.exe yourscript.py
I use fabric and have:
put('/projects/configuration-management/prototype','/etc/nginx/sites-available')
The result is:
Underlying exception:
Permission denied
Aborting.
Other configuration files can be uploaded easily. How could I avoid my issue?
It looks like you need super user permission, run it using sudo and it will work just fine
In the docs (link here) says:
While the SFTP protocol (which put uses) has no direct ability to
upload files to locations not owned by the connecting user, you may
specify use_sudo=True to work around this. When set, this setting
causes put to upload the local files to a temporary location on the
remote end (defaults to remote user’s $HOME; this may be overridden
via temp_dir), and then use sudo to move them to remote_path.
I have a python script I am running with a post request. The script is located in my cgi-bin and at the end of the script I am trying to upload a file to the /var/www/html/ folder and I am doing it like this
myFile= open("/var/www/html/file.html","w")
myFile.write("<html><body><p>test</p></body></html>")
myFile.close()
But I keep getting
<type 'exceptions.IOError'>: [Errno 13] Permission denied: '/var/www/html/file.html'
What is going wrong?
The error itself is already clear, your don't have enough permission to write to /var/www/html. It might be related the owner of the directory. If the owner is another user, then your current user don't have the write permission to the directory, the error would occur.
I am compiling a website using TeamCity on a server and need to deploy the compiled website to AWS.
As my last build step, I use the Elastic Beanstalk CLI to deploy: "C:\Python34...\eb.exe deploy".
eb init has already been run...but whenever I run "eb deploy", (even when I run it from the command line in an empty directory--which should deploy a default project to AWS), an error appears saying:
Error: PermissionError :: [Errno 13] Permission denied: './pagefile.sys'
I have already run the command on my local machine without any problems; I receive the error on the server regardless of whether I am running the command line as an administrator.
I am wondering if this is a permissions issue with the server, or something else? I haven't been able to achieve much insight from the other questions, because they seem to have been solved on a case-by-case basis.
pagefile.sys is the Windows swap file.
It is a special file which cannot be written or manipulated. Whatever your command is doing you need to fix your command so it that doesn't touch this file and ignores it.
I have a Python script running on the default OSX webserver, stored in /Library/WebServer/CGI-Executables. That script spits out a list of files on a network drive using os.listdir.
If I just execute this from the terminal, it works as expected, but when I try to access it through a browser (computer.local/cgi-bin/test.py), I get a permissions error:
<type 'exceptions.OSError'>: [Errno 13] Permission denied: '/Volumes/code/code/_sendrender/queue/waiting'
Is there any way to give the script permission to access the network when accessed via CGI?
I don't know much about the default osx webserver, but the webserver process is probably being run as some user, that user needs to be able to access those files. To find out who the user is you can use the ps command. Then depending on the configuration of the network shared drive, you can add this user to the users allowed to access this data.