Our client has a web application running a Django instance from virtualenv on a Ubuntu server. We did a security audit for that service and found a path traversal vulnerability in a file upload form that could allow the attacker to write arbitrary files in the django user owned paths. Example:
A parameter "Import Name" is supplied with value
../some/path/to/create
Then the form file field is supplied with arbitary filename and the correct file contents
The application then does
try:
path = os.path.join(DEFAULT_UPLOAD_DIR, <Import Name>)
os.mkdir(path)
...
with open(os.path.join(path, <Filename From Form>)) as upload_file:
upload_file.write(<File Contents>)
...
The unsafe os.path.join allows the attacker to walk up in the directory tree and upload to other directories than the DEFAULT_UPLOAD_DIR. So basically if the attacker is able to find a path that doesn't yet exist on the server he's able to create that folder avoiding the failure of os.mkdir() in the try...except and the file is uploaded there.
Now this translates to a real exploit if the attacker is able to write to
../virtualenvs/<env name>/lib/python2.7/
Since e.g Django modules are loaded from the subdirectory site-packages within the virtualenv python directory and pythonpath tells us whatever is directly under lib/python2.7 gets loaded first, essentially the module loading order allows the attacker to 'overwrite' a module and ensure their code is run on import.
We did a proof-of-concept penetration test and wrote to
../virtualenvs/somepath/__init__.py
Which succeeded but for some reason we are unable to write to
../virtualenvs/<actual env name>/
Which is strange cause the permissions are exactly the same as with somepath and owner / group is in both cases the Django user. Enabling the virtualenv for the Django user and going to the python shell it allows me to do the write so it seems weird that it can't when called from the vulnerable form view.
The question is: Is there something special about the virtualenv path from which the Django instance is running that makes it unable to write to that path? Or am I missing something?
Related
I am working on a project which is a system where students can submit their coding assignments to a server where the server then executes their code as part of tests and then returns the grades they received based on the results of the tests executed.
There is a security concern that where the student submitting the code could "damage" the server by including code to delete the directory where the system's files are stored. The files are stored in a directory hierarchy where if the student somehow figured out the path to it, they could easily code their program to delete this directory.
I have since setup permissions so that the server is run under a different user. This user only has access to a single directory that stores all the submissions for that module. They could still theoretically wipe out this directory, but it is better than deleting the whole system. While it is still not ideal, I am not sure how to approach it.
The server is coded in Python. I have tried using os.chown etc to change the ownership of directories to a single user, however, I found out that the program needs to be run under a superuser to change ownership and also for calls to os.setuid and os.setgid.
Basically, my question is, is there any way to run a program while restricting it to the directory it's running within? This would involve only allowing it to delete files/folders within its working directory. I know there is a chroot command but that also requires superuser privilegs.
It is also not possible to run the program under a different user without sudo privileges either. The programs are executed using subprocess.Popen().
I know it's a long shot as I have tried a lot of research with permissions and the current solution restricting deletion down to the submissions data directory is as far as I could get. It is still not ideal however and the server will not be allowed to be run with sudo privileges.
If there are any program attributes that can be set to prevent that program from deleting files, it would be great, but I don't think such a system exists. I may have to resort to "scanning" the submitted code file for dangerous calls and reject the file if there are any such calls in it.
The current directory hierarchy is used:
.handin/module-code/data (data is where submissions for each student are stored)`
Currently, the data directory is created with a group handin which allows any members of that group to create directories inside it. With the server running under a user handin, it creates directories/files inside in that data directory with user handin and group handin. So, the only files the server could delete as user handin is all directories underneath data, rather than the whole .handin directory.
Underneath data, you have directories named from the student ids, e.g. .handin/module-code/data/12345678 and underneath that you have a directory with the assignment name. The assignment directory is the directory the code is executed in. It would be ideal if it would be that directory that could only be deleted, but if not, the student-id directory.
So, I have solved the problem using separate Docker containers for each execution. I created separate images for different languages in the programs that would be executed in. I then created a user in these containers that had just enough permissions to create/delete files inside their home directory, essentially sandboxing it.
I then used the epicbox python module (https://pypi.org/project/epicbox/) which was created by Stepik.org to grade programming assignments in their own containers (very similar to the problem that I needed to solve).
That creates a volume internally to allow each docker container that is run to share files:
with epicbox.working_directory() as workdir:
epicbox.run(....)
epicbox.run(....)
.....
epicbox.run(....)
Each run call spins up a docker container but with the internally created volume, each container can access files produced from the previous call to run(). The module allows you to upload files from your local machine to the docker container and then compile/execute them there.
I did have to do some "hacks" to configure it to my requirements (change the default working directory in the docker container as epicbox did not provide a method to change it easily). This solution adds a few extra seconds to the execution time when compared to executing on the server itself, but for my use case, it is an acceptable trade-off for security.
I am working on a Django based application whose location on my disk is home/user/Documents/project/application. Now this application takes in some values from the user and writes them into a file located in a folder which is under the project directory i.e home/user/Documents/project/folder/file. While running the development server using the command python manage.py runserver everything worked fine, however after deployment the application/views.py which accesses the file via open('folder/path','w') is not able to access it anymore, because by default it looks in var/www folder when deployed via apache2 server using mod_wsgi.
Now, I am not putting the folder into /var/www because it is not a good practise to put any python code there as it might become readable clients which is a major security threat. Please let me know, how can I point the deployed application to read and write to correct file.
The real solution is to install your data files in /srv/data/myapp or some such so that you can give the webserver user correct permissions to only those directories. Whether you choose to put your code in /var/www or not, is a separate question, but I would suggest putting at least your wsgi file there (and, of course, specifying your <DocumentRoot..> correctly.
We use Django for a project. Prior to 1.7+, we were able to dynamically pull in files to our python environment when setup.py was being executed. For example, we have this directory structure:
/foo/bar
/foo/bar/__init__.py
/foo/bar/my_functions.py
Our Django project doesn't know anything about those files to start off. When the web server starts and setup.py is read, we look at a configuration which tells us where to find files to add to our environment. Again, let's assume /foo/bar is in our configuration to be dynamically loaded. We would then use import_module() to import it so that anything under /foo/bar essentially becomes part of the project and can be used. It's basically a "plugins" feature.
After Django >=1.7, this causes huge problems, mainly:
django.core.exceptions.AppRegistryNotReady: The translation infrastructure
cannot be initialized before the apps registry is ready. Check that you
don't make non-lazy gettext calls at import time.
It also limits our ability to add new files dynamically as you always have to put them in place and restart your web server. You couldn't have an "Upload Plugin" page to add new files to the server, for example.
Is there a way to add files like this both during the web server startup as well as after the startup without restarting it?
I'm using AWS for the first time and have just installed boto for python. I'm stuck at the step where it advices to:
"You can place this file either at /etc/boto.cfg for system-wide use or in the home directory of the user executing the commands as ~/.boto."
Honestly, I have no idea what to do. First, I can't find the boto.cfg and second I'm not sure which command to execute for the second option.
Also, when I deploy the application to my server, I'm assuming I need to do the same thing there too...
"You can place this file either at /etc/boto.cfg for system-wide use
or in the home directory of the user executing the commands as
~/.boto."
The former simply means that you might create a configuration file named boto.cfg within directory /etc (i.e. it won't necessarily be there already, depending on how boto has been installed on your particular system).
The latter is indeed phrased a bit unfortunate - ~/.boto means that boto will look for a configuration file named .boto within the home directory of the user executing the commands (i.e. Python scripts) which are facilitating the boto library.
You can read more about this in the boto wiki article BotoConfig, e.g. regarding the question at hand:
A boto config file is simply a .ini format configuration file that
specifies values for options that control the behavior of the boto
library. Upon startup, the boto library looks for configuration files
in the following locations and in the following order:
/etc/boto.cfg - for site-wide settings that all users on this machine
will use
~/.boto - for user-specific settings
You'll indeed need to prepare a respective configuration file on the server your application is deployed to as well.
Good luck!
For those who want to configure the credentials in Windows:
1-Create your file with the name you want(e.g boto_config.cfg) and place it in a location of your choice(e.g C:\Users\\configs).
2- Create an environment variable with the Name='BOTO_CONFIG' and Value= file_location/file_name
3- Boto is now ready to work with credentials automatically configured!
To create environment variables in Windows follow this tutorial: http://www.onlinehowto.net/Tutorials/Windows-7/Creating-System-Environment-Variables-in-Windows-7/1705
For anyone looking for information on the now-current boto3, it does not use a separate configuration file but rather respects the default one created by the aws cli when running aws configure (Ie, it will look at ~/.aws/config)
I'm trying to run a site with Django on an IIS-based server. I followed all the instructions on the main site (http://code.djangoproject.com/wiki/DjangoOnWindowsWithIISAndSQLServer), and double checked it with a very good article (http://www.messwithsilverlight.com/2009/11/django-on-windows-server-2003-and-iis6/).
I successfully got as far as setting up IIS to read .py files. Following the main instructions, I can get the server to render Info.py. However, I can't seem to get IIS and Django to play nice. If, for instance, my Virtual directory is "abc", then if I go to "localhost/abc/", the browser simply shows me the content directory for that folder. Furthermore, if I have my urls set up so that "/dashboard/1" should bring me to a certain page, entering "localhost/abc/dashboard/1" gives me a "page cannot be displayed" error.
I'm fairly certain IIS simply isn't referencing or interacting with Django at all. Does anyone have any ideas how to fix this?
Thanks
Here are the original instructions I followed,
basics instructions: https://code.djangoproject.com/wiki/DjangoOnWindowsWithIISAndSQLServer
additional tips: http://whelkaholism.blogspot.ca/
The first thing you should do is install Python 2.5 or 2.6, for 2.7 you need to recompile PyISAPIe, which I have not done. http://www.python.org/ftp/python/2.6/python-2.6.msi
You need to install the version of PyISAPIe that will match your Python Interpreter version, if they do not match, it will fail. Get it there : http://sourceforge.net/projects/pyisapie/files/pyisapie/
Move the extracted folder from the last step at a decent location (i.e. C:)
You need to change the security settings of the PyISAPIe.dll, they suggest Network Service read, but I set everyone, to be sure there are no problems with this
You then have to CUT AND PASTE (Important) the Http folder of PyISAPIe to Lib\Site-Packages of your Python installation directory
Next, you setup IIS (open the manager with inetmgr in run (winkey+r):
Add a new virtual directory and allow executing ISAPI extensions when prompted by the wizard
Add a new wildcard extension in the property of your virtual directory, untick file exist setting
Add Web Service Extension to IIS Manager pointing to the dll, ensure it is allowed
From the PyISAPIe folder, copy examples\django\Isapi.py and paste it in Lib\Site-Packages\Http
In Isapi.py, set the path (i.e. c:\inetpub\wwwroot\ web_site\ django_project ) and DJANGO_SETTINGS_MODULE (i.e. django_app .settings)
When any change is done to your files, use iisreset in your command prompt to apply the changes
Here are some other things you might do
Ensure the path of your db file (if sqlite used) is okay
Do the same with template location settings
In your urls and html files, ensure the path start with the name you gave to your virtual directory alias (i.e. web_site in our example)
Finally, you may encounter difficulties with serving your CSS. If you have any troubles, tell me and I will update my post.
Serving Django with any webserver basically involves three key details:
Telling the webserver, "I want you
to serve content that is provided by
this module that invokes python"
Telling the python module, "I want you to execute python code
using the details in this file"
Telling the file, "I want you to use Django"
If you're getting a directory listing back for your Virtual Directory then it would seem that you should investigate the VD settings to make sure PyISAPIe is configured for that directory (key details #1).
From the article you mentioned:
Open the IIS Management Console, and create a new virtual directory, and
allow executing ISAPI extensions when
prompted by the wizard.
View the properties of the new folder and click on the
"configuration" button (if it's greyed
out, click 'create' first), then add a
new wildcard extension (the lower
box), locate the pyisapie.dll file and
untick the "check that file exists"
box.
In the IIS Manager, go to the "Web Service Extensions" section, and
right click -> add new web service
extension.
Give it a name (it doesn't matter what), add the pyisapie.dll
fill as a required file and check the
box to set the extension status to
allowed.