A django settings file includes sensitive information such as the secret_key, password for database access etc which is unsafe to keep hard-coded in the setting file. I have come across various suggestions as to how this information can be stored in a more secure way including putting it into environment variables or separate configuration files. The bottom line seems to be that this protects the keys from version control (in addition to added convenience when using in different environments) but that in a compromised system this information can still be accessed by a hacker.
Is there any extra benefit from a security perspective if sensitive settings are kept in a data vault / password manager and then retrieved at run-time when settings are loaded?
For example, to include in the settings.py file (when using pass):
import subprocess
SECRET_KEY=subprocess.check_output("pass SECRET_KEY", shell=True).strip().decode("utf-8")
This spawns a new shell process and returns output to Django. Is this more secure than setting through environment variables?
I think a data vault/password manager solution is a matter of transferring responsibility but the risk is still here. When deploying Django in production, the server should be treated as importantly as a data vault. Firewall in place, fail to ban, os up to date... must be in place. Then, in my opinion, there is nothing wrong or less secure than having a settings.py file with a config parser reading a config.ini file (declared in your .gitignore!) where all your sensitive information is present.
Related
When using sessions, Flask requires a secret key. In every example I've seen, the secret key is somehow generated and then stored either in source code or in configuration file.
What is the reason to store it permanently? Why not simply generate it when the application starts?
app.secret_key = os.urandom(50)
The secret key is used to sign the session cookie. If you had to restart your application, and regenerated the key, all the existing sessions would be invalidated. That's probably not what you want (or at least, not the right way to go about invalidating sessions). A similar case could be made for anything else that relies on the secret key, such as tokens generated by itsdangerous to provide reset password urls (for example).
The application might need to be restarted because of a crash, or because the server rebooted, or because you are pushing a bug fix or new feature, or because the server you're using spawns new processes, etc. So you can't rely on the server being up forever.
The standard practice is to have some throwaway key commited to the repo (so that there's something there for dev machines) and then to set the key in the local config when deploying. This way, the key isn't leaked and doesn't need to be regenerated.
There's also the case of running secondary systems that depend on the app context, such as Celery for running background tasks, or multiple load balanced instances of the application. If each running instance of the application has different settings, they may not work together correctly in some cases.
I have a number of REST APIs from a software program (ex: Tradingview)
I would like to store the API credentials (e.g. keys, secrets) safely.
I had thought about placing them in a Database table - but - I am not totally fond of placing clear text in a table.
I already know about using OS Environment Variables:
[... snip ...]
import os
import sys
import logging
[... snip ...]
LD_API_KEY = os.getenv("BINANCE_APIKEY")
LD_API_SECRET = os.getenv("BINANCE_API_SECRET")
where keys are stored in a file - but - as mentioned before, I have a number of API keys.
Just leaving them on a server (in clear text) - even though the file is hidden - is not sitting well with me.
Is there any other way to store API Keys?
There are a number of articles on this topic, a quick web search for "Storing API keys" will net you some interesting and informative reads, so I'll just talk about my experience here.
Really, it's all up to preference, the requirements of your project, and the level of security you need. I personally have run through a few solutions. Here's how my project has evolved over time.
Each key stored in environment variables
Simple enough, just had to use os.environ for every key. This very quickly became a management headache, especially when deploying to multiple environments, or setting up an environment for a new project contributor.
All keys stored in a local file
This started as just a file outside source control with an environment variable pointing to the file. I started with a simple JSON file in the following structure.
[
{
"name": "Service Name",
"date": "1970-01-01", // to track rotations
"key": "1234abcd",
"secret_key": "abcd1234"
}
]
This evolved into a class that accessed this file for me and returned the desired key so I didn't have to repeat json.load() or import os in every script that accessed APIs. This got a little more complex when I started needing to store OAuth tokens.
I eventually moved this file to a private, encrypted (git-secret), local-only git repo so team members could also use the keys in their environments.
Use a secret management service
The push to remote work forced me to create a system for remote API key access and management. My team debated a number of solutions, but we eventually fell on AWS Secrets Manager. The aforementioned custom class was pointed at AWS instead of a local file, and we gained a significant increase in security and flexibility over the local-only solution.
There are a number of cloud-based secret management solutions, but my project is already heavily AWS integrated, so this made the most sense given the costs and constraints. This also means that each team member now only needs to have AWS permissions and use their accounts AWS API key for access.
At present my application related variables like external server IP / default pattern etc.. These variables are specific to my application. It includes my external server username and password.
Now how can I have a single common place so that I can externalise this from my application.
The options which I thought are below:
Have one conf.ini file and use configparser and read this during the start of the django app
But I am not sure from where I should have method to read this so that it will be done in the startup.
Other option is to save this in a py file itself and import this file in my modules
Please suggests me the good and standard approach for above issue.
Save the important secret details in an ini file and place it in etc/project_name/my_settings.ini. Read these settings from settings.py. This way it will be secure. Can be read directly in the settings file itself
Or better way is to set them in bashrc, read the env vars from it.
Check this: Setting envs
I've read that it's best practice for security reason to store things like API keys in the instance/settings.py file in Flask - why is this so and what is the mechanism that makes it so. I haven't been able to find much documentation about this online.
As #davidism suggested, Config files are never meant to be tracked since these are your secret keys and anybody with access to your code will have access to your keys if tracked.
But, there is no hard and fast rule in Flask to keep your settings file in a specific location. You can keep them anywhere and name them anything.
But, when adding the config to the app, the correct file path must be given.
I tried to figure it out, the most secure and flexible solution for storing in config file some credentials for database connection and other private info.
This is inside a python module for logging into different handlers (mongodb, mysqldb, files,etc) the history of users activity in the system.
This logging module, is attached with a handler and its there where I need to load the config file for each handler. I.E. database, user, pass, table, etc.
After some research in the web and stackoverflow, I just saw mainly the security risks comparison between Json and CPickle, but concerning the eval method and the types restriction, more than the config file storage issue.
I was wondering if storing credentials in json is a good idea, due to the security risks involved in having a .json config file in the server (from which the logging handler will read the data). I know that this .json file could be retrieved by an http request. If the parameters are stored in a python object inside a .py code, I guess there is more security due to the fact that any request of this file will be interpreted first by the server, but I am loosing the flexibility of modularization and easy modification of this data.
What would you suggest for this kind of Security issues while storing this kind of config files in the server and accessed by some Python class?
Thanks in advance,
Luchux.
I'd think about encrypting the credentials file. The process that uses it will need a key/password to decrypt it, and you can store that somewhere else-- or even enter it interactively on server start-up. That way you don't have a single point of failure (though of course a determined intruder can eventually put the pieces together).
(Naturally you should also try to secure the server so that your credentials can't just be fetched by http request)