I am building simple app which is using Twitter API. What I have to do to hide my Twitter app keys? For example, if I will put my program to the internet and somebody who look up to the code will know my consumer key, access token etc. And if I not include this information into my program, that it won't be work!
I'm assuming by putting on the internet you mean publishing your code on github or such.
In that case you should always separate code and configuration. Put your API keys in an .ini file, i.e. config.ini, then load that file from python program using configparser
Add configuration file to your .gitignore so it would not get added to the source control.
Assuming you're running on a Unix like system, one way to handle this is environment variables.
In your shell you can do this:
export TWITTER_API_KEY=yoursecretapikey
Note that you don't use quotes of any kind for this.
Then in your script:
import os
twitter_key = os.environ.get('TWITTER_API_KEY')
Related
Let's say:
I have my python code in main.py and I am using Pandas
I am storing my API Key(to some azure service) in a Windows Environment Variable ( variable name = "AZURE_KEY" and variable_value = "abc123abc")
I will import this API Key in main.py using azure_key = os.environ.get("AZURE_KEY")
Question:
How can I be sure that Pandas Library hasn't sent azure_key's value to somewhere outside my local system?
Possible Approach:
I know one way is to go through the entire Pandas module files and understand the source code to see if any fishy stuff is happening , but such an approach is not feasible.
Note:
Pandas is just an example for the question.I want to use an API Key within a Streamlit code.
Hence,Please take this question agnostic to the library..
For a production system (on a server), you could use a firewall to filter outgoing connections
For a development system (your machine), you could add restrictions to the "API Key" account (e.g. only access test data, only access systems you really need, etc.)
I'm currently using the azure-cosmos module in Python to connect to a database on Azure. I want to fetch the data, make a few transformations, and then push it to a new container.
You need the key and client ID to connect to the database, which I've used as variables in my code for now, as follows:
url = 'https://xyz.azure.com:443/'
key ='randomlettersandnumbers=='
client = CosmosClient(url, credential=key)
This seems to be a bad practice intuitively, and especially once I push this to Git, anyone could gain access to my database. So what's the most secure way to do this?
I'm coming from a non-SWE background, so apologies if this question is dumb.
Thanks!
The way I deal with this kind of problem is by using environment variables
import os
url = os.environ.get("url-endpoint")
key = os.environ.get("api-key")
client = CosmosClient(url, credential=key)
You can set them in your ssh shell like that:
export url-endpoint="https://xyz.azure.com:443/"
export api-key="randomlettersandnumbers=="
Or you can put them in a bash script envs.sh
export url-endpoint="https://xyz.azure.com:443/"
export api-key="randomlettersandnumbers=="
And then you can use source command.
source envs.sh
You have a good article about storing sensitive data using environment variables here
I am creating a Python script which reads a spreadsheet and issues Rest requests to an external service. A token must be obtained to issue requests to the Rest service, so the script needs a username and password to obtain the oauth2 token.
What's the best or standard way to hide or not have visible information in the Python script?
I recommend using a config file. Let's create a config file and name it config.cfg. The file structure should look more or less like this:
[whatever]
key=qwerertyertywert2345
secret=sadfgwertgrtujdfgh
Then in python you can load it this way:
from configparser import ConfigParser
config = ConfigParser()
config.read('config.cfg')
my_key = config['whatever']['key']
my_secret = config['whatever']['secret']
In general, the most standard way to handle secrets in Python is by putting them in runtime configuration.
You can do that by reading explicitly from external files or using os.getenv to read from environment variables.
Another way is to use a tool like python-decouple, which lets you use the environment (variables), a .env file, and an .ini file in a cascade, so that developers and operations can control the environment in local, dev, and production systems.
At present my application related variables like external server IP / default pattern etc.. These variables are specific to my application. It includes my external server username and password.
Now how can I have a single common place so that I can externalise this from my application.
The options which I thought are below:
Have one conf.ini file and use configparser and read this during the start of the django app
But I am not sure from where I should have method to read this so that it will be done in the startup.
Other option is to save this in a py file itself and import this file in my modules
Please suggests me the good and standard approach for above issue.
Save the important secret details in an ini file and place it in etc/project_name/my_settings.ini. Read these settings from settings.py. This way it will be secure. Can be read directly in the settings file itself
Or better way is to set them in bashrc, read the env vars from it.
Check this: Setting envs
I've got a website that I wrote in python using the CGI. This was great up until very recently, when the ability to scale became important.
I decided, because it was very simple, to use mod_python. Most of the functionality of my site is stored in a python module which I call to render the various pages. One of the CGI scripts might look like this:
#!/usr/bin/python
import mysite
mysite.init()
mysite.foo_page()
mysite.close()
and in mysite, I might have something like this:
def get_username():
cookie = Cookie.SimpleCookie(os.environ.get("HTTP_COOKIE",""))
sessionid = cookie['sessionid'].value
ip = os.environ['REMOTE_ADDR']
username = select username from sessions where ip = %foo and session = %bar
return(username)
to fetch the current user's username. Problem is that this depends on os.envrion getting populated when os is imported to the script (at the top of the module). Because I'm now using mod_python, the interpreter only loads this module once, and only populates it once. I can't read cookies because it's os has the environment variables of the local machine, not the remote user.
I'm sure there is a way around this, but I'm not sure what it is. I tried re-importing os in the get_username function, but no dice :(.
Any thoughts?
Which version of mod_python are you using? Mod_python 3.x includes a separate Cookie class to make this easier (see here)
Under earlier versions IIRC you can get the incoming cookies inside of the headers_in member of the request object.