Flask Session Cookies Expire Almost instantly, Can't Set Samesite Attribte - python

I a making a web application with a session cookie log in system. When using the cookies they expire within seconds, logging the user out of any service they were in. When I open my app I occasionaly get a warning in the terminal that states UserWarning: The session cookie domain is an IP address. This may not work as intended in some browsers. Add an entry to your hosts file, for example "localhost.localdomain", and use that instead. I'm hosting this app on Heroku so I don't think editing my local file would help, but if theres a way to get this to be solved on Heroku that would be great. Another error message I get comes from the console in the website itself, which reads:
Cookie “session” will be soon rejected because it has the “SameSite” attribute set to “None”
or an invalid value, without the “secure” attribute. To know more about the “SameSite“
attribute, read https://developer.mozilla.org/docs/Web/HTTP/Headers/Set-Cookie/SameSite
I set the Session cookie in my web application to:
app.config["SESSION_FILE_DIR"] = tempfile.mkdtemp()
app.config["SESSION_PERMANENT"] = False
app.config["SESSION_TYPE"] = "filesystem"
app.config["SESSION_COOKIE_SECURE"] = True
app.config["SESSION_COOKIE_SAMESITE"] = "None"
Session(app)
But this didn't solve my problem and both errors keep coming up. If there's any way to manually set SameSite and Secure that would be fantastic. Getting a https connection on Heroku did not work, I don't know why this is happening and it breaks the site, if there's any advice anyone has that would be greatly appreciated!

You need to use a domain name to access the service (https://domain.xxx/) and not the IP-address (https://123.123.123.213).
To avoid a lot of pain and errors, you should aim to use HTTPS, especially if you want cookies to work properly. Both the Secure and SameSite attributes requires HTTPS to work properly in most cases. And to get HTTPS to work you need a domain name and a proper certificate.

Related

Python Django /w Microsoft Graphs - I keep getting value error "state missing from auth_code_flow"

Python Django /w Microsoft Graphs -
I'm following this Microsoft Tutorial for building Django apps with Microsoft Graph (using it on my existing Django webapp), and I am having an issue with authentication: https://learn.microsoft.com/en-us/graph/tutorials/python
I'm on the step 'Add Azure AD authentication' and, after implementing,
I hit the sign in button and enter credentials...and I keep getting value error "state missing from auth_code_flow".
The "callback" method is only making it to result=get_token_from_code(request) and then fails.
Here is the get_token_from_code method:
def get_token_from_code(request):
cache = load_cache(request)
auth_app = get_msal_app(cache)
# Get the flow saved in session
flow = request.session.pop('auth_flow', {})
result = auth_app.acquire_token_by_auth_code_flow(flow, request.GET)
save_cache(request, cache)
return result
What I'm trying to do is eventually access excel online from my webapp.
Any help is greatly appreciated!
I just had this issue and resolved it. It is one of these two things:
You are starting out at 127.0.0.1:8000 and then when you're redirected you're at localhost:8000, which is a different domain. The sessions aren't remembered from one domain to the other. The solution is to start out on localhost:8000 so that the session persists across login.
Your browser is using super-strict cookie settings. Microsoft Edge appears to default to this mode on localhost and 127.0.0.1. There is a lock or shield icon in or near your address bar that lets you relax the restrictions on your cookie settings.
Try one or both of these and you should succeed.
I'm a beginner coder, so i'm pretty sure im just circumventing around the error. But replacing the website URL with http://localhost:8000/# and re running it somehow got around the error. maybe that could be of some use.
If you are running on chrome, rather than running application on http://127.0.0.1:8000 run it on http://localhost:8000, because chrome isn't saving the cookies when the ip address is being used.

DisallowedHost error not going away when adding IP address to ALLOWED_HOSTS

If I set ALLOWED_HOSTS = ['*'] I am able to make a succesfull call, however this seems dangerous and counterintuitive.
When I set ALLOWED_HOSTS to the recommended string, it fails. How to fix this?
Since you've tagged your post with AWS, I assume the host in question is an AWS EC2 instance. If so, try put in your EC2 private IP or your full domain instead, like:
['ip-XX-XX-XX-XX.XX-XXX-X.compute.internal']
OR
['.yourdomain.com']
The preceding . in your domain name represents a wildcard, as described in Django's docs
I encountered this and found the reason. There were 2 different tabs which were running server. For test reasons I just started server in another tab. Django doesn't warn in the second tab. So your requests are most probably falling to the another tab running the server.

Python Django session returns "None"

tl;dr: Python newbie, Django's session not propagated correctly while using HTTPS
I'm building a basic web service which rely on session/cookies to authentication an user.
During the first authentication, I configure a specific session like this:
request.session['userSecureId'] = "blabla"
return HttpResponseRedirect('http://localhost/secure',context)
At this point, a new session key has been added to django_session table. A basic b64 decode on the session_data field confirm the presence of 'userSecureId'
On my view, I check if this session exist like this:
if request.session.get('userSecureId'):
# do something
If I try this on my local system (plain HTTP), it works great. So my next step was to run it on my remote server with SSL enabled. I've configured
SESSION_COOKIE_SECURE = True on my settings.py but now, the value returned by 'userSecureId' is always None.
This is probably a newbie question, so any pointer will be appreciated =)
Additionally, If I print request.session.session_key I'm able to successfully retrieve the session key, meaning Django correctly detect my sessionid cookie, but can't decode the content of session_value
EDIT: I just tried accessing Django on my remote system (same configuration) and I'm facing the same issue. I have no idea why I can't run the session value. Code works using 127.0.0.1 w/o problem though
According to here and here
To share a session between HTTP and HTTPS (and cross domain also), you should set SESSION_COOKIE_DOMAIN in your settings.
SESSION_COOKIE_DOMAIN = '.example.com'

Appropriate cookie domain for Django dev server running multiple sites

I have multiple Django dev sites running locally like http://localhost:8000, http://localhost:8001, http://localhost:8002, etc.
Originally, I had SESSION_COOKIE_DOMAIN and CSRF_COOKIE_DOMAIN set to '' or 127.0.0.1 but this causes each site to overwrite the other's cookies, causing me to have to login every time I switch between sites. I tried using 127.0.0.1:<port> but that had no effect.
How do I get these sites to use separate cookies?
One solution to this problem is to use local domain name resolution to reach each of your different development servers. If you leave SESSION_COOKIE_DOMAIN as None, then the returned cookie is a standard domain cookie and will have the same domain as the request.
Have a look at http://en.wikipedia.org/wiki/Hosts_(file) which describes how to add local host file entries.
With a hosts file like this:
127.0.0.1 www.testserver1.com www.testserver2.com
You could then access each of your different test servers at:
http://www.testserver1.com:8000
http://www.testserver2.com:8001
I haven't tried this, but I believe it should work.
Alternatively, as per Mikhail's answer, use a different session cookie name for each instance.
Cookies are shared across ports for the same domain according to various RFCs (see e.g. https://www.rfc-editor.org/rfc/rfc6265#section-8.5). So this is not django-specific.
I think you could use different SESSION_COOKIE_NAME to have at least sessions separated.

Web/Screen Scraping with Google App Engine - Code works in python interpreter but not GAE

I want to do some web scraping with GAE. (Infinite Campus Student Information Portal, fyi). This service requires you to login to get in the website.
I had some code that worked using mechanize in normal python. When I learned that I couldn't use mechanize in Google App Engine I ended up using urllib2 + ClientForm. I couldn't get it to login to the server, so after a few hours of fiddling with cookie handling I ran the exact same code in a normal python interpreter, and it worked. I found the log file and saw a ton of messages about stripping out the 'host' header in my request... I found the source file on Google Code and the host header was in an 'untrusted' list and removed from all requests by user code.
Apparently GAE strips out the host header, which is required by I.C. to determine which school system to log you in, which is why it appeared like I couldn't login.
How would I get around this problem? I can't specify anything else in my fake form submission to the target site. Why would this be a "security hole" in the first place?
App Engine does not strip out the Host header: it forces it to be an accurate value based on the URI you are requesting. Assuming that URI's absolute, the server isn't even allowed to consider the Host header anyway, per RFC2616:
If Request-URI is an absoluteURI, the host is part of the Request-URI.
Any Host header field value in the
request MUST be ignored.
...so I suspect you're misdiagnosing the cause of your problem. Try directing the request to a "dummy" server that you control (e.g. another very simple app engine app of yours) so you can look at all the headers and body of the request as it comes from your GAE app, vs, how it comes from your "normal python interpreter". What do you observe this way?

Categories

Resources