I'm trying to search for user_name in mongoDB and if it's found it will print the user_id from the db. And if is not found I want to print "Id not found" but there's a problem: If the user_no_aront is not found then it doesn't print anything. Not even a "null" message. So any way to make the code send a "null"/"None" message if the data I'm asking for is not found?
connection = pymongo.MongoClient("mongodb://xxx:xxx#xxxx.mlab.com:xxx/xxx")
ddb = connection['aurora']
coll = db['users']
user = ''.join(args)
user_no_aront = re.sub('[^A-Za-z0-9]+', '', user)
curs = coll.find({"user_name":user_no_aront}, {"user_id":True, "_id":False})
for item in curs:
get_user = item.get("user_id", None)
print("get_user")
if user_no_aront doesn't exist in coll, the for-loop won't get execute at all. that's the reason why you won't print any thing, even None.
Related
// SELECT
myDatabaseCursor.execute("SELECT username, password FROM member")
myDatabase.commit()
// get data from form to make a tuple
userCheck = (request.form["signInUsername"], request.form["signInPassword"])
// iterate selected data tuple into a list
results = []
for selectedData in myDatabaseCursor:
results.append(selectedData)
// check if there is a match in MySQL database
if userCheck in results:
session["status"]="logged"
session["user_name"]=request.form["signInUsername"]
return redirect("/member")
else:
return redirect("/error/?message=wrong username or password")
When I ran my server and tried typing in the username and the right password, successfully logged in; tried typing in the username and the wrong password, which, didn't have any match in the database, got rejected logging in. ALL GOOD...
BUT, when I tried typing in the username and the wrong password, which, HAS A MATCH IN THE PASSWORD COLUMN THOUGH DOESN'T BELONG TO THE RIGHT USERNAME, still successfully logged in.
I am really confused now, hope you guys have any idea about this situation.
Thanks, appreciate your replies.
You could change your query to support WHERE clause. Something along the lines of:
# get data from form to make a tuple
username, password = (
request.form["signInUsername"],
request.form["signInPassword"]
)
# SELECT
myDatabaseCursor.execute(
"""
SELECT username, password
FROM member
WHERE username = '{username}' AND password = '{password}'
""".format(username=username, password=password)
)
myDatabase.commit()
# set userCheck to True or False if the myDatabaseCursor result is not empty..
# TODO
# if row was in returned table
if userCheck:
session["status"]="logged"
session["user_name"]=request.form["signInUsername"]
return redirect("/member")
else:
return redirect("/error/?message=wrong username or password")
Probably the problem lies in the session['status']. You never set it to e.g. "unlogged", so if you don't close the browser, the status will always be 'logged' after first successful login.
Try to initialize your variable at the beginning of the script, i.e. session["status"]=None and then in every other page check that the status is actually 'Logged' as you're probably already doing.
session["status"]=None
// SELECT
myDatabaseCursor.execute("SELECT username, password FROM member")
myDatabase.commit()
// get data from form to make a tuple
userCheck = (request.form["signInUsername"], request.form["signInPassword"])
// iterate selected data tuple into a list
results = []
for selectedData in myDatabaseCursor:
results.append(selectedData)
// check if there is a match in MySQL database
if userCheck in results:
session["status"]="logged"
session["user_name"]=request.form["signInUsername"]
return redirect("/member")
else:
return redirect("/error/?message=wrong username or password")
In any case, for the sake of best practice, you should amend your code to apply the logic depicted by #matthewking, retrieving just the password you need to check.
I want to save an API response, on some table of my database, I'm using Postgres along with psycopg2.
This is my code:
import json
import requests
import psycopg2
def my_func():
response = requests.get("https://path/to/api/")
data = response.json()
while data['next'] is not None:
response = requests.get(data['next'])
data = response.json()
for item in data['results']:
try:
connection = psycopg2.connect(user="user",
password="user",
host="127.0.0.1",
port="5432",
database="mydb")
cursor = connection.cursor()
postgres_insert_query = """ INSERT INTO table_items (NAME VALUES (%s)"""
record_to_insert = print(item['name'])
cursor.execute(postgres_insert_query, record_to_insert)
connection.commit()
count = cursor.rowcount
print (count, "success")
except (Exception, psycopg2.Error) as error :
if(connection):
print("error", error)
finally:
if(connection):
cursor.close()
connection.close()
my_func()
I mean, I just wanted to sort of "print" all the resulting data from my request into the db, is there a way to accomplish this?
I'm a bit confused as You can see, I mean, what could be some "print" equivalent to achieve this?
I mean, I just want to save from the API response, the name field, into the database table. Or actually INSERT that, I guess psycopg2 has some sort of function for this circumstance?
Any example You could provide?
EDIT
Sorry, I forgot, if I run this code it will throw this:
PostgreSQL connection is closed
A particular name
Failed to insert record into table_items table syntax error at or near "VALUES"
LINE 1: INSERT INTO table_items (NAME VALUES (%s)
There are a few issues here. I'm not sure what the API is or what it is returning, but I will make some assumptions and suggestions based on those.
There is a syntax error in your query, it is missing a ) it should be:
postgres_insert_query = 'INSERT INTO table_items (NAME) VALUES (%s)'
(I'm also assuming thatNAME` is a real column in your database).
Even with this correction, you will have a problem since:
record_to_insert = print(item['name']) will set record_to_insert to None. The return value of the print function is always None. The line should instead be:
record_to_insert = item['name']
(assuming the key name in the dict item is actually the field you're looking for)
I believe calls to execute must pass replacements as a tuple so the line: cursor.execute(postgres_insert_query, record_to_insert) should be:
cursor.execute(postgres_insert_query, (record_to_insert,))
I am using following code in the free heroku scheduler add-on to send emails to certain users. After the email was sent a value in the DB must be changed, to be more precise:
setattr(user, "stats_email_sent", True)
Somehow the db_session.commit() is executed but doesnt save the new value. Here is the code:
all_users = User.query.all()
for user in all_users:
if user.stats_email_sent is False and user.number_of_rooms > 0:
if date.today() <= user.end_offer_date and date.today() >= user.end_offer_date - relativedelta(days=10):
print user.id, user.email, user.number_of_rooms, user.bezahlt
if user.bezahlt is True:
with app.app_context():
print "app context true", user.id, user.email, user.number_of_rooms, user.bezahlt
html = render_template('stats_email_once.html', usersname=user.username)
subject = u"Update"
setattr(user, "stats_email_sent", True)
#send_email(user.email, subject, html, None)
else:
with app.app_context():
print "app context false", user.id, user.email, user.number_of_rooms, user.bezahlt
html = render_template('stats_email_once.html', usersname=user.username)
subject = u"Update"
setattr(user, "stats_email_sent", True)
#send_email(user.email, subject, html, None)
print "executing commit"
db_session.commit()
I tryed moving db_session.commit() right after setattr then it will work, but only for one user (for the first user).
setattr(user, "stats_email_sent", True)
db_session.commit()
And give me this in the logs:
sqlalchemy.orm.exc.DetachedInstanceError: Instance <User at 0x7f5c04d462d0> is not bound to a Session; attribute refresh operation cannot proceed
I also found some topics on detaching an instance, but I don't need to detach here anything, or?
EDIT:
I tryed now also adding db_session.expire_on_commit = False. This sadly had no effect.
I also looked on the bulk updates, which also didn't worked.
I even tryed to ignore and pass the DetachedInstanceError.
I cant believe that updating multiple rows at once is such an issue. I am running out of ideas here. Any help is appriciated. It either has no effect or is run into DetachedInstanceError.
EDIT
I solved the issue, had yesterday a similar one. I assume the query and its variables were consumed, thats why it didnt worked.
To solve this I created a list client_id_list = [] and appended all the ids of users which got the email and where the value needs to be changed.
Then I created a completely new query and all in all run the same code with the same logic, but the variables here were not consumed I guess? I wont answer the question again, because I am not sure whether this is true or not, here is the code which I appended to the code above which changes the values:
all_users_again = User.query.all()
for user in all_users_again:
if user.id in client_id_list:
setattr(user, "stats_email_sent", True)
db_session.commit()
I have a portal users can access built on cherrypy which has some forms which can be submitted that will be sent to JIRA via the REST api for tracking purposes. Once it has been submitted I then take the information from the user supplied information on the form and that JIRA Issue ID and send them to an oracle DB.
As well, I then extended the functionality of the portal to be able to view the user submissions via a list page and then select a record to view what is stored in the DB for that submission. I had the idea to then use the REST API for JIRA to get what the status and assignee is for the Issue within JIRA. Converting my code to submit to the API to instead query it with the necessary JQL statement was fairly simple and can be seen below.
def jira_status_check(jira_id):
if jira_id != "No JIRA Issue":
try:
search_url = "https://myjirainstance.atlassian.net/rest/api/2/search/?jql=issue=" + jira_id + "&fields=status,assignee,resolution"
print search_url
username = 'some_user'
password = 'some_password'
request = urllib2.Request(search_url)
base64string = base64.encodestring('%s:%s' % (username, password)).replace('\n', '')
request.add_header("Authorization", "Basic %s" % base64string)
request.add_header("Content-Type", "application/json")
result = urllib2.urlopen(request).read()
json_results = json.loads(result)
print json_results
jira_status = json_results["issues"][0]["fields"]["status"]["name"]
if json_results["issues"][0]["fields"]["resolution"] is None:
tmp = "tmp"
if json_results["issues"][0]["fields"]["resolution"] is not None:
jira_status = jira_status + " - " + json_results["issues"][0]["fields"]["resolution"]["name"]
# assignee_name = "TEST"
# assignee_NT = "TEST"
if json_results["issues"][0]["fields"]["assignee"] is None:
assignee_name = "Unassigned"
assignee_NT = "Unassigned"
if json_results["issues"][0]["fields"]["assignee"] is not None:
assignee_name = json_results["issues"][0]["fields"]["assignee"]["displayName"]
assignee_NT = json_results["issues"][0]["fields"]["assignee"]["name"]
# if json_results["issues"][0]["fields"]["assignee"]["displayName"] is not None:
# assignee_name = json_results["issues"][0]["fields"]["assignee"]["displayName"]
# if json_results["issues"][0]["fields"]["assignee"] is None:
# assignee_NT = "Unassigned"
# if json_results["issues"][0]["fields"]["assignee"]["name"] is not None:
# assignee_NT = json_results["issues"][0]["fields"]["assignee"]["name"]
print jira_status
print assignee_name
print assignee_NT
output = [jira_status, assignee_name, assignee_NT]
except:
jira_status = "No JIRA Issue by that number or JIRA inaccessible"
assignee_name = "No JIRA Issue by that number or JIRA inaccessible"
assignee_NT = "No JIRA Issue by that number or JIRA inaccessible"
output = [jira_status, assignee_name, assignee_NT]
else:
jira_status = "No JIRA Issue"
assignee_name = "No JIRA Issue"
assignee_NT = "No JIRA Issue"
output = [jira_status, assignee_name, assignee_NT]
return output
However it was limited to searching a single record at a time, which works when you are only viewing the single record, but I was hoping to extend this possibly to my list page and searching many at once with one api query rather than tons of single issue queries. I am capable of using jql and the rest API to search with multiple Issue numbers at a link like this https://myjirainstance.atlassian.net/rest/api/2/search/?jql=Issue%3DSPL-3284%20OR%20Issue%3DSPL-3285&fields=status,assignee,resolution
But then I was thinking about what if somehow a bad Issue ID is saved and queried as a part of the massive query. Previously it was handled with the except statement in my jira_status_check function when it was a single record query. When I try to query the rest api with a link like the last one shared I instead get
{"errorMessages":["An issue with key 'SPL-6666' does not exist for field 'Issue'."],"warningMessages":[]}
I tried to build a query from an advanced search of issues but when I do something like Issue=SPL-3284 OR Issue=SPL-3285 OR Issue=SPL-6666 I get a response of An issue with key 'SPL-6666' does not exist for field 'Issue'.
Is there a correct way to search via JQL with multiple Issue numbers and give back no values for the fields for ones without matching issue numbers?
Or am I stuck with doing a ton of single issue queries to the api to cover my bases? This would be less than ideal, and might cause me to just limit the api queries to when a single record is viewed rather than the list page for usability.
Would I be better off moving my function to query JIRA to javascript/jquery that can populate the list of submissions after the page is rendered?
I ended up reaching out to Atlassian with my question about JQL and then was given the following rest api documentation and told about the validateQuery parameter to add to my JQL to achieve my search. https://docs.atlassian.com/jira/REST/6.1.7/
When I now use a query similar to this on my rest api link with my additional parameter
jql=Issue%3DSPL-3284 OR Issue%3DSPL-3285&fields=status,assignee,resolution&validateQuery=true
I get back a JSON with actual content for the issues which are valid and then a separate warningMessages object with any that are bad. An example JSON is below, but obviously $CONTENT would be actual results from the query
{
"expand": "schema,names",
"startAt": 0,
"maxResults": 50,
"total": 2,
"issues": [
{
$CONTENT
},
{
$CONTENT
}
],
"warningMessages": [
"An issue with key 'SPL-6666' does not exist for field 'Issue'."
]
}
Hopefully someone else will find this helpful in the future
I've been working on trying to populate a table in a PostreSQL database using Tweepy and Twitter's Streaming API. I'm extremely close, I believe I'm just one line away from getting it. I've looked at many examples including:
http://andrewbrobinson.com/2011/07/15/using-tweepy-to-access-the-twitter-stream/
http://blog.creapptives.com/post/14062057061/the-key-value-store-everyone-ignored-postgresql
Python tweepy writing to sqlite3 db
tweepy stream to sqlite database - invalid synatx
Using tweepy to access Twitter's Streaming API
etc, etc
Im at the point where I can stream tweets quite easily using Tweepy, so I know my consumer key, consumer secret, access key and access secret are correct. I also have Postgres set up, and am successfully connecting to the database I created. I tested hard coded values into the table in my database using psycopg2 from a .py file, and that is also working. I am getting tweets streamed in based on keywords I select, and am successfully connected to a table in a database. Now I just need the tweets to stream into the table in my postgres database. Like I said, I am so close and any help would be so greatly appreciated.
This stripped down script inserts data into my desired table:
import psycopg2
try:
conn = psycopg2.connect("dbname=teststreamtweets user=postgres password=x host=localhost")
print "connected"
except:
print "unable to connect"
namedict = (
{"first_name":"Joshua", "last_name":"Drake"},
{"first_name":"Steven", "last_name":"Foo"},
{"first_name":"David", "last_name":"Bar"}
)
cur = conn.cursor()
cur.executemany("""INSERT INTO testdata(first_name, last_name) VALUES (%(first_name)s, %(last_name)s)""", namedict);
conn.commit()
Below is the script I have been editing for a while now trying to get it to work:
import psycopg2
import time
import json
from getpass import getpass
import tweepy
consumer_key = 'x'
consumer_secret = 'x'
access_key = 'x'
access_secret = 'x'
connection = psycopg2.connect("dbname=teststreamtweets user=postgres password=x host=localhost")
cursor = connection.cursor()
#always use this step to begin clean
def reset_cursor():
cursor = connection.cursor()
class StreamWatcherListener(tweepy.StreamListener):
def on_data(self, data):
try:
print 'before cursor' + data
connection = psycopg2.connect("dbname=teststreamtweets user=postgres password=x host=localhost")
cur = connection.cursor()
print 'status is: ' + str(connection.status)
#cur.execute("INSERT INTO tweet_list VALUES (%s)" % (data.text))
cur.executemany("""INSERT INTO tweets(tweet) VALUES (%(text)s)""", data);
connection.commit()
print '---------'
print type(data)
#print data
except Exception as e:
connection.rollback()
reset_cursor()
print "not saving"
return
if cursor.lastrowid == None:
print "Unable to save"
def on_error(self, status_code):
print 'Error code = %s' % status_code
return True
def on_timeout(self):
print 'timed out.....'
print 'welcome'
auth1 = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth1.set_access_token(access_key, access_secret)
api = tweepy.API(auth1)
l = StreamWatcherListener()
print 'about to stream'
stream = tweepy.Stream(auth = auth1, listener = l)
setTerms = ['microsoft']
#stream.sample()
stream.filter(track = setTerms)
Sorry if it's a bit messy of code, but have been trying many options. Like I said any suggestions, links to helpful examples, etc would be greatly appreciated as I've tried everything I can think of and am now resorting to a long walk. Thanks a ton.
Well, I'm not sure why you are using classes for this, and then why you don't have __init__ defined in your class. Seems complicated.
Here is a basic version of the functions I use to do this stuff. I've only ever used sqlite for it, but the syntax looks basically the same. Maybe you can get something from this.
def retrieve_tweets(numtweets=10, *args):
"""
This function optionally takes one or more arguments as keywords to filter tweets.
It iterates through tweets from the stream that meet the given criteria and sends them
to the database population function on a per-instance basis, so as to avoid disaster
if the stream is disconnected.
Both SampleStream and FilterStream methods access Twitter's stream of status elements.
"""
filters = []
for key in args:
filters.append(str(key))
if len(filters) == 0:
stream = tweetstream.SampleStream(username, password)
else:
stream = tweetstream.FilterStream(username, password, track=filters)
try:
count = 0
while count < numtweets:
for tweet in stream:
# a check is needed on text as some "tweets" are actually just API operations
# the language selection doesn't really work but it's better than nothing(?)
if tweet.get('text') and tweet['user']['lang'] == 'en':
if tweet['retweet_count'] == 0:
# bundle up the features I want and send them to the db population function
bundle = (tweet['id'], tweet['user']['screen_name'], tweet['retweet_count'], tweet['text'])
db_initpop(bundle)
break
else:
# a RT has a different structure. This bundles the original tweet. Getting the
# retweets comes later, after the stream is de-accessed.
bundle = (tweet['retweeted_status']['id'], tweet['retweeted_status']['user']['screen_name'], \
tweet['retweet_count'], tweet['retweeted_status']['text'])
db_initpop(bundle)
break
count += 1
except tweetstream.ConnectionError, e:
print 'Disconnected from Twitter at '+time.strftime("%d %b %Y %H:%M:%S", time.localtime()) \
+'. Reason: ', e.reason
def db_initpop(bundle):
"""
This function places basic tweet features in the database. Note the placeholder values:
these can act as a check to verify that no further expansion was available for that method.
"""
#unpack the bundle
tweet_id, user_sn, retweet_count, tweet_text = bundle
curs.execute("""INSERT INTO tblTweets VALUES (null,?,?,?,?,?,?)""", \
(tweet_id, user_sn, retweet_count, tweet_text, 'cleaned text', 'cleaned retweet text'))
conn.commit()
print 'Database populated with tweet '+str(tweet_id)+' at '+time.strftime("%d %b %Y %H:%M:%S", time.localtime())
Good luck!