How to Query from salesforce the OpportunityFieldHistory with python - python

I am having an issue figuring out how to start a query on the OpportunityFieldHistory from Salesforce.
The code I usually use and works for querying Opportunty or Leads work fine, but I do not know how should be written for the FieldHistory.
When I want to query the opportunity or Lead I use the following:
oppty1 = sf.opportunity.get('00658000002vFo3')
lead1 = sf.lead.get('00658000002vFo3')
and then do the proper query code with the access codes...
The problem arises when I want to do the analysis on the OpportunityFieldHistory, I tried the following:
opptyhist = sf.opportunityfieldhistory.get('xxx')
Guess what, does not work. Do you have any clue on what should I write between sf. and .get?
Thanks in advance

Looking at the simple-salesforce API, it appears that the get method accepts an ID which you are passing correctly. However, a quick search in the Salesforce API reference seems to indicate that the OpportunityFieldHistory may need to be obtained by another function such as get_by_custom_id(self, custom_id_field, custom_id).
(OpportunityFieldHistory){
Id = None
CreatedDate = 2012-08-27 12:00:03
Field = "StageName"
NewValue = "3.0 - Presentation & Demo"
OldValue = "2.0 - Qualification & Discovery"
OpportunityId = "0067000000RFCDkAAP"
},

Related

Python 3 CouchDB single update vs batch

This is my first post :) so I'll apologize beforehand. I'm working on exporting data from mysql to couchdb, when an item has been saved, i mark the mysql item with a recently updated date. Below I have a python function which takes in a json object one by one and some random id to update on mysql locally:
def write_json_to_couchdb(json_obj, id):
#couchdb auto create doc_id and rev
doc_id = ''
revision_or_exception = ''
for (success, doc_id, revision_or_exception) in db.update(json_obj):
print(success, doc_id, revision_or_exception)
# mark id inside mysql db, so we know its been saved to couchdb
mysql.update_item_date(id)
This solution above works but is quite slow, both writing to couchdb and updating onto mysql, how can I use "bulk api" or "batch api", without using curl. I believe couchdb's db.update(item) can also take a list like this db.update(dict_of_items). How can i specify "batch ok". Are there any other method i'm unaware of. Seems there's few examples online.
Would this increase speed significantly? Also how can i specify the "batch size" of lets say 1000 records.
Here's what I'm thinking a better solution would be:
def write_json_to_couchdb_bulk(json_obj_list, id_list):
doc_id = ''
revision_or_exception = ''
for (success, doc_id, revision_or_exception) in db.update(json_obj_list):
print(success, doc_id, revision_or_exception)
# update added_date with current datetime
for id in id_list:
mysql.update_item_date(id)
Thanks,
SW
import couchdb
from couchdb import *
def write_json_to_couchdb_bulk(json_obj_list):
for doc in db.update(json_obj_list):
print(repr(doc))
json_obj_list = [
Document(type='Person', name='John Doe'),
Document(type='Person', name='Mary Jane'),
Document(type='City', name='Gotham City')
]
write_json_to_couchdb_bulk(json_obj_list)
Here's the solution I came up with, its much faster.

Get multiple lists/data from MySQL StoredProcedure

I'm working in python with MySQL and want to get multiple lists of data from Stored Procedure.
i am using PyMySql to connect with my Database. And trying to do something like this but its not working
CREATE DEFINER=`root`#`%` PROCEDURE `spGetData`(IN ssscreenId INT(11),)
BEGIN
SELECT clientId, clientName FROM apl_cb.Client WHERE isActive = 1 AND isDeleted = 0;
SELECT bankId, bankName, bankAddress FROM apl_cb.Bank WHERE isDeleted = 0 AND isActive = 1;
SELECT eventId, eventName, eventGroup FROM apl_cb.Event WHERE isActive = 1 AND isDeleted = 0 AND menuEvent = 0 AND screenId = ssscreenId;
END
Any kind of help will be appreciated.
thanks
Have you tried using the nextset() method on your Python cursor to jump to the next result set? (python.org/dev/peps/pep-0249/#nextset)
If you have tried it, and it doesn't work, then you're out of luck; your Python program won't be able to retrieve multiple resultsets from a single stored procedure.
Keep in mind that MySQL stored procedures offer encapsulation advantages but not efficiency advantages. The time taken to issue three queries inside a stored procedure is the same as (or maybe a tiny bit higher than) the time of issuing them one after another from your Python program. Here's an explanation of part of that. http://www.joinfu.com/2010/05/mysql-stored-procedures-aint-all-that/

Updating a sqlite3 Database with QSqlQuery

I am working on a simple gui with pyqt and trying to update my database without any luck so far. I am pretty new to qt, database and well also python ;) so I don't understand whats wrong with my code, maybe some one could help me a little bit.
While working with python (3) and sqltie3 I could update my db like this:
cur.execute('''UPDATE news SET Raw=? WHERE id=?''', (raw, news_id))
But with pyqt it's only working like this:
query = QSqlQuery()
query.exec_("UPDATE news SET Raw = 'test' WHERE Id = 9")
I tried:
query.exec_("UPDATE news SET Raw = ? WHERE Id = ?", (raw, news_id))
What lead to this error:
TypeError: arguments did not match any overloaded call:
QSqlQuery.exec_(str): too many arguments
QSqlQuery.exec_(): too many arguments
In the book rapid gui programming with qt he is doing his queries like this (not exactly like that but I tried to adapt it):
query.exec_('''UPDATE news SET Raw={0} WHERE id={1}'''.format(raw, news_id))
It doesn't seem to do anything.
and (doesn't seem to do anything either):
raw = 'test'
query.prepare("UPDATE news SET (Raw) VALUES (:raw) WHERE Id = 9 ")
query.bindValue(":raw", raw)
#query.bindValue(":news_id", 9)
query.exec_()
Well and some other things I have found here and elsewhere but without any luck so far.
It seems that you have not yet tried the proper syntax which should be the following:
query.prepare(QString("UPDATE news SET Raw = :value WHERE id = :id "));
query.bindValue(":value", raw);
query.bindValue(":id", news_id);

Make changes persistent in Boto

I have a SimpleDB instance that I update and read using boto for Python:
sdb = boto.connect_sdb(access_key, secret_key)
domain = sdb.get_domain('DomainName')
itemName = 'UserID'
itemAttr = {'key1': 'val1', 'key2': val2}
userDom.put_attributes(itemName, itemAttr)
That works a expected. A new item with name 'UserID' and values val1 and val2 will be inserted in the domain.
Now, the problem that I am facing is that if I query that domain right after updating its attributes,
query = 'select * from `DomainName` where key1=val1'
check = domain.select(query)
itemName = check.next()['key2']
I will get an error because the values in the row could not be found. However, if I add a time.sleep(1) between the write and the read everything works.
I suspect this problem is due to the fact that put_atributes signals the data base for writing, but does not wait until this change has been made persistent. I have also tried to write using creating an item and then saving that item (item.save()) without much success. Does anyone know how can I make sure that the values have been written in the SimpleDB instance before proceeding with the next operations?
Thanks.
The issue here is that SimpleDB is, by default, eventually consistent. So, when you write data and then immediately try to read it, you are not guaranteed to get the newest data although you are guaranteed that eventually the data will be consistent. With SimpleDB, eventually usually means less than a second but there are no guarantees on how long that could take.
There is, however, a way to tell SimpleDB that you want a consistent view of the data and are willing to wait for it, if necessary. You could do this by changing your query code slightly:
query = 'select * from `DomainName` where key1=val1'
check = domain.select(query, consistent_read=True)
itemName = check.next()['key2']
This should always return the latest values.

How to a query a set of objects and return a set of object specific attribute in SQLachemy/Elixir?

Suppose that I have a table like:
class Ticker(Entity):
ticker = Field(String(7))
tsdata = OneToMany('TimeSeriesData')
staticdata = OneToMany('StaticData')
How would I query it so that it returns a set of Ticker.ticker?
I dig into the doc and seems like select() is the way to go. However I am not too familiar with the sqlalchemy syntax. Any help is appreciated.
ADDED: My ultimate goal is to have a set of current ticker such that, when new ticker is not in the set, it will be inserted into the database. I am just learning how to create a database and sql in general. Any thought is appreciated.
Thanks. :)
Not sure what you're after exactly but to get an array with all 'Ticker.ticker' values you would do this:
[instance.ticker for instance in Ticker.query.all()]
What you really want is probably the Elixir getting started tutorial - it's good so take a look!
UPDATE 1: Since you have a database, the best way to find out if a new potential ticker needs to be inserted or not is to query the database. This will be much faster than reading all tickers into memory and checking. To see if a value is there or not, try this:
Ticker.query.filter_by(ticker=new_ticker_value).first()
If the result is None you don't have it yet. So all together,
if Ticker.query.filter_by(ticker=new_ticker_value).first() is None:
Ticker(ticker=new_ticker_value)
session.commit()

Categories

Resources