Use Freebusy to other calendar than 'primary' - python

I am creating events for a not primary calendar, I want to check if the user is not busy in this calendar, not in primary one for this event.
My query:
the_datetime = tz.localize(datetime.datetime(2016, 1, 3, 0))
the_datetime2 = tz.localize(datetime.datetime(2016, 1, 4, 8))
body = {
"timeMin": the_datetime.isoformat(),
"timeMax": the_datetime2.isoformat(),
"timeZone": 'US/Central',
"items": [{"id": 'my.email#gmail.com'}]
}
eventsResult = service.freebusy().query(body=body).execute()
It returns:
{'calendars': {'my.email#gmail.com': {'busy': []}},
'kind': 'calendar#freeBusy',
'timeMax': '2016-01-04T14:00:00.000Z',
'timeMin': '2016-01-03T06:00:00.000Z'}
even if i have something created for that date in my X calendar, but when I create an event in primary calendar I have:
{'calendars': {'my.email#gmail.com': {'busy': [{'end': '2016-01-03T07:30:00-06:00',
'start': '2016-01-03T06:30:00-06:00'}]}},
'kind': 'calendar#freeBusy',
'timeMax': '2016-01-04T14:00:00.000Z',
'timeMin': '2016-01-03T06:00:00.000Z'}
Is there a way to tell the API the calendar I want to check?

i found it! :D
in items of body, put calendar id instead of mail

Related

Create a delivery order in inventory from custom module odoo 13

I have created a custom module, in this module, I have a button. The button should create a delivery order in the inventory module ( just like the confirm button in the Sales module). But when I click it, It gives me the error in the pic, and this is my code.
def delivery_order(self):
delivery = self.env['stock.picking'].create({
# 'type': 'out_invoice',
'state': 'assigned',
'picking_type_id': 1,
'partner_id': self.partner_id.id,
'location_id': 1,
'location_dest_id': 1,
'origin': self.name,
'move_ids_without_package': [(0, 0, {
'product_id': self.product_id.id,
'product_uom_qty': self.selected_weight,
'picking_type_id': 1,
})]})
return delivery
You should add name field in the create method because it is a mandatory field, which causes the error.

google calendar api not adding event and not throwing an error

I'm trying to follow google's example of adding a event and originally it threw errors because I formatted something wrong but now it's saying the event is being made and linking me to it but it never actually makes the event. The 3rd picture I attached is the result and it has a little popup that says error adding event, the weird thing is that none of my events at all show up on the calendar even though I'm logged into the same account in both cases and only have one calendar. The event never gets added to my calendar I've tried running it multiple times even using different data in the event and same result, it says it adds it but never does. I'm lost any help is greatly appreciated please feel free to ask for more info if you need!
edit:
here's my event creation function
def createEvent(summary, start_time, end_time, *args, description='', location='', timeZone='America/New_York'):
credentials = get_credentials()
service = discovery.build('calendar', 'v3', credentials=credentials)
event = {
'summary': summary,
'location': location,
'description': description,
'start': {
'dateTime': start_time,
'timeZone': timeZone,
},
'end': {
'dateTime': end_time,
'timeZone': timeZone,
},
'reminders': {
'useDefault': False,
'overrides': [
# {'method': 'email', 'minutes': 24 * 60},
{'method': 'popup', 'minutes': 10},
],
},
}
for arg in args:
event[arg[0]] = arg[1]
event = service.events().insert(calendarId='primary', body=event).execute()
print ('Event created: %s' % (event.get('htmlLink')))
edit 2:
this is how the function is called with an example of the information passed in
googEvent = ['CSC 385 hw', '20-1-31T22:59:59', '20-1-31T23:59:59', 'EC Mylavarapu']
createEvent(googEvent[0], googEvent[1], googEvent[2], description=googEvent[3])
After studying your code I found that you are so close to fixing it. You only need to force the date format into ISO 8601. To accomplish that, I used the following Python methods:
import datetime
…
googEvent = ['CSC 385 hw', datetime.datetime.strptime("31/01/2020 22:59:59",
"%d/%m/%Y %H:%M:%S").isoformat(), datetime.datetime.strptime(
"31/01/2020 23:59:59", "%d/%m/%Y %H:%M:%S").isoformat(), 'EC Mylavarapu']
createEvent(googEvent[0], googEvent[1], googEvent[2], description = googEvent[3])
This is only one way of doing it. Each date is first created from a human readable string using strptime() and later converted into ISO 8601 with isoformat(). Please, answer me back if you need further help.

Unable to update data in pymongo

I have a JSON POST data that a user is going to send me every time to fetch some data from a third party service.I plan to cache the data based on a scope id so that I don't keep inserting the data each time the user requests for something.Futhermore I am keeping a time stamp for each user request.Below is the POST data that user is going to send me everytime.
{
"scope_id": "user1",
"tool_id": "appdynamics",
"api_id": "get metrics",
"input_params": {"user": "myuser", "pwd": "mypwd", "acc_id": "myaccount", "app_id": "TestApp", "metric-path": "ars",
"time-range-type": "BEFORE_NOW", "duration-in-mins": 10},
"output_filters": {}
}
Below is the code snippet to handle the insertion of data
def post(self):
data = ServiceAPI.parser.parse_args()
print("First data", data)
scope_id = data["scope_id"]
tool_id = data["tool_id"]
api_id = data["api_id"]
input_params = data["input_params"]
output_filter = data["output_filter"]
if all([scope_id, tool_id, api_id]) and all(input_params.values()):
check_id = [j for i in users.find({}) for j in i if j == scope_id]
if check_id and check_id[0] == scope_id:
users.update({scope_id: [tool_id, api_id, input_params]},
{scope_id: [tool_id, api_id, input_params],
"timestamp": datetime.now().strftime('%Y-%m-%d %H:%M:%S')}, upsert=True)
else:
users.insert_one(
{scope_id: [tool_id, api_id, input_params],
"timestamp": datetime.now().strftime('%Y-%m-%d %H:%M:%S')})
Here the update statement works great if the user request is exactly the same as last time but makes a new entry if the user demands a new information for example in the POST request api_id = "get logs" when ideally it should have updated the user's data with the latest one.
For the first time when the user makes a POST request, below is the data that gets stored in my database
[{'user1': ['appdynamics', 'get metrics', {'pwd': 'mypwd', 'metric-path': 'ars', 'user': 'myuser', 'time-range-type': 'BEFORE_NOW', 'acc_id': 'myaccount', 'app_id': 'TestApp', 'duration-in-mins': 10}], 'timestamp': '2018-03-24 21:49:28', '_id': ObjectId('5ab67a901899db6d8a266558')}]
Now I make the same request again, it ensures no new entry is made since its made by the same scope id
However now if the user requests some new information for example
{
"scope_id": "user1",
"tool_id": "appdynamics",
"api_id": "get logs",
"input_params": {"user": "myuser", "pwd": "mypwd", "acc_id": "myaccount", "app_id": "TestApp", "metric-path": "ars",
"time-range-type": "BEFORE_NOW", "duration-in-mins": 10},
"output_filters": {}
}
Notice I have changed "api_id": "get logs", it makes a new entry instead of just modifying the existing data in my database.Here is the data now
[{'user1': ['appdynamics', 'get metrics', {'pwd': 'mypwd', 'metric-path': 'ars', 'user': 'myuser', 'time-range-type': 'BEFORE_NOW', 'acc_id': 'myaccount', 'app_id': 'TestApp', 'duration-in-mins': 10}], 'timestamp': '2018-03-24 21:49:28', '_id': ObjectId('5ab67a901899db6d8a266558')}, {'user1': ['appdynamics', 'get logs', {'pwd': 'mypwd', 'metric-path': 'ars', 'user': 'myuser', 'time-range-type': 'BEFORE_NOW', 'acc_id': 'myaccount', 'app_id': 'TestApp', 'duration-in-mins': 10}], 'timestamp': '2018-03-24 21:55:29', '_id': ObjectId('5ab67bf9089b16e9e77037f4')}]
So here the update seems to fail.What could be going wrong?
Note: This is a flask app and I suggest not to get into the details of the implementation.I just need to update the given data based on the scope id each time a user makes a request irrespective of whether it is the same request or a different one.
You are passing upsert=True to update(). Upsert tells MongoDB to update an existing document if one matching the query is found, insert a new document otherwise. The first parameter to update() is a query filter to find documents to apply the update to. The update query filter where api_id == "get logs" isn't matching any existing document, so a new document is being created.

How to improve performance of pymongo queries

I inherited an old Mongo database. Let's focus on the following two collections (removed most of their content for better readability):
Collection user
db.user.find_one({"email": "user#host.com"})
{'lastUpdate': datetime.datetime(2016, 9, 2, 11, 40, 13, 160000),
'creationTime': datetime.datetime(2016, 6, 23, 7, 19, 10, 6000),
'_id': ObjectId('576b8d6ee4b0a37270b742c7'),
'email': 'user#host.com' }
Collections entry (one user to many entries):
db.entry.find_one({"userId": _id})
{'date_entered': datetime.datetime(2015, 2, 7, 0, 0),
'creationTime': datetime.datetime(2015, 2, 8, 14, 41, 50, 701000),
'lastUpdate': datetime.datetime(2015, 2, 9, 3, 28, 2, 115000),
'_id': ObjectId('54d775aee4b035e584287a42'),
'userId': '576b8d6ee4b0a37270b742c7',
'data': 'test'}
As you can see, there is no DBRef between the two.
What I would like to do is to count the total number of entries, and the number of entries updated after a given date.
To do this I used Python's pymongo library. The code below gets me what I need, but it is painfully slow.
from pymongo import MongoClient
client = MongoClient('mongodb://foobar/')
db = client.userdata
# First I need to fetch all user ids. Otherwise db cursor will time out after some time.
user_ids = [] # build a list of tuples (email, id)
for user in db.user.find():
user_ids.append( (user['email'], str(user['_id'])) )
date = datetime(2016, 1, 1)
for user_id in user_ids:
email, _id = user_id
t0 = time.time()
query = {"userId": _id}
no_of_all_entries = db.entry.find(query).count()
query = {"userId": _id, "lastUpdate": {"$gte": date}}
no_of_entries_this_year = db.entry.find(query).count()
t1 = time.time()
print("delay ", round(t1 - t0, 2))
print(email, no_of_all_entries, no_of_entries_this_year)
It takes around 0.83 second to run both db.entry.find queries on my laptop, and 0.54 on an AWS server (not the MongoDB server).
Having ~20000 users it takes painful 3 hours to get all the data.
Is that the kind of latency you'd expect to see in Mongo ? What can I do to improve this ? Bear in mind that MongoDB is fairly new to me.
Instead of running two aggregates for all users separately you can just get both aggregates for all users with db.collection.aggregate().
And instead of a (email, userId) tuples we make it a dictionary as it is easier to use to get the corresponding email.
user_emails = {str(user['_id']): user['email'] for user in db.user.find()}
date = datetime(2016, 1, 1)
entry_counts = db.entry.aggregate([
{"$group": {
"_id": "$userId",
"count": {"$sum": 1},
"count_this_year": {
"$sum": {
"$cond": [{"$gte": ["$lastUpdate", date]}, 1, 0]
}
}
}}
])
for entry in entry_counts:
print(user_emails.get(entry['_id']),
entry['count'],
entry['count_this_year'])
I'm pretty sure getting the user's email address into the result could be done but I'm not a mongo expert either.

Openerp: onChange event to create lines on account move

I have a field amount on the Journal Entries (account move form) and I need to define an onChange event which automatically inserts the lines once I fill in the amount. But I am not sure how.
Yesterday I had to do something similar to your requirement. I had a salo order(m2o) field in purchase ...on_change of sale_order I had to fill the purchase order lines ...see hope it can help you.
class purchase_order(osv.osv):
_inherit = 'purchase.order'
_columns = {
'sale_order':fields.many2one('sale.order','Sale Order'),
'purchase_type':
fields.selection([
('order','Purchase Order'),
('job','Job Order')
],
'Purchase Type',
required=True,
states={
'confirmed':[('readonly',True)],
'approved':[('readonly',True)],
},
select=True,
help="Define type of purchase order.",
),
}
def onchange_saleorder(self,cr,uid,ids,order,context=None):
res={}
lis=[]
sorder_id=self.pool.get('sale.order').browse(cr,uid,order)
for line in sorder_id.order_line:
print "uom",line.product_id.uom_id.id
res={'product_id':line.product_id.id,
'name':line.product_id.name,
'product_qty':1,
'product_uom':line.product_id.uom_id.id,
'price_unit':line.price_unit,
'date_planned':time.strftime('%Y-%m-%d'),}
lis.append(res)
print "list is .........",lis
res={'value':{'order_line':lis}} // here order_line is o2m field of purchase.
return res
Your on_change method has to return a list of dictionaries made of line values.
For instance:
res['value']['line_ids'] = [
{
'account_id': 1,
'debit': 100,
},
{
'account_id': 2,
'credit': 100,
}]
Also, see the recompute_voucher_lines method of account.voucher as example.
sometimes error '0' will be generate so we have to write a code with like this.
In case of above Example
lis.append((0,0,res))

Categories

Resources