I'm running into a bit of a wall here.
I'm pulling in some data that I pull in via an API, which I transform and then append it to the bottom of a sheet in Google Sheets. For each line of data that I pull, I currently use an append() request.
I'm trying to reduce the amount of calls I make, and batchUpdate seems like a good start. However, out of the available options for batchUpdate, append doesn't seem to be present, or I'm misreading it.
My end goal is that I can get a bunch of data and append them to the bottom of a spreadsheet, instead of continuously calling the append endpoint.
It looks like you're just looking at the Values collection. It doesn't make much sense to do a batchAppend because all you'd be doing is appending to the end of the prior one.. so you might as well just make a single append call with all the data you'd like to append.
However, if you'd like to intermix updating various other aspects of the spreadsheet (not just the values), then you can use spreadsheets.batchUpdate. One of the kinds of requests that can do is an AppendCellsRequest.
However, it's not clear why you need to batch in the first place. With batchUpdate it makes sense because each update is going to an exact place in the sheet so you can bundle a lot of updates together. OTOH, "append" doesn't specify the location, it just says "go after the existing data", so there's no point to batching them up, you can just as easily make a single call w/ all the data you'd like to append.
Related
Currently using gspread and python. Currently to update values on my spreadsheet I have to grab values from sheet, update them and then write back.
What i am looking for is...say I have a cell with a value of 5, is there a way to make an api call that says add +5 to "x" cell? I would no longer need to grab the values first, which even saves an api call. I don't know if this command is available or anything similar is, but would appreciate any help here!
Poring over both the documentation for gspread and the Sheets API I'm fairly confident that this cannot be done. All the methods that update values on a sheet require you to specify a range and the list of values, but you are unable to refer to the current values within the range without first retrieving them.
Your best way to reduce API calls is to just retrieve and update the data in batches as much as possible using batch_get() and batch_update().
If you'd like to submit feedback to Google to implement something like this in a future API update you can check out their issue tracker.
If I want to get data from API that has different GET like "/names", "/schools", "/subjects", and more is there an easier way to find out how many GETs there are in the API or an easier way to get them all?
I'm gonna use data from a .csv to train a model to predict user activity on google ads (impressions, clicks) in relation to the weather for a given day. And I have a .csv that contains 6000+ recordings of this info and want to parse it into a database using Python.
I tried making a df in pandas but for some reason the whole table isn't shown. The middle columns (there's about 7 columns I think) and rows (numbered over 6000 as I mentioned) are replaced with '...' when I print the table so I'm not sure if the entirety of the information is being stored and if this will be usable.
My next attempt will possible be SQLite but since it's local memory, will this interfere with someone else making requests to my API endpoint if I don't have the db actively open at all times?
Thanks in advance.
If you used pd.read_csv() i can assure you all of the info is there, it's just not displaying it.
You can check by doing something like print(df['Column_name_you_are_interested_in'].tolist()) just to make sure though. You can also use the various count type methods in pandas to make sure all of your lines are there.
Panadas is pretty versatile so it shouldn't have trouble with 6000 lines
I have an excel spreadsheet which basically acts as an UI, it is used to let the user enter some parameters which are then passed to some python code on a server via a web service, as well as a whole tab full of data.
I am by far no VBA expert but managed to get my data and individual variables submitted. My question is what is the best suited VBA data structure to use, ideally I would like to have something like a dictionary where the keys would be my defined Names for the Excel cells, plus the data which might for some cases will be a single value or a Variant array.
I have to be able to distinguish between keys and their corresponding values in python eventually.
So far I was playing around with collections
Dim Main_tab_vars As Collection
Set Main_tab_vars = New Collection
Main_tab_vars.Add Range("Start_Date").Value, "Start_Date_var", "Start_Date_var"
Main_tab_vars.Add Range("Definitions").Value, "Definitions_var"
If I look at the collection in my watches window I can see the values correctly stored in item1 and item2. But it looks like my key information gets lost
I would recommend either JSON or Xml when sending data to a web service, these are the industry standards. If chooisng JSON then you'd use nested dictionaries and then build a string (plenty of code on internet) when ready . If using Xml then you could build up the Xml document as you go.
I do not know how well Python handles JSON so probably I'd opt for XML.
I have data in an excel file that I would like to use to create a case in PSSE. The data is organized as it would appear in a case in PSSE (ie. for bus Bus number, name, base kV, and so on. Of course the data can be entered manually but I'm working with over 500 buses. I have tried copied and pasting, but that seems to works only sometimes. For machine data, it barely works.
Is there a way to import this data to PSSE from an excel file? I have recently started running PSSE with Python, and maybe there is a way to do this?
--
MK.
Yes. You can import data from an excel file into PSSE using the python package xlrt, however, I would reccomend instead converting your excel file to csv before you import and use csv as it is much easier. Importing data using the API is not just a copy and paste job, into the nicely tabulated spreadsheet that PSSE has in its case data.
Refer to the API documentation for PSSE, chapter II. Search this function, BUS_DATA_2. You will see that you can create buses with this function.
So your job should be three fold.
Import the csv file data with each line being a list of each data parameter for your bus. Like voltage, name, baseKV, PU etc. Store it to another list.
Iterate through the new list you just created and call:
ierr = bus_data_2(i, intgar, realar, name)
and pass in your data from the csv file. (see PSSE API documentation on how to do this) This will effectively load data from the csv file to your case ( in the form of nodes or buses).
After you are finished, you will need to call a function called psspy.save("Casename.sav") to save your work in a new PSSE case.
Note: there are functions to load in line data, fix shunt data, generator data etc.
Your other option is to call up the PTI folks as they can give you training.
Good luck
If you have an Excel data file with exactly the same "format" and same "info" as the regular case file (.sav), try this:
Open any small example .sav file from the example sub-folder PSSE's installation folder
Copy the corresponding spreadsheet to the working case (shown in spreadsheet view) with the same "info" (say, bus, branch,etc.) in PSSE GUI
After finishing copying everything, then save the edited working case in GUI as a new working case.
If this doesn't work, I suggest you to ask this question on forum of "Python for Power Systems":
https://psspy.org/psse-help-forum/questions/