The free user for AlchemyAPI can call 1000 requests a day (http://www.alchemyapi.com/products/pricing/).
I have been accessing the API with python as such:
from alchemyapi import AlchemyAPI
demo_text = 'Yesterday dumb Bob destroyed my fancy iPhone in beautiful Denver, Colorado. I guess I will have to head over to the Apple Store and buy a new one.'
response = alchemyapi.keywords('text', demo_text)
json_output = json.dumps(response, indent=4)
print json_output
I know I ran out of calls since the requests were response returning None.
How do I check how many calls I have left through the python interface?
Will the check count as one request?
You can use alchemy_calls_left(api_key) function from Here
and no it won't count as a call itself.
This URL will return you the daily call used info.
Replace the API_KEY with your key.
http://access.alchemyapi.com/calls/info/GetAPIKeyInfo?apikey=API_KEY&outputMode=json
You could keep a local variable that would keep track of the number of API calls and would reset when the date changes using datetime.date from date module.
You can also use this Java API as follows:
AlchemyAPI alchemyObj = AlchemyAPI.GetInstanceFromFile("/../AlchemyAPI/testdir/api_key.txt");
AlchemyAPI_NamedEntityParams params= new AlchemyAPI_NamedEntityParams();
params.setQuotations(true); // for instance, enable quotations
Document doc = alchemyObj.HTMLGetRankedNamedEntities(htmlString, "http://news-site.com", params);
The last call will cause an IOException (if you exceed the allowed calls for a given day) and the message will be "Error making API call: daily-transaction-limit-exceeded."
You can then catch it, wait for 24 hours and re-try.
Related
I currently use the function get_users_following(user_id_account, max_results=1000) to get the list of follows of an account, to know who he follows on twitter. So far it works well as long as the user follows less than 1000 people because the API limits to a list of maximum 1000 users. The problem is that when he follows more than 1000 people I can't get the last people. The function always gives me the first 1000 and ignores the last ones
https://docs.tweepy.org/en/stable/client.html#tweepy.Client.get_users_followers
https://developer.twitter.com/en/docs/twitter-api/users/follows/api-reference/get-users-id-following
There is a pagination_token parameter but I don't know how to use it. What I want is just the last X new people followed so I can add them to a database and get a notification for each new entry
client = tweepy.Client(api_token)
response = client.get_users_following(id=12345, max_results=1000)
Is it possible to go directly to the last page?
Tweepy handles the pagination with the Paginator class (see the documentation here).
For example, if you want to see all the pages, you could do something like that:
# Use the wait_on_rate_limit argument if you don't handle the exception yourself
client = tweepy.Client(api_token, wait_on_rate_limit=True)
# Instantiate a new Paginator with the Tweepy method and its arguments
paginator = tweepy.Paginator(client.get_users_following, 12345, max_results=1000)
for response_page in paginator:
print(response_page)
Or you could also directly get the full list of the user's followings:
# Instantiate a new Paginator with the Tweepy method and its arguments
paginator = tweepy.Paginator(client.get_users_following, 12345, max_results=1000)
for user in paginator.flatten(): # Without a limit argument, it gets all users
print(user.id)
I am using uniswap python api to get live token prices. I am using all the variation of the builtin functions. However, it does not give me the right value.
HERE IS MY CODE
address = "0x0000000000000000000000000000000000000000"
private_key = None
uniswap_wrapper = Uniswap(address, private_key,infura_url,version=2)
dai = "0x89d24A6b4CcB1B6fAA2625fE562bDD9a23260359"
print(uniswap_wrapper.get_eth_token_input_price(dai, 5*10**18))
print(uniswap_wrapper.get_token_eth_input_price(dai, 5*10**18))
print(uniswap_wrapper.get_eth_token_output_price(dai, 5*10**18))
print(uniswap_wrapper.get_token_eth_output_price(dai, 5*10**18))
And these are my results respectively,
609629848330146249678
24997277527023953
25306950626771242
2676124437498249933489
I don't want to use coingecko or coinmarketcaps api as they do not list newly released token prices immediately.
I tried etherscan to get token prices but it does not have a built-in function for that. Does anybody any suggestions on how to fix that or do you know any alternatives?
I don't have the time or setup to test this right now, but I believe that what you want is something like this:
print(uniswap_wrapper.get_eth_token_input_price(dai, 5*10**18)/5*10**18)
print(uniswap_wrapper.get_token_eth_input_price(dai, 5*10**18)/5*10**18)
print(uniswap_wrapper.get_eth_token_output_price(dai, 5*10**18)/5*10**18)
print(uniswap_wrapper.get_token_eth_output_price(dai, 5*10**18)/5*10**18)
When using the MSAL library for Python, I cannot get the access token expiration time to change from the default of 1 hour.
I have tried:
now = datetime.datetime.utcnow()
then = datetime.datetime.utcnow() + datetime.timedelta(minutes=10)
claims = {
"exp": then,
}
app = msal.ConfidentialClientApplication(
graph_config["client_id"], authority=graph_config["authority"],
client_credential=graph_config["secret"], client_claims=claims)
I have tried sending this as a python datetime object, and as a string. I have tried adding '_min' to the value, and I have tried the 'now + 10_min' like the docs say.
No matter what, I still get an expiration time of:
"expires_in": 3599,
"ext_expires_in": 3599,
i.e. one hour
Docs: https://msal-python.readthedocs.io/en/latest/#publicclientapplication-and-confidentialclientapplication
Please for the love of all that is holy, someone help me get this stupid access token to last longer.
Token lifetimes are managed by Polices in Azure AD https://learn.microsoft.com/en-us/azure/active-directory/develop/active-directory-configurable-token-lifetimes#configurable-token-lifetime-properties so they aren't something that you can change from a user level (but an Admin could alter or create a new policy to do this). The default lifespan is 1 hour for security reasons and unless you have a good reason to change it you shouldn't as its generally easy for any app to manage its own token refresh/renew.
I am trying to learn how to use the new WebHooks service on IFTTT, and I'm struggling to figure out how it is supposed to work. Most of the projects I can find online seem to refer to a deprecated "maker" service, and there are very little resources for interpreting the new channel.
Let's say I want to check the value of "online" every ten minutes in the following json file: https://lichess.org/api/users/status?ids=thibault
I can write a Python script that extracts this value like so:
response = urlopen('https://lichess.org/api/users/status?ids=thibault')
thibault = response.read()
data = json.loads(thibault)
status = data[0]['online']
If the status is equal to "true", I would like to receive a notification via email or text message. How do I integrate the python script and the webhooks service? Or do I even need to use this script at all? I assume I need some kind of cron job that regularly runs this Python script, but how do I connect this job with IFTTT?
When I create a new applet on IFTTT I am given the ability to create a trigger with a random event name, but it's unclear what that event name corresponds to.
I have a similar setup for my IFTTT webhook service. To the best of my understanding, the answer to your question is yes, you need this script (or similar) to scrap the online value, and you'll probably want to do a cron job (my approach) or keep the script running (wouldn't be my preference).
IFTTT's webhooks a json of up to 3 values, which you can POST to a given event and key name.
Below is a very simple excerpt of my webhook API:
def push_notification(*values, **kwargs):
# config is in json format
config = get_config()
report = {}
IFTTT = {}
# set default event/key if kwargs are not present
for i in ['event', 'key']:
IFTTT[i] = kwargs[i] if kwargs and i in kwargs.keys() else config['IFTTT'][i]
# unpack values received (up to 3 is accepted by IFTTT)
for i, value in enumerate(values, 1):
report[f"value{i}"] = value
if report:
res = requests.post(f"https://maker.ifttt.com/trigger/{IFTTT['event']}/with/key/{IFTTT['key']}", data=report)
# TODO: add try/except for status
res.raise_for_status()
return res
else:
return None
You probably don't need all these, but my goal was to set up a versatile solution. At the end of the day, all you really need is this line here:
requests.post(f"https://maker.ifttt.com/trigger/{event}/with/key/{key}", data={my_json_up_to_3_values})
Where you will be placing your webhook event name and secret key value. I stored them in a config file. The secret key will be available once you sign up on IFTTT for the webhook service (go to your IFTTT webhook applet setting). You can find your key with a quick help link like this: https://maker.ifttt.com/use/{your_secret_key}. The event can be a default value you set on your applet, or user can choose their event name if you allow.
In your case, you could do something like:
if status:
push_notification("Status is True", "From id thibault", event="PushStatus", key="MysEcR5tK3y")
Note: I used f-strings with version 3.6+ (It's great!), but if you have a lower version, you should switch all the f-strings to str.format().
Using the GData Calendar API via App Engine in Python, when you create an event there are handy little helper methods to parse the response:
new_event = calendar_service.InsertEvent(event, '/calendar/feeds/default/private/full')
helper = new_event.GetEditLink().href
When you create a new calendar:
new_calendar = gd_client.InsertCalendar(new_calendar=calendar)
I was wondering if there might be related methods that I just can't find in the documentation (or that are--perhaps--undocumented)?
I need to store the new calendar's ID in the datastore, so I would like something along the lines of:
new_calendar = gd_client.InsertCalendar(new_calendar=calendar)
new_calendar.getGroupLink().href
In my code, the calendar is being created, and G is returning the Atom response with a 201, but before I get into using elementtree or atom.parse to extract the desired element, I was hoping someone here might be able to help.
Many thanks in advance :)
I've never used the GData API, so I could be wrong, but...
It looks like GetLink() will return the link object for any specified rel. Seems like GetEditLink() just calls GetLink(), passing in the rel of the Edit link. So you should be able to call GetLink() on the response from InsertCalendar(), and pass in the rel of the Group link.
Here's the pydoc info that I used to figure this out: http://gdata-python-client.googlecode.com/svn/trunk/pydocs/gdata.calendar_resource.data.html