When using the MSAL library for Python, I cannot get the access token expiration time to change from the default of 1 hour.
I have tried:
now = datetime.datetime.utcnow()
then = datetime.datetime.utcnow() + datetime.timedelta(minutes=10)
claims = {
"exp": then,
}
app = msal.ConfidentialClientApplication(
graph_config["client_id"], authority=graph_config["authority"],
client_credential=graph_config["secret"], client_claims=claims)
I have tried sending this as a python datetime object, and as a string. I have tried adding '_min' to the value, and I have tried the 'now + 10_min' like the docs say.
No matter what, I still get an expiration time of:
"expires_in": 3599,
"ext_expires_in": 3599,
i.e. one hour
Docs: https://msal-python.readthedocs.io/en/latest/#publicclientapplication-and-confidentialclientapplication
Please for the love of all that is holy, someone help me get this stupid access token to last longer.
Token lifetimes are managed by Polices in Azure AD https://learn.microsoft.com/en-us/azure/active-directory/develop/active-directory-configurable-token-lifetimes#configurable-token-lifetime-properties so they aren't something that you can change from a user level (but an Admin could alter or create a new policy to do this). The default lifespan is 1 hour for security reasons and unless you have a good reason to change it you shouldn't as its generally easy for any app to manage its own token refresh/renew.
Related
I am fetching a subscription's Secure Score using the Microsoft Azure Security Center (ASC) Management Client Library. All operations in the library state that
You should not instantiate directly this class, but create a Client instance that will create it for you and attach it as attribute.
Therefore, I am creating a SecurityCenter client with the following specification:
SecurityCenter(credentials, subscription_id, asc_location, base_url=None)
However, it seems to me like the only way to get the asc_location information properly is to use the SecurityCenter client to fetch it... The spec says the same as the quote above, You should not instantiate.... So I am stuck not being able to create the client because I need the ASC location to do so, and I need to create the client to get the ASC locations.
The documentation mentions
The location where ASC stores the data of the subscription. can be retrieved from Get locations
Googling and searching through the Python SDK docs for this "Get locations" gives me nothing (else than the REST API). Have I missed something? Are we supposed to hard-code the location like in this SO post or this GitHub issue from the SDK repository?
As offical API reference list locations indicates:
The location of the responsible ASC of the specific subscription (home
region). For each subscription there is only one responsible location.
It will not change, so you can hardcode this value if you already know the value of asc_location of your subscription.
But each subscription may have different asc_location values(my 2 Azure subscriptions have different asc_location value).
So if you have a lot of Azure subscriptions, you can just query asc_location by API (as far as I know, this is the only way I can find to do this)and then use SDK to get the Secure Score, try the code below:
from azure.mgmt.security import SecurityCenter
from azure.identity import ClientSecretCredential
import requests
from requests.api import head, request
TENANT_ID = ''
CLIENT = ''
KEY = ''
subscription_id= ''
getLocationsURL = "https://management.azure.com/subscriptions/"+subscription_id+"/providers/Microsoft.Security/locations?api-version=2015-06-01-preview"
credentials = ClientSecretCredential(
client_id = CLIENT,
client_secret = KEY,
tenant_id = TENANT_ID
)
#request for asc_location for a subscription
azure_access_token = credentials.get_token('https://management.azure.com/.default')
r = requests.get(getLocationsURL,headers={"Authorization":"Bearer " + azure_access_token.token}).json()
location = r['value'][0]['name']
print("location:" + location)
client = SecurityCenter(credentials, subscription_id, asc_location=location)
for score in client.secure_scores.list():
print(score)
Result:
I recently came across this problem.
Based on my observation, I can use whatever location under my subscription to initiate SecurityCenter client. Then later client.locations.list() gives me exactly one ASC location.
# Any of SubscriptionClient.subscriptions.list_locations will do
location = 'eastasia'
client = SecurityCenter(
credential, my_subscription_id,
asc_location=location
)
data = client.locations.list().next().as_dict()
pprint(f"Asc location: {data}")
In my case, the it's always westcentralus regardless my input was eastasia.
Note that you'll get exception if you use get instead of list
data = client.locations.get().as_dict()
pprint(f"Asc location: {data}")
# azure.core.exceptions.ResourceNotFoundError: (ResourceNotFound) Could not find location 'eastasia'
So what i did was a bit awkward,
create a SecurityCenter client using a location under my subscription
client.locations.list() to get ASC location
Use the retrieved ASC location to create SecurityCenter client again.
I ran into this recently too, and initially did something based on #stanley-gong's answer. But it felt a bit awkward, and I checked to see how the Azure CLI does it. I noticed that they hardcode a value for asc_location:
def _cf_security(cli_ctx, **_):
from azure.cli.core.commands.client_factory import get_mgmt_service_client
from azure.mgmt.security import SecurityCenter
return get_mgmt_service_client(cli_ctx, SecurityCenter, asc_location="centralus")
And the PR implementing that provides some more context:
we have a task to remove the asc_location from the initialization of the clients. currently we hide the asc_location usage from the user.
centralus is a just arbitrary value and is our most common region.
So... maybe the dance of double-initializing a client or pulling a subscription's home region isn't buying us anything?
I am using uniswap python api to get live token prices. I am using all the variation of the builtin functions. However, it does not give me the right value.
HERE IS MY CODE
address = "0x0000000000000000000000000000000000000000"
private_key = None
uniswap_wrapper = Uniswap(address, private_key,infura_url,version=2)
dai = "0x89d24A6b4CcB1B6fAA2625fE562bDD9a23260359"
print(uniswap_wrapper.get_eth_token_input_price(dai, 5*10**18))
print(uniswap_wrapper.get_token_eth_input_price(dai, 5*10**18))
print(uniswap_wrapper.get_eth_token_output_price(dai, 5*10**18))
print(uniswap_wrapper.get_token_eth_output_price(dai, 5*10**18))
And these are my results respectively,
609629848330146249678
24997277527023953
25306950626771242
2676124437498249933489
I don't want to use coingecko or coinmarketcaps api as they do not list newly released token prices immediately.
I tried etherscan to get token prices but it does not have a built-in function for that. Does anybody any suggestions on how to fix that or do you know any alternatives?
I don't have the time or setup to test this right now, but I believe that what you want is something like this:
print(uniswap_wrapper.get_eth_token_input_price(dai, 5*10**18)/5*10**18)
print(uniswap_wrapper.get_token_eth_input_price(dai, 5*10**18)/5*10**18)
print(uniswap_wrapper.get_eth_token_output_price(dai, 5*10**18)/5*10**18)
print(uniswap_wrapper.get_token_eth_output_price(dai, 5*10**18)/5*10**18)
I have a simple script shown below. I want to use the value of holiday_value to create a filter. I am thinking this could be done by putting the value into zap storage and then retrieving the value from storage and use it in a zap filter. I don't know how to get the value from the script into zap storage.
from datetime import date
import holidays
us_holidays = holidays.US()
if date.today() in us_holidays:
holiday_value='true'
else:
holiday_value='false'
I'm not really good at Python, but here's what you could do.
Trigger
If you want to trigger another Zap from this Code step, you could use the requests library and Zapier's Webhooks as a Trigger step for your other Zap (Zap that you want to Trigger).
Here are the steps:
Setup a Zap with the trigger app as Webhooks. Get the webhook URL.
From the Code step, make a request to the Webhook URL with the value of holiday_value. (Here is a sample POST request). Also helpful to refer to this example.
Filter
If you are looking for creating a filter in the same Zap instead,
You could return the value of holiday_value from this code step. Refer documentation here.
Your code would probably look like (Please check syntax, I'm not good at Python),
from datetime import date
import holidays
us_holidays = holidays.US()
if date.today() in us_holidays:
return {'holiday_value': 'true'}
else:
return {'holiday_value': 'false'}
You can now add a filter step that only allows the Zap to continue if holiday_value equals True or False. Documentation for filters here.
Hope that helps.
I am trying to learn how to use the new WebHooks service on IFTTT, and I'm struggling to figure out how it is supposed to work. Most of the projects I can find online seem to refer to a deprecated "maker" service, and there are very little resources for interpreting the new channel.
Let's say I want to check the value of "online" every ten minutes in the following json file: https://lichess.org/api/users/status?ids=thibault
I can write a Python script that extracts this value like so:
response = urlopen('https://lichess.org/api/users/status?ids=thibault')
thibault = response.read()
data = json.loads(thibault)
status = data[0]['online']
If the status is equal to "true", I would like to receive a notification via email or text message. How do I integrate the python script and the webhooks service? Or do I even need to use this script at all? I assume I need some kind of cron job that regularly runs this Python script, but how do I connect this job with IFTTT?
When I create a new applet on IFTTT I am given the ability to create a trigger with a random event name, but it's unclear what that event name corresponds to.
I have a similar setup for my IFTTT webhook service. To the best of my understanding, the answer to your question is yes, you need this script (or similar) to scrap the online value, and you'll probably want to do a cron job (my approach) or keep the script running (wouldn't be my preference).
IFTTT's webhooks a json of up to 3 values, which you can POST to a given event and key name.
Below is a very simple excerpt of my webhook API:
def push_notification(*values, **kwargs):
# config is in json format
config = get_config()
report = {}
IFTTT = {}
# set default event/key if kwargs are not present
for i in ['event', 'key']:
IFTTT[i] = kwargs[i] if kwargs and i in kwargs.keys() else config['IFTTT'][i]
# unpack values received (up to 3 is accepted by IFTTT)
for i, value in enumerate(values, 1):
report[f"value{i}"] = value
if report:
res = requests.post(f"https://maker.ifttt.com/trigger/{IFTTT['event']}/with/key/{IFTTT['key']}", data=report)
# TODO: add try/except for status
res.raise_for_status()
return res
else:
return None
You probably don't need all these, but my goal was to set up a versatile solution. At the end of the day, all you really need is this line here:
requests.post(f"https://maker.ifttt.com/trigger/{event}/with/key/{key}", data={my_json_up_to_3_values})
Where you will be placing your webhook event name and secret key value. I stored them in a config file. The secret key will be available once you sign up on IFTTT for the webhook service (go to your IFTTT webhook applet setting). You can find your key with a quick help link like this: https://maker.ifttt.com/use/{your_secret_key}. The event can be a default value you set on your applet, or user can choose their event name if you allow.
In your case, you could do something like:
if status:
push_notification("Status is True", "From id thibault", event="PushStatus", key="MysEcR5tK3y")
Note: I used f-strings with version 3.6+ (It's great!), but if you have a lower version, you should switch all the f-strings to str.format().
The free user for AlchemyAPI can call 1000 requests a day (http://www.alchemyapi.com/products/pricing/).
I have been accessing the API with python as such:
from alchemyapi import AlchemyAPI
demo_text = 'Yesterday dumb Bob destroyed my fancy iPhone in beautiful Denver, Colorado. I guess I will have to head over to the Apple Store and buy a new one.'
response = alchemyapi.keywords('text', demo_text)
json_output = json.dumps(response, indent=4)
print json_output
I know I ran out of calls since the requests were response returning None.
How do I check how many calls I have left through the python interface?
Will the check count as one request?
You can use alchemy_calls_left(api_key) function from Here
and no it won't count as a call itself.
This URL will return you the daily call used info.
Replace the API_KEY with your key.
http://access.alchemyapi.com/calls/info/GetAPIKeyInfo?apikey=API_KEY&outputMode=json
You could keep a local variable that would keep track of the number of API calls and would reset when the date changes using datetime.date from date module.
You can also use this Java API as follows:
AlchemyAPI alchemyObj = AlchemyAPI.GetInstanceFromFile("/../AlchemyAPI/testdir/api_key.txt");
AlchemyAPI_NamedEntityParams params= new AlchemyAPI_NamedEntityParams();
params.setQuotations(true); // for instance, enable quotations
Document doc = alchemyObj.HTMLGetRankedNamedEntities(htmlString, "http://news-site.com", params);
The last call will cause an IOException (if you exceed the allowed calls for a given day) and the message will be "Error making API call: daily-transaction-limit-exceeded."
You can then catch it, wait for 24 hours and re-try.