I have some code that generates a signed URL
expire_time = in_an_hour()
conn = CloudFrontConnection(access_key_id, access_key)
##enter the id or domain name to select a distribution
distribution = Distribution(connection=conn, config=None, domain_name=domain, id=dist_id, last_modified_time=None, status='')
signed_url = distribution.create_signed_url(url=url_tosign, keypair_id=keypair,expire_time=expire_time,private_key_file="/path/to/priv.pem")
return signed_url
I want to include a custom policy that takes into account user IP. I wrote a method to generate the json that works fine. I tried adding policy=get_policy(url_tosign, ip) to the distribution.create_signed_url call, but I just get an error saying it's an unexpected keyword argument. How can I modify this code to generate my signed URL with source IP restrictions?
James Dean's link had the solution. There is an ip_address parameter that we can use.
Updated line of code
signed_url = distribution.create_signed_url(url=url_tosign, keypair_id=keypair,expire_time=expire_time,private_key_file="/path/to/priv.pem", ip_address= "x.x.x.x/32") # CIDR notation optional. x.x.x.x also works as a /32
Related
I wish to call http api to change password in roundcube mail client using python code. How can this be done? Where is the exact location in the roundcube configuration and how is it invoked?
Ensure the httpapi is available in your install
In the main roundcube configuration file config.inc.php set the variables:
$config['password_httpapi_url'] =
'http://host:5000/change_user_password'; // required
$config['password_httpapi_method'] = 'GET'; // default
$config['password_httpapi_var_user'] = 'username'; // optional
$config['password_httpapi_var_curpass'] = 'curpass'; // optional
$config['password_httpapi_var_newpass'] = 'newpass'; // optional
Important NOTE: If you use the GET method you can pass the variables as parameters using query string values eg. the request.args.get('username')
If you use the POST method you need to use the form fields. eg. request.form['username']
Pass the http api driver name in plugins/password/config.inc.php:
$config['password_driver'] = 'httpapi';
Reload the web server.
I would like to verify an ethereum (ETH) signature made in MetaMask using python. I'm developing a website using flask as backend.
Javascript code send a POST requests to the back end containing the 3 following variables:
{'signature': '0x0293cc0d4eb416ca95349b7e63dc9d1c9a7aab4865b5cd6d6f2c36fb1dce12d34a05039aedf0bc64931a439def451bcf313abbcc72e9172f7fd51ecca30b41dd1b', 'nonce': '6875972781', 'adress': '0x3a806c439805d6e0fdd88a4c98682f86a7111789'}
My goal is to verify that the signature contains the nonce (random integer) and was sign by the public adress
I using javascript to sign the nonce using the ether library
const ethereum = window.ethereum;
const provider = new ethers.providers.Web3Provider(ethereum)
const signer = provider.getSigner()
var signature = await signer.signMessage(nonce);
I tried with several python libraires, but I'm unable to format signature, adress and nonce so that it works. here is unsuccessfull try made using ecdsa librairy:
vk = ecdsa.VerifyingKey.from_string(bytes.fromhex(address), curve=ecdsa.SECP256k1, hashfunc=sha256)
vk.verify(bytes.fromhex(hex(signature)), bytes(nonce, 'utf-8'))
I get the following error:
ValueError: non-hexadecimal number found in fromhex() arg at position 1
Thanks for your help !
Using web3.py you could use w3.eth.account.recover_message to recover the address from the signature and the data. After that you compare the adress to the correct adress(with lowercase, because i think web3.py would give you lower and uppercase)
from web3 import Web3
from hexbytes import HexBytes
from eth_account.messages import encode_defunct
w3 = Web3(Web3.HTTPProvider(""))
mesage= encode_defunct(text="6875972781")
address = w3.eth.account.recover_message(mesage,signature=HexBytes("0x0293cc0d4eb416ca95349b7e63dc9d1c9a7aab4865b5cd6d6f2c36fb1dce12d34a05039aedf0bc64931a439def451bcf313abbcc72e9172f7fd51ecca30b41dd1b"))
print(address)
You can verify using JS but I commend you for wanting to verify on the backend with python. I was faced with the exact same choice and opted for the latter. Basically, I created 2 apis, the first of which generates the message and nonce and the second verifies the signature.
I'm using FastAPI for this but you can pattern it to any framework you prefer such as Django.
Generate message:
# ethaccount is the user's wallet included in the body of the request
async def generate_message(ethaccount: str = Body(...)) -> str:
# I save the wallet to cache for reference later
set_cache_address(ethaccount)
# Generate the nonce
nonce = uuid.uuid4()
# Generate the message
message = f'''
Welcome! Sign this message to login to the site. This doesn't cost you
anything and is free of any gas fees.
Nonce:
{nonce}.
'''
return message
Metamask takes over from here. Afterwards verify the signature generated by metamask:
from web3.auto import w3
from eth_account.messages import encode_defunct
async def signature(data: dict = Body(...)): # noqa
# User's signature from metamask passed through the body
sig = data.get('signature')
# The juicy bits. Here I try to verify the signature they sent.
message = encode_defunct(text=data.get('message'))
signed_address = (w3.eth.account.recover_message(message, signature=sig)).lower()
# Same wallet address means same user. I use the cached address here.
if get_cache_address() == signed_address:
# Do what you will
# You can generate the JSON access and refresh tokens here
pass
NOTE: I've cleaned this code to show only the logic/methods you might need. The actual code I'm using is longer as I generate tokens, etc.
JackDonMClovin, Body is a request auto parse that is provided by fastapi. For other libs you can get the signature data from the request object.
I'm using boto3 to copy encrypted EBS snapshots from one region to another, but I've been getting Invalid presigned URL messages when I try to copy. I'm generating the presigned URL using the boto3 client method generate_presigned_url in the source region and calling the copy function in the destination region like so:
uw2_client = non_prod.client(
'ec2',
region_name="us-west-2",
config=Config(signature_version='s3v4')
)
presigned_url = uw2_client.generate_presigned_url(
ClientMethod='copy_snapshot',
Params={
'SourceSnapshotId': og_snapshot_id, # Original snapshot ID
'SourceRegion': 'us-west-2',
'DestinationRegion': 'us-east-1'
# I also tried include all parameters from copy_snapshot.
# It didn't make a difference.
# 'Description': desc,
# 'KmsKeyId': 'alias/xva-nonprod-all-amicopykey',
# 'Encrypted': True,
}
)
Here's my code to create the copy.
ue1_client = non_prod.client(
'ec2',
region_name="us-east-1",
config=Config(signature_version='s3v4')
)
response = ue1_client.copy_snapshot(
Description=desc,
KmsKeyId='alias/xva-nonprod-all-amicopykey', # Exists in us-east-1
Encrypted=True,
SourceSnapshotId=og_snapshot_id,
SourceRegion='us-west-2',
DestinationRegion='us-east-1',
PresignedUrl=pre_signed_url
)
It successfully returns the presigned URL. But if I attempt to use that presigned URL to copy a snapshot, I get the invalid URL error. If I try to validate the url:
r = requests.post(presigned_url)
print(r.status_code)
print(r.text)
I get:
<Response>
<Errors>
<Error>
<Code>AuthFailure</Code>
<Message>AWS was not able to validate the provided access credentials</Message>
</Error>
</Errors>
<RequestID>3189bb5b-54c9-4d11-ab4c-762cbea32d9a</RequestID>
</Response>
You'd think that it would an issue with my credentials, but I'm not sure how... It's the same credentials I'm using to create the pre-signed URL. And my IAM user has unfettered access to EC2.
I'm obviously doing something wrong here, but I cannot figure out what it is. Any insight would be appreciated.
EDIT
Just to confirm that it's not a permissions issue, I tried this with my personal account which has access to everything. Still getting the same error message.
As it turns out, the documentation is wrong... A signed URL is NOT required when copying encrypted snapshots within the same account (according to AWS Support).
From AWS Support:
... it's not actually necessary to create the pre-signed URL in order to copy encrypted snapshot from one region to another (within the same AWS account).
However, according to their documentation, it's not possible to copy encrypted snapshots to another account either... ¯\_(ツ)_/¯
The current boto3.EC2.Client.copy_snapshot function documentation says:
PresignedUrl (string) --
When you copy an encrypted source snapshot using the Amazon EC2 Query API, you must supply a pre-signed URL. This parameter is optional for unencrypted snapshots.
Instead, it can simply be accomplished by creating the client object in the destination region and calling the copy_snapshot() method like so:
try:
ec2 = boto3.client(
service_name='ec2',
region_name='us-east-1'
)
ec2.copy_snapshot(
SourceSnapshotId='snap-xxxxxxxxxxxx',
SourceRegion='us-west-2',
Encrypted=True,
KmsKeyId='DestinationRegionKeyId'
)
except Exception as e:
print(e)
I'm trying to use boto3 to query my CloudSearch domain using the docs as a guide: http://boto3.readthedocs.io/en/latest/reference/services/cloudsearchdomain.html#client
import boto3
import json
boto3.setup_default_session(profile_name='myprofile')
cloudsearch = boto3.client('cloudsearchdomain')
response = cloudsearch.search(
query="(and name:'foobar')",
queryParser='structured',
returnFields='address',
size=10
)
print( json.dumps(response) )
...but it fails with:
botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "https://cloudsearchdomain.eu-west-1.amazonaws.com/2013-01-01/search"
But how am I supposed to set or configure the endpoint or domain that I want to connect to? I tried adding an endpoint parameter to the request, thinking maybe it was an accidental omission from the docs, but I got this error response:
Unknown parameter in input: "endpoint", must be one of: cursor, expr, facet, filterQuery, highlight, partial, query, queryOptions, queryParser, return, size, sort, start, stats
The docs say:
The endpoint for submitting Search requests is domain-specific. You submit search requests to a domain's search endpoint. To get the search endpoint for your domain, use the Amazon CloudSearch configuration service DescribeDomains action. A domain's endpoints are also displayed on the domain dashboard in the Amazon CloudSearch console.
I know what my search endpoint is, but how do I supply it?
I found a post on a Google forum with the answer. You have to add the endpoint_url parameter into the client constructor e.g.
client = boto3.client('cloudsearchdomain', endpoint_url='http://...')
I hope those docs get updated, because I wasted a lot of time before I figured that out.
import boto3
client = boto3.client('cloudsearchdomain',
aws_access_key_id= 'access-key',
aws_secret_access_key= 'some-secret-key',
region_name = 'us-east-1', # your chosen region
endpoint_url= 'cloudsearch-url'
# endpoint_url is your Search Endpoint as defined in AWS console
)
response = client.search(
query='Foo', # your search string
size = 10
)
Reference response['hits'] for returned results.
I'm trying to get the cryptographic keys just like portal, but I can't make my mask work, can someone tell me what's wrong with the following request, btw I am using the following url
https://sldn.softlayer.com/reference/services/softlayer_security_certificate_request/getsslcertificaterequests
and jic python api client is used and also rest requests can work for me to get the socket layers.
mask = "mask[accountId, certificateSigningRequest, certificateAuthorityName, id]
response = client['SoftLayer_Security_Certificate_Request'].getsslcertificaterequests()
I also want to find how to search the virtual ips associated to the certificates but I don't find what api method does what i need.
Your current code will retrieve only one of the certificates and not the stored security certificates, in order to make your mask work you need to add close the string with " double quote, and the method you are calling should be getSslCertificateRequests, see below:
accountId = 202768 #change this value
mask = "mask[accountId, certificateSigningRequest, certificateAuthorityName, id]"
response = client['SoftLayer_Security_Certificate_Request'].getSslCertificateRequests(accountId)
Currently the portal use SoftLayer_Account::getSecurityCertificate to retrieve the stored security certificates including SSL, use the following Python script:
import SoftLayer
from pprint import pprint as pp
USERNAME = 'set-me'
# Endpoint url that contains all the Api Services.
API_KEY = 'set-me'
# Generate one for you or your users, or view yours at
https://control.softlayer.com/account/users
client = SoftLayer.create_client_from_env(username=USERNAME,
api_key=API_KEY)
accountService = client['SoftLayer_Account']
try:
""""
getSecurityCertificates() retrieves stored security certificates (ie. SSL)
"""""
result = accountService.getSecurityCertificates()
pp(result)
except SoftLayer.SoftLayerAPIError as e:
""""
If there was an error returned from the SoftLayer API then bomb out with the
error message.
"""""
print("Unable to retrieve the Account's stored security certificates (i.e. SSL) . %s %s " % (e.faultCode, e.faultString))
To find the virtual ip addresses associated you should use the method getAdcLoadBalancers and send the id value obtained in the previous method, try this Rest request.
https://[username]:[apiKey]#api.softlayer.com/rest/v3.1/SoftLayer_Account/getAdcLoadBalancers?objectFilter={"adcLoadBalancers":{"securityCertificateId":{"operation":[id]}}}
Remember to change the username and apiKey for valid credentials, and the id mentioned above to retrieve the associated load balancer ip addresses.