Starting from zero bitstamp api in python - python

Starting from stage one trying to connect to bitstamp.
I have a funded account and an api key.
I am trying the following:
import bitstamp.client
public_client = bitstamp.client.Public()
print(public_client.ticker()['volume'])
trading_client = bitstamp.client.Trading(username='userNameForWeb', key='apiKeyFromBitStamp', secret='myPasswordFromWeb')
print(trading_client.ticker()['volume'])
print(trading_client.account_balance()['fee'])
Yet getting the error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Applications/Spyder.app/Contents/Resources/lib/python2.7/spyderlib/widgets/externalshell/sitecustomize.py", line 540, in runfile
execfile(filename, namespace)
File "/Users/jasonmellone/Documents/PythonProjects/bitStamp.py", line 18, in <module>
print(trading_client.account_balance()['fee'])
File "/Library/Python/2.7/site-packages/bitstamp/client.py", line 197, in account_balance
return self._post("balance/", return_json=True)
File "/Library/Python/2.7/site-packages/bitstamp/client.py", line 47, in _post
return self._request(requests.post, *args, **kwargs)
File "/Library/Python/2.7/site-packages/bitstamp/client.py", line 80, in _request
raise BitstampError(error)
bitstamp.client.BitstampError: Invalid signature
I grabbed the code directly from the github project.
Happy to use other library for python, but looking for input.
Thanks!

trading_client = bitstamp.client.Trading(username='userNameForWeb', key='apiKeyFromBitStamp', secret='myPasswordFromWeb')
The secret should not be myPasswordFromWeb. The error clearly says you have an 'Invalid signature.'. The secret should be the api_secret key that bitstamp shows you only ONE TIME when you generate an API key. Generate a new API key and store the secret key it gives you. Your secret key will be almost (if not equally) as long as your API key. Hope this helps.

Related

Why do I get error "Expected a message object, but got kind: " when using Datastore API?

I'm using this Datastore API documentation and here is my code in Python:
class DatastoreConf:
def __init__(self):
self.datastore_client = datastore.Client()
def insert_entity(self):
complete_key = self.datastore_client.key("Task", "sampleTask")
task = datastore.Entity(key=complete_key)
task.update(
{
"category": "Personal",
"done": False,
"priority": 4,
"description": "Learn Cloud Datastore",
}
)
self.datastore_client.put(task)
d = DatastoreConf()
d.insert_entity()
I've already set GOOGLE_APPLICATION_CREDENTIALS variable to my service account key and I have sufficient roles, but I'm getting the error
TypeError: Expected a message object, but got kind: "Task" name: "sampleTask".
The traceback that I got is:
Traceback (most recent call last):
File "C:/Users/user/project/datastore_configuration.py", line 24, in <module>
d.get_entity()
File "C:/Users/user/project/datastore_configuration.py", line 21, in get_entity
self.datastore_client.put(task)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\client.py", line 575, in put
self.put_multi(entities=[entity], retry=retry, timeout=timeout)
File "C:\Users\user\projectvenv\lib\site-packages\google\cloud\datastore\client.py", line 612, in put_multi
current.put(entity)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\batch.py", line 227, in put
_assign_entity_to_pb(entity_pb, entity)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\batch.py", line 373, in _assign_entity_to_pb
bare_entity_pb = helpers.entity_to_protobuf(entity)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\helpers.py", line 207, in entity_to_protobuf
key_pb = entity.key.to_protobuf()
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\key.py", line 298, in to_protobuf
key.path.append(element)
Do you have any idea what causes this error?
I resolved the problem. The error was due to version difference. Datastore worked with library protobuf 3.20.1 but didn't with 4.21.0.
I think it's beacause Firestore is now developed much more and the versions in Datastore are no longer supported.

"Private key is missing or invalid. It should be service " when making BigQuery call with credential

I'm following this data prediction using Cloud ML Engine with scikit-learn tutorial for GCP AI Platforms. I tried to make an API call to BigQuery with:
def query_to_dataframe(query):
import pandas as pd
import pkgutil
privatekey = pkgutil.get_data('trainer', 'privatekey.json')
print(privatekey[:200])
return pd.read_gbq(query,
project_id=PROJECT,
dialect='standard',
private_key=privatekey)
but got the following error:
Traceback (most recent call last):
[...]
TypeError: a bytes-like object is required, not 'str'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/root/.local/lib/python3.7/site-packages/trainer/task.py", line 66, in <module>
arguments['numTrees']
File "/root/.local/lib/python3.7/site-packages/trainer/model.py", line 119, in train_and_evaluate
train_df, eval_df = create_dataframes(frac)
File "/root/.local/lib/python3.7/site-packages/trainer/model.py", line 95, in create_dataframes
train_df = query_to_dataframe(train_query)
File "/root/.local/lib/python3.7/site-packages/trainer/model.py", line 82, in query_to_dataframe
private_key=privatekey)
File "/usr/local/lib/python3.7/dist-packages/pandas/io/gbq.py", line 149, in read_gbq
credentials=credentials, verbose=verbose, private_key=private_key)
File "/root/.local/lib/python3.7/site-packages/pandas_gbq/gbq.py", line 846, in read_gbq
dialect=dialect, auth_local_webserver=auth_local_webserver)
File "/root/.local/lib/python3.7/site-packages/pandas_gbq/gbq.py", line 184, in __init__
self.credentials = self.get_credentials()
File "/root/.local/lib/python3.7/site-packages/pandas_gbq/gbq.py", line 193, in get_credentials
return self.get_service_account_credentials()
File "/root/.local/lib/python3.7/site-packages/pandas_gbq/gbq.py", line 413, in get_service_account_credentials
"Private key is missing or invalid. It should be service "
pandas_gbq.gbq.InvalidPrivateKeyFormat: Private key is missing or invalid. It should be service account private key JSON (file path or string contents) with at least two keys: 'client_email' and 'private_key'. Can be obtained from: https://console.developers.google.com/permissions/serviceaccounts
When the package runs in local environment, the private key loads fine, but when submitted as a ml-engine training job, the error occurs. Note that the private key fails to load only when I use GCP RUNTIME_VERSION="1.15" and PYTHON_VERSION="3.7", but can load with no problem when I use PYTHON_VERSION="2.7".
In case it's useful, the structure of my package is:
/babyweight
- setup.py
- trainer
- __init__.py
- model.py
- privatekey.json
- task.py
I'm not sure if the problem is due to a bug in Python, or where I placed privatekey.json.
I was able to solve the problem after I changed read_gbq's attribute for reading BigQuery access key from private_keys to credentials, as recommended by #rmesteves, and as shown here. I then set the value as the absolute path to privatekey.json, as shown here. Now the job is able to run without error.
Note: I only encountered this problem with Python 3+, but not with Python 2.7. I'm not sure why. It could possibly be due to the implementation of read_gbq.

CloudDataflow can not use "google.cloud.datastore" package?

I want to put datastore with transaction on CloudDataflow.
So, I wrote below.
def exe_dataflow():
....
from google.cloud import datastore
# call from pipeline
def ds_test(content):
datastore_client = datastore.Client()
kind = 'test_out'
name = 'change'
task_key = datastore_client.key(kind, name)
for _ in range(3):
with datastore_client.transaction():
current_value = client.get(task_key)
current_value['v'] += content['v']
datastore_client.put(task)
# pipeline
....
| 'datastore test' >> beam.Map(ds_test)
But, Error occured and log message was displayed as below.
(7b75e0ef2db229da): Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
op.start()
...(SNIP)...
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module
return getattr(__import__(module, None, None, [obj]), obj)
AttributeError: 'module' object has no attribute 'datastore'
CloudDataflow can not use "google.cloud.datastore" package?
add 2018/2/28.
I add --requirements_file to MyOption
options = MyOptions(flags = ["--requirements_file", "./requirements.txt"])
and I make requirements.txt
google-cloud-datastore==1.5.0
But, Another error occurred.
(366397598dcf7f02): Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
op.start()
...(SNIP)...
File "my_dataflow.py", line 66, in to_entity
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore/__init__.py", line 60, in <module>
from google.cloud.datastore.batch import Batch
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore/batch.py", line 24, in <module>
from google.cloud.datastore import helpers
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore/helpers.py", line 29, in <module>
from google.cloud.datastore_v1.proto import datastore_pb2
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore_v1/__init__.py", line 17, in <module>
from google.cloud.datastore_v1 import types
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore_v1/types.py", line 21, in <module>
from google.cloud.datastore_v1.proto import datastore_pb2
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore_v1/proto/datastore_pb2.py", line 17, in <module>
from google.cloud.datastore_v1.proto import entity_pb2 as google_dot_cloud_dot_datastore__v1_dot_proto_dot_entity__pb2
File "/usr/local/lib/python2.7/dist-packages/google/cloud/datastore_v1/proto/entity_pb2.py", line 28, in <module>
dependencies=[google_dot_api_dot_annotations__pb2.DESCRIPTOR,google_dot_protobuf_dot_struct__pb2.DESCRIPTOR,google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR,google_dot_type_dot_latlng__pb2.DESCRIPTOR,])
File "/usr/local/lib/python2.7/dist-packages/google/protobuf/descriptor.py", line 824, in __new__
return _message.default_pool.AddSerializedFile(serialized_pb)
TypeError: Couldn't build proto file into descriptor pool!
Invalid proto descriptor for file "google/cloud/datastore_v1/proto/entity.proto":
google.datastore.v1.PartitionId.project_id: "google.datastore.v1.PartitionId.project_id" is already defined in file "google/cloud/proto/datastore/v1/entity.proto".
...(SNIP)...
google.datastore.v1.Entity.properties: "google.datastore.v1.Entity.PropertiesEntry" seems to be defined in "google/cloud/proto/datastore/v1/entity.proto", which is not imported by "google/cloud/datastore_v1/proto/entity.proto". To use it here, please add the necessary import.
The recommended way to interact with Cloud Datastore from a Cloud Dataflow Pipeline is to use the Datastore I/O API, which is available through the Dataflow SDK and provides some methods to read and write data to a Cloud Datastore database.
You can find detailed documentation for the Datastore I/O package for Dataflow SDK 2.x for Python in this other link. The datastore.v1.datastoreio module is the specific module that you want to use. There is plenty of information in the links I am sharing, but in short, it is a connector to Datastore that uses PTransform to read / write / delete a PCollection from Datastore using the classes ReadFromDatastore() / WriteToDatastore() / DeleteFromDatastore() respectively.
You should try using it instead of implementing the calls yourself. I suspect this may be the reason for the error you are seeing, as a Datastore implementation already exists in the Dataflow SDK:
"google.datastore.v1.PartitionId.project_id" is already defined in file "google/cloud/proto/datastore/v1/entity.proto".
UPDATE:
It looks like those three classes collect several mutations and executes them in a single transaction. You can check that in the code describing the classes.
If the aim is to retrieve (get()) and then update (put()) a Datastore entity, you can probably work with the write_mutations() function, which is described in the documentation, and you can work with a full batch of mutations performing the operations you are interested in.

Web2py built-in authentication, after filling in the username, email, etc it generates the following ticket

I am new to web2py, when I came around to use the built-in authentication, after filling in the username, email, etc it generates the following ticket.
I am a clueless to what to do! Any help would be really welcomed!
I am running a mac osx 10.8. Postgres 9.3.
The ticket
Traceback (most recent call last):
File "/Users/JoaoBtte/Documents/python/Hello/gluon/restricted.py", line 217, in restricted
exec ccode in environment
File "/Users/JoaoBtte/Documents/python/Hello/applications/Cifra/controllers/default.py", line 90, in <module>
File "/Users/JoaoBtte/Documents/python/Hello/gluon/globals.py", line 378, in <lambda>
self._caller = lambda f: f()
File "/Users/JoaoBtte/Documents/python/Hello/applications/Cifra/controllers/default.py", line 88, in user
return dict(form=auth())
File "/Users/JoaoBtte/Documents/python/Hello/gluon/tools.py", line 1297, in __call__
return getattr(self, args[0])()
File "/Users/JoaoBtte/Documents/python/Hello/gluon/tools.py", line 2554, in register
self.login_user(user)
File "/Users/JoaoBtte/Documents/python/Hello/gluon/tools.py", line 1976, in login_user
user = Row(user)
File "/Users/JoaoBtte/Documents/python/Hello/gluon/dal.py", line 6986, in <lambda>
__init__ = lambda self,*args,**kwargs: self.__dict__.update(*args,**kwargs)
TypeError: 'NoneType' object is not iterable
Do you have the following line in one of your model files (usually in db.py) ?
from gluon.tools import Auth
auth = Auth(db)
Could you show us your default.py controller?

DeleteResource from Google Docs with Python

I am trying to delete a spreadsheet in Google Docs with this function:
def f_DeleteResource(xls_name):
"""Delete a resource"""
client=Auth()
for e1 in client.GetResources().entry:
e2 = client.GetResource(e1)
if xls_name==e2.title.text:
client.DeleteResource(e2.resource_id.text,True)
And I obtain different errors when I change the first parameter of client.DeleteResource(p1,p2):
client.DeleteResource(e2.resource_id.text,True):
Traceback (most recent call last):
File "C:\xmp\D6GDocsDeleteUpload.py", line 164, in <module> main()
File "C:\xmp\D6GDocsDeleteUpload.py", line 157, in main f_DeleteResource(sys.argv[2])
File "C:\xmp\D6GDocsDeleteUpload.py", line 144, in f_DeleteResource client.DeleteResource(e2.resource_id.text,True)
File "C:\Python27\lib\site-packages\gdata\docs\client.py", line 540, in delete_resource uri = entry.GetEditLink().href
AttributeError: 'str' object has no attribute 'GetEditLink'
client.DeleteResource(e2,True):
Traceback (most recent call last):
File "C:\xmp\D6GDocsDeleteUpload.py", line 164, in <module> main()
File "C:\xmp\D6GDocsDeleteUpload.py", line 157, in main f_DeleteResource(sys.argv[2])
File "C:\xmp\D6GDocsDeleteUpload.py", line 144, in f_DeleteResource client.DeleteResource(e2,True)
File "C:\Python27\lib\site-packages\gdata\docs\client.py", line 543, in delete_resource return super(DocsClient, self).delete(uri, **kwargs)
File "C:\Python27\lib\site-packages\gdata\client.py", line 748, in delete **kwargs)
File "C:\Python27\lib\site-packages\gdata\docs\client.py", line 66, in request return super(DocsClient, self).request(method=method, uri=uri, **kwargs)
File "C:\Python27\lib\site-packages\gdata\client.py", line 319, in request RequestError)
gdata.client.RequestError: Server responded with: 403, <errors xmlns='http://schemas.google.com/g/2005'><error><domain>GData</domain><code>matchHeaderRequired</code><location type='header'>If-Match|If-None-Match</location><internalReason>If-Match or If-None-Match header or entry etag attribute required</internalReason></error></errors>
Anyone can help me?
It seems to be a bug in Google API Python library. I checked gdata-2.0.16 and noticed that DeleteResource() function uses only URL of the resource (gdata/docs/client.py lines 540-543), but later checks for hasattr(entry_or_uri, 'etag') (gdata/client.py lines 737-741) and of course string value (uri) doesn't have etag attribute.
You may work around it using force keyword argument:
import gdata.docs.data
import gdata.docs.client
client = gdata.docs.client.DocsClient()
client.ClientLogin('xxxxxx#gmail.com', 'xxxxxx', 'XxX')
for doc in client.GetAllResources():
if doc.title.text == 'qpqpqpqpqpqp':
client.DeleteResource(doc, force=True)
break
If you want you may report an error to library maintainers (if it isn't already reported).
This issue has been fixed in this revision:
http://code.google.com/p/gdata-python-client/source/detail?r=f98fff494fb89fca12deede00c3567dd589e5f97
If you sync you client to the repository, you should be able to delete a resource without having to specify 'force=True'.

Categories

Resources