How to create cloud object storage on softlayer? - python

I want to create cloud object storage using python API and referred to link https://sldn.softlayer.com/blog/waelriac/managing-softlayer-object-storage-through-rest-apis.
When I use following command to order CLOUD_OBJECT_STORAGE, it prompt errors. Did I miss some config or give the wrong config?
payload = '{"parameters" : [{"complexType": "SoftLayer_Container_Product_Order_Network_Storage_Hub", "quantity": 1, "packageId": 206, "prices": [{"id": 177725}]}]}'
client['SoftLayer_Product_Order'].placeOrder(payload)
client['SoftLayer_Product_Order'].placeOrder(payload)
Traceback (most recent call last):
File "", line 1, in
File "/usr/local/lib/python2.7/dist-packages/SoftLayer/API.py", line 392, in call_handler
return self(name, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/SoftLayer/API.py", line 360, in call
return self.client.call(self.name, name, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/SoftLayer/API.py", line 263, in call
return self.transport(request)
File "/usr/local/lib/python2.7/dist-packages/SoftLayer/transports.py", line 195, in call
raise _ex(ex.faultCode, ex.faultString)
SoftLayer.exceptions.SoftLayerAPIError: SoftLayerAPIError(SoftLayer_Exception_Order_InvalidContainer): Invalid container specified: SoftLayer_Container_Product_Order. Ordering a server or service requires a specific container type, not the generic base order container.

well in order to call the Softlayer's API methods using the Soflayer's Python client the request are differents from REST request I recomend you to review documentation to get more information.
this is the code you need to use:
import SoftLayer
USERNAME="set me"
APIKEY="set me"
client = SoftLayer.create_client_from_env(
username=USERNAME,
api_key=APIKEY
)
order = {
"quantity": 1,
"packageId": 206,
"prices": [{
"id": 177725
}]
}
result = client['SoftLayer_Product_Order'].verifyOrder(order)
print (result)
Another thing to point out is that you need to make sure that you are using the correct price, the prices can be different for each account, so in order to make sure you are using the correct price I recomend you to call the SoftLayer_Product_Package:getItemPrices
Using Python use this:
result = client['SoftLayer_Product_Package'].getItemPrices(id=206)
print (result)
Regards

Related

openapi-python-client write-only field is required in GET request

i'm using DRF+openapi-python-client for my project.
i have 2 question for now.
openapi-python-client generate does not accept url for generating client
generated client fileas to serialize model having write-only file field
Thanx in advance!
Launching my server with manage.py script works ok, i can get my openapi schema on http://127.0.0.1:8000/control_center/schema/?format=openapi-json with my browser. On the same time,command openapi-python-client generate --url http://localhost:8000/control_center/schema?format=openapi-json gives me
Traceback (most recent call last):
File "pydantic/main.py", line 522, in pydantic.main.BaseModel.parse_obj
TypeError: 'NoneType' object is not iterable
What am i missing? workaround for me is to copy schema from browser to json and run openapi-python-client generate --path schema.json
i have a model StorageFile with write-only file field. In GET responses this field is ommited,getting file content is another endpoint. Getting storage file details with browser or with manual request is ok,but using generated client like
file_details: models.StorageFile = retrieve_storage_file.sync(id=file.id,
client=self.api_client)
gives me error
Error
Traceback (most recent call last):
File ".../api_client_tests/test_storage_files_api.py", line 18, in test_can_get_storage_file_details
file_details: models.StorageFile = retrieve_storage_file.sync(id=file.id,
File ".../api_client/control_center_client/control_center_client/api/control_center/retrieve_storage_file.py", line 86, in sync
return sync_detailed(
File ".../api_client/control_center_client/control_center_client/api/control_center/retrieve_storage_file.py", line 70, in sync_detailed
return _build_response(response=response)
File ".../api_client/control_center_client/control_center_client/api/control_center/retrieve_storage_file.py", line 43, in _build_response
parsed=_parse_response(response=response),
File ".../api_client/control_center_client/control_center_client/api/control_center/retrieve_storage_file.py", line 32, in _parse_response
response_200 = StorageFile.from_dict(response.json())
File ".../api_client/control_center_client/control_center_client/models/storage_file.py", line 128, in from_dict
file = File(payload=BytesIO(d.pop("file")))
KeyError: 'file'
In schem.json this field is described as
"file": {
"type": "string",
"format": "binary",
"writeOnly": true
}
What am i missing here?

Why do I get error "Expected a message object, but got kind: " when using Datastore API?

I'm using this Datastore API documentation and here is my code in Python:
class DatastoreConf:
def __init__(self):
self.datastore_client = datastore.Client()
def insert_entity(self):
complete_key = self.datastore_client.key("Task", "sampleTask")
task = datastore.Entity(key=complete_key)
task.update(
{
"category": "Personal",
"done": False,
"priority": 4,
"description": "Learn Cloud Datastore",
}
)
self.datastore_client.put(task)
d = DatastoreConf()
d.insert_entity()
I've already set GOOGLE_APPLICATION_CREDENTIALS variable to my service account key and I have sufficient roles, but I'm getting the error
TypeError: Expected a message object, but got kind: "Task" name: "sampleTask".
The traceback that I got is:
Traceback (most recent call last):
File "C:/Users/user/project/datastore_configuration.py", line 24, in <module>
d.get_entity()
File "C:/Users/user/project/datastore_configuration.py", line 21, in get_entity
self.datastore_client.put(task)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\client.py", line 575, in put
self.put_multi(entities=[entity], retry=retry, timeout=timeout)
File "C:\Users\user\projectvenv\lib\site-packages\google\cloud\datastore\client.py", line 612, in put_multi
current.put(entity)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\batch.py", line 227, in put
_assign_entity_to_pb(entity_pb, entity)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\batch.py", line 373, in _assign_entity_to_pb
bare_entity_pb = helpers.entity_to_protobuf(entity)
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\helpers.py", line 207, in entity_to_protobuf
key_pb = entity.key.to_protobuf()
File "C:\Users\user\project\venv\lib\site-packages\google\cloud\datastore\key.py", line 298, in to_protobuf
key.path.append(element)
Do you have any idea what causes this error?
I resolved the problem. The error was due to version difference. Datastore worked with library protobuf 3.20.1 but didn't with 4.21.0.
I think it's beacause Firestore is now developed much more and the versions in Datastore are no longer supported.

How to send data post data to jobs api elasticsearch

I'm trying to post data to a machine learning api using elasticsearch. What format does the json docs need to be in?
I've attempted to send data with json docs separated by newline in a txt file. I've also tried converting back and forth to json using dump and load to no avail. The documentation states that the documents can be separated by whitespace, but no matter what I try it won't accept them.
https://www.elastic.co/guide/en/elasticsearch/reference/current/ml-post-data.html
Here is an example of a json doc saved as file_name.json:
[{"myid": "id1", "client": "client1", "submit_date": 1514764857},
{"my_id": "id2", "client": "client_2", "submit_date": 1514764857}]
Here is the basic code needed to post data:
from elasticsearch import Elasticsearch
from elasticsearch.client.xpack import MlClient
es = elastic_connection()
es_ml = MlClient(es)
def post_training_data(directory='Training Data', file_name='file_name.json'):
with open(os.path.join(directory, file_name), mode='r') as train_file:
train_data = json.load(train_file)
es_ml.post_data(job_id=job_id, body=train_data)
post_training_data()
This is the specific error I am getting with this:
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "..\train_model.py", line 218, in post_training_data
self.es_ml.post_data(job_id=self.job_id, body=train_data)
File "..\inc_anamoly\lib\site-packages\elasticsearch\client\utils.py", line 76, in _wrapped
return func(*args, params=params, **kwargs)
File "..\inc_anamoly\lib\site-packages\elasticsearch\client\xpack\ml.py", line 81, in post_data
body=self._bulk_body(body))
AttributeError: 'MlClient' object has no attribute '_bulk_body'
This turns out to be a bug. Issue reported.
https://github.com/elastic/elasticsearch-py/issues/959

OrientDB Gremlin server not working in python

I am using the orientdb and gremlin server in python, Gremlin server is started successfully, but when I am trying to add one vertex to the orientdb through gremlin code it's giving me an error.
query = """graph.addVertex(label, "Test", "title", "abc", "title", "abc")"""
following is the Traceback
/usr/bin/python3.6 /home/admin-12/Documents/bitbucket/ecodrone/ecodrone/test/test1.py
Traceback (most recent call last):
File "/home/admin-12/Documents/bitbucket/ecodrone/ecodrone/test/test1.py", line 27, in <module>
result = execute_query("""graph.addVertex(label, "Test", "title", "abc", "title", "abc")""")
File "/home/admin-12/Documents/bitbucket/ecodrone/ecodrone/GremlinConnector.py", line 21, in execute_query
results = future_results.result()
File "/usr/lib/python3.6/concurrent/futures/_base.py", line 432, in result
return self.__get_result()
File "/usr/lib/python3.6/concurrent/futures/_base.py", line 384, in __get_result
raise self._exception
File "/home/admin-12/.local/lib/python3.6/site-packages/gremlin_python/driver/resultset.py", line 81, in cb
f.result()
File "/usr/lib/python3.6/concurrent/futures/_base.py", line 425, in result
return self.__get_result()
File "/usr/lib/python3.6/concurrent/futures/_base.py", line 384, in __get_result
raise self._exception
File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/admin-12/.local/lib/python3.6/site-packages/gremlin_python/driver/connection.py", line 77, in _receive
self._protocol.data_received(data, self._results)
File "/home/admin-12/.local/lib/python3.6/site-packages/gremlin_python/driver/protocol.py", line 106, in data_received
"{0}: {1}".format(status_code, data["status"]["message"]))
gremlin_python.driver.protocol.GremlinServerError: 599: Error during serialization: Infinite recursion (StackOverflowError) (through reference chain: com.orientechnologies.orient.core.id.ORecordId["record"]->com.orientechnologies.orient.core.record.impl.ODocument["schemaClass"]->com.orientechnologies.orient.core.metadata.schema.OClassImpl["document"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"]->com.orientechnologies.orient.core.record.impl.ODocument["owners"])
Process finished with exit code 1
First of all, I very much recommend that you do not use the Graph API to make mutation. Prefer the Traversal API for that and do:
g.addV('Test').
property('title1', 'abc').
property('title2', 'abc')
Second, I think that your error is occurring because you are returning a Vertex which contains an ORecordId which is the vertex identifier and Gremlin Server doesn't know how to handle that. I don't know if OrientDB has serializers built to handle that, but if they do then you would want to add them to Gremlin Server configurations which is described in a bit more detail here - basically, you would want to know if OrientDB exposes an TinkerPop IORegistry for all their custom classes that might be sent back over the wire.
If they do not, then you would want to avoid returning those or convert them yourself. TinkerPop already recommends that you not return full Vertex objects and only return data that you need. So, rather than g.V() you would want to convert that Vertex into a Map with g.V().valueMap('title') or something similar (perhaps use project() step). If you definitely need the vertex identifier then you would need to convert that to something TinkerPop serializers understand. That might mean something as simple as:
g.V().has("title1","abc").id().next().toString()

SimpleIDML How to convert IDML to PDF?

I am new to INDD CC Server. I have Implemented Indesign server running on Windows. I need to convert IDML to PDF but having issues.
I have used SimpleIDML Python library to manipulate Adobe(r) IDML(r) files.
My sample script is
I2P.py
from simple_idml.indesign import indesign
idml_file = "/home/user/Project/EPS/media/test/2-idml/test001.idml"
indd_file = "/home/user/Project/EPS/media/test/InDesigndocument/test001.indd"
url_path = "http://192.168.1.1:12345/"
client_dir = "/home/user/Project/EPS/media/source"
server_dir = "/home/user/Project/EPS/media/server"
response = indesign.save_as(indd_file, [{
"fmt": "pdf",
"params": {"colorSpace": "CMYK"},
}],
url_path,
client_dir,
server_dir)[0]
with open("my_file.pdf", "w+") as f:
f.write(response)
In documentation :
response = indesign.save_as("/path_to_file.indd", [{
"fmt": "pdf",
"params": {"colorSpace": "CMYK"},
}],
"http://url-to-indesign-server:port",
"/path/to/client/workdir",
"/path/to/indesign-server/workdir")[0]
When i run I2P script throws me error as :
Traceback (most recent call last):
File "ItoP.py", line 12, in <module>
server_path)[0]
File "/home/user/eps2_env/local/lib/python2.7/site-packages/simple_idml/indesign/indesign.py", line 71, in new_func
logger, logger_extra)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/simple_idml/indesign/indesign.py", line 180, in save_as
responses = map(lambda fmt: _save_as(fmt), dst_formats_params)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/simple_idml/indesign/indesign.py", line 180, in <lambda>
responses = map(lambda fmt: _save_as(fmt), dst_formats_params)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/simple_idml/indesign/indesign.py", line 149, in _save_as
response = cl.service.RunScript(params)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/suds/client.py", line 542, in __call__
return client.invoke(args, kwargs)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/suds/client.py", line 602, in invoke
result = self.send(soapenv)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/suds/client.py", line 649, in send
result = self.failed(binding, e)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/suds/client.py", line 702, in failed
r, p = binding.get_fault(reply)
File "/home/user/eps2_env/local/lib/python2.7/site-packages/suds/bindings/binding.py", line 265, in get_fault
raise WebFault(p, faultroot)
suds.WebFault: Server raised fault: 'The specified script file can not be found: /home/user/Project/EPS/media/server/tmp9LVUWj/save_as.jsx'
Manually i can see dynamically created dir tmp9LVUWj inside server dir. Server path expecting on same time.
Not able to figure out how to set indesign-server/workdir and access in code and how to solve ? I have spend much time on this and not able find help or example code.
Or is there other python package to convert from IDML to PDF.
Thanks in advance
You wrote,
Manually I can see dynamically created dir tmp9LVUWj inside server
dir.
That is true, but that is not the error. It is stating that it cannot find a JSX file named save_as.jsx within that directory. Is that in fact the name of the JSX file that you were intending to place there, or the file that is residing there now?

Categories

Resources