I'm trying to create an OrientDB graph database using PyOrient, and I can't find enough documentation to allow me to get Functions working. I've been able to create a function using record_create into the ofunction cluster, but although it doesn't crash, it doesn't appear to work either.
Here's my code:
#!/usr/bin/python
import pyorient
ousername="user"
opassword="pass"
client = pyorient.OrientDB("localhost", 2424)
session_id = client.connect( ousername, opassword )
db_name="database"
client.db_create( db_name, pyorient.DB_TYPE_GRAPH, pyorient.STORAGE_TYPE_PLOCAL )
# Set up the schema of the database
client.command( "create class URL extends V" )
client.command( "CREATE PROPERTY URL.url STRING")
client.command( "CREATE PROPERTY URL.id INTEGER")
client.command( "CREATE SEQUENCE urlseq")
client.command( "CREATE INDEX urls ON URL (url) UNIQUE")
# Get the id numbers of all the clusters
info=client.db_reload()
clusters={}
for c in info:
clusters[c.name]=c.id
print(clusters)
# Construct a test function
# All this should do is create a new URL vertex. Eventually it will check for uniqueness of url, etc.
code="INSERT INTO URL SET id = sequence('urlseq').next(), url='?'"
addURL_func = { '#OFunction': { 'name': 'addURL', 'code':'orient.getGraph().command("sql","%s",[urlparam]);' % code, 'language':'javascript', 'parameters':'urlparam', 'idempotent':False } }
client.record_create( clusters['ofunction'], addURL_func )
# Assume allURLs contains the list of URLs I want to store
for url in allURLs:
client.command("select addURL('%s')" % url)
vs = client.command("select * from URL")
for v in vs:
print(v.url)
Doing all the select addURL bits runs happily, but doing select * from URL simply times out. Presumably because (as I've discovered by examining the database in Studio) there are still no URL vertices. Although why that should timeout rather than returning an empty list or giving a useful error message, I'm not sure.
What am I doing wrong, and is there an easier way to create Functions through PyOrient?
I don't want to just write the Functions in Studio, because I am prototyping and want them written from the Python code rather than being lost every time I drop the mangled experimental graph!
I've mainly been using the OrientDB wiki page to find out about OrientDB functions, and the PyOrient github page as almost my only source of documentation for that.
Edit: I've been able to create a working Function in SQL (see my own answer below) but I still can't create a working Javascript Function which creates a vertex. My current best attempt is:
code2="""var g=orient.getGraph();g.command('sql','CREATE VERTEX URL SET id = sequence(\\"urlseq\\").next(), url = \\"'+urlparam+'\\"',[urlparam]);"""
myFunction2 = 'CREATE FUNCTION addURL2 "' + code2 + '" parameters [urlparam] idempotent false language javascript'
client.command(myFunction2)
which runs without crashing when called from PyOrient, but doesn't actually create any vertices. But if I call it from Studio, it works!?! I have no idea what's going on.
OK, after a lot of hacking and Googling, I've got it working:
code="CREATE VERTEX URL SET id = sequence('urlseq').next(), url = :urlparam;"
myFunction = 'CREATE FUNCTION addURL "' + code + '" parameters [urlparam] idempotent false language sql'
client.command(myFunction)
The key here seems to be the use of a colon before parameter names in OrientDB's version of SQL. I couldn't find any reference to this anywhere in the OrientDB docs, but someone online had discovered it somehow.
I'm answering my own question in the hope that this will help others struggling wth ODB's poor documentation!
You could try something like :
code="var g=orient.getGraph();\ng.command(\\'sql\\',\\'%s\\',[urlparam]);"
myFunction = "CREATE FUNCTION addURL '" + code + "' parameters [urlparam] idempotent false language javascrip"
client.command(myFunction);
UPDATE
I used this code (version 2.2.5) and it worked for me
code="var g=orient.getGraph().command(\\'sql\\',\\'%s\\',[urlparam]);"
myFunction = "CREATE FUNCTION addURL '" + code + "' parameters [urlparam] idempotent false language javascrip"
client.command(myFunction);
Hope it helps
Related
I'll preface this by saying I'm fairly new to BigQuery. I'm running into an issue when trying to schedule a query using the Python SDK. I used the example on the documentation page and modified it a bit but I'm running into errors.
Note that my query does use scripting to set some variables, and it's using a MERGE statement to update one of my tables. I'm not sure if that makes a huge difference.
def create_scheduled_query(dataset_id, project, name, schedule, service_account, query):
parent = transfer_client.common_project_path(project)
transfer_config = bigquery_datatransfer.TransferConfig(
destination_dataset_id=dataset_id,
display_name=name,
data_source_id="scheduled_query",
params={
"query": query
},
schedule=schedule,
)
transfer_config = transfer_client.create_transfer_config(
bigquery_datatransfer.CreateTransferConfigRequest(
parent=parent,
transfer_config=transfer_config,
service_account_name=service_account,
)
)
print("Created scheduled query '{}'".format(transfer_config.name))
I was able to successfully create a query with the function above. However the query errors out with the following message:
Error code 9 : Dataset specified in the query ('') is not consistent with Destination dataset '{my_dataset_name}'.
I've tried changing passing in "" as the dataset_id parameter, but I get the following error from the Python SDK:
google.api_core.exceptions.InvalidArgument: 400 Cannot create a transfer with parent projects/{my_project_name} without location info when destination dataset is not specified.
Interestingly enough I was able to successfully create this scheduled query in the GUI; the same query executed without issue.
I saw that the GUI showed the scheduled query's "Resource name" referenced a transferConfig, so I used the following command to see what that transferConfig looked like, to see if I could apply the same parameters using my Python script:
bq show --format=prettyjson --transfer_config {my_transfer_config}
Which gave me the following output:
{
"dataSourceId": "scheduled_query",
"datasetRegion": "us",
"destinationDatasetId": "",
"displayName": "test_scheduled_query",
"emailPreferences": {},
"name": "{REDACTED_TRANSFER_CONFIG_ID}",
"nextRunTime": "2021-06-18T00:35:00Z",
"params": {
"query": ....
So it looks like the GUI was able to use "" for destinationDataSetId but for whatever reason the Python SDK won't let me use that value.
Any help would be appreciated, since I prefer to avoid the GUI whenever possible.
UPDATE:
This does appear to be related to the scripting I used in my query. I removed the scripts from the query and it's working. I'm going to leave this open because I feel like this should be possible using the SDK since the query with scripting works in the console without issue.
This same thing also threw me through a loop but I managed to figure out what was wrong. The problem is with the
parent = transfer_client.common_project_path(project)
line that is given in the example query. By default, this returns something of the form projects/{project_id}. However, the CreateTransferConfigRequest documentation says of the parent parameter:
The BigQuery project id where the transfer configuration should be created. Must be in the format projects/{project_id}/locations/{location_id} or projects/{project_id}. If specified location and location of the destination bigquery dataset do not match - the request will fail.
Sure enough, if you use the projects/{project_id}/locations/{location_id} format instead, it resolves the error and allows you to pass a null destination_dataset_id.
I had the exact same issue. the fix for the issue is as below.
The below method returns Projects/{projectid}
parent = transfer_client.common_project_path(project_id)
instead use the below method , which returns projects/{project}/locations/{location}
parent = transfer_client.common_location_path(project_id , "EU")
I had tried with the above change , i am able to schedule a script in BQ.
Below is my current code. It connects successfully to the organization. How can I fetch the results of a query in Azure like they have here? I know this was solved but there isn't an explanation and there's quite a big gap on what they're doing.
from azure.devops.connection import Connection
from msrest.authentication import BasicAuthentication
from azure.devops.v5_1.work_item_tracking.models import Wiql
personal_access_token = 'xxx'
organization_url = 'zzz'
# Create a connection to the org
credentials = BasicAuthentication('', personal_access_token)
connection = Connection(base_url=organization_url, creds=credentials)
wit_client = connection.clients.get_work_item_tracking_client()
results = wit_client.query_by_id("my query ID here")
P.S. Please don't link me to the github or documentation. I've looked at both extensively for days and it hasn't helped.
Edit: I've added the results line that successfully gets the query. However, it returns a WorkItemQueryResult class which is not exactly what is needed. I need a way to view the column and results of the query for that column.
So I've figured this out in probably the most inefficient way possible, but hope it helps someone else and they find a way to improve it.
The issue with the WorkItemQueryResult class stored in variable "result" is that it doesn't allow the contents of the work item to be shown.
So the goal is to be able to use the get_work_item method that requires the id field, which you can get (in a rather roundabout way) through item.target.id from results' work_item_relations. The code below is added on.
for item in results.work_item_relations:
id = item.target.id
work_item = wit_client.get_work_item(id)
fields = work_item.fields
This gets the id from every work item in your result class and then grants access to the fields of that work item, which you can access by fields.get("System.Title"), etc.
I'm at the moment in the middle of writing my Bachelor thesis and for it creating a database system with Postgres and Flask.
To ensure the safety of my data, I was working on a file to prevent SQL injections, since a user should be able to submit a string via Http request. Since most of my functions which I use to analyze the Http request use Kwargs and a dict based on JSON in the request I was wondering if it is possible to inject python code into those kwargs.
And If so If there are ways to prevent that.
To make it easier to understand what I mean, here are some example requests and code:
def calc_sum(a, b):
c = a + b
return c
#app.route(/<target:string>/<value:string>)
def handle_request(target,value):
if target == 'calc_sum':
cmd = json.loads(value)
calc_sum(**cmd)
example Request:
Normal : localhost:5000/calc_sum/{"a":1, "b":2}
Injected : localhost:5000/calc_sum/{"a":1, "b:2 ): print("ham") def new_sum(a=1, b=2):return a+b":2 }
Since I'm not near my work, where all my code is I'm unable to test it out. And to be honest that my code example would work. But I hope this can convey what I meant.
I hope you can help me, or at least nudge me in the right direction. I've searched for it, but all I can find are tutorials on "who to use kwargs".
Best regards.
Yes you, but not in URL, try to use arguments like these localhost:5000/calc_sum?func=a+b&a=1&b=2
and to get these arguments you need to do this in flask
#app.route(/<target:string>)
def handle_request(target):
if target == 'calc_sum':
func= request.args.get('func')
a = request.args.get('a')
b = request.args.get('b')
result = exec(func)
exec is used to execute python code in strings
I'm building a standalone proximity search tool in python 2.7 (the intent is distributing it using py2exe and NSIS) and tkinter that takes a center point address, queries the database, and returns all addresses in the database within a certain range.
This is my first time venturing into the google api, and I am extremely confused about how to make use of it to retrieve this data.
I followed the code here: http://www.libertypages.com/clarktech/?p=315
And receive nothing but a 610.
I tried using this url instead, based on a question here on stack overflow and receive a Access Denied Error: http://maps.googleapis.com/maps/api/geocode/xml?
I've set up the project in the API console, enabled the maps service, added both a browser and a server api key, tried them both, and failed.
I've spent all morning pouring through the API documentation, and I can find NOTHING that tells me what URL to specify for a simple API information request for google maps api v3.
This is the actual code of my function, it's a slightly modified version of what I linked above, with some debugging output mixed in, when I ran it with http://maps.google.com/maps/geo? I received 610,0,0,0 :
def get_location(self, query):
params = { }
params[ 'key' ] = "key from console" # the actual key, of course, is not provided here
params[ 'sensor' ] = "false"
params[ 'address' ] = query
params = urllib.urlencode( params )
print "http://maps.googleapis.com/maps/api/geocode/json?%s" % params
try:
f = urllib.urlopen( "http://maps.googleapis.com/maps/api/geocode/json?%s" % params )
Everything Runs perfectly, I just wind up with "610" and "Fail" as the lat and long for all the addresses in my database. :/
I've tried a server app API key and a browser app API key, an Oauth Client ID isn't an option, because I don't want my users to have to allow the access.
I'm REALLY hoping someone will just say "go read this document, you moron" and I can shuffle off with my tail between my legs.
UPDATE: I found this: https://developers.google.com/maps/documentation/geocoding/ implemented the changes it suggests, no change, but it's taking longer to give me the "ACCESS DENIED" response.
The API Key is causing it to fail. I took it out of the query parameters dictionary and it simply works as an anonymous request.
params = { }
params[ 'sensor' ] = "false"
params[ 'address' ] = query
I have a Jython script that is used to set up a JDBC datasource on a Websphere 7.0 server. I need to set several properties on that datasource. I am using this code, which works, unless value is '-'.
def setCustomProperty(datasource, name, value):
parms = ['-propertyName', name, '-propertyValue', value]
AdminTask.setResourceProperty(datasource, parms)
I need to set the dateSeparator property on my datasource to just that - a dash. When I run this script with setCustomProperty(ds, 'dateSeparator', '-') I get an exception that says, "Invalid property: ". I figured out that it thinks that the dash means that another parameter/argument pair is expected.
Is there any way to get AdminTask to accept a dash?
NOTE: I can't set it via AdminConfig because I cannot find a way to get the id of the right property (I have multiple datasources).
Here is a solution that uses AdminConfig so that you can set the property value to the dash -. The solution accounts for multiple data sources, finding the correct one by specifying the appropriate scope (i.e. the server, but this could be modified if your datasource exists within a different scope) and then finding the datasource by name. The solution also accounts for modifying the existing "dateSeparator" property if it exists, or it creates it if it doesn't.
The code doesn't look terribly elegant, but I think it should solve your problem :
def setDataSourceProperty(cell, node, server, ds, propName, propVal) :
scopes = AdminConfig.getid("/Cell:%s/Node:%s/Server:%s/" % (cell, node, server)).splitlines()
datasources = AdminConfig.list("DataSource", scopes[0]).splitlines()
for datasource in datasources :
if AdminConfig.showAttribute(datasource, "name") == ds :
propertySet = AdminConfig.list("J2EEResourcePropertySet", datasource).splitlines()
customProp = [["name", propName], ["value", propVal]]
for property in AdminConfig.list("J2EEResourceProperty", propertySet[0]).splitlines() :
if AdminConfig.showAttribute(property, "name") == propName :
AdminConfig.modify(property, customProp)
return
AdminConfig.create("J2EEResourceProperty", propertySet[0], customProp)
if (__name__ == "__main__"):
setDataSourceProperty("myCell01", "myNode01", "myServer", "myDataSource", "dateSeparator", "-")
AdminConfig.save()
Please see the Management Console preferences settings. You can do what you are attempting now and you should get to see the Jython equivalent that the Management Console is creating for its own use. Then just copy it.
#Schemetrical solution worked for me. Just giving another example with jvm args.
Not commenting on the actual answer because I don't have enough reputation.
server_name = 'server1'
AdminTask.setGenericJVMArguments('[ -serverName %s -genericJvmArguments "-agentlib:getClasses" ]' % (server_name))
Try using a String instead of an array to pass the parameters using double quotes to surround the values starting with a dash sign
Example:
AdminTask.setVariable('-variableName JDK_PARAMS -variableValue "-Xlp -Xscm250M" -variableDescription "-Yes -I -can -now -use -dashes -everywhere :-)" -scope Cell=MyCell')