I understand that I can set the database location by changing the following line in /conf/neo4j-server.properties
org.neo4j.server.database.location=data/graph.db
Is it possible to do this within a python instance? For example, I'm using neo4jrestclient
neo4j_login = {
"username" : "neo4j",
"password" : "supersecret",
"url" : "http://localhost:7474"
}
from neo4jrestclient.client import GraphDatabase
gdb = GraphDatabase(**neo4j_login)
Can I somehow set the location of the database I'd like to open to a local directory?
If you're using the REST client, then you're not talking to a local directory, so no - in any case you'll be using a local endpoint. Via that REST client, I believe your only option is to point the server to a directory and then point the REST client to the same place you would have anyway.
Now, if this were java, you could use the neo4j-shell or other tools to open up any directory on your disk as a neo4j database; I don't believe that's an option with present python implementations but if I'm wrong about that someone else please jump in and indicate so.
Related
Let's say:
I have my python code in main.py and I am using Pandas
I am storing my API Key(to some azure service) in a Windows Environment Variable ( variable name = "AZURE_KEY" and variable_value = "abc123abc")
I will import this API Key in main.py using azure_key = os.environ.get("AZURE_KEY")
Question:
How can I be sure that Pandas Library hasn't sent azure_key's value to somewhere outside my local system?
Possible Approach:
I know one way is to go through the entire Pandas module files and understand the source code to see if any fishy stuff is happening , but such an approach is not feasible.
Note:
Pandas is just an example for the question.I want to use an API Key within a Streamlit code.
Hence,Please take this question agnostic to the library..
For a production system (on a server), you could use a firewall to filter outgoing connections
For a development system (your machine), you could add restrictions to the "API Key" account (e.g. only access test data, only access systems you really need, etc.)
I'm currently using the azure-cosmos module in Python to connect to a database on Azure. I want to fetch the data, make a few transformations, and then push it to a new container.
You need the key and client ID to connect to the database, which I've used as variables in my code for now, as follows:
url = 'https://xyz.azure.com:443/'
key ='randomlettersandnumbers=='
client = CosmosClient(url, credential=key)
This seems to be a bad practice intuitively, and especially once I push this to Git, anyone could gain access to my database. So what's the most secure way to do this?
I'm coming from a non-SWE background, so apologies if this question is dumb.
Thanks!
The way I deal with this kind of problem is by using environment variables
import os
url = os.environ.get("url-endpoint")
key = os.environ.get("api-key")
client = CosmosClient(url, credential=key)
You can set them in your ssh shell like that:
export url-endpoint="https://xyz.azure.com:443/"
export api-key="randomlettersandnumbers=="
Or you can put them in a bash script envs.sh
export url-endpoint="https://xyz.azure.com:443/"
export api-key="randomlettersandnumbers=="
And then you can use source command.
source envs.sh
You have a good article about storing sensitive data using environment variables here
I'm using a queue trigger to pass in some data about a job that I want to run with Azure Functions(I'm using python). Part of the data is the name of a file that I want to pull from blob storage. Because of this, declaring a file path/name in an input binding doesn't seem like the right direction, since the function won't have the file name until it gets the queue trigger.
One approach I've tried is to use the azure-storage sdk, but I'm unsure of how to handle authentication from within the Azure Function.
Is there another way to approach this?
In Function.json, The blob input binding can refer to properties from the queue payload. The queue payload needs to be a JSON object
Since this is function.json, it works for all languages.
See official docs at https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings
For example, in you function.json,
{
"name": "imageSmall",
"type": "blob",
"path": "container/{filename}",
}
And if your queue message payload is:
{
"filename" : "myfilename"
}
Then the {filename} token in the blob's path expression will get substituted.
Typically, you store connection strings / account keys in App Settings of the Function App, and then read them by accessing environment variables. I haven't used python in Azure, but I believe that looks like
connection = open(os.environ['ConnectionString']).read()
I've found one example of python function which does what you ask for: queue trigger + blob operation.
Storing secrets can (also) be done using App Settings.
In Azure, go to your Azure Functions App Service, Then click "Application Settings". Then, scroll down to the "App Settings" list. This list consists of Key-Value pairs. Add your key, for example MY_CON_STR and the actual connection string as the value.
Don't forget to click save at this point
Now, in your application (your Function for this example), you can load the stored value using its key. For example, in python, you can use:
os.environ['MY_CON_STR']
Note that since the setting isn't saved locally, you have to execute it from within Azure. Unfortunately, Azure Functions applications do not contain a web.config file.
I've been using rpclib to auto-generate a WSDL and implement it in Python.
Then I wanted to call a web-service* that has this WSDL using JavaEE, so I simply used the Web Service from WSDL option in the creation wizard in Eclipse (Indigo 3.7.1 with OEPE), but then the Ant build failed with the exception (in short):
weblogic.wsee.tools.WsBuildException Error running JAX-WS wsdlc
Caused by java.lang.NoSuchMethodException: javax.xml.bind.annotation.XmlElementRef.required()
What should I do? How can I call the web-service using JavaEE?
* The web service is configured with: Apache HTTP Server 2.2.2 + mod_wsgi 3.3 + Python 2.6.5 + rpclib 2.6.1.
Ok, stumbled upon your post the second time, so I'll elaborate my comment given before :).
First I recapitulate your set-up:
You have a working webservice and an URL pointing to the corresponding WSDL
You'll try to invoke the WS methods from a different Java EE project on a different machine
General options for invoking a WS:
Use Dependency Injection to inject the WS reference
Create your own WS stubs
The first option won't work in your set-up because DI will only work in an container-managed-environment (see my comment). That means that the WS class and the executing class have to be in the same container (e.g. the same server).
So what is left is to generate your WS stubs manually. Therefore you can use the wsimport tool mentioned in your own answer. There are several different ways to use this tool. Lets have a look in the CLI use:
navigate in your projekt folder of the WS client used by your IDE : %IDE_WORKSPACE%/your project/src
crate a new folder, e.g. stub
open a command window in this directory
execute the follwing command : wsimport -keep <http://yourwsdl?wsdl>
After a refresh you should see several created files
Back in your IDE:
Now you're able to use your generated stub-files to connect to the WS by getting a port from the generated service-file
public class WsClient {
public static void main(String[] args) {
//Create Service
'GeneratedFile'Service service = new 'GeneratedFile'Service();
//create proxy
'GeneratedFile' proxy = service.get'GeneratedFile'Port();
//invoke
System.out.println(proxy.yourMethod(yourParam));
}
}
Last hints:
For portabilty purpose check the generated files. In their annotations sometimes the WSDL file is linked to a local copy. Just change this back to your WSDL-URL.
AFAIK there is an option in the wsimport tool to set this directly in the import routine.
There is a plugin for Eclipse called soapUI which allows you to use the wsimport tool in a GUI out of Eclipse. Once set up it should accelerate your work.
I've also found a quick start guide in developing WS clients with eclipse.
Hope this helped, have Fun!
EDIT: Just to clarify:
After you used the wsimport tool you should have a directory containing files like shown in the image. To make this example clear you'll need to get a Service from the RequestFileService (this is my WS operation) like RequestFileService service = new RequestFileService(); and after this you'll need a Port on this service like RequestFile proxy = service.getRequestFilePort();.
After this you can invoke your method calls by using the port proxy.yourMethod(yourParam);
I've got a website that I wrote in python using the CGI. This was great up until very recently, when the ability to scale became important.
I decided, because it was very simple, to use mod_python. Most of the functionality of my site is stored in a python module which I call to render the various pages. One of the CGI scripts might look like this:
#!/usr/bin/python
import mysite
mysite.init()
mysite.foo_page()
mysite.close()
and in mysite, I might have something like this:
def get_username():
cookie = Cookie.SimpleCookie(os.environ.get("HTTP_COOKIE",""))
sessionid = cookie['sessionid'].value
ip = os.environ['REMOTE_ADDR']
username = select username from sessions where ip = %foo and session = %bar
return(username)
to fetch the current user's username. Problem is that this depends on os.envrion getting populated when os is imported to the script (at the top of the module). Because I'm now using mod_python, the interpreter only loads this module once, and only populates it once. I can't read cookies because it's os has the environment variables of the local machine, not the remote user.
I'm sure there is a way around this, but I'm not sure what it is. I tried re-importing os in the get_username function, but no dice :(.
Any thoughts?
Which version of mod_python are you using? Mod_python 3.x includes a separate Cookie class to make this easier (see here)
Under earlier versions IIRC you can get the incoming cookies inside of the headers_in member of the request object.