I'm building an app that needs to get artwork information out of the libspotify Spotify API.
I'm building the app in python on google appengine. Does anyone know of a package that will enable me to access the libspotify API? The official page is C and I've googled around to try and find a suitable wrapper but can't seem to find one.
Thanks
Tom
There is also https://github.com/mopidy/pyspotify which is actively used in some applications and up to date.
I don't think you will be able to call libspotify at all. From the docs:
"The Python interpreter runs in a secured "sandbox" environment to isolate your application for service and security. The interpreter can run any Python code, including Python modules you include with your application, as well as the Python standard library. The interpreter cannot load Python modules with C code; it is a "pure" Python environment."
Check Spotimeta
http://pypi.python.org/pypi/spotimeta/
Related
I am looking for examples or actual documentation for Train level Openstack Octavia (Load Balancer component) python API. Everything in the OS project doc site seems to be focused on the CLI. I would like an example and possible API specification (what to pass in, what to expect out of the functions defined for the OctaviaAPI class in the component. I have been looking for a few hours with little success.
Try https://docs.openstack.org/openstacksdk/latest/user/proxies/load_balancer_v2.html. In general, projects' documentation sites don't cover Python bindings, as they are not part of the projects. OpenstackSDK is its own project.
Yeah, the client based python binds are being phased out in favor of the openstacksdk. Octavia went straight to using openstacksdk for the python bindings.
I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.
Can Google Cloud Functions handle python with packages like sklearn, pandas, etc? If so, can someone point me in the direction of resources on how to do so.
I've been searching a while and it seems like this is impossible, all I've found are resources to deploy the base python language to google cloud.
Python 3.7 is supported now.
Steps to create one via the google cloud console:
go to google cloud functions in the google cloud console and click on create function
2.specify the function's properties
select trigger
4.change runtime to python 3.7
enter your cloud function logic and entry point
enter python dependencies in requirements.txt
EDIT: As of July 2018 there is now a Python runtime (3.7) available for Google Cloud Functions!
OLD ANSWER: Google Cloud Functions (GCF) are written in JavaScript (executed in a Node.js runtime), so there is no way for them to actually handle Python at this moment. There is a Python module at GitHub that you might have come across and it can be used to write and deploy GCF with one of three trigger types: http, Pub/Sub and bucket. The module takes care of translating your Python logic to a JavaScript code that is later run inside Google Cloud Platform.
When it comes to other packages like pandas, the ‘translation’ into JavaScript was not prepared for them by anyone AFAIK. If you really don’t like the idea of jumping into JavaScript and writing the Cloud Function code on your own (with the logic you intended to use in a Python script), you have a possible workaround. You can evoke your Python script from inside of the Cloud Function written in JS - the idea was discussed in this topic. Another way is using Object Change Notifications or Pub/Sub Notifications as explained here.
As of 19th July 2018, Google Cloud Functions supports Python 3.7.
Kindly check the Runtime environment to find the Python 3.7 runtime and sample script (based on Flask) .
--UPDATED--
Official Documentation for the Google Cloud Functions - Python 3.7 support Beta Release.
This is a beta release of the Python runtime for Google Cloud
Functions. This feature might be changed in backward-incompatible ways
and is not subject to any SLA or deprecation policy.
SkLearn, Numpy is supported in Google Cloud function. Also I've run a sample test to confirm the availability of Pandas as well and its working fine.
https://github.com/mkanchwala/google-functions-python-example
Hope this helps to all the "Py" lovers.
You can use AWS lambda as well if you want to work around and still use Python as your main language. Some modules/packages will need to be imported via zip file with AWS Lambda but it has a broader range of usable languages than GCF
After several weeks looking for some information here and google, I've decided to post it here to see if anyone with the same problem can raise me a hand.
I have a java application developed in Eclipse Ganymede using tomcat to connect with my local database. The problem is that I want to send a simple message ("Hello World") to a Kafka Topic published on a public server. I've imported the libraries and developed the Kafka function but something happens when I run in debug mode. I have no issues or visible errors when compiling, but when I run the application and push the button to raise this function it stops in KafkaProducer function because there is NoClassDefFoundError kafka.producer..... It seems like it is not finding the library properly, but I have seen that it is in the build path properly imported.
I am not sure if the problem is with Kafka and the compatibility with Eclipse or Java SDK (3.6), it could be?. Anyone knows the minimum required version of Java for Kafka?
Also, I have found that with Kafka is really used Scala but I want to know if I can use this Eclipse IDE version for not change this.
Another solution that I found is to use a Python script called from the Java application, but I have no way to call it from there since I follow several tutorials but then nothing works, but I have to continue on this because it seems an easier option. I have developed the .py script and works with the Kafka server, now I have to found the solution to exchange variables from Java and Python. If anyone knows any good tutorial for this, please, let me know.
After this resume of my days and after hitting my head with the walls, maybe someone has found this error previously and can help me to find the solution, I really appreciate it and sorry for the long history.
Please include the Kafka client library within the WAR file of the Java application which you are deploying to Tomcat
Please use org.apache.kafka.clients.producer.KafkaProducer rather than kafka.producer.Producer (which is the old client API) and make sure you have the Kafka client library on the classpath. The client library is entirely in Java. It's the old API that's written in scala, as is the server-side code. You don't need to import the server library in your code or add it to the classpath if you use the new client API.
At the end the problem was related with the library that was not well added. I had to add it in the build.xml file, importing here the library. Maybe this is useful for the people who use an old Eclipse version.
So now it finds the library but I have to update Java version, other matter. So it is solved
I need to write a script in python to check a webpage, which is protected by kerberos. Is there any possibility to do this from within python and how? The script is going to be deployed on a linux environment with python 2.4.something installed.
dertoni
I think that python-krbV and most Linux distributions also have a python-kerberos package. For example, Debian has one of the same name. Here's the documentation on it
Extract from link:
"This Python package is a high-level wrapper for Kerberos (GSSAPI)
operations. The goal is to avoid having to build a module that wraps
the entire Kerberos.framework, and instead offer a limited set of
functions that do what is needed for client/server Kerberos
authentication based on http://www.ietf.org/rfc/rfc4559.txt. "