Solution for distributing MANY simple network tasks? [closed] - python

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I would like to create some sort of a distributed setup for running a ton of small/simple REST web queries in a production environment. For each 5-10 related queries which are executed from a node, I will generate a very small amount of derived data, which will need to be stored in a standard relational database (such as PostgreSQL).
What platforms are built for this type of problem set? The nature, data sizes, and quantities seem to contradict the mindset of Hadoop. There are also more grid based architectures such as Condor and Sun Grid Engine, which I have seen mentioned. I'm not sure if these platforms have any recovery from errors though (checking if a job succeeds).
What I would really like is a FIFO type queue that I could add jobs to, with the end result of my database getting updated.
Any suggestions on the best tool for the job?

Have you looked at Celery?

Related

How to protect code confidentiality (product using Python and ruby) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
We developed an AI assisted work product using Python and ruby.
For data security reasons, the customer requires the code to be installed on his server.
What technical solutions can we use to ensure easy deployment and no leakage of our code, or use open source encryption system? Or use a service of AWS?
Putting aside the legal aspects, from a technical perspective, there is no way to prevent the user from reverse engineering your code with enough time, as ultimately it is running on their server.
If you have anything that shouldn't ever be leaked, put the logic behind a secured API, which the local application can call - that can use AWS API Gateway if you wish to do so.
If the code cannot be deployed inside an API, you can obfuscate the code using solutions found online but you will only make it harder, not impossible.

Desktop application database for python program [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I am designing a desktop application using Python.
I am facing a problem in choosing the most appropriate database, the program works online and offline.
In the case of offline, there is no problem because I find SQLite very suitable, but the problem is when the program works online, in this case I need to put the database on the cloud so that it is easy to deal with and access it from anywhere.
My question here is what is the best solution to this problem? Is there a database that can be embedded with the program for local use and at the same time it can be used as a server on the cloud?
Thanks in advance
Well, I can say that You can use PouchDB as client side Work and for data sync on user side any database suitable which is use CouchDB sync protocol.
I also worked with postgreSQL and kinto.js but it's not as user friendly as PochDB.

Printing Graphs in Python [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I will soon start working on a certain computer science assignment, and one of the things I will have to do is use data from a database (I will use SQLite, and its plugin in Mozilla Firefox), and one of the things which I need my program to output is a normal distribution graph of the grades. It would be nice if you could tell me whether I can create a GUI program that can do that (for now I am using Jetbrains Pycharm Edu) Can someone please tell me how am I supposed to do that, consider me not a total starter in python but not a professional either, thank you in advance!
I had to create a GUI for my python app like you. And there are mainly two choices. Either you create a native app using some qt for ex. Or you create a web app. I chose the latter for simplicity and portability.
There are lots of framework out there like pyramid. Then using plotly (or bokeh, etc.) you will be able to convert (HTML+JS) some matplotlib directly into interactive plots.

Performance monitoring/profiling for python server process (similar to New Relic) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Is there a tool/service that can automatically and perpetually instrument and profile python server processes? I'm thinking about processes like Celery or RQ workers? I'd like to get method-level performance timers averaged across multiple similar job executions.
New Relic will do this for Celery but it only has experimental support for RQ. Unfortunately it's not recommended for short lived tasks, and we have lots of those.
I'm aware of cProfile and line_profiler, but I'm hoping to find a service I can use in production where I don't have to capture the output and aggregate it myself. While a permanent service/tool would be preferred, if there's a tool that will aggregate the output of multiple cProfile runs that might work as well.
BTW the processes are running on Heroku non-web worker dynos.
Have you tried AppDynamics? It's an APM like NewRelic but support Python (http://www.appdynamics.com/python/)

How to upload/publish products to Amazon via Amazon MWS API? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I heard SubmitFeed API is for adding products. But i didn't find any example.
By the way, i need a Python solution.
Thanks a lot.
The general gist of it is you use SubmitFeed to send your product list. Then you must check the status of the submission. Once the submission is complete you can then get the results. You have to repeat these steps for images, pricing and availability.
It's a bit of a pain to get started with it, Amazon supply a LOT of useful information but it is everywhere and not particulary very easy to understand at first. Experiment with just adding products to your inventory and go from there. Make use of the scratchpad too, very handy tool indeed.
As for python I can't help you there I'm afraid but I think there is sample code within the python download available from Amazon.

Categories

Resources