I'm having a problem creating a inter-process communication for my python application. I have two python scripts at hand, let's say A and B. A is used to open a huge file, keep it in memory and do some processing that Mysql can't do, and B is a process used to query A very often.
Since the file A needs to read is really large, I hope to read it once and have it hang there waiting for my Bs' to query.
What I do now is, I use cherrypy to build a http-server. However, I feel it's kind of awkward to do so since what I'm trying to do is absolutely local. So, I'm wondering are there other more organic way to achieve this goal?
I don't know much about TCP/socket etc. If possible, toy examples would be appreciate (please include the part to read file).
Python has good support for ZeroMQ, which is much easier and more robust than using raw sockets.
The ZeroMQ site treats Python as one of its primary languages and offers copious Python examples in its documentation. Indeed, the example in "Learn the Basics" is written in Python.
Related
Is it a bad idea to mix programming languages, for example have a node.js server that sends some stuff to a python program that then goes and does other stuff with it. This is pretty vague but what is the best way to send data between different programs, or is this just a terrible idea and I shouldn't even consider it? sockets?
I don't see why this is harmful. Polyglot programming is common these days: jQuery and JavaScript on clients, Java or .NET on servers, etc.
It's common to write web services in Java or .NET. It shouldn't bother anyone to mix and match.
Use the best tool for the job.
It's not a bad idea. As a matter of fact, it's often a necessity. Especially with the two language you've mentioned. It's often necessary to have a client-side language, and then a separate server-side language. They have different purposes and they're both needed.
As for passing data between them, it's generally not a good idea to mix languages if you need to pass data between them. I'd need more information on the situation to be more specific.
The main problem is that you can't reuse code as easily. This means for example that you might have to duplicate things like the ORM mapping for each language you are using.
The answer is "it depends", but in general there's nothing wrong with it, no.
Many times it's beneficial to actually do so for ease of development and maintenance since some languages are just simply suited towards certain tasks.
Other times, there's a communication barrier between languages depending on the data you're handling that's awkward.
And yet other times, you actually end up making your own separate languages (e.g. configuration files) because you simply don't like the available programming languages for a task.
Your general question is a bit vague, as it's unclear what you're trying to do specifically. However, if you're simply trying to get programs written in different languages to communicate to each other, you should consider an RPC library.
Thrift, originally developed by Facebook, is one potential (and pretty good IMO) option: http://thrift.apache.org/
I recall when I first read Pragmatic Programmer that they suggested using scripting languages to make you a more productive programmer.
I am in a quandary putting this into practice.
I want to know specific ways that using Python or Ruby can make me a more productive .NET developer.
One specific way per answer, and even better if you can say whether I could use Python or Ruby or Both for it.
See standard format below.
IronPython / IronRuby
IronPython in Action will do a better job explaining this (and exactly how best to use IronPython) that can possibly be accommodated in a SO answer. I'm biased -- I was a tech reviewer and am a friend of one of the authors -- but objectively think it's a great book. (No idea if IronRuby is blessed with a similarly wonderful book, yet).
As you want "one specific way per answer" (incompatible with SO, which STRONGLY discourages a poster posting 25 different answers if they have 25 "specific ways" to indicate...!-): prototyping in order to explore some specific assembly or collection thereof that you're unfamiliar with (to check if you've understood their docs right and how to perform certain tasks) is an order of magnitude more productive in IronPython than in C#, as you can explore interactively and compilation is instantaneous and as-needed. (Have not tried IronRuby but I'll assume it can work in a roughly equivalent way and speed).
Less Code
I think productivity is direct result on how proficient you are in a specific language. That said the terseness of a language like Python might save some time on getting certain things done.
If I compare how much less code I have to write for simple administration scripts (e.g. clean-up of old files) compared to .NET code there is certain amount of productivity gain. (Plus it is more fun which also helps getting the job done)
Advanced Text Processing
Traditional strengths of awk and perl. You can just glue together a bunch of regular expressions to create a simple data-mining system on the go.
Learning a new language gives you knowledge that you can bring back to any programming language. Here are some things you'd learn.
Add functionality to your objects on the fly.
Mix in modules.
Pass a chunk of code around.
Figure out how to do more with less code: ruby -e "puts 'hello world'"
C# can do some of these things, but a fresh perspective might bring you one step closer to automating your breakfast.
Embedding a script engine
Use of IronPython for a scripting engine inside your .NET application. For example enabling end-users of your application to change customizable parts with a full fledge language such as Python.
A possible example might be to expose custom logic to end-users for a work flow engine.
Quick Prototyping - Both
In the simplest cases when firing a python interpreter and writing a line or two is way faster than creating a new project in visual studio.
And you can use ruby to. Or lua, or evel perl, whatever. The point is implicit typing and light-weight feel.
Cross platform
Compared to .NET a simple script Python is more easily ported to other platforms such as Linux. Although possible to achieve the same with the likes of Mono it simpler to run a Python script file on different platforms.
Processing received Email
Python has built-in support for POP3 and IMAP where the standard .NET framework doesn't. Useful for automating email triggered tasks.
I am considering programming the network related features of my application in Python instead of the C/C++ API. The intended use of networking is to pass text messages between two instances of my application, similar to a game passing player positions as often as possible over the network.
Although the python socket modules seems sufficient and mature, I want to check if there are limitations of the python module which can be a problem at a later stage of the development.
What do you think of the python socket module :
Is it reliable and fast enough for production quality software ?
Are there any known limitations which can be a problem if my app. needs more complex networking other than regular client-server messaging ?
Thanks in advance,
Paul
Check out Twisted, a Python engine for Networking. Has built-in support for TCP, UDP, SSL/TLS, multicast, Unix sockets, a large number of protocols (including HTTP, NNTP, IMAP, SSH, IRC, FTP, and others)
Python is a mature language that can do almost anything that you can do in C/C++ (even direct memory access if you really want to hurt yourself).
You'll find that you can write beautiful code in it in a very short time, that this code is readable from the start and that it will stay readable (you will still know what it does even after returning one year later).
The drawback of Python is that your code will be somewhat slow. "Somewhat" as in "might be too slow for certain cases". So the usual approach is to write as much as possible in Python because it will make your app maintainable. Eventually, you might run into speed issues. That would be the time to consider to rewrite a part of your app in C.
The main advantages of this approach are:
You already have a running application. Translating the code from Python to C is much more simple than write it from scratch.
You already have a running application. After the translation of a small part of Python to C, you just have to test that small part and you can use the rest of the app (that didn't change) to do it.
You don't pay a price upfront. If Python is fast enough for you, you'll never have to do the optional optimization.
Python is much, much more powerful than C. Every line of Python can do the same as 100 or even 1000 lines of C.
To answer #1, I know that among other things, EVE Online (the MMO) uses a variant of Python for their server code.
The python that EVE online uses is StacklessPython (http://www.stackless.com/), and as far as i understand they use it for how it implements threading through using tasklets and whatnot. But since python itself can handle stuff like MMO with 40k people online i think it can do anything.
This bad answer and not really an answer to your question, rather addition to previous answer.
Alan.
I want to start writing a http proxy that will modify responses according to some rules/filters I will configure. However, before I start coding it, I want to make sure I'm making the right choice in going with Python. Later, this tool would have to be able to process a lot of requests, so, I would like to know I can count on it later on to be able to perform when "push comes to shove".
As long as the bulk of the processing uses Python's built-in modules it should be fine as far as performance. The biggest strength of Python is its clear syntax and ease of testing/maintainability. If you find that one section of your code is slowing down the process, you can rewrite that section and use it as a C module, while keeping the bulk of your control code in Python.
However if you're looking to make the most optimized Python Code you may want to check out this SO post.
Yes, I think you will find Python to be perfectly adequate for your needs. There's a huge number of web frameworks, WSGI libraries, etc. to choose from, or learn from when building your own.
There's an interesting post on the Python History blog about how Python was supporting high performance websites in 1996.
This will depend on the library you use more than the language itself. The twisted framework is known to scale well.
Here's a proxy server example in python/twisted to get you started.
Bottomline: choose your third party tools wisely and I'm sure you'll be fine.
Python performs pretty well for most tasks, but you'll need to change the way you program if you're used to other languages. See Python is not Java for more info.
If plain old CPython doesn't give the performance you need, you have other options as well.
As has been mentioned, you can extend it in C (using a tool like swig or Pyrex). I also hear good things about PyPy as well, but bear in mind that it uses a restricted subset of Python. Lastly, a lot of people use psyco to speed up performance.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I want to have some work done on the Network front, pinging numerous computers on a LAN and retrieving data about the response time. Which would be the most useful and productive to work with: Perl or Python?
I agree that it is pretty subjective which programming language you use - essentially I would rather get the job done as quickly and efficiently as possible which making it supportable - so that depends on your infrastructure...
Can I suggest that you look at Nagios rather than re-inventing the wheel yourself?
While Nagios might require a greater learning curve in terms of configuration, it will be worth it in the long run, and if you can't find a plugin to suit your requirements, then it is easy to write your own. Joel Spolsky has written an interesting article on this.
Well, I work in both Perl and Python, and my day job is supporting a network monitoring software. Most of the import points have already been covered, but I'll consolidate/reiterate here:
Don't reinvent the wheel - there are dozens of network monitoring solutions that you can use to perform ping tests and analyze collected data. See for example
Nagios
Zenoss
OpenNMS
PyNMS
If you insist on doing this yourself, this can be done in either Perl or Python - use the one you know best. If you're planning on parsing a lot of text, it will be easier to do this "quick and dirty" in Perl than it will be in Python. Both can do it, but Python requires an OOP approach and it just isn't as easy as Perl's inline regex syntax.
Use libraries - many, many people have done this task before you so look around for a suitable lib like Net::Ping in Perl or the icmplib in Python or this ping.py code.
Use threads or asynchronous pings - otherwise pinging is going to take forever for example see this recipe using threads to run pings simultaneously. This is particularly easy to do in Python using either approach, so this is one place Python will be easier to work with IMO than using Perl.
Go with Perl.
You'll have access to a nice Ping object, Net::Ping and storing the results in a database is pretty easy.
Either one should work just fine. If you don't have experience with either, flip a coin. No language is inherently productive; languages allow people to be productive. Different people will benefit differently from different languages.
In general, though, when you know your specific task and need to choose a tool, look for the libraries that would make your life easy. For Perl, check out the Comprehensive Perl Archive Network. There are modules for just every networking thing you might need.
Python probably has very similar tools and libraries; I just don't know what they are.
I know Perl better than Python, so my choice would fall on Perl. That said, I'd argue that on low level tasks (like pinging computers on a network and things like that) they are rather equivalent. Python may have a better object-oriented support but for scripting (that happens to be what you need) the power of Perl is quite obvious. The large pool of tested modules (some of them are even object oriented) that you find on CPAN usually can do everything you need and they can even scale well if you use them appropriately.
I don't know Python, so I can't comment on what it offers, and I agree with those who suggest Nagios or other existing systems.
However, if you decide to roll your own system with Perl, Consider using POE. POE is a cooperative multitasking and networking framework.
POE has a steep learning curve. But you will be repaid for you effort very quickly. POE will provide a solid foundation to build from. Much of the client code you will need is already available on CPAN.
Whichever you know better or are more comfortable using. They both can do the job and do it well, so it is your preference.
Right now I've experimented the approach of creating some simple unit test for network services using various TAP libraries (mainly bash+netcat+curl and perl). The advantage is that you wrote a single script that you can use for both unit and network testing.
The display is dove via TAP::Harness::HTML.
I'd say that if you need something quick and dirty that's up and running by this afternoon, then perl is probably the better language.
However for developing solid application that's easy to maintain and extend and that you can build on over time, I'd go with python.
This is of course assuming you know both languages more or less equally well.