Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
For example, http://developer.apple.com/cocoa/pyobjc.html is still for OS X 10.4 Tiger, not 10.5 Leopard.. And that's the official Apple documentation for it..
The official PyObjC page is equally bad, http://pyobjc.sourceforge.net/
It's so bad it's baffling.. I'm considering learning Ruby primarily because the RubyCocoa stuff is so much better documented, and there's lots of decent tutorials (http://www.rubycocoa.com/ for example), and because of the Shoes GUI toolkit..
Even this badly-auto-translated Japanese tutorial is more useful than the rest of the documentation I could find..
All I want to do is create fairly simple Python applications with Cocoa GUI's..
Can anyone shed light on the horrible documentation, or point me at some tutorials that don't just give you huge blocks of code and assume you know what NSThread.detachNewThreadSelector_toTarget_withObject_("queryController", self, None) does..?
The main reason for the lack of documentation for PyObjC is that there is one developer (me), and as most developers I don't particularly like writing documentation. Because PyObjC is a side project for me I tend to focus on working on features and bugfixes, because that's more interesting for me.
The best way to improve the documentation is to volunteer to help on the pyobjc-dev mailing list.
As an aside: the pythonmac-sig mailinglist (see google) is an excelent resource for getting help on Python on MacOSX (not just PyObjC).
I agree that that tutorial is flawed, throwing random, unexplained code right in front of your eyes. It introduces concepts such as the autorelease pool and user defaults without explaining why you would want them ("Autorelease pool for memory management" is hardly an explanation).
That said…
basically all I want to do is write Cocoa applications without having to learn ObjC.
I'm afraid that for the time being, you will need a basic grasp of ObjC in order to benefit from any language that uses Cocoa. PyObjC, RubyCocoa, Nu and others are niches at best, and all of them were developed by people intimately familiar with the ins and outs of ObjC and Cocoa.
For now, you will benefit the most if you realistically see those bridges as useful where scripting languages truly shine, rather than trying to build a whole application with them. While this has been done (with LimeChat, I'm using a RubyCocoa-written app right now), it is rare and likely will be for a while.
To be blunt:
If you want to be an effective Cocoa programmer, you must learn Objective-C. End of story.
Neither Python or Ruby are a substitute for Objective-C via their respective bridges. You still have to understand the Objective-C APIs, the behaviors inherent to NSObject derived classes, and many other details of Cocoa.
PyObjC and RubyCocoa are a great way to access Python or Ruby functionality from a Cocoa application, including building a Cocoa application mostly -- if not entirely -- in Python or Ruby. But success therein is founded upon a thorough understanding of Cocoa and the Objective-C APIs it is composed of.
Tom's and Martin's response are definitely true (in just about any open source project, you'll find that most contributors are particularly interested in, well, developing; not so much in semi-related matters such as documentation), but I don't think your particular question at the end would fit well inside PyObjC documentation.
NSThread.detachNewThreadSelector_toTarget_withObject_("queryController", self, None)
NSThread is part of the Cocoa API, and as such documented over at Apple, including the particular method + detachNewThreadSelector:toTarget:withObject: (I'd link there, but apparently stackoverflow has bugs with parsing it). The CocoaDev wiki also has an article.
I don't think it would be a good idea for PyObjC to attempt to document Cocoa, other than a few basic examples of how to use it from within Python. Explaining selectors is also likely outside the scope of PyObjC, as those, too, are a feature of Objective-C, not PyObjC specifically.
I stumbled across a good tutorial on PyObjC/Cocoa:
http://lethain.com/entry/2008/aug/22/an-epic-introduction-to-pyobjc-and-cocoa/
All I want to do is create fairly simple Python applications with Cocoa GUI's.. Can anyone shed light on the horrible documentation, or point me at some tutorials that don't just give you huge blocks of code and assume you know what NSThread.detachNewThreadSelector_toTarget_withObject_("queryController", self, None) does..?
[...]
basically all I want to do is write Cocoa applications without having to learn ObjC.
Although I basically agree with Soeren's response, I'd take it even further:
It will be a long time, if ever, before you can use Cocoa without some understanding of Objective C. Cocoa isn't an abstraction built independently from Objective C, it is explicitly tied to it. You can see this in the example line of code you quoted above:
NSThread.detachNewThreadSelector_toTarget_withObject_("queryController", self, None)
This is the Python way of writing the Objective C line:
[NSThread detachNewThreadSelector:#selector(queryController:) toTarget:self withObject:nil];
Now, it's important to notice here that this line can be seen in two ways: (1) as a line of Objective C, or (2) as an invocation of the Cocoa frameworks. We see it as (1) by the syntax. We see it as (2) by recognizing that NSThread is a Cocoa framework which provides a set of handy features. In this case, this particular Cocoa framework is making it easy for us to have an object start doing something on a new thread.
But the kicker is this: The Cocoa framework here (NSThread) is providing us this handy service in a way that is explicitly tied to the language the framework has been written in. Namely, NSThread gave us a feature that explicitly refers to "selectors". Selectors are, in point of fact, the name for something fundamental about how Objective C works.
So there's the rub. Cocoa is fundamentally an Objective-C creation, and its creators have built it with Objective C in mind. I'm not claiming that it's impossible to translate the interface to the Cocoa features into a form more natural for other languages. It's just that as soon as you change the Cocoa framework to stop referring to "selectors", it's not really the Cocoa framework any more. It's a translated version. And once you start going down that road, I'm guessing things get really messy. You're trying to keep up with Apple as they update Cocoa, maybe you hit some parts of Cocoa that just don't translate well into the new language, whatever. So instead, things like PyObjC opt to expose Cocoa directly, in a way that has a very clear and simple correlation. As they say in the documentation:
In order to have a lossless and unambiguous translation between Objective-C messages and Python methods, the Python method name equivalent is simply the selector with colons replaced by underscores.
Sure, it's a bit ugly, and it does mean you need to know something about Objective-C, but that's because the alternative, if one truly exists, is not necessarily better.
I didn't know anything at all about Objective C or Cocoa (but plenty about Python), but I am now writing a rather complex application in PyObjc. How did I learn? I picked up Cocoa Programming for OSX and went through the whole book (a pretty quick process) using PyObjC. Just ignore anything about memory management and you'll pretty much be fine. The only caveat is that very occasionally you have to use a decorator like endSheetMethod (actually I think that's the only one I've hit):
#PyObjcTools.AppHelper.endSheetMethod
def alertEnded_code_context_(self, alert, choice, context):
pass
This answer isn't going to be very helpful but, as a developer I hate doing documentation. This being a opensource project, it's hard to find people to do documentation.
Tom says it all really. Lots of open source projects have dedicated developers and few who are interested in documenting. It isn't helped by the fact that goalposts can shift on a daily basis which means documentation not only has to be created, but maintained.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Is it feasible to script in a Lisp, as opposed to Ruby/Python/Perl/(insert accepted scripting language)? By this I mean do things like file processing (open a text file, count the number of words, return the nth line), string processing (reverse, split, slice, remove punctuation), prototyping/quick computations, and other things you would normally use Python, etc. for. How productive would doing such tasks in a Lisp be, as opposed to Ruby/Python/Perl/scripting language of choice?
I ask because I want to learn a Lisp but also use it to do something instead of only learning it for the sake of it. I looked around, but couldn't find much information about scripting in a Lisp. If it is feasible, what would be a good implementation?
Thank you!
Today, using LISP as if it's certain that anyone would understand what language one is talking about is absurd since it hasn't been one language since the 70's or probably some time earlier too. LISP only indicates that it's fully parenthesized Polish prefix notation just like Pascal, Ruby, Python and Perl are just variations of ALGOL.
Scheme is a standard and Common LISP is a standard. Both of those are general purpose though Common LISP is a batteries included while Scheme is a minimalistic language. They are quite different in style so comparing them would be like comparing Java with Python.
Embedded LISPS
There are lots of use of Scheme and specialized LISP dialects as embedded languages. Emacs is the most widely used editor in the unix segment and its lisp elisp is the most used lisp language because of this. Image processing applpication GIMP has a Scheme base with extensions for image processing.
Stand alone scripts
It's possible in many Common LISP implementation with the standard #!-notation to make a script work as an executable and run it as an application. Eg. I use CLISP and have scripts using #!/usr/bin/clisp -C as first line. I also use Scheme the same way and in the very fast incremental compiler ikarus you use #!/usr/bin/ikarus --r6rs-script. Clojure has all power of Java libraries and you can use your own classes from it and it also can be made an application with #!/usr/bin/env java -cp /path/to/clojure-1.2.0.jar clojure.main
more permanent application
In Common LISP you can dump an image. It will be a Common Lisp binary with your code already compiled in. Many Scheme implementations has compilation to native and Clojure can compile to java bytecode (though it's not the most common way to do it). Still I have had experience with Ikarus sometimes interpreting faster than a compiled executable from racket, chicken and gambit so I often do my programming in DrRacket and running it in ikarus in Scheme.
Try both Common LISP and Scheme as both of them are good enough for the tasks you specified in your question. There are many free books on the subject and some are worth their price as well. You may also try Racket too, which is a Scheme deviate with lots of libraries for everyday tasks, but it's not conforming to any standard.
About productivity
I imagine you are referring to how quick you can write a certain task in a Lisp dialect. I imagine it depends on how used you are to the syntax. It takes a while to get used to it after only knowing Algol dialects. It takes different approaches as well as you need to think in a more functional manner, especially for Scheme. I imagine when you are as good in Scheme as in your favorite Algol dialect it will be similar. Eg. some algol dialects are faster to prototype inn than others and that is true for Lisp dialects as well.
When I first started to poke around Lisp I used it to write shell scripts... I'm somewhat OCD about order and uniformity and I really liked Lisp languages because they have saner syntax (fewer syntax rules, no random decisions related to particular syntax elements).
If you are looking into Common Lisp, then SBCL, as installed by default on any Linux distro is available right away for CGI scripting. SBCL also has its means to process command line arguments, access pipes, processes and so on. If you aren't after portability between different Lisps, then I'd say that you are good to go. Just to give you examples of where I used this scripting: a girl in our office compiled and maintained a list of words which I had to further process in our application. This list was available as Googledoc spreadsheet. My script would download the words table and parse it into the format I needed. I had scripts that helped me with file manipulation and preprocessing before the project was compiled (the project wasn't in Lisp).
Finally, SBCL has its own means of being used with FastCGI http://kdr2.com/project/sb-fastcgi.html , but of course there are several full-blown HTTP-servers, which you could either use as is, or put behind a proxy. Hunchentoot was historically most popular one, but there are others too, like cl-http, here are some more links: http://www.cliki.net/web .
#!/usr/bin/env sbcl --script
would be the shebang comment to use.
Furthermore, I use Common Lisp for my classes, simply to do my homework, which is, I imagine, what you were after when you said "open a text file, count the number of words, return the nth line". Here's an example I used for the class on introduction to logic: https://github.com/wvxvw/coursera-logic/blob/master/formula.lisp .
Since there was a discussion on the matter, here's something else to be considered.
Typically scripting languages have twofold nature: they are written in in a low-level language, while exposing high-level API to the programmer. This can be a blessing or a curse. On one hand languages like Python, Ruby, JavaScript have highly optimised libraries for dealing with common tasks, while Common Lisp typically, similarly to C++ or Java implements everything in Lisp. Thus, for example, strings are a lot less sophisticated in CL then they are in JavaScript: in CL they are simple arrays of characters, while in JS they are a special kind of trees known as ropes.
Typically, a programmer writing in scripting language is neither required nor expected to pull out a high performance code. The language compensates for it with the base level highly optimized library code. Unfortunately, once the programmer actually wants to squeeze as much performance as possible from the scripting language, it appears that they just don't have the tools to do it, because they can't get to the bottom of the implementation.
On the other hand, dealing with the lower level of details means that the programmer will be less productive overall, and this would require more skill, because optimizing the code to get on par with the industry standard implementations requires skill.
Common Lisp generally falls under the second category, but I'd argue it to be still good for one-liners and casual programming because of the extensive library and highly developed macro system, which allows to reduce verbosity usually associated with low-level languages.
I understand that you have asked two questions:
Is it possible to run scheme scripts from a command line
How effective is that?
I can answer your first question, but not your second. It highly depends on your scheme skills and how much code you want to (re)write in scheme how feasible it is.
So I just answer your first questions :)
Have a look at this SO question: Running Scheme from the command line.
If you have installed, for example, DrRacket (which is a good IDE for many scheme dialects) as your scheme interpreter, you may use the shebang line #! /usr/bin/env mzscheme in your scheme script.
This test script (test.scm)
#!/usr/bin/env mzscheme
#lang scheme
(print (+ 40 2))
can be made executable (with chmod +x test.scm) and executed (with ./test.scm).
I'd say that Lisp/Scheme could be used to write small scripts or big application. But they are not yet ready for wide use.
The big difference between python/ruby and scheme is that python has a huge library of modules centralized in one place. Ruby is quite similar to python with ruby gems.
Scheme on the other hand might have a small library of modules scattered accross the internet. The quality of modules doesn't always compare to the popular modules in python and ruby.
One could say that they are aiming at different goal but I'd say scheme just got old and people started to forget about it and how it could be used as a tool instead of just a school subject.
About Lisp, I can't really say. But from your description, it's possible to write scripts that you'd like to write but if you need something specific it's possible that it's not there and you'll have to rewrite it yourself.
All I can say, is jump in. And become someone who gives a future to this language. Don't be scared. This language has a bright future and you'll learn a lot from it.
WRT to your tasks, what about using Emacs, which comes with an interactive Python-shell. So you have the convenience of editing alongside with running scripts.
We have an in house developed web-based admin console that uses a combination of C CGI and Perl scripts to administer our mail server stack. Of late we have been thinking of cleaning up the code (well, replacing most of it), making the implementation more secure, and improving the overall behavior.
I don't have much programming knowledge, but I use Ruby on and off (mainly for writing erb templates), and hence was thinking of using ruby/rails for developing such an app (off-duty for now, I also need to learn stuff !).
Before blindly picking up a language though, what would you folks suggest ? Please let me know if this is too vague a question, I'll try to supply more information, if needed.
Have you considered writing your applications as Webmin modules?
You get a lot of stuff for free when you do so (users and groups, tons of security features, a pretty big variety of helper functions related to config files, and tons of existing code for most aspects of a UNIX/Linux system). You also get a lot of stuff for nearly free, like action logging, packages and updates via wbm or apt or yum, an online help system, etc.
There are some cons, as well. It's an old codebase, so it has some clunky bits in the API among other places. A lot of the old modules can be a bit hard to grok if you're not an old-school Perl programmer. But, it's a well-maintained codebase, and it's been banged on by millions of users for over a dozen years. It's pretty robust. The UI isn't beautiful, but it is relatively theme-able, and if you're distributing a minimized version it becomes easier to customize the UI.
I suspect you can be up and running a lot faster than starting from scratch or using most existing frameworks that aren't targeted specifically to building systems management interfaces the way Webmin is.
Also, it's BSD licensed, so you can do whatever you want with it, including building a custom commercial app with it (hundreds of companies have done so over the years).
If you already know a bit of ruby, then there's no reason not to use that.
If you're interested specifically in learning another language, then what you're trying to do could be done in pretty much any language/framework, it's just a matter of which one you want to learn.
Without knowing much about your existing application I'd say that this effectively boils down to "which language do you like to work with?".
Python and Ruby are both mature languages with ample library infrastructure. They also boast popular, similar web application frameworks namely Django and Ruby-on-Rails respectively.
Since you are porting an existing Perl app(lets) it may be worthwhile to note that Ruby is relatively more similar to Perl. Not surprising given that Ruby was influenced "primarily by Perl, Smalltalk, Eiffel and Lisp".
django has a nice admin interface
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
We need to write simple scripts to manipulate the configuration of our load balancers (ie, drain nodes from pools, enabled or disable traffic rules). The load balancers have a SOAP API (defined through a bunch of WSDL files) which is very comprehensive but using it is quite low-level with a lot of manual error checking and list manipulation. It doesn't tend to produce reusable, robust code.
I'd like to write a Python library to handle the nitty-gritty of interacting with the SOAP interface but I don't really know where to start; all of my coding experience is with writing one-off monolithic programs for specific jobs. This is fine for small jobs but it's not helping me or my coworkers -- we're reinventing the wheel with a different number of spokes each time :~)
The API already provides methods like getPoolNames() and getDrainingNodes() but they're a bit awkward to use. Most take a list of nodes and return another list, so (say) working out which virtual servers are enabled involves this sort of thing:
names = conn.getVirtualServerNames()
enabled = conn.getEnabled(names)
for i in range(0, len(names)):
if (enabled[i]):
print names[i]
conn.setEnabled(['www.example.com'], [0])
Whereas something like this:
lb = LoadBalancer('hostname')
for name in [vs.name for vs in lb.virtualServers() if vs.isEnabled()]:
print name
www = lb.virtualServer('www.example.com').disable()
is more Pythonic and (IMHO) easier.
There are a lot of things I'm not sure about: how to handle errors, how to deal with 20-odd WSDL files (a SOAPpy/suds instance for each?) and how much boilerplate translation from the API methods to my methods I'll need to do.
This is more an example of a wider problem (how to learn to write libraries instead of one-off scripts) so I don't want answers to these specific questions -- they're there to demonstrate my thinking and illustrate my problem. I recognise a code smell in the way I do things at the moment (one-off, non-reusable code) but I don't know how to fix it. How does one get into the mindset for tackling problems at a more abstract level? How do you 'learn' software design?
"I don't really know where to start"
Clearly false. You provided an excellent example. Just do more of that. It's that simple.
"There are a lot of things I'm not sure about: how to handle errors, how to deal with 20-odd WSDL files (a SOAPpy/suds instance for each?) and how much boilerplate translation from the API methods to my methods I'll need to do."
Handle errors by raising an exception. That's enough. Remember, you're still going to have high-level scripts using your API library.
20-odd WSDL files? Just pick something for now. Don't overengineer this. Design the API -- as you did with your example -- for the things you want to do. The WSDL's and the number of instances will become clear as you go. One, Ten, Twenty doesn't really matter to users of your API library. It only matters to you, the maintainer. Focus on the users.
Boilerplate translation? As little as possible. Focus on what parts of these interfaces you use with your actual scripts. Translate just what you need and nothing more.
An API is not fixed, cast in concrete, a thing of beauty and a joy forever. It's just a module (in your case a package might be better) that does some useful stuff.
It will undergo constant change and evolution.
Don't overengineer the first release. Build something useful that works for one use case. Then add use cases to it.
"But what if I realize I did something wrong?" That's inevitable, you'll always reach this point. Don't worry about it now.
The most important thing about writing an API library is writing the unit tests that (a) demonstrate how it works and (b) prove that it actually works.
There's an excellent presentation by Joshua Bloch on API design (and thus leading to library design). It's well worth watching. IIRC it's Java-focused, but the principles will apply to any language.
If you are not afraid of C++, there is an excellent book on the subject called "Large-scale C++ Software Design".
This book will guide you through the steps of designing a library by introducing "physical" and "logical" design.
For instance, you'll learn to flatten your components' hierarchy, to restrict dependency between components, to create levels of abstraction.
The is really "the" book on software design IMHO.
I recall when I first read Pragmatic Programmer that they suggested using scripting languages to make you a more productive programmer.
I am in a quandary putting this into practice.
I want to know specific ways that using Python or Ruby can make me a more productive .NET developer.
One specific way per answer, and even better if you can say whether I could use Python or Ruby or Both for it.
See standard format below.
IronPython / IronRuby
IronPython in Action will do a better job explaining this (and exactly how best to use IronPython) that can possibly be accommodated in a SO answer. I'm biased -- I was a tech reviewer and am a friend of one of the authors -- but objectively think it's a great book. (No idea if IronRuby is blessed with a similarly wonderful book, yet).
As you want "one specific way per answer" (incompatible with SO, which STRONGLY discourages a poster posting 25 different answers if they have 25 "specific ways" to indicate...!-): prototyping in order to explore some specific assembly or collection thereof that you're unfamiliar with (to check if you've understood their docs right and how to perform certain tasks) is an order of magnitude more productive in IronPython than in C#, as you can explore interactively and compilation is instantaneous and as-needed. (Have not tried IronRuby but I'll assume it can work in a roughly equivalent way and speed).
Less Code
I think productivity is direct result on how proficient you are in a specific language. That said the terseness of a language like Python might save some time on getting certain things done.
If I compare how much less code I have to write for simple administration scripts (e.g. clean-up of old files) compared to .NET code there is certain amount of productivity gain. (Plus it is more fun which also helps getting the job done)
Advanced Text Processing
Traditional strengths of awk and perl. You can just glue together a bunch of regular expressions to create a simple data-mining system on the go.
Learning a new language gives you knowledge that you can bring back to any programming language. Here are some things you'd learn.
Add functionality to your objects on the fly.
Mix in modules.
Pass a chunk of code around.
Figure out how to do more with less code: ruby -e "puts 'hello world'"
C# can do some of these things, but a fresh perspective might bring you one step closer to automating your breakfast.
Embedding a script engine
Use of IronPython for a scripting engine inside your .NET application. For example enabling end-users of your application to change customizable parts with a full fledge language such as Python.
A possible example might be to expose custom logic to end-users for a work flow engine.
Quick Prototyping - Both
In the simplest cases when firing a python interpreter and writing a line or two is way faster than creating a new project in visual studio.
And you can use ruby to. Or lua, or evel perl, whatever. The point is implicit typing and light-weight feel.
Cross platform
Compared to .NET a simple script Python is more easily ported to other platforms such as Linux. Although possible to achieve the same with the likes of Mono it simpler to run a Python script file on different platforms.
Processing received Email
Python has built-in support for POP3 and IMAP where the standard .NET framework doesn't. Useful for automating email triggered tasks.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm a C# .NET developer and I work on mostly ASP.NET projects.
I want to learn a new programming language,
to improve my programming skills by experiencing a new language,
to see something different then microsoft environment,
and maybe to think in a different way.
I focus on two languages for my goal. Python and Ruby.
Which one do you offer for me ?
Pros and cons of them on each other?
Is it worth learning them ?
EDIT : Sorry I editted my post but not inform here,
Ruby on Rails replaced with Ruby.
Both languages are powerful and fun. Either would be a useful addition to your tool box.
Python has a larger community and probably more mature documentation and libraries. Its object-orientation is a little inconsistent and feels (to me, IMHO) like something that was bolted on to the language. You can alter class behaviour at runtime (monkey-patching) but not for the precompiled classes and it's generally frowned-upon.
Ruby might be a little more different to your current experience: it has some flavour of Smalltalk (method-calling is more correctly message-sending for example). Its object-orientation is built-in from scratch, all classes are open to modification and it's an accepted - if slightly scary - practise. The community is smaller, the libraries less mature and documentation coverage is less.
Both languages will have some level of broken backward compatibility in their next majopr releases, both have .Net implementations (IronPython is production, IronRuby getting there). Both have web frameworks that reflect their strengths (search SO for the Django/Rails debate).
If I'd never seen Ruby, I'd be very happy working in Python, and have done so without suffering when necessary. I always found myself wishing I could do the work in Ruby. But that's my opinion, YMMV.
Edit: Come to think of it, and even though it pains me, if you're seeking to leverage your knowledge of the .Net framework, you might be best off looking at IronPython, as it's more mature than the Ruby equivalent.
First... good for you for wanting to broaden your knowledge! Second, you are comparing a language (Python) with a web framework (Ruby on Rails).
I think your best option is to try a few different frameworks in both Python and Ruby, do the same fairly simple task in each, and only then pick which one you'd like to learn more about. Rails is nice for Ruby, but it's not the only one out there. For Python I like Pylons and Django.
Pros and cons: Ruby is a little cleaner, language-wise, than Python. Python has a much larger set of modules.
Is it worth learning? Yes, to both Python and Ruby.
If you're a beginner, I would recommend you try Django if you decide to start learning Python. Of course if you decide Ruby is your choice of flavor, Rails is the obvious way to go. Whichever language you choose, I can assure you it will be a good choice.
Having said that, my personal choice is Python. I like the language, I like the community, and I use Python for almost every occasion. I use it for command-line apps, GUI apps, and I use it for web apps (Django). Oh and I use it for system administration scripts on Windows and Linux as well.
Having said that as well, I would recommend you learn a language like Haskell or Lisp as well. That will really open your eyes to a new perspective to programming. Furthermore, since you say you are mostly familiar with the .Net framework, I would really recommend you start with F# since you'll already be familiar with the libraries and it will make the transition much more smoother. Either way, good luck.
It's always valuable to learn a new programming language. And both Python and Ruby are good ones to know. It's important to note that while Python is a language, Ruby on Rails is a framework. IMHO, you should learn Ruby before you learn Rails.
Go try ruby! to see if you like it. If you do, then try Rails. Otherwise, try Python. Both are similarly useful. To me, Ruby is more "fun". If you like Lisp, you'll probably like Ruby. If you like C, you might prefer Python. Try them both!
Rule of thumb - Python if you like strict rules and Ruby if you hate them.
Another one: if you adore JavaScript - Ruby is your choice :)
What? No mention of IronPython?
IronPython is the flagship language of the DLR. It allows you to use all the familiar .NET libraries, but through Python.
I would definitely try Python and IronPython. You'll learn a lot and might even sneak it into your current projects (you can embed an IronPython engine in a .NET application).
If you're looking to learn Ruby on Rails, the guides site has a great guide for getting started and the further guides for improving your rails-fu.
Also, Tore Darell has written a Survivor's Guide for Ruby on Rails which could prove useful to you too.
I'd get in on Ruby. Seems to have a larger (or at least more active) community, the pace of new projects & continued development is second-to-none, and the learning resources seem to outnumber & outpace those of Python. I could be wrong, but these are my impressions.