what is the sole purpose of python being an interpreter? [closed] - python

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
What exactly is the sole purpose of python being an interpreter.
It doesn't provide executable files (how a commercial software developer use it?)
If any part of the code had bugs, it doesn't show up unless python
goes to that line at run it. In large projects, all parts of code
doesn't get interpreted every time, so, there would be a lot of
hidden bugs inside project
Every system should have a python installed in it to run those software's...
I am using py2exe, and I find myself puzzled to just look at the executable file size (too large).

First, answers to your questions.
They can use it for parts of their system for which they don't mind the source being visible (e.g. extensions) or they can Open Source their application. They can also use it to develop backend services for something which they're providing as a service (e.g. Youtube). They can also use it for internal tools which they don't plan to release(e.g. with Google).
That's why you need to write tests, exercise discipline and measure test coverage regularly. You sacrifice the compilers ability to check for things and some speed for advantages which I've detailed below.
Yes but it's not too hard to bundle Python along with your app. The entire interpreter + libraries is not that big. Python is pretty much a standard on most UNIX environments today. This is usually not a practical problem. The same issue is there with (say) Java (you need the JVM installed).
py2exe bundles all the modules into a single executable. It will be big. If you want to do compiled programs that are lean, don't use Python. Wrong fit.
Now, a few reasons on why "interpreted".
Faster development time. Programmer time is costlier than computer time so we should optimise for that.
No compilation cycle. Very easy to make incremental changes and check. Quick turnaround.
Introspection and dynamic typing allows certain kinds of coding not possible with some compiled languages like C.
Cross platform. If you have an interpreter for your platform, the program will run there even if it was written on a different platform.

You bring up a few different issues, here are some responses:
1) Technically, Python isn't interpreted (usually) - it is compiled to bytecode and that bytecode is run on a virtual machine.
So Python doesn't provide executables because it runs bytecode, not machine code.
You could just as well ask why Java doesn't produce executables.
The standard advantages of virtual machines apply: A big one being a simplified cross-platform development experience.
You could distribute just the .pyc (compiled bytecode) files if you don't want your source to be available. See this reference.
2) Here, you are talking about dynamic vs. static languages. There are tradeoffs, of course. One disadvantage of dynamic languages, as you mention, is that you get more run-time errors rather than compile-time errors.
There are, of course, corresponding advantages. I'll point you to some resources discussing both sides:
Dynamic type languages versus static type languages
What do people find so appealing about dynamic languages?
http://research.microsoft.com/en-us/um/people/emeijer/Papers/RDL04Meijer.pdf
3) Quite right. Just as you need the Java VM installed to run Java, perl to run Perl, etc.
Regarding your last point:
The whole idea of running in a VM is that you can install that VM once, then run many different apps. By bundlig the whole VM with every app (such as with py2exe), you are going against that concept. So yes, you have to pay the cost in terms of size.

Sole purpose of python is to provide a beautiful language to program in.
Your point #1 and #3 are similar and answer is that professional programmers use py2exe/pyinstaller etc to bundle their programs and distribute, in cases of frameworks/libraries they even don't need to do that.
Your point number #2 is also valid for statically compiled languages, something compiles correctly in C++ doesn't mean it will not crash at run-time or business logic is correct, you anyway need to test each part of your code, so with good unittests and functional tests python is at par with other languages in finding bugs, and as it doesn't need to be comiled and being dynamic means better productivity.
IMO

Python is not an interpreter, but an interpreted language.
This question is more about interpreted language VS compiled languages which has actually no other answer that the usual "it depends of your need".
See Noufal Ibrahim for details, but I'm not sure this question is a good fit for SO.

(1) You can provide installers for Python code (which may install the Python environment). This doesn't prevent you from having commercial code. Note that you can also have Java (also "interpreted" or JIT-compiled) commercial or desktop code and require your users to install a JRE.
(2) Any language, even compiled and strongly type, may produce errors that only show up when you get to that given code (e.g. division by zero). I guess you may be referring to strongly v.s. loosely typed languages. It's not just a matter of compilation, but the fact that strongly-typed languages generally make it easier to find "structural" bugs (e.g. trying to use a string as a number) during the compilation process. In contrast, loosely-typed language often lead to shorter code, which may be easier to manage. What to use really depends on the goal of your application.

Interactivity is good. I find it encourages making small, easily testable functions that build together to make an application.
Unless you are writing simple, statically linked applications, you will usually have some run-time baggage that must be included or installed (mfc, dot net, etc.) for compiled languages. Look at the winsxs folder. Sure, you get to "share" that stuff most of the time, but there is still a lot of space taken up by "needed", if hidden, requirements.
And as far as bugs, run-time bugs will be the same no matter what. Any good programmer would test as much as possible when making changes. This should catch what would be compile time bugs in other languages as well as testing run-time behavior.

A python .exe has to necessarily include a complete copy of the python interpreter. As you say, since it's interpreted it won't touch every line of code until that line is actually run. Some parts may actually invoke a python parse/compile sequence (e.g. eval(), some regexes, etc...). These would fail in a compiled .exe unless the full interpreter was available.

Related

When using the Python Interpreter, is the compiler used at all?

In Google's Python Class it reads
Python is a dynamic, interpreted (bytecode-compiled) language
I know what an interpreter is and know what bytecode is but the two together seem not to fit. After doing some reading it became a bit clearer that basically Python source code is automatically compiled before it is interpreted; but some new questions emerged.
When using the Python Interpreter does no compilation happen? If it does, when? For example if you're just typing code at the command line and it gets run each time you hit enter, when does a compiler have the opportunity to do its work?
Also in the linked to question above, #delnan gives a pretty broad definition of a compiler
A compiler is, more generally, a program that converts a program in
one programming language into a program in another programming
language...JIT compilers compile to native machine code at runtime
I guess my question is: what's the difference between an interpreter and automatic compiler? To refine the question a bit, if Python is compiled, why not compile all the way to machine code (or assembly, since I know writing compilers that can produce pure machine code is difficult)?
Perhaps it is best to forget semantics and just try to learn what Cpython is actually doing. When you invoke the Cpython binary, it does a number of things. Generally speaking, you can expect it to translate the code you've written into a sequence of bytecode instructions. This is the "compiling" stage that people will sometimes reference for python code. These are a more compact and efficient way to tell the interpreter what to do than your hand-written code. Frequently, python will cache these files for reuse in .pyc files (only re-generating if the associated .py file is newer). You can think of python bytecode as the set of instructions that the python virtual machine can run -- In a lot of ways, it's not really all that different than what you get for Java. When people speak of compiled languages (e.g. C), the compiler's job is to translate your code into a set of instructions that will run directly on your computer's hardware. Languages like Cpython and Java have an extra level of indirection (e.g. the Virtual Machine). The Virtual Machine runs directly on the computer's hardware and is responsible for interpreting the domain specific language.
Compared to standard "compiled" languages (e.g. C, Fortran), this stage is really light-weight -- and python doesn't do a lot of the checking that "traditional" compilers will do (e.g. typechecking). It pretty much only checks the syntax and does a few very simple optimizations using the peephole optimizer.

Is it possible to make a Python script its own OS?

I have written a Python script that I want to be able to run separately from the OS. As a result, I want it to be able to run as pretty much its own OS. Everything is shell based and there are no GUI components. How would I go about making a bootable version of this Python script, and is it even possible? Or would i have to put a bare-bones OS with the Python script installed as an addition?
For all practical purposes, the answer is no, it's not possible. The bare-bones os with a python install is your best option.
It is theoretically possible to implement a stand-alone Python interpreter, which runs as the OS of a system. That has been done with several interpreter languages. First of all, BASIC, then APL, Forth, Smalltalk, Java, and most likely some I'm not aware of. There is no reason why this couldn't be done with Python, it merely needs a bit of implementation work.
To give a rough estimation of the amount of work: a caffeine-driven expert coder takes about two days to implement a stand alone Forth interpreter, doubling as operating system, from scratch, on a platform (s)he's familiar with. By then the system will be far from finished or complete, but it will compile source, and allow extending itself with the results of that compilation. Other languages would differ. In case of Java, only porting an assembly implementation from one to an entirely different CPU took a team of 4 people about 2 weeks, while the back end of Java, the JVM, has a considerable resemblance to Forth. I'm not knowledgeable enough about Python inner workings to be able to give a meaningful estimate of the effort to implement a stand alone Python interpreter, but it could be roughly comparable to the effort of implementing a Java VM.

Python vs IronPython for BitCoin Development [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Been looking at porting a BitCoin Mining Pool server across to IronPython
Apart from the limitations of Ironpython supporting upto Python ~v2.7
Is a Ironpython/windows platform really not suited to this type of high volume/large transactional system
What are some of the other limitations I might encounter or should i juts bite the bullet and build up an ubuntu programming RIG
If the later what would be an easy IDE for someone coming from a c# Visual Studio world
Thanks in Adavnce
Without knowing how your existing code is built, or how you intend the new version to work, it's hard to answer specifically, but I can give some generalities.
IronPython limitations/issues, in decreasing order of importance:
* No third-party C extension modules. If you depend on any, you'll have to find a .NET equivalent or reimplement the same thing yourself. If not, not a problem.
* No 3.x. If you know 3.x, Unicode is a lot easier, yield from rocks, etc., and 3.x keeps improving (and even speeding up), while 2.x is stagnant. But how much this is worth is a matter of personal preference, not something anyone else can give you an objective answer on.
* No third-party pure Python modules that rely (in important ways) on refcounting semantics. There aren't too many of these out there, but if you need one, you need it.
* There are a handful of differences between IronPython and CPython, listed in the release notes. Probably none of these will affect you, but you should check.
* If there's any heavy application logic (as opposed to all of the work being networking/files/database), IronPython is much faster than CPython at a few things and much slower at a few others, so you probably need to profile and perf-test before you go too far down either path. (And consider at least PyPy, if not Jython, while you're at it…) But usually this isn't an issue for servers.
Next, is a Windows platform suited to a high-volume systems? Windows is definitely capable of handling thousands of connections per second (or requests, or bytes, or whatever's most relevant to your use case). It will probably never beat linux in pure connections/second/$, but you have to realistically trade off operational costs vs. dev costs. However, oversimplifying a bit, Windows sucks at reactors and rocks at proactors, while linux rocks at reactors and is decent at proactors (and there are other similar issues—e.g., substitute processes/threads for the above). So, even though Windows can do nearly as well as linux in general, it can't do as well on the same code, or often even the same designs. You need to architect your code to take advantage of Windows features and take into account Windows weaknesses. And if you're using a framework to do the heavy lifting for you, there are going to be fewer choices, and the best ones may not work like what you're used to, which is usually a much higher dev cost than changing platforms.
If you do go with CPython on linux, what IDE should you use? Stick with Visual Studio. You can, and probably should, write either all or most of your code in a cross-platform way that runs on both IronPython and linux CPython. Then you can code, test, and debug with Iron Python in Visual Studio, use native Windows CPython and/or Cygwin CPython for occasional sanity checks, and only touch linux for performance tests, final acceptance tests before releases, and debugging problems you can't repro on Windows. You'll still occasionally need a simple text editor on linux, but IDLE, or whichever of vi-vs.-emacs you prefer, or whichever GUI text editor your linux distro comes with as the default will be fine.
If your application logic is tightly coupled with your reactor design or something, this could be difficult. If it's doable, it's probably worth doing, but if it's not… well, you can still use Visual Studio as an editor and project organizer, even if the actual code is intended to be run on a linux box. You don't get the nifty debugger integration, etc. that way, but you still get a lot of mileage.

Speeding up Parts of Existing Python App with PyPy or Shedskin

I am looking to bring speed improvements to an existing application and I'm looking for advice on my possible options. The application is written in Python, uses wxPython, and is packaged with py2exe (I only target windows platforms). Parts of the application are computationally intensive and run too slowly in interpreted Python. I am not familiar with C, so porting parts of the code over is not really an option for me.
So my question is basically do I have a clear picture of my options as I outline below, or am I approaching this from the wrong direction?
Running with pypy: Today I started experimenting with Pypy - the results are exciting, in that I can run large parts of the code from the pypy interpreter and I'm seeing 5x+ speed improvements with no code changes. However, if I understand correctly, (a) Pypy with wxpython support is still a work in progress, and (b) I cannot compile it down to an exe for distribution anyway. So unless I'm mistaken, this seems like a no-go for me? There's no way to package things up so parts of it are executed with pypy?
Converting code to RPython, translating with pypy So the next option seems to be actually rewriting parts of the code to the pypy restricted language, which seems like a pretty large job. But if I do that, parts of the code can then be compiled to an executable (?) and then I can access the code through ctypes (?).
Other restricted options Shedskin seems to be a popular alternative here, does this fit my requirements better? Other options seem to be Cpython, Psyco, and Unladen, but they are all superseded or no longer maintained.
Using PyPy indeed rules out py2exe and similar tools, at least until one is ported (AFAIK there is no active work on that). Still, as PyPy binaries do not need to be installed, you might get away with a more complicated distribution that includes both your Python source code and a PyPy binary+stdlib and uses a small wrapper (batch file, executable) to ease launching. I can't comment on whether WxPython on PyPy is mature enough to be used, but perhaps someone on pypy-dev, wxpython-dev or either one's IRC channel can give a recommendation if you describe your situation.
Translating your code into RPython does not seem viable to me. The translation toolchain is not really a tool for general purpose development, and producing a C dll for embedding/ctypes seems nontrivial. Also, RPython code really is low-level, making your Python code restricted enough may amount to rewriting half of it.
As for other restricted options: You seem to mix up CPython (the original Python interpreter written in C) with Cython (a compiler for a Python-like language that emits C code suitable for CPython extension modules). Both projects are active. I'm not very familiar with Shedskin, but it seems to be a tool for developing whole programs, with little or no interaction with non-restricted Python code. Cython seems a much better fit: Although it requires manual type annotations and lower-level code to achieve really good performance, it's trivial to use from Python: The very purpose of the project is producing extension modules.
I would definitely look into Cython, I've been playing with it some and have seen speedups of ~100x over pure python. Use the profile module to find the bottlenecks first. Usually the loops are the biggest chances to increase speed when going to Cython. You should also look into seeing if you can use array/vector operations in Numpy instead of loops, if so that can also give extreme performance boosts. For instance:
a = range(1000000)
for i in range(len(a)):
a[i] += 5
is slow, real slow. On the other hand:
a = numpy.arange(10000000)
a = a +5
is fast, real fast.
Correction: shedskin can be used to generare extention modules, as well as whole programs.

Developing PyPy's Rpython as a general programming language [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
Is there any interest in developing Rpython (Restricted Python) from the PyPy project as a general purpose programming language? Perhaps it could be a fork from the PyPy project. Does such a project exist? Since the programs are compiled, one could simply contribute modules written in Rpython, and it could compete with other python implementations including CPython and PyPy.
I can't speak for everyone else, but I personally am extremely interested in using RPython as a general-purpose language. To answer some other people's questions:
Why? Because Cython is a pain to figure out how to use. If you don't put in a lot of tricky type declarations just right, you don't get any speedup. With RPython, it will run fast or it won't run at all.
Using PyPy offers a good speedup, but currently not nearly as much as RPython.
RPython might be a good way to get super-fast, somewhat Pythonic code. Here's an example to help you get started. I'm not aware of any large projects to do this, unfortunately.
Yes, there is already a project to use the translation tool chain of PyPy to create standalone executables and libraries using RPython. It is called RPythonic.
A general purpose RPython would not be a competitor for CPython and PyPy. It wouldn't even be a competitor for things like Cython.
Why? Because RPython is not Python!! RPython is merely a language which shares some syntax with Python; enough that a Python interpreter can execute RPython code, but you cannot take Python code and compile it as RPython.
The restrictions on Python that are added to enable RPython to be statically typed and compiled (indeed, to have static type inference) are so severe that they completely change the language. If you want to write an RPython program, you have to decide to do so up front and write it in RPython. You can't write a program in Python and then decide to compile it as RPython, and you can't even tweak a realistic Python program a bit to make it RPython. Normal Python code is nothing like normal RPython code; writing RPython is more similar to writing Java or C#.
So if you want to write general programs in Python but you want them to go faster, RPython has very little to offer you. That's the niche PyPy's Python interpreter is trying to fill.
If you want to write general programs in a lower level compiled language because you need your program to run faster than Python, then there are existing very mature languages and libraries for that, like Java or C#.
The reason to code in RPython would be to do something that is particularly made better by RPython. Like writing interpreters to which you can add garbage collectors and JIT compilers without having to write them by hand! Here RPython shines, and yes I would be very interested in a more polished and usable RPython interpreter-writing environment. But as a general purpose programming language for writing programs that don't particularly benefit from RPython's specialities, it would simply be a monstrous amount of work to get it to the point where it could compete with existing languages that already fill that role better. It barely even has a standard library now (because almost none of Python's extensive standard library is usable in RPython), for example.
From the looks of it, the restrictions are quite severe and on the whole it's a lot less to program in, I imagine. That's necessary for implementing PyPy, but generally if you want fast compiled code that can interact with Python, you'd use Cython (which is targeted at CPython extensions and supports pretty much all of Python seamlessly) or write code in one of the more common languages that can do this. And if you just want fast, compiled code... well, RPython may be more pleasant than e.g. C, but still, I don't see a significant advantage here (at least none that would warrant the effort to create a usable, stable language).
Why would I want to write directly in RPython?
It seems so much simpler to Python code and run PyPy.
Why would I want to write C code?
It seems so much simpler to write Python and have PyPy be implemented in C
Why would I want to write assembler code?
It seems so much simpler to write Python and have PyPy implemented in C and C implemented in Assembler.
I guess it really is turtles all the way down.
Why would I want to stop using the most convenient language and switch to a less convenient language?
What's the value in giving up a nice language?

Categories

Resources