PDA

View Full Version : Is Python really better than Perl?



superarthur
March 13th, 2010, 02:28 PM
I am not a good programmer. I just learned programming to help my university project. I started with Java, but it didn't go too well. Then I learned Perl, and rewrote everything in Perl. It was wonderful. As a result, I love Perl, and if someone ask me what's the best programming language for a beginner get a task done, I would say Perl.

However, when I search on the internet, everyone says that Python is better. I am yet to see an "unbiased" web page comparing Perl and Python. Why? Is Python really better than Perl?

(I am too busy to finish my final year in university, so I don't have time to look at Python in depth.)

Penguin Guy
March 13th, 2010, 02:35 PM
I've never used Perl, but yes - Python is way better. :p

kleskjr
March 13th, 2010, 02:36 PM
just for the record: python is widely used in the scientific comunities

superarthur
March 13th, 2010, 02:54 PM
just for the record: python is widely used in the scientific comunities

What about BioPerl (http://www.bioperl.org/wiki/Main_Page)?
And Perl saved the human genome project (http://www.bioperl.org/wiki/How_Perl_saved_human_genome).

Bachstelze
March 13th, 2010, 03:22 PM
Apples and oranges...

kleskjr
March 13th, 2010, 03:25 PM
Probably there are some other programs used, but I am talking according to my experience. Python became something like substitute for the commercial MATLAB

km0r3
March 13th, 2010, 03:49 PM
I have learned both Python and Perl, though I started with Perl and later tried Python.
Maybe I have spent about 3 months intensively when I tried Python, but I fell in love Python from the first second on.

I compared my case with similar ones and I can say it depends from case to case. There are guys like me who started with Perl and then tried Python and hated Python and there are the guys who started with Python and then tried Perl and fell in love with it. And then there are [...].

If someone would ask me why I'd choose Python rather than Perl, I'd immediately say:


The Python "shell" interpreter, also see IPython, which practically is my replacement for Bash on my Linux system.
Indentation. Oh Python code is/can be so much cleaner and readable IMO!
Object system. Everything in Python is an object. This is especially helpful when you're doing fancy stuff.
The libraries. There are so much great libraries and 3rd party modules for Python, but that's a point where Perl is a worthy competitor.
The community
Programming and Testing with Python is fun.

It's quite possible that Perl is better in one of the points, but I'm content and isn't that what cares in the end? Choose whatever you want as long as it does what you want and does it well.

Well, you shouldn't listen to anybody; you should just try it out yourself. :)

wmcbrine
March 13th, 2010, 05:35 PM
Either is fine for writing. But only Python can still be read afterwards. :D

Method X
March 13th, 2010, 05:58 PM
Starting study python now, but know, that its better.
Alsom if you want try ruby after python.
;)

Alexandre Putt
March 13th, 2010, 06:08 PM
There is no universally best language. It depends on your preferences and what you want to implement. So don't bother with side remarks, just use what you like and find efficient for your tasks.

nvteighen
March 13th, 2010, 07:04 PM
Apples and oranges...

+1000

Perl is a great language. The issue it has is that its design takes a totally different point of view than all other programming languages. Perl makes use of morphological and semantic resources, which is all what lies behind the context system... e.g. @a has to be "inflected" into $a in order to get something at some index: $a[n]. Most programming languages work with a lexico-syntactical apprioach: use a keyword with a certain syntax to do something. Perl works by using a keyword for different meanings depending on form and/or context.

What makes Perl no good is its programmers. For some weird reason, Perl is always shown written in convoluted code... when it's fairly easy to write senseful code in Perl.

Remember that Perl is a programming language written by a linguist, not by a computer scientist.

I won't talk about Python, as you all know that I consider it a great language too.

ratcheer
March 13th, 2010, 08:19 PM
I would say that which language is "better" depends on what you need to use it for. Perl is a perfectly good language. Python is also a perfectly good language. I have used Perl fairly extensively in the past.

My long time favorite language is Smalltalk. In the past couple of years, I have tried to identify and learn a more modern language where I could do things in the Smalltalk style. After much research, my final two choices were Python and Ruby. I ended up choosing Ruby and I am still happy with my choice, for me.

But, it seems most are choosing Python. To each his own.

The best answer is, use what suits you as long as it fulfills your needs. When it can no longer do everything you need, find something else that you like and can get the job done. Don't feel pressured to choose something just because that is what most others choose.

Tim

schauerlich
March 13th, 2010, 08:50 PM
My long time favorite language is Smalltalk. In the past couple of years, I have tried to identify and learn a more modern language where I could do things in the Smalltalk style.

Objective-C? (http://en.wikipedia.org/wiki/Objective_C)

nmaster
March 13th, 2010, 08:53 PM
What about BioPerl (http://www.bioperl.org/wiki/Main_Page)?
And Perl saved the human genome project (http://www.bioperl.org/wiki/How_Perl_saved_human_genome).

last semester a BioE prof here at Berkeley told my friend that Ruby is also starting to be used in research projects.

superarthur
March 13th, 2010, 09:30 PM
btw, you don't have to worry too much about typing perl.com instead or perl.org.
but python...

TheStatsMan
March 13th, 2010, 09:48 PM
http://www.linuxjournal.com/article/3882

Some Penguin
March 13th, 2010, 10:36 PM
*shrug* Perl is a wonderfully efficient language for rapid prototyping. Some things in it are very programmer-friendly, like making it easy to create a function meant to accept a variable number of named arguments in any order through easily-created associative arrays, or the ability to have functions check whether the caller wants a single return value or a list of values and return different results.

It's also flexible enough that insane^H^H^H^H^H^Hcreative people write poetry in Perl. *Executable* poetry. I am not kidding. See the Perl port of Jabberwocky.
http://www.runme.org/feature/read/+londonpl/+34/

However, that flexibility means that it's not hard to write horribly structured Perl if for some reason that's the first language you chose, and if you lack experience in a low-level language you might not understand the performance or other implications of the extremely convenient primitives it provides.

km0r3
March 13th, 2010, 11:28 PM
+1000

Perl is a great language. The issue it has is that its design takes a totally different point of view than all other programming languages. Perl makes use of morphological and semantic resources, which is all what lies behind the context system... e.g. @a has to be "inflected" into $a in order to get something at some index: $a[n]. Most programming languages work with a lexico-syntactical apprioach: use a keyword with a certain syntax to do something. Perl works by using a keyword for different meanings depending on form and/or context.

What makes Perl no good is its programmers. For some weird reason, Perl is always shown written in convoluted code... when it's fairly easy to write senseful code in Perl.

Remember that Perl is a programming language written by a linguist, not by a computer scientist.

I won't talk about Python, as you all know that I consider it a great language too.

+1

Very good arguments. I forgot the fact that Larry Wall is a linguist.

Yeah and they're both great languages.

cprofitt
March 14th, 2010, 12:06 AM
I am not a good programmer. I just learned programming to help my university project. I started with Java, but it didn't go too well. Then I learned Perl, and rewrote everything in Perl. It was wonderful. As a result, I love Perl, and if someone ask me what's the best programming language for a beginner get a task done, I would say Perl.

However, when I search on the internet, everyone says that Python is better. I am yet to see an "unbiased" web page comparing Perl and Python. Why? Is Python really better than Perl?

(I am too busy to finish my final year in university, so I don't have time to look at Python in depth.)


1. No one language is better for everything.

2. No unbiased opinions exist... a person usually prefers language X or language Y

3. Depending on your particular problem there may be a 'better' language, but that has to be balanced with 'what a person' already knows.

superarthur
March 14th, 2010, 12:12 AM
1. No one language is better for everything.

2. No unbiased opinions exist... a person usually prefers language X or language Y

3. Depending on your particular problem there may be a 'better' language, but that has to be balanced with 'what a person' already knows.

How come there are so many people biased towards python, but not as many biased towards perl? :P

Simian Man
March 14th, 2010, 12:23 AM
Perl is perfect for text processing but not ideal for much else. Python is very good for anything, but not specially suited to any one task like Perl is for text. So like others said it's comparing apples to oranges. If you mostly do text processing, stick with Perl. If you want a language that is right-footed in almost any situation, Python is hard to beat.

d3v1150m471c
March 14th, 2010, 12:29 AM
Your own experience is going to be the best answer. Thus, try them both or just read comparison reviews and come to your own conclusion.

ssam
March 14th, 2010, 01:06 AM
How come there are so many people biased towards python, but not as many biased towards perl? :P

maybe people who like ubuntu are more likely to like python.

perhaps in a slackware forum you would find more perl fans.

(i'd guess that even on the ubuntu mailing lists you would find a different distribution of people).

myrtle1908
March 14th, 2010, 05:07 AM
How come there are so many people biased towards python, but not as many biased towards perl? :P

There aren't. You will find that Perl is far more widely adopted than Python. Perl is very solid and only a touch older than its poor cousin :) There is major bias towards to Python on these forums. Ask the same question somewhere else ... you may be surprised.

eginon
March 14th, 2010, 06:27 AM
My first language was python. I've tried out a handful of other languages (bash scripts, C, asm, C#, Perl). All of them have some things that are really annoying and some things that are really neat.

In my opinion, merit language debates are fruitless. Every language can be made to do every possible operation. Sure, some things are easier in one language than others, but that's because some other programmer(s) laid the groundwork for you to be able to do that thing easily. In the end, they are all just instructions executed by the machine.

What normally happens to me is, I try out a language and then forget about it. A few months down the road I'm like "OH I remember that this particular task was much easier in...and I could see how I'd could use it to...". Then I decide to switch and discover the new language does do that one thing easier, but brings it's own set of challenges to overcome.

I think the only way to get a decent answer to your question is to try out python for yourself. What other people have to say about it is irrelevant. Code a few projects in it see if you like it and have fun coding in it. If you do, then you have a new tool in your programmer's bag-o-tricks to draw on.

soltanis
March 14th, 2010, 07:46 AM
Here comes the flood of "X is better than Y".

We've had this discussion a million times.

Neither language is really better at anything. Python and Perl are two different ways of getting the same thing done. Use whichever one works better for you.

Just remember, There's more than one way to do it.

nvteighen
March 14th, 2010, 04:46 PM
maybe people who like ubuntu are more likely to like python.

perhaps in a slackware forum you would find more perl fans.

(i'd guess that even on the ubuntu mailing lists you would find a different distribution of people).

Don't go that far into Slackware... The majority in the Debian community, although it has been increasingly adopting Python, are Perl fans.

MCVenom
March 17th, 2010, 01:14 AM
btw, you don't have to worry too much about typing perl.com instead or perl.org.
but python...
I've made that mistake before :|

ACanOfTuna
July 31st, 2010, 07:07 AM
Hi guys! My first post here! I really love (K)ubuntu and its community. :D

I've came to try Perl before and I love it. There's a lot of syntactic sugar that could make your program shorter and the geeky feeling sure high. After doing some project with it I try other programming language named Python. The whitespace sure annoying, eg. if you copy your code to your friends `lesser` editor, it would screw up (I use VIM!), but everything start to change after you become more and more accustomed to it. The simplicity and clarity of the language structure always help me during large project. "There's only one way to do it" motto also help me to focus more on the algorithm rather than on the syntactic aspect. Sometimes using Perl might result in fewer lines, but do I really need that? I believe the benefits of clarity and simplicity outweigh the size aspect. The last thing that I love with Python is the 'battery included' features, almost any programming task you could do with standard CPython. If you need more, you could always do `sudo easy_install package_name`. Easy, eh?:p

Lastly, It's a matter of preferences since Python and Perl are both on the same league. If you're really good at Perl (and like it also) then stick with it and create a big good project to promote your languages. In the end, the community will always get the benefits.

**Sorry for my English, it isn't my native.

KdotJ
July 31st, 2010, 12:00 PM
This is perhaps one of the worse forums to ask this question, you're not really going to get an "unbiased" comparison here lol

CptPicard
July 31st, 2010, 01:06 PM
This is perhaps one of the worse forums to ask this question, you're not really going to get an "unbiased" comparison here lol

Well... there is a lot of informative stuff to be said about why Python is designed the way it is, and why those design decisions are good things. Not every preference comes from pure unreasoned bias, you know.

trent.josephsen
July 31st, 2010, 02:09 PM
I think the reason for the perceived bias towards Python is because so many of the questions are of the nature "What language should I learn first" rather than "What language should I use for writing such and such a project". Python is IMHO better designed, easier to learn, scalable to very large projects, and an overall great language for new programmers. Perl is bigger, uglier, less appropriate for newbies, and more useful for small scripts and programs that won't grow past (say) 1000 lines. It uses the programmer's time very efficiently (with a certain disregard for the maintainer).

KdotJ
July 31st, 2010, 07:52 PM
Well... there is a lot of informative stuff to be said about why Python is designed the way it is, and why those design decisions are good things. Not every preference comes from pure unreasoned bias, you know.

I totally agree with you CptPicard, and believe me I wasn't trying to make a dig at people. But I was just saying, and you have to admit, as Python is hugely used in Ubuntu and by many people who use Ubuntu, there are many people who would say Python is better than Perl regardless

CptPicard
July 31st, 2010, 08:36 PM
Yes, KdotJ, I know. I also understand that it may seem like there is a Python-"bias" here on the forum... it may be true to a degree, but the reason why the point of view is skewed like that is simply because there have always been a lot of actually knowledgeable Python-afficionados around. They are capable of giving grounds for their point of view if necessary ;)

WitchCraft
July 31st, 2010, 11:19 PM
Yes - Python is better.
You can compile it to C, and you cannot not indent your code.
Indent also prevents you from mindlessly copy-pasting other people's buggy code.
Apart from that, Python syntax and regex is far more readable, coupled with indent, makes large programs far better maintainable, and enforces ORDER.

However, more often than it should be, other people's opinion tends to vastly diverge from mine.
http://bradrants.com/blog/uploaded/BradRants/Images/Misc/SmileyWink.jpg

ghostdog74
August 1st, 2010, 04:22 AM
Python is better only if you say so. Same with Perl.
Technically, both are as good to do the job.

KdotJ
August 1st, 2010, 05:04 PM
Yes - Python is better.

...

However, more often than it should be, other people's opinion tends to vastly diverge from mine.


my point proven

cprofitt
August 1st, 2010, 11:09 PM
Erlang.

Oh, wait... what was the question.

WitchCraft
August 7th, 2010, 10:10 PM
Erlang.

Oh, wait... what was the question.

The minimum requirement for a good programming-language is that it at least has some sort of input-output.

Oh, wait... I'm confusing Erlang with Haskell.


F# (F sharp) for everything !

donsy
August 8th, 2010, 01:02 AM
And then there's the whole Python3 issue. Who knows if Python will ever progress beyond 2.7.x.

simeon87
August 8th, 2010, 01:25 AM
Python is better only if you say so. Same with Perl.
Technically, both are as good to do the job.

Technically, most esoteric programming languages will do the job as well, like Brain****. However we can say that not all languages are created equal for practical purposes.

ghostdog74
August 8th, 2010, 02:43 AM
Technically, most esoteric programming languages will do the job as well, like Brain****. However we can say that not all languages are created equal for practical purposes.
Let's not go into that. Esoteric languages are not for practical and everyday purposes.

slavik
August 8th, 2010, 09:07 AM
0. I like Perl. ;)
1. Larry is a linguist (which makes him more qualified to invent a language than anyone). Noel Chomsky got us much further in Comp Sci, too. ;)
2. GIL is silly and stupid.
3. CPAN owns any other language library. You can get a module for anything you want to do. It will usually have a ::Simple module, too. Two lines and you're doing something.
4.
$a = 5; print $a . "hello" means only one thing.
5. it's a swiss army knife of programming languages.
6. It's friggin' awesome, what else is there to say?
7. Perl6 is awesome with bacon on top. BACON!!!!
8. Indentation is a style, not a syntax.
9. GIL is still stupid.
10. Did I meantion I like Perl?
11. Perl interpreter can be easily built into any C/C++ program (pidgin has it).
12. OO is a hack, so you can do reflection.

There, 13 points why Perl is better. :)

WitchCraft
August 8th, 2010, 02:10 PM
Larry is a linguist (which makes him more qualified to invent a language than anyone). Noel Chomsky got us much further in Comp Sci, too.

I don't think being a linguist is a good qualification for computer science.
Having a math AND a computer science degree, such as Mr. Google (Guido van Rossum), is a far better qualification.



2. GIL is silly and stupid.
8. Indentation is a style, not a syntax.
9. GIL is still stupid.


2+9: No, GIL isn't stupid, executing thread-unsafe code in threads is stupid, which is why GIL exists.
Of course, one could execute thread-unsafe code in threads, such as in Perl or C, but...
... C is not only the speed of light, it's also the mark on Dennis Ritchie's physics/math degree.


8. In Python, indentation IS syntax, just like the opening and closing brackets ({}) in C. You will realize, Python programs won't work correctly if you do not indent, just as a C program won't work correctly if you omit the brackets for loops, functions, if statements, etc.
Also, different indent and no brackets is why you can't copy/past Python code as easily.
You'll have to put the copied code into a separate file/module
which is why I say Python enforces order/structure, and that's why it's better.

Obviously, that means that Python development speed is a bit lower than in Perl, but in the long run, people will realize that Python code is far better maintainable/extendable and more error free than Perl code, and the Python libraries will become just as comprehensive as Perl's, which today is the only reason as to why use Perl.

Of course it doesn't mean you can't write bugs in Python, and it also doesn't mean that a bad Perl programmer will be a good Python programmer or vice-versa.

donsy
August 8th, 2010, 02:58 PM
I don't think being a linguist is a good qualification for computer science.
Having a math AND a computer science degree, such as Mr. Google (Guido van Rossum), is a far better qualification.
You're wrong. Problem-solving has it's very basis rooted in language. That's why in OO programming objects are described as *nouns*, properties as *adjectives*, and methods as *verbs*. That's why a programming language has *syntax* and production rules for a context-free *grammar*. Much of the basis for modern programming languages is derived from Chomsky's seminal work: "Syntactic Structures".

shawnhcorey
August 8th, 2010, 05:03 PM
There's one good reason for using Perl: CPAN (http://www.cpan.org/). If you want to do something, changes are there is a module for it in CPAN and it's much better than anything you could possibly write.

Can+~
August 8th, 2010, 10:09 PM
7. Perl6 is awesome with bacon on top. BACON!!!!


Well, close this thread, the power of bacon has been summoned, and we cannot discuss this further.

Unless...

http://bacolicio.us/http://www.python.org

SNYP40A1
August 8th, 2010, 10:29 PM
Try debugging a 5000+ line perl script made by someone else. I had to do this once, not fun.

raf-kig
August 9th, 2010, 12:32 AM
Try debugging a 5000+ line perl script made by someone else. I had to do this once, not fun.
Good programming practice involves well-structured code with lots of comments. If someone chooses to obfuscate their code then it'll be a maintenance nightmare no matter what the language. How would you like to debug a 5000+ Python program where the variables are named a, b, c, ... , x, y, z?

raf-kig
August 9th, 2010, 01:02 AM
Either is fine for writing. But only Python can still be read afterwards. :DThis is becoming a tired argument. Notice what a well-written Perl program looks like: http://www.perlfect.com/articles/sendmail.shtml (http://www.perlfect.com/articles/sendmail.shtml)

KdotJ
August 9th, 2010, 01:32 AM
This is becoming a tired argument. Notice what a well-written Perl program looks like: http://www.perlfect.com/articles/sendmail.shtml (http://www.perlfect.com/articles/sendmail.shtml)

I completely agree with you. I personally am a Java programmer, so does this mean that my code is unreadable just because it's not Python? No. I can comment my code and have full freedom of how I wish to format it... and that means I can make it as readable as I wish.

I understand the argument about the requirement for indentation in Python and that it forces "good practise" and produces better readability. But I don't like the way that people use this argument as if indentation is not possible in other languages. If anything, I would argue that in this sense, Python is in fact restrictive as it doesn't give the programmer full choice over how they format and style their code, regardless of what is seen as good/best practise.

shawnhcorey
August 9th, 2010, 02:16 AM
I understand the argument about the requirement for indentation in Python and that it forces "good practise" and produces better readability. But I don't like the way that people use this argument as if indentation is not possible in other languages. If anything, I would argue that in this sense, Python is in fact restrictive as it doesn't give the programmer full choice over how they format and style their code, regardless of what is seen as good/best practise.

The problem is that the code shop's unchangeable style specification does change every 5 to 7 years. Given Python's requirement of indentation does prevent this. But at least it doesn't have endless arguments about whether the else should be cuddled or not.

ghostdog74
August 9th, 2010, 02:31 AM
Obviously, that means that Python development speed is a bit lower than in Perl

development speed is dependent on how well the person knows his stuff. A skilled Python programmer will solve the task easily because he knows what modules and algorithms/steps to use. Same as a skilled Perlist.

eveningsky339
August 9th, 2010, 05:15 PM
I love Python. And C, C++, Java, Lua, and Perl.

Rather than trying to lift one language up over another, why not explore other languages and admire their strong points? Every language has at least one weakness. For example, I feel that C++'s use of classes is horrifically complicated. But I won't let that prevent me from being a C++ fan.


http://i21.photobucket.com/albums/b282/eveningsky339/thread.jpg

shawnhcorey
August 9th, 2010, 05:42 PM
Rather than trying to lift one language up over another, why not explore other languages and admire their strong points? Every language has at least one weakness. For example, I feel that C++'s use of classes is horrifically complicated. But I won't let that prevent me from being a C++ fan.

I feel just the opposite. It's not which language I like the most, it's which language I despise the least. ;)

nvteighen
August 9th, 2010, 08:36 PM
1. Larry is a linguist (which makes him more qualified to invent a language than anyone). Noel Chomsky got us much further in Comp Sci, too.



I don't think being a linguist is a good qualification for computer science.
Having a math AND a computer science degree, such as Mr. Google (Guido van Rossum), is a far better qualification.



You're wrong. Problem-solving has it's very basis rooted in language. That's why in OO programming objects are described as *nouns*, properties as *adjectives*, and methods as *verbs*. That's why a programming language has *syntax* and production rules for a context-free *grammar*. Much of the basis for modern programming languages is derived from Chomsky's seminal work: "Syntactic Structures".

Being myself a Linguistics student and a hobbyist programmer, I can assure you that the mindsets are different, although related.

A linguist is someone who states rules about how languages work... It's an empirical science: you get some data, state some nice theory and then you get some new data and you test whether your nice theory was able to predict that new data or not.

To create a language (programming, fantasy, IAL, whatever), you don't need to be a linguist... our ancestors weren't linguists and "created" the languages we know and speak :P In the case of artificial languages, you surely have to know some basic stuff, mainly structural properties of languages and knowing lots of them obviously helps too. But it's like you don't need to be a musician in order to study Musicology; it might be better, but it's not a necessary condition.

Programming language designers create languages in order to fulfill some goal in some fashion they consider to be the most descriptive. They seek solutions to problems they've found... A computational linguist may then study that language, compare it to another and explain how they differ or are alike and even find that programming languages share a common base because of what programming is. But the designer doesn't need to know that stuff.

Perl's a very weird and uncommon language because its designer is a linguist. It stresses things that are completely unusual in the world of programming: for example, morphological features (the sigils), syntactic freedom, and the use of a restricted "common framework" semantic system that defines some useful stuff like the "default" array (@_), the "default" value ($_) and all the well-known arcane global variables that serve the purpose of having a little "world" your code can refer to (e.g. when you speak, you can do it more economically if the stuff you're talking about is around).

shawnhcorey
August 9th, 2010, 09:35 PM
I don't find Perl's sigils that unusual. Java programmers use them. They are not required by the language but its programmers find them useful. They even have their own name for them; they call them using Hungarian Notation (http://en.wikipedia.org/wiki/Hungarian_notation).

WitchCraft
August 9th, 2010, 09:45 PM
I love Python. And C, C++, Java, Lua, and Perl.

Rather than trying to lift one language up over another, why not explore other languages and admire their strong points? Every language has at least one weakness. For example, I feel that C++'s use of classes is horrifically complicated. But I won't let that prevent me from being a C++ fan.



Try C#. With that, you can even write a program that rewrites & recompiles & reloads/extends itselfs while it is running. I used it for a self-learning artificial neural network. System.Codedom is awesome.

wmcbrine
August 9th, 2010, 10:51 PM
I personally am a Java programmer, so does this mean that my code is unreadable just because it's not Python?Java is unreadable because it puts the reader to sleep. Almost the opposite problem from Perl. TL;DR.


I understand the argument about the requirement for indentation in Python and that it forces "good practise" and produces better readability.This is the least of the things that make Python readable.

Of course one can program badly in any language if one really tries. But it took me longer to encounter a nearly-illegible Python program (posted here, as it happens) than it did for any other language I've looked at, so I have to count that in its favor.

raf-kig
August 9th, 2010, 11:03 PM
This issue is too complex to be resolved by a thread on a forum. Let's just say that they both have their strengths and weaknesses, and it's ultimately up to the programmer to choose whichever language is most suitable for the task at hand.

eveningsky339
August 9th, 2010, 11:56 PM
I think Python can do everything that Perl can do, and vice versa, so it's really a matter of what you want to program and how you as a programmer think.

worksofcraft
August 10th, 2010, 08:19 AM
I programmed hard real-time systems in assembler for many years. Eventually I fully embraced C++ because it is so very powerful all inclusive and steeped in evolution and history and yet gives me nearly as much control as assembler did.

I absolutely detest Java and other sad C++ wannabe interpreters like PhP and C#.

I was delighted with Prolog and love it's inference engine. It is so totally different to any other programming language I've ever seen. IMO Prolog would be far more suited to web pages than that ghastly Javascript abomination!

I never needed to look at Perl but when I finally encountered Python, it was love at first sight <3 I'm so very impressed and recommend it for learning, for rapid development and for prototyping and most day to day programming applications that don't need full blown C++.

We can't be masters of them all, but isn't it great we have the choice ;)

nvteighen
August 10th, 2010, 09:11 AM
I don't find Perl's sigils that unusual. Java programmers use them. They are not required by the language but its programmers find them useful. They even have their own name for them; they call them using Hungarian Notation (http://en.wikipedia.org/wiki/Hungarian_notation).

Yeah, but Java is a static-typed language. This means that the type of some object doesn't ever change, so, ok, the "sigil" won't change. In Perl, you use the sigil to set how the variable (in which context) will be used and interpreted.

slavik
August 11th, 2010, 12:12 PM
Being myself a Linguistics student and a hobbyist programmer, I can assure you that the mindsets are different, although related.

A linguist is someone who states rules about how languages work... It's an empirical science: you get some data, state some nice theory and then you get some new data and you test whether your nice theory was able to predict that new data or not.

To create a language (programming, fantasy, IAL, whatever), you don't need to be a linguist... our ancestors weren't linguists and "created" the languages we know and speak :P In the case of artificial languages, you surely have to know some basic stuff, mainly structural properties of languages and knowing lots of them obviously helps too. But it's like you don't need to be a musician in order to study Musicology; it might be better, but it's not a necessary condition.

Programming language designers create languages in order to fulfill some goal in some fashion they consider to be the most descriptive. They seek solutions to problems they've found... A computational linguist may then study that language, compare it to another and explain how they differ or are alike and even find that programming languages share a common base because of what programming is. But the designer doesn't need to know that stuff.

Perl's a very weird and uncommon language because its designer is a linguist. It stresses things that are completely unusual in the world of programming: for example, morphological features (the sigils), syntactic freedom, and the use of a restricted "common framework" semantic system that defines some useful stuff like the "default" array (@_), the "default" value ($_) and all the well-known arcane global variables that serve the purpose of having a little "world" your code can refer to (e.g. when you speak, you can do it more economically if the stuff you're talking about is around).

Because our ancestors weren't linguists, we get such abomination as English ... seriously, slap German together with Latin, throw in some French and Greek, voila! We got English. :(

The reason why there are 3 negative prefixes (un-wind, a-symmetrical, dis-connect) is simply because the words come from different languages (more info: http://www.linglish.net/2008/09/15/so-many-negative-prefixes/).

Also, obviously you're not familiar with Noel Chompsky's work ... look it up and what his impact was. :)

schauerlich
August 11th, 2010, 06:39 PM
Because our ancestors weren't linguists, we get such abomination as English ... seriously, slap German together with Latin, throw in some French and Greek, voila! We got English. :(

Language contact and borrowing is a primary motivator for change and linguistic diversity. I wouldn't be so quick to dismiss it as an "abomination".


The reason why there are 3 negative prefixes (un-wind, a-symmetrical, dis-connect) is simply because the words come from different languages (more info: http://www.linglish.net/2008/09/15/so-many-negative-prefixes/).

I'm not sure this is a problem. Every language has synonyms, even ones that seem silly and redundant.


Also, obviously you're not familiar with Noel Chompsky's work ... look it up and what his impact was. :)

Surely you're not questioning a linguistics major's familiarity with Noam Chomsky? :)

nvteighen
August 11th, 2010, 07:03 PM
Also, obviously you're not familiar with Noel Chompsky's work ... look it up and what his impact was. :)

No, of course I'm not... and it's Noam Chomsky... That's why I didn't write this: http://ubuntuforums.org/showpost.php?p=9706261&postcount=11 ;)

interval1066
August 11th, 2010, 07:08 PM
I programmed hard real-time systems in assembler for many years. Eventually I fully embraced C++ because it is so very powerful all inclusive and steeped in evolution and history and yet gives me nearly as much control as assembler did.

Yeah, I agree, I find little reason to use assembler these days, other than a little optimization here and there. I use C for drivers, C++ for everything else. But the real debate is python vs perl, and frankly, I find little difference between the two. I must be old school, but using the py interpreter in lieu of a shell is abhorrent to me. On the other side, perl is a little like C in that it gives you the opportunity to shoot yourself in the foot depending on how close to the hardware you are getting.

As has been remarked elsewhere, its a toolbox. You use the tool for the job.

navaneethan
August 11th, 2010, 07:17 PM
Nice

WitchCraft
August 11th, 2010, 08:52 PM
Frankly, every language has its advantages and disadvantages.
And certainly, if anybody purposely wants to obfuscate code, he/she can.

What I find good about Python is, that it by design disables bad practises. Yes, it takes the freedom from you to not-indent correctly, which is a good thing, and yes, it also disables copy-paste coding quite effectively, which I find good, too. And the lacking brackets force you to comment the program, like start function xy/end function xy
/endif which is the best feature.

On the other hand, it's not like I never used Perl when I didn't have much time, but I prefer if I don't have to.




I absolutely detest Java and other sad C++ wannabe interpreters like PhP and C#.

Watch your tongue/fingers. C# is not an interpreter like PHP or Java, it's a runtime and the code is compiled.


Java has the disadvantage that, apart from the slow start, it consumes resources as if it was the only program running. This may be good for a dedicated server, but for everything else, it's a pain in the ***.

.NET/C# has the advantage that you can port from nearly all major programming languages with near no changes, and then, you can reuse it in any program written in any language you like.

Think about it: C++/C program, such as encryption algorithms, you can port them easily to C++.NET or to C#.
Java code, such as databases/libraries: You can port it to J# easily.
VB6/VBA code: You can port it to VB.NET with near no changes.
IronPython programs can also be used in .NET, btw.

And once it is a .NET program, you can use the VB.NET/C#/J# components in any .NET language you like. You can extend the result with the full power of the .NET framework. You can even easily write an interface for a non-.NET language if you like.
F# enables you to write most complex parallel processing programms with ease. True, you can write asynchronous programs with C++/C too, but all the callbacks are making the program flow complicated and hard to follow (and you have to write all of them, while F# eliminates all that overhead).

So, the idea is being able to write most complex programs in the blink of an eye, with near no time needed (in comparison to what it would take you to do the same in C++/C).

And it's bytecode, it will run on any processor. So, no need for the (average garden variety) user to choose the version for his operating system/processor (windows/linux/mac/x64/x32/ppc/ppc64/arm/mips). In theory, you could even write something like .NET applets with it and deploy just as one deploys it via browser, just as java applets.

Additionaly, .NET classes are largely reusable. For example, you can reuse large parts of your treeview code from a windows application in a web application.

Sure, you pay all that managed code with a slower program that consumes more memory, just as in Java. But does that really matter, if you don't write a operatingsystem/driver or a very complex 3d program/game? The difference to Java is, .NET doesn't act as if it was the only program present, and it's load time isn't that long by far, since the bytecode is compiled to native code on the first execution. You can even precompile to native code directly, but that eliminates the advantages of bytecode distribution.

So what I want to say: don't compare Java to C#/.NET. It's not quite the same.
Java is one language, one runtime, and that interpreted.

.NET (C# being only a part of it) is a framework, compromised of 4+ major languages, 4+ runtimes, easy interlanguage interoperability, webservices (soap/XML/JSON), ASP.NET webforms, ASP.NET MVC, Silverlight (crappy f***ed-up version of Flash, but in .NET), as well as windows services and Linux/Unix daemons and it is compiled. And just like Java, it runs on Windows, Linux and UNIX (where the mono project has ported it bug-free, but the if bug-free ported applies for Java as well).

KdotJ
August 11th, 2010, 09:26 PM
These threads all end up the same lol...

worksofcraft
August 11th, 2010, 09:28 PM
.NET (C# being only a part of it) is a framework, compromised of 4+ major languages, 4+ runtimes, easy interlanguage interoperability, serialization, reflection, ASP.NET webforms, ASP.NET MVC...


There is a difference between the language and the underlying frame work. I don't see any reason we couldn't equally run both Python and Perl on .NET or compile C++ to Java VM for that matter.

dv3500ea
August 11th, 2010, 10:00 PM
IMO Ruby beats them both (I have tried all 3).

I really couldn't care less about the whole syntax thing. For me it's more about how logical the languages seem.

To me, python is extremely counter-intuitive. It has string.join(list_to_join), string_to_split.split(string), len(list) etc.
It is all very inconsistent - it's hard to remember what's a method, what's a function etc.

perl is better, as it is consistent - it uses functions, as built in types are not objects. Actually, object orientation in perl is quite a faff and seems unnatural.

ruby is much better - everything is an object, and only methods are used. You have string.split(str_or_re), array.join(str) and array.length - this is far more intuitive. Object orientation is very natural - now only if they replaced classes with prototypal inheritance...

It's all just preference though. All scripting languages can make you very productive but they don't have as good performance as compiled languages.

WitchCraft
August 11th, 2010, 10:08 PM
There is a difference between the language and the underlying frame work. I don't see any reason we couldn't equally run both Python and Perl on .NET or compile C++ to Java VM for that matter.

In theory you're right, but in practise you're welcome to write a C++-to-Java compiler, since I haven't found one on the first google page.
And as I said, IronPython for .NET exists. But Iron is a subset of Python.

But, also in practise, it all boils down to the fact that there is no decent Python or Perl support for .NET. So you can't really use the framework from those languages, as opposed to using those languages from the framework (MS is really mean). Actually, I just googled and saw ActiveState sells Python/Perl for .NET, but it probably works only on Windows (I can't say because I didn't buy).

Edit: Actually, I can't find anything on this except
http://findarticles.com/p/articles/mi_m0EIN/is_2000_July_11/ai_63287204/
Seems to be vapoware or undocumented.

Some Penguin
August 11th, 2010, 10:35 PM
There is a difference between the language and the underlying frame work. I don't see any reason we couldn't equally run both Python and Perl on .NET or compile C++ to Java VM for that matter.

The latter is not very plausible, because that's moving up the abstraction chain. Much C++ code has assumptions on machine architecture (word size, byte alignment, in-memory layout of data structures, et al) that would be difficult to automatically infer and convert to Java. Ditto for such things as calls to operating-system-specific APIs.

WitchCraft
August 11th, 2010, 10:44 PM
The latter is not very plausible, because that's moving up the abstraction chain. Much C++ code has assumptions on machine architecture (word size, byte alignment, in-memory layout of data structures, et al) that would be difficult to automatically infer and convert to Java. Ditto for such things as calls to operating-system-specific APIs.

Yep, that's the only thing I miss in C#: Low level stuff with the ability to write an aimbot or a rootkit, or both ;-))

Thinking about what I just wrote, I think I should search google first before I make unqualified remarks :lolflag:

But hey, you can call unmanaged code, such as a selfmade C++ shared library from C# ;-)))

worksofcraft
August 11th, 2010, 11:04 PM
In theory you're right, but in practise you're welcome to write a C++-to-Java compiler, since I haven't found one on the first google page.
And as I said, IronPython for .NET exists. But Iron is a subset of Python.

But, also in practise, it all boils down to the fact that there is no decent Python or Perl support for .NET...

Evidently absence of C++ compilers for the Java VM doesn't make Java a superior programming language. In the same way if there was a big demand for Python and Perl on .NET I'm sure they would gain better support.

IMHO both Sun and Microsoft could have done us all a big favor had they simply ported existing languages for their innovative frameworks.

C++ was an extension of C. Object oriented facilities was a significant improvement. OTOH I personally didn't bother learning Perl because I just couldn't see it having anything particularly new to offer. It doesn't mean it isn't as good as, or maybe slightlky better than what I already learned, but like someone else said, perhaps Ruby is preferable, or I could go learn F# or... well I am learning Python at the moment :)

interval1066
August 11th, 2010, 11:11 PM
ruby is much better - everything is an object

I agree, Ruby is probably the most OOP of all the languages discussed. But what about using it as a shell? I think I'd rather use python if I had to pick one for use as a shell-like environment.

shawnhcorey
August 11th, 2010, 11:22 PM
I agree, Ruby is probably the most OOP of all the languages discussed. But what about using it as a shell? I think I'd rather use python if I had to pick one for use as a shell-like environment.

Why? Perl is more shell-like than Python. In fact, that's where all its sigils comes from.

echo $PATH

interval1066
August 11th, 2010, 11:26 PM
Perl as a shell? Naw... you need to install psh for that. Never messed with it. It may be the greatest thing since sliced bread, but there's such a thing as overkill.

KdotJ
August 11th, 2010, 11:26 PM
I agree, Ruby is probably the most OOP of all the languages discussed.

More than Java? Java is an Object-Obsessed Language...

worksofcraft
August 11th, 2010, 11:30 PM
I agree, Ruby is probably the most OOP of all the languages discussed. But what about using it as a shell? I think I'd rather use python if I had to pick one for use as a shell-like environment.

Yes I agree with this. Being forced to do everything with objects isn't always what we want and especially for rapid prototyping or one off solutions. Having it as one of our options is more flexible.

My understanding is that Java is reliant on their underlying virtual machine architecture. C++ makes no such assumptions. It is precisely when people rely on a particular precision, byte ordering and word alignment that they get problems with C++.

How do Ruby, Perl and Python compare on assumptions they make about word sizes and so on?

interval1066
August 11th, 2010, 11:34 PM
More than Java? Java is an Object-Obsessed Language...

No, probably not more than java, java does a good job of being oop. Maybe too good; for example I don't see the point of disallowing multiple inheritance, even if the gurus do say that mi is not the true oop path. Many have told me that single inheritance is true and correct, but no ones really given me a good reason why...

KdotJ
August 11th, 2010, 11:43 PM
No, probably not more than java, java does a good job of being oop. Maybe too good; for example I don't see the point of disallowing multiple inheritance, even if the gurus do say that mi is not the true oop path. Many have told me that single inheritance is true and correct, but no ones really given me a good reason why...

Agreed, I too have never been given good explanation why M.I is not good.. I mean, other languages can offer it so...

Frak
August 12th, 2010, 05:24 AM
Agreed, I too have never been given good explanation why M.I is not good.. I mean, other languages can offer it so...
http://en.wikipedia.org/wiki/Multiple_inheritance#Criticisms

Biggest thing, future maintenance. It doesn't give you everything now, but it's much easier to understand later on.

worseisworser
August 12th, 2010, 07:35 AM
MI works well in CLOS, and in particular when combined with CLOSER-MOP.

None of the points mentioned apply; they are not problems there.

worksofcraft
August 16th, 2010, 07:42 AM
I'm actually quite interested in language efficiency and so I was looking at some Benchmarks.

It showed C as being 2% slower than assembler.
Java was 32% slower
Perl 2600% slower
and Python 4300%

Then I looked at how they tested it... and I'm sorry but I think their tests were garbage. Hence I won't reference their web page.

For a start Java uses just-in time compiler to translate virtual machine code to physical machine code on the fly and then it needs to allocate and garbage collect all the variable it handles.

Thus if you just hammer the same data over and over in a tight "compile-first-time-through" loop then you aren't really doing a representative test.

IDK has anyone seen benchmarks they thought sounded quite genuine?

raf_kig
August 16th, 2010, 12:04 PM
This is becoming a tired argument. Notice what a well-written Perl program looks like: http://www.perlfect.com/articles/sendmail.shtml (http://www.perlfect.com/articles/sendmail.shtml)

That's a nice nick you've got there :-)

/e and I agree, though I do prefer python ;-)

raf-kig
August 16th, 2010, 11:08 PM
That's a nice nick you've got there :-)

/e and I agree, though I do prefer python ;-)
A smart guy like you should be programming in Perl. Python is for wimps.

mmix
August 16th, 2010, 11:11 PM
IMHO, no.

perl is better, but then, lisp, c.

Queue29
August 17th, 2010, 02:41 AM
I'm actually quite interested in language efficiency and so I was looking at some Benchmarks.

It showed C as being 2% slower than assembler.
Java was 32% slower
Perl 2600% slower
and Python 4300%

Then I looked at how they tested it... and I'm sorry but I think their tests were garbage. Hence I won't reference their web page.

For a start Java uses just-in time compiler to translate virtual machine code to physical machine code on the fly and then it needs to allocate and garbage collect all the variable it handles.

Thus if you just hammer the same data over and over in a tight "compile-first-time-through" loop then you aren't really doing a representative test.

IDK has anyone seen benchmarks they thought sounded quite genuine?

http://shootout.alioth.debian.org/


All the submissions are viewable, and if you think you can write something faster, you're free to submit it. They also enforce level-headed rules, like you can't just have python call a bunch of C libraries and say 'hey look, python is just as fast as C!'.

worksofcraft
August 17th, 2010, 03:05 AM
http://shootout.alioth.debian.org/


All the submissions are viewable, and if you think you can write something faster, you're free to submit it. They also enforce level-headed rules, like you can't just have python call a bunch of C libraries and say 'hey look, python is just as fast as C!'.

Now that is a much more believable comparison than the one I had Googled! Thanks :)


I uploaded a comparison graph showing, C++, Java 6 Server, Python 3, Perl, Ruby and PHP. Java and PHP because I think it is so dumb to cripple web server throughput running interpreted scripts... but I suppose that's a different issue :lolflag:

GenBattle
August 17th, 2010, 04:13 AM
IMHO it completely depends on your programming style and preference. From what i've heard of Perl (never actually used it for anything) it tends to try and minimize code as much as possible by providing many ways to achieve the same end result, giving you plenty of flexibility.

From what i've heard, for most programmers this tends to follow the old mantra of being given enough rope to hang yourself with.

Python tends to focus on having one explicit method for anything, and has an extensive library of functions and objects.

But yea, as a python user, i am biased :p I don't think the two target audiences really intersect, which is probably why you've had trouble finding balanced and unbiased opinions.

nvteighen
August 17th, 2010, 10:14 AM
Again, it depends what do you want to do.

For example, and call me a bit crazy, but I've always felt that Perl interfaces much better with system-specific stuff than Python (i.e. the sort of stuff Python places at the os module). Maybe it's the shell scripting heritage Perl has or maybe just my impression.

But again, use the best tool for the job. And where both will be right, it just boils down to personal tastes.

ja660k
August 17th, 2010, 05:30 PM
again, it depends what do you want to do.

For example, and call me a bit crazy, but i've always felt that perl interfaces much better with system-specific stuff than python (i.e. The sort of stuff python places at the os module).

+1

juancarlospaco
August 17th, 2010, 10:42 PM
Pearl breaks your ; key
:D

igouy
August 18th, 2010, 12:16 AM
Now that is a much more believable comparison than the one I had Googled! Thanks :)


I uploaded a comparison graph showing, C++, Java 6 Server, Python 3, Perl, Ruby and PHP. Java and PHP because I think it is so dumb to cripple web server throughput running interpreted scripts... but I suppose that's a different issue

Did you actually experience a throughput problem?

The chart. (http://shootout.alioth.debian.org/u32/which-programming-languages-are-fastest.php?gpp=on&java=on&v8=on&lua=on&python3=on&perl=on&yarv=on&php=on&calc=chart)

worksofcraft
August 18th, 2010, 12:30 AM
Did you actually experience a throughput problem?

The chart. (http://shootout.alioth.debian.org/u32/which-programming-languages-are-fastest.php?gpp=on&java=on&v8=on&lua=on&python3=on&perl=on&yarv=on&php=on&calc=chart)

Definitly. For instance, IMO, PhP sucks... it sucks up all the server's processing power for absolutely no good reason.

If you invest in latest multi core 64 bit processor technology to get improved performance it seems reasonable you might care also about how efficient your programming languages are?!

eeperson
August 18th, 2010, 12:55 AM
Definitly. For instance, IMO, PhP sucks... it sucks up all the server's processing power for absolutely no good reason.

If you invest in latest multi core 64 bit processor technology to get improved performance it seems reasonable you might care also about how efficient your programming languages are?!

Execution speed is not the only metric that determines usefulness of a language. A language that makes it easier to implementation and maintenance easier can be more useful even if it does execute slower. This is especially true when when something IO bound rather than CPU bound, such as a web application.

GenBattle
August 18th, 2010, 03:33 AM
Execution speed is not the only metric that determines usefulness of a language. A language that makes it easier to implementation and maintenance easier can be more useful even if it does execute slower. This is especially true when when something IO bound rather than CPU bound, such as a web application.

Agreed.

You can basically guarantee that an interpreted script will have no memory leaks (provided there are no bugs in the interpreter itself). Additionally, the code would probably take you half as long or less to write and then self-test.

Optimized C++ code will beat out an interpreted language by a large margin every time, even unoptimized C++ code will win out most of the time, but simply using a language like C++ consumes more development time. These days developer time is usually a more expensive and limited resource than additional computing power.

Also, perceived throughput is very different to actual throughput. Sure, it'll generate a higher server load, but you get around this by running more servers (as i said, cheaper than development/maintenance time/downtime). You might care about the efficiency of your code and how fast it runs, but computer illiterate clients will usually opt for buying an additional server rather than paying you for extra work. This may be different if you are an uber C++ programmer, but again, cost still comes into it.

As a (even more) side note, if you want C++ execution speed but with much better development/debugging speed, Go is a really good option already (http://golang.org/). But yes, a side note nonetheless.

Anyway, the topic of this thread is Python vs. Perl, which the chart shows to be pretty even.

worksofcraft
August 18th, 2010, 03:47 AM
Execution speed is not the only metric that determines usefulness of a language. A language that makes it easier to implementation and maintenance easier can be more useful even if it does execute slower. This is especially true when when something IO bound rather than CPU bound, such as a web application.

Yes indeed and as the graph shows, execution speedwise there isn't much to choose between Python and Perl. Perhaps you have some metrics regarding implementation and maintenance costs?

I seriously don't think things like Javascript and PhP are much of an improvement over C++ in that respect. For instance, generating images on the fly in PhP on heavily loaded servers isn't a good choice of language IMO, but it isn't really the topic of this thread.

When it comes to "better" the question we must ask is for what application? Would you consider writing the Linux Kernel in either Perl or Python? I wouldn't.

GenBattle
August 18th, 2010, 04:36 AM
Aside from the face that you couldn't (because you can't currently compile those languages), they are not designed for kernel work. Each Language has its strengths and weaknesses. Debating which language is best overall is pointless.

In reality no programmer only knows a single language and just uses that - if you were trying to do something fancy like generate images on the fly from a Python script, you would usually just write the function in C and then call it from python (don't know about php or perl).

cj.surrusco
August 18th, 2010, 04:44 AM
btw, you don't have to worry too much about typing perl.com instead or perl.org.
but python...

I've never done that before, so i had to see for myself... Now I know!
:lolflag:

slavik
August 18th, 2010, 12:29 PM
Execution speed is not the only metric that determines usefulness of a language. A language that makes it easier to implementation and maintenance easier can be more useful even if it does execute slower. This is especially true when when something IO bound rather than CPU bound, such as a web application.

Yes, developer time is more expensive than computing time ... memory is cheap, too.

One semi-recent experience I had is that a company had a cluster of 4 servers running a Java J2EE app. All through the day, we'd see very high load on the systems (Sunfire V490, dual quad-cores). Instead of fixing the app (which does stupid things), they bought 2 more servers from Sun with the Niagara T1 CPUs (128 threads on each host). The load went down. Keep in mind that this is a cluster and has data replication to take care of.

Sure, you can throw hardware at a software problem, but when you decrese the I/O limit by adding systems, suddenly you realise that the developer time is the only thing you can throw at the problem.

Throwing hardware at the problem will not somehow magically make your code scale better.

eeperson
August 18th, 2010, 05:53 PM
Yes indeed and as the graph shows, execution speedwise there isn't much to choose between Python and Perl. Perhaps you have some metrics regarding implementation and maintenance costs?

I seriously don't think things like Javascript and PhP are much of an improvement over C++ in that respect. For instance, generating images on the fly in PhP on heavily loaded servers isn't a good choice of language IMO, but it isn't really the topic of this thread.

When it comes to "better" the question we must ask is for what application? Would you consider writing the Linux Kernel in either Perl or Python? I wouldn't.

There are studies about developer productivity in different languages but the results tend to be subjective since productivity and maintenance costs tend to be difficult things to measure. Here (http://norvig.com/java-lisp.html) is a Lisp biased article that links to some productivity studies. The results are far from conclusive but you might find them interesting.

My reasons for believing that some languages are more productive at certain tasks comes from my own personal experience with different languages. I believe that a language will will be more productive and easier to maintain if it lets you express your intentions in a simple fashion without having to worry details unrelated to the problem at hand. I also believe that a smaller feedback loop, for testing code produced, results in increased developer productivity.

Take, for example, simple web development. The example you mentioned of a web app that generates images on the fly would be a good example of a time when you would want to write part of the app in C or C++ rather than PHP. However, this is a corner case that doesn't occur too often. On the other hand, there are a large number of web apps that need to pull some data from a database do some simple logic with it embed the data in HTML. This is where PHP excels because you can embed it in the HTML and it has built in tools edit HTML tags. Since it is interpreted the feedback loop is smaller because there is no compile step. In order to get all of these features in C++ you would basically have to write a PHP interpreter.

igouy
August 18th, 2010, 06:12 PM
Definitly. For instance, IMO, PhP sucks... it sucks up all the server's processing power for absolutely no good reason.

You have not described what "problem" you experienced with that webpage - was there a 15 second delay before the page loaded?

Actually that website is reported by Google Webmaster Tools to be faster than 80% - 95% of websites.



If you invest in latest multi core 64 bit processor technology to get improved performance it seems reasonable you might care also about how efficient your programming languages are?!

The website does not have a dedicated webserver.

Actually the webserver is on the same machine that hosts the repositories and other software for all the 10,000 users of the 900 projects hosted by Alioth.

igouy
August 18th, 2010, 06:25 PM
Anyway, the topic of this thread is Python vs. Perl, which the chart shows to be pretty even.

For that topic there are direct comparisons -

CPython :: Perl (one core) (http://shootout.alioth.debian.org/u32/benchmark.php?test=all&lang=python&lang2=perl)

CPython :: Perl (quad core) (http://shootout.alioth.debian.org/u32q/benchmark.php?test=all&lang=python&lang2=perl)

etc

igouy
August 18th, 2010, 06:47 PM
For instance, generating images on the fly in PhP on heavily loaded servers isn't a good choice of language IMO, but it isn't really the topic of this thread.


As I expect you know, images are generated on the fly by calling C library functions from PHP.

The choice wouldn't be about programming language, the choice would be - cache more images on the server.

igouy
August 18th, 2010, 06:59 PM
Take, for example, simple web development. The example you mentioned of a web app that generates images on the fly would be a good example of a time when you would want to write part of the app in C or C++ rather than PHP. However, this is a corner case that doesn't occur too often.

The PHP functions used to generate images on the fly are just wrappers for functions in a C library - in effect, that part of the app is written in C rather than PHP.


On the other hand, there are a large number of web apps that need to pull some data from a database do some simple logic with it embed the data in HTML. This is where PHP excels because you can embed it in the HTML and it has built in tools edit HTML tags. Since it is interpreted the feedback loop is smaller because there is no compile step. In order to get all of these features in C++ you would basically have to write a PHP interpreter.

I'm not an experienced Perl programmer - so I find Perl unreadable and tricky. However I'm willing to believe that those who use Perl everyday don't have that difficulty and I can see there are examples that show Perl can be an effective tool - Sweden's Premium Pension System (http://www.oreillynet.com/pub/a/oreilly/perl/news/swedishpension_0601.html).

WitchCraft
August 18th, 2010, 07:57 PM
Sure, you can throw hardware at a software problem, but when you decrese the I/O limit by adding systems, suddenly you realise that the developer time is the only thing you can throw at the problem.

Throwing hardware at the problem will not somehow magically make your code scale better.

I couldn't agree more.
I've had the same problem with ASP.NET / C#.

The thing is that speed is only as high as the slowest component.
Which for a webapplication is neither server nor database performance, it's data throughput per second and reloading and the client's webbrowser.

You're much better of using PHP than ASP.NET although PHP is certainly much slower in processing than C#. The reason is the architecture of ASP.NET. It uses (when not MVC) webforms, which use viewstates and postbacks. It aitms at eliminating javascript. But... that means whenever you click a button, all the info contained in the pages is put in the viewstate, the viewstate sent to the server, some code is processed using the viewstate, and the page is being sent and loaded again by the client, with such marvellous updates as a red frame around a textbox, indicating an input error.

So whenever you do just about anything, a lot of behind-the-scenes data transfer is taking place, using up bandwidth, time and server power, while in PHP you simply call a JavaScript JQuery function in the page which runs directly on the client.

So the ASP.NET architecture makes the application slow and the server load heavy, even when it is completely not necessary.

Since what makes the application slow is all the data-transfer and reloading, throwing additional servers at it will not solve the problem.

And the second bottleneck is usually the database.
Switching from MySQL/PostGre to Firebird might just bring much more than changing programming languages or adding another server.

worksofcraft
August 18th, 2010, 09:57 PM
You have not described what "problem" you experienced with that webpage - was there a 15 second delay before the page loaded?

Actually that website is reported by Google Webmaster Tools to be faster than 80% - 95% of websites.
...


Hum, I don't remember specifying any particular website.

I was actually thinking of a project I worked on some while back that created pie/bar charts on the fly. It was a generic opinion poll generator that anyone could use in their blog spots and web pages.

The overhead of running all that interpreted PhP code to generate .png files was simply staggering. Also I became aware of how excruciatingly painful mixing HTML of web pages with C++ style PhP syntax really is. Indentations, quoting and matched bracketing issues are hardly scratching the surface of the problem.

PhP might be a practical choice at the moment because of it's availability, but I'm confident some clever people somewhere will soon sit down reassess and then come up with a proper design for this application ;)

However back to Python v. Perl, I notice Google state they use Python for ALL their scripting applications... Now wouldn't it be great if browsers like Google Chrome supported Python scripts in web pages to replace the abominable Javascript :lolflag: ?

igouy
August 18th, 2010, 11:19 PM
Hum, I don't remember specifying any particular website.

My mistake - I suppose it was too much to hope that your decision to upload a png chart was caused by an actual problem you experienced when that chart was generated by interpreted PHP scripts.



Also I became aware of how excruciatingly painful mixing HTML of web pages with C++ style PhP syntax really is. Indentations, quoting and matched bracketing issues are hardly scratching the surface of the problem.

Sounds like you weren't doing enough to separate the presentation code from the app code.

worksofcraft
August 18th, 2010, 11:59 PM
My mistake - I suppose it was too much to hope that your decision to upload a png chart was caused by an actual problem you experienced when that chart was generated by interpreted PHP scripts.




Sounds like you weren't doing enough to separate the presentation code from the app code.

Lol - wut?

I think working on that project made me very experienced with implications of interpreted language inefficiency and unmaintainabilty of code that encourages programmers to jumble up HTML, Javascript and PhP language all in the same files.

worksofcraft
August 19th, 2010, 03:37 AM
As a Python noob I report back with some things I found that I really DON'T like about the language.

- I was happily learning Python on my latest Ubuntu distro which came with Python 2.6.4. I discover that 2 years ago a Python 3.0 was released... and then a 3.01 and then... well they are up to 3.1.2 now I think: The point being all these are backwards-incompatible, with what I've just been learning... and next Ubuntu distro comes with Python 3 dot something and no version 2 :P

- Unlike C++, Python gives the programmer little control over the scope of objects and relies on Garbage Collection to avoid running out of memory. Currently the main algorithm is based on reference counting which is a heavy overhead and seriously flawed when there can be cyclicly linked garbage and there is no guarantee that object destructors will ever get called... so you can't rely on it to do things like flushing buffers and closing files on exit.

I wonder how Perl does on these?

slavik
August 19th, 2010, 12:06 PM
I couldn't agree more.
I've had the same problem with ASP.NET / C#.

The thing is that speed is only as high as the slowest component.
Which for a webapplication is neither server nor database performance, it's data throughput per second and reloading and the client's webbrowser.

You're much better of using PHP than ASP.NET although PHP is certainly much slower in processing than C#. The reason is the architecture of ASP.NET. It uses (when not MVC) webforms, which use viewstates and postbacks. It aitms at eliminating javascript. But... that means whenever you click a button, all the info contained in the pages is put in the viewstate, the viewstate sent to the server, some code is processed using the viewstate, and the page is being sent and loaded again by the client, with such marvellous updates as a red frame around a textbox, indicating an input error.

So whenever you do just about anything, a lot of behind-the-scenes data transfer is taking place, using up bandwidth, time and server power, while in PHP you simply call a JavaScript JQuery function in the page which runs directly on the client.

So the ASP.NET architecture makes the application slow and the server load heavy, even when it is completely not necessary.

Since what makes the application slow is all the data-transfer and reloading, throwing additional servers at it will not solve the problem.

And the second bottleneck is usually the database.
Switching from MySQL/PostGre to Firebird might just bring much more than changing programming languages or adding another server.
I have seen Databases cause issues. Oracle tends to re-run statistics on tables if they change more than 10% of the data. The new statistics were screwed up for some reason (not an Oracle DB expert here) and caused some giant horrible queries (built via lazily configured hibernate) to run slow as molasses. Needless to say, it was a full day outage for an application that brings in 600M USD per year in revenue.

trent.josephsen
August 19th, 2010, 02:48 PM
As a Python noob I report back with some things I found that I really DON'T like about the language.

- I was happily learning Python on my latest Ubuntu distro which came with Python 2.6.4. I discover that 2 years ago a Python 3.0 was released... and then a 3.01 and then... well they are up to 3.1.2 now I think: The point being all these are backwards-incompatible, with what I've just been learning... and next Ubuntu distro comes with Python 3 dot something and no version 2 :P

- Unlike C++, Python gives the programmer little control over the scope of objects and relies on Garbage Collection to avoid running out of memory. Currently the main algorithm is based on reference counting which is a heavy overhead and seriously flawed when there can be cyclicly linked garbage and there is no guarantee that object destructors will ever get called... so you can't rely on it to do things like flushing buffers and closing files on exit.

I wonder how Perl does on these?
The big upgrade is 2 -> 3; all the 2 versions are backwards compatible, as are all the 3 versions (I'm pretty sure).

Perl is experiencing a similar transition on a larger scale (heard of Perl 6?). In both cases development on both branches is ongoing; Python 2 and Perl 5 will continue to be used and developed as if their successors do not exist. The transition will be slow, but probably slower for Perl; Python is more mathematical and strongly structured, which makes it easier to automate code translation.

Perl also relies on reference counting for runtime GC. But if you're depending on destructors to do things like buffer flushing and file closing, and you're complaining because you don't have control over when it happens, that's a flaw of garbage collection in general, not reference counting in particular. Even in C, nothing is guaranteed to happen when you call free(); many implementations don't release recently freed memory until program termination (it saves time if you need to shortly malloc() something else). I imagine C++ works much the same way.

shawnhcorey
August 19th, 2010, 03:51 PM
Perl is experiencing a similar transition on a larger scale (heard of Perl 6?). In both cases development on both branches is ongoing; Python 2 and Perl 5 will continue to be used and developed as if their successors do not exist. The transition will be slow, but probably slower for Perl; Python is more mathematical and strongly structured, which makes it easier to automate code translation.

Perl 6 is suppose to be able to detect Perl 5 programs and switch compilers automatically. (Or at least, that's the theory.)


Perl also relies on reference counting for runtime GC. But if you're depending on destructors to do things like buffer flushing and file closing, and you're complaining because you don't have control over when it happens, that's a flaw of garbage collection in general, not reference counting in particular. Even in C, nothing is guaranteed to happen when you call free(); many implementations don't release recently freed memory until program termination (it saves time if you need to shortly malloc() something else). I imagine C++ works much the same way.

Many OSes do not have a way for you to return memory to the system. When you use malloc(), a chunk of memory is taken from the system but when use you free(), it is put on a queue which is examined first when malloc() is called again. Only if it can't find enough room in this free list does it get more from the system.

This is one thing you have to watch for when you are writing daemons. Many of them will restart themselves and kill of the old process to get away from this problem.

worseisworser
August 19th, 2010, 04:32 PM
As a Python noob I report back with some things I found that I really DON'T like about the language.

- I was happily learning Python on my latest Ubuntu distro which came with Python 2.6.4. I discover that 2 years ago a Python 3.0 was released... and then a 3.01 and then... well they are up to 3.1.2 now I think: The point being all these are backwards-incompatible, with what I've just been learning... and next Ubuntu distro comes with Python 3 dot something and no version 2 :P


Backward compatibility is not something one get for free; take a look at C++ for instance.



- Unlike C++, Python gives the programmer little control over the scope of objects and relies on Garbage Collection to avoid running out of memory. Currently the main algorithm is based on reference counting which is a heavy overhead and seriously flawed when there can be cyclicly linked garbage and there is no guarantee that object destructors will ever get called... so you can't rely on it to do things like flushing buffers and closing files on exit.

I wonder how Perl does on these?

Uh. The GC in Python handles cycles.

WitchCraft
August 19th, 2010, 05:18 PM
I have seen Databases cause issues. Oracle tends to re-run statistics on tables if they change more than 10% of the data. The new statistics were screwed up for some reason (not an Oracle DB expert here) and caused some giant horrible queries (built via lazily configured hibernate) to run slow as molasses. Needless to say, it was a full day outage for an application that brings in 600M USD per year in revenue.

I've done something even better.
I've used table-valued function in MS-SQL to create the equivalent of a parametrized view.

Now the thing is, it worked very fast on my development machine, where the tables have around 10 to 100 test entries.

But on the production server, with millions of rows...
Made me realize that table valued functions cache the temporary table in RAM. Which will be too small for millions of entries, which means the database system begins swapping - argh, HD was extremely slow ...

worksofcraft
August 20th, 2010, 12:39 AM
...
Uh. The GC in Python handles cycles.

What I read is that it isn't really guaranteed to work and if it does then there is no telling if and when the destructor gets called.

In C++ when you 'new' an object the constructor gets called and when you 'delete' it then the destructor gets called and it is intended to be under full control of the programmer. GC if any is just an optional belt and braces safety net.

Now I do want to stick to the OP's topic about Python and on reflection I suppose the GC issue is a red-herring. My actual concern is this:



class Base:
def __init__(self):
print ('base init')

def __del__(self):
print ('base del')

class Derive(Base):
def __init__(self):
Base.__init__(self)
print ('derive init')

def __del__(self):
print ('derive del')
Base.__del__(self)

d = Derive() # construct one
d.__del__() # I'm done with it now
print ('done')
# OMG it gets destroyed a second time now!!


I don't like the fact I have to explicitly call the constructors and destructors of the base class: It seems like sacrilege when a sloppy programmer might create an object that has only been partially constructed! As for the destructors well they are utterly useless because of their unpredictability!

StephenF
August 20th, 2010, 01:11 AM
I don't like the fact I have to explicitly call the constructors and destructors of the base class.
You only need to call base class methods explicitly if you override methods of the base class, otherwise they are inherited.


def Base(object):
def __init__(self):
print "Constructor was run."

def Derived(Base):
pass

Derived()

It is assumed when a method is overrided that the method of the subclass knows how the base class method is to be called, with what parameters, or whether it is to be called at all.

In your example you print "derive init" after "base init" but you have the power to switch this around by putting the explicit call after the print statement, or call any of the other methods of the base class instead of or in addition to.

worksofcraft
August 20th, 2010, 02:21 AM
You only need to call base class methods explicitly if you override methods of the base class, otherwise they are inherited.

It is assumed when a method is overrided that the method of the subclass knows how the base class method is to be called, with what parameters, or whether it is to be called at all.

In your example you print "derive init" after "base init" but you have the power to switch this around by putting the explicit call after the print statement, or call any of the other methods of the base class instead of or in addition to.

The purpose of a constructor is to initialize an instance of a class so that it becomes a valid object. In my example you have to mentally picture the print statement as doing something vital and necessary to make the object operational.

IMHO it is a serious weakness in the language when base class component of a derived class could easily be used without having been correctly initialized.

StephenF
August 20th, 2010, 03:07 AM
IMHO it is a serious weakness in the language when base class component of a derived class could easily be used without having been correctly initialized.
When this does happen it is usually accompanied by a traceback of some kind which is a hell of a lot better than a segmentation fault, therefore not much of a problem.

slavik
August 20th, 2010, 09:02 AM
I've done something even better.
I've used table-valued function in MS-SQL to create the equivalent of a parametrized view.

Now the thing is, it worked very fast on my development machine, where the table have around 10 to 100 test entries.

But on the production server, with millions of rows...
Made me realize that table valued functions cache the temporary table in RAM. Which will be too small for millions of entries, which means the database system begins swapping - argh, HD was extremely slow ...
Sir, I will top that ... TWICE!!!

1. Same application as before. This time, this is a database for BPM (Aqualogic BPM (Business Process Management). Aqualogic was bought by BEA to replace Tuxedo, and BEA was bought by Oracle). So ... there is a table update that need to happen. Alter table Blah, add 3 columns, default null. The table has 30+ million records. So now the DB has to re-organise the table storage and set the default values. 20 minutes later, a cron job runs that kills that session since it was long running ... ROLLBACK! ;)

2. Similar issue as yours, except there was an update to a table on every page visit (some kind of security thing. Issue 1. The number of connections the DB allowed from app server was too low. Issue 2. The connection pools would get overloaded. Turns out, when the developers tested the code in production, it was 10 people (so everyone basically got their own connection to DB). The load that morning at 9AM was 2k people. This also caused threads to just sit there waiting for DB. All day fun ;)

WitchCraft
August 20th, 2010, 10:51 AM
Sir, I will top that ... TWICE!!!

1. Same application as before. This time, this is a database for BPM (Aqualogic BPM (Business Process Management). Aqualogic was bought by BEA to replace Tuxedo, and BEA was bought by Oracle). So ... there is a table update that need to happen. Alter table Blah, add 3 columns, default null. The table has 30+ million records. So now the DB has to re-organise the table storage and set the default values. 20 minutes later, a cron job runs that kills that session since it was long running ... ROLLBACK! ;)

Hilarious. But why does a cron job kill the session ?




2. Similar issue as yours, except there was an update to a table on every page visit (some kind of security thing. Issue 1. The number of connections the DB allowed from app server was too low. Issue 2. The connection pools would get overloaded. Turns out, when the developers tested the code in production, it was 10 people (so everyone basically got their own connection to DB). The load that morning at 9AM was 2k people. This also caused threads to just sit there waiting for DB. All day fun ;)

:lolflag: good one, could have been mine ;-))

WitchCraft
August 20th, 2010, 10:59 AM
What I read is that it isn't really guaranteed to work and if it does then there is no telling if and when the destructor gets called.


Usually on program/function exit/termination, but it's true, that's the general problem of managed/interpreted code.

Though usually there is a function like GC.collect(), which means it collects all 'pending' garbage. However, it's most of the time not advisable to do so, because the definiton of 'pending' is a bit difficult since there's no explicit variable/object lifetime.

shawnhcorey
August 20th, 2010, 01:22 PM
Usually on program/function exit/termination, but it's true, that's the general problem of managed/interpreted code.

No, that's the problem of all code that uses malloc(3), including C and C++. As I said above, some OSes don't even have a function that returns freed memory back to the system. It only gets returned when the process dies.

trent.josephsen
August 20th, 2010, 03:05 PM
What I read is that it isn't really guaranteed to work and if it does then there is no telling if and when the destructor gets called.

In C++ when you 'new' an object the constructor gets called and when you 'delete' it then the destructor gets called and it is intended to be under full control of the programmer. GC if any is just an optional belt and braces safety net.

Now I do want to stick to the OP's topic about Python and on reflection I suppose the GC issue is a red-herring. My actual concern is this:



class Base:
def __init__(self):
print ('base init')

def __del__(self):
print ('base del')

class Derive(Base):
def __init__(self):
Base.__init__(self)
print ('derive init')

def __del__(self):
print ('derive del')
Base.__del__(self)

d = Derive() # construct one
d.__del__() # I'm done with it now
print ('done')
# OMG it gets destroyed a second time now!!


I don't like the fact I have to explicitly call the constructors and destructors of the base class: It seems like sacrilege when a sloppy programmer might create an object that has only been partially constructed! As for the destructors well they are utterly useless because of their unpredictability!
Python allows multiple inheritance, which means that it can't make assumptions about which superclass constructors to call and in what order. How would you suggest resolving this "problem"? Keep in mind that calling two constructors (or destructors) on the same object, or in the wrong order, could be disastrous.

As for __del__ being called twice, calling __del__ is not the same as deleting an object. Python provides the 'del' operator for that. FYI.

shawnhcorey
August 20th, 2010, 03:30 PM
Python allows multiple inheritance, which means that it can't make assumptions about which superclass constructors to call and in what order. How would you suggest resolving this "problem"? Keep in mind that calling two constructors (or destructors) on the same object, or in the wrong order, could be disastrous.

Perl allows multiple inheritance in the order listed. If you declare object X's parent to be A, B, and C, it will search A first, then B, then C. If you say, B, C, A, it will search B first, then C, then A.

Also, if you plan to do a lot of object-oriented programming in Perl, it's better to use a toolkit for it, like Moose (http://search.cpan.org/search?query=Moose&mode=all).

slavik
August 20th, 2010, 03:58 PM
Hilarious. But why does a cron job kill the session ?




:lolflag: good one, could have been mine ;-))
the cron job kills any long running queries. or sessions ...

worseisworser
August 20th, 2010, 04:32 PM
What I read is that it isn't really guaranteed to work

Perhaps you could provide a source for what you've been reading(?), because I sure would be surprised if this was the case. While I do not have extensive knowledge of the GC as implemented in CPython -- it is certainly not a general problem for several other GC implementations.



... there is no telling ... when the destructor gets called.

This tends not to a be a problem; I'm sure you already knew this.



In C++ when you 'new' an object the constructor gets called and when you 'delete' it then the destructor gets called and it is intended to be under full control of the programmer.

You talk as if allocation and deallocation on the heap in performance critical tight loops is a good idea. O_o

That is not to say that any language or environment with a GC cannot possibly also support the deterministic, simple handling of memory/object/resource lifetime you talk of -- for when this makes sense. One example is the `dynamic-extent' declaration seen in Common Lisp, and of course the several `with-xxxxxx' type constructs; macros that expand to `open + unwind-protect + close' type combinations. I'm sure the Python know-hows around here can fill us in as to whether Python has something similar, but I'm pretty sure it does.

..heck, this stuff even actually works; even when combined with exceptions -- you know; because of `unwind-protect' and us having a GC around.. The same can not be said for e.g. C++: http://yosefk.com/c++fqa/exceptions.html .. I know; none of you people actually deal with non-trivial software that could possibly benefit from the usage of "exotic" (lol..) concepts and tools like e.g. conditions, exceptions and restarts anyway.




GC if any is just an optional belt and braces safety net.

ORLY. I'll take my "belt and braces safety net" (F16) and climb miles higher than you (WW1 fighter) or any other more or less tool-less and simple animal (bird) ever could.

CptPicard
August 20th, 2010, 05:58 PM
I'm sure the Python know-hows around here can fill us in as to whether Python has something similar; I'm pretty sure it does.

The "with"-statement. Also, decorators can be used to similar effect. Naturally I prefer the Lisp way as the macro style is not either hardcoded into the language or just a way to chain functions...

Very good post, btw.

darkstarbyte
August 20th, 2010, 06:20 PM
C++, Perl, Java, and python are all c based languages. Perl was written in nov 14, 1999, or at least

when the person or people started to write it . Python was written in c, but before C++, Perl, Java, and

python was c before that binary before that analog computers but before quantom computers was the

analitical engine and before that many gear based calculators but before that the abacus and posibly

before all that rocks or something because a computer is anything that can perform calculations so

there i win. One more thing, "Stick with what you know and posibly branch out later."


:lolflag:

shawnhcorey
August 20th, 2010, 06:32 PM
C++, Perl, Java, and python are all c based languages. Perl was written in nov 14, 1999, or at least

when the person or people started to write it . Python was written in c, but before C++, Perl, Java, and

python was c before that binary before that analog computers but before quantom computers was the

analitical engine and before that many gear based calculators but before that the abacus and posibly

before all that rocks or something because a computer is anything that can perform calculations so

there i win. One more thing, "Stick with what you know and posibly branch out later."


:lolflag:

Actually, before the electronic kind, a computer was a person who did calculations, usually with a slide rule (http://en.wikipedia.org/wiki/Slide_rule). Engineering firms use to have big, hanger-sized rooms full of row upon row of desks, at each sat a computer doing calculations.

darkstarbyte
August 20th, 2010, 06:56 PM
The reson i didnt include that last part is because if I didnt i would be here all day making everyone mad at me for leaving such a long comment.

schauerlich
August 20th, 2010, 07:05 PM
Perl was written in nov 14, 1999, or at least when the person or people started to write it.
wat (http://en.wikipedia.org/wiki/Perl#History)


Python was written in c, but before C++, Perl
c++ (http://en.wikipedia.org/wiki/C%2B%2B#History)
python (http://en.wikipedia.org/wiki/Python_(programming_language)#History)

Like I said, wat

slavik
August 20th, 2010, 09:44 PM
Perhaps you could provide a source for what you've been reading(?), because I sure would be surprised if this was the case. While I do not have extensive knowledge of the GC as implemented in CPython -- it is certainly not a general problem for several other GC implementations.




This tends not to a be a problem; I'm sure you already knew this.




You talk as if allocation and deallocation on the heap in performance critical tight loops is a good idea. O_o

That is not to say that any language or environment with a GC cannot possibly also support the deterministic, simple handling of memory/object/resource lifetime you talk of -- for when this makes sense. One example is the `dynamic-extent' declaration seen in Common Lisp, and of course the several `with-xxxxxx' type constructs; macros that expand to `open + unwind-protect + close' type combinations. I'm sure the Python know-hows around here can fill us in as to whether Python has something similar, but I'm pretty sure it does.

..heck, this stuff even actually works; even when combined with exceptions -- you know; because of `unwind-protect' and us having a GC around.. The same can not be said for e.g. C++: http://yosefk.com/c++fqa/exceptions.html .. I know; none of you people actually deal with non-trivial software that could possibly benefit from the usage of "exotic" (lol..) concepts and tools like e.g. conditions, exceptions and restarts anyway.




ORLY. I'll take my "belt", "braces" and "safety net" (F16) and climb miles higher than you (WW1 fighter) or any other more or less tool-less and simple animal (bird) ever could.
I am sorry ... what?! If Java self GC, braces and belt were that good, I wouldn't be doing what I do.

worksofcraft
August 20th, 2010, 10:34 PM
Python allows multiple inheritance, which means that it can't make assumptions about which superclass constructors to call and in what order. How would you suggest resolving this "problem"? Keep in mind that calling two constructors (or destructors) on the same object, or in the wrong order, could be disastrous.

As for __del__ being called twice, calling __del__ is not the same as deleting an object. Python provides the 'del' operator for that. FYI.

What "problem"? It seems blatantly obvious you should initialize the base classes before the bits that are derived from it and could be dependent on it. Destroy them in reverse order.

Personally I see absolutely no benefit in depriving the programmer of an ability to control the scope of variables by making use of heap and GC compulsory even when it is inappropriate.

eeperson
August 21st, 2010, 12:49 AM
What "problem"? It seems blatantly obvious you should initialize the base classes before the bits that are derived from it and could be dependent on it. Destroy them in reverse order.


This still doesn't address the issue of multiple inheritance. If you have two superclasses which one do you construct first? The same applies to destructors. Also, if you have a superclass with more than one constructor or one that takes parameters you are still going to have to call the constructor explicitly. This last part is true for all languages that support this kind of multiple inheritance, including C++.



Personally I see absolutely no benefit in depriving the programmer of an ability to control the scope of variables by making use of heap and GC compulsory even when it is inappropriate.


The benefit of GC is that there is less book keeping that has to been done manually. I'm a little confused about your stance. You don't like having to explicitly close resources in Python so you would much rather work in a language where you have to explicitly delete everything you allocate?

worksofcraft
August 21st, 2010, 01:54 AM
If you have two superclasses which one do you construct first?

This is not an issue. The point is that things be constructed before they get used. Competing superclasses will not use each other during their construction so a simple rule like "do them in the order specified by the superclass derivation is perfectly adequate.

The point of having languages like Python and Perl is to automate things and make it easier to program than in languages like C++.




The benefit of GC is that there is less book keeping that has to been done manually. I'm a little confused about your stance. You don't like having to explicitly close resources in Python so you would much rather work in a language where you have to explicitly delete everything you allocate?

Even in ancient C most variables have automatic and highly efficient allocation AND deletion (using a stack). Their scope is well defined and easy to understand. There is no "book keeping" involved and no requirement to explicitly delete things.

GC is fine when you really need to use "the heap", but retaining an option to explicitly delete things is a great asset to house keeping in languages where destructors do work predictably.

eeperson
August 21st, 2010, 05:40 AM
This is not an issue. The point is that things be constructed before they get used. Competing superclasses will not use each other during their construction so a simple rule like "do them in the order specified by the superclass derivation is perfectly adequate.

The point of having languages like Python and Perl is to automate things and make it easier to program than in languages like C++.


I think that is more along the lines of 'probably should not' have superclasses use each other during construction. Although there certainly could be scenarios where that would be necessary.

My impression of Python is that it tends to favor being more explicit than implicit. I would argue that Python is still easier to program than C++ despite this one aspect being more verbose (maybe even because of it). I can certainly understand if you would prefer something more concise. In my limited experience with Perl, I find that it tends to emphasize conciseness much more (sometimes at the expense of readability).



Even in ancient C most variables have automatic and highly efficient allocation AND deletion (using a stack). Their scope is well defined and easy to understand. There is no "book keeping" involved and no requirement to explicitly delete things.

GC is fine when you really need to use "the heap", but retaining an option to explicitly delete things is a great asset to house keeping in languages where destructors do work predictably.


I dont't see how the C rules for releasing memory are any more defined and or easier to understand. C has two separate sets of rules for the heap and the stack. Python has one rule 'if it has no more references then it is collected'.

Of course the book keeping savings are going to be when you are using the 'heap'. That's the place where things tend to get more complicated. At the same time your not really losing anything when your using variables that would be on the 'stack'.

I'm not really sure what you gain by explicitly deleting resources. You would still have to remove all of the references to the object before you delete it. All you are doing is adding an extra step.

CptPicard
August 21st, 2010, 09:33 AM
Personally I see absolutely no benefit in depriving the programmer of an ability to control the scope of variables by making use of heap and GC compulsory even when it is inappropriate.

It removes much of mostly pointless manual labour which is prone to causing a large class of common bugs. Also, some really interesting language constructs such as closures essentially require the existence of a garbage collector, as the lifecycle of a closure is pretty much required to be quite implicit for it to be useful... if you had to manually tell it when to clean up the bindings, things would get complicated.

Personally, I see no real reason to at least in the general case to give up the benefits so that I could "get" to manage my own memory. Of course in the corner cases where it is actually needed, you lose the ability, but that's just a tradeoff one has to make.

Interestingly, by the way, in Lisp you still can use malloc to your heart's content if you want to.



I'm not really sure what you gain by explicitly deleting resources.

It's the determinism, and the time savings in not running the heap analysis of the GC. Whether this is a requirement or not, depends.

worseisworser
August 21st, 2010, 10:23 AM
IMHO it is a serious weakness in the language when base class component of a derived class could easily be used without having been correctly initialized.

Why?

Do you think a language that allows one to try to read from an uninitialized variable residing outside of a class/object in general has a "serious weakness" too?



(defclass super ()
((some-field :initform (write-line "SUPER SOME-FIELD"))))


(defmethod initialize-instance :before ((super super) &key)
(write-line ":BEFORE SUPER"))


(defmethod initialize-instance :after ((super super) &key)
(write-line ":AFTER SUPER"))



(defclass sub (super)
((some-other-field :initform (write-line "SUB SOME-OTHER-FIELD"))))


(defmethod initialize-instance :before ((sub sub) &key)
(write-line ":BEFORE SUB"))


(defmethod initialize-instance :after ((sub sub) &key)
(write-line ":AFTER SUB"))


(defmethod initialize-instance :around ((sub sub) &key)
(write-line ":AROUND SUB {")

;; This is a "serious weakness"? Why? Here an exception will be thrown, but
;; since we have a GC and stuff like `unwind-protect' available it is not a problem.
(restart-case
(pprint (slot-value sub 'some-field))
(continue))

(call-next-method)
(write-line ":AROUND SUB }"))





CL-USER> (make-instance 'sub)
:AROUND SUB {
; Evaluation aborted.
CL-USER>


Yes, things where aborted (and I was shown a debugger in my IDE here); but that is not a "serious problem or weakness", as nothing will leak.




CL-USER> (handler-bind ((error (lambda (c) (invoke-restart 'continue))))
(make-instance 'sub))
:AROUND SUB {
:BEFORE SUB
:BEFORE SUPER
SUPER SOME-FIELD
SUB SOME-OTHER-FIELD
:AFTER SUPER
:AFTER SUB
:AROUND SUB }

#<SUB {100414B9A1}>
CL-USER>


..and as you can see it is not a problem in a run-time end-user context either (where I've obviously wrapped my code or application in a handler); I could try to use uninitialized objects and/or their slots/fields all day long without problems.

I mean, sure; it can cause serious trouble (memory corruptions, leaks) in languages like C++, but not in higher-level languages/environments with the tools and power to deal with this sort of thing.

worksofcraft
August 21st, 2010, 11:09 AM
Why?

Do you think a language that allows one to try to read from an uninitialized variable outside of a class/object in general has a "serious weakness" too?


With effective use of constructors and destructors I used to find it very productive to ensure that everything will always be in a consistent and predictable state when it gets used.

I only just started with Python but find it is quite sloppy and error prone in many ways. That is fine for some situations as I wasn't looking for a substitute for languages like C++.

One of Pythons strengths is it's ease of use in rapid prototyping that has resulted in it's use on numerous high quality online tutorials. Another is it's usability to bring together the functionality from other packages and libraries.

worseisworser
August 21st, 2010, 12:19 PM
With effective use of constructors and destructors I used to find it very productive to ensure that everything will always be in a consistent and predictable state when it gets used.

.. "..effective use.." what does this even mean?

Anyway, this doesn't mean construction/initialization always is a simple sequential and almost atomic-like operation where time does not exist while it is in progress.

There can be a lot of coordination and back-and-forth needed when constructing/initializing objects -- especially when slots/fields have interdependencies, and the ability to control or override details with regards to order is useful.

shawnhcorey
August 21st, 2010, 01:06 PM
It's the determinism, and the time savings in not running the heap analysis of the GC. Whether this is a requirement or not, depends.

Well, I think you're worried about something that doesn't happen very often. Python, like Perl and Ruby, will free up heap resources when their reference count goes to zero. That's immediate. When GC is needed if when you create a cycle in references and don't break the cycle when finished. To avoid GC, write a delete function that will break the cycle. Then the rest of the resource will be freed automatically. That means, for 99% of your work, you don't need to write a delete function.

eeperson
August 21st, 2010, 04:47 PM
It removes much of mostly pointless manual labour which is prone to causing a large class of common bugs. Also, some really interesting language constructs such as closures essentially require the existence of a garbage collector, as the lifecycle of a closure is pretty much required to be quite implicit for it to be useful... if you had to manually tell it when to clean up the bindings, things would get complicated.

Personally, I see no real reason to at least in the general case to give up the benefits so that I could "get" to manage my own memory. Of course in the corner cases where it is actually needed, you lose the ability, but that's just a tradeoff one has to make.

Interestingly, by the way, in Lisp you still can use malloc to your heart's content if you want to.


That's a good point about GC and closures. Closures can vastly simplify the expression of a large class of problems.

I was not aware that Lisp supported manual memory allocation. That's good to know.



It's the determinism, and the time savings in not running the heap analysis of the GC. Whether this is a requirement or not, depends.


I should have been more clear. I meant to imply that explicitly deleting resource didn't gain you anything in terms making a program easier to understand. I'm aware that there are other reasons to avoid GC.

Also, whether or not GC is deterministic depends on the type of GC. Tracing GC, such as that used by Java, is non-deterministic. However, reference counting GC, such as that used by Python and Perl, can be. Otherwise destructors would make no sense Python.

CptPicard
August 21st, 2010, 06:40 PM
Well, I think you're worried about something that doesn't happen very often.

Oh no, I am not the one to be worried about it, I'm just briefly mentioning why some people may worry about it.



That's a good point about GC and closures. Closures can vastly simplify the expression of a large class of problems.

Not to mention that closures can be seen as the conceptual foundation of something like object-oriented programming...



I was not aware that Lisp supported manual memory allocation.

Well, in the sense that its foreign function interface can be used to call anything, malloc included. Then you'll just have to use the typical Lisp idioms to call free() at an opportune time.

worksofcraft
August 22nd, 2010, 11:54 PM
I think that is more along the lines of 'probably should not' have superclasses use each other during construction. Although there certainly could be scenarios where that would be necessary.


When you inherit from a base class, the base class can't have any knowledge of what else is in the derived class because it wouldn't be a base class then.
So I really think we must be talking at cross purposes here.



I dont't see how the C rules for releasing memory are any more defined and or easier to understand. C has two separate sets of rules for the heap and the stack. Python has one rule 'if it has no more references then it is collected'.


Actually python has a different rule for garbage that somehow got cyclicly linked, but I think I made quite clear it's not an issue of recovering lost memory it's an issue of the destructors getting called. C++ has one rule: Constructor gets called when an object is created. Destructor gets called when it is destroyed.

On the one hand you say it is good that Python lets you choose if and when to call constructors and destructors but on the other you don't think that calling the destructor explicitly should actually destroy said object :confused:



Not to mention that closures can be seen as the conceptual foundation of something like object-oriented programming...


It would help if you can explain what you mean by "closure".

p.s. In C++ there is a template class for reference counting "smart pointers" that you can use when needed but for this topic the question is really how does Python compare with the way Perl invokes constructors and destructors? e.g. Can you neglect to initialize objects in Perl too and will they get destroyed when they go out of scope, or at some indeterminate later stage?

schauerlich
August 23rd, 2010, 02:03 AM
Constructor gets called when an object is created. Destructor gets called when it is destroyed.

On the one hand you say it is good that Python lets you choose if and when to call constructors and destructors but on the other you don't think that calling the destructor explicitly should actually destroy said object :confused:

Use the del keyword.


It would help if you can explain what you mean by "closure".

http://en.wikipedia.org/wiki/Closure_(computer_science)

Basically, it's a function defined in some context (such as within another function) that can "capture" variables in that context and carry them with it.

For instance, in Common Lisp:


CL-USER> (defun make-incrementer (start-value)
(flet ((incrementer ()
(incf start-value)))
#'incrementer))
MAKE-INCREMENTER
CL-USER> (defvar my-incrementer (make-incrementer 10))
MY-INCREMENTER
CL-USER> (funcall my-incrementer)
11
CL-USER> (funcall my-incrementer)
12
CL-USER> (funcall my-incrementer)
13

Here I define a function which returns another function (closure), "incrementer". It's a function which has captured the variable "start-value" from its surrounding context (the body of make-incrementer) and allows it to "live on" past where you'd expect it to go out of scope just looking at the code. Every time I call "my-incrementer", it increments the current value of "start-value" and returns it. So, basically, it's a function which can carry around state with it indefinitely. Combined with message passing, you have the basis for an object system. Structure and Interpretation of Computer Programs (http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-20.html#%_sec_3.1.1) has some interesting stuff about this (in Scheme).

worksofcraft
August 23rd, 2010, 03:01 AM
Basically, it's a function defined in some context (such as within another function) that can "capture" variables in that context and carry them with it...


Thanks for that explanation :)

My initial thoughts on functions that carry around data from elsewhere is that I would put it in an object and make the function a method of that class.

I must admit I have no intention of learning Lisp as well though. I have more than enough to contend with just learning practical aspects of programming.

This is precisely why I am interested in Python. I could have chosen Perl, but Python seems to be more fashionable and I wouldn't even be surprised if Google add it as a scripting language choice to their web browser one day. :D

eeperson
August 23rd, 2010, 04:11 AM
When you inherit from a base class, the base class can't have any knowledge of what else is in the derived class because it wouldn't be a base class then.
So I really think we must be talking at cross purposes here.


I thought we were talking about two base classes knowing about each other not a base class knowing about derived class?



Actually python has a different rule for garbage that somehow got cyclicly linked, but I think I made quite clear it's not an issue of recovering lost memory it's an issue of the destructors getting called. C++ has one rule: Constructor gets called when an object is created. Destructor gets called when it is destroyed.

On the one hand you say it is good that Python lets you choose if and when to call constructors and destructors but on the other you don't think that calling the destructor explicitly should actually destroy said object :confused:




p.s. In C++ there is a template class for reference counting "smart pointers" that you can use when needed but for this topic the question is really how does Python compare with the way Perl invokes constructors and destructors? e.g. Can you neglect to initialize objects in Perl too and will they get destroyed when they go out of scope, or at some indeterminate later stage?


It's not that I think it is good that Python lets you decide when to call base class constructors and destructors, it's that it is not possible for Python to automatically determine how and when to call them. Python could pick a default for and automatically handle certain cases (such as C++ does for zero argument constructors) but that still does not force you to initialize or destroy the base class properly.

Python takes a hybrid approach to garbage collection (GC). For the most part, Python uses reference counting (http://en.wikipedia.org/wiki/Reference_counting) GC. This is the type of GC used by Perl and the same type of GC that you get when you used C++ 'smart pointers'. The advantage of this type of GC is that it is 'deterministic'. By this, I mean you can know exactly when data will be removed by the GC. This is a requirement for 'sane' destructors. The problem with reference counting GC is that it can't handle circular references, hence your issues. In order to make handling this case easier Python has optional GC for handling circular references (This is on by default). The problem is that the clean up of circular references cannot be made 'deterministic' for the same reason that Python cannot automatically determine the which order to call the destructors for multiple superclasses, there is no way to know which to delete first (unless you specify using Python's 'weakref').

When using Python you kind of have a choice between using destructors and getting more advanced GC. I personally would opt for more advanced GC and make use of Pythons 'with' for cleanup.

As far as I know Perl doesn't offer GC for circular references.

amanjsingh
September 4th, 2010, 11:26 PM
I am not a good programmer. I just learned programming to help my university project. I started with Java, but it didn't go too well. Then I learned Perl, and rewrote everything in Perl. It was wonderful. As a result, I love Perl, and if someone ask me what's the best programming language for a beginner get a task done, I would say Perl.

However, when I search on the internet, everyone says that Python is better. I am yet to see an "unbiased" web page comparing Perl and Python. Why? Is Python really better than Perl?

(I am too busy to finish my final year in university, so I don't have time to look at Python in depth.)

I have always felt that Perl is more 'lose' and 'lenient' than Python.