PDA

View Full Version : The programming language of future!!!



MedivhX
January 8th, 2007, 02:08 PM
Is it Python???

meng
January 8th, 2007, 02:10 PM
(Ducks for cover.) Another holy war coming! Fire in the hole!

MedivhX
January 8th, 2007, 02:18 PM
LOL!!! I forgot to say: NO FLAMING!!!

meng
January 8th, 2007, 02:28 PM
So one could argue that the language of the future will depend largely on the programmers of the future. If we expect a wave of hobbyist rather than hard-core programmers to swell the ranks, then I think Python will be very appealing, at least to learn the ropes.

loell
January 8th, 2007, 03:51 PM
no its not python , its just one of the present languages,

the programming lanuage of the future will be reffered as "Language X" :mrgreen:

and future programmers will also refer it as the prohgramming language of the future's future ;)

so as the song goes




When I was just a little kiddie script coder
I asked my mentor, what will it be
Will it be python, will It be ruby
Here's what he said to me.

Que Sera, Sera,
Whatever will be, will be
The programming language future's not ours, to see
Que Sera, Sera
What will be, will be.

:p :p :p

Verminox
January 8th, 2007, 03:58 PM
^^^ Roflmao :d

MedivhX
January 8th, 2007, 04:24 PM
no its not python , its just one of the present languages,

the programming lanuage of the future will be reffered as "Language X" :mrgreen:

and future programmers will also refer it as the prohgramming language of the future's future ;)

so as the song goes

ROFLOL!!! :-D :-D :-D

WebDrake
January 8th, 2007, 04:34 PM
It has been suggested by some that Lisp will make a comeback:
http://www.paulgraham.com/icad.html

More seriously I think there will always be a need for a range of languages between those like C (which describe in human-friendly [relatively speaking ;)] pretty much what the computer does, which is store and manipulate arrays of numbers) and so are close to the implementation, and Lisp-like languages (Python and Ruby are closer to Lisp than C), which focus on the concise conceptual description of what is wanted and are far from the implementation.

The former lets you write very fast code (which is why C is used for the kernel and much scientific large-scale simulation), the latter lets you develop quickly (which is why Lisp, Python and Ruby are very useful for web development where speed of deployment is more important than speed of execution), and depending on what you want to do, it may be worthwhile to have something in-between those two extremes.

pmasiar
January 8th, 2007, 05:29 PM
We have more than 8000 programming for a reason (one of them is - everyone and his brother think is smarter then all the others :-) ). languages are accepted for many different resons: marketing, company politics, professors in college inflicting their preference onto others, and yes, sometims for ability to solve problems and compatibility with others.

I wrote long musing about this issue here (http://ubuntuforums.org/showpost.php?p=1983053&postcount=11)- does not make sense to repeat, but it was rather off-topic and probably comments are better in this thread. :-)

Let me also invoke Paul Graham and his "The Hundred-Year Language" essay: http://www.paulgraham.com/hundred.html which tries to find NOW the language which will be used 100 years from now. Hint: it's not java :-)

SuperMike
January 8th, 2007, 05:32 PM
Long ago, I thought about this. I wanted to call my language "Polar" and in some ways it would be like C and PHP. I just don't have the brainpower to pull it off though and may only copyright a design and let someone else build it.

is a language built on top of Larry Wall's Parrot runtime -- dynamic typing instead of static -- namespaces with shortcut addressing and "using x {}" control structure -- curly braces instead of begin/end-type clauses -- semicolons instead of line wraps -- OOP features about to the level of PHP5 -- methods and properties accessed with period character instead of "::", ":::", or "->" -- must have foreach object/array iteration -- C-like for, switch/case, if/else/elseif, while, do, break/continue, etc. control structures -- is a scripting language instead of needing to be compiled -- starts off as a rich command line language and evolves into the web -- rich set of namespace components -- default set of namespace components can do most common programming tasks, while others may be added for more obscure things -- easy to build your own namespace component in C -- indenting style has no chance of becoming a Python-like syntax error and indenting is encouraged, not required -- uniform parameter order and namespace naming for namespace components -- runs about as fast as Perl -- tries to keep things simple and doesn't have Perl's shortcuts that can make for hard-to-read code -- is about as far removed from SmallTalk, Lisp, Scheme, and Ruby as can be -- supports piped-to-shell processes much like PHP -- has a goal of being the world's general purpose cross-platform language and can be taught in schools even to CS 101 students -- doesn't follow the principle of "everything is an object" which slows the interpreter down -- encourages but does not require use of classes to get code done, permitting very short scripts to be possible -- class reflection -- easy to use thread-safe multithreading -- linked lists, stacks, and queues are super-easy to implement -- built-in API security -- ability for some sysadmins to block certain API features or function calls -- public, private, protected class access control -- built-in garbage collection -- no nested functions/methods -- PHP-style arrays with ability to go 8 dimensions -- the language should try to work about the same on Mac, Linux, and Windows rather than having to write exception logic for the platform -- named parameters possible -- PHP-like function/method/property declaration -- PHP-like byref addressing -- supports try/catch/throw -- supports deep global variable scope such that functions called by functions also support the global variable scope -- built-in Unicode support by default without having to use special namespace component methods for that -- Perl-like quotes -- string concatenation with asterisk instead of plus, period, or ampersand -- PHP-like logical comparisons with ==, ===, !=, etc. -- all variables begin with dollar sign ($)

Omnios
January 8th, 2007, 05:35 PM
A good question would be can Python be a leader in the future but that would count on the uses of the language rather than the language its self

3rdalbum
January 9th, 2007, 11:10 AM
I think INTERCAL is the language of the future. As computers become more intelligent, and they get actual artificial intelligence, they will demand that programmers use manners.

No, but seriously I think C# will really be the language of the future as long as there are interpreters for all platforms.

Wybiral
January 9th, 2007, 11:59 AM
People will write programs with notes from guitar solo's, anyone who has ever watched "Bill and Ted's excellent adventure" knows that...

On a serious note, my vote would be something closer to human languages, with compilers that are SUPER intensely intelligent about assembling the code. But to really answer a question like that, you would first have to answer this... "What kind of technology will we have in the future?"

Anyone who knows the answer to that, please, do tell.

BTW, for python to be *THE* language it would have to be hardwired into the hardware, otherwise it would need to be written in another language and compiled to assembly... negating it as *THE* language. Unless of course python compilers become more efficient and able.

But still... Programming theory, compiler intelligence, market demands, hardware... Too many factors to be sure.

I think machine code is probably going to be around longer than most languages, so put your money on assembly and raw machine code...

Rhubarb
January 9th, 2007, 12:54 PM
Another vote for assembly here!


Assembly is the fastest running code you can write out there.
If you like spaghetti code (like I do) you'll love it
It's certainly not cross-platform (or that easy to code), the same code wouldn't run on i386 and PowerPC / Sparc / ARM / insert_random_architecture_here
It's the first language, and I think it will be the only language that's able to survive so long as computers exist.

MedivhX
January 9th, 2007, 01:09 PM
In 4-5 years there will multi-core processors, so why would we need assembly???

Rhubarb
January 9th, 2007, 01:45 PM
In 4-5 years there will multi-core processors, so why would we need assembly???

Heheh, so as to make assembly even more complicated than it already is.
](*,)

pmasiar
January 9th, 2007, 02:58 PM
In 4-5 years there will multi-core processors, so why would we need assembly???

Every core runs the binary code (== assembly) so you still need it. Either you or me are utterly confused :confused:


Assembly is the fastest running code you can write out there.

It's the first language, and I think it will be the only language that's able to survive so long as computers exist.

You may be not aware of it, but many processors were build 20 years ago which implement higher-level language, like List and Fort. For them, Lisp (or Forth) **was** the assembly language. I don't follow it anymore but I am sure it is still possible. Not sure how feasible tho :twisted:


If you like spaghetti code (like I do) you'll love it
LOL no comments needed here - you explained your position and experience level thoroughfully :D


closer to human languages, with compilers that are SUPER intensely intelligent about assembling the code.

Human language is vague, not strict enough. Sarcasm and irony are hard to understand. We already had the discussion about understanding "yeah, right".


for python to be *THE* language it would have to be hardwired into the hardware.

Not so. Python's Bytecode interpreter will be enough. MS has CIL (http://en.wikipedia.org/wiki/Common_Intermediate_Language) with same goal. Or Forth processor will do nicely :p

MedivhX
January 9th, 2007, 03:45 PM
Every core runs the binary code (== assembly) so you still need it. Either you or me are utterly confused :confused:

I wanted to say this: Why would anyone write a program in assembly, or C, to make program run faster, when there will exist multi-core CPUs.

gummibaerchen
January 9th, 2007, 05:41 PM
Simply answer:


D

MedivhX
January 9th, 2007, 06:51 PM
Hehehe... Then that future is already here: http://en.wikipedia.org/wiki/D_programming_language

pmasiar
January 9th, 2007, 07:14 PM
Nah, D has no support in Ubuntu - not mentioned in Master Programming Tutorial Thread :evil:

Seriously now. D is great language bridging Asm and seriously improved C++. But in no way is aimed to be used for RAD (rapid-app development) like Python is. Statically typed, not optimized for reading. Good for C++ honchos, useless for a biologist or any other scientist who need to *use* computers to solve her own problems, fast.

Jaygo333
January 9th, 2007, 07:18 PM
We all know its D++ and Ruby.
D++ Successor to C++ , C# and Python( A little bit)
Ruby Successor to Java

Its true D++, go on GooooooooGle it.
Now who's laughing.

TuxCrafter
January 9th, 2007, 07:29 PM
D looks very promising. I just think they should give it a other name

MedivhX
January 9th, 2007, 07:50 PM
Well, they could but it's all the same... Look how many languages have only one letter in their name: A, A++, A#, B, C, C++, C--, C#, E, F, F#, J, J#, J++, K (LoL, maybe KDE guys invented it), L, M4, Q, R, R++, S, S2, T, X10.

gummibaerchen
January 9th, 2007, 08:18 PM
Hehehe... Then that future is already here: http://en.wikipedia.org/wiki/D_programming_language

Yeah, as Python also is ;)

pmasiar
January 9th, 2007, 08:31 PM
We all know its D++ and Ruby.
D++ Successor to C++ , C# and Python( A little bit)
Ruby Successor to Java

Its true D++, go on GooooooooGle it.
Now who's laughing.

I do - I laugh in your face! :D

1) I cannot "GooooooooGle" it - Google is a name (noun), not verb. :twisted:

2) Ruby Successor to Java? LOL. To Java of all the languages? LOL again. To Perl, maybe, just look at it. Java enthusiast *finally* got enlightened, seen errors in their ways. All smart people left perl and java for python long time ago, but if *they* joined python so late, it would be admission they were un-cool and missed python when it was cool to switch (now switching is obvious). So they got hyped to switch to Ruby. All on pure hype - exactly as they got to java. We old perlers, looking at all that hype, can only say: been there, done that, run away screaming ](*,)

Seriously folks, some coding rules from greybeard for you youngsters:

1) "Code is like tattoo - it will stick with you for a very long time". So think twice before you write code. Language programs your mind.

2) "Write code as it will be maintained by crazy gun-loving maniac, who knows where you live" (it's you). Writing code is the easy part, maintenance is where problems are patiently waiting for your deadlines.

3) "Debugging code is twice as hard as writing it" So if you write code as clever as you are able to, by definition you are NOT able to debug it correctly. "clever code" or "clever programming language" is not a solution - it is yet another additional problem you need to handle.

4) "perfect creation is not when you cannot add any new feature - perfect is when you cannot remove any feature without breaking it apart"

I don't think you youngsters can appreciate it now, but it might stick with you and you will recall it 10 years later :-)

gummibaerchen
January 9th, 2007, 09:40 PM
We all know its D++ and Ruby.
D++ Successor to C++ , C# and Python( A little bit)
Ruby Successor to Java

Its true D++, go on GooooooooGle it.
Now who's laughing.

like pmasiar said, you have no idea ;)

Btw you can Google a lot (asdffdasasdf for example :D)

But in fact there must be a D++, even if I couldn't find any Wikipedia Article about it.

Maybe it was started as D++ but it's the normal D now?

http://99-bottles-of-beer.net/language-d++-193.html

DoktorSeven
January 9th, 2007, 09:42 PM
Sorry, you're all wrong. It's C. :)

pmasiar
January 9th, 2007, 09:59 PM
DoktorSeven, you are looking in wrong direction: that way is the past. Future is in the other direction :cool:

This thread started as silly and now is really stoopid - no real new ideas? some insight? just jokes? nothing meaningfull to say?

yabbadabbadont
January 9th, 2007, 10:10 PM
In the future, the computers will program each other and will become our overlords.... so the answer is "raw op codes". (it's their native language) :D

All hail our computer overlords! :lol:


Edit: OK, that is about one third prediction and two thirds joke. ;)

gummibaerchen
January 9th, 2007, 10:12 PM
Sorry, you're all wrong. It's C. :)


DoktorSeven is right for the next 5 years ;) Hey, better be a little oldschool then just jumping on the hype train of Ruby and Mono and Co, that train may drive to /dev/null.


DoktorSeven, you are looking in wrong direction: that way is the past. Future is in the other direction :cool:

This thread started as silly and now is really stoopid - no real new ideas? some insight? just jokes? nothing meaningfull to say?

Gnome, GTK and the Kernel (which are very important parts of Ubuntu) will surely stay with C for some more time ;)

pmasiar
January 9th, 2007, 10:25 PM
Looks to me that your mind (and everybody else's) is binded by spell that the only way to talk to CPU is Intel assembler. It is true - but only for Intel processors. 15 years ago they had $2 custiom chip which used Forth as assembly language (used for expert systems) with millions of logical inferences per second: http://www.ultratechnology.com/

Workstation in a Mouse (http://www.ultratechnology.com/scope.htm) mouse with an F21 would cost about the same as an ordinary $10 mouse and only needed an RGB monitor.

DoktorSeven
January 9th, 2007, 10:52 PM
DoktorSeven, you are looking in wrong direction: that way is the past. Future is in the other direction :cool:

This thread started as silly and now is really stoopid - no real new ideas? some insight? just jokes? nothing meaningfull to say?

C is the past, present and future. Don't talk about it like it is a dead language, because it's not.

I may have put a smiley on the post but I'm serious -- I don't see anything replacing C as the real, workhorse language anytime soon. Seeing how it's responsible for the kernel and many of the programs that drive Ubuntu and other GNU/Linux distros, I don't see it becoming less relevant anytime soon.

Everything else may have its purpose and place, but I see C as the foundation on which everything else is built.

Dygear
January 9th, 2007, 11:46 PM
D (http://www.digitalmars.com/d/)?

Too bloated Hello World is 100K.

Sasa_Ivanovic
January 10th, 2007, 12:51 AM
C will die sooner or later. And C++ sucks 'cause it inherits all the bugs from C, and isn't pure.

Java forever!!!

Wybiral
January 10th, 2007, 01:10 AM
I don't think people understand what I mean... Processors will ALWAYS (unless some radical new technology comes out) be based on machine code. By that... I mean binary machine code. It's hard to program in, but everything runs from it... It's THE language, now and probably always (until a technology change).

I'm not saying we will all have to learn it, I'm just saying... That's what talks to the processor, that's what everything gets translated to somewhere down the line. It would be foolish to think that machine code isn't *THE* language now, and wont continue to be *THE* language.

People will naturally want to use more *HUMAN READABLE* language. When I said that, I didn't mean you would talk to your computer like "Yo computer, what's up... Write something to the graphics card"

I just meant that the syntax will probably get easier to read as a human, and CLOSER to human grammatical laws. Obviously sarcasm is not valid, I mean languages will probably get more natural...

Look at the leaps it took to get from assembly, to python... Tell me it hasn't gotten more human readable, and I'll hand you a shiny new dunce cap. :)

EDIT:

Someone has talk to the processor somewhere down the line

Sasa_Ivanovic
January 10th, 2007, 01:31 AM
The perfect language for programming is sleng. Until then : Java.

g3k0
January 10th, 2007, 02:19 AM
Java is c++'s toilet paper

pmasiar
January 10th, 2007, 02:41 AM
C will die sooner or later. And C++ sucks 'cause it inherits all the bugs from C, and isn't pure. Java forever!!!

LOL. Funny part is, that C will outlive java. Java is Cobol++ right now, and making it GPL just added 10-15 years to it. C at least a reason to exist: it is fast.

I analyzed C vs Java, and inevitable downfall of java, in this response (http://ubuntuforums.org/showthread.php?p=1983053#post1983053). It might be surprise for you, but in 45 years of computing we had many candidates to "ultimate programming language", and you are probably too young to recognize even names of most of them:

- Cobol. You seen this one coming, did you?
- PL/1. Big promise from IBM.
- Algol 68. Grammar was really, really clever - beyond you think is possible.
- Ada. Even bigger promise from DoD (Dept. of Defense, USA)
- Java - just joined the "fabulous five" group.

All except Algol 68 are still used - Algol68 was too beautiful to be useful.

lnostdal
January 10th, 2007, 11:59 AM
..oh - what the hell; i'll post a drunken rant..

There will be no "one language" of the future; things will stay the way they are now - where we have many languages.

There will however be a "direction" that all languages are headed and that direction is Lisp. It's easy to confirm that this is really happening by looking at things like C#, Python and Ruby -- even though they have miles to go.

* compiled to native optimized code
* macros done right
* dynamic typing
* programmable programming language
* removal of all syntax (yes; lisp has no syntax - only (data)structures; code is data)
* ...etc...

Lisp in-itself is not one concrete entity or just "one language"; it is fluid, flexible and programmable. One can program the programming languages themselves; there is a language (edit: or (data/code)structure) underneath them - with the same fluid form all the way to enable one to continue extending in any direction at any time. Code is data and data is code -- and the lines between run-time and compile-time are blurred also.

Instead of having many incompatible languages -- one have many languages that are compatible.

Sasa_Ivanovic
January 10th, 2007, 05:09 PM
You all are just too stupid to understand OOP. That's why you use ******* C.
I'm not saying that C++ sucks. It's fast but it has a bad design.
while Java has the best design. and that's what metters in the future.

If you like speed so much why don't you go ahead. Use assembler.

Sasa_Ivanovic
January 10th, 2007, 05:11 PM
Even if Java dies. The basic concept of Java ( OOP and "write once run everywhere" ) will live in it's descendents.
So : JAVA FOREVER!!!

TuxCrafter
January 10th, 2007, 05:35 PM
Even if Java dies. The basic concept of Java ( OOP and "write once run everywhere" ) will live in it's descendents.
So : JAVA FOREVER!!!

I will put systems like wxWidgets wxGTX wxPython QT4 etcetera against that!

Sasa_Ivanovic
January 10th, 2007, 05:58 PM
No metter what, Java has the best desing. Everything is an object ( with few exceptions ) unilke in C++. And what are those global variables / functions for ? It's really stupid to see them, they don't make any sense in OOP.

Java isn't perfect either : it's slow, it brakes it's own naming conventions at places. and variables aren't objects ( there are still alternatives ... ).

But it's the idea of Java that i like : everything is an object, and relationship between those objects. It' just so good. Also the protability.

The language of the future is one with Java's design and C++ speed.
Until then, you can choose Java - design, C++ - speed.

It's the C history that makes progress : C > C++ > Java > ?

yaaarrrgg
January 10th, 2007, 06:04 PM
In the future, computers will program YOU ;)

Tomosaur
January 10th, 2007, 06:10 PM
No metter what, Java has the best desing. Everything is an object ( with few exceptions ) unilke in C++. And what are those global variables / functions for ? It's really stupid to see them, they don't make any sense in OOP.

Java isn't perfect either : it's slow, it brakes it's own naming conventions at places. and variables aren't objects ( there are still alternatives ... ).

But it's the idea of Java that i like : everything is an object, and relationship between those objects. It' just so good. Also the protability.

The language of the future is one with Java's design and C++ speed.
Until then, you can choose Java - design, C++ - speed.

It's the C history that makes progress : C > C++ > Java > ?

Variables can be objects if you define them as objects. Instead of writing:


private int x = 0


You can write


private Integer x = new Integer(0);


This just slows things down though.

Sasa_Ivanovic
January 10th, 2007, 06:13 PM
That's what i meant by alternatives. And that's why noone uses them.

pmasiar
January 10th, 2007, 06:40 PM
No, in real language you just write: x =0 and compilers handles the rest. Use Python! :-)

Sasa_Ivanovic
January 10th, 2007, 06:45 PM
To be honest i don't know much about Python. Can somone give me some details, 'cause i don't know if i wanna learn it :

1. Can you declare global var or functions ?
2. How fast is it ? Betwen 1 and 10 ( Java and C )
3. Is everything an object ?

pmasiar
January 10th, 2007, 06:50 PM
Even if Java dies. The basic concept of Java ( OOP and "write once run everywhere" ) will live in it's descendents.


I did not know that java invented OOP (http://en.wikipedia.org/wiki/Object-oriented_programming) and patented it so no other language can use it without licence. And portability too - why C did not thought of that? :evil: maybe java invented also using alphabet (http://en.wikipedia.org/wiki/Alphabet)for writing the code? :twisted:

Smaltalk was first popular OOP language (and I learned Simula was even before). BTW smaltalk was dynamically typed interpreter - like Python. So OOP is more from Smaltalk than from Java. And Python can do operator overload and multiple inheritance - can your java do that? :mrgreen: thougt so. :cool:

eteran
January 10th, 2007, 06:53 PM
As said earlier in here, there allways will be a big variaty of different languages, that are all useful for solving different problems. Being a fanboi of a particular language sounds idiotic to me.
I will learn whatever language feeds me and solves the requirements to a project best.

Sasa_Ivanovic
January 10th, 2007, 06:57 PM
No it can't 'cause it doesn't want to. The lack of multiple inheritence is disabled on purpose, so that class tree isn't complicated. The only reason for multiple inheritance is lack of implementations ( like in C++ ).

As the basic concept of Java will live in it's descendants so did SmallTalk's concept live in Java. The same thing.

Sasa_Ivanovic
January 10th, 2007, 07:00 PM
Also operator overloading is missing, 'cause it doesn't make sense in some examples :
Car + House = Pig
And it complicates stuff without any reason, why wouldn't you do :
Car.add(new Wheel());

pmasiar
January 10th, 2007, 07:00 PM
To be honest i don't know much about Python. Can somone give me some details, 'cause i don't know if i wanna learn it :

1. Can you declare global var or functions ?
2. How fast is it ? Betwen 1 and 10 ( Java and C )
3. Is everything an object ?

link: Python master link (http://ubuntuforums.org/showpost.php?p=1984319&postcount=5) and yes, you do.

pmasiar moves open hand in front of Sasa's face and says with persuasive voice *YOU DO WANT TO LEARN PYTHON* :-)

1) You have modules, classes, packages, package-level functions and vars, and functions/vars defined inside functions if you want.

2) "speed to market" - 10 times java. Python compiled to C via Psyco: alnost C speed, ie. much faster than java

3) yes. You can even pass function pointers. And you vave generics - what java is getting only now. and many more cool toys. :-)

Sasa_Ivanovic
January 10th, 2007, 07:03 PM
" package-level functions and vars. "
that seems unorganized. So everything isn't class ?

Also is linking done like in Java ( the tree of packages ) or like in C ( including h files ) ?

jblebrun
January 10th, 2007, 07:18 PM
Object-Oriented programming is not the be-all and end-all of programming paradigms. True, it's very useful in a number of situations, but to say that another language is sub-standard because "everything is not an object" is just bull. On what basis do you make the claim that in order to be a good language, the language must be completely object-based?

Furthermore, your ad hominem towards C programmers is ridiculous. Newsflash: object-oriented programming is a PARADIGM. It's possible in MANY MORE LANGUAGES than those which are advertised as "object oriented." Ever heard of gobject?

Object Orientation is a useful abstraction, but blindly using OOP styling for every project betrays one's narrow-mindedness as a problem solver.

Sasa_Ivanovic
January 10th, 2007, 07:30 PM
OOP is the best and only way to solve complicated big projects. While you can do lot's of stuff without it, you can't do anything serious. Without it you get lost in your own code sooner or later.

OOP is a way of thinking. You can only see profit of OOP in the long run.
You can do anything in OOP that you can with other methods ( declare everything static public ), but you can't declare object's in C, can you ?

jblebrun
January 10th, 2007, 07:37 PM
Also operator overloading is missing, 'cause it doesn't make sense in some examples :
Car + House = Pig
And it complicates stuff without any reason, why wouldn't you do :
Car.add(new Wheel());

That's a strawman argument. You can still write lots of code that doesn't "make sense"! That doesn't mean that it should be syntactically prevented! For example, you can still write

Pig = Car.add(House).

So what is gained by limiting the flexibility of syntax extension? Operator overloading is just syntactic sugar. By preventing it, you just force coders to write uglier code.

Imposing arbitrary restrictions on a language in the name of cleaner program design is crap, especially when it doesn't work! It's still quite easy to find poorly written Java examples with just a little bit of searching on the net.

If you want to write good code, learn how to program well. Coding is just a means to an end.

jblebrun
January 10th, 2007, 07:41 PM
OOP is the best and only way to solve complicated big projects. While you can do lot's of stuff without it, you can't do anything serious. Without it you get lost in your own code sooner or later.

OOP is a way of thinking. You can only see profit of OOP in the long run.
You can do anything in OOP that you can with other methods ( declare everything static public ), but you can't declare object's in C, can you ?

Let me re-iterate what I've already said: just because you can't write something like "class Foo" in a language does NOT mean that you can not use object oriented methodologies in your code design. OBJECT ORIENTATION IS A PARADIGM! Writing something like "class Foo" is SYNTACTIC SUGAR.

Object orientation is the only way that YOU can manage to solve complicated big projects. Last time I checked, the Linux kernel is not strictly object-oriented. It uses data objects in the form of structures, and uses some object-oriented methodologies, but it's not purely object-oriented.

Tomosaur
January 10th, 2007, 08:00 PM
jblebrun is absolutely correct. Object orientation is nothing but syntactic sugar. All programs are inevitably transformed into procedural instructions which access data values in-memory. Object oriented programs are designed to be read and understood more easily by humans. Machines do not understand the concept of objects - they only recognise data and a finite set of instructions. Procedural languages can have OO implemented in them using ordinary syntax and careful design. All an object is is a group of data values / memory addresses which point to one another. Let's say you have an object House:

House is aware of its components - window, door, roof, chimney
Window, door, roof, chimney are not aware of the other components, and indeed are technically not aware of their 'parent'.

House is therefore a block of memory with pointers to other memory addresses which contain values. The only way these values will know about their parent is if they too are an object, and have a value which will represent their parent's memory address.

Although OO languages like Java make this kind of class/object based approach very easy - it CAN be done in virtually any language which allows pointers and the ability to manipulate pointers.

Sasa_Ivanovic
January 10th, 2007, 08:15 PM
Although OO languages like Java make this kind of class/object based approach very easy - it CAN be done in virtually any language which allows pointers and the ability to manipulate pointers.
Yes that's right, just open your notepad and start writing in binary.

Everything is syntactic sugar! You can write something like class in C but it's very complicated and full of leaks | bugs.

Operator overloading is a waste of time.
why would you write something like


class Integer () {
private int value;
public Integer operator + ( Integer i ) {
return new Integer(i.value + value);
}
}

when you can write :


class Integer () {
private int value;
public void add(Integer i) {
value += i.value;
}
}

jblebrun
January 10th, 2007, 08:16 PM
jblebrun is absolutely correct. Object orientation is nothing but syntactic sugar. All programs are inevitably transformed into procedural instructions which access data values in-memory. Object oriented programs are designed to be read and understood more easily by humans. Machines do not understand the concept of objects - they only recognise data and a finite set of instructions. Procedural languages can have OO implemented in them using ordinary syntax and careful design. All an object is is a group of data values / memory addresses which point to one another. Let's say you have an object House:

House is aware of its components - window, door, roof, chimney
Window, door, roof, chimney are not aware of the other components, and indeed are technically not aware of their 'parent'.

House is therefore a block of memory with pointers to other memory addresses which contain values. The only way these values will know about their parent is if they too are an object, and have a value which will represent their parent's memory address.

Although OO languages like Java make this kind of class/object based approach very easy - it CAN be done in virtually any language which allows pointers and the ability to manipulate pointers.

Yes, excellent description, Tomosaur! OOP tactics are best thought of as just one way to bridge the gap between machine functionality and human thinking.

Sasa, you might want to take a read through this:
http://www.geocities.com/tablizer/myths.htm

What makes a difference in the long run is good coding practice and sane modeling (modeling does NOT require the use of objects, by the way). I think we already have some examples of projects where object orientation would NOT have given an advantage, in the long run. The Linux kernel, for example. It uses some object orientation, to be sure (mostly in the form of structs and function calls), but it's certainly not explicitly OOP by any means!

Another example of a large scale project written in a non-OOP language is Gaim. It uses glib and gobject to provide some object-like functionality, but it's written in C. Having poked around in the Gaim code quite a bit, I find its structure quite good and easy to follow.

Sasa_Ivanovic
January 10th, 2007, 08:18 PM
Also i do like one thing in Python, and that's no need to use { }.
It's very cool to force programmers use indenting, i just hate to see not-formatted code!

jblebrun
January 10th, 2007, 08:22 PM
Yes that's right, just open your notepad and start writing in binary.

Everything is syntactic sugar! You can write something like class in C but it's very complicated and full of leaks | bugs.

Operator overloading is a waste of time.
why would you write something like


class Integer () {
private int value;
public Integer operator + ( Integer i ) {
return new Integer(i.value + value);
}
}

when you can write :


class Integer () {
private int value;
public void add(Integer i) {
value += i.value;
}
}


I don't see how your example shows any benefit in the non-overloading example. First of all, they aren't functionally equivalent. Your java example implements an incrementing method, not an add method. So writing the equivalent operator overload method is ACTUALLY:


class Integer() {
private int value;
public Integer operator+=(Integer i) {
value += i.value;
}
}

And the equivalent of my add method in java would be:



class Integer() {
private int value;
public Integer sum(Integer j) {
return new Integer(value + j.value);
}
}


Finally, the fact that a Java class may be full of leaks and bugs has nothing to do with the fact that C is not object oriented, and everything to do with the fact that C requires you to manage memory on your own. That is completely different topic.

Sasa_Ivanovic
January 10th, 2007, 08:23 PM
Yes you don't have to use OOP directly. But it all ends up beeing an object. Coding in C is done for the speed, while the lack of OOP is done manualy by making structs and associated functions.
While C++ is good language, i hate all the stuff that's inherited from C.

So Java might be something like C++--; ( C++ without all of the C bugs )

jblebrun
January 10th, 2007, 08:24 PM
Also i do like one thing in Python, and that's no need to use { }.
It's very cool to force programmers use indenting, i just hate to see not-formatted code!

Yeah, I kind of like the forced indenting, for the most part. Although, the complete lack of block specifiers can be annoying. Sometimes it's tough to notice when there are multiple un-indents at the end of a block, for example. A decent editor solves that problem, though.

jblebrun
January 10th, 2007, 08:26 PM
Yes you don't have to use OOP directly. But it all ends up beeing an object. Coding in C is done for the speed, while the lack of OOP is done manualy by making structs and associated functions.
While C++ is good language, i hate all the stuff that's inherited from C.

So Java might be something like C++--; ( C++ without all of the C bugs )

There are no bugs in C unless you write them. When you mention bugs, I assume you mean things like memory leaks that tend to be more common in language like C and C++, due to the lack of automatic memory management schemes. Again, this argument is completely orthogonal to that of object-orientation.

Sasa_Ivanovic
January 10th, 2007, 08:27 PM
class Integer() {
private int value;
public Integer sum(Integer j) {
return new Integer(value + j.value);
}
}


Whatever, i was just writing an example. I wasn't looking for the functionality of the code.

And that's what i'm trying to explain from the beginning, that C isn't good for any big project. Why ? 'Cause it doesn't support OOP. Is C++ good then ? In some way, but it isn't pure! it inherited all of the C bugs.

pmasiar
January 10th, 2007, 08:27 PM
Python does not force you to use object untill you need them. That is major plus - simpler to learn and program. OO is there but optional.

jblebrun
January 10th, 2007, 08:32 PM
Whatever, i was just writing an example. I wasn't looking for the functionality of the code.

And that's what i'm trying to explain from the beginning, that C isn't good for any big project. Why ? 'Cause it doesn't support OOP. Is C++ good then ? In some way, but it isn't pure! it inherited all of the C bugs.

Yes, you were writing an example. Generally, the point of an example is to manifest a point. You did not do that. Your example did not support your argument in the least.

Take a look at the Linux kernel, and Gaim, both written in C, and both very active, successful projects. :-)

I still don't know what you mean by "C++ inherited the bugs of C". Do you mean that existing C/C++ compilers are buggy, or are you talking about "conceptual" bugs that make it easier to write buggy code in C/C++?

Sasa_Ivanovic
January 10th, 2007, 08:33 PM
you have to allocate and destroy all the memory yourself.
you have to declare var at beginning of the block. ( i would rather die than do this )
global var and func exsist ( in a world of OOP this is like hell )
when you include a file you must make sure that you don't include it again
you have to write two files, by convetion ( Class.h Class.cpp ) one for declaration one for definition
there isn't a tree of packages like in Java, isn't it easier to


import java.util.Random;

than to find all the *.h files neccecery to construct a random number using time seed.

Sasa_Ivanovic
January 10th, 2007, 08:37 PM
bugs like :
exsistance of #define and const.
exsistance of *pointers and &refrences.
possibility to not return anything from a function :


main () {
}

also there isn't a class hierarchy in C++, you have to make your own.

I was a C++ fan once, until i realized the power of Java.

jblebrun
January 10th, 2007, 08:44 PM
bugs like :
exsistance of #define and const.
exsistance of *pointers and &refrences.
possibility to not return anything from a function :


main () {
}

also there isn't a class hierarchy in C++, you have to make your own.

I was a C++ fan once, until i realized the power of Java.

You can declare variables anywhere in a block in C++. That statement is just wrong. Anway, when I write Java code, I still define my vars at the head of a block. It's just easier to read. It's such a pain trying to track down declarations scattered about like breadcrumbs, even with a decent search function.

Why is the existance of #define/const a bug?
Why are pointers and reference a bug?
When you write "main () {}" there is an implicit "void". Void functions are possible in Java as well.
The "class hierarchy" that you speak of is simply a bunch of pre-written Java code. If you use a standard library like glib, you get functionality like containers and random number generators as well.

I agree that having to track down header files is a pain, but I've had just as many times tracking down the right java library to import.

Sasa_Ivanovic
January 10th, 2007, 08:48 PM
i promissed to stop promoting Java. so don't expect any more awnsers from me regarding Java.

jblebrun
January 10th, 2007, 08:53 PM
i promissed to stop promoting Java. so don't expect any more awnsers from me regarding Java.

But I wasn't trying to claim that Java is a poor language, or that it shouldn't be used! I was simply pointing out the flaws in most of your arguments, and pointing out that it's not the ultimate solution to programming as we know it!

TuxCrafter
January 20th, 2007, 05:08 PM
Ok, maybe it is better to create a new treat but here I go:

What will be the best language of the future to programs GUI's
It must be fast in execution low cpu and low memory print
Fast speed to market time.
Fast compilation time.
Good integration with hardware.
Must be GNU free software

I have these languages:
C, C++, D, C#, Python, Obj-C, Java, Pascal

I think I vote for D and Python or C++ next. but the licensing is not GPL

gummibaerchen
January 20th, 2007, 05:21 PM
Ok, maybe it is better to create a new treat but here I go:

What will be the best language of the future to programs GUI's
It must be fast in execution low cpu and low memory print
Fast speed to market time.
Fast compilation time.
Good integration with hardware.
Must be GNU free software

I have these languages:
C, C++, D, C#, Python, Obj-C, Java, Pascal

I think I vote for D and Python or C++ next. but the licensing is not GPL

I bet for C, D and Python. Everything else will die :D

C, because a lot of programs are in already written in C and you can't fast replace all these with D.

D because it's a better C++, C#, Java whatever.

And Python because it is easy, human readable, smarter Syntax then Ruby and so on.
(hopefully mostly for prototyping, all these new Python-Applets for Gnome scare me, hopefully they will be replaced with a C or D version).

lnostdal
January 21st, 2007, 02:26 AM
Ok, maybe it is better to create a new treat but here I go:

What will be the best language of the future to programs GUI's

Any "modern" language will do;

* Python
* Lisp
* Ruby
* etc .. (see my previous post in this thread though)

..but the most "futuristic" thing will be a switch to more dataflow-based programming-style _regardless_ of language (well, almost):
http://en.wikipedia.org/wiki/Dataflow

..and/or combined with declarative programming (http://en.wikipedia.org/wiki/Declarative_programming) of both static and as much dynamic stuff (using dataflow-ideas from above) as possible.

Once you understand how this work it'll blow your mind while at the same time being sooo simple it is surprising it's not more widely known. It is particularly excellent for GUI-work, but also suitable for other stuff.

*goes back to lurking and hacking home-made GTK+ bindings for Common Lisp* :)

(note to post by pmasiar two posts below this one: Prolog can be implemented directly in Lisp (but I think you already know this))

pmasiar
January 21st, 2007, 02:38 AM
What will be the best language of the future to programs GUI's
It must be fast in execution low cpu and low memory print
Fast speed to market time.
Fast compilation time.
Good integration with hardware.
Must be GNU free software


IMNSHO you grossly confused theoretical programming in ideal world, and real-life software engineering. Engineering as discipline is search for workable compromises between conflicting and contradictory goals. Obviously none of languages ever will be best in all your categories (including GPL) - so every person, team, company will set their own preferences and compromises.

Let me add two more quotes for illustration:

In theory, there is no difference between theory and practice. In practice, there is.
Writing software, which is: within budget, high quality, and on-schedule: pick any two.


To learn about compromises and planning for change, try to play strategy resource planning games, like Dune2000, UFO, Civilization etc was 10 years ago - are they used anymore? Or you youngsters play only first-person shooters, slash-them-up, and MMURPG? You cannot get all you want - learn to find how to prioritize and compromise. You gain speed-to-market if you are willing to pay in CPU execution time and RAM, and vice versa, etc.

Or you wanted just to mention important characteristics of such super GUI language? How you feel about adding also:

code is simple to profile, test, maintain, and refactor
easy to learn, and lots of programmers with valid skills available
well paid rare skill which you are safe to use until you retire :-)
usable in other important areas of application development, like Web-based apps, and apps embedded in mobile devices (phone, PDA, GPS navigator etc)
easy to integrate with other programs on other platforms

pmasiar
January 21st, 2007, 02:50 AM
declarative programming (...)will blow your mind while at the same time being sooo simple it is surprising it's not more widely known.

Yup. Prolog (http://en.wikipedia.org/wiki/Prolog_programming_language) is 35 years old now, part of Ubuntu, and simply amazing. It was hard to use when 1MB was a lot of memory, but expect it become more popular soon. Today, put 6GB of RAM to workstation, load all knowledgebase into RAM and run inferences *quite* fast. Look at prolog to experience *very* different kind of programming, just as brain exercise

hod139
January 21st, 2007, 03:58 AM
I bet for C, D and Python. Everything else will die :D


Why won't this useless thread die....

loell
January 21st, 2007, 07:38 AM
Why won't this useless thread die....

:lolflag: because you just posted on it.

gummibaerchen
January 21st, 2007, 04:59 PM
:lolflag: because you just posted on it.

Exactly ;) It's far too popular and too funny...

Ok, more serious answers would be good, but anyway, too late for that now.

luizfar
January 21st, 2007, 06:02 PM
I don't think you guys are making good comparisons.

Comparing languages like Java x C is like comparing a Ferrari x Jeep: it depends on where you want to drive it. Ferraris would be perfect in a highway, but would suck in a road with no asphalt towards a farm.

It's almost the same here. Java clearly isn't the best language for programming kernels, while C isn't as good as Java for programming large web-based applications.

Though, when talking about GUI applications, like Gaim and Azureus, all those languages have their advantages and disadvantages. Also, each programmer has his preferences and knowledge, so it would be more a matter of choosing the language you know better and have more facility to work with.

For the future, I'm sure C will not die, I doubt that anyone would ever want to re-write kernel and other complicated stuff codes in different languages.
Cobol is also a die-hard language. People have been saying for years that Cobol is gonna die and blablabla, but it's as alive as never. Very important applications running on Cobol can't be re-written for security and risk purposes. But maintenance is always needed. If you know Cobol very well and have patience enough to work with that, you may be sure that finding a good job wont be a problem.
I also have doubts about Java not being used anymore in the future. It's being constantly improved and has a lot of big companies, like IBM, supporting it. Also, as with Cobol, there are a lot of risk applications running in Java out there and it's clearly the most used language today.
Python is a very simple and easy language, and Python-people usually love it. So it certainly will be with us in the future.

Of course, new languages are always being created. And new paradigms too. So, new languages are coming for sure. But don't worry, there's always room enough for everyone

p.s.: sorry about the bad English

DrMega
January 21st, 2007, 08:41 PM
C will never die. It lives on in all the languages that borrow from it. C (which was actually a successor to some other syntactically similar language) has evolved in C++, from there Sun moved it along into Java, Microsoft came back and 'borrowed' Java and C++ to make C#. PHP (although weak) takes much of its styling from C and C++ and so on etc etc.

I'm a professional developer in the Windoze world, and our company uses C# now, which is great. I've used other lesser known languages in the course of my career but it is really useful to know C or Java etc because from there you get a good launching pad into other languages should you need to switch. And therein lies the very essence of what makes a good programmer. It is not the absolute mastery of one particular language on one platform, it is the ability to adapt quickly to the ever changing needs of industry.

I like C#. I think original C and C++ are a bit too cryptic and hard to follow in places, Java is too platform independant at the expense of execution speed, and nobody is going to pay to develop in perl, python or ruby. Now that C# has been submitted to ECMA to become 'a standard', I think it has a very promising future. In the Windoze world it is already the language and with the Mono project it isn't going to be long before it gets a good foothold in the Linux realm also.

TuxCrafter
January 21st, 2007, 09:20 PM
I like C#. I think original C and C++ are a bit too cryptic and hard to follow in places, Java is too platform independant at the expense of execution speed, and nobody is going to pay to develop in perl, python or ruby. Now that C# has been submitted to ECMA to become 'a standard', I think it has a very promising future. In the Windoze world it is already the language and with the Mono project it isn't going to be long before it gets a good foothold in the Linux realm also.

I agree that C# is a modern language and has advantages above C and C++. But C# is still from Microsoft and they have still patents on the Forms and other widgets. So it is very dangerous to use if you want to keep your freedom. Also C# Mono is a virtual machine language like JAVA with all the disadvantages with it. Why can python or D not become a language used in the business world? I just want arguments the more the better :-D

http://en.wikipedia.org/wiki/Mono_(software)
http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2887217,00.html
But the first question that those third parties must ask is whether another commercial deployment of the CLI standard--say, for Linux or Unix--could infringe on a Microsoft patent. According to Microsoft's director of intellectual property Michele Herman, who I interviewed earlier this year, the answer is a qualified yes.

MadMan2k
January 21st, 2007, 11:56 PM
Why can python or D not become a language used in the business world? I just want arguments the more the better :-D.
python is not suitable for large projects. you can enforce others to use your code right, which is important for bigger projects where several people work at. the keywords are static typing and variable visibility.

D is basically a nice option but it offers nothing whch could reason a switch from java and it lacks many third party libraries/ bindings.

and honestly, since java is GPL now I see no reason why the OSS community should not just adopt it...

pmasiar
January 22nd, 2007, 03:55 AM
as good as Java for programming large web-based applications.

You compared java vs C - my guess is you are not familiar with 'more different" languages for web app development, "dynamically typed" scripting languages as Python, Ruby, or even PHP/perl, which all are better for text processing and web app development. Both Java and C/C++ are rather similar "statically typed" languages, rather clumsy when dealing with text (at least compared with scripting languages).


Very important applications running on Cobol can't be re-written for security and risk purposes. But maintenance is always needed. If you know Cobol very well and have patience enough to work with that, you may be sure that finding a good job wont be a problem.

LOL! Companies do not avoid rewrite for security reasons - but for cost reasons! If it is not broken, don't fix it. Lot of people expected cobol die during fixing Y2K bug, but "windowing" techniques were invented to deal with it *cheaper* without really fixing the code (or throwing it out and rewriting). But not for security: reason was the *cost*.

I also would not recommend someone learning cobol in hope to get a good job: Most of Cobol tasks are run by greybeard gurus on rare mainframes (running couple virtual operating systems, some of them obsolete systems with Cobol tasks). As greybeards retire, Cobol tasks are replaced one by one by Java: our new shiny Cobol++ :-)


Of course, new languages are always being created. And new paradigms too. So, new languages are coming for sure. But don't worry, there's always room enough for everyone

We have around 8K languages (but I guess less than 100 languages are used by more than 100 programmers) and 8 paradigms: procedural in '50, functional and Declarative in '60 and '70, Structural in '80, Object-oriented in 90, and now freshly baked Test driven, Message-passing, and Aspect paradigms. It's about one paradigm per decade, or one per 1K languages - not *that* many :-) But I agree, there is plenty of room for everyone.

pmasiar
January 22nd, 2007, 04:38 AM
python is not suitable for large projects. you can enforce others to use your code right, which is important for bigger projects where several people work at. the keywords are static typing and variable visibility

Did you ever tried scripting language, like Python or Ruby? You would be surprised how much more fun programming can have, how much more productive you can be.

This "python is not suitable" is standard FUD repeated by Java people again and again. ](*,) I am not naive enough to hope to clear the misunderstanding, but never hurts to try :-)

Java was designed so you can run code of other people and restrict it - basically for closed-source world - which is fine and it is successful in "enterprise" market. Python was designed to be open, for open-source. It's approach of variable protection and type enforcement is: do not camp in my living room because I asked you (and you are reasonable person) - not because I have AK-47. If you know what you are doing - you can do whatever you want. We are all consenting adult persons, not kindergarden.

Type checking can be easily done by unit testing - and you will write unit tests anyway, right? Variable visibility ( like: _var is private) is easily checked in code review - and if group of developers doesn't have code review... there is nothing to prevent project failure. Sure all this can be checked and enforced by the compiler - static typing comes with the price of static (slow, not agile) development. It may make sense for Nuclear powerplant system, or something else absolutely critical, where you don't care to pay 10 times the price for extra security, but hardly makes sense anywhere else.


since java is GPL now I see no reason why the OSS community should not just adopt it...

because I have access to source, I can easily check and modify what I need - why in the earth I want to inflict on myself rigidness of Java? If someone defined something as "private" or "final" and I have source, how hard will be for me to make it public, recompile, and use *my* version? Where is your protection now? :-)

Java was supposed to be multi-platform, but for open source doesn't make sense to spend efforts to support other platforms. And which other platforms? Mac uses BSD, which is Unix. Windows has own java - it is called C#.

Java is popular, because PHBosses were tricked by Sun's propaganda machine that it will solve all their problems. Obviously it is not happening, now all the rage is C# will do it on windows, and Ruby in web apps :-) I try to see through hype and look what smart gurus use (and what tools they create for themselves), even if it is not enterprisey.

Of course, if you want secure job as 'programmer resource' (== cog) in "enterprise" which hires and fires programmers according to stock price on Wall Street: all you need is Java, Struts, Tomcat. :twisted: I am learning it too :-( - but I am not fooled that Java is best thing since sliced bread. It is best hyped since Cobol. :-)

Tomosaur
January 22nd, 2007, 12:38 PM
Did you ever tried scripting language, like Python or Ruby? You would be surprised how much more fun programming can have, how much more productive you can be.

This "python is not suitable" is standard FUD repeated by Java people again and again. ](*,) I am not naive enough to hope to clear the misunderstanding, but never hurts to try :-)

Java was designed so you can run code of other people and restrict it - basically for closed-source world - which is fine and it is successful in "enterprise" market. Python was designed to be open, for open-source. It's approach of variable protection and type enforcement is: do not camp in my living room because I asked you (and you are reasonable person) - not because I have AK-47. If you know what you are doing - you can do whatever you want. We are all consenting adult persons, not kindergarden.

Type checking can be easily done by unit testing - and you will write unit tests anyway, right? Variable visibility ( like: _var is private) is easily checked in code review - and if group of developers doesn't have code review... there is nothing to prevent project failure. Sure all this can be checked and enforced by the compiler - static typing comes with the price of static (slow, not agile) development. It may make sense for Nuclear powerplant system, or something else absolutely critical, where you don't care to pay 10 times the price for extra security, but hardly makes sense anywhere else.



because I have access to source, I can easily check and modify what I need - why in the earth I want to inflict on myself rigidness of Java? If someone defined something as "private" or "final" and I have source, how hard will be for me to make it public, recompile, and use *my* version? Where is your protection now? :-)

Java was supposed to be multi-platform, but for open source doesn't make sense to spend efforts to support other platforms. And which other platforms? Mac uses BSD, which is Unix. Windows has own java - it is called C#.

Java is popular, because PHBosses were tricked by Sun's propaganda machine that it will solve all their problems. Obviously it is not happening, now all the rage is C# will do it on windows, and Ruby in web apps :-) I try to see through hype and look what smart gurus use (and what tools they create for themselves), even if it is not enterprisey.

Of course, if you want secure job as 'programmer resource' (== cog) in "enterprise" which hires and fires programmers according to stock price on Wall Street: all you need is Java, Struts, Tomcat. :twisted: I am learning it too :-( - but I am not fooled that Java is best thing since sliced bread. It is best hyped since Cobol. :-)

What are you talking about? Public / Private / Final etc are common to many languages, they're not intended to make things closed, they're there for very good reasons - good programming practice. If you have the source code - then making things private which are public will just break the program, it won't 'liberate' the software.

luizfar
January 22nd, 2007, 01:44 PM
You compared java vs C - my guess is you are not familiar with 'more different" languages for web app development, "dynamically typed" scripting languages as Python, Ruby, or even PHP/perl, which all are better for text processing and web app development. Both Java and C/C++ are rather similar "statically typed" languages, rather clumsy when dealing with text (at least compared with scripting languages).

No, I know some of these languages. I know Python and a little of Ruby. They're indeed very productive but productiveness isn't the only important thing.
Also, there are plenty of IDE's out there to solve the productive problem of "statically typed" languages via refactoring, auto-completion, etc.
You don't need to create a package P with static method rint() (as said in another thread), you just type syso and Eclipse will do a System.out.println() for you.
But I've gotta admit, I should have typed 'enterprise systems' instead of 'large web-based applications'.



LOL! Companies do not avoid rewrite for security reasons - but for cost reasons! If it is not broken, don't fix it.

Yes, for cost reasons. English cheated on me, sorry hehehe.



I also would not recommend someone learning cobol in hope to get a good job: Most of Cobol tasks are run by greybeard gurus on rare mainframes (running couple virtual operating systems, some of them obsolete systems with Cobol tasks). As greybeards retire, Cobol tasks are replaced one by one by Java: our new shiny Cobol++ :-)


I'm sorry to say that but mainframes aren't rare.
Big companies like IBM and EDS are freaking out to find qualified in Cobol people, which is pretty rare these days. I've been to IBM Brazil site some months ago, and they said that even though mainframe's death has been announced through the years, it's never happened and is not likely to happen soon.
Some of those already retired greybeards were called to come back to work because there aren't other people to deal with those systems.
And, when they retire (or die), Cobol tasks are not easily replaced by Java, because, as you said, it can be tricky and costly.
We're not talking about simple web-systems here using scripting dynamically typed languages, but about very large systems (like banks) that have to run on mainframes with huge computational power. Those systems are running mostly on Cobol and, as you said, Cobol++ (Java).



We have around 8K languages (but I guess less than 100 languages are used by more than 100 programmers) and 8 paradigms: procedural in '50, functional and Declarative in '60 and '70, Structural in '80, Object-oriented in 90, and now freshly baked Test driven, Message-passing, and Aspect paradigms. It's about one paradigm per decade, or one per 1K languages - not *that* many :-) But I agree, there is plenty of room for everyone.

Well, it's a lot of paradigms from my point-of-view :D

pmasiar
January 22nd, 2007, 04:24 PM
What are you talking about? Public / Private / Final etc are common to many languages, they're not intended to make things closed, they're there for very good reasons - good programming practice. If you have the source code - then making things private which are public will just break the program, it won't 'liberate' the software.

Now what are *you* talking about :-) - can you next time at least trim out the parts irrelevant to your comment?

I was not talking about "liberating" software - what *that* might mean? Something like liberating Iraq? :twisted:

I am not a Java expert - I am not sure how making private instance variable public will break the program? If I use it responsibly and not intentianlly brak it, it should be fine, no? Breaking will be caused by wrong usage, not by changing it to Public, IMHO. Am I wrong?

Python has also "private" variables - by convention, anything which starts with _ is private and should be used with care (or better not at all).

Tomosaur
January 22nd, 2007, 04:34 PM
Now what are *you* talking about :-) - can you next time at least trim out the parts irrelevant to your comment?

I was not talking about "liberating" software - what *that* might mean? Something like liberating Iraq? :twisted:

I am not a Java expert - I am not sure how making private instance variable public will break the program? If I use it responsibly and not intentianlly brak it, it should be fine, no? Breaking will be caused by wrong usage, not by changing it to Public, IMHO. Am I wrong?

Python has also "private" variables - by convention, anything which starts with _ is private and should be used with care (or better not at all).

Well technically no, making a private thing public won't break the program, but you get the idea. Private things are supposed to be private. If the developers find themselves making things private which are supposed to be public, then something's gone wrong somewhere down the line. It's not necessarily restricted to Java though. When you were talking about 'where's your protection now', I thought you meant it in the intellectual property protection / copyright etc sense (since you were going along the lines of how Java was/is closed source blah blah blah) :P I clearly misunderstood you, it's just the way you worded it made me think you misunderstood what public/private etc is all about. That being said - private things are meant to be private. There's no reason to go making it public if it doesn't need to be.

neoflight
January 22nd, 2007, 04:40 PM
if the compilers are so intelligent so as to prevent assigning a real value to an integer variable.!...
why doesn't it recognize the variable in the first place while assigning....

just use python!

pmasiar
January 22nd, 2007, 04:42 PM
there are plenty of IDE's out there to solve the productive problem of "statically typed" languages via refactoring, auto-completion, etc.
(...) you just type syso and Eclipse will do a System.out.println()

Yes, Eclipse will write the code for me - but will it read it for me when I need to analyze and understand it? The more lines of code, the harder is to comprehend it. As they say, productivity of programmers should be not measured in lines written - but in lines spent :-)


I'm sorry to say that but mainframes aren't rare.
Big companies like IBM and EDS are freaking out to find qualified in Cobol people, which is pretty rare these days. (...) very large systems (like banks) that have to run on mainframes with huge computational power.

And for programmer to be efective in such complex environment, knowing Cobol will be not enough. He needs 20 years of experience of working with the industry, so he seen every rare case (which occurs once in a decade) twice, and can write code to handle it. You will be surprised how many people will say "it is so because computers says it" - after restructuring and cost-cutting layeoffs, the only 'institutional memory" which knows how to handle rare cases is obsolete Cobol program. :-)

Ben Sprinkle
January 22nd, 2007, 05:04 PM
I think Java will be. :)

TuxCrafter
January 22nd, 2007, 05:34 PM
I think Java will be. :)

arguments please without it the comment is useless. Also arguments why for example mono, C#, python, D is a lesser language.

Ben Sprinkle
January 22nd, 2007, 05:43 PM
Um, because Java is fun.

MadMan2k
January 22nd, 2007, 05:46 PM
Did you ever tried scripting language, like Python or Ruby? You would be surprised how much more fun programming can have, how much more productive you can be.
Im currently writing (yet another) audio player in python. and already have written scheme before which is even more problem solving and less syntax.
the problem is that you are the only one who can maintain your scheme code - this also applies to a lesser extend to python.

yes I can even enforce private variables with double underscore but the code becomes a pain to read then. (and you still dont have package, final and protected)



Type checking can be easily done by unit testing - and you will write unit tests anyway, right?
unit testing can only ensure that the specific examples work right - you can still have type errors on runtime.
Static typing on the other hand finds all type errors on compile time.

luizfar
January 22nd, 2007, 06:30 PM
Yes, Eclipse will write the code for me - but will it read it for me when I need to analyze and understand it? The more lines of code, the harder is to comprehend it.


I'm sorry but I've gotta disagree with you here.
A text is not harder to comprehend if it has more lines or more letters.
One of the principles of Java designers is to provide a language that will be easy to read, with no surprises for the reader. Statically typing contributes for it.
If you see a System.out.println() and you know Java's convention you will easily notice (if you have or want to, otherwise you will just ignore it) that 'out' is a static public variable of the class 'System', and it has a public method called 'println', which receives no args.
Although Python is very productive, Python code can be easier made in a trick manner. Of course it depends on the programmer, but Python code can be way harder to read than Javas.



And for programmer to be efective in such complex environment, knowing Cobol will be not enough. He needs 20 years of experience of working with the industry, so he seen every rare case (which occurs once in a decade) twice, and can write code to handle it. You will be surprised how many people will say "it is so because computers says it" - after restructuring and cost-cutting layeoffs, the only 'institutional memory" which knows how to handle rare cases is obsolete Cobol program.


Well, maybe that's why they're looking so bad for Cobol programmers. In 20 years the greybeards definitely wont be here any more.

hod139
January 22nd, 2007, 06:37 PM
Again, why won't this useless thread die! No one can predict the future programming language, in part, because we don't know the future computer. Will we still be using Turing equivalent languages in 20 years? What will the architecture be like? What effects will quantum computing have to programming languages? Stream processing seems to be a popular buzzword nowadays, will that become the "future"?

All of you are arguing about a future programming language using todays model of computation and today's popular programming languages. This is wrong and pointless. My guess is that in the short term we will start seeing massive growth in parallel architecture with the increasing power of CPUs. This means we will need to develop new languages to take advantage of this massive parallelism, something today serial languages do not do well. Beyond that, who knows...

luizfar
January 22nd, 2007, 07:02 PM
You're very right, hod139.
But talking about that is fun :D

gh0st
January 22nd, 2007, 07:08 PM
Again, why won't this useless thread die! No one can predict the future programming language, in part, because we don't know the future computer. Will we still be using Turing equivalent languages in 20 years? What will the architecture be like? What effects will quantum computing have to programming languages? Stream processing seems to be a popular buzzword nowadays, will that become the "future"?

All of you are arguing about a future programming language using todays model of computation and today's popular programming languages. This is wrong and pointless. My guess is that in the short term we will start seeing massive growth in parallel architecture with the increasing power of CPUs. This means we will need to develop new languages to take advantage of this massive parallelism, something today serial languages do not do well. Beyond that, who knows...

This is a great point, trying to predict this sort of thing is pointless when we don't know what the computers of the future will be like. It's also a very vague question because it depends what you mean by the future. Next week? Next Year? Next Century?

Personally I think the language of the future will probably be Mandarin :D ;)

pmasiar
January 22nd, 2007, 08:46 PM
Again, why won't this useless thread die! No one can predict the future programming language, (...)
All of you are arguing about a future programming language using todays model of computation and today's popular programming languages. This is wrong and pointless.

Pointless? maybe. Wrong? Come on? Can you PRETTY PLEASE leave us to have our little fun here? If you dislike topic, plese learn to ignore it - it is not that hard.

In pointless discussions like this lnostdal (thank you man! ) posted link to hillarious rant Execution in the Kingdom of Nouns (http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html), if you can read the poem without smile - you are lost case IMHO :-) Steve knows his java, and builds OO architecture of the kingdom, battle etc with a zest.

And if you want to read something serious about future of languages, please read The hundred year language (http://www.paulgraham.com/hundred.html) essay of Paul Graham. He is smart guy - you don't have to agree, but at least listen and think.

Or just ignore this thread - we do not claim we will change future languages here, and neither can you.

pmasiar
January 22nd, 2007, 09:01 PM
Im currently writing (yet another) audio player in python.

Good, so at least you know Python - many people choose to deride what they don't know.


python is not suitable for large projects. you can enforce others to use your code right, which is important for bigger projects where several people work at. the keywords are static typing and variable visibility.

yes I can even enforce private variables with double underscore but the code becomes a pain to read then. (and you still dont have package, final and protected)

unit testing can only ensure that the specific examples work right - you can still have type errors on runtime.
Static typing on the other hand finds all type errors on compile time.

Well, if you typecast, you change type in runtime, and you easily can typecast wrong if not used with care. Same with dynamic typing - only less work. :-) And java community feels need for more dynamic typing - Generics is the proof.

I wonder why _ or __ prefix in Python variable name is hard to read, but UglyLongTypeManager type definitions are easy on your eyes? It is not on mine eyes - I guess personal preferences :-)

Some big companies (like NASA nad Google) think that with code review, dynamic typing is safe enough. You may disagree - more power to Google if you choose language which makes you to write code slower :-)

BTW quote from Greg Stein, from Apache (works for Google): they like python and promote it - but not too much, because if competition would start using Python too, Google would lose competitive advantage. So they choose not to hype Python like Sun hypes Java.

hod139
January 22nd, 2007, 09:12 PM
Pointless? maybe. Wrong? Come on? Can you PRETTY PLEASE leave us to have our little fun here? If you dislike topic, plese learn to ignore it - it is not that hard.

In pointless discussions like this lnostdal (thank you man! ) posted link to hillarious rant Execution in the Kingdom of Nouns (http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html), if you can read the poem without smile - you are lost case IMHO :-) Steve knows his java, and builds OO architecture of the kingdom, battle etc with a zest.

And if you want to read something serious about future of languages, please read The hundred year language (http://www.paulgraham.com/hundred.html) essay of Paul Graham. He is smart guy - you don't have to agree, but at least listen and think.

Or just ignore this thread - we do not claim we will change future languages here, and neither can you.
If this thread stayed in the realm of pointless or intellectual/serious predictions then that would be fine and I would probably ignore or contribute more positively. My problem is that it doesn't, it quickly turns into another python versus C versus C++ versus Java ideological rant. In fact, do you remember the first post


Is it Python???
subsequently followed by


(Ducks for cover.) Another holy war coming! Fire in the hole!
The number of these "My programming language is better than yours" threads seems to have increased in the past few months, and after a while it becomes too difficult for me to ignore all of them. There will never be an agreement on which language is better, and having countless threads reminding us of this fact is pointless.

pmasiar
January 22nd, 2007, 10:18 PM
If this thread stayed in the realm of pointless or intellectual/serious predictions then that would be fine and I would probably ignore or contribute more positively. (...)
The number of these "My programming language is better than yours" threads seems to have increased in the past few months, and after a while it becomes too difficult for me to ignore all of them. There will never be an agreement on which language is better, and having countless threads reminding us of this fact is pointless.

1) you quoted my comments as a whole, but obviously did not read them - I IMHO answerd *why*, and gave you 2 more links as a proof something interesting can be said. How quoting someone without thinking about what you quote helps productive discussion - it was *you* who complained that this discussion is not productive? At least trim irrelevant parts - save the electrons...

2) Maybe we need to work out some sticky about pro and con for every language - or comparing classes of languages or something. Because people kkep asking same questions - and same people rehash old arguments for benefit of new people asking old question (every time in slightly different context).

3) Obviously there would not be "one language to rule them all" as we agreed couple times (maybe you don't follow those discussions *that* closely :-) ). So why won't you just ignore it all? Or are you afraid that Java is losing? :twisted:

(ducking and running for cover like hell :-) )

hod139
January 22nd, 2007, 10:33 PM
1) you quoted my comments as a whole, but obviously did not read them - I IMHO answerd *why*, and gave you 2 more links as a proof something interesting can be said. How quoting someone without thinking about what you quote helps productive discussion - it was *you* who complained that this discussion is not productive? At least trim irrelevant parts - save the electrons...

I read them but maybe I was unclear. You seem to be defending pointless threads, whereas I want them to die and be replaced with productive and insightful threads on the forum. Especially when the pointless thread further degrades into the "My language is better than yours" thread, which brings up your point 2:



2) Maybe we need to work out some sticky about pro and con for every language - or comparing classes of languages or something. Because people kkep asking same questions - and same people rehash old arguments for benefit of new people asking old question (every time in slightly different context).
Good luck! Even if something like this existed (which it probably does on Wikipedia or elsewhere), people wouldn't bother to read it before posting. As is evident with the same language bashing the occurs over and over.



3) Obviously there would not be "one language to rule them all" as we agreed couple times (maybe you don't follow those discussions *that* closely :-) ). So why won't you just ignore it all? Or are you afraid that Java is losing? :twisted:

(ducking and running for cover like hell :-) )If it is so obvious, then why are there soooo many threads on it?

TuxCrafter
January 22nd, 2007, 10:35 PM
If every one makes a page about why there favorite language is the best with good arguments and make a wiki of it. We collect them and compare them.

The current wikipedia will give you a lot of info already. But mabe some info about toolkits like

wxWidgets, vs MONO vs JAVA, vs GTK etcetra because language bindings are already common

loell
January 22nd, 2007, 11:23 PM
@hod139

you have been insisting that this thread is useless and pointless, yet you are still engaging in an argument with this thread, how ironic.

obviously you cant enforce your "want" in the forum.

and we all know that there is no "end all be all programming language" , so is this thread pointless and usesless? maybe , you are entitled to your opinion,

but, you don't decide on who's thread dies or lives. you can however create a

"All langugauge vs lanuage thread should die or be strip out in the forum" thread

and let us see what will be the admins reply.

MadMan2k
January 22nd, 2007, 11:32 PM
Well, if you typecast, you change type in runtime, and you easily can typecast wrong if not used with care. Same with dynamic typing - only less work. :-) And java community feels need for more dynamic typing - Generics is the proof.
Generics were actually introduced to remove the need of dangerous typecasts. You will now get an compile-time error if you insert an "Integer" into a "List<String>" while you used to get an runtime error on the "(String)lst.get(0)" before.



I wonder why _ or __ prefix in Python variable name is hard to read, but UglyLongTypeManager type definitions are easy on your eyes? It is not on mine eyes - I guess personal preferences :-)
I dont think one should not care about variable visibility while being inside the class. (thats where private vars are used) IMO Python should really introduce a keyword for this.
Besides I see the UglyLongTypeName only twice at the definition, while I have to prefix the underscore every time I access the var.


Some big companies (like NASA nad Google) think that with code review, dynamic typing is safe enough. You may disagree - more power to Google if you choose language which makes you to write code slower :-)
especially in the OSS community you write programms which wont get reviewed (f.i. if the project starts out to be only for yourself) then it is good if the compiler forces you to use the third party code right.

sysop
January 23rd, 2007, 11:32 PM
Programming language of the future may be just that...spoken language. Tell the computer what you want and it codes it for you. :wink:

"Computer, begin new program. Create as follows, workstation here.
Now, create a standard alpha-numeric console positioned for the left hand.
Now, an iconic-display console positioned for the right hand. Tie both
consoles into the Enterprise main computer core utilizing neural scan
interface."
"Computer, save program."


Scotty: "Computer? Computer?"
McCoy: [Hands Scotty the mouse]
Scotty: "Ah. Hello, computer."

pmasiar
January 23rd, 2007, 11:52 PM
Programming language of the future may be just that...spoken language. (...)
"Computer, begin new program. (...)


It needs only 1 operator: rmdm, which is shortened version of "Read my mind and do what I mean" :twisted:

so program will be:

source = rmdm(); execute(source)

ohhh no! this is not safe! here is improved *much* safer version:

safe = execute(rmdm());
if safe { source = rmdm(); execute(source) }

jblebrun
January 24th, 2007, 06:36 AM
It needs only 1 operator: rmdm, which is shortened version of "Read my mind and do what I mean" :twisted:

so program will be:

source = rmdm(); execute(source)

ohhh no! this is not safe! here is improved *much* safer version:

safe = execute(rmdm());
if safe { source = rmdm(); execute(source) }

If you ask Ray Kurzweil, WE won't be writing the programs of the future, machines will, once we pass the "singularity" point.

Grishka
February 10th, 2007, 12:23 AM
I wonder why noone has mentioned this: http://www.inform-fiction.org/I7

that's not the language of the future, obviously, but it's a good example of an egalitarian, natural programming language. this is a system for writers, not programmers, but in the future these professions will possibly merge. I find it necessary for the progress to take place.