PDA

View Full Version : Some programming languages that honestly shouldn't exist anymore



jacobpratt909
February 23rd, 2015, 01:39 AM
I have been thinking about programming for a bit now, and have thought (considering today's tech) why are some programming languages still alive?! One of the many I can point out is Visual Basic Script (.vbs) as some old school Windows users might know it as. The best thing this language could do was create an interesting, one-use, quiz application... And yet it still exists... It's not even a good starting language, considering it uses undefined to define-me-now variables, you could use Java (which is a good language to start with) if you wanted a better language and a better security stance (still not-so-good on security, but for starters, it works).
Post what languages you think shouldn't exist in a reply, or argue a good stance to my statement, I would enjoy a gather of programmers :D

-Wyz

Li_Wu
February 23rd, 2015, 02:17 AM
we cannot do without Oberon. Itz the best. Modula-2 and Pascal are runners up. Sooner or later C will be abandoned in favour of a Wirth'ian language designed at ETHZ in Zurich.

DCO
February 23rd, 2015, 03:57 AM
Well, VB has been brought into the .NET framework. So, until that Goliath falls. VB is going to be around for awhile. I learned that during the college courses I was taking that helped me decide to switch to Linux,7 (Network Operating systems was the clincher, since I was dual-booting at the time and was introduced to Active Directory.)

The rest of the reason for the prevelance is the existance of programs made in those languages and the cost it would take to port them to something else. Python 3 is having a taste of this right now, hence the persistence of Python 2.7.

buzzingrobot
February 23rd, 2015, 03:04 PM
Languages seem to survive as long as there are programs to maintain written in them. Visual Basic in its different forms will be around as long as Visual Basic code is running.

When I used to do a bit or two of programming, I found I was more comfortable with Wirth-ian languages like Pascal and Modula-2. I was comfortable with C, as well.

People tend to conceptualize programming in different ways. If they can, they're probably better off using a language that meshes with the way they think.

sffvba[e0rt
February 23rd, 2015, 03:19 PM
I don't understand why a language can't exist... it is not like it detracts from any other language for simply being available?!

HermanAB
February 23rd, 2015, 05:09 PM
I think Brain**** really shouldn't exist anymore, but how does one kill a language?

pfeiffep
February 23rd, 2015, 07:38 PM
I don't understand why a language can't exist... it is not like it detracts from any other language for simply being available?!
^^^^^
Many times the language exists because it's still being used ... just ask the finance industry about Common Business-Oriented Language (COBOL)

It may not be sexy or new but ... 70-75% of the business and transaction systems around the world run on COBOL (https://cis.hfcc.edu/faq/cobol)

So if you want to insure yourself a job in programming learning COBOL might be your ticket!

jacobpratt909
February 24th, 2015, 03:07 AM
To kill a programming language is like to kill a real language. Latin is a practically dead language, so follow the roots from that, use your brain to convert that to probability with computer problems, and voila! You know how a language will fail (maybe).

-Wyz

Matthew_Harrop
February 24th, 2015, 09:56 AM
I learnt Latin - Its not dead, we used to speak it to learn it too.

Technically there are languages that are dead. Think of the original languages that were written (FLOW-MATIC for example) nobody (or very few people) write in them these days


Wirth'ian language

Whats that? Like Delphi and Modula2 (What I could find out in a short Google search)

And what are they developing in Zurich?

coldraven
February 24th, 2015, 10:10 AM
And what are they developing in Zurich?
Where did you think Gnome came from?
http://en.wikipedia.org/wiki/Gnomes_of_Z%C3%BCrich

jacobpratt909
March 1st, 2015, 11:16 PM
Ironic


And what are they developing in Zurich?


Where did you think Gnome came from?
http://en.wikipedia.org/wiki/Gnomes_of_Z%C3%BCrich

Sasha_Aderolop
March 3rd, 2015, 11:26 PM
There was a time when everyone seemingly programmed in Perl. But for those of us who used the language regularly, there was something about it that didn’t seem right.
Indeed, even its creators seemed to (implicitly) acknowledge that something was wrong, kicking off work on Perl6, currently under development as a complete revamp of the language. Work on Perl6 started in… the year 2000. Where is it? Who cares? Perl is dead. Don’t bother learning it.

Matthew_Harrop
March 3rd, 2015, 11:35 PM
There was a time when everyone seemingly programmed in Perl. But for those of us who used the language regularly, there was something about it that didn’t seem right.
Indeed, even its creators seemed to (implicitly) acknowledge that something was wrong, kicking off work on Perl6, currently under development as a complete revamp of the language. Work on Perl6 started in… the year 2000. Where is it? Who cares? Perl is dead. Don’t bother learning it.

But what was the problem?

robsoles
March 4th, 2015, 12:47 PM
we cannot do without Oberon. Itz the best. Modula-2 and Pascal are runners up. Sooner or later C will be abandoned in favour of a Wirth'ian language designed at ETHZ in Zurich.

Hmmm...


...

And what are they developing in Zurich?

+1!!!


Where did you think Gnome came from?
http://en.wikipedia.org/wiki/Gnomes_of_Z%C3%BCrich

-(SEVERAL_THOUSAND) There is this thing about threads with more than 1 reply; you should read more than 1 reply before answering what you think is a random question, I eventually learned to bother so you could too :)



But seriously; what is this super ""Wirth'ian"" language they are developing (WHO_CARES_WHERE) ?

JKyleOKC
March 4th, 2015, 11:02 PM
Technically there are languages that are dead. Think of the original languages that were written (FLOW-MATIC for example) nobody (or very few people) write in them these days.Perhaps ALGOL and PL/I could be added to such a list; I've not seen much mention of them in the past 15 years or so. Both were quite significant milestones in language development; ALGOL-60 (created in 1960 by a huge committee) was possibly the very first "block structured" language and was the first one I encountered that had the concept of "strings" (back in 1966, not one of the software gurus at General Electric could tell me what that word meant, when I was trying to learn ALGOL, although it was one of three languages supported on their pioneering time-share system -- the other two were Basic and yfortran). And PL/I was IBM's one-size-fits-all descendant of it, becoming the primary language for MIT's Project MAC and because of that, the intellectual ancestor of C itself.

The first language with which I did any system programming, TRAC (Text Reckoning And Compiling), seems to actually be extinct, although a clone known as SAM76 for use in personal computers was around for a little while in the early 1980s.

For that matter, the original Dartmouth Basic with its 15, count 'em, 15 keywords is long gone. Its only error message was "What?" and it dealt only in floating-point numbers, no integers or text. But it gave way to many descendants of the same or similar name, resulting in the Basic we know today, so it's a bit of a stretch to call it dead!

Matthew_Harrop
March 4th, 2015, 11:59 PM
But it gave way to many descendants of the same or similar name, resulting in the Basic we know today, so it's a bit of a stretch to call it dead!

Should we then redefine what a dead language is perhaps? What criteria should it have to fulfill to be classified as dead?

I'll suggest some to get the ball rolling:
1. It must no longer be actively used - Actively is defined as any new code/programs written in the last 20 years
2. Its descendants must not carry any major part of the language but can include design or implementation lessons - Major as defined a syntax structure, keywords ect.
3. The 'Off The Shelf' compiler must be sufficiently hard to get - Requires a contact who has it stuffed in the back of a box and it takes more that 6 weeks to get to work/requires a major rewrite.
4. The last learning text book must have been written for it over 20 years ago.

Gustaf_Alhll
March 7th, 2015, 02:35 PM
Why should a programming language be removed? I'm sure there's always someone using that programming language, that applies to all programming languages.

You can't really say that something (Or someone, that also applies) should disappear. Since everyone has different oppinions about everything, you cannot just pull the string and say that something should be done. For example, you can't say that we should get rid of Brainfudge because it's a "strange" programming language since there will always be someone that thinks the opposite. A thing might be strange to someone, someone else might think it's completely natural.
That applies to programming in general. There are people that thinks programming, no matter what language, is strange. I assume none of us here thinks that, but I know a friend to me that has that oppinion.

That's a big issue with our society. If we don't like something, we want to get rid of it. We do never think from perspectives other than our own.

Matthew_Harrop
March 7th, 2015, 06:44 PM
I'm not sure that is the argument at all. What I understood it as was that there are certain languages that no longer exist because they are not actively written in, not that they should be 'removed'

Gustaf_Alhll
March 7th, 2015, 11:25 PM
I'm not sure that is the argument at all. What I understood it as was that there are certain languages that no longer exist because they are not actively written in, not that they should be 'removed'

I was referring to the first/earlier posts:

Post what languages you think shouldn't exist in a reply
I guess I should've quoted it first.

mbott
March 8th, 2015, 02:42 PM
MOBOL. Ran on the MDS Series 21 that I worked with back in the mid-80s.

http://bitsavers.trailing-edge.com/pdf/mohawk/mds21/

--
Mike

Matthew_Harrop
March 10th, 2015, 09:16 AM
@Gustaf Sorry, my bad.

See, there are loads of languages out there that are no longer used and could be considered dead. It might even be that the machines that they were used on no longer exist, making them even dead-er ;)

flaymond
March 18th, 2015, 10:54 AM
Should assembly and binary still remain to exist?

robsoles
March 18th, 2015, 12:04 PM
Should assembly and binary still remain to exist?lol.

(Tho, on the other hand: If that isn't intentionally a gag then say so and somebody will probably beat me to explaining what each of those are (aside from binary not being a programming language.))

flaymond
March 18th, 2015, 02:41 PM
I know binary is not a programming language, it's the machine language. I just bring it up to see if this thread still active. ;)

and if you got the answer...why assembly should not be dead? :O

I'm a newbie, sorry. 8-[

Matthew_Harrop
March 18th, 2015, 05:10 PM
Assembly language is a series of mnemonics that are very very closely mimicked in binary i.e. 1 term (mov) to 1 binary instruction.

For example:

mov r1, 0x1

This loads register r1 with the hex value 0x1. That is literally 1 instruction to the CPU - and keep in mind a CPU can handle millions per second. Many of the very basic aspects of a kernel are written in Assembly (Like certain graphics handlers, for example)

Also written in assembly is the bootloader of a computer. This is because it needs to access specific sections of memory and perform very specific functions while being very small and running very quickly.

A high level language, like C, is converted into assembly by the compiler. The assembly functions created from what you've written in C may not be the most efficient way of carrying something out. Further, the abstraction of a High-level language means that you don't control, or need to be concerned about, memory management or disk management or anything like that.

If you're interested further by assembly (or I've completely confused you) then try reading:
- http://en.wikibooks.org/wiki/X86_Assembly
- http://en.wikipedia.org/wiki/Assembly_language

And I think we settled it with, there are dead languages that were written for machines that no longer exist (FLOW-MATIC for example)

PondPuppy
March 18th, 2015, 05:28 PM
The old gag I've read goes something like:

There are three reasons for using assembly:

1. speed
2. speed
3. speed

Not that I would know firsthand -- I'm a complete idjit except in a small area of machine automation, where I ascend to the level of semi-amateur.

robsoles
March 18th, 2015, 11:12 PM
I know binary is not a programming language, it's the machine language. I just bring it up to see if this thread still active. ;)

and if you got the answer...why assembly should not be dead? :O

I'm a newbie, sorry. 8-[

Binary is not a language, not a machine language and not a human language because it is just a number system which is very suited to systems where switches can only represent an 'on' or 'off' state. Memory and such like are literally just banks of programmable switches.

Assembly/Assembler is the language of the processors themselves and can be represented using any number system but binary methods of encoding it are employed because it works out (funnily enough) as easiest to implement to the best of my knowledge.

I'd have left it at what Matthew_Harrop said but he didn't cover binary so...

flaymond
March 19th, 2015, 02:09 AM
Binary is not a language, not a machine language and not a human language because it is just a number system which is very suited to systems where switches can only represent an 'on' or 'off' state. Memory and such like are literally just banks of programmable switches.

Assembly/Assembler is the language of the processors themselves and can be represented using any number system but binary methods of encoding it are employed because it works out (funnily enough) as easiest to implement to the best of my knowledge.

I'd have left it at what Matthew_Harrop said but he didn't cover binary so...

It just information I know from several Linux articles I read about binary...I don't wanna arguing which is true or not...because this will start flame war instead of a clean conversation/forum here..

Anyway, thanks for the information robsole, PondPuppy, and Matthew.

I need to learn a lot.

SantaFe
March 19th, 2015, 02:18 AM
You want fun, try programming in BAL for an IBM 360! http://cs.ecs.baylor.edu/~maurer/SieveE/bal.htm :D

Matthew_Harrop
March 19th, 2015, 10:29 AM
Thanks for filling in the missing piece Rob :)


It just information I know from several Linux articles I read about binary...I don't wanna arguing which is true or not...because this will start flame war instead of a clean conversation/forum here..

Anyway, thanks for the information robsole, PondPuppy, and Matthew.

I need to learn a lot.

I wouldn't worry so much about Assembly and Binary. It may be good to have an understanding of what they are, but (I should imagine) most programmers can go through their lives without knowing any assembly. I chose to learn about it because I wanted to know the very basics of computing (Like how the computer knows which sector to boot from and how to actually boot the computer in the first place)

user1397
March 20th, 2015, 03:12 AM
I know binary is just a number system made of only 1s and 0s but isn't binary referred to as machine language? If not, what is machine language? It most certainly isn't assembly, as that is a low level programming language, as far as I understand these things.

robsoles
March 20th, 2015, 04:51 AM
Hi ubuntuman001

Machine Language is ultimately down to well aligned 1s and 0s and is commonly referred to as assembly/assembler; have a look at the wikipedia article Matthew_Harrop linked to in a post earlier: http://en.wikipedia.org/wiki/Assembly_language

To say a processor is 'programmed' to read the instructions and act accordingly would (at least probably) not be as accurate as to say a processor is 'configured' to react to bit patterns in predetermined fields so as to execute programs.

I personally think that where it reads
An assembly language is a low-level programming language (http://en.wikipedia.org/wiki/Low-level_programming_language) for a computer (http://en.wikipedia.org/wiki/Computer), or other programmable device, in which there is a very strong (generally one-to-one (http://en.wikipedia.org/wiki/One-to-one_correspondence)) correspondence between the language and the architecture's (http://en.wikipedia.org/wiki/Computer_architecture) machine code (http://en.wikipedia.org/wiki/Machine_code) instructions (http://en.wikipedia.org/wiki/Instruction_%28computer_science%29). Each assembly language is specific to a particular computer architecture, in contrast to most high-level programming languages (http://en.wikipedia.org/wiki/High-level_programming_language), which are generally portable (http://en.wikipedia.org/wiki/Porting) across multiple architectures, but require interpreting (http://en.wikipedia.org/wiki/Interpreter_%28computing%29) or compiling (http://en.wikipedia.org/wiki/Compiler).I would have probably written the bit "in which there is a very strong (generally one-to-one) correspondence between the language and the architecture's machine code instructions" out as "the mnemonics of assembly are used to represent the numeric instructions required by the processor this assembly is for; mnemonics are a human readable representation of machine code and only in this way form any sort of 'language' to help humans cope with it."

Somebody would have probably had a go at me for being too ambiguous and re-written it as it stands now tho, so never mind.

I learnt Z80 about 30 years ago, I was able to hand compile Z80 mnemonics into the required numbers up to as recently as probably about 27 years ago but all I remember now is that the 'RET' instruction was 0xC9 which is 201 in decimal and 11001001 in binary - it was arbitrary, while I was loading the program into memory with BASIC 'poke' statements way back then, whether I wrote
Poke nextlocation,&hC9or
Poke nextlocation,201or if I could remember how to express binary numbers in Locomotion BASIC I would show that. (2nd Edit: Actually remembered, it would have been 'Poke nextlocation, &B11001001' and it may have been called Locomotive BASIC and that version of BASIC may not have allowed very long variable names but it has been a while - sort of all arbitrary if I have managed to make my point(s) anyway.)

Binary!=ML (Machine Language); it is just easiest to represent many instructions as binary as many of the instructions rely on forms of bit masking to differentiate between which register (or other target) the operation is going to be performed on.

robsoles
March 20th, 2015, 03:00 PM
Hmmm, pity I had to get reasonably drunk before searching my feelings for what I am pretty sure I will stand by as the right answer to this thread in a generalistic sense when I make the potentially quotable statement;

There is no such thing as a programming language that *should* be deprecated because such things will deprecate themselves when that is their true limitation in their end.



(:rofl: :lolflag: joke is on me pending how I feel about this post when I am sober next ](*,))

user1397
March 23rd, 2015, 12:44 AM
Hi ubuntuman001

Machine Language is ultimately down to well aligned 1s and 0s and is commonly referred to as assembly/assembler; have a look at the wikipedia article Matthew_Harrop linked to in a post earlier: http://en.wikipedia.org/wiki/Assembly_language

To say a processor is 'programmed' to read the instructions and act accordingly would (at least probably) not be as accurate as to say a processor is 'configured' to react to bit patterns in predetermined fields so as to execute programs.

I personally think that where it reads I would have probably written the bit "in which there is a very strong (generally one-to-one) correspondence between the language and the architecture's machine code instructions" out as "the mnemonics of assembly are used to represent the numeric instructions required by the processor this assembly is for; mnemonics are a human readable representation of machine code and only in this way form any sort of 'language' to help humans cope with it."

Somebody would have probably had a go at me for being too ambiguous and re-written it as it stands now tho, so never mind.

I learnt Z80 about 30 years ago, I was able to hand compile Z80 mnemonics into the required numbers up to as recently as probably about 27 years ago but all I remember now is that the 'RET' instruction was 0xC9 which is 201 in decimal and 11001001 in binary - it was arbitrary, while I was loading the program into memory with BASIC 'poke' statements way back then, whether I wrote
Poke nextlocation,&hC9or
Poke nextlocation,201or if I could remember how to express binary numbers in Locomotion BASIC I would show that. (2nd Edit: Actually remembered, it would have been 'Poke nextlocation, &B11001001' and it may have been called Locomotive BASIC and that version of BASIC may not have allowed very long variable names but it has been a while - sort of all arbitrary if I have managed to make my point(s) anyway.)

Binary!=ML (Machine Language); it is just easiest to represent many instructions as binary as many of the instructions rely on forms of bit masking to differentiate between which register (or other target) the operation is going to be performed on.


I see. I guess I was just thrown back with the statement from the wikipedia article on machine code (http://en.wikipedia.org/wiki/Machine_code) where it says
Numerical machine code (i.e. not assembly code (http://en.wikipedia.org/wiki/Assembly_code)) may be regarded as the lowest-level representation of a compiled (http://en.wikipedia.org/wiki/Compiled) and/orassembled (http://en.wikipedia.org/wiki/Assembly_language) computer program (http://en.wikipedia.org/wiki/Computer_program) or as a primitive and hardware (http://en.wikipedia.org/wiki/Computer_hardware)-dependent programming language (http://en.wikipedia.org/wiki/Programming_language)

robsoles
March 23rd, 2015, 12:59 AM
Wikipedia is brilliant in most places but often parts of it are written by 'people in the know' in such a form that it will only help others who already knew and just needed something about the topic reinforced. The snippet you have quoted falls into that category for me - obviously actually near enough to right to me but not really providing clarity for novices.

There are wikipedia articles (not many really quite few) that are completely rubbish and should not be trusted in the slightest (please don't ask me to dig any up, fruitless exercise really; haven't come across one is a while tbh) and there are (many) excellently brilliant wikipedia articles that any novice could really benefit from reading although ambiguity is a bit of a loss in (also) many of them.

I often think of wikipedia: Many many more hands than (real/rational/sensible) minds involved in the editing, and by this I mean that the ratio of hands to minds is far greater than 2 to 1.

user1397
March 23rd, 2015, 05:31 AM
Gotcha.