Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Programming languages... Where are they going ?? 4

Status
Not open for further replies.

guestgulkan

Technical User
Sep 8, 2002
216
GB
As a technical user and sometime dabbler in programming,
I have seen HUGE strides taken on the hardware development side.
On the programming side:
I dabbled in Assembly (& hex), then to structured (C) now
OOP (c++).
What next?? I have seen no progress in languages in the last 10 years.?

Will the next generation of programmers, in 20 years time or so
still be programming my 50GHz, x terabyte HDD, voice recognition, holographic VDU PC in C++ ??.
Heaven forbid!
 
Ok, I know nothing about programming, or what exactly people class as "programming" languages.
But haven't languages such as Java & javascript; mercury;PHP; python; VBscript even Perl being developed in the last 10-15 years and are widely used on the net?
Or am I way-off??


::
 
I think guestgulkan is asking about the development of programming technology, not the formation of new languages.

A brief history --

In the early days, we had what is know as 1st generation lanaguages, or machine languages, where all the programming was done is 1's and 0', and/or hard-wired relays.

Then came the 2nd generation which includes the assembly languages, where mneumonic encoding of machine instructions became available.

In the 60's we began to see the development of third generation lanaugages, such as FORTRAN, COBOL, and PL/I. These languages had a formal syntax and grammer and now English-like statements could be built and combined into reasonable programs. Also, the 3rd generation required to development of interpretors and compilers, and to a lesser extent, translators. This generation can perhaps best be described as Procecdure Oriented lanagues.

Now it gets muddy. Different people has different ideas about 4th and 5th generation languages.

What is a 4th generation language? Some people argue that SQL is 4th generation. I don't agree. SQL, at least IMHO, did not enhance programming technology just because a new syntax for database access had been defined.

Others argue, and I am in this camp, that the introduction of objects, an entirely new technology and methodology for programming defines the 4th generation.

Summarizing the generations"
1st - Machine Lanagues
2nd - Mneumonics
3rd - Procedures-Oriented / Compiled
4th - Object Oriented

As said, the first three generations are accepted fairly well by all, but there is divergance from 4th generation and up.

With that in mind, guestgulkan's question could probably be better restated as "What new programming technologies will we see in the next generation?"

I would have to surmise that the next generation of programming technology would be in the direction of Natural Language Processing. But I also think that due to the inherant ambiguities, this will not be so easy. I think we'll see, at least in the near term, more advances in programming development interfaces, and not so much in programming language technology.


BTW guestgulkan - with all the questions that you have asked, is it true that you've yet find anything helpful?
Good Luck
--------------
As a circle of light increases so does the circumference of darkness around it. - Albert Einstein
 
-> CajunCenturion,
It's not that I've haven't found anything helpful, even though
I've asked a lot of questions.
I'm between jobs at the moment - so I'm just hanging out in these two forums and having a good time (and I post a reply or two in some other forums.)

However, I start work at an ICT (Information Communication & Technology) centre soon - Then I'll really start asking some serious questions :) so I having been saving the 'stars'.
 
I will have to disagree a little with CajunCenturion. I consider OOP to be still 3rd generation (they allow someone to create 4th generation abstractions, but OOP languages themselves are still as completely tied up in implementation details as their predecessors).

The whole point of a 4th generation language is to allow declarative logic without having to worry about the implementation. In this sense, SQL is one of the best examples we have so far of a 4GL. It is declarative, based almost purely on responses to logical requests, rather than details of storage, ordering, optimizing, etc... (all those things must be considered by a good DBA, but the SQL user--any application layer--doesn't need to be concerned with those details). Now, I would be the first to agree that SQL is not perfect, and has muddied the "pure 4GL" ideal with many unnecessary implementation details, but I personally can't believe an experienced programmer could go so far as to say it "did not enhance programming technology". It wasn't just a new syntax for database access. We had plenty of those already, and that was exactly the problem: they were all application-dependent and architecture-dependent, and every single on of them forced the user (application) to be intimately concerned with implementation details, instead of separating physical storage from logical request processing. Thus, it was impossible to guarantee database integrity if the database was being accessed from more than one application (think: just about any enterprise-level database).

Are there any advances over SQL? Yes, C.J. Date, one of the most prominent writers/theorists has proposed a truly relational language, that is both simpler than SQL, and even more purely logical, allowing the database designer to declare any level of logical constraint upon the database, and for datatypes of arbitrary complexity. For years, most of the database industry ignored him, but apparently some companies are starting to take his ideas seriously. Dataphor by implements a language they call D4, which is a relational language, but is also a 4GL applicaton development language, and SolidTech (makers of the Solid database engine) are working on something similar.

Basically, if these languages are put to proper use, they will get rid of 90% of the application code needed these days to implement database-backed applications.

Even in my own limited experience (PHP,Perl, etc...), I found that once I stopped just using SQL as a simple storage facility, and really pursued the logical capabilities of an SQL DBMS (PostgreSQL), I was able to radically cut back on the amount of application code needed.

Another language that aims toward 4GL-thinking is TCL (Tool command language), which is kind of SQL-like, although it is aimed at front-end application logic. TCL users argue it is a perfect complement to an SQL back-end.

And, an attempt at 5GL is available at -------------------------------------------

Big Brother: "War is Peace" -- Big Business: "Suspicion is Trust"
(
 
Mmmm - Never thought of SQL as a programming language.
But what lies ahead for programming languages??
Will programming languages 20 years from now use AI?

Will I able be able to write code like this:

" When I press the xyz button, open a connection to
and the compiler will be able to understand what I mean??
 
I would hope not, if they ever made anything like that you'd end up with a ton of really inefficient coding!

I've never seen any real computer intelligence and I doubt we will for a lot longer than 20 years.
 
So, out of a total of 300 thousand plus members - no one has a clue about the future of programmimg languages?

But I'm sure we're all agreed, whatever the future holds, it will have a really silly name (like SQUEAK).
 
Well, if all 300,000 members are looking at this thread, and no one is coming up with something, then that's pretty sad. But I think generally most members here don't even want to think about the future of computing ;-).

Let me do my best to gather up some interesting web links and discussions I have seen in the past 2 years about this topic. But before that, I will have to stop a moment and mention SQL again, because I think it has bearing on this topic.

>Mmmm - Never thought of SQL as a programming language.

This is exactly the point. Our perception of what a programming language is will have to change somewhat in order to progress toward better abstractions in computing. Moving from procedural to object-oriented was a big step, in some ways, although there have been plenty of well-voiced debates about that also. But still, with OO programming, the language is a generalized programming environment having all the same operators and methods, with just a couple additions.

When I said SQL is the best example of a 4GL available at the moment, that is just what I mean. I don't mean that SQL is a wonderful language. I find much fault with it. It is, however, a programming language; it is just not a generalized programming environment. And that is exactly the point. If we keep trying to make generalized languages, which cover every facet of development, we will always be mired in implementation details, instead of being able to simply and directly deal with the logic of our software. SQL database systems are usually written in C/C++, using all of the implementation details of those languages, in order to shield the SQL user from those very details. That doesn't do away with C programming, but provides an elegant separation of responsibilities that we would do well to emulate in other areas. Imaging a GUI front-end language that shielded the programmer in the same way, concentrating only on the logical task at hand, allowing for the "back end" (the program that parses the languag) to be tweaked as a generalized system, without ever affecting the front end.

So, that brings me to a point about the supposed "progression", from 1GL to 4GL: every one of those levels is still in use even today. I mean, someone still has to make the physical chips, and deal with the ones and zeros. The progression should not be thought of as a measure of past->future, but a measure of low-level->high-level. So, the future will bring us more high-level abstractions, but there will always be a need for people to deal with the lower levels. The C compiler wasn't written in C, was it? Java wasn't written in Java, was it?

There are many ways to progress toward different types of abstraction and metaphor, each of which might be useful to one group, while irritating to another. For example the idea of a completely "visual" programming environment, where there is never any text represantation at all, seems to me a very frustrating concept, but it might be a wonderful concept for those developing certain sorts of projects.

Check out this comparison between Windows vs Apple scriptiability: (
Also, of course, there is the negative side to increased abstraction. Check out Joel Spolsky's "Law of Leaky Abstractions" -- and the ensuing Slashdot discussion at
There are all kinds of companies working on 4GL this and 5GL that, each claiming that they are pursuing the ultimate breakthrough, which will enable people to simply and easily tell the computer what they want, and receive clear responses. Let's face it, we can't even do that well when dealing with one person to another!!! People are constantly misunderstanding each other, and having to repeat questions, rephrase them, asking for a rephrased response in order to make sure the person understood them, and even then, often find out that the request is being processed in exactly the way the requester did not want.

Also, abstractions can actually bring greater complexity, if we are not careful. I consider Java to be a perfect example of that. C was a dangerous language for novice programmers, but it is also a fairly simple and straightforward language. Read Kernighan & Ritchie's "The C Programming Language", (about 270 pages) and try to imagine a Java book of the same size, and whether it could truly explain Java in that amount of space. And to properly use Java, you need to be familiar with mountain after mountain of APIs even to accomplish some of the simplest tasks.

I don't know... if I were forced to make a prediction, I would say that sooner or later, society will have to find a simple, direct way for the general public to interface with computers. This is because computers will be a part of everything. But, it will involve training the human as much as programming the computers. It will probably be some sort of simple declarative language--NOT "natural language" processing; it will be more like a subset of our spoken language. An extremely simplified subset, kind of like a pidgin English, or a pidgin version of the speakers native language. People will be taught this language from kindergarten onward, so that overall, they will be able to tell computers what to do for the everyday aspects of their lives, such as "Display all phone calls from my friend George". Then, of course, those who get further into dealing with computers professionally will learn more advanced aspects of this language. It will be possible to speak this, or enter it into a keyboard.

Of course, sooner or later someone will hook up the human brain to a computer (It's already being done, but not on a very sophisticated level yet). I think this will be a painful exercise at first, as our brains tend not to stay "on topic" too well. Again, this will involve "training the human" as much as training the computer. Maybe we will come up with some sort of mental viaualization system that allows us to manipulate higher abstractions. (Like the "ideoplasts" in David Zindell's Neverness) By that point, we probably won't even be thinking of these machines as computers. -------------------------------------------

Big Brother: "War is Peace" -- Big Business: "Suspicion is Trust"
(
 
As my cultural anthropology professor said: "Advances in technology always increase the amount of complexity and specialization in the work force."

In the programming world an example is with component writers vs. script writers, in the sense that there are those who design reusable objects (components), and there are those who organize components into 'scripts'. (terms used loosely)

Scripting technology will eventually manifest itself as an Application Markup Language (AppML) based on XML or some other human-readable declarative syntax. Component technology will evolve beyond the Procedural and OOP (Object-Oriented), maybe to include AOP (Aspect-Oriented) or some new paradigm nobody has thought of yet.

petey
 
I beleive we are starting to see the next generation today. It is not so much the development of the language itself but a migration in the thinking pattern of how we implement the software. Object Oriented Programming (OOP) was the first step.

The future of programming lies in the development of implementation specifications and interfaces that allow collaboration of components for development of large scale applications. The programming language of choice will become irrelevant.

I know, this has been going on for a long time. The bigger issue has been the commercial markets unwillingness to use open standards based technology.

RMI and CORBA are small examples of providing a distributed interface.

X windows with GNOME and KDE is another example where an interface was developed and a more complex structure was placed on top of it.

ODBC drivers and SQL are good examples from a database standpoint.

IVM (VPIMv3) is another good example of where things are heading. What I have found interesting about IVM is that Microsoft knocked off a European CODEC, put a file header on it, and called it there own. They then tried to force it into the specification. Fortunately G.711 is the only CODEC that is required to be supported. IVM is the natural migration of a customer revolt several years ago that forced the development of AMIS analog networking. I'm not going into the history lesson here.

I know many will argue with me. Everyone wants to have "the language".

I also realize that there are sacrifices in programming performance that are sometimes made with this approach. That is where the hardware improvements have helped.

Will Natural Language programming come about. Probably. It will most likely overly an existing language such as C++. It's really a matter of developing a large enough repository of objects with tightly woven interfaces to handle the variety of requests.



James Middleton
ACSCI/ACSCD/MCSE
Xeta Technologies
jim.middleton@xeta.com
 
i think with the introduction of Genetic Engineering and Bio-Informatics, Quantum Theory, we might in fact move to a totally different level of programming in the next 20 years. (such as with Quantum Computers you would be able to literally have infinite processing powers), Bio-informatics would enable systems as complex or nearly as complex as human brains. I am pretty sure traditional Object Oriented Approach to programming would not quite be sufficient to fit in with these scientific developments.
 
"with Quantum Computers you would be able to literally have infinite processing powers"

is that like

"with nuclear power electricity will be too cheap to meter"

?

I doubt it, basically.

But while we're talking about speed; computing power (speed) and what we (programmers) do with it seems to work like this.

Users have a finite length of time that they're willing to wait, after pushing a button, before getting bored and throwing something out of the pram.

As computers get faster, we fill that time with "nice" stuff. GUI's, pop-up help text, animated graphical buttons.

The programs of ten years ago, written in COBOL etc., do very similar work to the programs of today, written in C++ & Perl - and they seem to do it at about the same speed... Mike

Want to get great answers to your Tek-Tips questions? Have a look at faq219-2884

It's like this; even samurai have teddy bears, and even teddy bears get drunk.
 
Speaking of a finite time to wait, users also have a finitie amouint of processing power upstairs. As things get faster it is almost a requirement to keep them moving no faster. Quantum computers will be extremely handy in scientific fields and entertainment fields because it will allow much larger amounts of number crunching. When it comes to the basic dektop programs though, that extra power will be close to useless, as humans do have a maximum speed they can receive and send information at.

"Traditional" object oriented programming is not tied to the machine in any way shape or form, it is a mindset and a theoretical outlook on how to program. While it is probable that someone will come up with a new way to look at programming, OOP will not become extinct because computers become faster, but instead because someone finds a better way.

-Tarwn ________________________________________________________________________________
Want to get great answers to your Tek-Tips questions? Have a look at faq333-2924
 
"that extra power will be close to useless"

To be honest, no -- I don't think so.

We'll just do "nicer" things with it. Cleverer speech recognition will chew up cpu cycles very nicely thank you. Improved graphics for games, improved AI for computer generated characters in games -- you name it, we can think of it. Mike

Want to get great answers to your Tek-Tips questions? Have a look at faq219-2884

It's like this; even samurai have teddy bears, and even teddy bears get drunk.
 
When I think of "infinite" processing and compare it to current processing power, I tend to think in a give or take 10Ghz mode. Most of the really nifty things I can think to do with more processor power have a finite upper bound that throwing more power at will affect very little. There will always be additions, but I think after a point the applications themselves will reach a point where increased processor power will be unnecessary for improvements.

Right now a high-end game takes only a little more processing power than an office application, if compared to some of the genetics work or nuclear research going on. Instead of comparing a 2 or 3Ghz processor to a processor with infinite power, try comparing a 2Ghz processor to a 100Ghz processor, or a 1000Ghz processor, or a 10,000Ghz processor...after a while the limit is going to be human I/O and imagination.

-Tarwn ________________________________________________________________________________
Want to get great answers to your Tek-Tips questions? Have a look at faq333-2924
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top