Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Differences between programming languages

Status
Not open for further replies.

kbert777

Technical User
Jul 21, 2003
1
0
0
US
Well, it's time to move on, looking at different employment opportunities in my local setting. Ran into a question during one of the recent interviews for a position as a Web Developer that I think I could have answered much better, but it pretty much caught me of guard. Here it is:

Describe the differences between programming for the web and traditional 3rd/4th generation languages.

While I came up with some differences I would really like to hear how other people would have responded.

Thank you very much in advance.
 
To me a 3rd gen lang is: Cobol, Fortran, C, Java; and a 4th gen lang is: Visual Basic.

So to me I don't understand what their question is, because most of the web programming is done in Java if using an App Server such as WebSphere and using a backend database.

Unless someone can enlighten me to how web programming is not using 3/4 gen languages and how it is "different."
 
I don't think the question is very well-worded, because of the use of traditional 3rd/4th generation languages. That in an of itself is discussable issue.

But perhaps the thrust of question is to investigate the difference between stateless web application and statefull desktop application.

Good Luck
--------------
As a circle of light increases so does the circumference of darkness around it. - Albert Einstein
 
Hard to be sure what they were after.

It is possible that "state management" was one of the issues they had in mind, but tons of programming using "traditional" languages was/is done that worked in a similar transaction-processing model to most web programming. Only simple batch or single-user programs really work otherwise. If they were coming from a desktop programming environment they might well be thinking that way.

They might also be looking at the issue of separating presentation logic from business logic. In many cases you code client script logic to enrich the interface in addition to the code that runs at the server. In other cases you might isolate certain business logic functions out into yet another layer that is called from your server logic that manages state and processes user input received from the browser, sometimes via stored database procedures and sometimes via compiled components.

Another thing I do commonly with web applications is use them to integrate one or more legacy applications on existing platforms with additional new logic and a web front-end. Older-style applications would more typically just go right to the database and reinvent logic or call a few libraries of shared code rather than act as a new layer "in front of" existing programs. But often a condition of adding a web interface to legacy applications is that the customer wants little or no change to the existing code and no "new legacy" code.

The latter leads into the use of various middleware and communication technologies to integrate parts of an application that "live on" different platforms. Most old-style environments relied strongly on monolithic development. The code ran on the box that the database engine ran on - or used direct API calls into database logic rather than using an engine at all for most non-relational databases. Things like ODBC popularized moving the database to a separate server, but desktop databases like MS Access still use an "engine" that runs on the application machine. Web applications often rely more on middleware systems ranging from simple things like socket connections between servers at each layer to structured synchronous links like DCOM or CORBA to async message-oriented middleware.

There are often additional platform, application, and network security issues to deal with in web applications. This is especially true for web apps that will be exposed to the Internet, but also true in-house. Old-style PC applications required deployment to individual workstations, while most web apps effectively "self-deploy" when you hit the server with the right URL. Getting hold of the desktop code was often seen as a component of application security, in addition to logon rights and so on.

I wonder if they were really talking about the simple separation of user-interface logic running at the client vs. the server logic layer though. Most web applications have incredibly simple architectures and are basically written like an old mainframe terminal program, substituting HTML output for terminal screen-control codes and effectively treating URLS as transaction codes.
 
I was always tought:
1GL = Machine
2GL = Assemblers
3GL = HOL (high order languages)
4GL = VHOL (very high order languages)
5GL = computer responds to spoken commands

SPguru - not sure if VB fits in 4GL. 4GL & 3GL are basically the same except that 4GL allows the programmer to define "what" without necessarily having to define "how". The compiler makes "logical(not all the time)" assumptions on the "how". Typical 4GL's are Progress, Oracle, etc.


longhair
 
Hey longhair- how many times were you tought, I mean, taught this? Didn't stick the first time? Just busting you. I'm having a bad day, I think.
 
jack1955,

1) never one a spellign be
2) too much reliance on M$ spell checkers.
3) or maybe i meant to type thought ;>)
4) or maybe it's been a bad week (and it has)
the day is almost over so hopefully your day will get better

regards,

longhair
 
I think the biggest difference is that the web uses a lot of scripting and markup languages (JavaScript, VBScript, HTML, etc.) which are not considered "real" programming languages for various reasons (subset of another language, lack of traditional programming constructs, etc). Although you can use some of the 3/4 GLs on the web, you usually have to incorporate one of the scripting or markup langauges somewhere. Don't know if that helps, but it's my 2 cents.
 
Why are their so many languages that do the same things, except for this or that? Let's get rid of VB and make VB.NET! Let's get rid of COBOL and RPG and go with C++ for business programming! Why? How does this help the customer, the person we are supposed to be helping in this career?
 
Would you rather existing languages are constantly expanded and added to in order to do things they were never intended to do?
Cobol for hardware access maybe?

In fact VB is an example of that. VB.NET IS VB, though expanded to do things VB was never intended to do.
The result is usually a disaster. Language constructs that are impossible to work with, terrible performance, etc.

For programmers it might be a bit frustrating to come somewhere where they need to learn a different language, but it's not that much more work than learning a different API or toolkit to use with a language they knew already.

Or do you propose to do away with all that and go back to hacking binary data into the CPU directly, which is the way programming started?
If those first programmers had thought like you "who needs another language" noone would ever have written a compiler and you'd still be stuck working in machine binary code.
 
You bring up a very interesting point jwenting with respect to the evolution of language vs creation of a new language.

General purpose languages vs specific purpose languages.

And then their is natural language. All natural languges evolve both in the written and spoken forms. New words are added to account for new inventions/discoveries, (or from langugages), and new words are created as combinations of previous words, and for many other reasons. Words die off as well as they fall out of vogue and/or no longer apply.

Perhaps we can have some fun discussing how and why the same kinds of evolutionary growth that already exists in natural language, cannot be migrated into programming languages. For the sake of the discussion, perhaps we should agree to leave the commercial argument out of it, or is that really too strong of a factor to ignore? Is is even technically feasible?

Good Luck
--------------
As a circle of light increases so does the circumference of darkness around it. - Albert Einstein
 
Jeez, jwenting, don't get your knickers in a twist! I just think there are TOO many languages. I understand the need for x number of languages. Some become popular for a short period of time, then go away. It just seems to me that we get away from our purpose, solving customer problems, and into this- we need to use another language, just for the sake of IT! I work in a business environment, not technical, not hardware design, purely business- finance, sales, marketing, supply chain.

Keep this discussion going....
 
I've said it before, and I'll keep saying it: learning the syntax of a new language takes a good professional a matter of days. Learning a new approach to programming (going from structured to OOP for example) takes a lot longer. It doesn't matter in the faintest how many languages there are; it's more relevant how many underlying ideas there are. And you can't stop new underlying ideas, can you?

On the same basis, should the world abandon risc processors on the grounds that they only muddy the waters when we are all used to the intel rich instruction set?
 
Many a language is created as a hobbie or test of skill and grows and grows.
The creator goes to work for a company and uses the language for some small internal tool.
Colleagues see it, take a look and try it.
Before you know the entire company is using it as one of their core languages.

People leave the company and go work elsewhere, sometimes taking the language with them.

That's how many languages grow, not by some diabolical plan to make our lives harder but by random drifting and connecting of neurons :)

Of course there are exceptions. Java was created specifically for the purpose of marketing it, and so was Visual Basic (for example) though that one was based on an older language that did start as I described.

Effectively, languages are evolving. Many current languages are natural progressions upon older ones. Dialects evolve, merge and die out over time as people add features creating their own versions of the language which either get abandoned or merged into the common codebase.
In Pascal there were several Object Oriented versions for example, all but one of which died out. The one that did survive did pick up some elements of the other dialects (as well as elements from other languages like C++).

I think the parallel between programming languages and natural languages can be drawn quite far, up to and including attempts to prevent "polution" of the language by "foreign elements" which with natural languages is typified by the French attempts to lock out any terms from other languages (remember last month's announcement to ban the word e-mail?).
 
For the most part very few people are in the position of adding features to a programming language. Most people are using commercially released language processors (LPs: compilers, interpreters) and aside from "new feature request" processes there is little opportunity for input.

There are exceptions in certain minority areas, and I suppose you could consider the few people actively working on open source compilers. But in general language evolution is either an academic venture or an internal one by commercial LP providers to correct deficiencies, add features, or accomodate standards processes.

Frankly, language evolution is a thorny problem. You want to do new things, but you have to consider the installed base of production software source code that will need to be upgraded too. For example VB5 to VB6 had its share of troubles, but VB6 to VB.Net is significantly messier.

I have a hard time believing vast numbers of programmers have a clue regarding RISC or CISC instruction sets, let alone that they program in assembly languages. This one is really a minority concern today, though it doesn't mean it isn't important.
 
jwenting, I think the days of new computer languages being created by hobbyists are long gone...
 
Not quite strongm, it's still being taught at universities.
Building a compiler is AFAIK still part of the curiculum of many IT courses (which I didn't do, being a physicist by training and a programmer by profession).

Just look at sourceforge for the number of compilers and scripting languages there.
On the first page browsing compilers I see at least 2 (out of 20 entries).
It's similar on further pages.

But indeed the volume seems down in favour of creating addon libraries for existing languages.
I notice the same elsewhere, people are more interested in "quickly" changing something that's already there than creating something new from scratch.
This is probably part of the idea that you should have instant gratification of all your actions that's being put forward by advertising. People just aren't patient enough anymore to do much really constructive work, taking months before they can see their application do more than fail to compile because core features aren't yet implemented.
 
I think you and I will just have to agree to disagree, as I fail to find any evidence at sourceforge that supports your argument that "Many a language is created as a hobbie " or that "That's how many languages grow". I also, in 20 years working in IT, have found no evidence of companies adopting hobbyist languages as their core development language.

Sure, building a compiler is still a part of many IT courses, but generally the compilers written tend to be for existing languages (and often just a subset), as it is designed to teach principles.
 
Depends on what you're developing.

PHP started as a way for Rasmus Lerdorf to monitor the number of times his resume was accessed on his web site. Sounds like a hobby to me.

And according to Netcraft, PHP has been the most-used web programming language for a while now.

Want the best answers? Ask the best questions: TANSTAAFL!!
 
I don't think anybody is disputing that innovation can (and does, and should) come from any quarter. We just got off on a tangeant here about programming languages, many of which spring up all of the time though few gain broad acceptance and use.

Anyone have more input on the original question?
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top