Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

The free lunch is over

Status
Not open for further replies.

chiph

Programmer
Jun 9, 1999
9,878
US
The article says that processor power has topped out, and that for any future performance gains, programmers must learn how to write concurrent software to take advantage of multi-core CPUs and hyperthreading.


Chip H.


____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first
 
There was an article a week ago about HP Research developing technology that replaces transistors and in doing so creates almost limitless power. I don't recall where I read the article but it was interesting. It also mentioned, however, that it would be years before it would be used as a primary method (if ever, I would add.)

And to think this R&D occurred on Carly's watch and they turn around and oust her as CEO. How fair is that?
 
The article says that processor power has topped out, and that for any future performance gains, programmers must learn how to write concurrent software to take advantage of multi-core CPUs and hyperthreading.
Or maybe cleaner code that isn't so power-hungry. Then again with JVMs and "Frameworks" to contend with...
 
Or maybe cleaner code that isn't so power-hungry.

I agree, but with the current drive to get code to be written in the cheapest place in the world, it can only result in bad code being written. Saw a good one at work the other day:

SELECT a, b, c
FROM tbl_x
WHERE a = a

I can only imagine that template-driven code will become the norm, and we all know how efficient that stuff is. :-(

The only thing that might replace a careful design done by someone who knows the system inside & out, would be a design done by genetic algorithm, and then there's a serious trust issue -- how do you know that the GA's code isn't working because of some unintended side-effect?

Chip H.


____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first
 
Reminds me of those bloated, finned boats we drove in the 1950s and '60s. The engineering placed no emphasis on clean and efficient energy use (or for that matter safety). It took a series of political embarassments before anything was done at all (Nader, the Saudis exerting the muscles car-culture had given them, etc.).

Anyone with any foresight can see a crisis looming in computing too. Wasteful use of a cheap resource available in bulk from outside sources is a recipe for power plays that change the rules of the game.
 
Wasteful use of a cheap resource available in bulk

I think that such waste has been happening since the 286 was released. Every generation of PC hardware (the 286, the 386, the 486, etc..) has seen important increases in computing power, available memory and storage.
And what have we done with it ? Apart from recompiling the same old code from time to time, what have we really done with it ? Clippy ? Streaming video ? Oohh, shiny !

get code to be written in the cheapest place in the world, it can only result in bad code being written
If I'm not mistaken, buffer overflows are our daily nemesis, and have been since the beginning of the Internet. These buffer overflows were not implemented by low-pay Indian hacks, they were designed by highly-paid Western programmers. Many of them with degrees in Engineering or Science. Fat good that did.
I don't know anything about education in India (or any other third-world country for that matter), and I am not aware that any Indian-based company has produced any code worthy of recognition yet, but given all the trouble we have with pooly-designed mail clients and OSes now, I fail to see how low-pay programmers can do any worse.

Never mind, it's Monday and I'm probably grumpy.

Pascal.
 
I think "bad" in the sense being discussed here was meant as wasteful rather than outright buggy. The more I think about it though, the real waste is probably in the rapid replacement of hardware just to get the latest and greatest.

Most desktop systems sit running but idle for a lot of hours each day. So maybe the waste is really in not using those resources to any good purpose. In that sense maybe using otherwise bloated development environments that resulted in bloated applications makes some use of the hardware at least. If there is any gain by doing so (development productivity?) then it may be all to the good, since those overkill machines are just sitting around anyway.

I have to wonder if a typical business desktop really ever needs the resources of anything greater than around 256MB and 500Mhz. Machines of that power scale ought to be darned cheap and about the size of a really thick paperback book if built using today's technology, with substantially reduced power consumption compared with older ones. Most of the size would be the hard drive and CD/DVD writer assuming a small external power supply.

At the worst I can't imagine why they'd need to be larger than a case for a CD duplicator (one reader/one writer cases).
 
Is code inefficiency really escalating in proportion to advance in PC performance? I think one major the reason why the "computer experience" hasn't netted users a huge speed increase is because that there are simply more apps being run concurrently today than before. There are so many apps being run in the background, more multitasking, etc. As new software ideas come out, more programs come out that are attractive to users.

 
I agree with dilettante in the sense that not using system resources is not always efficient. Sure, if you can get an app to need only 10MB of memory instead of 30MB that sounds great. But is it really? We programmer types often fail to acknowledge the real world implications of our drive to make code smaller and faster. Sometimes bigger and slower can make a lot of business sense.

For example lets examine creating a business process that needs to run nightly to update data.

Lets say we can spend 2 weeks building it and optimizing it to run super effeciently and have it take 15 minutes to run on a $5000 box.

Or we can spend 1 week building it with a RAD tool that takes 1 hour to run on the same $5000 box.

What did we gain by spending the extra week? A four fold improvement in effeciency! Sounds great! But if the business needs are such that we have a window of 3 hours to complete the task each night we really have gained nothing at all! We pat ourselves on the back for being effecient meanwhile the developer with the bloated code is twice as productive and ultimately more use to the business world.

Sure this does not apply to all cases. Sometimes it is worth the extra time and effort to maximize effeciency. It is imperative to be able to do so. It is also imperative from an economic standpoint to know when to just take the shortcut and let the app be technically ineffecient.

I would definitly argue that there are certain classes of applications where effeciency is extremely important. This would be any operating system or program meant to run continuously in the background. With these one should always take the smallest possible footprint to leave the resources available for the apps that actually do the productive work.

 
I think "bad" in the sense being discussed here was meant as wasteful rather than outright buggy.

Correct. I meant it in the sense of lots of cut-n-paste code, where the programmer doesn't have a sense of what it does, but just finds something that sortof works, and futzes with it until it works adequately.

And, this isn't limited to outsourced code. There are plenty of western programmers who do it too. But my experience has been that it primarily comes from overseas programmers.

Chip H.


____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first
 
What did we gain by spending the extra week?

Let me see : 2 weeks build time for 15 minutes run time, or one week build time for one hour run time.

What is gained by spending another week optimizing ?
More time for the backups that run after/before the process.
More flexibility for the administrator, secure in the knowledge that an important business process can be rescheduled without severely impacting the nightly run schedule.
The code taking one hour to complete will be judged obsolete sooner because of time constraints, whereas the 15-minute code will be able to justify its usefulness longer, since it will allow for more tasks to run in the same night.

The price of the box is irrelevant, there has to be a box anyway. The cost of the coder(s) is relevant, but that justs pushes back the date at which the code can be deemed to provide value for money (however the company decides to calculate that date).

I agree that quick & dirty is sometimes an acceptable way of doing things. Unfortunately, quick & dirty is what gave us buffer overflows in the first place. And cut&paste code is what has kept them alive.

My opinion is that it always pays to carefully plan an application, and design it as best as is possible - even if it is not destined to support a critical business process.

Pascal.
 
When you say you have seen "nothing from India..." you are closer that you realise. The concept of zero was invented by Indian mathmaticians while the Scots and English were running around covered in blue paint hiding from the Roman invaders. India has a richer and longer mathematical heretidge than most other civilisations.

It is always easy to find faults in foreigners and ignore the worse ones at home.

But hey, what I can I say? I'm from New Zealand were high technology is the electric fence. :)

Editor and Publisher of Crystal Clear
 
I'm from New Zealand were high technology is the electric fence.

You're being modest -- Weta Digital has what must be the largest compute cluster in the southern hemisphere.

You're right -- we wouldn't have gotten very far without the concept of zero. It's one of those things that everyone just knew, but didn't apply it to counting until someone in India thought of it.

It is always easy to find faults in foreigners and ignore the worse ones at home.

I've seen lots of bad code written by US programmers, too. I'm not denying it.

What I suspect is happening is that the Indian software industry is about where we were in 1998 -- they're hiring everyone who can spell "object" and can fog a mirror. As a consequence, the quality of code is rather low.

Chip H.


____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first
 
Coming from a thoroughly western education, I remember being told at university not to re-write code that I could already take from somewhere else (in those days, Fortran and I'd have to type it in again). In fact this was the very thinking that later grew into black-box object-ism, and thence to rapid application development tools.

How can we criticise people for cut-'n'-pasting, and not bothering to understand what the black boxes really do, if we told them to do it that way in the first place?
 
Every new computer has been an improvement, from my viewpoint. I can do the same things faster and also a few extra bits.

------------------------------
An old man [tiger] who lives in the UK
 
Computers are not the issue. It is the software that allows them to do something, or to crash faster. The new dual core CPUs will now allow multi-threaded applications to crash two threads, independantly.

BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!
 
Several years ago there was a thread about increasing cpu power for PC's, and I feel the same now as I did then--it's a big yawn.

Sure, for a server, you want the power. But my guess is that 95% of business desktops are overpowered. You just don't need this kind of power to run Word, Excel, IE, and most of the commercial business apps out there. Memory--yes--most desktops might benefit from more memory due to the bloat and inefficiencies mentioned. But more power wouldn't be noticed by most business users.

Gamers, yeah. Graphics, yeah. But not business desktops.

Me, I want internet bandwidth and memory.
--Jim
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top