Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

should code and data be separated? 2

Status
Not open for further replies.

lionelhill

Technical User
Dec 14, 2002
1,520
GB
Last week I saw an article (New Scientist I think) about hardware solutions to the buffer overflow weakness (as exploited by hackers etc. to gain control of other people's computer equipment).
The idea seemed to be to separate code and data by hardware means, such that memory containing code could not physically be written to (in this case by an overflow from a preceding block of data).

Does anyone else, like me, feel that we're losing a baby with the bath-water? One of the greatest realisations in the history of computing was surely that code is just a specialised form of data, and not a distinct special thing. And now we're undoing that realisation?

And incidentally, what's the difference between self-modifying code and "just in time compiled" code? I admit self-modifying code is a speciality interest found in only a few special applications, but it's still one of the useful tools available to the constructive programmer.
I'm interested in other view-points and would greatly appreciate finding out more from anyone who has time and knows a bit more than I gleaned from a short article.
 
It's the usual problem, isn't it? Do you want something you can actually use to get something done, or do you want something so safe it's almost impossible to use? Microsoft clearly reckoned the former was more saleable (in the end it's software that makes hardware sell, and if you make it hard to write the software, you won't sell the hardware no matter how wonderful it is). But I do appreciate the beauty of the tagged approach. Thanks for the excelent explanation dilettante.

I still feel the best way to avoid viruses is to accept only boring ascii text and boring unadorned, plain bitmaps from outside sources (e.g. anything appearing up a cable rather than on a CD/diskette) - eliminate whizzy macro languages and executables that can run remotely. But I'm a Luddite!
 
Protected memory schemes don't make things 'impossible'. The whole world ran happily with secure operating systems before the arrival of the pc. Early microprocessor chips were too primitive to provide security, but that was addressed with the Intel 80386. Unfortunately Microsoft, and therefore practically everybody, just couldn't be bothered. It will probably change but 99% of computer users just don't realise it is a simple easy option so they don't demand it. As the virus problem is begining to get overwhelming, things will change. Let's pray microsoft.com gets blown out by a denial-of-service trojan. Then Billy will get of his a**e and do something.

 
lionelhill said:
Do you want something you can actually use to get something done, or do you want something so safe it's almost impossible to use?

I don't think that is the right question at all. Software was being developed quite nicely, with third generation langauges for over 25 years before the arrival of the PC, including, and probably even moreso due to lack of resources, self-modifying code. Programmers knew what they were doing, and had the discipline and the professionalism required to get the job done.

I will grant you that programming was not commoditized as it is today. Microsoft, and other vendors I'm sure, made it much easier to program, in order to increase sales and make money, and in the process removed the discipline and knoweledge required. Nothing is free, and today, viri are one way that we are paying that price.

Good Luck
--------------
As a circle of light increases so does the circumference of darkness around it. - Albert Einstein
 
Microcomputers began as toys for hobyists. Their growth to become a backbone of modern life was gradual, and each new machine sold better if it could run existing software.

We need a system where data is just data and doesn't take over your computer and start doing things. Not even when it's displaying a picture or a pop-up.

Establish the basic principle that people should control their own home machines: machines at work are different but should be protected from outsiders, definitely.

Modern machines are so fast that a proper split between data and instructions is not going to be that hard. And life is getting impossible without it. Who trusts an e-mail these days? Who downloads the nice bits of 'shareware that you used to be able to accept without risking anything worse than being bored by them?

------------------
A view from the UK
 
One answer is to try to replace fully programmable PCs with fixed-function devices. You'd have all of the software in ROM or in flash memory that can't be rewritten without flipping a hardware "upgrade" switch.

The problem here is that this seems to have gone two directions:
[ul][li]"Terminal" devices that require hosted services to do most of the heavy lifting.[/li]
[li]Smarter devices that only perform a few fixed functions like email, word processing, playing music, creating/viewing photo albums, etc. with no macro or other programmable features.[/li][/ul]
The former was unpopular for several reasons. Not the least of which was fear that it would be used as a way to extract even more cash from users than the current model does.

The latter might be useful at some level if you added web browsing. But so many sites require script or other active technology today that the result on a "hard" device can be pretty unacceptable.

I still think one or both of these approaches (combined in one unit?) may eventually win out. It is probably just a question of what will find enough acceptance in the market. It is quite a "chicken and egg" problem though. It won't be popular until it is really cheap. It won't be cheap until it can be mass-marketed.

Though it wouldn't meet every need, a lot of users (both home and business) don't need much more than such a hard-wired device. Such users make very limited use of the flexibility a PC offers today.
 
Cajun, I agree, but I think you're taking my quote not quite how I intended. I agree with you that good software was written, and was safe, without special help from hardware, and long before the IBM-compatible PC came into existance. What I meant was that an aggressively defensive hardware/OS system as described by dilettante would render programming a very specialist activity that could only be carried out by the privileged few, armed with extremely specialist compiler software. The thing that's made the PC so useful and attractive is that hobbyists have been able to write useful software at home, and smallish concerns, universities etc. have been able to program extensively etc. without special resources. I'm sure it goes back historically to the fact that early personal computers always came with a way to program them, but never came with anything else! The PC owes its history to them, and not to the specialist word-processors that grew out of electronic typewriters, and died out because of Word, WordPerfect, and good printers.
There will always be a place for strongly defensive systems as Diletante described, but for most of us, that's not the right machine at all.

The thing that's made the PC so vulnerable is a combination of three sub-things:
(a) the idea that it's nice for a computer's activity to be, in some way, directable from outside (even if that's only by embedded macroish things in web-pages etc.)
(b) the vast increase in complexity of PCs, so more things are going on without us users knowing,
(c) the general feeling that the user should have to know as little as possible about what's going on (which means they don't know what might be harmful, or how to stop it).
If you look back to the DOS-days, life may not have been very sophisticated, but it Was safer, because no one could make my machine do anything unless I put the diskette in and said yes; and I knew what the machine was up to. It couldn't do two things at once. Even boot-sector viruses relied on me forgetting and leaving the diskette in.

And just take a look at that: it's an important point. That's where the rot set in. The boot sector viruses were the big problem back then, and they relied on the one thing your computer did on its own without you typing anything - the one and only automatic, self-running function in the earliest operating systems.

Ironic that automation, the very point of computers, should also be their downfall.
 
Lionel

It wasn't more diificult to program then, it was easier. The DEC System 20 that I mentioned, with its proper memory protection, was something I programmed at university, running open source compilers. Things actually got harder when the c/unix generation arrived. Before that, the operating software was supposed to help you do things and capture your errors. It was like we all had automatic transmissions that wouldn't let you accidenatlly go into reverse. Then the hippes came along with their manual gearboxes. Good fun - yes. Cheaper - yes. Safer - oh no.

You're right about the mad dash to automated program loading, updates etc etc but the rise of half-baked operating environments is purely down to the immaturity of Bill Gates' uneducated programmers.

 
With all this MS bashing, has anyone stopped to think that Bill Gates' OS is targeted because it is the most popular, while other systems are secure through obscurity?

Dimandja
 
Dimandja,

You're so right about the real importance being measured by actual market use. I think it's only the Vocal Minority who indulge in this Bill-Bashing on a regular basis. If you look on the site home page at the list of Hot Forums (which I believe is based on top 10 current posting volumes) you find that Javascript appears halfway down the list - everything else on the list is Microsoft.

________________________________________________________________
If you want to get the best response to a question, please check out FAQ222-2244 first

'If we're supposed to work in Hex, why have we only got A fingers?'
 
Dimandja, johnwm:

I will admit that the popularity of Mi[¢]ro$oft's products will make them a more attractive target to those sociopaths who release virii into the wild. That's why Al Qaida hit the World Trade Center -- until 2001-09-11, from 8:00 to 17:00 U.S. Eastern Time the piece of property on which those buildings stood was the most populous place on the planet Earth.

However, the popularity of Mi[¢]ro$oft's products doesn't dispel the fact that those products couldn't be a target for the sociopaths if the security holes weren't there in the first place.



Want the best answers? Ask the best questions!

TANSTAAFL!!
 
So I guess we have gone around in a circle.

We started by talking about one of the ways the industry is looking at making PCs less vulnerable to attack through holes in code created by imperfect programmers. Now we're saying we don't need these protections if everyone just wrote hardened code.

But what costs more?

I suppose a sideshow here is the idea that if we had greater platform diversity we could live with less perfect code on less defensive platforms. There might be some merit in that idea too, but I don't see it coming anytime soon. For the short term at least I expect to see relative market penetration numbers stay about as they are today.
 
lionhill said:
What I meant was that an aggressively defensive hardware/OS system as described by dilettante would render programming a very specialist activity that could only be carried out by the privileged few, armed with extremely specialist compiler software. The thing that's made the PC so useful and attractive is that hobbyists have been able to write useful software at home, and smallish concerns, universities etc. have been able to program extensively etc. without special resources.

Actually, on the platforms like this that I was talking about things aren't nearly as grim as you describe.

You can still use programming languages like C that encourage unsafe coding. It's just that some really unsafe things aren't available, and others are effectively "sandboxed."

There are several Algol compilers available that grant varying levels of access to privileged functionality (or none) where required. Making these compilers available to users, and to which users is a site management responsibility. Very few users need access to the OS compiler though. Even being able to run an "unsafe" compiler doesn't mean you can execute the resulting code.

You also have the option of several Cobol compilers, Cobol-generators of various stripes, a Pascal compiler, a Basic compiler, several Fortran compilers, an Ada compiler, and so on. Other languages such as Jovial and PL/I have been supported when they were in fashion. There is even a Java runtime allowing bytecode compiled elsewhere to be interpreted on this machine.


Speaking of Java, you have that already as an option for general use on many platforms. .Net and its supported languages also support the execution of managed or sandboxed code. The only difference here is that this relies (maybe too) strongly on software mechanisms for protection. These "sandboxes" need to be really tight code you can trust. They also cost you in terms of performance.


I'm just saying that "safe" doesn't have to mean that your hands are tied as a user or developer. Of course you do have to live within some limitations. But how often do you really need to write into another process' private data, stack, or code segment?
 
But how often do you really need to write into another process' private data, stack, or code segment?

Once should be enough, shouldn't it?

Dimandja
 
Dimandja, thanks for input. I'm personally not microsoft-bashing; for better or worse they have provided a very universal system that lots of people have used for good things.

Dilettante, I think what I'm getting at is that for all the added hardware security, you haven't excluded viruses. If you have a java interpreter which can also write files, it can modify files of java code compiled elsewhere, and it would be quite possible to write a java-specific virus affecting this machine. I don't think it's basically possible to exclude viruses completely without excluding the creation of any new programming (e.g. by running exclusively from rom and having no interpreters at all). For most of us, this is too much to lose. Everyone makes that choice when they decide whether to enable word/excel macros or not.
 
So if you believe that "hardened" platforms aren't an answer (at both hardware and software levels), and you aren't willing to forego having the ability to alter your computing platform... what's the answer?

The hardware protection topic we started out with is an attempt to compensate for or guard against faulty code. You might be interested in the limitations and the effort that goes into trying to make Java safe at Securing Java.


But the important question is: So where do we go next?
 
It seems pretty clear that Microsoft is beginning this journey with Win XP, SP 2: Execution Protection

"Although the only processor families with Windows-compatible hardware support for execution protection that are currently shipping are the AMD K8 and the Intel Itanium processor families, it is expected that future 32-bit and 64-bit processors will provide execution protection. Microsoft is preparing for and encouraging this trend by supporting execution protection in its Windows operating systems."

I assume we can expect the BSDs, Linux, Netware, Solaris on x86, and so on to follow suit - if they don't have plans in place already.

Oops, I see there is an OpenBSD/amd64 for Athlon-64 already.
 
And for older machines:
Microsoft said:
Sandboxing

To help control this type of attack on existing 32-bit processors, Service Pack 2 adds software checks to the two types of memory storage used by native code: the stack, and the heap. The stack is used for temporary local variables with short lifetimes; stack space is automatically allocated when a function is called and released when the function exits. The heap is used by programs to dynamically allocate and free memory blocks that may have longer lifetimes.

The protection added to these two kinds of memory structures is called sandboxing. To protect the stack, all binaries in the system have been recompiled using an option that enables stack buffer security checks. A few instructions added to the calling and return sequences for functions allow the runtime libraries to catch most stack buffer overruns. This is a case where a little paranoia goes a long way.

In addition, "cookies" have been added to the heap. These are special markers at the beginning and ends of allocated buffers, which the runtime libraries check as memory blocks are allocated and freed. If the cookies are found to be missing or inconsistent, the runtime libraries know that a heap buffer overrun has occurred, and raise a software exception.


Software instead of hardware, but deja vu all over again. ;-)
 
'ullo Dilettante,
you asked me a question:

>So if you believe that "hardened" platforms aren't an answer (at both hardware and software levels), and you aren't willing to forego having the ability to alter your computing platform... what's the answer?

I personally think we all need different answers according to who we are. For many people, who need never create new programs themselves, and who don't live on vast quantities of slightly-dodgy self-modifying legacy code, the hardened platforms at hardware level are ideal.
For others, like me, the best approach is probably to turn off the macro abilities of utterly everything, and never open those wretched e-mail attachments (and hope desperately that buffer overflow problems, something of which any self respecting programmer would be deeply ashamed, gradually get sorted out. I mean! Writing without consideration of range! Shocking!).

I feel it's maybe time for the PC to split more enthusiastically into dedicated versions for different functions. Somehow dedicated web-surfing equipment, or dedicated e-mail equipment, just hasn't succeeded in the way that dedicated games consoles have. The PC itself should ideally remain flexible, even if that makes it less safe. But that's just my view.

 
I personally think we all need different answers according to who we are.

Amen.

I like to code creative programs. That means I like my hardware and software to be flexible enough and smart enough to keep up with my designs. A dumbed down machine simply won't do.

There is a million ways to introduce annoying data and code into a computer - any computer. We simply need to design a million and one way to stop malicious software. Simplistic - common denominator - solutions only defeat the purpose of having a computer in the first place.

Dimandja
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top