I can understand how an analog computer wouldnt work, but why only binary? With trinary you get 50% more voltages to work with with cumulative more storage capacity with more bits, errrr trits.
I believe is is because computers work with electricity. The power can either be OFF (0) or ON (1). No degrees onf 'on-ness'. Hence binary. Everything else is determined from there (powers of 2).
At college (many years ago) I can even recall loading programs in using just toggle switches that were either off on...painstaking doesn't even begin to describe it!
indeed. There has been experimentation with a trinary system of on, off, maybe (to put it somewhat loosely) as states.
Brings some advantages in certain areas but apparently not enough to warrant largescale commercial ventures.
Come to think of it, the reason *could* be that when designing CPUs etc. some of the partial circuits in the CPU are re-used from previous designs to be cost-effective?!
So a Shift would Result...
0001 - 1
0010 - 3
0100 - 9
1000 - 27
0002 - 2
0020 - 6
0200 - 18
2000 - 54
That'll be hard to get used to ;-)
(I kinda like the powers of 2 )
if you want to cut something in half, use a right shift...
Plus the screen sizes have been based on 2^x
We would no longer be able to say it's all 1's And 0's...
Or...
There are only 10 kinds of people in this world, those who know binary and those who don't...
Though I guess It would be... There are only 10 kinds of people in this world, those who know Trinary, those stuck in Binary, and the rest of the world...
not sure about the...
it's all 1's And 0's... and uhhh 2's too
In the modern days, performance is achieve with higher frequencies that allows logical gates to switch faster between states (on and off). The usual voltages for modern cpus varies between 1.3 and 1.7 volts (in the past 5 volts was the standard, but cmos circuits were able to use a range of voltages from 3 and up). The lower voltages allow 2 things: speed and low power.
Threshold is the key design issue in these logical gates, because it got to be enough tolerant to allow you to define which state the logical gate is: on or off.
At higher frequencies inputs and outputs tend to be not that steady. So probably tri-state or trinary computers will have to run at lower speeds(because they got to determine three states instead of two), and the performance would end up being the same. So why complicate things.. is probably the answer. ( Still I believe trinary would be more efficient)
A good example is also in modems, the first ones used two frequencies to modulate 1´s and 0´s, newer ones use phase modulation technologies that allow them to transmit up to 256 levels or 16 bits at the same time (Please correct me if i am wrong)
(1) the first electronic computer I know of, a horrendously huge thing using valves, was actually decimal. It was still based on the concept of on/off rather than 10 levels of "on-ness", but it used a ring of 10 bits, only one of which could be on, to represent a decimal digit. It was called Eniac. It's easy to be critical in retrospect, but when you think that those 10 bits could have stored a number from 1 to 1000 instead of a number from 1 to 10.
(2) Analogue computers did work. The Bush differential analyzer is probably the best known, but there have been numerous analogue computing devices over the years. Mostly they amount to a mechanical expression of a mathematical equation, with an ability to use it to calculate or plot values. In a sense, Oreries (models of the solar system, and I don't think I spelt that correctly!) were analogue computers.
(extending that, the gearing between the minute and hour hands of your watch is probably amongst the most common calculating engines - even if it merely divides by 12.)
(3) Maybe the reason why binary is so popular is it works so well, not just in electronics but also in Boolean algebra/logic. Have a think about the philosophy of a trinary computer and its logic! What do you do with three? 0=No, 1=Maybe, 2=Yes? What's the result of an or operation? Yup, you can work this sort of thing out: No or No = No; anything or Yes = Yes; of the remainder, anything or Maybe = Maybe.
But there are other ways you could interpret the bits: 0=No, 1="Yes, I just know it", and "2=Yes, and I can prove it". This can change how the logic ought to work.
Also it gets harder to work out how to use logic gates to make an adder (a very simple thing in binary). For our logical or, 1 or 1 = 1, but now the adder needs to see that 0+2 is the same as 1+1.
As an Electronics Engineer I can't help but interject here...
There are logic devices which are called "tristate" meaning that there are three logic states. The three states come from the obvious "on" state, the "off" state, and the "transitional" state. So, *if* the chip manufacturers really, really wanted to, then I believe that there could be a "trinary" computer system. I encourage you to look into tristate logic devices and see if you can hard-wire a simple circuit which will use a three-state logic. This could be interesting, but I will tell you, that you MUST have a very stable, noise free, power supply in order to get adequate results from this... Good Luck!
LF
"As far as the laws of mathematics refer to reality, they are not certain; as far as they are certain, they do not refer to reality."--Albert Einstein
If you want to buid a trinary computer the most difficult
part wil be of determing wat the levels are.
Because the computer chips nowadays run on a very low
internal voltage the diffirange betwee high, mid and low
would give as a result very poor tresshold differanges.
By using an extra negative supply high would be +V, mid
would be ground and low would be -V.
But is isn't that easy to make two voltage plains on a
chip as you may think.
Ok on a chip with a few transistors and or functions and
low speed, it will work, but with the millions of
transitors needed for a medium type of processor it is
very difficult, because the basic substrate has to be on
a unary voltage.
All circuits are made of gates, the most basic are AND, OR and NOT
1 AND 1 = 2?
2 AND 2 = ???
1 OR 2 = ???
NOT 2 = 0
0 OR 2 = ???
0 AND 2 = ???
Even sequential circuits are made with AND gates.
My guess, too difficult with too little gain. It is said that electrical engineering is the second most difficult and technical endevour of man. The first is Software Engineering. Now imagine both of them being rewritten to be more complicated.
Remember Occum's Razor "All thing considered, the simplest answer is most often the correct one." Well, there is a princible of engineering here too... Make the simplest thing that gets the job done, and then refactor (simplify).
If this is up your alley, you may want to look into Fuzzy Logic.
The Russians made a trinary computer back in the 50s.
Apparently it used magnetic cores & semiconductor diodes.
The three states were +, 0, and -.
It was quite successful, and an improved model was produced in 1970, but the glorious leaders decided that such an off the wall device was too good to be true & cancelled it.
If you search on ternary arithmetic computers, it'll bring up appropriate sites in Russia. (english language).
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.