hi friends,
Is it true that in ascii we represent using 7 digit binary and in ebcdic using 8 digit binary.How can we use 7 digit binary in ascii..please clarify.
Also can u highlight the significant difference between ascii and ebcdic
Basically, ASCII just uses the lower 7 bits. The eight bit in a byte is not used (accept in extended code tables), and is regarded 0 for standard ASCII; all values above 127 have no meaning. The very useful page Pipk supplied shows it all (good one, Pipk).
I have had some of what i consider to be fairly fundamental beliefs turned upside down and inside out by this forum. I guess it just goes to show the value of these pages and having so many peers to pick the brains of.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.