Does using constants such as #define WIDTH 25 would somehow be better than using variables such as short int width=25 (in term of speed and/or size)? Thanks.
Zech,
As a beginner in C programming, I have found (often to my cost) that using macros help in faster processing speeds with a smaller footprint as far as file sizes are concerned. Unfortunately, however, if you have a value which needs modification/alteration even once during the program execution, it is better to have a variable - SHORT, LONG, DOUBLE all depending upon the usage of the variable, of course.
For example, if I were to write a dot-matrix printing routine whereby I would have to specify the page sizes, I would go about doing the following:
#DEFINE PRN80 80
#DEFINE PRN132 132
I only need to define this twice because printer sizes mostly come in 2 types of column variants - 80 and 132 columns.
I hope I have been able to give you an insight of what is involved in the usage of macros vis-a-vis that of using variables.
Happy programming,
Udai.
to make programs faster you might want to try using the register declaration.
example: register int i;
this tells the compiler to [if possible] make use of a physical register within the CPU [rather than allocated storage in the main memory]. accessing the register in the CPU is much faster.
secondly, it is [in most cases] true that accessing constants are faster than variables.
but we've to know which object's est to be stored in register. too much unproper use of register variable may also slow down your program execution. (but some compiler'll optimize how the variable store and whether store them as register variable. some also automatically store the variable in register without your known.)
and one thing, you CAN"T access the memory location of register variable.
eg:
register int i;
scanf ("%d",&i); <== is illegal.
address of 'i' is inaccesible.
NOTE: if the compiler cannot accomodate all of the program's register variables, it is NOT considered to be an error. the prefix will simply be igmored and they will be treated as if they have been defined as automatic variables [BUT you still cannot access their address]. so, make sure your preffered choices are registered first.
If let say that I compile and build a program that makes use of register variables, would that affect the portability of my program (similar OS -let's say win32 family- but may be different machines)?
I believe that '#define' is faster than 'const' because, when you define a value using '#define' (let's say #define HEIGHT 25), the compiler would actually replace the word 'HEIGHT' with '25' upon building. So it is exactly the same as changing every single 'HEIGHT' in your codes with a value of 25.
On the other hand, 'const int height 25' is treated just like any other variables (in which the compiler would allocate a space in the memory to store the value) with an exception that it is protected by the compiler from careless alterations in your program. (It is NOT completely protected though because you can still change its value when you pass the value by reference or when you use pointer)
Actually, when you declare a constant outside of a class, it should be pretty much unchangable. You can't pass around pointers or references to it unless they're pointers or references to constants... which means the compiler won't let you change anything through them. You can do a const_cast on them to get the program to compile, but that'll still probably cause your program to crash at run-time.
Your logic about #defines being faster because they affect the actual program text is correct in theory. In actuality, however, because of the above fact (constants really are constant), the compiler, if it's worth anything, will perform an optimization to make accessing the constant exactly as fast as it would be with a #define.
Since there's really no difference in speed between the two options, it's usually a better idea to not use #defines if you can help it. Since all they do is change text, they don't know anything about type-safety. They pollute the global namespace. They can cause subtle changes to other parts of your program that are really hard to detect. In short, they're dangerous. Constants respect type and namespaces. Use them in place of #defines if at all possible. When you must use #defines, give them really long, messy, all-caps names like STATIC_ASSERTION_MACRO so they don't conflist with other parts of your program.
I should point out that there's one exception to the pointers/references deal:
A string literal, like "Hello, world!", is really of type const char*. However, because of historical reasons (sloppy programmers), you can make a char* point to it, even though it should require const char*. This fact still doesn't stop the compiler from making the optimization, though, and your program will, again, probably crash if you try to change the literal through the barely-legal pointer.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.