Since we're made up of an amalgam of DOS and UNIX programmers, the source code found at Dendrite is a odd combination of conflicting styles. An example is how each deals with the possibility of errors. UNIX programmers rarely check if an malloc succeeds ("Hey, it's all virtual. man!"), but will be fastidious about checking any file operation ("The 'change directory' could fail, even though it immediately follows a successful 'create directory', because between the two, another process could remove the directory"). MSDOS programmers, who only have the possibility of a single process, but the reality of the 640K memory limit, are just the reverse. The funny thing is that both groups are sure that the checks they skip are unnecessary, and the ones they do perform are needed for "robustness".
In the following samples (drawn from actual Dendrite code), the programmer has seemingly gone out of his way to guard against some problem. Can you identify what it is? And can you spot the subtle problem that the "fix" is causing ?
#define MAX_BUFFER_LEN 4096
char *buffer=malloc(MAX_BUFFER_LEN);
char *GetAString(); /* Returns a ptr to an ASCIIZ string */
memcpy (buffer, GetAString(), MAX_BUFFER_LEN);
1) Why use a memcpy() instead of a strcpy(). What should they have used instead? Why.
#define LOGOUT_STRING "logout"
if (strncmp(command, LOGOUT_STRING, strlen(LOGOUT_STRING)) == 0)
2) Why use a strncmp here instead of a strcmp? (Wouldn't it do the same thing?) And how could we make this more efficient?
Answers:
1) The memcpy was used because the programmer didn't trust GetAString() to actually return a proper ASCIIZ string. If it had not, strcpy() would have kept copying bytes until a 0x00 byte was reached. If a zero was not reached in MAX_BUFFER_LEN bytes, the copy would have overflowed buffer, and started overwriting memory it didn't own.
So, what's wrong with all this ? Well, while we are so concerned about not writing to memory we don't own, we completely disregard the problem of reading memory we don't own either. GetAString() is defined as returning an ASCIIZ string, which means we only know for sure that it owns the memory up to the first 0x00 byte. Nevertheless, the statement above would attempt to read the next 4K starting at the address return by GetAString(). If that happens to be near the end of the program's allocation, it could cause a memory protection violation.
What else is wrong with this ? In general, functions like GetAString would return a string of about 100 characters, or less. (In the actual function used in the code I took this from, the string returned was normally about 10 characters). Nevertheless, in that example, we are always copying 4K of data, requiring in general 40 times the amount of time actually needed.
So, what can we do about it? In place of the memcpy(), use a strncpy(). It will copy either MAX_BUFFER_LEN, or until the first 0x00 byte, whichever comes first, protecting against memory errors on both the source and the destination, and will only do the work that is actually needed.
Some more to think about: Under what conditions would GetAString return a bad value? Would it require a bad memory chip? A corrupt database? A bug in GetAString()? Not much sense going out of your way to prevent an error from causing more damage, and then pretending the error never occurred and proceeding as if nothing happened, when the error itself indicates that damage done already would prevent any real work from being performed.
2) strcmp() would compare each string, byte for byte, up to and including the end of string 0x00. But strlen() will tell you the number of characters in a string not including the null, so while strcmp("ABCD","ABC") would fail, strncmp("ABCD","ABC",3) would say they match. Since it can be assumed from the context that command contains the last thing sent before a modem dropped carrier, there's a good change that the command is followed by line noise, so we'd have to consider just the first part of it.
So, what's wrong with all this ? The #defines in the function expand to
strncmp(command, "logout",
strlen("logout"))
-- two separate copies of the string. Some compilers are bright enough to combined them, but not all are, and with those, you're wasting a bit of memory. Even if the compiler does generate references to a single string, it's still wasting time. "logout" is a constant, and therefore, so is it's length. So, why are we going to the trouble of calling a function every time the program is run to count the character in the string, when it will always be the same number? What about out old friend sizeof()? It works for strings as well, and does all it's work as compile time, so the calculation only has to be done once:
strncmp(command,
LOGOUT_STRING, sizeof(LOGOUT_STRING))
Wait - Hold the phone central ! There's a problem with that last line. sizeof() returns the size of the string including the null at the end. Which makes that statement work exactly the same as strcmp(). Oops, back to where we started from -- not a problem, just remember to subtract 1 from sizeof() (unless you're really meticulous, in which case, you'll subtract "sizeof(char)")
Copyright © 1994, James M. Curran