Skip navigation

Buffer Overflows: The Developer's Bane

For a brief moment last week, it appeared as though someone had discovered a genuine back door in a Microsoft Web product. As it turns out, the product has no back door, but it does have some interesting code and a nasty buffer overflow condition.

The story broke last Thursday night when a researcher informed Microsoft that he thought a particular component that ships with various Web platforms had a back door. Apparently, someone found a suspicious string of words inside a file (dvwssr.dll, part of Visual InterDev 1.0), thought that it might represent telltale signs of a back door, and tipped off the researcher. The hacker investigated the code and reported his findings to Microsoft. The string inside the DLL clearly read, "Netscape engineers are weenies!" and after some investigation, the researcher learned that the string obscured a URL-based file request sent to the DLL in question.

According to Microsoft, barely an hour after it received the initial bug report, a reporter from the Wall Street Journal called to ask for a denial or confirmation of the alleged back door. By Friday afternoon, Microsoft had openly confirmed that a bug did exist in the DLL file. In Security Bulletin MS00-025 (released Friday), the company said that the DLL in question might let a Web author access certain files of other Web sites on the same server, if the relevant server files had incorrect permission settings.

As it turns out, the embedded phrase is not a true back door, only a key string used to obscure part of a URL. Someone with knowledge of the obscuring routine still needs specific file access permission to exploit the routine. No risk exists until an administrator sets file access permissions in a very particular way. But that isn't the end of the story.

Researchers began looking for other problems with the dvwssr.dll file and quickly found them. By late Friday afternoon, a message was circulating on various mailing lists that stated a buffer overflow condition exists in the dvwssr.dll file. Apparently, an attacker can launch a Denial of Service (DoS) attack against the server by sending the DLL a URL parameter string of 5000 characters. Furthermore, under certain circumstances, the buffer overflow can let an attacker run code on a remote system.

After news of the overflow condition reached Microsoft, the company revised its original security bulletin with the new risk details. In addition, the company recommended that because Visual InterDev 1.0 is so old and probably not widely used, administrators should delete the dvwssr.dll file from servers to eliminate associated risks.

The entire scenario flushed out arguments for and against two old sore spots in the security community: full and immediate vulnerability information disclosure, and the potential benefits of open source projects when it comes to secure coding practices. As soon as this story hit the news outlets on Friday morning, the debates began on several public forums.

People cried foul because they felt the initial vulnerability report was misleading and confusing. They used the incident to claim that full and immediate vulnerability disclosure is detrimental. Yet proponents said that without such disclosure, researchers wouldn't have found the buffer overflow condition in the first place. I think both sides have valid arguments. Sometimes a risk needs to be held in confidence for a period of time for a good reason; in other incidents, the best course is to release full risk information immediately. Both approaches depend on the circumstances involved, so no static rule applies across the board.

On the open source issue, supporters believe that making source code available for review reduces the number of security risks in that code because more eyes will find more problems. But is this really true?

Elias Levy (CTO of SecurityFocus) pointed out in a recent commentary about open source projects that there is no guarantee that people will review open source code from a security perspective. Nor is there any guarantee that people will report any security problems they find. Keep in mind that black hats review code to exploit bugs, not report them. The bottom line is that peer review of source code is only as valuable as the skill set and morals of the peer performing the review.

The real priority with developing solid code is to educate developers about the finer points of secure programming so that they avoid common programming pitfalls, such as buffer overflows. This approach stops basic security problems before they originate instead of depending on peer review to discover them. Providing developers with better knowledge and improved tool sets will quickly decrease the number of security-related problems we encounter, which means that everyone can enjoy a safer network. Until next time, have a great week.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish