In-Depth

Protection Through Isolation

This admin got a shock when he discovered that usernames and passwords for his new employer's sensitive documents were being sent in cleartext. Fortunately, he was able to see the forest for the trees.

Taking over an IT infrastructure at a new job is often like buying a used car—it's important to give everything a good once-over to see where potential problems lay. When I started working for my current employer, things looked to be in great shape on the surface: Compaq servers with hardware RAIDs, managed switches, a well-known firewall, and a server room with its own HVAC system and a lock on the door. So far, so good.

For me, this particular infrastructure was different from the ones before it because of the sensitive nature of the company's business. Large amounts of money changed hands electronically on a daily basis, making security of the utmost importance. Ethereal is my sniffer of choice when it comes to network security, so I had a good-sized packet capture to look at by the end of my first week on the job. It never occurred to me that I'd find something so alarming so soon, but there it was: Usernames and passwords for the company's IIS server were being sent in cleartext.

After picking my jaw up off the floor, it became obvious that more investigation was necessary. The existing documentation on the company's network setup included copious amounts of data regarding server setup, network design, disaster recovery, etc., but nothing on the IIS design or how our internal software interfaced with the IIS server. The firewall had only ports 80 and 21 open for inbound traffic and only to the aforementioned IIS server, dramatically narrowing my search for where these passwords were coming from. After a quick look at the IIS setup, the problem became all too clear. Clients were logging into their own individual directories on the FTP server with usernames and passwords, to send and receive encrypted data files. The data had a strong encryption algorithm applied, so that portion of the issue didn't worry me; but the vulnerability of the network as a whole was of great concern. I knew that if I had any interest in keeping my new job for any length of time, this issue had to go at the top of my priority list.

There's a dark little secret in the world of older Internet services: FTP, SMTP, and Telnet all send their usernames and passwords in cleartext. Back in the 1970s, when these protocols were invented, there were only a handful of people with the knowledge to pluck passwords off a network line without the sender's or recipient's knowledge. Today that number is in the millions. Unfortunately, the death knell for these protocols has yet to be sounded, putting many admins in the awkward position of trying to secure an infrastructure that uses fundamentally insecure protocols. In my case, the culprit was FTP.

Searching for Solutions
The answer to this problem was obvious: Stop using non-anonymous FTP. There are a number of secure protocols that don't send passwords in cleartext and, in my case, encryption was unimportant because the files being sent were encrypted before leaving clients' computers. In reality, though, moving off of FTP wasn't a viable option in the near term. The transfer mechanism for these files was built into a major software package used throughout the company.

In addition, our IIS setup utilized a cool trick in Microsoft's implementation of FTP that allows authenticated users to log in to their individual home directories automatically (see Microsoft Knowledge Base Article 201771, "How To Set Up an FTP Site So That Users Log Onto Their Folders"). Any new solution would have to incorporate that functionality. Furthermore, the developer was unwilling to rewrite our version of the software to remove this security hole. An upgraded version of the entire program was "in development," he said, and should be available "in a few months"—in developer-speak, this means I'd be lucky to see it within a year. I couldn't wait that long. Another option was to try to encapsulate the FTP connection within an SSL link to encrypt the FTP authentication, but the FTP client our customers used was built into the software package and couldn't support it. I was back to being at the mercy of the software developer, but no good MCSE gives up that easily.

Unacceptable Risks
At this point, I had to take a step back and evaluate just what this security hole represented. These accounts and passwords were domain user accounts that had access to one particular directory on one FTP server. They also had the right to log on locally to the FTP server (they had to—see KB 239120, 262233 and 200475). Other than that, they were locked down rather well. However, the fact that someone outside the company could sniff a domain account's username and password with "log on locally" rights was an unacceptable risk. Even if the accounts were locked down, anyone with one of those usernames and passwords could sit down at any machine on the domain and log on. This could lead to any number of potential attack vectors, none of which were acceptable.

I was envisioning scenarios like someone bringing in a USB memory key and running the code of his choice, or using an as-yet-unpatchable exploit in Windows 2000 Server or Windows XP to elevate his privileges. However, the information on the FTP server itself was relatively secure—everything was strongly encrypted and no information that could identify our clients was available in the filenames or directory structure. The users didn't even have rights to delete the files they placed on the server. My thoughts then turned to securing the rest of the network, and all of a sudden it hit me: Put the FTP server in its own forest!

Upon reflection, the idea seemed a little drastic, even to me. I couldn't find an article on the Internet suggesting this sort of setup, and it would be a significant amount of extra work to implement. Microsoft's site wasn't very forthcoming on "best practices" for DMZ setup, although one KB article (309682) hinted at this idea, which gave me hope that I wasn't barking up the wrong tree. I had colleagues suggest that I simply set up the FTP box as a stand-alone server, but I didn't like the idea of having to recreate all those local user accounts if that one server were to fail. Also, that configuration introduces issues of setting up trust relationships. Others suggested setting up the FTP server in its own domain within the forest, but Win2K AD has implicit two-way trusts between domains in the same forest.

Drawing up the Plan
At first, I came up with a plan of attack on paper: Remove the FTP server from the local domain, use DCpromo to bring it up in its own forest, then add an old computer as a second DC in the new forest for redundancy. Then I'd set up a one-way trust in which the DMZ domain trusts the internal domain, but not the other way around. Finally, I'd use Microsoft's Active Directory Migration Tool (ADMT) to move all the user accounts from the internal forest to the new DMZ forest; because the SIDs wouldn't change, the rights that those accounts used to have in the old configuration would remain in the new forest.

In reality, it didn't go quite that smoothly. Because this FTP server was an upgrade from NT 4, a complete wipe and rebuild seemed a good idea. First, I backed up the IIS configuration to a spare server using a very cool IIS export utility (see www.adsonline. co.uk/iisexport). Then I backed up the FTP server to tape, removed the server from the internal domain, and wiped it clean. On went Win2K SP4, AD with a new domain in a new forest, and the IIS configuration (using the aforementioned tool).

It was when I went to set up the trust relationship that things got a little interesting. I hadn't totally thought through how I was going to handle DNS between the two forests. Since the whole point of this exercise was to increase security, the last thing I wanted to do was to replicate our internal DNS structure to the DMZ, so complete two-way replication was out of the question. Because there were only two servers involved in this new DMZ forest, I chose to add A records for them on the internal DMZ server manually and leave it at that. As for the new forest's DNS server, I added a record to one DC in the internal network in order to establish the trust relationship.

Password Nightmare
When the trust was in place, it was time to move the user accounts. ADMT is a powerful yet temperamental utility. Moving the user accounts was a piece of cake, but getting the passwords to move with them proved to be a nightmare. The tool has a specific list of configuration changes that must be performed in order to migrate passwords, but for the life of me I couldn't get it to work. The error message was always some variation of "could not contact the source server," so I checked and rechecked the DNS entries for both forests. Everything appeared fine and I could browse the servers from each other by FQDN. I tried making the trust relationship between the forests two-way, with no success. I even went so far as to install a WINS server in the DMZ and set up replication between the DMZ and internal WINS servers. It didn't help. I ended up scrapping the password migration idea and manually re-entered all passwords for the client accounts. It wasn't the best use of a few hours, but it got me past the problem. After that, I was sure to remove the WINS server on the new forest and rebuild the trust relationship as one-way.

With the passwords back in place, it was time to try out the new configuration. I called a few clients that like to work weekends and asked them to send me a few dummy files. It worked like a charm. I breathed a great sigh of relief, knowing that this project wasn't going to turn into an all-nighter.

Closing the Gates
Even though the hard part was over, the project wasn't complete. An internal network is like a house—it's only as secure as its weakest point of entry. Now that the DMZ was configured, it was time to limit the attack vectors from the DMZ into the internal network. The first line of defense is a good firewall and, thankfully, I had two of them available. The one between the Internet and the DMZ was already set up to allow only inbound connections on ports 21 and 80, and only to the IIS server. Between the DMZ and the internal network, things couldn't be locked down quite that tightly, but I didn't want to leave things wide open, either. I opened up port 21 to the FTP server, and opened the ports required to allow the trust relationship to exist between the DMZ and internal network (see KB 179442).

Lessons Learned
While this solution worked out very well for my particular situation, it's not for everyone. If your DMZ servers host only anonymous connections, there's little value gained by setting up a separate forest. If you do have passwords sent, but they're always strongly encrypted, this also isn't for you. Furthermore, some servers won't respond well to this configuration (I'm thinking of Exchange servers in particular; see KB 839049). Finally, the best solution is to rewrite your applications to stop using non-secure protocols. For the times when passwords can't be protected, though, a separate forest may be just what you're looking for.

About the Author

John Dunphy is the pseudonym for an IT administrator, MCSE, MCP+I, CCNA, employed with a financial services company.

Featured

comments powered by Disqus

Subscribe on YouTube

Upcoming Training Events

0 AM
TechMentor @ Microsoft HQ
August 11-15, 2025