In-Depth

5 Steps to a More Secure Datacenter

In the age of virtualization and cloud computing, administrators need a holistic approach.

With the advent of cloud computing, rich Internet applications, service-oriented architectures and virtualization, datacenter operations are becoming more dynamic, with fluid boundaries.

The shift toward a new computing environment adds layers of complexity that have broad implications for how IT managers secure the components of a datacenter to protect data from malicious attack or compromise.

Organizations should bake security into the design of the datacenter, said Henry Sienkiewicz, technical program director of the Defense Information Systems Agency's (DISA) Computing Services Directorate.

"The datacenter is an entire ecosphere, which has to be looked at as individual components but also holistically," Sienkiewicz said. "If we don't do that, we will miss something."

IT managers are increasingly looking to ensure completely secure transactions, which means securing everything from the desktop and network to applications and storage, said Jim Smid, datacenter practice manager for Apptis Technology Solutions. Traditionally, the network support team secured the networks and the application team would handle data encryption. But the present and emerging environments call for new methods, Smid said.

Organizations need to monitor that all datacenter operations interact correctly and that each element of the datacenter is secure, he said.

In addition, organizations must manage the policies, people and technologies needed to secure dynamic, fluid datacenters. Every security layer is important, so it is hard to say if one is more important than another, datacenter security managers and industry experts say.

Organizations such as the Cloud Security Alliance, a coalition of industry leaders, global associations and security experts, have published guidance (PDF) to promote best practices and provide education on the use of cloud computing. The consortium has released guidelines that cover 15 security domains, ranging from computing architecture to virtualization, that organizations can apply to datacenter security.

However, based on conversations with several datacenter security managers and industry experts, here is a list of five things to consider when securing a datacenter.

1. Get Physical: Control Physical Access to the Datacenter
The first step for many companies is to consider whether they want to continue to maintain their own datacenters or outsource the task, Smid said. Still other datacenter managers might start with harder tasks, such as controlling access to each system or the network layer.

But Corbin Miller, IT security group manager at NASA's Jet Propulsion Laboratory (JPL), prefers to start by locking down physical security to the datacenter.

"A lot of people will forget the physical because there is so much on the network side," Miller said.

At a Federal Aviation Administration (FAA) datacenter in Oklahoma, layered security is increasingly popular, according to Mike Myers, former enterprise services center IT director at FAA. At that center, physical security includes a fenced-off campus, badge access to the main building and datacenter, a guard who escorts visitors, key card admittance to rooms, video surveillance of the datacenter, and locked cages for servers depending on the sensitivity of the data that they contain.

Miller also is working to establish layers of physical security at JPL's datacenter to partition testing, development and production areas.

The center's manager wants to set up a development laboratory in the datacenter, but Miller wants to keep it separated from the production area that is home to systems that keep JPL's operations running.

"I want to keep the production zone at the highest security level," only allowing authorized systems administrators into the area, he said. "So I envision three zones within that one datacenter." One zone would be for researchers to test and stage equipment, one would provide more control over which development work on applications and systems is performed before putting them into production, and one would be a production zone accessible only to core systems administrators.

For the inner layers, one doesn't necessarily need badge access, but some type of access control (such as locks on server racks) is necessary for production systems, Miller said. "I just don't want the accidental lab guy that is working on his equipment to say, 'I need power. Let me plug it in here,' and he overloads the power circuit for production," he said.

2. Establish Secure Zones in the Network
After establishing physical security processes, the hard job of securing the network begins.

"I would concentrate on zoning into the network layer," Miller said. At JPL, "the first zone is a little bit looser environment because it is a development area. The next one is a test subnet, which is isolated from the random traffic of the development area but looser than the production area."

The third zone -- the production or mission support subnetwork -- is where systems administrators spend a lot of time and effort. That zone has only approved production equipment, so administrators must deploy new systems to the production network in a controlled manner, Miller said.

At JPL, administrators can physically or virtually deploy systems to subnetworks attached to virtual local-area networks, and they can set strict rules about incoming or outgoing traffic. For example, administrators could deploy mail servers to Internet Port 25 or Port 80, which should not affect approved traffic in that zone.

Datacenter managers need to consider the types of business that various subnetworks will handle, Miller said. Applications such as e-mail and some database-monitoring activities would use ports that link to the outside world. However, those production machines shouldn't be going to CNN.com, ESPN.com or Yahoo News. After administrators set those rules, they can better detect anomalous activity, Miller said.

By the time a machine moves onto a production network, you should know what it is running, who has access to the operating system layer and what other systems it is communicating with across the network, Miller said.

"Now you can better understand where to put your security monitors or data-leakage-prevention monitors," he said. "If I know only three machines are going to be talking on the Web, it is easy for me to watch traffic and look for specific things."

Although wireless networks are popular, Miller said wireless access points are not necessary in a datacenter. They're difficult to control even with Radius and two-factor authentication, Miller said.

DISA takes a three-pronged approach to datacenter security, Sienkiewicz said. The first part is NetOps, the operational framework that ensures that the Defense Department's Global Information Grid has availability, protection and integrity. The second is technical protection, and the final piece is accreditation and certification of applications, he said.

NetOps consists of the GIG Enterprise Management, GIG Network Assurance and GIG Content Management. DISA has devoted specific people, policies, processes and business support functions to operate NetOps, Sienkiewicz said.

On the technical side, the Department of Defense (DOD) demilitarized zone is a focal point. All of the Defense Enterprise Computing Centers' (DECC) traffic funnels through DOD and DISA demilitarized zones.

As a result, Internet connections to DOD Web servers are inspected and managed from the Internet access point all the way to the host machine. There also is physical separation between Internet-accessible Web infrastructure and other DECC infrastructures and logical separation between users and server types.

With that setup, DISA can limit access points, manage command and control, and provide centralized security and load balancing across the environment.

Additionally, "we run an out-of-band network, so production traffic does not cascade into the way we manage the infrastructure," Sienkiewicz said. Through virtual private network connections, users can manage their own environments. The VPN connections provide paths for production hosts to send and receive enterprise systems management traffic.

3. Lock Down Servers and Hosts
At the FAA facility in Oklahoma, all servers are registered in a database that contains contact information and details about whether the servers contain privacy information. Most of the database is manually maintained, but the process could be improved by automation, Myers said. Problem areas have been change and configuration management -- some of those processes are automated, and some are manual. FAA is working with Remedy software to improve automation.

Server security is standardized and subjected to Statement on Auditing Standard 70 (SAS 70) and annual inspector general audits. FAA also has standardized on National Institute of Standards and Technology (NIST) security checklists that are available on the agency's Web site. In addition, FAA is implementing patching programs and tracking vulenerabilities on servers by scanning them at least monthly.

FAA handles data security separately from server security. FAA is doing more appliance encryption than software encryption, which is too restrictive and poses system compatibility problems. The agency set up firewalls to separate private data from government data. FAA also uses scanning technology to monitor data in motion for potentail privacy leaks. The scanning technology seeks to ensure that data goes to the right recipients and is properly encrypted, Myers said.

At DISA, datacenter managers are working to address security concerns caused by server virtualization. "When we look at virtualization, that notion of how do we increase server virtualization has brought on new security issues," Sienkiewicz said.

A couple of questions that DISA security managers must answer are "how do we ensure that the hypervisors are locked down? How do we make sure that additions, deletions and moves are properly protected? VMware has been a good partner helping DISA to work through security attributes of virtualization," he said.

"One of the big things we have used for virtualization is separation and isolation," Sienkiewicz said. "We do try to separate applications, Web services, application services, database services into physical separate racks, so that there is no possibility of data linkage or spillage or something else happening in the environment."

"We are also hardening other parts of the environment," he said.

As with FAA, DISA requires hosts to be registered on a white list. The agency also has installed host-based security systems, which monitor and detect malicious activity.

"Are we completely there yet? No. Do we have an aggressive plan? Yes," Sienkiewicz said. DISA's broad perspective strives to protect each individual host.

Lastly, DISA is using the public-key infrastructure initiative it runs for DOD to manage physical security. Users must log on to systems with a Common Access Card, which provides two-factor authentication.

"That is giving us digital data signing -- the ability to encrypt e-mail traffic -- and it is a real-time certification program," Sienkiewicz said.

4. Scan for Application Vulnerabilities
Application-scanning and code-scanning tools are important, NASA's Miller said.

At JPL, if someone wants to deploy an application, it must undergo a scan before administrators release it in the production environment. Miller uses IBM Rational AppScan to look at Web applications. AppScan tests for vulnerabilities that hackers can easily exploit and provides remediation capabilities, security metrics and dashboards, and key compliance reporting.

On the other hand, developers who write their own code must run it through a code scanner, which could be a Perl script that looks for specific functions or a fortified product that scans source code for buffer overflows or other vulnerabilities that crept into the code, Miller said.

5. Coordinate Communication Between Security Devices
With cloud computing, agencies need to change their whole approach to securing the datacenter, said Tim LeMaster, director of systems engineering at Juniper Networks.

"In cloud computing, it is about securing the data flows between datacenters, client systems and datacenter, and between virtual machines within the datacenter," LeMaster said. Therefore, application visibility becomes important.

"You have to have visibility into these flows to validate that the traffic is legitimate and is not malware [because] a lot of malicious traffic tries to mask itself as something else." A lot of that traffic uses Port 88 or tunnels with Secure Sockets Layer encryption. Network administrators must have the knowledge and application identification to understand what that traffic is, he said.

Juniper has developed application identification technology that looks beyond port protocols to the context of the data and tries to apply signatures that help determine if an application is really a shareware program or peer-to-peer program.

Juniper's technology also focuses on application denial-of-service attacks. Denial-of-service attacks are not new, but the traditional way to counter the attack was to "black hole" the traffic. But that approach helps the denial-of-service attack accomplish what it intended to do -- deny services -- because a network administrator must remove all traffic from the server that is under attacked.

Application denial-of-service prevention software provides a profiling capability for administrators to determine if traffic is legitimate or not. With such tools, administrators can look at other data flows, such as client-to-server traffic, and compare them to flows that exist in other datacenters or between servers -- or between virtual machines within the virtualized datacenter.

"You can have traffic between the virtual machines that can escape the normal security appliances or services you offer," LeMaster said. Juniper has partnered with a company to offer an intervirtual machine firewall capability, he said.

"I would like my intrusion prevention to see that malicious worm [and] not just drop it but to talk with the SSL device and eliminate only that bad session," he said.

The concept of coordinating networking devices, firewalls, SSL devices, and intrusion prevention solutions becomes useful in a cloud computing infrastructure. Juniper is working with the Trusted Network Connect Work Group, a consortium of users and service providers that published standards that will allow security components made by different companies to share information about a device, LeMaster said.

Featured

comments powered by Disqus

Subscribe on YouTube