Green IT: Going Green -- Does Your Company Care?
Take a good look at your organizations’ priorities and policies, and you may discover you’re going green just by trying to save some green.
The term green IT started rearing its head seven or eight years ago. Most Americans have become familiar with the term through IBM's recent Disney-esque television ads. The trouble is that green IT hasn't caught on as quickly as pundits and advocates had originally hoped. The intent is there, but acting on that intent is lacking.
Many companies profess a concern for the environment, but few are remodeling their data centers or taking major steps toward greener technology. Recent consulting experiences have led me to formulate a hypothesis about green IT: Either your company's executives have made a personal commitment to green IT, or your company doesn't really care.
"Personal commitment" may take the form of a CEO's personal concern for the environment, or it may come from a board decision to pursue greener business techniques for publicity reasons.
The underlying cause doesn't matter: With a corporate mandate, green IT can happen. Without that top-level mandate, green IT won't happen. Aside from personal environmental concerns or publicity motives, how can we create that mandate in more companies?
What Is Green IT?
Typically, green IT simply means building an IT infrastructure that uses fewer resources—most notably energy. That refers to not only the energy consumed by computers, but also the energy needed to keep them cool and to manage them over the corporate network. In the office and in the datacenter, desktops and servers require an extraordinary amount of power.
One other bit of misinformation around green IT initiatives involves virtualization as a greener technology. Let's be perfectly clear: Virtualization does not automatically equal green IT.
True, virtualization lets us use fewer servers, but the servers we do use are often bigger, more powerful, consume more energy and produce more heat. They're also more likely to be fully loaded in terms of workload, meaning they're at maximum consumption and heat production. There have been a few instances where virtualization actually increased energy costs, as a brand-new, super-powerful server replaced several older, less-power-hungry machines.
Virtual Desktop Infrastructure (VDI) is also something to be skeptical of when it comes to green IT. For example, moving 50 desktops from 250-watt computers to a single 2,000-watt computer might seem like a 10,000-watt energy savings, unless of course you're going to keep those desktop computers to use as VDI endpoints. If you are, then you're actually adding another 10,000 watts of energy consumption and working toward the opposite of green IT.
If you can reduce your power consumption then you can be greener. In the process, you'll almost always reduce heat output as well, because power consumption leads directly to heat. And, by the way, you'll also be paying less for electricity. That's the way to create an executive mandate: Green can be a nice public relations side effect of real monetary savings.
Green = Cheaper
Saving money is something most businesses can understand. That was, in fact, the very theme of IBM's green IT commercials: You get to do something for the environment and potentially save a boatload of money. The problem is that most companies don't actually have a clear handle on how much they actually spend to power and cool all of their computers.
The electricity bill is usually just one big lump sum that includes lights, coffee makers and so on. It's difficult to determine exactly where to realize some savings. Furthermore, most IT departments don't even know how much power any given server consumes. You can't just look at the data plate on the back, because the wattage listed there is the maximum output of the power supply.
Most of us don't run our servers at maximum load all the time, so they're often consuming less than the stated wattage. If that's the case, which servers are the best candidates for potential savings? Finally, most companies don't want to undergo massive restructuring solely to achieve some nebulous reduction in energy costs. Any kind of restructuring is risky, labor-intensive and disruptive. Will the savings be worth it?
Going Green Through Better Utilization
The real trick to achieving green IT is to accept the fact that your current IT assets are fixed. You're not going to lose a lot of physical servers. What you can do, however, is get a better idea of your current servers' utilization and a feel for which servers are more efficient. Identify which of your servers produce more workload for less energy. By identifying the most efficient machines and those with extra capacity, you can start to slowly consolidate tasks—using virtualization, in most cases—and perhaps reduce energy costs a bit.
More importantly, you can avoid adding IT assets. By making better utilization of existing high-efficiency servers, you can support new IT initiatives without adding more energy load. This isn't as exciting as seeing a drastic reduction in energy consumption by eliminating a bunch of servers, but it is a lower-risk, slow-but-steady way of curbing your energy growth and taking a greener approach to IT in the future.
As you begin to purchase new servers, focus on their energy-to-workload ratios. Select machines that provide the most computing power for the least amount of energy consumption. Try to ensure that every watt is being used to full advantage by moving older, lower-efficiency servers into virtual machines running on newer, higher-efficiency ones.
Lacking an executive mandate driven by other reasons, this is the approach that most companies can practically implement. They'll save money and, incidentally, help the environment a bit.
Don Jones is a multiple-year recipient of Microsoft’s MVP Award, and is an Author Evangelist for video training company Pluralsight. He’s the President of PowerShell.org, and specializes in the Microsoft business technology platform. Follow Don on Twitter at @ConcentratedDon.