Amazon Gains More Direct Connections, Security Options

Amazon Web Services in August launched AWS Direct Connect, an option that lets customers build a dedicated network link from a datacenter or collocation facility to an Amazon facility, which it calls an AWS Region.

Amazon added this option to customers who were concerned about privacy, network costs and those who were seeking better connectivity than the Internet provides, the company said at the time. But the connectivity was limited to one location: an Equinix collocation facility in Virginia.

This week Amazon has added four new AWS Direct Connect regions that include Northern California (Equinix in San Jose and CoreSite One Wilshire in Los Angeles), EU West (TelecityGroup Docklands in London), Singapore (Equinix) and Tokyo (Equinix).

While Amazon's AWS Region in the EU is in Ireland and the Northern California AWS Region is in Silicon Valley, the additions of London and Los Angeles are intended for "additional flexibility when connecting to AWS from those cities," said AWS evangelist Jeff Barr in a blog post.

Separately, Amazon customers who want added security beyond what Amazon offers can now use a variety of security gateways offered by Check Point Software. Check Point, an Amazon partner, last week said it is providing its security gateways as Amazon Machine Images, or AMIs, to customers looking to extend such services as intrusion detection, data loss prevention, firewall, VPN and URL filtering.

While Amazon offers its own VPN access and encryption, Check Point said the availability of these added security options should appeal to customers leery of using the cloud for sensitive data.

"As customers bring their servers and applications and their own IT stack [to the cloud], there's security that needs to be done that Amazon just doesn't provide," said Fred Kost, Check Point's head of marketing. For example, some customers require added access control mechanisms or want to watch for intrusions, Kost said.

Those with perpetual licenses can extend them to Amazon. Check Point also offers some of its gateways on a subscription basis.

Posted by Jeffrey Schwartz on 01/12/2012 at 1:14 PM0 comments


Internap Extends Cloud Footprint with Voxel Acquisition

Looking to expand its global footprint and addressable market, cloud hosting provider Internap this week said it has acquired Voxel Holdings for $30 million in cash.

By adding Voxel, Internap can target smaller customers that require more automation and self-service implementation of cloud and hosting services. Internap addresses large enterprises that typically spend a minimum of $10,000 per month and as much as $1 million monthly, said President and CEO Eric Cooney, speaking Tuesday during an investor conference call announcing the acquisition.

Typical Voxel customers, by comparison, spend less than $10,000 per month and tend to require rapid, self-service provisioning of infrastructure.

Internap offers two cloud services. The first is its flagship Custom Public Cloud, based on VMware's vCloud Director platform. Custom Public Cloud is geared toward organizations running enterprise applications that require high availability. Also, in October, the company launched the first public cloud service based on the compute service of OpenStack, the popular open source platform aimed at building interoperable clouds.

Voxel's self-service platform runs on Red Hat Linux. While Voxel has closely explored OpenStack and even contributed to the community, it has not to date deployed OpenStack. Also differentiating the two companies is the fact that Internap owns its own datacenters, while Voxel rents its facilities.

Cooney indicated that in the coming quarters Internap will look at relocating some of Voxel's infrastructure into Internap facilities. Voxel, based in New York, has a large presence there, where leasing and powering datacenters can be more expensive than other locations. But Voxel also has a presence in Amsterdam and Singapore, giving Internap an expanded global footprint.

"With Voxel's strong track record in dedicated hosting and public cloud and Internap's robust collocation and enterprise hosting presence, we are now positioned to capture and hold customers through the majority of their IT infrastructure lifecycle," Cooney said.

Though Cooney forecast that Voxel closed out 2011 with a 25 percent growth rate, he suggested it lacked the sales and marketing infrastructure to compete with much larger giants such as Amazon Web Services and Rackspace. Internap has 60 direct sales reps and a network of channel partners that will help expand Voxel's footprint, while extending Internap's portfolio.

"We've been doing very well in the marketplace competing against the three major players in head-to-head RFPs, and now we look at this Internap deal and just clearly see a huge potential with combining that kind of product set and Infrastructure-as-a-Service platform with an established sales and marketing organization," said Voxel Founder and CTO Raj Dutt during the investor call. "So we're very optimistic we can continue to grow the business with that combination."

Will Internap transition the Voxel infrastructure to OpenStack? Cooney indicated that is the plan. "Obviously, they are separate platforms. We have not done the engineering development work to integrate the two platforms, but you can expect that top of our list of integration activities is to integrate those two public cloud platforms together," Cooney said. "Without putting too much of our future roadmap on the table, suggest to say that OpenStack we expect to be a key part of the long term platform."

Posted by Jeffrey Schwartz on 01/05/2012 at 1:14 PM0 comments


IBM-Green Hat Deal Targets Cloud-Based Software Testing

IBM kicked off its first acquisition of the new year on Wednesday saying it has agreed to buy Green Hat, a provider of tools that lets application developers use cloud resources to test their software.

Big Blue did not disclose financial terms of the deal. Green Hat will be folded into IBM's Rational Software business where it will be offered in conjunction with the company's application lifecycle management (ALM) portfolio.

By acquiring Green Hat, IBM said it is looking for a way to provide lower-cost ways of letting developers test and ensure the quality of their software without cutting corners. Properly testing software today requires an investment in simulation labs that can cost anywhere from $5 million to $30 million, said Charles Chu, IBM Rational's director of product management and strategy.

"That's not a trivial number no matter what size company you are," Chu said. "By providing this virtualized environment, the agile developer is able to come much closer to the vision of realizing continuous integration and continuous testing."

While IBM emphasized that Green Hat's GH Tester, GH performance and Green Hat Virtual Integration Environment (GH VIE) tools are aimed at using cloud resources, Chu acknowledged that developers must use private clouds today. "We don't currently have any announced plans for a public cloud offering," Chu said. However IBM has a cloud service in beta that allows developers to provision a test environment.

GH VIE allows for cloud testing in a virtual environment, Chu said. "Once the configuration is done, that environment is available and provisioned in a matter of minutes for a developer to run tests against," he said.

Founded in 1996, Green Hat's tools support a variety of software environments, including those from IBM, Microsoft, Oracle, SAP, Software AG, Sonic MQ and Tibco, as well as supporting Web services and Java protocols.

Posted by Jeffrey Schwartz on 01/04/2012 at 1:14 PM0 comments


Top 5 Cloud Acquisitions of 2011

With the rush to the cloud, it's no surprise that some of the largest IT vendors shelled out big bucks to gain ground in 2011.

Many players were slow to acknowledge that the cloud is for real, or they have legacy platforms that are not easy to transform. Consequently, some pioneers in cloud computing are now becoming parts of established vendors such as Verizon, Oracle and SAP. Meanwhile, leading cloud pioneer Salesforce.com continues to fill in gaps that exist in its offerings.

With the new year upon us, we can expect to see more big cloud computing acquisitions in 2012. Here's a look at the most noteworthy cloud acquisitions of 2011:

Verizon-Terremark
Earlier this year, Verizon Communications announced it would pay $1.4 billion to acquire Terremark Worldwide, one of the largest independent providers of managed hosting and cloud services for large enterprises and government agencies.

The move gave Verizon instant credibility as a provider of enterprise cloud services. Verizon President and CEO Lowell McAdam told investors when the deal was announced that it made more sense to acquire a leading provider of cloud infrastructure services than building out datacenters throughout the world.

"This is a classic make-buy decision," he said.  "By the time you build datacenters and then outfit them and build the employee capability and all of the software applications that ride in these datacenters, it takes time, and to be honest, that is not our core competency."

At the time of the deal, Miami-based Terremark operated 13 datacenters throughout the world. Now Terremark is a subsidiary of Verizon, led by Nelson Fonseca, who was named president of the new unit on Dec. 16.  Fonseca was previously Terremark's chief operating officer.

CenturyLink-Savvis
Verizon wasn't the only communications provider to jump onto the cloud bandwagon in 2011. Shortly after that deal was announced, Time Warner Cable said it would acquire NaviSite for $230 million. But both would prove paltry to CenturyLink's bid to acquire Savvis for $2.5 billion.

After the deal was completed, CenturyLink combined Savvis with its managed hosting business. That business unit is led by Savvis CEO Jim Ousley. CenturyLink in 2011 became the third largest communications provider with its $12.2 billion acquisition of Denver-based Qwest Communications.

By acquiring Savvis, CenturyLink said it now operates 48 datacenters throughout North America, Europe and Asia, with nearly 2 million square feet of floor space. With its domestic 207,000-mile fiber network and 190,000-mile international access network, CenturyLink is looking to offer combined collocation, network access, managed hosting and cloud services to both large enterprises and government agencies, as well as small and medium businesses.

Salesforce.com-Radian6
Relative to the other deals in this roundup, Salesforce.com's acquisition of Radian6 was the smallest. Nonetheless, it defined the cloud-based CRM giant's emphasis in 2011: enterprise social networking.

Salesforce.com acquired Radian6 for $326 million. The Radian6 Engagement Console allows customers to discover what is being said about them on a variety of social networks, including Facebook and Twitter, as well as on blogs. This sophisticated search engine is important for organizations that are concerned about their reputations. Radian6 customers include Dell, General Electric, Kodak, Molson Coors, PepsiCo and United Parcel Service.

Marc Benioff, Salesforce.com's outspoken CEO, spent much of 2011 emphasizing his belief that enterprises need to embrace social networking to interact with each other as well with as customers, suppliers and partners. "We were born cloud in 1999 but we've been reborn social" was his oft-quoted sound byte in 2011.

As Salesforce.com continues to bolster its Chatter service, look for the company and others to continue their emphasis on enterprise social networking in 2012.

Oracle-RightNow
The success and growth of Salesforce.com's cloud-based applications have been a thorn in the side of Oracle CEO Larry Ellison. The tension has been hard to ignore as Ellison and Benioff have taken the gloves off in a number of public rants over the years, which came to a head this fall.

Ellison put his money where his mouth is, though, with Oracle's recent deal to acquire Salesforce.com rival RightNow Technologies for $1.5 billion. RightNow covers the customer service niche. RightNow's cloud-based Customer Service Cloud product is used by 2,000 organizations, according to the company.

RightNow competes with Salesforce.com's Service Cloud, as both companies go after the call center, by helping organizations use the Internet and social networks to modernize how they provide customer service.

The deal is the latest sign that Oracle is looking to step up its emphasis on cloud computing. In October, the company launched the Oracle Public Cloud, a Platform as a Service (PaaS) that runs Oracle apps, middleware (including its Fusion Applications) and database platforms.

SAP-SuccessFactors
SAP may have a huge base of customers that use its ERP and line of business applications, but the company has struggled to embrace the cloud. Its core cloud initiative, Business ByDesign, has been slow out of the gate.

Looking to address one growing segment, human capital management, SAP in early December said it has agreed to acquire SuccessFactors for $3.4 billion. SuccessFactors is a leading provider of cloud-based human capital (a.k.a. human resources) management solutions.

While complementing its own premises-based HR software and providing a more robust move to the cloud, SAP sees the deal as having broader implications. SuccessFactors Founder and CEO Lars Dalgaard will lead SAP's overall cloud strategy. "Now is the time to take this game to the next level," he told analysts on a call announcing the deal.

Indeed, SAP needs to take its cloud game to the next level. It will be interesting to see if Dalgaard sticks around and is able to make that happen.

Posted by Jeffrey Schwartz on 12/20/2011 at 1:14 PM1 comments


HP-Microsoft Cloud Pact Targets Large Enterprises

While it is not known how many customers have signed on to Microsoft's Office 365 service since it launched nearly six months ago, Office division president Kurt DelBene last month said 90 percent are small businesses. Gunning for the largest of corporations and government agencies, Microsoft and Hewlett-Packard said they will jointly offer Office 365 with the HP Enterprise Cloud Services portfolio.

The two companies announced a four-year partnership in which HP will host at its datacenters Microsoft's Exchange, SharePoint and Lync, as well as resell the subscription-based Office 365. The pact is aimed at organizations with more than 5,000 seats, Patricia Wilkey, HP's global director of marketing for workplace services, told me this week.

Most enterprises of that size are typically not looking to migrate their entire user bases over to public cloud services like Office 365 for a variety of reasons. One key factor is governance, compliance or the need for specific service levels. By bundling both Office 365 with a private cloud implementation of Exchange, SharePoint and Lync, the two companies are arguing they can offer these large customers an integrated hybrid cloud offering.

"It is a solution at a private cloud level, still allowing rapid scalability, but it is designed to meet segregation of data needs, a single governance model, auditing rights, a higher SLA for customers who might need immediate real-time collaborative access to business applications and support of multiple applications," Wilkey said.

Through HP Enterprise Cloud Services, she said private-public cloud integration would allow for a seamless user experience such as looking up free-busy time on calendars, sharing of SharePoint content, combined directories and other interactions among individuals.

HP hasn't announced any large enterprise wins from this pact, though Wilkey insists there are numerous interested parties.

Posted by Jeffrey Schwartz on 12/14/2011 at 1:14 PM0 comments


Tier 3 Adds .NET to Cloud Foundry

VMware's open source Cloud Foundry Platform as a Service (PaaS) is getting an unlikely addition: support for Microsoft's .NET Framework.

It's not coming from VMware but from cloud provider Tier 3, which announced it is contributing its own fork of the .NET Framework for Cloud Foundry to the open source community. The framework will allow developers to port their .NET applications to Cloud Foundry. 

The move comes just one day after Microsoft announced an upgraded release of its PaaS -- the Windows Azure platform -- which among other things includes a preview of its Hadoop connectors, a Node.js software development kit (SDK) and a JavaScript plug-in for Eclipse developers.

Bellevue, Wash.-based Tier 3 will contribute its .NET fork of Cloud Foundry, called Iron Foundry, as well as its Windows version of the Cloud Foundry Explorer and a Visual Studio plug-in for Cloud Foundry. Tier 3 is making the code available at ironfoundry.org and at GitHub under an Apache 2 license.

"Because developers can run their own instances of Iron Foundry in-house or with any service provider who supports it, developers finally have a truly open, interoperable .NET PaaS solution that can be run inside and outside the firewall," said a company blog post. "And because you can run your own instances of Iron Foundry, it's easy to have a full test, QA, and staging environment before pushing to production. In addition, operations teams now have the freedom to choose among various service providers that meet their needs in areas such as security, compliance, availability, location, etc."

In a bid to accelerate adoption of its .NET fork of Cloud Foundry, Tier 3 is offering developers trial usage consisting of one Web and one database instance for 90 days, running on the company's cloud platform.

Cloud Foundry, launched in April, appears to be gaining momentum. It was rated the top cloud PaaS platform by developers, according to the results of a survey by Evans Data Corp. last month. Cloud Foundry is designed to run Spring, Rails, Node.js and Scala applications. With .NET support available on Cloud Foundry, that should only broaden its appeal.

Posted by Jeffrey Schwartz on 12/13/2011 at 1:14 PM0 comments


Amazon Wins Cloud Storage Shootout, Microsoft Places Second

Amazon Web Services edged out 16 cloud storage providers in a 26-month stress test that measured scalability, availability, stability and performance.

The company's Simple Storage Service (S3) was one of only six that made the cut, with Microsoft's Windows Azure coming in second. The tests were conducted by Nasuni, a provider of premises-based network attached storage (NAS) gear that uses cloud storage providers for primary storage backups and/or disaster recovery.

While this benchmark is based on one vendor's assessment aligned with its own criteria and service-level requirements, it is the first I have come across that has measured cloud storage providers over a prolonged period of time and publicly disclosed its findings.

In addition to Amazon and Microsoft, AT&T, Nirvanix, Peer1 Hosting and Rackspace all passed Nasuni's stress test. The company declined to name the 10 that didn't make the cut, noting that as those providers mature their offerings, they stand a good chance of passing the tests in the future.

"The large providers certainly have a leg up with regard to economies of scale and tenure of performance," said Jennifer Sullivan, Nasuni's VP of marketing, in an e-mail. "We'll continue to monitor a variety of cloud providers, and as adoption of the cloud increases (e.g., different use cases for the use of cloud in organizations emerge), this will help shape what cloud storage has to become to be adopted and embraced by the enterprise."

Nasuni has maintained that the cloud is merely a component of an overall storage solution, particularly enterprises with distributed locations. Nasuni's on-premise storage controller, which leverages the cloud as a target for data, provides added security and access control.

When offering its solution, Nasuni chooses a cloud storage provider for a customer that will meet the service-level agreements at any given time. "We choose the cloud provider and we can also migrate providers if we feel that one provider offers better performance," Sullivan said, likening the process to computer makers that choose hard disk drives for customers. "We dedicate the cycles to evaluating the providers so our customers don't have to."

Here are some findings from the report:

  • Writing large files: Windows Azure had the highest average speed at 2.38 MB per second. Nirvanix was close behind at 2.32 Mbps. The remainder of the six had similar speeds except for Peer1, whose average write speed was 1.49 Mbps.
  • Reading large files: Nirvanix was fastest at 13.3 Mbps, with Windows Azure coming close behind at 13.2 Mbps. Amazon posted 11.28 Mbps.
  • Writing medium-sized files: Windows Azure led at 2.1 Mbps, followed closely by Amazon S3 at 2.0 Mbps. The remainder came in 28 to 70 percent slower.
  • Reading medium-sized files: Amazon significantly outpaced everyone else at 9.2 Mbps. Coming in second was Microsoft, though 28 percent slower at 6.6 Mbps.
  • Reading small files: Amazon S3, at 387 files-per-second, was 41 percent faster than its nearest rival, AT&T.
  • Writing objects: Windows Azure led with 154 files per second, with Amazon S3 coming in second at 135 files per second. AT&T came in third with 98 files per second. The remaining three were much slower.
  • Outages: Amazon had the fewest, with only 1.4 per month, and the average duration was not significant, giving it an uptime of nearly 100 percent. (Those who experienced some of its major disruptions earlier this year, including April's four-day outage, may beg to differ.) Microsoft had 11.1 outages per month with an overall uptime of 99.9 percent. Peer1 had 6.8 outages, Rackspace experienced 10.3, AT&T averaged 10.4 (though posted uptime of 99.5 percent) and Nirvanix was less fortunate with 332, though the outages apparently were not significant since its uptime still came in at 99.8 percent.

Sullivan said it will be interesting to see if Amazon holds the top spot, noting Microsoft has a good chance of taking the lead. "Time will tell," she noted. A copy of the report is available for download here.

Posted by Jeffrey Schwartz on 12/13/2011 at 1:14 PM0 comments


Survey Finds Cost Savings from Cloud Elusive for Some

Access to data from multiple mobile devices outweighs cost savings when it comes to justifying the reason for deploying cloud-based solutions.

That's the rather curious finding from a study released this week by CSC, the global integrator based in Falls Church, Va. According to CSC's Cloud Usage Index, in a report based on a survey of 3,645 IT decision makers in eight countries, 33 percent cited access to data from mobile devices as the primary reason for adopting cloud computing. Only 17 percent said reducing costs was the most important reason for moving to the cloud.

Meanwhile, 82 percent said their cloud efforts have reduced their IT costs but in many cases those savings are minimal. In the United States, 23 percent of enterprises and 45 percent of small businesses with fewer than 50 employees said they were not saving any money at all, while 35 percent of U.S. organizations have said those savings were less than $20,000.

"Although requirements for business agility and cost savings certainly factor in, neither is the single most important driver for cloud adoption," according to the report. At the same time, "in terms of overall IT performance, an overwhelming 93 percent of respondents say cloud improved their data center efficiency/utilization or another measure. And 80 percent see improvements like these within six months of moving to the cloud."

Among the other findings in CSC's Cloud Usage Index:

  • 14 percent downsized their IT departments after moving to the cloud.
  • 20 percent of organizations hired more cloud experts.
  • 65 percent signed on for subscriptions lasting more than one year.
  • 64 percent reported the cloud has helped lower energy use.
  • 48 percent of U.S. government agencies moved at least one workflow to the cloud in line with the "cloud-first" initiative.

Despite the minimal overall IT savings, 47 percent said they saw lower operating costs after moving to the cloud.

I find it surprising that cost savings isn't a higher priority and outcome. I'd be curious to hear if the IT cost savings are more important and substantial in your organization than CSC's Cloud Index suggests. Leave a comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 12/07/2011 at 1:14 PM0 comments


Cisco Outlines Cloud Framework

Like every IT vendor these days, Cisco Systems has talked up the cloud for some time. But now, it has a new umbrella cloud strategy.

The networking giant on Tuesday outlined its framework aimed at tying together private, hybrid and public clouds using its network gear, datacenter infrastructure and apps and services.

The framework, called CloudVerse, is designed to enable customers and partners to construct, connect and manage public, private and hybrid clouds, as well as cloud-based applications.

CloudVerse brings together three of the company's key segments:

  • Unified Data Center, which includes integrated servers, access networking, storage networking and the management of those components.

  • Cloud Intelligent Network, the networking components and management infrastructure aimed at providing connectivity and automation among multiple clouds, including its Nexus and Catalyst switches, and other routers, firewalls and related hardware and software.

  • Cloud Applications and Services, Cisco's portfolio of cloud collaboration offerings, including its WebEx and TelePresence services.

The launch of CloudVerse comes just one week after Cisco released its first Cloud Index Report, where it forecast twelvefold growth in cloud computing traffic between 2010 and 2015 to 1.6 zettabytes of data. Cisco sees an opportunity to use its presence as a leading supplier of network automation gear to bring together stove-piped clouds and datacenters.

"Until now cloud technology resided in silos, making it harder to build and manage clouds, and to interconnect multiple clouds, posing critical challenges for many organizations," said Padmasree Warrior, Cisco's senior VP of engineering and chief technology officer, in a statement.

Cisco announced several cloud providers and enterprises that are already using CloudVerse, including Fujitsu, LinkedIn, Qualcomm, Silicon Valley Bank, Verizon's Terremark subsidiary and Xerox's Affiliated Computer Services (ACS).

While CloudVerse is a framework that brings together existing products and services, Cisco announced some key new offerings that will advance its aim toward bringing together existing cloud silos.

Among them are Cisco Intelligent Automation for the Cloud, an offering that includes automated cloud provisioning and an on-demand orchestration engine; Cisco Network Services Manager 5.0, which lets organizations combine existing network and cloud resources into a multi-tenant datacenter architecture; and its Cloud-to-Cloud Connect based on Cisco's Network Positioning System (NPS), a technology that exposes network intelligence to the cloud.

NPS will be included with Cisco's forthcoming Aggregation Services Routers 1000 and 9000, due out next year. Cisco said the new routers will provide network automation between datacenters and clouds.

Posted by Jeffrey Schwartz on 12/06/2011 at 1:14 PM0 comments


SAP Makes Cloud Play with Deal To Acquire SuccessFactors

Over the weekend, SAP announced it has agreed to acquire SuccessFactors, a provider of cloud-based human capital management solutions, for $3.4 billion.

The deal represents a 52 percent premium over SuccessFactors' share price at the close of the equity markets on Friday. By acquiring SuccessFactors, SAP, primarily known for its premises-based line of business and ERP software, is hoping it will propel its push into the cloud.

"SAP's cloud strategy has been struggling with time-to-market issues, and its core on-premises HR management software has been at a competitive disadvantage with best-of-breed solutions in areas such as employee performance, succession planning, and learning management," said Forrester analyst Paul Hamerman in a blog post. "By acquiring SuccessFactors, SAP puts itself into a much stronger competitive position in human resources applications and reaffirms its commitment to software-as-a-service as a key business model."

Hamerman noted that SAP's subscription revenue has been flat for the first nine months of the year, only representing 3.7 percent of software revenues. With SuccessFactors' 42 percent growth last quarter, he said that SAP's SaaS effort -- which includes Business ByDesign (ERP), Sales OnDemand (CRM), Carbon Impact OnDemand (sustainability), Sourcing OnDemand and Travel OnDemand (expense reporting) -- should accelerate

SAP said that SuccessFactors, with 3,500 customers in 168 countries, is forecast to generate $400 million in revenues in 2012 and generated a 59 percent increase in revenues during the first nine months of this year.

SuccessFactors will operate as an independent business SAP business unit, much like database and mobile integration software provider Sybase is run. In addition to heading the new subsidiary, Lars Dalgaard, SuccessFactors founder and CEO, will lead SAP's overall cloud strategy. "Now is the time to take this game to the next level," Dalgaard said on a conference call for analysts Saturday.

"They will provide leadership and expertise to accelerate our cloud strategy," added SAP co-CEO Bill McDermott. "They truly understand the go-to-market dynamics in this fast evolving cloud space, and are one of the fastest growing cloud companies based on 10 years of on demand expertise."

Posted by Jeffrey Schwartz on 12/05/2011 at 1:14 PM0 comments


Cisco Forecasts Twelvefold Increase in Cloud Traffic by 2015

According to Cisco's first Global Cloud Index Report released this week, cloud computing traffic will reach 1.6 zettabytes by 2015, a twelvefold increase over last year's traffic, which topped 166 exabytes.

That translates to a 66 percent compounded annual growth rate (CAGR). The cloud today represents 11 percent of datacenter traffic, which Cisco says is growing at a CAGR of 33 percent and is expected to equate to 4.8 zettabytes (a zettabyte is 1 trillion gigabytes). By 2015, the cloud will represent 33 percent of datacenter traffic, according to Cisco's forecast.

Cisco said in a whitepaper that it gathered data such as server shipments from a number of analyst firms, where it calculated workloads by type and implementation. The company also assembled network stats from 10 enterprises and Internet centers.

Here are some of Cisco's other findings:

  • The number of workloads per installed traditional server will increase from 1.4 in 2010 to 2.0 in 2015.
  • The number of workloads per installed cloud server will increase from 3.5 in 2010 to 7.8 in 2015.
  • By 2014, more than 50 percent of all workloads will be processed in the cloud.
  • In 2015, global cloud IP traffic will reach 133 exabytes per month.

All of this is further validation of a significant transition of workloads from the datacenter to the cloud, but Cisco doesn't see the in-house systems going away anytime soon. Rather, the cloud will take up a substantial chunk of workloads and storage in the coming years.

Are these findings by Cisco consistent with where you see your organizations headed? Leave a comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 11/30/2011 at 1:14 PM0 comments


Would You Heat Your Home with a Cloud Data Furnace?

Are you frustrated by the high cost of heating your home? With the winter weather arriving in many parts and furnaces kicking into high gear, once again we can look forward to exorbitant bills for oil or natural gas.

If you can't justify the hefty investment in solar panels or other alternative energy sources, would you consider replacing that furnace with a cabinet full of servers, storage and network gear?

That's what researchers at Microsoft and the University of Virginia are proposing. They have introduced the concept of the Data Furnace, or DF for short, to heat homes and office buildings, while at the same time reducing operational costs for those hosting cloud infrastructures by offloading at least some of the expense of running servers in large datacenters that consume huge amounts of energy and require substantial cooling facilities.

With servers used to power cloud computing infrastructures, these DFs can generate enough heat to act as a primary heating system in a home or building, the researchers proposed in a paper presented back in June at the annual USENIX Workshop on Hot Topics in Cloud Computing, held in Portland, Ore.

Though the paper got little attention at the time, New York Times columnist Randall Stross wrote about it in his popular Digital Domain column Sunday, thereby exposing the idea to a mass audience.

The authors defined three primary benefits to cloud providers deploying DFs in homes and office buildings: a reduced carbon footprint, lower total cost of ownership per server, and the ability to bring the compute and storage closer to the users (by further caching data, for example). The DF shares a footprint similar to a typical furnace, in a metal cabinet that is linked to ducts or hot water pipes.

"DFs create new opportunities for both lower cost and improved quality of service, if cloud computing applications can exploit the differences in the cost structure and resource profile between Data Furnaces and conventional data centers," the authors wrote. "Energy efficiency is not only important to reduce operational costs, but is also a matter of social responsibility for the entire IT industry."

A typical server farm generates heat ranging from 104 to 122 degrees. While not hot enough to sustainably regenerate electricity, it is ideal for powering heating systems, clothes dryers and water heaters, the authors wrote.

Cheaper servers, improved network connectivity and advances in systems management also make this a practical notion, thanks to the ability to remotely re-image or reboot a server.

Still, there are some obstacles. It can cost anywhere from 10 to 50 percent more for electricity in a home to power the DF versus the cost cloud providers pay in an industrial complex. Also, network bandwidth to the home could be more costly. And maintaining the geographically dispersed systems becomes more complex and expensive.

Co-author Kamin Whitehouse, an assistant professor of computer science at the University of Virginia, told Stross that he has received more response than he typically gets when publishing a scientific paper. In fact, he said he has heard from some who are already heating their homes with servers "which shows that it works."

While it may work, I'd like to see some cloud providers trying this out, find out how well it works in the home or building and see what the total economics are. It seems reasonable that the industry should seriously evaluate this concept.

Posted by Jeffrey Schwartz on 11/30/2011 at 1:14 PM2 comments


Subscribe on YouTube

Upcoming Training Events

0 AM
Live! 360 Orlando
November 17-22, 2024
TechMentor @ Microsoft HQ
August 11-15, 2025