IBM Launches Cloud Partner Program

IBM is kicking off a new program intended to make it easier for partners to sell the company's cloud-based products.

Big Blue unveiled the new Cloud Computing Specialty at its PartnerWorld Leadership Conference in Orlando last week. Up until now, the company lacked a cohesive go-to-market strategy for its various cloud initiatives.

"We did have a couple of programs that were in certain areas but as we looked at what cloud has become, there are so many more opportunities now," said Dave Mitchell, director of strategy and emerging business for IBM's ISV and developer relations organization. "Our portfolio has become so much wider, and our capabilities have become so much broader, that we felt it was a good time to really see how we could bring this together into a single story."

The company has identified five types of cloud partners it is addressing:

  • Cloud Application Providers: Typically, these are ISVs that are moving toward the Software as a Service (SaaS) model.
  • Cloud Builders: Systems integrators, consultants and solution providers that build private clouds for customers.
  • Cloud Infrastructure Providers: Telcos, regional hosters and managed service providers, and anyone building out cloud infrastructure.
  • Cloud Services Solution Providers: Those that resell multiple cloud services to offer custom or vertical solutions for customers.
  • Cloud Technology Providers: Companies that provide tools aimed at extending the value of public clouds, such as cloud management, billing and monitoring software.

Each of the five parts of the program has its own defined criteria for skills, revenue and references that partners can establish, Mitchell said. And in return, there's detailed go-to-market benefits and technical support benefits that are customized for each of the paths.

Using its LotusLive network, IBM is also linking partners together so they can collaborate on deals, Mitchell added.

While the Cloud Computing Specialty is aimed at partner development, IBM last week also launched its first authorization and certification program for its software resellers. The IBM Cloud Computing Authorization program is an extension of the company's existing software reseller program known as the IBM Software Value Plus program.

"These two programs are very complementary," Mitchell said, referring to the skills and specialties required.

Partners who join the program and get certifications in the right combination of products, including the WebSphere CloudBurst appliance, Tivoli Service Manager, Tivoli Provisioning Manager, IBM Solution Delivery Manager and some of the cloud development tools in the Rational development suite, will be eligible for incremental margin incentives.

Posted by Jeffrey Schwartz on 02/24/2011 at 1:14 PM0 comments


Axcient Brings Data Protection to Cloud for SMBs

Axcient is the latest provider of data protection technology that lets small and medium-sized enterprises use the cloud for disaster recovery and business continuity.

Though not a well-known company, Axcient supplies appliances that offer multiple levels of backup and recovery. Here's how it works: The customer pays a monthly fee for the appliances that consist of servers loaded with Axcient's software, storage and network infrastructure and are controlled by a Web-based interface. Files and data from both clients and servers can be backed up to the appliance, which also provides backup to the vendor's cloud-based facilities.

"We've built a platform that basically provides an SMB total access to data, regardless of what happens, total continuity and total disaster recovery," said Justin Moore, Axcient's CEO. The appliance, Moore explained, is the only component a customer needs to deploy. "It eliminates the need to install a single piece of software on the network," he said.

Moore said Axcient has 1,000 customers with 50,000 devices installed. While there are a number of players in this space, Axcient is looking to differentiate itself. It is hoping a partnership announced last week with Hewlett-Packard Co. will give it a more consistent hardware stack and access to the larger company's channel partners to extend its reach.

"We are not a hardware company, we are a software and Software as a Service platform, company," he explained. "Now our entire platform, from the on-premise appliance to our cloud compute infrastructure, is all going to be on HP's platform."

Enterprise Strategy Group senior analyst David Chapa, who published a report this month on the backup-as-a-service market, said linking with HP promises to give Axcient greater recognition in the market. "They are starting to make a name for themselves and the announcement of the HP relationship certainly will help with that awareness," Chapa said.

As part of the pact, Axcient will use HP's ProLiant SL and DL servers, StorageWorks and networking products. The current solution is designed to handle 20 terabytes of compressed data and can failover to seven such appliances. Pricing starts at $150 for a 250 GB implementation.

Data and server images are automatically replicated to Axceint's data centers, which the company says is SAS 70 II-certified.

Posted by Jeffrey Schwartz on 02/24/2011 at 1:14 PM0 comments


Government Plans 'Cloud-First' Policy in New IT Budget

The information technology portion of the 2012 federal budget proposal submitted by President Barak Obama this week calls for a "Cloud-First" policy, meaning agencies are being encouraged to use cloud-based solution when such an option exists.

The shift is outlined in a report released last week by Federal CIO Vivek Kundra. "This policy is intended to accelerate the pace at which the government will realize the value of cloud computing by requiring agencies to evaluate safe, secure cloud computing options before making any new investments," according to the report.

Obama's budget calls for a slight uptick in IT spending overall totaling $79.5 billion, a 1.9 increase over the current fiscal year, which ends Sept. 30. In the Federal Cloud Computing Strategy report, Kundra said $20 billion spent on IT, or 25 percent, can move to the cloud.

"By using the cloud computing model for IT services, we will be able to reduce our data center infrastructure expenditure by approximately 30 percent,"  the report states, pointing to inefficiencies in the current IT environment.

"Cloud computing has the potential to play a major part in addressing these inefficiencies and improving government service delivery. The cloud computing model can significantly help agencies grappling with the need to provide highly reliable, innovative services quickly despite resource constraints," according to the report.

But analyst Ray Bjorklund of Federal Sources, tells Computerworld that he is skeptical that the government can achieve such lofty goals.  "Trying to wave your wand and say we are going to achieve 30 percent savings is not that simple," he said.

Bjorklund has a point but the government's push into cloud computing will be an important barometer for the success and challenges of this shift in how IT is provided and managed.

Posted by Jeffrey Schwartz on 02/17/2011 at 1:14 PM0 comments


CEO Insists Rackspace Is Not for Sale

With Verizon Communications agreeing to acquire Terremark for $1.4 billion last month and Time Warner Cable following up with a deal to buy NaviSite for $230 million, it begs the question: are we going to see a wave of consolidation in the cloud computing industry? The obvious answer is: of course.

One company that claims it's not in play is Rackspace Hosting. Rackspace would appear to be a natural acquirer or acquire. The company is one of the leading players in the cloud computing industry. Revenues in 2010 were up 24 percent topping $780 million, the company announced last week. At that run rate, Rackspace's revenues this year will fall just shy of $1 billion.

CEO Lanham Napier insists Rackspace is not for sale. His game plan is to grow Rackspace into a "giant" organization. "We are a committed long term player to this market," Napier said during the company's earnings call last week. "There is going to be a Web giant that emerges from this technology shift and we want to be that giant."

He put it more bluntly in an interview with Forbes: "Our company is not for sale, and we want to whup these guys," he said, acknowledging it would be his fiduciary obligation to shareholders to consider any kind of substantial bid. Considering we are in the midst of a cloud computing gold rush, there are some deep pocketed companies that could make Napier an offer hard to refuse. Naturally any company is for sale at the right price.

Committed to hitting that $1 billion revenue milestone in this, its 13th year, Napier believes the $2 billion goal is in sight along with growing from 130,000 customers to 1 million. "Today we believe it is very much within our reach and that we have a solid plan to get there," he said during the earnings call.

And that plan does not call for buying any rivals to grow share, he added. "We will continue to be an organic based growth company," he said. "We will do acquisitions like we just announced [such as the deal to buy Cloudkick in December] but these are acquisitions about building capability, not trying to buy revenue or scale."

Posted by Jeffrey Schwartz on 02/17/2011 at 1:14 PM0 comments


Marketplace for Cloud Capacity Debuts

Need to use 10,000 servers just for an hour or two? Or perhaps you have excess capacity that you'll never use or sell? If you are in either camp, now there's a marketplace aimed at matching buyers and sellers of cloud computing capacity.

Enter SpotCloud. Enomaly, a longtime provider of cloud computing infrastructure software, launched the public beta of SpotCloud on Monday. Cloud service providers can make unused capacity available on the exchange.

Enomaly describes SpotCloud as the first structured marketplace where service providers can sell their excess computing capacity to an open field of buyers and resellers. The company vets all providers, who can have as little one 8-core server and 500 gigabytes of storage.

Buyers such as those dealing with brief spikes in workloads or shops that want to run performance tests of their applications are typical candidates, said Reuven Cohen, Enomaly's founder and CTO, in an interview this week. A buyer may need capacity in a certain part of the world or perhaps they want to acquire it from multiple distributed providers, Cohen explained.

The business model for Enomaly is it takes a commission for every transaction. Asked if there were enough suppliers interested in participating in such a marketplace, Cohen, said he has already received queries from hundreds of providers. "We certainly have had no issue getting a large group of suppliers," he said.

"People are willing to buy something from you that you weren't going to sell otherwise. It's pretty easy to convince someone to make their excess capacity available as long as you can show that you're doing so in a secure and structured way. It's not a hard pitch."

SpotCloud is a marketplace built using the company's Enomaly ECP platform (it will support other cloud infrastructure platforms in the future, the company says) and runs atop Google App Engine.

Here's how it works, according to the SpotCloud Web site:

  1. Buyers deposit an initial credit into the SpotCloud platform. (It's a pay as-you-go model)
  2. Buyers create a VM appliance using the Enomaly SpotCloud package builder.
  3. Then upload a VM appliance (SpotCloud can provide sample VM images with phone-home capabilities) using the SpotCloud management interface.
  4. Sellers can dynamically define hardware profiles, location information, duration of available capacity and associated resource costs.
  5. Buyers select providers based on a cost and location.
  6. VMs are automatically delivered to sellers' (who must have a SpotCloud-compatible IaaS cloud platform) cloud infrastructures where the VM packages are run according buyers requirements.
  7. SpotCloud monitors and debits buyers on an hourly utility basis with a notification sent when credits drop below minimum threshold.
  8. At the end of the month, sellers are paid directly for any capacity utilized via the SpotCloud marketplace.

A notable example of a clearinghouse like this date back more than a decade ago, when Enron launched a bandwidth marketplace, Cohen said. Of course that was unsuccessful for quite a few reasons but Cohen believes the time is right for a marketplace to exchange cloud capacity.

Will it evolve into a larger business than the company's core software infrastructure offering, Enomaly's core offering for more than six years? "If the numbers keep going the way we currently see it, I would say the future of our business is probably looking more like SpotCloud and less like traditional infrastructure-as-a-service software we've done in the past," Cohen said. "It's been very strong, we've had a significant number of people sign up who are paying for the service. It's looking very, very promising."

If you're a buyer or seller of cloud capacity, would you use a marketplace like SpotCloud? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/17/2011 at 1:14 PM0 comments


Oracle Database Added to Amazon Web Services

Amazon Web Services plans to make the Oracle Database 11g R2 available on its Relational Database Service (RDS) next quarter.

RDS is a service designed to let customers install, run and scale relational databases in the cloud. Until now, only the MySQL database was an option on RDS, a service Amazon said is used by thousands of customers.

"As with today's MySQL offering, Amazon RDS running Oracle database will reduce administrative overhead and expense by maintaining database software, taking continuous backups for point-in-time recovering, and exposing key operational metrics via Amazon CloudWatch," said Amazon Web Services evangelist Jeff Barr, in a blog post. CloudWatch is a service that lets customers monitor their AWS cloud resources.

Customers can also scale compute and storage capacity using the AWS Management Console, Barr added.

Amazon is offering three licensing options. Customers can bring their own licenses and run them without additional licensing fees. The second option will be On-Demand Instances (DB-Instances), in which customers will pay by the hour for usage of an Oracle database. It doesn't require any setup fees nor long-term commitments and the hourly rate is based on the specific database edition and DB Instance size selected. The final option is Reserved DB Instances, which lets customers pay once for each DB Instance with the option of running it at a discount over the hourly usage charge. That will be available with one-year and three-year commitments.

Oracle will provide technical support for those who bring their own licenses, while Amazon will provide support for the On-Demand and Reserved DB Instances Options.

Posted by Jeffrey Schwartz on 02/10/2011 at 1:14 PM0 comments


Google Aims for E-Mail Dial Tone

In my Redmond magazine cover story this month, Clouds Collide in Strategic Microsoft vs. Google Battle,I reported how fiercely the two companies are going after each other in the hosted e-mail and Web productivity space.

Google is trying to eat into Microsoft's cash cows: Windows, Exchange, SharePoint and Office -- and the article explains how. While Microsoft likes to tout missing features in Google Apps, it's a moving target. Google frequently updates Google Apps.

One of the critiques of Google Apps was its approach to service level agreements. Specifically, downtime of less than 10 minutes wasn't included as part of the SLA. Google has amended its SLA so that even shorter amounts of downtime are now is included.

Moreover, Google has removed an SLA provision that permitted scheduled downtime. "Going forward, all downtime will be counted and applied towards the customer's SLA," wrote Matthew Glotzbach, Google's enterprise product management director, in a  blog post last month announcing the new policy.

Despite some outages, Glotzbach said that Gmail was available 99.984 percent of the time last year, covering both business users and consumers. "People expect e-mail to be as reliable as their phone's dial tone, and our goal is to deliver that kind of always-on availability with our applications," he noted.

As Google and Microsoft argue over whose service has features more conducive to how information is created and shared, availability and meeting service requirements will remain high on the requirements list. Providing E-mail dial tone must be a given. Google and Microsoft both appear to get that but the proof will be whether they can deliver.

Posted by Jeffrey Schwartz on 02/10/2011 at 1:14 PM0 comments


Forecast: Certain Verticals Will Accelerate Public Cloud Growth

Expenditures for public cloud services will grow to nearly $30 billion over the next three years, according to a report released this week by market researcher IDC.

The latest forecast projects a compounded annual growth rate of 21.6 percent since 2009 when revenues were $11 billion. With a number of reports showing robust growth for cloud computing services, this one is noteworthy because it looks at public cloud service revenues from 18 vertical industries.

IDC found that professional services, communications, media and manufacturing markets would generate the most revenue for public cloud service providers. Professional services is especially high on the radar because of the number of midsized companies that are "information-dependent" that will use software-as-a-services, or SaaS.

The services and distribution sector, which includes retail, wholesale professional services, consumer and recreational services, and transportation, accounts for the largest share of revenue. That sector, now a $3 billion market, will nearly triple to $8.5 billion by 2014, according to IDC.

Other verticals that are regulated or have significant security and/or privacy concerns will limit their use of public cloud services to e-mail, messaging and collaboration. Those industries include government, banking and healthcare.

The latter will only account for 5 percent of public cloud revenue in 2014, yet it will post CAGR of 23 percent.

Posted by Jeffrey Schwartz on 02/10/2011 at 1:14 PM0 comments


OpenStack Gains Ground

NASA and Rackspace Hosting last week commemorated the six-month anniversary of their open-source cloud effort by announcing it has more than 40 partners on board. One of those partners, Internap Network Services, said it is building a cloud storage platform based on OpenStack.

Rackspace chairman Graham Weston believes OpenStack will become a key factor in open-source cloud computing. The goal is to provide portability among cloud providers who build their infrastructures on OpenStack. "At Rackspace we think OpenStack is the next Linux," Weston said in a company-produced video posted on the OpenStack Web site.

OpenStack is a cloud operating system freely available under the Apache 2.0 license. Rackspace decided to open source the code behind two of its cloud services: CloudServers and CloudFiles, compute and storage offerings, respectively. Through the process of open sourcing those, Rackspace learned that NASA was working on some similar technology and was interested in open sourcing its effort. As a result, in July both combined efforts and launched OpenStack.

Currently OpenStack consists of two core efforts: OpenStack Compute and OpenStack Object Storage. OpenStack Compute is software designed to deploy and manage large clusters of virtual private servers, while OpenStack Object Storage is designed to scale terabytes and petabytes of data.

From the outset, the OpenStack consortium launched with 25 partners. Now it has more than 40 including Citrix, Dell, Intel, as well as smaller companies like Internap, CloudKick, Cloudscaling, Limelight Networks, RightScale and Gigaspaces.

"If they can spread this platform, it gives them a better way to compete," said Redmonk analyst Michael Cote. "There's been a huge amount of interest." Outside of NASA and Rackspace, Internap is the first cloud provider to implement it in a product.

For its part, Internap said it has released to beta Internap XIPCloud Storage, a public cloud storage service aimed at complimenting its managed hosting service. The company is building its elastic cloud storage service using OpenStack, said Scott Hrastar, senior vice president of technology at Internap.

While Hrastar admits OpenStack's availability was good timing, he told me "we're a big proponent of the open-source community aspect of the project and just saw it as a natural way for us to build on top of an interesting and differentiated solution."

A new version of OpenStack, code-named "Bexar," is slated for release early next month. Bexar represents a stabilization of the code base, said Jonathan Bryce, chairman of the OpenStack Project Oversight Committee and co-founder of the Rackspace Cloud.

"It's been a lot of fun to see it grow and see it really pick up momentum and see the software improving," he said.

Posted by Jeffrey Schwartz on 01/25/2011 at 1:14 PM0 comments


Did Google's Schmidt Lose Control?

Google's decision to name Larry Page to replace Eric Schmidt as CEO caught me and everyone else who follows the company off guard. After all, Google was showing quarter-over-quarter revenue and profit growth that most companies would kill for. But in retrospect, the handwriting was on the wall.

While there are rumors they weren't on the same page, Google insists there was no friction between Schmidt and the two co-founders, Page and Sergey Brin. Putting aside Schmidt's controversial comments on privacy and the decision last year to pull out of China, Schmidt had his own ideas for how the company should develop technology and bring products to the market.

One example involves the genesis of the Chrome browser and Chrome OS -- Google's vision for replacing typical PCs with computers that rely on the cloud for everything they do. The company last month launched a controlled beta of the Google Notebook, dubbed Cr-48. Schmidt pointed out he wanted no part of Google being in the browser or OS business. As CEO, Schmidt earlier on had put the kibosh on it.

At least so he thought. That's when the end run happened as Schmidt explained at last month's Chrome OS launch event:

"Larry and Sergey wanted to be in the browser and OS business and I absolutely was not interested in being in either, and I said no... They sneakily hired a number of people who were very clever, to work on Firefox browser which we helped fund through an advertising deal, and ultimately that core team was able to build this phenomenal browser called Chrome, which finally broke through the architectural frameworks that people had with respect to security and speed."         

While Schmidt was apparently trying to portray it as a beneficial rebuke of his authority, nevertheless, it raises questions as to how much control he had.

Whether Schmidt really is on board today with the decision to build Chrome and Chrome OS is a moot point. Page and Brin did it without him. And perhaps the two decided it was time to stop "sneaking "around and take control of their company's destiny -- for better or for worse.

Posted by Jeffrey Schwartz on 01/25/2011 at 1:14 PM1 comments


Microsoft Readies New Cloud Data Center

Microsoft is getting ready to launch the newest datacenter for its cloud services. The new facility, in Quincy, Wash., will go live early this year, Microsoft announced this week.

It incorporates much of the principals of its Chicago and Dublin datacenters, notes Kevin Timmons, Microsoft's general manager of datacenter services.

Timmons points out that there are some nuances. The Dublin facility uses server PODs, which rely on outside air to reduce cooling costs. The Chicago datacenter, by comparison uses Microsoft's IT Pre-Assembled Components (ITPACs.). Quincy will use the ITPACs.

"An ITPAC is a pre-manufactured, fully-assembled module that can be built with a focus on sustainable materials such as steel and aluminum and can house as little as 400 servers and as many as 2,000 servers, significantly increasing flexibility and scalability," Timmons notes in a blog post.

"The expansion in Quincy takes these ideas a step further," he adds, "by extending the flexibility of PACs across the entire facility using modular 'building blocks' for electrical, mechanical, server and security subsystems.  This increase in flexibility enables us to even better support the needs of what can often be a very unpredictable online business and allows us to build datacenters incrementally as capacity grows.  Our modular design enables us to build a facility in significantly less time while reducing capital costs by an average of 50 to 60 percent over the lifetime of the project."

The new datacenter will be adjacent to Microsoft's existing 500,000 square foot facility, except this one will be in a structure that resembles tractor sheds, allowing Microsoft to pull in outside cool air, while providing protection from other elements.

Posted by Jeffrey Schwartz on 01/06/2011 at 1:14 PM0 comments


Midmarket Doesn't Get Cloud

A survey of mid-sized companies employing less than 1,000 people found that nearly half don't understand what the cloud is.

Fielded by cloud provider Vitacore Systems, the survey found 48 percent are confused about the cloud. Despite the fact that they use Salesforce.com or Google Docs, 54 percent said they had no idea they were using a cloud service.

Vitacore surveyed 210 business and IT pros at mid-sized companies. Of those, roughly one-third said they were IT pros the rest came from the business side. Vitacore is a provider of cloud services to midmarket companies.

Posted by Jeffrey Schwartz on 01/06/2011 at 1:14 PM1 comments


Subscribe on YouTube