VMware's Cloud Foundry project released a beta version of its open source Platform as a Service (PaaS) for developer laptops and desktops.
The Micro Cloud Foundry is software for developers who want to build and test applications locally without having to connect to the cloud. Developers can download a virtual machine image of Micro Cloud Foundry, which is compatible with VMware Fusion for Mac OS X, VMware Workstation and VMware Player for Linux and Windows.
"Today we are taking the next step toward providing developers what they need -- a simple PaaS solution you can quickly download and install on your machine," said VMware CTO Steve Herrod in a blog post.
The release of the client-side software, which was expected, is aimed at making it easier for developers to build or repurpose their apps for Cloud Foundry by providing the environment on their local machines.
Herrod said Micro Cloud Foundry supports Java on Spring, Ruby on Rails/Sinatra and Node.JS frameworks in addition to MySQL, MongoDB and Redis databases. In addition, it works with Cloud Foundry's scriptable command line interface, called vmc, and with the Eclipse-based SpringSource Tool Suite (STS), Herrod noted. "This allows developers to retarget deployments between on-premise and public environments without code modifications," he said.
The beta is available for download.
Posted by Jeffrey Schwartz on 08/26/2011 at 1:14 PM0 comments
Verizon Communications has acquired CloudSwitch, whose software makes it possible to move applications and workloads between public and internal datacenters. Terms of the deal were not disclosed.
The company's gateway appliance includes software that allows administrators to move workloads from enterprise datacenters to public clouds without changing the application or infrastructure layer. Applications maintain policies when moved between various cloud environments and internal datacenters.
CloudSwitch will become part of Verizon's Terremark division. Verizon acquired enterprise cloud provider Terremark Worldwide earlier this year in a deal worth $1.4 billion.
"Our founding vision has always been to create a seamless and secure federation of cloud environments across enterprise data centers and global cloud services," said CloudSwitch CEO John McEleney in a statement. "Together, we will be able to provide enterprises with an unmatched level of flexibility, scalability and control in the cloud with point-and-click simplicity. This will go a long way in helping achieve widespread adoption of the cloud especially when managing complex workloads."
Verizon is betting that the addition of CloudSwitch will ease customer resistance to moving enterprise workloads to the cloud.
Posted by Jeffrey Schwartz on 08/26/2011 at 1:14 PM0 comments
Eucalyptus Systems disclosed plans to roll out the third version of its open source Infrastructure as a Service (IaaS) private cloud software.
The new release, dubbed Eucalyptus 3, adds high availability (HA) to its cloud platform, meaning customers will be ensured uptime in the event of a hardware, software or network failure. The system will fail over if it goes down for any reason, including a failed disk drive, memory corruption or even a power outage. In any such event, the software will fail over to a "hot spare" service running on different hardware.
"Implementing HA was the obvious next major evolution for Eucalyptus," said Eucalyptus CEO Marten Mickos in a blog post. "After all, a key reason for organizations to run a private (i.e. on-premise) cloud is typically that they want more control than they have over a public cloud. Public clouds provide amazing uptime. But at the end of the day, this uptime is nothing you can influence. In a private cloud, however, it's your hardware and you can set the parameters. You can determine the level of assurance. With Eucalyptus 3 supporting HA in the cloud platform itself, your cloud rises to a new level of availability."
Mickos acknowledged that the company initially intended to offer HA just for customers who were requesting it. But the company since determined that it was a feature most customers would need.
Eucalyptus claims it has deployed 25,000 private clouds and counts 21 of the Fortune 100 that have deployed Eucalyptus-based clouds. Among its customers are Puma, the USDA, Plinga, Aerospace Corp., InterContinental Hotels Group, Wetpaint and USASpending.gov.
The appeal of Eucalyptus software is that it lets customers build internal clouds based on the Amazon Web Services API, allowing customers to move applications and data between AWS and their private clouds. Customers can also run Amazon Machine Images (AMIs) on Eucalyptus and AWS-compatible clouds and can use management tools to administer Eucalyptus clouds.
That could be attractive to organizations that want to create hybrid clouds, where some of their systems run in the public cloud and others remain in the datacenter. It also allows for portability among public and private clouds.
Also new in Eucalyptus 3, users will be able to boot from Amazon Elastic Block Storage (EBS). Eucalyptus 3 will also support the AWS Identity and Access Management (IAM) API, as well as the ability to map identities from LDAP and Microsoft Active Directory servers to Eucalyptus accounts, groups and users.
Eucalyptus 3 will be available next quarter.
Posted by Jeffrey Schwartz on 08/25/2011 at 1:14 PM0 comments
Amazon Web Services on Tuesday launched a new Web caching service that lets customers deploy an in-memory cache for applications running in the cloud.
The company says its new ElastiCache service lets customers add an in-memory cache to their application architectures. That will enable them to boost the performance of applications by letting customers retrieve information from the in-memory cache rather than from slower disk-based databases.
ElastiCache is suited toward read-heavy workloads such as social networking, gaming and media sharing, Amazon said.
"Caching is a core part of so many Web applications today, but running your own caching infrastructure is time-consuming and rarely adds differentiated value for your business," said Raju Gulabani, AWS' vice president of database services, in a statement. "Until today, businesses have had little choice but to shoulder this responsibility themselves -- and indeed, many AWS customers have built and managed caching solutions on top of AWS for some time. Amazon ElastiCache answers one of the most highly requested functionalities of AWS customers by providing a managed, flexible and resilient caching service in the cloud."
ElastiCache supports Memcached, an open source distributed object memory system. Existing apps, tools and code that use Memcached can migrate their apps to ElastiCache with minimal effort, Amazon said.
"If you are already running Memcached on some Amazon EC2 instances, you can simply create a new cluster and point your existing code at the nodes in the cluster," said AWS evangelist Jeff Barr in a blog post. "If you are not using any caching, you'll need to spend some time examining your application architecture in order to figure out how to get started. Memcached client libraries exist for just about every popular programming language."
Via the AWS Management Console, customers can launch a Cache Cluster composed of a number of Cache Nodes. Customers can add or subtract nodes to scale the amount of memory tied to a cache cluster.
Customers can monitor performance characteristics associated with Cache Nodes via Amazon's CloudWatch.
Amazon said pricing is based on the size of the Cache Nodes used, with an entry price of $0.095 per hour for a small Cache Node (1.3 GB of memory). A large Cache Node (7.1 GB) costs $0.38 per hour and an extra-large Cache Node (14.6 GB) is $0.76 per hour. ElastiCache is initially available in Amazon's Virginia region and will be rolled out to its other regions in the coming months.
Posted by Jeffrey Schwartz on 08/23/2011 at 1:14 PM0 comments
It's not every day that a company plunks down $10 billion and the news becomes a sidebar. But that's what happened last week when Hewlett-Packard Co. said it will acquire Autonomy for that amount.
The deal was overshadowed by the news that HP is weighing the sale or spinoff of its PC business, that it was shutting down its webOS hardware business and that it was lowering its revenue outlook for the year. All of that, and questions about the merits of its Autonomy acquisition, led to analyst downgrades and a 20 percent drop in HP's share price Friday (it recovered a bit Monday, gaining 3.6 percent).
The fact that HP is using the bulk of its $13 billion in cash reserves to acquire Autonomy, which will only account for 1 percent of HP's revenues, has not been well received. HP has signaled that it is transforming itself from a consumer and enterprise company to just an enterprise company, following in the footsteps of IBM. Seemingly, the plan is: Get out of low-margin businesses and focus on more profitable niches like software, services and the cloud.
But will Autonomy get HP there? And is the company squandering its cash reserves at that price tag, which is nearly 11 times revenues? CEO Leo Apotheker said on HP's earnings call Thursday that it is worth it. "Autonomy represents an opportunity for HP, for us to accelerate our vision to decisively and profitably lead a large and growing space, which is the enterprise information management space," Apotheker said.
"It also brings HP higher value business solutions that will help customers manage the explosion of information. But as we execute this deal, this will position HP as a leader in the large and growing space. It will complement our existing technology portfolio and enterprise strategy. It will provide differentiated IP for services and extensive vertical capability in key industries. We will provide IPG [Imaging and Printing Group] a base for content management platform. It will, over time, significantly enhance HP's financial profile. And the board believes that the transaction is accretive to HP's non-GAAP earnings in its first full year after completion."
Autonomy's flagship Intelligent Data Operating Layer (IDOL) software, which provides cross-enterprise search and content management, brings together silos of unstructured data including e-mail, text, Web pages, voice and video. It's used for e-discovery, records management, archiving and Web content management, among other things.
HP noted that Autonomy's revenue growth was running at a compounded annual growth rate of 55 percent. Its 25,000 customers include major corporations, law firms and federal agencies such as AOL, BAE Systems, BBC, Bloomberg, Boeing, Citigroup, Coca-Cola, Daimler AG, Deutsche Bank, DLA Piper, Ericsson, FedEx, Ford, GlaxoSmithKline, Lloyds TSB, NASA, Nestlé, the New York Stock Exchange, Reuters, Shell, Tesco, T-Mobile, the U.S. Department of Energy, the U.S. Department of Homeland Security and the U.S. Securities and Exchange Commission.
It also has 400 OEM partners including Symantec, Citrix, HP, Novell, Oracle, Sybase and TIBCO.
Sounds good, but is all that worth $10.3 billion for a company that generated $870 million in revenues last year? Some analysts are questioning that. "The IDOL IP is stagnant. There hasn't been a major release of IDOL in over 5 years," wrote Forrester Research analyst Leslie Owens in a blog post.
"Buyers of its technology though have been less enthusiastic, regularly citing a firm that is arrogant in its dealings with customers, confused roadmap messages, and technology (particularly the core IDOL platform) that is overly complex and expensive to use," wrote Alan Pelz-Sharpe, a principal with The Real Story Group. "On the other side of the equation is HP, a hardware and services firm that has had very little success with software."
Others are more bullish on the deal. "The Autonomy acquisition brings a broad portfolio of information management technology, along with a growing number of search-based applications," IDC said in a research note. "Autonomy is a highly profitable software business, with a solid customer base worldwide and a significant cash reserve -- that is, it should add significantly to HP's margins."
Pointing to Autonomy's most recent earnings call, IDC noted that Autonomy's cloud business represents 62 percent of revenues today and is growing at a clip if 17 percent. "HP's converged infrastructure solutions are another linchpin of its risk management-aware cloud computing strategy," the IDC note said.
Apotheker said one-third of Autonomy's revenues are from its Software as a Service business. Autonomy also recently acquired Iron Mountain Digital for $380 million.
"Autonomy is successfully selling enterprise software into a market in which enterprise spending on technology is growing, even while IT budgets are flat or in decline," said Constellation Research analyst Michael Dortch in a blog post.
Nevertheless, the deal has a lot of people scratching their heads. One doesn't have to look too far to see HP doesn't have a problem making big investments and later changing strategies. While HP claims it will salvage its webOS software business, it seems all but inevitable that the company will be writing off its $1.2 billion outlay in Palm.
Time will tell if HP does better with Autonomy, but it sure seems like it spent a fortune for a company that only has some of the pieces of the puzzle it is trying to stitch together. And it remains to be seen how well those pieces fit.
Posted by Jeffrey Schwartz on 08/22/2011 at 1:14 PM0 comments
To say cloud providers are less than forthcoming on their approach to cloud security would be an understatement. Call it paranoia or prudence, customers are demanding more transparency about security practices before making the leap to the cloud.
The Cloud Security Alliance (CSA) this fall will launch a searchable registry that existing or prospective customers can access free of charge to query how cloud providers are approaching security. Customers will be able to look up cloud providers and review their security practices.
The CSA Security, Trust & Assurance Registry (STAR) aims to document the security controls in place by cloud computing providers, letting users determine how their existing or potential providers are addressing security. STAR allows providers to file reports that document security practices.
"The purpose of the registry is to prod the industry a bit to really be more transparent in their security practices," said Jim Reavis, executive director of the CSA. "We need to have security by transparency. It's really going to create a big mindset shift that while there are definitely a lot of the details about security practices that must be closely held, that in fact to have cloud actually function as a compute utility, we have to have a lot more knowledge about how it works and operates."
The CSA is looking to strike the right balance between transparency and secrecy, but Reavis believes right now it lies too far on the side of secrecy. As a result, it inhibits the adoption of cloud computing and holds back knowledge of what the security practices are of cloud providers.
"It's a high-level type of shift in the mentality and mindset in how we protect our systems, how we disclose things, how we respond to audits, how we do due diligence," Reavis said. "But I think that will have far-reaching impact on the whole of security and compliance and it could even forestall the need for some pretty heavy-handed government regulation of cloud computing, if we are actually are able to show that the industry can self-regulate to a degree and really expose a prudent amount of information about what they're doing. That's the big effect we're trying to get."
STAR is open to all types of cloud providers, which have the option of submitting two different reports that would indicate their compliance:
- The Consensus Assessments Initiative (CAI) is a questionnaire that lets providers document what security controls exist in their IaaS, PaaS and SaaS offerings, based on industry-accepted methods. It consists of 140 questions a customer or auditor might ask of a cloud provider.
- The Cloud Controls Matrix (CCM) is a spreadsheet-based tool of the CSA's recommended security controls across 13 domains.
STAR should be welcome by customers considering cloud providers. But so far, it remains to be seen how many will contribute to the registry. Reavis is confident there will be broad industry participation.
"Under NDA I have seen this documentation that we're asking for from virtually every cloud provider," he said, noting they've had to provide it for their bigger customers. "I think based on the fact that they've already done this work -- and we've had really positive conversations -- we expect most major cloud providers to have this documentation posted very close to our go-live date."
Posted by Jeffrey Schwartz on 08/18/2011 at 1:14 PM0 comments
Cloud-based business intelligence (BI) startup GoodData has landed $15 million in venture funding, bringing the total it has raised up to $28.5 million.
The funding was led by Andreessen Horowitz, with existing investors General Catalyst Partners, Fidelity Growth Partners and Windcrest Partners also contributing to the latest round.
San Francisco-based GoodData offers BI in a Software as a Service model, providing dashboards, analytics and data warehousing capabilities to businesses as an alternative to traditional on-premises BI platforms.
Since its launch in 2009, GoodData has landed 2,500 customers, including Capgemini, Groupon, Nike and Time Warner Cable. GoodData said that its platform use has grown 500 percent this year. In July the company managed 6,500 unique data marts and produced 2 million reports.
SaaS cloud providers such as Zendesk, Twilio, Pardot and Get Satisfaction offer their own analytics apps that run on the GoodData BI service. GoodData can run analytics against Facebook, Microsoft Dynamics CRM, Salesforce.com and SugarCRM data, among others. The company said it intends to use the additional funding to build new analytics apps and expand its partner network.
GoodData also said it has added two new members to its board of directors: John O'Farrell, general partner at Andreessen Horowitz, and Dave Girouard, president of Google Enterprise.
Posted by Jeffrey Schwartz on 08/18/2011 at 1:14 PM0 comments
VMware Inc.'s Cloud Foundry Platform as a Service (PaaS) has signed on several new supporters of its open-cloud effort.
Cloud Foundry, which runs atop VMware's vSphere and vCloud platforms, went into beta in April. It is striving to offer a platform by which customers can develop and deploy applications to multiple clouds.
Now Cloud Foundry has a number of new partners joining in its cause: Canonical, Dell, enStratus and Opscode, all of which are aiming to make it easier to deploy Cloud Foundry-based infrastructures.
"We are announcing collaborations with key industry partners, dramatically expanding the integrated tooling for running Cloud Foundry almost anywhere, and delivering on our portability vision," Cloud Foundry said on its blog.
Here's what the various partners are bringing to Cloud Foundry:
- Canonical: The Ubuntu 11.10 Linux distribution will include Cloud Foundry. The company said it has added client and server deployment tools, enabling both single-node and multi-node PaaS environments.
- Dell: Using its software framework called Crowbar, users have a recognized open-source bare-metal installation package. Cloud Foundry will release a Crowbar "barclamp" that will install and configure Cloud Foundry.
- enStratus: The company has added Cloud Foundry to its service catalog, allowing Cloud Foundry to be deployed on any of the 18 clouds supported by the vendor, which will offer a management and automation platform that manages vSphere, vCloud and Cloud Foundry infrastructures.
- Opscode: The creator of the popular Chef open-source cloud automation system said it will simplify the building of complex Cloud Foundry topologies.
Cloud Foundry is predicting that these new integrations and the Cloud Foundry VMC client will show up on "millions" of developer and server operating systems.
Posted by Jeffrey Schwartz on 08/17/2011 at 1:14 PM0 comments
Intuit this week said it will "double down" on its partnership with Microsoft to integrate its Intuit Partner Platform (IPP) with the Windows Azure platform for those looking to build Software as a Service (SaaS) apps.
The two companies had jointly announced in January 2010 that Intuit would name Windows Azure as a preferred platform for cloud application development on the IPP. To facilitate that, they developed and released the Windows Azure SDK for the IPP.
In parallel, though, Intuit had its own Platform as a Service (PaaS) cloud aimed at providing an end-to-end service for partners. Alex Chriss, director of the IPP, said in a blog post that while it helped some partners get to market, it lacked the scale that would appeal to its developer community.
"While we've continued to invest in our native platform, other platforms have moved at lightning speed," Chriss wrote. "These platforms have evolved way past anything we could create and honestly provide a better development experience than we could."
As a result, Intuit said it will no longer offer its native development stack for developers. Instead, the company plans to focus on the top of the stack to make data and services for developers a top priority. Hence, the decision to extend its reach with Azure.
"We believe with additional investment in our Windows Azure SDK for IPP, we can make developing an app on Azure, leveraging QuickBooks data and IPP's go-to-market services, and getting into the Intuit App Center Channel, DROP DEAD EASY," Chriss wrote.
A forthcoming 2.0 release of the SDK will provide support for one-click publishing and authentication of federated IPP and Intuit Anywhere apps hosted on Windows Azure, Microsoft said on the Windows Azure Team blog. The companies did not say when they will release the new SDK.
The two companies have launched a program called the Front Runner for Intuit Partner Platform, where developers can get access to SDKs, content and support; test their applications' compatibility; and then publish apps to the Intuit App Center, a marketplace of small-business applications and services.
"If you're a member of the Front Runner Program (for third party developers) then you'll also benefit from technical and business support that simplifies getting your apps hosted onto Windows Azure, while also integrating with QuickBooks data," wrote Liz Ngo, senior business development manager in Microsoft's Global ISV Group, in a blog post. "Once you are on-boarded, Intuit & Microsoft will promote your application to Intuit's 26 million (and growing) SMB customers."
Posted by Jeffrey Schwartz on 08/11/2011 at 1:14 PM0 comments
Are your non-technical friends and family members familiar with the term cloud computing? If they're not, don't despair. Most people aren't, either.
Only 22 percent of consumers are familiar with the term cloud computing, according to a study conducted by NPD Group. But that doesn't mean they aren't using cloud computing to conduct various tasks and activities.
"Whether they understand the terminology or not, consumers are actually pretty savvy in their use of cloud-based applications," said Stephen Baker, NPD's vice president of industry analysis, in a statement. "They might not always recognize they are performing activities in the cloud, yet they still rely on and use those services extensively. Even so, they are not yet ready to completely give up on traditional PC-based software applications."
Seventy-six percent of respondents to the researcher firm's "Digital Software and the Cloud Report" said they have used some sort of cloud service over the past year. Top services were e-mail, tax preparation and online gaming, NPD said.
Despite the emergence of cloud services, 24 percent of those surveyed said they purchased a computer-based software app over the past six months.
Posted by Jeffrey Schwartz on 08/11/2011 at 1:14 PM0 comments
Amazon Web Services' woes continued on Tuesday as the cloud provider worked to recover from a bolt of lighting that caused power outages in its Dublin, Ireland datacenter over the weekend.
The lighting strike brought down Amazon's EC2 and RDS services, as well as Microsoft's Business Productivity Online Services. Microsoft's outage reportedly lasted several hours on Sunday and has since been restored, the company said on its Twitter feed.
But Amazon Tuesday was still trying to restore its Elastic Block Store (EBS) block storage volumes following an explosion and fire that made the company's backup generators unavailable. The company said on its Service Health Dashboard on Sunday:
"Due to the scale of the power disruption, a large number of EBS servers lost power and require manual operations before volumes can be restored. Restoring these volumes requires that we make an extra copy of all data, which has consumed most spare capacity and slowed our recovery process. We've been able to restore EC2 instances without attached EBS volumes, as well as some EC2 instances with attached EBS volumes. We are in the process of installing additional capacity in order to support this process both by adding available capacity currently onsite and by moving capacity from other availability zones to the affected zone. While many volumes will be restored over the next several hours, we anticipate that it will take 24-48 hours until the process is completed."
The company said some EC2 instances and EBS servers lost power before writes to their volumes were completed. "Because of this, in some cases we will provide customers with a recovery snapshot instead of restoring their volume so they can validate the health of their volumes before returning them to service. We will contact those customers with information about their recovery snapshot."
Vincent Partington, co-founder and CTO of XebiaLabs, was among those who received bad news from Amazon. "Just got an email from Amazon AWS saying they lost some of my data," Partington Tweeted. "I'm OK with an outage now and then but this really blows!"
Apparently, the problem extends beyond the lightning strike. Amazon discovered a bug in the software that cleans up unused snapshots. "During a recent run of this EBS software in the EU-West Region [Dublin], one or more blocks in a number of EBS snapshots were incorrectly deleted," the company said. "We've addressed the error in the EBS snapshot system to prevent it from recurring. We have now also disabled all of the snapshots that contain these missing blocks."
As of Tuesday morning, Amazon said it has delivered recovery snapshots for more than half of the volumes that were affected by the power outage. "We are continuing to make steady progress on creation and delivery of the remaining recovery snapshots," the company said.
If that wasn't enough, Amazon suffered a brief outage Monday night at its Northeast U.S. datacenter in Virginia. While only lasting about two hours, it affected the Web sites of customers including Foursquare, Netflix and Reddit. Amazon attributed that to connectivity issues between instances.
Posted by Jeffrey Schwartz on 08/10/2011 at 1:14 PM0 comments
Cloud service provider Skytap has come up with a way to automate the launching of servers without requiring third-party management tools.
The company has added "cloud orchestration" to its flagship service, which it says will eliminate the need for software from the likes of CA Technologies, Hewlett-Packard and IBM that manage the sequencing of servers.
When provisioning apps to the cloud, often servers need to be brought up in a particular sequence. For example, a company may want to launch Active Directory first to ensure all user credentials are available, then SQL Server where data is stored, then Microsoft's Team Foundation Server followed by an application server, explained Sundar Raghavan, Skytap's VP of marketing.
"When people move to the cloud, they want the same high-fidelity environment but in our opinion they don't have to buy all that expensive software and have an IT person set it all up," Raghavan said. "So what we are doing is allowing customers to use our self-service UI and create rules by which these startup sequences will occur and in which the shutdown and suspend will occur. Being able to schedule those operating dependencies is very crucial for moving to the cloud."
Skytap has also added a network routing feature that lets users consolidate server environments, letting customers recreate a server hub-and-spoke model in the cloud. That would allow, for example, customers to use centralized servers such as Active Directory without having to replicate it to the nodes.
"The benefit of this is if you're a team manager, you don't have to wait around for IT to set up your environment," Raghavan said. "You can go, consolidate, use our UI to route between the environments, and get going immediately. Reduce your setup errors, [avoid your obsolete machines and reduce] your usage cost. So really it's about consolidating your servers and simplifying your architecture yet having the benefits and economics of scale of cloud computing."
Posted by Jeffrey Schwartz on 08/10/2011 at 1:14 PM0 comments