Intel is investing $30 million to expand its research in cloud and embedded computing technologies.
The money will be used by Intel Labs to fund the creation of its Intel Science and Technology Centers (ISTCs) at Carnegie Mellon University. The ISTCs are aimed at providing collaboration between university researchers and Intel. The results of that collaboration are made publicly available via technical journals and open-source software releases, Intel said.
"These new ISTCs are expected to open amazing possibilities," said Justin Rattner, Intel's CTO, in a prepared statement. "Imagine, for example, future cars equipped with embedded sensors and microprocessors to constantly collect and analyze traffic and weather data. That information could be shared and analyzed in the cloud so that drivers could be provided with suggestions for quicker and safer routes."
Intel said researchers will examine technologies that will impact the cloud, such as built-in application optimization, big data analytics and extending cloud capabilities to the edge of the network and client devices.
Posted by Jeffrey Schwartz on 08/09/2011 at 1:14 PM0 comments
CloudBees, which offers a cloud-based Platform as a Service (PaaS) for Java applications, has secured $10.5 million in Series B financing from Lightspeed Venture Partners.
The startup company, founded last year by former JBoss CTO Sacha Labourey, offers a Java development- and deployment-based PaaS.
Waltham, Mass.-based CloudBees currently offers two core services: [email protected], which provides a Java-based development infrastructure for building and testing of apps, and [email protected], which provides an app server in the cloud for deployment.
Among CloudBees' customers are Cisco, Digg, Lawrence Livermore National Laboratory and Sandia National Laboratories.
The company received $4 million in Series A financing in November from Matrix Partners, which also participated in the current round. CloudBees said it will use the funds for product development and to expand sales and marketing efforts.
Posted by Jeffrey Schwartz on 08/08/2011 at 1:14 PM0 comments
Amazon Web Services has rolled out three new features to its portfolio of cloud services that should appeal to enterprise users by making it easier to extend their datacenters to the cloud.
Today marks the general availability of AWS Virtual Private Cloud (VPC), AWS Direct Connect and new identity federation capabilities.
"With today's launch of Amazon VPC worldwide, AWS Direct Connect and the new IAM federated identity capabilities, enterprises have even more flexibility and control over deploying their workloads to the cloud," said Amazon Web Services VP Adam Selipsky in a statement. "These capabilities provide even more privacy, and along with AWS's existing cloud services allow enterprises to choose the environment that is best suited to each of their workloads."
The new AWS Direct Connect feature allows customers to create connections from their datacenters to an AWS location with a dedicated network link. By bypassing the Internet and creating dedicated network connections, users gain improved privacy and better network and data transfer performance, while seeing increased bandwidth and reduced latency between the customer's datacenter and AWS, according to the company.
Currently, AWS Direct Connect is available at one location, Equinix's colocation facility in Virginia, allowing users to connect to services in AWS's East Coast region. In the coming months, Direct Connect locations will be available in San Jose, Los Angeles, London, Tokyo and Singapore.
The new Amazon VPC offering lets customers provision instances on a private, isolated section of the AWS cloud service. "You can now build highly available AWS applications that run in multiple Availability Zones within a VPC, with multiple (redundant) VPN connections if you'd like," wrote AWS evangelist Jeff Barr in a blog post. "You can even create redundant VPCs. And, last but not least, you can do all of this in any AWS Region."
Amazon said users can define a virtual network topology that closely matches a typical network that an organization might run within their own datacenters. Customers have control over the virtual networking environment including IP address ranges, creation of subnets and configuration of route tables and network gateways.
In his blog post, Barr wrote:
- The VPC is available in multiple Availability Zones in every AWS Region.
- A single VPC can now span multiple Availability Zones.
- A single VPC can now support multiple VPN connections.
- You can now create more than one VPC per Region in a single AWS account.
- You can now view the status of each of your VPN connections in the AWS Management Console. You can also access it from the command line and via the EC2 API.
- Windows Server 2008 R2 is now supported within a VPC, as are Reserved Instances for Windows with SQL Server.
Lastly, AWS rolled out Identity Federation to its Identity and Access Management (IAM) service. With Identity Federation, customers can use existing enterprise identities to access AWS without having to create a new identity. It lets users create temporary security credentials for AWS to let identities from an existing directory such as an LDAP server, to use IAM's access controls.
In a separate blog post, Barr explained how the Identity Federation capability can allow organizations to request temporary security credentials. "Identity federation opens up new use cases for our enterprise customers," he noted. "You can provision temporary security credentials for identities you are responsible for managing, with no limits on the number of identities you can represent or the number of credentials you can obtain."
Posted by Jeffrey Schwartz on 08/04/2011 at 1:14 PM0 comments
Chris Kemp, who stepped down as CTO of NASA back in March, has launched a startup company called Nebula that offers a turnkey appliance based on the OpenStack platform.
OpenStack is the open source project co-developed by NASA and Rackspace Hosting. Nebula came out of stealth mode last week by announcing that it has developed an appliance that it said will allow organizations to create large private clouds using thousands of commodity computers. Nebula is named after the project Kemp oversaw at the NASA Ames Research Center.
"Until today, this computing power has only been accessible to organizations like NASA and a small number of elite Silicon Valley companies," Kemp said in a statement. "We intend to bring it to the rest of the world."
The company was seeded by Sun Microsystems co-founder Andy Bechtolsheim, along with investors David Cheriton and Ram Shriram, who all were the first to invest in Google. Nebula also has secured funding from Kleiner Perkins Caufield & Byers and Highland Capital Partners.
Posted by Jeffrey Schwartz on 08/03/2011 at 1:14 PM0 comments
Looking to bolster its IT monitoring portfolio, CA Technologies has agreed to acquire Web site monitoring company Watchmouse. Terms were not disclosed.
The remote user monitoring service is a subscription-based offering that monitors the performance and availability of Web sites, services and transaction-level applications from 62 monitoring stations worldwide in over 40 countries.
Watchmouse, based in Utrecht, Netherlands, replicates real-time transactions and creates scripts that help identify issues that include slow response times and issues that could cause performance problems.
"The idea is as applications start to become more and more cloud-delivered, it's hard to instrument from inside," said Lokesh Jindal, senior VP of strategy and business development at Nimsoft, a division of CA that provides IT management solutions for midsized enterprises.
Watchmouse will be offered as an adjunct offering to the company's Nimsoft Monitor solution, allowing customers to ascertain how their Web sites and applications are performing from around the world. The service helps provide root causes of slow performing Web sites.
Watchmouse will also be sold as an add-on to CA's higher-end Application Performance Management (APM) solution, targeted at large enterprises. Watchnouse will offer greater visibility into application performance, whether the app is running in a datacenter, in the cloud or from a managed services provider, CA said.
Posted by Jeffrey Schwartz on 08/03/2011 at 1:14 PM0 comments
Dell is delivering an Infrastructure as a Service cloud solution based on the open source OpenStack platform.
Called the Dell OpenStack Cloud Solution, it consists of a reference architecture that outlines how to integrate the OpenStack operating system, Dell PowerEdge C servers and a component to install the software on bare metal systems called Crowbar. Also available are services from Dell and Rackspace Cloud Builders.
OpenStack appears to be emerging as a key open source option for deploying cloud services, and Dell's newest release is a notable milestone. The OpenStack Project, kicked off a year ago, is based on a cloud operating system developed by NASA and Rackspace that was contributed to the open source community under the Apache 2.0 license.
Dell was among 25 member companies that said they would contribute to the OpenStack project when it launched last year. Now there are more than 90 member companies which include Cisco, Citrix, Equinix, Intel, Opscode and RightScale. And Dell's largest rival, Hewlett-Packard, just became the latest member company to join the project.
"Since day one we investigated the code, started looking at reference architectures, figuring out how we can get a solution together to enable our customers to move on OpenStack," said Joseph George, director of marketing for Dell Cloud solutions.
While Dell engineers tweaked the code provided by NASA and Rackspace, getting an OpenStack instance up and running in a physical environment was problematic, George explained.
"It's not trivial to actually install a cloud, and it's not trivial to install an OpenStack cloud and so we decided we should automate that," he said, hence the development of Crowbar. "Crowbar is a software framework that allows users to deploy a multi-node OpenStack cloud on bare metal server technology in a matter of hours and even minutes, in some cases, as opposed to a couple of days that it would take if you were to do it manually."
Crowbar allows operators to set up BIOS, update RAID and it performs some level of network discovery and monitoring. "At the heart of it, it's a nice, simple automated way to be able to get an OpenStack cloud environment running on your servers very quickly," he said.
Dell is releasing the code for Crowbar to the open source community with the hope that it will become a key project within the OpenStack initiative.
Posted by Jeffrey Schwartz on 07/27/2011 at 1:14 PM0 comments
Count Hewlett-Packard as the latest member company to join the OpenStack project, the open source cloud effort formed a year ago.
As the largest IT vendor in terms of revenues, it can be argued that HP's presence is a major boost for OpenStack. Or perhaps the vendor realized that some of its key rivals -- Cisco and Dell -- were already major contributors to an effort that appears to be gaining momentum.
NASA and Rackspace contributed the code for an open source cloud operating system that was contributed to the community under the Apache 2.0 license. Besides Cisco and Dell, the OpenStack Project boasts a laundry list of players including Citrix, Equinix, Intel, Opscode and RightScale.
"HP recognizes that open and interoperable cloud infrastructure and services are critical in delivering the next generation of cloud-based services to developers, businesses and consumers," said Emil Sayegh, HP's VP of Cloud Services, in a blog post. "HP is taking an active role in the OpenStack community and we see this as an opportunity to enable customers, partners and developers with unique infrastructure and development solutions across public, private and hybrid cloud environments."
Sayegh said HP will sponsor the OpenStack Design Summit and OpenStack Conference, slated for October in Boston.
Posted by Jeffrey Schwartz on 07/27/2011 at 1:14 PM0 comments
Capgemini, the global consulting and systems integration giant, is putting a major emphasis on Microsoft's Windows Azure cloud platform.
The company last week said it has agreed to align its resources around Windows Azure, as well as marketing and delivering services around the platform. The two companies have a longstanding partnership, and this agreement is an extension of that. Capgemini's Microsoft-related business last year was $1.7 billion.
A key component of this agreement is a commitment to train 1,500 of its architects on Windows Azure, SQL Azure and the Azure AppFabric.
"It is our intent to cross-train roughly half of our .NET development staff because it's our belief that Azure and the Azure development environment and the apps specifically are going to be how next-generation applications are built," said Don Jones, Capgemini's Group VP for global channels and partners. "So Azure will be our default development environment for cloud-based applications going forward."
While Capgemini also has partnerships with Google, IBM and Amazon Web Services, it appears the SI is putting its emphasis on bringing full cloud solutions to customers via Windows Azure.
"What we think for the vast majority of our clients, Azure is going to be the right application development environment from a Platform as a Service perspective," Jones said. "If you look at the ISV community who are developing applications for Azure, it's our intent to become an orchestration service for that."
One such ISV is Geneva-based Temenos, which offers systems software for banks. Temenos announced last week that it has deployed its T24 system on Windows Azure for a network of six Mexican microfinance institutions (MFIs) back in May.
"It is our intent to go industry by industry and identify the top three to five Azure-based solutions and take those to market and draft Azure behind those," Jones said. "A core competency of Capgemini is working with these ISVs and bringing those solutions to market."
Capgemini intends to offer the Windows Azure service across 22 countries, with an initial emphasis on the United States, Canada, the United Kingdom, the Netherlands, France, Belgium and Brazil.
Posted by Jeffrey Schwartz on 07/26/2011 at 1:14 PM0 comments
The OpenStack consortium is readying the next release of its open-source cloud operating platform with a target date of late September.
OpenStack, a project launched a year ago this week by Rackspace Hosting and NASA, now is 80 member companies strong. The last release of the OpenStack operating system, Cactus, came out in April. The group is now moving to a six-month release cycle, said Jonathan Bryce, chairman of the OpenStack Project Policy Board and a founder of the Rackspace Cloud.
The next release is called Diablo. I chatted about Diablo with Bryce, who believes it will further broaden the appeal of OpenStack to both emerging service providers and large enterprises.
"The Diablo release is going to be a really good release for the traditional and average IT shop that's out there," Bryce said.
While there are a number of new features that will embody Diablo, Bryce emphasized three of them: improved networking, identity management and new service provider capabilities.
The networking in the first few versions of OpenStack Compute has been built around how you take network traffic and get it to and from virtual machines. "It's IP assignment and some basic routing. It's not a full networking stack by any means," he said.
With contributions from a number of players including notably Cisco, Diablo will step up the networking capabilities of OpenStack by dealing with the switching and routing side of things, "and even concepts of moving from having a network card and a virtual machine that you need to get traffic to, to actually dealing with ports on both sides of the line and coming up with a real model for virtualizing all of that so you can control all of the aspects of the networking of your datacenter," he explained.
The approach is to set up an interface, an API, where you can configure and create new ports, establish new connections between ports, create new routes and manage it all, he added.
"This is why Cisco has gotten involved, so you get to the point where you can really create some powerful automation at all layers of the network, and virtualize pieces of it, use physical hardware for pieces of it, but you control it all through one interface that is tied into this whole OpenStack cloud operating system."
Next is identity. A new feature called Keystone provides an identity and authentication management service. Keystone can tap into existing authentication systems like Active Directory and LDAP but it presents multitenant concepts that the compute service and the object storage need to keep users' data and virtual machines separate and secure and gives a single source of truth on the identity side, Bryce explained.
Finally, Diablo will gain new service provider features. One that Bryce pointed out includes usage tracking for billing and chargeback and monitoring the health of a service. The way the system keeps track of that data now is you can pull the data out but it's not all that clean and separated to the point where you could put it into a billing system or an accounting system, he said.
Some of the OpenStack developers are working on an interface that is meant to pull data out of all of these systems. It can be usage data for billing and monitoring data for health checks, Bryce said. "It's basically a system to subscribe to these feeds of data that you need so there's a unified framework for how that data is being published and exported."
Posted by Jeffrey Schwartz on 07/21/2011 at 1:14 PM0 comments
It was a stellar quarter for IBM as the company's reported second quarter revenues of $26.7 billion were up 12 percent, and net income of $3.7 billion was up 8 percent. That's certainly not a bad way to cap off a quarter in which the company turned 100 years old.
Buoyed by strong services, software and hardware revenues, Big Blue also talked up its cloud business, which is on track to double this year. Within its Integrated Technology Services business, cloud-based services revenue grew 200 percent. Since IBM did not break out those revenues, it's hard to get too excited about that figure for the moment.
However, IBM said it has won 2,000 cloud deals so far this year, and the average private cloud transaction has tripled, said IBM's senior VP and CFO Mark Loughridge in his prepared remarks during the company's earnings call. "In the first half of 2011, cloud revenue has already exceeded our full year 2010 results, keeping us on track to double our cloud revenue for the year," Loughridge said.
IBM beefed up its cloud offerings earlier this year, when it launched SmartCloud, a product designed to enhance the company's public cloud infrastructure services, and Workload Deployer, a private cloud solution.
Posted by Jeffrey Schwartz on 07/20/2011 at 1:14 PM0 comments
Looking to add networking punch to its datacenter and cloud solutions portfolio, Dell on Wednesday said it has agreed to acquire Force10 Networks.
Terms of the deal, set to close later this summer, were not disclosed, though the San Jose, Calif.-based company posted $200 million in revenues over the past 12 months.
Dell said Force10 lets customers "transform their network infrastructures into an open, reliable and scalable datacenter and cloud computing fabric" via its Open Networking framework, which Dell said is based on open standards, automation and virtualization.
Though it doesn't have the installed base of some of its larger networking rivals, Force10 counts as its customers Web 2.0 and Fortune 100 companies, telecommunications carriers, research laboratories and government organizations. The company sells its networking gear in 60 countries but 80 percent of its business comes from North America.
Dell said it intends to maintain and expand Force10's existing channel partner program just as it did with its acquisitions of Compellent and EqualLogic.
Posted by Jeffrey Schwartz on 07/20/2011 at 1:14 PM0 comments
This week marked the debut of the first cloud-focused ETF, a fund comprising of cloud companies.
The release of the First Trust ISE Cloud Computing Index Fund (Ticker: SKYY) begged the question: Are cloud stocks going the way of dotcom holdings more than a decade ago?
In a short segment Thursday, CNBC asked two analysts to weigh in on the matter: Brad Whitt, enterprise software analyst at Gleacher and Co. and James Staten, vice president principal analyst at Forrester Research.
Whitt is bullish on cloud stocks, pointing to the fact that cloud companies have executed well since the end of 2009 and he doesn't believe that their market valuations are overheated.
"We've seen the metrics continue to accelerate in the past six or seven quarters," Whitt said. "We are also seeing tremendous end user demand. We attend a lot of user conferences and trade shows and we continue to see a lot of demand. Systems integrators are definitely getting behind the cloud computing initiative, and lastly I think the expectations in valuations are reasonable. We are nowhere near the dotcom valuation levels, so if the companies continue to execute, which we think they will, the end user demand is still there."
Forrester's Staten was more bearish, arguing the hype might be outstripping demand for cloud services.
"The hype around how much cloud demand is out there is beyond what really we're seeing," Staten said. "We're definitely seeing enterprises that are interested in using the cloud and we see them actually putting things in there, but they're not getting ready to shut down their datacenters and move it all in there, and that's a lot of what is behind all the hype."
Whitt countered that stocks that he considers cloud companies are all seeing anywhere from 20 to 40 percent revenue growth.
"That's not what I consider hype growth-type expectations," Whitt said. "I'd agree with James that companies are going to move slow. They're going to start with private clouds, they're going to adopt Software as a Service, which we've seen from companies like Salesforce."
Posted by Jeffrey Schwartz on 07/08/2011 at 1:14 PM0 comments