Dell Boomi on Tuesday released an upgrade to its AtomSphere cloud integration service, adding the ability to use the collective intelligence of its customer base to map applications and data.
It's been nearly one year since Dell said it agreed to acquire Boomi, whose AtomSphere service is designed to simplify integration between cloud and premises-based applications. The new release, dubbed AtomSphere Fall 11, is aimed at addressing more complex integration requirements.
"The interesting trend here is that the integrations that we're seeing that our customers are using Boomi for are increasing in complexity," said Rick Nucci, founder and CTO of Dell Boomi, during a briefing with press and analysts. "The average number of applications connected today across Boomi's customer base is 11, which is a nice indicator of increasing complexity in integration that Boomi is helping to simplify."
Boomi Suggest, launched last year, allows customers to share an anonymous index of every data map built by Boomi users. Nucci said while customers have the option of opting out, so far no one has done so. The index suggests how developers should be mapping their data fields. When Boomi Suggest was launched, there were 5,000 maps in the index; now there are 50,000. And within those 50,000 maps, over 1.5 million individual data mappings are in that index, Nucci said.
With AtomSphere Fall 11, Dell Boomi is adding a mapping function, where customers can build filed level transformations on data.
Dell Boomi has also added the following new features in the release:
- Business Rules: Users can now build multi-step conditional business rule logic into their integration workflows.
- Partner Dashboards: These provide the ability for Boomi partners to create centralized dashboards. Currently, Dell Boomi has 70 ISV and systems integration partners, and the Partner Dashboards let them have centralized control of data and application integrations, while enabling them to monitor usage detail.
- Expanded trust.boomi.com: Launched last year, this feature is designed to provide transparency of Dell Boomi's operational performance. Now the company is extending that capability by providing "improved transparency," where metrics will be published.
- Predictive Assistance: Customers can gather return-on-investment metrics using analytics that measure usage characteristics.
The new features in Dell Boomi AtomSphere are available now, though Business Rules is only offered with Dell Boomi AtomSphere Professional or higher. Pricing starts at $550 per month.
Posted by Jeffrey Schwartz on 10/25/2011 at 1:14 PM0 comments
It seems everyone wants a piece of the cloud storage pie.
Cloud storage provider Dropbox this week received a whopping $250 million infusion from Index Ventures, with new investors Benchmark Capital, Goldman Sachs, Greylock Partners, Institutional Venture Partners, RIT Capital Partners and Valiant Capital Partners also contributing.
Dropbox is the subject of a cover story in the Nov. 7 issue of Forbes, which recounts a meeting nearly two years ago between founder Drew Houston and Apple founder Steve Jobs, who apparently had an interest in Dropbox. Jobs, who passed away earlier this month after a long battle with pancreatic cancer, warned Houston that Apple was looking to develop a cloud storage service like Dropbox. That service, of course is iCloud, which debuted last week.
And yet another cloud storage high-flyer, Box.net, last week received $81 million from Salesforce.com, SAP Ventures and Bessemer Venture Partners. The infusion brings Box.net's total amount of funding raised up to $162 million, giving it a reported $600 million valuation.
Box.net reportedly spurned a $600 million takeover bid from Citrix, according to Forbes. Citrix, which now views cloud storage and document management as a key pillar of its desktop virtualization story, instead last week said it has acquired ShareFile for an undisclosed amount.
Like Box.net and Dropbox, ShareFile allows business users to store and synchronize files across multiple devices.
"Think of this as iCloud for the enterprise," said Sumit Dhawan, a Citrix Group VP. "iCloud is designed for consumers for sharing their pictures and videos and keeping it online along with your consumer apps. Think of [ShareFile] as enterprise-grade, with all the capabilities that enterprises need, including encryption, control and policy and things like that where you give users the flexibility to have access to their documents and data in the cloud on any device, while at the same time ensuring IT does not lose control but actually regains control so that users are not using unsanctioned ways of sending data to these devices where all of a sudden where you are compromising all IT policies and running into compliance issues."
Dropbox, which says it has 45 million users, is primarily a consumer-based cloud storage service, though individuals certainly are known to use it for business purposes. By comparison, Box.net and ShareFile bill themselves as cloud services providers for enterprises with an emphasis on document sharing and collaboration.
ShareFile, based in Raleigh, N.C., launched its first service in 2005 and says it now has 17,000 corporate customers. With the acquisition of ShareFile, Citrix said it will offer "follow-me-data" capabilities, allowing users to access data from any device and to collaborate with others.
Box.net said it has 7 million individuals among 100,000 enterprises using its service, including an 18,000-user deal with Proctor & Gamble. Next month Box.net plans to launch the Box Innovation Network, aimed at enabling developers to integrate their applications with the company's cloud service.
The company said it already integrates with 120 applications from the likes of Google (Apps), Salesforce.com, NetSuite and SAP. The company said it plans to use the funding to expand its international presence and to extend its U.S. infrastructure, including the launch of a third datacenter next year. Box.net and Citrix say they plan to boost their ISV and partner ecosystems.
For its part, Dropbox believes it has only addressed a sliver of its addressable market of 2 billion Internet-connected users, Houston told Reuters.
While anything is possible, it doesn't look like either of these companies is going to get scooped up any time soon. As larger rivals such as Apple, Google, Microsoft and now Citrix, among others, build out their cloud storage capabilities, it will be interesting to see to what extent Box.net and Dropbox upset the status quo.
Posted by Jeffrey Schwartz on 10/20/2011 at 1:14 PM0 comments
Enterprises wanting to move from using in-house e-mail, productivity and collaboration software to Microsoft's new Office 365 may have to embark on upgrades they don't want to make. For example, to get the most out of Office 365's enterprise edition, shops need to upgrade to Office 365 and be able to provide single sign-on via Microsoft's Active Directory Federation Services (ADFS).
Third-party cloud providers are finding ways to help customers circumvent such requirements.
The latest to do so is a company called CenterBeam. The San Jose, Calif.-based managed services provider this week launched CenterBeam 365+ and has come out swinging at the cloud-based offerings from Google, Microsoft and others.
"We believe most of the players out there are delivering consumer-class services," said Shahin Pirooz, CenterBeam's CTO. "The solutions are vanilla and the intention is they don't offer a lot of customizing. With a lot of the systems, one size fits all. There's a mentality of 'figure it out yourself,' if you will, whether it's through a partner or through a third party."
For its part, closely held CenterBeam has been around since 1999, has 185 employees, serves users in 49 countries and claims it is profitable. The company says it developed the first multi-tenant hosted version of Exchange in conjunction with Microsoft back in 1999. At the time, Microsoft had invested in the company to gain access to CenterBeam's IP, said Karen Hayward, the company's CMO. CenterBeam's services are geared toward mid-sized enterprise with anywhere from 100 to 4,000 PC users (typically distributed) or with revenues ranging from $10 million to $1 billion.
With CenterBeam 365+, customers can use Office 2003, Active Directory 2000 and .PST files, while enjoying greater flexibility for customization versus Microsoft's Office 365, company officials said. CenterBeam 365+ is available with Office Web Apps. And CenterBeam has a 24x7 helpdesk staffed with level 2 engineers, many of whom have Microsoft and other (such as Apple, Cisco, Citrix and VMware) certifications. The engineers are equipped to remote into user desktops, if needed, Pirooz said.
"One of the key differentiators we bring to the table is how cloud can play in a true hybrid environment with integration back into the customer's environment," he said. "This model gives our customers the ability to move at their pace on their timing and move the pieces and parts that they feel appropriate."
Pirooz said CenterBeam 365+ offers other features that mimic the in-house Exchange environment, such as support for public folders, BlackBerry application push, global address list segmentation and the ability to provision users in multiple datacenters. "The solution has all the bells and whistles and advantages of an enterprise-class model that can be customized, but it has the financial and scale benefits of a cloud-based model," he said. "We deliver similar functionality to customers as if they had their on-premise environment."
With the release of CenterBeam 365+, the company is looking to extend its reach. It is doing so by launching its first-ever partner program. CenterBeam is recruiting both referral partners and resellers, Hayward said.
The company's key appeal to partners is it will allow them to control the billing relationship with customers, something Microsoft allows only with its largest partners "We allow the partner to bill the customer," Hayward said. The company handles the onboarding of customers for partners, she added.
Pricing starts at $10 per user per month and runs up to $21 for the highest grade of service.
Posted by Jeffrey Schwartz on 10/19/2011 at 1:14 PM0 comments
Google has been busy over the past few weeks upgrading its Google App Engine cloud service.
The company last week said it has updated its cloud storage, added premium support and released the preview of a new cloud database service. Google App Engine is the company's Platform as a Service (PaaS) cloud offering, designed for those who want to build and host their applications.
More than 200,000 developers have built apps that run atop Google's cloud services, claimed Google Group product manager Jessie Jiang in a blog post. Google appears to be hoping that its newly announced upgrades to Google App Engine will make its cloud platform more appealing to larger enterprises.
Google Storage for Developers, introduced earlier this year, is now called Google Cloud Storage. The company has lowered the price of the storage service and is no longer charging upload fees. Larger customers are eligible for volume discounts, said Google Cloud Storage product manger Navneet Joneja in a blog post.
Pricing starts at 13 cents per gigabyte and scales down to 10.5 cents per gigabyte for up to 90 TB. For those requiring additional storage, customers can contact the company.
The company also added two experimental features to Google Cloud Storage. One is the ability to read and write data using the App Engines File API. The other is an API that gives users detailed usage information.
With the new Google App Engine Premier Accounts, customers will receive a 99.95 uptime service-level agreement and will be permitted to post an unlimited amount of apps on their accounts. Also, there are no minimum monthly fees for each app. Customers only pay for what they use, Google said.
The service costs $500 a month and is only available during business hours, not weekends or holidays. Customers are also expected to first "use reasonable efforts to fix any error, bug, malfunction or network connectivity defect without escalation to Google," according to the technical support guidelines.
Google also launched the preview of Google Cloud SQL, its cloud-based database. Google characterized a cloud database as one of the most-requested features for App Engine.
The preview, which allows customers to run MySQL databases in the cloud, is designed to support applications developed in Java or Python, allows for database instances up to 10 GB, and supports both asynchronous and synchronous replication.
Posted by Jeffrey Schwartz on 10/18/2011 at 1:14 PM1 comments
IBM's SmartCloud public and private cloud portfolio are being fleshed out in answer to a growing demand from IT and enterprise users.
Based on an IBM survey of 500 enterprise IT and business executives, 33 percent have deployed more than one cloud pilot to date, a figure poised to double by 2014. The survey also found that 40 percent see the cloud as bringing "substantial change" to their IT environments. IBM said it will be supporting 200 million users in the cloud by the end of next year.
"It's clear to us, what we're seeing is a fundamental transformation of how our clients are trying to change the economics of their IT environment and speed the delivery of new innovative products and services," said Scott Hebner, VP of market strategy at IBM Tivoli.
On the public cloud front, IBM plans to launch a platform-as-a-service (PaaS) called SmartCloud Application Services, which will consist of a managed services offering that will provide application infrastructure. The service will offer application lifecycle management, application integration and the ability to manage applications in the hosted environment.
The new PaaS offering, due to go into beta later this quarter, will run atop IBM's Smart Cloud Enterprise and Enterprise+, the company's public infrastructure-as-a-service (IaaS) offerings announced in April. IBM this week announced the commercial availability of the IaaS offerings in the United States with full global availability slated for the end of 2012.
Building on its private cloud portfolio, IBM launched its SmartCloud Foundation, for customers looking to build and operate cloud infrastructures internally. The company introduced three offerings:
- SmartCloud Entry: a starter kit that allows organizations to build private clouds running on standard x86 or IBM's Power processor-based infrastructure. The modular offering lets organizations scale as demand for capacity increases.
- SmartCloud Provisioning: Software consisting of a provisioning engine and an image management platform that can spin more than 4,000 virtual machines in less than an hour, IBM said.
- SmartCloud Monitoring: A tool that provides views of virtual and physical environments including storage, networks and servers. It offers predictive and historical analytics designed to warn IT admins of potential outages.
IBM also announced the availability of its SAP Managed Application Services. Announced back in April, the service will allow for the automated provisioning and management of SAP environments.
Posted by Jeffrey Schwartz on 10/14/2011 at 1:14 PM0 comments
Microsoft's SQL Server database server platform and its cloud-based SQL Azure may share many core technologies but they are not one and the same. As a result, moving data and apps from one to the other is not all that simple.
Two companies this week set out to address that during the annual PASS Summit taking place in Seattle. Attunity and CA Technologies introduced tools targeted at simplifying the process of moving data from on-premises databases to SQL Azure.
Attunity Replicate loads data from SQL Server, Oracle and IBM's DB2 databases to SQL Azure and does so without requiring major development efforts, claimed Itamar Ankorion, Attunity's VP of business development and corporate strategy.
"The whole idea is to facilitate the adoption of SQL Azure, allow organizations and ISVs to benefit from cloud environments and the promise of SQL Server in the cloud," Ankorion said. "One of the main challenges that customers have is how do they get their data into the cloud. Today, it requires some development effort, which naturally creates a barrier to adoption, more risk for people, more investment, with our tools, it's a click away."
It does so by using Microsoft's SQL Server Integration Services, which provides data integration and transformation, and Attunity's change data capture (CDC) technology, designed to efficiently process and replicate data as it changes. "We wanted to basically provide software that would allow you to drag a source and drag a target, click, replicate and go," Ankorion said.
For its part, CA rolled out a new version of its popular ERwin data modeling tool for SQL Azure. CA ERwin Data Modeler for SQL Azure lets customers integrate their in-house databases with SQL Azure.
"CA ERwin Data Modeler for Microsoft SQL Azure provides visibility into data assets and the complex relationships between them, enabling customers to remain in constant control of their database architectures even as they move to public, private and hybrid IT environments," said Mike Crest, CA's general manager for data management, in a statement.
CA's tool provides a common interface for combining data assets from both premises and cloud-based databases. Customers can use the same modeling procedure to maintain their SQL Azure databases.
Posted by Jeffrey Schwartz on 10/13/2011 at 1:14 PM0 comments
Looking to quell criticism that it has too much control over the OpenStack Project, Rackspace has agreed to establish an independent foundation next year that will assume ownership and governance for the open source cloud platform.
Rackspace had been under pressure to make such a move, and did so last week, announcing its intention to form the foundation during the OpenStack Conference in Boston.
"This marks a major milestone in the evolution of OpenStack as a movement to establish the industry standard for cloud software," said Mark Collier, Rackspace's VP of business and corporate development, in a blog post. "The promise of a vendor-neutral, truly open cloud standard is within reach. By doing this important work together, as a community, we can achieve something much bigger with a lasting impact on the future of computing."
Developed by NASA and Rackspace, the two made the OpenStack code freely available under the terms of the Apache 2.0 license last year. More than 100 companies have jumped on the OpenStack bandwagon, including Canonical, Citrix, Cisco, Dell, Hewlett-Packard, Intel and SuSE. Many have contributed code and have committed to developing cloud products and services based on OpenStack. While the project has grown, Rackspace's control over the effort was an ongoing concern.
The company held a meeting last Thursday to take the first step toward creating the foundation. According to the meeting notes of Scott Sanchez, director of business development for Rackspace Cloud Builders, the gathering was a session to answer questions, gather input and for the company to explain its intentions. No decisions were made regarding how the foundation will be structured or funded.
Lew Moorman, Rackspace's president and chief strategy officer told attendees that the company's motives were not to defray costs or re-assign his company's personnel to other tasks but rather to prevent OpenStack from "forking," according to the meeting notes.
One concern was the potential for the transition process to cause new member companies to delay joining, since they will want to see how the organization is structured. The issue, noted Moorman, is to "ensure long term independence for OpenStack," while not creating short-term barriers to progressing the effort.
Rackspace moved forward this week, creating a mailing list aimed at getting the discussion going. The initial discussion will revolve around determining the foundation's mission and scope, Collier noted in an updated blog post.
It appears Rackspace is taking an important step that should be welcome by the OpenStack community and future stakeholders. The challenge will be to get everyone on board with the structure and funding, neither of which is trivial, while not moving too slow that the process ends up in limbo.
What's your take on Rackspace's decision to move OpenStack to an independent foundation? Leave a comment below or drop me a line at firstname.lastname@example.org.
Posted by Jeffrey Schwartz on 10/12/2011 at 1:14 PM0 comments
In a move that will boost its portfolio of high-performance computing (HPC) and cloud management software, IBM on Tuesday said it has agreed to acquire Platform Computing for an undisclosed sum.
Founded 19 years ago, Toronto-based Platform is regarded as a leading provider of workload management software for distributed, clustered and grid computing environments. IBM boasts Platform has more than 2,000 customers, including 23 of the 30 the world's largest enterprises. Among them are the European Organization for Nuclear Research (better known as CERN), Citigroup and Pratt & Whitney.
"IBM considers the acquisition of Platform Computing to be a strategic element for the transformation of HPC into the high growth segment of technical computing and an important part of our smarter computing strategy," said Helene Armitage, general manager of Systems Software at IBM, in a statement.
That strategy includes allowing customers to move HPC workloads to public and private clouds. Among its offerings are Platform ISF, a private cloud management tool that manages workloads across various virtual machines, operating systems, cloud resources and physical and virtual servers.
Customers can also create new clusters to cloud and hosting providers with Platform LSF, a workload management platform, and Platform MultiCluster, a cluster consolidation tool, enabling them to create policy-based distribution of workloads between in-house HPC clusters and cloud-based resources.
The addition of Platform will augment IBM's existing efforts to bridge HPC-based applications to the cloud. Big Blue's HPC Management Suite for Cloud enables provisioning of different operating system images on bare metal hardware and virtual machines, provides access to the HPC infrastructure via a Web-based management interface and allows for the sharing of HPC resources.
Posted by Jeffrey Schwartz on 10/11/2011 at 1:14 PM0 comments
Adobe Systems has been slow to move its traditional desktop software business to the cloud, but the company will take a key step forward to change that when it lets users of its Creative Suite of apps share and synchronize content through a new cloud service it plans to launch next month.
The company announced Creative Cloud at its annual AdobeMAX 2011 conference in Los Angeles this week. Initially, the service will offer 20 GB of storage capacity to users of Adobe Touch Apps, also launched this week, and the flagship Adobe Creative Suite, enabling collaboration and sharing of the content created with the software.
Adobe Creative Cloud will be the hub for Adobe Touch Apps, designed to allow creative professionals to deliver content that run on tablet devices. Content developed with Adobe Touch Apps can be shared and viewed across devices and transferred to Adobe Creative Suite CS5.5, the company said.
Early next year, the service will offer access to Adobe's flagship Creative Suite tools which include Photoshop, InDesign, Illustrator, Dreamweaver, Premier Pro, Edge and Muse.
"The move to the Creative Cloud is a major component in the transformation of Adobe," said Kevin Lynch, Adobe's chief technology officer, in a statement
Pricing for the service will be announced next month.
Posted by Jeffrey Schwartz on 10/06/2011 at 1:14 PM0 comments
Hosting provider Savvis this week said it will offer Microsoft's SQL Server and Oracle's Enterprise 11g RAC databases in the cloud.
Savvis said its new Symphony Database lets customers provision the databases without having to license the software or acquire hardware, while providing a scale-up and scale-down architecture.
"Unlike traditional database offerings, Symphony Database does not require hardware provisioning and software licensing, freeing enterprises from long-term contracts and expenses," said Brent Juelich, Savvis senior director of managed services, in a statement.
The database offering is the latest in a series of new services added by Savvis, which earlier this year was acquired by CenturyLink for $2.5 billion. The company also recently launched its Virtual Private Data Center Premier offering, aimed at proving a higher level of performance, security and support for mission-critical applications.
Savvis is in the midst of expanding its datacenters in North America. The company added new capacity in Atlanta and Seattle and is set to expand its facilities in Boston, Piscataway, N.J. and Toronto in the coming weeks.
Posted by Jeffrey Schwartz on 10/06/2011 at 1:14 PM0 comments
Looking to convince large enterprises to use its broader suite of infrastructure and platform cloud services, Google has launched its Cloud Transformation Program.
To date, much of the company's enterprise cloud emphasis has focused on Google Apps, its suite of e-mail, calendaring, collaboration and productivity tools. Now the company is looking to extend its other cloud offerings, notably its Google App Engine Platform as a Service (PaaS), to large enterprises.
The company has tapped seven partners to help large organizations use its cloud services, including App Engine, Google Storage for Developers, Google Apps Script and the Google Prediction API.
The partners include CSC, Cloud Sherpas, Cognizant, Opera Solutions, Razorfish, SADA Systems and TempusNova. Google said it intends to bring onboard additional cloud implementation partners.
Google wants enterprises to use its cloud services to build Web sites, mobile, social media, business process and customer-facing applications using App Engine and Apps Script, said Rahul Sood, global head of enterprise partnerships in a blog post.
With the Prediction API, Google sees customers building apps that detect fraud and analyze customer churn, for example. And with Google Storage for Developers, Google is pitching an array of services such as backup and data sharing.
Posted by Jeffrey Schwartz on 10/06/2011 at 1:14 PM1 comments
Startup Piston Computing came out of stealth mode this week, introducing a hardened operating system based on the open source OpenStack project for private enterprise clouds.
Piston is led by CEO and co-founder Joshua McKenty, who was technical lead and cloud architect of NASA's Nebula Cloud Computing Platform. NASA and Rackspace co-founded the OpenStack Project. Just last month, former NASA CTO Chris Kemp launched Nebula, which offers a turnkey appliance based on the OpenStack platform. McKenty left NASA last summer to launch Piston with the goal of bringing private clouds like Nebula to enterprises based on OpenStack.
McKenty maintains that much of the attention on OpenStack has been on the potential for service providers to build clouds based on the open source platform, but there has been little emphasis on opportunities for private clouds.
"A lot of the early contributions were around service provider requirements and there seemed to be more and more focus on that side of the story," McKenty said. "We had enterprise customers showing up at every [OpenStack] Design Summit saying, 'Hey what about our needs? We need things to deal with regulatory compliance and security and we know NASA worked on these -- why aren't they in the code base?' We really set out to rectify that. In a lot of ways I'm trying to finish what I started [at NASA]."
Piston is launching pentOS, which stands for Piston Enterprise operating system. The three key attributes of pentOS that Piston is emphasizing centers around its built-in security, interoperability and ease of deployment.
McKenty said pentOS is based on what the company calls a "null-tier" architecture that integrates compute, storage and networking on every node, providing a massively scalable platform.
Thanks to a hardened custom-built Linux distribution, pentOS is secure, McKenty said. Enabling IT to securely deploy pentOS is a feature called Cloud Key, which allows for the automated distribution of the software onto servers and switches via a USB stick. Admins can configure the OS on a laptop and then install it onto the hardware. This provides a critical component of security, McKenty explained, because it minimizes the number of administrators who need credentials for the physical hardware.
McKenty said 50 percent of all attacks come from insiders, and by reducing those who need credentials, the more secure the environment will be. "This is the largest single concern for enterprise IT security," he said. "So the fewer users that have administrative rights on your physical hardware, the better, in my opinion."
Piston claims pentOS includes the first implementation of the Cloud Audit standard, which provides a common interface and namespace, or repository, for cloud providers to automate audit, assertion, assessment and assurance of their environments. McKenty, who is on the Cloud Audit working group, said implementing the standard is important to enterprises who rely on certifications such as HIPAA, PCI, NIST 800-53 and other compliance frameworks.
The pentOS software can be installed on any server hardware and initially on switches supplied by Arista Networks and, shortly, on Hewlett-Packard and Dell Force10 switches, McKentry said, with others to follow.
Founded earlier this year, Piston has $4.5 million in Series A funding from Hummer Winblad and True Ventures.
Piston will issue a developer preview of pentOS next week at the OpenStack Design Summit with general availability scheduled for Nov. 29. The company is not yet revealing pricing but it will be based on per-server licensing and a subscription service for security updates.
Posted by Jeffrey Schwartz on 09/28/2011 at 1:14 PM1 comments