Often noted for moving slow to the cloud, SAP used its annual TechEd conference in Las Vegas to pick up the pace.
The move comes just two weeks after its key rival Oracle extended its cloud services and applications portfolio at its annual OpenWorld conference. While SAP has moved to offer more of its applications as a service and has made some key acquisitions to extend those offerings such as its $3.4 billion deal to buy SuccessFactors, now the company is adding some key platform services.
Notable on the cloud front at TechEd this week was the HANA in-memory database, used for building apps that perform real-time analytics incorporating large amounts of data. The new SAP HANA Cloud is targeted at developers who want to build modern applications using native HANA, Java and other development environments.
SAP HANA Cloud is designed to integrate with portals, support application integration, enable mobile development and allow developers to build apps that enable business intelligence and collaboration.
The first application service SAP is offering with HANA Cloud is NetWeaver Cloud, designed to let enterprise developers and ISVs build Java-based apps with modern UIs for mobile devices and consumer-type apps, which integrate premise-based and cloud apps. Along with the NetWeaver Cloud launch, SAP released a free unlimited developer license, removing a previous restriction of 90 days. HANA Cloud also includes a database as a service offering called HANA DBServices.
In addition to launching its own platform as a service (PaaS), SAP launched HANA One platform, which allows shops to provision HANA on Amazon Web Services (AWS) EC2 service on the fly. Developers can provision up to 60GM of memory per instance, which is suited for transaction oriented and apps that require real time analytics.
Developers can provision HANA ONE via the AWS Marketplace. Pricing starts at $.99 per hour for the use of SAP's software. "Because you can now launch HANA in the cloud, you don't need to spend time negotiating an enterprise agreement, and you don't have to buy a big server," wrote AWS evangelist Jeff Barr in a blog post. "If you are running your startup from a cafe or commanding your enterprise from a glass tower, you get the same deal. No long-term commitment and easy access to HANA, on an hourly, pay-as-you-go basis, charged through your AWS account."
Posted by Jeffrey Schwartz on 10/18/2012 at 1:14 PM0 comments
IBM and AT&T today said they are getting together to offer their shared customers cloud services that use Big Blue's compute and storage infrastructure and network connectivity provided by the telecommunications carrier.
Both companies will jointly sell the combined offering to their respective enterprise customers -- it's targeted at giants in the Fortune 1000 – as an alternative to traditional infrastructure as a service (IaaS) cloud services which often use standard Internet connections.
The offering, to be released early next year, will use IBM's SmartCloud Enterprise+, the company's public IaaS-based cloud service, along with new virtual private network technology developed by AT&T Labs. The new VPN capability is designed to ensure secure connections and more reliable performance than Internet links provide, said Dennis Quan, vice president of IBM's Smart Cloud infrastructure.
"We feel it will give clients a lot more control over security, privacy and performance and we think will resolve some of the issues enterprises have with adopting cloud computing," Quan said in a telephone interview. Quan added the service is suitable for development and testing as well as to run enterprise applications and even transaction oriented Web sites.
Given the target audience of Fortune 1000 customers and the implied, though undisclosed improvement in performance, this so-called "breakthrough" new VPN capability will come with a price premium, though the companies aren't saying. Also worth noting is the fact that both companies are members of the OpenStack effort. It is unclear when or if this service will support the OpenStack networking protocols. Quan would only say IBM is "deeply committed" to OpenStack.
The new VPN technology from AT&T automatically allocates network services to compute infrastructure, according to the announcement. This automation lets customers scale resources on demand much faster than if provided manually, both companies said. The companies said they will offer service-level agreements, over 70 security functions and "high levels" of security embedded on both wired and wireless devices authenticated to a customer's VPN.
The announcement had me wondering if AT&T is going to scale back its own public cloud infrastructure and platform services in favor of sourcing IBM's. In an e-mailed statement from an AT&T spokeswoman: "This is AT&T's most recent step in executing its strategy to deliver cloud services that meet the needs of a wide variety of users including large and medium enterprises, developers, and internet-centric businesses. We recognize that one size does not fit all when it comes to cloud, and see the opportunity to provide a managed alternative to AT&T Compute as a Service that pairs AT&T's virtual private network technology with IBM's SmartCloud Enterprise+ infrastructure to deliver a highly secure and flexible cloud offer to businesses."
While these companies compete to some extent, it appears both stand to benefit from working together. AT&T can provide direct links from private cloud and premises-based data centers to IBM SmartCloud Enterprise+ filling a gap in Big Blue's portfolio, while giving AT&T another option to deliver IaaS, even if the service is not AT&T's.
It is not clear how big AT&T's enterprise public cloud service is but IBM's is presumably bigger. The company said it expects its cloud revenue to hit $7 billion by the year 2015. While the company hasn't disclosed its cloud revenues to date, IBM said it doubled last year over 2010.
What's your take on this pairing arrangement? Comment here or drop me a line at [email protected].
Posted by Jeffrey Schwartz on 10/09/2012 at 1:14 PM0 comments
Box.com has made new pacts that will address security and compliance issues that are often associated with cloud storage offerings. The company announced the offerings at its BoxWorks conference in San Francisco this week.
On the compliance side, it has released a new reporting API that will enable third-party business intelligence partners, such as GoodData, to create dashboards that will let administrators detect unusual activity such as an individual downloading an excessive number of files.
On the security side, Box has a pact with Proofpoint to use its "security as a service" to add a layer of security over documents shared over the Box service. By using the Box API with Proofpoint's service, it can recognize if content is uploaded in violation of a security policy and will result in an admin being notified of the suspicious activity.
Security is one barrier but making content accessible is another key issue among would-be users of software a service (SaaS) offerings. At its annual BoxWorks customer conference this week in San Francisco, Box said it is moving to make it easier for customers to access data from various other SaaS-based services.
The company's new Box Embed is a framework based on HTML 5 that allows customers to access features from Box such as the ability to preview files, add or view comments and incorporate Box-based search into applications developed in house or by partners. Box revealed 10 partners that are using Box Embed within their cloud-based apps including Concur, Cornerstone OnDemand, DocuSign, Eloqua, FuzeBox, Jive, NetSuite, Oracle, SugarCRM and Zendesk.
Box Embed will let users securely and centrally store content, though they can access it from partner apps. For example, in the case of Jive, Box users will be able to incorporate enterprise social networking into its document collaboration service.
Customers can also use Box Embed to embed content from the service to their Web sites, intranets or services provided by third parties. NetSuite and SugarCRM launched Box Embed-supported features this week and Box said others will follow suit in the next quarter.
Posted by Jeffrey Schwartz on 10/09/2012 at 1:14 PM0 comments
At this year's Oracle OpenWorld conference in San Francisco, cloud computing is once again front and center with the launch of new IaaS, PaaS and SaaS offerings. The company has added new online storage services, cloud-based asynchronous message queuing, cloud-based tools to build sites on social networks and in the Oracle Public Cloud, and a variety of new SaaS-based line of business tools including financial planning and analytics capabilities.
The continued emphasis on cloud follows last year's OpenWorld, when the company made its big cloud push with the launch of Oracle Public Cloud and the release of a slew of SaaS applications. As with Microsoft, HP, IBM, Dell and others, cloud computing is the focus of everything Oracle is talking about now.
It's always interesting to hear Oracle CEO Larry Ellison sing the praises of cloud computing these days, years after shrugging it off. In 2008, Ellison famously described cloud computing as the fashion du jour. "What the hell is cloud computing?" Ellison said to financial analysts four years ago. "I'm not going to fight this thing. I don't understand what we would do differently in the light of cloud computing other than changing the wording on some of our ads. It's crazy. That's my view."
Well, Ellison has since refined his view. During his keynotes at OpenWorld this year, Ellison described Oracle as the only company that has addressed private and public cloud computing at the IaaS, PaaS and SaaS layers. I think Microsoft and others might beg to differ but Oracle clearly launched a broad portfolio of SaaS apps as well as substantial infrastructure and platform services.
Ellison effectively said everything Oracle offers will be available for use on premise in traditional datacenters, for private clouds and in the Oracle Public Cloud. Moreover, customers can host their apps on dedicated hardware in Oracle datacenters or run their apps on shared infrastructure. And finally, much of the software in the Oracle Applications suite is now available as a SaaS offering, with social networking hooks, and those applications are based on the same Java-based application infrastructure as the premises-based versions of its software.
One thing that many will take issue with, though, is Ellison's claim that its cloud is standards-based. "Industry standards are extremely important," Ellison said, pointing to the fact that all of Oracle's cloud apps, databases and middleware are based on Java, Linux and the Xen hypervisor. Yet Oracle is one of the only major IT vendors that have not joined the OpenStack initiative (Microsoft, Amazon and Google are other notable players not involved). Cisco, Dell, HP, IBM, Rackspace, Red Hat and VMware are all on board, among nearly 200 other players.
If you're not concerned about portability, this won't matter. But if you're already locked into Oracle, there may be some compelling options from its cloud offerings.
And that's going to be the key focus for Oracle in the foreseeable future. Prior to making his Wednesday afternoon keynote, Ellison told CNBC's Maria Bartiromo, who spent two days broadcasting her show live from OpenWorld, that the cloud will take priority over making any major acquisitions in the coming years. "We are not planning any major acquisitions right now," Ellison said. "We are really focused on the fact that over the last seven or eight years, we re-engineered our applications for the cloud. We think that's a huge opportunity for organic growth."
While that's not the first time Ellison has said Oracle has spent that many years re-engineering its apps for the cloud, you might wonder how that's possible if he was describing it as a fashion trend in 2008. More than likely, Oracle was watching the growth of Salesforce.com and was very much hedging its bets in the SaaS model.
At OpenWorld, Oracle announced the following new cloud offerings are available for preview:
- Oracle Planning and Budgeting Cloud Service, a subscription-based version of its Hyperion Planning app
- Oracle Financial Reporting Cloud Service for creating financial statements
- Oracle Data and Insight Cloud Service for self-service analytics
- Oracle Social Sites Cloud Service, which provides the ability for non-technical users to create sites on social networks such as Facebook
- Oracle Developer Cloud Service for developers who want to build their apps using a public cloud service
- Oracle Storage Cloud Service for providing object storage content linked to existing Oracle Cloud services
- Oracle Messaging Cloud Service, an asynchronous message queuing service to link data between disparate sources.
What's your take on Oracle's cloud strategy? Feel free to comment below or drop me a line at [email protected].
Posted by Jeffrey Schwartz on 10/03/2012 at 1:14 PM0 comments
Nasdaq today said it's launching a cloud-based system hosted by Amazon Web Services aimed at letting broker-dealers store critical records in order to meet compliance requirements.
While it's not the first stock market exchange to turn to the cloud -- the New York Stock Exchange last year launched a community cloud -- the move by Nasdaq will test the limits of using a major public cloud service to host sensitive data.
Specifically, Nasdaq's new offering, called FinQloud, will consist of two systems for broker-dealers: Regulatory Records Retention (R3), a storage and retrieval system, and Self Service Reporting (SSR), which will let firms perform queries and analysis of stored trading records on demand.
Ironically, compliance has been a major showstopper for many firms in the financial services industry as well as other vertical industries where a data security breach or outage could land a company in hot water. But Nasdaq, which for decades has been an early adopter of new technology, appears to believe that a breach or outage is no more likely to happen in the cloud than in its own datacenter.
Eric Noll, executive VP of transaction services for Nasdaq's U.S. and U.K operations, told CNBC this morning the economics of using the cloud are too compelling to pass up. Broker-dealers running their records management systems in Nasdaq's FinQloud will be able to reduce their costs by 80 percent compared with operating them in their own datacenters, Noll said.
By law, financial services firms are required to save records including e-mails for seven years. "In today's complicated market with more and more electronic communications, those storage costs have grown and grown and grown," Noll said. "By partnering with Amazon, what we think we are able to do is offer a lower cost solution to reduce the cost for broker-dealers for their storage and their retention requirements for the regulators."
As Nasdaq's systems outage in May demonstrated during the first hour of trading on the day of Facebook's initial public offering, an internally run system is subject to the same material failures as those that have occurred in the public cloud. Nasdaq has worked with Amazon for six years and Noll believes FinQloud can generate revenues for the exchange without requiring capital investment or operational costs by utilizing usage-based compute and storage.
Moreover, Noll, in his CNBC interview, expressed confidence that Amazon can ensure data will be kept secure. At the same time, Noll said Nasdaq is also applying its own data encryption.
"There's always going to be concerns about data security," Noll said. "Cyber attacks are a reality of today's modern society -- we're going to have to deal with them. I think there are attacks on standalone units as well as other sources of data out there but what we're going to be working with Amazon on is not only taking their info security protections but we're adding layers on it through Nasdaq as well and we will preserve the sanctity of this information for the users of the cloud and with us."
AWS senior VP Andy Jassy, who was present with Noll during the CNBC interview, said financial services companies have used Amazon's public cloud services for years, and expressed confidence that usage will grow exponentially. Jassy said he sees a day, perhaps 10 to 20 years from now, when AWS generates as much revenue as Amazon's $40 billion online retail business. "AWS has hundreds of thousands of customers at this point in over 190 countries and it's growing very, very dramatically," he said.
Yet many companies are loath to put sensitive and mission critical data in the cloud. Nasdaq's decision to do so publicly will no doubt be a closely watched public cloud deployment.
Posted by Jeffrey Schwartz on 09/25/2012 at 1:14 PM0 comments
It was a long time coming but OpenStack, the open-source infrastructure as a service (IaaS) project founded by NASA and Rackspace two years ago, is no longer the property of Rackspace. In keeping with its long-stated plan, Rackspace handed off OpenStack to an independent foundation with its own governance body.
The new OpenStack Foundation kicks off with $10 million in funding committed by its member companies over the next three years. That should be ample enough to cover its operating requirements, said Jim Curry, GM of Rackspace Private Cloud and a member of the newly-formed OpenStack Board of Directors.
Curry said Rackspace invested approximately $5 million per year to operate Rackspace, which included running of its various conferences and twice-yearly summits, development and marketing. Asked if Rackspace designated a financial value to OpenStack as an asset it is effectively spinning off, Curry said it was substantial, though he declined to elaborate.
Now that the OpenStack Foundation is an independent body, it removes the perception that it's a Rackspace-controlled entity. While Curry insisted that Rackspace long ago gave up such control, the fact remained that the company still owned the assets. The handoff took place Wednesday Sept. 19.
"I think we are in the position where OpenStack can now serve the best interest of OpenStack and its community and there won't be conflicts of interests with what Rackspace is trying to accomplish," said Randy Bias, CTO and co-founder of Cloudscaling, which provides open-source cloud software for carriers. Bias is also a member of the new OpenStack board.
Bias said the next key priority will be forming consensus over the foundation's mission. Specifically it must determine if it wants to focus on driving adoption of OpenStack versus creating standards to ensure interoperability. For his part, Bias will push for the former. "I think that second path is perilous because it's such early days," he said. "We would probably end up with a lot of bickering with what is the right way to build OpenStack."
Now that OpenStack is an independent foundation, the question is, will it become the de facto open source IaaS platform? It's too early to tell, said Forester Research analyst James Staten, noting while some of its partners have been selling OpenStack-based solutions over the past several months, sales remain weak, likely because they are awaiting the Folsom release due out next week.
But Staten noted OpenStack is also competing with other options such as the Citrix-backed CloudStack effort, released to the Apache Foundation earlier this year, and Eucalyptus and commercial implementations from the likes of Amazon Web Services that are experiencing revenue growth.
"CloudStack has a strong following among ISPs, especially overseas, and with enterprise divisional IT departments who want to replicate public clouds inside their walls," Staten noted. "OpenStack will be competing against CloudStack for this same business. It isn't clear yet how the two will differentiate from one another. That is also something they have to work on."
The key target for OpenStack and CloudStack are providers (hosters, managed services providers and telcos) and business unit IT, Staten added. On the provider side, Staten said "this is Cloudstack's market to lose today and the most attractive immediate opportunity for OpenStack."
The two open-source cloud offerings along with Eucalyptus, which has a partnership with Amazon Web Services, will also appeal to business units looking to shift workloads between public and private clouds.
"It's the business unit developers who are leading the adoption of public clouds and when they recognize the need for this sales type of a service from within their own corporate walls, they aren't turning to central IT to give this to them (their efforts to do so, so far, have fallen well short of developer expectations)," he noted. "So a small but growing trend is for the business unit itself to build and operate the private cloud. They want to start with a solution that looks and acts like a public cloud first, not a virtualization solution. This is the best second market for OpenStack and CloudStack and why Amazon and Eucalyptus partnered."
While some say CloudStack has an advantage today, OpenStack has the benefit of almost all of the blue-chip IT providers supporting it including AT&T Cisco, Dell, Hewlett-Packard, IBM and most recently VMware. Besides Rackspace, the various providers are still in the early stages of delivering their OpenStack products and services.
Angel Diaz, IBM's VP for software standards and cloud, said Big Blue's SmartCloud software and public cloud service will support OpenStack in near future, though he declined to be more specific. "The idea there is within our SmartCloud Foundation is a technology similar to OpenStack architected in the same way," Diaz said.
OpenStack will be as important to SmartCloud as the Apache server was to IBM's WebSphere application server technology, Diaz added. "What you'll see is a lot of the technology in our SmartCloud Foundation at the infrastructure as a service layer will work its way into OpenStack, in the not too distant future," he said. "You will see OpenStack in SmartCloud just like Apache is in WebSphere."
Now that OpenStack is an independent foundation, does it appeal to you any more than it did previously? Share your opinions here or drop me a line at jschwartz@1105 media.com.
Posted by Jeffrey Schwartz on 09/21/2012 at 1:14 PM0 comments
Looking to dispel the notion that his company is a one-trick pony, Salesforce.com CEO Marc Benioff is looking to offer its software as a service applications well beyond customer relationship management, the foundation of its business.
At its annual Dreamforce conference in San Francisco, Benioff told 90,000 attendees and thousands more tuned into his live keynotes broadcast via a Facebook feed, that Salesforce.com is moving into new areas such as document sharing, marketing automation and human resources.
While this is not the first time Salesforce.com has veered from its charter of offering tools aimed at helping organizations better interact with its customers, it's arguably the broadest extension of its service offerings into lines of business it hadn't previously touched.
In a sign that Salesforce.com is looking to stake a claim in these new areas, many of the new services launched this week won't be available until the second half of next year. That's out of character for Salesforce.com which typically doesn't pre-announce services well before their availability, said R. "Ray" Wang, CEO of Constellation Research.
"Marc did not announce things in general availability other than the marketing apps, which was unusual," Wang said. "They did a lot of forward-marketing which is not like Salesforce. I think he's afraid other people are going to jump into this market, so he's announcing things in development that are not released as code."
Wang, who attended the Dreamforce conference, said it was telling the number of HR, finance and marketing executives that were at the conference, who upstaged those from IT organizations. That's no coincidence. Citing a projection by Gartner that chief marketing officers will spend more on IT than CIOs by 2017, Benioff said "now we're inviting them into the cockpit into this incredible new marketing cloud."
The offering is Salesforce Marketing Cloud, which combines the scanning of social networks, advertising, business measurement and workflow based on technology it acquired from Buddy Media and Radian6. The key audience for Salesforce Marketing Cloud are these CMOs.
Benioff planted the seeds for extending Salesorce.com's reach with its launch two years ago of Chatter, its social media tool best described as a version of Facebook designed for use in an enterprise or extranet type scenario.
As noted by my colleague John K. Waters, Salesforce.com used Dreamforce to jump into the HR or as it is called these days "human capital management (HCM)" field with the launch Work.com, a system designed to let managers and HR benchmark and reward the performance of employees in a social context.
Through an expanded partnership with leading HCM SaaS provider Workday, Salesforce.com will provide integration of Work.com with Workday's HCM offerings. Salesforce.com's push into HCM comes as key rivals have moved into this rapidly growing segment. Oracle last year acquired Taleo, SAP, picked up SuccessFactors and IBM just announced it has acquired Kenexa for $1.3 billion.
Another area Salesforce.com is spreading its wings is its foray into document management in a move aimed at offering an alternative to the likes of Box and Dropbox. The company described the new Salesforce Chatterbox service as the "Dropbox for the enterprise." With so many services such as Drobox and Box.net out there, is Salesforce a) moving too far adrift? and b) likely to gain a foothold into document sharing?
"Although Box.net has gained lots of attention and many users, it hasn't established a firm hold on the B2B and enterprise markets," said Jeffrey Kaplan, managing director of Thinkstrategies, a consulting firm focused on cloud computing. "And, there are no dominant players in the other areas."
Likewise, Kaplan, who also attended Dreamforece, said Salesforcec.com will raise its profile in other new functional fields it's entering. "Salesforce.com will bring greater legitimacy to each of these areas, in the same way it has championed the idea of social networking in the enterprise with Chatter," he said.
Perhaps the biggest challenge facing Salesforce.com's ambitions to widen its footprint is the number of companies it has acquired in recent years, said Joshua Greenbaum, principal analyst of Enterprise Applications Consulting.
"Now that they have all these assets they need to do a better job of integrating them," Greenbaum said. "They need to focus on allowing developers and customers to integrate all this functionality and stop creating silos of functionality that are as problematic as any legacy silo is."
Posted by Jeffrey Schwartz on 09/20/2012 at 1:14 PM0 comments
Acronis, a provider of backup and recovery software, said it has acquired GroupLogic for an undisclosed amount. GroupLogic is a 22-year old company currently known for its file synchronization software that provides the enterprise equivalent of cloud-based storage and file-sharing services such as Dropbox, Microsoft SkyDrive or Google Drive without the risk associated with using such services.
Dmitri Joukovski, Acronis VP of product management said the acquisition of GroupLogic file synchronization software fits well with its portfolio of disaster recovery and data protection solutions, which are designed for physical, virtual and cloud environments. Joukovski said Acronis had sought out a software vendor that would let it add file sync software for some time.
"People are telling us they feel confident within the firewall their data is protected but they see more and more people work with corporate data outside the firewall," Joukovski said. "They e-mail documents to each other and they use insecure consumer file sharing solutions. They are looking for solutions that enable very secure access and collaboration with corporate documents."
GroupLogic's latest product, mobilEcho 4.0, lets users access corporate data running on file servers, networked attached storage (NAS) and within SharePoint environments. While giving users access to data from their PCs, Macs, iPhones and Macs, IT administrators also can manage access to data and wipe data from devices as needed, while also encrypting stored data. The software integrates with Microsoft's Active Directory and LDAP-based directory services.
Joukovski said that Acronis will operate GroupLogic as a separate subsidiary, though it's to be determined whether the company's brand will be phased out over time.
Posted by Jeffrey Schwartz on 09/14/2012 at 1:14 PM0 comments
Racemi, a company that offers software to move Windows and Linux images from one bare-metal server to another, this week added cloud migrations to its portfolio. The company's new Cloud Path is a software as a service (SaaS) offering that lets administrators move server images to public cloud services.
IT pros can use Cloud Path from a Web browser to migrate physical and virtual servers to infrastructure as a service (IaaS) cloud providers. Atlanta-based Racemi claims migrations will cost on average $800 less than performing manual re-imaging of data.
Pricing is based on a usage-based model and eliminates the need to rely on experienced administrators, who can move server images without requiring templates or scripts. In addition to moving workloads from in-house systems to cloud-based servers, Cloud Path lets customers migrate cloud instances between different supported cloud providers.
Racemi charges $299 per successful migration, which includes an initial 20 GB of free storage. Customers can currently migrate Windows Server 2008 R2, Ret Hat Enterprise Linux and CentOS systems to cloud services provided by Amazon Web Services, GoGrid, Rackspace and Verizon's Terremark. Racemi said it plans to support additional server OSes and cloud service providers over time.
Posted by Jeffrey Schwartz on 09/06/2012 at 1:14 PM0 comments
Attunity plans to offer a service that will let organizations synchronize and replicate data from their in-house servers to Amazon Web Services' S3 storage service. The offering, called Attunity CloudBeam, will also tie data from one AWS datacenter to another.
Best known for its database connectivity software, Attunity expanded its focus into Big Data replication with last year's acquisition RepliWeb, which developed the data transfer engine powering CloudBeam. Attunity announced a partnership with AWS in July and released for beta testing the CloudBeam service last week.
CloudBeam is targeted at those who want to use S3 for disaster recovery or for AWS customers wanting to ensure their data is available if a datacenter experiences an outage, said Matt Benati, Attunity's VP of global marketing. It's especially designed for those that are reluctant to use cloud services for production-level or business critical applications.
The service will be available in two forms: for those who want to replicate their servers to S3, administrators can download a thin-client that lets them map folders on a server to the target S3 resources. Attunity will also offer CloudBeam as a service, where a customer can establish replication between one AWS Availability Zone and another.
The company hopes to offer CloudBeam with other popular cloud services, Benati said. One likely candidate is Microsoft's Windows Azure service, he indicated, pointing to Attunity's existing relationship with the company.
Attunity hasn't disclosed pricing, though Benati said it will be aligned with Amazon's S3 cost models.
Posted by Jeffrey Schwartz on 09/06/2012 at 1:14 PM0 comments
Microsoft this week launched Windows Server 2012, describing it as a cloud OS. Company officials illuminated the cloud characteristics of its new server operating system during a carefully scripted, pre-recorded Webcast Tuesday.
Marketing hype aside, the release of Windows Server 2012 culminates a four-year engineering effort to build a common architecture for IT to develop and deploy applications on private, hybrid and public clouds. Microsoft has talked up the combination of Windows Server and System Center as a platform for private and hybrid clouds for some time.
Coining Windows Server 2012 a cloud OS simply advances Microsoft's messaging and lets IT decision makers compare it to a slew of other cloud infrastructure operating systems such as Eucalyptus and open-source software distributions delivered by those aligned with OpenStack including Rackspace and Red Hat, as well as VMware's proprietary VM and cloud software portfolio.
With the emergence of so-called "modern apps" -- those designed for various computing and mobile device types -- Microsoft wants developers to build server-side applications in its new .NET Framework 4.5 using Visual Studio 2012 and have the ability to deploy them on Windows Server 2012 in their datacenters or hosted by third-party cloud providers such as Hostway and Rackspace, or in its own Windows Azure public cloud, or any combination.
"We built Windows Server 2012 with the cloud in mind," said Bill Laing corporate VP of Microsoft's server and cloud division, who lead the engineering team that developed Windows Server 2012. "It's the deepest and broadest release ever built for companies of all sizes, whether you run a single server connected to the cloud, or a large data center."
In the video, Laing pointed to four key attributes that makes Windows Server 2012 a cloud OS:
- Scalable and elastic: The latest generation of Hyper-V lets a customer scale from one to thousands of virtual machines as workloads dictate. It supports up to 320 logical processors and 4 TB of RAM per server and Laing said it can virtualize 99 percent of all SQL Server databases. The new OS can run large virtual machines up to 64 virtual processors and 1 terabyte of memory per VM. So far, Laing said it has scaled 8,000 VMs per cluster.
- Shared resources: Windows Server 2012 is architected for multi-tenancy, critical for ensuring the workloads of a given group, organizational unit or customer, don't impact others. Combined with System Center 2012 SP1, Windows Server 2012 enables software defined networking, or SDN, which means "you can easily and dynamically provision isolated virtual networks running on the same physical fabric," explained Jeff Woolsey, a principal program manager for Windows Server and Cloud at Microsoft.
- Always-On: A feature called Live Migration provides VM mobility, which facilitates the movement of virtual machines from one physical server to another locally or over a wide area network. This cluster-aware feature is designed to provide continuous availability during patches, upgrades or failures.
- Automation and self-service: Users in departments can self-provision compute and storage resources. Windows Server 2012 enables automation with over 2,400 new PowerShell commandlets, designed to eliminate manual tasks and allowing IT to manage massive amounts of servers. Combined with System Center 2012, Windows Server 2012 offers automation via user-defined policies.
Microsoft makes a good case for Windows Server as a cloud OS and it should appeal to its installed base as customers build new apps for the cloud. But customers will determine whether Windows Server 2012, combined with System Center, is a viable option to VMware's cloud stack along with the open-source and Amazon Web Services-compatible alternatives.
Posted by Jeffrey Schwartz on 09/05/2012 at 1:14 PM2 comments
Security concerns might be the number one inhibitor to using public cloud services yet the horses may have already left the barn.
A global survey of 4,000 IT managers and executives found nearly half, or 49 percent, already use cloud services to store sensitive or confidential information and another 33 percent plan to do so over the next two years. Only 19 percent said they don't, according to the survey, conducted by security and privacy research consultancy the Ponemon Institute and commissioned by Thales, an IT security and encryption software and services firm.
The findings piqued my attention given public cloud services are a non-starter for many large organizations, especially those with regulatory or compliance restrictions. However the Ponemon study canvassed enterprises of all sizes including small and mid-sized organizations, explained Ponemon institute chairman and founder Larry Ponemon.
I pointed Poneman the findings by the Open Data Center Alliance (ODCA), in which 40 percent of its membership said security was the key barrier to using public cloud services. "Even organizations that say security is an inhibitor, still seem to be using cloud services," Ponemon remarked.
The findings also showed that 44 percent believe cloud providers are responsible for protecting data in the cloud, while 30 percent felt it was their own responsibility and 24 percent reported it should be shared.
"Like anything else, you need to be careful in selecting your business partners," Poneman said. "A public cloud provider is a business partner and the fact they have access to your data, and possibly confidential and sensitive information is a big deal, and organizations need to see the cloud as a place that can be very insecure and the source of data breaches and security exploits. Not all cloud providers are the same."
When asked about the impact of cloud services on an organization's security posture, 44 percent said it had no change and 39 percent said it decreased. Only 10 percent said it increased, while 7 percent were unsure.
Only a small percentage, 11 percent, said their cloud provider encrypts data for them, while the rest assume responsibility for encryption. Of those 38, percent encrypt data in transit, 35 percent do so before it is transferred to a cloud provider and 16 percent use encryption selectively at the application layer within a cloud environment.
Thirty six percent of those using encryption handle key management within their organizations, while 22 percent rely on a third party other than the cloud provider. Another 22 percent let the cloud provider manage the keys and 18 percent said it was a combination.
Posted by Jeffrey Schwartz on 08/28/2012 at 1:14 PM0 comments