In a move that appeared timed to rain on Citrix's parade, VMware last week demonstrated technology it plans to unveil which aims to provide the control plane for managing user workspaces. VMware revealed its hybrid cloud architecture aimed at creating "next-generation" workspaces, and did so on its rival's home turf at Citrix's annual Synergy conference last week in Orlando.
VMware took the wraps off Project Enzo, a new platform that it claims will change the way IT organizations deploy and manage virtual desktop environments, just as Citrix demonstrated its new Workspace Cloud architecture to customers at Synergy. Similar to Project Enzo, Citrix Workspace Cloud aims to provide the control plane based on a hybrid cloud architecture for managing user workspaces on a variety of form factors ranging from PCs, phones, tablets and even Raspberry Pi devices.
"Project Enzo is a new hybrid cloud-scale architecture that is designed to combine the economic benefits of cloud-based VMware virtual desktops and application technology with the simplicity of hyper-converged infrastructure to transform the IT experience," wrote Sumit Dhawan, VMware's senior vice president and general manager for desktop products and end-user computing, in a blog post announcing Project Enzo. "Project Enzo will enable the unified management of on-premises and cloud-based virtual workspace services (desktops and applications) through a single Web-based portal that will be available as a cloud service on VMware vCloud Air."
While there are certainly architectural differences between Citrix Workspace Cloud and VMware's Project Enzo, they appear to have the same goal in mind: being the center of deploying and securely managing user work environments on a variety of device types. The most noteworthy difference is that Project Enzo seems to prefer vCloud Air when it comes to providing the public cloud infrastructure. By comparison, Citrix executives went to great pains last week to emphasize that the Citrix Workspace Cloud can run in any infrastructure as a service, including AWS, Microsoft Azure and IBM Softlayer. Unlike VMware, Citrix doesn't operate a public cloud of its own and when asked last week at Synergy if it planned to do so, executives indicated firmly the company has no interest in doing so due to the massive investment requirement needed. Both companies are relying on cloud service provider partners to deliver these new platforms.
Each company also described their new architectures as deigned to bring together and manage "hyper-converged" software-defined infrastructures. Microsoft has a similar vision with its newly revealed Azure Stack earlier this month at the Ignite conference in Chicago. Azure Stack will come with the next release of Microsoft's server operating system, Windows Server 2016. VMware's Dhawan said the technical preview for Project Enzo is now available.
A key component introduced with Project Enzo technical preview, according to Dhawan, is its VMware Smart Node technology, which integrates hyper-converged solutions. "Smart Node technology enables the intelligent orchestration and automation of common set-up, delivery and management tasks of virtual workspace services across the hybrid cloud," he said.
Apparently VMware's decision to rain on Citrix's parade by announcing Project Enzo was payback, as pointed out by The Register's Simon Starwood, who recalled Citrix announcing a new version of Xen App, Xen Desktop and Receiver products at VMworld last year.
Posted by Jeffrey Schwartz on 05/21/2015 at 3:19 PM0 comments
Rarely a day goes by when I don't receive a news release on behalf of a company announcing it was included in one of Gartner's Magic Quadrant as a leader. If I had a dollar for every one of those announcements I deleted, I could retire now. Today both Amazon Web Services and Microsoft notified me that their respective public cloud services were recognized as leaders in the research firm's infrastructure-as-a-service (IaaS) report.
Gartner pointed to Amazon and Azure as the only "leaders" in its IaaS classification of cloud providers. Close followers, which Gartner describes as "visionaries" include CenturyLink, Google, VMware and IBM/SoftLayer. Among the others that Gartner described as "niche" providers were Rackspace, Joyent, Virtustream, Interroute, CSC, Dimension Data, Fujitsu, NTT Communications and Verizon.
Despite grouping AWS and Azure as the only leaders, Gartner singled out Amazon as the superior cloud provider. "AWS has a diverse customer base and the broadest range of use cases, including enterprise and mission-critical applications," the report said. "It is the overwhelming market share leader, with over 10 times more cloud IaaS compute capacity in use than the aggregate total of the other 14 providers in this Magic Quadrant."
The report said AWS still maintains a multiyear competitive advantage over both Microsoft and Google. A Microsoft spokeswoman in a statement said Azure still offers more than twice as much cloud IaaS compute capacity in use as the aggregate total of the remaining providers in this Magic Quadrant other than AWS. Microsoft officials also frequently point out that it has more datacenters around the world than AWS and Google combined with 19.
"This speaks strongly to Gartner's belief that the IaaS market is quickly consolidating around a small number of leading vendors," she said. "Microsoft is seeing significant usage and growth for Azure with more than 90,000 new Azure customer subscriptions every month, more than 50 trillion objects now stored in the Azure storage system and 425 million users stored in Azure Active Directory. In addition to strong use of Infrastructure-as-a-Service capabilities, we're also seeing over 60 percent of Azure customers using at least one higher level service."
The report opined that "Amazon has the richest array of IaaS features and PaaS-like capabilities. It continues to rapidly expand its service offerings and offer higher-level solutions." Microsoft argued it's the only vendor identified as a leader in both Gartner's IaaS and PaaS categories. "We also are differentiated in our ability to enable customers to use these capabilities together in a seamless fashion," she said. "For example, Azure Resource Manager enables a single coherent application model for IaaS and PaaS services and the Azure Preview Portal blends IaaS and PaaS seamlessly so that customers no longer have to work in multiple, disparate environments."
Gartner also pointed to reliability problems that have plagued Azure including numerous outages, though it notes substantial improvements over the past year. "We are committed to applying learnings when incidents occur to prevent recurrences of similar interruptions and improve our communications and support response so that customers feel confident in us and the service," she said pointing to Microsoft's Root Cause Analysis to see the most recent improvements.
The report also urged those choosing Azure not to jump in too fast. "Customers who intend to adopt Azure strategically and migrate applications over a period of one year or more (finishing in 2016 or later) can begin to deploy some workloads now, but those with a broad range of immediate enterprise needs may encounter challenges," the report advised.
Microsoft said it has aggressive plans to add new features, and said Gartner even acknowledged as much in the report. And, the spokeswoman's statement added: "Over the past 12 months, we've added more than 500 new features and services to the Azure platform, including robust IaaS and PaaS capabilities as well as offerings that enable consistency across on-prem and the cloud so customers can achieve the hybrid scenarios they demand."
At its Ignite conference last month, Microsoft announced extensive new hybrid cloud computing features coming in the form of the new Azure Stack, which the company believes will give it a further edge over both AWS and Google.
Of course different surveys and customer sets have their own benchmarks and criteria, as I noted last week, when Nasuni's third evaluation of major cloud providers gave preference to Microsoft Azure. Whether or not you give credence to Gartner's Magic Quadrants, it seems to match industry sentiment that AWS remains the dominant public cloud but Azure is a clear No. 2. Both companies would agree this race is far from over.
Posted by Jeffrey Schwartz on 05/20/2015 at 5:21 PM0 comments
At its annual Synergy conference in sweltering Orlando, Citrix last week staked its future around the company's new Workplace Cloud and a number of new wares that aims to establish it as the purveyor of the modern digital workplace. The major focus of this year's Synergy conference centered around Workspace Cloud, a platform that aims to ease the design, deployment, orchestration and management of secure work environments for mobile workers.
While virtual desktops and apps account for a small percentage of the workplace computing environments deployed today, their usage isn't trivial. Moreover it stands to grow in the coming years in new forms, including desktop as a service, as workers continue to use more device types, rely more on access from various places and organizations want to better secure information accessed by employees, contractors and even customers on these new form factors. The growth of hybrid cloud and the move to bring your own device policies are also enabling these new environments.
Looking to extend its reach from its core strength of offering virtual desktop and application environments, Citrix first started discussing Workplace Cloud a year ago but only demonstrated it publicly at Synergy last week, where customers also began testing the company's latest new platform. The company hopes to make it generally available in the third quarter. Mark Templeton, Citrix CEO, who is revered for his focus on engineering and user experience, showcased Workspace Cloud as the culmination of its effort to bridge public, private and hybrid clouds to the new ways people work with multiple device types. Templeton said the new digital workspace consists of Windows-based PCs, Macs. iPads, Android tablets Chromebooks new Linux-based systems and even embedded devices that enable Internet of Things-type environments.
"We think of Workspace as the core engine of the software-defined workplace," Templeton said in last week's keynote. "So if you don't do a great job with workspaces across all of those kinds of digital tools, then you're not going to have the engine of the software-defined workplace. And we know that everyone's workspace environment is different." The Citrix Workspace Cloud is based on a cloud delivery architecture similar to the company's BlackBeard reference architecture, which provides the service architecture to distribute XenDesktop and XenApp in hybrid cloud environments and RainMaker, which provides the orchestration across servers and nodes.
The control plane that powers Citrix Workspace Cloud is its ShareFile document sharing platform. Citrix, which acquired ShareFile in 2011, is a smaller competitor to the likes of Box and Dropbox. But Citrix has spent the ensuing years building on the core ShareFile engine to enable it to become the control plane for the new Citrix Workspace Cloud, which the company describes as a management platform for creating mobile workspaces that include desktops, applications and data provisioned in a hybrid cloud environment that could consist of a private datacenter, as well as a public or private cloud.
A key component of Citrix Workspace Cloud is the Lifecycle Manager, which creates blueprints that ease the migration of earlier versions of XenApp to current releases and provides the ability for IT to deploy them in the new management platform. These blueprints "are effectively groupings of things that you need to do to define whatever workload it is you want to deliver," explained Christian Reilly, CTO for the Citrix Workspace. "And then obviously the management piece comes after that. I'm not talking specifically about just delivering XenApp and XenDesktop because that's a key short term focus. The power of blueprints is if you kind of expand that out to two worlds, one in dealing with blueprints that can group together with different parts of the network topology, different bits of the infrastructure that need to be orchestrated to create an application workload and blueprints that can then provision or talk to Netscaler or other devices to compete the configuration."
In keeping with its history of not running its own public cloud, Citrix is empowering its base of 1,900 cloud service providers to provision Workspace Cloud in any environment, including Amazon Web Services, Azure and IBM's SoftLayer cloud, among others. The control plane itself runs in Microsoft Azure, but Citrix officials insisted that no customer data or apps touch the control plane, or Azure in particular, unless they want it to.
While building the control plane on ShareFile, Workspace Cloud brings together XenDesktop and XenApp platforms as well as networking gear such as Netscaler and CloudBridge. Stitching these together gives Citrix the opportunity to bundle -- and potentially upsell its wares -- though Templeton said the architecture allows organizations to plug in their own components, such as Microsoft and VMware hybrid cloud infrastructure. Workspace Cloud is an ambitious effort by the company to move itself forward with a major new platform designed to create and manage secure user work environments tailored around workers' tendencies to use multiple and often nontraditional devices to access their Windows environments. In addition to launching Workspace Cloud, Citrix previewed several new other key offerings in its pipeline including extensions to its XenMobile enterprise mobility management platform, networking and security upgrades to its Netscaler and CloudBridge tools, data loss prevention and other security improvements to its ShareFile enterprise file sharing offering. It also showed off new automation capabilities to its XenDesktop and XenApp platforms.
Attendees at Citrix Synergy last week seemed impressed with Workspace Cloud, though even its most visible customers said they need to understand how it might fit into their environments. "We will start playing with the beta," said David Enriquez, senior director of information technology for the Miami Marlins. "It looks to me something we could take advantage of such as spring training temporary deployments, if we have to do something at a minor league park or if we have an event at the ballpark that needs infrastructure but we don't want to put it on our existing infrastructure."
Posted by Jeffrey Schwartz on 05/19/2015 at 1:18 PM0 comments
Now that Microsoft has outlined its Universal Windows Platform and its new model for building and deploying modern applications for its Azure cloud, the company has hit the road to make its case to developers throughout the world.
The Build tour is taking place today in London and New York. These one-day developer confabs follow last month's annual Build conference, which took place in San Francisco. In the coming days and weeks Microsoft is hosting Build events in Atlanta, Berlin, Moscow, Tokyo and Mexico City, among numerous other locations.
I attended the New York event today where a few hundred developers took in a condensed version of the larger Build conference, featuring an overview of Microsoft's UWP and Azure strategy and demos on how to build and port apps to the new platform. Kevin Gallo, Microsoft's partner director for developer ecosystem and platform, gave the opening keynote address in which he called on developers to learn how they can enhance Android and Apple iOS applications for the new UWP by bridging them to Windows. At the same time, Gallo emphasized the ability to port legacy Win32 apps to the new UWP.
Key to the new UWP is the notion that developers can build their apps for Windows regardless of whether they're for a phone, tablet, PC or the new Surface Hub large conferencing system. "All these new APIs are to help create different user experiences on different devices," Gallo said.
Likewise, Microsoft's new love of open source platforms is carrying over to the Build tour, both in terms of the new UWP, which includes the new Edge browser, native support for HTML 5 and XAML, which Microsoft officials emphasized is appealing for developing responsive design to apps. Gallo explained how the new "bridge" capability for developers addresses four key form factors: Web, Win32 apps and integration with applications built in Objective C for iOS and editors for Android developers such as CodeMe. Developers "can use their Android tooling and port to Windows," Gallo said.
Gallo also showcased Microsoft's new Windows for IoT (Internet of Things). The new Windows IoT Core is available for Raspberry Pi, the popular low-cost small device platform and for MinnowBoard Max, the open source environment for developing embedded apps for Intel Atom processors.
Building on Azure
In the second half of the Build tour keynote session, Neil Hutson, a Microsoft's engineering evangelist, took the stage to talk about extensions to the Azure public cloud. In keeping with Microsoft's emphasis that Azure's not just for Windows and .NET apps, Hutson said that 25 percent of the instances running in the company's cloud are Linux based. "If you want to use a favorite language, platform, framework and operating system, we pretty much have you covered so you can do your best work with us," Hutson said.
While Microsoft has extolled that message for some time now, the next key selling point for Azure rests on its new Azure App Services. The notion behind Azure App Services is that it enables developers to build and deploy modern and mobile apps that can scale globally across the Azure cloud.
Hutson also gave airtime to the array of new data services hosted in Azure ranging from what he described as a high performance SQL DB to support for a variety of alternative SQL and NoSQL database types. Hutson outlined three new features in Microsoft's Azure SQL DB service. First is Elastic Database Pool, which will allow customers to maintain separate isolated databases, the ability to aggregate activity to smooth peaks and the ability define strict performance SLAs. Second is support for full text search and the most warmly received new feature -- based on audience applause -- was support for transparent data encryption (TDE). "That means when data is at rest, it's fully encrypted," Hutson said. Hutson also talked up the new Azure SQL Data Warehouse, which "lets you suck information from SQL DB, and aggregate on premises systems," Hutson said.
Circling back to the Internet of Things, Hutson also talked up how the various data services including Azure Machine Leaning and Event Hub can connect to and process data from millions of devices in near real time.
Stream Analytics consolidates all of that data and allows users to create reports using the new Power BI. They can store infinite amounts of data in the new Azure Data Lake. Hutson also underscored the new Office APIs that will enable interactivity among third-party apps and components of the productivity suite, especially Outlook. With the new Office APIs and Office Graph, developers can build native application experiences into Office 365, Hutson explained. Rather than toggle between Outlook and Salesforce.com, the two can now be integrated, he said.
Microsoft knows developers will be critical to the success of UWP, Azure App Services and the next generation of Office. If the road show comes to your neighborhood, you may want to learn the details and decide for yourself.
Posted by Jeffrey Schwartz on 05/18/2015 at 11:20 AM0 comments
Companies looking for a large global cloud provider to store significant amounts of data will do well choosing either Amazon Web Services or Microsoft Azure, with the latter performing slightly better according to Nasuni's third biennial cloud storage report. Google, the only other cloud provider with a large enough global scale that could be compared with the two by Nasuni's standards, came in a distant third place.
It's important to keep in mind that this is one benchmark by a single provider with its own requirements -- primarily using a large global cloud provider as a NAS target. But Nasuni performs the tests to determine which services it should use to provide as a storage target and claims it's not wedded to any one player, unless a customer specifies one. Nasuni first began sharing its benchmarks in 2012 when AWS had an overwhelming edge, though that was before Microsoft had a mature infrastructure as a service available.
Today, depending on the service, Nasuni primarily distributes its workloads between AWS and Azure and is always willing to add or shift to other suppliers. Nasuni currently prefers Microsoft Azure Blob Storage and Message Queue though it uses AWS's Dynamo database and EC2 compute instances, said John Capello, Nasuni's VP of product strategy. The primary test Nasuni conducted between October and February for the report evaluated a variety of read-write and delete scenarios, according to Capello.
"For our purposes, which is to write files for mid-sized to large enterprises to the cloud, Microsoft Azure Blob storage is a better target for us than Amazon or Google," he said. "Amazon is a very, very close second. Amazon and Microsoft seem to be, as many others have said, the real two competitors in this space in providing cloud services in general, but specifically with storage, they're very, very close in terms of both their speed, availability and their scalability."
According to the report, which is available for download if you're willing to give up your contact information, is that Microsoft outpaced Amazon and Google when it comes to writing data to a target 13 of the 23 scenarios of varying thread counts or file counts. When it came to reading files, Microsoft constantly performed better, though not to the extent it did in the write tests. Microsoft was twice as fast as Amazon when it came to deleting files and five times as fast as Google.
For system availability, Amazon's average response time of 0.1 seconds slightly edged Microsoft's 0.14 seconds, while Google was roughly five times slower. Nasuni also measured scalability and when writing 100 million objects to look at the number of read and write misses, "Microsoft had, by far, the largest write variance, which was more than 130 times larger than Google's, who had the smallest variance." Read and write errors were almost non-existent, according to a summary of the report. "Only Amazon showed any misses at all: five write errors over 100 million objects, which gives an error rate of .00005 percent."
Nasuni omitted several key players from the test, notably IBM's Softlayer, which was undergoing system upgrades and led to frequent periods of planned downtime during the testing period, according to Capello. HP was also initially in the test, though Capello said Nasuni chose to leave the company out this time because of HP's announced plans of changes in cloud strategy. "Before we decided we weren't going to continue testing them, they actually did surprisingly well, in some cases -- better than Amazon and Microsoft in some of the read-write and delete benchmarks," he said. "If we had run the full test, it would be interesting to see where they came out. "
Posted by Jeffrey Schwartz on 05/15/2015 at 10:56 AM0 comments
Yesterday was the latest of Patch Tuesday, the ritual which takes place on the second Tuesday of every month. But its days could be numbered. Patch Tuesday won't disappear anytime soon but the release of Windows 10 will set the stage for its ultimate transition to a different release cadence.
Microsoft said at last week's Ignite conference that its procedure for issuing security patches will change with the new OS's release via new "distribution rings" similar to the fast and slow rings offered with the Windows 10 Technical Preview. The company describes it as part of its Windows-as-a-service transition. It will only apply to Windows 10 and future operating systems, given the change in the way Microsoft builds software.
"Our goal is patch Tuesday will go away," said Stella Chernysak, a senior director for Windows Commercial at Microsoft, during an interview last week at Ignite. "Windows Update for Business essentially means the challenges customers have historically had with their patching will be easier to address."
Chernysak said this will let Microsoft issue patches as needed and allow organizations to better automate how they apply those patches, while at the same time allowing for customers to maintain a predictable process.
"This is our new foundation for us to deliver Windows as a service," she explained. Microsoft will start delivering the first set of Windows Update for Services this summer with additional functionality to be added in the fall.
Posted by Jeffrey Schwartz on 05/13/2015 at 11:50 AM0 comments
Microsoft's effort to displace passwords with technology with its forthcoming biometrics-based Windows Hello and Passport technology has received a fair amount of attention over the past few weeks. But Microsoft has another new technology slated for Windows 10 called Device Guard, which aims to further protect Windows from malware and known and new advanced persistent threats.
Device Guard, announced at last month's RSA Conference in San Francisco, will be an option for those who want deeper protection against APTs and malware in instances where intruders get in. Device Guard uses hardware-based virtualization to block the ability to execute unsigned code. It does so by creating a virtual machine that's isolated from the rest of the operating system. Device Guard can protect data and applications from attackers and malware that have already managed to gain access, according to Chris Hallum, a senior product manager for commercial Windows client security at Microsoft.
"This gives it a significant advantage over traditional antivirus and app control technologies like AppLocker, Bit9, and others which are subject to tampering by an administrator or malware," Hallum explained in an April 21 blog post. "In practice, Device Guard will frequently be used in combination with traditional AV and app control technologies. Traditional AV solutions and app control technologies will be able to depend on Device Guard to help block executable and script-based malware while AV will continue to cover areas that Device Guard doesn't such as JIT based apps (e.g.: Java) and macros within documents. App control technologies can be used to define which trustworthy apps should be allowed to run on a device. In this case IT uses app control as a means to govern productivity and compliance rather than malware prevention."
Device Guard blocks against malware and zero days targeting Windows 10 by only allowing trusted apps signed by software vendors, the Windows Store and internally developed software, according to Hallum. "You're in control of what sources Device Guard considers trustworthy and it comes with tools that can make it easy to sign Universal or even Win32 apps that may not have been originally signed by the software vendor," he explained.
When I met with Hallum and Dustin Ingalls, group program manager for OS security, at the RSA Conference in San Francisco last month, we primarily discussed Windows Hello and Passport, which Microsoft is hoping will replace passwords by enabling biometric authentication. Device Guard is not quite as sexy since it'll be invisible to individual end users but will allow enterprise IT administrators to make it impossible for attackers to execute code not recognized by Device Guard, Ingalls explained.
The VM created with Device Guard creates what Ingalls called a "tiny OS" where the operating system's decision-making components are isolated. "We take the actual critical integrity components and move those out of the main OS," Ingalls explained. "Now we have operating system that's much more difficult to compromise. "On top of that we make use of a feature called user mode integrity, which they know is vetted in the Windows Store."
Stella Chernysak, a senior director for Windows Commercial at Microsoft, described Device Guard as similar to Bitlocker in concept. "Device Guard will be on business systems, where IT has an opportunity to turn it on," Chernysak explained in an interview last week at Microsoft's Ignite conference in Chicago. "It will be an option for IT to take advantage of that feature or IT may make the decision to ask an OEM or partner to turn it on."
Posted by Jeffrey Schwartz on 05/11/2015 at 12:48 PM0 comments
As rumors surfaced this week that Salesforce.com is seeking bidders to acquire the company, including possibly Microsoft, they were barely noticed in Chicago's McCormick Place, where the Ignite conference was taking place. The mere mention of it to an attendee garnered an uninterested look, where attendees were focused on taking in the wave of new wares Microsoft rolled out.
Though shares of Salesforce.com's stock were halted briefly on Tuesday when they spiked on the rumors, the reports don't suggest Microsoft has actually even made a bid, only that there are at least two bidders and that Microsoft could be one of them and SAP the other. So the speculation swirling Wall Street is just that, though oftentimes speculation does turn into reality.
When I first read about the rumours, my reaction was that it does seem rather far-fetched that Microsoft would shell out as much as $60 billion to acquire Salesforce.com, even if it is the leading supplier of cloud-based software-as-a service applications. One has to wonder if the huge investment and cash outlay would be worth it, considering such deals rarely have the upside the architects envision.
Considering Salesforce.com's market cap is about $49 billion, a deal to acquire it with the typical premium could reach $60 billion, give or take a few billion. Salesforce.com's 2014 revenues were just over $4 billion with guidance for this year of $5.3 billion -- or 30 percent. While not shabby, last year's annual revenues increased 35 percent, suggesting growth is slowing. Another problem: despite its revenue growth, Salesforce.com lost $263 million last year. Also Microsoft has competing, though less successful, product lines with Dynamics and Yammer, to name a few.
On the other hand, acquiring Salesforce.com, which is hugely popular with enterprises, could accelerate Microsoft's shift in transitioning into a SaaS provider and extend its developer footprint into the open source community. It would also give Microsoft an even larger presence in Silicon Valley.
Some of that upside notwithstanding, does Microsoft need to bet the farm on Salesforce.com when it could use that cash hoard in more fruitful ways?
Posted by Jeffrey Schwartz on 05/08/2015 at 12:30 PM0 comments
When Microsoft rolled out Windows Server 2012 and System Center 2012, the company coined it and the then-new Azure Pack as its Cloud OS. While Cloud OS indeed provided the building blocks to build Azure-like clouds in private datacenters and third-party hosting providers, many say it's not seamless. Also Azure itself is a very different cloud than it was in 2012.
Cloud OS is a generic term used by a number of other providers including Cisco and Hewlett Packard. You can expect to see Microsoft phase out its Cloud OS brand that described Microsoft's approach to Windows Server and System Center in favor of the new Azure Stack. Along with the new Operations Management Service, which enables management of multiple servers, clouds and virtual machines, Azure Stack is a product that substantially advances upon the Azure Pack in that it aims to allow enterprises and hosting providers to build and manage cloud infrastructure that truly mirrors the functionality and experience of the Azure public cloud.
While Cloud OS as an amalgamation of Microsoft's datacenter software offerings didn't quite live up to its billing, Microsoft officials were confident at Ignite that the Azure Stack, its new operating system software including the new Nano Server configuration and containers, will enable a common infrastructure for on-premises datacenters and Azure. Time will tell whether Microsoft delivers on that promise but Azure Stack will come next year with Windows Server 2016 and System Center 2016, Microsoft officials explained here in Chicago this week. Corporate VP Brad Anderson introduced Azure Stack Monday in the opening keynote of Ignite.
"This is literally us giving you all of Azure for you to run in your datacenters," Anderson said. "What this brings you is you get that great IaaS and PaaS environment in your datacenters. You have incredible capability like a unified application model that gives you a one-click deployment experience for even the most complex, multitier applications and then you get that cloud-inspired infrastructure. We're giving you the same software controller that we built for our network, the name is the same, network controller. We're giving you our load balancing. We're giving you all the storage innovation."
Microsoft released the second technical preview of Windows Server 2016 Monday and the Azure Stack is slated for a later test version. Ryan O'Hara, a Microsoft program director, explained in an Ignite briefing Tuesday the Azure Stack will offer more features than the Azure Pack. Among other things, it will offer all of the services of both IaaS and PaaS and all of the Azure management tools. "We think about Azure Stack as the delivery of Azure innovations on premises," O'Hara said.
In Monday's keynote, Jeff Woolsey, a Microsoft senior technical product manager, demonstrated the Azure Stack. "You see the same IaaS virtual machines, the same network interfaces, the same public IP addresses, the same BLOB storage, the same SQL [and] the same role-based access control both in Azure and in Azure Stack," he said. Through the Azure Portal, Woolsey showed how to associate such Azure services as networking, compute and storage, as well as Azure's software-based load balancers, software-defined network controllers and the distributed firewall. Into the Azure Stack. "We've packaged those up and put those in the Azure Stack for you so you're getting those same software-defined networking capabilities," he said.
Azure Stack will be a key component of the next version of Windows Server but it will be a separate offering. As it rolls out, we'll see if this provides the true vision of the hybrid cloud platform formerly known as Cloud OS.
Posted by Jeffrey Schwartz on 05/06/2015 at 12:10 PM0 comments
Days after courting developers to build apps for its new Universal Windows Platform at the Build conference in San Francisco, Microsoft deluged more than 23,000 IT pros attending its inaugural Ignite conference in Chicago with a barrage of new offerings to manage and secure the new platform and the entire IT stack.
Ignite kicked off today with a three-hour keynote headlined byd CEO Satya Nadella, who talked up how the company's new wave of software and cloud services will enable IT and business transformation in line with the ways people now work. He also highlighted the need for better automation of systems, processes and management of the vast amount of data originating from new sources such as sensors.
Among the new offerings revealed during the keynote presentation were: Azure Stack, which brings its Azure IaaS and PaaS cloud to on-premises datacenters; Microsoft Operations Management Suite to administer multiple server OSes including Linux, clouds and VMs; and Windows Update Service for Business, while making the case for Windows 10 for enterprise users.
Nadella talked up the company's focus on "productivity and platforms" tied with the shift to cloud and mobility, saying everything Microsoft offers aims to bring all of that together in line with the changes in the way people work and new data types generated from sensors and other Internet-of-Things-type nodes.
"Every layer of the IT stack is going to be profoundly impacted," Nadella said in the keynote session. "This sets up our context. It sets up the tension we have as we set out to manage this IT landscape. "We want to enable to our IT professionals and end users to make their own choices for their own devices, yet we need to ensure the security, the management. We want to enable our business units to choose the SaaS applications of their choice, yet we want to have the compliance and control of efficiency."
Nadella emphasized three themes: making personal computing more personal and secure, bringing together productivity and process and providing more agile back-end infrastructure. Just about everything Microsoft offers will be updated.
SQL Server 2016 will be "the biggest breakthrough in database infrastructure," with a technology called Stretch, allowing a single table to stretch from the datacenter to Azure. Microsoft released the second preview of Windows Server 2016 and is readying System Center 2016 "to make it possible for you to have Azure in your datacenter which is consistent with the public cloud Azure," Nadella said. The new Microsoft Operations Management Suite will provide what Enterprise Mobility Suite provides for client device management to datacenter administration, said Corporate VP Brad Anderson.
The company also gave major airtime to new security wares including the release of the new Advanced Threat Analytics tool, which, among other things, manages activity in Active Directory logs. The company also is moving from its traditional Patch Tuesday delivery of security updates, which take place on the second Tuesday of every month, to "rings" of security releases that will start with the delivery of Windows 10.
For the most part, Microsoft emphasized its new release wave and how it will integrate with key platforms, notably iOS and Android. But in a departure, Windows Chief Terry Myerson couldn't resist talking up Microsoft's added security features on Windows, and the company's new wares to keep Windows even more secure, taking a shot at Google. "Google just ships a big pile of [pause for emphasis] ... code, and leaves you exposed with no commitments to update your device." It was intended to showcase Microsoft's new focus on providing regular security updates for Windows.
Joe Belfiore, corporate VP for Microsoft's operating systems group, showcased the new Windows Hello technology, tied to the company's new Passport authentication service, coming to Windows 10. While Windows Hello will support all forms of biometrics, Belfiore showcased Windows 10 using facial recognition to authenticate into Windows 10. Belfiore also demonstrated many popular features in Windows 7 that will reemerge into Windows 10 and new features, like Cortana, the new personal assistant that will provide answers to questions. "My mission is to convince you and give you the tools with the belief your end users will love and desire Windows 10," Belfiore said.
In coming posts, we'll drill down into these new offerings, which represent much of Microsoft's product waves expected in the next six-to-12 months.
Posted by Jeffrey Schwartz on 05/04/2015 at 2:08 PM0 comments
Microsoft spent the last two days trying to convince its own and the rest of the software development community that building applications to its new Universal Windows Platform (UWP) will let them create innovative and competitive apps. Indeed providing a common architecture for PCs, tablets and Xbox is a lofty goal and has promising implications. Likewise HoloLens, Microsoft's virtual reality headgear, is a worthy attempt to create new capabilities, though its success remains to be seen. The move to provide interfaces that will let Android and iOS developers extend their apps to Windows -- and vice versa -- raised eyebrows this week. It could be a last-ditch effort to save Windows Phone from fading to obscurity but even if it can't save Microsoft's struggling smartphone, UWP could still be a hit in other ways.
In short, UWP will support everything from legacy Win32 apps in the new Windows Store to Web, Android and iOS apps. Michael Domingo, editor-in-chief- of Redmond magazine sister publication Visual Studio Magazine is at the Build conference this week. Domingo explained the various tools that will create these bridges. Among them Project Astoria is the Android runtime bridge, which can be used from the Android Studio IDE to refactor Android app code for the Windows 10 platform. It will include a Windows emulator, and is supposed to allow for debugging and testing of apps from either the Android IDE or Visual Studio IDE. The new Project Islandwood toolkit is an iOS bridge for developing from Objective C. Myerson demonstrated some of the progress his group has made with the tool, showing the ability to debug and test Xcode from within the Visual Studio IDE. Project Centennial is aimed at Windows developers who want a shortcut for recasting current .NET and Win32 Windows apps for the newer Windows Store.
"Windows 10 is going to enable you to reuse your Web code, your .NET and Win32 code, your Android, Java and C++ code, to build amazing new applications, bringing the code over, extending it, putting it in the Windows Store and reaching 1 billion Windows 10 customers," said Terry Myerson, executive vice president and leader of Microsoft's Windows team, in Wednesday's opening keynote at Build, held in San Francisco. Likewise, "you will be able to compile the same Objective C code that's being used in iOS applications within Visual Studio on Windows, enabling you to leverage that code and extend it with the capabilities only found on the Windows platform."
David Treadwell, a corporate VP for operating systems at Microsoft, yesterday demonstrated how Windows 10 will provide a bridge for the Universal Windows Platform and store. "Apps written to these classic platform technologies will be able to be packaged and deployed with AppX," Treadwell said. "You'll get the same fast, safe, trusted deployment as apps written to the Universal Windows Platform."
Critics were quick to question how well Android and iOS apps will work on UWP, particularly Windows Phone. "Okay programmers, what do you get when you run something in emulation?" asked blogger Steven J. Vaughan-Nichols , on a ZDNet post. "That's right. You get slow performance."
Vaughan-Nichols, an expert on the open source community and Microsoft critic, had a more fundamental question: "If you're a Windows Phone or RT developer, may I ask why?" pointing to its below 4 percent and falling market share. "Microsoft has handed the keys to the Windows Mobile kingdom to Android and iOS programmers. Whether those developers will bother with it is another question. After the first flush of excitement, they too will face considerable technical and market problems getting their apps profitably on Windows. I think Microsoft is making a desperate play to stay relevant in the mobile space with its own operating system and it's one that's destined to fail."
Key to disproving the Vaughan-Nichols theory will be the ability to bridge these apps with ease, agility, speed and with no degradation in performance. Now that Microsoft has built it, will the developers come?
Posted by Jeffrey Schwartz on 05/01/2015 at 12:18 PM0 comments
Microsoft believes its new Windows 10 operating system will find its way onto 1 billion PCs, tablets, phones, Xbox gaming consoles and emerging device form factors like its HoloLens by fiscal year 2018, which begins in just over two years. Terry Myerson, executive vice president for Microsoft's Windows group, made the bold prediction in part of the opening keynote presentation at the annual Build conference which kicked off today in San Francisco.
But convincing developers to build applications for the new Universal Windows platform and its application store will be critical if Microsoft can achieve that goal. By providing a common code base for different form factors, Microsoft believes it will have an appealing reason for customers to embrace Windows 10.
In opening remarks, Microsoft CEO Satya Nadella made the case for Windows 10. "Windows 10 represents a new generation of Windows built for an era of more personal computing from Raspberry Pi (the low-cost touch-based device) to the holographic computer," Nadella said.
"Universal Windows apps are going to enable you to do things you never thought were possible," Myerson said. "With Windows 10 we are targeting the largest device span ever. We're talking about one platform -- a single binary that can run across all these devices." While Microsoft has talked up that theme for some time, Myerson announced four key developments that could further embolden Windows to developers and consequently millennials who tend to gravitate to other computing and device platforms.
Perhaps most noteworthy is the ability to port application code for iOS and Android to the new Universal Windows platform. Windows Phones will include an Android subsystem where an app can be written, but the extensions to Windows will enable Android apps to be extended to Windows, Myerson said. Developers will be able to bring the code over, extend it and put it in the Windows Store, "reaching 1 billion Windows 10 customers," he said.
Myerson also announced developers will be able to compile the same Objective C code used to build Apple iOS apps for iPhones and iPads within Visual Studio on Windows, "enabling you to leverage that code and use capabilities only found on Windows platform. "
Addressing the issue of legacy Windows applications, Myerson announced the new Universal Windows apps by letting developers reuse server-hosted code and tools. "Developers will be able to give Web sites live tiles, integrate with Xbox Live and more," Myerson said. Developers can also now enable Cortana notifications, he noted.
Microsoft is also adding support for .NET and Win32 apps into the Windows Store, enabling these apps to take advantage of all of the Universal Windows platform capabilities. It does so using the learnings from Microsoft's App-V technology that lets developers run their applications in virtual environments. Adobe said its Photoshop Elements and Illustrator will be available in this environment.
The ability to run iOS, Android, legacy Win32 and .NET code could address key barriers to Windows but what will ultimately make Windows 10 fly is the ability to deliver capabilities not currently available. Much of that is now in, or coming into, the hands of developers.
Posted by Jeffrey Schwartz on 04/29/2015 at 2:06 PM0 comments
Testing beta software is always fraught with unexpected challenges but the new Windows 10 Technical Preview Build 10061, released last week, might test your patience. If you've already downloaded it, you know what I mean. If you're not on the "fast ring" release cycle of the Windows Insider program and haven't seen it, prepare to roll up your sleeves.
Microsoft gave a heads up to some of the bugs that are always present when we agree to test beta software. None of the problems seems insurmountable. The most obvious issue is if you use Win32 apps, including Microsoft Office, they won't launch from the Start Menu. Microsoft was aware of this when it released the new preview but wanted to showcase the new features and tweaked look.
There's an easy fix for this problem, as Gabe Aul, chief of the Windows Insider program, explained in last week's blog post announcing the release. "We know this one will be a bit painful but there is a bug with this build in which Win32 (desktop) apps won't launch from the Start Menu," he explained. "The workaround is to use search to find and launch these apps and pin them to your taskbar for quick access." Once you do that, launching applications will be fine.
If you liked using Microsoft's new browser, code-named Project Spartan, it'll appear they pulled it from the Technical Preview along with the beta of the new Windows Store. Both are still there -- you just have to find them and "repin them to your Taskbar from All apps on your Start Menu," Aul said.
Despite some of these issues, Microsoft wanted to showcase some of the newest features coming to Windows 10. Among them, according to Aul, are:
Start Menu allows users to resize
- Black theme across Start Menu, TaskBar and Action Center
- Taskbar optimized for tablets. In tablet mode the size of the Start button, Cortana and Task View buttons increases making them optimized for touch
- Boot-to-tablet mode is the default setting for tablets smaller than 10 inches
- Virtual desktops: Users can now create as many as they need
Fixes from previous build include:
- Hyper-V now works
- Visual Studio won't crash when creating Universal Apps
- Project Spartan browser bugs repaired
What's your reaction to the latest Windows 10 build?
Posted by Jeffrey Schwartz on 04/27/2015 at 12:55 PM0 comments
It's been a long time coming for Amazon.com investors who have grown increasingly impatient with the drag its cloud computing business has imposed on profits but the company last week gave them some good news. Under pressure to give a more detailed breakdown of revenues and profitability of its Amazon Web Services subsidiary, the company caved and promised it would share that information commencing with last week's Q1 2015 earnings report.
In its first disclosure on AWS, Amazon said its cloud computing subsidiary is a $5 billion business that's growing. Specifically, it posted $1.57 billion in revenue for the period, a 49% year-over-year increase. Based on analyst estimates that would put AWS on a $6 billion run rate, according to Redmond's new sister site AWSInsider. The most surprising revelation from Amazon's earnings report was that AWS is profitable. AWS had a margin of 17 percent. Macquarie Analyst Ben Schachter told The Wall Street Journal that AWS "is significantly more profitable than we expected."
Noting the 15% jump in the company's stock on the news, Finro Equity Analyst Lior Ronen today was among a number of others suggesting that Amazon spin off AWS. In a Seeking Alpha blog post Ronen said based on AWS' $1.57 billion revenue for the quarter AWS segment is on a $6.9 billion annual revenue run rate, based on 11% quarterly growth. Assuming a price-to-sales ratio ranging from 7 to 10, AWS is worth between $48 billion and $69 billion, Ronen predicted.
"By spinning AWS, Amazon will be able to create two tech giants -- one focused on e-commerce and online retail business and the other on cloud computing and IaaS services," he said. "Amazon could leverage the two companies to create a whole that is bigger the sum of its parts: AWS could focus on its niche, develop new revenue streams, and invest further in its technology, while Amazon could do the same on its e-commerce platform. That is the only way Amazon could create a sustainable growth for the long term and employ the advantages it has in both businesses."
However Equity Analyst James Brumely was among those skeptical about AWS' long-term prospects. In a separate Seeking Alpha post, Brumely argued that as cloud services become more commoditized it will put pressure on future margins for AWS. Brumely also said that Google and Microsoft will continue to put pressure on Amazon. "Even as exciting as unexpected operating profits are for the Amazon Web Services (AWS) arm of the e-commerce giant, it doesn't change the fact that the company still lost money last quarter, nor does it change the fact that margins for AWS are more likely to continue to shrink rather than widen as cloud-computing continues to become commoditized," he said.
Furthermore, the 17% margin isn't as impressive as it seems, he argued, pointing to the fact that 291 of the companies in the Fortune 500 have operating profits of 15% or higher. More alarming, he said, is the fact that its 17% margin represents a marked decline from last year's profit of 23% during the same quarter.
"What happened?," he asked. "In simplest terms, Amazon (in a very Amazon-esque manner) has decided to become and remain the low-price leader with the cloud-storage world, and didn't worry about making much -- if any -- profit in the business. As turns out, it still made some operating profit as a cloud-computing provider, but it's making progressively, relatively less as time moves along."
The findings from Amazon's AWS stats may be vague but it's a noteworthy step not just for investors but for buyers of cloud infrastructure services who -- while looking to bet the best deal possible -- surely don't want to see their provider lose money indefinitely. And just as competitors tend to respond to pricing moves of one another, it'll be interesting to see if Microsoft, Google, IBM and others follow suit.
Posted by Jeffrey Schwartz on 04/27/2015 at 7:50 AM0 comments
Microsoft is extending its bug bounty program
, which pays up to $100,000, to include Azure, Hyper-V and the new Project Spartan browser that will be included in the new Windows 10 operating system.
Microsoft's bounty program has existed for several years and had already provided awards for detecting flaws in Internet Explorer and Office 365. Microsoft Azure CTO Mark Russinovich announced the addition of its cloud service, Hyper-V and Project Spartan to the bounty program at this week's RSA Conference in San Francisco. Among his three talks at the RSA Conference was an overview of the security of the Azure cloud service where he made the announcement at the end of his presentation.
"We want to make sure we don't have attacks discovered on Hyper-V before we do, so we're asking now for researchers to be there so we can get on top of them before attackers can take advantage of them," Russinovich said. "It's showing that we really want to keep our systems secure and make sure that the good guys aren't encouraged to go take their information and do evil with it but rather help everybody get some incentive like this."
Bounties for fixes that cover known flaws range from $500 to $15,000 and Microsoft will pay up to $100,000 for a mitigation bypass to any of the company's isolation technologies. It also offers $50,000 BlueHat bonuses for discovery and mitigation of zero-day vulnerabilities. Jason Shirk of the Microsoft Security Response Center said in a blog post Wednesday that the bounty extension will include Azure virtual machines, Azure Cloud Services, Azure Storage and Azure Active Directory, among others. The bounty will also cover Sway.com, the preview of Microsoft's new social network for sharing information. Shirk said Microsoft is only offering the bounty for Project Spartan through June 22.
Stephen Sims, a security researcher at the SANS Institute said Microsoft has paid handsomely for a number of discoveries, such as last year's $100,000 to Yang Yu, who disclosed three exploit mitigation bypass techniques to the company. "My experience with MSRC is they're kind of a pain, they're not very friendly about it but it is good that they have that program setup," Sims said. "But they do pay if you can prove to them without a doubt. If you can find one bug, it's a year's salary, potentially."
Posted by Jeffrey Schwartz on 04/24/2015 at 12:13 PM0 comments
When the developers of the original RSA encryption algorithms built what has become the mainstream means of encrypting and decrypt data, it wasn't lost on them that some bad guys might also find malicious uses for it as well. Two of its inventors yesterday said they were alarmed at the use of encryption for ransomware, which has become a pervasive way of gaining access to users' PCs and enterprise servers using increasingly more sophisticated social engineering and phishing techniques.
"As a security threat, encrypting ransomware has flown beneath the radar of many IT departments. It emerged as a consumer problem and at smaller companies and agencies," said Paul Kocher, president and chief scientist at Cryptography Research, who once again moderated this year's Cryptography Panel at the RSA Conference in San Francisco. "Many IT admins, unfortunately, write off the potential for ransomware incidents as unavoidable end-user errors that merit a slap on the wrist, but can't be helped. But all evidence suggests the problem isn't going away."
Given two of the panelists invented many of what are now the RSA algorithms used in today's encryption methods -- Adi Shamir, a professor at the Weizmann Institute in Israel, and MIT Professor Ronald Rivest -- Kocher asked them for their perspective on their use for ransomware.
"As the inventor of one of the algorithms, I sort of feel like the mother whose son has been brainwashed and he's off to become a Jihadist in Syria somewhere," Rivest said. "I think that ransomware is one of those areas where our community failed in a particularly miserable way," Shamir added. "There are good security programs you can use in order to protect yourself from this ransomware."
Shamir said he fears the worst is yet to come as the Internet of Things enables homes and businesses to become more connected. "Think about your TV being ransomware'd stopping to work, with a big display that you have to pay in to get the TV service back," Shamir said. "I think it's a very serious problem. It's going to stay with us and we really have to think about new techniques to stop it."
Shamir also noted that because systems can be infected silently for weeks or months before a user is aware of it, backing up files also won't solve the problem. "Eventually your files on the backup are going to be the encrypted files," he said. "This is a huge issue of the correctness of backed up data, which is a major problem."
This month's Redmond magazine cover story looked at the continued impact of ransomware on consumers and enterprises alike. Panelist Ed Giorgio a cryptographer and security expert said the malicious use of encryption is just part of the problem. "Ransomware is not just about encrypting your data so you don't have access to it, in order to do ransomware you have to first penetrate somebody's computer, then you have some sort of an exploit," Giorgio said. "But as we all know, criminals are very innovative and once they penetrate your file, they will find other things in your computer they can blackmail you for. Even if we do solve the loss of data problem, ransomware will still be around.
Posted by Jeffrey Schwartz on 04/22/2015 at 12:16 PM0 comments
This is not your father's RSA. That was the message the company's new president, Amit Yoran, effectively gave in the opening keynote on Tuesday at the annual RSA Security Conference in San Francisco attended by more than 30,000 IT security professionals. While it's hosted by RSA, a subsidiary of EMC known for its development of the industry standard RSA public key cryptography algorithm, the conference is an industry event with participation by its partners and competitors alike.
While focusing his keynote on issues that plague security professionals, it set the stage for changes Yoran is planning for the company he took the reins of last year from longtime President Art Coviello, who recently retired. "We're reengineering RSA across the board to enable us to deliver on this vision," Yoran said toward the end of his address. "This time next year, we won't be the same RSA you have known for decades."
Yoran didn't use his keynote to explain how he plans to remake RSA. But at a gathering of press and analysts a day earlier in brief remarks he indicated a move away from RSA's original SecureID strong authentication token platform. Addressing the current risk factors, which extend beyond enterprise perimeters thanks to the growing ubiquity of public and hybrid cloud services, he noted the launch of its new Via identity management product line and extensions to RSA Security Analytics.
RSA described its new Via portfolio as the first smart identity tools that use contextual awareness instead of static rules such as traditional passwords to single sign-on access to systems. The first in the portfolio, RSA Via Access, is a software-as-a-service (SaaS) offering that offers step-up authentication using mobile devices to provide single sign-on access. The portfolio also includes RSA Via Governance, built on its identity management and governance platform acquired by Aveksa, which provides views into access privileges, automates user access and flags orphan user accounts and inappropriate user access, according to the company. Also built on its Aveska acquisition is the new Via Lifecycle, a user provisioning platform.
The other major area of emphasis for the company is the extended capabilities of RSA Analytics. Based on RSA's 2011 acquisition of NetWitness, which Yoran led as CEO at the time, the company is launching a new release of RSA Analytics that will focus on extending visibility from the endpoint to the cloud. And that gave Yoran fodder for much of his talking points in his opening keynote.
Referring to the 2014 Verizon Data Breach Investigations Report that found less than 1 percent of successful advanced threat attacks were spotted by SIEM systems, he argued his call for change. "We're still clinging to our old maps," he said. "It's time to realize that things are different."
Given existing defense mechanisms are not sufficient in and of themselves these days, he believes analytics will be key to proactively identifying attacks. "We must adopt a deep and pervasive level of true visibility everywhere, from the endpoint to the network to the cloud, if we have any hope of being able to see the advanced threats that are increasingly today's norm," he said.
The Stuxnet, Equation Group and Carbanak intrusions are a handful of examples he pointed to. "One of the defining characteristics across all of them is their stealthy nature," he said. "Until written about they were virtually undetectable because they bypassed traditional defenses. Even now many organizations operate completely blind as to whether they are victim to these published techniques. Traditional forms of visibility are one-dimensional, yielding dangerously incomplete snapshots of an incident, let alone any semblance of understanding of an attack campaign. Without the ability to rapidly knit together multiple perspectives on an attack, you'll never fully understand the scope of the overall campaign you're facing."
Arguing he wasn't hawking his products, Yoran said "I'm not just standing up here and saying 'buy RSA gear.' I'm the first to admit that we need to go further than what is available today. We're on a journey to full visibility. Our environments, business practices and adversaries continue to evolve and so must we."
As I said, this is not your father's RSA.
Posted by Jeffrey Schwartz on 04/21/2015 at 2:19 PM0 comments
Microsoft late last week said it's shutting down the MS Open Tech subsidiary it formed three years ago to invest in open source initiatives and will absorb it into the company. The company announced the formation of Microsoft Open Technologies Inc. in April 2012, staffed with an interoperability strategy team in Redmond that aimed at accelerating its push into the open source community.
In a blog post late Friday, MS Open Tech's president Jean Paoli said the independent organization accomplished what it set out to do and the time is right to bring its people and efforts back into Microsoft. "MS Open Tech has reached its key goals, and open source technologies and engineering practices are rapidly becoming mainstream across Microsoft," Paoli said. "It's now time for MS Open Tech to rejoin Microsoft Corp., and help the company take its next steps in deepening its engagement with open source and open standards."
The move is hardly surprising. In the past year, Microsoft has extended its push into the open source community more than most ever would have expected. Not that Microsoft is positioning itself as an open source company but it in some way supports every major initiative and has made contributions once unthinkable including its .NET Framework. Mark Russinovich, CTO for Azure, earlier this month raised eyebrows when raising the specter of Microsoft open sourcing Windows saying "it's definitely possible."
"Open source has become a key part of Microsoft's culture," Paoli said in his Friday post. "Microsoft's investments in open source ecosystems and non-Microsoft technologies are stronger than ever, and as we build applications, services, and tools for other platforms, our engineers are more involved in open source projects every day. Today, Microsoft engineers participate in nearly 2,000 open source projects on GitHub and CodePlex combined."
Paoli also noted that Microsoft has brought "first-class support" to Linux and Azure, partnered with Docker to integrate its containers to enable support on Azure and Windows, built Azure HDInsight on Apache Hadoop and Linux and created developer support for open platforms and languages including Android, Node.js and Python. In addition to deep support for Docker, Paoli pointed to integration with other key environments, both open and competing proprietary platforms, notably iOS. Among other projects he noted were contributions to Apache Cordova, Cocos2d-x, OpenJDK, and dash.js, support for Office 365 on the Moodle learning platform and collaboration on key Web standards including HTML5, HTTP/2 and WebRTC/ORTC.
As Microsoft absorbs MS OpenTech, it will create the Microsoft Open Technology Programs Office, according to Paoli. "Team members will play a broader role in the open advocacy mission with teams across the company," he said. "The Programs Office will scale the learnings and practices in working with open source and open standards that have been developed in MS Open Tech across the whole company. Additionally, the Microsoft Open Technology Programs Office will provide tools and services to help Microsoft teams and engineers engage directly with open source communities, create successful Microsoft open source projects, and streamline the process of accepting community contributions into Microsoft open source projects."
Posted by Jeffrey Schwartz on 04/20/2015 at 11:17 AM0 comments
Microsoft's efforts to support containers in Windows took another step forward yesterday with the release of the Docker Client for Windows. The release of Microsoft's Docker command-line interface for Windows comes with Docker's updated container platform, dubbed Docker 1.6.
It comes after an active week for Docker, which on Tuesday received a huge equity investment of $95 million, which the company said it will use in part to further its collaborations with partners including Microsoft, Amazon Web Services and IBM. Microsoft also just announced that Docker containers are coming to Hyper-V and for Windows Server.
"Docker Client for Windows can be used to manage Docker hosts running Linux containers today, and managing Windows Server Containers and Hyper-V Containers will be supported in the future to provide the same standard Docker Client and interface on multiple development environments," wrote Ahmet Alp Balkan, a software engineer for Azure Compute at Microsoft. Microsoft and Docker have collaborated to port the Docker Client to the Windows environment in Docker's open source project, which you can see on GitHub."
Balkan also noted that IT pros will be able to find Windows Server Container Images in the Docker Hub among the 45,000 Docker images for Linux already available, a figure he noted continues to grow. IT pros and developers can download the Docker Client for Windows via the Chocolatey package manager or they can install Boot2Docker, which creates a Docker development environment within a virtual machine, Balkan noted.
The new Docker 1.6 includes a new container and labels that let IT pros and developers attach user-defined metadata to containers and images within various tools, Docker said. Also the new Docker Registry and API, along with the Docker 1.6 Engine boasts improved reliability and performance.
Docker's updated Compose 1.2 tool, designed for running complex applications, reduces repeatable processes. The release also includes the Swarm 0.2 clustering component, which the company said turns a pool of Docker hosts into one virtual host. The updated release includes a new spread strategy for scheduling containers, more Docker commands supported, the ability to add more clustering drivers and support for more Docker commands such as pulling and inspecting images.
Finally, Docker added Machine 0.2, which the company said has an improved driver interface, more reliable provisioning and the ability to regenerate TLS certificates to ensure better security when a host's IP address changes.
Posted on 04/17/2015 at 1:03 PM0 comments
Ask most people what companies are Microsoft's biggest rivals and some will say Apple but most will identify Google. Several published reports even point to powers in Redmond as a key force behind regulators coming down on the search giant this week. IT pros may throw VMware and Red Hat in the mix of major Microsoft competitors but its neighbor Amazon Web Services is right up there having launched its famous cloud infrastructure services years ahead of Microsoft. Even the entry of Azure got off to a slow start, lacking a complete infrastructure service to rival the offerings of AWS.
Microsoft talked up the kink in the armor last year when Amazon shocked investors with heavier-than-expected losses, due primarily to its investments in AWS. Anyone who knows Founder and CEO Jeff Bezos is aware he's not going to throw in the towel on AWS that quickly, though there are some who'd like to see him agree to spin off the cloud business into a separate company. But Bezos' rationale around AWS was always to let the two businesses feed off each other.
Proponents of a divestiture could be buoyed or deflated depending on what Amazon's numbers look like when it reports next week but, to date, history is not on its side. While we've reported on gains by Microsoft, IBM and numerous others at the expense of AWS, by no means is it game over for Amazon, which continues to crank out new offerings on a weekly basis. Consider over the past week when AWS held one of its regional summits, this one in San Francisco. The company pointed to the fact that software partners continue to extend support for AWS' services, simplified its Amazon Machine Learning service, announced its new new Elastic File Storage service and has extended its burgeoning Amazon WorkSpaces offering.
Rarely does a week go by when where isn't something new coming out from what is still the largest provider of infrastructure services, which is why Redmond parent company 1105 Media has launched the new AWSInsider site, which debuted this week. This new sister site is a welcome addition to our portfolio, but in no way will diminish the way Redmond covers AWS for Microsoft-focused IT pros. Rather it only promises to enhance it.
The timing couldn't be better as AWS Furiously Fights off Cloud Competitors. And it's number one antagonist and Pacific Northwest rival Microsoft is about to step up that battle in the coming weeks at its Build conference in two weeks and Ignite in early May. In a preview leading up to Microsoft's big splash, Jeffrey Snover, lead architect for the Windows Server division, last week talked about six key sessions he'll be participating with the likes of Azure CTO Mark Russinovich, where they'll talk about the company's datacenter vision moving forward, which includes the combination of new versions of Windows Server, System Center, and Azure. A key component will go deep on how this new datacenter vision extends its hybrid cloud platform, aka Cloud OS, with new levels of automation aided by PowerShell's Desired State Configuration and support for containers.
It will be interesting to see how the new offerings coming from not only Microsoft and AWS but all of the major players as well as the lesser known ones, who will also play a key role in how organizations view and procure IT in the future.
Posted by Jeffrey Schwartz on 04/16/2015 at 1:19 PM0 comments
The European Union has once again thrown the gauntlet down on Google, this time charging the company with violating antitrust laws by using its dominance in search by favoring its own comparison shopping service at the expense of others. The EU is also launching a separate investigation to see if Google has used its clout as the dominant supplier of mobile phone software to hold back providers of competing mobile operating systems, namely Apple and Microsoft. Google denied both allegations.
Regarding the charges that it skews results in its search engine to benefit its own shopping comparison service, the EU charged that "Google gives systematic 'favourable' treatment to its comparison shopping product (currently called 'Google Shopping') in its general search results pages, e.g. by showing Google Shopping more prominently on the screen."
Google diverts traffic from competing comparison shopping services obstructing their ability to compete, said the EU complaint.
"The Commission is concerned that users do not necessarily see the most relevant results in response to queries -- this is to the detriment of consumers, and stifles innovation," it said in a statement. The EU wants Google to operate its own comparison shopping services the same as it treats those of rivals. Google has 10 weeks to respond, at which point the EU will hold a formal hearing.
In response to that allegation, Google said in a blog post it has plenty of competitors and argued its own offerings are often underdogs. "Indeed if you look at shopping -- an area where we have seen a lot of complaints and where the European Commission has focused in its Statement of Objections -- it's clear that (a) there's a ton of competition (including from Amazon and eBay, two of the biggest shopping sites in the world) and (b) Google's shopping results have not harmed the competition," Amit Singhal, senior vice president of Google Search, said in a blog post. "Companies like Facebook, Pinterest and Amazon have been investing in their own search services and search engines like Quixey, DuckDuckGo and Qwant have attracted new funding. We're seeing innovation in voice search and the rise of search assistants -- with even more to come."
As for Android, the EU said it's investigating whether or not Google has violated antitrust regulations by thwarting development of mobile applications to other operating system providers by providing incentives to smartphone and tablet suppliers to install Google's apps and services exclusively. "Distribution agreements are not exclusive, and Android manufacturers install their own apps and apps from other companies as well," said Hiroshi Lockheimer, Google's VP of engineering for Android, in a blog post addressing the investigation. "And in comparison to Apple -- the world's most profitable (mobile) phone company -- there are far fewer Google apps preinstalled on Android phones than Apple apps on iOS devices."
Do you feel the EU has a case or are the latest charges just a witch hunt?
Posted by Jeffrey Schwartz on 04/15/2015 at 11:30 AM0 comments
Microsoft today as planned is releasing the Skype for Business client just weeks after introducing the Technical Preview. The company announced the release of the new Skype for Business as part of the Office 2013 April rollout. All Office 365 customers are scheduled to receive the update by the end of May.
Organizations not ready to let their users transition to the new Skype for Business client can allow administrators to switch back to the existing Lync interface, Microsoft said. The company posted instructions for how customers can roll back to the current Lync client, both for shops with Lync Online and those with Lync Server.
A new version of the Lync Server, to be called Skype for Business Server, is scheduled for release next month. Microsoft said in March that the new server edition will support high availability including support for the company's AlwaysOn capability included in SQL Server.
Skype for Business is the new phone and conference interface that replaces Lync and will bring the look and functions of Skype to Office. Microsoft claims that more than 300 million consumers use Skype, which Microsoft acquired in 2011 for $8.5 billion, its largest acquisition to date. Now comes the litmus test on whether Microsoft will get bang for its buck. By integrating the enterprise features of Lync with the interface of Skype, Microsoft is hoping it can raise the profile of its communications technology among business and enterprise users.
Microsoft first indicated plans to integrate Lync with Skype and give it the Skype brand late last year and released the technical preview of the new Skype for Business client at last month's Convergence conference in Atlanta. Skype for Business represents Microsoft's latest effort to give it an even stronger foothold in universal communications, which it has long aspired to do. Microsoft introduced Lync nearly five years ago as a revamped iteration of its Office Communications Server.
The company is hoping that the familiarity and access to all of the 300 million users Microsoft claims Skype has will increase its appeal and usage both within businesses and among consumers. Microsoft says Skype for Business has "enterprise-grade security" and controls for compliance. Just like the existing Skype and Lync clients, the new Skype for Business provides IM, presence, voice and video calls and meetings. With this new release, Skype is integrated directly into Office.
In the new client, users can initiate and control calls from their Office contact lists. It also brings the Skype emoticons to discussions, improved file transfer including the ability to drag and drop, the ability to let recipients see file details including the file's size and name and it lets users take notes from within the clients via OneNote. It also includes the Skype call monitor.
How quickly do you see your organization using Skype for Business?
Posted by Jeffrey Schwartz on 04/14/2015 at 10:26 AM0 comments
Investors are so bullish about how Docker is poised to play a major role in the future of enterprise IT infrastructure and software development that they filled its coffers with $95 million in D Series funding. Docker, regarded as the leading provider of containers for enterprise developers to build service oriented, scalable and portable software, took the huge cash infusion even though the provider of containers hasn't used up last fall's most recent infusion of $40 million.
The company's meteoric rise in just two years has quickly garnered support by enterprise IT heavyweights including Amazon Web Services, Google, Microsoft, VMware and IBM. Not only does Docker aspire to make operating systems and virtual machines much less relevant but it wants to make it possible for developers to build software without regard to the OS or public cloud provider, with the ability to scale with or without a virtual machine. The company claims that the Docker platform shrinks software development times from weeks to minutes and drives 20x improvements in computing resource efficiency.
Docker says it has logged 300 million downloads of instances for its Docker Hub hosted offering and 15 large Fortune 50 companies are now testing its forthcoming Docker Hub Enterprise offering. More than 1,200 open source developers have contributed to the Docker platform, according to the company. David Messina, Docker's VP of marketing, said on a conference call with journalists that the company plans to use the proceeds of the $95 million to expand the orchestration, networking storage and security features of the Docker platform and build on the APIs that enable extensions to platforms from partners like Amazon, Microsoft and IBM.
"I think we've been given a mandate to build something and clearly there's a community of people who are very excited about what we've built so far," said Docker founder and CTO Solomon Hykes, speaking during the conference call. "The expectations are extremely high, almost impossibly high. Our goal is to build a universal tool. Very specifically we're trying to solve fundamental problems that affect all applications, and although at any given time the implementation is limited in its scope. For example most obviously you can only run applications in Docker if they can run in Linux, but over time we're working to expand that scope, and the most dramatic example is our partnership with Microsoft."
Docker's partnership with Microsoft, launched in May and extended in October, is significant. The next version of Windows Server, code-named "v.Next," will ship with native support for containers and with a ported version of Docker that will support Windows container technology, Hykes noted. Microsoft last week said it will release the next technical preview of Windows Server next month.
"As a developer in an enterprise, you'll be able to develop, build and test applications using Docker's standard tooling and interfaces for both Linux and Windows environments," Hykes said. The notion of using Docker containers is that developers can build applications for .NET, Java and modern programming languages that are portable and scan scale without requiring huge investments in virtualization.
More Lightweight and Faster than a VM
"The first thing that happens when people play with containers in their development projects is it looks like a VM but faster and more lightweight and consuming less memory. And those are all true," Hykes said. "Several years before Docker existed, a common, wisdom among [IT pros and developers] was a container was just that: smaller, more lightweight, a faster VM. The fundamental difference between Docker and other lower-level container tools is simply we disagree. We think containers and VMs are fundamentally different. They operate at different levels of the stack and as a result they are not mutually exclusive. You can use containers with VMs, you can use containers without VMs, directly on bare metal, and we're seeing organizations do both."
Currently most customers aren't using Docker containers to replace virtual machines, Hykes said, emphasizing the notion that containers are designed to ensure existing infrastructure and applications don't require change.
"Typically what we've seen is the way developers reason about containers is not as a possible replacement for VMs, but as an additional layer on top of their infrastructure, which allows them to pick and choose the best infrastructure for each job. [This] means it now becomes easier for an organization to use VMs where VMs make sense, to use bare metal when bare metal makes sense and, of course, to pick and choose between multiple physical machine providers and virtual machine providers and then layer on top of all these different parts of their infrastructure. On top of which they can express their applications. So the bottom line is containers are for applications and VMs are for machines."
To the point regarding the type of applications Docker containers are best suited, Hykes said they can be applied to any type. "Docker can be applied to any sort of application. Over time, I think we're seeing more and more practitioners grow comfortable with the technology, comfortable with the best practices and evolve from the original pilot project, which is typically a non-vital project and gradually trust Docker with projects of larger and larger magnitude."
Messina said the appeal of Docker Server and the forthcoming Docker Hub played a key role in Goldman Sachs and Northern Trust joining the parade of investors funding this new round. The two companies have used Docker for various development efforts and now these organizations are standardizing on Docker in their application lifecycle infrastructure," Messina said.
Insight Venture Partners led the round with new investments from Coatue, Goldman Sachs and Northern Trust. Also participating in the round were previous investors Benchmark, Greylock Partners, Sequoia Capital, Trinity Ventures and Jerry Yang's AME Cloud Ventures.
Posted by Jeffrey Schwartz on 04/14/2015 at 1:38 PM0 comments
Microsoft last week filed a legal brief challenging a court order that is forcing the company to turn over a customer's e-mails stored in a foreign datacenter.
The brief, filed April 8 with the United States Court of Appeals for the Second Circuit, seeks to argue last summer's court order that Microsoft must turn over the messages from the customer, who is suspected in an alleged drug-related matter. The identity of the suspect is not known and Microsoft said at the time of the ruling, which was upheld by Judge Loretta Preska, that it would appeal the order.
A number of major technology companies last year had filed briefs in support of Microsoft's appeal including Apple, AT&T, Cisco and Verizon, along with the Electronic Frontier Foundation, noting that the outcome promises to set a precedent for all U.S.-based cloud providers storing data abroad.
"Settled doctrine makes this Court's job simple: Because laws apply only domestically unless Congress clearly provides otherwise, the statute is properly read to apply only to electronic communications stored here, just as other countries' laws regulate electronic communications stored there," according to the brief, which Microsoft published. "Even if the Government could use a subpoena to compel a caretaker to hand over a customer's private, sealed correspondence stored within the United States, however, it cannot do so outside the United States without clear congressional authorization."
Brad Smith, Microsoft's general counsel and executive vice president for legal and corporate affairs, indicated in a blog post that he's confident the company will prevail. "As we stated in our brief, we believe the law is on the side of privacy in this case, he said. "This case is about how we best protect privacy, ensure that governments keep people safe and respect national sovereignty while preserving the global nature of the Internet."
Smith also argued that the feds are long overdue in evaluating electronic privacy laws. "While there are many areas where we disagree with the government, we both agree that outdated electronic privacy laws need to be modernized," he said. "The statute in this case, the Electronics Communications Privacy Act, is almost 30 years old, he noted. "That's an eternity in the era of information technology."
Those differences of course pertain around combatting criminal activities versus protecting privacy. Smith acknowledged that conflict but renewed his plea for the government to find a resolution. "Law enforcement needs to be able to do its job, but it needs to do it in a way that respects fundamental rights, including the personal privacy of people around the world and the sovereignty of other nations," he said. "We hope the U.S. government will work with Congress and with other governments to reform the laws, rather than simply seek to reinterpret them, which risks happening in this case."
Posted by Jeffrey Schwartz on 04/13/2015 at 11:11 AM0 comments
PC shipments have been on the decline. But if you want to look at the glass half-full, those declines aren't as bad as originally forecast.
IDC yesterday reported that 69 million PCs shipped for the first quarter of this year amounted to a 6.7 percent decline over the same period last year. Though that's the lowest number of PCs shipped for the quarter since 2009, this year's decline wasn't as sharp as IDC had originally forecast last fall when the market researcher had predicted volumes to drop by 8.2 percent.
The better-than-expected number -- if you do look at the glass half full -- came from a slower decline in the United States than other parts of the world, according to IDC Senior Research Analyst for PCs Rajani Singh. In the U.S. 14.2 million PCs shipped in the first quarter, a 1 percent decline over the same period last year, according to IDC. The strongest segment of growth was portables, notably in new categories such as new Bing PCs, Chromebooks, convertible PC-tablets and ultra-slim notebooks, according to IDC, which said desktop shipments were sluggish this quarter.
Gartner said desktop declines were in the double digits but figures won't be available for another few weeks, according to analyst Mikako Kitagawa. For its part, Gartner said it had forecast more moderate declines and it says shipments declined 5.2 percent. Gartner is also forecasting moderate PC growth for the years to come.
"The PC industry received a boost in 2014 as many companies replaced their PCs due to the end of Windows XP support. But that replacement cycle faded in the first quarter of 2015," Kitagawa said in a statement. "However, this decline is not necessarily a sign of sluggish overall PC sales long term. Mobile PCs, including notebooks, hybrid and Windows tablets, grew compared with a year ago. The first quarter results support our projection of a moderate decline of PC shipments in 2015, which will lead to a slow, consistent growth stage for the next five years."
The pending arrival should boost PC shipments later this year once Microsoft releases Windows 10, IDC's Singh stated. "Windows 10 should be a net positive as there is pent-up demand for replacements of older PCs," she noted. "Only part of the installed base needs to replace systems to keep the overall growth rate above zero for rest of the year."
Kitagawa in an e-mail said she doesn't anticipate the arrival of Windows 10 playing a significant role in an uptick of PC demand. "We don't expect Windows 10 will stimulate the demand, but will see shipment growth from supply side as manufactures will try to push the volume," she said. "If related marketing activities are visible enough, then it can draw buyers' attention, but it does not mean that it can increase the sales to the end users."
Both research firms also noted that the two largest PC providers, Lenovo and Hewlett Packard respectively, were the only suppliers to grow sales during the quarter. IDC said Lenovo with 19.6 percent share of the market, shipped 13.4 million PCs, an increase of 3.4 percent. HP's sales of just under 13 million systems were up 3.3 percent giving it a 19 percent share of the market. Dell, the No. 3 player, shipped 9.2 million PCs, a 6.3 percent decline giving it a 13.5 percent share.
Smaller PC vendors, defined as "others," accounted for a third of the market and saw their shipments decline 17.6 percent, according to IDC. Naturally that impacted their market share, which dropped from 38.4 percent to 33.9 percent. If that trend continues, expect to see the big get bigger and the rest of the market to be squeezed. One variable is whether HP will be able to maintain its scale after it splits into two companies.
Posted by Jeffrey Schwartz on 04/10/2015 at 12:40 PM0 comments
In a move Microsoft says will further advance container technology to more deployment scenarios and workloads and allow developers to build more scalable apps, too, Microsoft today said it will offer Hyper-V containers.
The introduction of Hyper-V containers comes just weeks before Microsoft plans to debut the preview of the next version of Windows Server, code-named "v.Next," which the company will demonstrate at its Build conference in San Francisco. As part of today's announcement, Microsoft also revealed plans to offer a container-based scaled down version of Windows Server called the Nano Server, aimed at modern, cloud-native applications. The latest report of the Nano Server under development surfaced last month.
Microsoft Hyper-V containers will offer a deployment option to running applications on Windows Server. The company announced last fall that the next version of Windows Server will support containers, which are lightweight runtime environments with many of the core components of a virtual machine and isolated services of an OS designed to package and execute so-called micro-services. However, while the addition of containers to Windows Server V.Next has been described before, the Hyper-V containers addition is something new.
Microsoft has previously described a partnership with Docker to ensure its containers could run in Windows Server environments. The move followed an earlier announcement in June 2014 to ensure that the Microsoft Azure public cloud could run Docker containers on Linux-based virtual machines. Microsoft has also indicated that Azure will support Docker's open orchestration APIs and Docker Hub images in the Azure Gallery and Portal.
The latest addition today of Hyper-V containers will offer a deployment option that offers extended isolation utilizing the attributes of not just the Windows operating system but Hyper-V virtualization, according to Mike Neil, Microsoft's general manager for Windows Server, in a blog post.
"Virtualization has historically provided a valuable level of isolation that enables these scenarios but there is now opportunity to blend the efficiency and density of the container model with the right level of isolation," Neil wrote. "Microsoft will now offer containers with a new level of isolation previously reserved only for fully dedicated physical or virtual machines, while maintaining an agile and efficient experience with full Docker cross-platform integration. Through this new first-of-its-kind offering, Hyper-V Containers will ensure code running in one container remains isolated and cannot impact the host operating system or other containers running on the same host."
The Hyper-V containers will support the same development and management tools as those designed for Windows Server Containers, Neil noted. Moreover, he said developers don't need to modify applications built for Windows Server Containers in order to run in Hyper-V containers.
And for modern application scenarios where Hyper-V and Windows Server would be overkill, Neil described the new Nano Server as "a minimal footprint installation option of Windows Server that is highly optimized for the cloud, including containers. Nano Server provides just the components you need -- nothing else, meaning smaller server images, which reduces deployment times, decreases network bandwidth consumption, and improves uptime and security. This small footprint makes Nano Server an ideal complement for Windows Server Containers and Hyper-V Containers, as well as other cloud-optimized scenarios."
Posted by Jeffrey Schwartz on 04/08/2015 at 10:34 AM0 comments
Microsoft late Friday issued a short reminder that the Windows Server preview released in October is set to stop working on April 15. A new preview is slated to arrive in May, the company announced.
The company will release a fix so that testers can continue using it between April 15th and the release of the second technical preview, according to the Windows Server blog. "If you would like to continue your evaluation, we will soon deliver a solution until the next preview is released in May," read the blog post. "We will update this blog with more information shortly."
Given it has taken seven months for Microsoft to release the second Windows Server Technical Preview, it'll be interesting to learn what major changes make it into the new build, especially after a panel discussion last week at ChefConf in Santa Clara when Microsoft Azure General Manager Mark Russinovich said it's "it's definitely possible" that Microsoft is considering making Windows an open source platform.
Windows Server 2016, as it is now called, is scheduled for release next year. The platform, including System Center, is undergoing a "deep refactoring," according to Jeffrey Snover, distinguished engineer for the Windows Server Group. As reported last month, Microsoft is aligning the components of each to create a more software-defined, cloud-optimized platform.
It'll also be interesting to see if the reported "Nano Server" edition of Windows Server will appear with the forthcoming technical preview, which will be a smaller option to use than the Server Core option that currently exists in Microsoft's flagship Windows Server 2012 R2.
Posted by Jeffrey Schwartz on 04/06/2015 at 1:10 PM0 comments
The 40th anniversary of Microsoft's founding is tomorrow, April 4. And in a twist of irony, its stock closed yesterday at just a hair above $40 per share ($40.29 to be precise). While that's still higher than the $35 it was trading at when Satya Nadella succeeded Steve Ballmer, the stock has declined 20 percent since November.
Microsoft's history is among the most interesting growth stories of a company that started with nothing and rose into one of the world's most influential companies. It all started in 1975 when Paul Allen and Bill Gates were able to get its iteration of Basic for the Intel microprocessor-based MITS Altair up and running.
Certainly most Microsoft IT professionals and developers know the rich history behind that but for those who don't (or want to recap those interesting times), you can check out this seven-minute Channel 9 video which recaps some key milestones from 1975. Back then, a gallon of gas was 53 cents and Microsoft's revenues that year were $16,705. Microsoft enjoyed many years as the world's most-valued company until recent years when Apple overtook it -- aided by many mistakes made by Microsoft over the past decade.
Nevertheless since Satya Nadella took over as Microsoft's CEO just over a year ago, the company has remade itself, fully embracing open source and rival proprietary software and services and making mobility and cloud the core of everything it delivers. Nadella defined Microsoft as a "productivity and platforms" company. Investors cheered until January when the company forecasted a weaker outlook for the current quarter, leading to its current stock decline.
Most Wall Street analysts see the current decline as a hold on buying opportunity but there are a handful of skeptics. Among them is Goldman Sachs, which downgraded Microsoft to a Sell on Wednesday. Analyst Heather Bellini in a research note warned that the stock will sit at $38 per share over the next 12 months, below the consensus of $46.97. To be sure, this is a contrarian view but it's worth pointing out the headwinds she sees. Among them:
- Microsoft was buoyed last year by the end of life of Windows XP and there's no equivalent issue that will force upgrades this year.
- The free Windows 10 upgrades will impact Windows licensing revenue.
- There's currently little room for lowering costs .
- PC sales will remain flat.
- Cloud licensing products such as Office 365 have much tighter margins than traditional software.
Bullish reports counter that Microsoft's strong commercial software business is gaining share and its cloud business, backed by strong Azure growth, is showing signs that it'll become a key cash cow in the future. Microsoft's tendency to exceed expectations has tempered some concerns over the lower guidance.
Noting both that the Goldman findings and bullish reports, Morningstar yesterday said it's holding its four-star rating of the company. "Microsoft remains a cash flow juggernaut," the report said. "Generating more than $26 million in free cash flow in the past fiscal year and with more than $85 billion in cash on its balance sheet, the technology powerhouse has the financial flexibility and resources to remake itself."
Microsoft's biggest challenge moving forward is to keep Windows successful, while attracting and retaining new talent that will help the company move into the future, as described by The Economist.
Many dread turning 40 while others relish the milestone. Microsoft appears to have moved passed its own mid-life crisis before turning 40, but time will tell.
Posted by Jeffrey Schwartz on 04/03/2015 at 9:49 AM0 comments
Andreesen Horowitz last week invested on an addition $52 million in security startup Tanium, adding on to the $90 million the Silicon Valley venture capital firm last year infused into the endpoint security provider. Steven Sinofsky, the president of Microsoft's Windows group until his unceremonious departure more than two years ago, is now at Andreesen Horowitz and largely leading the firm's investment in Tanium.
Tanium claims its endpoint security platform is designed to provide near real-time visibility to cyber threats against the largest of organizations. The company's vision is to scale without degradation regardless of the size of the organization or number of endpoints. The Tanium platform has two key components: Endpoint Security and Endpoint Management. Endpoint Security provides threat detection, incident response, vulnerability assessment and configuration compliance and Endpoint Management performs patch management, software distribution, asset management and asset utilization reporting.
The company claims with the Endpoint Security component of its platform it can provide 15-second threat detection and remediation, can connect to key external threat intelligence feeds and supports open standards such as OpenIOC (the open framework for sharing threat intelligence contributed by Mandiant), Yara for creating signatures that identify malware families, the Structured Threat Information eXpression (STIX) language for describing threat information and Trusted Automated eXchange of Indicator Information (TAXII). The Endpoint Management component aims to provide accurate assessments of vulnerability by providing accurate visibility of all endpoint assets.
During a CNBC interview this week, Sinofsky, who sits on Tamium's board, described the company's addressable market as up to 1 billion endpoints and argued Tanium's approach to threat detection and systems management is broader than traditional security providers currently offer.
"It's broader than any one security company, or any one in the traditional area we used to call systems management," Sinofsky said. "What's incredible about Tanium is it takes a modern and novel innovative approach to the difference between what used to be the mundane task of inventory of just tracking your PCs with the network and edge detection of companies like Palo Alto Networks and FireEye, and the like land Symantec is kind of a legacy provider of endpoint protection -- the signature files, and malware. Tanium is all about 15-second response across a billion end points in the enterprise world."
The added $52 million, which brings up the total investment up to $142 million, has doubled Tanium's valuation bringing it up to $1.75 billion, according to reports. Sinofsky declined when asked by CNBC to confirm the reported valuations.
Posted by Jeffrey Schwartz on 04/03/2015 at 12:42 PM0 comments
If you were on the fence about attending next month's brand new Ignite conference, designed to bring together the former Tech-Ed, SharePoint Conference and other events into one mega show, you're too late. Microsoft says Ignite, slated for May 4-7 at the McCormick Place Convention Center in Chicago, is sold out.
According to the Web site Microsoft set up for Ignite, full conference passes are no longer available. But if you're just interested in attending the expo you can still get into that. A spokeswoman for Microsoft said that 20,000 attendees have registered for the inaugural Ignite conference. Ignite is targeted at enterprise IT pros and is expected to be the site where top executives outline the future of key products including Windows Server, System Center, Hyper-V, SharePoint Server and others.
Microsoft CEO Satya Nadella is slated to kick off Ignite Monday May 4 with the opening keynote. Though it's a rebranded and expanded iteration of TechEd, many MVPs have raised concerns over the way Microsoft has organized sessions for the event. Among them is Redmond magazine Windows Insider columnist and Pluralsight author Greg Shield, who made no bones about his take on the event, saying in last month's column that "the company's next big event may be all flash and no substance."
Describing Ignite as the equivalent of a whitewashed corporate white paper, Shields pointed out, with help from his Pluralsight partner Don Jones, that instead of allowing people to propose outline for a session, Microsoft was looking for MVPs to nominate themselves and offer up three broad topics. "I'd been wondering why the newly rebranded Ignite removed the Ed in TechEd," Shields noted in his column. "Now, I think I know why. A tightly controlled and on-message event is a brilliant spectacle, but at the same time disingenuous. Select groups of trusted speakers make for a perfectly executed storyline, but at the cost of introducing new souls and their thoughts into the process."
While longtime TechEd speakers Jones and Shields won't be speaking at Ignite, they're not boycotting the show either. And apparently neither are 20,000 others.
Posted by Jeffrey Schwartz on 04/01/2015 at 11:49 AM0 comments
Two leading suppliers of tools that enable migration from SharePoint Server to Office 365, OneDrive for Business and other cloud services are coming together. Metalogix, which also offers Exchange migration tools, today said it has acquired rival MetaVis for an undisclosed sum.
The deal not only eliminates a key rival for Metalogix but it facilitates the company's move to offer a richer set of cloud services tools. Among the MetaVis portfolio are SharePoint Migrator, Office 365 Migration and Management Suite and the Architect Suite, which will extend Metalogix's Content Matrix, Migration Expert, ControlPoint and SharePoint Backup tools.
"What's great is we were in the process of building out a fully integrated cloud solutions platform and MetaVis has exactly that, a very easy-to-install, agentless, simple cloud platform where we're able to combine our efforts," said Metalogix CEO Steven Murphy. "Now we're able to hit the market with a very nice, integrated migration suite which includes ongoing management with a focus on security and compliance administration, which are some really important issues."
With the pending arrival of SharePoint Server 2016, along with several older versions and the growing use of Office 365, SharePoint Online and OneDrive for Business, many organizations risk making key mistakes if they don't plan out their migrations and insure data is moved in a form that makes it usable once it's moved to a new target, especially if it's online. Maggie Swearingen, a SharePoint consultant at Proviti and a Redmond magazine contributor pointed this out in the April issue.
"As organizations consider a myriad of options from Microsoft, it becomes essential to have not only a long-term strategic technology vision -- but also a SharePoint migration and upgrade roadmap that's big on efficiency and low on cost," Swearingen wrote. "The sad reality is that many SharePoint migrations are considered failures by the organization and business even when the content successfully moves from point."
In her report, Swearingen pointed to five of the most popular SharePoint migration providers. In addition to Metalogix and MetaVis, they include AvePoint, Dell Software and Sharegate (of course Microsoft offers its own free tools).
"We think in a nutshell, this [merger of Metalogix and MetaVis] will solidify our stake as a leader in the migration and movement of content to the cloud -- Office 365 and other targets and this will just extend our range around compliance, security and administration," said Murphy.
Posted by Jeffrey Schwartz on 04/01/2015 at 11:59 AM0 comments
The first preview of Microsoft next-generation browser, code-named "Project Spartan," made its public debut yesterday. The Project Spartan browser is included with the latest Windows 10 Technical Preview build 10049. I downloaded the new build where Project Spartan made its presence known the first time I booted up Windows 10.
As promised last week, Internet Explorer 11 is included as a separate browser and is unchanged, deviating from an earlier plan to include Project Spartan's new EdgeHTML rendering engine, which the company argues is much faster, more secure and reliable. Microsoft claims that the new browser is better suited for modern apps than Internet Explorer. Project Spartan is said to lack Microsoft's legacy Trident rendering engine, but Microsoft has also suggested that Project Spartan will have good compatibility with Web apps and intranet sites nonetheless. For organizations not seeing that compatibility, though, IE will still be around.
"It is fast, compatible and built for the modern Web. Project Spartan is designed to work the way you do, with features enabling you to do cool things like write or type on a Web page," said Joe Belfiore , Microsoft corporate VP for operating systems, in a blog post. "It's a browser that is made for easy sharing, reading, discovery and getting things done online."
Project Spartan aims to deemphasize the fact that you're using a browser, effectively putting the user's focus on the content, Belfiore said. The new browser integrates with Cortana, Microsoft's digital assistant built into Windows Phone 8.1 and introduced into the Windows 10 Technical Preview. When I highlighted text in a page Cortana guessed what I was looking for and rendered it alongside the page I was reading. It uses the Bing search engine to find information and it will be interesting to see if Windows 10 (with the new browser) gives a boost to Microsoft's search share, now dominated by Google.
The new Project Spartan browser also introduces a new feature called "inking," which lets users type or write with an electronic pen directly onto the Web page. You can make comments on a piece of the page and share it as a Web note either as an e-mail or onto social networks. It's similar to marking up a PDF file. Belfiore also pointed out that users can easily compile Web Notes and save them in Microsoft OneNote. In my quick test of that feature, it permitted ne to share the Web Note on Facebook, Twitter, Yammer and several other networks and apps such as Microsoft OneNote, though there was no obvious way to send it as an e-mail.
Also new in the Project Spartan browser are Reading Lists and Reading Views, designed to make it easier to put aside information from Web pages. It does so by letting you save any Web page or PDF into a Reading List for easy access at a later time. I saved a file to a reading list which in a way combines the function of Favorites and Web browsing histories, except you choose what's key to that history and can organize it accordingly.
At first glance, the browser does appear to render pages faster and introduces some useful new features. As for Cortana, that relationship is yet to take off. Every time I have tried to speak with her has resulted in a response in effect "try again later."
Have you downloaded the new build and looked at Project Spartan? Share your observations.
Posted by Jeffrey Schwartz on 03/31/2015 at 1:34 PM0 comments
Microsoft today introduced its thinnest and lightest model to date of its Windows 8.1 tablet. The new Surface 3, which will appear at Microsoft's retail stores tomorrow, will weigh just 1.37 pounds and a paper-thin .34 inches.
The new $499 device will include a one-year Office 365 subscription and, in keeping with its free Windows 10 upgrade offer, it will be eligible for the new operating system once Microsoft releases it this summer. Powered with Intel's latest system-on-a-chip, quad-core Intel Atom x7 processor, Microsoft makes clear this device is not aimed at those engaging in compute-intensive tasks. It's more suited for everyday productivity such as e-mail, Web browsing and other traditional office purposes.
"If you do very demanding work -- things like editing and rendering video or complex 3D modelling -- then the power and performance of a Surface Pro 3 is for you," said Panos Panay, corporate VP for the Surface product line at Microsoft, in a blog post announcing the Surface 3. "If the majority of your work is less intense -- working in Office, writing, using the Internet (using IE, Chrome, or Firefox!), and casual games and entertainment, then you'll find that Surface 3 delivers everything you need."
Microsoft claims that the new Surface 3 will get 10 hours of battery life even when running video and the company also has eliminated its proprietary charger, instead offering support for a Micro USB charger. The Surface 3 comes with a 3.5 megapixel camera in front and its 8 megapixel rear-facing camera comes with a new autofocus feature.
Though running the low-power system-on-a-chip processor, Paney emphasized it can run 64-bit version of Windows 8.1. Pro, making it suitable for business users in addition to students. It will work with the Surface Pen, though that will cost extra, Microsoft said. Paney emphasized the Surface 3's appeal to enterprise users, noting customers including BASF, Prada and the University of Phoenix.
Posted by Jeffrey Schwartz on 03/31/2015 at 1:12 PM0 comments
Many people remain skeptical that wearable computing and communications devices will grow at the pace of smartphones but growth this year is expected to double, according to a forecast released today by IT market research IDC.
Sales of "wristwear," which will account for 89.2 percent of all wearables, will grow from 17.7 million units in 2014 to 40.7 million this year, IDC is predicting. Wristwear, by IDC's definition, includes bands such as the Microsoft Band, bracelets and watches. Other types include clothing, eyewear, ear pieces and modular devices, accounting for the remaining 10.8 percent.
Wristwear capable of running third-party apps will account for the largest amount of new wearables sold, with 25.7 million predicted to sell this year. That's more than a 500 percent increase from the 4.2 million users bought last year. Next month's release of the Apple Watch will certainly play a big role in that growth. Other popular smartwatches include the Moto 360 and Samsung Gear watches will also contribute. But just as Apple kicked off the personal music player, smartphone and tablet markets, IDC predicts that the new Apple Watch will fuel the market for wrist-worn devices.
"Smart wearables are about to take a major step forward with the launch of the Apple Watch this year," said IDC Research Manager Ramon Llamas, in a statement. "The Apple Watch raises the profile of wearables in general and there are many vendors and devices that are eager to share the spotlight. Basic wearables, meanwhile, will not disappear. In fact, we anticipate continued growth here as many segments of the market seek out simple, single-use wearable devices."
The jury is still out on whether the Apple Watch and other devices like it will be a novelty, or if there is a killer app for these devices other than the convenience of being able to look at your e-mails and texts (and answer your phone). As I noted earlier this month, the Apple Watch doesn't have to be a hit right away but the apps available for it and others like it will have to offer a capability not available with smartphones today.
Do you see a killer app coming or is the current convenience alone enough to drive this new market?
Posted by Jeffrey Schwartz on 03/30/2015 at 12:07 PM0 comments
In a move that could broaden the discussion on income equality beyond gender, race and status, Microsoft will require its suppliers of contract workers to offer them paid vacation and sick time. The new policy, which applies to suppliers with 50 or more employees ranging from engineering and development staff to maintenance and security personnel at its numerous facilities, requires they offer either 10 days of paid vacation and five days of paid sick leave or 15 days of unrestricted paid time off.
Brad Smith, Microsoft's general counsel, said yesterday in a blog post that employees in the U.S. who have worked at least nine months, or 1,500 hours, "who perform substantial work for Microsoft," will be eligible. It's unusual for a large U.S. company to issue what could amount to a costly stipulation for suppliers, but one long sought-after by proponents of fair pay and income equality The move could lead other companies to enact similar benefits, a report in The New York Times today suggested.
The U.S. doesn't require paid sick leave and 43 million workers aren't offered it, according to the report. Consequently, many people are forced to come to work when they're sick, which often makes others sick and reduces productivity, Smith said in his blog post. Citing a University of Pittsburgh study, Smith said when an employee doesn't come to work when he or she has the flu, it reduced the risk of others catching it from that person by 25 percent and when taking two days off it reduced transmission of the virus by 39 percent.
Another survey, whose source he didn't identify, found only 49 percent of those in the bottom fourth of earners, receive paid time off. "Lack of paid time off also has a disproportionate impact on minorities at a time when the tech sector needs to do a better job of promoting diversity," Smith noted. We've long recognized that the health, well-being and diversity of our employees helps Microsoft succeed. Our commitment to them extends beyond the workplace."
While it isn't clear how many employees will benefit from the company's new mandate, Microsoft uses 2,000 outside suppliers who provide contract employees overall, according to The Times report. It surely is likely to raise the ire of those who have fewer employees or individuals who provide contract services to Microsoft. "We recognize that this approach will not reach all employees at all of our suppliers, but it will apply to a great many," Smith said in his blog post. "We've long recognized that the health, well-being and diversity of our employees helps Microsoft succeed. Our commitment to them extends beyond the workplace."
Some suppliers surely won't welcome the move as offering paid time off will be more costly for them. Smith indicated that Microsoft will work with them. "We also want to be sensitive to the needs of small businesses," Smith said. "For these reasons, we are going to launch a broad consultation process with our suppliers so we can solicit feedback and learn from them about the best way to phase in the specific details."
Whether or not Microsoft's move will lead other companies to enact similar policies remains to be seen. Employees working for a smaller company may feel further left out. But if you're a proponent of fairness in pay and compensation, this is a noteworthy step to further that goal.
Posted by Jeffrey Schwartz on 03/27/2015 at 12:46 PM0 comments
In a move aimed at making it more appealing for developers and business decision makers to use its cloud platform as a service (PaaS), Microsoft is bringing together its separate Azure app services into one complete offering.
Microsoft describes its new Azure App Service, now available, as a fully managed service which provides a simple way for developers to build apps that are customer facing. The new packaging effectively brings together three offerings that, until now, were disparate services -- Web sites, Mobile Services and Biztalk Services -- to easily integrate with SaaS and on-premises systems.
"It brings those together in new unified experiences," said Omar Khan, Microsoft's director of Azure engineering. "Developers are challenged with trying to connect all that data from these different systems into their apps. That's what App Service helps with. It helps developers integrate data from on-premises and from popular cloud services into their Web and mobile apps. And App Service also has new capabilities around allowing businesses to automate their business processes more easily, allowing them to be more agile."
Khan explained how the four offerings are coming together:
- Web Apps: Online tools and templates that make it easy to build, deploy and scale apps that are customer facing, for employee productivity or partners.
- Mobile Apps: Services that enable the tailoring of Web and other apps to key mobile platforms, notably iOS, Android and (of course) Windows.
- BizTalk Apps: Also described as Logic Apps, Microsoft now has 50 connectors to popular SaaS and on-premises apps including Office 365, Microsoft Dynamics Salesforce.com, Oracle, SAP, Facebook, Twitter and others.
- API Apps: These provide the services to expose APIs with the three App Services so the other three app types -- Mobile Apps, Logic Apps and Web Apps -- can consume those APIs.
"API Apps allow you to take any existing API, whether it's an API in the cloud or an API on-premises, and project that into App Service adding some simple metadata," Khan said. "And in doing so it exposes a slider format, which is a popular format for describing APIs, and thus allowing the other app types to consume those APIs. API Apps also let you then project your own custom APIs into App Service."
The service essentially provides a JSON file to provide your API and you can load that into app service using Microsoft's standard publishing mechanism. "We support Git, so it's basically uploading a JSON file via Git, and then App Service can basically make those APIs available in a reasonable form. And then you can use them within the regular apps within App Service," he added.
Asked how the service connects to on-premises applications and systems, Khan explained that the BizTalk connectors address that. "We have virtual networking in Azure that allows you to connect on-premises resources to the cloud," he said. "They also support hybrid connections which is a BizTalk capability that allows you to do app-to-app connection across firewalls. So these API Apps and the Oracle connector or the SAP connector, among others, utilize those connectivity options in Azure to connect to the on-premises resources and then there's a connector piece that you can run on premises that connects to that API App."
Microsoft is betting that, by providing this simplified services together, these will bring more applications to the Azure PaaS cloud service. But Microsoft today is also targeting the emerging developers who'll ultimately decide what platforms to build their applications on. Microsoft is now offering Azure for student developers, in which they can get free usage to learn how to build cloud-based mobile and Web apps using services such as the aforementioned Azure Apps and Azure Insights, which "gives students a 360-degree view across availability, performance and usage of ASP.NET services and mobile applications for Windows Phone, iOS and Android," wrote Microsoft's Steve "Gugs" Guggenheimer, in a blog post announcing the offering.
"Student developers are growing up in a world that requires them to leverage cloud services to deliver cool and modern experiences," Guggenheimer noted. "Microsoft Azure is a great fit for students because of its speed and flexibility enabling the creation and development of Web sites and Web apps. This new offer for students, available today in 140 countries, gives young developers access to the latest technology, allowing them to develop in or deploy sites and apps to the cloud, at no cost and with no credit card required."
In addition to Azure Apps and Azure Insights, Guggenheimer noted that the free offering lets students use Microsoft's Visual Studio Online.
Posted by Jeffrey Schwartz on 03/24/2015 at 2:46 PM0 comments
In its latest show of support for non-Windows hardware, Microsoft on Monday said that 11 device makers will preinstall key apps from its Office Suite onto the vendors' respective tablets and smartphones for consumers and business users. Leading the pack was Samsung, which said it will preinstall Microsoft Word, Excel, PowerPoint, OneNote on its tablets in the second half of this year. Microsoft said it will offer the apps via a new Microsoft Office 365 and Samsung Knox Business Pack.
Microsoft also said that Dell, along with original device manufacturer Pegatron, local device makers TrekStor from Germany, JP Sa Couto of Portugal, Italy's Datamatic, Russia's DEXP, Hipstreet of Canada, QMobile in Pakistan, Tecno from Africa and Turkey's Casper, will preinstall the Office 365 components.
Samsung's latest move follows this month's news at the Mobile World Congress in Barcelona that the electronics giant will offer OneNote, OneDrive and Skype on the new Galaxy S6 and Galaxy S6 Edge smartphones. At the time, reports surfaced that a deal to offer Office 365 apps for Samsung's portfolio of Android devices with Samsung Knox Workspace security integration was in the works -- a scuttlebutt that came to fruition this week. Adding to its Galaxy phone announcement from earlier in the month, Microsoft said the devices will come with an additional 100GB of Microsoft OneDrive free storage for two years.
Microsoft said businesses and enterprises that buy Samsung devices through its channel partners will have a choice of three Office 365 plans: Business, Business Premium or Enterprise bundled with Samsung's Knox, the company's Android-based security platform. The agreement also covers support and setup services.
Samsung and the other 10 hardware providers will offer the preinstalled Office 365 capability later this year, according to Peggy Johnson, Microsoft's executive vice president for business development, hired by Microsoft CEO Nadella from Qualcomm six months ago. "These deals demonstrate how we are working with hardware partners in new ways to deliver rich experiences through their scale," she said in a blog post. "This is a big step forward for our cross-platform and cross-device services strategy, which will bring an array of Microsoft services to every person on every device."
While this is not a major technical breakthrough but rather a bundling deal, it's likely to attract those who buy Android tablets and smartphones that want to continue using Office to procure new subscriptions or keep users happy with their existing plans.
Posted by Jeffrey Schwartz on 03/24/2015 at 2:41 PM0 comments
Microsoft's announcement earlier this week that users of pirated versions of its PC operating system can also take advantage of its free Windows 10 upgrade offer has an important caveat: it's no more official than the bootlegged version.
Terry Myerson, executive vice president of Microsoft's operating systems group, made the head-scratching announcement during Windows Hardware Engineering Conference (WinHEC) this week in Shenzhen, China. Talking up Microsoft's January announcement that users of Windows 7, 8, 8.1 and Windows Phone could upgrade their systems to the new Windows 10, Myerson told Reuters: "We are upgrading all qualified PCs, genuine and non-genuine, to Windows 10."
Microsoft's goal is to "re-engage" with the hundreds of millions of users of Windows in China, he told the news service, though he declined to elaborate. Given 90 percent of Microsoft software used in China alone is said to be pirated, that's a lot of free software. But that begs the question: why buy the software when you can get a bootlegged version for a fraction of the cost, if not free? Answering that question, Microsoft issued a statement which points out that if you're upgrading your pirated software, you still have an unlicensed version of Windows 10.
"We have always been committed to ensuring that customers have the best Windows experience possible," according to the statement. "With Windows 10, although non-genuine PCs may be able to upgrade to Windows 10, the upgrade will not change the genuine state of the license. Non-genuine Windows is not published by Microsoft. It is not properly licensed, or supported by Microsoft or a trusted partner. If a device was considered non-genuine or mislicensed prior to the upgrade, that device will continue to be considered non-genuine or mislicensed after the upgrade. According to industry experts, use of pirated software, including non-genuine Windows, results in a higher risk of malware, fraud (identity theft, credit card theft, etc.), public exposure of your personal information and a higher risk for poor performance or feature malfunctions."
By tapping China-based PC makers Lenovo, Qihu 360 and Tencent, Microsoft is hoping it'll convince customers to buy legitimate licensed versions of Windows.
Posted by Jeffrey Schwartz on 03/20/2015 at 11:38 AM0 comments
When Microsoft talked up the company's next-generation browser, code-named "Project Spartan," at this week's Windows Convergence conference in Atlanta, the obituaries came pouring in for Internet Explorer. It flashed on the screen of CNBC, it was on every general news site and was talked about all over social media.
Perhaps those outside of IT hadn't heard about Project Spartan, which Microsoft began talking about in some detail back in January. Microsoft explained at the time that the new browser will contain a new rendering engine, called "EdgeHTML." To create that new rendering engine, Microsoft forked the code in Internet Explorer's Trident engine. Spartan will offer both rendering engines, including the legacy MSHTML engine used for Trident. As reported by my colleague Kurt Mackie:
Organizations will be able to use the Spartan browser even if they have legacy IE support issues to address. When legacy support needs arise, Spartan will be capable of running the old IE Trident engine via Enterprise Mode. Microsoft's Enterprise Mode technology is an IE 11 solution that emulates earlier IE browser technologies all of the way back to IE 5 for compatibility purposes.
Reports this week that Microsoft is killing the Internet Explorer brand do appear to be the long-term plan from a branding perspective. MIT Technology Review Senior Editor Rachel Mertz is among those who believe Microsoft should retire Internet Explorer.
"The changes both to the browser and the branding make a lot of sense," she wrote. "Internet Explorer, first released in the mid-1990s, dominated the browser market at its peak in the early 2000s, but it came to be associated with poor security and compatibility with other browsers and has since languished. Spartan's success is critical if Microsoft is to remain relevant in the Web browser business -- a market in which it used to dominate but now trails Google's Chrome."
At the same time, enterprise users aren't going to want to see Internet Explorer, or at least the rendering capabilities of whatever it calls its next browser, go away. For its part, Microsoft is promising the new browser will offer the same compatibility it has offered in past upgrades."Project Spartan is Microsoft's next generation browser, built just for Windows 10," according to a company statement. "We will continue to make Internet Explorer available with Windows 10 for enterprises and other customers who require legacy browser support."
Now the question is: what will Microsoft call its new browser?
Posted by Jeffrey Schwartz on 03/20/2015 at 1:03 PM0 comments
In what may seem like a bizarre move, Microsoft said its free Windows 10 upgrade offer also applies to those with pirated older versions of the operating system.
Terry Myerson, executive vice president of Microsoft's operating systems group, made the startling announcement at the Windows Hardware Engineering Conference (WinHEC), taking place this week in Shenzhen, China.
Microsoft earlier this year said it will offer Windows 10 as a free upgrade to Windows 7 and Windows 8.x users. The offer, which will also include earlier versions of Windows Phone, is good only for one year after the release of the new operating system and doesn't apply to all enterprise users.
Given the vast majority of all PC software in China is said to be pirated, Myerson chose a fitting place to announce the move. "We are upgrading all qualified PCs, genuine and non-genuine, to Windows 10," Myerson told Reuters. Microsoft's goal is to "re-engage" with the hundreds of millions of users of Windows in China, he told the news service, though he declined to elaborate.
It appears Microsoft is hoping to get users of pirated software to use legitimate versions of Windows. Microsoft has tapped China-based PC makers Lenovo, Qihu 360 and Tencent to help in that effort.
Posted by Jeffrey Schwartz on 03/18/2015 at 2:13 PM0 comments
Microsoft has put Windows 10 on the fast track saying in an unexpected announcement the new OS will arrive this summer. That's a surprise escalation in expectations from the fall timeline targeted by Microsoft Chief Operating Officer Kevin Turner back in December.
The company announced the unexpected earlier delivery date at the Windows Hardware Engineering Conference (WinHEC), taking place in Shenzhen, China this week. Also at WinHEC, as reported yesterday, Microsoft revealed that the release of Windows 10 will aim at transitioning users away from passwords to login to their systems and instead will offer Microsoft's new biometric authentication tool called Windows Hello.
While it wasn't initially clear to what extent the Windows Hello technology would be supported in Windows 10, Terry Myerson, executive vice president for the Windows platform group at Microsoft said at WinHEC and in a blog post that all OEMs have agreed to support it.
Windows 10 will be available in 190 countries and 111 languages when it launches, according to Myerson. Obviously that's a wide window given it can arrive anytime between June 21 and Sept. 20. But the expedited release may suggest that Microsoft doesn't want to miss this year's back-to-school season, a time many students buy new systems. If that's the case, it will need to come in June or July, rather than late September.
The big question an earlier-than-expected release raises: is Microsoft looking to rush Windows 10 out the door too soon and will it come out feature-complete? Meanwhile, there are many new features testers have yet to see, such as the new browser component called Spartan and yesterday's reveal: Windows Hello. Joe Belfiore, corporate vice president for Microsoft's operating systems group unveiled Windows Hello at WinHEC, which he said provides system-level support for biometric authentication, including fingerprint and facial recognition as a replacement for passwords.
Hello isn't the first effort to bring biometrics to Windows PCs. Makers of PCs have offered fingerprint scanners on a small selection of their PCs for years now. But few used them and most devices today have done away with them. This time, it looks like Microsoft is aiming for biometrics that will be pervasive in Windows 10 devices. "We're working closely with our hardware partners to deliver Windows Hello-capable devices that will ship with Windows 10," Myerson said. "We are thrilled that all OEM systems incorporating the Intel RealSense F200 sensor will fully support Windows Hello, including automatic sign-in to Windows."
Myerson said Microsoft is also offering a new version of Windows for smaller Internet of Things devices ranging from ATM machines to medical equipment thanks to partnerships with the Raspberry Pi Foundation, Intel, Qualcomm and others. Microsoft also unveiled Qualcomm's DragonBoard 410C for Windows 10 devices. It includes the first Windows 10 developer board that's integrated with Wi-Fi, Bluetooth and GPS, along with the Qualcomm Snapdragon 410 chipset.
Posted by Jeffrey Schwartz on 03/18/2015 at 12:55 PM0 comments
Prominent Forrester Analyst James Staten is joining Microsoft today as chief strategist for Microsoft's Cloud and Enterprise division. Word of his move from Forrester to Microsoft spread quickly over the weekend when Staten updated his LinkedIn status.
In a tweet last night, Staten alerted followers: "Just landed in Redmond. Ready to start my new career at #Microsoft." When asked by a Twitter friend if he's relocating to Redmond, Staten replied: "Staying in Silicon Valley. Working w/VCs, startups and local companies r key to my job."
Staten didn't immediately respond to an e-mail from me, though I've known him for many years as someone who had a firm grasp on the competitive strengths and weaknesses of all the public infrastructure-as-a-service (IaaS) cloud players. Furthermore, he had forecast the evolution of IaaS pretty much from the beginning and has consulted with many customers making cloud computing decisions.
It is a safe bet that his primary focus will be to advance the Azure cloud business. Staten's tweets on regarding his move included the Azure hashtag. Given his deep knowledge of the IaaS landscape, Microsoft will have a strong and credible executive on the Azure team, which should also help in the company's effort to further gain inroads in Silicon Valley.
At Forrester, Staten was based in Silicon Valley and, prior to joining the research firm, he worked at Azul Systems and Sun Microsystems software.
In a statement Microsoft said Staten will report to Cloud and Enterprise Executive VP Scott Guthrie, where he'll "work closely with Scott's leadership team on delivering the most complete set of cloud capabilities and services to customers small and large."
Posted by Jeffrey Schwartz on 03/16/2015 at 10:18 AM0 comments
Microsoft kicked off this week's annual Convergence conference in Atlanta by announcing the preview of Office 2016 for IT Pros and Developers. It was among several releases which also includes a general preview of Skype for Business.
Office 2016 is the first major upgrade of the Office desktop suite since 2013 and follows last month's preview releases of touch-enabled versions of Word, Excel and PowerPoint apps for Windows 10. Today's Office 2016 release for IT pros and developers gives a far broader look at the new suite including some new features such as click-to-run-deployment, extended data loss prevention (DLP) support and the new Outlook 2016 client.
Julia White, general manager of Microsoft's Office division, demonstrated the new Outlook 2016 during the opening Convergence keynote. In the demo, she played up Outlook's ability to handle content linked with OneDrive for Business. When a user goes to attach a file, the most recently accessed documents appear and are added as a link to the sender's OneDrive for Business account.
"When I hit send, it looks like an attachment, it feels like an attachment and when I send it, it actually sends the access to the file," White said. "So I don't have to send a physical attachment and deal with versioning." Users can link up on the same document in the cloud, continued White. She also added that the new Outlook will still let users attach actual files.
The new Outlook also offers significant technical improvements, said Kirk Koenigsbauer, corporate vice president for Microsoft's Office 365 Client Apps and Services team, in a blog post announcing the Office 2016 preview. Among the improvements in Outlook for IT pros he pointed to include:
- MAPI-HTTP protocol: RPC-based sync replaced with a new Internet-friendly MAPI-HTTP protocol that supports Exchange/Outlook connectivity
- Foreground network calls: The use of foreground network calls eliminated to ensure Outlook stays responsive on unreliable networks
- Multi-factor authentication: Support multi-factor authentication via integration with the Active Directory Authentication Library (ADAL)
- E-mail delivery performance: The amount of time it takes to download messages, display message lists and show new email after resuming from hibernation reduced
- Smaller storage footprint: New settings let users better manage storage by only retaining 1, 3, 7, 14 or 30 days of mail on the device
- Search: Improved reliability, performance and usability of Outlook search and FAST-based search engine is integrated into Exchange.
The new Office 2016 preview doesn't include all of the features that Microsoft is planning for the new release, Koenigsbauer noted. The new DLP support builds on what Microsoft now offers with Exchange, Outlook, OneDrive for Business and SharePoint. "Now we're bringing these same classification and policy features to Word, Excel and PowerPoint," he said. "With these new capabilities, IT admins can centrally create, manage and enforce polices for content authoring and document sharing -- and end users will see policy tips or sharing restrictions when the apps detect a potential policy violation."
Koenigsbauer noted that the new click-to-run deployment feature for Office 365 customers introduces Microsoft's new Background Intelligence Transfer Service (BITS), which Koenigsbauer said aims to prevent network congestion on the network. "BITS throttles back the use of bandwidth when other critical network traffic is present," he said.
Other deployment improvements showcased in the new Office 2016 preview include tighter integration with Microsoft's System Center Configuration Manager (SCCM), more flexible update management for handling feature updates and bug fixes and improved activation management added to the Office 365 Admin Portal.
White also talked up Skype for Business, which Microsoft said back in November would represent the rebranding of the company's Lync platform with its Skype service. "Now all of the Skype for Business users can connect with Skype from a contacts perspective and communicate [with] them with IM, voice and video," she said. "So imagine a sales person connecting with any customer, a doctor connecting with a patient, and employer interviewing someone via Skype. There's so many possibilities with this new experience."
Posted by Jeffrey Schwartz on 03/16/2015 at 2:13 PM0 comments
Kemp Technologies is now offering a free version of its Loadmaster application load balancer, a move the company is hoping developers and DevOps managers will use for distributed workloads that don't require a lot of capacity.
While the move is aimed at seeding its virtual appliance, the company is letting companies use it permanently for production workloads. The catch is that operators will have to reregister it with Kemp's licensing server every 30 days, it will only support 20 Mbps applications and it doesn't support the high availability features of its commercial version, said Kemp Product Manager Maurice McMullin.
Nevertheless, the company believes developers and DevOps managers will use the free virtual load balancing for testing, development and running non-critical applications that don't have large amounts of network traffic. "It's reasonably configured," McMullin said, noting that it includes the intrusion detection system and Web application firewall.
"People could use it in a preproduction environment and a dev-test environment and potentially even some production environments for non-critical low volume workloads," he added. One example might be a time sheet entry system where the application is distributed among locations but isn't used frequently and is not business critical.
Kemp isn't the first to offer a free load balancer. For example, HAProxy offers an open source version of its namesake load balancing software and operates a community site. But McMullin argues the Kemp offering is suited for mainstream VMware and Hyper-V workloads and can run in public clouds including Amazon Web Services, Microsoft Azure and VMware vCloud Air.
The company said its free offering includes complete testing and validation of applications in Kemp's global site load balancing (GSLB) and Edge Security Pack, which offers single sign-on. The free software also supports the Kemp Web Application Firewall Pac as well as its REST API and Windows PowerShell and Java API wrappers.
Posted by Jeffrey Schwartz on 03/12/2015 at 3:39 PM0 comments
Intel's warning that revenues could be off by about $1 billion weighed on its shares Thursday, stoking fears that PC sales may remain weak until Microsoft ships its new Windows 10 operating system.
The chipmaker's revised forecast for the first quarter is revenue of $12.8 billion, give or take $300 million, compared to the prior prediction of $13.7 billion, give or take $500 million. Intel said lower than anticipated demand for business desktop PCs across the supply chain spurred the revised forecast.
Lower than expected Windows XP refreshes have impacted supply for its inventories, Intel said. Businesses and consumers are taking an "if it ain't broke, don't fix it" attitude to their old PCs, Summit Research Analyst Srini Sundararajan told Reuters. But with Microsoft's Windows 10 waiting in the wings, many PC buyers are likely putting off upgrades as well. As research by Redmond magazine and others is showing, many users plan to upgrade to Windows 10 and are awaiting the new hardware in the pipeline that will support it.
Like many other companies, Intel also said currency conditions in Europe will also affect revenues. That has led to an increase in PC prices in Europe, which has impacted demand, according to various reports. BlueFin Research Partners is forecasting about 76 million PCs will ship this quarter, a decline of 8 to 9 percent, Reuters reported.
Intel's datacenter business forecast remains unchanged, the company said.
Posted by Jeffrey Schwartz on 03/12/2015 at 11:42 AM0 comments
News that Hillary Clinton operated an e-mail server out of her house in Chappaqua, N.Y. for both personal and official communications while serving as Secretary of State underscores how far people will go for convenience and control -- even if it means bypassing IT to do so.
While maintaining that she didn't break any laws or send any classified messages using her personal e-mail account instead of the official e-mail system the rest of the government uses, Clinton has raised fierce debate over the propriety of her decision to take matters into her own hands. Aside from the legal issues and obvious questions, such as did her use of personal e-mail really go unnoticed for four years and was the system she used as secure as government's network (some argue hers might have been more secure), her actions are far from unique.
In my reporting on this month's Redmond magazine cover story about the forthcoming end of life of Windows Server 2003, IT pros lamented that they discovered many unsanctioned servers in use -- often under employees' desks. For better or worse, many companies have become more tolerant of employees bypassing IT than they were years ago in part due to the bring-your-own-device (BYOD) trend brought on by the advent of smartphones and tablets. I often receive business-related e-mail from high-level people at companies of all sizes from their personal e-mail addresses -- usually a Gmail, Yahoo or Outlook.com address -- and I'm sure you do too.
Many employees in organizations not wanting to wait for IT to provision systems have spun up VMs by setting up an Amazon Web Services or Azure account with a credit card. And it's certainly become common for business users to set up accounts using Salesforce.com and Workday, among other SaaS applications. Many are also setting up Office 365 accounts on their own. Services such as OneDrive, Google Drive, Dropbox and Box have replaced flash drives for copying and storing files. A survey by data protection vendor Vision Solutions found that 52 percent of organizations don't have processes to manage the use of such services, putting at risk the loss of confidential data.
Apparently the U.S. government -- or at least the State Department, which she headed from 2009 to 2013 -- was one of them. Indeed setting up your own Exchange Server (I'm assuming that's what she used since that's what runs the Clinton Foundation's e-mail) is a more brazen move than most take and one that most are unlikely to do, given the cost.
From news accounts we know she's not the only government official to use personal e-mail for routine business communications, though she's the highest level and, for now, the most infamous one to do so. If we're to take Clinton at her word -- and I realize many don't -- she said yesterday she used her own e-mail for convenience. If that's to be translated that she was trying to balance her work and personal life by using one account, I think we can all agree that was a bad idea -- she said as much, though she could have separated them and still used one device, as many of us do.
Given that she was the nation's top diplomat and a potential presidential candidate, the fallout from this is far from certain as this issue continues to create discourse.
In the end, businesses and government agencies of all sizes have to establish policies that address what employees at all levels can and can't do when it comes to using IT. Absent of any clear rules and enforcement of them, you likely have at least one, if not many, people like Hillary Clinton in your organization.
Posted by Jeffrey Schwartz on 03/11/2015 at 1:19 PM0 comments
Many people have asked me if I plan on getting an Apple Watch when it comes out next month. The answer is, not the first version and probably not the second either. I'm not sure if I'll ever buy one but haven't ruled it out in case the price and performance are right.
Apple's launch event yesterday confirmed what we already presumed. The Apple Watch will ship next month (preorders begin April 10 and they'll appear in stores April 24) and the starting price is $350. If you want to spring for one with an 18-karat gold band, that'll cost $10,000 and if you must have the most expensive model -- with sapphire faces -- it'll set you back $17,000. If you collect Rolexes and the like, it'll be the perfect addition to your collection.
Who needs an extension of their iPhone on their wrist? Let's face it, the Apple Watch is yet the latest accessory to the iPhone, which is required for the watch to work. If you want to glance at your messages, view alerts and maybe access some information without removing your iPhone from your pocket, it's certainly could be convenient. The question is how much will most people be willing to pay for that convenience?
Back in 2007 when the first iPhone came out it cost $599 and that was with the carrier subsidy (only AT&T offered them for the first few years). And they were basically just iPods with phones on them with e-mail access and a handful of other apps. When the iPod came out in 2001, Apple certainly wasn't the first to release an MP3 player. In both cases though, the company was the first to legitimize and create a mass market for their offerings in a way others before them were unable to do.
Will Apple be able to catch lightning in a bottle yet again? While that remains to be seen, it's a reasonable bet we'll see substantially less expensive versions of the Apple Watch in the next few years that'll be much more functional than the $17,000 models that are now debuting.
Posted by Jeffrey Schwartz on 03/10/2015 at 2:13 PM0 comments
In its push to simplify migration of Windows applications to cloud infrastructures without dependencies on hardware or software platforms, Microsoft has added Sphere 3D as its latest partner to deliver Windows containers. The two companies announced a partnership today to deliver Glassware 2.0 Windows containers for Azure.
Sphere 3D said it's collaborating with Microsoft to develop tools to simplify the migration of Windows-based end user applications to Azure. The two companies are first working to offer Glassware 2.0-based workloads in Azure for schools. Later in the year, Sphere 3D will offer other tools, the company said. Unlike Microsoft's higher-profile container partner Docker, which is open source, Glassware 2.0 is a proprietary platform designed to virtualize applications without requiring a virtualized desktop.
The Glassware 2.0 suite includes a micro hypervisor which the company calls a "Microvisor." Unlike a traditional hypervisor, which requires a guest OS for applications to run, the "Microvisor only pulls in elements of the OS stack needed for the software application to run, and also fills in any gaps that may be present, in particular with applications needing functionality not inherent in whichever OS stack you happen to be using," according to the company's description.
Glassware 2.0 also includes containers, management tools and clustering software. The containers are designed to run multiple instances of the same app in a Glassware 2.0-based server. It provides the ability to share binaries, libraries or the Glassware 2.0 Microvisor, according to the company. This environment provides access only to those components of an operating system an application needs to run. It supports applications running in Windows XP, Windows 7 and Windows 8.x environments.
"When we created Glassware 2.0, we envisioned a time where any application, regardless of its hardware or operating dependencies, could be easily delivered across multiple platforms from the cloud," said Eric Kelly, Sphere 3D's CEO in a statement. "Today, by joining forces with Microsoft, we have taken a substantial step towards realizing that vision."
The company says the Glassware 2.0 Microvisor can virtualize infrastructure components and the application stacks from both Windows and non-Windows-based systems and claims it can "outperform" any existing hypervisor-based infrastructure. Furthermore, the company said it can be used for systems and cloud management, orchestration and clustering. The Glassware Manager runs in Windows Server 2008 and above.
Posted by Jeffrey Schwartz on 03/09/2015 at 1:23 PM0 comments
Forget about next week's anticipated launch of the Apple Watch or the fact that the company will be added to the select 30 companies in the Dow Jones Industrial Average, knocking out AT&T. Apple's latest assault on the enterprise -- a market it has largely eschewed over its 39-year history -- is said to include a new iPad which, in many ways, could appeal to the same potential users of Microsoft's Surface Pro 3.
A larger iPad has been rumored for some time now. The new device is said to include a USB 3.0 port, keyboard and mouse, according to a report in today's Wall Street Journal. At this point, according to the report (which, as usual, Apple had no comment), the company is still just considering the USB port -- a feature the late Steve Jobs was staunchly against. Apple had originally told suppliers it was looking to deliver the larger 12.9-inch iPad this quarter. Now it apparently is in the pipeline for the second half of the year.
At 12.9-inches, the new iPad would actually be slightly larger than the current Microsoft Surface Pro 3, which now sports a 12-inch form factor. It would be considerably larger than the current iPad Air, which is 9.7 inches.
Would a larger iPad with USB support, a keyboard and mouse compete with the Surface Pro 3 and other Windows-based PCs? Before jumping on me for comparing apples to oranges, keep in mind that an iPad that could let people run Office for basic work processes could appeal to numerous individuals, even if the tablets don't have the 8GB of RAM and latest Intel Core processors that the Surface Pros have.
The move makes sense for Apple, which sold a mere 21 million iPads last quarter. While Microsoft hasn't shipped a fraction of that many Surface Pros, sales are also on the rise. However, they still lack the momentum some would like to see. Certainly the new extra-large iPhone 6 Plus is cannibalizing sales of iPads so the natural way to target the tablets is toward enterprise workers. Just as the bolstered iPad will compete with Windows PCs, it also has the potential to cut into at least some sales of low-end Macs.
In the latest boost for Apple's iOS platform, the company's new enterprise partner IBM this week launched additional applications in its MobileFirst for iOS suite at the Mobile World Congress in Barcelona. Among the new apps are Dynamic Buy for retailers, Passenger Care for companies in the travel and transportation industry and Advisor Alert for banks. Big Blue said with the launch of its latest apps, 50 customers are already using them including Air Canada, American Eagle Outfitters, Banorte, Boots UK, Citigroup and Sprint.
The rise of iOS in the enterprise may yet to have peaked. But at the same time, it's not a death knell for Windows. Microsoft's well aware of the new client device landscape and the fact that these new device types aren't going away. From my standpoint, different devices are suited for various purposes and it's nice to have both at my disposal.
Posted by Jeffrey Schwartz on 03/06/2015 at 11:28 AM0 comments
LinkedIn, the popular social network for business users, has informed customers that its social connector will no longer work with older versions of Outlook after next Monday, March 9.
The announcement, sent in an e-mail to customers, stated that the connector will still work with the most recent release, Outlook 2013. Those with Outlook 2003, 2007 and 2010 will no longer be able to view information about their LinkedIn contacts, the company said in the e-mail.
"Our team is working with Microsoft to build even more powerful tools to help you stay connected with your professional world," according to the e-mail that I received this afternoon. "Until then you can get similar capabilities with the 'LinkedIn for Outlook' app for Outlook 2013 from the Office Store."
The move comes just over five years after Microsoft debuted the Outlook Social Connector for LinkedIn. Though I've tested the feature, I don't currently use it.
Do you use the connector actively, or at all? If you have an older version of Outlook is this move likely to convince you to upgrade to Outlook 2013?
Posted by Jeffrey Schwartz on 03/02/2015 at 1:08 PM0 comments
Failure to update your systems and applications running Windows Server 2003 could have deadly consequences. That's the message that Microsoft Distinguished Engineer Jeffrey Snover conveyed over the weekend when he tweeted his warning about what will happen to those who keep Windows Server 2003-based systems running after July 14:
Not updating from WS2003 is like the guy who jumps off a building on the way down says, "so far so good." #ThisIsNotGoingToEndWell
Microsoft has been pretty vocal about the need to update Windows Server 2003. Snover, who is widely regarded by the Microsoft MVP community, is the latest of many in Redmond who are trying to be clear about the risks Microsoft says customers will face if they don't address the situation. In simple terms, there are still millions of Windows Server 2003-based systems in commission. After July 14, Microsoft will no longer issue security patches. That means those servers could become conduits to spread malware or other threats. I go much deeper into that in this month's Redmond magazine cover story.
Microsoft recommends upgrading to Windows Server 2012 R2 or urges users to consider moving the applications impacted by the loss of support to the cloud, if that makes sense. According to a survey conducted last year by application remediation vendor AppZero, more than one third, or 36 percent, said there will be a cloud component to their upgrade process.
Perhaps one of the biggest challenges to upgrading is it requires organizations to decommission the Windows Server 2003 Active Directory domain controllers and migrate the schema to the more current iteration of AD. MVP John O'Neill Sr., who has joined the roster of Redmond magazine contributors, aptly explains how to do so.
Do you have a plan in place? Will you migrate to Windows Server 2012 R2 or are you looking at a pure cloud-based deployment of your applications? Perhaps you're planning a hybrid architecture? Or maybe you simply don't agree with Snover's latest warning of the perils of doing nothing? Share what you're going to do about the pending end of support for Windows Server 2003.
Posted by Jeffrey Schwartz on 03/02/2015 at 11:49 AM0 comments
Docker today released several new tools aimed at letting IT pros and developers build and manage distributed applications that are compatible across environments including Amazon Web Services, Google, IBM, Joyent, Mesosphere, Microsoft and VMware.
Over the past year, these players have pledged support for Docker's open source container environment, which has quickly emerged as the next-generation architecture for developing and provisioning distributed apps. Today's beta releases are key deliverables by Docker and its ecosystem partners to advance the building, orchestration and management of the container-based platform.
For its part, Microsoft said it is supporting the newly released betas of Docker Machine (download here), the orchestration tool that gives IT pros and developers the ability to automate the provisioning and management of Docker containers on Linux or Windows Servers; and Docker Swarm, a tool that lets developers select infrastructures for their apps including Azure virtual machines. Microsoft also said it will natively support the new Docker Compose developer tool as a Docker Azure extension.
The new Docker Machine beta allows administrators to select an infrastructure to deploy an application built in the new environment. Microsoft has contributed drivers in Azure that allow for rapid and agile development of Docker hosts on Azure Virtual Machines. "There are several advantages to using Docker Machine, including the ability to automate the creation of your Docker VM hosts on any compatible OS and across many infrastructure options," said Corey Sanders, Microsoft's director of Azure program management, in a post on the Microsoft Azure blog.
"With today's announcement, you can automate Docker host creation on Azure using the Docker Machine client on Linux or Windows," he added. "Additionally, Docker Machine provides you the ability to manage and configure your hosts from a single remote client. You no longer have to connect to each host separately to perform basic monitoring and management tasks, giving you the flexibility and efficiencies of centralized devops management."
With Docker Swarm, which spins a pool of Docker hosts into one virtual host, IT pros can deploy their container-based apps and workloads using the native Docker clustering and scheduling functions, Sanders added. It also lets customers select cloud infrastructure such as Azure, enabling them to scale as needs necessitate for dev and test. Sanders noted that using the Docker CLI, customers can deploy Swarm to enable scheduling across multiple hosts. The Docker Swarm beta is available for download on Github.
The Docker Compose tool enables and simplifies modeling of multi-container Docker solutions using the declarative YAML file format. "This single file will be able to take a developer-modeled application across any environment and generate a consistent deployment, offering even more agility to applications across infrastructure," Sanders noted. "In Azure, we are working to expand our current Docker extension to support passing of the YAML configuration directly through our REST APIs, CLI or portal. This will make the simple even simpler, so you can just drop your YAML file details into the Azure portal and we take care of the rest."
Microsoft said it will be releasing documentation to build Docker hosts on Azure virtual machines using a Docker Machine.
Posted by Jeffrey Schwartz on 02/26/2015 at 12:53 PM0 comments
The National Security Agency (NSA) continues to hold its stance that the only way to thwart terrorist attacks and other crimes is to continue the surveillance programs exposed by Edward Snowden nearly two years ago. The latest report alleges that the NSA, along with the British government counterpart Government Communications Headquarters (GCHQ), has hacked encryption keys from SIM cards on smartphones.
Documents provided by Snowden and reported last week by The Intercept allege that the U.S. and British governments specifically were hacking into SIM cards from Gemalto, the largest provider of SIM cards, used in smartphones to store encrypted identity information. According to the report, the breach was outlined in a secret 2010 GCHQ document.
If indeed the encryption keys were stolen, it gave the agencies the ability to eavesdrop on and wiretap voice and data communications without approval from governments or wireless providers. The bulk key theft also gave the agencies the ability to decrypt communications that they had already intercepted, according to the report. The ability to do so was the result of mining communications of engineers and other Gemalto employees, the report added, noting that the company was "oblivious to the penetration of its systems."
Now Gemalto is shedding doubt on the severity of the breach. The company released a statement which did acknowledge it detected the intrusion that took place in 2010 and 2011. The findings of the investigation "give us reasonable grounds to believe that an operation by NSA and GCHQ probably happened," according to the Gemalto statement. However, in questioning the extent of the breach, the statement said that "the attacks against Gemalto only breached its office networks and could not have resulted in a massive theft of SIM encryption keys."
By 2010, the company said it had already implemented a secure transfer system with its customers and in only some rare instances could theft have occurred. Moreover, in many of the targeted countries at the time, many of the networks only had 2G mobile communications networks, which are inherently insecure. The modern 3G and 4G networks weren't vulnerable to such interceptions, according to the company. Gemalto said none of its other cards were affected by the attack. While the statement also pointed to some inconsistencies in the document that was leaked, including some of the customers it claimed the company worked with, Gemalto said that the SIM cards have customized encryption algorithms for each telecom provider.
For its part, the NSA is making no apologies on its surveillance policies. NSA Director Mike Rogers spoke last week at the New America Foundation's cyber security conference in Washington, D.C., where he said backdoors would not have a negative impact on privacy, weaken encryption or dampen demand for technology from the U.S.
Alex Stamos, Yahoo's chief information security officer, who was in attendance at the conference, took Rogers to task on his contention that the government has backdoors or master keys, according to The Guardian. When Stamos asked Rogers how Yahoo, which has 1.3 billion users throughout the world, could be expected to address requests for backdoors, Rogers reportedly skipped over the foreign requests, describing its overall process as "drilling a hole in a windshield. I think that this is technically feasible. Now it needs to done within a framework."
The problem is, it's unlikely that the feds will come up with a framework that will sit well with many people.
Posted by Jeffrey Schwartz on 02/25/2015 at 10:27 AM0 comments
Lenovo Chief Technology Officer Peter Hortensius yesterday apologized for the SuperfIsh spyware installed on several of its PC models, saying it shouldn't have happened and said the company is putting together a plan to ensure it never happens again.
"All I can say is we made a mistake and we apologize," Hortensius said in an interview with The New York Times. "That's not nearly enough. So our plan is to release, by the end of the week, the beginning of our plan to rebuild that trust. We are not confused as to the depth of that this has caused people not to trust us. We will do our best to make it right. In the process, we will come out stronger. But we have a long way to go to make this right."
Hortensius said so far Lenovo has not seen any evidence that the malicious software that was embedded deep within the company's systems put any customers or their data at risk. "We are not aware of this actually being used in a malevolent way," he told The Times' Nicole Perlroth. Asked if it's possible that Lenovo engineers installed this on any other models than the two already reported (the Yoga 2 models and Edge 15), Hortensius said he didn't believe so but the company is investigating and will have an answer by the end of the week.
Nevertheless, some of his responses were troubling. Why did it take more than a month for Lenovo to get to the bottom of this once it was reported to the company? "At that time, we were responding to this issue from a Web compatibility perspective, not a security perspective," he said. "You can argue whether that was right or wrong, but that's how it was looked at it." Hortensius also wasn't able to answer Perlroth's question regarding how the opt-in processes work.
He was also unable to explain how the company was unaware that Superfish was hijacking the certificates. "We did not do a thorough enough job understanding how Superfish would find and provide their info," he said. "That's on us. That's a mistake that we made."
Indeed mistakes were made. Some might credit him for saying as much and apologizing. But based on the comments from my report on the issue earlier this week, it may be too little, too late.
"I didn't trust Lenovo even before this issue," said one commenter who goes by the name "gisabun." "Expect to see sales drop a bit [even if the corporate sales are generally unaffected]. Microsoft needs to push all OEMs to remove unnecessary software."
"Bruce79" commented: "Inserting a piece of software that opens unsuspecting users up to security attacks? That is a clear betrayal, regardless of price."
Kevin Parks said, "We need a class-action lawsuit to sue them into oblivion. That would tell vendors that we won't accept this kind of behavior."
Another had a less extreme recommendation: "What Lenovo could and should do is simple. Promise to never put third-party software on their machines for [X number] of years. After X number of years, no software will be preloaded; Lenovo will ask if you want the software downloaded and installed."
Was Lenovo CTO's apology a sincere mea culpa or was he just going into damage-control mode? Do you accept his apology?
Posted by Jeffrey Schwartz on 02/25/2015 at 9:36 AM0 comments
BlueStripe Software today said its FactFinder monitoring suite now supports distributed applications residing in Docker containers. The company said an updated release of FactFinder will let IT operations administrators monitor and manage application containers deployed in Docker containers.
Granted, the number of full-blown transaction-oriented systems that are developed and deployed in Docker containers today are few and far between. But BlueStripe Director of Marketing Dave Mountain said a growing number of development teams are prototyping new applications that can use Docker containers, which are portable and require much less overhead than traditional virtual machines.
"It's something where there's a lot of activity on the dev site with Docker containers," Mountain said. "Generally when you hear people talking about it, it's very much at the development level of [those] prototyping new ideas for their applications and they're pulling things together to build them quickly. We're not seeing them being deployed out to the production environments, but it's coming. As such, we want to be ready for it."
FactFinder is a tool used to monitor transactions distributed across server, virtualization and cloud platforms and is designed to troubleshoot the root of a transaction that might be failing or just hanging. The company last year added a Microsoft System Center Operations Manager module and the Microsoft Azure Pack for hybrid cloud infrastructures. The BlueStripe "collectors" scan physical and virtual machines to detect processes taking place. With the new release BlueStripe can view, isolate and remediate those processes housed within the container just as if they were in a physical or virtual machine.
Despite the early days for Docker containers, Mountain believes they will indeed become a key tier in distributed datacenter and cloud architectures. "As it continues to go, we expect this to become more mainstream. So this was a move on our part to make sure we're addressing that need," Mountain said. "I think it's real, I don't think this is just hype."
Posted by Jeffrey Schwartz on 02/24/2015 at 10:25 AM0 comments
Lenovo's decision to install the adware program Superfish on some of its PCs, notably the Yoga 2 models and Edge 15, was the latest inexcusable action by a company that we should be able to trust to provide a secure computing environment. It's hard to understand how Lenovo could let a system that was able to bypass the antimalware software it bundled from McAfee (as well as others) into the market.
While Microsoft swiftly updated its Windows Defender to remove the certificate for Superfish and Lenovo on Friday released its own downloadable removal tools including source code, this wasn't just another typical bug or system flaw.
Unbeknownst to customers, Lenovo apparently installed the Superfish software, designed to track users' online sessions including all SSL traffic, making their systems vulnerable to theft from hackers of passwords and other sensitive information. Adding insult to injury, Lenovo took the rather unscrupulous move of installing it at the BIOS level, making it impervious to antimalware and AV protection software.
Justifying the move, Lenovo said it had knowingly installed the adware under the guise that it would "enhance the shopping experience." The only thing it enhanced was the level of suspicion users have that whoever Lenovo does business with are putting their information at risk to further their own objectives.
Just in the past few weeks, we learned that hackers stole user information from Anthem, the nation's second largest health insurer. Some 80 million customers' private information (myself included) were victims of this attack. Also last week, the latest leak by Edward Snowden to The Intercept accused the National Security Agency (NSA) and the British government of hacking into SIM cards from Gemalto, a company whose chips are used to store personal information in smartphones such as passports and identity information. And the list goes on.
What's galling about the Lenovo incident is that the company only put a stop to it when Peter Horne, the person who discovered it, raised the issue (the company argued it was due to negative user feedback). Horne, a veteran IT professional in the financial services industry, came across the installation of Superfish in the Lenovo Yoga 2 Notepad he bought. Horne told The New York Times that not only did the bundled McAfee software not discover it but Superfish also got past the Trend Micro AV software he installed. Looking to see how widespread the problem was, he visited Best Buy stores in New York, Boston and retailers in Sydney and Perth and the adware was installed on all the PCs he tested.
Yet upon fessing up, Lenovo argued that it was only installed on consumer systems, not ThinkPad, ThinkCentre, Lenovo Desktop, ThinkStation, ThinkServer and System x servers. Horne had a rather pointed suspicion about Lenovo's decision to install the adware in the first place. "Lenovo is either extraordinarily stupid or covering up," he told The Times. "Either one is an offense to me."
But he noted an even bigger issue. "The problem is," he said, "what can we trust?"
Posted by Jeffrey Schwartz on 02/23/2015 at 2:50 PM0 comments
Many of us look forward to the day when we can get any information we want and have systems intelligently bring us what we're looking for. In a sense that's what Microsoft's new Azure Machine Learning service aims to do. While IBM is among those who have demonstrated the concept with Watson and is looking to advance the technology as well, Microsoft is looking to bring the service to the masses more easily and affordably.
"Simply put, we want to bring big data to the mainstream," wrote Joseph Sirosh, corporate vice president for Machine Learning, in a blog post announcing the general availability for the service that was announced last summer. Azure Machine Learning is based on templates and usual workflows that support APIs and Web services, enabling developers to tap into the Azure Marketplace to easily pull together components to build applications that incorporate predictive analytics capabilities.
"It is a first-of-its-kind, managed cloud service for advanced analytics that makes it dramatically simpler for businesses to predict future trends with data," Sirosh added. "In mere hours, developers and data scientists can build and deploy apps to improve customer experiences, predict and prevent system failures, enhance operational efficiencies, uncover new technical insights or a universe of other benefits. Such advanced analytics normally take weeks or months and require extensive investment in people, hardware and software to manage big data."
Yet despite the rapid growth and rollout of new Hadoop-based services that are the underpinnings of the most sought out predictive analytics platforms, growth is somewhat stalled, according to a survey conducted during Gartner's latest quarterly Hadoop webinar. The percentage of the 1,200 participants who this month said they have deployed Hadoop-based applications has remained flat since last quarter's survey (only 15 percent said they have actually deployed).
However, when the Gartner survey results are examined based on respondents who said they were in the "knowledge gathering" mode, the percentage of Hadoop deployments was lower than 15 percent. Meanwhile, those who said in the survey that they were developing strategies for Hadoop had rates of deployment that were higher than 15 percent. Gartner Research VP Merv Adrian indicated in a blog post that while it's hard to draw any broad conclusions, it may indicate renewed interest by those who have put their plans on hold. "My personal speculation is that it comes from some who have been evaluating for a while," he said.
And indeed there is plenty to look at. Microsoft has rolled out some noteworthy new offerings and is gaining partner support. That includes the latest entry to the Azure Marketplace, Informatica, which released its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage.
"Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data," wrote Informatica's Ronen Schwartz, in a blog post. "The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others -- for advanced analytics."
Do you think machine learning is ready for prime time?
Posted by Jeffrey Schwartz on 02/20/2015 at 12:54 PM0 comments
Microsoft CEO Satya Nadella "loves" Linux. So it should come as little surprise that Microsoft is planning to support its Azure HDInight big data analytics offering on the open source server platform. The company announced the preview of HD Insight on Linux at the Strata + Hadoop World conference in San Jose, Calif. Microsoft also announced the release of Azure HD Insight running Storm, the popular Apache streaming analytics platform for streaming analytics.
The open source extensions aim to widen Microsoft's footprint in the growing market for big data services, enable users to gather more information that they can parse and analyze to make better decisions and bring big data into mainstream use, as Microsoft has indicated with its development of Cortana, now available on Windows Phone and in beta on Windows 10.
In addition to the public preview of HDInsight on Linux and general availability of Apache Storm for HDInsight, Microsoft announced Hadoop 2.6 support in HDInsight, new virtual machine sizes, the ability to grow or reduce clusters running in HDInsight and a Hadoop connector for DocumentDB.
"This is particularly compelling for people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform because they can use common Linux tools, documentation and templates and extend their deployment to Azure with hybrid cloud connections," said T. K. "Ranga" Rengarajan, corporate vice president for Microsoft's Data Platform and Joseph Sirosh, corporate vice president for Machine Learning, in a blog post.
Support for Storm is also another key advance for Microsoft as it has emerged as a widely adopted open source standard for streaming analytics. "Storm is an open source stream analytics platform that can process millions of data 'events' in real time as they are generated by sensors and devices," according to Ranga. "Using Storm with HDInsight, customers can deploy and manage applications for real-time analytics and Internet-of-Things scenarios in a few minutes with just a few clicks."
Despite its open source push, Microsoft isn't part of the Open Source Platform Alliance that was announced this week to ensure an interoperable Apache Hadoop core. Among those on board are GE, Hortonworks, IBM, Infosys, Pivotal, SAS, Altiscale, Capgemini, CenturyLink, EMC, Splunk, Verizon Enterprise Solutions, Teradata and VMware.
Asked why, a Microsoft spokeswoman stated, "Microsoft is already partnered with Hortonworks to use HDP which will utilize the Hadoop core from the Open Data Platform Initiative moving forward. We also will continue to contribute to the broader Apache Hadoop ecosystem." The statement also offered support for the project. Microsoft sees the Open Data Platform Initiative as a good step forward to having everyone run on the same Hadoop core including HDFS, YARN and Ambari. "We see standardization in the Hadoop space as a good thing as it reduces fragmentation and makes adoption of the technologies easier."
In addition, Microsoft is focused on contributing Hadoop projects like Hive (Project Stinger, Tez), YARN, REEF and others, as well as partnering with Hortonworks, she said. "We see this Open Data Platform Initiative as complimentary to these efforts and will help the overall Hadoop ecosystem."
Posted by Jeffrey Schwartz on 02/20/2015 at 12:20 PM0 comments
A number of prominent SharePoint MVP experts say they are confident that the on-premises server edition of SharePoint has a long future despite Microsoft's plans to extend the capabilities of its online counterpart -- Office 365 -- as well as options to host it in a public cloud service such as Azure. At the same time, many realize that customers are increasingly moving (or considering doing so) some or all of their deployments to an online alternative, either by hosting it in the cloud or moving to Office 365 and SharePoint Online.
In one of many Tweetjams -- online discussions via Twitter -- hosted by prominent SharePoint MVP Christian Buckley (@buckleyplanet), the experts weighted in on the forthcoming SharePoint 2016 release, due out later this year, and what it will mean to the future of the premises-based edition.
"On-prem is very much alive and well. Don't think it's going away anytime soon," said MVP Asif Rehmani (@asifrehmani), founder and CEO of Chicago-based VisualSP. "Alive and well. Oh, and heavily customized," added Daniel Glenn (@DanielGlenn), Technical Consultant at InfoWorks and president of the Nashville SharePoint User Group.
Not everyone sees it that way. Some participants say the move toward hybrid deployments is gaining traction and is a sign that SharePoint in the datacenter has peaked. "SharePoint OnPrem is trending down, but still steady and above 70 percent -- there is room to grow still," tweeted Jeff Shuey (@jshuey), chief evangelist at K2, an ISV that provides workflow apps for SharePoint.
Barry Jinks (@bjinks), CEO of collaboration app provider Colligo, argued that the economies of Office 365 are compelling to many customers. "Eventually enterprises will move there," Jinks tweeted. "Just going to take way longer than hoped."
Buckley, the moderator and principal consultant with GTConsult, noted that while Microsoft may want everyone to move to the cloud, enterprises have too much invested in their on-premises SharePoint deployments. "SP on-prem CAN'T be killed by MSFT or anyone, only supplanted as cloud gets ever better," he tweeted. "Our Enterprise customers are looking at Hybrid. Still loving the on-prem #SharePoint as they have hefty investments there," said Gina Montgomery (@GinaMMontgomery), strategic director for Softmart, where she manages its Microsoft practice.
"IT [and collaboration tools] are evolving much faster than a 3 year DVD release cycle," said SharePoint and Office 365 Architect Maarten Visser (@mvisser), managing director of meetroo. " SharePoint OnPrem gets old quickly."
Asked if hybrid SharePoint deployments are becoming the new norm, the experts argued the hype doesn't match what they're seeing from their customers. "I don't think it will be a norm as much as what will be the best fit to meet requirements," said Stacy Deere-Strole (@sldeere), owner of SharePoint consultancy Focal Point Solutions.
"MSFT want it to be [the new norm]," observed SharePoint MVP Jason Himmelstein (@sharepointlhorn), Sr. Tech Director at Atrion. "Like with much of what we see coming out of Redmond it will be a bit before the desire matches the reality."
Yet many acknowledged that many are moving to hybrid deployments, or are in the process of planning to do so. "The story for OnPremises #SharePoint only gets better when you can work seamlessly with the cloud #SPO -- Hybid is a must," said SharePoint MVP Fabian Williams (@FabianWilliams). "Is hybrid the right idea? DAMN RIGHT. Move the right workloads for the right reasons," Himmelstein added.
"Yes, hybrid is becoming the norm for enterprises as well now. It just makes sense," Rehmani added. "Hybrid brings conservative customers the stability they need and allows them to experiment in the cloud," said Visser. "That's why SharePoint 2016 will be all about hybrid to force the transition," said MVP Michael Greth (@mysharepoint), based in Berlin. "Soon -- complex enterprise landscape will require a balance that hybrid can provide," tweeted Michelle Caldwell (@shellecaldwell), a director at integrator Avanade and a founder of the Buckley, Ohio SharePoint User Group. "Many are still planning and dabbling."
Williams added: "Hybrid can be considered normal because you need a 'bridge' to work between SPO & ONPrem since not all features are on both," he tweeted.
Many are also looking forward to hearing about new management features coming in SharePoint 2016. "This will be the super exciting stuff at @MS_Ignite," said Dan Holme, co-founder & CEO of IT Unity, based in Maui. "I believe it will be the differentiator over O365," Glenn said. "But O365 will absorb some (if not all) of it via Azure services over time." Buckley is looking forward to hearing more from Microsoft on this. "There has always been a gap for management across SP farms, much less hybrid," he said. "Will be interesting to see what is coming next."
What is it about SharePoint 2016 you are looking forward to hearing about?
Posted by Jeffrey Schwartz on 02/19/2015 at 7:31 AM0 comments
The latest Silicon Valley startup looking to ride the wave of cloud-based software-defined datacenters (SDDCs) and containerization has come out of stealth mode today with key financial backers and founders who engineered the VMware SDDC and the company's widely used file system.
Whether Sunnyvale, Calif.-based Springpath will rise to prominence remains to be seen, but the company's hyper-converged infrastructure aims to displace traditional OSes, virtual machines (VMs) and application programming models. In addition to the VMware veterans, Springpath is debuting with $34 million in backing from key investors with strong track records. Among them is Sequoia Capital's Jim Goetz, whose portfolio has included such names as Palo Alto Networks, Barracuda Networks, Nimble Storage, Jive and WhatsApp. Also contributing to the round are New Enterprise Associates (NEA) and Redpoint Ventures.
Springpath's founders have spent nearly three years building their new platform, which they say will ultimately host, manage and protect multiple VMs, server OSes, apps and application infrastructure including Docker containers. The company's namesake Springpath Data Platform in effect aims to let organizations rapidly provision, host and virtualize multiple VMs (VMware, Hyper-V and KVM), compute and application instances, storage pools and distributed file systems while managing and protecting them running on commodity servers and storage.
Founders Mallik Mahalingam and Krishna Yadappanavar are respectively responsible for the development of VMware VXLAN, the underpinnings of the software-defined network (SDN), and VMFS, the widely deployed file system in VMware environments. The two believe the new distributed subscription-based datacenter platform they're rolling out will reshape the way enterprises develop and host their applications in the future.
Springpath's software runs on any type of commodity servers and storage hardware offered in a Software-as-a-Service (SaaS) subscription model. The hosting and systems management platform costs $4,000 per server per year and ties into existing enterprise management systems. Not surprisingly, it initially supports VMware vCenter, but will also run as a management pack in Microsoft System Center.
The platform is based on what Springpath calls its Hardware Agnostic Log-structured Objects (HALO) architecture, which purports to offer a superior method for provisioning data services, managing storage and offering a high-performance and scalable distributed infrastructure. Rather than requiring customers to buy turnkey appliances, the company is offering just the software. It's currently supported on servers offered by Cisco, Dell, Hewlett-Packard and Supermicro. Customers can deploy the software themselves or have a partner run it on one of the supported systems. Springpath has forged a partnership with mega distributor Tech Data to provide support for the platform.
Ashish Gupta, Springpath's head of marketing, described the HALO architecture in an interview. HALO consists of distribution, caching, persistence optimization and structured object layers, Gupta explained. "Married to all of this are the core data services that all enterprises are going to need from a management perspective like snapshots, clones, compression [and so on]," he said. "The idea here is you can put the Springpath Data Platform on a commodity server and then scale and essentially give core capabilities that can replace your array-based infrastructure. You will no longer need to buy expensive multimillion arrays and isolate yourself in that environment, you are buying commodity servers putting the software on top, and you're going to get all the enterprise capabilities and functionality that you expect."
The hardware-agnostic distribution layer, for example, enables enterprises to take advantage of all the underlying hardware to support an application, he added. "The platform can be running on N number of servers. We can take advantage of all the resources underneath the servers, be it the disk, be it the memory or the flash environment and essentially present that to the applications that are supported by the platform."
In that context the applications can run in a virtualized environment from VMware, Hyper-V or KVM, and can be supported in containerized environments. Gupta noted Springpath is also providing its Platform-as-a-Bare-Metal offering. "So it can look like a traditional storage device, except it can scale seamlessly in a bare metal deployment," he said. Gupta added Springpath has its own file system and supports either block objects or Hadoop plug-in infrastructures. "In essence, we can give you a singular platform for the app your application needs," he said.
While it's not the only hyper-converged platform available, it is the first potentially major one offered not tied to a specific hardware platform or offered as an appliance, said Chuck Bartlett, senior VP for Tech Data's advanced infrastructure solutions division, which is working to line up systems integration partners to offer the platform. "The fact that it is compute-agnostic, meaning the end-user client can use the server platform they have or want to implement, is unique and compelling.
Looking ahead, Springpath's Gupta sees HALO targeting emerging Web-scale applications and distributed programming environments built in programming languages such as Node.js, Scala, Akka or Google Go. The initial release is designed to support VMware environments and OpenStack Horizon-based clouds, though plans call for supporting Microsoft Azure, Hyper-V and System Center this summer.
Posted by Jeffrey Schwartz on 02/18/2015 at 4:02 PM0 comments
President Obama issued an executive order aimed at persuading companies who suffer breaches to share information in an effort to provide more coordinated response to cyberattacks. Though it stops short of mandating that they do so, the president is also introducing legislation that will pave the way for greater information sharing between the private sector and government agencies including the Department of Homeland Security. The legislation also calls for the modernization of law enforcement authorities to fight cybercrime and the creation of a national breach reporting authority.
The order, signed today by the president at the Cybersecurity Summit at Stanford University in Palo Alto, Calif., sets the stage for the latest round of debate on how to protect the nation's infrastructure and consumer information without compromising privacy and civil liberties. Obama's push to promote information sharing, which could help provide better threat intelligence and methods of responding to attacks, nonetheless won't sit well with organizations who loathe to do so for concerns over liability and business impact.
Specifically the president has proposed the formation of information sharing and analysis organizations (ISAOs). These will be private sector groups that would share information and collaborate on issues related to cyber security by creating Information Sharing and Analysis Centers (ISACs). It extends on the information sharing executive order Obama issued two years ago to the day and outlined in this State of the Union Address that led to the release of last year's Cybersecurity Framework.
Since then of course, the numbers of cyber attacks have become more severe with the 2013 Target breach, major attacks last year against Apple, Home Depot, the IRS, Sony and now this year's compromise of customer info at Anthem, the second largest health insurance provider.
Obama also met today with some key industry executives at the Cybersecurity Summit in Palo Alto, including Apple CEO Tim Cook and Intel president Renee James. Besides Cook, top CEOs are conspicuous by their absence including Facebook, Google, IBM, Microsoft and Yahoo. The president signed the executive order at today's summit.
The order also seeks to let law enforcement agencies prosecute those who sell botnets, while making it a crime to sell stolen U.S. financial information such as credit card and account numbers to anyone overseas. It will also give federal law enforcement agencies authority to go after those who sell spyware and give courts the authority to shut down botnets.
Several key IT providers and large companies at risk today attending the summit announced their support for the framework including Intel, Apple, Bank of America, U.S. Bank, Pacific Gas & Electric, AIG, QVC, Walgreens and Kaiser Permanente, according to a fact sheet released by the White House.
While some just announced support for the framework, Intel released a paper outlining its use and stated that it is requiring all of its vendors to use it as well. Apple said it's incorporating it as part of its broader security across its networks. Also requiring its vendors to use the framework are Bank of America, while insurance giant AIG said it is incorporating the NIST framework into how it underwrites cyber insurance for business of all sizes and will use it to help customers identify gaps in their approach to cyber security.
The White House also said several members of the Cyber Threat Alliance, which includes Palo Alto Networks, Symantec, Intel and Fortinet, have formed a cyber threat-sharing partnership that aims to create standards aligned with its information sharing order. Along with that, according to the White House, Box plans to participate in creating standards for ISAOs with plans to use its Box platform to extend collaboration among ISAOs. Further, FireEye is launching an Information Sharing Network, which will let its customers receive threat intelligence in near real time (including anonymized indicators).
Several companies are also announcing efforts to extend multifactor authentication, including Intel, which is releasing new authentication technology that seeks to make biometrics a more viable option to passwords. Credit card providers and banks, including American Express, Master Card and its partner First Tech Credit Card Union, are all advancing efforts to pilot and/or roll out new multifactor authentication methods including biometrics and voice recognition.
Much of the buzz is about the failure of the tech CEOs to attend, but it looks like today's event at Stanford has shown some potentially significant advances by companies and some proposals by the president that will certainly extend the noise level of debate from Silicon Valley to the Beltway.
What's your take on the president's latest executive order?
Posted by Jeffrey Schwartz on 02/13/2015 at 12:48 PM0 comments
The popular Sysinternals site acquired by Microsoft nearly two decades ago with troubleshooting utilities, tools and help files is now SSL-enabled. The cocreator steward of the site Mark Russinovich, Microsoft's Azure CTO, tweeted the news earlier in the week.
Microsoft and many others are making the move to use the SSL protocol for Web sites -- the long-established Secure Sockets Layer standard used for encrypted Web sessions. Enabled for decades in sites where financial transactions and other secure communications are necessary, the move to HTTPS sessions from HTTP is rapidly spreading rapidly as the hazards of intercepted communications is on the rise.
If you ask, why would a site that just hosts documentation need an HTTPS connection, consider there are lots of executables there as well, and though all the binaries are signed, using SSL to access the tools via the online share prevents man-in-the-middle tampering in cases where the user doesn't validate the signature before launching the tool.
Posted by Jeffrey Schwartz on 02/12/2015 at 1:28 PM0 comments
Microsoft's announcement back in October that it has partnered with Docker to enable Linux containers to run in Windows was an important step forward for enabling what promises to be the next wave in computing beyond virtualization. While things can change on a dime, it looks like Microsoft is going all in by supporting a widely endorsed (including IBM, Google, VMware and others) new computing model based on application portability and a more efficient use of compute, storage and network resource.
It sounds quite grand but so did virtualization -- and the idea of consolidating server resources -- when it hit the scene a decade ago. Of course, the proof will be in the implementation. It's very likely we'll hear about how to enable Linux containers in Windows Server at the upcoming Build and Ignite conferences in late April and early May, as Microsoft Distinguished Engineer Jeffrey Snover hinted last week.
"We're also going to talk about containers, Docker containers for Windows," Snover said. "There will be two flavors of the compute containers. There'll be a compute container focused in on application compatibility, so that will be server running in a containers, and then there will be containers optimized for the cloud. And with those containers you'll have the cloud optimized server."
Those wanting to start running Linux containers in Azure can start now, based on documentation posted by Microsoft yesterday. "Docker is one of the most popular virtualization approaches that uses Linux containers rather than virtual machines as a way of isolating data and computing on shared resources," according to the introduction. "You can use the Docker VM extension to the Azure Linux Agent to create a Docker VM that hosts any number of containers for your applications on Azure."
The documentation explains the following:
It also aims to explain how to:
In its description of Docker containers, it points out they're currently one of the most popular virtualization alternatives to virtual machines in that they isolate data and computing on shared resources, enabling developers to build and deploy apps across Docker resources, which may run in different environments.
As I noted earlier in the week, DH2i is now offering a platform that enables containers that run in Windows Server -- the difference being that they're Windows, not Linux-based, though they purport to work with Docker containers as well.
But if you're looking to start with Docker in Azure, Microsoft is making the push.
Posted by Jeffrey Schwartz on 02/12/2015 at 1:27 PM0 comments
Microsoft's new ExpressRoute service could emerge as a key piece of its hybrid cloud story for customers wary of using the public Internet to link their private datacenters to Azure. ExpressRoute, introduced at last May's TechEd conference in Houston, effectively provides dedicated links that are more reliable, faster and secure. To encourage customers and partners to try it out, Microsoft is offering the service free of charge through the end of June.
The offer covers Microsoft's Express Route 10 Mbps Network Service Provider (NSP) for new and existing customers in all Azure regions where ExpressRoute is currently offered. Several of Microsoft's telecommunications partners that offer ExpressRoute are also joining in the promotion including British Telecom, Colt, and Level 3. Also, AT&T is offering six month trials of its new NetBond service and Verizon is providing six months use of its Secure Cloud Interconnect offering for first-time users of its basic data plan with customers who sign two-year agreements for up to 1TB per month.
"ExpressRoute gives you a fast and reliable connection to Azure, making it suitable for scenarios such as data migration, replication for business continuity, disaster recovery, and other high-availability strategies," wrote Sameer Sankaran, a senior business planner within Microsoft's Azure group. "It also allows you to enable hybrid scenarios by letting you seamlessly extend your existing datacenter to Azure."
The service is especially complementary for services like Azure Site Recovery, which provides disaster recovery services using Azure targets and Hyper-V replication and for applications requiring private or more reliable links than using an Internet connection.
ExpressRoute is designed to connect on-premises resources such as physical and virtual server farms, storage, media services and Web sites, among other services. The service requires you to order circuits via one of the connectivity partners. Customers can choose either a direct layer 3 connection via an exchange provider or a standard layer 3 link from an NSP. Customers can enable one or both types through their Azure subscriptions but must configure both to connect to all supported services.
Posted by Jeffrey Schwartz on 02/11/2015 at 1:28 PM0 comments
A little-known startup that offers data protection and SQL Server migration tools today released what it calls the first native container management platform for Windows Server and claims it can move workloads between virtual machines and cloud architectures. DH2i's DX Enterprise encapsulates Windows Server application instances into containers removing the association between the apps, data and the host operating systems connected to physical servers.
The Fort Collins, Colo.-based company's software is a lightweight 8.5 MB server installation that offers a native alternative to that of Docker containers, which are Linux-based, though Microsoft and Docker are working on porting their containers to Windows, as announced last fall. In addition to its relationship with Microsoft, Docker has forged ties with all major infrastructure and cloud providers including Google, VMware and IBM. Docker and Microsoft are jointly developing a container technology that will work on the next version of Windows Server.
In his TechDays briefing last week, Microsoft Distinguished Engineer Jeffrey Snover confirmed that the company will include support for Docker containers in the next Windows Server release, known as Windows vNext.
DH2i president and CEO Don Boxley explained why he believes DX Enterprise is a better alternative to Docker, pointing to that fact that it's purely Windows Server-based.
"When you look at a Docker container and what they're talking about with Windows containerization, those are services that they're looking at then putting some isolation kind of activities in the future," Boxley said. "It's a really important point that Docker's containers are two containerized applications. Yet there are still going to be a huge amount of traditional applications simultaneously. We'll be able to put any of those application containers inside of our virtual host and have stop-start ordering or any coordination that needs to happen between the old type of applications and the new and/or just be able to manage them in the exact same way. It forces them to be highly available and extends now to a containerized application."
The company's containers, called "Vhosts," each have their own logical host name, associated IP addresses and portable native NTFS volumes. The Vhost's metadata assigns container workload management, while directing the managed app to launch and run locally, according to the company. Each Vhost shares one Windows Server operating system instance, which are stacked on either virtual or physical servers. This results in a more consolidated way of managing application workloads and enabling instance portability, Boxley explained.
Unlike Docker there are "no companion virtual machines running Linux, or anything like that at all," Boxer said. "It's just a native Windows application, you load it onto your server and you can start containerizing things right away. And again, because of that universality of our container technology, we don't care whether or not the server is physical, virtual or running in the cloud. As long as it's running Windows Server OS, you're good to go. You can containerize applications in Azure and in Rackspace and Amazon, and if the replication data pipe is right, you can move those workloads around transparently." At the same time, Boxley said it will work with Docker containers in the future.
Boxley said a customer can also transparently move workloads between any virtual machine platform including VMware, Hyper-V and Xen. "It really doesn't matter because we're moving the applications not the machine or the OS," he said. Through its management console, it automates resource issues including contention among containers. The management component also provides alerts and ensures applications are meeting SLAs.
Asked why it chose Windows Server to develop DX Enterprise, Boxley said he believes it will remain the dominant environment for virtual applications. "We don't think -- we know it's going to grow," he said. IDC analyst Al Gillen, said that's partly true, though Linux servers will grow in physical environments. Though he hasn't tested DX Enterprise, Gillen said the demo looked promising. "For customers that have an application that they have to move and they don't have the ability to port it, this is actually a viable solution for them," Gillen said.
The solution is also a viable option for organizations looking to migrate applications from Windows Server 2003, which Microsoft will no longer support as of July 14, to a newer environment, Boxley said. The software is priced at $1,500 per server core (if running on a virtual machine it can be licensed via the underlying core), regardless of the number of CPUs. Support including patches costs $360 per core per year.
Boxley said the company is self-funded and started out as a Microsoft BizSpark partner.
Posted by Jeffrey Schwartz on 02/10/2015 at 12:15 PM0 comments
The White House late last week said it has named Tony Scott as its CIO. This will only be the third person charged with overseeing the nation's overall IT infrastructure. Scott, who served as Microsoft's CIO, also served that role for Disney and VMware and was CTO at GM.
Scott's official title will be U.S. CIO and Administrator of OMB's Office of Electronic Government and Information Technology, succeeding Steve VanRoekel. The first CIO was Vivek Kundra, who launched the government's Cloud First initiative. The Obama Administration will task Scott with implementing its Smarter IT Delivery agenda outlined in the president's 2016 proposed budget.
I actually first met Scott when he was GM's CTO in the late 1990s when he spoke at a Forrester conference about managing large vendors, which included Microsoft, Sun Microsystems, Oracle and numerous others including some startups during the dot-com era. I later caught up with him more than a decade later attending Microsoft's Worldwide Partner Conference in 2010 in Washington, D.C.
Among his key initiatives at the time as Microsoft's CIO was enabling the internal use of new technologies the company had recently brought to market, among them Azure. "I think we've done what Microsoft always has done traditionally, which is we try to dog-food our own stuff and get the bugs out and make sure the functionality is there," he said during an interview at WPC, though he qualified that by adding: "We'll move them or migrate them as the opportunity arises and as the business case makes sense."
Nevertheless he was known as a proponent of cloud-enabling internal applications as quickly as possible. Scott's tenure at Microsoft ran from 2008 until 2013 and he has spent the past two years at VMware.
Posted by Jeffrey Schwartz on 02/09/2015 at 12:49 PM0 comments
The news that the next version of Windows Server and System Center won't come until next year caught many off guard who were under the impression one would come later this year. Microsoft brought its enterprise product roadmap into fuller view with that announcement and the promise of a new version of SharePoint Server later this year.
This latest information came at a fortuitous time for my colleague Gladys Rama, who was putting the finishing touches on the updated 2015 Microsoft Product Roadmap for sister publication Redmond Channel Partner. Check it out if you want to know planned release dates for anything noteworthy to an IT pro from Microsoft.
As for the delay of Windows Server v.Next, ordinarily it would seem par for the course. But after releasing Windows Server 2012 R2 just a year after Windows Server 2012 and messaging that Microsoft was moving toward a faster release cadence, it was more surprising. Whether by design or otherwise, the news removes a key decision point for IT pros who were considering waiting for the new Windows Server to come out before migrating their Windows Server 2003-based systems.
As soon as Microsoft got the word out that the new Windows Server is on next year's calendar, it issued another reminder that Windows Server 2003's end of support is less than six months away. Takeshi Numoto, corporate VP for cloud and enterprise marketing, gave the latest nudge this week in a blog post once again warning of the risks of running the unsupported operating system after the July 14 deadline.
"Windows Server 2003 instances will, of course, continue to run after end of support," he noted. "However, running unsupported software carries significant security risks and may result in costly compliance violations. As you evaluate security risks, keep in mind that even a single unpatched server can be a point of vulnerability for your entire infrastructure."
Microsoft has urged customers to migrate to Windows Server 2012 R2 and, where customers feel it makes sense, consider a cloud service such as Office 365 to replace Exchange Server on-premises as well as Azure or other cloud infrastructure or platform services to run database applications, SharePoint and other applications.
Did the news that Windows Server v.Next have any impact on your Windows Server 2003 migration plans? Or was the prospect of it possibly coming later this year too close for comfort for your planning purposes?
Posted by Jeffrey Schwartz on 02/06/2015 at 4:05 PM0 comments
It's been three years since VMware has upgraded its flagship hypervisor platform, but the company yesterday took the wraps off vSphere 6, which the company said offers at least double the performance over its predecessor vSphere 5.5. VMware describes its latest release as the "foundation for the hybrid cloud," thanks to the release of its OpenStack distribution and upgrades to components of the suite that integrate virtualized software-defined storage and networking.
The new wares, set for release by the end of March, will offer a key option for enterprise IT decision makers to consider as they choose their next-generation virtual datacenter and hybrid cloud platforms. With the new wave of releases, VMware is advancing and integrating its new NSX software-defined networking technology. VMware, to some extent, is also challenging the business model of its corporate parent EMC by offering new storage virtualization capabilities with its new Virtual SAN 6 and vSphere Virtual Volumes, which will enable virtualization of third-party storage arrays.
The move comes as VMware seeks to maintain its dominant hold on large datacenter installations looking to move to hybrid and public clouds as giants such as Microsoft, Google, IBM and Amazon Web Services are looking to position their cloud platforms as worthy contenders. In what appeared to be a strategically timed move, Microsoft published its Cloud Platform roadmap, as reported, just hours before the VMware launch event.
With this release, it now remains to be seen whether VMware can successfully leverage the virtualized server stronghold it has with its network and storage virtualization extensions to its public cloud, vCloud Air, as Microsoft tries to lure those customers away with its Cloud OS model consisting of Windows Server, Hyper-V and Microsoft Azure. Despite Microsoft's gains, VMware is still the provider to beat, especially when it comes to large enterprise installations.
"VMware's strength remains their virtualization installed base, and what they're doing through NSX is building that out into cloud environments," said Andrew Smith, an analyst at Technology Business Research. "VMware realizes that they need to gain ground, especially in private cloud deployments, so they're going to use NSX to tie into security along with vSphere, to really try and take the hybrid cloud space by storm. And I think with updates to vSphere and the integration with OpenStack it's all pieces of the puzzle coming together to make that a reality to customers."
OpenStack Cloud Support
The VMware Integrated OpenStack (VIO) distribution, the company's first OpenStack distro, includes an API access that provides access to OpenStack-enabled public and private cloud and VMware vSphere infrastructure. "VIO is free to vSphere Enterprise Plus customers and comes as a single OVA file that can be installed in fewer than 15 minutes from the optional vSphere Web client. VIO support, which includes support for both OpenStack and the underlying VMware infrastructure and is charged on a per-CPU basis, said Tom Fenton, a senior validation engineer and lab analyst with IT analyst firm Taneja Group, in a commentary published on our sister site Virtualizationreview.com.
"VMware brings a lot to the OpenStack table with VIO," according to Fenton. "Many common OpenStack tasks are automated and can be performed from vCenter. vRealize Operations is able to monitor OpenStack, and LogInsight can parse OpenStack logs to separate the considerable amount of log noise from actionable items." The new NSX software-defined networking infrastructure will enable ties to OpenStack-compatible clouds, as well as VMware's own vCloud Air public cloud.
"The company now has offerings that address all layers of the Kusnetkzy Group virtualization model, including access, application, processing, storage and network virtualization, as well as both security and management for virtualized environments, sometimes called software-defined environments," wrote noted analyst Dan Kusnetsky, in his Dan's Take blog post.
Hypervisor Improvements with vSphere 6
Martin Yip, a VMware senior product manager, said in a company blog post announcing vSphere 6 that it has 650-plus new features, increased scale, performance, availability, storage efficiencies for virtual machines (VMs) and datacenter simplified management. "vSphere 6 is purpose-built for both scale-up and scale-out applications including newer cloud, mobile, social and big data applications," Yip noted.
Compared with the existing vSphere 5.5, vSphere 6 supports 64 hosts per cluster which is double the VMs per cluster, 480 CPUs versus 320, triple the RAM with 12TB per host and quadruple the VMs per host with 2,048. It also supports double the number of virtual CPUs per VM at 128 and quadruple the amount of virtual RAM per VM totaling 4TB, according to Yip.
The starting price for vSphere 6 is $995 per CPU. vSphere with Operations Management 6 starts at $1,745 per CPU and vCloud Suite 6 starts at $4,995 per CPU.
Posted by Jeffrey Schwartz on 02/03/2015 at 12:19 PM0 comments
Goverlan last week said it's giving away its version of its GUI-based Windows Management Instrumentation (WMI) tool for remote desktop management and control. The company's WMI Explorer (WMIX) lets IT pros with limited scripting or programming skills perform agentless remote administrations of Windows-based PCs.
The free tool -- an alternative to Microsoft's own command line interface called WMIC-- leverages WMI, a stack of Windows driver component interfaces supporting key standards including WBEM and CIM. WMI is built into all versions of Windows, allowing for deployed scripting to manage PCs and some servers remotely.
According to Goverlan, its WMIX tool includes a WMI Query Wizard, which will appeal to administrators with limited scripting or coding skills because it lets them create sophisticated standard WMI Query Language (WQL) queries with a filtering mechanism that generates results matching the needs of the specified remote systems. Goverlan's WMIX GUI lets administrators automatically generate Visual Basic scripts to define parameters to generate a script and report. It can also create WMI-based Group Policy Objects (GPO) and performs agentless system administration.
What's in it for the company to give away this free tool? Ezra Charm, Goverlan's vice president of marketing, noted that the company has never officially launched the tool nor has it significantly promoted it, yet it's popular among those who use it. "We are seeing a ton of interest," Charm said. Though many companies release free tools hoping to upsell customers to premium releases, Charm said the release of WMIX is primarily aimed at overall awareness for those who want to perform advanced WMI-based system administration functions and reports. Nevertheless, the effect would be the same.
WMIX is a component of the company's flagship systems administration tool, Goverlan Remote Administration Suite v8, which, depending on your environment, is an alternative or supplement to Microsoft's System Center Configuration Manager.
"At first, WMIX was implemented as an internal development tool to assist the integration of WMI Technology within the Goverlan Client Management Tool, but we quickly realized that this product would be of great services to all Windows System Administrators out there as it would allow anyone without advanced scripting knowledge to consume this powerful technology," said Goverlan CEO, Pascal Bergeot, in a statement announcing the release of the free tool
Goverlan requires contact information to activate the WMIX download.
Posted by Jeffrey Schwartz on 02/03/2015 at 12:20 PM0 comments
When Microsoft last month announced that it will offer Windows 10 as a free upgrade to Windows 7, Windows 8 and Windows 8.1 users, the company said the deal doesn't apply to enterprise users. The company clarified that point late last week saying that the free upgrade is available to business users who have Windows Pro, but those wanting enterprise management capabilities should stick with or move to Software Assurance.
In a blog post outlining the way Microsoft will release new Windows 10 features and fixes via its new Windows-as-a-service model, the company elaborated on the free offer. Simply put, the free offer is available to anyone with Windows 7 Pro or Windows 8.x Pro. If you have Windows 7 Enterprise or Windows 8.x Enterprise and require the same "enterprise-grade capabilities, Windows Software Assurance (SA) will continue to offer the best and most comprehensive benefits," wrote Jim Alkove, a leader on the Windows Enterprise Program Management.
While Microsoft could change what's offered in its unnamed pro and enterprise versions, the latter edition will offer Direct Access, BranchCache, App Locker, support for VDI and the Windows to Go capability, according to a Microsoft description of the differences between the two SKUs. It's clear that Microsoft wants to get as many users as possible onto Windows 10, which required users to upgrade within a year of its release.
Posted by Jeffrey Schwartz on 02/02/2015 at 1:51 PM0 comments
Allaying concerns that Microsoft wasn't planning to develop any more on-premises versions of SharePoint, the company today said a new server release is scheduled for the second half of 2015. Microsoft's emphasis on SharePoint Online had many wondering at times whether the company was planning a new server release, although the company had indicated back in March that a new version was coming.
Despite its push to the cloud version, Microsoft has acknowledged and promoted the fact that organizations' best way to transition is via a hybrid architecture providing connectivity between server and Microsoft's SharePoint Online services. However unlike the on-premises version, the Office 365 version of SharePoint Online doesn't support the trusted code and apps developed for SharePoint Server.
"While we've seen growing demand for SharePoint Online, we recognize that our customers have a range of requirements that make maintaining existing SharePoint Server deployments the right decision for some," said Julia White, general manager for Microsoft's Office Products division in a blog post today. "We remain committed to meeting those needs. We're excited about the next on-premises version of SharePoint and we're sure you will be too. It has been designed, developed and tested with the Microsoft Software as a Service (SaaS) strategy at its core, drawing from SharePoint Online. With this, SharePoint Server 2016 will offer customers enhanced, flexible deployment options, improved reliability and new IT agility, enabled for massive scale."
The company didn't offer any details on what its plans are for SharePoint Server 2016 other than to say it will provide more details at its forthcoming Ignite conference in Chicago in the first week of May, although White insinuated that Microsoft would aim to extend the hybrid capabilities that the current on-premises and cloud versions offer.
"A hybrid SharePoint deployment can provide IT agility by creating a consistent platform spanning datacenters and cloud, simplifying IT and delivering apps and data to users on any device, anywhere," she said. "With SharePoint Server 2016, in addition to delivering rich on-premises capabilities, we're focused on robust hybrid enablement in order to bring more of the Office 365 experiences to our on-premises customers."
Microsoft is already jumpstarting that hybrid effort this year with the rollout of APIs and SDKs that aim to bridge the gap between the on-premises and cloud worlds, as noted in last month's Redmond magazine cover story. The topics to be discussed in sessions at Ingite cover the gamut of SharePoint and Office 365 technologies including Delve, OneDrive, Project, Visio and Yammer.
Posted by Jeffrey Schwartz on 02/02/2015 at 7:33 AM0 comments
Microsoft today released versions of its widely used Outlook mail, calendaring and contacts app for users of iPhones, iPads and a preview version Android devices. More than 80 million iPad and iPhone users have downloaded Office, according to Microsoft.
"We have received tremendous customer request for Outlook across all devices, so we are thrilled to fulfill this for our customers," said Julie White, general manager for the Office product management team. Microsoft was able to develop the new Outlook apps thanks to code developed by Acompli, which Microsoft acquired last month , she noted.
I didn't see it in the Apple Store earlier today but a link Microsoft provided enabled an easy download of the Outlook app for the iPad. It looks a lot like the traditional Outlook client, though it clearly is stripped down. In the Settings folder (Figure 1) you can choose swipe options, browsers and create a signature for each or all accounts. However, unlike the Windows Outlook client, you can only create generic signatures.
The mail client has a search button and allows you to see your folders (Figure 2). It also provides access to OneDrive, Dropbox and Google Drive storage accounts (Figure 3). In the configuration setting, it provides access to your Exchange account, Outlook.com, OneDrive, iCloud, Gmail, Yahoo Mail, Dropbox and Box (Figure 4).
Unfortunately if you connect to any other POP- or IMAP-based service you're out of luck. Microsoft didn't indicate whether that will change, though White noted that "you will see us continue to rapidly update the Outlook app, delivering on the familiar Outlook experience our customers know and love." For my testing purposes, I have a Yahoo mail account that I use for less-critical purposes, which enabled me to test the new Outlook client.
Microsoft said the new Outlook app replaces Outlook Web Access for iOS and Android, though Microsoft said it will keep them around for now because some advanced Exchange and Office 365 features still aren't available with the new Outlook app.
Microsoft also announced that its Office for Android tablets is now generally available in the Google Play store. This replaces the preview versions of Word, Excel and PowerPoint. It joins the already available version of OneNote for Android. The company said it will also support native implementation running on Intel chipsets within the quarter.
Posted by Jeffrey Schwartz on 01/29/2015 at 11:26 AM0 comments
Microsoft is giving its Power BI analytical service an upgrade with added connectivity sources, support for iOS and will be available for $9.99 for a premium edition to be called Power BI Pro. The company will also offer a free version with limited functionality that it will retain the Power BI name.
Power BI, a cloud-based business analytics service launched a year ago, was aimed at both technical and general business users. The browser-based software-as-a-service (SaaS) tool generates operational dashboards. As noted in our First Look at Power BI last year, this tool adds new functionality to existing Microsoft information management offerings, namely Excel 2013, Office 365 and SharePoint 2013.
Microsoft currently has three pricing tiers for Power BI, running as high as $52 per user per month included with Office 365 Pro Plus, $40 for a standalone version and $33 when added on to an Office 365 E3/E4 subscription. Starting Feb. 1 Microsoft is offering one paid subscription at the substantially reduced $9.99 price, which is 75 percent less expensive than the highest tier. The company will offer the free version when the upgrade becomes generally available.
The free version is limited to 1GB of data per month per user, whereas the paid subscription will allow up to 10GB according to a comparison of the two options. Users of the paid version will also have access to 1 million rows per hour of streaming data compared to 10,000 rows for the free service. The paid Power BI Pro is required to use such features as access to live data sources, the data management gateway and various collaboration features including the ability to share refreshable team dashboards, create and publish customized content packs, use of Active Directory groups for sharing and managing access control and shared data queries through the data catalog.
With the new preview, users can sign in with any business e-mail address, initially only in the United States. Microsoft said it'll be available for those in other countries in the future. The new data sources supported include GitHub, Marketo, Microsoft Dynamics CRM, Salesforce, SendGrid and Zendesk, noted James Phillips, Microsoft's general manager for data experiences, in a blog post Tuesday. In the pipeline are Inkling Markets, Intuit, Microsoft Dynamics Marketing, Sage, Sumo Logic, Visual Studio Application Insights and Visual Studio Online, among others. "Power BI is 'hybrid' by design, so customers can leverage their on-premises data investments while getting all the benefits of our cloud-based analytics service," he said.
Microsoft also is offering a new tool called Power BI Designer, designed to let business analysts connect with, model and analyze data, he added, letting them easily publish results to any other Power BI user. The company also released a preview of Power BI for iPad, which can be downloaded from the Apple App Store. Phillips noted versions for iPhones, Android and Windows universal apps will be available later this year.
Posted by Jeffrey Schwartz on 01/28/2015 at 3:20 PM0 comments
One day after Microsoft delivered a disappointing quarterly earnings report, Apple Tuesday did the complete opposite by posting its best quarter ever -- far exceeding expectations. In fact Apple is said to have posted the most profitable quarter of any publicly traded company ever, buoyed by the fact that it sold 74.5 million iPhones between Oct. 1 and Dec. 27.
Apple recorded $74.6 billion in revenues with a record $18 billion profit (gross margin of 39.9%). This topped the previous record of ExxonMobil in the second quarter of 2002. The company's shares are soaring a day after Microsoft stock sunk about 9% (making it a key contributor to a down day for the stock market on Monday).
It's not that Microsoft had a horrible quarter -- revenues of $26.5 billion were up 8% year over year -- but profits were down and the growth outlook for next quarter was considerably slower than Wall Street had anticipated. The results have led many to question whether Satya Nadella's honeymoon is over as he approaches his one-year anniversary as CEO. During his first year, Nadella made some major moves to turn Microsoft around, as Mary Jo Foley noted in her monthly Redmond magazine column posted this week.
In addition, Microsoft posted only a 5% increase in commercial licensing and an overall operating income decline of 2%. Impacting earnings were the restructuring of its Nokia business unit and the transition to cloud computing, the company said.
Microsoft also blamed the strong dollar and weakness in China and Japan as well, which could further impact the current quarter should the dollar continue to strengthen, CFO Amy Hood warned on the earnings call. Ironically strong results in China were a key contributor for Apple's growth, while the company indicated only modest impact from the dollar's surge.
Among some other noteworthy improvements that Microsoft reported:
- 9.2 million Office 365 Home and Personal licenses were sold. This is up 30% since last quarter.
- $1.1 billion in Surface sales (24% increase in sales).
- 10.5 million Lumia smartphones sold, up 30 percent. However, Windows Phone still accounts for just 3 percent of the smartphone market.
But despite those improvements, Apple outshined Microsoft in a number of ways. Apple said it sold its 1 billionth iOS device in November. Average selling prices of iPhones increased $50 as customers opted for more expensive devices equipped with more storage. Not that any major market shifts were expected, but its huge spike in iPhone sales aided by the larger form factor of the latest models continues to leave Windows Phone in the dust.
For Microsoft, while Windows Pro OEM revenue declined 13 percent as did non-Pro revenue, sales of Apple Macintoshes rose 14% to total 5.5 million. Apple App Store revenues also jumped 41 percent. One weak spot for Apple was the decline of iPad sales -- 21.6 million units were sold, which is down from 26 million during the same period last year. The decline in iPads is not surprising given the release of the iPhone 6 Plus, which many might substitute an iPad Mini for due to the fact it's only slightly larger. Also recent upgrades have only had modest new features, giving existing users little incentive to replace the iPads they now have.
Still, iPads are becoming a formidable device in the enterprise and Apple CEO Tim Cook said on Tuesday's call that the company's partnership with IBM has helped boost the use of them in the workplace. "I'm really excited about the apps that are coming out and how fast the partnership is getting up and running," he said.
Many are expecting Apple to increase its enterprise push with the release of a larger iPad. For its part, Microsoft is rumored to have a new Surface device coming later this year, though the company has yet to confirm its plans.
Perhaps Cook is having a better week than Nadella but Microsoft has many other fish to fry where Apple is not in its path. Nadella's turnaround for Microsoft is very much still in progress. If the honeymoon is over, as some pundits suggest, then Nadella's success will ride on his ability to keep the marriage strong.
Posted by Jeffrey Schwartz on 01/28/2015 at 12:32 PM0 comments
When Microsoft released the newest Windows 10 Technical Preview on Friday, testers saw some major new features the company is hoping to bring to its operating system designed to switch seamlessly between PC and tablet modes. Among the key new features are a new Start Menu and Cortana, the digital assistant that, until now, was only available on Windows Phone.
Along with those and other new features, the new Build 9926 takes a key step forward in showing the progress Microsoft is making to remove the split personality that epitomizes Windows 8.x. Microsoft is designing Windows 10 to launch desktop apps from the Windows Store interface and vice versa. Upon downloading the new build you'll want to look at the following:
Start Menu: One of the biggest mistakes Microsoft made when it rolled out Windows 8 was the removal of the popular Start Button. While the company brought some of its capabilities back with Windows 8.1, the new build of the Technical Preview introduces a new Start Menu that Windows 7 users who have avoided Windows 8.x should feel comfortable with. The Start Menu displays the apps you use most on the left side of your screen and lets you customize the rest of the page with tiles that can be sized however the user chooses and grouped based on preferences such as productivity tools and content. It can be viewed in desktop mode (Figure 1) or in the pure tablet interface (Figure 2).
Cortana: The digital voice assistant available to Windows Phone users is now part the Windows PC and tablet environment and it was the first thing I wanted to test upon downloading the new build. It wasn't able to answer many questions, though when asked who's going to win the Super Bowl, Cortana predicted the New England Patriots. We'll see how that plays out. Ask it the weather forecast and she'll give you a brief answer. In other cases when I asked certain questions it would initiate a Bing query and send back the search results in a browser view. Cortana is also designed to search your system, OneDrive and other sources based on your queries. Microsoft has warned Cortana for Windows is still in early development but it could emerge as a useful feature if it's able to work as the company hopes. Like the Start Menu, Cortana works on the traditional desktop (Figure 3) or in the tablet mode (Figure 4).
Continuum: The design goal of Windows 10 is its ability to let users transition between desktop and touch-based tablet modes. In either environment, you should be able to access desktop or Windows Store apps. For example if you have downloaded Google's Chrome browser as a desktop app, when in the tablet mode it will appear as an app in that environment. In either case, you're accessing the same browser, just from a different interface.
Farewell Charms: Microsoft introduced Charms with Windows 8 as a hip new way of configuring machines but many found it cumbersome and confusing. In the new build, Charms are gone, replaced by a new Settings component (Figure 5). As the name implies, Settings offers an easy way to customize the display, connect peripherals and configure networks.
New Windows Store: Microsoft is preparing a new store that has a common design for PC, tablet and phone users as well as those accessing it via the Web. The new Windows Store beta (Figure 6) appears as a gray icon, though the existing Windows Store is still available in green.
File Explorer: Many users have complained that the File Explorer in Windows 8.x doesn't allow for a default folder. Now when opening the File Explorer in the new preview, it can be set to open to a default folder (Figure 7).
Because this is still an early beta you'll find bugs and just because you see features here doesn't mean they'll end up in the shipping version this fall. If you've looked at this build, please share your opinions on the latest Windows 10 Technical Preview.
Posted by Jeffrey Schwartz on 01/26/2015 at 3:40 PM0 comments
After putting its plans to go public last year on hold, Box's widely anticipated IPO got out of the starting gate today with its shares up as much as 70 percent midday Friday. The company plans to use the estimated $180 million in proceeds to maintain operations and invest in capital infrastructure to grow its enterprise cloud offering.
Founder and CEO Aaron Levie launched the company in 2005 with the aim of offering an alternative to large premises-based enterprise content management systems. Over the years, Levie publicly put a target on the back of SharePoint. Levie's ambitions earlier this decade to establish Box as a SharePoint killer peaked before Office 365 and OneDrive for Business arrived. While Levie still has strong aspirations to become the primary storage and file sharing service for businesses, the market is more challenging now that Office 365 with OneDrive for Business, Google Drive and others are widely entrenched within enterprises.
For its part, Box has always targeted large enterprises, boasting such customers as GE, Toyota, Eli Lilly, Boston Scientific, Procter & Gamble, Chevron, Schneider Electric and Stanford University. Speaking on the New York Stock Exchange trading floor with CNBC, Levie emphasized that the best opportunity for the company lies with targeting large enterprises and mid-size firms with 500 to 1,000 employees.
Amid enthusiasm for Box, there's also plenty of skepticism among analysts. The company incurs large customer acquisition and retention costs, which include "customer success managers" assigned to those with contracts for the life of the relationship, according to the company's S1 filing with the Securities and Exchange Commission (SEC). Moreover, Box is unprofitable with no target date for turning a profit in sight. According to the filing, Box recorded a $169 million loss for the fiscal year ended January 31, 2014 with a deficit of $361 million.
Also in its filing, Box points to competitors including EMC, IBM and Microsoft (Office 365 and OneDrive), Citrix (ShareFile), Dropbox and Google (Google Drive). There are plenty of other rivals both entrenched and new players such as Acronis, Carbonite, Own Cloud and CloudBerry Lab, among numerous others.
Now that Box believes it doesn't have to displace SharePoint and OneDrive for Business in order to succeed, the company last summer forged an agreement to collaborate with Microsoft. The collaboration pact ensured that Office 365 could use Box to store and synchronize files as an alternative to OneDrive, both of which now offer unlimited storage for paying customers. Microsoft and Box rival Dropbox forged a similar arrangement.
Box also offers APIs for developers to build applications using Box as the content layer, which lets users store content from a given application to centrally and securely store it within its service. Salesforce.com and NetSuite are among those that have used the API to tie their offerings together. In addition, Box last month added a new enterprise mobility management service, Box for Enterprise Mobility Management (EMM), which fits into the company's new Box Trust effort. That initiative consists of a string of partnerships with security vendors and those with data loss protection management tools. Symantec, Splunk, Palo Alto Networks, Sumo Logic and OpenDNS join existing partners Skyhigh Networks, Hewlett Packard, Okta, MobileIron, CipherCloud, Recommind, Ping Identity, Netskope, OneLogin, Guidance Software and Code Green Networks.
It remains to be seen if Box and its chief rival Dropbox can go it alone or if they'll become attractive takeover candidates. Of course that will depend on their ability to grow and ultimately turn a profit. Do you see Box or Dropbox becoming your organization's primary file store and sharing platform?
Posted by Jeffrey Schwartz on 01/23/2015 at 12:27 PM0 comments
One of the unexpected surprises at yesterday's Windows 10 prelaunch event and webcast was when Microsoft donned slick looking eyewear designed to bring holography to the mainstream. Whether Google got word of it days earlier when it pulled its own failed Google Glass experiment off the market is unknown. But the irony of the timing notwithstanding, Microsoft's new HoloLens appears to have more potential.
Microsoft Technical Fellow Alex Kipman, who works in Microsoft's operating systems group and is known as the "father of Kinect," made the surprise introduction of HoloLens at the Windows 10 event. While he didn't say when it would come out or how much it will cost, he positioned it as a product that's designed around Windows 10 and the suggestion is we'll see it sometime later this year.
"Holographic computing enabled by Windows 10 is here," said Kipman. "Every Windows 10 device has APIs focused on humans and environment understanding. Holographic APIs are enabled inside every Windows 10 build from the little screens to the big screens to no screens at all." It has a built-in CPU and GPU, but doesn't require external markers, cameras, wires or a computer connection, he added, and it will blend the physical and digital worlds.
When wearing HoloLens, it is designed to combine an existing environment with holograms, giving a 3D-like visual experience. While it surely will enhance Microsoft's gaming and entertainment portfolio including Xbox and Minecraft, the company also underscored practical uses for HoloLens. In a video, the company described how HoloLens can let workers share ideas, collaborate, teach and learn in a more visually immersive way.
Unlike Google Glass, HoloLens appears to have more practical use cases and may actually offer broader appeal. How broad will depend on price and how useful it ultimately is. But Microsoft has the advantage of seeing where Google Glass fell short and potentially has a larger ecosystem behind it. Perhaps it's even the catalyst that can bring developers of modern apps for other platforms into the fold?
Either way, Google hasn't thrown in the towel in this segment and it could prove to be a burgeoning market alongside other wearable gadgets. Kipman said Microsoft has worked on HoloLens in its research labs for many years, suggesting the demo wasn't just vaporware. It's yet another way Microsoft could draw demand for Windows 10 if users find HoloLens appealing. That could be the case if the price is right and it works as advertised.
Posted by Jeffrey Schwartz on 01/22/2015 at 12:28 PM0 comments
It's great that Microsoft will let Windows 7 and Windows 8.x users upgrade their systems to the new Windows 10 for free when it comes out this fall. But before you cheer too loud, beware of the fine print: the deal doesn't apply to Windows Enterprise editions.
A Microsoft official earlier in the week told me that the company will have an event in March emphasizing the enterprise features of Windows 10. Hopefully Microsoft will reveal whether it will offer the free upgrade or some other incentive for earlier users to upgrade. In the fine print discovered by my colleague Kurt Mackie, Microsoft noted the exclusions, which also include the small number of Windows RT users.
"It is our intent that most of these devices will qualify, but some hardware/software requirements apply and feature availability may vary by device," according to the explanation. "Devices must be connected to the Internet and have Windows Update enabled. ISP fees may apply. Windows 7 SP1 and Windows 8.1 Update required. Some editions are excluded: Windows 7 Enterprise, Windows 8/8.1 Enterprise, and Windows RT/RT 8.1. Active Software Assurance customers in volume licensing have the benefit to upgrade to Windows 10 Enterprise outside of this offer. We will be sharing more information and additional offer terms in coming months."
In many instances, new system rollouts could negate this issue. Will a free upgrade make or break your organization's decision to move to Windows 10?
Posted by Jeffrey Schwartz on 01/22/2015 at 12:22 PM0 comments
Microsoft potentially removed a crucial barrier to the future of its Windows franchise by saying it will offer the next version -- Windows 10 -- as a free upgrade to existing Windows 7 and Windows 8.x users. The company is also adding some compelling new features that may make the upgrade worth the effort if these new capabilities live up to their promise.
Speaking at the anticipated launch event in Redmond today, Terry Myerson, Microsoft's executive vice president of operating systems, announced the free upgrade. The caveat is users must install Windows 10 within a year of its release, though it remains to be seen whether that deadline will hold. Perhaps for consumers, which today's event was aimed at, that won't be a big deal. But businesses and enterprises do things on their own clocks, based on need and compatibility.
In an earlier post, I thought it would be a wise move to offer the free upgrade, though I had no knowledge Microsoft would ultimately do so. As part of the release, Microsoft is also shifting to what it calls Windows as a service, where it will provide continuous upgrades.
"When it comes to Windows as a service, it's a pretty profound change," Microsoft CEO Nadella said at today's event. "For customers, they're going to get a continuous stream of innovation. Not only a continuous stream of innovation but also the assurance their Windows devices are secure and trusted. For developers, it creates the broadest opportunity to target. For our partners, hardware and silicon partners, they can coincident with our software innovation, drive hardware innovation. We want people to love Windows on a daily basis."
Microsoft gave a number of reasons to "love" Windows besides the free upgrade. The company announced the rumored Spartan Web browser, which has a rendering engine better suited for modern Web applications, Myerson said. Microsoft will also offer Cortana, the digital assistant released for Windows Phone last year, for Windows running on PCs and tablets.
Officials also demonstrated the notion of a set of universal apps such as Word, Excel, PowerPoint, OneNote and Outlook, while optimized for each form factor that are consistent across them and designed to let a user stop working on one device and quickly pick up where he or she left off on another. An Xbox app on Windows that will allow Xbox users to run games on their PC or Windows tablet was also announced.
Microsoft also revealed its vision for augmented reality and took the wraps off HoloLens, which ironically is a Google Glass-looking device that Microsoft said has a built-in CPU and GPU and built on sensors. Microsoft described it as the world's first holographic computer. Its APIs are designed to work with the new Windows 10 environment.
More hardware is in the works from third parties and Microsoft. The event showcased the new Surface Hub, a Windows 10-based 84-inch Ultra HD display with Skype for Business built-in, sensors, cameras and the ability to mark up content with any phone or device. The company will also offer a 55-inch version and indicated other Surface hardware is in the works.
The company will release a new Windows 10 technical preview next week with a Windows 10 build for phones scheduled for release in early February. Many of the new features Microsoft demonstrated today will work their way into builds of the technical preview over the next three to five months, said Joe Belfiore, a vice president in the operating system group. Microsoft also plans to reveal more features for enterprises in March, according to a company official. The company still plans for the commercial release of Windows 10 in the fall timeframe.
Posted by Jeffrey Schwartz on 01/21/2015 at 2:29 PM0 comments
Mike Culver, who served a number of strategic roles with Amazon Web Services from the inception of the company's launch of its popular public cloud, lost his battle with pancreatic cancer this week. He was 63. Culver, who before joining AWS was also a technical evangelist at Microsoft in the early days of the .NET Framework rollout, was deeply respected in Redmond and throughout the world.
In his roles at AWS, Culver trained insiders at the emerging cloud industry in how to build and deploy apps in EC2 and scaling them on the company's Simple Storage Service (S3). "Mike was well known within the AWS community," wrote AWS evangelist Jeff Barr, who had shared an office with Culver years back. "He joined my team in the spring of 2006 and went to work right away. Using his business training and experience as a starting point, he decided to make sure that his audiences understood that the cloud was as much about business value as it was about mere bits and bytes."
Culver spoke at many AWS and industry events including Visual Studio Live. I met Culver at Visual Studio Live in 2008 where he gave a session on how to scale ASP.NET applications with cloud-based content delivery. At the time Culver was head of developer relations at AWS. Keep in mind, this was before Microsoft officially announced Azure and AWS S3 was brand new. I was quite impressed by his presentation and sat down with him. Though that was the only time we met, we became friends on Facebook and occasionally commented on one another's posts. I'm quite saddened that he lost his battle both for him, his wife, grown children, siblings and many colleagues who clearly had deep admiration and respect for him.
When he was diagnosed with pancreatic cancer in 2013, Culver was quite candid about his treatment but kept an upbeat yet realistic worldview about his battle. Pancreatic cancer is among the deadliest of cancers. I lost my father nearly a decade ago to it. Culver was accepted a few weeks ago to partake in a trial in a new therapy to battle the disease, though in the end, the disease was too far advanced. Culver entered hospice last week. RIP Michael.
Posted by Jeffrey Schwartz on 01/21/2015 at 2:30 PM0 comments
However you feel about the emerging wearables market, many rightfully have found the notion of Google Glass over the top. Given its obvious potential to distract one's attention, it should be illegal to wear them on the streets and certainly when driving.
Google's announcement yesterday that it will end the Google Glass experiment on Jan. 19 was inevitable since all experiments come to an end. On the other hand, Google has a history of labeling new products or services either tests or beta for an extended amount of time -- remember when Gmail was a beta product for more than five years despite the fact that millions were using it?
Certainly millions weren't using Google Glass and given its $1,500 price tag, it's also not surprising that Jan. 19 is the last day Google will offer it. The company's announcement yesterday that it is moving Google Glass from the Google X research labs headed by Glass chief Ivy Rose into the Nest unit run by Tony Fadell makes sense.
Nest is the company that manufactures and sells network-enabled smart thermostats, which Google acquired last year for $3.2 billion. A few months later Nest also acquired Dropcam for $55 million, the provider of cameras which, like its thermostats, have built-in Wi-Fi connectivity.
Some reports are cheering the demise of Google Glass though the company seems to have future plans for it. Hopefully the Nest division will focus Google Glass on the practical usage: for vertical and specialty functions that can give medical practitioners and all kinds of field workers a tool to do useful things they are now incapable of doing.
Posted by Jeffrey Schwartz on 01/16/2015 at 12:32 PM0 comments
It appears President Obama's forthcoming legislative proposal to crack down on cybercrime could impose additional liabilities on IT pros in that there could be penalties for not putting in place the proper policies, auditing practices and reporting of breaches.
The President this week spoke on his plans to propose the new legislation aimed at stiffening the penalties for all forms of cybercrime that put the nation's critical information infrastructure at risk as well as individual privacy, he said in a speech Tuesday. Obama will emphasize his legislative proposal to Congress in his annual State of the Union address.
"We want to be able to better prosecute those who are involved in cyberattacks, those who are involved in the sale of cyber weapons like botnets and spyware," Obama said in Tuesday's speech. "We want to be sure we can prosecute insiders who steal corporate secrets or individuals' private information. We want to expand the authority of courts to shut down botnets and other malware. The bottom line: we want cyber criminals to feel the full force of American justice because they are doing as much if not more these days as folks who are involved in conventional crime."
The White House also announced it will host a cybersecurity and consumer protection summit at Stanford University on Feb. 13, which will include speeches, panel discussions and a number of topic-specific workshops. Stanford said it is still finalizing details of the summit.
In addition to calling for better information sharing, the legislation will call for compliance with "certain privacy restrictions such as removing unnecessary personal information and taking measures to protect personal information that must be shared in order to quality for liability protection." According to an outline on the White House Web site, the President will also propose giving law enforcement tools they need to "investigate, disrupt and prosecute cybercrime."
The administration has also revised an existing proposal pertaining to security breach reporting "by simplifying and standardizing the existing patchwork of 46 state laws (plus the District of Columbia and several territories) that contain these requirements into one federal statute, and putting in place a single clear and timely notice requirement to ensure that companies notify their employees and customers about security breaches."
Over the next five years, the Department of Energy will also provide $25 million in grants to fund the training of cybersecurity professionals. The move, of course, comes amidst growing concerns about high-profile breaches over the past year including Target, Home Depot and most recently Sony, among others.
Yet the President is sure to face a battle, especially as it relates to information sharing, where the IT industry is fighting to ensure customer privacy and civil rights. For its part, Microsoft has led that fight in its battle to protect data residing on servers in Dublin, despite last year's court order mandating the release of that information. The Electronic Foundation, the non-profit organization focused on protecting civil liberties, swiftly denounced the President's proposal.
"President Obama's cybersecurity legislative proposal recycles old ideas that should remain where they've been since May 2011: on the shelf," according to a statement it released following Obama's proposal. "Introducing information sharing proposals with broad liability protections, increasing penalties under the already draconian Computer Fraud and Abuse Act, and potentially decreasing the protections granted to consumers under state data breach law are both unnecessary and unwelcome."
But the White House isn't alone in its effort to crack down on cybercrime. New York State Attorney General Eric Schneiderman yesterday said he plans to propose legislation that would require companies to inform customers and employees following any type of cyberattack or breach. The legislation would also broaden the scope of data companies would be required to protect, impose tighter technical and physical security protection and offer a safe harbor for organizations meeting certain standards, according to a statement released by the AG's office. "With some of the largest-ever data breaches occurring in just the last year, it's long past time we updated our data security laws and expanded protections for consumers," Schneiderman said.
While it's good that cybercriminals will face harsher penalties for their crimes -- and they should -- it's not likely to thwart those determined to inflict the most harm. Still, no one wants to be the next Target or Sony. As the content of this new legislation is debated, it also puts enterprises on notice that they will need to take measures to protect their critical data -- for their benefit and for everyone else.
Posted by Jeffrey Schwartz on 01/15/2015 at 9:48 AM0 comments
Facebook apparently does intend to enter the enterprise social networking market with its own offering targeted at business users. The new Facebook at Work will let Facebook users establish work accounts that are separate from their personal accounts.
Rumors that Facebook was developing a business network first came to light in November. News that the Facebook at Work pilot would launch today surfaced this morning in a report by Recode. A Facebook spokeswoman confirmed that the company has launched the pilot with some undisclosed participants testing the new service.
"We're not disclosing the handful of companies in the pilot, since it's early and we're still testing," the spokeswoman said in response to an e-mail inquiry. I was able to download the new Facebook at Work app on my iPhone but when searching for it with my iPad and Windows 8.1 PC, the app didn't appear in their respective app stores (as of Wednesday afternoon). The Facebook at Work FAQ indicated that it's also available in the Google Play app store.
"With a Facebook at Work account, you can use Facebook tools to interact with coworkers," according to the FAQ. "Things you share using your work account will only be visible to other people at your company. To set up an account, your company must be using Facebook at Work." The current app allows users to request more information on how employers can establish accounts.
To what extent a full-blown launch of Facebook at Work might have on incumbent enterprise social network providers such as Microsoft's Yammer, Salesforce.com's Chatter and Jive remains to be seen. But as SharePoint and Yammer expert Christian Buckley of GTConsult said back in November, "they will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."
Do you see your organization using Facebook at Work?
Posted by Jeffrey Schwartz on 01/14/2015 at 12:44 PM0 comments
Microsoft is making a big splash at this year's annual National Retail Federation (NRF) show in New York. The company is showcasing a number of major brand name chains that have kicked off efforts to improve their in-store experiences by using Azure, predictive analytics and new ways of interacting using apps delivered on mobile devices and kiosks.
While Microsoft emphasized that many of its customers were rolling out mobile devices for their employees at last year's NRF show, the types of apps that various retailers and restaurant chains are rolling out this year make use of Microsoft Azure in a big way. A number of big chains including GameStop and McDonalds are making use of applications that make use of Azure Machine Learning, Microsoft's predictive analytics tool rolled out last year.
Usage of Azure by retailers has grown exponentially in the past year, Tracy Issel, general manager of Microsoft's retail sector, said in an interview. "It used to be [that] we talked with people about going to the cloud and the perceived risk and their concern about scalability," Issel said. "I haven't had one of those conversations in a long time. Now it's 'what do I move first and when do I do it?' Many are moving to Office 365 and Azure simultaneously."
In a roundtable discussion today, Issel introduced four customers that are in the midst of major new efforts using Azure and/or Windows 8.1-based tablets and kiosks. Here's a brief synopsis of their efforts:
GameStop: Jeff Donaldson, senior vice president of GameStop Technology Institute, outlined a number of initiatives that aim for customers to use the retailer's mobile app in ways that store employees can engage with them when a customer visits. The app uses a variety of analytics tools including Hewlett Packard Vertica, SAS for statistical analysis and Azure Machine Learning to inform a sales rep when a customer comes into a store as to what interactions have taken place in the past. "When they come into the store, we want to make sure the employees know about the messages we sent to customers so they better understand the intent of the visit," Donaldson says. Another major effort calls for delivering Ultra HD content into each of its 6,400 stores using Azure Media Services.
CKE Restaurant Holdings, aka Carls Jr. and Hardees: The popular fast-food chain has concluded that millennials would much rather interact with a kiosk to order their food than a person, said Thomas Lindblom, senior vice president and chief technology officer. As such, Hardees is rolling out kiosks that allow customers to choose and customize their burgers and it is designed to upsell. Lindblom is using Dell 24-inch off-the shelf touch-based PCs to deliver the highly visual application. Lindblom said CKE is "a significant user of Azure" for a number of functions including storage and disaster recovery. CKE has also rolled out Office 365.
TGI Fridays: Looking to modernize the dining experiences, waiters and waitresses will carry eight-inch off-the shelf Windows tablets with apps developed by Micros (which is now a part of Oracle). The point-of-sale solution is designed to track customer preferences through loyalty cards. "We are cutting training times [and] we are able to deliver this digital touch point in the hands of our servers as they serve their guests," said CIO Tripp Sessions.
McDonalds: Microsoft partner VMob has rolled out an Azure-based application at McDonalds locations in Europe that enables it to track customer preferences using information gathered by his or her purchasing patterns. VMob has also started rolling it out at locations in Japan. VMob founder and CEO Scott Bradley, who was demonstrating the solution in Microsoft’s both, indicated he’s still working on getting the app into United States locations but he implied that may take some time and said it’s not a done deal. Nevertheless, he said he believes McDonalds eventually will roll it out in the U.S.
Posted by Jeffrey Schwartz on 01/13/2015 at 1:48 PM0 comments
At last week's Consumer Electronics Show in Las Vegas, there were robots, smartwatches, driverless cars, ultra-high-definition TVs and home automation systems. Even the traditional PC desktop display got a facelift.
Hewlett Packard was among a number of suppliers showcasing new curved desktop displays, designed to provide a more "immersive" experience, as Ann Lai, director of commercial displays at HP, put it in a briefing prior to the show.
"With a curve desktop display, you're really sitting right in front of it, so the curve is wrapping around you and providing you a very immersive experience that also makes it easier to read and more comfortable to use for longer periods of time," Lai said . "As displays have gotten larger, we've noticed the edges can be harder to read, especially when you're sitting close to it. With a curved display, you're going to have much more comfortable peripheral viewing as well as a much more immersive experience as you're using your display."
I'm reserving judgment as I've never tried one, though my first reaction was these would have more cosmetic appeal for an executive's office than helping make workers more productive. HP's new Pavilion 27 CM Elite display and Elite display S273 are both 27-inch curve displays that are priced at $399. The price is a slight premium over displays without a curve.
If you were looking for a new desktop display, would one with a curve be on your checklist?
Posted by Jeffrey Schwartz on 01/12/2015 at 3:32 PM0 comments
When Microsoft released Windows Server 2012 R2 back in the fall of 2013, one of the many features we pointed out at the time was "Workplace Join," which is designed to let organizations give single sign-on capability to their bring your own device (BYOD) employees -- or for anything not designed to join an Active Directory domain. Simply put, it lets you register a non-domain-based device running Windows 8.1, iOS and Android to Active Directory.
Microsoft was especially happy to tout Workplace Join when it launched its Windows RT-based Surface 2 back in September 2013. In an interview with Surface Director of Marketing Cyril Belikoff at the time, she talked up the Workplace Join capability with me. "Workplace Joins are the access components of a directory service that allows a user to use their ID and password to access their corporate network documents and shares in a secure way," Belikoff said. "It's not a fully domained device but you get the administration of mobile device management and get the access component." Last year, Microsoft added support for Windows 7-based systems as well.
Workplace Join does require Active Directory Federation Services, Active Directory on premises and the Device Registration Service, all part of the "Federation Services Role on Windows Server 2012 R2," as described in the TechNet library.
I've talked to a variety of mobile device management vendors and suppliers of Active Directory management and auditing tools and I've heard various views. Some say customers (or prospective ones) are questioning how these tools will support Workplace Join and others recommend using the device enrollment features in their wares.
Microsoft boosting the functionality of Workplace Join in the forthcoming Windows 10 operating system could be a factor that builds to its popularity.
Please share your experience or views on Workplace Join. Is it suitable for your BYOD authentication requirements? Drop me a line at firstname.lastname@example.org.
Posted by Jeffrey Schwartz on 01/08/2015 at 2:31 PM0 comments
IT pros apparently plan to give Windows 10 a warmer welcome than they gave Windows 8 when it arrived, according to an online survey of Redmond magazine readers conducted during December and early this month. A respectable 41 percent said they plan to deploy PCs with Windows 10 within a year after it ships and 30 percent will go with Windows 8.1, the survey shows.
To put that in perspective, nearly two years ago only 18 percent of Redmond magazine readers said they planned to deploy Windows 8 in a survey fielded six months after that operating system shipped. The 41 percent now saying they will deploy Windows 10 was all the more respectable given it ranked second when asked which of Microsoft's "newer" offerings they plan to deploy this year. Topping the list, to no surprise, was Office 365, where nearly 46 percent say they plan to deploy it this year. Many respondents also plan to deploy Lync, Azure IaaS and Azure Active Directory this year.
To be sure, in a different question where Windows 10 was not an option (since it's still in Technical Preview), respondents overwhelmingly see Windows 7 as their client platform of choice to replace aging PCs, though not as many as in previous years, which is to be expected. Here's the breakdown of PC replacement plans:
- Windows 7: 57%
- Windows 8.1: 30%
- Virtual desktop/thin client: 6%
- Whatever employee requests: 1%
- Linux: 1%
It stands to reason that those that have started to mix Windows 8.1 devices into their shops will continue doing so. Anecdotally, though, I'm hearing many are awaiting the arrival of Windows 10 before making any firm commitments.
While it's hard to predict how IT decision makers ultimately will choose to replace aging desktops and portable PCs, it appears both Windows 8.1 and Windows 10 for now are the likely favorites among this audience in the coming year.
Drop me a line if you want to share your top IT priorities for 2015. I'm at email@example.com.
Posted by Jeffrey Schwartz on 01/07/2015 at 1:02 PM0 comments
As the annual Consumer Electronics Show kicks off today in Las Vegas, you can expect to hear lots of buzz about driverless cars, home automation systems, the so-called "Internet of Things" and of course wearable computing devices (including smartwatches and fitness bands).
Having spent most of December using the new Microsoft Band, as I reported last month, it has some nice features but it's still buggy and, in my opinion, not worth the steep $199 price tag. When I returned my Microsoft Band, the clerk asked why. I mentioned the buggy Bluetooth synchronization with iOS, which she admitted is a common problem with the Microsoft Band. It was also a problem CNBC On-Air Editor Jon Fortt emphasized while interviewing Matt Barlow, Microsoft's general manager of new devices.
"I'm sorry to hear about the challenges you're running into, but a bunch of other people using those devices are having a great time being fit with Microsoft Band," Barlow responded. Not letting Barlow off the hook, Fortt told Barlow that Microsoft has already acknowledged the Bluetooth connectivity issues with iOS. "With any types of new product rollout, you're going to have updates that need to occur with software," Barlow responded. "We're updating software and we're updating usability, so I'm definitely convinced that [we're] seeing people using the Microsoft Band with all phone types without any issues moving forward."
Barlow went on to tout the unique 24-hour heart-tracking capability of the Microsoft Band, along with its on-board GPS, guided workouts and e-mail, text and Facebook integration. "People are really looking for value, and when I think about what we have with the Microsoft Band ... at a $199 price point, [it] is certainly magical," he argued.
Clearly it is the early days for wearable devices and it remains to be seen if they will take off. For its part, Microsoft has only offered its band through its retail stores, further limiting their presence. One could argue that many are waiting for the Apple Watch, due out this quarter, but at a starting price of $349, it's not likely they'll be flying off the shelves either. Results of a survey by Piper Jaffray confirmed that.
Not that its earlier surveys showed pent-up demand either, but now only 7 percent of 968 iPhone users surveyed said they intend to purchase an Apple Watch, down from 8 percent back in September when it was introduced and 10 percent in September 2013.
"We believe that the muted response to the Watch is due to consumer questions including what is the killer feature of the watch?," wrote Gene Munster, senior analyst and known Apple bull, in a Dec. 21 research note. People also want to know, "what applications will be available for the watch? We believe that as we get closer to launch [this] year, Apple will answer many of these questions and demand will increase; however, we still believe expectations for the first year of the watch should remain conservative."
Do you use a wearable such as the Apple Watch, the Microsoft Band, or another item that come from a plethora of other players offering similar devices?
Posted by Jeffrey Schwartz on 01/05/2015 at 12:25 PM0 comments
Microsoft has taken a beating by critics over this month's security patch, which initially suggested that Windows 10 Technical Preview testers might need to uninstall Office before coming up with a less invasive workaround. Despite that and numerous other frustrations with the Windows 10 Technical Preview, Microsoft reported 1.5 million testers have their hands on a preview version of Windows 10, and nearly a third of them are "highly active." The company also claims that more people are testing it than any beta release of Windows to date.
Apologizing for not having a major new build this month, Gabriel Aul, a data and fundamentals team lead at Microsoft's Operating Systems Group, promised it would be worth the wait. "We're really focused on making the next build something that we hope you'll think is awesome," Aul wrote in a Windows Insider blog post Wednesday. "In fact, just so that we have a *daily* reminder to ourselves that we want this build to be great, we even named our build branch FBL_AWESOME. Yeah, it's a bit corny, but trust me that every Dev that checks in their code and sees that branch name gets an immediate reminder of our goal."
Microsoft recently said it will reveal what's in store in the next build on January 21 and Aul indicated it would have some substantive new features. Though Microsoft didn't release a major new build in December, Aul pointed out the company has issued numerous bug fixes as a result of feedback from the 450,000 active testers. Given the poor reception for Windows 8.x, the record number of testers of the Windows 10 Technical Preview is an encouraging sign.
While the large participation doesn't guarantee Windows 10 will be a hit, a sparse number of testers would obviously lower the odds of success. "That hardcore usage will help us fix all the rough edges and bugs," Aul said, noting his favorite was a "very rare" instance when the OneDrive icon in File Explorer could be replaced by an Outlook icon. So far, he noted, testers have helped discover 1,300 bugs. Aul said while most are minor bugs, Microsoft will implement UX changes based on the feedback as well. Many will be small changes and others will be major.
What's on your wish list for the next build and where does the final release of Windows 10 fit in your plans for 2015?
Posted by Jeffrey Schwartz on 12/19/2014 at 10:54 AM0 comments
It's been a tough few months for IBM. The company has seen its shares tumble in 2014 amid weak earnings. But looking to show it may be down but not out, Big Blue said it has picked up the pace to build out its cloud footprint after getting to a slow start several years ago.
To close the year, the company yesterday said the IBM Cloud now has 12 new datacenters around the world including in Frankfurt, Mexico City, Tokyo and nine other locations through colocation provider Equinix in Australia, France, Japan, Singapore, the Netherlands and the United States.
The IBM Cloud now has datacenters in 48 locations around the world. That's double the number it had last year thanks to its promise to invest $1.2 billion to expand its cloud network, which kicked into high gear with last year's acquisition of SoftLayer. On top of that, IBM invested $1 billion to build its BlueMix platform-as-a-service technology, designed for Web developers to build hybrid cloud applications.
Enabling that expansion, IBM made aggressive moves including its deal with Equinix to connect to its datacenters using the large colocation provider's Cloud Exchange network infrastructure. IBM also inked partnerships with AT&T, SAP and Microsoft. With its recently announced Microsoft partnership, applications designed for Azure will work on the IBM Cloud (and vice versa). As IBM and Microsoft compete with each other, Amazon and numerous other players, the partnership with Microsoft promises to benefit both companies and their customers.
As a result of its aggressive expansion this year, IBM says it, Microsoft and Amazon have the largest enterprise cloud infrastructure and platform offerings. Nevertheless, few would dispute Amazon remains the largest cloud provider.
In its announcement yesterday, IBM also said it recently inked more than $4 billion in long-term enterprise cloud deals with Lufthansa, WPP, Thomson Reuters and ABN Amro, and that its customer base has doubled in the past year to more than 20,000. While many are traditional Big Blue shops, IBM says thousands are new companies, including startups. IBM said its cloud revenues are growing at a 50 percent rate and on pace for $7 billion in 2015.
Posted by Jeffrey Schwartz on 12/18/2014 at 10:53 AM0 comments
Like many cloud service providers, Microsoft has identified disaster recovery as a key driver for its hybrid infrastructure-as-a-service (IaaS) offering. Microsoft this year delivered a critical component of delivering its disaster recovery as a service (DRaaS) with Azure Site Recovery.
If you saw Brien Posey's First Look at Azure Site Recovery, you may have quickly lost interest if you're not a Microsoft System Center user. That's because Azure Site Recovery required System Center Virtual Machine Manager. But with last week's Microsoft Azure release upgrade, the company lifted the SCVMM limitation.
The new Azure Site Recovery release allows customers to replicate and recover virtual machines using Microsoft Azure without SCVMM. "If you're protecting fewer VMs or using other management tools, you now have the option of protecting your Hyper-V VMs in Azure without using System Center Virtual Machine Manager," wrote Vibhor Kapoor, director of marketing for Microsoft Azure, in a blog post outlining the company's cloud service upgrades.
By making Azure Site Recovery Manager available without SCVMM, it brings the DRaaS to branch offices and smaller organizations that can't afford Microsoft's systems management platform or simply prefer other tools, explained Scott Guthrie, executive vice president of Microsoft's enterprise and cloud business, in a blog post. "Today's new support enables consistent replication, protection and recovery of Virtual Machines directly in Microsoft Azure. With this new support we have extended the Azure Site Recovery service to become a simple, reliable and cost effective DR Solution for enabling Virtual Machine replication and recovery between Windows Server 2012 R2 and Microsoft Azure without having to deploy a System Center Virtual Machine Manager on your primary site."
Guthrie pointed out that Azure Site Recovery builds upon Microsoft's Hyper-V Replica technology built into Windows Server 2012 R2 and Microsoft Azure "to provide remote health monitoring, no-impact recovery plan testing and single click orchestrated recovery -- all of this backed by an SLA that is enterprise-grade." Since organizations may have different uses for Azure Site Recovery, Guthrie underscored the One-Click Orchestration using Recovery Plans option, which provides various Recovery Time Objectives depending on the use case. For example using Azure Site Recovery for test and/or planned failovers versus unplanned ones typically require different RTOs, as well as for disaster recovery.
In addition to Hyper-V Replica in Windows Server 2012 R2, Azure Site Recovery can use Microsoft's SQL Server AlwaysOn feature. Azure Site Recovery also integrates with SAN replication infrastructure from NetApp, Hewlett Packard and EMC. Also, according to a comment by Microsoft's Roan Daley in our First Look, Azure Site Recovery also protects VMware workloads across VMware host using its new InMage option. Acquired back in July, InMage Scout is an on-premises appliance that offers real-time data capture on a continuous basis, which simultaneously performs local backups or remote replication via a single data stream. Microsoft is licensing Azure Site Recovery with the Scout technology on a per-virtual or per-physical instance basis.
Are you using Microsoft's Azure Site Recovery, planning to do so or are you looking at the various third party alternatives as cloud-based DRaaS becomes a more viable data protection alternative?
Posted by Jeffrey Schwartz on 12/17/2014 at 12:08 PM0 comments
Salesforce.com today launched a connector that aims to bridge its cloud-based CRM portfolio of services with enterprise file repositories. The new Salesforce File Connect will let organizations centralize their customer relationship management content with file stores including SharePoint and OneDrive with a connector to Google Drive coming in a few months.
The release of the connector to SharePoint and OneDrive was promised back in late May when both Salesforce.com and Microsoft announced a partnership to integrate their respective offerings. While the two companies have a longstanding rivalry, they also share significant overlapping customer bases. The companies at the time said they would enable OneDrive for Business and SharePoint Online as integrated storage options for the Salesforce platform.
In today's announcement, Salesforce claims it's the first to create a repository that natively integrates CRM content and files among popular enterprise file stores. Salesforce.com said it provides a simple method of browsing, searching and sharing files located in various repositories.
Salesforce.com described two simple use cases. One would enable a sales rep to attach a presentation on OneDrive for Business to a sales lead in the Salesforce CRM app. The other would allow a service representative to pull an FAQ content form OneDrive for Business running in the Salesforce Service Cloud app.
The connector supports federated search to query repositories simultaneously from any device and lets users attach files to social feeds, groups or records, enabling them to find contextually relevant information in discussions running in Salesforce Chatter. The tool is also designed to enforce existing file permissions.
For customers and third-party software providers wanting to embed file sharing into their applications, Salesforce.com also is offering the Salesforce Files Connect API.
Posted by Jeffrey Schwartz on 12/17/2014 at 12:26 PM0 comments
Microsoft last month entered the wearables market with the Microsoft Band, which, paired with the new Microsoft Health Web site and app for the wrist band, is designed to track your physical activities and bring some productivity features to your wrist.
The Microsoft Band, in my opinion, does a lot of interesting things, though it doesn't really excel at any of them at this point. Among the productivity features included are alerts that let you glance at the first sentence or two of a message, texts, Facebook posts, Facebook Messenger, phone calls, voicemails, schedules, stock prices and an alarm clock that vibrates gradually for deep sleepers who don't like to jump out of bed.
Then there's the health component that has a pedometer to track your steps, a monitor for runners as well as one to track general workouts. It also tracks your sleep including your average heart rate and how often you supposedly woke up. You can synchronize whatever physical activity it monitors with Microsoft Health, an app that runs on any iOS, Android and Windows Phone device. If you use it with Windows Phone, you get the added benefit of using Microsoft's Cortana, the digital assistant that responds to spoken commands. You can also look at reports on the Microsoft Health Web site.
My personal favorite: the Starbucks app, which presents the scan image of your account allowing for the barista to scan it when making a purchase. Most of them seeing it for the first time responded with awe, with one saying that "this is the wave of the future."
Though I've been skeptical about wearables like this and others like it from Fitbit, Samsung, Garman, Nike, Sony, and dozens of other providers, it's clearly a growing market and it may very well be the wave of the future -- or at least a wave of the future. Market researcher Statistica forecasts that the market for these wearables will be close to $5.2 billion this year, which is more than double over last year. In 2015, sales of these gadgets will hit $7.1 billion and by 2018 it will be $12.6 billion.
Gartner last month reported that while smart wristbands are poised for growth, its latest survey shows at least half are considering smartwatches. It actually sees the smartwatch market growing from 18 million units this year to 21 million in 2015, while purchases of wristbands will drop from 20 billion to 17 billion. Certainly the release of the Apple Watch, despite its hefty starting price of $349, will likely fuel that market, though I already questioned how much demand we'll see for it.
I haven't tested other devices so it's hard to say how the Microsoft Band rates compared to them. But I find the notion of having information on my wrist more compelling than I had thought. However, performance of my Microsoft Band is flaky. I've encountered synchronization problems that have required me to uninstall and reinstall the Microsoft Health app on my iPhone on a number of occasions. It has presented realistic heart rates when I'm at the gym and suddenly it would give numbers not believable. When I click on the e-mail button it often says I have nothing new and even when I can read them, the messages are cryptic and don't always indicate the sender.
I like that the Microsoft Band does synchronize with some other health apps, such as MyFitnessPal, which I use to track my meals these days. By importing that data, it provides more relevant info that I'd otherwise have to figure out and enter manually. The problem is, I don't believe I could have possibly burned 2,609 calories from a recent visit to the gym, though it would be nice if that was indeed the case.
That's why after spending several weeks with it, I can say I like the concept but it's not worth its $199 price tag unless money is no object to you. While I agree with my colleague Brien Posey that the Microsoft Band has some nice features, I think I'd wait for an improved version of the Microsoft Band and a richer Microsoft Health site before buying one of these (unless they become remarkably less expensive).
That stated, I hope Microsoft continues to enhance the Microsoft Band by adding more capacity and battery life to make it a more usable and comfortable device. If everyone had accurate readings of our physical activities, maybe it would lead to healthier lifestyles.
Posted by Jeffrey Schwartz on 12/15/2014 at 7:28 AM0 comments
Once hailed as the future of in-vehicle communications and entertainment, a partnership between Ford and Microsoft has all but unraveled. Ford this week said it's replacing Microsoft Sync with BlackBerry's QNX software.
Ford launched its Sync 3 platform, which ushers in significant new features and will show up in 2016 vehicles sometime next year, the company announced yesterday. Though Ford didn't officially announce it was walking away from Microsoft Sync in favor of BlackBerry QNX, The Seattle Times reported in February that the automaker was on the verge of making the switch. Raj Nair, Ford's CTO of global product development, said in numerous reports yesterday that QNX is now the new platform. Some 7 million Ford vehicles are reportedly equipped with Microsoft Sync but the systems have continuously scored poorly in consumer satisfaction reports due to frequent malfunctions.
Swapping out Microsoft Sync for QNX would also result in cost savings, according to The Seattle Times, noting that it's also used in the in-vehicle navigation systems of Audis and BMWs. Apple and Google also have alliances with various car manufactures. While BlackBerry smartphones may be rapidly disappearing, QNX has gained significant ground in the in-vehicle systems market. While Microsoft Sync, based on Windows Embedded, is said to also run the vehicle entertainment systems of some BMW, Kia, Fiat and Nissan models, Ford and Microsoft announced with great fanfare in 2007 their plans roll out models with the entertainment system as an option.
Microsoft Sync was initially designed to link iPods and Zune music players to entertainment systems, debuting just at the dawn of the smartphone age. At the time, Microsoft Founder Bill Gates saw Microsoft Sync as another element of the company's "Windows Everywhere" effort. As we all know, much as changed since then.
If Microsoft has new plans for Sync, the next logical time to announce them would be at next month's annual Detroit Auto Show.
Posted by Jeffrey Schwartz on 12/12/2014 at 11:28 AM0 comments
While IP address conflicts are as old as networks themselves, the growing number of employee-owned devices in the workplace are making them a more frequent problem for system administrators. By nature of the fact that PCs and devices have become transient in terms of the number of networks they may connect to, it's not uncommon for a device to still think it's linked to one network, causing an IP address conflict when it tries to connect to another network.
SolarWinds is addressing that with its new IP Control Bundle, which identifies and resolves IP address conflicts. The bundle consists of the new SolarWinds IP Address Manager (IPAM) and the SolarWinds User Device Tracker (UDT). There are two parts to the IP address resolution process.
First, IPAM identifies the IP conflicts by subnet, provides a history of where the user and machine connected to the network, identifies the switch and port on which that system connected and then actually disables that user's connection. Next, UDT uses that information to disable the switch port and assigns a new IP address and updates any DNS entries as necessary for the device to work before reconnecting.
IPAM and UDT are typically installed on a separate server, and when a problem arises an administrator can use the software to scan the network and IP address ranges. It also interrogates routers, switches and other network infrastructure to gather relevant troubleshooting information. Rather than using agents, it relies on standard protocols, notably SNMP.
In addition to troubleshooting and remediating client-based devices, the SolarWinds package can handle IP address conflicts occurring on servers and virtual machines, says Chris LaPoint, vice president of product management at SolarWinds.
"If I'm the owner of that critical app trying to figure out what's going on, I can go to this tool and see that Joe over in another part of the datacenter has spun up a new VM and that's what's creating issues with my application," LaPoint explains. "So now I can probably notify Joe and tell him I'm kicking him off the network because it's actually affecting the availability of a customer-facing application that we need to have running."
Pricing for IPAM starts at $1,995 and UDT begins at $1,795.
Separately, SolarWinds this week said its SolarWinds Web Help Desk now works with DameWare Remote Support. SolarWinds acquired DameWare in 2011 but it operates as a separate business unit. The products are collectively used by 25,000 customers and the combined solution will allow help desk technicians to connect with remote devices or servers, collect support data including chat transcripts and screen shots and generate reports.
The SolarWinds Web Help Desk offering provides automated ticketing, SLA alerts, asset management and reporting while DameWare Remote Support provides remote access to client devices and servers, allowing administrators to take control of those systems and manage multiple Active Directory Domains as well as resetting passwords.
Posted by Jeffrey Schwartz on 12/12/2014 at 11:27 AM0 comments
During the Thanksgiving break, I had a number of simultaneous encounters with PCs in public places still sporting the Windows XP logo and it got under my skin. Among them was a computer near the checkout area at Home Depot. And within an hour I spotted another on a counter right next to the teller stations at my local Bank of America branch.
Given that we know Windows XP systems are no longer patched by Microsoft, the sight of them is becoming as uncomfortable as being near someone who has a nasty cold and coughs without covering his or her mouth. Speaking of spreading viruses, I've even been to two different doctors' offices in recent months that were running Windows XP-based PCs -- one of them is used to actually gather patient information and the other to schedule appointments. In both cases, when I asked if they planned to upgrade those systems, I got the equivalent of a blank stare. I don't think they had any idea what I was talking about.
Nevertheless, seeing a Windows XP PC just after I used the self-checkout terminal at Home Depot was especially unsightly given the retailer's massive breach last month in which e-mail addresses were stolen. Home Depot Spokeswoman Meghan Basinger said: "Thanks for reaching out, but this isn't detail we'd discuss."
Now the Bank of America situation is a bit different. The day after the Thanksgiving holiday weekend, InformationWeek announced their IT chief of the year: Cathy Bessant, head of Bank of America's 100,000-person Global Technology & Operations, who manages an IT organization of 100,000 employees. That's a lot of IT pros and developers.
Bank of America appeared to have a strong IT organization just by the nature of the way the company is often first to market with new e-banking features and mobile apps. The bank's systems tend to be reliable and they haven't had any major breaches that I can recall. Also, having worked in the past for InformationWeek Editor-in-Chief Rob Preston, who interviewed Bessant and reported on the bank's ambitious IT efforts, I have no doubt the choice was a well vetted one.
So when he noted among the bank's many milestones this year that its IT team completed the largest Windows 7 migration to date (300,000 PCs), I felt compelled to check in with Bank of America Spokesman Mark Pipitone. Perhaps after updating so many systems, my inquiry sounded petty, but I was curious as to how they were dealing with these stray Windows XP systems. Were they paying $200 for premium support per system or maybe the PC was just front-ending an embedded system? (Microsoft does still support Windows XP embedded.) As such, I sent a picture of the system to Pipitone.
"Not knowing exactly what device you took a picture of, the best the team can tell is that it's an excepted device (there are some across our footprint), or it's a device that's powered on but not being used on a regular basis," Pipitone responded.
I made a trip to the branch and asked what the XP machine was used for. A rep there told me that it was used for those needing to access their safe deposit boxes. I informed Pipitone of that, though he declined to comment further. Maybe the lone PC I saw isn't connected to the Internet or it is otherwise protected. But the mere public display of Windows XP machines in so many different places for many tech-aware people is still disconcerting.
I laud Bank of America and others who have undertaken the painful move of modernizing their PC environments. At the same time, I look forward to a day when I don't have to see that Windows XP logo when I walk into a place of business, whether it's a doctor's office, a local restaurant or a major retailer or bank. Windows XP was a great operating system when it came out and I know some defenders of the legacy OS will be outraged by my stance -- many of whom are angered by Microsoft's decision to stop supporting it. But Windows XP machines are likely unprotected unless they're not, and never will be, connected to a network.
There is some encouraging news. Waiting in my inbox on December 1 right after the holiday weekend was a press release from StatCounter reporting that there are more Windows 8.1 PCs out there than those with Windows XP. According to the November report, 10.95 percent of systems are running Windows 8.1. Windows XP still accounts for 10.67 percent. This marks the first time that there are more Windows 8.1-based systems than Windows XP PCs, according to its analysis. Back in August, the combination of Windows 8 and Windows 8.1 systems achieved that milestone, so it could be argued the latest report is a minor feat.
Nevertheless, the stragglers will remain for some time, according to Sergio Galindo, general manager of GFI Software, a provider of Web monitoring and patch management software. "I'm aware of several companies that continue running large XP installations -- and even larger IT budgets -- that may have custom XP agreements," Galindo said. "Windows XP will continue to survive as long as it meets people's needs. To keep a network secure, IT admins and computer consultants can 'lock down' the accounts on the XP machines. I strongly advise that machines running XP be allowed only minimal capabilities and have no admin access. I also favor using more secure browsers such as Chrome versus Internet Explorer in these cases. Also, IT admins may want to shut off some of the more common attack vectors such as Adobe Flash. In the case of XP, less (software) is more (secure)."
By the way, just a friendly reminder: there are just over 200 days left before Microsoft will no longer support Windows Server 2003. You'll be hearing a lot about that from us and Redmond magazine's Greg Shields last month primed the pump.
Posted by Jeffrey Schwartz on 12/10/2014 at 12:54 PM0 comments
At Microsoft's annual shareholder meeting Wednesday in Bellevue, Wash., CEO Satya Nadella cashed in big. Shareholders approved his proposed $84 million pay package, a reward for a job well done. The pay package, which includes $59.2 million in stock options and a $13.5 million in retention pay, according to Bloomberg, has come under attack as excessive by Institutional Shareholder Services, an investor advisory organization.
Indeed Nadella ranks among the most highly paid CEOs. According to this year's Wall Street Journal/Hay Group CEO Compensation report ranking the 300 largest companies in revenue, the median pay package was $11.4 million, with Oracle CEO Larry Ellison taking the top spot in 2013 earning $76.9 million.
By that measure, Nadella isn't breaking any records. Oracle's share price rose nearly 29 percent, while Microsoft's share price jumped 32 percent since Nadella took over in early February. Nevertheless, investor advocates have scrutinized CEO compensation in wake of the financial crisis.
While Microsoft's prospects look better than they have in a long time, the package for some may look excessive. Others would argue Nadella has plenty of incentive to continue Microsoft's turnaround, which is still in its early stages and certainly not yet a sure thing, given rapid changes in fortune that can take place in the IT industry.
Do you believe Nadella's compensation is excessive or is it fair?
Posted by Jeffrey Schwartz on 12/05/2014 at 12:09 PM0 comments
It was hard to ignore the hype over the Thanksgiving weekend's traditional Black Friday and Cyber Monday barrage of cut rate deals including this year's decision by quite a few retailers to open their doors earlier than ever. Many, including the Microsoft Store, opened as early as 6 p.m. on Thanksgiving Day, hoping to lure people away from their turkey dinner earlier to get a jump on their holiday shopping.
Content with spending Thanksgiving Day with my family and not a big fan of crowds anyway, I decided to stop by my local Staples at a normal hour on Friday morning. To my surprise, there were just a handful of people in the store. When I asked an employee why the store was so empty on Black Friday, she said the crowds were all there Thanksgiving night.
When I asked her how many people bolted early from their turkey dinners, she said there was a line of about 100 people outside the store prior to its 6 p.m. opening Thursday evening. Apparently a good chunk of them were waiting for the $99 Asus EeeBook X205TA, which normally sells for at least double that price. Truth be told, that's why I popped in, though I had anticipated the allotment would be sold out. I had already looked at the 11.6-inch Windows 8.1 notebook, which can also function as a tablet with its removable keyboard. It's powered with an Intel Atom processor, 2 GB of RAM and a 32GB SSD.
I asked her how many people in line were waiting for that device and she replied that more than half were. While many Windows 8.1 notebooks and tablets were on sale during the holiday rush, the two prominent $99 deals were the aforementioned Asus device and the HP Stream 7. The latter is a 7-inch Windows 8.1 tablet and it comes with a one-year Office 365 Personal subscription good for the tablet and one other PC. The discounted HP Stream 7 is only available today at the Microsoft Store, which is also offering up to $150 off on the most expensive Surface Pros with Intel Core i7 processors.
The HP Stream 7 is also powered by an Intel Atom processor, a 32GB SSD but only has 1GB of RAM. While you shouldn't plan on doing much multitasking with this device, it's certainly a viable option if you want an ultra-portable tablet that can quickly access information and function as an option to a Kindle Fire (the Kindle app is among many apps now available in the Microsoft Store).
Given I already have a Dell Venue 8 Pro with similar specs and 2GB of RAM, the HP Stream 7 was of little interest to me, though it would make a good gift for someone at that price. Back at Staples, I asked the employee if there were any of the Asus convertibles left at the $99 price and to my surprise she said they were all out but I could order one with free delivery from the store's kiosk. It's slated to arrive today. Apparently you can still order one on this Cyber Monday on Staples' Web site (you can probably get a competitor to match the price).
Today the National Retail Federation released a report forecasting that sales over the Thanksgiving weekend overall were down 11 percent and there are a number of theories for why that's the case. The drop in sales does show that all of those retailers who felt compelled to open their doors on Thanksgiving Day may want to rethink that strategy for next year.
Posted by Jeffrey Schwartz on 12/01/2014 at 12:54 PM0 comments
Microsoft this week gave developers and IT pros a deep dive on major new features coming to Office 365, which the company has described as the fastest growing new product in its history. The demos, which include several APIs and SDKs aimed at driving existing SharePoint users to Office 365, gave a close look at how building and administering applications for collaboration is going to change dramatically for IT pros, developers and end users alike.
Because Microsoft has made clear that organizations running applications developed in native code for SharePoint won't be able to migrate them to Office 365, the company is trying to convince customers to plan for the eventual move using the company's new app model. Microsoft is betting by offering compelling new capabilities, which it describes as its "Office Everywhere" effort, that organizations will find making the move worthwhile.
The APIs and new Office 365 features demonstrated include the new My Apps user interface, which the company also calls the App Launcher, due out for preview imminently after what the company described as a brief delay. My Apps gives users a customizable interface to applications they use such Word, Excel, PowerPoint, contacts, mail and files. They can also add other Microsoft services as well as ultimately those of third parties.
Jeremy Thake, a senior Microsoft product manager, demonstrated the new Office 365 platform and underlying API model Thursday at the Live! 360/SharePoint Live! conference in Orlando. Thake said the Microsoft Graph demo was the first given in the United States since the company unveiled it two weeks ago at TechEd Europe, where Microsoft also released the preview of the new Office 365 APIs.
"The Microsoft Graph is essentially allowing me to authenticate once and then go to every single endpoint across Microsoft. And not just Office but Dynamics to Azure and anything I've got running Windows, such as Live, Outook.com and whatnot," Thake said during the demo, noting the plan is to tie it to third-party services that have registered to Microsoft Graph. "It's an access to those things from that one endpoint. This is a really powerful thing that isn't out yet. It's in preview; it will be coming next year."
Consultant Andrew Connell, organizer of the SharePoint Live! track at Live! 360 said the release of the APIs and the Microsoft Graph bode well for the future of Office 365 and SharePoint. "It opens so much more of the company data, not just so much more of our data that we're using in Microsoft services from a uniform endpoint for other companies to interact with and provide additional value on it," he said during the closing conference wrap up panel. "That's going to be tremendous. That [Microsoft Graph] API is being pushed by the 365 group but it's a Microsoft thing -- it touches everything we do."
Thake demonstrated numerous other APIs including a discovery service and the new Android and iOS SDKs, among other things. There are huge changes coming to Office 365 in 2015 and it will have a huge impact on IT pros and developers who build to and manage it. It was a huge topic at SharePoint Live! and I'll be sharing the implications of what's in the pipeline in the coming weeks.
Posted by Jeffrey Schwartz on 11/21/2014 at 9:02 AM0 comments
While Microsoft this year has rolled out extensive additions to its data management portfolio as well as business intelligence and analytics tools, SQL Server is still its core database platform. Nevertheless, Microsoft has unleashed quite a few new offerings that DBAs, developers and IT decision makers need to get their arms around.
"I think Microsoft needs to have the full stack to compete in the big data world," said Andrew Brust, who is research director at Gigaom Research. Brust Tuesday gave the keynote address at SQL Server Live!, part of the Live! 360 conference taking place in Orlando, Fla., which like Redmond, is produced by 1105 Media. Microsoft CEO Satya Nadella has talked of the data culture that's emerging, as noted in the Redmond magazine October cover story.
Brust pointed out that Microsoft has delivered some significant new tools over the past year including its Azure HDInsight, its Apache Hadoop-based cloud service for processing unstructured and semi-structured Big Data. Microsoft recently marked the one-year anniversary of Azure HDInsight with the preview of a new feature, Azure Machine Learning, which adds predictive analysis to the platform.
"Since the summer, they've added half a dozen new data products, mostly in the cloud but they're significant nonetheless," Brust said in an interview, pointing to the variety of offerings ranging from Stream Analytics, the company's real-time events processing engine to Azure Data Factory, which lets customers provision, orchestrate and process on-premises data such as SQL Server with cloud sources including Azure SQL database, Blobs and tables. It also offers ETL as a service. Brust also pointed to the new Microsoft DocumemtDB, the company's new NoSQL entry, which natively supports JSON-compatible documents.
Microsoft's release of SQL Server 2014, which adds in-memory processing to its flagship database, aims to take aim at SAP's HANA. "Microsoft is going after it from the point of view you can have in memory and just stay in SQL Server instead of having to having to move to a specialized database," Brust said. "It's a version one, so I don't expect adoption to be huge but it will be better in the next version. They are definitely still working on it. It's not just one-off that they threw out there -- it's very strategic for them."
Posted by Jeffrey Schwartz on 11/19/2014 at 1:38 PM0 comments
Microsoft today said it has merged Windows code into Docker, allowing administrators from a Windows client to manage Docker containers running on Linux hosts. It's the latest move by Microsoft to jump on the Docker bandwagon, which began earlier this year with its support for Linux containers in the Azure pubic cloud, and continued with last month's pact by the two companies to develop native Docker clients for Windows Server.
The company published reference documentation, published in the form of a command-line interface (CLI), that illustrates how to compile a Docker container on Windows. "Up 'til today you could only use Linux-based client CLI to manage your Docker container deployments or use boot2docker to set up a virtualized development environment in a Windows client machine," wrote Khalid Mouss, a senior program manager for the Azure runtime, in a blog post.
"Today, with a Windows CLI you can manage your Docker hosts wherever they are directly from your Windows clients," Mouss added. The Docker client is in the official Docker GitHub repository. Those interested can follow its development under Pull Request#9113.
While noteworthy, it bears noting this is not the announcement of a Windows Docker client -- it's just a move to enable management of Linux clients from a Windows client, said Andrew Brust, research director at Gigaom Research, who, when I saw the news on my phone, happened to be sitting next to me at our Live! 360 conference, taking place this week in Orlando, Fla. "This is simply a client that lets you run a client to manage Linux-based Docker containers," Brust said. "It's interesting but it's not a huge deal."
Furthermore, Mouss said on the heels of Microsoft open sourcing .NET Framework last week, the company this week also released a Docker image for ASP.NET on Docker Hub, enabling developers to create ASP.NET-ready containers from the base image. The ASP.NET image is available from Docker Hub.
See this month's Redmond magazine cover story on Microsoft's move toward containers as potentially the next wave on infrastructure and application virtualization.
Posted by Jeffrey Schwartz on 11/18/2014 at 11:52 AM0 comments
When Amazon Web Services announced Aurora as the latest database offering last week, the company put the IT industry on notice that it once again believes it can disrupt a key component of application infrastructures.
Amazon debuted Aurora at its annual AWS re:Invent customer and partner conference in Las Vegas. Amazon said the traditional SQL database for transaction-oriented applications, built to run on monolithic software and hardware, has reached its outer limits. Amazon Web Services' Andy Jassy said in the opening keynote address that the company has spent several years developing Aurora in secrecy.
Built on the premise that AWS' self-managed flagship services EC2, S3 and its Virtual Private Cloud (VPC) are designed for scale-out, service-oriented and multi-tenant architectures, Aurora removes half of the database out of the application tier, said Anurag Gupta, general manager of Amazon Aurora, during the keynote.
"There's a brand new log structured storage system that's scale out, multi-tenant and optimized for database workloads, and it's integrated with a bunch of AWS services like S3," said Gupta, explaining Aurora is MySQL-compatible. Moreover, he added, those with MySQL-based apps can migrate them to Aurora with just several mouse clicks and ultimately see a fivefold performance gain.
"With Aurora you can run 6 million inserts per minute, or 30 million selects," Gutpa said. "That's a lot faster than stock MySQL running on the largest instances from AWS, whether you're doing network IOs, local IOs or no IOs at all. But Aurora is also super durable. We replicate your data six ways across three availability zones and your data is automatically, incrementally, continuously backed up to S3, which as you know is designed for eleven nines durability."
Clearly Amazon is trying to grab workloads that organizations have built for MySQL but the company is also apparently targeting those that run on other SQL engines that it now hosts via its Relational Database Service (RDS) portfolio including Oracle, MySQL and Microsoft's SQL Server.
Aurora automatically repairs failures in the background recovering from crashes within seconds, Gupta added. It can replicate six copies of data across three Availability Zones and backup data continuously to S3. Customers can scale an Aurora database instance up to 32 virtual CPUs and 244GB of memory. Aurora replicas can span up to three availability zones with storage capacities starting at 10GB and as high as 64TB.
Gupta said the company is looking to price this for wide adoption, with pricing starting at 29 cents for a two-virtual CPU, 15.25-GB instance.
The preview is now available. Do you think Amazon Aurora will offer a viable alternative to SQL databases?
Posted by Jeffrey Schwartz on 11/17/2014 at 12:32 PM0 comments
Facebook is secretly developing a social network aimed at enterprise users, according to a report published in today's Financial Times. The report said Facebook at Work could threaten Microsoft's Yammer enterprise social network as well as LinkedIn and Google Drive.
At first glance, it's hard to understand how Facebook at Work would challenge both Yammer and LinkedIn. Though they're both social networks, they are used in different ways. Granted there's some overlap, Yammer is a social network for a closed group of users. Facebook users have apparently used Facebook at Work over the past year for internal communications and the company has let others test it as well.
The report was otherwise quite vague and I wonder if the author even understands the difference between Yammer, LinkedIn and Google Drive. It's not unreasonable to think Facebook would want to offer a business social network similar to Yammer or Salesforce.com's Chatter. But as PC Magazine points out, many businesses might have issues with a service provided by Facebook.
That said, I reached out to SharePoint and Yammer expert Christian Buckley, who recently formed GTConsult, to get his take. Buckley said there's been buzz about Facebook's ambitions for some time but he's skeptical that Facebook could make a serious dent in the enterprise social networking market despite its dominance on the consumer side.
"Honestly I think they're a couple years behind in making any serious move in this space," Buckley said. "They will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."
Buckley also noted that Google, LinkedIn and Yammer have very different value propositions to enterprises. "Each have their own struggles," Buckley said. "LinkedIn may be displacing Yahoo Groups and other public chat forums, but my understanding is that they are having a difficult time translating that moderate growth into additional revenue beyond job postings. Yammer's difficulties may be a closer comparison and highlight Facebook's uphill battle to win over the enterprise by aligning ad hoc social collaboration capabilities with business processes. Microsoft has Yammer at the core of its inline social strategy, and like SharePoint, the individual Yammer brand will fade (in my view) as the core features are spread across the Office 365 platform. Instead of going to a defined Yammer location, the Yammer-like features will happen in association with your content, your e-mail, your CRM activities, and so forth."
What's your take on this latest rumor?
Posted by Jeffrey Schwartz on 11/17/2014 at 11:56 AM0 comments
When Microsoft CEO Satya Nadella last month said, "Microsoft loves Linux" and pointed to the fact that 20 percent of its Azure cloud is already running the open source popular platform, he apparently was getting ready to put his money where his mouth is.
At its Connect developer conference this week, Microsoft said it will open source its entire .NET Framework core and bring it to both Linux and the Apple Macintosh platform. It is the latest move by Microsoft to open up its proprietary .NET platform. Earlier this year, the company made ASP.NET and the C# compiler open source. This week the company released the .NET Core development stack and in the coming months, Microsoft will make the rest of .NET Core Runtime and .NET Core Framework open source.
Citing more than 1.8 billion .NET installations and over 7 million downloads of Visual Studio 2013 during the past year, Microsoft Developer Division Corporate Vice President S. Somasegar said in a blog post, "we are taking the next big step for the Microsoft developer platform, opening up access to .NET and Visual Studio to an even broader set of developers by beginning the process of open sourcing the full .NET server core stack and introducing a new free and fully-featured edition of Visual Studio." These were all once unthinkable moves.
Just how big a deal is this? Consider the reaction of Linux Foundation Executive Director Jim Zemlin: "These are huge moves for the company," he said in a blog post. "Microsoft is redefining itself in response to a world driven by open source software and collaborative development and is demonstrating its commitment to the developer in a variety of ways that include today's .NET news."
Zemlin lauded a number of Microsoft's open source overtures including its participation in the OpenDaylight SDN project, the AllSeen Alliance Internet of Things initiative and the Core Infrastructure Initiative.
For IT pros, the move is Microsoft's latest affirmation of the company's embrace of open source and Linux in particular. At the same time, while some believe Microsoft is also doing so to deemphasize Windows, the company's plans to provide Docker containers in Windows Server suggests the company has a dual-pronged strategy for datacenter and applications infrastructure: bolster the Windows platform to bring core new capabilities to its collaboration offerings while ensuring it can tie to open source platforms and applications as well.
At the same time, it appears that Microsoft is seeking to ensure that its development environment and ecosystem remains relevant in the age of modern apps. Zemlin believes Microsoft has, in effect, seen the light. "We do not agree with everything Microsoft does and certainly many open source projects compete directly with Microsoft products," he said. "However, the new Microsoft we are seeing today is certainly a different organization when it comes to open source. Microsoft understands that today's computing markets have changed and companies cannot go it alone the way they once did."
Posted by Jeffrey Schwartz on 11/14/2014 at 11:11 AM0 comments
When Microsoft last month announced it has 100-plus partners adopting its burgeoning Cloud OS Network, which aims to provide Azure-compatible third party cloud services, it left out perhaps one of the biggest fishes it has landed: Rackspace.
The two companies are longtime partners, and as I recently reported, Rackspace has extended its Hyper-V-compatible offerings and dedicated Exchange, SharePoint and Lync services. But Rackspace also has a formidable cloud infrastructure as a service that competes with the Azure network. The news that Rackspace now will provide Azure-compatible cloud service, announced on Monday with Rackspace's third-quarter earnings report, signals a boost for both companies.
For Microsoft it brings one of the world's largest public clouds and dedicated hosting providers into the Azure fold. Even if it's not all in or the core of Rackspace business -- that is still reserved for its own OpenStack-based infrastructure, a healthy VMware offering and the newly launched Google Apps practice -- Rackspace has a lot of Exchange and SharePoint hosting customers who may want to move to an Azure-like model but want to use it with the service level that the San Antonio, Texas-based company emphasizes.
"Those who are down in the managed 'colo' world, they don't want to be managing the infrastructure. They want us to do that," said Jeff DeVerter, general manager of Microsoft's Private Cloud business at Rackspace. "They're happy to let that go and get back into the business of running the applications that run that business."
Customers will be able to provision Azure private cloud instances in the Rackspace cloud and use the Windows Azure Pack to manage and view workloads. This is not a multitenant offering like Azure or similar infrastructure-as-a- service clouds, DeVerter pointed out. "These are truly private clouds from storage to compute to the networking layer and then the private cloud that gets deployed inside of their environment is dedicated to theirs. We deploy a private cloud into all of our datacenters [and] it puts the customers' cloud dual homing some of their management and reporting back to us so that we can manage hundreds and then thousands of our customers' clouds through one management cloud."
Microsoft first launched the Cloud OS Network nearly a year ago with just 25 partners. Now with more than 100, Marco Limena, Microsoft's vice president of Hosting Service Providers, claimed in a blog post late last month that there are in excess of 600 Cloud OS local datacenters in 100 companies serving 3.7 million customers. The company believes this network model will address the barriers among customers who have data sovereignty and other compliance requirements.
Among the members of the Cloud OS Network listed in an online directory are Bell Canada, CapGemini, Datapipe, Dimension Data and SherWeb. "Microsoft works closely with network members to enable best-practice solutions for hybrid cloud deployments including connections to the Microsoft Azure global cloud," Limena said.
Asked if it's in the works for Rackspace to enable Cloud OS private cloud customers to burst workloads to the Microsoft Azure service, DeVerter said: "Those are active conversations today that we're having internally and having with Microsoft. But right now our focus is around making that private cloud run the best it can at Rackspace."
Posted by Jeffrey Schwartz on 11/12/2014 at 1:01 PM0 comments
If the Microsoft Azure public drive is going to be the centerpiece of its infrastructure offering, the company needs to bring third-party applications and tools along with it. That's where the newly opened Microsoft Azure Marketplace comes in. The company announced the Microsoft Azure Marketplace at a press and analyst briefing in San Francisco late last month led by CEO Satya Nadella and Scott Guthrie, executive VP of cloud and enterprise. As the name implies, it's a central marketplace in which providers can deliver to customers to run their software as virtual images in Azure.
A variety of providers have already ported these virtual images to the marketplace -- some are pure software vendors, while others are providers of vertical industry solutions -- and a number of notable offerings have started appearing. Many providers announced their offerings at last month's TechEd conference in Barcelona.
One that Microsoft gave special attention to at the launch of the Azure Marketplace was Cloudera, the popular supplier of the Apache Hadoop distribution. Cloudera has agreed to port its Cloudera Enterprise distribution, which many Big Data apps are developed on, to Microsoft Azure. That's noteworthy because Microsoft's own Azure HDInsight Hadoop as a Service is based on the Hortonworks Apache Hadoop distribution. While it could cannibalize Azure HDInsight, those already committed to Cloudera are far less likely to come to Azure than if Cloudera is there.
"To date, most of our customers have built large infrastructures on premises to run those systems, but there's increasing interest in public cloud deployment and in hybrid cloud deployment, because infrastructure running in the datacenter needs to connect to infrastructure in the public cloud," said Cloudera Founder and Chief Strategy Officer Mike Olsen, speaking at the Microsoft cloud briefing in San Francisco. "This we believe is, for our customers, a major step forward in making the platform more consumable still."
Also up and running in the Azure Marketplace is Kemp Technologies, a popular provider of Windows Server load balancers and application delivery controllers. The Kemp Virtual LoadMaster for Azure lets customers create a virtual machine (VM) optimized to run natively in the Microsoft cloud, said Maurice McMullin, a Kemp product manager.
"Even though Azure itself does have a load balancer, it's a pretty rudimentary one," McMullin said. "Having the Kemp load balancer in there totally integrated into the Azure environment allows you to script some of those environments and application scenarios. The impact of that is, for an organization that's looking toward the cloud, one of the big challenges is trying to maintain the consistency by having a consistent load balancer from on premises, meaning you get a single management interface and consistent management of apps and policies on premises or in the cloud."
Lieberman Software has made available as a virtual image in the marketplace its Enterprise Random Password Manager (ERPM), which the company said provides enterprise-level access controls over privileged accounts throughout the IT stack, both on premises and now in Azure.
The company says ERPM removes persistent access to sensitive systems by automatically discovering, securing and auditing privileged accounts across all systems and apps within an enterprise. Authorized administrators can delegate to users quick access to specific business applications, as well as corporate social media sites in a secure environment. And those activities are automatically recorded and audited. It also ensures access to such identities is temporary and able to ensure unauthorized or anonymous access to sensitive data is avoided.
Another security tool is available from Waratek Ltd., a supplier of a Java Virtual Machine (JVM) container, which lets enterprises bring their own security to the cloud. Called Runtime Application Self-Protection (RASP), it monitors for key security issues and provides policy enforcement and attack blocking from the JVM.
In the JVM, the company offers a secure container where administrators can remotely control their own security at the application level, said Waratek CEO Brian Maccaba. "This is over and beyond anything the cloud provider can do for you and it's in your control," Maccaba says. "You're not handing it to Microsoft or Amazon -- you're regaining the reins, even though it's on the cloud."
The number of offerings in the Azure Marketplace is still relatively few -- it stands at close to 1,000 based on a search via the portal, though it is growing.
Posted on 11/10/2014 at 12:44 PM0 comments
Microsoft got some positive ink yesterday when it announced that Office 365 users on iPhones and iPads can now edit their documents for free and that the same capability was coming to Android tablets. Indeed it is good news for anyone who uses one or more of those devices (which is almost everyone these days).
But before you get too excited, you should read the fine print. As Directions on Microsoft Analyst Wes Miller noted on his blog, "Office is free for you to use on your smartphone or tablet if, and only if you are not using it for commercial purposes [and] you are not performing advanced editing."
If you do fit into the above-mentioned buckets or you want the unlimited storage and new Dropbox integration, it requires either an Office 365 Personal, Home or a commercial Office 365 subscription that comes with the Office 365 ProPlus desktop suite, Miller noted. As Computerworld's Gregg Keizer put it: "What Microsoft did Thursday was move the boundary between free and paid, shifting the line."
In Microsoft's blog post announcing the latest free offering, it does subtly note that this offer may not be entirely free. "Starting today, people can create and edit Office content on iPhones, iPads, and soon, Android tablets using Office apps without an Office 365 subscription," wrote Microsoft Corporate VP for Microsoft Office John Case, though that fine print was at the end of his post. "Of course Office 365 subscribers will continue to benefit from the full Office experience across devices with advanced editing and collaboration capabilities, unlimited OneDrive storage, Dropbox integration and a number of other benefits." Microsoft offers similar wording on the bottom of its press release issued yesterday.
Still, while noting this is great news for consumers, it's going to be problematic for IT organizations, Miller warned, especially those that have loose BYOD policies. "For commercial organizations, I'm concerned about how they can prevent this becoming a large license compliance issue when employees bring their own iPads in to work."
Are you concerned about this as well?
Posted by Jeffrey Schwartz on 11/07/2014 at 11:04 AM0 comments
BlueStripe Embeds App Monitor into System Center, Windows Azure Pack
BlueStripe Software is now offering its Performance Center tool as a management pack for Microsoft System Center 2012 R2 Operations Manager. The company earlier this year released the dashboard component of FactFinder, which monitors distributed applications across numerous modern and legacy platforms.
With the addition of Performance Center, the company has embedded its core FactFinder tool into System Center. FactFinder can monitor everything from mainframe infrastructure including CICS and SAP R3 transactions, along with applications running on Unix, Linux and Windows infrastructures. BlueStripe said it provides visibility and the root causes of performance to application components on physical, virtual and cloud environments. It works with third-party public cloud services, as well.
FactFinder integrates Operation Manager workflows, providing data such as response times, failed connections, application loads and server conditions, the company said. It also maps all business transactions by measuring performance across each hop of a given chain and is designed to drill into the server stack to determine the cause of a slow or failing transaction.
In addition to the new System Center Management Pack, BlueStripe launched Performance Center for the Windows Azure Pack, which is designed to provide administrators common visibility of their Windows Server and Microsoft Azure environments. This lets administrators and application owners monitor the performance via the Windows Azure Pack.
BlueStripe Marketing Manager Dave Mountain attended last week's TechEd Conference in Barcelona and said he was surprised at the amount of uptake for the Windows Azure Pack. "There's a recognition of the need for IT to operate in a hybrid cloud world," Mountain said. "IT's reason for existing is to ensure the delivery of business services. Tools that allow them to focus on app performance will be valuable and that's what we are doing with FactFinder Performance Center for Windows Azure Pack."
Netwrix Tackles Insider Threats with Auditor Upgrade
Netwrix Corp. has upgraded its auditing software to offer improved visibility to insider threats, while warning of data leaks more quickly. The new Netwrix Auditor 6.5 offers deeper monitoring of log files and privileged accounts, which in turn provides improved visibility to changes made across a network, including file servers and file shares.
The new release converts audit logs into more human readable formats, according to the company. It also lets IT managers and systems analysts audit configurations from any point in time, while providing archives of historical data against which to match. Netwrix said this ensures compliance with security policies and thwarting rogue employees from making unauthorized changes.
In all, Netwrix said it has added more than 30 improvements to the new release of Auditor, resulting in higher scalability and performance.
Riverbed Extends Visibility and Control
Riverbed Technology this week launched the latest version of its SteelHead WAN optimization platform, including a new release of its SteelCentral AppResonse management tool to monitor hybrid environments, including Software-as-a-Service (SaaS) apps.
Core to the new SteelHead 9.0 is its tight integration with SteelCentral AppResponse, which Riverbed said simplifies the ability to troubleshoot applications using the app's analytics engine, making it easier to manage such processes as policy configuration, patch management, reporting and troubleshooting. The SteelCentral dashboard lets administrators track performance of applications, networks, quality of service and reports on how policies are maintained.
SteelCentral AppResponse 9.5 also gives administration metrics on end-user experiences of traditional and SaaS-based apps, even if they're not optimized by the SteelHead WAN platform. Riverbed said providing this information aims to let IT groups respond to business requirements and issues causing degraded performance. The new SteelHead 9.0 also is designed to ensure optimized performance of Office 365 mailboxes.
Posted by Jeffrey Schwartz on 11/07/2014 at 10:50 AM0 comments
A majority of some of the largest chief information security officers (CISOs) strongly believe that the sophistication of attackers is outstripping their own ability to fend them off and the number of threats has increased markedly. According to IBM's third annual CISO study, 59 percent are concerned about their inability to keep pace with 40 percent and say it's their top security challenge.
Moreover, 83 percent said external threats have increased over the past three years with 42 percent of them saying the increases were dramatic. IBM revealed results of its study at a gathering of CISOs held at its New York offices.
The survey also found CISOs have also found themselves more frequently questioned by the C-suite and corporate boards, while changes to the global regulatory landscape promise to further complicate efforts to step threats, where the vast majority are derived. Kristin Lovejoy, IBM's general manager of security services, said malware creation is a big business in unregulated countries, which are the origin of most attacks.
"Where we say we're worried about external attackers and we're worried about financial crime data theft, there's a correlation between people getting Internet access in unregulated, unlegislated countries where it's an economic means of getting out," Lovejoy said. "When you interview the criminals, they don't even know they're performing a crime -- they're just building code. We have to be careful here, this external attacker thing, it's not going to get any better, it's going to get worse."
Most are able to exploit the naivety of employees, she added, noting 80 to 90 percent of all security incidents were because of human error. "They're getting in because users are pretty dumb," she said. "They click on stuff all the time. It's going to continue." She added organizations that are most secure are those that have good IT hygiene, automation, configuration management, asset management, especially those that implement ITIL practices.
Posted by Jeffrey Schwartz on 11/05/2014 at 1:23 PM0 comments
A survey of small and medium enterprises found that only 8 percent are prepared to recover from an unplanned IT outrage, while 23 percent of them report it would take more than a day to resume operations.
Underscoring the risk to companies with fewer than 1,000 employees, a vast majority of the 453 organizations surveyed have experienced a major IT outage in the past two years. Companies with 50 to 250 employees were especially at risk. A reported 83 percent have gone through a major IT failure, while 74 percent of organizations with 250 to 1,000 employees have experienced a significant outage.
One-third are using cloud-based disaster recovery as a service, which has rapidly started to gain momentum this year, according to the survey, conducted by Dimensional Research and sponsored by DRaaS provider Axcient. Daniel Kuperman, director of product marketing at Axcient, said the results confirmed what the company had suspected. "In a lot of cases, companies still don't put emphasis on disaster recovery," he said.
Axcient didn't reveal whose DRaaS offerings the organizations were using, through Kuperman said Dimensional chose its own companies to poll from the researcher's own resources. DRaaS is one of the leading use cases for organizations making their foray into using cloud services.
A survey by cloud infrastructure provider EvolveIP last month found that nearly 50 percent benefitted from a recovery cloud service by avoiding outages from a disaster. Nearly three quarters, or 73 percent, cited the ability to recover from an outrage was the prime benefit of using a cloud service. As a result, 42 percent of those responding to EvolveIP's survey have increased their cloud spending budgets this year, while 54 percent plan to do so in 2015.
Posted by Jeffrey Schwartz on 11/05/2014 at 12:01 PM0 comments
In its latest bid to offer better failover and replication in its software and cloud infrastructure, Microsoft demonstrated its new Storage Replica technology at last week's TechEd conference in Barcelona.
Microsoft Principal Program Manager Jeff Woolsey demonstrated Storage Replica during the opening TechEd keynote. Storage Replica, which Microsoft sometimes calls Windows Volume Replication (or WVR) provides block-level, synchronous replication between servers or cluster to provide disaster recovery, according to a Microsoft white paper published last month. The new replication engine is storage-agnostic and Microsoft says it can also stretch a failover cluster for high availability.
Most notable is that Storage Replica provides synchronous replication, which as Microsoft describes it, enables organizations to mirror data within the datacenter with "crash-consistent volumes." The result, says Microsoft, is zero data loss at the file system level. By comparison, asynchronous replication, which Microsoft added to Windows Server 2012 via the Hyper-V Replica and updated in last year's Windows Server 2012 R2 release, allows site extension beyond the limitations of a local metropolitan area. Asynchronous replication, which has a higher possibility for data loss or delay, may not be suited for scenarios where instantaneous real-time availability is a requirement, though for general purposes it's considered adequate.
In the TechEd demo, Woolsey simulated a scenario with four server nodes, two in New York and the other across the river in New Jersey. The goal is to ensure that if users are unable to access data on the two nodes in New York, they automatically and transparently fail over to New Jersey without losing any data, Woolsey explained. It also uses a new feature in the Microsoft Azure service called Cloud Witness.
"To do a stretch cluster you need to have a vote for the cluster quorum," Woolsey explained. "In the past, this meant extra hardware, extra infrastructure, extra cost. Now we're just making this part of Azure as well. So that's an option to take advantage of the Cloud Witness. As you can see, we're baking hybrid capabilities right into Windows Server."
In the demo, Woolsey accessed the file share data to enable replication via the new storage replication wizard. From there he selected the source log disk, then the destination storage volume and log disk. "Literally in just a few clicks, that's it, I've gone ahead and I've set up synchronous replication," he said.
In the recently published white paper, the following features are implemented in the Windows Server Technical Preview:
Yes (server to server only)
Storage hardware agnostic
Windows Server Stretch Cluster creation
Write order consistency across volumes
TCP/IP or RDMA
Replication network port firewall requirements
Single IANA port (TCP 445 or 5445)
Over the wire encryption and signing
Per-volume failovers allowed
Dedup & BitLocker volume support
Management UI in-box
Windows PowerShell, Failover Cluster Manager
Microsoft also has emphasized that Storage Replica is not intended for backup and recovery scenarios. And because of the general purpose of the product, the company noted it may not be suited to specific applications behaviors. In addition, Microsoft is warning that with Storage Replica, organizations could see feature gaps in applications and hence they could be better served by those app-specific replication technologies.
What's your take on Microsoft's latest efforts to embed disaster recovery into Windows Server and Azure?
Posted by Jeffrey Schwartz on 11/03/2014 at 1:26 PM0 comments
Microsoft used its TechEd conference in Barcelona this week to give customers a first look at the new Azure cloud in a box. The so-called Cloud Platform System (CPS), announced at an event held last week in San Francisco led by CEO Satya Nadella and Executive VP for Cloud and Enterprise Scott Guthrie, is Microsoft's effort to let customers or hosting providers run their own Azure clouds.
The first CPS is available from Dell, though describing the company as the "first" to provide one implies that other major hardware providers may have plans for their own iterations -- or perhaps it's only at the wishful thinking stage. At any rate, CPS has been a long time coming.
As you may recall, Microsoft first announced plans to release such an offering more than four years ago. At the time, Dell, Hewlett Packard and Fujitsu were planning to offer what was then coined the Windows Azure Platform Appliance, and eBay had planned to run one. Though Microsoft took it on a roadshow that year, it suddenly disappeared.
Now it's back and Corporate VP Jason Zander showcased it in his TechEd Europe opening keynote, inviting attendees to check it out on the show floor. "This is an Azure-consistent cloud in a box," he said. "We think this is going to give you the ability to adopt the cloud with even greater control. You energize it, you hook it up to your network and you're basically good to go."
The CPS appears more modest than the original Windows Azure Platform Appliance in that they are sold as converged rack-based systems and don't come in prefabricated containers with air conditioning and cooling systems. The racks are configured with Dell PowerEdge servers, storage enclosures and network switches. Each rack includes 32 CPU notes and up to 282TB of storage. On the software side customers get Windows Server 2012 R2 with Hyper-V, configured in a virtualized multi-tenant architecture, System Center 2012 R2 and the Windows Azure Pack to provide the Azure-tie functionality within a customer's datacenter.
So far, the first two known customers of the CPS are NTTX, which will use it to provide its own Azure infrastructure as a service in Japan and CapGemini, which will provide its own solutions for customers running in the Azure cloud.
CapGemini is using it for an offering called SkySight, which will run a variety of applications including SharePoint and Lync as well as a secure policy driven orchestration service based on its own implementation of Azure. "SkySite is a hybrid solution where we will deliver a complete integrated application store and a developer studio all using Microsoft technologies," said CapGemini Corporate VP Peter Croes, in a pre-recorded video presented by Zander during the keynote. "CPS for me is the integrated platform for public and private cloud. Actually it's the ideal platform to deliver the hybrid solution. That is what the customers are looking for."
Microsoft last week tried to differentiate itself from Amazon Web Services and Google in its hybrid approach. CPS could become an important component of Azure's overall success.
Posted by Jeffrey Schwartz on 10/31/2014 at 1:00 PM0 comments
Andy Rubin is leaving Google to join a technology incubator dedicated to startups building hardware, according to published reports. Whether you are an Android fan or not, it's hard not to argue that Google's acquisition of the company Rubin founded was one of the most significant deals made by the search giant.
While Rubin continued to lead the Android team since Google acquired it in 2005, Google reassigned him last year to lead the company's moves into the field of robotics, which included overseeing the acquisition of numerous startups.
Rubin's departure comes a week after Google CEO Larry Page promoted Sundar Pichai to head up all of Google's product lines except for YouTube. Page in a memo to employees published in The Wall Street Journal said he's looking to create a management structure that can make "faster, better decisions."
That effectively put Pichai in charge of the emerging robotics business as well. A spokesman told The New York Times that James Kuffner, who has worked with Google's self-driving cars, will lead the company's robotics efforts.
The move comes ironically on the same day that Google sold off the hardware portion of its Motorola Mobility business to Lenovo. The two companies yesterday closed on the deal, announced in January. Lenovo said it will continue to leverage the Motorola brand.
As for Rubin, now that he's incubating startups, he'll no doubt be on the sell-side of some interesting companies again.
Posted by Jeffrey Schwartz on 10/31/2014 at 12:58 PM0 comments
Microsoft kicked off what looks to be its final TechEd conference with the launch of new services designed to simplify the deployment, security and management of apps running in its cloud infrastructure. In the opening keynote presentation at TechEd, taking place in Barcelona, officials emphasized new capabilities that enable automation and the ability to better monitor the performance of specific nodes.
A new feature called Azure Operational Insights will tie the cloud service and Azure HDInsight with Microsoft's System Center management platform. HDInsight, the Apache Hadoop-based Big Data analytics service, will monitor and analyze machine data from cloud environments to determine where IT pros need to reallocate capacity.
Azure Operational Insights, which will be available in preview mode next month (a limited preview is currently available), initially will address four key functions: log management, change tracking, capacity planning and update assessment. It uses the Microsoft Monitoring Agent, which incorporates an application performance monitor for .NET apps and the IntelliTrace Collector in Microsoft's Visual Studio development tooling, to collect complete application-profiling traces. Microsoft offers the Monitoring Agent as a standalone tool or as a plugin to System Center Operations Manager.
Dave Mountain, vice president of marketing at BlueStripe Software, was impressed with the amount of information it gathers and the way it's presented. "If you look at it, this is a tool for plugging together management data and displaying it clearly," Mountain said. "The interface is very slick, there's a lot of customization and it's tile-based."
On the heels of last week's announcement that it will support the more-robust G-series of virtual machines, which boast up to 32 CPU cores of compute based on Intel's newest Xeon processors, 45GB of RAM and 6.5TB of local SSD storage, Microsoft debuted Azure Batch, which officials say is designed to let customers use Azure for jobs that require "massive" scale out. The preview is available now.
Azure Batch is based on the job scheduling engine used by Microsoft internally to manage the encoding of Azure Media Services and for testing the Azure infrastructure itself, said Scott Guthrie, Microsoft's executive VP for cloud and enterprise, in a blog post today.
"This new platform service provides 'job scheduling as a service' with auto-scaling of compute resources, making it easy to run large-scale parallel and high performance computing (HPC) work in Azure," Guthrie said. "You submit jobs, we start the VMs, run your tasks, handle any failures, and then shut things down as work completes."
The new Azure Batch SDK is based on the application framework from GreenButton, a New Zealand-based company that Microsoft acquired in May, Guthrie noted. "The Azure Batch SDK makes it easy to cloud-enable parallel, cluster and HPC applications by describing jobs with the required resources, data and one or more compute tasks," he said. "With job scheduling as a service, Azure developers can focus on using batch computing in their applications and delivering services without needing to build and manage a work queue, scaling resources up and down efficiently, dispatching tasks, and handling failures."
Microsoft also said it has made its Azure Automation service generally available. The tool is designed to automate repetitive cloud management tasks that are time consuming and prone to error, the company said. It's designed to use existing PowerShell workflows or IT pros can deploy their own.
Also now generally available is WebJobs, the component of Microsoft Azure Websites designed to simplify the running of programs, services or background tasks on a Web site, according a blog post by Product Marketing Manager Vibhor Kapoor, in a post today on the Microsoft Azure blog.
"WebJobs inherits all the goodness of Azure Websites -- deployment options, remote debugging capabilities, load balancing and auto-scaling," Kapoor noted. "Jobs can run in one instance, or in all of them. With WebJobs all the building blocks are there to build something amazing or, small background jobs to perform maintenance for a Web site."
Posted by Jeffrey Schwartz on 10/28/2014 at 10:59 AM0 comments
Will Microsoft's final TechEd conference this week in Barcelona go out with a bang? We'll have a better sense of that over the next two days as the company reveals the next set of deliverables for the datacenter and the cloud. Microsoft has kept a tight lid on what's planned but we should be on the lookout for info pertaining to the next versions of Windows Server, System Center and Hyper-V, along with how Microsoft sees containers helping advance virtualization and cloud interoperability.
In case you missed it, this week's TechEd, the twice-yearly conference Microsoft has held for nearly two decades, will be the last. Instead, Microsoft said earlier this month it will hold a broader conference for IT pros and developers called Ignite to be held in Chicago during the first week of May. Ignite will effectively envelope TechEd, SharePoint and Exchange.
Given the company's statements about faster release cycles, if officials don't reveal what's planned for the next releases of Windows Server, System Center and the so-called "Cloud OS" tools that enable it to provide an Azure-like infrastructure within the datacenter, partner cloud services and its own public cloud, I'd be quite surprised.
If you caught wind of presentations made last week by CEO Satya Nadella and Scott Guthrie, EVP of Microsoft's cloud and enterprise group, it was clear that besides some noteworthy announcements, they were clearly aimed at priming the pump for future announcements. For example Microsoft announced the Azure Marketplace, where ISV partners can develop virtual images designed to accelerate the use of Azure as a platform. Also revealed, the Azure G-series of virtual powered by the latest Intel Xeon processors that Guthrie claimed will be the largest VMs available in the public cloud -- at least for now. Guthrie claimed that the new VMS provide twice the memory of the largest Amazon cloud machine.
As Microsoft steps up its moves into containerization with the recent announcement that it's working with Docker to create Docker containers for Windows Server, it will be interesting to hear how that will play into the next release of the server operating system. It will also be interesting to learn to what extent Microsoft will emphasize capabilities in Windows Server and Azure that offer more automation as the company moves to build on the evolving software-defined datacenter.
The opening keynote is tomorrow, when we'll find out how much Microsoft intends disclose what's next for its core enterprise datacenter and cloud platforms. I'd be surprised and disappointed if it wasn't substantive.
Posted by Jeffrey Schwartz on 10/27/2014 at 3:17 PM0 comments
While almost every part of Microsoft's business faces huge pressure from disruptive technology and competitors, the software that put the company on the map -- Windows -- continues to show it's not going to go quietly into the night. Given Microsoft's surprise report that Surface sales have surged and the company promising new capabilities in the forthcoming release of Windows 10, expectations of the operating system's demise are at least premature and potentially postponed indefinitely.
Despite the debacle with its first Surface rollout two years ago, this year's release of the Surface Pro 3 and the resulting impressive performance shows that Windows still has a chance to remain relevant despite the overwhelming popularity of iOS and Android among consumers and enterprises. Granted, we now live in a multiplatform world, which is a good thing that's not going to change. The only question still to play out is where Windows will fit in the coming years and this will be determined by Microsoft making the right moves. Missteps by Apple and Google going forward will play a role as well of course, but the ball is in Microsoft's court to get Windows right.
Amid yesterday's impressive results for the first quarter of Microsoft's 2015 fiscal year were increases along key lines including Office 365, enterprise software and cloud business and the disclosure of $908 million in revenues for its Surface business. That's more than double of what Surface devices received last year. This report includes the first full quarter that the new Surface Pro 3 has been on the market. Presuming there wasn't significant channel stuffing, this is promising news for the future of Windows overall.
Indeed while showing hope, the latest report on Surface sales doesn't mean Windows is out of the woods. Despite the surge in revenues, Microsoft didn't reveal Surface unit sales. And while the company said its Surface business is now showing "positive gross margins" -- a notable milestone given the $900 million charge the company took five quarters ago due to poor device sales -- Microsoft didn't say how profitable they are, said Patrick Moorhead, principal analyst with Moor Insights & Strategy.
Marketing and the cost of implementing Microsoft's much improved global channel and distribution reach neutralized or negated much of the overall negative margin. Moorhead predicted, "I can say with 99 percent confidence they are losing money on Surface still. That may not be bad for two reasons. They need to demonstrate Windows 8 can provide a good experience and second of all it puts additional pressure on traditional OEMs that they need to be doing a better job than what they do."
Also worth noting, the $908 million in Surface revenues were about 17 percent of the $5.3 million Apple took in for iPads during the same period (revenues for Macintoshes, which are in many ways more comparable to the Surface Pro 3, were $6.6 million, Apple said). Apple's iPads, which often displace PCs for many tasks, are also hugely profitable though ironically sales of the tablets have declined for the past three quarters amid the sudden surge in Surface sales. Naturally they also have different capabilities but the point is to underscore the positive signs the growth of Surface portend for the future of Windows.
Morehead said the current quarter and notably holiday sales of all Windows devices, led by an expected onslaught of dirt-cheap Windows tablets (possibly as low as $99) could be an inflexion point, though he warned that Microsoft will need to execute. "If Microsoft continues to operate the way they are operating, they will continue to lose considerable consumer relevance," he said. "If during the holidays, they make news and sell big volumes, I would start to think otherwise."
Key to the quarter's turnaround was the company's expanded global distribution and extended sales of corporations through its channel partners, though that effort is still at a formative stage. Despite his skeptical warning, Moorhead believes Google's failure to displace Windows PCs with large Android devices and Chromebooks gives Microsoft a strong shot at keeping Windows relevant.
"Google had this huge opportunity to bring the pain on Microsoft with larger devices and eat into notebooks," he said. "They never did it. They really blew their opportunity when they had it. While Android may have cleaned up with phones, when you think about it what they did was just blocking Microsoft as opposed to going after Microsoft, which would be in larger form factor devices in the form of Android notebooks and Chromebooks. The apps are designed for 4-inch displays, not a 15-inch display or 17-inch display. And with Chrome, its offline capabilities just came in too slowly and there really aren't a lot of apps. They just added the capability to add real apps."
Meanwhile, Moorhead pointed out that Apple this month has delivered on what Microsoft is aiming to do: provide an experience that lets a user begin a task on say an iPhone and resume that task on an iPad or Mac.
Hence keeping Windows relevant, among other thing, may rest on Microsoft's ability to deliver a Windows 10 that can do that and improve on a general lack of apps on the OS, which in the long run would incent developers to come back. The promising part of that is the renewed focus on the desktop, Moorhead said. "When they converge the bits in a meaningful way, I think they can hit the long tail because of the way they're doing Windows 10 with Windows apps and the ability to leverage those 300 million units to a 7-inch tablet and a 4-inch phone. I think that is an enticing value proposition for developers."
Given the target audience of business users for the Surface Pro 3, it also is a promising signal for the prospects of Windows holding its own in the enterprise. Do you find the new surge in Surface sales coupled with design goal of Windows 10 to be encouraging signs for the future Windows or do you see it more as one last burst of energy?
Posted by Jeffrey Schwartz on 10/24/2014 at 12:48 PM0 comments
Microsoft may be trying to compete with IBM in the emerging market for machine learning-based intelligence but like all rivals, these two with a storied past together have their share of mutual interests even as they tout competing public enterprise clouds. Hence the two are the latest to forge a cloud compatibility partnership.
The companies said today they are working together to ensure some of their respective database and middleware offerings can run on both the IBM Cloud and Microsoft Azure. Coming to Microsoft Azure is IBM's WebSphere Liberty application server platform, MQ middleware and DB2 database. IBM's Pure Application Service will also run on Microsoft Azure the two companies said.
In exchange, Windows Server and SQL Server will work on the IBM Cloud. Both companies are collaborating to provide Microsoft's .NET runtime for IBM Bluemix, the company's new cloud development platform. While the IBM Cloud already has support for Microsoft's Hyper-V, IBM said it will add expanded support for the virtualization platform that's included in Windows Server. It was not immediately clear how they will improve Hyper-V support on the IBM Cloud.
Andrew Brust, a research director at Gigaom Research, said that the IBM Cloud, which is based on the SoftLayer public cloud IBM acquired last year for $2 billion, runs a significant amount of Hyper-V instances. "They explained to me that they have a 'non-trivial' amount of Windows business and that they support Hyper-V VMs," Brust said.
"With that in mind, the announcement makes sense, especially when you consider [Microsoft CEO] Satya's [Nadella] comment on Monday that Azure will 'compose' with other clouds," Brust added. The comment made by Nadella took place Monday when he was articulating on Microsoft's strategy to build Azure into a "hyperscale" cloud. "We are not building our hyperscale cloud in Azure in isolation," Nadella said. "We are building it to compose well with other clouds."
Nadella spelled out recent efforts to do that including last week's announcement that Microsoft is working with Docker to develop Docker containers for Windows Server, its support for native Java via its Oracle partnership (which, like IBM, includes its database and middleware offerings) as well as broad support for other languages including PHP, Python and Node.js. "This is just a subset of the open source as well as other middle-tier frameworks and languages that are supported on Azure," Nadella said at the event.
Most analysts agree that Amazon, Microsoft and Google operate the world's largest cloud infrastructures but with SoftLayer, IBM has a formidable public cloud as well. Both IBM and Microsoft are seeing considerable growth with their respective cloud offerings but have reasonably sized holes to fill as well.
Nadella said Monday that Microsoft has a $4.4 billion cloud business -- still a small fraction of its overall revenues but rapidly growing. For its part, IBM said on its earnings call Monday that its public cloud infrastructure is in a $3.1 billion run rate and its overall cloud business is up 50 percent, though the company's spectacular earnings miss has Wall Street wondering if IBM has failed to move quickly enough. The company's shares have tumbled in recent days and analysts are questioning whether the company needs a reboot similar to the one former CEO Lou Gerstner gave it two decades ago.
"Overall, this looks like a marriage of equals where both stand to gain by working harmoniously together," said PundIT Analyst Charles King. Forrester Research Analyst James Staten agreed. "IBM and Microsoft both need each other in this regard so a nice quid quo pro here," he said.
For Microsoft, adding IBM to the mix is just the latest in a spate of cloud partnerships. In addition to its partnership with Oracle last year, Microsoft recently announced a once-unthinkable cloud partnership with Salesforce.com and just tapped Dell to deliver its latest offering, the new Cloud Platform System, which the company describes as an "Azure-consistent cloud in a box" that it will begin offering to customers next month.
It also appears that IBM and Microsoft held back some of their crown jewels in this partnership. There was no mention of IBM's Watson or Big SQL, which is part of its InfoSphere Platform on Hadoop, based on a Hadoop Distributed File System (HDFS). During a briefing last week at Strata + Hadoop World in New York, IBM VP for Big Data Anjul Bhambhri described the recent third release of Big SQL in use with some big insurance companies. "Some of their queries which they were using on Hive, were taking 45 minutes to run," she said. "In Big SQL those kinds of things are 17 rejoins is now less than 5 minutes."
Likewise, the announcement doesn't seem to cover Microsoft's Azure Machine Learning or AzureHD Insights offerings. I checked with both companies and while both are looking into it, there was no response as of this posting. It also wasn't immediately clear when the offerings announced would be available.
Update: A Microsoft spokeswoman responded to some questions posed on the rollout of the services on both companies' cloud. Regarding the availability of IBM's software on Azure: "In the coming weeks, Microsoft Open Technologies, a wholly owned subsidiary of Microsoft, will publish license-included virtual machine images with key IBM software pre-installed," she stated. "Customers can take advantage of these virtual machines to use the included IBM software in a 'pay-per-use' fashion. Effective immediately, IBM has updated its policies to allow customers to bring their own license to Microsoft Azure by installing supported IBM software on a virtual machine in Azure."
As it pertains to using Microsoft's software in the IBM cloud, she noted: "Windows Server and SQL Server are available for use on IBM Cloud effective immediately. IBM will be offering a limited preview of .NET on IBM Cloud in the near future." And regarding plans to offer improves support for Hyper-V in the IBM Cloud: "Hyper-V is ready to run very well on IBM SoftLayer to provide virtualized infrastructure and apps. IBM is expanding its product support for Hyper-V."
Posted by Jeffrey Schwartz on 10/22/2014 at 2:22 PM0 comments
The launch today of the new Apple Pay service for users of the newest iPhone and iPad -- and ultimately the Apple Watch -- is a stark reminder that Microsoft has remained largely quiet about its plans to pursue this market when it comes to Windows Phone or through any other channels.
If smartphone-based payments or the ability to pay for goods with other peripherals such as watches does take off in the coming year, it could be the latest reason consumers shun Windows Phone, which despite a growing number of apps, still is way behind the two market leaders.
So if payments become the new killer app for smartphones, is it too late for Microsoft to add it to Windows Phone? The bigger question should be is it too late for Microsoft as a company? Perhaps the simplest way to jump in would be to buy PayPal, the company eBay last month said it will spin off. The problem there is eBay has an estimated market valuation of $65 billion -- too steep even for Microsoft.
If Microsoft still wants to get into e-payment -- which, in addition to boosting Windows Phone, could benefit Microsoft in other ways including its Xbox, Dynamics and Skype businesses, among others -- the company could buy an emerging e-payment company such as Square, which is said to be valued at a still-steep (but more comfortable) $6 billion.
Just as Microsoft's Bill Gates had visions of bringing Windows to smartphones nearly two decades ago, he also foresaw an e-payments market similar to the one now emerging. Gates was reminded of the fact that he described possible e-payment tech in his book, "The Road Ahead," by Bloomberg Television's Erik Schatzker in an interview released Oct. 2.
"Apple Pay is a great example of how a cellphone that identifies its user in a pretty strong way lets you make a transaction that should be very, very inexpensive," Gates said. "The fact that in any application I can buy something, that's fantastic. The fact I don't need a physical card any more -- I just do that transaction and you're going to be quite sure about who it is on the other end -- that is a real contribution. And all the platforms, whether it's Apple's or Google's or Microsoft, you'll see this payment capability get built in. That's built on industry standard protocols, NFC and these companies have all participated in getting those going. Apple will help make sure it gets critical mass for all the devices."
Given his onetime desire to lead Microsoft in offering digital wallet and payment technology, Schatzker asked Gates why Microsoft hasn't entered this market already? "Microsoft has a lot of banks using their technology to do this type of thing," Gates said. "In the mobile devices, the idea that a payment capability and storing the card in a nice secret way, that's going to be there on all the different platforms. Microsoft had a really good vision in this." Gates then subtly reminded Schatzker the point of their interview was to talk about the work of the Bill and Melinda Gates Foundation.
But before shifting back to that topic, Schatzker worked in another couple of questions, notably should Microsoft be a player the way that Apple is looking to become (and Google has) with its digital wallet? "Certainly Microsoft should do as well or better but of all the things that Microsoft needs to do in terms of making people more productive in their work, helping them communicate in new ways, it's a long list of opportunities," he said. "Microsoft has to innovate and taking Office and making it dramatically better would be really high on the list. That's the kind of thing I'm trying to help make sure they move fast on."
For those wishful that Microsoft does have plans in this emerging segment, there's hope. Iain Kennedy last month left Amazon.com where he managed the company's local commerce team to take on the new role of senior director of product management for Microsoft's new commerce platform strategy, according to his LinkedIn profile. Before joining Amazon, Kennedy spent four years at American Express.
Together with Gates' remarks, it's safe to presume that Microsoft isn't ignoring the future of digital payments and e-commerce. One sign is that Microsoft is getting ready to launch a smartwatch within the next few weeks that is focused on fitness. While that doesn't address e-payments, it's certainly a reasonable way to get into the game. According to a Forbes report today, it will be a multiple operating system watch.
It's unclear what role Windows Phone will play in bringing a payments service to market but it's looking less like it will have a starring role. As a "productivity and platforms" company, despite Gates' shifting the conversation to Office, it may not portend that Microsoft has plans for the e-payments market. If the company moves soon, it may not be too late.
Posted by Jeffrey Schwartz on 10/20/2014 at 1:49 PM0 comments
Apparently stung by his remarks last week that women shouldn't ask for raises but instead look for "karma," Microsoft CEO Satya Nadella said he is putting controls in place that will require all employees to attend diversity training workshops to ensure not just equal pay for women but opportunities for advancement regardless of gender or race.
Nadella had quickly apologized for his remarks at last week's Hopper Celebration of Women in Computing in Phoenix but apparently he's putting his money where his mouth is. Microsoft's HR team reported to Nadella that women in the U.S. last year earned 99.7 percent of what men earned at the same title and rank, according to an e-mail sent to employees Wednesday that was procured by GeekWire.
"In any given year, any particular group may be slightly above or slightly below 100 percent," he said. "But this obscures an important point: We must ensure not only that everyone receives equal pay for equal work, but that they have the opportunity to do equal work."
Given the attention Microsoft's CEO brought to the issue over the past week, it begs the question: do women in your company earn the same amount as men for the same job title and responsibilities and have the same opportunities for advancement or is there a clear bias? IT is an industry dominated by men though educators are trying to convince more women to take up computer science.
Posted by Jeffrey Schwartz on 10/17/2014 at 12:59 PM0 comments
Nearly a year after launching its Hadoop-based Azure HDInsight cloud analytics service, Microsoft believes it's a better and broader solution for real-time analytics and predictive analysis than IBM's widely touted Watson. Big Blue this year has begun commercializing its Watson technology, made famous in 2011 when it came out of the research labs to appear and win on the television game show Jeopardy.
Both companies had a large presence at this year's Strata + Hadoop World Conference in New York, attended by 5,000 Big Data geeks. At the Microsoft booth, Eron Kelly, general manager for SQL Server product marketing, highlighted some key improvements to Microsoft's overall Big Data portfolio since last year's release of Azure HDInsight including SQL Server 2014 with support for in-memory processing, PowerBI and the launch in June of Azure Machine Learning. In addition to bolstering the offering, Microsoft showcased Azure ML's ability to perform real-time predictive analytics for the retail chain Pier One.
"I think it's very similar," in terms of the machine learning capabilities of Watson and Azure ML, Kelly said. "We look at our offering as a self-service on the Web solution where you grab a couple of predictive model clips and you're in production. With Watson, you call in the consultants. It's just a difference fundamentally [that] goes to market versus IBM. I think we have a good advantage of getting scale and broad reach."
Not surprisingly, Anjul Bhambhri, vice president of Big Data for IBM's software group disagreed. "There are certain applications which could be very complicated which require consulting to get it right," she said. "There's also a lot of innovation that IBM has brought to market around exploration, visualization and discovery of Big Data which doesn't require any consulting." In addition to Watson, IBM offers its InfoSphere BigInsights for Hadoop and Big SQL offerings.
As it broadens its approach with a new "data culture," Microsoft has come on strong with Azure ML, noting it shares many of the real-time predictive analytics of the new personal assistant in Windows Phone called Cortana. Now Microsoft is looking to further broaden the reach of Azure ML with the launch of a new app store-type marketplace where Microsoft and its partners will offer APIs consisting of predictive models that can plug into Azure Machine Learning.
Kicking off the new marketplace, Joseph Sirosh, Microsoft's corporate VP for information management and machine learning, gave a talk at the Strata + Hadoop conference this morning. "Now's the time for us to try to build the new data science economy," he said in his presentation. "Let's see how we might be able to build that. What do data science and machine learning people do typically? They build analytical models. But can you buy them?"
Sirosh said with Microsoft's new data section of the Azure Marketplace, marketplace developers and IT pros can search for predictive analytics components. It consists of APIs developed both by Microsoft and partners. Among those APIs from Microsoft are Frequently Bought Together, Anomaly Detection, Cluster Manager and Lexicon Sentiment Analysis. Third parties selling their APIs and models include Datafinder, MapMechanics and Versium Analytics.
Microsoft's goal is to build up the marketplace for these data models. "As more of you data scientists publish APIs into that marketplace, that marketplace will become just like other online app stores -- an enormous of selection of intelligent APIs. And we all know as data scientists that selection is important," Sirosh said. "Imagine a million APIs appearing in a marketplace and a virtual cycle like this that us data scientists can tap into."
Also enabling the real-time predictive analytics support is support for Apache Storm clusters, announced today. Though it's in preview, Kelly said Microsoft is adhering to its SLAs with use of the Apache Storm capability, which enables complex event processing and stream analytics, providing much faster responses to queries.
Microsoft also said it would support the forthcoming Hortonworks Data Platform, which has automatic backup to Azure BLOB storage, Kelly said. "Any Hortonworks customer can back up all their data to an Azure Blob in a real low cost way of storing their data, and similarly once that data is in Azure, it makes it real easy for them to apply some of these machine learning models to it for analysis with Power BI [or other tools]."
Hortonworks is also bringing HDP to Azure Virtual Machines as an Azure certified partner. This will bring Azure HDInsight to customers who want more control over it in an infrastructure-as-a-service model, Kelly said. Azure HDInsight is currently a platform as a service that is managed by Microsoft.
Posted by Jeffrey Schwartz on 10/17/2014 at 9:54 AM0 comments
In perhaps its greatest embrace of the open source Linux community to date, Microsoft is teaming up with Docker to develop Docker containers that will run on the next version of Windows Server, the two companies announced today. Currently Docker containers, which are designed to enable application portability using code developed as micro-services, can only run on Linux servers.
While Microsoft has stepped up its efforts to work with Linux and other open source software over the years, this latest surprise move marks a key initiative to help make containers portable among respective Windows Server and Linux server environments and cloud infrastructures. It also underscores a willingness to extend its ties with the open source community as a key contributor to make that happen.
In addition to making applications portable, proponents say containers could someday supersede the traditional virtual machine. Thanks to their lightweight composition, containers can provide the speed and scale needed for next generation applications and infrastructure components. Those next generation applications include those that make use of Big Data and processing complex computations.
Containers have long existed, particularly in the Linux community and by third parties such as Parallels. But they have always had their own implementations. Docker has taken the open source and computing world by storm over the past year since the company, launched less than two years ago, released a standard container that created a de-facto standard for how applications can extend from one platform to another running as micro-services.
Many companies have jumped on the Docker bandwagon in recent months including Amazon, Google, IBM, Red Hat and VMware, among others. Microsoft in May said it would enable Docker containers to run in its Azure infrastructure as a service cloud. The collaboration between Docker and Microsoft was a closely held secret.
Microsoft Azure CTO Mark Russinovich had talked about the company's work with Docker to support its containers in Azure in a panel at the Interop show in New York Sept. 30 and later in an interview. Russinovich alluded to Microsoft's own effort to develop Windows containers, called Drawbridge. Describing it as an internal effort, Russinovich revealed the container technology is in use within the company internally and is now available for customers that run their own machine learning-based code in the Azure service.
"Obviously spinning up a VM for [machine learning] is not acceptable in terms of the experience," Russinovich said during the panel discussion. "We are figuring out how to make that kind of technology available publicly on Windows."
At the time, Russinovich was tight-lipped about Microsoft's work with Docker and the two companies' stealth effort. Russinovich emphasized Microsoft's support for Linux containers on Azure and when pressed about Drawbridge he described it as a more superior container technology, arguing its containers are more secure for deploying micro-services.
As we now know, Microsoft has been working quietly behind the scenes with Docker to enable the Docker Engine, originally architected only to run in a Linux server, to operate with Windows Server as well. The two companies are working together to enable the Docker Engine to work in the next version of Windows Server.
Microsoft is working to enable Docker Engine images for Windows Server that will be available in Docker Hub, an open source repository housing more than 45,000 Docker applications via shared developer communities. As a result, Docker images will be available for both Linux and Windows Server.
Furthermore, the Docker Hub will run in the Microsoft Azure public cloud, accessible via the Azure Management Portal and Azure Gallery and Management Portal. This will allow cloud developers including its ISV partners to access the images. Microsoft also said it will support Docker orchestration APIs, which will let developers and administrators manage applications across both Windows and Linux platforms using common tooling. This will provide portability across different infrastructure, such as on-premises servers to cloud. It bears noting the individual containers remain tied to the operating system they are derived from.
The Docker Engine for Windows Server will be part of the Docker open source project where Microsoft said it intends to be an active participant. The result is that developers will now be able to use preconfigured Docker containers in both Linux and Windows environments.
Microsoft is not saying when it will appear, noting it is in the hands of the open source community, according to Ross Gardler, senior technology evangelist for Microsoft Open Technologies. To what extent Microsoft will share the underlying Windows code is not clear. Nor would he say to what extent, if any, the work from Docker will appear in this effort other than to say the company has gained deep knowledge from that project.
"This announcement is about a partnership of the bringing of Docker to Windows Server to insure we have interoperability between Docker containers," Gardler said. "The underlying implementation of that is not overly important. What is important is the fact that we'll have compatibility in the APIs between the Docker containers on Linux, and the Docker container on Windows."
David Messina, vice president of marketing at Docker, said the collaboration and integration between the two companies on the Docker Hub and the Azure Gallery, will lead to the merging of the best application content from both communities.
"If I'm a developer and I'm trying to build a differentiated application, what I want to focus on is a core service that's going to be unique to my enterprise or my organization and I want to pull in other content that's already there to be components for the application," Messina said. "So you're going to get faster innovation and the ability to focus on core differentiating capabilities and then leveraging investments from everybody else."
In addition to leading to faster development cycles, it appears containers will place less focus on the operating system over time. "It's less about dependencies on the operating system and more about being able to choose the technologies that are most appropriate and execute those on the platform," Microsoft's Gardler said.
Microosft Azure Corporate VP Jason Zander described the company's reasoning and plan to support Docker in Windows Server and Azure in a blog post. Zander explained how they will work:
Windows Server containers provide applications an isolated, portable and resource controlled operating environment. This isolation enables containerized applications to run without risk of dependencies and environmental configuration affecting the application. By sharing the same kernel and other key system components, containers exhibit rapid startup times and reduced resource overhead. Rapid startup helps in development and testing scenarios and continuous integration environments, while the reduced resource overhead makes them ideal for service-oriented architectures.
The Windows Server container infrastructure allows for sharing, publishing and shipping of containers to anywhere the next wave of Windows Server is running. With this new technology millions of Windows developers familiar with technologies such as .NET, ASP.NET, PowerShell, and more will be able to leverage container technology. No longer will developers have to choose between the advantages of containers and using Windows Server technologies.
IDC Analyst Al Hilwasaid in an e-mail that Microsoft has taken a significant step toward advancing container technology. "This is a big step for both Microsoft and the Docker technology," he said. "Some of the things I look forward to figuring out is how Docker will perform on Windows and how easy it will be to run or convert Linux Docker apps on Windows."
Posted by Jeffrey Schwartz on 10/15/2014 at 2:10 PM0 comments
Satya Nadella's comments suggesting that women shouldn't ask for pay raises or promotions have prompted outrage on social media. But to his credit, he swiftly apologized, saying he didn't mean what he said.
To be sure, Nadella's answer to the question of "What is your advice?" to women uncomfortable asking for a raise was, indeed, insulting to women. Nadella said "karma" is the best way women should expect a salary increase or career advancement, a comment the male CEO couldn't have made at a worse place: The Grace Hopper Celebration of Women in Computing in Phoenix, where he was interviewed onstage by Maria Klawe, president of Harvey Mudd College in Claremont, Calif. Even more unfortunate, Klawe is a Microsoft board member, one of the people Nadella reports to.
Here's exactly what Nadella said:
"It's not really just asking for the raise but knowing and having faith that the system will give you the right raises as you go along. And I think it might be one of the additional super powers that, quite frankly, women who don't ask for a raise have. Because that's good karma, it will come back, because somebody's going to know that 'that's the kind of person that I want to trust. That's the kind of person that I want to really give more responsibility to,' and in the long-term efficiency, things catch up."
Accentuating his poor choice of words was Klawe's immediate and firm challenge when she responded, "This is one of the very few things I disagree with you on," which was followed by a rousing applause. But it also gave Klawe an opportunity to tell women how not to make the same mistake she has in the past.
Klawe explained that she was among those who could easily advocate for someone who works for her but not for herself. Klawe related how she got stiffed on getting fair pay when she took a job as dean of Princeton University's engineering school because she didn't advocate for herself. Instead of finding out how much she was worth, when the university asked Klawe how much she wanted to be paid, she told her boss, who was a woman, "Just pay me what you think is right." Princeton paid her $50,000 less than the going scale for that position, Klawe said.
Now she's learned her lesson and offered the following advice: "Do your homework. Make sure you know what a reasonable salary is if you're being offered a job. Do not be as stupid as I was. Second, roleplay. Sit down with somebody you really trust and practice asking for the salary you deserve."
Certainly, the lack of equal pay and advancement for women has been a problem as long as I can remember. On occasion, high-profile lawsuits, often in the financial services industry, will bring it to the forefront and politicians will address it in their campaign speeches. The IT industry, perhaps even more so than the financial services industry, is dominated by men.
Nadella's apology appeared heartfelt. "I answered that question completely wrong," he said in an e-mail to employees almost immediately after making the remarks. "Without a doubt I wholeheartedly support programs at Microsoft and in the industry that bring more women into technology and close the pay gap. I believe men and women should get equal pay for equal work. And when it comes to career advice on getting a raise when you think it's deserved, Maria's advice was the right advice. If you think you deserve a raise, you should just ask. I said I was looking forward to the Grace Hopper Conference to learn, and I certainly learned a valuable lesson. I look forward to speaking with you at our monthly Q&A next week and am happy to answer any question you have."
Critics may believe Nadella's apology was nothing more than damage control. It's indeed the first major gaffe committed by the new CEO, but I'd take him at his word. Nadella, who has two daughters of his own, has encouraged employees to ask if they feel they deserve a raise. If Nadella's ill-chosen comments do nothing else, they'll elevate the discussion within Microsoft and throughout the IT industry and business world at large.
Of course, actions speak louder than words, and that's where the challenge remains.
Posted by Jeffrey Schwartz on 10/10/2014 at 3:00 PM0 comments
In its bid to replace the traditional Windows and client environment with virtual desktops, VMware will release major new upgrades of its VMware Workstation and VMware Player desktop virtualization offerings in December. Both will offer support for the latest software and hardware architectures and cloud services.
The new VMware Workstation 11, the company's complete virtual desktop offering and the company's flagship product launched 15 years ago, is widely used by IT administrators, developers and QA teams. VMware Workstation 11 will support the new Windows 10 Technical Preview for enterprise and commercial IT testers and developers who want to put Microsoft's latest PC operating system through the paces in a virtual desktop environment.
Built with nested virtualization, VMware Workstation can run other hypervisors inside the VM, including Microsoft's Hyper-V and VMware's own vSphere and ESXi. In addition to running the new Windows 10 Technical Preview, VMware Workstation 11 will add support for other operating systems including Windows 2012 R2 for servers, Ubuntu 14.10, RHEL 7, CentOS 7, Fedora 20, Debian 7.6 and more than 200 others, the company said.
Also new in VMware Workstation 11 is support for the most current 64-bit x86 processors including Intel's Haswell (released late last year). VMware claims that based on its own testing, using Haswell's new microprocessor architecture with VMware Workstation 11 will offer up to a 45 percent performance improvement for functions such as encryption and multimedia. It will let IT pros and developers build VMs with up to 16 vCPUs, 8TB virtual disks and up to 64GB of memory. It will also connect to vSphere and the vCloud Air public cloud.
For more mainstream users is the new VMware Player 7. Since it's targeted at everyday users rather than just IT pros and administrators, it has fewer of the bells and whistles, but it gains support for the current Windows 8.1 operating system, as well as offering continued support for Windows XP and Windows 7 in desktop virtual environments. "Our goal is to have zero-base support," said William Myrhang, senior product marketing manager at VMware.
VMware Player 7 adds support for the latest crop of PCs and tablets and will be able to run restricted VMs, which, as the name implies, are secure clients that are encrypted, password restricted and can shut off USB access. VMware said the restricted VMs, which can be built with VMware Workstation 11 or VMware Fusion 7 Pro, run in isolation between host and guest operating systems and can have time limits built in.
Posted by Jeffrey Schwartz on 10/08/2014 at 2:25 PM0 comments
Veeam today said it will offer a free Windows endpoint backup and recovery client. The move is a departure from its history of providing replication and backup and recovery software for virtual server environments However, company officials said the move is not a departure from focus, which will remain targeted on protection of server virtual machines, but rather a realization that most organizations are not entirely virtual.
The new Veeam Endpoint Backup software will run on Windows 7 and later OSes (though not Windows RT) to an internal or external (such as USB) disk or flash drive and will include a networked attached storage (NAS) share within the Veeam environment. The company will issue a beta in the coming weeks and the product is due to be officially released sometime next year. The surprise announcement came on the closing day of its inaugural customer and partner conference called VeeamON, held in Las Vegas.
Enterprise Strategy Group Analyst Jason Buffington, who follows the data protection market and has conducted research for Veeam, said offering endpoint client software was unexpected. "At first, I was a little surprised because it didn't seem congruent with that VM-centric approach to things," Buffington said. "But that's another great example of them adding a fringe utility. In this first release, while it's an endpoint solution, primarily, there's no reason you technically couldn't run it on a low-end Windows Server. I'm reasonably confident they are not going to go hog wild into the endpoint protection business. This is just their way to kind of test the code, test customers' willingness for it, as a way to vet that physical feature such that they have even a stronger stranglehold on that midsize org that's backing up everything except a few stragglers."
At a media briefing during the VeeamON conference, company officials emphasized that they remain focused on its core business of protecting server VMs as it plots its growth toward supporting various cloud environments as backup and disaster recovery targets. Doug Hazelman, Veeam vice president of product strategy, indicated that it could be used for various physical servers as well. Hazelman said that the company is largely looking to see how customers use the software, which can perform file-level recoveries. Furthermore he noted that the endpoint software doesn't require any of the company's software and vowed it would remain free as a standalone offering.
"We are not targeting this at an enterprise with 50,000 endpoints," Hazelman said. "We want to get it in the hands of the IT pros and typical Veeam customers and see how we can expand this product and see how we can grow it."
Indeed the VeeamON event was largely to launch a major new release of its flagship suite, to be called the Data Availability Suite v8. Many say Veeam is the fastest growing provider in its market since the company's launch in 2006. In his opening keynote address in the partner track, CEO Ratmir Timashev said that Veeam is on pace to post $500 million in booked revenue (non GAAP) and is aiming to double that to $1 billion by 2018.
In an interview following his keynote, Timashev said the company doesn't have near-term plans for an initial public offering (IPO) and insisted the company is not looking to be acquired. "We're not looking to sell the company," he said. "We believe we can grow. We have proven capabilities to find the next hot market and develop a brilliant product. And when you have this capability, you can continue growing, stay profitable and you don't need to sell."
Timashev added that Veeam can reach those fast-growth goals without deviating from its core mission of protecting virtual datacenters. Extending to a new network of cloud providers will be a key enabler, according to Timashev. The new Data Availability Suite v8, set for release next month (he didn't give an exact date), will incorporate a new interface called Cloud Connect that will let customers choose from a growing network of partners who are building cloud-based and hosted backup and disaster recovery services.
The new v8 suite offers a bevy of other features including what it calls "Explorers" that can now protect Microsoft's Active Directory and SQL Server and provides extended support for Exchange Server and SharePoint. Also added is extended WAN acceleration introduced in the last release to cover replication and a feature called Backup IO, which adds intelligent load balancing.
Posted by Jeffrey Schwartz on 10/08/2014 at 11:30 AM0 comments
Hewlett Packard for decades has resisted calls by Wall Street to divest itself into multiple companies but today it has heeded the call. The company said it would split itself into two separate publicly traded businesses next year. The two companies will leverage their existing storied brand, calling its PC and printing business HP Inc. and the infrastructure and cloud businesses HP Enterprise.
Once the split is complete, Meg Whitman will lead the new HP Enterprise as CEO and serve only as chairman of HP Inc. The move comes less than a week after eBay said it would spin off its PayPal business unit into a separately traded company. Ironically, Whitman, HP's current CEO, was the longtime CEO of eBay during the peak of the dotcom bubble and it too was recently under pressure by activist investors to spin off PayPal, believing both companies would fare better apart.
HP has long resisted calls by Wall Street to split itself into two or more companies. The pressure intensified following the early 2005 departure of CEO Carly Fiorina, whose disputed move to acquire Compaq remained controversial to this day. The company had strongly considered selling off or divesting its PC and printing businesses under previous CEO Leo Apotheker. When he was abruptly dismissed after just 11 months and Whitman took over, she continued the review but ultimately decided a "One HP" would make it a stronger company.
In deciding not to divest back in 2011, Whitman argued remaining together gave it more scale and would put it in a stronger position to compete. For instance, she argued HP was the largest buyer of CPUs, memory, drives and other components.
Now she's arguing that the market has changed profoundly. "We think this is the best alternative," she told CNBC's David Faber in an interview this morning. "The market has changed dramatically in terms of speed. We are in a position to position these two companies for growth"
Even though the question of HP divesting its business has always come up over the years, today's news was unexpected and comes after the company was rumored to be looking at an acquisition of storage giant EMC and earlier cloud giant Rackspace. It's not clear how serious talks, if there were any, were.
The decision to become smaller rather than larger by HP reflects growing pressure by large companies to become more competitive against nimbler competitors. Many of the large IT giants have faced similar pressure. Wall Street has been pushing EMC to split itself from VMware and IBM last week just completed the sale of its industry standard server business to Lenovo. And Dell, led by its founder and CEO Michael Dell, has become a private company.
Microsoft has also faced pressure to split itself up over the years, dating back to the U.S. government's antitrust case. Investors have continued to push Microsoft to consider splitting off or selling its gaming and potentially its devices business off since former CEO Steve Ballmer announced he was stepping down last year. The company's now controversial move to acquire Nokia's handset business for $7.2 billion and the selection of insider Satya Nadella as its new CEO has made that appear less likely. Nadella has said that Microsoft has no plans to divest any of its businesses. But HP's move shows how things can change.
Despite its own "One Microsoft" model, analysts will surely step up the pressure for Microsoft to consider its options. Yet Microsoft may have a better argument that it should keep its businesses intact, with the exception of perhaps Nokia if it becomes a drag on the company.
But as Whitman pointed out, "before a few months ago, we weren't positioned to do this. Now the time is right." And that's why never-say-never is the operative term in the IT industry.
Posted by Jeffrey Schwartz on 10/06/2014 at 11:42 AM0 comments
Microsoft may have succeeded in throwing a curve ball at the world by not naming the next version of its operating system Windows 9. But as William Shakespeare famously wrote in Romeo and Juliet, "A rose by any other name would smell as sweet." In other words, if Windows 10 is a stinker, it won't matter what Microsoft calls it.
In his First Look of the preview, Brien Posey wondered if trying to come up to speed with Apple's OS X had anything to do with the choice in names -- a theory that quickly came to mind by many others wondering what Microsoft is up to (the company apparently didn't say why it came up with Windows 10). Perhaps Microsoft's trying to appeal to the many Windows XP loyalists?
The name notwithstanding, I too downloaded the Windows 10 Preview, a process that was relatively simple. Posey's review encapsulates Microsoft's progress in unifying the modern interface with the desktop and gave his thoughts on where Microsoft needs to move forward before bringing Windows 10 to market. One thing he didn't touch upon is the status of the Charms feature. Introduced in Windows 8, Charms were intended to help find shortcuts in managing your device.
In the preview, the Charms are no longer accessible with a mouse, only with touch. If you got used to using the Charms with a mouse, you're going to have to readjust to using the Start Button again. For some, that may require some readjustment, especially if they continue to use the Charms with their fingers. Would you like to see Microsoft make the Charms available when using a mouse? What would be the downside?
Meanwhile, have you downloaded the Windows 10 Preview? Keep in mind, this is just the first preview and Microsoft is looking for user feedback to decide what features makes the final cut as it refines Windows 10. We'd love to hear your thoughts on where you'd like them to refine Windows 10 so that it doesn't become a stinker.
Posted by Jeffrey Schwartz on 10/03/2014 at 12:35 PM0 comments
As the capabilities of virtual machines reach their outer limits in the quest to build cloud-based software-defined datacenters, containers are quickly emerging as their potential successor. Though containers have long existed, notably in Linux, the rise of the Docker open source container has created a standard for building portable applications in the form of micro-services. As they become more mature, containers promise portability, automation, orchestration and scalability of applications across clouds and virtual machines.
Since releasing Docker as an open source container for Linux, just about every company has announced support for it either in their operating systems, virtual machines or cloud platforms including IBM, Google, Red Hat, VMware and even Microsoft, which in May said it would support Linux-based Docker containers in the infrastructure-as-a-service (IaaS) component of its Azure cloud service. Docker is not available in the Microsoft platform as a service (PaaS) because it doesn't yet support Linux, though it appears only a matter of time before that happens.
"We're thinking about it," said Mark Russinovich, who Microsoft last month officially named CTO of its Azure cloud. "We hear customers want Linux on PaaS on Azure."
Russinovich confirmed that Microsoft is looking to commercialize its own container technology, code-named "Drawbridge," a library OS effort kicked off in 2008 by Microsoft Research Partner Manager Galen Hunt, who in 2011 detailed a working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In the desktop prototype, Microsoft said the securely isolated library operating system instances worked via the reuse of networking protocols. In a keynote address at the August TechMentor conference (which, like Redmond magazine, is produced by 1105 Media) on the Microsoft campus, Redmond magazine columnist Don Jones told attendees about the effort and questioned its future.
During a panel discussion at the Interop conference in New York yesterday, Russinovich acknowledged Drawbridge as alive and well. While he couldn't speak for plans on the Windows client he also stopped short of saying Microsoft plans to include it in Windows Server and Hyper-V. But he left little doubt that that's in the pipeline for Windows Server and Azure. Russinovich said Microsoft has already used the Drawbridge container technology in its new Azure-based machine learning technology.
"Obviously spinning up a VM for them is not acceptable in terms of the experience," Russinovich said. "So we built with the help of Microsoft Research our own secure container technology, called Drawbridge. That's what we used internally. We are figuring out how to make that kind of technology available publicly on Windows." Russinovich wouldn't say whether it will be discussed at the TechEd conference in Barcelona later this month.
Sam Ramji, who left his role as leader of Microsoft's emerging open source and Linux strategy five years ago, heard about Drawbridge for the first time in yesterday's session. In an interview he argued that if Windows Server is going to remain competitive with Linux, it needs to have its own containers. "It's a must-have," said Ramji, who is now VP of strategy at Apigee, a provider of cloud-based APIs. "If they don't have a container in the next 12 months, I think they will probably lose market share."
Despite Microsoft's caginess on its commercial plans for Drawbridge and containers, reading between the lines it appears they're a priority for the Azure team. While talking up Microsoft's support for Docker containers for Linux, Russinovich seemed to position Drawbridge as a superior container technology, arguing its containers are more secure for deploying micro-services.
"In a multi-tenant environment you're letting untrusted code from who knows where run on a platform and you need a security boundary around that," Russinovich said. "Most cloud platforms use the virtual machines as a security boundary. With a smaller, letter-grade secure container, we can make the deployment of that much more efficient," Russinovich said. "That's where Drawbridge comes into play. "
Ramji agreed that the ability to provide secure micro-services is a key differentiator between the open source Docker and Drawbridge. "It's going to make bigger promises for security, especially for third-party untrusted code," Ramji said.
Asked if cloud platforms like the open source OpenShift PaaS, led by Red Hat, can make containers more secure, Krishnan Subramanian, argued that's not their role. "They are not there to make containers more secure. Their role is for the orchestration side of things," Subramanian said. "Security comes with the underlying operating system that the container uses. If they're going to use one of those operating systems in the industry that are not enterprise ready, probably they're not secure."
Russinovich said customers do want to see Windows-based containers. Is that the case? How do you see them playing in your infrastructure and how imperative is it that they come sooner than later?
Posted by Jeffrey Schwartz on 10/01/2014 at 2:25 PM0 comments
Microsoft Research today opened its new online Prediction Lab in a move it said aims to reinvent the way polls and surveys are conducted. The new lab, open to anyone in the format of a game, seeks to provide more accurate predictions than current surveys can forecast today.
Led by David Rothschild, an economist at Microsoft Research and also a fellow at the Applied Statistics Center at Columbia University, the Prediction Lab boasts it already has a credible track record prior to its launch. In some examples released today, the lab predicted an 84 percent chance that Scottish voters would reject the election held to decide whether Scotland should secede from the United Kingdom.
The predictions, published in a blog post called A Data-Driven Crystal Ball, also included data of the winners of all 15 World Cup knockout games this year. And in the 2012 presidential election, the lab got the Obama versus Romney results right in 50 of 51 territories (including Washington, DC). The new interactive platform, released to the public, hopes to gather more data and sentiment of the general population.
"We're building an infrastructure that's incredibly scalable, so we can be answering questions along a massive continuum," Rothschild said in the blog post, where he described the Prediction Lab as "a great laboratory for researchers [and] a very socialized experience" for those who participate.
"By really reinventing survey research, we feel that we can open it up to a whole new realm of questions that, previously, people used to say you can only use a model for," Rothschild added. "From whom you survey to the questions you ask to the aggregation method that you utilize to the incentive structure, we see places to innovate. We're trying to be extremely disruptive."
Rothschild also explained why traditional poling technology is outdated and the need to research new methods like in the Prediction Lab in the era of big data. "First, I firmly believe the standard polling will reach a point where the response rate and the coverage is so low that something bad will happen. Then, the standard polling technology will be completely destroyed, so it is prudent to invest in alternative methods. Second, even if nothing ever happened to standard polling, nonprobability polling data will unlock market intelligence for us that no standard polling could ever provide. Ultimately, we will be able to gather data so quickly that the idea of a decision-maker waiting a few weeks for a poll will seem crazy."
Microsoft is hoping to keep participants engaged with its game-like polling technique, where participants can win or lose points based on making an accurate prediction (if you're wrong, you lose points). This week's "challenge" looks to predict whether President Obama will name a new attorney general before Oct. 5. The second question asks if the number of U.S. states recognizing gay marriages will change next week and the final poll asks if there will be American active combat soldiers in Syria by Oct. 5.
Whether the Microsoft Prediction Lab will gain the status of more popular surveys such as the Gallup polls remains to be seen. But the work in Microsoft Research shows an interesting use of applied quantitative research. Though Microsoft didn't outline plans to extend the Prediction Lab, perhaps some of its technology will have implication for the company's offerings such as Cortana, Bing and even Delve, the new Office Graph technology formerly code-named "Oslo" for SharePoint and Office 365. Now in preview, it's built on Microsoft's FAST enterprise search technology and is designed to work across Office 365 app silos.
Posted by Jeffrey Schwartz on 09/29/2014 at 11:54 AM0 comments
Rackspace, which over the past few years tried to transform itself into the leading OpenStack cloud provider, is shifting gears. The large San Antonio-based service provider last week began emphasizing a portfolio of dedicated managed services that let enterprises run their systems and applications on their choice of virtual platforms -- Microsoft's Hyper-V, VMware's ESX or the open source OpenStack platform.
The new Hyper-V-based managed services include for the first time the complete System Center 2012 R2 stack, including Windows Server and Storage Spaces. The orchestrated services are available as dedicated single-tenant managed services for testing and are set for general availability in the United States in November, followed by the United Kingdom and Sydney in the first quarter of next year. Insiders had long pushed the company to offer Hyper-V-based managed services but until recently, it was met with resistance by leadership that wanted to stick with VMware for hosted services.
Months after Rackspace CEO Lanham Napier stepped down in February and the company retained Morgan Stanley to seek a buyer in May, the company last week said it didn't find a buyer and decided to remain independent, naming Rackspace President Taylor Rhodes as the new CEO. A day later, the company said it's no longer just emphasizing OpenStack. Instead, the company is promoting its best-of-breed approach with Hyper-V, VMware and OpenStack.
I caught up with CTO John Engates at a day-long analyst and customer conference in New York, where he explained the shift. "The vast majority of IT is still done today in customer-run datacenters by IT guys," Engates explained. "The small fraction of what's going to the cloud today is early stage applications and they're built by sometimes a sliver of the IT organization that's sort of on the bleeding edge. But there are a lot of applications that still move to the cloud as datacenters get old, as servers go through refresh cycles. But they won't necessarily be able to go to Amazon's flavor of cloud. They will go to VMware cloud, or a Rackspace-hosted VMware cloud, or a Microsoft-based cloud."
In a way, Rackspace is going back to its roots as the company was born as a hosting provider until a few years ago when it decided to base its growth on competing with large cloud providers by building out its entire cloud infrastructure on OpenStack to offer an option to Amazon Web Services cloud offerings, with the added benefit of offering its so-called "fanatical support." Rackspace codeveloped OpenStack with NASA as an open source means of offering portable Amazon-compatible cloud services. While it continued to offer other hosting services including a Microsoft-centric Exchange, Lync and SharePoint managed services offering, it was a relatively small portion of its business, ran only on VMware and remained in the shadow of Rackspace's OpenStack push.
A report released by 451 Research shows OpenStack, though rapidly growing, accounts for $883 million of the $56 billion market for cloud and managed services. "Rackspace for a long time was seen part and parcel of the Amazon world with public cloud and they're clearly repositioning themselves around the fastest growing, most important part of the market, which is managed cloud and private cloud," said 451 Research Analyst and Senior VP Michelle Bailey. "They can do cloud with customer support, which is something you don't typically get with the larger public provides. They have guarantees around availability, and they'll sign a business agreement with customers, which is what you'll see from traditional hosting and service providers."
Like others, Bailey said there's growing demand for managed services based on Hyper-V-based single-tenant servers. "With the Microsoft relationship they're able to provide apps," she said. "So you're able to go up the stack. It's not just the infrastructure piece that you're getting with Microsoft but specifically Exchange, SharePoint and SQL Server. These are some of the most commonly used applications in the market and Microsoft has made it very good for their partners to be able to resell those services now. When you get into an app discussion with a customer, it's a completely different discussion."
Jeff DeVerter, general manager for the Microsoft private cloud practice at Rackspace, acknowledged it was a multi-year effort to get corporate buy-in for offering services on Hyper-V and ultimately the Microsoft Cloud OS stack. "I had to convince the senior leadership team at Rackspace that this was the right thing to do," said DeVerter. "It was easier to do this year than it was in previous years because we were still feeling our way from an OpenStack perspective. If you look at the whole stack that is Microsoft's Cloud OS, it really is a very similar thing to what the whole stack of OpenStack is. Rackspace has realized the world is not built on OpenStack because there really are traditional enterprise applications [Exchange, Lync and SharePoint] that don't fit there. They're not written for the OpenStack world."
DeVerter would know, having come to the company six years ago as a SharePoint architect and helped grow the SharePoint, Exchange and ultimately Lync business to $50 million in revenues. Aiding that growth was the Feb. 2012 acquisition of SharePoint 911, whose principals Shane Young and Todd Klindt and helped make the case for moving those platforms from VMware to Hyper-V.
Posted by Jeffrey Schwartz on 09/26/2014 at 7:46 AM0 comments
Within moments of last week's news that Larry Ellison has stepped down as Oracle's CEO to become CTO, social media lit up. Reaction such as "whoa!" and "wow!" preceded every tweet or Facebook post. In reality, it seemed like a superficial change in titles.
For all intents and purposes, the new CEOs, Mark Hurd and Safra Catz, were already running the company, while Ellison had final say in technical strategy. Hence it's primarily business as usual with some new formalities in place. Could it be a precursor to some bombshell in the coming days and weeks? We'll see but there's nothing obvious to suggest that.
It seems more likely Ellison will fade away from Oracle over time, rather than have a ceremonial departure like Bill Gates did when he left Microsoft in 2008. Don't be surprised if Ellison spends a lot more time on his yacht and on Lanai, the small island he bought in 2012 near Hawaii that he is seeking to make "the first economically viable, 100 percent green community," as reported by The New York Times Magazine this week.
For now, Hurd told The Times that "Larry's not going anywhere." In fact Hurd hinted despite incurring his wrath on SAP and Salesforce.com over the past decade, Ellison may revisit his old rivalry with Microsoft, where he and Scott McNealy, onetime CEO of Sun Microsystems (ironically now a part of Oracle), fought hard and tirelessly to end Microsoft's Windows PC dominance. They did so in lots of public speeches, lawsuits and a strong effort to displace traditional PCs with their network computers, which were ahead of their time. Ellison also did his part in helping spawn the Linux server movement by bringing new versions of the Oracle database on the open source platform first, and only much later on Windows Server.
While Oracle has fiercely competed with Microsoft in the database market and the Java versus .NET battle over the past decade, with little left to fight about, Ellison largely focused his ire on IBM, Salesforce and SAP. Nevertheless Oracle's agreement with Microsoft last year to support native Java and the Oracle database and virtual machines on the Microsoft Azure public cloud was intriguing as it was such an unlikely move at one time.
Now it appears Ellison, who years ago mocked cloud computing, has Azure envy due to it having one of the more built-out PaaS portfolios. Hurd told The Times Oracle is readying its own platform as a service (PaaS) that will compete with Azure that Ellison will reveal at next week's Oracle OpenWorld conference in San Francisco. Newly promoted CEO Hurd told The Times Ellison will announce a PaaS aimed at competing with the Azure PaaS and SQL Server in a keynote. Suggested (but not stated) was that Ellison will try to pitch it as optimized for Microsoft's .NET language.
Oracle's current pact with Microsoft is primarily focused on Azure's infrastructure-as-a-service (Iaas) offerings, not PaaS. Whether Oracle offers a serious alternative to running its wares on IaaS remains to be seen. If indeed Oracle aims to compete with Microsoft on the PaaS front, it's likely going to be an offering that will give existing Oracle customers an alternative cloud to customers who would never port their database and Java apps to Azure.
However Ellison positions this new offering, unless Oracle has covertly been building a few dozen datacenters globally that match the scale and capacity of Amazon, Azure, Google and Salesforce.com --which would set off a social media firestorm -- it's more likely to look like a better-late-than-never service well suited and awaited by many of Oracle's customers.
Posted by Jeffrey Schwartz on 09/24/2014 at 11:14 AM0 comments
Apple today said it has sold 10 million of its new iPhones over the first three days since they arrived in stores and at customers' doorsteps Friday. This exceeds analysts' and the company's forecasts. In the words of CEO Tim Cook, sales of its new iPhone 6 and iPhone 6 Plus models have led to the "best launch ever, shattering all previous sell-through records by a large margin."
Indeed analysts are noting that the figures are impressive, especially considering they haven't yet shipped in China, where there's large demand for the new iPhones. How and if the initial results will embolden the iPhone and iOS, in a market that has lost market leadership share to Android, remains to be seen. But it appears the company's unwillingness to deliver a larger phone earlier clearly must have cost it market share with those with pent up demand for larger smartphones. Most Android phones and Windows Phones are larger than the previous 4-inch iPhone 5s and the majority of devices are even larger than 4.5 inches these days.
The company didn't break out sales of the bigger iPhone 6 versus the even larger 5.5-inch iPhone 6 Plus, but some believe the latter may have had an edge even if they're more scarce. But if the sales Apple reported are primarily from existing iPhone users, that will only stabilize its existing share, not extend it. However, as the market share for Windows Phone declines, demand for the iPhone will grow on the back of features such as Apple Pay and the forthcoming iWatch that saturate the media. This won't help the market for Microsoft-based phones (and could bring back some Android users to Apple).
It doesn't appear the new Amazon Fire phones, technically Android-based devices, are gaining meaningful share. Meanwhile BlackBerry is readying its first new phone since the release of the BlackBerry 10 last year. Despite miniscule market share, BlackBerry CEO John Chen told The Wall Street Journal that the new 4.5-inch display that will come with its new Passport will help to make it an enterprise-grade device that's targeted at productivity. It will also boast a battery that can power the phone for 36 hours and a large antenna designed to provide better reception. In addition to enterprise users, the phone will be targeted at medical professionals.
With the growing move by some to so-called phablets, which the iPhone 6 Plus arguably is (some might say devices that are over 6 inches better fit that description), these larger devices are also expected to cut into sales of 7-inch tablets. In Apple's case, that includes the iPad mini. But given the iPad Mini's price and the fact that not all models have built-in cellular connectivity, the iPhone 6 Plus could bolster Apple more than hurt it.
Despite Microsoft's efforts to talk up improvements to Windows Phone 8.1 and its emphasis on Cortana, it appears the noise from Apple and the coverage surrounding it is all but drowning it out. As the noise from Apple subsides in the coming weeks, Microsoft will need to step up the volume.
Posted by Jeffrey Schwartz on 09/22/2014 at 12:03 PM0 comments
Microsoft earlier this week said two more longtime board members are stepping down and the company has already named their replacements, which will take effect Oct. 1.
Among them are David Marquardt, a venture capitalist who was an early investor in Microsoft, and Dina Dublon, the onetime chief financial officer of J.P. Morgan. Dublon was on Microsoft's audit committee and chaired its compensation, The Wall Street journal noted. The paper also raised an interesting question: Would losing the two board members now and others over the past two years result in a gap in "institutional knowledge?" Marquardt, with his Silicon Valley ties, played a key role in helping Microsoft "get off the ground and is a direct link to the company's earliest days."
The two new board members are Teri List-Stoll, chief financial officer of Kraft Foods and Visa CEO Charles Scharf, who once ran J.P. Morgan Chase's retail banking and private investment operations. Of course the moves come just weeks after former CEO Steve Ballmer stepped down.
Nadella will be reporting to a board with six new members out of a total of 10 since 2012. Indeed that should please Wall Street investors who were pining for new blood during last year's search for Ballmer's replacement and didn't want to see an insider get the job.
But the real question to be determined is will these new voices teamed with Nadella strike a balance that is needed for Microsoft to thrive in the fiercest competitive market it has faced to date?
Posted by Jeffrey Schwartz on 09/18/2014 at 3:32 PM0 comments
IBM's New M5 Servers Include Editions for Hyper-V and SQL Server
In what could be its last major rollout of new x86 systems if the company's January deal to sell its commodity server business to Lenovo for $2.3 billion goes through, IBM launched the new System x M5 line. The new lineup of servers includes systems designed to operate the latest versions of Microsoft's Hyper-V and SQL Server.
IBM said the new M5 line offers improved performance and security. With a number of models for a variety of solution types, the new M5 includes various tower, rack, blade and integrated systems that target everything from small workloads to infrastructure for private clouds, big data and analytic applications. The two systems targeting Microsoft workloads include the new IBM System x Solution for Microsoft Fast Track DW for SQL Server 2014 and an upgraded IBM Flex System Solution for Microsoft Hyper-V.
The IBM System x Solution for Microsoft SQL Data Warehouse on X6 is designed for data warehouse workloads running the new SQL Server 2014, which shipped in April. IBM said it offers rapid response to data queries and enables scalability as workloads increase. Specifically, the new systems are powered by Intel Xeon E7-4800/88 v2 processors. IBM said the new systems offer 100 percent faster database performance than the prior release, with three times the memory capacity and a third of the latency of PCIe-based flash. The systems can pull up to 12TB of flash memory-channel storage near the processor as well.
As a Microsoft Fast Track Partner, IBM also added its IBM System x3850 X6 Solution for Microsoft Hyper-V. Targeting business-critical systems, the two-node configuration uses Microsoft Failover Clustering, aimed at eliminating any single point of failure, according to IBM's reference architecture. The Hyper-V role installed on each clustered server, which hosts virtual machines.
Unitrends Adds Reporting Tools To Monitor Capacity and Storage Inventory
Backup and recovery appliance supplier Unitrends has added new tools track storage inventory and capacity, designed to help administrators more accurately gauge their system requirements in order to lower costs.
The set of free tools provide views of storage, file capacity and utilization. Unitrends said they let administrators calculate the amount of storage they need to make available for backups and prioritize files that need to be backed up. The tools provide single point-in-time snapshots of storage and files distributed throughout organizations' datacenters to help determine how much capacity they need and prioritize which files get backed up based on how much storage is available.
There are two separate tools. The Unitrends Backup Capacity Tool provides the snapshot of all files distributed throughout an organization on both storage systems as well as servers, providing file-level views of data to plan backups. It provides reports for planning, while outlining file usage. The other, the Unitrends Storage Inventory Tool, provides a view of an organization's entire storage infrastructure, which the company says offers detailed inventories of storage assets in use and where there are risks of exceeding available capacity.
These new tools follow July's release of BC/DR Link, an online free tool that helps organizations create global disaster recovery plans. It includes 1GB of centralized storage in the Unitrends cloud, where customers can store critical document.
CloudLink Adds Microsoft BitLocker To Secure Workloads Running in Amazon
Security vendor CloudLink's SecureVM, which provides Microsoft BitLocker encryption, has added Amazon Web Services to its list of supported cloud platforms. The tool lets customers implement the native Windows encryption to their virtual machines -- both desktop and servers -- in the Amazon cloud. In addition to Amazon, it works in Microsoft Azure and VMware vCloud Air, among other public clouds.
Because virtual and cloud environments naturally can't utilize the BitLocker encryption keys typically stored on TPM and USB hardware, SecureVM emulates that functionality by providing centralized management of the keys. It also lets customers encrypt their VMs outside of AWS.
Customers can start their VMs only in the intended environment. It's policy based to ensure the VMs only launch when authorized. It also provides an audit trail tracking then VMs are launched, with other information such as IP addresses, hosts and operating systems, among other factors.
Data volumes designated to an instance are encrypted, allowing customers to encrypt other data volumes, according to the company. Enterprises maintain control of encryption key management including the option to store keys in-house.
Posted by Jeffrey Schwartz on 09/18/2014 at 12:09 PM0 comments
As the IT industry looks at the future of the virtual machine, containers have jumped out as the next big thing and every key player with an interest in the future of the datacenter is circling the wagons around Silicon Valley startup Docker. That includes IBM, Google, Red Hat, VMware and even Microsoft. Whether it is cause or effect, big money is following Docker as well.
Today Sequoia Capital has pumped $40 million in Series C funding, bringing its total funding to $66 million and estimated valuation at $400 million. Early investors in Docker include Benchmark, Greylock Partners, Insight Ventures, Trinity Ventures and Yahoo Cofounder Jerry Yang. Docker's containers aim to move beyond the traditional virtual machine with its open source platform for building, shipping and running distributed applications.
As Docker puts it, the limitation of virtual machines is that they include not only the application, but the required binaries, libraries and an entire guest OS, which could weigh tens of gigabytes, compared with just a small number of megabytes for the actual app. By comparison, the Docker Engine container consists of just the application and its dependencies.
"It runs as an isolated process in user space on the host operating system, sharing the kernel with other containers," according to the company's description. "Thus, it enjoys the resource isolation and allocation benefits of VMs but is much more portable and efficient."
With the release of Docker in June, Microsoft announced support for the Linux-based containers by updating the command-line interface in Azure, allowing customers to build and deploy Docker-based containers in Azure. Microsoft also said customers can manage the virtual machines with the Docker client. As reported here by John Waters, Microsoft's Corey Sanders, manager of the Azure compute runtime, demonstrated this capability at DockerCon at the time.
On the Microsoft Open Technologies blog, Evangelist Ross Gardler outlined how to set up and use Docker on the Azure cloud service. According to Gardler, common use cases for Docker include:
- Automating the packaging and deployment of applications
- Creation of lightweight, private PaaS environments
- Automated testing and continuous integration/deployment
- Deploying and scaling web apps, databases and backend services
At VMworld last month, VMware talked up its support for Docker, saying it has teamed with the company, joined by its sister company Pivotal as well as Google, to enable their collective enterprise customers to run and manage apps in containers in public, private and hybrid cloud scenarios as well as on existing VMware infrastructure.
Containers are a technology to watch, whether you're a developer or IT pro. The entire IT industry has embraced (at least publicly) it as the next generation of virtual infrastructure. And for now, Docker seems to be setting the agenda for this new technology.
Posted by Jeffrey Schwartz on 09/16/2014 at 11:57 AM0 comments
Apple today said preorders of its new iPhone 6 and its larger sibling, the 6 Plus, totaled a record 4 million in the first 24 hours, which doubled the preorders that the iPhone 5 received two years ago. But those numbers may suggest a rosier outlook than they actually portend.
Since Apple didn't release similar figures for last year's release of the iPhone 5s, which was for the most part an incremental upgrade over its then year-old predecessor as well as the lower-end 5c, it suggests customers are sitting on a number of aging iPhones. That includes earlier models which are now reaching the point of sluggishness due to upgrades to iOS running on slower processors and the fact they can only run on 3G networks.
Perhaps boosting demand was the fact that Apple for the first time offered an attractive promotion -- it will offer a $200 trade-in for an older iPhone if it's in working condition. For now, that offer is only good through this Friday but it wouldn't be surprising if Apple extended it or reintroduced it through the holiday season.
Also while I visited a local Verizon owned and operated store on Saturday, I noticed a few customers interested in switching out their larger Android phones for the new 6 Plus. But most of those orders Apple reported must have come from online because the vast majority were there looking at Android-based phones. Those wanting a new iPhone, especially the larger one, will have to wait at least a month or more, although Apple always seems to like to play those supply shortages early on when releasing new devices.
Many customers these days are less loyal to any one phone platform and are willing to switch if one has hardware specs that meet their needs -- perhaps its size, the camera or even the design. For example, I observed one woman who wanted to replace her damaged iPhone with iPhone 6 Plus but when the rep told her she'd have to wait a few weeks, she said she'd just take an Android phone since she needed a new one right away. I saw another man warning his son that if he switched to an Android phone, he'd lose all his iOS apps. The teenager was unphased by that and also bought an Android Phone.
Meanwhile, no one was looking at the Nokia Lumia 928s or Icons and the store employees told me that they sell few Windows Phones, which they estimated accounted for less than 5 percent of phone sales. Perhaps that will change, following Microsoft's deal to acquire Mojang, purveyor of the popular Minecraft game? That's a discussion for my post about today's announcement by Microsoft to acquire Mojang for $2.5 billion.
For those who did purchase a new iPhone, it appears there was more demand for the larger unit with the 5.5-inch display compared with its junior counterpart, which measures 4.7-inches (still larger than the earlier 5/5s models). Apple didn't provide a breakdown. If you're an iPhone user, do you plan to upgrade to a newer one or switch to another platform? What size do you find most appealing?
Posted by Jeffrey Schwartz on 09/15/2014 at 12:08 PM0 comments
Microsoft has pulled the trigger on a $2.5 billion deal to acquire Mojang, the developer of the popular Minecraft game. Rumors that a deal was in the works surfaced last week, though the price tag was initially said to be $2 billion. It looks like the founders of the 5-year-old startup squeezed another half-billion dollars out of Microsoft over the weekend.
Minecraft is the largest selling game on Microsoft's popular Xbox platform. At first glance, this move could be a play to make it exclusive to Microsoft's gaming system to keep it out of the hands of the likes of Sony. It could even signal to boost its declining Windows Phone business or even its Windows PC and tablet software. However if you listen to Xbox Head Phil Spencer and Microsoft's push to support all device platforms, that doesn't appear to be the plan.
"This is a game that has found its audience on touch devices, on phones, on iPads, on console and obviously its true home on PC. Whether you're playing on an Xbox, whether you're playing on a PlayStation, an Android or iOS device, our goal is to continue to evolve with and innovate with Minecraft across all those platforms," Spencer said in a prerecorded announcement on his blog.
If you consider CEO Satya Nadella's proclamation in July that Microsoft is the "productivity and platforms company" and it spent more than double what it cost to acquire enterprise social media company Yammer on Mojang, it may have you wondering how this fits into that focus. In the press release announcing the deal, Nadella stated: "Minecraft is more than a great game franchise -- it is an open world platform, driven by a vibrant community we care deeply about, and rich with new opportunities for that community and for Microsoft."
That could at least hint that the thinking is the platform and community the founders of Mojang created could play a role in UI design that doesn't rely on Windows or even Xbox. Others have speculated that this is a move to make Microsoft's gaming business ripe for a spinoff or sale, something investors want but a move the Nadella has indicated is not looking to make.
"The single biggest digital life category, measured in both time and money spent, in a mobile-first world is gaming," Nadella said in his lengthy July 10 memo, announcing the company's focus moving forward. "We also benefit from many technologies flowing from our gaming efforts into our productivity efforts --core graphics and NUI in Windows, speech recognition in Skype, camera technology in Kinect for Windows, Azure cloud enhancements for GPU simulation and many more. Bottom line, we will continue to innovate and grow our fan base with Xbox while also creating additive business value for Microsoft."
The deal also makes sense from another perspective: Minecraft is hugely popular, especially with younger people -- a demographic that is critical to the success in any productivity tool or platform. Clearly Nadella is telling the market and especially critics that it's not game over for Microsoft.
Posted by Jeffrey Schwartz on 09/15/2014 at 12:24 PM0 comments
Hewlett Packard's surprising news that it has agreed to acquire Eucalyptus potentially throws a monkey wrench into Microsoft's recently stepped-up push to enable users to migrate workloads from Amazon Web Services to the Microsoft Azure public cloud.
As I noted last week, Microsoft announced its new Migration Accelerator, which migrates workloads running on the Amazon Web Services cloud to Azure. It's the latest in a push to accelerate its public cloud service, which analysts have recently said is gaining ground.
By acquiring Eucalyptus, HP gains a tool sanctioned by Amazon to enable AWS-enabled workloads in its private cloud service. Eucalyptus signed a compatibility pact with Amazon in March 2012 that enables it to use Amazon APIs including Amazon Machine Images (AMIs) for its open source private cloud operating system software.
The deal, announced late yesterday, also means Eucalyptus CEO Marten Mickos will become general manager of HP's Cloud services and will report to HP Chairman and CEO Meg Whitman. Mickos, the onetime CEO of MySQL, has become a respected figure in infrastructure-as-a-service (IaaS) cloud circles. But the move certainly raised eyebrows.
Investor and consultant Ben Kepes in a Forbes blog post questioned whether Eucalyptus ran out of money and was forced into a fire sale or if the acquisition was a desperate move by HP to give a push to its cloud business. HP has had numerous management and strategy shifts with its cloud business.
"One needs only to look at the employee changes in the HP cloud division." Kepes wrote. "Its main executive, Biri Singh, left last year. Martin Fink has been running the business since then and now Mickos will take over -- he'll apparently be reporting directly to CEO Meg Whitman but whether anything can be achieved given Whitman's broad range of issues to focus on is anyone's guess."
Ironically on Redmond magazine's sister site Virtualization Review, well-known infrastructure analyst Dan Kusnetzky had just talked with Bill Hilf, senior vice president, product and service management, HP Cloud. He shared his observations on HP's new Helion cloud offering prior to the Eucalyptus deal announcement. Helicon is built on OpenStack, though HP also has partnerships with Microsoft, VMware and CloudStack.
"The deal represents HP's recognition of the reality that much of what enterprise developers, teams and lines of business do revolves around Amazon Web Services," said Al Sadowski, a research director at 451 Research.
"HP clearly is trying to differentiate itself from Cisco, Dell and IBM by having its own AWS-compatible approach," Kusnetzky added. "I'm wondering what these players are going to do once Eucalyptus is an HP product. Some are likely to steer clients to OpenStack or Azure as a way to reduce HP's influence in their customer bases."
It also raises questions about the future of Helicon, which despite HP's partnerships, emphasizes OpenStack -- something Mickos has been a longtime and vocal supporter of. "We are seeing the OpenStack Project become one of the largest and fastest growing open source projects in the world today," Mickos was quoted as saying on an HP blog post.
Hmm. According to a research report released by 451 Research, OpenStack revenues were $883 million of IaaS revenues, or about 13 percent. They are forecast to double to about $1.7 billion of the $10 billion in 2016, or 17 percent. Not trivial but not a huge chunk of the market either.
HP clearly made this move to counter IBM's apparent headway with its cloud service, even though it appears the three front runners are Amazon, Microsoft and Google. In order to remain a player, HP needs to have compatibility with all three, as well as OpenStack, and acquiring Eucalyptus gives it a boost in offering Amazon compatibility, even if it comes at the expense of its server business, as noted by The New York Times.
In my chats with Mickos over the years, Eucalyptus hadn't ruled out Azure compatibility but admitted it hadn't gone very far last time we spoke over a year ago. Time will tell if this becomes a greater priority for both companies.
Regardless, 451 Research's Sadowski indicated that HP's move likely targets IBM more than Microsoft. "HP is hoping to capture more of the enterprise market as organizations make their way beyond Amazon (and Azure) and build out their private and hybrid cloud deployments," he said. "We would guess that in acquiring Eucalyptus, the company is seeking to replicate the story that IBM has built with its SoftLayer buy and simultaneous OpenStack support."
Posted by Jeffrey Schwartz on 09/12/2014 at 12:51 PM0 comments
Apple's much-anticipated launch event yesterday "sucked the air out of the room" during the opening keynote session at the annual Tableau Customer conference, taking place in Seattle, as my friend Ellis Booker remarked on Facebook yesterday.
Regardless how you view Apple, the launch of its larger iPhone 6 and 6 Plus models, along with the new payment service and smartwatch, were hard to ignore. Positioned as an event on par with the launch of the original Macintosh 30 years ago, the iPod, iPhone and iPad, the new Apple Watch and Apple Pay made for the largest launch event the company has staged in over four years. It was arguably the largest number of new products showcased at a single Apple launch event. Despite all the hype, it remains to be seen if it will be remembered as disruptive as Apple's prior launches. Yet given its history, I wouldn't quickly dismiss the potential in what Apple announced yesterday.
The Apple Watch was the icing on the cake for many fans looking for the company to show it can create new markets. But critics were quickly disappointed when they learned the new Apple Watch would only work if linked it to your new iPhone (or last year's iPhone 5s). Many were surprised to learn that it will be some time before the component circuitry for providing all of the communications functionality of a phone or tablet can be miniaturized for a watch.
This is not dissimilar to other smartwatches on the market. But this technology is not yet the smartwatch Dick Tracy would get excited about. No one knows if the Apple Watch (or any smartwatch) will be the next revolution in mobile computing or if it will be a flop. And even if it is the next big thing, it isn't a given that Apple, or any other player, will own the market.
Yet Apple does deserve some benefit of the doubt. Remember when the first iPod came out and it only worked with a Mac? Once Apple added Windows compatibility, everything changed. Likewise when the first iPhone came out in 2007, it carried a carrier-subsidized price tag of $599. After few sold, Apple quickly reduced them to $399. But it w