Microsoft Has Ambitious Plans To Broaden IoT Deployment

Microsoft has released the technical preview of IoT Central, the company's SaaS offering designed to let organizations roll out IoT-based solutions with minimal configuration or knowledge of the complexities of integrating operational systems with IT.

The company, which announced IoT Central back in April, said customers can use these preconfigured application templates to deploy IoT capabilities within hours and without the need for developers skilled in IoT. The new SaaS offering will compliment Microsoft's Azure IoT Suite, a PaaS-based offering that requires more customization and systems integration.

"For Microsoft IoT Central, the skill level [required] is really low," said Sam George, Microsoft's director of Azure IoT, during a press briefing earlier this week. Both IoT Central and the IoT Suite are built upon Azure IoT Hub as a gateway that the company said provides secure connectivity, provisioning and data gathering from IoT-enabled endpoints and devices.

They also both utilize other Azure services such as Machine Learning, Streaming Analytics, Time Series Insights and Logic Apps. The Azure IoT Suite is consumption-based, while IoT Central is based on a subscription model based on the amount of usage starting at 50 cents per user per month (or for a fixed rate of $500 per month). The company is also offering free trials.

"Up until now IoT has been out of reach for the average business and enterprise," George said. "We think it's time for IoT to be broadly available. There is nothing with rapid development like this from a major cloud vendor on the market."

IoT Central gives each device a unique security key and the service provides a set of device libraries, including Azure IoT device SDKs that support different platforms including Node.js, C/C# and Java. The service offers native support for IoT device connectivity protocols such as MQTT 3.1.1, HTTP 1.1 and AMQP 1.0. The company claims IoT Central can scale to millions of connected devices and millions of events per second gathered through the Azure IoT cloud gateway and stored in its time-series storage.

The number of IoT-based connected "things" is forecast at 8.4 billion for this year and on pace to reach 20 billion by 2020, according to Gartner. But ITIC Principal Analyst Laura DiDio warns that most customers are still in the early stages of true IoT deployments, though she said they are on pace to ramp up rapidly over the next two-to-three years. "Scalability will be imperative," DiDio said. "Corporations will require their devices, applications and networks to grow commensurately with their business needs."

Posted by Jeffrey Schwartz on 12/06/2017 at 11:54 AM0 comments


Breaking Up with SCCM Is Hard To Do

It's no secret that Microsoft wants enterprises to migrate all their PC users to Windows 10 as a service and to move to its new modern approach to configuring, securing and managing those systems and the applications associated with them. This year's launch of Microsoft 365 -- a subscription service that bundles Windows 10 licenses, Office 365 and the Enterprise Mobility + Security (EMS) service -- is the strongest sign yet that the company is pushing IT pros away from the traditional approach of imaging and managing PCs with System Center Configuration Manager (SCCM) in favor of Microsoft Intune in EMS.

While many IT pros have embraced the new modern systems management model, others are bemoaning it and quite a few remain unsure, according to a spot survey of Redmond readers over the weekend.

Nearly half, or 48 percent, of Redmond subscribers who are planning Windows 10 migrations said they intend to continue using SCCM to configure, deploy and manage those systems, according to 201 respondents to an online poll. Yet 31 percent are undecided and only 4 percent have decided on Microsoft's EMS offering. And while 9 percent will use a third-party MDM/EMM/MAM offering, a near-equal amount will implement a mixture of the aforementioned options.

While 19 percent responded that they plan to use Microsoft 365 where it makes sense, only 10 percent plan to use it enterprisewide. A formidable number, 42 percent, said their organization has no plans to use Microsoft 365, while 28 percent are undecided.

As Intune takes on more automated deployment capabilities, organizations upgrading to Windows 10 -- which they must do by 2020 -- may find SCCM becoming less essential in a growing number of scenarios. Brad Anderson, Microsoft's corporate vice president for enterprise client mobility, drove that point home during a keynote session at the recent Ignite conference in Orlando. "One of the big things about modern management is we are encouraging you to move away from imaging," Anderson said. "Stop maintaining those images and all of the libraries and drivers and let's move to a model where we can automatically provision you from the cloud."

Anderson estimated that SCCM now manages 75 percent of all PCs, hundreds of millions, and continues to grow by 1 million users per week -- hence the point that this change is going to happen quickly. Microsoft continues to upgrade SCCM, and has moved from its traditional release cycle of every two to three years to three times per year, and its current branch model with new test builds for insiders issued every month. Microsoft also now offers an SCCM co-management bridge and PowerShell scripts for Intune.

Organizations have various decisions to make, and there are still many moving parts. Which is why despite the fact that 600 million systems now run Windows 10, many organizations still haven't migrated or are only in the early stages of doing so.

Join MVP and longtime Redmond contributor Greg Shields, Enterprise Strategy Group Senior Analyst Mark Bowker and me tomorrow at 11 a.m. PT for a Redmond webinar: Microsoft 365 for Modern Workplace Management: Considerations for Moving to a Post-SCCM World. You can sign up here.

Posted by Jeffrey Schwartz on 12/04/2017 at 1:10 PM0 comments


Box Platform Now Available in Microsoft Azure

The Box collaboration and enterprise content management (ECM) service is now available in Microsoft's Azure public cloud, marking the latest integration points between the two companies in recent years. Box and Microsoft, which are also competitors with overlapping collaboration capabilities, found it in their respective best interests two years ago to work together, staring with basic Office 365 integration.

Both companies extended their partnership in June and elaborated on their extended roadmap at last month's BoxWorks 2017 conference. Box had indicated then that the first new capability would start when it began offering its service in all of Microsoft's Azure global regions. The service that Box made available last week allows Box customers to use Azure as their primary storage for Box content, said Sanjay Manchanda, in an interview last week.

"It's Box content management capabilities with the content being stored in Azure," he said. Box historically has run its own datacenter operations throughout the world, but decided its best route to scaling would be to partner with the large global cloud providers. In addition to its arrangement with Microsoft, Box has partnerships with Amazon Web Services (AWS) and IBM. Manchanda, who worked at Microsoft for more than 10 years before joining Box, said the partnership with his former employer is similar to its relationship with IBM, where both include technical and comarketing pacts.

Manchanda noted that giving Box customers access to its service in Azure will simplify cross-organization collaboration among employees and their external partners, suppliers and customers. It will also provide secure content management by tapping Box's integrations with more than 1,400 SaaS-based applications, including those offered via Office 365, and will allow organizations to do the same when building custom applications.

The roadmap calls for Box to use Microsoft's Azure Key Vault service, which lets organizations bring and manage their own encryption keys. While Box offers its own encryption service called Box Key Safe, which uses a hardware security module (HSM) for encryption, it won't be making it into the new service. "We plan to use the service that Azure Key Vault provides," he said.

Box also plans, as part of its roadmap, to integrate Microsoft Cognitive Services with its service, allowing customers to automatically identify content, categorize it, run workflows and make it easier for users to find information.

Now available in the United States, Box intends to roll it out throughout Microsoft's 40 Azure regions, and allow customers with data sovereignty requirements to ensure their data doesn't leave the confines of a specific country or locale, Manchanda explained. Asked whether Box plans to support Azure Stack, either deployed on a customer's premises or via a third party managed services provider, Manchanda said that isn't part of the current roadmap -- but didn't rule it out if there's customer demand.

Posted by Jeffrey Schwartz on 11/29/2017 at 1:47 PM0 comments


McAfee To Acquire Leading CASB Provider Skyhigh Networks

McAfee is filling a void in its security portfolio with its plan to acquire leading cloud access security broker (CASB) provider Skyhigh Networks. The deal to acquire Skyhigh for an undisclosed amount, announced today, will give McAfee one of the most regarded CASB offerings as the company looks to join the fray of vendors blending cloud security, network protection and endpoint management.

Prior to spinning off from Intel earlier this year, McAfee determined it needed to focus on two key threat protection and defense control points: the endpoint and cloud, a strategy Symantec similarly concluded last year by acquiring Blue Coat for $4.6 billion. Cisco also came to that conclusion last year with its acquisition of CloudLock, which it is integrating with its OpenDNS, Talos threat analytics and existing firewall and data loss protection (DLP) offerings.

Microsoft jumped into the CASB arena two years ago with the acquisition of Adallom, which was relaunched last year as Cloud App Security and refreshed in September at the Ignite conference with support for conditional access. It's now offered as an option with Microsoft's Enterprise Mobility + Security (EMS) service. Other popular providers of CASB tools include BitGlass, CipherCloud, Forcepoint, Imperva and Netskope.

Raja Patel, McAfee's VP and general manager for corporate products, said in an interview that customers and channel partners have all asked what the company had planned in terms of offering a CASB solution. "We think there is a large market for CASBs and the capabilities that Skyhigh brings," Patel said.  "They were the original CASB player in the marketplace, and they have led the category and really moved the needle in terms of their leadership and evolving the category."

Upon closing of the deal, which is scheduled to take place early next year, Skyhigh Networks CEO Rajiv Gupta will report to McAfee CEO Chris Young and will oversee the combined vendor's cloud security business. Gupta's team, which will join McAfee, will also help integrate Skyhigh Networks' CASB with McAfee's endpoint and DLP offerings. "Combined with McAfee's endpoint security capabilities and operations center solutions with actionable threat intelligence, analytics and orchestration, we will be able to deliver a set of end-to-end security capabilities unique in the industry," Gupta said in a blog post.

Patel said the company plans to let customers bring the policies implemented in their McAfee endpoint protection software and network DLP systems to their cloud infrastructure and SaaS platforms. Another priority is bringing more context to its endpoint tools and integrating the CASB with Web gateways and cloud service providers.

While only 15 percent of organizations used CASBs last year, Patel cited a Gartner forecast that found the market will grow to 85 percent by 2020. "If you look at the exponential growth of people adopting public cloud environments over the past two years, it extends to moving the security posture around it."

Posted by Jeffrey Schwartz on 11/27/2017 at 12:17 PM0 comments


Microsoft Joins MariaDB Foundation and Plans Azure Release

Microsoft is adding MariaDB to the list of open source relational database platforms it will bring to Azure. MariaDB will join MySQL and PostgreSQL, announced earlier this year and now available in preview mode, in Azure. In addition to adding it to the menu of open source databases available in Azure, Microsoft has joined the MariaDB Foundation as a Platinum sponsor.

MariaDB is a fork -- or, as the foundation describes it, "an enhanced, drop-in replacement" -- of the MySQL, developed in wake of Oracle's acquisition of Sun Microsystems, which had earlier bought the popular open source database. MySQL Founder Monty (Michael) Widenius developed MariaDB to ensure an option that maintained its open source principles.

Microsoft announced its MariaDB support this week at the annual Connect conference, held in New York. While it's primarily SQL-based, it also has GIS and JSON interfaces and is used by notable organizations including Google, WordPress.com and Wikipedia.

Scott Guthrie, Microsoft's executive VP of cloud and enterprise, told analysts and media that MariaDB has become popular with a growing segment of developers. "Like our PostgreSQL and MySQL, options it's 100 percent compatible with all of the existing drivers, libraries and tools and can take full advantage of the rich MariaDB ecosystem," Guthrie said, during his Connect keynote address.

Microsoft announced MySQL and PostgreSQL as managed services offerings in Azure at its annual Build conference, which took place back in May in Seattle. Since releasing the previews, Microsoft has added PostgreSQL extensions, compute tiers and it's now available in 16 regions and on pace for release in all 40 regions, said Tobias Ternstrom principal group program manager for Microsoft's Database Systems Group, in a blog post.

Ternstrom visited Widenius in Sweden recently, where they discussed adding MariaDB to Azure and ultimately decided to join the foundation and participate in the open source project. "We are committed to working with the community to submit pull requests (hopefully improvements...) with the changes we make to the database engines that we offer in Azure," Ternstrom noted, in a blog post. "It keeps open source open and delivers a consistent experience, whether you run the database in the cloud, on your laptop when you develop your applications, or on-premises."

Microsoft has posted a waitlist for those wanting to test the forthcoming preview.

Posted by Jeffrey Schwartz on 11/20/2017 at 9:26 AM0 comments


Microsoft Previews New VS Live Share for Real-Time Collaboration

Microsoft is looking to help remotely distributed development teams collaborate on code in real time with the introduction of Visual Studio Live Share. The new capability, demonstrated at the Microsoft Connect Conference in New York and streamed online, now works with both the Visual Studio IDE and multi-platform VS Code lightweight development tool, editor and debugger.

Visual Studio Share is among several new capabilities the company is adding to its tooling portfolio and suite of Azure services focused on making it easier for organizations to shift to DevOps management and methodologies, support for continuous integration-continuous deployment release cycles and to target multiple platforms and frameworks.

Based in Azure, the new Visual Studio Live Share lets remotely distributed developers create live sessions in which they can interactively share code projects to troubleshoot problems and iterate in real time, while working in their preferred environment and setup. Visual Studio Live Share aims to do away with the current practice of remote development teams exchanging screen images with one another via e-mail or chat sessions using Slack or Teams. Instead it will be enabling live code sharing sessions.

The core Visual Studio Live Share service runs in Azure and allows for sharing of code among developers using both the full Visual Studio IDE or the new lightweight VS Code editor. It allows developers to share code between the two tools and they don't have to be using the same client platform, or even using all of the same programming environments or languages.

"We really think this is a game changer in terms of enabling real-time collaboration and development," said Scott Guthrie, Microsoft's executive VP for cloud and enterprise, speaking during the Connect keynote. "Rather than just screen sharing, Visual Studio Live Share lets developers share their full project context with a bidirectional, instant and familiar way to jump into opportunistic, collaborative programming" he added in a blog post, outlining all of the Connect announcements.

As Chris Dias, Microsoft's Visual Studio group product manager, demonstrated VS Live Share during the keynote, numerous developers shared their approval on Twitter, which Guthrie remarked on as the demo concluded. Julia Liuson, Microsoft's corporate VP for Visual Studio, in an interview at Connect said the reaction on Twitter mapped with the frustration developers typically encounter with the limitation of sharing screen grabs or relying on chat sessions or phone conversations. "It's painful. We hear this all the time, even in our own day interactions," she said.

The company released a limited private preview of the new VS Live Share capability yesterday, but did not disclose how it will be offered, or whether it might be extended to other development environments over time. "We're going to make sure we have the right offering first and will talk later about the business model," Liuson said, adding that some sort of free iteration is planned.

Microsoft introduced multiple features at Connect that will be coming to Visual Studio, as described below.

Azure DevOps Projects: A complete DevOps pipeline running on Visual Studio Team Services, Azure DevOps Projects is available in the Azure Portal in preview. It's aimed at making DevOps the "foundation" for all new projects, supporting multiple frameworks, languages and Azure-hosted deployment targets.

Visual Studio Connected Environment for Azure Container Service (AKS): Building on Microsoft's new AKS offering, Microsoft Principle Program Manager Scott Hanselman demonstrated how developers can edit and debug modern or cloud native apps running in Kubernetes clusters using Visual Studio, VS Code or using a command line interface. Hanselman demonstrated how developers can switch between .NET Core/C# and Node, Visual Studio and VS Code. The service is also in preview.

Visual Studio App Center:  The app lifecycle platform that lets developers build, test, deploy, monitor and iterate based on live usage crash analytics telemetry is now generally available. Microsoft describes Visual Studio App Center as a shared environment for multiplatform mobile, Windows and Mac apps, supporting Objective-C, Swift, Java, Xamarin and React Native, connected to any code repository.

Visual Studio Tools for AI: A new modeling capability for Visual Studio is now in preview and will give developers and data scientists debugging and editing for key deep learning frameworks including Microsoft's Cognitive Toolkit, as well as TensorFlow and Caffe to build, train, manage and deploy their models locally and scale to Azure.

Azure IoT Edge: A service to deploy cloud intelligence to IoT devices via containers that run with the company's Azure IoT Edge, Azure Machine Learning, Azure Functions and Azure Stream Analytics is now in preview.

Posted by Jeffrey Schwartz on 11/16/2017 at 12:54 PM0 comments


Microsoft Extends Analytics Platform Portfolio with Databricks Spark Service in Azure

Microsoft and Apache Spark creator Databricks are building a globally distributed streaming analytics service natively integrated with Azure for machine learning, graph processing and AI-based applications.

The new Datrabricks Spark as a service was introduced at Microsoft's annual Connect developer conference, which kicked off today in New York. The new service, available in preview, is among an extensive list of announcements focused on its various SQL and NoSQL database products and services, as well as productivity, cross-platform and added language improvements to Visual Studio and VSCode developer tools, as well as new DevOps capabilities, new machine learning, AI and IoT tooling.

During the opening keynote, Scott Guthrie, Microsoft's executive VP for Cloud and Enterprise, emphasized that Databricks is the creator of, and steward of, Apache Spark, and the new service will enable organizations to build modern data warehouses that support self-service analytics and machine learning using all data types in a secure and compliant architecture.

Databricks has engineered a first-party Spark-as-a-service platform for Azure. "It allows you to quickly launch and scale up the Spark service inside the cloud on Azure," Guthrie said. "It includes an incredibly rich, interactive workspace that makes it easy to build Spark-based workflows, and it integrates deeply across our other Azure services."

Those services include Azure SQL Data Warehouse, Azure Storage, Azure Cosmos DB, Azure Active Directory, Power BI and Azure Machine Learning, Guthrie said. It also provides integration with Azure Data Lake stores, Azure Blob storage and Azure Event Hub.  "It's an incredibly easy way to integrate Spark deeply across your apps and drive richer intelligence from it," he said.

Databricks customers have been pushing the company to build its Spark platform as a native Azure service, said Ali Ghodsi, the company's cofounder and CEO, who joined Guthrie on stage. "We've been hearing overwhelming demand from our customer base that they want the security, they want the compliance and they want the scalability of Azure," Ghodsi said. "We think it can make AI and big data much simpler."

In addition to integrating with the various Azure services, it's designed to let those who want to create new data models to do so. According to Databricks, a user can target data regardless of size or create projects with various analytics services including Power BI, SQL, Streaming, MLlib and Graph. "Once you manage data at scale in the cloud, you open up massive possibilities for predictive analytics, AI, and real-time applications," according to a technical overview of the Azure Databricks service. "Over the past five years, the platform of choice for building these applications has been Apache Spark. With a massive community at thousands of enterprises worldwide, Spark makes it possible to run powerful analytics algorithms at scale and in real time to drive business insights."

However, deploying, managing and securing Spark at scale has remained a challenge, which Databricks believes will make the Azure service compelling.

Internally, Databricks is using the Azure Container Services to run the Azure Databricks control-plane and data planes using containers, according to the company's technical primer. It's also using accelerated networking services to improve performance on the latest Azure hardware specs.  

Posted by Jeffrey Schwartz on 11/15/2017 at 1:33 PM0 comments


Cloud Foundry Rising as Runtime for App Portability

More than half of large enterprises that have implemented Cloud Foundry as their application runtime, orchestration and DevOps environment for business modernization projects are using it across multiple clouds, according to a results of a survey published last month.

Adoption of Cloud Foundry Application Runtime is on the rise among large and midsize enterprises, who are running new or transformed business apps across different clouds, including Amazon Web Services, Microsoft Azure, Google Cloud Platform and OpenStack, as well as in VMware vSphere virtual machines.

The survey of 735 users consisting of developers, architects, IT executives and operators, sponsored by the Cloud Foundry Foundation, revealed that 53 percent are using it across multiple clouds. According to the survey that was conducted by the 70-member consortium, 54 percent of Cloud Foundry Application Runtime users are running it on AWS, followed by 40 percent on VMware's vSphere, 30 percent on Microsoft's Azure, 22 percent on OpenStack and 19 percent on Google Cloud Platform. An additional 17 percent said they are running it on various provider-managed PaaS, including IBM Bluemix.

"Whether companies come to Cloud Foundry early or late in their cloud journey, the ability to run Cloud Foundry Application Runtime across multiple clouds is critical to most users," according to an executive summary of the report based on the survey conducted by research and consulting firm ClearPath Strategies in late August.

Accelerated Development and Deployment Reported
Organizations that use the open-source Cloud Foundry Application Runtime are seeing accelerated application development cycles when building their cloud-native applications, the survey also found. A majority, 51 percent, said It previously took three months to deploy a cloud application with only 16 percent able to do so. Upon moving their applications to Cloud Foundry Application Runtime, 46 percent claim their cloud app development cycles of under a week, and among them 25 percent claimed it took less than a day with only 18 percent reporting application development cycles of more than three months.

Before using Cloud Foundry Application Runtime, 58 percent said cloud applications were developed and deployed manually, 52 percent used custom installed scripts, 38 percent relied on configuration management tools, 27 percent VM images, 20 percent Docker containers and 19 percent Linux packages.

A vast majority, 71 percent, said they are using or evaluating adding container orchestration and management to their Cloud Foundry Runtime environment now that the Cloud Foundry Container Runtime is available. Half of Cloud Foundry Application Runtime users are currently using containers, such as Docker, with another 35 percent evaluating or deploying containers.

The release of Cloud Foundry Container Runtime and the Kubernetes-based container management project has generated significant interest among Cloud Foundry Application Runtime users. Nearly three-quarters (71 percent) of Cloud Foundry Application Runtime users currently using or evaluating container engines, primarily based on Docker or the Kubernetes' rtk, and are now interested in adding container orchestration and management to their Cloud Foundry Application Runtime environment.

A majority, 54 percent, use Cloud Foundry to develop, deploy and manage microservices, with 38 percent using it for their Web sites, 31 percent for internal business applications, 27 percent for software-as-a-service (SaaS) and 8 percent for legacy software transformation.

Early Stages
While Cloud Foundry is a relatively new technology -- only 15 percent have used it for more than three years and 45 percent for less than a year with 61 percent in in the early stages, according to the survey -- a noteworthy 39 percent report they have broadly integrated it already. Also, while 49 percent of those adopting it are large enterprises such as Ford, Home Depot and GE, 39 percent are smaller enterprises and 39 percent are small and medium businesses (SMBs), according to the breakdown of respondents.

The number of applications in deployment are still relatively few. More than half, 54 percent, have only deployed less than 10 apps, with 22 percent claiming between 11 and 50 apps and only 8 percent have deployed more than 500.

Pivotal Cloud Foundry, the subsidiary of Dell Technologies that offers the leading commercial distribution of Cloud Foundry, is among the most widely consumed services in Azure, according to Microsoft. Redmond contributor Michael Otey explains how to deploy it in Azure, which you can find here.

Posted by Jeffrey Schwartz on 11/13/2017 at 11:35 AM0 comments


Microsoft Reveals DevOps Tool for DBAs and Developers

Microsoft is readying a new lightweight database development and management tool that aims to give DBAs and developers common DevOps tooling to manage Microsoft's various on-premises and cloud database offerings. The new Microsoft SQL Operations Studio, demonstrated for the first time at last week's PASS Summit in Seattle, brings together the capabilities of SQL Server Management Studio (SSMS) and SQL Server Data Services (SSDS) with a modern, cross-platform interface.

The company demonstrated the forthcoming tool during the opening PASS keynote, showing the ability to rapidly deploy Linux and Windows Containers into SQL Server, Azure SQL Database and Azure SQL Data Warehouse. The first technical preview is set for release in the coming weeks. SQL Operations Studio lets developers and administrators build and manage T-SQL code in a more agile DevOps environment.

"We believe this is the way of the future," said Rohan Kumar, Microsoft's general manager of database systems engineering, who gave the opening keynote at last week's PASS event. "It's in its infancy. We see a lot of devops experiences, [where] development and testing is being used right on containers, but this is only going to get better and SQL is already prepared for it."

Kumar said Microsoft will release a preview for Windows, Mac and Linux clients within the next few weeks. It will enable "smart" T-SQL code snippets and customizable dashboards to monitor and discover performance bottlenecks in SQL databases, both on-premises or in Azure, he explained. "You'll be able to leverage your favorite command line tools like Bash, PowerShell, sqlcmd, bcp and ssh in the Integrated Terminal window," he noted in a blog post. Kumar added that users can contribute directly to SQL Operations Studio via pull requests from the GitHub repository.

Joseph D 'Antonio, a principal consultant with Denny Cherry and Associates and a SQL Server MVP, has been testing SQL Operations Studio for more than six months. "It's missing some functionality but it's a very solid tool, said D 'Antonio, who also is a Redmond contributor. "For the most part, this is VS code, with a nice database layer."

Posted by Jeffrey Schwartz on 11/06/2017 at 12:01 PM0 comments


Power BI Soon Will Query Datasets with 1 Trillion Rows

Microsoft's Power BI can now query 10 billion rows of data, but a forthcoming release will blow that threshold to 1 trillion, a capability demonstrated at this week's annual PASS Summit, where the company also released the first on-premises version with Power BI Report Server.

Microsoft gave Power BI major play at PASS, held in Seattle, where the company also underscored the recently released SQL Server 2017 and its support for Linux and Docker, hybrid implementations of the on-premises database with SQL Azure and its next-generation NoSQL database CosmosDB.

Christian Wade, a Microsoft senior program manager, demonstrated the ability to search telemetry data from the cell phones of 20 million people traveling across the U.S., picking up their location data and battery usage as often as 500 times per day from each user. Though it wasn't actual usage data, Wade demonstrated how long it was taking people to reach various destinations based on their routes by merely dragging and dropping data from the Power BI dashboard interface. Wade queried Microsoft's Spark-based Azure HD Insights service.

"This is what you will be able to do with the Power BI interface and SQL Azure Analysis Services," Wade said during a brief interview following his demonstration. Wade performed the demo during a session focused on Power BI Wednesday, which followed the opening keynote by Rohan Kumar, Microsoft's general manager of database engineering. In the main keynote Wade demonstrated a query against a 9 TB instance with 10 billion rows supported in the current release.

"This is a vision of what's to come, of how we are going to unlock these massive data sets," Wade said, regarding the prototype demo. According to Wade, this new capability demonstrated was the first time anyone was able to use Power BI to perform both direct and in-memory queries against the same tabular data engine,

Kamal Hathi, general manager of Microsoft's BI engineering team, described the new threshold as a key milestone. "We have a history with analysis services and using the technology to build smart aggregations and apply them to large amounts of data," Hathi said in an interview. "It's something we have been working on for many years. Now we are bringing it to a point where we can bring it to standard big data systems like Spark and with the power of Power BI."

While he wouldn't say when it will be available, or if there will be a technical preview, Hathi said it's likely Microsoft will release it sometime next year.

What is now available is the new Power BI Report Server, which brings the SaaS-based capability on-premises for the first time. It still requires the SaaS service, but addresses organizations that can't let certain datasets leave their own environments.

Microsoft is offering the Power BI Report Server only for its Power BI Premium customers -- typically large organizations that pay based on capacity rather than on a per-user basis. The Power BI Report Server lets users embed reports and analytics directly into their apps by using the APIs of the platform, Kumar said during the keynote. "It essentially allows you to create report on-premises using your Power BI desktop tool. And once your report gets created you can either save it on premises to the reporting server [or the] cloud service," Kumar said. "The flexibility is quite amazing."

Posted by Jeffrey Schwartz on 11/03/2017 at 1:30 PM0 comments


Microsoft 365 Business Collaboration and Device Management Bundle Is Now Available

A version of the Microsoft 365 service for small- and mid-size organizations with up to 300 users is now generally available. Microsoft 365 Business, released today, is the last of four versions of the new service announced in July that brings together Office 365, the Enterprise Mobility + Security (EMS) device configuration and management service and Windows 10 upgrades and licenses.

As part of the new release, Microsoft is introducing three new tools for business users called Connections, Listings and Invoicing, on top of previously announced apps, which include a mileage tracker, customer manager and appointment scheduling tool. The tools are available for customers in the U.S., Canada and the U.K. and are included with the $20 per user, per month subscription.

The company today is also releasing Microsoft StaffHub, its new tool for firstline workers designed to help them manage their work days, which is included in Microsoft 365 Business and Office 365 Business Premium subscriptions

Microsoft released the technical preview for the business version back in August and had said it would become generally available this fall. The company has already released Microsoft 365 Enterprise, Microsoft 365 F1 for firstline workers and a version with two licensing options for educational institutions.

The company is betting that the new Microsoft 365 Business option will appeal to the millions of customers who currently pay $12.50 per user per month for Office 365 Business Premium subscription plans. It'll cost them an additional $7.50 per user per month for the new plan, but they'll gain configuration, management and security services, plus Windows 10 and the new apps.

Those who manage Office 365 for Business Premium users will transition to the Microsoft 365 portal. The latter is effectively the same as the Office 365 portal, bringing in the features included with Microsoft Business, said Caroline Goles, Microsoft's director of Office for SMBs.

"We designed it so it looks exactly the same, except it just lights up those extra device cards," she said during a prelaunch demo in New York. "If you manage Office 365, it will be familiar but will bring in those new Microsoft 365 capabilities."

Unlike the enterprise version, Microsoft 365 Business includes a scaled-down iteration of EMS suited for smaller organizations. Garner Foods, a specialty provider of sauces based in Winston-Salem, N.C., is among the first to test and deploy the new service. Already an Office 365 E3 customer, Garner Foods was looking to migrate its Active Directory servers to Azure Active Directory, said COO Heyward Garner, who was present at the New York demo.

"They were able to downgrade most of their Office 365 users and gain the security and management capabilities offered with Microsoft 365," said Chris Oakman, president and CEO of Solace IT Solutions, the partner who recommended and deployed the service for Garner Foods. "It's a tremendous opportunity for small business."

Also now available are three previously announced tools: Microsoft Bookings for scheduling and managing appointments, MileIQ for tracking mileage and Outlook Customer Manager for managing contacts. In addition, Microsoft is adding three new tools: Connections, designed for e-mail marketing campaigns, Listings, for those who want to perform brand engagement on Facebook, Google, Bing and Yelp and Invoicing to generate bills to customers with integration into QuickBooks. These apps can all be managed in the Microsoft Business Center and the company is considering additional tools for future release.

 

Posted by Jeffrey Schwartz on 11/01/2017 at 12:06 PM0 comments


Google Takes on Azure Stack with Hybrid Cloud System from Cisco

In a move to make Google's public cloud services more appealing to enterprise customers, the company and Cisco are partnering to bring hybrid cloud infrastructure that's compatible with the Google Cloud Platform (GCP). The pact, announced today, will enable workloads to run on Cisco UCS hyper-converged infrastructure hardware and the GCP.

The partnership is a major boost for Google as it looks to take on Amazon Web Services (AWS) and Microsoft, which both offer hybrid cloud solutions. Both have a wide lead on Google with the world's largest cloud footprints and infrastructure. Microsoft is hoping to maintain its lead over Google and gain ground on AWS with its new hybrid cloud solution, Azure Stack, which is now just starting to ship from Dell EMC, Hewlett Packard Enterprise and Lenovo. Cisco is also taking orders for its Azure Stack solution, which is set for imminent release.

Now that Cisco will also offer infrastructure compatible with GCP, Cisco is widening its cloud reach, while Google is gaining significant extension into enterprises. "Applications in the cloud can take advantage of on-premises capabilities (including existing IT systems)," said Kip Kipton, VP of Cisco's Cloud Platform and Solution Group, in a blog post announcing the pact.  "And applications on-premises can take advantage of new cloud capabilities."

Cisco HyperFlex HX-Series systems will enable hybrid workloads to run on-premises and in GCP. The hybrid GCP offering is based on Kubernetes, the open source container orchestration and management platform that will provide lifecycle management, support for hybrid workloads and policy management. Kubernetes now integrates with Cisco's software-defined networking architecture, just upgraded earlier this month with the third release of Cisco's Application Centric Infrastructure (ACI).

The new ACI 3.0 includes improved network automation, security and multi-cloud support. Now that ACI offers Kubernetes integration, customers can deploy workloads as microservices in containers. Cisco said the Kubernetes integration also provides unified networking constructs for containers, virtual machines and bare-metal hardware and lets customers set ACI network policy.

Cisco's hybrid cloud offering also will include the open source Istio service management tooling. Itsio connects, manages and secures microservices. According to a description on its Web site, Istio manages traffic flows between microservices, enforces access polices and aggregates telemetry data without requiring changes to the code within the microservices. Running on Kubernetes, Istio also provides automated HTTP, gRPC, WebSocket and TCP load balancing and various authentication and security controls.

The Cisco offering will also include the Apigee API management tool. Apigee, a leading provider of API management software, was acquired by Google last year. It enables legacy apps to run on-premises and connect to the cloud via the APIs.

"We're working together to deliver a consistent Kubernetes environment for both on-premises Cisco Private Cloud Infrastructure and Google's managed Kubernetes service, Google Container Engine," said Nan Boden, Google's head of global technology partners for GCP, in a blog post published by Cisco. "This way, you can write once, deploy anywhere and avoid cloud lock-in, with your choice of management, software, hypervisor and operating system." Boden added that Google will provide a cloud service broker to connect on-premises workloads to GCP services for machine learning, scalable databases and data warehousing.
The partnership with Cisco promises to make GCP a stronger candidate for enterprises to consider moving workloads to the Google public cloud, though it's not the first. Among some notable partnerships, Google earlier this year announced Nutanix will run a GCP-compatible implementation of Kubernetes on its hyper-converged systems. And at VMworld, Google and Pivotal Cloud Foundry launched the Pivotal Container Service (PCS) to provide compatibility between Kubernetes running vShpere and the Google Container Engine. However, that VMworld announcement was overshadowed by VMware's biggest news of its annual conference, the plan to offer its VMware Cloud on AWS service.

While Cisco is offering customers an alternative to Azure Stack with its new Google partnership, Microsoft has made significant investments in support for Kubernetes orchestration. In addition to its Azure Container Service (ACS) with support for Kubernetes, Microsoft yesterday launched the preview of its managed Kubernetes service, called AKS.

Planned for release in the latter part of 2018, testing for the GCP-compatible Cisco offering will begin early in the year. 

Posted by Jeffrey Schwartz on 10/25/2017 at 12:47 PM0 comments


Cray To Bring Supercomputing Services to Microsoft Azure

If existing high-performance computing (HPC) isn't enough for you, Microsoft is bringing the supercomputing capabilities provided by Cray to its Azure public cloud.

This is a noteworthy deal because Cray has been regarded for decades as the leading provider of supercomputing systems. Cray's supercomputers are capable of processing some of the most complex, high-performance and scientific workloads performed. The two companies today announced what Cray described as an "exclusive strategic alliance" aimed at bringing supercomputing capability to enterprises.

While the term "exclusive" is nebulous these days, this pact calls for the two companies to work together with customers to offer dedicated Cray supercomputers running in Azure datacenters for such workloads as AI, analytics and complex modeling and simulation "at unprecedented scale, seamlessly connected to the Azure cloud," according to Cray's announcement.

The deal marks the first time Cray is bringing its supercomputers to a cloud service provider, according to a statement by Peter Ungaro, the company's president and CEO. The two companies will offer the Cray XC and Cray CS supercomputers with its ClusterStor storage systems for dedicated customer provisioning in Azure datacenters and offered as Azure services.

"Dedicated Cray supercomputers in Azure not only give customers all of the breadth of features and services from the leader in enterprise cloud, but also the advantages of running a wide array of workloads on a true supercomputer, the ability to scale applications to unprecedented levels, and the performance and capabilities previously only found in the largest on-premise supercomputing centers," according to Ungaro.

Cray's systems will integrate with Azure Virtual Machines, Azure Data Lake storage, Microsoft's AI platform and Azure Machine Learning (ML) services. The Cray Urika-XC analytics software suite and Microsoft's the CycleCloud orchestration service that Microsoft now offers following its August acquisition of Cycle Computing, can be used for hybrid HPC management.

The fact that Microsoft would want to bring Cray into the Azure equation is not surprising given CEO Satya Nadella's focus on bringing supercomputer performance to the company's cloud, a priority he demonstrated last year at the Ignite conference in Atlanta. In last year's Ignite keynote, Nadella revealed some of the supercomputing functions Microsoft had quietly built into Azure including an extensive investment in field programmable gate arrays (FPGAs) throughout the Azure network backbone, bringing 25Gbps backbone connectivity, up from 10Gbps, combined with GPU nodes.

Nadella stepped it up a notch in his opening keynote at the recent Ignite gathering held last month in Orlando, where he revealed extensive research and development effort focused on trying to one day offer quantum computing. The ability to offer this form of high-performance computing requires breakthroughs in physics, mathematics and software programming that are still many years away from achievement. While IBM and others have long showcased some of their R&D efforts, Microsoft' revealed it too has had been working on quantum computing for many years.

Nadella and his team of researchers said Microsoft will release some free tools by year's end that will let individuals experiment with quantum computing concepts and programming models. The pact with Cray will bring supercomputing processing capabilities to Azure that will solve the most complex challenges in climate modeling, precision medicine, energy, manufacturing and other scientific research, according to Jason Zander, Microsoft's corporate VP for Azure.

"Microsoft and Cray are working together to bring customers the right combination of extreme performance, scalability, and elasticity," Zander stated in a blog post. While it's not immediately clear to what extent, if any, Cray will be working with Microsoft on quantum computing, it's a safe bet that they will do so at some level, if not now, then in the future.

Posted by Jeffrey Schwartz on 10/23/2017 at 12:12 PM0 comments


Azure Stack Certified for Intel's New Xeon Scalable CPUs

Microsoft has given Azure Stack the green light to run on systems powered by Intel's next-generation Xeon Scalable Processors, code-named "Purley." By validating Azure Stack for Intel's new Purley CPUs, enterprises and service providers can run Microsoft's cloud operating system at much greater scale and expansion capability than the current Xeon CPUs, code-named Intel Xeon E5 v4 family ("Broadwell").

The first crop of Azure Stack appliances, which include those that customers have used over the past year with the technical previews, are based on the older Broadwell platform. Some customers may prefer them for various reasons, notably since gear with the latest processors cost more. Organizations may be fine with older processors, especially for conducting pilots. But for those looking to deploy Azure Stack, the newer generation might be the way to go.

Azure Stack appliances equipped with the new Purley processors offer improved IO, support up to 48 cores per CPU (compared with 28) and provide 50 percent better memory bandwidth (up to 1.5TB), according to a blog post by Vijay Tewari, Microsoft's principle group program manager for Azure Stack.

Intel and Microsoft have worked to tune Azure Stack with the new CPUs for over six months, according to Lisa Davis, Intel's VP of datacenter, enterprise and government and general manager of IT. In addition to the improved memory bandwidth and increased number of cores, the new processors will offer more than 16 percent greater performance and 14 percent higher virtual machine capacity, compared with the current processors, Davis said in a blog post.

The validation of the new CPUs comes three weeks after Microsoft announced the official availability of Azure Stack from Dell EMC, Hewlett Packard Enterprise and Lenovo, enabling customers and service providers to replicate the Azure cloud in their respective datacenters or hosting facilities. Cisco, Huawei and Wortmann/Terra are also readying Azure Stack appliances for imminent release. 

The first units available are based on the older Broadwell processor. Tewari noted some of the vendors will offer them for the next year. "For customers who want early as possible adoption, Broadwell is a good fit because that's what's going to be off the truck first," Aaron Spurlock, senior product manager at HPE, said during a meeting at the company's booth at Ignite. "For customers who want the longest possible lifecycle on a single platform, [Purley] might be a better fit. But in terms of the overall user experience it's going to be greater [on the new CPUs] 90 percent the time."

Paul Galjan, Dell EMC's senior director of product management for hybrid cloud solutions, said any organization that wants the flexibility to scale in the future will find the newer processors based on the company's new PowerEdge 14 hyper-converged server architecture a better long-term bet. Systems based on the PowerEdge 14 will offer a 153 percent improvement in capacity, Galjan said in an interview this week.

"It is purely remarkable the amount of density we have been able to achieve with the 14g offering," Galjan said. One of the key limitations of systems based on the Broadwell platform is that they'll lack the ability to expand nodes on a cluster, a capability Microsoft will address in 2018. But it'll require the new Intel CPUs. Galjan said most customers have held off on ordering systems based on Azure Stack, awaiting the new processors from Intel for that reason.

"That's one of the reasons we are being so aggressive about it," he said. "Azure Stack is a future looking cloud platform and customers are looking for a future looking hyper-converged platform." Galjan compared in a blog post the differences between Azure Stack running on its PowerEdge 13 and the new PowerEdge 14 systems.

Lenovo this week also officially announced support for the new Xeon Scalable Processor with its ThinkAgile SX for Azure Stack appliance and is the first to support the Intel Select program early next year, which includes additional testing designed to ensure verified, workload-optimized and tuned systems.

Select Solutions, a program announced by Intel earlier this year, is a system evaluation and testing process designed to simplify system configuration selection for customers, according to Intel's Davis. It targets high-performance applications that use Azure Stack with an all-flash storage architecture.

Posted by Jeffrey Schwartz on 10/20/2017 at 1:55 PM0 comments


Microsoft Debuts Surface Book 2 as 'Most Powerful Yet'

Looking to distance itself further from the highest-performing MacBook Pros, Microsoft is unleashing its most powerful Surface PCs to date with the launch of the new Surface Book 2. Microsoft showed off the newest Surface Book 2 today as an extra surprise tied with the release of the Windows 10 Fall Creators Update, which comes with support for new mixed reality and improvements for IT pros including enhancements to the console.

The new Surface Book 2 will be available Nov. 16, initially from the Microsoft Store, with either 13.5- or 15-inch displays and will be up to five times more powerful than the original Surface Book, which Microsoft launched two years ago. The 13.5-inch model will start at $1,499 with the larger unit starting at $2,499, available with either dual- or quad-core Intel 8th Generation Core processors and the current NVIDIA GeForce GTX GPUs.

Engineered to process 4.3 trillion math operations per second, the newest Surface Books are for those who perform compute-intensive processes such as video rendering, CAD, scientific calculations and compiling code at high speeds. Also targeting gamers, the 15-inch model will also include a wireless Xbox Controller. It has a 95-watt power supply and can render 1080p at 60 frames per second.

"This thing is 'Beauty and the Beast,'" said Panos Panay, corporate VP for devices overseeing Microsoft's entire hardware lineup, during a briefing with Windows Insiders and media, where we got to spend some time with the new units. "There's no computer, no laptop, no product that's ever pushed this much computational power in this mobile form factor. It's quite extraordinary, the performance."

Panay emphasized that the Surface Book Pro 2 aims to bring out the best in the new Windows 10 Fall Creators Update, Office 365, digital inking and mixed reality. Like its predecessor, the Surface Book 2 screen detaches so it can be used as a tablet or users can fold it back into studio mode. It sports what Panay said is the thinnest LCD available, with a 10-point multitouch display. It also supports a newly refined Surface Pen and the company's Surface Dial.

In addition to the five times power boost over the original Surface Book of two years ago (the 13.5-inch is four times more powerful), Panay said the newest version is twice more powerful than the latest MacBook Pro and boasts 70 percent greater battery life in video playback mode. Microsoft is claiming a 17-hours battery life (5 hours when used as a tablet), though laptops from Microsoft or anyone else rarely reach those maximums. Nevertheless, it is designed to ensure all-day use in most conditions.

The 13.5-inch model weighs 3.38 pounds, with the 15-inch unit weighing 4.2 pounds with the keyboard attached. "There has never been this much computational power in a mobile form factor this light," Panay said in a blog post announcing the Surface Book 2. "You can show off your meticulously designed PowerPoint deck or complex Pivot tables in Excel with Surface Book 2's stunningly vibrant and crisp PixelSense Display with multi-touch, Surface Pen, and Surface Dial on-screen support. You won't believe how much the colors and 3D images will pop in PowerPoint on these machines."

The 15-inch model offers just shy of 7 million pixels, or 260 DPI, which Microsoft claims is 45 percent more than the MacBook Pro. The 13.5-inch model produces 6 million pixels at 267 DPI (3000x2000).

Among the various configurations, they are available with a choice of Intel 7th or 8th Generation Core i5 or i7 processors and the option of dual- and quad-core processors. Depending on the model, customers can choose between 8GB or 16GB of RAM with 256GB, 512GB or 1TB SSD storage. 

 

Posted by Jeffrey Schwartz on 10/17/2017 at 11:44 AM0 comments


Microsoft Brings Cross-Platform and Java Support to Azure Serverless Compute

Nearly a year after rolling out its Azure Functions serverless compute option for running event-driven, modern PaaS apps and services, Microsoft has given it a cross-platform boost. The company announced it had ported the Azure Functions service to the new .NET Core 2.0 framework during the Ignite conference in Orlando, Fla., late last month. On the heels of that release, Microsoft made available a public preview of its Java runtime for Azure Functions during last week's JavaOne conference in San Francisco.

Azure Functions provides elastic compute triggered by events within any service in Azure or third-party service, in addition to on-premises infrastructure, according to Microsoft.  By porting it to .NET Core 2.0, both the Azure Functions Core Tools and runtime are now cross-platform, Microsoft announced at Ignite, though acknowledged in the Sept. 25 post that there are some known issues and functional gaps.

Java support in Azure Functions has been a top request, according to the announcement posted last week by Nir Mashkowski, partner director for the Azure App Service. "The new Java runtime will share all the differentiated features provided by Azure Functions, such as the wide range of triggering options and data bindings, serverless execution model with auto-scale, as well as pay-per-execution pricing," Mashkowski noted.

The preview includes a new Maven plug-in, a tool for building and deploying Azure Functions from that environment, he noted. "The new Azure Functions Core Tools will support you to run and debug your Java Functions code locally on any platform," he said.

Until now, Azure Functions supported C#, F#, JavaScript (Node.js), PowerShell, PHP, Python, CMD, BAT and Bash. In addition, the new Azure Functions is open source and available on GitHub. Azure Functions integrates with SaaS applications and a list of interfaces and supports authentication via standard OAuth providers including Azure Active Directory, a Microsoft account, Facebook, Google and Twitter.

Microsoft also posted some five-minute tutorials that demonstrate how to build and deploy Java app services and serverless functions. Microsoft also held two sessions at JavaOne that describe how to build and deploy serverless Java apps in Azure that are now available for replay.

Posted by Jeffrey Schwartz on 10/13/2017 at 12:31 PM0 comments


Microsoft Confirms What Most of Us Assumed: No More Windows Phone

If there was any hope that Microsoft had any plans to come out with any new phones based on Windows 10 Mobile, or add new features to that version of the OS this year, HP and Microsoft both appear to have dashed them.

The chatter about the all-but-forgotten Windows Phone emerged a week ago when The Register reported that HP will no longer sell or support its Elite x3 Windows-based phone after the end of 2019 and would only offer whatever inventory is still available. The report quoted Nick Lazardis, who heads HP's EMEA business, during the Canalys Channels Forum. Noting that HP had insisted as recently as August that it was committed to the platform, and specifically the Windows Continuum feature, Lazardis attributed the change to Microsoft's shift in strategy with Windows Phone.

HP's Elite X3 3-in-1 device received positive reviews, including one published by Redmond magazine earlier this year. However, Samsung's release of the S8 phone and DexStation dock, allowing users to dock their phones and run Windows-based remote desktops and virtual apps, has proven a strong option to those who want Windows on a smartphone device.

Asked to confirm HP's plan, a company spokeswoman responded: "We will continue to fully support HP Elite x3 customers and partners through 2019 and evaluate our plans with Windows Mobile as Microsoft shares additional roadmap details," according the e-mailed statement. "Sales of the HP Elite x3 continue and will be limited to inventories in country. HP remains committed to investing in mobility solutions and have some exciting offerings coming in 2018."

In wake of the HP report, Joe Belfiore Corporate VP in Microsoft's Operating Systems Group, posted a series of Twitter responses on Sunday night about Windows Phone support. One question was about whether it was time to give up on using Windows Mobile.

"Depends who you are," he tweeted. "Many companies still deploy to their employees and we will support them!" But, he continued: "As an individual end-user, I switched platforms for the app/hw diversity. We will support those users too! Choose what's best 4 u."

Responding to a tweet asking directly about the future for Windows Phone, Belfiore stated what was until now largely presumed: "Of course we'll continue to support the platform. bug fixes, security updates, etc. But building new features/hw aren't the focus," he said with regret (based on his emoji).

If that left ambiguity, a response from a Microsoft spokesperson shouldn't. "We get that a lot of people who have a Windows 10 device may also have an iPhone or Android phone, and we want to give them the most seamless experience possible, no matter what device they're carrying," according to the spokeswoman. "In the Fall Creators Update, we're focused on the mobility of experiences and bringing the benefits of Windows to life across devices to enable our customers to create, play and get more done. We will continue to support Lumia phones such as the Lumia 650, Lumia 950 and Lumia 950 XL, as well as devices from our OEM partners."

Posted by Jeffrey Schwartz on 10/11/2017 at 1:42 PM0 comments


GE Chooses AWS for Internal IT, with Predix Still Set for Azure

Amazon Web Services yesterday announced that it is the "preferred cloud provider" for General Electric, one of the world's largest industrial conglomerates. However, what was not stated in the announcement is that it only pertains to GE's internal IT apps. GE is moving forward with its plans announced last year to run its Predix industrial exchange on Microsoft Azure.

Responding to my query about the nature of yesterday's announcement, issued only by AWS, a GE spokeswoman clarified the company's plans. "The AWS announcement refers to our internal IT applications only," she stated.

The plan to run GE Predix, the giant industrial cloud platform aimed at building and operating Internet of Things (IoT) capabilities, on Azure was announced at Microsoft's Worldwide Partner Conference (WPC) in July of 2016. Predix is a global exchange and hub designed to gather massive quantities of data from large numbers of sensors and endpoints and apply machine learning and predictive analytics to the data to optimize operations and apply automation. The company uses it for its own customers and partners.

Jeffrey Immelt, the chairman and CEO of GE who recently retired, joined Microsoft CEO Satya Nadella on stage during the keynote session kicking off last year's WPC, where he described GE's plans to integrate Predix with the Azure IoT Suite and Cortana Intelligence Suite.

"For our Predix platform, our partnership with Microsoft continues as planned," according to the GE spokeswoman. "By bringing Predix to Azure, we are helping our mutual customers connect their IT and OT [operational technology] systems. We have also integrated with Microsoft tools such as Power BI (which we demonstrated at our Minds + Machines event last year) and have a roadmap for further technical collaboration and integration with additional Microsoft services."

GE will have more to say about that at this year's annual Minds + Machines conference, scheduled to take place at the end of this month in San Francisco.

Posted by Jeffrey Schwartz on 10/06/2017 at 10:27 AM0 comments


NetApp To Bring Enterprise NFS File Service to Microsoft Azure

Microsoft and NetApp are working to deliver a native version of the storage provider's Network File System (NFS) for Azure. The new Enterprise NFS Service is based on NetApp's flagship Data ONTAP storage operating and management platform and will be available for public preview in early 2018.

Both companies inked their latest of many partnerships over the years to let enterprises move their enterprise storage workloads to Microsoft Azure. The new Azure Enterprise NFS Service was announced at NetApp Insight, the company's annual customer and partner conference.

The announcement was overshadowed by the fact that the conference is taking place at the Mandalay Bay hotel in Las Vegas, the site of Sunday night's deadly shooting massacre. While the preconference sessions were cancelled on Monday because the hotel was still on lockdown, it resumed yesterday and was marked by a moment of silence.

Administrators will be able to access the new service from the Azure console, which NetApp said will appeal to cloud architects and storage managers seeking to bring NFS services natively to Azure for workloads such as database-oriented analytic workloads, e-mail and disaster recovery. In a statement by Anthony Lye, senior VP of Microsoft's cloud business unit, the solution will use NFS to provide "visibility and control across Azure, on-premises and hosted NFS workloads."

The offering will enable the provisioning and automation at scale of NFS services using RESTful APIs, with added data protection offered through the ability to create on-demand, automated snapshots. The service will support both V3 and V4 workloads running in Azure as well as hybrid deployments, according to NetApp. It will also include integration with various Azure services including SQL Server and SAP Hana for Azure.

NetApp also said the pack calls for integrating Data ONTAP software with Azure Stack, which CEO George Kurian said in a recorded video presentation will speed the migration of enterprise applications to Azure Stack and Azure.

Kurian also said the two companies are integrating NetApp's recently launched all-flash-based FabricPool technology and said it "manages cold data by tiering in a cost-effective manner to the cloud and integration of fabric pools together with Azure Blob Storage. He added that it "gives customers a really capable hybrid cloud data service and allows them to optimize their own datacenters," he said.

Cloud Control for Office 365 now supports Azure Storage and will soon have availability in EMEA and APAC regions, allowing local instances of Exchange, OneDrive and SharePoint. The two companies are working to provide more extensive integration with Cloud Control for Office 365 and NetApp AltaVault archiving platform, enabling customers to choose between hot, cool, and cold storage options in Azure for backup and disaster recovery requirements.

Posted by Jeffrey Schwartz on 10/04/2017 at 11:44 AM0 comments


IT Industry in Crosshairs of Las Vegas Shooting Tragedy

As the world today awoke to the news of last night's horrific mass shooting that originated from a 32nd floor room of the Mandalay Bay hotel in Las Vegas, killing at least 58 and injuring more than 550, many IT pros had just arrived or were on their way to NetApp's Insight conference.

The NetApp conference was set to begin today with preconference technical sessions but as the shooting unfolded, the hotel was evacuated and all of today's activities were cancelled. NetApp later in the day said it had decided to resume the conference tomorrow for those who already attended with the keynote session to be kicked off by President and CEO George Kurian.

"I am shocked and saddened by the tragic event that occurred in Las Vegas at the Mandalay Bay last night," Kurian said in a statement. "I am sure you all share these sentiments. My heard and the hearts of thousands of NetApp employees break for the loved ones of those affected by the terrible events. NetApp will stand strong in the face of senseless violence and continue with the conference for those who want to attend."

Also kicking off this week is the Continuum Conference, an event for managed services providers, taking place further down the Las Vegas Strip at the Cosmopolitan. It too is set to go on, as reported by SMBNation's Harry Brelsford.

Some expressed mixed opinions on whether NetApp should resume the conference where the hotel remains an investigation scene and attendees were either looking to help and donate blood, while others as of last evening still were unable to return to their rooms.

This tragedy, which has shocked the world, is also unnerving to many IT professionals who find themselves in Las Vegas many times a year for large conferences. Las Vegas hosts hundreds of conferences of all sizes every year, and it is the site of many major tech gatherings. Many of you, myself included, were just at the Mandalay Bay just over a month ago for VMworld 2017. Many other tech vendors and conference organizers including Veritas, Okta, Cisco, Hewlett Packard Enterprise, DellEMC and VMware, among others have held major conferences in Las Vegas.

TechMentor, produced by Redmond magazine parent 1105 Media was last held in Las Vegas in early 2016. The upcoming TechMentor conference is scheduled to take place in Orlando, Fla., Nov. 12-17 as part of the Live! 360 cluster of conferences and will return to the Microsoft campus in Redmond next summer as it did this year. After years of steering its conferences away from Las Vegas, Microsoft recently announced next year's Inspire partner conference in Las Vegas.

"My heart and thoughts are with all the people, families and responders in Las Vegas impacted by this horrific and senseless violence," Microsoft CEO Satya Nadella tweeted. VMware CEO Pat Gelsinger, pointed to a GoFundMe page posted by Steve Sisolak, Clark County Commission chair from Las Vegas for victims. "Our thoughts and prayers this morning are with the victims and families of Las Vegas shooting," Gelsinger tweeted.

NetApp is one of the largest providers of enterprise storage hardware and software with $5.52 billion in revenues for its 2017 fiscal year, which ended April 28. Many storage experts attending Microsoft's Ignite conference, which wrapped up on Friday, had signaled they were headed to Las Vegas for the NetApp event. Software and cloud providers listed as major sponsors of the NetApp technical event include Amazon Web Services, Cisco, Fujitsu, Google Cloud, IBM Cloud, Intel, Microsoft, Rackspace, Red Hat, VMware and many others, according to the company's sponsorship roster.

The shooter is alleged to have targeted an outdoor festival where he fired round after round for more than 10 minutes at the crowd. It is believed to be the worst shooting by an individual, in terms of the number of deaths and casualties, in U.S. history. People have been asked to donate blood if they can, and according to reports, attendees at the NetApp conference, and other events taking place are doing that, and helping out however they can.

Unfortunately, what has transpired in Las Vegas could happen in many places in the U.S. and abroad. Hotels may now have to consider scanning the bags of guests just as airlines now do, and we need to remain aware of our surroundings, which we've become conditioned to do since the attacks of Sept. 11, 2001.

Posted by Jeffrey Schwartz on 10/02/2017 at 1:54 PM0 comments


Microsoft Means Business with Bing

Microsoft created some interesting buzz at this year's Ignite conference with the news that it is putting Bing at the center of its artificial intelligence and enterprise search efforts.

The new Bing for Business will be a key deliverable from the new AI Research Group Microsoft formed a year ago this week, led by Harry Shum, which brought together Microsoft Research with the company's Bing, Cortana, Ambient Computing and Robotics and Information Platform groups.

Bing for Business brings the Microsoft Graph to the browser, allowing employees to conduct personalized and contextual search incorporating interfaces from Azure Active Directory, Delve, Office 365 and SharePoint. Li-Chen Miller, partner group program manager for AI and Research at Microsoft, demonstrated Bing for Business in the opening keynote session at Ignite, showing how to discover her organization's conference budget.

Using machine reading and deep learning models, Bing for Business went through 5,200 IT and HR documents. "It didn't just do a keyword match. It actually understood the meaning of what I was asking, and it actually found the right answer, the pertinent information for a specific paragraph in a specific document and answered my question right there," Miller said. "The good news is Bing for Business is built on Bing and with logic matching, it could actually tell the intent of what I was trying to do."

Miller added that organizations deploying Bing can view aggregated but anonymized usage data. "You can see what employees are searching for, clicking on and what they're asking, so you can truly customize the experience," she said. It can also be used with Cortana.

The idea behind Bing for Business is it takes multiple approaches to acquiring information within an enterprise -- such as from SharePoint, a file server and global address book -- and applies all the Active Directory organizational contexts, as well as Web search queries, to render intelligent results, said Dave Forstrom, Microsoft's director of conversational AI, during an interview at Ignite.

"If you're in an enterprise that has set this up, now you can actually work that into your tenant in Office 365 and then it's set up through your Active Directory for authentication in terms of what you have access to," Forstrom said.

Quite a few customers are using it now in private beta, according to Forstram. The plan is to, at some point, deliver it as a service within Office 365.

Seth Patton, general manager of Microsoft's Office Productivity Group, said in a separate interview that the Microsoft Graph brings together the search capabilities into a common interface. It also includes Microsoft's Bot Framework.

"Being able to have consistent results but contextualized in the experience that you're in when you conduct the search is just super powerful," Patton said. "We've never before been able to do that based on the relevance and the contextual pieces that the Graph gives."

Posted by Jeffrey Schwartz on 09/28/2017 at 10:40 AM0 comments


Rackspace Extends Protection to Hyper-V and PCI for Azure

Rackspace will secure Hyper-V workloads via its managed security service offering and has announced its Microsoft Azure offering is now PCI certified. The company, one of the world's largest managed services provider, is talking up the new Hyper-V protection capabilities at this week's Microsoft Ignite conference, taking place in Orlando, Fla.

The Hyper-V protection extends across the company's Rackspace Managed Security (RMS) service. Also, a new Rackspace Cloud Replication for Hyper-V will be offered under the umbrella of its Rackspace portfolio of Microsoft services, which provides overall threat protection using analytics and remediation across the company's managed services offering. The managed Rackspace Cloud Replication for Hyper-V offering is based on Microsoft's Azure Site Recovery and offers replication, storage and failover of Hyper-V virtual machines.

While Rackspace said the new offering lets organizations use the managed service to target Microsoft Azure rather than on-premises infrastructure, it also provides an alternative to moving those Hyper-V workloads into an infrastructure-as-a-service scenario, said Jeff DeVerter, CTO of Microsoft technologies at Rackspace, during an interview at Ignite.

"As excited as the world is about the cloud, not every workload is ready to be made into a modern application, and so customers can actually have a single tenant using Hyper-V and solve the problem of getting out of the datacenter, which most companies want to do today," DeVerter said. "But they don't have to take on the added expense of running an IaaS inside of Azure, and IaaS workload tends to cost more in azure than it does in a tenant."

DeVerter said running them in Hyper-V solves the problem of moving those workloads out of on-premises datacenters. Rackspace can work with customers over time and starting to extend those application out into Azure, if transforming the application is the ultimate goal. Rackspace had already offered protection of workloads for VMware-based VMs.

Rackspace also used Ignite to announce that its managed Microsoft Azure service is now has PCI certified for those running workloads that carry payment data. Microsoft Azure is already PCI compliant. But because Rackspace manages the workloads, it too had to secure the certification.

Posted by Jeffrey Schwartz on 09/27/2017 at 2:25 PM0 comments


Azure Portal Takes on More On-Premises Capabilities

The Azure management portal isn't just for Microsoft's public cloud. Microsoft kicked off its annual Ignite conference in Orlando, Fla. this week announcing that Azure Stack appliances from Dell EMC, Hewlett Packard Enterprise and Lenovo are now available, along with news of bringing PowerShell, change tracking, update management, log analytics and simplified disaster recovery to the Azure portal.

Azure Stack appliances are the cornerstone of Microsoft's long-stated hybrid cloud strategy to bring its Azure cloud in line with the same portal management experience and proved the ability to build and provision instances and applications with the use of common APIs. "Azure Stack also enables you to begin modernizing your on-premises applications even before you move into the public cloud," said Scott Guthrie, Microsoft's executive VP for cloud and infrastructure, speaking in a keynote session at Ignite today.

"The command and control [are] identical," said Sid Nag, Gartner's research director for cloud services, during an interview at Ignite following the Guthrie's session. "If I have a craft, I don't have to learn new skills. I can transition very smoothly without a learning curve."

However, like any major new piece of infrastructure, despite significant interest, the pace and number of deployments remain to be seen, according to Nag. "Clients have been looking for an onramp to the public cloud, but they are not ready to commit," Nag said.

Microsoft maintains that enterprises should embrace the hybrid path it has championed for some time, but the company is also apparently giving them a nudge by bringing the Azure portal to their world, whether they use the public cloud. Adding these new capabilities brings the portal even to those not using Azure Stack. During the session, Corey Sanders, Microsoft's director of Azure compute, demonstrated the new features coming to the Azure portal.

PowerShell Built into Azure Portal: PowerShell is now built into the Azure portal, aimed at simplifying the creation of virtual machines. "It's browser based and can run on any OS or even from an iPhone," Sanders said. "If you are familiar with PowerShell, it used to take many, many, commands to get this going. Now it takes just one parameter," he said. "With that, I put in my user name and password and it creates a virtual machine, so you don't have to worry about the other configurations unless you want to." Sanders said IT pros can use classic PowerShell WhatIf queries to validate what a given command will do.

Change Tracking: When running a virtual machine, it is tracking every change on the VM including every file, event and registry change. It can scan a single machine or an entire environment, letting the IT pro discover all changes and investigate anything that requires attention.

Log Analytics: Administrators can now call on a set of prebuilt operations and set of statistics to discover the number of threats exist. It looks beyond the built-in antimalware, letting the administrator go into the analytics designer to create queries that, Sanders said, are simple to write. "They are ery SQL-like and allow me to do very custom thinvgs," he said. For example, it can query over last seven days the processor time of all the virtual machines in a specific subscription. It can group them by computer and display a time chart that specifies spikes of all the CPUs timed across those seven days.

Update Management: Administrators looking to see what updates or patches have been installed, or are awaiting installation can use this new feature in the Azure portal. It displays details of what the updates include, allows the administrator to choose which ones to act on. Sanders emphasized this works across an entire Azure or on-premise infrastructure of Windows and Linux machines.

Disaster Recovery: Noting that planning how to back up infrastructure and ensure a workable recovery plan is complex, Sanders said the site recovery capability in the Azure portal lets the administrators pick a target region and it will provide a picture of how a failover scenario will actually appear. "The key point here is this isn't just a single machine. You can do this across a set of machines, build a recovery plan across many machines, do the middleware and actually run scripts according to that plan," he said.

Microsoft said these features will appear in preview mode today. The company hasn't disclosed a final release date.

Posted by Jeffrey Schwartz on 09/25/2017 at 12:01 PM0 comments


Veritas Broadens Data Management Integration with Microsoft Azure

Veritas is upgrading its entire data management software portfolio with performance improvements and more extensive support for hybrid and public clouds with extended integration with Microsoft Azure.

The deeper integration for Azure is key among the announcements at this week's annual Veritas Vision customer and partner conference in Las Vegas, as well as the news of the company extending data deduplication and optimization of its flagship NetBackup, and Backup Exec backup and recovery and DR offerings. The company also today is introducing a software-defined storage offering designed for massive amounts of data called Veritas Cloud Storage, designed to apply intelligent analytics and classification to improve data discovery.

Late in addressing support for major public clouds including Amazon Web Services, Microsoft Azure and Google Cloud, Veritas last year stepped up its effort after spinning off from Symantec by delivering native integration with the major public clouds in NetBackup 8 and Backup Exec 16. At the time, Veritas pledged to offer better support for all three of the largest global public clouds, making data protection and management across hybrid environments a priority. 

Among the key promises at last year's Vision partner and customer conference, Veritas had annouced a partnership with Microsoft to add deeper support for its Azure public cloud. Earlier this year the two companies extended their agreement to include Veritas data archiving solutions, including Enterprise Vault.

The deliverables announced this week include integrated support of Veritas' data archiving, management and orchestration tools with Azure. Specifically, the Veritas Resiliency Platform (VRP), which provides backup and disaster recovery management, orchestration and automation across multiple clouds and on-premises storage, establish Azure as a recovery target. It provides monitoring failover and failback to and from Azure.

Also, a new release of the Veritas Access software-defined storage NAS scale-out file system supports Azure cloud storage. Veritas Access enables simultaneous access for multiprotocol file and object types such as NFS, CIFS, FTP, SMB and S3. And the company's relatively new Veritas Information Map, a visualization tool designed to give administrators a real-time, aggregated and detailed view of where data is stored for cost management and risk mitigation, will support Azure Blob Storage and Azure File Storage, as well as other Microsoft cloud storage services.

Veritas launched Information Map with a grand plan of giving administrators extensive visibility across the spectrum of cloud and on-premises storage. Building on that product, the company announced the new Connection Center, a designed link to provide visibility into 23 different cloud stores, with more planned. In addition to adding support for Azure Blob Storage and Azure File Storage, Veritas said it will be rolling out connectors to OneDrive, Google Drive, Box, G Suite, Google Cloud Platform Cloud Storage, AWS S3, Office 365, Exchange Online and SharePoint Online.

The company said Veritas Information Map can help determine which data must be preserved to help meet compliance regulations, such as the forthcoming European Union's General Data Protection Regulation (GDPR), which takes effect next year. Veritas Information Map connectors will support a variety of other Microsoft and third-party data sources. This integration with Azure will be available in the coming quarters.

The new NetBackup 8.1, aimed at large enterprises, brings improved data deduplication to improve backup and recovery times and lower bandwidth utilization. Also new in the release is Parallel Streaming, aimed at supporting hyperconverged clusters and scale-out backups for big data workloads, including support for HDFS.

A new release of Veritas Backup Exec 16 FP2 for small and midsize organizations also offers improved deduplication and the new CloudConnect Optimizer to improve network bandwidth utilization. Also building on its support for Azure released last year, the Backup Exec upgrade will add compatibility with Azure Government and Azure Germany, as well as Amazon Web Services S3 storage and Google Cloud regional class storage.

The new Veritas Cloud Storage offering, now in beta, is aimed at huge amounts of data. The company said its new data store, which, in addition to supporting AWS S3 and standard Swift object storage protocols, will offer support for MQTT and COAP for addressing new IoT workloads. Veritas said organizations can use it in geodistributed scenarios and can support complex storage polices, suited for GDPR and other environments where sensitive data are handled.

Posted by Jeffrey Schwartz on 09/20/2017 at 12:32 PM0 comments


Massive Equifax Breach Epitomizes Reckless IT Security Practices

Given the scope of a growing number of major data breaches, each one is harder to top, although security experts know there's no bottom limit to what could be next. The compromise of 143 million individual accounts reported by Equifax on Sept. 7 that included names, birthdates and credit card numbers, may be one of the most damaging breaches disclosed to date. Apparently tied to the Equifax breach, news surfaced Friday that information on more than 200,000 credit card accounts were also stolen.

The way Equifax executives and its IT security team appears to have failed to adequately apply patches, the amount of time it took to discover the depth of the breach and the delay in ultimately reporting it certainly paints a picture of a colossal failure at all levels, including the curiosly timed stock sales by top executives (who deny knowledge of the breach at the time of the sale) just days before the disclosure, reported by Bloomberg.

Fallout from the breach has, not surprisingly, led to the reported departures of CIO Dave Webb and CSO Susan Maudlin late last week. Signs of trouble trace back to March 8 when Cisco warned of a security flaw in Apache Struts, the open source, Java-based framework widely used on interactive Web sites,  that already was "being actively exploited," before the July 29 discovery of trouble at Equifax and the Sept. 7 revelation of how many potential customer records were stolen, according to a detailed report published by The Wall Street Journal today.

While the report noted many details remain unknown, it is understood that hackers pillaged information between May and the July 29 discovery. A few days later, Equifax brought in security consulting firm Mandiant, now a unit of FireEye and associated with many high-profile forensics investigations including the Yahoo breach last year, when data on more than 1 billion accounts were exposed.

Initially, Mandiant believed that 50 million accounts were compromised. But as its investigation continued, it determined it was nearly three times that amount, according to the report, which also noted the company registered the EquifaxSecurity2017.com domain for customers to seek information.

The report also noted last week's revelation by Alex Holden, founder of Hold Security, that an Equifax portal in Argentina was "wide open, protected by perhaps the most easy-to-guess password combination ever: 'admin/admin,'" he told the KrebsonSecurity Web site.

The true impact of the Equifax breach is yet to unfold, but it already has brought a new awareness to the risks at hand that many have long overlooked or ignored. How organizations address their own risks in wake of this remains to be seen.

Posted by Jeffrey Schwartz on 09/18/2017 at 7:39 PM0 comments


Rackspace To Acquire Rival Datapipe Creating Managed Cloud Services Giant

Once rumored as an acquisition target by a larger cloud provider, now it's Rackspace that's getting bigger. The company today said it is making its largest acquisition to date with its agreement to buy rival Datapipe, a combination that will bring together two leading providers of managed hosting and cloud services.

While terms weren't disclosed, the deal is expected to close by the end of the year, pending financing and regulatory approval. Bringing together the two companies will create a managed hosting and cloud service provider with a total of 40 datacenters throughout the world, making it the largest, according to Rackspace CEO Joe Eazor.

"It will have a big impact on our ability to deliver the multi-cloud services that today's customers are demanding," Eazor said, in a blog post. "They want us to give them expert, unbiased advice, and manage their applications on whichever clouds will best meet their unique needs. They want us to serve them at scale, from data centers and offices across the globe. And they want the world's best customer experience, across digital tools and results-obsessed customer service. Our mission is to meet those needs -- today and as they evolve."

Rackspace, based in San Antonio, has 11 datacenters worldwide and Jersey City, N.J.-based Datapipe runs 29. Both are privately held and offer managed public and multi-cloud services that collectively include the Alibaba Cloud, Amazon Web Services, Google Cloud Platform and Microsoft Azure.

The companies also have distinct services. Datapipe offers Alibaba Cloud integration, giving Rackspace a foothold in China. Datapipe customers will be able to benefit from Rackspace's Google-based services. Rackspace, which was a key creator of OpenStack before contributing it to the open source community, offers managed OpenStack services. Rackspace also offers VMware-hosted services and though its Microsoft partnership managed Exchange, SharePoint and Windows Server hosting. It also provides Office 365 and Google at Work services. Rackspace also now offers managed Oracle and SAP services.

Datapipe will boost the Rackspace portfolio in several ways. It has a strong public sector business with such customers as the U.S. Department of Defense, Dept. of Energy and the Treasury department, and has FedRAMP and FISMA certifications. It services agencies outside the U.S. including the U.K Cabinet Office and the Ministry of Justice and the Department of Transport. Datapipe gives Rackspace an extended presence in many places it lacked or was limited in including the U.S. west coast, Brazil, China and Russia. Datapipe also brings added managed public cloud migration services, colocation services covering four continents and experience bringing cloud services to midsize organizations as well as large enterprises.

The agreement is indeed a reversal of fortunes for Rackspace, which had struggled to grow a few years ago, and had put itself up for sale in 2014. After failing to forge an acceptable deal, Rackspace decided to go it alone, and added public cloud services from AWS, Google and Microsoft to its managed services portfolio. The company also went private last year after Apollo Global Management and its investors acquired it for $4.3 billion. Eazor came in as Rackspace's CEO earlier this year after leading Earthlink, which was acquired late last year by Windstream.

Posted by Jeffrey Schwartz on 09/11/2017 at 3:03 PM0 comments


Okta Extends Security Capabilities of Cloud Directory Service

Building on its goal to extend the single-sign on capability of its cloud-based directory service, Okta has added native LDAP support to its Okta Universal Directory and has extended its multifactor authentication (MFA) offering to bypass on-premises ADFS servers, among other services. The moves are among several upgrades to its SSO portfolio announced at the company's annual Oktane gathering, held last week in Las Vegas.

Until now, Okta has connected to LDAP directories, which are often found on legacy on-premises applications, security and network systems, by using replication. Now Okta's directory supports the LDAP protocol natively, allowing LDAP-based applications to authenticate against it directly, which the company said eliminates the need for multiple on-premises directories tied to specific systems and applications, including VPNs.

"You just point [applications] at the Okta Universal Directory, and it speaks the protocol, and you're integrated and on your way," said Okta CEO Todd McKinnon. "You can now retire those legacy directory workloads, make it much easier for you and more cost effective." By adding LDAP support, organizations can eliminate multiple on-premises directories, IDC Analysts Tom Austin and Frank Dickson, noted in a research note.

Okta said it is also responding to the growing push to bring multifactor authentication (MFA) into broader use. The company said basic two-factor authentication will be a standard feature for all customers. "Every company using our SSO product gets basic multifactor authentication for free," McKinnon said. "We think it pushes the industry forward. It makes it incredibly easy to deploy multifactor authentication in a usable, scalable way across your entire ecosystem and we think this will push the security industry forward."

The company has added new functionality to its Adaptive MFA (AMFA) offering, which provides context-based security capabilities. Among them is a new capability that will prevent its users from using common passwords or those already exposed from known breaches. Okta has also added IP blacklisting to protect against DDoS attacks. "AMFA can also detect anomalies based on user location and client, and determine whether authentication event is using a trusted/untrusted device," Austin and Dickson noted.

AMFA can also now be used with LDAP as well as a broader set of on-premises custom applications, including ADFS, Web apps, RADIUS and other SSO products such as CA SiteMinder, Oracle Access Manager and IBM's Tivolli Access Manager, among others.  "We've now extended our Adaptive MFA offering, enabling you to connect to anything behind an ADFS server, also to connect directly to anything speaking the remote desktop protocol," McKinnon said. "What you are seeing here is a broadening and a deepening of this integration and this product. It's not about applications, it's about being securely connected to everything. This is critical as you manage and secure your extended enterprise."

Okta, which said it now has 3,000 customers using the Okta Identity Network and 5,000 native connections, also announced a new developer toolkit and integrations with a number of providers including ServiceNow, Workato, Palo Alto Labs, Cisco, F5 Networks, Citrix (Netscaler), Akamai, Box, Google (G Suite and Google Cloud), Sumo Logic, Splunk, Skyhigh, Netskope, MuleSoft, IBM (DataPower and Radar) and Amazon Web Services.

Okta is regarded as a leading provider of SSO providers to large enterprises and one whose business is now easier to gauge than others, thanks to the fact that it went public earlier this year. While the Andreesen-Horowitz-backed company is still quite in the red, Okta surprised Wall Street yesterday by beating revenue estimates and upping its forecast for the rest of the year. Revenues of $61 million during its second FY18 quarter rose 63% over the same period last year. The company showed incremental progress on its road to profitability, reporting a net loss of $27.2 million, or 44.5% of total revenue, compared with $20.6 million, or 54.9% of revenues, year-over year.

Posted by Jeffrey Schwartz on 09/08/2017 at 10:18 AM0 comments


VMware Adds Windows 10, Office 365 and Mac MDM to Workspace One

VMware has upped the stakes in delivering unified client and application enrollment and management with broad extensions to its Workspace One platform. In addition to configuring mobile phones and tablets, Workspace One can now enroll and manage Windows 10 devices and Chromebooks with Mac support coming this fall, VMware announced at the company's VMworld 2017 conference in Las Vegas this week. VMware also revealed that Workspace One will enforce Office 365 data loss protection policies, peer-to-peer distribution of policies and patches via the Adaptiva software it licensed earlier this year, along with automation of Windows desktops in its Horizon offering.

Employees can now enroll their own or company provided Windows 10 PCs the same way they configure their mobile phones and tablets using the Workspace One's AirWatch mobile device management (MDM) rather than joining them to an Active Directory domain. AirWatch last week also became the first third party to support the new Google Enterprise option for managing Chromebooks, scheduled for release by the end of September.  The Mac client support will come this fall when Apple releases the next version of its operating system, known as Sierra.

"What we are trying to do is give our customers and users a great experience to access all of their applications on any device in a consumer-simple and enterprise-secure way," said Sumit Dhawan, senior VP and general manager of the company's End User Computing business in his keynote address Monday afternoon at VMworld. "We do this with two things: disruptive innovations in our product and those partnerships in the ecosystem. The reason is, today's environment that most customers have are siloes. Siloes of our desktop, mobile and line of business applications. And with the product innovations and partnerships we believe we can stitch together [them] into a platform."

That common user and management experience gives Workspace One a much broader capability than existing MDM offerings including Microsoft's Enterprise Mobility + Security (EMS) suite, said Mitch Berry, VP of Unified Endpoint Management at Mobi, which provides managed mobility lifecycle management services and software. "I think their technology is a lot more advanced than a Microsoft, or a MobileIron or Citrix in that the experience they are able to provide across multiple device types really gives them the lead," said Berry, whose company has partnerships with all of the major MDM providers including Microsoft.

"Few vendors provide the breadth of Workspace ONE's offering, and VMware did a good job of telling a comprehensive EUC transformation story at VMworld," said Gartner Analyst Andrew Garver. Enterprises looking to shift to this more holistic approach to system and applications management, which Gartner calls "unified workspaces," will find the Workspace One appealing, according to Garver, because it provides "modern management across traditional and mobile endpoints, tight coupling with Horizon VDI and Apps and robust set of gateways for both cloud and on-premises."

The unified MDM capability is now possible because Microsoft, Apple and Google have released their management APIs for Windows, Mac OS X and the Chrome operating system. Microsoft did so earlier this year when it released the Intune portion of its Graph APIs. Google said it would do the same when it made its partnership announcement last week with VMware to enable Chromebook management with Workspace One and Apple came on board during Dhawan's VMworld keynote when Matt Brennan, head of global enterprise strategy at Apple, joined him on stage.

"Within VMware, we have leveraged those public APIs extensively," Dhawan said. By "extensively," Dhawan explained that their use goes beyond just enrollment and providing policy management; it's about integrating identity management and applying context, while striking a balance between providing user control and privacy and ensuring that corporate data remains secure.

Dhawan said Workspace One has evolved to meet its mission of bringing mobile, desktop and application management together. The company has added the VMware Identity Manager into its AirWatch console, which it said will provide a common interface for managing devices, context and identity. It also has a simplified mobile single sign-on interface and, using the Microsoft Graph API, it can apply Office 365 enrollment and management, as well as support for other SaaS apps. The new Workspace One release will manage and enforce security polices and provide Office 365 data loss prevention (DLP) upon release of the Office APIs by Microsoft.

"It gives you one way of unifying the experience across all applications and one place to unify your management across all devices," Dhawan said. "This we believe is a massive change and we think is a great opportunity for you."

Workspace One will enable administrators to control how policies, patches and upgrades are pushed out to branch offices using the Adaptiva OneSite tool that VMware licensed earlier this year. By distributing the updates on a peer-to-peer basis using a content delivery network (CDN), organizations don't need to have servers at those branch locations, said Jason Roszak, VMware's Windows 10 director of product management.

In addition to enabling PCs, Macs and Chromebooks to be configured and managed like mobile devices, VMware also said that the Workspace One Horizon 7 VDI and virtual application platform will be available on Microsoft's Azure cloud service in October. VMware, which announced its plans to offer Horizon 7 on Azure back in May, released the technical preview last week. The company, which first extended Horizon beyond vSphere to the IBM Cloud earlier this year, said the Horizon Cloud service running on Microsoft Azure will start at $8 per user, per month.

VMware also plans to enable automation of Windows desktops and applications using its Just in Time Management Platform (JMP) tools, which include Instant Clone, VMware App Volumes and User Environment Manager, by bringing them into a single console.

That will let administrators more easily design desktop workspaces based on a users' needs, said Courtney Burry, senior director of product marketing for Horizon at VMware, who gave a demo of the new capability during the keynote. "The underlying JMP automation engine [will] build those desktops for you," she said. The integrated JMP console is now in preview.

Posted by Jeffrey Schwartz on 09/01/2017 at 1:32 PM0 comments


Samsung Pushes Smartphone Threshold with Galaxy Note8

Samsung unveiled its widely anticipated new Galaxy Note8, a smartphone that will continue to tip the scale at providing PC power in a smartphone. It may also test the practical size limit of a phone with its 6.3-inch Quad HD+ AMOLED "Infinity Edge" display. The new Note8 will be available Sept. 15 and is among several major smartphone upgrades anticipated over the next month, including a new iPhone and Google's new Pixel 2.

The Galaxy Note is Samsung's premium (and most expensive, smartphone) that is aiming to appeal to "the ultimate multitaskers and power users," said D.J. Koh, president of Samsung's Mobile Communications Business, at the launch event held in New York City. Besides its large, high-resolution display (2960x1440 and 521ppi), the Note8's most prominent features include the new S Pen, aimed at broadening the usage of drawing, annotating and notetaking, the ability to multitask on a split screen and  two rear 12MP cameras with image stabilization on both lenses. The Note8 is configured with an octa-core 64-bit processor with 6GB of RAM, 64GB of flash storage with a microSD slot that supports up to 256GB of additional external storage and support for Bluetooth 5 and USB-C.

Given its name, the Note's hallmark feature is the S Pen, designed to let users draw and take handwritten notes on the device. The Note8's new S Pen appears more practical for mainstream use with this release and may even appeal to those that have had little interest in that capability in the past. "I think for the first time the pen is actually usable for more people. There's no excuse not to use the pen -- you can use it with the screen off," said Patrick Moorhead, president and principal analyst with Moor Insights and Strategy.

Improvements to the S Pen includes a much finer tip and the ability to respond more naturally do to refined pressure sensors on the display. A new Live Messages feature lets you add GIFs in popular messaging apps. The S Pen supports Samsung's "Always On" capability, letting users draw or take handwritten notes within apps as well as on a notepad app that is now easier to use when the phone is locked. The S Pen offers better precision and, based on my brief test of it, works quite intuitively. The offscreen notepad supports up to 100 pages of notes.

The new App Pair feature now lets users simultaneously launch and use on the same screen two apps. App Pair will let users link most apps to the device's "Edge" panel, allowing any two to simultaneously launch in the new Multi window mode. One could look at a document while exchanging e-mails or participating in a videoconference.

Samsung officials gave significant emphasis to the Note8's two rear 12MP cameras that have Optical Image Stabilization (OIS) on both lenses, including the 2x telephoto lens. The camera's dual-capture mode lets users take two pictures simultaneously and keep both images, enabling the bokeh effect or allowing a background to have the same level of detail and focus as the foreground. A dual-pixel sensor on the wide-angle lens aims to compensate for low-light situations, the company said. It has an 8MP camera in front for selfies and conferencing.

With the launch of the Note8, the company is also looking to put to rest the embarrassing and costly hit its reputation took following the release of the device's predecessor, the Galaxy Note7, last year. Samsung had to pull the Note7 from the market shortly after its release when a flaw in the battery caused them to catch fire. Besides the billions of dollars in losses the company incurred, it briefly tarnished the reputation of one of the top brands among consumers and business users. "None of us will ever forget what happened last year," Koh said. "I know I won't. I will never forget how millions of dedicated Note users stayed with us."

Samsung appears to have already recovered from last year's debacle with this spring's release of the Galaxy S8, S8+ and the new DeX Station, the new dock that allows home and business users to connect the phones to a full-size display, mouse and keyboard as well as run it on a network and run virtual Windows desktops with Amazon Work Spaces, Citrix Receiver and the VMware desktops, as we reported this month. The Galaxy Note8 will also work with the DeX Station.

Posted by Jeffrey Schwartz on 08/25/2017 at 11:12 AM0 comments


Intel Showcases Solar Eclipse in Launch of 8th Gen Core Processors

Leading up to today's solar eclipse, Intel's client computing team celebrated the launch of its new Intel 8th Gen Core i5 and i7 processors, which will fuel the next crop of PCs coming this fall for both the consumer holiday buying season and commercial and enterprise upgrades.  Designed to give a major boost to its entire line of mobile, desktop, commercial and high-performance PC processors, including the release of a quad-core processor in a thin and light notebook form factor, the company used today's historic total eclipse to showcase how it can enable virtual reality experiences.

Gregory Bryant, senior vice president of Intel's Client Computing Group, this morning gave a 15-minute live description with key members of his engineering team, talked up key characteristics of the new processor, which, like previous upgrades, offers a 40 percent performance boost over the previous 7th Gen Core processors. Intel also claims anywhere from a two-time boost over PCs over five years old. The company also said a system with the new processors can view a 4K UHD video locally on a single charge.

Systems with the new processors will also allow users to create a slide show up to 48 percent faster with the 8th Gen processor versus a five-year-old device and render an edited video that would take 45 minutes on that same machine in just three minutes, Bryant said in a blog post announcing the new processors. Bryant also noted it is optimized for Microsoft's new Windows Mixed Reality technology coming to the Windows 10 Creators Fall Update and use Thunderbolt 3 external graphics, up to 4K, for advanced gaming and virtual reality.

Intel is releasing the new processors over the coming months in stages, starting with its U-Series mobile CPUs in the coming weeks with new PCs from Dell, HP, Lenovo, Asus and Acer. "We've put more performance and capabilities than ever into our mobile 15-watt Core i5 and i7 processors, we've added two powerful and power efficient cores and we've created a quad-core processor that fits into the same tiny package as its dual-core predecessor," said Karen Regis, Intel's mobile marketing manager, speaking during the brief broadcast event. The initial release will be based on Intel's existing 14 nanometer form factor chipsets but the company is moving to 10 nanometer processors in the future.

Later this fall, Intel will ship its S-Series processors for desktop and all-in-one PCs, followed by the Y-Series processors intended for fanless detachable tablets and finally the H-Series for high-performance laptops and mobile workstations.

Timing the launch with the solar eclipse was indeed a publicity effort. Nevertheless, the company used the backdrop of the eclipse, to showcase its capabilities for advanced photography. Bryant said the company plans to release a virtual reality experience created by artist and photogrammetrist Greg Dowling, who is creating a virtual reality experience for the solar eclipse in Jackson Hole, Wyo. One of the challenges Downing said he faces with the eclipse is the very rapid change in light with only two minutes to take all of the photographs needed. "The advancement of the amount of compute that we can throw at a problem has completely transformed what we've been able to do with photogrammetry," Downing said. "The eclipse is a very rare and wonderful natural phenomenon that we hope to allow the rest of the world to see that missed it. We'll use everything you can give us for processing power."

Posted by Jeffrey Schwartz on 08/21/2017 at 12:12 PM0 comments


Microsoft Extends Serverless Computing with Azure Event Grid

Microsoft this week launched Azure Event Grid, a new managed service aimed at developers who are increasingly building event-based, responsive applications that require event routing and handling. This aims to fill what Microsoft said is a key missing piece in its serverless computing platform.

Azure Event Grid, available as a technical preview, extends Microsoft's existing serverless offerings including its Azure Functions compute engine and Azure Logic Apps, which provides serverless workflow orchestration. The addition of Azure Event Grid addresses the growth of the responsive applications appearing on Web sites and mobile apps, as well as from data streams generated from sensors, embedded systems and other IoT devices, according to Corey Sanders, director of Azure compute, who announced the new service in a blog post.

The new service is the latest addition to address what Microsoft sees as a growing shift toward serverless computing that is allowing developers to build their applications without having to focus on infrastructure, provisioning or scaling. Sanders said Azure Event Grid is a single service that manages the routing of programmed events from any source to any endpoint and with any application. "Azure Event Grid completes the missing half of serverless applications. It simplifies event routing and event handling with unparalleled flexibility," according to Sanders.

"With Azure Event Grid, you can subscribe to any event that is happening across your Azure resources and react using serverless platforms like Functions or Logic Apps," he added. "In addition to having built-in publishing support for events with services like Blob Storage and Resource Groups, Event Grid provides flexibility and allows you to create your own custom events to publish directly to the service."

It also supports various Azure services with built-in handlers for events such as Functions, Logic Apps and Azure Automation, Sanders noted, adding it also provides flexibility in handling events, supporting custom Web hooks to publish events to any service, including third-party services outside of Azure.

Azure Event Grid allows for direct event filtering, is designed to scale "massively" and eases the move toward ops automation with a common event management interface for addressing operational and security automation, including policy enforcement, enabled by Azure Automation's ability to respond to the creation of virtual machines or changes within the infrastructure.

In the current preview, Azure Event Grid can integrate with five event publishers including Blob Storage, Resource Groups, Azure Subscriptions, Event Hubs, Custom Topics and four Event Handlers: Azure Functions, Logic Apps, Azure Automation and WebHooks.

Additional event sources that Microsoft plans to add include Azure Active Directory, API Management, IoT Hub, Service Bus, Azure Data Lake Store, Azure Cosmos DB, Azure Data Factory and Storage Queues. Pricing for Azure Event Grid is usage-based and is detailed on Microsoft's pricing page. During the preview Microsoft is offering the first 100,000 operations at no charge and the cost thereafter is $0.30 per million operations.

Posted by Jeffrey Schwartz on 08/16/2017 at 11:20 AM0 comments


Microsoft Targets Cloud HPC Orchestration with Acquisition of Cycle Computing

Microsoft has acquired Cycle Computing, a leading provider of orchestration software designed to deploy and manage large workloads in the three major public clouds, as well as in on-premises and hybrid datacenter infrastructures.

Cycle Computing was ahead of its time 12 years ago with its orchestration software designed to deploy and manage virtual clusters and storage to enable HPC workloads such as machine learning, genomic research and running complex simulations. The company's orchestration software now can run those workloads using any combination of the three largest public clouds: Amazon Web Services, Microsoft Azure and Google Cloud. Cycle Computing says its software is used by companies of all sizes including some of those with some of the largest workloads including JP Morgan Chase, Lockheed Martin, Pfizer and Purdue University.

The orchestration platform, called CloudCycle, includes a workflow engine and load balancer which utilizes cloud resources to enable computation at any scale, while using encryption to ensure data remains secure. Cycle Computing is on pace to manage 1 billion core hours this year and is growing at a 2.7x annual pace, according to the company's CEO Jason Stowe, in an announcement about the Microsoft acquisition posted today.

[Click on image for larger view.] Figure 1. Cycle Computing's CloudCycle HPC workflow and load balancing platform.

Cycle Computing's customers spend $50 to $100 million on cloud services, according to Stowe, though he emphasized its software also supports on-premises and hybrid workloads. "Our products have helped customers fight cancer and other diseases, design faster rockets, build better hard drives, create better solar panels and manage risk for peoples' retirements," Stowe noted.

Microsoft Corporate VP Jason Zander, who announced the acquisition today, said Cycle Computing brings its tools, capabilities and experience supporting some of the largest supercomputing scenarios to Azure. Zander also believes Cycle Computing will play a role in picking up the pace in mainstream cloud migration.

"Cycle Computing will help customers accelerate their movement to the cloud and make it easy to take advantage of the most performant and compliant infrastructure available in the public cloud today," according to Zander.

Asked about whether CloudCycle will continue to support AWS and Google Cloud, a spokeswoman for the company said it will continue to support existing clients whose implementations include the two competitors but "future Microsoft versions released will be Azure focused. We are committed to providing customers a seamless migration experience to Azure if and when they choose to migrate."

Interestingly, Amazon Web Services showcased Cycle Computing at its first AWS Summit event in New York four years ago. As noted in Stowe's blog post announcing the acquisition, he noted that the company was self-funded with just $8,000 put on a credit card and only took in some funding from angel investors.

In an interview during the 2013 AWS Summit, Stowe described an HPC job that used 10,600 server instances in Amazon's EC2 to perform a job for a major pharmaceutical firm. To run that simulation in-house would require the equivalent of a 14,400 square-foot datacenter which, based on calculations from market researcher IDC, would cost $44 million, he said at the time. "Essentially, we created an HPC cluster in two hours and ran 40 years of computing in approximately 11 hours," Stowe explained. "The total cost for the usage from Amazon was just $4,472."

At the time of our discussion, when I asked if he was looking to other clouds services such as Microsoft Azure, Google Cloud or others, he responded that only AWS was suited to such jobs. "Who knows what the future holds but from our perspective AWS has a tremendous advantage in its technology," he said back then. Clearly, Stowe didn't envision a future that meant Cycle Computing becoming part of Microsoft but that future has arrived.

Posted by Jeffrey Schwartz on 08/15/2017 at 7:46 AM0 comments


Consumer Reports Casts Doubts on Microsoft Surface Hardware Reliability

In a rebuke of Microsoft's Surface Pros, Surface Books and new Surface Laptops, Consumer Reports magazine last week removed its "recommended" designation from the PCs and tablets. The nonprofit, subscriber-funded publication rescinded its stamp-of-approval after a survey of its readership led Consumer Reports to forecast that 25 percent of Microsoft's Surface-based systems will "present owners with problems" within two years of owning them.

The Consumer Reports National Research Center surveyed subscribers who bought new tablets and laptops between 2014 and the beginning of 2017. Complaints ranged from systems freezing or shutting down unexpectedly, with others dissatisfied with the responsiveness of the touchscreens.

Consumer Reports, which purchases and tests products in its labs on various metrics including display quality, battery life and ergonomics, had previously found several of Microsoft's Surface systems either "Very Good" or "Excellent," including its newest Surface Pro, released in June. However, the publication made the change because many customers care equally about reliability, which its surveys now brought into question.

"While we respect Consumer Reports, we disagree with their findings," wrote Microsoft's Corporate VP for devices Panos Panay, in a blog post challenging the 25 percent failure rate. "In the Surface team we track quality constantly, using metrics that include failure and return rates -- both our predicted 1-2-year failure and actual return rates for Surface Pro 4 and Surface Book are significantly lower than 25 percent. Additionally, we track other indicators of quality such as incidents per unit (IPU), which have improved from generation to generation and are now at record lows of well below 1 percent"

Panay also noted that a survey conducted by researcher Ipsos, commissioned by Microsoft, found a 98 percent satisfaction rate among Surface Pro 4 customers. Interestingly, the Consumer Reports advisory didn't make note of the widely publicized battery issues faced by Surface Pro 3 users. My Surface Pro 3 barely gets three hours these days, despite applying all of the firmware patches Microsoft released.

Among those commenting to the Consumer Reports advisory, a good number had various complaints, while others offered more positive assessments of their experience with the hardware. Personally, I turn to Consumer Reports when making a major purchase such as a car or home appliance (though not for computer hardware or software). Having just bought the new Surface Laptop of my own a few weeks ago, I'm not planning on returning it based on the Consumer Reports advisory, for a number of reasons.

First, based on the tests of the Surface hardware, Consumer Reports testers were impressed with the various systems. It's hard to conclude that 25 percent of every machine will become inoperable after two years, though any system can experience issues of various degrees. But perhaps most important, the survey period ended earlier this year -- way before Microsoft released its latest systems, meaning it has no bearing on the current hardware.

Panay said in his post that Microsoft has learned a lot since rolling out its first Surface devices five years ago. Analyst Patrick Moorhead, CEO of Moor Computing and Insights, has owned every Surface model Microsoft has released, agreed. In an e-mail, Moorhead noted the various software issues it experienced with its Skylake-based SKUs almost two years ago. "These issues have been resolved and led to Microsoft's conservativeness with their latest crop of products," Moorhead noted. "Notice on Surface Laptop and Surface 5 that Microsoft did not embrace Kaby Lake, USB-C or Thunderbolt 3, as this conservatism should lead to a very high-quality experiences."

So far, I'm happy with the Surface Laptop and hope that I'll feel the same way in a few years. I'll follow up with my impressions of the device after a few more weeks of using it.

Posted by Jeffrey Schwartz on 08/14/2017 at 10:56 AM0 comments


3 Top Tips for Sysinternals Tools

Aaron Margosis (@AaronMargosis), a self-described "Windows nerd" and an 18-year Microsoft veteran who is now a principal consultant for the company's Global Cybersecurity Practice, is one of the leading experts when it comes to Sysinternals, a set of free Windows management utility tools. In addition to working with key customers on various security issues, Margosis focuses on his core expertise of Windows security, least privilege, app compatibility and configuring locked-down environment, according to his bio. He has also collaborated with Microsoft's Mark Russinovich on the recently updated book Troubleshooting with the Windows Sysinternals Tools, 2nd Edition (Microsoft Press). 

During Margosis' presentation at this week's annual TechMentor conference in Redmond, Wash., titled "The Sysinternals Tools Can Make You Better at Your Job," he gave a deep dive into several of the more popular tools, including autoruns, process monitor and desktop. "These valuable, lightweight tools that aren't available anywhere else," Margosis said.

Autoruns for Troubleshooting
During his presentation, Margosis shared how to use Autoruns to check out some startup scripts. By default, it hides anything that is part of Windows or comes from Microsoft, to expedite troubleshooting. "The pink items are potentially suspicious," he explained. It can also help verify code signatures, to ensure entries are digitally signed by their vendor and submit questionable entries to Virus Total, a Web application service that hosts more than 60 antivirus engines. "You can submit files and ask it what it thinks of this program," he said. "You can also click on links and get details."

To find the things that are problematic, Hide VirusTotal Clean Entries engine, which leaves you with the unknowns or entries that were flagged by one or more anti-virus engines. "So, what do you do when you find something in there you don't want on your computer?" he asked, segueing into another demo. "Right click on the entry to go to where that item is configured or right click and go to the image," he said. "This goes to actual file in Windows Explorer. Then you can right click to delete the entry completely. When you delete it, there's no way to get it back. That might cause damage you can't undo." 

The safer thing to do is simply uncheck the questionable entry. "That will disable the thing from running, but don't delete it."  Margosis then used used the Process Monitor to help with a problem he had recently experienced in Office where the white theme occasionally reverted back to the default colorful. "The next thing I want to show you is something I was just working on last week. The 'colorful' theme is new default for Office. It kept switching back to 'white'. The tool I decided to use for this is the Process monitor.

Process Monitor
The Process Monitor is the tool for tracking all system activity. It loads up all registry entries and files. "This is the tool I want to use to determine what is setting the colorful theme back in place," he says. "By default, all that data is stored in virtual memory of the process monitor. So, there are a few different ways to run a trace for a long time."

One is backing files. Instead of writing to virtual memory, write to the file and keep appending that file. Another is history depth, which only stores a certain number of events, then drops off the older ones. Drop filtered events is the third. "And that's what I used," says Margosis. "I set the filter only looking for specific actions. I'm going to look for things happening in Excel and writing to the registry."

Margios said he ran his trace for six hours, captured 21 events and did not bog down system at all. "I was able to nail down exactly what happened," he says. "It was actually [a bug within] Office itself."

Desktops
Margosis wrapped up his presentation with a demo the Sysinternal Desktop tool. Within windows station, you can run one or more desktops. Windows runs on each desktop, and can send messages back and forth to communicate between desktops. You can create up to four desktops and use hotkeys to switch between them. The theory being you could have work on three of them and soccer on the fourth. "The Desktop tool takes care of all that," he says. "And it will not lose track of which should be hidden or shown."

While Sysinternals are a critical tool for maintaining the security of Windows, MVP Sami Laiho will give an all-day workshop on how to secure Windows workstations, servers and domains at the next TechMentor conference, which takes place as a track in the annual Live! 360 confab Nov. 13-17 in Orlando.

Posted by Lafe Low on 08/11/2017 at 8:00 AM0 comments


Windows 10 Adapts to the Millennial-Dominated Workplace

Like it or not, IT pros must now cater to the whims and tastes of the millennial workforce to keep them productive and satisfied with their jobs. Giving employees of this digital generation, who expect more in terms of capability and less in terms of restrictions, is counter to the way IT organizations have traditionally run and requires a whole new mindset.

These sweeping changes mean IT must give their employees the leeway they need to remain productive, while maintaining security and controls of enterprise information. IT needs to apply enough control to ensure productivity and security, but not so much as to scare away this new generation, according to Michael Niehaus director of product marketing for Microsoft's Windows Commercial group, speaking earlier this week in the opening keynote address at the TechMentor conference, taking place at Microsoft's headquarters in Redmond. Wash. (TechMentor is produced by Redmond magazine parent company 1105 Media).

The number of millennials in the workforce is quickly outpacing the combination of Gen Xers and the Baby Boomer generation, and is expected to account for more than half by 2020, according to various forecasts. According to surveys, they're more likely than previous generations to leave an organization if they're not satisfied. This has created a contrast between classic and modern IT, as Niehaus calls it, but it's not as much a question of one model against the other. "I really see this as embracing both at the same time; using both of them when appropriate," Niehaus told attendees. "There is a place for multiple devices and a place for proactive versus reactive support. We need to take both into account."

Niehaus said migrating to Windows 10, which is on the rise this year, is the best approach to supporting both models. During his one-hour keynote titled "Modernizing Windows 10 Deployment and Servicing," Niehaus outlined the new cadence and steps IT professionals should take to keep their Windows shops up to date. Moving to Windows 10 not only means deploying a new operating system but it introduces a shift to more rapid updates and servicing. Niehaus recognizes that this in itself can be quite disruptive from the status quo.

"While technology clearly changes on a rapid regular basis, when it comes to profound changes, there haven't been as many lately," he said. "It feels like we've been stagnant for last 15 years or so. The pace of change hasn't really affected IT as much. The PC environment probably looks same today as it did 10 or 15 years ago." Such stagnation is no longer going to be the norm, he said.

Employees of this digital generation expect more in terms of capability and less in terms of IT exerting control over them. "They're used to having all sorts of devices. They don't have to be taught," he said. "They don't understand why IT is so heavy handed in control of PCs. They expect something different."

With the need to strike that proper balance of giving employees more flexibility (while maintaining an appropriate level of control), Niehaus displayed the recently announced Microsoft 365 service, which includes Windows 10 Enterprise, Office 365 Pro Plus, and the Enterprise Mobility + Security service. The latter, EMS, includes Azure Active Directory for authentication, Intune device management and data loss protection features. The new Microsoft 365 bundle, still in preview, aims to help organizations move to the cloud, Niehaus said. "There are lots of cloud services feeding into this system, with Windows being updated more frequently -- updates twice year instead of twice per decade," he said.

Windows updates essentially become a cloud-based service. The new shift to Windows as a Service means users will get the latest features onto these devices. "The goal is to make this as simple as possible," says Niehaus. "Windows as a Service is an iterative process. We want to be able to move to servicing approach. We also want an easy path to Windows 10."

Niehaus acknowledges there are scenarios where traditional deployment with system images will still be required, but hope to minimize that, especially as Windows servicing progresses. "Our hope is you never need to do that again," he said. "We'll keep you up to date using Windows as a Service."  A day later, Aug. 9, Niehaus published a blog post following his presentation, explaining all of the Windows-as-a-Service changes. The post referred to documentation posted in April and a five-minute Microsoft Mechanics video posted last month.

Microsoft intends to have the Microsoft 365 service arrive ready to use. "Ideally we take this further and ship to employee. Then, straight out of box, when they get it, with just a few simple steps, they can set it up without IT touching it."

Windows Autopilot, which Microsoft announced in late June, is also a key piece of that strategy. "We can do this from the cloud completely," Niehaus said. "The idea with the cloud-driven approach is you can take a device and register it in the cloud so when that device is turned on, it can, more or less, configure itself. That is the golden path. We want to eliminate all touching and imaging of the devices. We want to make it easy for users to do it themselves."

The basic steps for using Windows Autopilot, Niehaus said, are to register device to indicate it belongs to an organization. You can then assign a profile which includes system settings. Then you ship the device to user. "They turn it on and off they go," he says. "That's when magic happens."

After a detailed demonstration of the configuration process for the Microsoft 365 service, Niehaus shifted back to discussing the new Windows delivery and deployment cadence.  "Once we have Windows 10, we never want you to go through an upgrade project again. We want you to stay current on that." To maintain that cadence, there will be new Windows 10 releases twice a year, in March and September. "Let's just call them semi-annual channel releases."

On the System Center Configuration Manager side, Microsoft will issue three release updates: in March, July and November. "The first and last roughly correlate to Win 10 updates," he said. "They need that intermediate middle of the year release for Configuration Manager."

And that is the primary message from Microsoft: to prepare for the new update cadence. "We used to talk about every three years. Now it's every six months to get latest productivity and security features," Niehaus said. "We need to stay ahead of the bad guys. We will always have new security capabilities. It's a constantly moving target. And [improved security] is the number one reason we've seen organizations move to Windows 10."

A number of sessions at the next TechMentor, scheduled for Nov. 13-17 as part of the Live! 360 conference in Orlando, Fla., will address Windows 10 migration.

Posted by Lafe Low on 08/10/2017 at 8:31 AM0 comments


Targeted Attacks Stoke Concerns of Rising Cyberespionage

Almost every IT security professional is concerned that the latest advanced persistent threats (APTs) have made them potential targets of sophisticated cyberespionage campaigns. A survey of IT security leaders in the U.S. and several European countries conducted by security software provider Bitdefender found that 96 percent are concerned about APTs, while 61 percent worry about becoming victims of targeted corporate or industrial espionage.

The survey of 1,051 IT security decision makers, conducted in April and May of this year, also found that 58 percent could be targeted by cyberespionage campaigns using APTs, with 36 percent acknowledging that they were at risk of sophisticated cyberespionage attacks aimed at exfiltrating critical information.

Office 365 attacks are of particular risk since they provide access to e-mail accounts and files stored in OneDrive. Cloud access security broker (CASB) Skyhigh Networks last month revealed a campaign specifically targeting its large enterprise customers' Office 365 accounts.

Skyhigh reported it detected 100,000 failed login attempts originating from 67 IP addresses and 12 networks throughout the world. The campaign targeted 48 of its customers' Office 365 accounts, according to Sandeep Chandana, senior VP of engineering at Skyhigh Networks. Chandana revealed the brute force attack in a blog post on July 20, noting the attack didn't cast a wide net, but rather was targeted at high-level executives.

"The attack was really sophisticated," Chandana said in an interview this week. "It worked really slow, under the radar. Typical systems didn't detect it because it was timed in such a way to evade typical solutions." Based on the intelligence Skyhigh gathered, the attackers appeared to have passwords of high-level executives, many of them C-level, Chandana said, but not their login IDs. "They were trying to use different variations of user names with the same passwords," he said.

Chandana said Skyhigh alerted the ISPs and Microsoft of the incident, and the attempted logins have since tapered off. No one was breached that the company is aware of, he said, noting these were all Fortune-250 companies that use two-factor authentication.

IT security pros believe competitors (61 percent) are the number one culprit of these campaigns, according to the Bitdefender survey, followed by hactivists (56 percent), foreign state-sponsored actors (48 percent) and national government agencies (41 percent).  "Most advanced persistent threats are not limited to state-sponsored attacks, as enterprises can also fall victim to attackers that exploit zero-day vulnerabilities to install highly targeted malware to spy on companies and steal intellectual property," according to the report's executive summary. Only 32 percent believe that insiders are likely attackers when it comes to APTs.

Posted by Jeffrey Schwartz on 08/04/2017 at 1:37 PM0 comments


Microsoft Extends Windows Hello Reach with New Keyboard

While a growing number of PCs now support Windows Hello authentication, many still can't take advantage of the feature aimed at letting users access their systems without a password. The new Microsoft Modern Keyboard with Fingerprint ID, launched last week, brings Windows Hello Authentication to any PC running Windows 10.

Biometric authentication was among many key new features introduced with Windows 10, released two years ago this week. The number of new systems that support Windows Hello is on the rise but many still lack the necessary hardware -- especially lower-end PCs and older ones upgraded to Windows 10. Microsoft is looking to encourage users to create unique passwords for various accounts and believes Windows Hello will encourage that practice.

"Studies show more than 80 percent of people use the same password across multiple Web sites, managing around 20-30 different accounts," noted Jennifer Thompson, product marketing manager for accessories, in a blog post. "We want to make sure that everyone running Windows 10 can experience the beautiful relief that comes from letting go of your written Pa55w0Rd5! So, we worked to deliver a predictable, intent-driven and simple solution for someone to quickly and securely log into their PC, or authenticate an action."

Microsoft earlier this year pointed to a number of peripherals that provide Windows Hello authentication from Acer, Asus, Fitbit, HID, Huawei, Nymi, Plantronics, Vanconsys and Yubico. RSA Secure Access Authenticator also provides Windows Hello authentication. Nevertheless, while Windows Hello supports biometric authentication to PCs, which reduces the likelihood of unauthorized access to an unattended PC, the number of apps that support Windows Hello integration is paltry. Besides Microsoft OneDrive, the company now points to only 11 third-party tools that support Windows Hello, such as Dropbox, Citrix ShareFile, OneLocker Password File and Cloud Drive.

The new keyboard can link to PCs via Bluetooth or a hardwire connection and costs $130, with a matching mouse priced at $50.

Posted by Jeffrey Schwartz on 07/31/2017 at 11:40 AM0 comments


Azure Container Instances Take VMs Out of the Mix

Microsoft's new Azure Container Instances (ACI) introduces a way to spin up workloads with the precise amount of CPU cores and memory needed on demand, use them for as little as a few seconds and instantly remove them and pay just for what was used. And it does so with the need to provision and manage Hyper-V or any other virtual machines.

ACI, revealed yesterday, is what Microsoft believes is the fastest and easiest way yet to deploy and manage container-based workloads in the cloud, pushing it up the infrastructure layer. Corey Sanders, director of Azure Compute at Microsoft announced the new service during a webcast, where he demonstrated the ability to spin up and take down container-based workloads.

Microsoft's existing Azure Container Service, released over a year ago, introduced the ability to deploy containers in its public cloud service. Using a choice of orchestration tools such as Mesos DC/OS or Docker Swarm, the current Azure Container Service deploys containers as a set of virtual machines. Administrators must manage both the VMs and containers, whereas the new ACI is able to provide that on-demand provisioning and deprovisioning capability without having to use virtual machines to spin up the containers.

"That is what makes Azure Container Instances unique," Sanders said. "It directly deploys the containers instead of deploying and requiring customers to manage the virtual machines underneath. This pushes the infrastructure up a layer, enabling customers to just work with containers, and no longer have to worry about container deletion, creation, patching and scaling of virtual machines. That's all taken care of by just exposing the container as the top-level resources."

Does this new capability portend the demise of the virtual machine? Given Hyper-V is a key component for which Azure itself is built upon, probably not anytime soon. Even with ACI, Sanders noted that Docker Hub and the Azure Container Registry will be supported and that Hyper-V virtualization in Azure enables the deployment and secure isolation of containers. It's still early days for containers.

"Containers are certainly emerging as a platform for newer workloads and for enabling some of the points around agility and scale," he said. "[But] we expect that many workloads will require the isolation characteristics and scale characteristics that come with full virtual machines. We do expect those will continue to be used, and deployed by customers, both in the public cloud and on premises, whether it be on Hyper-V or other hypervisors."

 

Posted by Jeffrey Schwartz on 07/27/2017 at 2:20 PM0 comments


Microsoft Readies Simplified and Rapid Container Deployment Service in Azure

Looking to simplify the deployment of containers, Microsoft is ramping up a new service in Azure that will allow customers to deploy them on the fly. The company today announced its new Azure Container Instances (ACI) and a Kubernetes-based connector that will enable orchestration.

Microsoft is offering the new ACI service in technical preview, initially for Linux containers, with support for Windows planned in the coming weeks. The preview initially will be available in the U.S. East and West and Europe West Azure regions. The company has also released with the preview its new ACI Connector for Kubernetes, an open source tool that will allow for the orchestration of Kubernetes-based container clusters.

Corey Sanders, Microsoft's director of Azure Compute, described ACI as variably sized single containers that deploy on the fly without requiring virtual machines, with usage to be billed by the second. "It offers the fastest and easiest way to run a container in the cloud," Sanders said during a webcast announcing ACI and the Kubernetes connector. Sanders noted that for typical container deployments, administrators must create a virtual machine to host that container.

"That amount of time to get started and that amount of work to deploy a container has now gone away with Azure Container Instances," Sanders said. "It offers a much faster way to launch those containers by directly launching the container itself instead of first creating the virtual machine."

Sanders elaborated on the annoucement in a blog post. "As the service directly exposes containers, there is no VM management you need to think about or higher-level cluster orchestration concepts to learn," he noted. "It is simply your code, in a container, running in the cloud."

In addition to simplifying the deployment and management of container-based workloads by taking the VM out of the picture, Sanders emphasized that the appeal of containers is their ability to scale and shut down on the fly in an agile and cost-effective manner. In addition to the per-second billing, customers can pay by the gigabyte and CPU, allowing them to choose the capacity needed for the containers.

"This lets customers and developers make sure the platform is perfectly fitting their workload, not paying for a second more than required and not paying for a gigabyte more of memory than necessary," he said during today's call.  "Customers can designate the amount of memory separate from the exact count of vCPUs, so your application perfectly fits on the infrastructure," he added in his blog post. "With ACI, containers are a first-class object of the Azure platform, offering Role-Based Access Control (RBAC) on the instance and billing tags to track usage at the individual container level."

In addition to the ACI Kubernetes cluster, customers can deploy ACI instances from a Docker hub repository, which supports a large set of commercial, custom or internally built based containers or from an Azure Container Registry as well as private registries.

Along with today's new service, Microsoft said it has joined the Cloud Native Computing Foundation (CNCF) as a Platinum member. CNCF, which is a project sponsored by the Linux Foundation (Microsoft joined last year), governs various container projects including Kubernetes, Prometheus, OpenTracing, Fluentd, Linkerd, containerd, Helm and gRPC, among others. Gabe Monroy, lead program manager for Microsoft Azure containers, is joining the CNCF board.

Posted by Jeffrey Schwartz on 07/26/2017 at 1:51 PM0 comments


Microsoft Claims Reports of Paint's Death Greatly Exaggerated

When Microsoft last week posted an update of features that will be removed or deprecated with the release of the Windows 10 Fall Creators Update, the flagship Paint image capture and illustration tool was among those on the list, which also included PowerShell 2.0 support, IIS 6 management tools, Outlook Express and the Reader app. Obituaries and tweets lamented the death of MS Paint, which was among the first free add-on tools offered in Windows.

Microsoft has deprecated Paint in favor of the new Paint 3D tool, which debuted in April with the Windows 10 Creators Update, and has taken its place on the Windows Start menu. While Paint 3D is now pinned to the Start menu, the original Paint is still included among the apps bundled with the OS. With the fall update, only Paint 3D is included.

Reports of Paint's demise went viral yesterday, apparently much to Microsoft's surprise. The company issued a brief statement stating that Paint isn't disappearing, reiterating that it'll remain available in the Windows Store as a free download.

"MS Paint is here to stay. It will just have a new home soon in the Windows Store where it will be available for free," according to the statement by Megan Saunders, Microsoft's general manager for 3D for Everyone, Windows Experiences. "If there's anything we learned, it's that after 32 years, MS Paint has a lot of fans. It's been amazing to see so much love for our trusty old app."

Nevertheless, that doesn't mean Paint will remain in the Windows Store forever. Tools that Microsoft classifies as "deprecated" could be removed in future releases, according to the company.

Frankly, it's not that big a deal. Paint 3D, though it has a much different interface, has the same basic functionality Paint offers, while giving users more illustration and photo editing options, including the ability to render images in 3D.

In fact, when Microsoft first revealed and demonstrated Paint 3D back in October of last year at the announcement of the Windows 10 Creators Update and Surface Studio, the company said at the time that the new tool was a replacement for Paint due to its ability to create richer graphics and images (see Michael Desmond's assessment of Paint 3D and some of the  other features in the Windows 10  Fall Creators Update Technical Preview).

By the time MS Paint finally does disappear from the Windows Store, in all likelihood few will notice or care other than those who are sentimental. And those traditionalists will likely once again mourn its final demise, though in reality MS Paint just has a new name and added features.

Posted by Jeffrey Schwartz on 07/25/2017 at 1:08 PM0 comments


What Does SharePoint Online's Growth Mean?

Growth of Azure once again was the center of attention following Microsoft's latest quarterly earnings report last week. But the company also gave a strong nod to continued expansion of its Office 365 business, buoyed by the news that Office 365 revenues for the first time surpassed sales of licenses of the traditional suite. Another important stat worth noting was that SharePoint Online usage nearly doubled last quarter over the same period last year.

CEO Satya Nadella shared that data point during the company's investor conference call last week to discuss its results for Q4 FY 2017. Although Nadella alluded to SharePoint Online's growth in passing, and without further elaboration, it gives some indication that the company's goal to move more SharePoint on-premises users to Office 365 is accelerating. To what extent the doubling of usage is meaningful depends on how many organizations are now using it, which the company hasn't disclosed. Various reports have suggested SharePoint Online usage still represents a small percentage of the overall SharePoint user base that either continues to run SharePoint Server on-premises or through hosting or cloud services.

In his prepared remarks, Nadella spoke of Office 365's expansion both in terms of subscriber growth and the depth of premium services used by commercial customers.  "Customers are moving beyond core workloads to adopt higher value workloads," Nadella said. "For example, we've seen a significant increase in SharePoint usage, which nearly doubled year-over-year." Does this signify a turning point for SharePoint Online?

During the Q&A portion of the call, financial analyst Raimo Lenschow of Barclays asked about its competitive edge in the context of the Office 365 Premium packages such as E5. "There's a lot more in the Office 365 adoption cycle beyond Exchange and e-mail," Nadella responded. Indeed, the impetus for many organizations to move to Office 365 is to get out of the business of deploying and managing Exchange Servers on premises, a common practice for more than two decades.

But moving to SharePoint Online hasn't been a priority for many organizations, especially those that have built custom server-side applications that aren't portable to Office 365. The release of the SharePoint Framework and recent extensions, along with new tools such as Microsoft Flow, Power Apps and Microsoft Forms are helping to ease that transition. Growth of the new Microsoft Teams also could portend increased usage of SharePoint Online.

But as the recent research conducted earlier this year by graduate students at the Marriott School of Management at Brigham Young University, spearheaded by Christian Buckley's CollabTalk, confirmed, more than half of SharePoint administrators, developers and decisionmakers surveyed have cloud-based implementations of SharePoint but don't have plans to transition entirely to the online version. (Redmond magazine was a media sponsor of the survey, available for download here).

Microsoft corporate VP Jeff Teper, who leads the company's SharePoint, OneDrive and Office, welcomed the Nadella mention in a tweet saying: "One of first highlights from Satya in today's earning's call – near 2x YoY growth in #SharePoint Online usage."

Indeed, that's a sign of growth. While Office’s commercial revenue saw a $277 million or a 5% year-over-year increase for the quarter, Microsoft said the growth was driven by a 43% increase in Office 365 commercial revenues. The overall 5% growth in commercial Office revenues, which totaled $5.5 billion, were offset by the shift to Office 365 from traditional licenses, according to Microsoft. But to what extent SharePoint Online growth has contributed and, in its own right, is accelerating is hard to determine. While Microsoft said that there are now 27 million consumer Office 365 subscribers, the company didn't share the number of commercial Office 365 subscriptions. Nadella had said back in April during the company's Q3 FY2017 earnings call that there are more than 100 million commercial Office 365 subscribers.

As the aforementioned survey revealed, while 32 percent of organizations of all sizes reported that they have hybrid SharePoint deployments, nearly half, or 49 percent, have hybrid server licenses, 35 percent have on-premises licenses and the remaining 17 percent use SharePoint Online. It will be telling what future surveys show with regard to SharePoint Online usage.

It's clear, and certainly not a surprise, that SharePoint Online is growing. The fact that usage has doubled year-over-year validates that point. But it's difficult to draw any solid conclusions as to whether the near-doubling of SharePoint Online usage last quarter points to acceleration or just incremental growth.

Is your organization moving more of its SharePoint workloads to Office 365?

 

Posted by Jeffrey Schwartz on 07/24/2017 at 6:32 AM0 comments


Office 365 Revenues Surpass Sales of Traditional Software Licenses

Office 365 revenues were greater than licenses for perpetual software licenses during the last quarter, marking the first time that this has happened. Microsoft revealed that new milestone yesterday when it reported its quarterly earnings for the period ended June 30.

While Microsoft said there are now 27 million consumer subscriptions, the company hasn't disclosed the number of commercial and enterprise Office 365 subscribers. The company did say commercial revenue for Q4 of its fiscal year 2017 rose 43% over the same period last year. Microsoft pointed to a number of new large subscribers including Nissan, Quicken Loans, Key Bank and Deutsche Telekom, as a part of Office 365's success.

Office 365 is becoming more profitable as well. Microsoft said gross margin for its portfolio commercial cloud services, which, in addition to Office 365, includes Azure and Dynamics 365, was up 52% for the period -- a 10-point increase.

Microsoft CFO Amy Hood indicated during the quarterly earnings call that growth was occurring across different license types and she cited momentum for its E5 premium subscriptions, which the company believes will expand with the coming of Microsoft 365, the bundle of Office 365, Windows, Enterprise Management and Security and Dynamics 365.

"Office 365 commercial revenue growth will continue to be driven by install base growth, ARPU [average rate per user] expansion, and adoption of premium services like E5 and should outpace the rate of transactional decline," Hood said. She added that LinkedIn revenue of $1.1 billion for the quarter, which is also part of Microsoft's Productivity and Business Process segment, contributed 15 points of growth.

Posted by Jeffrey Schwartz on 07/21/2017 at 12:32 PM0 comments


Microsoft Partner Program Offers Support for Software-Defined and Hyper-Converged Capabilities in Windows Server 2016

Microsoft has created a new software-defined datacenter certification program for storage and hyper-converged systems running Windows Server 2016, which shares many of the same properties as Azure such as Storage Spaces Direct and software-defined networking (SDN). The new program is targeted at customers interested in some of the appliances coming in September from Dell EMC, Hewlett Packard Enterprise (HPE) and Lenovo that will allow enterprises and service providers to run some of Microsoft's public cloud technologies in their own datacenters.

The Windows Server Software-Defined (WSSD) program lets partners deliver infrastructure capable of building hyperscale datacenters based on Microsoft's software-defined datacenter guidelines and specifications. Systems certified by Microsoft under WSSD undergo the company's Software-Defined Datacenter Additional Qualifications (SDDC AQ) testing process and "must firmly follow Microsoft's deployment and operation policies to be considered part of the WSSD program," as explained by QCT, one of the first six partners certified. The other partners include DataON, Fujitsu, HPE, Lenovo and Supermicro.

"Deployments use prescriptive, automated tooling that cuts deployment time from days or weeks to mere hours," according to the WSSD announcement on Microsoft's Hyper Cloud blog. "You'll be up and running by the time the WSSD partner leaves your site, with a single point of contact for support." The partners are offering three types of systems:

  • Hyper-Converged Infrastructure (HCI) Standard: "Highly virtualized" compute and storage combined in the single server-node cluster, which Microsoft says will make them easier to deploy, manage and scale.
  • Hyper-Converged Infrastructure (HCI) Premium: Hardware that Microsoft describes as "software-defined datacenter in a box" that adds SDN and Security Assurance to HCI Standard, which the company claims simplifies the ability to scale compute, storage and networking as needed, similar to public cloud usage models.
  • Software-Defined Storage (SDS): Enterprise-grade shared storage that's built on server-node clusters, designed to replace traditional external storage devices with support for all-flash NVMe drives, which customers can scale based on demand.

The specific WSSD-certified offerings announced by Microsoft include:

  •  DataON's S2D-3000 with the company's DataON MUST deployment and management tool, which provides SAN-type storage monitoring.
  •  Fujitsu's Primeflex for Storage Spaces Direct.
  •  Lenovo's Cloud Validated Design for Microsoft Storage Spaces Direct, optimized to provide cloud-based storage to Microsoft Hyper-V and Microsoft SQL database workloads.
  • QCT's QxStack WS2016, MSW2000 and MSW6000 hyper-converged appliances (its MSW8000 is still under review).
  •  Supermicro SYS-1028U-S2D HCI appliance, which it describes as a "highly dense" all-NVMe system for cloud-scale software-defined data centers.

While some obvious large providers of software-defined and engineered hyper-converged systems weren't on the initial list, notably Cisco Systems and Dell EMC, Microsoft said it expects to add more partners over time.

Posted by Jeffrey Schwartz on 07/19/2017 at 1:24 PM0 comments


Microsoft Execs Discuss Blockchain's Potential for Industry Disruption

More than 500 people attended a panel discussion Wednesday to hear Microsoft officials explain the company's plans to become a leading provider of enterprise blockchain software and services. Microsoft is among a rising number of IT infrastructure providers and customers that believe blockchain, the distributed ledger technology that has helped enable Bitcoin and the mounting number of cryptocurrencies like it, is poised to disrupt almost every industry.

Two key leaders at the center of Microsoft's marathon effort to build out robust private blockchain capabilities via its Azure cloud service described the technology's potential to upend the economics of many industries during a session at the company's Inspire partner conference, which took place this week in Washington, D.C. The two officials -- Yorke Rhodes, Microsoft's global strategist for blockchain, and Craig Hajduk, a principal program manager for blockchain engineering -- joined myself, Jeremy Epstein and moderator Eric Rabinowitz, CEO of Nurture Marketing, to explain what blockchain is and how they see it bringing about major changes across various industries by removing intermediaries from various processes and transactions.

Microsoft's blockchain efforts, code-named "Project Bletchley," started about two years ago. In the last year, the company has made a push to build Azure blockchain as a service as well. The efforts have gained credibility in the banking, capital markets and insurance industries. They are also of interest in sectors such as manufacturing, logistics and healthcare, where record keeping and security are needed for transactions.

"This technology allows us to digitize assets in a way that we've never been able to do so before," Rhodes said during Wednesday's session. "As soon as I digitize a cow, I can do futures. There's tremendous things going on and this wave of interesting technology is taking us places we've never believed that we could get to before." Panelist Jeremy Epstein, CEO of Never Stop Marketing, has focused most of his time these days working with startups using or offering blockchain-based technology such as Storj.io, OpenBazzar and Everledger because "this thing is like a tsunami 50 miles off the coast, and basically no one knows it's coming," Epstein said.

Blockchain is best known as the open source distributed ledger (think of a shared write-once, read-many shared spreadsheet), that enables bitcoin transactions. Bitcoin, the peer-to-peer cryptocurrency, was considered a breakthrough when it was invented in 2009 because it "was a very novel, creative combination of existing technologies that were out there applied in an interesting way, solving the problem that researchers and computer scientists have struggled to solve up until that point," Hajduk explained in his remarks.

"Blockchain refers to the technologies that sit behind that cryptocurrency and, in fact, there are over 75 different distributions of blockchain or distributed ledger technologies that are available on the market today," Hajduk continued. The amount is actually much higher, evidenced by the number of initial coin offerings (ICOs), or new cryptocurrency startups launched (it's important to point out that these fall outside traditional regulated markets and venture capital fundraising requirements). "There's a lot that you can go into, but the thing to know is that there's a lot of choices for customers and partners today. There's [also] a lot of excitement and hype -- that means it can be very hard to separate what's real from what is just pure hype," he said. "The engineering talent here is frickin' mind blowing, but the marketing is just horrific."

However, it's still early days for blockchain. And while hundreds of large organizations are exploring it and running pilots, no one knows how quickly it will take off.  But there's significant interest and hype around the technology. If you've struggled to get a grasp of what blockchain might portend for existing infrastructure and applications, you're not alone -- it has confounded many veteran IT experts. Rhodes, who also teaches electronic commerce at New York University, said it took him some time and extensive research to get up to speed.

"I discovered blockchain by reading about directed acrylic graphs in the summer of 2015 after Ethereum was launched," Rhodes said. "I started to read, and I started realizing that there were all these new nouns and verbs and I had no clue what they meant, so I studied literally for six months on and then came together with another colleague, some of you may know Marley Gray, and we started this engine and ourselves down the blockchain road."

Having met both of them in May 2016 during the inaugural ID 2020 Summit at the United Nations, Microsoft's extensive blockchain focus came to light for me. That event, and last month's follow-up, put forth the notion that blockchain could someday give everyone a secure and private digital identity, but especially to the estimated 1.2 billion undocumented individuals -- many of whom are exploited and deprived of basic human rights. At last month's ID 2020 gathering, attended by 300 people, Accenture and Microsoft announced that they have jointly developed a digital biometric identity that's based on blockchain.

Microsoft, through its key role in organizing and sponsoring the inaugural ID 2020 event, put the spotlight on how blockchain could become more than just a mechanism for removing intermediaries from financial transactions, a notion that already had the financial services establishment on edge with an urgency to act before upstarts beat them to the punch. Microsoft has quickly won over a large number of well-known banks, financial service market firms and exchanges because of its focus on enabling private blockchains based on Ethereum, along with its commitment to supporting other blockchains such as Chain Core, the IBM-Linux Foundation-backed Hyperledger Fabric and J.P. Morgan's open source Quora in the Azure Marketplace.

"I would tell you that about 50 percent of the investment right now in blockchain-based applications is going into the financial sector," Hajduk said. "Not surprising, [since] this technology came from cryptocurrencies, it does value transfer very well. On the other hand, retail or manufacturing supply chain is an area where there's a lot of participating investment. We also see it in healthcare. It's very important in the public sector as well. So, this is a technology that has broad applicability and has broad interest. It's not really about cryptocurrency -- it's about the shared source of truth and it's about rethinking, reworking and transforming your business. That, folks, is what drives the excitement about distributed ledger and blockchain technology, and that's also why we're here today."

Still, everyone on the panel cautioned that the hype cycle around blockchain has reached fever pitch. "If you look a distributed ledger technology as a whole, the hype is still deafening," Hajduk warned. "I'd say in probably six out of 10 cases [or] seven out of 10 cases, when the engineering team sits down with a customer, they say, 'Oh we really want to do a blockchain project.' When you start to talk about the use cases, most of the use cases will actually not be appropriate for a distributed ledger."

For background on blockchain, see this month's Redmond magazine cover story, "Microsoft's Bet on a Blockchain Future," which explains why blockchain is important to IT pros and developers. Microsoft's blockchain technology is further discussed in an article by Terrence Dorsey, "Inside Microsoft's Blockchain as a Service," while partner efforts are described in the Redmond Channel Partner article, "Microsoft Aims to Eliminate the Middleman with Blockchain."

Posted by Jeffrey Schwartz on 07/14/2017 at 12:28 PM0 comments


Azure Stack To Advance Hybrid Cloud, DevOps and Serverless Computing

More than two years after Microsoft revealed plans to offer its Azure Stack software to makers of hybrid cloud-based appliances, Azure Stack is now set for release this September. Azure Stack, which lets enterprises and service providers run their own mirror images of Microsoft's cloud platform in their own datacenters, is a strategic deliverable as the company looks to advance modern IT architectures including hybrid cloud, DevOps and serverless computing.

The first Azure Stack appliances will be available in September from Dell EMC, Hewlett Packard Enterprise and Lenovo, with Microsoft's newest partners, Cisco and Huawei, set to release their offerings by year's end and in the first quarter of 2018, respectively. Microsoft announced the release plans, pricing and the service options at its Inspire conference (formerly known as Worldwide Partner Conference), taking place this week in Washington, D.C.

"We're building out Azure as the first hyper-scale cloud that supports true distributed computing with Azure Stack," said Microsoft CEO Satya Nadella in Monday's opening keynote address.

Some beg to differ as to whether Azure Stack appliances will be "the first" hybrid cloud offerings delivered to organizations, since products from VMware or OpenStack-based appliances may have claimed that turf. But Microsoft argues it brings the software-defined infrastructure offered in Windows Server 2016 (such as Storage Spaces Direct, Hyper-V and support for containers) to a common application development, deployment and systems management model.

"You're writing one set of code, you're updating one set of code, you're deploying one set of code but it is running in two places," said Microsoft Corporate VP Julia White, during a press briefing at Inspire. "In a Visual Studio dropdown, you can select Azure or Azure Stack. It's that simple."

The initial systems will allow customers to provision and manage both IaaS and PaaS workloads via the Azure Portal, effectively choosing Azure Stack as a region. While workloads running in Azure Stack initially are limited, Microsoft officials say they cover the most widely used capabilities in Azure. Among them are Virtual Machines (base VMs and Windows Server), Storage (Blob, Table and Queue) and PaaS offerings via the Azure App Service (including Web Apps, Mobile Apps, API Apps and Functions).

Microsoft said it will continue to push additional capabilities and templates over time. In the short-term pipeline is IoT Hub and the Azure Data Service, said Microsoft Senior Product Director Mark Jewett. While Azure stack doesn't yet support Azure Data Services, customers can run SQL Server in Azure Stack. "We can certainly deliver database as a service," said Jewett.

Jewett and White also pointed to the ability to run the Azure App Service Stack on premises, notably PaaS services, the common API model and Azure Functions, which lets organizations move to serverless computing. Nadella in his keynote also said he sees serverless computing as the next wave in application development, deployment and management. "Virtualization has been amazing, but now this new era of micro services, containers and serverless [computing] is going to be fundamentally transformative to the core of what we write as applications," he said.

Azure Stack will appeal to those who have data sovereignty requirements where information can't be stored in the public cloud, edge computing scenarios where connectivity is unavailable or sporadic such as cruise ships and oil rigs and those looking to build new cloud-based applications that run on premises or extend existing legacy systems.

While Azure Stack isn't the first hybrid cloud appliance, Microsoft is looking to make the case that it's the first to share a common control plane across on-premises and public clouds. Paul Galjan, senior director of product management for Dell EMC Hybrid Cloud Platforms, agrees. "It is unique," he said. "It fits into a niche in the market that no other software vendor is offering anything quite like it."

Dell EMC, which launched a single-node developer edition of Azure Stack back in May that costs $20,000, will offer appliances that support up to 12 modes. The initial systems will be based on the company's PowerEdge R730XE systems. Dell EMC will follow shortly with iterations based on its new 14G server platform, announced at Dell EMC World, which will be built using the new Intel Xeon Scalable ("Skylake") processors, officially launched yesterday.

The configurations from Dell, which will range in cost from $100,000 to $300,000, will vary in average capacity from 100 to 1,000 Azure D1 class virtual machines with up to 8TB of persistent storage, according to Galjin.

Microsoft's Azure usage pricing is now set and specific costs can be found here.

 

 

Posted by Jeffrey Schwartz on 07/12/2017 at 12:10 PM0 comments


Largest Windows 10 Migration Yet Is Set to Reach 400,000 Employees

Accenture has migrated nearly 75 percent of its 400,000 employees to Windows 10 and is on pace to complete the upgrade next year. It appears to be the largest known Windows 10 migration to date, Microsoft acknowledged this week.

It's not entirely surprising that Accenture has fast tracked its Windows 10 migration, considering it is the largest independent global IT consulting and systems integration firm and one of Microsoft's closest partners. Accenture also is the largest customer using Microsoft's OneDrive with 6 Petabytes of business data stored in the online cloud storage service, tied to Windows 10 and Office 365.

Besides the fact that the two are invested in their Avanade joint venture, both companies are working closely on a number of technology initiatives. Two prominent examples were Accenture's endorsement of Microsoft Teams when Microsoft launched it back in November and more recently their work together on advancing secure identities using blockchain, announced earlier this month.

Still, Accenture's migration is a remarkable example of an enterprise that has taken an aggressive posture toward upgrading so many employees in relatively short order. As Windows 10 reaches its two-year anniversary next month, upgrades are on the rise. But as reported by my colleague Kurt Mackie earlier this week, a survey by Adaptiva found just under half (46 percent) of respondents report having migrated 10 percent or less of their PCs and devices to Windows 10, while 41 percent of participants said they have plans to have 51 percent of more devices migrated to Windows 10 within the next year.

The Accenture migration is also noteworthy in that the company's workforce is four times the size of Microsoft's. Naturally, the push gives Accenture cover to say it eats its own -- and Microsoft's -- dogfood, to coin an old phrase. "Not only do we enable our people with the latest technology, we're also setting ourselves up to be a reference for the state of what's possible with Microsoft," Accenture Managing Director Brad Nyers said in a promotional video, embedded in Microsoft's announcement. "It's demonstrating to our clients that we can be a market leader in the adoption of Microsoft technology."

In addition to its massive Windows 10 migration, Accenture is making a big push with Office 365 and Microsoft's Enterprise Mobility + Security (EMS) service, which the company earlier this month ported to the Azure portal and implemented the Microsoft Graph APIs (as detailed in this month's Redmond magazine cover story).

In addition to showcasing its use of the new technology, Accenture CIO Andrew Wilson noted that 75 percent of its workforce are millennials. Both are key factors driving its own move to provide them with more modern experiences such as Microsoft Teams, which Wilson discussed at the November launch, and the ability to support BYOD, where EMS plays a key role.

"Managing the identity of the user, differentiating between enterprise and between personal use at the application and the data level, we can operate in both a mobile way and a secure way [without] disrupting the user experience at the same time," Wilson said in the above-mentioned video. "We believe we're at the very leading edge of keeping ourselves and our clients totally relevant."

Posted by Jeffrey Schwartz on 06/30/2017 at 9:47 AM0 comments


Massive Petya Ransomware Outbreak Puts Spotlight on Prevention

The massive Petya ransomware attack crippled companies and governments across the globe yesterday, putting many workers on the sidelines, thousands of whom were unable to access business-critical files. The attack is similar to last month's WannaCry ransomware attack, which exploited a flaw in Windows Server Message Block 1 (SMB 1). It affects those who didn't apply Microsoft's critical MS17-010 patch issued in March. WannaCry had a kill switch, but there's no known kill switch for the Petya ransomware (also called "NotPetya" by some researchers).

Its effect was indeed quite extensive. The attack yesterday infected more than 12,500 users in 64 countries across the world including Belgium, Brazil, Germany, Russia and the United States. Microsoft late yesterday posted a detailed account of Petya's technique, which the company described as "sophisticated."

Microsoft said it has since released updates to its signature definition packages shortly after confirming the nature of the malware. The updates are available in Microsoft's free antimalware products, including Windows Defender Antivirus and Microsoft Security Essential, or administrators can download the files manually at the Malware Protection Center, according to Microsoft, which also noted that the new Windows Defender Advanced Threat Protection (Windows Defender ATP), released with the latest Windows 10 update "automatically detects behaviors used by this new ransomware variant without any updates."

Experts said this attack is the latest reminder that organizations need more advanced options to protect organizations from becoming victims of ransomware. A report by ISACA found that 62 percent of those surveyed were attacked by ransomware and only 53 percent have any type of formal approach to mitigate it. Moreover, 31 percent said they routinely test their security controls and 13 percent never test them, according to ISACA's recently released State of Cyber Security report.

Organizations need to build better architectures based on zero-trust segmentation, processes (automation, threat intelligence and patch management) and culture and communication, according to a blog post by Forrester Analyst Jeff Pollard. "The more dependent on digital business your revenue is, the higher the likelihood a cybersecurity issue will cripple you," Pollard said.

With this attack and last month's WannaCry incident, security firms are reiterating the following security best practices and guidelines (while also making a case for their own security wares):

  • Backup and recovery: In conversations and conferences held by companies such as Acronis, CommVault and Veeam, the companies have talked up the fact that merely backing up your data doesn't mean your data will be protected from ransomware. The recent release of Acronis Backup 12.5 includes a new feature called Acronis Active Protection, designed to prevent malware that can find its way into a backup in the first place using behavioral heuristics. "We are making sure the ransomware cannot get into our agent and get into our backups," said Frank Jablonski, VP of global product marketing at Acronis.
  • Manage pivileges: The Petra exploit, similar to other ransomware variants, requires elevated administrator rights to gain access to systems, Morey Haber CTO of BeyondTrust said. Organizations that have lax privilege management tools should remove end user administrator rights, which will ensure that only digitally signed software is trusted. However, that will only stop initial infection, Haber warned. "Once the first machine is compromised, administrator rights are not needed to propagate the worm due to the severity of the vulnerability and methods used for exploitation," he said. 
  • Keep software up to date: In addition to removing administrator rights, Haber said organizations should perform vulnerability assessment and install security patches promptly.

Those individuals that are educated on what to do when receiving a suspicious message is surprisingly low, commented Marty Kamden, of NordicVPN, in an advisory released today. "If you encounter a 'Check Disk' message, quickly power down to avoid having the files encrypted by the ransomware," said Kamden. Also, it's important to know which file to block. "Stop the spread within a network from the Windows Management Instrumentation by blocking the file C:\Windows\perfc.dat from running," he noted. "If such a file doesn't exist yet, create it yourself and make it read-only."

Posted by Jeffrey Schwartz on 06/28/2017 at 2:12 PM0 comments


Microsoft Delays Guest Access in Teams, Adds File Storage Options

Organizations that now use or are considering the Microsoft Teams chat tool offered with Office 365 business and enterprise subscriptions received both good and bad news last week. The good news is that Microsoft Teams can now integrate with third party cloud storage and file sharing services such as Box, Dropbox, Citrix ShareFile and Google Drive. The bad news is support for guest access that will allow external users to participate in Microsoft Teams groups will arrive later than expected. At the time Microsoft Teams was released in March, the company had targeted adding the guest access capability by the end of June.

The delay in support for adding external users to Teams should be brief, according to Suphatra Rufo, a Microsoft program manager for Microsoft Teams, who provided notification of the delay in a support forum. Although Rufo didn't provide an exact timeframe, she insisted the company still plans to deliver the feature, which will allow groups to add outside contractors, partners, customers and suppliers.

"We just are hitting some technical issues," according to Rufo's explanation. "The next date is not too far from the original June target. This is a top priority, so trust me when I say you will have it soon!"

Several posters on the forum commented that "soon" is too vague for their liking.  "I'm about to launch a multisite project about lean manufacturing," according to a comment by someone identified as Gerald Cousins. "Nearly 8 companies, 20+ project leaders to coordinate / inform / communicate. A good opportunity to use Teams. Have you any expected date for availability?  So that I can decide to delay ... or to postpone."

Marco, who apparently works for a school district, also was hopeful the delay would be brief.  "This is becoming a real problem now. You 'committed to June' and I relied on this and made commitments for it to my customers. There are school migrations planned during July that rely on this feature. School starts again in August. So, will we be able to use Teams with external access or not."

Providing access to external users obviously must be delivered correctly to ensure customers aren't introducing security risks. Based on Rufo's comments, it appears the feature will arrive in a relatively short timeframe, but it's understandable those that have made commitments are left holding the bag.

As for the good news, Microsoft is extending the storage and file sharing options for Teams, which, until now, required organizations to draw from OneDrive and SharePoint. Now Microsoft Team members can now also integrate with Box, Citrix ShareFile and Google Drive. Office 365 Admins can configure the individual storage providers in the Office 365 Admin Center, according to the announcement posted by Katharine Dundas, a Microsoft senior product manager for Office 365.

"By bringing content from Box into Teams, organizations can share their files more easily and collaborate on projects in real time, all while keeping their content securely managed in Box," said Jon Fan, a senior director for product management at Box, in his post announcing the integration with Microsoft Teams.

Ross Piper, VP of enterprise at Dropbox, added that integrating Office 365 and Microsoft Teams with Dropbox, set to be available next month, will make it easier to find, share and gather feedback in a chat without having to leave the conversation. "Once the integration is authorized by an administrator, users will be able to add Dropbox folders to a channel," according Piper's announcement. "From there, they'll be able to upload files to conversations, and create Office files directly on a shared Dropbox folder in Teams."

 

Posted by Jeffrey Schwartz on 06/26/2017 at 11:39 AM0 comments


Microsoft in the Crosshairs of Amazon's Whole Foods Deal

While Amazon's surprise deal last week to acquire Whole Foods for $13.7 billion is poised to upend the entire grocery and retail industry, as well as how suppliers distribute their goods, it also could impact which cloud services providers large retailers, distributors and goods suppliers use. Microsoft is in the middle of a battle that has already emerged in wake of last week's deal, where rivals are concerned about issues related to supply chain visibility and loath to enrich a fierce competitor attacking their margins.

Whole Foods, which has 462 high-end grocery stores throughout North America and Europe, is already a large Microsoft Azure customer. Meanwhile, Amazon's most formidable rival in the retail industry, Wal-Mart, which has a storied history of using its IT purchasing clout to its advantage, is among several large retailers that have told IT providers that if they want Wal-Mart's business, they can't use solutions dependent on AWS, according to a report in the Wall Street Journal this week.

Wal-Mart, which keeps the bulk of its data on premises, does use Azure and other providers when running some cloud-based services and acknowledged that there are cases when it pushes for alternatives to AWS, according to the report. Amazon reportedly responded that such conditions amount to bullying and are bad for business.

Wal-Mart reportedly did approach cloud service provider Snowflake Computing at one point. However, CEO Bob Muglia said that Snowflake's data warehousing service currently only runs in AWS, though in an interview with Muglia late last year, he hadn't ruled out Azure and other clouds in the future, if there's a business case to do so. At the time, it didn't appear to be a priority.

Ironically, Muglia is a former longtime Microsoft executive and president of the company's server and tools business, and was on the team that launched Azure back in 2010. Even more ironic is a previous Whole Foods and Azure commitment, according to a November case study published on Microsoft's Web site. In the midst of a five-year plan to move all of its infrastructure and software to a SaaS model running in Microsoft Azure, Whole Foods has already deployed Microsoft's Enterprise Mobility and Security (EMS) service. Whole Foods has also migrated 91,000 employees from Active Directory running on premises to Azure AD Premium, which gives all of them single sign-on access to 30 SaaS applications.

It goes without saying that should the deal go through, the role of Azure at Whole Foods will surely diminish. Microsoft has the most to lose should the deal go through, though IBM and Oracle have formidable retail clientele as well, many of which are using their emerging cloud offerings. Nevertheless, AWS and Microsoft Azure are considered to have by far the largest cloud infrastructure service portfolios, datacenter footprint and industry leadership, according to Gartner's latest annual Magic Quadrant Report.

For its part, Microsoft has invested significantly in targeting solutions for retailers, distributors and consumer goods suppliers. Microsoft has its Retail Experience facility in Redmond, Wash., which I saw two years ago. The facility offers partners and customers who visit a glimpse of the broad advances and investments Microsoft has made in offering retailers and wholesalers new capabilities, showcasing what the company is working on in areas such as automation, IoT, machine learning and new ways of managing payments.

The deal to acquire Whole Foods, by far the largest Amazon has ever made, brings new questions to what retail experiences will look like in the coming years. Though not surprising, the move shows that Amazon CEO Jeff Bezos is willing to spend whatever he feels is necessary to win. And it's possible Amazon could find itself having to raise its bid. Whole Foods shares have traded slightly above the agreed-upon price of the all-cash deal on belief, or speculation, that a rival suitor might top Amazon's price.

Depending on Amazon's determination to pull off the deal at any price, the reality is that the giant online retailer now it has its sights on bricks and mortar distribution in a big way (it is already experimenting with several bookstores), which could bring a new wave of business disruption.

While it's impossible to predict how this will play out and what moves will follow, I think it's a reasonable bet that Microsoft won't try to one-up Amazon and acquire a retailer such as Wal-Mart. Certainly, let's hope not.

Posted by Jeffrey Schwartz on 06/23/2017 at 2:02 PM0 comments


Lenovo Declares Ambitious Plan To Dominate Datacenter Market

Lenovo has rebooted its entire datacenter portfolio with what it described as its largest and broadest rollout of new and refreshed hardware with the introduction of 26 new servers, storage and network gear and a new line of engineered appliances and hyper-converged systems. At a one-day event in New York City for hundreds of customers, analysts and press, Lenovo's top brass declared the company intends to extend its footprint in datacenters and become the leading supplier of high-performance and super-computing systems.

The company's event was more than just a large rollout of datacenter, client and workstation products but rather an assertion that Lenovo is gunning to outpace rivals Cisco, Dell EMC and HPE. Such competitive chest thumping of course is common, and Lenovo's assertion is ambitious considering its rivals currently have much broader and modern datacenter portfolios (and, consequently, greater market share).

Nevertheless, Lenovo officials at yesterday's event noted that the company will deliver its 20 millionth server next month. Best known for taking IBM's struggling PC business 12 years ago and achieving market leadership, Lenovo is relatively new to the datacenter after Big Blue again decided to offload its commodity x86 server business in January of 2014 for $2.3 billion, a deal that was completed a year later.

While Lenovo has rolled out various upgrades since and has signaled its plan to extend its datacenter footprint, yesterday's Lenovo Transform event was the kickoff of a strategy that brings together new products and revamped development, manufacturing, distribution, marketing and service capabilities. "We are going to disrupt the status quo and accelerate the pace of innovation -- not just in our legacy server solution but also in the software-defined market," said Kirk Skaugen, president of Lenovo's Data Center Group.

Skaugen, a former senior executive at Intel who was tapped late last year to lead Lenovo's datacenter business, believes Lenovo has some distinct advantages over Cisco, Dell EMC and HPE -- notably that it doesn't have those companies' legacy businesses. "We don't have a huge router business or a huge SAN business to protect," he said. "It's that lack of legacy that's enabling us to invest and get ahead of the curve on this next transition to software-defined. You are going to see us doing that through building our internal IP, through some significant joint ventures [and] also through some mergers and acquisitions through the next several quarters."

Another key advantage is that Lenovo manufactures its own systems, he emphasized. Bold as the statements Lenovo made yesterday might sound, which also includes wanting to be the top player in supercomputing in the next several years, the company has the resources to disrupt the status quo if it can execute. "I've never seen a big, bold statement from Lenovo on the datacenter side," said Patrick Moorhead, principal analyst with Moor Insights and Strategy, during a brief interview right after the keynote presentation. Moorhead, who said he needs to drill deeper into the roadmap, said Lenovo has been building toward this focus for over a year. "They've thrown down the gauntlet and are definitely at the table," he said.

Moving away from IBM's Series x brand, Lenovo launched two new brands: ThinkSystem and ThinkAgile. ThinkSystem is a broad portfolio of new platforms consisting of new servers, storage and network switches that will roll out with this summer's release of Intel's new Xeon Scalable Family Platforms.

The new rack-based ThinkSystem offerings include the SR950 4U system, which is targeted at mission-critical workloads such as transactional systems, ERP and in-memory databases, and is scalable from two to eight processors. The denser SN850 blade server compute node is designed for the company's Flex System chassis. The SD530, the company's high performance computing entry into the 2U4N form factor, is designed for its new D2 chassis. Also added to the ThinkSystems line is its new DS Series entry-level to mid-range storage offering, available in all flash and hybrid SAN configurations.

ThinkAgile is what Lenovo describes as its software-defined infrastructure based line consisting of engineered systems targeting modern hybrid cloud workloads that include hyper-converged systems based on platforms from Microsoft, Nutanix and VMware. Lenovo's planned Azure Stack appliance will fall under the ThinkAgile portfolio and will be called the ThinkAgile SX for Microsoft Azure Stack.

Both the ThinkSystem and ThinkAgile portfolios are based on Lenovo's new systems management platform, XClarity Controller, which the company said offers a modern and intuitive interface that can provide centralized configuration, policy and systems management across both.

Lenovo officials said that while the company plans to accelerate the release of new products and partnerships, the company has made some key operational changes over the past year that will give its datacenter group better focus. Among the changes, Skaugen said the company has moved to a dedicated sales and marketing organization. "In the past, we had people that were shared between PC and datacenter," he said. "Now thousands of sales people around the world are 100 percent dedicated end-to-end to our datacenter clients."

Skaugen added that Lenovo now has a dedicated supply chain, procurement organization and has brought in new leadership that can focus on various technology and industry segments. Lenovo has also revamped its channel organization. A year ago, Lenovo's datacenter group had five different channel programs around the world. "We now have one simplified channel program for dealer registration," he said. "I think our channel is very, very energized to go out to market with Lenovo technology across the board. And we have a whole new set of system integrator relationships [and] a whole new set of partnerships to build solutions together with our system integrator partners."

Moorhead said the moves were long overdue. "While I think Lenovo should have done this two or three years ago, right after they chewed up IBM's xSeries business, these moves should help them become more successful," he said.

Despite some operational miscues in the past, unfortunately for Lenovo, it picked up the IBM xSeries business at the peak of the market, Charles King, principal analyst with Pund-IT, observed in a research note. Lenovo acquired IBM's x86 business in January 2015, when global server sales had grown just 2.3%. But the x86 systems market grew at twice that rate, according to IDC, King recalled, noting that two years later the global server market declined 4.6% to $14.6 billion in the 4th quarter of 2016.

"While Lenovo was working to integrate IBM's System x x86 systems and personnel with its own strategies, products and company culture, it was also navigating a notable decline in hardware sales and revenues," King noted.

Now that Lenovo has rebooted, King said despite its posture that the company doesn't have the legacy of some of its rivals, given its success in the PC business, it would be premature to underestimate the company's ability to extend its footprint in the datacenter over time. "It would be a mistake to assume Lenovo isn't fully ready and able to take its efforts in datacenter solutions and sales to the next level."

Posted on 06/21/2017 at 1:35 PM0 comments


Accenture and Microsoft Team on Biometric Identity Tied to Blockchain

Accenture today demonstrated a prototype of a technology it developed with Microsoft that can allow individuals to create digital identities based on the blockchain Ethereum protocol using biometric authentication. The prototype, demonstrated during the second ID2020 Summit at the United Nations, showed how an individual can create a digital identity on a blockchain tied to a biometric interface such as a fingerprint or facial recognition and maintain control.

The demo was based on a recent effort by Accenture, which tapped Microsoft and its joint venture Avanade, to provide a biometric identity system for the UN's High Commissioner for Refugees (UNHRC), which has already enrolled more than 1.3 million refugees throughout the world from more than 75 countries and hopes to support more than 7 million by 2020.

David Treat, director of Accenture's Blockchain practice, showed attendees of the ID2020 Summit how an undocumented refugee could create his identity and update it through his life with information such as birth, banking, health, education, career and any information needed to authenticate a given transaction. The user enrolls his credentials using Accenture's Unique Identity Service Platform, which uses Microsoft's Azure Blockchain as a Service, to provide and share identity attributes based on permissions defined by the user.

The Ethereum blockchain is suited for giving individuals control over their personal data because it's based on a "permissioned," distributed peer-to-peer architecture. Accenture and Microsoft recently showcased the potential of blockchain to give the 1.2 billion estimated people throughout the world who now lack any form of identification a digital fingerprint as an example of how the ID2020 working group within the UN sees partnering with the IT community as a positive trend. The goal is to find a global identity solution by 2020 that could be implemented by 2030, as defined in 2015 by the UN's Sustainability Development Goals. "This is going be a long haul. It's not something we are going to solve overnight," said Dakota Gruener, ID2020's executive director. The first ID2020 Summit gathered last year at the UN, and intends to continue working on its mission in the near future, Gruener said.

Treat said the technology is designed to connect with existing identity systems and is based on the recently announced Decentralized Identity Foundation, a consortium led by Accenture, Microsoft, RSA and a number of Blockchain startups aiming to create "an open source decentralized identity ecosystem for people, organizations, apps and devices."

Accenture's biometric registration capability has been in the field for three years. "What we did was make sure that it's scalable and runs well on our cloud, and then add this consumer-owned identity piece," said Yorke Rhodes, Microsoft's global strategist for blockchain. Over time, Rhodes says blockchain offers the potential to give users control how their identities are used by Google, Facebook and LinkedIn, as well as in Active Directory.

"If you look at a lot of the problems associated with identity, there are hacks associated with honeypots," Rhodes said. "So, the ideal world is you can get away from that by not actually pulling together all this data into these large databases."

Accenture's Treat said the pilot discussed actually amounts to a simple use of blockchain. "It lets you leave the data where it is, and lets you use blockchain to capture that identifier, index that information, use it where it is and create that linkage between disparate sources."

 

Posted by Jeffrey Schwartz on 06/19/2017 at 1:33 PM0 comments


Microsoft To Shrink Nano Server to Focus on Containers

Microsoft this week unveiled plans to bring Windows Server into its semi-annual update release cycle, starting this fall, alongside a new Nano Server image configuration with a 50 percent reduced footprint. The company is stripping the infrastructure roles from Nano Server to optimize the headless configuration option for deployment of container-based environments. While Microsoft revealed the Nano Server changes during last month's Build conference in Seattle, along with other features coming to Windows Server including Linux container support and plans to offer the server OS into its new semi-annual cycle, the company on Thursday made its plans official and provided some additional details.

Only those opting for the newly minted semi-annual channel can implement any of the new technical features planned for Windows Server, which include the pending stripping down of Nano Server. In addition, organizations will be required to have Software Assurance coverage to access the semi-annual channel releases. The revamp of Nano Server is noteworthy because leading up to its release last year, Microsoft had touted the minimal-footprint deployment option for Windows Server 2016 for its suitability for large clusters in Web-scale application and datacenter environments. However, Microsoft has since found that the "vast-majority" of Nano Server deployments from a workload perspective are running container-based applications based on Docker, Kubernetes and others. Since container-based workloads do not require the infrastructure components, Microsoft determined that removing them would result in a more efficient server environment and advance the move toward containers.

"Nano Server will be optimized as a container runtime image and we will deprecate the infrastructure roles," said Chris Van Wesep, Microsoft's director of enterprise cloud product marketing, "so for anybody who had wanted to do smaller footprint compute and storage clusters, Server Core will be the right implementation to do that." By deprecating the infrastructure features in the Nano Server option, the removal of that code will make way for Microsoft's new .NET Core 2.0, "which enables customers to use more of their code in more places [and] make Nano Server the best option for new container-based development," said Erin Chapple, general manager for Windows Server, in a blog post announcing the new release options.

Microsoft is recommending Server Core for hosting virtual machines as well as containers, which Chapple said can run a Nano Server or Linux container images. The Windows Server update this fall will support Linux workloads via extended Hyper-V isolation, which will allow Linux containers to run without having to deploy two separate container infrastructures to run both Linux and Windows-based applications. As previously announced, Microsoft is also bringing the Windows Subsystem for Linux, (a.k.a. Windows Bash component), allowing application administrators and developers to use common scripts and container images for both Linux and Windows Server container hosts, according to Chapple.

Collectively, these technical changes to Windows Server and the continuous release cycle option associated with it are part of Microsoft's strategy to bring more consistency to the server OS and Azure. The changes also promote the development of modern cloud apps and migration of legacy apps and systems to these environments using container images. "Many customers don't realize that Server Core is the base image that runs Azure and Azure Stack," Chapple noted. "This means the investments we make in Windows Server for Azure can be made available to customers to use in their own datacenters. This makes Server Core a great choice for Azure customers who like the idea of consistent technologies between their datacenter and the cloud. One example of this in the upcoming feature update is the cluster sets functionality for increased scale of hyper-converged deployments. We're also continuing to add security investments such as the ability to quickly encrypt network segments on their software-defined networking infrastructure per their security and compliance needs. You can expect new features to continue to optimize efficiency, performance and security for software-defined datacenters."

Server Core will also play a key role with the modernization of applications, Van Wesep emphasized. "One of our big pushes for next year is going to be around getting folks that have traditional .NET applications to drop those into containers running on Windows Server 2016, potentially even moving them into Azure," he said. The new features will only be available to those opting for the new semi-annual channel, which will require Microsoft Software Assurance or Azure cloud subscriptions.

Microsoft explained how the new semi-annual channel release update will work. The company will offer new feature updates every spring and fall, the same model it recently moved to for Windows 10, Office 365 ProPlus and System Center. Microsoft will offer Windows Server previews shortly before the final release via its Windows Insiders for Business program, which is now open to those who want to sign up. Each semi-annual release will come with a pilot availability period of three to four months, and once the software is deemed stable, Microsoft will lock it down into a "Broad" release. Those releases will carry the Windows Server name with no year attached to it, instead using the Windows 10 versioning model. The first release, scheduled for September, will be called Windows Server 1709. Chapple noted that the semi-annual channel feature updates are cumulative, containing previous updates. The semi-annual channel feature updates will get combined for Microsoft's long-term servicing channel releases, a servicing model conceived for environments that can't tolerate change.

The long-term servicing channel will include five years of mainstream support, five years of extended support and the option for six years of Premium Assurance. Van Wesep acknowledged that the long-term channel will be the most common in the near term. "I don't imagine that the vast majority of the people will come out of the gates and say this is our new model and we will wholeheartedly switch to this -- that would be naïve," he said. "I think there has been enough demand and feedback from customers that they want a more active way of consuming our software that there will certainly be a meaningful size of the installed based that will start looking at this and working to adopt it. I can see a scenario where every customer would find both channels compelling for different parts of their organization."

Indeed, many organizations may be resistant to such a model, and Van Wesep acknowledged that many existing applications and systems don't lend themselves to a continuous release update model. But as many organizations look to transform their business processes or models over time, that can change. "This is on us to do the education process," Van Wesep said. "People need to start thinking about the layers of abstraction between the app, the OS and the underlying hypervisor/fabric. It all can be thought of independently and should be."

Posted by Jeffrey Schwartz on 06/15/2017 at 12:27 PM0 comments


Microsoft Joins Cloud Foundry Foundation and Pledges Extended Azure Integration

Microsoft is giving a significant boost to its support for Cloud Foundry, the popular open source cloud platform for building and deploying cloud-based applications. The company announced yesterday it has joined the Cloud Foundry Foundation and is offering extended tools and integration between the popular PaaS architecture.

Cloud Foundry is quickly emerging as a DevOps platform of choice among enterprises looking to develop applications in any language and deploy in container images on any supported infrastructure. Conceived originally by SpringSource, which in 2009 was acquired by VMware, the Java-oriented Cloud Foundry project was later spun into Pivotal Labs. Pivotal contributed to the Cloud Foundry open source project, while offering its own commercial stack.

In addition to Pivotal, among its principal stakeholders are the rest of the Dell Technologies family including Dell EMC and VMware. Cisco, Google, IBM, SAP and Suse are also among those who have made strategic bets on Cloud Foundry. Microsoft first announced support for Cloud Foundry two years ago with the release of service broker integration and a cloud provider interface to provision and manage Cloud Foundry in Azure via an Azure Resource Manager template. Microsoft added Pivotal Cloud Foundry to the Azure Marketplace last year.

A growing number of enterprises have found Cloud Foundry appealing because of its ability to scale and support automation in multiple hybrid and public cloud environments. GE, Ford, Manulife and Merrill are among a number of large Microsoft customers using Cloud Foundry as their cloud application platform, noted Corey Sanders, Microsoft's director of Azure Compute. Sanders announced Microsoft's plan to become a Gold sponsor of Cloud Foundry at the annual Cloud Foundry Summit, taking place this week in Santa Clara, Calif. While Microsoft already offers some support for Cloud Foundry in Azure, the company is making a deeper commitment, Sanders explained yesterday in a blog post.

"Cloud Foundry on Azure has seen a lot of customer success, enabling cloud migration with application modernization while still offering an open, portable and hybrid platform," Sanders noted. Microsoft's extended Cloud Foundry support in Azure includes integration with Azure Database (PostgreSQL and MySQL) and cloud broker support for SQL Database, Service Bus and Cosmos DB. Microsoft has also added the Cloud Foundry CLI in its Azure Cloud Shell, which Sanders said will provide "easy CF management in seconds."

Sanders outlined some other key integration points and tools that will enable deeper support for Cloud Foundry running in Azure. Among those, as described by Sanders in his blog post:

  • Azure Cloud Provider Interface - The Azure CPI provides integration between BOSH and the Azure infrastructure, including the VMs, virtual networks and other infrastructural elements required to run Cloud Foundry. The CPI is continually updated to take advantage of the latest Azure features, including support for Azure Stack.
  • Azure Meta Service Broker - The Azure Meta Service Broker provides Cloud Foundry developers with an easy way to provision and bind their applications to some of the most popular services, including Azure SQL, Azure Service Bus and Azure Cosmos DB.
  • Visual Studio Team Services plugin - The Cloud Foundry plugin for Visual Studio Team Services (VSTS) provides rich support for building continuous integration/continuous delivery (CI/CD) pipelines for CF, including the ability to deploy to a CF environment from a VSTS-hosted build agent, allowing teams to avoid managing build servers.
  • Microsoft Operations Management Suite Log Analytics – Integration with Log Analytics in OMS allows users to collect system and application metrics and logs for monitoring CF Application.

Microsoft, Google and Red Hat are among those working to build support into their service brokers to provide a single and simple means of providing services to native cloud software and SaaS offerings based on Cloud Foundry (as well as OpenShift and Kubernetes). The resulting Open Service Broker API project, announced in December, aims to provide a single interface across multiple application and container service platforms. While Microsoft announced support for the API at the time, Sanders said it is formally joining that group, which includes Fujitsu, GE, IBM and SAP.

"Working with this group, I hope we can accelerate the efforts to standardize the interface for connecting cloud native platforms," Sanders said, promising that will also ensure application portability across platforms and clouds. Microsoft will host a webinar on July 20 that describes how to create applications using Cloud Foundry on Azure, including examples of existing customer implementations.

Posted by Jeffrey Schwartz on 06/14/2017 at 12:34 PM0 comments


Microsoft's Ambition: Integrate Power BI Everywhere

Microsoft today is showcasing its aspiration to broaden the reach of analytics data and to bring new ways to integrate and visualize operational data. Kicking off the Microsoft Data Insights Summit in Seattle, Wash. today, the company announced the general availability of its new Power BI Premium offering, announced last month. Microsoft also revealed forthcoming capabilities to Power BI that it said will add more drill-down capabilities and more advanced ways of querying data using conversational input.

Since the release of Power BI three years ago, Microsoft has fast tracked the addition of new tools, integration capabilities and scale of its self-service visualization offering. Power BI is now used by 2 million people who create 30,000 data models in Power BI services every day, said James Phillips, corporate VP of Microsoft's Business Applications, Platform and Intelligence (BAPI) organization, speaking in the opening keynote at the Microsoft Data Insights Summit.

Phillips said Power BI is integrated with in the form of "content packs" embedded among thousands of software and SaaS offerings using its common API. "If you go back three of four years, there were reasonable questions about Microsoft's commitment to the business intelligence market," Phillips said. "I think if you look at the levels of investments that we've made, the progress that we've made, the growth of our user population, I think those questions are far behind us."

Microsoft believes it will change the economics of delivering visualized operational analytical data from any data source. The release of Power BI Premium comes with a new licensing model that allows for the distribution of reports enterprise wide, as well as externally to business partners and customers. The cost of embedding Power BI Premium into apps starts at $625 per month.

Power BI Premium is offered in a variety of different capacity, virtual core and memory sizes, depending on the requirements of the application. Microsoft posted a Power BI Premium calculator to help customers estimate the configurations they'll need and the monthly costs. Microsoft touts the fact that the scale of Power BI comes from the fact that it is deployed throughout its Azure footprint -- the new premium offering offers an on-premises Power BI Report Server. In addition to Power BI Premium customers, those with SQL Server Enterprise Edition and Software Assurance (SA) can deploy the Power BI Report Server as well via their entitlements.

Microsoft Technical Fellow Amir Netz also took the stage during the opening keynote to announce new offerings coming to Power BI Premium this summer that will allow customers to embed Power BI into their applications. He also demonstrated how Power BI Premium allows key Microsoft applications to generate visualizations. Among then was the ability to embed Power BI reports into SharePoint collaboration projects "that are fully dynamic and fully updateable that you can distribute to everybody," Netz said. In addition to Power BI working with SharePoint, Netz said it works with the new Microsoft Teams collaboration workspace application, allowing members of a team to add Power BI reports.  "All you have to do is go to the channel, and pick one of the workspaces you want it to work with and just like that I have a Power BI report embedded into Microsoft Teams," he added. Also demonstrated was forthcoming Microsoft Visio custom visual, which the company will release next month, though a preview is now available.

Netz also described new capabilities that will allow for users to act on data generated from Power BI Premium, which will be enabled though Microsoft's Power Apps offering. Netz said users will be able to write back directly to Power BI. The keynote also showcased the new Quick Insights feature that will allow advance predictive analytics via the Power BI Desktop, allowing people to gather answers to queries.

Posted by Jeffrey Schwartz on 06/12/2017 at 2:05 PM0 comments


Microsoft Condemns U.S. Withdrawal from Paris Climate Agreement

Microsoft was among the numerous tech companies and leading businesses across all industries that spent the last few months with key officials in the Trump administration urging President Trump not to withdraw from the Paris Climate Agreement. President Trump's decision last week to pull out last week was widely and sharply criticized by IT and business leaders, as well as elected officials at the federal, state and local levels.

For its part, Microsoft was a champion of the agreement from the outset and viewed last week's decision as a key setback. "We believe climate change is an urgent issue that demands global action," Microsoft CEO Satya Nadella tweeted on Thursday just after the president announced plans to withdraw. "We remain committed to doing our part."

Brad Smith, Microsoft's president and chief legal officer, delivered a strong rebuke of the president's announcement. "We are disappointed with today's decision by the White House to withdraw the United States from the landmark, globally supported Paris Agreement on climate change," Smith wrote in a LinkedIn Pulse blog post.

"Continued U.S. participation benefits U.S. businesses and the economy in important and multiple ways," Smith added. "A global framework strengthens competitiveness for American businesses. It creates new markets for innovative clean technologies, from green power to smart grids to cloud-enabled solutions. And by strengthening global action over time, the Agreement reduces future climate damage to people and organizations around the world. We remain steadfastly committed to the sustainability, carbon and energy goals that we have set as a company and to the Paris Agreement's ultimate success. Our experience shows us that these investments and innovations are good for our planet, our company, our customers and the economy."

Trump argued that the agreement would cost the U.S. economy $3 trillion in lost GDP and 6.5 million jobs, while slashing annual household incomes by $7,000 or more. "Not only does this deal subject our citizens to harsh economic restrictions, it fails to live up to our environmental ideals," Trump said. "As someone who cares deeply about the environment, which I do, I cannot in good conscience support a deal that punishes the United States -- which is what it does -- the world's leader in environmental protection, while imposing no meaningful obligations on the world's leading polluters."

A number of major news organizations reported many of the claims Trump made weren't accurate, notably the vast number of jobs that would be lost, the economic impact and potential for blackouts and brownouts. Smith noted that Microsoft spent several months with administration officials, imploring them of the importance of the agreement.

On the eve of the decision, Microsoft joined other key technology providers, including Adobe, Apple, Facebook, Google, Hewlett Packard Enterprise, Intel and Salesforce.com, in running full-page ads in the New York Times, New York Post and The Wall Street Journal. According to the ad, presented in the form of a letter to Trump, sticking with the agreement would strengthen U.S. competitiveness, create new jobs with providers of clean energy and reduce long-term risks to businesses that can be harmed by consequences to the environment.

Indeed, Microsoft, now one of the largest datacenter operators in the world, has long endorsed efforts and participated in initiatives aimed at reducing the need for carbon-based energy. Microsoft said five years ago it was 100 percent carbon-neutral, and implemented an internal carbon fee. Since imposing that, Microsoft in November 2016 reported that it has reduced carbon dioxide emissions by 8.5 million metric tons, purchased more than 14 billion kilowatt hours of green power and its community projects globally have supported more than 7 million people, all which align with the United Nations (UN) Sustainable Development Goals

Microsoft was a strong proponent of the Paris Agreement, ratified in early November of last year, and within two weeks participated in the launch of the SMARTer2030 Action ICT industry, joining those in technology, business and policy makers to support the Paris Agreement goals "through public awareness, education and mobilization."

The company's position on the withdrawal from the Paris Agreement, as described by Smith, drew mixed reaction in the comments section of his post. Many praised Microsoft's defense of the agreement, while others criticized Smith, saying they shared Trump's view that it wasn't good for America. "Microsoft supports global socialism," Mark Elert, a senior account manager at U.S. Internet, commented. "Apparently Microsoft doesn't actually believe in real science," added Greg Renner, director of information systems at Covenant Services Worldwide.

Another critic, Pat Esposito, a SharePoint and Office 365 consultant who has contributed to Redmond magazine, offered a more measured response, offering Trump the benefit of the doubt. "Let's see what Trump's next move is," Esposito commented. "If he invests the $3B allocated for the accord back into U.S.  green initiatives, perhaps Microsoft and others will follow suit. I say develop the models for success and then bring them to the rest of the world to follow."

I asked him via e-mail why he didn't support the agreement. "Economically, our money will go further spent internally than diluted across the many countries in the accord," he responded. "Lack of binding enforcement, non-guaranteed financing and the fact that several studies indicate even with the plan as configured, it will not have a positive impact. there are other ways we can and must demonstrate a commitment to this world crisis.  Only if Mr. Trump chooses to do nothing should we start calling him out... he's the president, we have to give him a chance to perform."

Posted by Jeffrey Schwartz on 06/05/2017 at 12:02 PM0 comments


Microsoft Gives Skype a Facelift

Microsoft is rebuilding its Skype client communications interface "from the ground up," and it's now available for those with Android phones and planned for iOS devices soon. Windows and Mac versions are slated for release over the next few months. Skype's new look is both cosmetic and functional as it offers a platform for intelligent bots that can let people use it to search for information, products and services.

The revamp applies to the free Skype consumer client that's also now included with Windows. The move is an ambitious push to pick up more mainstream user support by bringing modern communications services to the app. The wholesale rebuilding of Skype fits with Microsoft CEO Satya Nadella's goal to bring conversational computing to the mainstream. However, standing out among options from the likes of Amazon, Apple, Google and Facebook, and even apps that are popular with millennials such as Snapchat, will be challenging.

Images of the new Skype show an entirely new modern interface, which the company proclaims will deliver a vast improvement in how people communicate. The new Skype puts chat "front and center," Microsoft said Thursday when it announced the release of the new interface to Android users.

"We want to help you deepen connections within your personal network," read yesterday's announcement by Microsoft's Skype team. "There's only one of you in this world, so now you can show off your personal style by customizing Skype with your favorite colors. When in a conversation, you should always make sure your voice is heard, or more specifically, your emoticon is seen."

The new Skype interface lets users share their feedback during a chat session or video call by tapping on a reaction icon. Skype also offers a new Highlights section, that lets users create a "highlight reel" of their day with photos and videos. Users can send a Highlight directly to select contacts or groups. Microsoft wants people to use Skype as their canvas to share experiences and communicate more expressively with friends, families and defined groups.

But Microsoft wants people to use Skype beyond simple chat, voice and video communications. The new Skype offers add-ins and bots to provide an "infinitely searchable" tool. In one example, a StubHub bot will help find seats for an event and find seating options and pricing. An Expedia bot lets users find travel options with other bots and add-ins forthcoming.

Since its release a day ago, Microsoft has shared some known issues on its support page. Among them are incoming calls default to the speakerphone. Those wanting to switch to earpiece mode should just tap on the speakerphone icon. Skype doesn't allow users to receive SMS messages within the app yet. The translator function isn't available and voicemail is currently restricted to those with Skype phone numbers. For Skype-to-Skype calls, Microsoft said it's not yet supported, but users can sign in to other Skype clients to receive them.

Perhaps the biggest known complaint is that those using the new Skype client with existing phone numbers aren't showing up as being online for other Skype users, an issue Microsoft said on the support page it is resolving. In the meantime, contacts who search for you using your full name in Skype will find you. Until Microsoft resolves the issue, the support page offers a workaround.

It will be interesting to see if the new interface broadens the use of Skype in a crowded field of communications and chat options. But, for sure, this isn't your father's Skype. If and how these features are brought over to Skype for Business also remains to be seen, but the team will undoubtedly monitor the new consumer interface to see what works.

Posted by Jeffrey Schwartz on 06/02/2017 at 10:50 AM0 comments


Microsoft Readies Cross Region DR Service for Azure VMs

Microsoft is building on its Azure Site Recovery (ASR) service with a new disaster r ecovery option intended to ensure customers can restore Azure virtual machines running in its public cloud IaaS offering. The new service, released today as a technical preview, will let customers replicate IaaS-based applications in Azure to any different region of a customer's choice without having to deploy additional software or appliances to existing ASR subscriptions.

While Microsoft emphasizes Azure's high availability, compliance requirements stipulate the need to have a disaster recovery solution that can provide business continuity. To ensure data is adequately protected requires implementing a simplified approach to replication data to an alternate region, according to Microsoft Principal Program Manager Rochak Mittal.

"This new capability, along with Azure Backup for IaaS virtual machines, allows you to create a comprehensive business continuity and disaster recovery strategy for all your IaaS based applications running on Azure," Mittal noted in today's announcement.

Microsoft is adding new features to its disaster recovery service on the heels of forging tighter ties with key suppliers of data protection software. Among them is Commvault, which claims it was the first to offer a Windows-based data protection offering 17 years ago, and has built its own DR as a service based on Microsoft Azure.

Randy De Meno, chief technologist for Commvault's partnership, says he's not concerned about Microsoft's ASR, noting his company offers a broad set of archiving, Office 365 protection, e-discovery and migration capabilities that go beyond core data replication. "We are going to drive more Azure consumption and give customers an enterprise-wide solution," he said during a meeting yesterday at the company's Tinton Falls, N.J . offices.

Key Microsoft execs have given a nod to some of these key partnerships. In a promotional video released today, Microsoft Corporate VP Steve Guggenheimer said: "We're partnering together to help those customers move to the cloud, in a way that helps take advantage [of what] Microsoft has built in Azure."

Earlier this month, a number of key Microsoft executives gave talks at Veaam's annual gathering of customers and partners in New Orleans. In addition to a keynote by Microsoft's Azure CTO Mark Russinovich, and Principal Program Manager Jeff Woolsey  gave a talk at the VeeamON conference. "Azure Site Recovery is just replicating virtual machines out to Azure," said Clint Wycoff, a technical marketing evangelist at Veeam, during an interview at VeeamON. "It is it's somewhat complex to set up and a lot of customers and partners have had challenges around its complexity and consistency."

Scott Woodgate, director of Microsoft Azure Management and Security,  begged to differ. "This is the simplest disaster recovery or business continuity configuration ever, in particular because it's all within Azure, there's no need to worry about configuring the interface to your organization's firewalls," Woodgate said during a brief interview. "It's virtually a simple wizard. it's virtually a simple wizard. "Many of the other vendors are offering applications running in virtual machines, where I as the end user still needs to manage and patch and update and secure that application. Azure Site Recovery is actually a SaaS service, so Microsoft does all that work for you, which reduces the overall cost of ownership versus the VM based solutions.

The new DR features added to Microsoft's Azure Site Recovery offer redundancy without requiring additional infrastructure on-premises, allowing administrators to choose cross-region DR by selecting the VMs and selecting a target location, according to Mittal. By offering the DR function as-a-service offering, customers can avoid the need to deploy, monitor, patch and maintain on-premises DR infrastructure, he added.

Administrators can enable cross-region DR by selecting a VM, choosing the Azure region and creating replication settings. Customers can choose how to orchestrate failovers and determine the required recovery-time and point objectives. The cross-region preview is available in all regions that Microsoft offers ASR.

Posted by Jeffrey Schwartz on 05/31/2017 at 7:56 AM0 comments


Colin Powell Rouses IT Pros on Cybersecurity and Leadership

Former Secretary of State General Colin Powell is best known for his tenure as a one-time top diplomat and his role as chairman of the Joint Chiefs of Staff during Operation Desert Storm a quarter century ago. But he told an audience of several thousand IT pros yesterday that he's no luddite when it comes to enterprise technology and cybersecurity.

During a one-hour keynote address yesterday closing out this week's Citrix Synergy conference in Orlando Fla., Powell shared his IT credentials and his encounters with cybersecurity challenges over the years. Powell also emphasized the importance of strong leadership and the need to recognize the issues dealing with immigration and the diverse cultures, issues quite germane to any IT organization.

While Powell steered clear of weighing in on the current investigations into charges that the Russians hacked e-mail systems in an effort to steer the outcome of last year's presidential election, he weighed in on the private e-mail  server that Democratic nominee Hillary Clinton used when she served as Secretary of State and how it brought attention to his use of personal e-mail  serving in the same top diplomatic role.

"Hillary had a problem," Powell said, which stirred extended laughter and applause. "They came after me. They said: 'If Hillary did it, Powell must have done it.' So, they chased me around. And they subpoenaed my AOL account and I said 'go ahead, be my guest.' And they looked at it and discovered there was nothing there."

While he didn't dwell on Clinton's e-mail server during the talk, Powell gave this view: "Hillary really didn't do anything wrong," he said, a position he has shared in the past. "It was not done well, but the was no criminal intent or criminal act there, and that's what ultimately was discovered. But what we have to make sure of from now on is that we manage these systems the proper way."

If an 80-year-old decorated military general and former diplomat who served four U.S. presidents might seem an unlikely candidate to offer insights on the state of cybersecurity and IT management today, he pointed to why the audience should take him seriously. Early in his military career, the U.S. Army told Powell they were sending him to graduate school, though not to get a master's in foreign policy or political science. Instead, he was directed to get an MBA in data processing.

"I went there, and I graduated two years later, had a straight-A average, and of the 5,300 people here, this morning, I am probably the only one left, who knows how to program in Fortran, Cobol and to deal with 80-column punch card machines," he said.

"The reality is, as most of you I hope know, that a lot of what you have done and mastered over the years came out of the military," he added. "I was on the DARPAnet [Defense Advanced Research Projects Agency Network, the building block of today's modern Internet] 40 years ago, and information and technology and communications have always been an essential part of my military career of my life and any success that I have had always rested on the ability to move information around rapidly, effectively and make sure it gets to where it's supposed to be."

Powell said the highlight of that came more than two decades ago during the Persian Gulf War in 1990 when Powell and Field Commander General Norman Schwarzkopf Jr. had to move 500,000 troops from the U.S. to Saudi Arabia for the war, then known as Operation Desert Storm. "When it was time to issue the order to start the conflict, we had one of the most perfectly secure means of doing it," Powell recalled. "Not something you might think of now, but it was a fax machine. And by using that secure fax machine, I knew the order only went to one person. It wasn't a table that could spread around all over the organization. So, cybersecurity was always a major part in war planning."

At the same time, Powell recognized that if security was too tight, spreading critical information to key people in the field could stymie an effective outcome. Or worse, it could have put soldiers in harm's way. "I didn't want our security to be too tight ... and those down at the lower level weren't going to get information they needed to get because it was secure, it was secret. And I found early in my career and during this period of Desert Storm that we have to always triage information. There's a lot of information to give out unclassified because no enemy can react fast enough to it. It was a tactical situation."

That would come to play a decade later in 2001 when Powell became Secretary of State under President George W. Bush. "One of the challenges we are all facing now, and I faced when I became Secretary of State, is how to make sure you have an information system that is getting the information to where it has to be, when it's needed in order to be actionable. And make sure you are not cluttering the whole system by overclassifying things."

When Powell took over the State Department, one of his first actions was to make sure the 44,000 employees all had Internet-connected computers on their desks. While employees locally and in every embassy throughout the world now had connectivity, they were internally secured systems.

"That would allow you to send e-mails to the guy next door but you couldn't get the Internet outside to see what was going on in the rest of the world," he said. "I had to change that, so I got new software and hardware and then I had to change the brain-ware. I had to change the thinking within the State Department."

Powell did that by sending unclassified messages from his AOL account to State Department employees.  "They knew Secretary Powell was liable to send an e-mail, so they said 'I better have my system on.'  Now there [was the question] -- should you be using your AOL account that way? It was unclassified information [and] I had a secure system when I was dealing with secure material."

Nevertheless, he saw it as a critical step toward encouraging better communications at the time. "All I had to do was grease the wheels, grease the engine. I had to make sure these people understood the power of the network, the power of the information systems, and the only way to do that was for me to lead and set an example and get it going," Powell said.

Indeed, that worked, though it came back to haunt him when Hillary Clinton's use of personal e-mail dominated her entire presidential campaign.

Posted by Jeffrey Schwartz on 05/26/2017 at 1:12 PM0 comments


Citrix Plans Secure Browser and Windows 10 S Access to Win32 Apps

Citrix Systems is developing a secure browser that it will host in the Microsoft cloud. The new Citrix Secure Browser Essentials, set for release by year's end, will allow IT organizations to present desktop images to users regardless if they run any of the company's VDI or app virtualization offerings.  

Citrix's new secure browser, designed to isolate corporate desktop images and data from personal information and apps, is among a number of new wares Citrix revealed at this week's annual Synergy conference, taking place in Orlando, Fla. Citrix and Microsoft are working together to help deliver the new secure browser, which Citrix will make available in the Azure marketplace. The secure browser will offer a version of Citrix Receiver and a new analytics service and is the next step in the Microsoft and Citrix  broad pact to build the Citrix Cloud on Microsoft Azure.

"The browser itself needs a lot of protection and we will be delivering it with Microsoft," said Citrix CEO Kirill Tatarinov, in the opening keynote session. Tatarinov is the former longtime Microsoft senior executive who took the reins of Citrix early last year and quickly reached out to his former employer to extend their work together.

PJ Hough, Citrix's senior VP for product and also a long-time Microsoft exec who had worked on the Office 365 team before joining Tatarinov last year, said at this year's Synergy that the Citrix Secure Browser Essentials will isolate public Internet browsing from access to enterprise applications and resources. "It's going to be a great isolated browsing experience for customers who want to separate the corporate browsing they do from other applications and experiences on the device," Hough said.

"It's not only separating corporate from personal on the device, it's actually taking the corporate image and putting it up in the cloud," said Brad Anderson, Microsoft's corporate VP for enterprise mobility and security management, who joined Hough on stage during the keynote session at Citrix Synergy.  

Certainly, it's not the first-time Citrix or others have released a secure browser, but the fact that it's Azure hosted, and that it can provide a means of isolating personal from corporate data and apps on any user-owned devices, is a good way to introduce those who don't use Citrix or virtual clients to the company's offerings.

"The potential here is since it's hosted in Azure, there's opportunity to protect the apps and data even further," said Mark Bowker, an Enterprise Strategy Group analyst. "Microsoft is a big target from threat vectors, and having [the browser] on Azure can give it the opportunity to provide an even higher level of protection just due to what they see out on the Internet."

Hough said the new Citrix Secure Browser Essentials, will arrive by year's end and will be available in the Azure Marketplace, with pricing starting at $180 per year (with a three-year subscription for a minimum of 50 subscribed users).

Citrix Receiver for Windows 10 S
Citrix and Microsoft are also working to deliver a release of the Citrix Receiver client for the new Windows 10 S operating system that will aim at rendering traditional Win32 desktop apps.  

Windows 10 S, which Microsoft announced earlier this month, is a version of the OS that is locked-down, meaning it will only run Universal Windows Platform (UWP) apps and tools that are only available in the Windows Store. The new Citrix Receiver for Windows  "opens the door for the Win 32 apps to run on Windows 10 S," according to a description posted by Vipin Borkar, a director of product management at Citrix. He also added that it will provide a way for organizations that want the Windows 10 S UWP experience, but also may want support for specific Win32 apps or environments not likely to find their way into the Windows Store, such as Google's Chrome browser.

Hough said it should be available in the Windows Store "any day."

Citrix also said it is working with Microsoft to develop a threat analytics service, which can pull all of the telemetry of its XenDesktop and XenApp solutions to address advanced security threats. The Citrix Analytics Service will offer continuous monitoring that will use the telemetry of users, devices, applications and networks to detect anomalies that may portend potentially malicious activity and offer specific responses to prevent an attack.

The plan to offer the Citrix Analytics Service, which will run on Azure as a part of the Citrix Cloud, comes as Microsoft is in the process of rolling out its own Windows Defender Advanced Threat Analytics service. Since the Citrix Cloud runs in Microsoft Azure, it's reasonable to presume they're exploring a number of integration points, including using Azure Machine Learning and the Microsoft Security Graph, as well as extending on the work they're completing with tying the Citrix platform to Microsoft Enterprise Mobility + Security (EMS).

Meanwhile Hough and Anderson pointed to the deliverables announced at last year's Synergy conference, among them the ability to run Citrix XenDesktop Essentials and XenApp Cloud Essentials in hybrid environments on Windows Server 2016 and Microsoft Azure, the integration of Microsoft's EMS service and the Intune mobile application management functions with Citrix XenMobile. "Citrix has taken all their apps and Intune-MAM-enabled them," Anderson said. "IT professions get one common management paradigm for managing all of their apps. And that translates to a much easier user experience because users have all of this working underneath one policy as one. It just flows a lot easier for them."

Posted by Jeffrey Schwartz on 05/24/2017 at 1:56 PM0 comments


Microsoft Azure, Office 365 and AWS To Gain Data Recovery Options

When data protection provider Veeam last week held its annual gathering of customers and partners, the company outlined how the next release of its availability suite will offer near-real-time recovery, provide integration with a much wider range of storage hardware and use Microsoft Azure and Amazon Web Services for disaster recovery services. Veeam also said it will provide extended support for Office 365.

As reported last week, Veeam Availability Suite v10 will offer continuous data protection (CDP) with recovery point objectives of 15 minutes. Co-CEO Peter McKay, speaking during the opening keynote of the company's VeeamOn conference in New Orleans, described v10 as a "major extension" of its platform, which will widen its ability to backup and recover physical systems, which still account for an estimated 25 percent of datacenter resources, as well as Linux and Windows endpoints, covering PCs, personal mobile devices and IoT-based hardware with embedded software. A new API will allow connectivity to substantially more NAS storage offerings and reduced long-term retention costs with support for native object storage. But the other key focus at last week's conference was the rapid growth of cloud infrastructure, both hybrid and public services.

"Seven percent of the company's customers already have a cloud-first strategy when it comes to IT," said Paul Mattes, vice president of Veeam's global cloud group, according to a recent survey. Mattes, who gave a keynote in the second day of the conference, cited IDC's projection that 46 percent of IT spending will be in the cloud. "This is happening at a pace much greater than anything we've seen before in the history of IT," said Mattes, who joined Veeam last fall after a long tenure with Microsoft and its Azure team. "The cloud is being adopted at a rate that's much faster than anything we've seen, and that's because of the business opportunity and agility that cloud technology has afforded. So Veeam has embraced this aggressively [and] we are going to continue to deliver solutions that are cloud focused."

Several years after releasing its Cloud Connect interface, supporting Amazon Web Services and Microsoft Azure as a target, the v10 release will bring the CDP technology to create much deeper integration with the two companies' large public clouds, among others. While Veeam gave many of its key alliance partners ample airtime during the three-day gathering, including Microsoft, VMware, Cisco, NetApp, IBM and Hewlett Packard Enterprise, the company emphasized the deeper relationship it has formed with Microsoft. "Our Microsoft partnership has been one of the strongest and deepest partnerships over the years and we are going to take that further and deeper," CEO McKay said.

As part of last week's new pact, Veeam and Microsoft said they are working together to offer Veeam Disaster Recovery in Microsoft Azure, which will provide automated availability of business-critical workloads by providing DR as a service in Azure. The new Veeam DR in Azure will tie Veeam's new Direct Restore tool and the new Veeam Powered Network (PN), a tool that the company said will automate the setup of DR sites in Azure. "What the Veeam Powered Network is about is delivering a free lightweight tool that you can use to quickly set up a DR site in Azure using an easy-to-configure, software-defined networking tool," Mattes said. It doesn't require a VPN and will be offered to organizations of all sizes.

Also coming is native support for Microsoft's Azure BLOB cloud object service. Called Veeam's Scale Out Backup Repository (SOBR), the company said it will treat cloud object storage as an archive tier, allowing customers to retain Veeam backups as BLOB storage, a lower cost option for retaining data. The Veeam Management Pack will also tie with Microsoft's new Operations Management Service (OMS), a Web-based services that provides views of system and multi-cloud resources.

Veeam also said it is upgrading its recently launched Veeam Backup for Office 365. The protection tool will provide extended support for Exchange Online and will bring SharePoint Online and OneDrive for Business into the mix, providing support for hybrid SharePoint environments. The new release will provide multi-tenancy and support for multiple repositories, which will allow its network of 15,000 cloud service providers to deliver more secure, scalable and broader options. It will also include added automation with new PowerShell SDK and a RESTful API. "We will deliver full PowerShell support for creating jobs and modifying your organizations and adding infrastructure components to fully automating the recovery of e-mail items," said Mike Resseler, director of product management, who demonstrated the new Office 365 features. Reseller noted that the new RESTful API will interface with existing automation tools.

In addition to Microsoft Azure, Veeam said it has invested in a company that will allow it to provide native and agentless protection of AWS EC2 instances, and some of its other offerings including Aurora, Redshift and RDS. The company inked a partnership with N2WS, which will offer what it claims is the industry's first cloud-native, agentless backup and recovery and DR software that can recover AWS applications, moving them to other AWS accounts or regions, as well as into hybrid multicloud environments. During a session after the keynote Rick Vanover, Veeam's director of product marketing, demonstrated what he described as a mock scenario (sine it's not available yet) where data backed up in AWS was moved over to Azure (but, as Vanover noted, through non-existent components). "I showed the functionality that the Connector will have," he noted in a follow-up e-mail. "It will pick up AWS snapshots and write in a repository of Veeam."

Posted by Jeffrey Schwartz on 05/22/2017 at 1:00 PM0 comments


Veeam's Availability Platform Will Gain Continuous Data Protection

Veeam this week explained how it plans to fulfill its goal of extending its popular virtual machine protection platform to offer near-real-time availability of mission-critical systems running in hybrid environments. The company outlined its ambitious plans, which center around the forthcoming Veeam Data Availability Platform 10.0, at its VeeamOn gathering of 3,000 customers and partners in New Orleans.

The deliverables, set to start rolling out toward the latter part of this year, will fill some key gaps in its systems availability portfolio, such as providing continuous data protection (CDP) and protection of physical servers, cloud instances and SaaS-based data including Office 365 and Linux and Windows endpoints. The 10-year-old company that built its brand and a reputation as a leader in protecting VMs last summer signaled its intent to make its data protection offering viable for large global enterprises. At this week's conference, the company introduced the Veeam Availability Suite 10.0, and supporting software aimed at building on its footprint.

Among the deliverables Veeam outlined this week include:

  • Continuous data protection of tier-1 and mission-critical applications with RPO/RCOs of 15 minutes supporting hybrid, multicloud environments hosted and SaaS environments.
  • Disaster recovery in Microsoft's Azure public cloud, bringing Veeam's Direct Restore and new Protected Node (PN) to automate the DR process.
  • Agentless data protection of data running in Amazon Web Services (AWS).
  • An agent that can protect Windows-based physical servers and endpoints, along with applications running in Microsoft Azure, AWS and other public clouds.
  • Native object storage support using Amazon S3, Amazon Glacier, Microsoft Azure Blob and any S3/Swift compatible storage.
  • Extended Office 365 protection including OneDrive for Business and SharePoint Online.
  • A new Universal Storage API framework that will add IBM, Lenovo and Infididat to Veeam's list of storage partners, joining Cisco, Dell EMC, Hewlett Packard Enterprise (HPE), NetApp and hyper-converged systems provider Nutanix.

In addition to broadening beyond its core competency of protecting VMs, the company said it is transforming itself with a new brand and new software licensing that are in line with the growing trend toward subscription-based pricing, while providing closer integration with key cloud providers. "We created a great business in the last 10 years. But now the game is changing completely because customers are moving to a multicloud world, so we have to adopt," said Ratmir Timashev, the company's cofounder, during an interview at VeeamOn. The company, whose revenue last year reached $607 million, up from $474 million in 2015, aims to take in $800 million this year and exceed $1 billion in 2018.

Timashev last year determined that to reach its lofty growth goals, it needs to move beyond protecting VMs and change its business model of offering perpetual software licenses as customers increasingly demand IT providers offer cloud-based subscription models. To do so, he brought in Peter McKay, a senior executive from VMware's client virtualization group, as co-CEO, putting him in charge of operations and growing the company. McKay, who joined VMware last July, has quickly jumped into filling some of those technology gaps and broadening its development efforts to focus on the shift to cloud-based infrastructure and application services.

"We dominate the virtualized world, but we needed to do physical and we needed to be more aggressive in the cloud space and so a lot of these announcements do just that," McKay said in an interview at VeeamOn. In order to become more aggressive in the cloud, and challenge the established data protection providers, McKay said he quickly determined that providing near-real-time recovery with CDP was a key priority. "That is huge," McKay said. "It's probably one of the most exciting announcements we have, if not the most exciting, and that changes the game in the market for us."

Posted by Jeffrey Schwartz on 05/19/2017 at 1:25 PM0 comments


VMware Horizon Cloud Virtual Clients To Run on Microsoft Azure

VMware is bringing its new Horizon Cloud virtual app and desktop as a service to Microsoft Azure later this year. Microsoft Azure will represent the largest and only the second public cloud service available to date for Horizon Cloud customers.

By bringing Horizon Cloud to Azure, which VMware announced on Tuesday, customers will have another option for running managed apps and desktops in Microsoft's public cloud when it launches in the second half of this year. Rival Citrix Systems, which is set to have its annual Synergy technology conference in Orlando, Fla. next week, nearly a year ago launched a virtual client Windows-as-a-service offering hosted on Microsoft Azure as part of a broad partnership between those two companies.

While VMware last year announced plans to offer cross-cloud management services on Microsoft Azure and other large public clouds, the plan to add Horizon Cloud to the mix gives the burgeoning service a broader set of infrastructure flexibility, said Courtney Burry, VMware's senior director of marketing.

"We often see customers requiring fully cloud-hosted desktops and, while they can take advantage of what we have available today with Horizon Cloud and IBM SoftLayer, obviously lots of customers are moving toward Azure," Burry said.  "We want to make sure we support those customers with the ability to manage that Azure infrastructure and manage those applications through that common cloud control plane and take advantage of those different deployment models."

Burry said one of the notably appealing benefits of Microsoft Azure is the availability of sub-hourly billing and its global datacenter footprint. Customers running Horizon Cloud in Azure will be able to use some of its other attributes, such as federated access into Azure Active Directory and a common management interface.

A customer will have the option to connect Azure infrastructure with the Horizon Cloud control plane, letting VMware manage desktops, apps and the entire infrastructure through that control plane. "Unlike our IBM model, in which a customer would come and buy the infrastructure and the desktops and apps through VMware, this provides customers with the flexibility that they have when using Azure today," said Burry.

Horizon Cloud, the outgrowth of the company's Horizon Air virtual desktop service, was launched earlier this year as part of VMware's Workspace One portfolio, initially supporting IBM's SoftLayer as its only public cloud provider. Customers can also run Horizon Cloud on the VX Rail hyper-converged infrastructure from its parent company Dell EMC, as well as hyper-converged infrastructure from Quanta and Hitachi Data Systems.

VMware launched Horizon Cloud with the goal of upping the performance and functionality of virtual clients by offering them on a common backplane. Horizon Cloud provides a common platform for managing virtual clients, devices and system images with common monitoring and reporting and service updates. In addition to the Horizon DaaS, Horizon Cloud  includes the new Horizon Apps, which delivers published SaaS and mobile apps to the workspace.

Horizon Cloud's new Just-in-Time Management Platform (JMP) can offer real-time app delivery, rapid provisioning of virtual desktops and contextual policy management, according to the company. VMware also has touted Horizon Cloud's new Digital Workspace Experience with BEAT (Blast Extreme Adaptive Transport), its new UDP-based network link designed to optimize user experiences regardless of network conditions, making it suitable for low-bandwidth, high-latency and high-packet loss.

Horizon Cloud is designed to let organization provision fully featured PC desktops and applications running either solely in the public cloud or they can run the hyper-converged infrastructure and scale to the public cloud.

Initially, the Azure-hosted Horizon Cloud service will support the virtual desktop offering, with the app service set to follow. Burry also said the release of the Skype for Business option for Horizon Cloud, announced last fall, is imminent.

Posted by Jeffrey Schwartz on 05/17/2017 at 5:12 PM0 comments


Apple iTunes Is Coming to the Windows Store

As the list of major apps joining the Windows Store continues to grow, albeit incrementally, Microsoft scored another coup this week announcing that the Apple iTunes music and app stores will be available by year's end.

Microsoft has struggled to get big names into the Windows Store but a number have jumped on board including Facebook and Spotify. In addition to iTunes, Microsoft announced that SAP's Digital Boardroom and the popular Autodesk Stingray 3D gaming and real-time rendering engine were being added to the Windows Store. Stingray isn't the first Autodesk modern app to join the Windows Store. Autodesk Sketchbook was added last year and Microsoft reported it's seeing 35 percent sales growth each month for the app this calendar year so far.

The release last month of Windows 10 Creators Update, announced last fall, appears to have generated improved prospects for Microsoft's Universal Windows Platform (UWP), through it's far from having received critical mass. Microsoft is also bringing Office 365 to UWP. The cover story for this month's issue of Redmond magazine offers some analysis on the impetus for the latest Windows 10 release and the company's "mixed reality" tooling to motivate a growing number of developers to build UWP apps and make them available in the Windows Store.

In his monthly Redmond Windows Insider column, Ed Bott questioned whether there's enough incentive for the broad cross-section of software publishers and developers to dedicate resources to UWP. "It doesn't help that Microsoft has pivoted its mobile platform more often than the New England Patriots backfield," Bott writes. "Just ask a longtime Microsoft developer about Silverlight. And be prepared to duck."

Terry Myerson, Microsoft's executive VP for Windows and Devices, announced at Build this week a number of efforts to motivate developers. Among them are .NET Standard 2.0 for UWP and XAML Standard, slated for release later this year, which Myerson said will provide a more simplified and modernized code base. Myerson remains encouraged.

"There has never been a better time to bring your apps to the Windows Store," he wrote in his blog post, where he claimed monthly in-app purchases in the Windows Store have doubled.  Myerson also noted that Microsoft will deliver  complete  UWP functionality to Visual Studio Mobile Center this fall, which will include automated build support and an extended number of Windows devices in its test cloud. 

Posted by Jeffrey Schwartz on 05/12/2017 at 12:41 PM0 comments


Dell EMC Turbocharges Flagship PowerEdge Servers Hitched to Datacenter Refresh

Dell EMC is refreshing its compute storage and networking offerings with a broad portfolio of modernized infrastructure designed to underpin hybrid cloud datacenters of all sizes. At the core of the company's new lineup of datacenter offerings, outlined this week at Dell EMC World in Las Vegas, is an upgraded version of the flagship Dell EMC PowerEdge servers, the first developed by the newly merged company.

The company kicked off the datacenter portion of the conference with the launch of its PowerEdge 14 G servers (due out this summer) which are tied to the release of Intel's next-generation Xeon processors, code-named "Skylake Purley." It's the first refresh of the PowerEdge server line in three years and, in keeping with any refresh, the new systems offer the typical boosts in feeds and speeds. And while PowerEdge refresh will appeal to anyone looking for the latest servers, the release is also the key component to the entire Dell EMC converged and hyper-converged systems portfolio as well as new purpose-built appliances and engineered systems.

In addition to a new line of tower and rack-based servers, the PowerEdge 14 G will be the core compute platform for the forthcoming Azure Stack system and a new portfolio of datacenter tools, including a new release of its Networker data protection offering and upgrades to the VXRail 4.5, VX Rack and XC Series engineered systems (Windows Server, Linux and VMware, among others). "This is our 14th generation of servers, which is actually the bedrock of the modern datacenter," said David Goulden, president of Dell EMC, during the opening keynote session.

The new PowerEdge 14 G servers will be available for traditional datacenter applications as well as Web-scale, cloud-native workloads. Among the key upgrades that Dell EMC will deliver in the new PowerEdge server line are increased app performance and response times. The company claims the servers will offer a 19x boost in Non-Volatile Memory Express (NVMe) low latency flash storage single-click BIOS tuning that will allow for simplified and faster deployment of CPU-intensive workloads and the ability to choose from a variety of software-defined-storage (SDS) options. "We knew we had to accelerate the workloads. We had to reduce the latency to make sure we have handled the performance to transform peoples' businesses ," said Ashley Gorakhpurwalla, president of the Server Solutions division at Dell EMC. The server's new automatic multi-vectoring cooling allows a greater number of GPU accelerators, which the company claims can increase the number of VDI users by 50 percent.

In addition to the performance boost, company officials are touting a more simplified management environment. The servers will support the new OpenManage Enterprise console and an expanded set of APIs, which Dell EMC said will deliver intelligent automation. The company described the new OpenManage Enterprise as a virtualized enterprise system management console with a simple user interface that supports application plugins and customizable reporting. A new Quick Sync feature offers server configuration and monitoring on mobile devices. It boasts a 4x improvement in systems management performance over the prior version and can offer faster remediation with its ProSupport Plus and Support Assist, which the company claims will reduce the time to resolve failures by up to 90 percent.

Dell EMC has also added some noteworthy new security capabilities embedded in the hardware that the company said offers new defenses including SecureBoot, BIOS Recovery, signed firmware and iDRAC RESTful API that conforms to Redfish standards.  It also has better protection from unauthorized access control changes, with a new System Lockdown feature and a new System Erase function that ensures all data is wiped from a machine when taken out of commission.

The new PowerEdge servers were part of a number of other key datacenter offerings announced by the company this week. "Our new 14 G servers will be built into our full Dell EMC product portfolio, bringing out of our seventh generation of storage and data protection as well," Goulden said. The servers will be offered with a variety of the company's new software-defined enterprise storage systems including a new version of the Dell EMC ScaleIO software-defined storage (SDS), upgrades to the company's Elastic Cloud Storage (ECS) platform including the ECS Dedicated Cloud Service for hybrid deployments of ECS and  ECS.Next, which will offer upgraded data protection and analytics, its new Project Nautilus  SDS offering for storing and  streaming IoT data. The servers will also power Dell EMC's new Ready Node portfolio, designed to transition traditional datacenters into cloud-scale infrastructure.

In addition to storage, Dell EMC said the PowerEdge 14 G will power the company's new Open Networking switches, including what the company claims is a top of rack that can offer more than a 2x in-rack throughput speed of traditional 10GbE switches and a unified platform for network switching as well as a new line for small and midsize organizations.

It will also run Dell EMC's forthcoming Azure Stack appliance, announced last week and on display on the show floor. I have a meeting and will file a separate post with more information on that offering, which isn't set to ship until the second half of this year.

Posted by Jeffrey Schwartz on 05/10/2017 at 2:14 PM0 comments


Surface Laptop Success Will Depend on Long Battery Life

When Microsoft introduced its Surface Laptop last week, the company boldly promised it would "reset" the mobile PC category. Besides some innovative mechanical engineering, an impressive high-resolution PixelSense display that renders 3.4 million pixels and a lightweight, thin form factor, company officials were especially proud of the battery life the Surface Laptop is poised to achieve: 14.5 hours when continuously running video.

Most people take such battery life claims with a grain of salt, a point I reminded the lead engineer for the Surface Laptop at last week's launch event in New York City. The engineer, who requested his name not be used, seemed to take exception to my skepticism of such best-case claims. A year in the making in collaboration with Intel, the engineer was emphatic that the Surface Laptop's battery life will prove impressive.

First off, he emphasized improvements with Intel's 7th Generation Core processor and the work the two teams have done on ensuring the new Windows operating system and the engineering applied to the Surface Laptop, will ensure long battery life. Second, the team looked at its previous efforts, where engineers used telemetry from previous versions of the operating system, the Edge browser and Office.

"Architecturally we took a slightly different approach to developing the Surface Laptop in that we deliberately load-switched almost all of the subsystems to optimize those subsystems for when we need to bring them up and power them down," the engineer explained. "From the beginning there was a conscious effort to prolong battery life, increase connected standby time and off state power to minimize it."

The fact that the battery pack has four equal cells is also a key factor, he added.  "What's nice about that, is there are two serial and two parallel cells, which optimizes battery life because every cell works exactly the same way. You don't leave a lot of capacity on the table, and over the life of the battery, you have less aging."

Similar to the Surface Pro hybrid laptop PCs, the battery in the new laptop can't be swapped out. Many users of the Surface Pro 3 reported low battery lives on certain models, attributed to some bad batteries. Customers with expired or no extended warranties were out of luck. Will those who purchase the new Surface Laptop have better luck? The engineer was pretty confident that they will and that Microsoft has learned a lot about optimizing battery life since then.

"I do understand the issue that you may have had with the claims versus reality," he admitted. "We've done a lot of work over the last couple of years to make sure that the claims match the experience a lot more so. A lot of work went into getting to the 14.5 hours. We wouldn't have claimed it if we hadn't validated it with numerous SKUs and multiple lots, and a substantial number of devices."

Besides the battery life, I asked what else will help the Surface Laptop justify its premium price over the top-tier Ultrabooks from key OEMs? "Look at the thickness of this device, the fitting of the motherboard underneath that keyset and key travel, the vapor chamber design, the heat pipes underneath and the spreading that heat," he said as he showed me the system. "And then we vent all of the heat out of the back, so there's no exposure. And if the fan drives all of the exhaust air out the back, we actually have a real challenge in that we hold the touch temperatures to a really low temperature."

It sounds impressive but the temperature will be pretty high if the Surface Laptop doesn't offer the superior battery life promised.

Posted by Jeffrey Schwartz on 05/08/2017 at 10:29 AM0 comments


Dell EMC Reveals Hybrid Cloud Platform Running Azure Stack

Today marks the two-year anniversary of Microsoft's first Ignite conference in Chicago where the company revealed plans to offer Azure Stack, the same software that runs its public cloud, and also unveiled the technical specifications allowing customers and service providers to run iterations of Azure in their own datacenters. While the company's vision for Azure Stack changed last year after the release of the first technical preview, Microsoft has signaled it will appear in the latter half of this year and there are now signs it will soon see the light of day.

Dell EMC offered a key indicator today that Microsoft is on track with the introduction of a single-node server for developers and a 4-node converged system aimed at dev and test, both of which will appear when Microsoft officially releases the software later this year. Dell EMC is one of four of the top datacenter infrastructure providers that Microsoft has engineering and codevelopment pacts with to offer Azure Stack-certified systems. In addition to Dell EMC, Microsoft is working with Cisco, Hewlett Packard Enterprise (HPE) and Lenovo, which have all indicated they'll have Azure Stack-based systems and have demonstrated prototypes over the past few months.

Following an interview with Dell EMC officials, it's clear that the company is taking a different approach to its Azure Stack systems than the Cloud Platform System running Windows Azure Pack (WAP). The CPS was introduced by Dell prior to its merger with EMC. Now that the two companies are combined, EMC's influence on the Azure Stack platform is apparent. Today's announcement comes in advance of its Dell EMC World gathering, which will take place next week in Las Vegas, where officials plan to emphasize hybrid cloud.

Since the completion of the Dell-EMC merger last summer, server, network and storage infrastructure product groups have consolidated with EMC's headquarters in Hopkinton, Mass. The development of its Azure Stack appliances are now based on EMC's approach to hybrid cloud hardware, said Kevin Gray, Dell EMC's director of product marketing for hybrid cloud platforms.

"Our first enterprise hybrid cloud was based on the VMware vRealize Suite and all their software defined datacenter components and the virtualization," Gray said. "We build integrations to IaaS layer, things like backup and encryption as a service, and we're extending that approach and model to Azure Stack. We are leveraging the partners we've had with Microsoft and the expertise we both have with hybrid cloud."

In addition to its vRealize-based offering, Dell EMC offers its Pivotal Cloud Foundry native hybrid cloud platform, which Gray said focused on enterprises and partners building cloud-native analytics modules. "We are moving this model to Azure Stack," he said.

Gray said the company isn't revealing specs at this time other than the entry level 1-node system doesn't come with the entire infrastructure and tooling planned for deployment as it's only intended for skilled developers. It will carry a list price of $20,0000. The 4-node system will carry a list price of $300,000. The company isn't offering details on hardware spec as of yet.

However, Gray said where Dell EMC believes it will have a differentiated offering with its Azure Stack Platform is via its backup-as-a-service offering, based on Dell EMC Networker and Data Domain storage. "We back up not just the data produced by the applications, we actually protect that underlying infrastructure of the Azure environment," he said. "So, all of the bits that are created and the IaaS and PaaS layer are protected, as well as the underlying infrastructure, making sure we backup the full environment."

Dell EMC's Azure Stack Platform will also offer the company's CloudLink SecureVM encryption offering. This secure VM tool is available in the Azure catalog and enables encryption of virtual machines such that they're portable and remain secure as they move between hosts. "That really ensures that workloads remain secure, wherever they are running in the datacenter, whether it's in public cloud or if it's the on-premises instance of Azure Stack," Gray said.

While Gray emphasized the 4-node system will be targeted for development as well, he indicated deployment-based systems will also arrive by launch time.

Posted on 05/04/2017 at 1:27 PM0 comments


Microsoft Planning New UWP-Only Windows 10 S and Unveils Surface Laptop

Microsoft wants to see Windows PCs, Office 365 and its forthcoming mixed reality wares in every classroom from kindergarten through high school and college. The company has taken the wraps off perhaps its broadest effort yet to accomplish that goal.

At an event in New York today, Microsoft unveiled Windows 10 S, a version of Office 365 optimized for educational environments and a new Surface Laptop that company officials said will exceed the capabilities of existing mobile PCs, Chromebooks and Apple MacBooks. The company also released a version of its Intune configuration and management service customized for educational environments.

It's not lost on any provider of technology that capturing the student demographic is critical, since that's the time they form preferences and allegiances to specific platforms and applications. Likewise, making it easier for students to learn and collaborate with each other, teachers and parents is critical, said Microsoft CEO Satya Nadella, who discussed how he's spent the past two years visiting classrooms all over the world.

"Technology should make teachers' lives simpler and spark students' creativity, not distract from it," Nadella said, in remarks kicking off today's MicrosoftEDU event. "This is a top priority we are focused on at Microsoft. Today we are delivering an accessible streamlined platform, readily available to all classrooms so teachers spend less time focused on technology and more time doing what they love doing: inspiring students."

While speculation about Microsoft's plans to release new hardware has mounted for months, perhaps the biggest surprise was the launch of Windows 10 S, a version of the operating system optimized for classroom environment. It will support forthcoming View Mixed Reality learning experiences as well as various new teaching applications and STEM-based lesson plans and apps such as Lego's WebDo 2.0 tools focused on headsets, interactive whiteboards and accessibility.

Terry Myerson, executive VP of Microsoft's Windows and devices group, described Windows 10 S, which can run on partner devices that start at $189, up to Microsoft's high-end Surface Book, as a streamlined version of the OS that's secure and able to maintain consistent performance over years of usage.

Windows 10 S will also test the appetite for Microsoft's Universal Windows Platform (UWP) in a big way because it will not run classic Win32 software -- only apps available in the Windows Store. This restriction will ensure consistent performance and better security, Myerson explained.

"Everything that runs on Windows 10 S is downloaded from the Windows Store, which means first it's verified for security and performance," Myerson said. "When it's downloaded to the device, it runs in a safe container to ensure that the execution of applications don't impact the overall performance of the rest of the system, allowing the performance of the device to be the same on day one as day 1,000."

Still lacking in the Windows Store is the complete desktop suite of Office applications consisting of Word, Excel, PowerPoint and Outlook, which Myerson said, "will be coming soon." Another limitation that might raise some eyebrows, but also with the same goal of ensuring consistent performance and security, is the fact that Windows 10 S will only run Microsoft's Edge browser. Also, Windows 10 S won't support domain joins to Active Directory on-premises -- only via Azure AD.

Windows 10 S is slated for release this summer and can be deployed on existing Windows 10 Pro systems free of charge. New PCs sold for educational use will also include free subscriptions to Minecraft: Education Edition. Microsoft is also offering Office 365 for Education, including use of the new Microsoft Teams free of charge to students and educators. The company also released a version of Windows Intune for Education, which is now available.

Surface Laptop: 'Resets the Category'
The other big news at today's MicrosoftEDU event was the launch of the Surface Laptop, a thin and lightweight device with a 13.5-inch display available in high-end configurations that aims to offer a viable alternative to Apple MacBooks. While this is not the first Surface introduction to make such a claim -- in fact, most have -- it may have made the strongest argument yet, though it's too early to draw any conclusions since the device isn't shipping.

"This is the laptop that resets the category," said Panos Panay, Microsoft's corporate VP for devices. While it runs Windows 10 S, this system will clearly not just appeal to students, though clearly the company wants to grab the attention of those who want to go back to school this fall with a MacBook. Panay emphasized the engineering of the device, which is made of anodized metal, an Alcantara-covered textured backlit keyboard with keys that are 1.5 mm and a .2mm pitch which is has a maximum of 0.57 inches and weighs just 2.76 pounds. Microsoft officials claim the Surface Laptop will get 14.5 hours of battery life, but typically systems never achieve maximum power estimates.

Although I only spent a few minutes with the new Surface Laptop, it was quite light given its size, and its 3.4 million-pixel display renders high-resolution visuals. The systems are priced in the same general range as Microsoft's current Surface Pro 4 line. An entry level Surface Laptop costs $999, configured with an Intel 7th Generation Core i5 processor, 128GB SSD and 4GB or RAM. The company is taking orders for i5-based systems today, which are slated to ship June 15. Surface Laptops based on the Intel Core i7 processor are scheduled to ship in August. A Surface Laptop with an i7 processor, 16GB of RAM and a 512GB SSD is priced at $2,199.

While the Surface Laptop will run the new Windows 10 S operating system, it will also support Windows 10 Pro, and presumably enterprise editions. Microsoft is initially only selling the Surface Laptop through its own retail and online stores. Asked about plans for other retailers, resellers or channel partners to offer the Surface Laptop, a Microsoft spokeswoman said the company has no information to share at this time.

Posted by Jeffrey Schwartz on 05/02/2017 at 11:48 AM0 comments


Microsoft Showcases Customers Using Azure Cloud for Digital Transformation

Microsoft this week showcased customers that are investing in the company's newest cloud-based offerings such as the Internet of Things (IoT) and machine learning using predictive analytics.

At an event held in New York called Microsoft's Digital Experience, the company showcased more than a dozen companies that have committed in some form to piloting or implementing a number of new wares aimed at either rapidly accelerating an existing process and/or enabling new revenue opportunities.

Companies that Microsoft said are using such Azure-based technologies include Bank of America, Hershey, Fruit of the Loom, Geico, Maersk and UBS. For example, Hershey is using Azure Machine Learning and Azure IoT to better predict temperatures in vats based on feeds from sensors to reduce waste in the process of manufacturing Twizzlers by using PowerBI and Azure ML.

"This is a beautiful correlation plot that we run in Power BI, said George Lenhart III, senior manager of IS innovation and disruptive technologies at Hershey. "We turned on the machine learning, and by adjusting every five minutes with the prediction of whether it was going to be heavy or light, we were able make changes accordingly."

Maersk Transport and Logistics, one of the world's largest logistics providers, plans to use Microsoft's technology to automate its supply chain and with the goal of shaving tens of millions of dollars from its costs by bringing information to management and customers and predicting activities that may be the source of delays. "It's all about speed -- how do we go to market faster, how do we do much more with less and how do we increase our return on investments," said Ibrahim Gokcen, Maersk's chief digital officer.

Gokcen said Maersk started using the Azure Machine Learning IoT suite and other services about a year ago and has decided to engage in a long-term effort to build out an Azure-based digital marketplace. Maersk is also considering the potential of Azure Stack and running it on the more than 1,000 ships which could be anywhere in the world.

"It will be great because as we build applications in the cloud, we will be able to just drop them on premises and do some streaming analytics on data that we generate on the vessels that can't be transmitted to the cloud when they're in the middle of the ocean. Then when they reach a port they can replicate and synchronize the data."

Patrick Moorhead, principal of Moor Insights and Technology, said the event is a sign that Microsoft is making more progress with customers than it often gets credit for. "I think they're making a lot more progress with their customers than people realize. And if there's one thing I can fault Microsoft for, is I think they need to be telling this story more," Moorhead said. "They have infinitely more customers using Google cloud and AWS."

Microsoft's talk at Digital Experience included results from a Harvard Business Review study  that Microsoft had commissioned. Based on a global survey of 783 people with various roles in business operations and management, the survey found only a handful, 16 percent, considered themselves fully digital operations, though 61 percent said they have started going down that path. Only 23 percent said their business rely on many digital technologies. The survey is available for download here.

Posted by Jeffrey Schwartz on 04/28/2017 at 12:59 PM0 comments


Windows, Office and SCCM Release Schedules Now in Sync

Microsoft's announcement last week that it that it has combined the release cycles of Windows, Office 365 ProPlus and System Center Configuration Manager (SCCM) should be welcome news to IT managers. Even better is the fact that Microsoft has designated that those releases will come out twice per year -- in March and September.

Given Microsoft's legacy of missing deadlines by many months (and often longer for many of its major products) it's reasonable to be skeptical that Microsoft can keep to the regimen it committed to last week. But more recently, Microsoft has pretty much kept to its release plans.

It always seemed incongruous that the different products came out at different times, though with new versions coming out every three years or so, it was certainly more understandable, especially given Microsoft's organizational makeup in the past. Now that Microsoft has moved to more frequent release cycles, the company says customers have asked that the servicing models become more consistent and predictable.

As IT plans system upgrades moving forward with the knowledge that the latest versions of Windows, Office 365 ProPlus and SCCM update releases are now aligned, this should help bring better clarity and consistency to those for whom that matters. And for IT pros who are more ambivalent about the change, there's little downside.

Microsoft said it will be fielding more questions on the changes next week during a Windows as a service AMA, scheduled for May 4 at 9 a.m. PST.

Posted by Jeffrey Schwartz on 04/26/2017 at 10:04 AM0 comments


Microsoft Broadens IoT Reach with New SaaS Offering

Microsoft has unleashed numerous new offerings to build up its extensive suite of Internet of Things (IoT) technology. Looking to extend the reach of  IoT for novices, the company is planning to release a new Microsoft IoT Central SaaS-based offering.

Unlike the rest of its IoT services and tools, Microsoft IoT Central is a managed SaaS solution for those that don't have experience working with and building automation and data gathering capabilities. Microsoft IoT Central aims to significantly accelerate the ability of those customers to deploy the variety of automation and data gathering capabilities using Windows 10 IoT Core and integrating them with existing applications and systems.

Microsoft said in last week's announcement that it will roll out the new service in the next few months. Although the company didn't reveal much about the new SaaS offering, it appears Microsoft IoT Central is aimed at customers and partners looking to build applications that utilize sensors and intelligent components into their applications that know little about IoT and don't want to use its PaaS-based offering.

The Microsoft IoT Central announcement was one of a number of new offerings introduced last week and earlier this month. The company is also showcasing its Microsoft Azure IoT Suite PaaS offering this week at the annual Hannover Messe conference, and is also using the event to unveil its new Connected Factory offering. Microsoft said Connected Factory is designed to simplify the connection of on-premises environments based on the industry standard OPC Foundation's platform-independent Unified Architecture (UA) and the older Windows-specific OPC Classic devices to Microsoft Azure, which the company said provides operational management information. IT managers can use the Microsoft Azure-based Connected Factory to view and configure embedded factory devices.

Partners that offer IoT gateways that are designed to bridge data gathered from IoT-based sensors and endpoint devices to the Microsoft Azure IoT Suite include Hewlett Packard Enterprise, Softing and Unified Automation. Microsoft indicated that its IoT software is integrated into those partner gateways, which limits the configuration work needed for Azure IoT Hub.

Also adding to its IoT portfolio, Microsoft last week launched its new Azure Time Series Insights, which the company described as a managed analytics, storage and visualization service allowing users to analyze billions of events from an IoT solution interactively and on-demand. The service offers a global view of data over different event sources, allowing organizations to validate IoT products.

Microsoft said the tool is designed to uncover trends, find anomalies and discover root-cause analysis in near real time, and its user interface is simple enough that lines of business can create capabilities without requiring dev teams to write any code. Microsoft is also offering APIs that it said will allow customers to integrate functionality into existing applications. The new Azure Time Series Insights Service, available now in preview, is already built into the above-mentioned new Microsoft IoT Central and the existing Microsoft Azure IoT Suite.

Microsoft already has an extensive IoT portfolio with its Azure IoT Offering and Windows 10 IoT Core. The company earlier this month made available its Azure IoT Hub, announced last fall, as well as the Azure DM Client Library, an open source library that lets developers build device management capabilities into devices built with Windows IoT Core connected to Azure. Microsoft says the new client library uses the same approach to device management as its enterprise Windows management tools.

The new Windows IoT Azure DM Client Library addresses such functions as device restart, certificate and app management, as well as many other capabilities introduced with the new Azure IoT Hub device management. The DM Client Library is designed to address the resource restrictions of sensors and other devices with embedded components and allows Azure IoT to remotely manage those devices.

Addressing scenarios where connectivity to the cloud is an issue, Microsoft last week also announced the preview of its Azure Stream Analytics, specifically for edge devices. Azure Stream Analytics on edge devices uses the Azure IoT Gateway SDK, which runs on either Windows or Linux endpoints, and supports various hardware ranging from small components and single-board computers to full PCs, servers and dedicated field gateways devices, Santosh Balasubramanian principal program manager for Azure Stream Analytics explained in a separate blog post. It uses Azure IoT Hub to provide secured bi-directional communications between gateways and Azure, he noted.

Finally, on the IoT front, Microsoft said it has bolstered security with support for key industry standards and partnered with several players of components at the silicon layer. Microsoft said Azure IoT now supports Device Identity Composition Engine (DICE), which allows silicon gate manufactures to put unique identification on every device, and Hardware Security Module (HSM) to secure device identities. The new partners Micron and STMicro will enable the new HSM and DICE security technologies for Microsoft, while Spyrus will support HSM as part of Secure Digital (SD) and USB storage devices.

Posted by Jeffrey Schwartz on 04/24/2017 at 2:05 PM0 comments


Microsoft: Some Don't Realize They Have a Hybrid Cloud

Most organizations have either deployed their first hybrid cloud or intend to do so within the next 12 months. But many who plan to do so may not realize they already have a hybrid cloud, according to a new report published last week by Microsoft.

The Microsoft State of the Hybrid Cloud 2017 report revealed that 63 percent of mid- and large-size enterprises already have implemented a hybrid cloud. However, Microsoft discovered that of the 37 percent claiming they haven't yet implemented their first hybrid cloud (but intend to within the next 12 months), 48 percent already unknowingly have deployed one.

Microsoft came to that conclusion, which the report noted as a surprising figure, based on the infrastructure specified among IT pros surveyed in December and January. "Odds are you've got a hybrid cloud strategy and have already started implementing this approach," said Julia White, corporate VP for the Microsoft cloud platform, in a blog post prefacing the report. "We know this because nine in ten of IT workers report that hybrid cloud will be the approach for their organizations five years from now."

Overall, the 1,175 IT pros who responded to Microsoft's survey had similar definitions of hybrid cloud -- defined as some type of infrastructure that integrates an on-premises environment and a public cloud. Yet Microsoft was surprised by the disparity between that definition and reality. "It seems that while the conceptual definition of hybrid cloud is not challenging for people, identifying hybrid cloud in practice is more difficult," the report stated.

That begs the question, what are the deployed solutions that customers didn't see as a hybrid cloud? A spokeswoman for Microsoft referred to the report's appendix where it spelled out the use-case definitions. They're broken down into four categories:

  • Hybrid Applications: Any application that shares common APIs and end-user experiences with an on-premises implementation and IaaS or PaaS services. Typically, an organization can extend an application by combining on-premises data with on-demand public cloud resources when workload requirements spike.
  • Data: It's become increasingly common to use some form of public cloud to backup and/or archive data. Likewise, many mobile and Web-based front end analytics tools may run in a public cloud, even if it's querying on-premises data.
  • User: Organizations running Office 365 or using cloud-based file sharing services such as OneDrive, Box or Dropbox, are examples of those which likely have a hybrid cloud. Cloud-based directories federated with on-premises user account information, most notably Active Directory, have become popular forms of providing single sign-on to enterprise resources including SaaS-based applications and tools. Many organizations are using either Azure Active Directory or one of several third-party identity-management-as-a-service (IDMaaS) offerings.
  • Infrastructure: The ability to extend the compute capacity of an on-premises datacenter by connecting to a public cloud-based IaaS or PaaS via a dedicated gateway, network or VPN makes up a basic hybrid cloud. It's what enables the above-mentioned hybrid cloud applications. But anyone who said they are using a monitoring tool that offers visibility and/or control over both systems on-premises and in a public cloud also has a hybrid cloud, according to Microsoft. Likewise, many organizations that require near real-time recovery of on-premises data are using new disaster recovery-as-a-service offerings, which are hybrid cloud based. These DRaaS offerings are public cloud-based services capable of providing nearly synchronous replication between the two. An organization that has deployed advanced security analytics tools that provide unified views and protection against threats against any endpoint on-premises or a cloud-based resource also has a hybrid cloud infrastructure.

Is it a big deal that some might not realize they already have a hybrid cloud? Not really. IT pros are focused on building and running infrastructure and understanding how everything works. But the report underscores Microsoft's emphasis on hybrid clouds, which is evident in almost everything it, and most other companies, are developing these days. From that standpoint, the report provides a baseline for Microsoft's view of hybrid clouds with some notable data points.  

For example, it shows that the most popular use cases for hybrid clouds were (in order) productivity/collaboration, high-performance networking, SSO, DRaaS, cloud front ends to on-premises data, archiving, analytics, unified monitoring, dev test in cloud and production on-premises, advanced security and global applications.

Not surprisingly, there are notable variations depending on the industry and country the respondents were from. The 1,175 IT pros who responded were either from the U.S., U.K., India or Germany. Those in regulated industries such as banking and healthcare were among the top and retailers ranked high as well. Adoption in Germany was notably low due to data sovereignty laws, while India showed high adoption, likely because of latency issues there and the need to keep data accessible.

Deployments also varied based on the age of a company. More older companies with on-premises infrastructure have hybrid clouds than younger ones. More than half (52 percent) of companies in business for less than 10 years have hybrid clouds, while 91 percent of companies that are 25 years to 49 years old have one. Curiously, the percentage drops slightly to 84 percent among companies that are 50 years old.

Let's not forget, as Microsoft promotes hybrid clouds, its eye is toward the future. And that future is Azure, which White underscored last week, when she pointed to Microsoft's claim that organizations can save up to 40 percent on Windows Server VMs run in Azure by using existing licenses with Software Assurance. The company outlined how by introducing its new Azure Hybrid Use Benefit tool and free migration assessment tool.

Posted by Jeffrey Schwartz on 04/20/2017 at 12:46 PM0 comments


Steve Ballmer Builds Expansive Online Government Database

Now that everyone has filed their tax returns, many may wonder where that money goes. A new effort spearheaded by former Microsoft CEO Steve Ballmer can now help. Ballmer has recently started talking about the new Web site, USAFacts.org, which is a massive database of information about federal, state and local government information presented just like the 10-K forms publicly traded companies must release.

Ballmer, who is funding the site as a nonprofit, nonpartisan project, has assembled a team of people, many of whom helped create Microsoft's financial reports, and started talking up the effort in February during a TEDxPennsylvaniaAvenue talk. Upon looking for information about how money is spent, there was no single source of information to people easily find information on everything ranging from how Medicaid is used to how many people work for the government.

To get a sense of how much information is available, check out this 2017 report, that provides performance information for a wide spectrum of expenditures, from the Medicare Trust Financials to spending on national defense and veterans' affairs to child safety. And that was just at the federal level. Ballmer noted there are more than 90,000 organizations overall in federal, state and local governments as well as school, fire and water districts.

Recalling as CEO of Microsoft the need to know how every business unit and product was performing and why, Ballmer set out to create the same benchmark for how the government is performing. "I'm a numbers guy," Ballmer said during his February presentation. "For me, numbers are a tool to tell a story and to bring things together in ways that are more precise [and] that has more context. There has to be more clarity than you'll find in any other way. Numbers add to the discussion. They take a collage of mess and turn it into something that helps you in the playing field of a complex platform."

According to the site, USAFacts provides a data-driven portrait of the American population, government, its finances and the impact it has on society. It bills itself as a comprehensive, integrated database that brings together information from federal, state and local government sources, while providing logically organized and contextual information. It draws only on the most recently reported government data.

"What we want people to be able to do is to go to work and say what do my tax dollars go for, what is the money going to get used for and what kind of outcome," he said. Ballmer emphasized that the information doesn't give forecasts and is not biased. "We don't do any forecasting, no prediction that's not factual. We are just reporting on the history," he said. "We'll let people do their own forecasting and prediction. We also don't try to take a position on any issue."

Not surprisingly, the site runs in Microsoft Azure, Ballmer said in an interview published by Geekwire yesterday. The application is .NET-based using REST APIs to access the system. The front-end interfaces are JavaScript based. Plans call for enabling users to create their own visualizations with PowerBI.

Posted by Jeffrey Schwartz on 04/19/2017 at 1:48 PM0 comments


Hybrid SharePoint Deployments Poised to Expand

The results of what is believed to be the first independent survey to analyze the makeup of hybrid SharePoint environments are in and the findings reveal that nearly a third of organizations have SharePoint hybrid users, while nearly half (46 percent) still have deployments that are entirely on-premises and 22 percent use the SharePoint Online service offered via Office 365.

More than half have cloud-based implementations of SharePoint, yet claim they don't have plans to transition entirely to the online version, according to the survey that was fielded last month by graduate students at the Marriott School of Management at Brigham Young University and spearheaded by Christian Buckley's CollabTalk. The survey was sponsored by Microsoft, a number of ISVs and a number of media partners including Redmond magazine (and is available for download here).

The survey's aim was to determine the makeup of hybrid SharePoint deployments and the extent to which organizations plan to maintain those environments as they move more applications and infrastructure to the cloud and as Microsoft continues to emphasize Office 365 as the future of its collaboration strategy.  Those with on-premises deployments plan to have hybrid solutions by 2020, according to the survey, which validates what proponents have long emphasized: that most organizations that have long used SharePoint either already have hybrid deployments or plan to move part of its functionality to the cloud but maintain hybrid implementations.

"While Microsoft's messaging continues to focus on cloud-first, mobile-first when it comes to product innovation, the company has realized the need to bring hybrid features and messaging more to the forefront in recent years," Buckley said in the introduction to a report based on the findings of the results released today. "Office 365 may be the future of collaboration, but Microsoft has softened their tone in regard to on-prem customers needing to move to the cloud -- not only reassuring them that on-prem will be supported as long as customers are using it, but acknowledging hybrid as a valid strategy for some organizations."

While 32 percent of organizations of all sizes claimed that they have hybrid SharePoint deployments, nearly half, 49 percent, have hybrid licenses, while 35 percent said they have on-premises licenses and the remaining 17 percent used SharePoint Online.

The overall makeup of SharePoint licenses, according to the survey, shows that 63 percent were on-premises and 37 percent were online. Given the overlap and the fact that it's not unusual for Microsoft customers to have more licenses than actual usage, the report based on the survey stated: "We know that the number of SharePoint users are fewer than the number of licenses, and consequently, the percentage of licenses coming from companies using hybrid solutions will be greater than the number of users they have."

Among the 626 respondents representing 510 different organizations, the findings not surprisingly showed that mid- and large-size enterprises are more partial to hybrid and on-premises implementations of SharePoint and less likely to move everything online. Small businesses are more likely to already have or plan to use SharePoint entirely online. More than half of those with on-premises SharePoint implementations today will have hybrid deployments by 2020.

Small business with 51 to 200 employees accounted for 16.7 percent of respondents, while 21.4 percent had 1,001 to 5,000 employees and 17.5 percent had more than 10,000. Respondents came from all over the world, with the U.S. making up 35 percent of the sample.

The findings underscored the overlap between licenses and users. To that point, the survey noted: "In many cases, we assume that a single user possesses a single license, which, in many cases may be true, but not in a hybrid environment, where a single user may have two licenses: SharePoint Online and on-premises SharePoint. Recognizing this might not always be true, with some on-premises users not having SharePoint Online licenses and some online users without access to the on-prem environment, because the scope of our analysis focused specifically on SharePoint, we found that respondents overwhelmingly owned or planned to acquire Office 365 e-licenses where SharePoint is included. In other words, SharePoint on-prem users were generally given appropriate online licenses."

Organizations least likely to move from SharePoint on-premises to Office 365 are those with workloads that require SQL Server Reporting Services Integrated Mode, PowerPivot for SharePoint or PerformancePoint Services, John White, SharePoint MVP and CTO of UnlimitedViz, noted in the report.

"Companies that have made significant investments here cannot move these assets, making a complete move to the cloud impossible. Hybrid is the only cloud option," White stated. "Combine this with the prevalence of third-party solutions (Nintex, K2, etc.) and custom solutions, and it is easy to see why some on-premises presence will be with us for quite some time."

Such issues, and a vocal SharePoint community, are among the reasons Microsoft has shifted its emphasis on hybrid deployments. Speaking to that issue in the report, Jared Shockley, senior service engineer at Microsoft, said: "Migration of customizations used in on-premises installations is biggest blocker to cloud migrations of SharePoint. Many companies do not want to rethink or redevelop these solutions as that is an additional expense to the migration. There are tools and frameworks, like Cloud App Model, to accomplish this work. But they are not the same as on-prem tools and frameworks. This training for the development teams can be one of the main blockers for migrations to SharePoint Online."

While the report notes Microsoft is closing this gap, "the challenge of reducing functionality to end users is not trivial," said Ed Senez, president of UnlimitedViz, an ISV that provides an analytics solution for SharePoint, Office 365 and Yammer. "A simple example is that SQL Server Reporting Services (SSRS) does not currently have a cloud solution. It would be a hard sell to tell employees that you can no longer get the reports that you need to do your job. Again, this gap is closing, but it remains an issue."

Posted by Jeffrey Schwartz on 04/17/2017 at 3:43 PM0 comments


McAfee Emphasizes Threat Intelligence with Spinoff from Intel Now Complete

McAfee is once again a freestanding provider of security software, following last week's completion of its divestiture from Intel, which was announced last fall. Private equity firm TPG acquired a majority 51 percent stake in the McAfee spinoff for $3.1 billion, though Intel has a strong vested interest in McAfee retaining 49 percent ownership. Now free from Intel's control, the new McAfee is no longer beholden to the interest of the chip provider, giving it a freer hand to compete with the likes of IBM, Symantec, Sophos and Trend Micro, among others.

Chris Young, who ran Intel Security, is now McAfee's CEO. While TPG has suggested further acquisitions are likely and said in its strategy statement that it intends to "build and create one of the largest, independent, pure-play cybersecurity companies in the industry." As many have noted, Intel's $7.7 billion acquisition of McAfee back in 2011 didn't live up to its promise. Now McAfee hopes to gain ground in a much different IT security landscape.

Nevertheless, McAfee has a formidable and wide range of cybersecurity offerings including its flagship endpoint security software, intrusion detection and prevention tools, its Enterprise Security Manager SIEM offering and e-mail security, Web security and vulnerability scanning tools. While it exited the next generation firewall (NGFW) business, ePolicy Orchestrator had become an "anchor" platform for Intel Security, and now McAfee, according to ESG Senior Principal Analyst Jon Olstik, in a Network World blog post. Olstik, who has followed McAfee for decades since it was known as Network Associates, said McAfee's challenge is to regain its leadership in endpoint security, become less product focused, emphasize the C-suite and focus on cloud security, an area the company hasn't adequately addressed.

One area McAfee has invested in heavily is threat intelligence with ePolicy Orchestrator tied to its Threat Intelligence Exchange (TIE), whose wide gamut of partners supports its Data Exchange Layer (DXL), which the company recently made available as open source in the hopes to extend adoption.

In the first McAfee Labs Threat Report following the spinoff, the company identified five critical challenges to handling threat intelligence: volume, validation, quality, speed and correlation. The 49-page report is available for download, though here's an edited synopsis of the five threats McAfee Labs believes the industry must address:

  • Volume: The Internet of Things has led to the deployment of millions of security sensors creating high volumes of data fed into threat intelligence tools, which include streaming analytics and machine-learning software that process and analyze the data. While these tools have improved the level of internal threat detection, it has created a yet unsolved massive signal-to-noise problem. Vendors are tackling this in various ways, such as building access monitors that scan sensitive data, sophisticated sandboxes and traps that can resolve contextual clues about a potential attack or suspicious event.
  • Validation: Given the ability for threat actors to issue false threat reports designed to mislead or overwhelm threat intelligence systems, it's essential to validate the sources of shared threat intelligence.
  • Quality: Vendors need to rearchitect security sensors to capture and communicate richer trace data to help decision support systems identify key structural elements of a persistent attack. Filters, tags and deduplication are critical. McAfee is among six founding members of the new Cyber Threat Alliance (CTA), launched in February during the RSA Conference, that is looking to address the quality issue. Joined by Check Point, Cisco, Fortinet, Palo Alto Networks and Symantec, the CTA will automatically score the quality of threat intelligence data, but can only gather information if they are supplied quality input.
  • Speed: The latency between a threat detection and the reception of critical intelligence remains an issue. Open and standardized communication protocols, optimized for sharing threat intelligence are essential for successful threat intelligence operations. Advanced persistent threats and sophisticated, targeted campaigns often target multiple organizations in specific vertical industries, meaning communications among an intermediary or exchange must occur within hours of the first indication of an attack.
  • Correlation: As threat intelligence is received, correlating the information -- while looking for patterns and key data points relevant to the organization -- is critical. Vendors must find improved ways to share threat intelligence among different products and improve methods to automatically identify relationships between the intelligence collected and ultimately to employ machine assistance to simplify triage.

While the report points to an industry call to action, it gives a synopsis of McAfee's priorities regarding threat intelligence, an emphasis kicked off back in 2014 with the launch of its DXL threat exchange. Olstik noted the DXL platform is effectively security middleware. The TIE includes products from dozens of exchange members who offer network management, application and database security, incident response, forensics, endpoint and mobile device management platforms, authentication, encryption, data loss prevention and cloud security.

Posted by Jeffrey Schwartz on 04/14/2017 at 11:42 AM0 comments


Despite Conflicting Reports, Signs Point to PC Growth

PC shipments have increased for the first time in five years, according to the latest quarterly report issued by market researcher IDC. The first quarter IDC PC Device Tracker report from IDC, which also showed HP regaining the top position from Lenovo, showed that the 60.3 million units shipped worldwide during the period represented a year-over-year growth of 0.6 percent.

While the increase may appear negligible, it was a surprise increase versus a projected 1.8 percent  decline, according to IDC, which revealed the quarterly report yesterday (on the same day Microsoft released its new Windows 10 "creators update"). The growth spurt is particularly surprising, given the beginning of the year is historically slow for pc sales.

Jay Chou, research manager of IDC's PC Tracker reports, noted in a statement that competition from tablets and smartphones, along with longer PC lifecycles, have pushed PC shipments down 30 percent since the peak at the end of 2011. Though he disputed the notion that those devices are the reason for the declining growth.

"Users have generally delayed PC replacements rather than giving up PCs for other devices," Chou stated. "The commercial market is beginning a replacement cycle that should drive growth throughout the forecast. Consumer demand will remain under pressure, although growth in segments like PC gaming as well as rising saturation of tablets and smartphones will move the consumer market toward stabilization as well."

Despite the first uptick in global PC shipments since 2012, the U.S. market wasn't a contributor to that growth. According to IDC, the overall PC market in the U.S. declined slightly with shipments of 13.3 million units. The report noted strong demand for Chromebooks as well.

HP Back on Top
It appears HP was a key driver of last quarter's surprise surge. Shipments for the quarter rose 13.1 percent, making it the market-leading supplier of PCs for the first time since 2013, when it ceded the top spot to Lenovo, which dropped to the No. 2 position. HP shipped 13.1 million units in the first quarter, compared with 11.6 million during the same period last year, giving it 21.8 percent of the market, compared with 19.4 percent share year-over-year.  Lenovo shipped 12.3 million units, giving it a 20.4 percent share of the market.

Dell, the No. 3 supplier, saw a 6.2 percent growth with 9.6 million units shipped. Apple came in fourth with 4.2 million shipments, up 4.1 percent and Acer's 4.1 million systems shipped represented a 2.9 percent increase in shipments. Outside the five top players, the rest, lumped together as "other" saw an 11.4 percent decline.  

Despite IDC's optimistic report, rival Gartner's quarterly findings, also released yesterday, contracted those findings, reporting a 2.4 percent overall decline. According to its report, it was the first time global PC shipments fell below the 63 million unit threshold. Likewise, while HP took the top spot in U.S. shipments, Lenovo retained its leadership globally, though the latter's growth was considerably lower.

It's not the first time the two firms had contradictory reports because they use different tracking methods and metrics. The two did agree on the fact that the top three players, -- Dell, HP and Lenovo -- will battle it out among enterprises. "The market has extremely limited opportunities for vendors below the top three, with the exception of Apple, which has a solid customer base in specific verticals," said Gartner Principal Analyst Mikako Kitagawa, in a statement.

Depending on whose numbers you buy into, the PC business in 2017 has either gotten off to a surprisingly good start, or has yet to hit rock bottom. In either case, despite this week's release of the Windows 10 creators update, it may be premature to say we've entered the post post-PC era.

Posted by Jeffrey Schwartz on 04/12/2017 at 2:29 PM0 comments


Microsoft Packs New Tools with Windows 10 Creators Update SDK Release

As Microsoft gets set to roll out the next major release of Windows 10, the company is also priming the pump for developers to take advantage of the latest new features coming to the OS -- 3D, mixed reality, improved natural language interaction with Cortana, enhanced inking capabilities and support for the new Surface Dial -- with this week's release of a new SDK.

Microsoft is set to start rolling out the latest Windows 10 upgrade, called "creators update, this coming Tuesday. It's the second major upgrade to Windows 10 since Microsoft launched it nearly two years ago. The first update came last summer with the release of the Windows 10 Anniversary Edition and offered better stability and improved security. There are noteworthy additional security improvements in the creators update as well but Microsoft has a big stake in the new usability features to make it more attractive to end users.

The new creators update SDK, which was made available for download on Wednesday, includes a broad set of tooling for developers. In addition to the SDK, the download includes Visual Studio 2017 UWP Tooling. The Windows Store is also accepting applications built around the Windows 10 creators update. "We expect users to once again move rapidly to the latest and best version of Windows," said Kevin Gallo, Microsoft's corporate VP for Windows developers, in a post announcing the release of the creators update SDK. "For developers, this is the time to get ready for the next wave."

The SDK lets developers to create a new Universal Windows Platform (UWP) app or build one with existing app code on Windows. Microsoft posted a list of new and improved features for developers as well as a list of new namespaces added to the Windows SDK.

A laundry list of new capabilities in the new  SDK outlined by Microsoft includes:

  • Desktop to UWP Bridge, to help convert existing applications to UWP apps and integrating them with other apps
  • Ink improvements that offer greater inputs and ink analysis, which analyzes ink stroke input for Windows Ink apps, such as shape detection and recognition, handwriting recognition and layout interpretation and classification. It also includes a new Ink toolbar, support for Input injection to programmatically generate and automate input from a variety of devices and the ability for developers to specify inking apps via the new Ink Workspace.
  • Windows IoT Core updates with support for Cortana, a spruced-up IoT Dashboard, Azure Device Management and Device Guard for IoT, among other new updates.
  • UWP App Streaming Install, which lets users launch an app before it's fully installed for faster access.

Gallo pointed to other features in the SDK including APIs for the Surface Dial, "significant Bluetooth improvements with Bluetooth LE GATT Server, peripheral mode for easier discovery of Windows Devices and support for loosely coupled Bluetooth devices (those low energy devices that do not have to be explicitly paired)" and he pointed to the recently released Android SDK for Project Rome.

While Microsoft has emphasized that the creators update will allow users to generate new 3D images, the support for mixed reality headsets priced in the $300 range will test user interest in holograms. Microsoft also recently rolled out its Mixed Reality toolkit. Little is known about the release dates and other specifics around the various headsets in the pipeline, though Mashable's Lance Ulanoff yesterday was the first to publish a review of the forthcoming Acer Mixed Reality Headset.

Posted by Jeffrey Schwartz on 04/07/2017 at 12:54 PM0 comments


Microsoft Stores Sell Samsung's New Android Phone

A Microsoft Store might not be the first place one might look for an Android phone but the company's retail locations are taking orders for the new Samsung Galaxy S8, launched last week.

Even as Microsoft stores already sell other competitive devices, such as the Facebook Oculus virtual reality headsets, and the company itself now supports Apple's iOS and Android in numerous ways, the idea of it now selling an Android phone -- and one from the largest supplier -- is somewhat striking.

Certainly, it's not remarkable considering Microsoft's support of other platforms and that Samsung has offered Microsoft apps on its previous phones the Galaxy S6 and S7 (the latter was removed from the market last year after some started catching fire). Indeed, 11 device manufacturers agreed to preinstall Office on their Android devices more than two years ago.

In addition to Office, The Samsung Galaxy S8 Microsoft Edition can be customized with OneDrive, Cortana and Outlook, among other Microsoft offerings. Presuming Samsung has resolved the issues that led to the company having to take the Galaxy S7 off the market, the S8 is expected to be a hit, considering Android is the most widely used mobile platform and the company has the most popular Android-based phone.

The Galaxy S8, which will offer improved video recording and rendering and introduce new biometric authentication options, is the first to use Qualcomm's new Snapdragon 835, which ultimately will introduce support for new high-speed Gigabit LTE-class connectivity that carriers are expected to roll out later this year (here's a listing of specs).

Microsoft is only selling the new phones in its stores, not online. They're due to arrive April 21.

Posted by Jeffrey Schwartz on 04/03/2017 at 12:19 PM0 comments


Remembering Microsoft's First Communications Exec Pam Edstrom

Pam Edstrom, who many say played a key role in shaping the image of Microsoft and its cofounder Bill Gates, passed away last week at the age of 71 following a four-month battle with cancer.

Microsoft hired Edstrom in 1982 as its first director of public relations where she crafted the company's communications strategy, which many believe helped bring visibility to what was then an obscure startup. Two years later, she joined Melissa Waggener Zorkin, who, at the time, had a boutique PR agency. The two later formed Waggener Edstrom, now known as WE Communications.

Initially, Edstrom balked at Waggener-Zorkin's overtures to join her until she convinced Gates and then-Microsoft President Jon Shirley what they collectively could do for Microsoft. In those early years, Edstrom cultivated relationships with influential business and technology reporters, helping spread the mission and values Gates and Microsoft had and the oft-described goal of bringing PCs to every user. "We spent time very directly designing our roles, and spent endless hours simply working side by side," Waggener-Zorkin said in a blog post published on the agency's Web site in a tribute to Edstrom.

Many veterans of the agency have landed key communications roles at companies such as Expedia, Lenovo, Starbucks and T-Mobile, according to Geekwire, which was the first to report on Edstrom's death. Microsoft's current VP of communications, Frank X. Shaw, is a former president of the agency, the report noted.

In response to her death, Gates told The New York Times that Edstrom "defined new ways of doing PR that made a huge mark on Microsoft and the entire industry."

Posted by Jeffrey Schwartz on 04/03/2017 at 12:18 PM0 comments


Microsoft Adds Web Application Firewall to Azure Application Gateway

Looking to protect sites running in its public cloud from malicious attacks, Microsoft this week released its new Web Application Firewall (WAF) option for its Azure Application Gateway and HTTP load-balancing service.

Microsoft said its new centralized WAF service, announced last fall at Microsoft's Ignite conference, will protect Web apps running with the Azure Application Gateway from common exploits such as SQL injections and cross-site scripting attacks.

Preventing Layer-7 app-level attacks is difficult, requiring laborious maintenance, patching and monitoring throughout the application tiers, according to Yousef Khalidi, Microsoft corporate VP for Azure Networking. "A centralized Web application firewall (WAF) protects against Web attacks and simplifies security management without requiring any application changes," Khalidi said in a blog post this week announcing the release of the Azure WAF service. "Application and compliance administrators get better assurance against threats and intrusions."

Microsoft's Azure Application Gateway is the company's Application Delivery Controller (ADC) Layer-7 network service, which includes SSL termination, load distribution and URL path-based routing and can host multiple sites, according to Khalidi. The new ADC service in Azure also offers SSL policy control and end-to-end SSL encryption and logging.

"Web Application Firewall integrated with Application Gateway's core offerings further strengthens the security portfolio and posture of applications protecting them from many of the most common Web vulnerabilities, as identified by Open Web Application Security Project's (OWASP) top 10 vulnerabilities," Khalidi noted. The WAF comes with OWASP ModSecurity Core Rule Set (3.0 or 2.2.9), designed to protect against these common threats, he added.

Besides SQL injection and cross-site scripting, Khalidi noted the WAF offering protects against command injection, HTTP request smuggling, HTTP response splitting and remote file inclusion attacks. It also addresses HTTP protocol violations, bots, crawlers,  scanners and common misconfiguration of application infrastructures, notably in IIS and Apache.

As one would expect from a WAF, Microsoft's new services is designed to fend off denial-of-service attacks occurring simultaneously against multiple Web apps. Microsoft Azure Application Gateway can currently host up to 20 sites behind each gateway, all of which can defend against such attacks. The service is offered with the medium and large Azure Application Gateway types. It costs $94 and $333 per month, respectively.

Microsoft said it intends to add the new WAF service through its Azure Security Service, which scans cloud-based subscriptions for vulnerabilities and recommends ways to remediate issues that are discovered. That service currently didn't include protection of Web apps that aren't scanned by a WAF, though the service does offer third-party firewalls from Barracuda Networks Inc., Check Point Software Technologies Inc., Cisco, CloudFlare, F5, Fortinet Inc., Imperva Inc. and Trend Micro, among others.

Posted by Jeffrey Schwartz on 03/31/2017 at 11:48 AM0 comments


Skype for Business via Office 365 To Interoperate with Cisco Endpoints

Among the slew of improvements to Microsoft's Skype for Business and Cloud PBX offering announced at this week's annual Enterprise Connect conference in Orlando, Fla., one that stood out was Polycom's new RealConnect, which will allow Office 365 users with Skype for Business to add Cisco devices to meetings.

It's noteworthy because it's an important step by Microsoft and Polycom that extends the reach of Skype for Business Online, permitting users to connect using other vendors' conferencing equipment. Given Cisco is the leading provider of VoIP phones and PBXs with a formidable videoconferencing systems business, it promises to widen the reach of Skype for Business.

"We want to make sure you have a great video experience regardless of what platform your users are on, and regardless of what platform you develop on, that extends across a number of different platforms," said Ron Markezich, Office 365 corporate VP, during his keynote address today at the conference.

Delanda Coleman, a Microsoft product marketing manager, joined Markezich on stage to demonstrate the interoperability capability that Polycom's RealConnect will offer when it's released next month. "Now any legacy Cisco VTC [videoconferencing system] can join the Skype for Business meeting without any problems," Coleman said.

Polycom is the leading provider of Skype for Business handsets and videoconferencing systems. While connecting to Cisco devices is an important step, it also suggests Polycom will look to connect with other devices, software and services. "Polycom RealConnect for Office 365 simplifies the video world by connecting Skype for Business online users with those using other video systems," Mary McDowell, Polycom's CEO stated in Markezich's blog post announcing the Cloud PBX and SfB upgrades. "This cloud service protects customers' investments in existing video systems as it allows these users to join a Skype for Business meeting with a single click."

It's reasonable to presume Microsoft will certify bridging solutions from other partners, which could lead to further usage of Skype for Business over the long haul, even if it means organizations will hold onto those existing systems longer.

Posted by Jeffrey Schwartz on 03/29/2017 at 1:39 PM0 comments


Microsoft Emboldens EMS as Partners and Rivals Explore Its APIs

Microsoft's Enterprise Mobility + Security (EMS) service has come a long way over the past year with added integration and new capabilities as organizations grapple with what role it will play. If Microsoft has its way, EMS, bundled with Office 365 and Windows 10 will ensure customers won't choose third-party data tools to secure access to data, apps and cloud services and for authentication and policy management. But despite Microsoft's declaration that EMS is the most "seamless" enterprise information protection offering, the company is also showing a pragmatic view with the recent release of the Intune APIs, and partnerships with those who have rival solutions from VMware, SailPoint and Ping Identity, among others.

Nevertheless, Corporate VP Brad Anderson has long argued the case for EMS, and claims it's the most widely used enterprise mobility, application and device management offering. Anderson released a 35-minute video last week called "Everything You Want to, Need to, and/or Should Know About EMS in 2017," where he made the case for EMS and gave demonstrations showcasing the new EMS portal, and features such as conditional access, ties with the new Microsoft Security Graph, integration with Azure Information Protection and the recently released Windows Information Protection in Windows 10 and the release of the mobile application management (MAM) SDKs, that allow for the embedding of EMS controls into apps.

The slickly produced video came on the heels of a post by Anderson two weeks earlier that highlighted the coming together of PC and device management using mobile device management (MDM) approaches. It is indeed a trend that is gaining notice. It was a key topic of discussion during last December's TechMentor track at the annual Live! 360 conference in Orlando, Fla., produced by Redmond publisher 1105 Media. The application of MDM to PC and device management is also the focus of this month's Redmond magazine cover story, "Breaking with Tradition: Microsoft's New MDM Approach."

Anderson earlier this month pointed to the results of analyst firm CCS Insight's recent survey of 400 IT decision makers responsible for mobility, which found that 83 percent plan to converge PC and mobility operations into a single team within the next three years and 44 percent will so this year. Worth noting is that 86 percent reported plans to upgrade their PCs to Windows 10 within three to four years and nearly half (47 percent) planned to do so this year.

Microsoft has reported that more than 41,000 different customers use EMS. Anderson last week argued that's more than double the size of VMware's AirWatch installed base and triple that of MobileIron. Anderson also is a strong proponent that Azure Active Directory (AAD), the identity management solution offered for both EMS and Office 365, obviates the need for third-party identity management-as-a-service (IDMaaS) offerings.

"There are more than 85 million monthly access of office 365, just shy of 84 million of them use the Microsoft solution to manage and synchronize all of their identities into the cloud." Anderson said in reference to CCS Insight's annual survey. "What that means, just a little over 1 percent of all of the monthly active users of Office 365 use competing identity protection solutions. EMS is the solution that you need to empower your users to be productive how, where and when they want, and give them that rich, engaging experience."

Asked if he agrees with Anderson's strong assertion, CCS Insight's analyst Nick McQuire responded that Intune and EMS has had quite a large impact on the market over the past year fuelled by interest in Windows 10 and Office 365's growth. "Perhaps the biggest impact is the pause that it has generated with existing AirWatch, Blackberry and MobileIron customers," McQuire said. "The EMM market is slowing down and penetration rates of EMM into their customer bases is low and this is a challenge they need to address. Microsoft has contributed to this slowdown in the past 12 months, without question."

That said, McQuire isn't saying it's game over for the other providers. "At the moment, there is a real mix," he said. "Some customers are making the switch to Microsoft. Others may not have made the switch but are absolutely kicking the tires on the product and waiting to see if Intune and EMS becomes the real deal, given that it arrived late to the market and is playing catch up."

McQuire also noted that switching EMM products is not straightforward and churn rates in the industry, although unreported, are very low. "This is evidenced in the renewal rates across all the long-standing EMM players which are high (average 80 to 90 percent range) indicating that when EMM is deployed, it sticks and it becomes very hard to ask customers to rip and replace," he said.

The release of the Microsoft Graph and Intune APIs for Office 365 will help customers who don't want to move to EMS, he noted. Because EMS is offered with Microsoft Enterprise Agreements, using it with other tools will become more practical and make more customers open to using it in concert with those offerings.

"At the moment, we don't see many customers with a production environment under the coexistence model but we do see this growing rapidly this year," McQuire noted. "Microsoft's strategy here is not to concede these accounts but to land and expand."

Why does it make sense for rivals such as VMware's AirWatch or MobileIron to use the APIs? Ojas Rege, MobileIron's VP of strategy said there are two sides to the EMS equation. One is the EMS-Intune console on the front end and the other is a set of middleware services on the back end based on the Microsoft Graph.

"If other consoles like MobileIron want to leverage them, they can," Rege said. "What does matter are these additional proprietary Microsoft features. It doesn't make sense for us to use the Graph API to activate an Intune function to lock an iOS device because we just lock the iOS device directly, but it does make sense to use the Graph API, to set a security control on Office 365."

Adam Rykowski, VMware's VP of UEM Product Management, agrees that traditional desktop PC management and MDM are coalescing and it's fueling growth. "We are actually some seeing some pretty major customers ramp up even sooner than we had expected," Rykowski said.

Andrew Conway, general manager for EMS marketing at Microsoft, posted a brief update last week on EMS and Microsoft Graph APIs, describing them as a gateway to various offerings ranging from Azure AD, Outlook, OneDrive, SharePoint and Intune among others. "The Microsoft Graph API can send detailed device and application information to other IT asset management or reporting systems," Conway noted. "You could build custom experiences which call our APIs to configure Intune and Azure AD controls and policies and unify workflows across multiple services."

Posted by Jeffrey Schwartz on 03/27/2017 at 1:09 PM0 comments


Google To Distrust 30k HTTPS Certs in Chrome Issued by Symantec

Google has accused Symantec of improperly issuing 30,000 Extended Validation (EV) certificates and will immediately start distrusting them in its widely used Chrome browser.

The move, a stinging indictment against the giant Certificate Authority (CA) and security provider, means SSL/TLS-based certs issued by Symantec will be invalidated. Users who visit affected HTTPS Web sites will receive warnings that a site's cert isn't valid but will still have access if they choose to ignore the warning. It could force Web site operators to move to other CAs. Symantec is disputingGoogle's charge that the certs were improperly validated.

An investigation by the Google Chrome team initiated on Jan. 17 initially centered around just 127 certificates. But the company now has found more than 30,000 certificates issued over a period of several years that don't fully meet Chrome's Root Certificate Policy. Ryan Sleevi, a software engineer on the Google Chrome team, announced the move yesterday in a newsgroup post.

"The Google Chrome team has been investigating a series of failures by Symantec Corporation to properly validate certificates," Sleevi noted. "Over the course of this investigation, the explanations provided by Symantec have revealed a continually increasing scope of misissuance with each set of questions from members of the Google Chrome team. This is also coupled with a series of failures following the previous set of misissued certificates from Symantec, causing us to no longer have confidence in the certificate issuance policies and practices of Symantec over the past several years."

As of two years ago, certificates issued by Symantec accounted for more than 30 percent of the valid certificates based on volume, according to Sleevi. A Root Certificate Policy requires all CAs to ensure that they validate domain controls, audit logs for signs of unauthorized issuance of certs and protect their infrastructures to avoid the ability to issue fraudulent certs. Symantec has failed to meet those requirements, Sleevi stated.

"On the basis of the details publicly provided by Symantec, we do not believe that they have properly upheld these principles, and as such, have created significant risk for Google Chrome users," said Sleevi. "Symantec allowed at least four parties access to their infrastructure in a way to cause certificate issuance, did not sufficiently oversee these capabilities as required and expected, and when presented with evidence of these organizations' failure to abide to the appropriate standard of care, failed to disclose such information in a timely manner or to identify the significance of the issues reported to them."

Despite the ongoing investigation, Symantec was surprised by Google's move. In a statement sent via e-mail by a Symantec spokesman, the company disputed Sleevi's accusations. "We strongly object to the action Google has taken to target Symantec SSL/TLS certificates in the Chrome browser. This action was unexpected, and we believe the blog post was irresponsible. We hope it was not calculated to create uncertainty and doubt within the Internet community about our SSL/TLS certificates."

For now, Sleevi noted Google is taking the following steps:

  • Newly issued Symantec certificates will only be valid for nine months or less, to reduce any impact to Google Chrome users of potential future certs that aren't issued properly.
  • All Symantec-issued certs covering different Chrome releases must be revalidated and replaced.
  • The Extended Validation status of Symantec issued certificates must be removed for at least one year but cannot be reinstated until it meets Google's guidelines.

Symantec argued the certs in question caused no harm against Web site visitors and believes Google is singling it out. Google's criticism of Symantec's certificate issuance policies is "exaggerated and misleading," according to its statement, which noted that it discontinued its third-party registration authority program to reduce any concerns regarding trust regarding its SSL/TLS certs.

"This control enhancement is an important move that other public certificate authorities (CAs) have not yet followed," according to Symantec. "While all major CAs have experienced SSL/TLS certificate misissuance events, Google has singled out the Symantec Certificate Authority in its proposal even though the misissuance event identified in Google's blog post involved several CAs."

Will the two companies iron out their differences? Symantec insists it maintains extensive controls over how it issues SSL/TLS certs and hopes to discuss the matter further with Google in hopes of resolving this dispute.

Posted by Jeffrey Schwartz on 03/24/2017 at 12:26 PM0 comments


New Tool Aims To Bring AI to Outlook via Microsoft Graph

Looking to make Outlook the center of the digital workspace, Harmon.ie has launched a new tool which brings together commonly used business apps and services into one place that gathers information from disparate apps and cloud services using AI and machine learning to apply context to a given task. Harmon.ie, which claims its namesake Outlook plugin is used in 1,200 organizations as a common interface to interact with information stored in SharePoint and Office 365 files, has widened its scope by supporting additional tools such as Yammer, IBM Connections, ZenDesk and Salesforce.com's Chatter and CRM offerings.

The new tool, called Collage, is among the first enterprise applications to make use of the Microsoft Graph APIs, released last year, according to Harmon.ie CEO Yaacov Cohen. "It's a very important API," Cohen said in an interview. "During the last six months, Microsoft has released more and more of Graph APIs, which give not just information but insights such as who I am working with. Based on these APIs, we can deliver an experience that can deduce things for you."

Cohen indicated that Collage will work with numerous other popular workplace tools. Collage seeks to make Outlook the center of an employee's workspace. But with the AI and machine learning APIs of the Microsoft Graph, Collage also provides more context with its ability to recognize keywords used across different apps, according to Cohen.

During a demo, Cohen showed how Collage recognizes topics people are working on and associates them with related relevant information among different apps. Collage lets users access information from Outlook in their native app experiences and brings documents from SharePoint and Office 365 as links rather than attachments to ensure information is current, Cohen explained. While it works with SharePoint Server (on-premises), it requires organizations to use OneDrive for Business to store data, Cohen said, noting its dependence on the Graph and Office 365.

"The tool is great for organizations who use Office 365 as well as other services that Collage can connect to," SharePoint MVP Vlad Catrinescu, who is president of vNext Solutions, said via e-mail. "By showcasing information 'in-context' from multiple services directly in Outlook, it allows users to be more productive, really get work done and make the right decisions because they have all the information available to them. I think Outlook is a great place to be the 'hub' of this information because, let's be honest, that's where most of the work of the classic information worker is."

In a review posted on his Web site, Catrinescu illustrated how Collage connects users with SharePoint sites relevant to a given topic, letting users view, open and drag and drop documents into SharePoint. Users can also drag and drop documents from the sidebar in Outlook and create an e-mail with a link to the document. As users with access to that shared document make changes, it ensures everyone has the most recent version, he noted.

"The big challenge for Harmon.ie is really to make sure their AI engine and Machine Learning will do an amazing job in tagging content from so many systems, so it becomes useful for the users," Catrinescu said, adding that it performed well in his recent tests.

Harmon.ie first introduced Collage last fall at Microsoft's Ignite conference in Atlanta, offering the free preview as a separate tool outside of Outlook. Now it's available to enterprise customers at a price of $6 per user per month and is unified with its flagship Harmon.ie client.

Posted by Jeffrey Schwartz on 03/22/2017 at 5:12 PM0 comments


SharePoint Summit Will Emphasize 'Connected Workplace' with Office 365

Microsoft is expected to share what's next for SharePoint during an event scheduled for May 16 and it should come as no surprise the announcements involve further integration with Office 365, OneDrive, Yammer, Windows, PowerApps, Flow and the new Microsoft Teams chat service.

Corporate VP Jeff Teper, who oversees SharePoint and Office 365, is scheduled to lead the virtual event, joined by Corporate VPs James Phillips and Chuck Friedman, who will explain how Office 365, connected with Windows and Azure, "is reinventing productivity for you, your teams and your organization," according to the invite posted by Microsoft. In addition to introducing new products, the company at this point has only indicated that attendees will "learn how to create a connected workplace." Vague as that is, the teaser is consistent with Microsoft's focus on bringing a more social and connected user experience with SharePoint and Office 365.

The roadmap update is scheduled slightly more than one year after Microsoft held its Future of SharePoint event in San Francisco, where Teper introduced the new SharePoint Framework, marking the official launch of SharePoint Server 2016. At last year's launch, Microsoft promised to release upgrades to the on-premises server edition in 2017 but said the company will continue to emphasize Office 365 and hybrid cloud implementations. The forthcoming event isn't being promoted as a major launch but as an informational update that will include some release news.

It continues to beg the question: to what extent are SharePoint shops embarking on moving the collaboration suite to the cloud, either hybrid or purely online? As noted last week, a survey conducted by a team of graduate students at the Marriott School of Management at Brigham Young University and Christian Buckley's CollabTalk, which is sponsored by Microsoft, a number of ISVs and several media partners including Redmond magazine, aims to shed light on the extent to which SharePoint shops are moving to either hybrid or all-cloud deployments. According to preliminary results:

  • 31 percent are using hybrid SharePoint solutions, with almost 50 percent remaining entirely on-premises.
  • More than 60 percent of those with hybrid in place are managing SharePoint on-premises.
  • Of those entirely on-prem, almost half plan to shift to hybrid within the next one to three years.

The survey is much deeper than that, and today's the last day to weigh in here. We look forward to sharing the results next month in advance of Microsoft's event.

 

Posted by Jeffrey Schwartz on 03/22/2017 at 11:53 AM0 comments


AirWatch Upgrade Extends Windows 10 Management and Endpoint Security

VMware has extended Windows 10 endpoint management and security options offered in its Workspace One platform, the company's new digital workspace platform released last year that brought together the company's AirWatch mobile device management (MDM) tooling and Horizon application delivery offerings.

The upgrade, released this week, is the first major update to Workspace One. In addition to adding new Windows 10 security and management controls to its AirWatch 9.1 MDM offering, which the company describes as unified endpoint management (UEM), the update also adds new support for Android devices, real-time threat detection and an advance rules engine for devices used in specific vertical industries and ruggedized computers.

Workspace One and AirWatch 9.1 adds more granular controls to work around off-network OS patching of Windows 10 endpoints and restrictions imposed by Microsoft's new Windows Update as a service and includes a new dashboard to track patch compliance and perform audits of Windows updates. AirWatch 9.1 also now provides advanced BitLocker configurations, which the company says eliminates the need for encryption management tools from Microsoft or other third-party providers.

Adam Rykowski, VMware's VP of UEM Product Management, said in an interview that the upgrade lets Windows administrators encrypt an entire disk, system partition or take a device's built-in TPM chip to eliminate the need for USB-based flash drives for Secure Boot or startup keys. At the same time, it enables the enforcement of logical PINs in conjunction with the TPM chip to lock the OS from starting up or resuming until a user is authenticated. It also offers various controls for rotation key policies, recovery controls and the ability to suspend BitLocker enforcement policies when deploying critical maintenance updates.

AirWatch 9.1 also now supports Microsoft Business Source Packages (BSPs), the set of components designed for specific hardware and the Windows Store for Business. Rykowski said that will ease the deployment of applications in Microsoft's Windows Store via VMware's Workspace One company store catalog.

VMware has also added real-time threat detection and access control remediation for Windows by integrating with VMware's TrustPoint endpoint security tool. The company added TrustPoint as an option to AirWatch through an OEM agreement inked last year with Tanium, a provider of real-time security and endpoint management tools, which both companies claim can query millions of endpoints in seconds to detect and remediate threats. VMware said TrustPoint offers accelerated compliance and threat management.

Many of the key updates are the result of feedback from early customers of last year's Workspace One release, which included the new AirWatch 9.0 that wanted to bring more MDM-like management to Windows 10, according to Rykowski. "It was interesting to see how Windows management required another deeper level of control that's very different from mobile," he said. "But the way we're doing it, is in this cloud way where you can apply updates and patches in real time with devices that are not on the network, regardless of where they are. We can provide the same level of deep capabilities but do it in that more modern management style."

The new Android support added to the update includes new onboarding support for managed devices, which now offers configuration of devices via a QR code or an e-mail to the user. It supports automatic app permissions and configurations, enforces app-level passcode polices for work applications without requiring an SDK and adds improved integration of Google Play and App Config setups. VMware's update also adds support for Apple's forthcoming iOS 10.3 and MacOS 12.4 releases with a new SDK, app wrapping engine and support for productivity applications.

A newly added browser plugin now offers single sign-on access to nonfederated software-as-a-service Web apps that don't support SAML. VMware is also slashing the price of the various packaging options. The new standard license was reduced from $4.33 per device to $3.50 and the advanced cut from $6 to $5.50.

Posted by Jeffrey Schwartz on 03/17/2017 at 1:14 PM0 comments


Research To Measure State of Hybrid SharePoint Market

While SharePoint Server now lends itself well toward hybrid deployments with Office 365, it's unclear to what extent organizations are doing so. A research project underway hopes to measure how many organizations are actively pursuing hybrid SharePoint deployments, what's motivating them or holding them back.

Graduate students at the Marriott School of Management at Brigham Young University and Christian Buckley's CollabTalk are conducting the research, which is sponsored by Microsoft, a number of ISVs and several media partners including Redmond magazine. The research includes an online survey of SharePoint and Office 365 customers and MVPs, and customer interviews with contribution from outside research firms.

When Buckley, a Redmond contributing author and well-known MVP in the SharePoint community, reached out to me about our editorial cosponsorship to this research project, he noted that the size of the hybrid SharePoint market remains unclear, along with why organizations opt to remain on-premises. In addition to getting a better measure on hybrid deployments, the research aims to understand what fears many organizations still have about moving SharePoint to the cloud, as well as what technical issues are making it difficult to migrate existing server infrastructure, custom apps and third-party solutions.

A report based on the research will also explain how customers that have implemented hybrid SharePoint deployments are addressing those various challenges including management and security considerations. Buckley said there's no accurate data that explains how many customers are actively pursuing SharePoint deployments as a strategy versus a means toward displacing their on-premises deployments to the cloud.

"Whether you are already well into your cloud journey or just getting started, this study will help us understand where you're at and what we can do to help," Bill Baer, Microsoft's senior technical product manager focused on hybrid SharePoint deployments said in a statement.

The report will be released next month in various forms, including webcasts by some of the sponsors. In addition to Microsoft, the corporate sponsors include PixelMill, B&R Business Solutions, Rencore, Crow Canyon Systems, tyGraph, Focal Point Solutions and AvePoint.

Of course, we'll share the findings and analysis as well. If you'd like to weigh in, the deadline to participate in the survey is March 22.

 

Posted by Jeffrey Schwartz on 03/16/2017 at 10:30 AM0 comments


Okta IPO Will Measure Growth of Identity-as a-Service Market

Okta, one of the largest identity-as-a-service (IDaaS) providers, is finally going public. The company, rumored for years to have its eyes on an initial public offering (IPO), this week registered plans with the Securities and Exchange Commission (SEC) to offer Class A common stock to be traded on the Nasdaq market.

The move is the latest sign that the company is undeterred by claims from Microsoft that organizations no longer need third-party IDaaS offerings (such as the Okta Identity Cloud) if they use Azure Active Directory Premium and the rest of its Enterprise Mobility and Security (EMS) services. Despite such claims, Okta and its customers point to the larger array of integrations offered with its cloud-based SSO service. Okta says it now offers 5,000 connectors to legacy and software-as-a-service (SaaS) applications.

Okta's S-1 filing reveals its revenues have soared in recent years. During its last fiscal year, revenues grew 109% from $41 million to $85.9 million. Looking at the first nine months of calendar year 2015 and 2016, Okta said revenues have grown from $58.8 million to $111.5 million, respectively. Despite the revenue growth, however, Okta posted steep losses -- $59.1 million in FY 2015, $76.3 million in FY 2016 and $54.9 million and $65.3 million for the first nine months of calendar years 2015 and 2016, respectively.

However, the company also touted its base of 2,900 enterprise customers including 20th Century Fox, Adobe, Engie, Flex, Github, LinkedIn, MassMutual, MGM Resorts, Pitney Bowes and Twilio, and key partnerships with Amazon Web Services, Box, Google Cloud, Microsoft, NetSuite, SAP, ServiceNow and Workday.

Indeed, many organizations are looking at third-party IDaaS options, said Forrester Research analyst Merritt Maxim in a blog post commenting on Okta's IPO. "Over the past 18 months, Forrester has had a steadily increasing number of IDaaS-related inquiries from enterprise clients looking to deliver identity and access management (IAM) capabilities to their employees via a SaaS subscription model," Maxim noted. "Okta's revenue growth aligns with the strong growth in demand we see from our clients."

While Okta's cloud-based single sign-on platform is among the most widely used IDaaS offerings, the company has added complementary services including a new mobility management tool and last week's acquisition of API management provider Stormpath.  

"The need for a unified identity across every app, service and device is exploding as every company transforms their business with digital services," said Okta CEO Todd McKinnon, in a post announcing the Stormpath deal. "Additionally, developers are becoming major buying centers and decision makers within organizations. And with no signs of that trend slowing, the need for secure application integration is growing."

Okta also faces some formidable competitors. In addition to Microsoft, VMware now offers its namesake VMware Identity Manager, offered with its Workspace One and AirWatch suites, and MobileIron, a leading supplier of enterprise mobility management (EMM) services, also recently added a single sign-on tool to its offering. Then there's a slew of other rival providers including Centrify, One Login, Ping, Quest and Sailpoint, among others.

While Okta and Microsoft are partners, the two companies are also in a heated battle. In an interview last year, McKinnon argued Okta is winning a lot of enterprise IDaaS deals from Microsoft. "They're losing and they don't like losing," McKinnon told me at the time. We beat them eight out of 10 times, deal-for-deal, and that's even with them bundling in their products in their Enterprise Agreements for a pretty low price."

Microsoft argues it now has 41,000 customers using its EMS service. And, ironically, a key pillar of Okta's increased revenues have come on growth of Microsoft Office 365 subscriptions, which automatically result in the uptick of Azure AD accounts.

Posted by Jeffrey Schwartz on 03/15/2017 at 1:12 PM0 comments


30 Years After Microsoft's Initial Public Offering

It's the 30th anniversary of Microsoft's initial public offering. Since the company went public on March 13, 1986, the company's stock has grown 89,000%, though the vast majority of that was on Cofounder Bill Gates' watch.

Reporting on the anniversary was CNBC's Josh Lipton, who noted an investment of $1,000 at the time of the IPO would now be worth $900,000, presuming you held on to the shares. "Not a bad gain for those early employees," Lipton said.

Despite incrementally liquidating his Microsoft shares, Gates reportedly still holds $87 billion in Microsoft stock, while Cofounder Paul Allen's holdings in the company are valued at $20 billion.

Of course, the bulk of Microsoft's growth came while Gates was CEO, when shares rose 73,000%. Microsoft fell on harder times once Steve Ballmer took the reins. During his tenure between 2000 and 2014, shares fell 34%. Though much of its value dropped on some key missteps, many tech stocks took a beating when Ballmer took over -- first with the dotcom crash and next with the financial crisis of 2008. Also, Gates was still active at Microsoft as chief software architect during the first eight years of Ballmer's tenure.

Since Satya Nadella took over as Microsoft's third CEO in 2014, Lipton noted the company's stock has risen 78% from the Ballmer days.

Posted by Jeffrey Schwartz on 03/13/2017 at 12:57 PM0 comments


Locked Out of Outlook.com After Service Upgrade

It's a good thing Outlook.com isn't my main e-mail account. Otherwise it would have been a long month. That's how long I was locked out of my Outlook.com account. Strangely, I had no problem accessing Microsoft and Office 365 accounts, which are all tied to the same credentials as Outlook.com.

Though I've had an Outlook.com account for years, I rarely use it and never give it out since I have a personal e-mail domain and use Yahoo and Gmail as backup accounts. But given the recent upgrades to the service and its ability to synchronize schedules and contacts, I decided to check it out to consider my options. But when I merely tried to access Outloom.com while already logged into my personal Microsoft Office 365 account, I was redirected to a page which read: "Sorry Something Went Wrong." Trying directly from Outlook.com didn't work either. After trying with a different browser, then via another computer and finally with an iPad, I presumed Outlook.com was experiencing an outage. Since it wasn't critical I tried the next day but still was unable to access my Outlook.com account.

Realizing it must have something to do with my account, I connected with the Outlook.com support desk via an online chat session. One can't expect telephone support with a free service, after all. First the technician said I should change my password in case my credentials were compromised. It didn't seem likely since if that were the case, I wouldn't be able to access any of my other Microsoft account services but it's hard to argue the wisdom of changing your password these days. Nevertheless, that didn't fix anything and the rep was unable to help.

During a visit to my local Microsoft Store the next day, the technician there said he'd reach out to the company's Global Escalation support team. Upon receiving a call from an Outlook.com technician I explained the situation. I gave him remote access to my PC, though I asserted I doubted the problem originated from the client.  

The admin didn't see anything obviously wrong on my end but said he'd get back to me in a few days. We went back and forth a few times, and he ultimately suggested I capture my session traffic using Telrik's FiddlerCap Web recorder. The team apparently wanted to review the session traffic (HTTP, cookies and other information that could detect a possible problem).

A few days later he got back to me and apparently found the solution. He instructed me to log in to my Microsoft account and view my settings. Under "Your Info" he asked me to click on "Manage How to Sign-In to Microsoft." This is the section that lets you share all of your account aliases. Asked if my Outlook.com username was my primary alias, I replied no. The primary alias for my Microsoft account was the email address for my personal domain, which I have always used to log in to all of my Microsoft accounts. The admin instructed me to specify my Outlook.com address as my primary address. And in one click, the problem was solved. The good news is I can still use my personal domain e-mail address to log in -- I just can't specify it as my primary address (which is no big deal).

I asked the support rep at Global Escalation Services why it took a month to get to the bottom of this. (I should note that he was determined to get to the bottom of the problem and dutifully called me every few days to check in.) As I reported late last month, with the launch of Outlook.com Premium, Microsoft has transitioned the Outlook.com service over the past year from the Web-based system to Microsoft Exchange Server. The move to Exchange Server came with some caveats. In this case, those who use personal domains can no longer use Exchange Active Synch (EAS) with the aliases to those providers. 

That doesn't answer the question of why it took so long, especially when this January community support thread showed a handful of others who had similar problems who noted that EAS is no longer supported with personal domain addresses. Perhaps if accessing the account was more urgent rather than a background task, I'd have found that myself.

If you also use a personal domain e-mail address to access your Outlook.com account (or create a new one), now you know that all you have to do is change your primary username.

Of course, if you do want to tie your personal domain to Outlook.com, you could sign up for the premium service, which allows you to link your personal domain and four others for $10 per year on top of the existing annual fee. And if you sign up by month's end, Microsoft is offering Outlook.com Premium for $19.95 a year. Microsoft says the regular price is $49.99 but we'll see if the company sticks with that price

However, the addition of that option would explain why those who used their domain names as their primary credentials were suddenly locked out.

Posted by Jeffrey Schwartz on 03/10/2017 at 12:40 PM0 comments


Howard Schmidt's Legacy Shaped Cybersecurity Policy and Trustworthy Computing

Cybersecurity experts are mourning the loss of Howard Schmidt, the nation's first cybersecurity czar, who died last week at the age of 67 after a long battle with brain cancer. Schmidt, who served two U.S. presidents, was a onetime chief security officer (CSO) at Microsoft and played a key role in shaping the company's Trustworthy Computing initiative.

Schmidt was recruited from Microsoft by President George W. Bush in April of 2002 in wake of the Sept. 11 terrorist attacks where he served as vice chairman of the President's Critical infrastructure Protection Board. President Barack Obama later tapped Schmidt as the nation's first cybersecurity czar -- his actual title was Special Assistant to the President, Cyber Security Coordinator.

In that role, Schmidt is credited with Obama's efforts to foster private and public sector cooperation in shaping more coordinated policy that promotes sharing attack and threat information in the common interest of protecting the nation's critical infrastructure.

In an interview with DarkReading in a 2011, Schmidt said his key efforts centered around the need for the government and private sector to share attack intelligence. "We are able to coalesce intelligence … the government has information that comes from its unique position, so part of this is taking that information and [showing] we care about putting the bad guys in jail," he said at the time. "We want to make sure we are sharing with our private sector partners."

Schmidt was instrumental in numerous White House initiatives. Of note was the National Strategy for Trusted Identities in Cyberspace (NSTIC), which the White House has since removed from its Web site, and also helped create a strategy on how the U.S.  would defend itself from a major international cyberattack. As noted by DarkReading, Schmidt once warned of the "cascading effects" of targeted malware attacks against nation states. Schmidt left the Obama administration toward the end of his first term.

While he was well known for his government service, Schmidt's cybersecurity career spanned 40 years and held roles in military and the commercial sectors as well. In addition to serving as Microsoft's CSO, he spent two years as eBay's chief information security officer (CISO), was chairman of Codenomicon and held numerous board, director and other non-executive roles at various security companies, including Fortify, RatePoint, Neohapsis, BigFix and HAS Security.

Schmidt also served as international president of the Information Systems Security Association (ISSA) and as president and CEO of the Information Security Forum (ISF). After leaving the White House in 2012, Schmidt served as executive director of the Software Assurance Forum for Excellence in Code (SAFECode), a non-profit organization focused on promoting best practices in developing secure code. When Schmidt was too ill to continue at SAFECode his longtime Microsoft protégé Steve Lipner took the reins of the association last fall.

Indeed, Schmidt left his imprint on Microsoft as well, having served during the period that led up to Bill Gates' Trustworthy Computing Initiative and was a cofounder of what ultimately became the company's Trustworthy Computing Group, as recalled by Threatpost, a blog produced by Kaspersky Lab.

Along with Lipner, who ran the Microsoft Security Response Center back then, Schmidt helped create the team that led up to Gates' infamous Trustworthy Computing e-mail. Both Lipner and Schmidt worked closely together on the response on some of the largest major Internet cyberattacks at the time, including Code Red.

Lipner last week told Threatpost that Bush recruited Schmidt from Microsoft just three months before Gates launched the Trustworthy Computing Initiative. "Howard always felt a higher calling to service to the government of the United States, Lipner told Threatpost. "There's no better demonstration of that than the fact that, in late 2001, after the 9/11 attacks, he left Microsoft to join the White House cybersecurity policy office. His departure meant that he was no longer at Microsoft when the Trustworthy Computing e-mail -- which reflected a lot of effort on his part -- was released."

Posted by Jeffrey Schwartz on 03/06/2017 at 11:46 AM0 comments


New Docker EE Targets Containers for Business Apps

Looking to bolster the container-based software it has helped standardize, Docker has launched a new enterprise edition with the goal of helping DevOps teams build, deploy and manage business-critical applications that it said can run in production at scale across various infrastructure and cloud services.

The new Docker Enterprise Edition (EE), announced today, is what the company calls its open and modular container platform for building and maintaining cloud-scale applications that are portable across the operating systems and cloud services. The new Windows Server 2016, released last fall, includes the Docker runtime engine free of charge. In addition to Windows and Hyper-V, Docker is certified to run on with a wide array of Linux distributions, VMware's hypervisor platforms and cloud services including Microsoft Azure, Amazon Web Services and Google Cloud Platform.

Docker EE is available in three configurations: Basic, Standard and Advanced. Basic includes the Docker platform designed to run certified infrastructure and has support for Certified Containers and Plugins, available in the Docker Store. The Standard configuration adds to that support for multi-tenancy, advance image and container management, as well as integration with Active Directory and other LDAP-based directories and the Advanced version offers Docker Security Scanning and continuous vulnerability monitoring.

The launch of Docker EE comes just over a year after the company launched Docker Datacenter, the company's DevOps platform built to enable the deployment of Containers as a Services (CaaS) on premises or in virtual managed private clouds. In addition to the Docker engine container runtime, it brought together the Docker Trusted Registry for image management, security and collaboration. Docker Datacenter also introduced the company's Universal Control Plane (UCP) with embedded Swarm for integrated management tooling.

Docker Datacenter is now a component of the Docker EE, available with the Standard and Advanced versions. All of the free Docker versions will now be renamed Community Edition (CE).

With the release of Docker EE, the company has also broadened its certification program with a wider number of third-party partner offerings available in the Docker Store that are compatible with Docker EE. In addition to the infrastructure the company's container platform supports, the new program will allow ISVs to distribute and support their software as containers that are reviewed and tested before they're available in the Docker Store.

The program also looks to create an ecosystem of third-party plugins that must pass API compliance testing to ensure compatibility across apps, infrastructure and services. "Apps are portable across different network and storage infrastructure and work with new plugins without recoding," said Dan Powers, Docker's head of partner and technology management, in a blog post announcing the new certification program.

Docker also plans to accelerate its release cycles. Docker will now issue new releases of the community and enterprise editions each quarter. The company will support each release of Docker EE for a year and the community editions for four months, though Docker CE users will have a one-month window to update from one release to the next. The Docker API version will remain independent from the Docker platform, said Docker Product Manager Michael Friis, in a blog post announcing Docker EE.

"Even with the faster release pace, Docker will continue to maintain careful API backwards compatibility and deprecate APIs and features only slowly and conservatively," Friis noted. "And Docker 1.13 introduces improved interoperability between clients and servers using different API versions, including dynamic feature negotiation."

Posted by Jeffrey Schwartz on 03/02/2017 at 3:43 PM0 comments


Azure Stack Steps Closer to Release with TP3 and Expanded Roadmap

Microsoft today has released the third and final technical preview of its Azure Stack, the software that will be offered on integrated appliances letting enterprises and hosting providers run Redmond's public cloud platform in their own datacenters. The company also revealed it will offer consumption-based pricing for Azure Stack.

Signaling it's on target to become generally available this summer, the new Azure Stack TP3 comes nearly eight months after the company released the second preview and more than a year since releasing the first test release. While the company has shifted course a number of times, most notably last summer, Microsoft says it has firmed up its roadmap for Azure Stack and with the release of TP3 is promising more continuous refreshes right through its release this summer.

Shortly after today's TP3 release, Microsoft will add support for Azure Functions, and shortly thereafter Blockchain, Cloud Foundry, and Mesos templates. Also in the coming months, Microsoft will release a refresh to TP3 that will offer multi-tenant support. "Our goal is to ensure that organizations choosing hybrid cloud environments have this same flexibility and innovation capability to match their business objectives and application designs," said Microsoft Technical Evangelist Jeffrey Snover, in a blog post published today revealing the latest Azure Stack news.

Snover said TP3 will let customers:

  • Deploy with ADFS for disconnected scenarios
  • Start using Azure Virtual Machine Scale Sets for scale out workloads
  • Syndicate content from the Azure Marketplace to make available in Azure Stack
  • Use Azure D-Series VM sizes
  • Deploy and create templates with Temp Disks that are consistent with Azure
  • Take comfort in the enhanced security of an isolated administrator portal
  • Take advantage of improvements to IaaS and PaaS functionality
  • Use enhanced infrastructure management functionality, such as improved alerting

Apparently, Microsoft is letting testers keep their previews. Upon release, Snover said the company will rename its proof-of-concept (PoC) deployments Azure Stack Development Kit, which customers can use as a "single-server dev/test tool to prototype and validate hybrid applications," according to Snover. After Azure Stack is commercially available, Snover said customers should expect to see frequent release updates focused on application modernization, improved management and scale.

Snover emphasized the goal for Azure Stack to offer a consistent and hybrid application development platform that ensures apps developed can run in either the public or private versions of Azure IaaS and PaaS services. "Individuals with Azure skills can move projects, teams, DevOps processes or organizations with ease," he noted. The APIs, Portal, PowerShell cmdlets and Visual Studio experiences are all the same.

Microsoft also said it is looking to apply the same "cloud economics" to Azure Stack as offered through Azure. As such, customers will pay on a consumption basis, though somewhat less since Microsoft isn't incurring the entire infrastructure cost, according to a brief video explaining the usage-based pricing model. Hardware providers Microsoft has partnered with have indicated they will also offer subscription-based pricing plans.

The first systems will come from Dell EMC, HPE and Lenovo, and later in the year from newly added partner Cisco, which will offer Azure Stack systems jointly engineered among the respective suppliers and Microsoft. "From what we have seen if it so far is it is starting to take on the fit and finish of a completed product," said Jeff Deverter, chief technologist for the Microsoft Practice at Rackspace, in an e-mail. "This announcement has really ignited the productization work here at Rackspace as we strive for being a launch-day partner."

Rand Morimoto, a Microsoft Azure MVP and president of Walnut Creek, Calif.-based Convergent Computing, which provides IT development and deployment services, has been building POCs with Azure Stack for a number of large enterprises since Microsoft released the first technical preview early last year, and said in an e-mail he has been awaiting TP3. "This is a huge solution for a lot of our banking and healthcare customers," Morimoto said. "We are anxiously waiting for the final release of Azure Stack and corresponding hardware solutions around it."

Posted by Jeffrey Schwartz on 03/01/2017 at 12:54 PM0 comments


Microsoft Quietly Releases Outlook.com Premium Service

The premium ad-free version of Microsoft's Outlook.com e-mail service that offers personal addresses that can be shared by five people is now generally available. Outlook.com Premium will once again test the appetite among users to pay for personal e-mail services they've received for free for decades, if they ever paid for them at all. The new Outlook.com Premium service, which costs $49.99 per year, is available at a promotional rate of $19.95 until March 31.

Microsoft released Outlook.com Premium last year, first to private invitation-only pilots and then launched a public preview in October, as documented by Mary Jo Foley in her ZDnet blog. Microsoft recently removed the "preview" designation from the service, though the company didn't say when, according to a Feb. 14 update.  A few weeks earlier, Microsoft told Foley that the migration of the 400 million Outlook.com mailboxes to the Office 365 infrastructure, which is powered by Microsoft Exchange, was 99.9 percent complete.

All of the major e-mail providers have long offered premium ad-free e-mail services and other perks including Google, Yahoo and Microsoft. What could give the new Outlook.com Premium service added appeal is the ability to create or use existing personal e-mail addresses to an account. Furthermore, a user can allow four additional people to use the domain name of the e-mail address, allowing them to also share calendars, contacts and files. The service also lets customers either create any domain that's available or tie to one they already have.

Customers who sign up for the promotional $19.95 rate will get use of the personal domain free for the first year. While they can renew for the same price, each personal domain costs an additional $10 per year. The personal domain automatically synchronizes Outlook.com mailboxes, along with hotmail.com, live.com and msn.com accounts into one mailbox.

Personal e-mail domains require an Outlook.com Premium subscription, according to Microsoft's FAQ about the Outlook.com Premium service, which implies that cancelling the latter means you must give up your domain name. You can create up to 10 aliases per Microsoft account and are permitted to change them up to 10 times per calendar year.

Outlook.com Premium users can use the same password to sign into an account with any alias tied to that Microsoft account, according to the FAQ. The company describes how to add or remove aliases in Outlook.com Premium here. If you bring your own domain, the registration process requires that you verify ownership and update its mail (MX) records by following instructions on the "bring your own domain" setting.

Users who don't want e-mails from their different accounts to appear in the same inbox can create rules to automatically move messages from a specific account to different e-mail folders. Outlook.com Premium also allows users to send and receive e-mails from AIM Mail, Gmail and Yahoo Mail addresses accounts, with Outlook.com Premium serving as the primary inbox for those other accounts. Similar to Microsoft's free e-mail services, attachment size limits are 10MB -- or 300GB when linked via OneDrive.

Outlook.com Premium, according to the FAQ, does not "currently" support e-mail auto-forwarding, e-mail groups, DomainKeys Identified Mail (DKIM) or Domain-based Message Authentication, Reporting and Conformance (DMARC). Perhaps the use of "currently" suggests those features may appear in the future. For now, Outlook.com Premium is only available to customers in the U.S.

Posted by Jeffrey Schwartz on 02/27/2017 at 12:47 PM0 comments


Will Amazon Chime Steal Share from Skype and UCaaS Providers?

Amazon's entry last week into the crowded universal communications fray with the launch of its Chime service could pose a challenge to the likes of Microsoft's Skype for Business, Google Voice and a swath of fast-growing players such as Twilio, Bandwidth and Bluejeans, among numerous others. However, a quick look at the initial service suggests it has no obvious technical advantage over its rivals, hence it's not immediately likely to attract enterprises that use VoIP and UC offerings from Cisco, Avaya or Microsoft.

Anyone who has followed Amazon's meteoric growth over the past two decades knows not to take a blind eye to the company's track record in upsetting most -- though not all -- of the industries it has invested heavily in. Chime comes out of the company's AWS cloud business though technically it's available to consumers and businesses. I was able to create an account using my Amazon Prime credentials. And if you look at the success the company's retail business had with its Echo and Dash intelligent devices that respond to voice commands, one can imagine the possibilities Amazon might have up its sleeve.

The company has released three packages: the free version offers chatrooms and allows video calls between two parties; the "Plus" plan, priced at $2.50, adds to that screen sharing and access to a corporate directory; and the "Pro" package, which costs $15 per month, allows meetings between up to 100 participants. Scott Gode, VP of products at Seattle-based Unify Square, a longtime provider of management services for large Microsoft Skype for Business customers, said Amazon's pricing is comparable with the cost of services offered by other UC providers.

"It's very much akin to unified communications-as-a-service providers that are out there and similar to the pricing that Microsoft offers as well," Gode said. "The big difference Microsoft has is it's bundled together with Office 365, whereas the Amazon stuff -- even though its backend is on AWS -- has a standalone pricing model."

Telecommunications service provider analyst Irwin Lazar, a VP at Chicago-based Nemertes Research, agreed that Chime lacks the integration Microsoft's Skype for Business offers with Office 365. But he believes over time that could change. Likewise, Lazar believes Amazon is looking to challenge Google's G Suite. "Ultimately I think their goal is to compete with Office 365 and G Suite. But they have a long way to go," Lazar said. "The biggest immediate impact is potentially downward price pressure on the Web conferencing providers," including PGi, GoToMeeting, Zoom, BlueJeans and Cisco's WebEx, among others.

While AWS runs the world's largest cloud infrastructure, the company hasn't focused on communications and networking services. To boost its entry into the Universal Communications as a Service (UCaaS) market, AWS has partnered with network backbone provider Level 3 Communications and Vonage. Amazon signaled its interest in jumping into the market late last year when the company acquired San Francisco-based Biba, a chat, video and audio conferencing tools provider, a year after buying video signal processing company Elemental for just under $300 million.

Lazar noted that many UCaaS providers that Chime will now compete with run their services on AWS. "But Cisco, Google, and Microsoft have pretty robust cloud infrastructures of their own, so I don't think AWS gives them any real advantage," he said.

Posted by Jeffrey Schwartz on 02/24/2017 at 1:21 PM0 comments


Veritas To Move Archiving Service to Microsoft Azure

Veritas is migrating its Enterprise Vault.cloud data archiving service from its own managed datacenters to Microsoft Azure. The move to outsource EV.cloud is part of a global multiyear partnership announced today by Veritas to optimize its portfolio of data protection offerings for hybrid cloud environments using the public Microsoft Azure cloud storage as backup and archive targets.

In addition to EV.cloud and its on-premises EV, Veritas is best known for its Backup Exec for Windows and NetBackup enterprise data protection software, which now support Microsoft Azure as a target, as promised last fall. Veritas said it is announcing the Microsoft pact as employees today celebrate the one-year anniversary of its spinoff from Symantec. Private equity firm Carlyle Group officially closed the $7.4 billion buyout from Symantec in late January of last year and Veritas has since pledged to accelerate making its data protection wares more ready for the cloud. Prior to its pact with Microsoft, Veritas last week said it will integrate its Veritas 360 Data Management tool with Amazon Web Services cloud offerings.

While Veritas' decision to move EV.cloud to a global service provider stemmed from its determination to focus on its software development priorities, Microsoft Azure was a logical contender given it's a popular archiving service for Exchange, SharePoint, Skype for Business, Lync messaging and Office 365 data. "There is a lot of potential in having archival technology living and breathing within the same cloud framework as Office 365," said Jason Buffington, a principle analyst at Enterprise Strategy Group.

Nevertheless, Veritas' decision to move EV.cloud to Azure represents an important endorsement of Microsoft's public cloud infrastructure. Despite the obvious connections between Office 365, Microsoft's on-premises software and Azure, Buffington said that Veritas could just have easily chosen Amazon Web Services (AWS) or Google Cloud Platform (GCP), which are also partners and have comparable global-scale clouds.

"When you think about what Enterprise Vault is about, cloud or no cloud, it's your archive copy of last resort. It's the copy the auditor requires and, in many cases, the copy that means you won't have absorbent fines or go to jail," Buffington said. "Veritas has decided that they trust the Azure platform. When they looked at what the underlying frameworks were like and their potential to innovate on top of that platform, and yet an assurance of geopolitical boundaries and a whole bunch of considerations, Veritas' relocation of their cloud is a huge testament to the Azure platform."

Veritas decided to outsource the hosting of EV.cloud to focus adding new levels of capabilities to the platform to support a growing number of new industry and government regulations that are impacting data retention and availability requirements. Alex Sakaguchi, director of solutions marketing at Veritas, said the growth of Office 365 and hosted Exchange, SharePoint and Skype for Business is contingent on the ability to provide the necessary protections and governance.

"The primary driver of that is moving from Exchange on premises to Exchange Online," Sakaguchi said. "And then there's some unique capabilities for customers to move to Office 365 to employ Azure Storage. From a technology standpoint, this will help facilitate that movement and ensure customers have their data management capabilities, visibility and the information that they need as they make that transition."

Veritas also said as part of its partnership, the company's NetBackup 8.0 enterprise data protection software now supports storage tiering to the Azure cloud for those looking to use it for information lifecycle management. Likewise, Veritas Backup Exec, the company's data protection software for small- and mid-size organizations, supports Azure as a target. Veritas last fall had announced plans to support Azure storage with common connectors it calls Open Storage Technology (OST), which also provide connections to Enterprise Vault, the on-premises solution, and EV.cloud.

It would stand to reason that the move of EV.cloud to the Microsoft Azure public cloud would suggest that it will also support Azure Stack, the forthcoming converged systems that will provide Microsoft's cloud infrastructure and platform service, set for release later this year. For now, that's not part of this agreement, said Tad Brockway, Microsoft's partner director for Azure Storage, who also said that Azure Stack doesn't fall under this agreement. But Veritas' decision to move EV.cloud to Azure will allow the company to lower the cost of running the service.

"Backup and archival are natural scenarios for the public cloud," Brockway said. "The public cloud gives customers for archival and backup the cost and cloud economics, which make sense for those workloads. Especially, if you look at the exponential growth in data, the requirements can only be met via public cloud."

Veritas' Sakaguchi said the migration will be staged over time, though the company isn't providing any timelines at this point. However, he said customers will be informed.

Posted by Jeffrey Schwartz on 02/22/2017 at 1:11 PM0 comments


Most Organizations Still Lack Adequate Identity Management Controls

An overwhelming number of organizations appear to lack mature best practices when it comes to addressing identity and access management to their systems, making them more vulnerable to breaches, according to 203 IT decision makers surveyed by Forrester Consulting.

Results of the survey, commissioned by IAM provider Centrify, were shared this week at the RSA Conference in San Francisco. Centrify CEO Tom Kemp shared the findings Monday during the Cloud Security Alliance event. A report based on the survey's findings determined that the least mature organizations experienced twice the number of breaches as the most mature ones.

That's not to say those who have adequately addressed authentication are immune to breaches -- they reported 5.7 annual incidents, while those with lacking identity and access management policies reported an average of 12.5 incidents per year. Across the board, two thirds said they have experienced five or more breaches during the past two years, with misuse of identities and passwords the key causes.

Nevertheless, most IT and information security managers aren't ignoring authentication and identity management, Corey Williams, senior director of product management at Centrify, acknowledged. "It's a more piecemeal approach. They do a few tactical things but not looking at things holistically," Williams said. The Forrester report emphasized issues stemming from privileged access as a common cause of breaches.

During the RSA Conference, Centrify polled another 100 security managers, which found 68 percent enforce single sign-on and 43 percent have multi-factor authentication implemented in their organization. Only 36 percent responded that they don't allow sharing of their privileged accounts, with 13 percent not allowing session recording, 12 percent implementing granular deprovisioning of access across server and application accounts and only 8 percent having privilege elevation management.

Posted by Jeffrey Schwartz on 02/17/2017 at 12:02 PM0 comments


Microsoft: Proposed 'Digital Geneva Convention' Must Neutralize Nation-State Cyberattacks

Microsoft President and Chief Legal Officer Brad Smith's call for a "Digital Geneva Convention" convention would seek to forge ties with international governments and the tech sector committed to nonproliferation of cyberweapons and toward making "the Internet a safer place" with the goal of putting an end to nation-state attacks. Smith proposed the neutral organization, comparable with the International Atomic Energy Agency focused on non-proliferation of nuclear weapons, during his Tuesday RSA Conference keynote address.

Smith's "Digital Geneva Convention" would have the scope of a global convention that would, among other things, call on the world's governments to disengage from nation-state attacks against targets in the private sector, pledge to respond to vulnerabilities and put an end to the spread of vulnerabilities by sharing them with appropriate tech providers. Acknowledging the rise of nationalism, the "global technology sector needs to become a trusted and neutral digital Switzerland," Smith told RSAC 2017 attendees.

"We need governments to take the page out of the 1949 Geneva Convention, and other instruments that have followed," Smith continued. "We need an agency that brings together the best of the best and the brightest in the private sector, in academia and public sector. We need an agency that has the international credibility to not only observe what is happening but to call into question and even identify attackers when nation-state attacks will happen."

Smith emphasized that this organization must be global and neutral. This would help ensure that the group would focus on providing 100 percent defense and would not support offense-based counterattacks. Technology providers must also make clear that they will assist and protect all customers irrespective of the country they're from and commit to refusing to aid government-sponsored attacks on customers.

"These two principals have been at the heart and soul of what we've been doing. We need to stay on that path," he said. "We need to make the case to the world that the world needs to retain its trust in technology. And regardless of a government's politics, or policies or individual issues at any moment in time, we need to persuade every government that it needs a national and global IT infrastructure that it can trust."

In his remarks, echoed in a blog post, Smith emphasized the fact that the Internet is based on infrastructure provided by private companies and they are the "first-responders" to nation-state attacks, hence their responsibility to commit resources accordingly. Smith gave a plug to Microsoft's own efforts, which include its $1 billion per year spent on security, last year's release of Advanced Threat Protection to Exchange Online that identifies malware and suspicious patterns within the content of messages and, more recently, the addition of Office 365 Threat Intelligence. Smith also gave mention to Microsoft's new Office 365 Secure Score, the tool that helps administrators assess risk factors that was launched last week.

Where this will go remains to be seen but Smith's choice to use his opening keynote of RSAC 17 as the podium to propose this "Digital Geneva Convention" suggests he wanted it to be heard. RSAC is the largest gathering of security information professionals, policy makers and law enforcement officials and is perhaps the most viable vehicle to get talks started. It will require a strong show of force and a willingness of governments with conflicting interests to come together, which, of course, is no easy task. However, despite the obstacles, there's too much at stake by remaining idle.

Posted by Jeffrey Schwartz on 02/15/2017 at 12:36 PM0 comments


Cisco UCS To Join Microsoft's Azure Stack Roster

Microsoft has tapped Cisco Systems as the fourth partner to offer the forthcoming Azure Stack cloud platform. Cisco will offer Azure Stack in its UCS converged server and network appliance, which Microsoft today said both companies will engineer.

The datacenter and networking giant was conspicuously absent from the list of Azure Stack partners when Dell, HPE and Lenovo were first announced by Microsoft to offer the software on their respective systems. Microsoft is co-engineering the Azure Stack solutions on the respective three players converged systems as well.

Despite Cisco's absence from last summer's list, it has various partnerships with Microsoft tying various wares from both companies, including one inked in 2014 to integrate Cisco Nexus switches and UCS Manager with System Center integration modules and Cisco PowerTools with Windows Server 2012 R2, System Center 2012 R2, PowerShell and Microsoft Azure. The two had also announced plans to integrate Microsoft's wares with Cisco ACI and Cisco Intercloud.

Cisco's cloud strategy is going through a transition. The company is now emphasizing its hybrid cloud portfolio on UCS, as it sets to shut down its Intercloud network of interconnected public clouds at the end of March. Cisco had launched Intercloud, an ambitious multi-cloud project, in 2014. Based on OpenStack, the company at the time said its $1 billion effort would become the largest public network of interconnected clouds. Azure Stack, which brings the same software and engineering design into a converged system as Microsoft runs in its public cloud, appears to fit into Cisco's hybrid cloud portfolio.

Microsoft first revealed plans to offer Azure Stack in May 2015 and had targeted the end of last year to release the software. But after the first technical preview, the company decided to limit partnerships with the three OEMs to provide systems co-engineered between Microsoft and the respective hardware suppliers. While the decision angered a number of partners and service providers who were already testing the software on their own hardware, Microsoft defended the move, saying it wanted to ensure that the first Azure Stack implementations provided a consistent experience with the company's public cloud.

Nevertheless, Microsoft officials have indicated the possibility of offering Azure Stack through other partners over time. Besides Dell, HPE and Lenovo, Cisco is among the largest providers of datacenter infrastructure and the leading supplier of networking hardware and software. Among the four developing Azure Stack-based systems, Cisco is newer to the server business. The company launched UCS, which consists of virtualized servers, switches and management software, in 2009.

It appears the company is targeting its offering, called the Cisco Integrated System for Microsoft Azure Stack, at service and hosting providers, though large enterprises have expressed interest in deploying the private instantiations of the Microsoft Azure Cloud.

"The Cisco Integrated System for Microsoft Azure Stack reinforces Cisco's complete approach to cloud, offering businesses the freedom to choose the best environment and consumption model for their traditional and cloud-native applications," wrote, Liz Centoni, senior VP and general manager of Cisco's Computing Systems Product Group, in a blog post announcing its Azure Stack offering.

Slated for release in third quarter of this year, Centoni noted that the Cisco Integrated System for Microsoft Azure Stack will offer both IaaS or PaaS services integrated in UCS with high-performance networking switches and the company's Cisco Virtual Card Interface optimized for Azure Stack.

Cisco's Azure Stack offering will also provide a common management interface for managing compute, an Azure Stack optimized adapter and networking with templates to support policies for multi-tenant environments. The Azure Stack offering on is UCS will also include Cisco One Enterprise Cloud Suite, which automates deployment and management applications, and is offered with more than 20 hybrid cloud platforms.

Posted by Jeffrey Schwartz on 02/10/2017 at 3:29 PM0 comments


Microsoft Begins Rollout of Integrated EMS Console

Microsoft's Enterprise Mobility + Security (EMS) service is getting a facelift. A unified EMS console will roll out in the coming months to customers with Microsoft's cloud-based platform for enrolling, configuring and securing devices and services including Office 365.

The upgrade comes as Microsoft is making an aggressive push to accelerate growth of its combined enterprise mobile device management and identity management-as-a-service offerings, announced less than three years ago, as a bundle of Intune for device configuration and management, Azure Active Directory Premium and Azure Information Protection (aka Azure Rights Management). 

More than 41,000 organizations have paid subscriptions to Microsoft's EMS with 4,000 signing up in the last quarter, according to a tweet by Corporate VP for Mobility Management Brad Anderson. Paid seats last quarter grew 135 percent, his tweet added.

The new EMS console will provide one common system for mobile device management and user policies, Anderson underscored in a blog post late last month, announcing the planned rollout. "This means that you no longer have to go to one console to set identity policies, and then another console to set device/app policies. It's all together," Anderson noted.

Customers will be advised when their existing EMS tenants will change, which Anderson said should be complete over the next several months. The new EMS console will be part of a Web-based portal that won't be dependent on the currently used Silverlight-based approach. Any new subscribers and those signing up for trials will automatically have access to the new EMS Console, and existing customers can sign up for free trials if they want access to it right away.

"What we are delivering with this new EMS console is an integrated administrative experience that makes the end-to-end scenarios we've enabled far simpler, much more powerful, and even more flexible," Anderson noted. In an example of what the integrated administrative experience offers, Anderson's post described how admins can set conditional access policies.

"Conditional Access enables IT to define the rules under which they will allow access to corporate data -- which EMS then enforces in real time," Anderson explained. "With an integrated EMS console, we can now bring together all the different areas where IT wants to define risk polices that govern access -- this allows you to define a complete and comprehensive set of rules."

The EMS console lets IT managers define their own risk policies and set rules for access, such as whether or not certain log-in attempts should be deemed suspicious whether a device meets an organization's mobile device management (MDM) policies. "We will now evaluate in real time the risk in each of those areas and only grant access to a service/application if the risk is within the constraints you define," he noted.

In addition to managing devices with the EMS console, EMS customers can apply policies to more than 3,000 SaaS third-party offerings, as well as applications running on premises.

Posted by Jeffrey Schwartz on 02/10/2017 at 12:33 PM0 comments


VMware Brings Windows Desktops and Apps to IBM Cloud

VMware is upgrading its Horizon virtual digital workspace portfolio to make desktop-as-service (DaaS) more viable over low-bandwidth networks and for those requiring high performance. The company said it is implementing new protocols and deployment options including several different hyperconverged appliances and support for IBM's SoftLayer Cloud.

The company announced three new additions to its end user computing set of offerings: the new Horizon Cloud Service, Horizon Apps, and Horizon 7.1 desktop and apps delivery platform.  The new offerings take advantage of new protocols VMware has acquired and optimized to boost performance, making it more suitable for various virtual client and app environments, including Microsoft's Skype for Business. These improvements to Horizon are currently rolling out at the technology preview stage, according to VMware.

VMware has extended it Blast virtual client protocol, which it introduced last year, with technology designed to improve latency by offering improved adaptive bit rates, reducing packet loss and providing better forward error correction. The company claims its new Blast Extreme Adaptive Transport (BEAT) offers 6x faster transfers over intercontinental networks and 4x when the connection is cross-continental, while reducing bandwidth by 50 percent. These findings are based on the company's testing but officials say it's suited for use on public Wi-Fi networks. "It really opens up a lot of use cases, where now we can really deliver that great user experience," said Sheldon D'Paiva, a director of product marketing at VMware.

The company is also offering new integrated management controls. The new Just-in Time Management Platform (JMP), which the company said provides an integrated approach to managing virtual desktops and apps, enables real-time video delivery, contextual policy management and improved configuration and provisioning.  The initial JMP release will allow customers to take advantage of VMware's Instant Clones, previously only available for virtual desktops, and supporting it for published apps as well as for cloud deployment.

"We feel bringing these best of breed technologies together allows customers to  dynamically scale desktops and apps on demand both on premises and in the cloud to best meet their use case," said Courtney Burry, VMware's senior director of end user computing. "The benefit of what this provides to customers when you combine all of these things together is that they get the ability to pool their infrastructure on the back end to drive down costs. They can destroy and create sessions on demand for improved security. And users get a better, more personalized experience every time they log in."

Robert Young, research director for IDC's IT service and client virtualization software practice, said the addition of the Just in Time management tools should be welcome by customers and prospects who want common controls across DaaS, on-premises and hybrid virtual client offerings. "If you're going to take advantage of Horizon and want to use Horizon Cloud on prem or in the public cloud, or a mixture of both, how you can utilize that common set of management tools to do that," Young said. "That's important because a lot of large companies don't want to go in on DaaS. They want to do some DaaS, still some on-prem. The ability to have those management tools function across those different environments is an interesting innovation from VMware in that respect. "

JMP will be offered with the new Horizon App Volumes advanced edition, which offers the RDSH app, session-based desktops and includes the User Environment Manager, VMware vSphere and VMware vCenter. It's priced at $200 per named user and $325 per concurrent user. A standard edition that doesn't include the JMP technology is priced at $125 per named user and $200 per concurrent user.

The new Horizon Cloud Service will let organizations deploy virtual Windows desktops and apps using a VMware-managed service hosted on the IBM SoftLayer public cloud. VMware teamed up with IBM last summer to announce at its annual VMworld conference that it would offer services managed on Big Blue's SoftLayer cloud. Horizon Cloud supports 3G graphics with devices that have Nvidia GRID virtual GPUs. In addition to using the public IBM SoftLayer cloud, the new Horizon Cloud service allows organizations to provision their own hyperconverged appliances. The company is supporting systems from its parent company, Dell EMC, as well as Quanta and Hitachi Data Systems

Posted by Jeffrey Schwartz on 02/08/2017 at 9:00 AM0 comments


Slack Counters Microsoft Teams with Enterprise Grid

Slack last week showed it's not about to let Microsoft walk away with the market for chat-based workgroup collaboration without a strong fight. Addressing some of the weakness in Slack, which has emerged as a widely popular service for creating ad-hoc workgroups based on its chat interface, the company announced its new Slack Enterprise Grid.

The move comes amid the pending release of Microsoft Teams this quarter, which will create a formidable challenge to Slack's namesake chat service. Slack Grid aims to counter some of the objections by IT and compliance managers to the company's current offering. Slack Grid answers those limitations by letting administrators control permissions and configure integrations on a per-workspace basis, the company highlighted in a blog post announcing the new release.

Slack Grid will also let administrators build shared channels between workspace. Among improved functions, Slack Grid will offer data loss protection (DLP) controls and a console that'll let administrators manage security, policy and compliance across an enterprise. Slack Grid will also let administrators customize policies and data retention settings for each workspace. Hoping to give it more appeal to enterprises, Slack Grid will also offer unlimited workspaces among multiple groups and departments.

Relatively new to the scene, Slack was founded in 2013 and counts numerous emerging companies as its customers but also has become increasingly popular in large organizations that include Accenture, Capital One Bank and IBM, among others. Slack is in the early stages of benefiting from the whole bring-your-own-device and so-called "shadow-IT" trend that has enabled services like Salesforce.com, Box, Dropbox and even Amazon Web Services to gain critical mass.

Courting Millennials
Seeing that Slack's viral success in the workplaces, Microsoft saw another alarming trend: millennials -- those born between 1980 and 1997 -- are expected to account for half the workforce by 2020 and have embraced Slack as a form of setting up ad-hoc collaboration teams. Slack's workspace is a familiar and consistent form of communication many millennials use in their personal interactions, which is why it has become so popular in the workplace.

"As a younger generation are used to in their personal lives dealing with things like Snapchat and Google Hangouts and these other services with persistent chat capabilities, Slack is definitely a major attraction," said Shyam Oza, a senior product marketing manager at AvePoint, which has partnered with Microsoft to integrate with Teams. "There's not much of a learning curve in moving into the Slack Workspace."

Microsoft Teams Momentum
Yet Microsoft has one key advantage with Microsoft Teams: it'll be included with Office 365 subscriptions. This should give it significant gravitational pull with IT from a management and cost perspective. Microsoft Teams, announced in November, is slated for release within the next seven weeks, if not sooner.

In a blog post announcing the status of Microsoft Teams, Kirk Koenigsbauer, corporate VP of the company's Office group, revealed more than 30,000 organizations in 145 markets and 19 language have "actively" used the preview release during January.

Koenigsbauer tacitly suggested Microsoft Teams will address the shortcomings of the current Slack offering. In the coming weeks, he said the company will release compliance and reporting features into Microsoft Teams, ensured at addressing DLP and other security concerns.

"Great collaboration tools don't need to come at the cost of poor security or a lack of information compliance," Koenigsbauer said. Microsoft Teams will also come with WhoBot, based on the Language Understanding Intelligent Service (LUIS) developer model for creating chatbots, when the new service is released, Koenigsbauer added.

Google Aligns with Slack
Microsoft's "update" last week came conveniently timed on the day of the Slack Grid announcement, and likewise on the same day Google announced new enterprise controls for G Suite (formerly Google Apps). Google's announcement is relevant to Slack since the two in December extended their partnership to help keep G Suite in the battle against Microsoft's Office 365 juggernaut.

The G Suite update now makes it competitive with Microsoft's Office 365 E5 SKU, said Tony Safoian, CEO of SADA Systems, one of the largest systems integrators with dealings with both companies' offerings. "It's one part of the story for them to offer a more comprehensive, secure, holistic and higher value enterprise offering," Safoian said in an interview last week.

While Google's partnership with Slack also helps broaden that focus, Safoian noted SADA in October announced a partnership with Facebook to offer its Workplace offering. "We plan to help customers implement this new service without organizational headaches and maximum productivity through user adoption programs, bringing about transformational value for their teams," Safoian said in a statement at the time.

It was very clear from the time of the Microsoft Teams launch event in early November that the company is putting a lot of weight behind it, claiming it already has 150 partners lined up to support it upon its release. AvePoint, a longtime SharePoint ISV, which now also supports Office 365, is among those with plans to support Teams.

Oza said that while the Slack Grid announcement promises to make that offering more compelling than its current version, he wants to see how it works, what tooling is offered, how well it integrates with existing infrastructure (including Microsoft's Active Directory) and how much it will cost.

"There's a lot in the [Slack] announcement that is powerful, and it shows they are moving in a positive direction and kind of addressing a lot of the criticism and gaps they face," Oza said. "It will be very interesting to see the product out in the wild for sure."

Posted by Jeffrey Schwartz on 02/06/2017 at 1:04 PM0 comments


Microsoft Requests Exception from Immigration Travel Ban

Microsoft President and Chief Legal Officer Brad Smith today said that the company has formally requested that the U.S. government grant an exception from the travel ban enacted by President Donald Trump to its employees with nonimmigrant visas who live and work in this country. The president's executive order, enacted late last week, temporarily bans entry into the U.S. from seven countries: Syria, Iraq, Iran, Libya, Somalia, Yemen and Sudan. 

In the formal letter to the newly sworn-in Secretary of State Rex Tillerson and Homeland Security Secretary John Kelly, Smith requested the exception to its employees using their authority in Section 3(g) of the executive order. That clause allows petitioners with "pressing needs" to enter the U.S. As reported Monday, Microsoft has 76 employees with nonimmigrant visas that live and work in the in the U.S. that are affected by the travel ban.

In today's filing, Smith added that those 76 employees have 41 dependents, including one with a terminally ill parent.  "These situations almost certainly are not unique to our employees and their families," Smith stated. "Therefore, we request that you create an exception process to address these and other responsible applications for entry into the country."

Citing the Section 3(g) clause of the executive order, Smith said "the Secretaries of State and Homeland Security may, on a case-by-case basis, and when in the national interest, issue visas or other immigration benefits to nationals of countries for which visas and benefits are otherwise blocked." Smith stated that this "is not only consistent with the Executive Order, but was contemplated by it."

The Microsoft employees and family members the exception would apply to have already gone through extensive vetting by the U.S. government when approved for employment with their nonimmigrant visas, he noted. That vetting includes background checks by the Interagency Border Inspection System (IBIS), which required fingerprint and name checks, along with submitting US VISIT's Automated Biometric Identification System (IDENT) fingerprint information, DHS's Traveler Enforcement Compliance Program (TECS) name check and National Crime Information Center (NCIS) information, Smith underscored.

Smith pointed to the disruption the travel restrictions is having on Microsoft and all companies who have employees who need to travel internationally. "The aggregate economic consequence of that disruption is high, whether in administrative costs of changing travel plans or the opportunity cost of cancelled business meetings and deals," he noted. But he closed his request underscoring the human consequences the executive order has placed since it was enacted last week.

"Even among just our own employees, we have one individual who is unable to start her new job in the U.S.; others who have been separated from their spouses; and yet another employee who is confronted with the gut-wrenching decision of whether to visit her dying parent overseas," Smith stated. "These are not situations that law-abiding individuals should be forced to confront when there is no evidence that they pose a security or safety threat to the United States."

Posted by Jeffrey Schwartz on 02/02/2017 at 9:27 AM0 comments


Google Bolsters G Suite Enterprise Management and Security

Looking to chip away at Microsoft's growth in Office 365 subscriptions, Google this week added new enterprise, management and security features to its rival G Suite. Google will roll out the upgraded features this month to organizations with G Suite Enterprise edition licenses.

Google's G Suite remains the most direct alternative to Office 365, given that the search giant offers a complete and comparable alternative platform comprising of productivity, collaboration, messaging and voice communication services.

By most accounts, Office 365 has made strong inroads among enterprises of all sizes, as Microsoft moves to shift many of its Exchange and SharePoint customers to the cloud. But the market is still ripe and the upgrades Google is adding to G Suite Enterprise promise to make it more appealing to businesses by addressing critical data retention, corporate policy and security requirements.

Google claims it has 3 million paying customers, including PwC, Whirlpool and Woolworths, while Microsoft last week reported nearly 24.9 million consumer-based subscriptions of Office 365 -- up from 20.6 million a year ago. Microsoft hasn't revealed enterprise subscriptions, though it said its overall cloud business, which includes Azure, grew 93% year-over-year last quarter.

The upgrade to Google G Suite announced this week aims to close the gap between the two services by improving management controls and security. "Having greater control and visibility when protecting sensitive assets, however, should also be a top concern in today's world," wrote Reena Nadkarni, Google's product manager for G Suite, in a blog post announcing the upgrades. "That's why starting today, we're giving customers the critical control and visibility they expect (and their CTOs and regulators often require) in G Suite."

The new features include:

  • Extended access control for administrators with Security Key enforcement, allowing IT management to require the use of these keys. Admins will also be able to manage the deployment of Security Keys and view usage reports, according to Nadkarni.
  • Data Loss Prevention (DLP) now extended to Google Drive with DLP for Gmail. Building on the DLP features launched in 2015, Nadkarni said it now allows administrators to configure rules and scan Google Drive files and Gmail messages to enforce policies.
  • S/MIME support for Gmail. Customers will have the option to bring their own certificates to encrypt messages using S/MIME, while administrators will be able to enforce usage and establish DLP policies based on the requirements of specific lines of business.
  • BigQuery integration with Gmail, aimed at offering extended analytics. BitQuery aims to allow IT to run custom report and dashboards.
  • Support for third-party e-mail archiving.  In addition to Google Vault, customers will now have the option to use other archiving services, including HP Autonomy and Veritas.

If these additions will move the needle on market share remains to be seen. But these features certainly up the ante by offering key data protection services many IT decision makers are demanding and should be welcome to existing customers. Many of these features are slated to roll out this month, according to Google's G Suite Release Calendar.

Posted by Jeffrey Schwartz on 02/01/2017 at 10:32 AM0 comments


Trump's Immigration Ban Hits Home for Microsoft CEO

President Donald Trump's stunning executive order late Friday temporarily banning visitors with visas from seven countries entry into the United States has hit home for several prominent tech leaders, including Microsoft CEO Satya Nadella, an immigrant from Hyderabad, India. The global turmoil created by the immigration ban is predictably taking a huge toll on the technology community, which employs many immigrants.

While the administration underscored the ban is temporary -- 120 days for all refugees and 90 days for those from earmarked countries -- it quickly wreaked havoc among thousands of travelers. A federal judge on Saturday blocked part of the order, but the move caused chaos at U.S. airports where many travelers were detained and protestors held demonstrations. It notably created uncertainty among those working for tech companies as well as IT professionals and developers working in the United States. Tech CEOs were among the most vocal to raise alarms about the move.

Nadella joined a chorus of tech CEOs condemning the president's unprecedented ban. "As an immigrant and as a CEO, I've both experienced and seen the positive impact that immigration has on our company, for the country and for the world," Nadella said on his LinkedIn page. "We will continue to advocate on this important topic."

Nadella also pointed to an e-mail to employees by Microsoft President Brad Smith, who noted 76 employees are citizens of the countries banned by the administration: Syria, Iraq, Iran, Libya, Somalia, Yemen and Sudan. Microsoft has reached out to those employees individually, Smith noted, and encouraged those that the company isn't aware of that have citizenship in those countries and those unsure if they're affected to reach out.

"As we have in other instances and in other countries, we're committed as a company to working with all of our employees and their families," Smith stated in his e-mail, which Nadella shared in his post. "We'll make sure that we do everything we can to provide fast and effective legal advice and assistance."

Smith also underscored Microsoft's commitment to all its employees who are immigrants. "We appreciate that immigration issues are important to a great many people across Microsoft at a principled and even personal level, regardless of whether they personally are immigrants," Smith stated. "Satya has spoken of this importance on many occasions, not just to Microsoft but to himself personally. He has done so publicly as well as in the private meetings that he and I have attended with government leaders."

Both Smith and Nadella were among a parade of tech leaders who met with President Trump in New York on Dec. 15, along with Amazon's Jeff Bezos, Apple's Tim Cook, Alphabet CEO Larry Page, Facebook's Sheryl Sandberg, IBM's Ginny Rometty and Oracle's Safra Catz. Immigration and global trade were issues discussed during that meeting. Nadella reportedly raised the issue of immigration to Trump during the gathering, according to a Recode report, in which the Microsoft CEO emphasized much of the company's spending on R&D takes place in the U.S., which benefits from contributions from immigrants. The report indicated that Trump responded positively by saying "let's fix that," though gave no specifics or promises.

Nadella's peers also condemned Trump's order. Among them was Apple's Cook, who reportedly told employees in an e-mail that he heard from numerous people concerned about the implications of the move. "I share your concerns. It is not a policy we support," Cook wrote, as reported by BuzzFeed. "In my conversations with officials here in Washington this week, I've made it clear that Apple believes deeply in the importance of immigration -- both to our company and to our nation's future. Apple would not exist without immigration, let alone thrive and innovate the way we do."

Google Cofounder Sergey Brin, who's also president of its parent company Alphabet, was spotted at a demonstration protesting the move in San Francisco. "I'm here because I'm a refugee," Brin reportedly told Forbes. More than 100 Google employees were immediately impacted by the ban, according to an e-mail to employees by CEO Sundar Pichai, obtained by Bloomberg. "It's painful to see the personal cost of this executive order on our colleagues," Pichai stated in the e-mail. "We've always made our view on immigration issues known publicly and will continue to do so."

The order was among a barrage of swift and controversial moves made by Trump since he was sworn in as the 45th U.S. president 10 days ago. Although the moves are in keeping with his campaign promises, which surely are pleasing to his supporters, it appears Trump is doing so with approaches that are testing the limits of presidential power. As rulings are issued, it remains to be seen what steps, if any, Congress will take. The economic consequences are uncertain, though the markets are down sharply today. Surely, that will hardly offer consolation to those who don't know what their futures hold.

Posted by Jeffrey Schwartz on 01/30/2017 at 10:26 AM0 comments


AppDynamics Deal Boosts Cisco's Infrastructure and App Stack Push

Cisco Systems this week made an offer to AppDynamics it couldn't refuse. Cisco's $3.7 billion deal to acquire AppDynamics, the leading supplier of application performance monitoring (APM) software, came just days before AppDynamics planned to go public.  

Cisco reportedly paid more than twice the amount what AppDynamics was aiming to fetch at the time of its IPO, which was scheduled for Wednesday, leading some to shake their heads at the premium the networking giant was willing to pay. With this deal, Cisco is also acquiring AppDynamics for a whopping 20 times its earnings. AppDynamics recorded revenues of $158 million during the first nine months of its fiscal year 2016, according to the company's prospectus. While that pointed to more than a $200 million year for AppDynamics, the company's year-over-year growth during the nine-month period was 54%. Despite the premium paid, Cisco executives reminded Wall Street that it has a good track record for capitalizing on its big deals.

Cisco CEO Chuck Robbins responded to a question by CNBC about the premium paid, noting that AppDynamics is the leading supplier of cloud-based APM and argued it's growing nearly twice as fast as its next competitor, New Relic. Moreover, Robbins argued AppDynamics is growing faster than any publicly traded software company today.

Indeed, AppDynamics is regarded as the leading provider of cloud-based APM software. The eight-year-old company can take telemetry from the application stack down to the code layer to either predict pending performance issues and to track the cause of those that occur, while providing business impact analysis. AppDynamics can monitor performance of applications in datacenters as well as public clouds including Amazon Web Services, Microsoft Azure and the Google Cloud Platform. As organizations move to hybrid clouds, software-defined infrastructures and use more Internet-of-Things devices, Cisco is looking toward the ability to holistically measure telemetry of apps at all of those tiers.

"The synergies between the application analytics that they can drive and the infrastructure analytics that we can drive across both private and public clouds creates business insights for our customers that no one else can deliver," Robbins told CNBC. "What we're doing with AppDynamics is to really help our customers understand what's going on in their environments."

During a conference call with media and analysts this week, AppDynamics President and CEO David Wadhwani explained why he decided to ditch the company's IPO at the 11th hour and accept Cisco's bid.  "It's inevitable in my mind that we are moving to a world that's going to be focused on systems of intelligence," he said. "We are in that rarified position that can redefine not just how IT departments operate but how enterprises as a whole operate."

AppDynamics is used by many large enterprises including Nasdaq, Kraft, Expedia and the Container Store. In all, AppDynamics claims that 270 of the 2000 largest global companies use its platform. Wadhwani said Cisco, which is used by almost all large organizations, has an opportunity to extend AppDynamics' reach through its extensive channel partner network. The deal to acquire AppDynamics also comes as Cisco is on the cusp of rolling out its Tetration analytics platform, which the company claims will provide monitoring and telemetry for all activity in the datacenter.

"We saw an opportunity here together to provide a complete solution," said Rowan Trollope, senior VP and general manager of Cisco's Internet of Things and applications business, during this week's analyst and media briefing. "Infrastructure analytics, paired together with application analytics, to provide not just visibility into the application performance, candidly visibility and insight into the performance of the business itself."

Enterprise Management Associates VP of Research Dennis Drogseth said presuming Cisco can successfully integrate AppDynamics with Tetration could advance business performance management technology. Vendors touting BPM solutions have to deliver on the promises of such offerings to date, according to Drogseth.

"One of the questions I'm not sure they know the answer to is are they going to have a central analytic pool? Will all of the data from AppDynamics feed into the machine learning engine of Teration? Or, when they have the two together, could Tetration move to a cloud environment? And how will those two integrate? But again, it could be a very promising combination."

Asked by analysts on a conference call announcing the deal why Cisco was paying such a high premium for AppDynamics, executives pointed to its 2013 acquisition of Meraki, which provided a cloud-based platform for central management of Wi-Fi networks and mobile devices for $1.2 billion. At the time the deal was announced Meraki's annual revenues were $100 million. "Today, it's at a $1 billion bookings rate," said Rob Salvagno, Cisco's VP of corporate development. "And we see the same potential for synergies in this opportunity,"

Posted by Jeffrey Schwartz on 01/27/2017 at 1:15 PM0 comments


LinkedIn and Former Google Exec Kevin Scott Tapped as Microsoft's CTO

In a sign that Microsoft wants to utilize the technical chops of talent it has brought in with last month's acquisition of LinkedIn, Microsoft CEO Satya Nadella has named Kevin Scott to the newly created position of chief technology officer (CTO). Scott, who was the senior vice president of engineering at LinkedIn, joins Nadella's senior leadership team. He will be in charge of driving company initiatives. It's a dual promotion for Scott, who will continue at LinkedIn in a stepped-up role as senior VP of infrastructure.

The surprising move comes about seven weeks after Microsoft closed its $26.2 billion deal to acquire LinkedIn, by far the company's largest acquisition. By his choice, Nadella is signaling confidence that Scott can presumably play a major role in bringing together both the Microsoft and LinkedIn graphs that are key in creating new forms search, machine learning and artificial intelligence capabilities.

When initially announcing the acquisition deal, Nadella indicated the potential LinkedIn brought to Microsoft to help combine users' professional social networks with the tools provided in the workplace. "You look at Microsoft's footprint across over a billion customers and the opportunity to seamlessly integrate our network within the Microsoft Cloud to create a social fabric, if you will, that can be seamlessly integrated into areas like Outlook, Calendar, Office, Windows, Skype, Dynamics [and] Active Directory. For us, that was an incredibly exciting opportunity," Nadella said at the time. (See Redmond magazine's August 2016 cover story analyzing the implications of the deal.)

Given Scott's deep engineering and management background at LinkedIn (and before that at Google and AdMob), it appears Nadella believes Scott will lead that effort. "We are thrilled that Kevin will bring to Microsoft his unique expertise developing platforms and services that empower people and organizations," Nadella stated in today's announcement. "Kevin's first area of focus is to bring together the world's leading professional network and professional cloud."

The brief announcement emphasized that Scott will remain active at LinkedIn and on its executive team. "I am very optimistic about where Microsoft is headed and how we can continue to use technology to solve some of society's most important challenges," Scott stated.

Scott has a 20-year career in the field of academics as well as a researcher, engineer and leader, according to Microsoft. In addition to LinkedIn, Scott has held engineering and management roles at Google and AdMob, and has advised a number of startups. Microsoft said Scott is also an active angel investor who has founded Behind the Tech, an early stage nonprofit organization focused on giving visibility to lesser-known engineers responsible for, and involved in, advances in technology.

Posted by Jeffrey Schwartz on 01/24/2017 at 12:37 PM0 comments


LANDesk and Heat Software Merge into Ivanti

After acquiring four companies over the past several years to extend beyond its core specialty of PC patch management, LANDesk has combined with Heat Software and the two companies effective today are now called Ivanti.

Heat Software is a SaaS-based provider of IT service management (ITSM) and endpoint configuration and control tools that are part of private equity firm Clearlake Capital's portfolio of companies, which earlier this month agreed to acquire LANDesk from Thomas Bravo. Terms weren't disclosed, through The Wall Street Journal reported the deal is valued at more than $1.1 billion.

While the individual brands will remain, at least for now, Ivanti is now the new identity of the combined company. Steve Daly, LANDesk's CEO, will lead Ivanti. By combining with Heat Software, Daly said it will accelerate LANDesk's ability to move from traditional endpoint management software into offering its products as SaaS-based tools.

"At LANDesk, historically we've been slow to the cloud," Daly acknowledged during an interview following the announcement of the deal. "From a Heat perspective, what it brings to us at LANDesk is first and foremost, their very robust cloud platform. First and foremost, that for us was the main strategic reason for this deal."

Daly said Heat has invested extensively over the past several years on bringing its ITSM tools to the cloud and he added that Heat offers a workflow engine that can manage endpoint lifecycle management. "That's really where the power is," he said.

LANDesk has a long history as a provider of patch management software but Daly has looked to extend its portfolio, most recently with last year's acquisition of AppSense, a popular provider of endpoint virtualization software. The other two companies under LANDesk's umbrella are Shavlik, which provides a broad range of security, reporting and management tools that include a System Center Configuration Manager (SCCM) add-in module, and Wavelink, a provider of mobile modernization and mobile enablement tools.

The combined company offers a broad range of offerings, ranging from privilege and patch management, security, IT asset management, ITSM, password control, desktop management and what Daly described as a complete suite of device management and reporting tools. Bringing together the two companies comes as Windows has more security and self-updating features, and the task of managing endpoints is now falling on both IT and security teams, Daly said.

"Because a lot of the management techniques are getting easier, the OS is building more and more management into the platform. It's really about how you secure that end user environment," Daly said. "This is particularly acute at the endpoints because the endpoint is such a dynamic environment, whereas the datacenter is pretty static and well controlled. Our endpoints change every day, as we download stuff or as we add content."

Going forward, Daly said he believes that the profile technology it uses for Windows 10 migrations and building support for mobile devices will become a key factor in delivering a so-called "digital workplace" because of the end user activity the platform gathers. "If you lose your laptop and you need a new one, bang, you don't lose anything -- we just grab your personality that we've watched and stored."

Posted on 01/23/2017 at 12:57 PM0 comments


Microsoft Emphasizes Retail Shift to Cloud, IoT and Big Data Analytics

As many retailers are reporting declining in-store growth as consumers continue to conduct more of their shopping online, Microsoft last week emphasized how a growing number of its retail customers are deploying IoT-based sensors to capture data that can help improve operations and sales in stores.

The company was among almost every major IT player showcasing technology with that very same focus at last week's annual National Retail Federation (NRF) show, held in New York. At this year's gathering, there was a greater focus on the use of helping lift sales and control inventory by using beacons to deliver data that can give retailers the ability to replenish inventory more rapidly without overstocking, while also delivering promotional messages to customers on their phones based on their proximity to certain products.

"Every retailer knows they have products that go out of stock. But if you can quantify how long an item has been out of stock and convert that into dollar figure and then spread that across your whole store, they can react to it," said Marty Ramos, CTO of Microsoft's retail industry segment, during an interview at the company's booth at NRF. "So, we're seeing sensors that tell how much product we have on the shelf. We are doing that with shelf mats and a robot that roams by the shelf and just checks whether or not that shelf needs to be restocked. Analytics has this power where you can just pop back in these numbers. I love the fact that you can quantify these hidden metrics."

During a Microsoft-sponsored session at NRF, executives at Nordstrom Rack, Mars Drinks, Hickory Farms and Coca-Cola discussed how they're using some of the artificial intelligence (AI) and predictive analytics features offered with Azure Machine Learning and Azure IoT services.

Nordstrom Rack is in the midst of a pilot at 15 stores where it has deployed in-store beacons provided by Microsoft partner Footmark, which connects the beacons to Azure IoT. The beacons track customers that are running an app on their phones to provide promotional offers based on their proximity to certain products. Among the 15 stores that have deployed the beacons, a better-than-expected number of customers have opted to use the app, said Amy Sommerseth, Nordstorm Rack's senior director of service and experience. Sommerseth also discussed the retailer's ability to reduce waiting times at the checkout counter and better help customers find merchandise. She described the pilot as successful and the store plans to roll it out to its 260 stores this year.

Hickory Farms, whose business ranges from a pure online channel to seasonal pop-up locations in shopping malls, is deploying Microsoft's new Dynamics 365 to replace its legacy inventory management systems. The company plans to start rolling it out in April, initially to upgrade and automate back-office and inventory management, with the retail component slated to be operational by October in time for the peak holiday shopping system, said Gordon Jaquay, Hickory Farms' director of IT. Microsoft further described the two companies' implementations, among others showcased at NRF, in a blog post.

Consumer packaged goods companies, which rely on retail to distribute their goods, are also using some of these new services. Saurabh Parihk, a VP at Coca-Cola, discussed at the NRF session how the company has started using Azure services to gather data for those in the field serving a broad constituency, which includes grocery markets, restaurants and vending machine operators. "Our current 2017 big focus is on predictive capabilities," Parihk said. "That's where we are understanding how we can better predict demand using our internal transaction data as well as external data, and, at some time in the future, we want to do more optimization because of some of the capabilities are not there yet."

Posted by Jeffrey Schwartz on 01/20/2017 at 12:17 PM0 comments


Microsoft Pitches Blockchain To Help Troubled Retail Supply Chains

Microsoft has set its sights on retail as an industry that could benefit from Blockchain technology. The company hosted a demo at its booth during this week's annual National Retail Federation (NRF) show in New York that aims to help retailers streamline their supply chain operations by creating smart contracts based on Blockchain.

A solution developed by Microsoft partner Mojix, best known for its RFID hardware and data analytics software, lets retailers automate their supply chains to enable smart contracts, making the delivery of goods more reliable with less overhead, according to officials at both companies. While RFID, which uses RF signals to track the whereabouts of high-value inventory and pallets, has gained major inroads among certain segments of the retail industry, notably apparel, retailers that have adopted it still lack holistic visibility and control over their entire supply chains.

The solution developed by Mojix allows for Blockchain-based smart contracts between retailers, suppliers and logistics providers. During a discussion in Microsoft's booth at NRF, Mojix VP of Products Scot Stelter explained how a grocery chain implementing a smart contract could stipulate that an order of blueberries had to be picked on a certain day, arrive within five days and stored within a specific temperature range throughout the logistics and shipping processes.

"At each step of the way, that's a smart contract, where effectively a box gets checked, cryptographically locked and published to the Blockchain," he said. "When I am at the end of the chain, I see it so I can track the prominence of those berries so when they arrive I know if they are fresh. All parties to a contract have to agree that all the boxes that are checkable. Once they are checkable, the contract gets locked and it fulfills itself."

The smart contracts are based on Microsoft's Azure Blockchain-as-a-service, code-named Project Bletchley, consisting of a distributed ledger that's an immutable and unchangeable database record of every transaction, where specific values can be shared as desired, ensuring that even competitors in the chain can't access or compromise data not applicable to them.

For Mojix, offering smart contracts using Blockchain is a natural extension of its OmniSenseRF Intentory Service and ViZix IoT Software Platform, which provides location-based near real-time inventory management information and performance data. Microsoft has made an aggressive push to offer Blockchain services for the past year, as reported last summer. Almost all major banks and financial service companies are conducting extensive pilots using Blockchain, which is the technology that bitcoin currency is based on.

Microsoft has said it believes Blockchain has applications in many other industries as well and Yorke Rhodes said the company is working with Mojix and other partners to help automate supply chains. "In a typical supply chain, you have 10 or more legal entities that are disparate from each other," Rhodes said during a session at NRF. "[The supply chain] is a prime example of where Blockchain technology comes into play. The nature of the shared distributed ledger actually allows all parties to be contributing to the same ledger without one party owning the ledger. And all parties agree on what is actually the one state of truth. So, there's huge benefits here, across industries."

Merrill Lynch and mining operator BHB Billiton are among those that are using Microsoft's Blockchain service, and Rhodes believes automating retail makes sense as well. "What we are trying to do is pick use cases across sectors to be leading-light use cases," he said, in a brief interview following yesterday's session.

Traditional retailers will need all of the help they can get with several reporting slower in-store sales than their online counterparts during last quarter's peak holiday season. Since the beginning of the month, Macy's and Sears announced that they will close hundreds of stores, and Target today said in-store sales fell 3%, though its online sales surged 30%.

Posted by Jeffrey Schwartz on 01/18/2017 at 11:26 AM0 comments


Microsoft Aims at Smarter AI with Maluuba Acquisition

Microsoft is taking another key step to move forward to advance artificial intelligence (AI) and machine learning with its agreement to acquire Maluuba, a Montreal startup that promises to squash the limitations of search and AI. Maluuba, founded in 2011, has some of the most advanced deep learning and natural language research and datasets in the world, according to Microsoft.

The Maluuba team will become part of Microsoft's Artificial Intelligence and Research group. Announced last fall, the new group brings together 5,000 engineers and computer scientists from the company's Bing, Cortana, Ambient Computing and Robotics, and Information Platform groups. Microsoft has made advancing AI, machine learning and conversational computing a key priority. Cortana, Microsoft's digital assistant in Windows, is a key pillar of that effort.

But as Maluuba CTO and Cofounder Kaheer Suleman said last year at a demonstration in New York, today's digital assistants are confined to a limited number of buckets or domains.  This is why if you ask a question to Siri, Cortana or Google Now that they are designed to be asked, they work well. But if you ask anything outside of that they fall flat on their faces,  Suleman said (see the 17-minute video and YouTube demo here).

"Maluuba's vision is to advance toward a more general artificial intelligence by creating literate machines that can think, reason and communicate like humans -- a vision exactly in line with ours," said Harry Shum, executive VP of Microsoft's AI and Research Group, in a blog post announcing the deal. "Maluuba's impressive team is addressing some of the fundamental problems in language understanding by modeling some of the innate capabilities of the human brain -- from memory and common sense reasoning to curiosity and decision making. I've been in the AI research and development field for more than 20 years now and I'm incredibly excited about the scenarios that this acquisition could make possible in conversational AI."

The company's founders, CEO Sam Pasupalak and Suleman, met in an AI class at the University of Waterloo and share a common goal of making AI more intuitive by developing machine learning techniques optimized for natural interactions.  Professor Yoshua Bengio, described as "one of Deep Learning's founding fathers," was key in supporting their efforts and will advise Microsoft's Shum while maintaining his role as head of the Montreal Institute for Learning Algorithms.

"Understanding human language is an extremely complex task and, ultimately, the holy grail in the field of AI," the two said in a blog post announcing the Microsoft deal. "Microsoft provides us the opportunity to deliver our work to the billions of consumer and enterprise users that can benefit from the advent of truly intelligent machines."

The two also pointed to Microsoft's Azure public cloud as the optimal choice for applying the datasets the company has developed because of the GPUs and field-programmable gate arrays (FPGAs) used to bolster its global infrastructure, which the company revealed back at last fall's Ignite conference in Atlanta. Maluuba's most recent advance came last month when the company announced the release of two natural language datasets.

The first dataset is called NewsQA, which the company said can train algorithms to answer complex questions that typically require human comprehension and reasoning skills. It uses CNN articles from the DeepMind Q&A Dataset, described as a collection methodology based on "incomplete information and fostered curiosity." The questions in the dataset "require reasoning to answer, such as synthesis, inference and handling ambiguity, unlike other datasets that have focused on larger volumes yet simpler questions. The result is a robust dataset that will further drive natural language research," according to the company.

The other dataset, called Frames, addresses motivation and is based on 19,986 turns designed to train deep-learning algorithms to engage in natural conversations. The model simulates text-based conversations between a customer interacting with a travel agent.

Terms of the deal were not disclosed but the company was backed by Samsung Ventures.

Posted by Jeffrey Schwartz on 01/13/2017 at 12:40 PM0 comments


VMware Brings Adaptiva to AirWatch UEM for Windows 10 Deployments

In its latest effort to extend its AirWatch platform beyond core mobile device management (MDM), VMware today said it has tapped Adaptiva to integrate its OneSite peer-to-peer software distribution tool with its new AirWatch Unified Endpoint Management (UEM) offering.

Adaptiva's OneSite is a popular option among large enterprises with Microsoft's System Center Configuration Manager (SCCM) for distributing OS and software images with thousands of client endpoints. OneSite's appeal is its efficient form of software distribution via content delivery networks (CDNs) using peer-to-peer distribution across end user endpoints, rather than a server-based approach. Designed specifically to bring this software distribution capability to SCCM, the pact with VMware, its first with the company, will extend OneSite's use, while also offering an important option to organizations with the new AirWatch UEM.

OneSite will be built into AirWatch UEM and offered as an option sold by VMware. AirWatch UEM, launched in October, looks to broaden the focus of MDM to include configuration and lifecycle management. It's regarded as the most formidable alternative to Microsoft's Enterprise Mobility + Security (EMS), which includes its Intune device management tool and Azure Active Directory, Azure Information Protection and the ability to deliver Remote Desktop Services (RDS).

The new AirWatch UEM aims to include MDM but also is a platform for deployments, security and managing all endpoints. Aditya Kunduri, VMware's product marketing manager for AirWatch UEM, said UEM was designed to align with Windows 10's modern application framework, while allowing organizations to continue to run Win32 applications in their native modes.

"IT can now deploy their traditional Win32 applications, be it EXEs, MSIs or ZIP files from the same platform, and also their existing mobile applications," Kunduri explained. "And it's not just about pushing down an application -- they can also tie these applications with additional lifecycle management features such as attaching dependencies to those apps."

For example, if you are deploying WebEx Outlook plugins to your end users who don't have Outlook, it would be a waste of network resources, he said. "We provide this intelligent platform that can manage those features alongside just pushing down the app. And on top of that, you can manage the application patches when they're available and attach these to those files too," he said.

Kunduri argued that deploying to PCs that might have gigabytes of data is no longer practical with traditional PC Lifecycle Management (PCLM) tools (especially for organizations with thousands or tens of thousands of devices) because they must rely on distribution management nodes and servers, which provide significant overhead and points of failure. "You don't need to manage those distribution servers and distribution points anymore because it's all delivered from the cloud and directly to the peers. Now you're reducing your network and bandwidth infrastructure so it's a huge cost savings," he said.

Adaptiva COO Jim Souders said MDM platforms will continue to evolve into a common platform for managing all endpoints over the next few years. "Ultimately you will get to a common means of which to deploy software across disparate types of device types," Souders said. "Obviously, Microsoft and VMware are two of the bigger players going down that path. I think that's where there will be potential redistribution or affirmation of players."

Posted by Jeffrey Schwartz on 01/11/2017 at 1:10 PM0 comments


Citrix Launches Windows 10 Service on Microsoft's Azure Cloud

Following on last year's plan to deliver new virtual desktops and application-as-a-service offering on Microsoft's Azure public cloud, Citrix today released its Windows 10 desktop-as-service VDI offering that will run on Microsoft Azure. Citrix also said its apps-as-a-service offering, poised to replace Microsoft's Azure RemoteApp, will arrive this quarter.

The new services, to be called Citrix XenDesktop Essentials and XenApp Essentials, were launched today at the annual Citrix Summit, taking place in Anaheim, Calif. Both are key new offerings developed by both companies as part of broad extensions of their longstanding partnership announced back in May at the company's Synergy conference. Citrix XenDesktop Essentials was the first offering announced that lets organizations provision virtual Windows 10 instances as a service using their existing software licenses. The forthcoming XenApp Essentials will let organizations deploy Windows 10 Enterprise images on Azure.

This is aimed at "... those organizations seeking a simplified way to deploy Windows 10 virtual desktops in the Microsoft Azure cloud," said Calvin Hsu, VP of product marketing at Citrix, discussing several key announcements at its partner conference. "Microsoft customers who have licensed Windows 10 Enterprise on a per-user basis will have the option to manage their Windows 10 images on Azure through our XenDesktop VDI solution. Once XenDesktop Essentials is set up and running, the service can be managed by the Citrix Cloud."

The two companies, which have a longstanding partnership, described last year's extension of its pact as their broadest to date. In additions to offering VDI and app services on the Azure public cloud (managed by Citrix), the extended pact aims to offer a new delivery channel for Windows desktops and apps, including Skype for Business, Office 365 and Microsoft's Enterprise Mobility Suite (including Intune) via the Citrix Workspace Cloud. Citrix is running its digital workspace platform on Microsoft Azure. Citrix, through its service provider partners, will offer these new services via XenApp Essentials.

Citrix also said it will kick off a pilot for its network of service providers looking to offer its workspace platform using its cloud. Based on the licensing model found in other Citrix offerings, the company is looking for existing and new service providers to deliver hosted, managed desktop-as-service offerings and app workspaces.

Another pillar of last year's pact between the two companies included plans to integrate Microsoft's Enterprise Mobility Suite with Citrix NetScaler, the company's application delivery controller (ADC) and load balancing appliance. Citrix said the resulting integration of the two, the Citrix NetScaler Unified Gateway with Microsoft Intune, is now available. Citrix said the new offering lets administrators apply policies tied to Microsoft's EMS to NetScaler, allowing for conditional single sign-on access based on specific endpoint and mobile devices.

"Together, our solution allows IT administrators to define access control policies based on the state of the end user mobile device," explained Akhilesh Dhawan, principal product marketing manager at Citrix, in a blog post. "These policies will check each end-user mobile device before a user session is established to determine whether the device is enrolled with Microsoft Intune and is compliant with the security policies set by an organization and -- only then -- grant or deny access accordingly."

For customers looking for hybrid solutions, the company launched a new program that will provide hyper-converged infrastructure running on integrated appliance. Providers of hyper-converged infrastructure hardware that are part of the new Citrix Ready HCI Workspace Appliance Program will offer appliances designed to automate the provisioning and management of XenDesktop and XenApp.

Hewlett Packard Enterprise and Atlantis Computing are the first partners inked to offer a new solution based on the new program, according to Citrix's Hsu.  The appliance will include XenApp or XenDesktop running on HPE's new Edgeline Converged EL4000 Edge System, a 1U-based system available in configurations from one to four nodes and four to 16 cores based on Intel's Xeon processors with GPU compute and integrated SSD storage. Included on the HPE system is Atlantis' USX software-defined storage offering, which creates the hyper-converged infrastructure delivering the Citrix Workspace.

Citrix also announced today that it has acquired Unidesk, a well-regarded supplier of virtual application packaging management software that, among other thing, can manage both Citrix XenDesktop and Microsoft's Remote Desktop Service (RDS). Citrix said Unidesk's application layering technology "offers full-stack layering technology, which enhances compatibility by layering the entire Windows workspace as modular virtual disks, including the Windows operating system itself (OS layer), apps (app layers), and a writable persistent layer that captures all user settings, apps and data."

The company also described the latest Unidesk 4.0 architecture as a scalable solution that offers the company's app-layer technology and aims to ease customers' transition to the cloud by providing a single app image that covers both on-premises and cloud-based deployments. Citrix said it will continue to offer Unidesk as a standalone product for VMware Horizon and Microsoft virtual desktop customers.

Posted by Jeffrey Schwartz on 01/09/2017 at 1:42 PM0 comments


Posted on 01/08/2017 at 9:14 AM0 comments


Dell Paints New Picture of Computing with Canvas

Looking to create a new way for digital content creators to interact with Microsoft's forthcoming Windows 10 Creators Update, Dell took the wraps off a novel desktop device with a next-generation display that promises to create more immersive user experiences.  The new Dell Canvas debuted at this week's Consumer Electronics Show (CES) in Las Vegas and effectively merges display and input into a single device.

At first glance, the 27-inch device will compete with Microsoft's recently launched Surface Studio. While both are aimed at bringing forth the new input capabilities coming to Windows 10 Creators Update later this quarter, the Surface Studio is an all-in-one computer while the Dell Canvas is only a display that connects to any PC running the new operating system.

Like the Surface Studio the Dell Canvas will appeal to digital content creators ranging from engineers, industrial designers and users in finance who create simulations and models. The Dell Canvas even supports a new interface device called the Totem. The Totem, about the size of a hockey puck, is similar to the new Surface Dial, which Microsoft introduced as an option to the Surface Studio.

Despite the similarities, Dell believes the new offering, which the company's engineers collaborated on with Microsoft's Windows team on for more than two years, will offer a more immersive and interactive user experience. "It's transitioning from physical analog interactions that are hardware based to more dynamic digital-based interactions," explained Sarah Burkhart, a product manager with Dell's precision computing group, who gave me a demonstration of the Dell Canvas during a briefing last week in New York.

Burkhart believes that the Canvas and the use of the Totem, which, unlike the Surface Dial, delivers digital signals and can stick on the surface of the display, will open new ways for vertical software vendors such as Avid, Cakewalk, Solidworks and Sketchable to interact with Windows. "We have been pleasantly surprised at the interest in horizontal plus vertical from lots of areas outside of digital content creation," she said.

The Surface Dial will also work with the Dell Canvas, Burkhart said. The device comes with the Dell Canvas Pen based on the electro-magnetic resonance method and supports 20 points of touch. It can also be positioned at any angle or lay flat.

Posted by Jeffrey Schwartz on 01/06/2017 at 1:19 PM0 comments


Volvo Brings Skype for Business and Cortana to the Dashboard

Drivers of Volvo's newest 90 Series of cars will be able to initiate and participate in calls and conferences using Microsoft's Skype for Business. Volvo is displaying the new in-dash Skype for Business feature at this week's Consumer Electronics Show (CES), taking place this week in Las Vegas.

Skype for Business will be part of dashboard infotainment experience in Volvo's high-end S90 sedan, as well as the company's XC90 SUV and V90 Cross Country wagon. Drivers have access under Skype for Business subscriptions, offered with Office 365 licenses. They use a Skype for Business application that will let drivers bring up their schedules and contacts to initiate or access calls and conferences.

"With the addition of Microsoft Skype for Business app, Volvo will eliminate the need for fumbling with your phone while driving or remembering long conference call codes," according to the narrative of a one-minute video advertising the feature. Once calls are completed, drivers can send themselves a note with the recording of the call.

"We see a future where flexible in-car productivity tools will enable people to reduce time spent in the office," said Anders Tylman-Mikiewicz, Volvo's VP of consumer connectivity services, in a prepared statement.  "This is just the beginning of a completely new way of looking at how we spend time in the car."

Volvo is among a handful of automakers who have recently begun working with Microsoft to improve manufacturing and in-car experiences. Just over a year ago, Volvo revealed it was experimenting with Microsoft's HoloLens enhanced reality headsets. Other noteworthy automakers working with Microsoft include Nissan, Toyota and BMW.

Microsoft recently showcased its work with BMW at last fall's Ignite conference in Atlanta, where Executive VP for Cloud and Enterprise Scott Guthrie demonstrated software that he said provides better customer engagement and an "immersive end-user experience" within the vehicle that spans both dashboard display and native mobile apps that the car owner can use to manage the car remotely. BMW built the capabilities using Azure IoT and the company's Azure Data Analytics services.

Asked if Volvo plans to make Skype for Business available in all of its models, a spokesman said for now the company is only offering it in the 90 Series 2017 models. Volvo said the company is also exploring the possibility of embedding Cortana, Microsoft's digital assistant technology built into Windows 10, into its vehicles to enable voice recognition.

Volvo signaled strong interest in extending the use of voice recognition into its cars at last year's CES, when the company showcased the ability to control navigation, the heating system, door locks, lights and the horn using its Volvo on Call app via the Microsoft Band 2. A few months ago, Microsoft quietly discontinued the Microsoft Band, which could explain why Volvo is now looking at Cortana.

Posted by Jeffrey Schwartz on 01/04/2017 at 2:06 PM0 comments


Slack 'Teams' Up with Google

After reportedly flirting with acquiring Slack -- the chat-based collaboration service is popular with millennial workers -- for $8.5 billion, Microsoft decided to build rather than buy. The result was Microsoft Teams, which the company unveiled last month, and will be added to Office 365 Business and Enterprise subscriptions early next year. Apparently, not taking the competition lightly, Slack ran a full-page ad in The New York Times on Nov. 2, the day of the launch, with a lengthy letter to Microsoft warning the company that chat-based collaboration is more complicated than it looks. The move was borrowed from the late Apple CEO Steve Jobs, who infamously ran the Welcome IBM, Seriously ad in 1981 when Big Blue unveiled its first PC.

Apparently realizing talk is cheap, Slack has found a potentially more potent way to defend its turf by extending its ties with Google. Slack already works with Google Docs, representing one of its early integration deals. Google Drive files are imported into Slack roughly 60,000 times each weekday, according to Google, which calculates that with a file shared every 1.4 seconds. Now the two companies have embarked on a joint development pact, which Google announced last week.

"We're increasing our joint product and engineering efforts to strengthen the link between the content in Google Drive and the communication in Slack," said Nan Boden, Google's head of technology partners, in a blog post. The two companies are going to develop the following capabilities:

  • Instant permissions: Customizing permissions in Google Drive ensuring files shared in Slack are immediately available to the proper teammates.
  • Previews and notifications:  Google Drive and Docs notifications will be delivered in Slack, which will enable previews.
  • Slack channels tied to Team Drives: A centralized repository for shared content will keep Team Drives synchronized. Google announced Team Drives in September. Slated for release early next year, Google said customers will be able to pair Team Drives and Slack channels to keep content and conversations in sync.
  • Provisioning Slack from G Suite: The G Suite (the new name for Google Apps) administrative will be able to provision Slack.

The new Google Teams will aim to make it easier for users to find and access files, simplify group scheduling, add more capabilities to its Google Sheets spreadsheet app and add support for external conferencing, among other near features. With Microsoft and the Google-Slack tandem targeting next quarter to release their respective new offerings, the battle for the mindshare of millennials in the workplace is set to intensify in the new year.

Posted by Jeffrey Schwartz on 12/15/2016 at 1:17 PM0 comments


Security Skills Gap Widens, Pushing Up Pay

Deciding on which vendors' security tools to implement is a complex process, especially as threat and attack vectors frequently change, along with the environment itself (new infrastructure, apps and devices) that IT pros need to protect. But an even bigger challenge for IT professionals is finding skilled security experts.

A survey conducted by Intel Security's McAfee Center for Strategic and International Studies released last summer pointed to a global shortage of skilled cybersecurity professionals. The report gave a grim assessment of the current availability of those with security expertise across all disciplines. "The cybersecurity workforce shortfall remains a critical vulnerability for companies and nations," the report's introduction warned. "Conventional education and policies can't meet demand. New solutions are needed to build the cybersecurity workforce necessary in a networked world."

The survey of 775 IT decision makers in eight countries (including the U.S.) found that 82 percent are coping with a shortage of cybersecurity skills in their IT department, 71 percent report that the lack of talent is causing direct and measurable damage and 76 percent believe their respective governments aren't investing adequately in educating or training cybersecurity talent.

Among the three top skill sets respondents have the most demand for are for those who can help address intrusion detection, secure software development and attack mitigation. The report also estimated that up to 2 million cybersecurity jobs will remain unfilled in 2019. In the U.S., 209,000 open cybersecurity positions went unfilled and job postings have risen 74 percent over the past five years, according to the Bureau of Labor Statistics.

IT compensation Expert David Foote, founder of Foote Partners and author of this month's Redmond feature "IT Jobs Poised To Pay," emphasized the shortage of cybersecurity experts during the opening keynote address at last week's annual Live! 360 Conference, held in Orlando, Fla. "Companies are struggling to make this leap from information security to cybersecurity," Foote said. "Information security in so many companies and in so many regulated industries [is addressed] with very skeletal staffs."

Foote's report for the third quarter of 2016 showed that skills-based pay premiums for cybersecurity experts increased 10.7 percent for the year. Based on data compiled throughout 65 U.S. cities, the average  salary for an information security analyst is $99,398, while senior information security analysts earn $127,946. The average salary for security architects were $133,211. At the management tier, VPs of information security earned an average of $206,331, while managers of security compensation averaged $137,466.

Posted by Jeffrey Schwartz on 12/14/2016 at 10:48 AM0 comments


Windows Hello Brings Multifactor Authentication to Intel Security's True Key

Intel Security's True Key password manager can now work with Windows Hello, enabling multifactor authentication using biometrics.

The integration of both authentication capabilities comes nearly two years after Intel Security introduced True Key at the January 2015 Consumer Electronics Show. At the time, Intel Security saw it as a better alternative to Windows Hello, a feature delivered in Windows 10 six months later.

Intel Security Chief Technology Officer Steve Grobman, hadn't ruled out supporting Windows Hello at some point. But during an early 2015 interview he pointed out that True Key targeted all devices and operating systems, not just Windows 10.

"I think Microsoft has done a lot of good things in Windows 10, but I don't know if Hello and Passport will completely change the use of passwords right away because users have many devices they work on," Grobman said at the time. "But Microsoft is definitely taking steps in the right direction." Intel described True Key as a password manager that supports multifactor authentication including fingerprints and any 2D camera, and works with Intel's RealSense cameras for more extensive security.

Now Intel Security, which will become McAfee once TPG completes its $4.2 billion buyout of a 51 percent majority stake of the business unit with Intel Corp. retaining a 49 percent minority investment,  is offering True Key as an app extension that integrates with Windows Hello via the Edge browser. The app extension lets users add Windows Hello as an authentication factor to a True Key profile, which remembers existing and new passwords. Intel True Key will automatically enter a username and password when logging in to apps and Web sites with Windows Hello authentication.

"The password problem won't disappear overnight, which is why working with Windows Hello is a big step in the shared vision between Intel Security and Microsoft of a password-free future," said Intel Security CTO for True Key Richard Reiner in a statement. "By providing the True Key app with its enhanced multi-factor protection and support for dynamic Web form-filling, we continue to build an application that will encourage better password management and online security."

Users can download the free app from the Windows Store and then enable Windows Hello in their security settings.

In addition to Edge, Intel Security last week said True Key also supports Internet Explorer, Chrome and Firefox. Intel Security has also extended the number of authentication factors Android and iOS can combine to three. Factors supported include facial, fingerprint, trusted device and a master password. Android users can also now authenticate with a fingerprint and use it in the Android browser or Opera.

Posted by Jeffrey Schwartz on 12/12/2016 at 12:18 PM0 comments


CAs To Apply Microsoft's New Digital Cert Code-Signing Requirements

Microsoft for the first time will require certificate authorities to institute minimum requirements for how digital certificates tied to Windows-based executables and scripts are verified. The move is being made in hopes of making it much more difficult to distribute malware. The new requirements apply to code signing, the process of applying digital signatures to executables and scripts to verify an author's identity and validate that the code hasn't changed and isn't malicious. Following two years of discussions, the Certificate Authority Security Council (CASC), a group that includes the top seven CAs, this week said they have agreed on the code-signing requirements for Windows-based systems.

The new requirements will apply to Windows 7 and above, which introduced the dynamic root store update program enabling the removal of roots easily from the root store. CASC officials said standardizing code signing has become essential in verifying that software installed in an OS is authentic.

Code or executables won't run in Windows and most browsers if their certificates are unsigned, as user authentication is required first for them to execute. However, more sophisticated rogue actors have issued seemingly legitimate certificates. Because CAs had no code-signing standards, a rogue signature only needed to get by once for the malicious code to spread. The new CASC requirements, which the CAs and Microsoft will implement Feb. 1, aims to block such attempts to distribute malicious code with invalid signatures. CASC officials said Microsoft will be the first to institute the new guidelines, with other key players expected to follow in the near future.

"The main aim was to encourage better [digital] key protection, make sure there was a standard for validating identity information within digital certificates and to make sure there is a very prompt and streamlined process for revoking certificates if they are found to be used with malware. And then implement brand new standards for time-stamping services so that you can time-stamp your code and it will work on a longer period," Jeremy Rowley, VP of business development at DigiCert and a member of the CASC, said. "What we came up with is something everyone is happy with. It looks like it will accomplish those advantages." Rowley said all of the members of CASC are supportive of the new code-signing requirements, including the top CAs, which in addition to his company include Comodo, Entrust, GlobalSign, GoDaddy, Symantec and Trustwave. "The entire CA community and industry have bought into this," he said.

"Since it's being added to Microsoft's policy and part of their root distribution policy, it ends up being a mandatory item for any CAs working with that policy to follow the guidelines," added Bruce Morton, another CASC member and a director at Entrust Certificate Services. "In other words you didn't have a choice."

Morton added that this will extend beyond just Windows and Microsoft's browsers. "We did write the policy so it could work with non-Microsoft root policies with the expectation that other browser providers or other software vendors who rely on code-signing certificates would eventually want to use it," he said.

Having spent the entire week at the Live! 360 conference in Orlando, I asked some security experts attending TechMentor sessions about the new rules. MVP Sami Laiho, CEO of Adminize, who last week disclosed a Windows in-place upgrade security flaw, said the move is important.

"It's very big, because before this the whole certificate issuing industry has been the biggest cause of lacking trust," Laiho said. "We've had these issuers but we've had no restrictions on who the issuers can be or how they operate. This will increase security on the technical side. The whole issue of this is the whole concept of finally having some sort of a certification for those partners."

Dale Meredith, an ethical hacking author for Pluralsight, was among a few who wondered if the move will make it harder for legitimate users such as students, researchers and startups. Nevertheless, Meredith agreed with Laiho that it should improve security. "It will definitely make it harder for attackers," he said. "If it's done right it will protect users and companies from malicious attacks."

CASC spelled out three of the key guidelines, which include:

  • Stronger protection for private keys: The best practice will be to use a FIPS 140-2 Level 2 HSM or equivalent. Studies show that code signing attacks are split evenly between issuing to bad publishers and issuing to good publishers that unknowingly allow their keys to be compromised. That enables an attacker to sign malware stating it was published by a legitimate company. Therefore, companies must either store keys in hardware they keep on premises, or store them in the new secure cloud-based code signing cloud-based service.
  • Certificate revocation: Most likely, a revocation will be requested by a malware researcher or an application software supplier like Microsoft if they discover users of their software may be installing suspect code or malware. After a CA receives a request, it must either revoke the certificate within two days, or alert the requestor that it has launched an investigation. 
  • Improved code signatures time-stamping: CAs must now provide a time-stamping authority (TSA) and specify the requirements for the TSA and the time-stamping certificates. Application software suppliers are encouraged to allow code signatures to stay valid for the length of the period of the time-stamp certificate. The standard allows for 135-month time-stamping certificates.

The CASC published a technical white paper that describes the new best practices, which is available for download here.

Posted by Jeffrey Schwartz on 12/09/2016 at 12:43 PM0 comments


IT Salary Expert Reveals Where Pay Is Headed

Rapid changes in technology and businesses finding they must become more agile if they want to survive means IT pros and developers need to calibrate their skills accordingly. That was the call-to-action by David Foote, founder and chief analyst at Foote Partners, who gave the opening keynote at the annual Live! 360 conference, taking place this week in Orlando, Fla.

Demand for IT pros and developers remains strong for those who have honed their skills on the right technologies. But having strong "soft skills" and knowledge of the business is also more important than ever. However, those who can't keep pace with those requirements are likely to find a more challenging environment in the coming years, warned Foote.

Foote would know. His company has spent the past two decades tracking IT salaries to create quarterly and annual benchmark reports based on data from 250,000 employees at companies in the U.S. and Canada in 40 different industries. The scope covers 880 IT job titles and skills, both certified and non-certified.

Foote's keynote coincided with his most recent findings and analysis published in this month's issue of Redmond magazine. "It's a great time to be working in technology. There's so many jobs, there is so much need coming down the pike," Foote said. Many IT pros and developers, nevertheless, may find those jobs coming in areas they have never considered, he said. One example, as the growth if Internet of Things continues to accelerate, more software experts will need to learn specialized hardware skills.

Likewise, cross-skilling will be critical, he said. Despite the hype, digital transformation initiatives are placing skills premiums on experience in key areas that include architecture, Big Data, governance, machine learning, Internet of Things, cloud, micro-services, containers and security. These areas are among those on this quarter's "hot skills" list, which on average command skills premiums of 12 percent.

Skills-based pay is based on the premium a specific job title commands. The percentage represents the amount of base pay employers will pay for that premium. That pay is guaranteed without targets, but those percentages can fluctuate from year to year, as demand for skills changes. Currently, large organizations are looking to put these digital transformation efforts on a fast track for their business survival, Foote explained.

"More and more companies are creating innovation groups," Foote said. "They are usually not part of IT, they are on their own. They may take people from IT, but they figured out that this has to be a department that sits in the business and they need to be left alone to do their stuff because they are creating the future of the company. These companies are betting that their money on these digital groups as something they need now but they are really going to need five to ten years from now."

Salaries for those at director level who can run these efforts start at $225k with bonuses often 50 percent of base. With stock options, these experts can earn more than $400,000 per year, he said. At the same time, many IT people find themselves working in organizations for many years earning below market rates.

"Don't feel bad if you're basically happy but could do better. Because, as part of my advice here is that you probably need to look for a job somewhere else," Foote said. "There's a thing called salary compression, which if you are in the compensation world, it means that job is growing faster in the market in terms of pay value than the annual increase that you are getting at the company."

The need by companies to become more agile is often leading to more contingent hiring, where instead of employing full-timers, organizations are hiring consultants or those willing to work on a contract basis. "One of the reason we have such an under-employed population out there in the world is that companies are just not hiring, because it's expensive, and they need an agile workforce. They need somebody to come in and get this work done," he said.

Other reasons for turning to the so-called "gig economy" is hiring the right person on a full-time basis is time consuming and complex these days, he noted. Many often will hire those consultants and contract workers on a full-time basis in the future when it makes sense, he added.

To view his entire presentation and see what's in store, click here.

Posted by Jeffrey Schwartz on 12/07/2016 at 1:31 PM0 comments


Microsoft Clears Final Regulatory Hurdle in LinkedIn Deal

The European Union's approval of Microsoft's $26.2 billion acquisition of LinkedIn clears the final regulatory holdup, allowing the deal to close in the coming days.

If there was any potential for Microsoft to miss its target of closing its $26.2 billion acquisition of LinkedIn, announced in June, the European Union yesterday put that to rest by approving the deal, which is now set to close in the coming days.

The approval dashes the last hope by those objecting to the deal -- notably Salesforce.com CEO Marc Benioff, who filed objections with both the U.S. regulators and subsequently with the EU. Benioff, who was outbid by Microsoft in an attempt to acquire LinkedIn, raised concerns that Microsoft would lock out rival social networking service using its large Office and Windows market share.

Before clearing the deal, the EU last month sought some concessions from Microsoft, which had reportedly offered compromises. Microsoft President Brad Smith outlined the key commitments it will maintain for at least five years. Among them Microsoft will:

  • Continue to make its Office add-in program available to third-party professional social networking providers, which will let developers integrate those services into Outlook, Word, PowerPoint and Excel.
  • Maintain programs that allow third-party social networking providers to promote their services in the Office Store.
  • Allow IT pros, admins and users to customize their Office configurations by letting them choose whether to display the LinkedIn profile and activity information in the user interface when Microsoft provides those future integrations as is anticipated.
  • Ensure PC manufacturers aren't required to install new LinkedIn apps or tiles in the European Economic Area (EEA). Likewise, Microsoft is promising not to hinder users from uninstalling the apps and tiles. Microsoft also said it won't use Windows itself to prompt users to install a LinkedIn app, though it'll remain in the Windows Store and customers may be promoted in other ways to use it.

In the EEA, Smith also said that Microsoft has agreed not to form agreements with PC makers to pre-install LinkedIn exclusively, thereby blocking competitors. "We appreciated the opportunity to talk through these and other details in a creative and constructive way with the European Commission," Smith noted in yesterday's post. Having cleared approval in Europe, Smith said Microsoft is ready to move forward.

"Microsoft and LinkedIn together have a bigger opportunity to help people online to develop and earn credentials for new skills, identify and pursue new jobs and become more creative and productive as they work with their colleagues," Smith noted. "Working together we can do more to serve not only those with college degrees, but the many people pursuing new experiences, skills and credentials related to vocational training and so-called middle skills. Our ambition is to do our part to create more opportunity for people who haven't shared in recent economic growth."

Posted by Jeffrey Schwartz on 12/07/2016 at 10:45 AM0 comments


Box CEO Talks Up Added Office 365 Ties

Box last week continued to show it can grow it's cloud-based file sharing and collaboration enterprise business despite competition from Microsoft and Google, among other players, by posting more than $100 million in revenues in a quarter. Despite competing with Microsoft's OneDrive file share for SharePoint and Office 365, Box CEO and Cofounder Aaron Levie believes his company's partnership with Redmond is beneficial.

Upon announcing revenues for its third quarter fiscal year 2017 earnings of $102 million, the company projected improved demand in the current quarter with sales expected to break $108 million. "The need for Box is clear," Levie said during last week's earnings call (according to a transcript). "Today, business content is spread across separate legacy systems, on-premises storage, disparate collaboration and workflow tools, and sync and share solutions."

Box last week released several new ties to Office 365 including integration with the Office 365 Android client, the ability to create files to be saved in Box from Office Online and improved search sorting and previewing of Excel files in Box. The company described the added integration capabilities in a blog post.

In an interview last week, Levie and I discussed the partnership between Box and Microsoft and the benefits and risks of teaming up with a company that has a major stake with its own collaboration platform.

"I have to say, Microsoft has really transformed itself over the past couple of years to being a much more partner-friendly and customer-friendly organization, and I can say we are getting that feedback from customers right now because of that openness has been phenomenal," Levie said. "I think unequivocally what they [Microsoft] are realizing is with billions of people on the Internet and on mobile devices, the market is so much bigger. It is so much broader in terms of opportunity and they are taking advantage of that by partnering and building for multiple platforms."

Asked if Box will provide integration with Microsoft Teams, the new chat box feature the company will add to Office 365 business and enterprise subscriptions, Levie said that presuming there's demand for it, Box will provide the integration.

"I haven't played with it yet but I would say strategically this is an important space," Levie said. "The future of communications is no longer going to be just dominated by e-mail. I don't believe that e-mail necessarily goes away but I think we are going to use different tools to communicate in different contexts. It's not exactly just instant messaging like they have with Skype and Lync, and it's not as asynchronous as e-mail. It's really a space in the middle. Microsoft is recognizing that this is a very real opportunity, so I think they have to go after this space. That being said, Slack is rapidly growing and I think what we are going to all benefit on as users and partners in this ecosystem is more innovation."

Levie didn't always see Microsoft as an innovator or a good partner. Years ago, Box was founded to offer an alternative to SharePoint and Office. While Levie still believes that Box offers a superior cloud-based content management solution for large enterprises, he also said there's room for both, while arguing that the two offer very different types of capabilities.

"Our focus is that we're trying to build an incredibly simple, yet robust platform to help enterprises manage, collaborate and secure all of their data. And when you look at what we built up, it builds a very different kind of experience than SharePoint or OneDrive," Levie said. "In many respects, we've been building out a very different kind of product over the past decade where it's much more of a platform. it's a real end-to-end content platform that can solve every use case around working with documents, working with files and working with your information. But then importantly, it connects into every application in your organization and that's what's fundamentally different about Box and any other product in this space."

The new security and governance features Box has rolled out this year are also taking hold, Levie said on last week's earnings call, noting a $500,000 deal from a multinational pharmaceutical company centered around the new Box Governance offering. Box Governance allows organizations to meet data retention requirements and most recently security classification.

"Security is massive," Levie said. "It's one of the key reasons why customers will go to Box. It's one of the bigger catalysts that drive our growth and more and more we have chief information security officers that are driving the adoption of Box within the enterprise."

Posted by Jeffrey Schwartz on 12/05/2016 at 12:07 PM0 comments


AWS Covers All Bases To Extend Cloud Dominance

Amazon executives this week left no stone unturned with an extensive barrage of new deliverables covering a wide gamut of services -- many that will define its next generation of cloud services. This week's annual springboard of new offerings ranges from added instance types to services aimed at addressing the next wave of IT, which, for many business and technology planners rests on undergoing digital transformation efforts.

During five hours of keynotes spanning two days at the annual AWS re:Invent conference in Las Vegas, Amazon executives set out to prove that their company's cloud dominance will remain uncontested anytime soon. More than 30,000 customers and partners attended this year's event as AWS enters its second decade with a consistent focus on driving organizations to tap its broad portfolio of services to replace more and more of the functions now running in datacenters.

AWS has become so dominant that 451 Research this week put out a report predicting that "AWS +1" will become a strategic choice, or "operating principle," for enterprises in 2017." At this year's gathering, which has become AWS' largest staging event for new offerings, the new services ranged a from simpler starter kit for developers to spin up VPCs via the company's new Lightsail offering and a code-free visualization workflow tool called AWS Step Functions. A wider range of compute options including GPUs for all instance types and new field programmable gate arrays for gaming and high performance computing, a new batch processing service, management and automation capabilities, extended open source contributions and tools to advance its push into AI, machine learning, Internet of Things and edge locations were also on display

Customers' trust in the public cloud to transform the way they deliver IT was equally a key theme as well-known customers including Capital One Bank, McDonalds, Netflix and FINRA all explaining how they are broadening their use of the AWS. Netflix, which streams 150 million hours of programming each day and has 86 million customers, remains the poster child of companies that have transitioned from on-premises datacenters, an effort that dates back to 2008. Chief Product Officer Neil Hunt told attendees that this year marked the final phase of that transition. "We unplugged our last datacenter," he said.

Still, the customers touted by AWS are the exception rather than the rule, said Chris Wegmann, managing director at Accenture, which this week extended its partnership with AWS that it first kicked off a year ago. Accenture's AWS Business Group now has 2,000 IT pros that have 500 AWS certifications that are working with several large enterprises such as Avalon, Hess, Mediaset and Talen Energy. Wegmann said Accenture believes in the coming years cloud migrations, especially to Amazon's, will become more prevalent, while concerns still linger for some.

"We are seeing customers that are still hesitant," he said. "They're still trying to figure out whether or not it's the right thing for them, or whether or not they are going to try to have cloud independence. We are seeing them try to go slow and hit some roadblocks and they lose momentum. When you lose momentum, it doesn't go very well." Often those organizations "can't get out of their own way," Wegmann added.

In contrast, organizations that are successful in making the transition take disciplined approaches but still stick with their plans.

The companies that are being successful are maintaining that momentum," he added.  "They are not wavering on their decisions and they make realistic decisions, while not trying to end world hunger."

Posted by Jeffrey Schwartz on 12/02/2016 at 1:42 PM0 comments


Q&A: The Case for Running SharePoint Server in Azure IaaS

One of the most important new products from Microsoft this year was the release of SharePoint Server 2016. In addition to coming closer into sync with Office 365, it's the best-suited version of SharePoint to run as an instance in an infrastructure-as-a-service (IaaS) cloud. Dan Usher, lead associate with Booz Allen Hamilton, believes Microsoft Azure is the most logical IaaS for SharePoint, though he said customers can certainly run it in cloud infrastructure environment that meets their needs and budgets.

Usher, who has helped his clients deploy SharePoint Server in Microsoft Azure, will be joined by Scott Hoag, an Office 365 solutions architect at Crowley Maritime at next week's SharePoint Live! Conference in Orlando, where he'll demonstrate how to deploy SharePoint in Azure. Both presenters last month spoke with my colleague Lafe Low, but unbeknownst to him, I also recently met up with Usher where we discussed specifically why he recommends running SharePoint in Azure either entirely or in a hybrid architecture.

Are you seeing customers looking to provision or move SharePoint Server farms into Azure?
It goes back to what your use case is. If you've got a mission-critical system and you don't have a datacenter already, the question is, why not? Cloud services, at least from a procurement perspective, will be a lot easier because you don't have to find space in a co-lo [collocation facility] and pay for electricity.

Is there a case to use SharePoint in Azure these days rather than using SharePoint Online in Office 365?
I'd say there is. A lot of organizations still want to be able to deploy applications onto the server and interact directly with the server.

And of course, the newest release, SharePoint 2016, lends itself to do that more than any previous version.
Tremendously. You have things like the cloud search service application to act as that conduit to let Office 365 go in and actually do your crawl across the enterprise and work through your index effectively. That helps out tremendously to find that information. But if you have that specific thing that needs to sit in SharePoint server and don't want in Office 365, for whatever reason -- whether you're afraid or have security compliance requirements or if you've got some specific application code you need to put directly on the server, that's one of the main core reasons to stay on-premises.

If they're going to take that on-premises version, does it make sense to put in Azure versus AWS, or some other third-party cloud provider?
If they don't want to do it in their own datacenter, or if they want to have it out there in a more available, and in some cases more secure infrastructure and need multiple 9s of availability -- which can be pretty difficult as an IT pro -- I don't think there's a reason not to use Azure. I know for some systems, depending on how they want to architect it out, they might run into some limitations where they go. Some can't deploy something that requires them to join it to Azure Active Directory and, while even the Azure team has made it possible to put out Azure AD domain services so you can connect servers into Azure Active Directory, you're still kind of hitting some areas, where if you have things that need to be integrated into your home network, using Azure still works pretty well.

Where do you hit those limitations?
A capability that was put out in preview awhile back, Azure Active Directory Domain Services, doesn't let you extend the schema. If there's something where you have a customization back in your own Active Directory that extends schema, you might run into a limitation there.

Does that impact SharePoint itself?
I believe there is only one spot that actually touches this sort of schema for a connection point to just identity itself.

So it's not a showstopper?
No, not by any means.

What about the cost of running your SharePoint Server farms in the public cloud versus your own datacenter? Are you going to save money? Will it cost more but perhaps have other benefits?
I hate saying the depends line, but the problem you run into is everyone says cloud is going to be cheaper for you. And say if you're running 24x7x365, you may actually be better off looking at software as a service [like Office 365] instead of going with infrastructure as a service. Because you're paying for a usage license instead of compute, storage, network and everything.

How many clients that you're working with are looking at or running, SharePoint in Azure?
It's mixed. Some are still staying on-premises; some are taking advantage of Azure just because they don't want to have to run something internally. It's a mixed crowd. I'd say one of the more interesting things is the folks that are looking at keeping things on-premises but also setting up hybrid. And now that they're seeing things like cloud service applications, they're basically buying in and saying let's move those workloads that aren't critical up in Office 365 because it just makes more sense because then you don't have as much to back up and keep operational. And then we can make it more accessible. One of the cooler things that pushes that story even further forward is OneDrive for Business with the commitments they have made around the synchronization engine supportability.

How would you characterize the improvements to synchronization?
It's working better and better.

Given the latest developments, what will you be talking about during your SharePoint Live! session?
We will be running demos about how to set SharePoint up in Azure, and how to configure it. A lot of folks will step in and say, 'oh, I see this template that's already here for me -- I'll just use that.' You can definitely go down that path, and Bill Baer [senior technical product manager and Microsoft Certified Master for SharePoint] and that team has put a lot of effort into having an environment that you can automatically spin up. But those are the use cases where you kind of step in and say, 'hey it's great but I'd still like to be able to go in and customize this or customize the way SharePoint interacts with those other components.' So we're going to be walking though some of the ARM [Azure Resource Manager] templates -- how to use them to build out a SharePoint environment. If you don't want to use that, if you just want to use the auto SP installer tool to build out your installation for you, I'll be showing off some of that as well, and showing some of the more complex use cases you might have with getting things connected back to your own network.

Overall how much interest are you seeing in this?
There's a lot of interest by customers who want to get out of their own datacenter. I would say, a lot of organizations have a SharePoint guy and that SharePoint guys may have five other roles, and probably started off doing something completely different, such as Windows Server administration or Exchange administration. And when he steps into SharePoint, he freaks out. So we're hoping this can at least give them information and knowledge to go back and get their job done more effectively.

Find out more about Live! 360 2016 in Orlando here.

Posted by Jeffrey Schwartz on 11/30/2016 at 12:33 PM0 comments


Google Will Help Build Future Versions of .NET Core

Love is a two-way street. Microsoft has shown considerable affection for Linux over the past two years and by some measure consummated its marriage to open source by joining the Linux Foundation. However, this  news overshadowed the fact that Google has joined the .NET Foundation's Technical Steering Committee.

Microsoft announced both milestones earlier this month at its Connect() conference in New York, where the company outlined several key deliverables that will extend the company's portfolio of tools, new platform and computing capabilities that will bring intelligence to its apps with its new Bot Service preview and Azure cloud platform services. While many of Microsoft's key moves show more than just lip service to the open source community, the company is equally committed to extending its own .NET platform.

Having released the first iteration of multiplatform .NET Core this summer, the company revealed the preview of .NET Core 1.1, which offers 1,380 APIs, support for additional Linux distributions, performance improvements, thousands of new features, hundreds of bug fixes and improved documentation. Bringing Google into the fold builds on Microsoft's ".NET Everywhere" strategy, which aims to bring C# code to phones, TVs and to open source infrastructure and cloud services.

But Google's decision to join the .NET Foundation's Technical Steering committee drew a more muted applause than the news earlier that same morning that Microsoft was joining the Linux Foundation. When Scott Hanselman, Microsoft's principal program manager for Azure, ASP.NET and Web tools, made the announcement during his keynote presentation at Connect(), there was a brief silence, followed by a polite, though hardly exuberant, applause. "I don't want your pity applause," Hanselman quipped.

Allowing that the reaction was more surprise than pity since the move could give a boost to the Google Cloud Platform, Hanselman explained the significance of having Google join Red Hat, JetBrains, Unity and Samsung help steer the direction of .NET Core.

"Googlers have quietly been contributing to .NET Foundation projects and they are also helping drive the ECMA standard for C#," Hanselman told attendees at Connect(). "The Google Cloud Platform has recently announced full support for .NET developers and they have integrations into Visual Studio and support for PowerShell. All of this is made possible by the open source nature of .NET and of .NET Core, and I could not be more thrilled to be working with our friends at Google to move C# forward and to move .NET forward and to make it great everywhere."

Support for .NET goes back for several years when Google added .NET support throughout its infrastructure, including libraries for over 200 of the company's cloud services. The company has also recently added native support in Google Cloud Platform for Visual Studio and PowerShell. Back in August, Google announced support for the .NET Core release including ASP.NET and Microsoft's Entity Framework.

Google isn't the only major player to give .NET a lift. Apps built with .NET Core will also find their way to TVs, tablets, phones, wearables and other devices with help from Samsung, which used the Connect() event to release the first preview of Visual Studio Tools for Tizen, the embedded OS used to power Samsung devices. The tooling will enable mobile application development using device emulators and an extension to Visual Studio's IntelliSense and debugging features.

Martin Woodward, executive director of the .NET Foundation, noted that there are 50 million Tizen-powered devices in the wild. Said Woodward: "This will allow .NET developers to build applications to deploy on Tizen across the globe and continues in our mission to bring the productive .NET development platform to everyone."

Posted by Jeffrey Schwartz on 11/28/2016 at 1:34 PM0 comments


Microsoft's Pending LinkedIn Acquisition Ignites Both Fear and Optimism

Microsoft is hoping to finalize its acquisition of professional social networking giant LinkedIn by year's end. But it still has a key hurdle to meet -- gaining approval from European Union regulators. Salesforce CEO Marc Benioff, which was unable to outbid Microsoft's successful offer to acquire LinkedIn for $26.2 billion, is trying to convince the EU it shouldn't approve the deal, arguing it will stifle competition.

While it has already cleared regulatory approval in the United States and other countries, the EU is expected to decide by Dec. 6 to either clear the deal or to delay it pending further investigation. A report by Reuters on Nov. 16 revealed that Microsoft officials last week met with EU regulators and offered concessions. The report, citing unnamed EU officials, said details weren't provided but they'll share the concessions with competitors and customers.

Benioff has complained to regulators for some time since the deal was first announced by both companies in mid-June. In late September, Salesforce Chief Legal Officer Burke Norton told CNN in a statement that "Microsoft's proposed acquisition of LinkedIn threatens the future of innovation and competition. By gaining ownership of LinkedIn's unique dataset of over 450 million professionals in more than 200 countries, Microsoft will be able to deny competitors access to that data, and in doing so obtain an unfair competitive advantage."

Amid the latest hurdles, I was among four participants in a panel discussion Tuesday night in New York City and again Wednesday morning in Woodbridge, N.J. at a meeting of Microsoft's partners. The meeting was planned a while back to discuss if, and how, Microsoft partners might benefit from the deal.

In advance of the meeting, I shared results of the Redmond magazine survey we conducted and shared in our August cover story. About 20-plus attendees were at both meetings, held by the local chapters of the International Association of Microsoft Channel Partners (IAMCP) -- and some were hopeful the integration of LinkedIn's large pool of data and social graph will open new opportunities for them such as offering new capabilities on the Dynamics CRM suite to make it easier to sell and market to customers.

Among those bullish about the potential of the deal is Eric Rabinowitz, CEO of Nurture Marketing, who organized the panel discussion and recruited me to participate along with Jeffrey Goldstein, founder and managing director of Queue Associates, a Dynamics integrator, and Karl Joseph Ufert, president of Mitre Creative, a New York-based digital agency that works with various Microsoft partners.

Rabinowitz uses LinkedIn today to mine his clients' "circle of influences," he explained. "We look at our clients, go into LinkedIn and look at what surrounds them, who their peers are, who they report to and who reports to them," Rabinowitz explained. "Then we harvest that information and get e-mail addresses for those people. And instead of marketing to one person in an organization, we reach their whole circle of influence. What's beautiful about the Microsoft acquisition, I think what I just described will all be there at the push of a button."

Asked later if he was concerned that this capability would marginalize his business, Rabinowitz argued to the contrary. "Right now, it's a labor-intensive service that does not make us much money," he said. "If it's a service then what Microsoft will do for us is improve our service, gaining improved results. Also, I can envision us packaging the service into something else we do."

Others, though, are concerned that once Microsoft absorbs LinkedIn, resulting services could put the squeeze on their own offerings, depending on how they're priced. That's specially the case for those who offer virtual software training services. Lisa Eyekuss, president of Corporate Training Group, based in Iselin, N.J., shared her concerns, especially if Microsoft bundles LinkedIn's Lynda.com training service with Office 365 subscriptions. "If they include it, then Microsoft just slapped the face of its partners who do all the end user training because it's the first item in the budget to be cut."

While she takes some comfort in having diversified her business in recent years, Eyekuss is wondering to what degree Microsoft will slash the cost of Lynda.com to Office 365 subscribers. "I know it will affect our business," she said. "We have to figure out how to stay away from it."

It also remains to be seen to what degree Microsoft will integrate LinkedIn data, including contacts and the newsfeed, with Office 365 and the new Microsoft Teams (announced earlier this month), as well as ties to the new Dynamics 365 business suite, particularly CRM. Goldstein said he believes it will add much richer capabilities to the CRM stack. Goldstein said he is hoping Microsoft shares more specifics as soon as the deal closes -- presuming the EU doesn't delay it. "I'm excited and I don't know why," he said. "The only thing I do know is if Marc Benioff is upset about this acquisition and is trying to block it, it's got to be good," he said. "There's got to be something there."

Posted by Jeffrey Schwartz on 11/17/2016 at 10:17 AM0 comments


Black Friday PC Deals Will Include Discounts for Surface

If you have considered treating yourself or someone else to a new PC, next weekend's Black Friday and Cyber Monday might be the best opportunity yet to do it. Microsoft today revealed on its Windows blog that it will offer some attractive deals for some of its Surface Pro 4 and Surface Book models, and pointed to some notable markdowns for PCs from third-party OEMs from a number of retailers. As far as I can recall, this year's lineup of holiday discounts may be the best Microsoft has ever offered during the traditional holiday buying season.

Arguably the most tempting offers are for a Surface Pro 4 with an Intel m3 processor, 4GB of RAM, a 128GB SSD and its Signature Type Cover keyboard, which regularly costs $1,029 for $599. For those wanting a step up, the company is also offering its Surface Pro 4 with an i5 processor, 256GB SSD, 8GB of RAM and the keyboard for just $999 (the bundle typically costs $1,429).

Those $400-plus discounts will only be available at Best Buy and the Microsoft Store from Nov. 24 to Nov. 28. While there are rumors that Microsoft at some point next spring might roll out a Surface Pro 5 and thereby marking down the predecessors anyway, the discounts offered on the Surface Pro 3s after the launch of the Surface Pro 4s weren't as attractive as these current holiday deals.

Also during Black Friday next weekend, Microsoft is taking $400 off its Surface Book, offering a model with an i5 processor that includes a GPU, 8GB of RAM and a 256GB SSD for $1,499. The entry level Surface Book with 128GB SSD and 8GB of RAM, which normally sells for $1,499 will be on sale for $1,249.

In addition to its own hardware, Microsoft pointed to a variety of other Black Friday and Cyber Monday deals available at other retailers from its OEM partners Dell, Lenovo, HP, Acer, Asus and Samsung. One of the deals Microsoft considered most appealing to business users included $150 off of the new Lenovo Yoga 910 with Intel's 7th Generation Core i7 processor, a 14-inch display, 8GB RAM and a 256GB SSD for $1,049 (not mentioned but also planned is a similar model with 16GB of RAM and a 4K Ultra HD display for $1,399) at Best Buy. During the weekend Costco members can save $300 on the costly and compact  Dell XPS 13 Ultrabooks for $300 off or $100 markdown on the Acer Spin 5.

Those looking for lower-end PCs that are lighter on the wallet will find some good deals worth considering as well including:

  • Dell Inspiron PCs starting at $399 and HP Notebook 15 for $299 at the Microsoft Store.
  • HP X360 for $229 or Lenovo Ideapad for $400 at Best Buy.
  • HP laptops NT Ci3 for $269 and NT Ci5 for $329 at Office Depot.
  •  ASUS Transformer Mini T102 for $299 from Amazon.
  • Dell's Inspiron 11 3000 for $99 at the company's Web site.

For gamers, Microsoft said it is discounting its 1TB Xbox and Xbox One S by $50, bringing the latter to its lowest price ever, at $249. The game console deals will be available starting next Sunday Nov. 20 through Wednesday Nov 23.

Posted by Jeffrey Schwartz on 11/14/2016 at 1:29 PM0 comments


How Will Trump’s Victory Impact Cybersecurity, Tech and Internet Policy?

Whether you're pleased or shocked by the stunning upset Donald Trump notched last night, his election as the 45th president raises questions on how his administration will try to change Internet policy and address the wide number of cybersecurity issues facing businesses and end users.

If some of his remarks about cybersecurity, encryption and Internet regulation including net-neutrality during his 17-month campaign are any indication, there are reasons to believe big changes are in store. One big question is whether he will press for higher restrictions on encryption and the government's overall approach to encryption.

When the California Federal District Court Magistrate Judge Sheri Pym earlier this year ordered Apple to help the FBI decrypt the iPhone used by suspected terrorist Syed Rizwan Farook, who, with his wife Tashfeen Malik, killed 14 people in the December 2015 San Bernardino, Calif., shootings, Trump called on a boycott of Apple.

"Apple ought to give the security for that phone, OK. What I think you ought to do is boycott Apple until such a time as they give that security number. How do you like that? I just thought of it. Boycott Apple," Trump said at the time. "The phone's not even owned by this young thug that killed all these people. The phone's owned by the government, OK, it's not even his phone," Trump said. "But [Apple CEO] Tim Cook is looking to do a big number, probably to show how liberal he is. But Apple should give up, they should get the security or find other people."

According to a post today by the Information Technology and Innovation Foundation, Trump supports weakening of encryption in favor of stronger homeland security. The post also addresses his positions on number of other issues that will impact the IT industry, including his opposition to H-B1 visas and has articulated little in terms of whether he would advocate for increased R&D investments in technology.

The new president-elect has also advocated for restricting Internet access to stop terrorist organizations like ISIS from recruiting. "I would certainly be open to closing areas where we are at war with somebody." Trump said, according to the On the Issues site. "I don't want to let people that want to kill us use our Internet."

The site also questioned Trump's understanding of 'Net Neutrality when he compared it to the Fairness Doctrine.

Clearly by his remarks, the new president-elect is not up to speed on many of these issues. Now that the campaign is over and once he assumes office though, the advisors he surrounds himself with and the appointments he makes could have a major impact.

What's your prediction on how he will address cybersecurity, Internet policy and other IT-related issues?

Posted by Jeffrey Schwartz on 11/09/2016 at 1:15 PM0 comments


Microsoft's AI and Speech Breakthroughs Eclipsed by New IBM Watson Platform

Researchers at Microsoft achieved what they say is a breakthrough in speech recognition claiming they've developed a system that's as effective or better than people with professional transcription skills. The software's word error rate (WER) is down to 5.9 percent -- an improvement from the WER of 6.9 the team reported in September. The milestone was enabled with the new Microsoft Cognitive Toolkit, the software that enables those speech recognition advances (as well as image recognition and search relevance). Microsoft announced both developments two weeks ago, though the timing wasn't the best as IBM was holding its huge World of Watson event in Las Vegas.

Watson, of course, is Big Blue's AI system made famous several years ago when it appeared on Jeopardy and, in advance of its latest rollout, made the talk-show circuit including CNN and CBS's 60 Minutes, where IBM Chairman, President and CEO Ginni Rometty talked up Watson's own achievements including the ability to discover potential cancer cures deemed not possible by humans, among other milestones.

Microsoft believes it has the most powerful AI and cognitive computing capabilities available. The Microsoft Cognitive Toolkit is the new name for what it previously called the Computation Network Toolkit, or CNTK. In addition to helping the researchers hit the 5.9 WER, the new Microsoft Cognitive Toolkit 2.0 helped the researchers enable what the company is calling "reinforcement learning."

The new open source release, like its predecessors available on GitHub, now supports Python including migration from C++. "We've removed the barrier for adoption substantially by introducing Python support," said Xuedong Huang, a Microsoft distinguished engineer, in a recent interview. "There are so many people in the machine learning community who love using Python."

Most noteworthy, Huang said is that the new software has a significant boost in performance, enabling it to scale across multiple Nvidia GPUs, including those added with the field-programmable gate arrays in the Azure cloud. Huang acknowledges the tool isn't as popular as other open source frameworks such as Google's TensorFlow, Café or Torch, but he argues it's more powerful, extensible and able to scale across multiple machines and environments.

"It' the fastest, most efficient distributed deep learning framework out there available," Huang said. "The performance is so much better. It's at least two to three times faster than the second alternative." The new Microsoft Cognitive Toolkit includes algorithms that can't degrade computational performance, he added.

For its part, IBM made some pretty big news of its own. The company released the new Watson Data Platform (WDP), a cloud-based analytics development platform that allows programming teams including data scientists and engineers to build, iterate and deploy machine-learning applications.

WDP runs on IBM's Bluemix cloud platform, integrates with Apache Spark, works with the IBM Watson Analytics service and will underpin the new IBM Data Science Experience (DSX), which is a "cloud-based, self-service social workspace that enables data scientists to consolidate their use of and collaborate across multiple open source tools such as Python, R and Spark," said IBM Big Data Evangelist James Kobielus in a blog post outlining last month's announcements at the company's World of Watson conference in Las Vegas. "It provides productivity tools to accelerate data scientists' creation of cognitive, predictive machine learning and other advanced analytics for cloud-based deployment. It also includes a rich catalog of learning resources for teams of data science professionals to deepen their understanding of tools, techniques, languages, methodologies and other key success enablers."

There are also free enterprise plans that include 10 DSX user licenses and a Spark Enterprise 30 Executor Plan, he noted. IBM claims more than 3,000 developers are working on the WDP and upwards of 500,000 users are now trained on its capabilities.

Has IBM already won the war against Microsoft, Google, Amazon and Facebook when it comes to intelligent cognitive computing, AI and machine learning? Karl Freund, a senior analyst at Moor Insights and Technology, said Microsoft has a long way to go to compete with IBM Watson for mindshare without question.  "IBM is a brilliant marketing machine, and they are spending a lot of money to establish the Watson brand. Microsoft has nothing comparable, Freund said. "From a technology perspective, however, IBM has not convinced the technologists that they have anything special.  In fact, most people I speak with would say that their marketing is ahead of their reality."

Freund said what IBM is offering is a collaborative development platform. "Microsoft is releasing their software as open source," he added. "IBM is all about the services, while Microsoft is seeking to gain broad support for their software stack, regardless of where you run it."

Microsoft's new toolkit and its multi-GPU support are significant, while Watson is likely to appeal to existing Big Blue shops including those with mainframes and organizations using the IBM SoftLayer cloud.

Posted by Jeffrey Schwartz on 11/07/2016 at 1:57 PM0 comments


Will Microsoft Teams Bring Millennials to Office 365?

The launch of Microsoft Teams this week is an important building block in Microsoft's focus on digital transformation and ensuring a new generation of workers gravitate to Office 365. It's critical because many millennials hate e-mail, are more accustomed to using chat and likely use other productivity suites such as Google Apps.

Microsoft Teams is poised to be omnipresent in the workforce early next year when the company releases it to all business and enterprise subscriptions of Office 365, though it apparently will let users opt-in rather than opt-out. But the goal is clear -- the company wants Teams to evolve into users' core digital workspace with the Web-based interface serving as a hub for all collaboration.

"Think of Microsoft Teams as a digital transformation of an open office space environment. One that fosters easy connection and conversation to help people build relationships. One that makes work visible, integrated and accessible across the team so that everyone can stay in the know," said Kirk Koenigsbauer, corporate VP overseeing Microsoft's Office client team, at this week's New York launch of the new offering.

Workers under the age of 30 are more accustomed to communicating in environments such as SnapChat, Facebook Messenger and a slew of other chat-focused environments. And as they join the workforce, Microsoft risks many being turned off by Office 365 and Outlook e-mail in particular, which is what Teams aims to overcome.

"Our workforce is already two-thirds millennial," said Andrew Wilson, CIO of the large IT consulting firm Accenture, speaking at Wednesday's event.  "So they have been behaving like this in the consumer space.  But what this does is provide enterprise security, enterprise foundation and that nice integration with the things we've already invested in."

Alaska Air is another customer that has spent several months testing Microsoft Teams. Systems engineer Randy Cochran said the airline's customer service group is testing the new tool. "Teams provides persistent chat for keeping track of conversations," said Cochran. "So if a customer calls and speaks to a different rep, they can retrieve a history of the prior discussion. They can just discover documents [and] it gives them the ability to share knowledge and go back and retrieve knowledge that's already been maybe in the silo and get that back. It also gives us a single pane of glass for documents and manuals and everything that's already been put into SharePoint."

Time will tell whether millennials embrace Microsoft Teams. But the early testers believe it has a good shot. "This will be a no-brainer," said Wictor Wilén, a digital workplace architect at Avanade, who is currently testing Microsoft Teams. "They will understand from day one exactly how to use it." When asked about the older generation, he said that remains to be seen, given they've used e-mail for two decades.

Posted by Jeffrey Schwartz on 11/04/2016 at 2:16 PM0 comments


Office 365's New Chat-Based Digital Workspace: Microsoft Teams

After mulling an acquisition of popular enterprise chat platform Slack for $8.5 billion earlier this year, Microsoft decided to build its own. The company today revealed Microsoft Teams, hoping it will emerge as the hub for user experience across Office 365 and third-party apps and services.

Microsoft Teams, introduced  at an event in New York, will bring a new chat-based workspace that will bring together all the components of Office 365 and Skype for Business into an integrated workspace tied to the Microsoft Graph.

Microsoft Teams brings together Word, Excel, PowerPoint, SharePoint, OneNote, Planner and Power BI into a common workspace. Microsoft Graph is used to share intelligence and context among the apps, and Active Directory and Office 365 Groups are used to associate people with information. Microsoft also released an SDK that allows enterprise and commercial developers to build connectors, based on the same model as Exchange Server Connectors, that can allow chats to be integrated among each other. For example, Microsoft Teams can integrate Twitter or GitHub notifications into the workspace.

Among other things, Microsoft Teams will appeal largely to the millennial workforce that have become accustomed to using online chat as communication, while reducing reliance on e-mail. Slack is among a number of popular tools many workers have started using in recent years, and Microsoft believes it has a solution that provides security, governance and context enabled by the underlying Microsoft Graph, Office 365 and much of its machine learning efforts including its new Bot Framework and Azure Machine Learning service.

"Microsoft Teams is a chat-based workspace where people can come together in a digital forum to have casual conversations, work on content, create work plans -- integrated all within one unified experience," CEO Satya Nadella said at the New York launch event. "It's designed to facilitate real-time conversations and collaboration while maintaining and building up that institutional knowledge of a team."

Kirk Koenigsbauer, corporate VP overseeing Microsoft's Office client team, gave an overview of the new chat platform, saying it will be added to Office 365 Business and Enterprise subscriptions in the first quarter of next year. "It will be automatically provisioned in Office 365 and managed as any other Office 365 service," he said.

A preview is now available along with an SDK that will allow developers to tie their own apps to Microsoft Teams. Currently 70 Office 365 connectors are available in the toolkit, Koenigsbauer said. Support for Microsoft and third-party services will come via planned enhancements to the company's Bot Framework.

The toolkit lets developers build Tabs that present users individual Web experiences within Teams, providing instant access to a specific service, allowing for collaboration around that application's content. Using the Bot Framework, developers can create Bots that allow individuals and teams to chat with the service by making queries, while also allowing users to enable Connector notifications. Microsoft officials also previewed T-Bots, which include help features for Microsoft Teams and Who-Bots, a feature under development that will help individuals perform intelligent discovery of those with specific knowledge based on their identities and the context of their own communications.

The APIs will also allow for customizable workspaces and the company said it expects to have 150 integration partners at launch next year. Among them are Hootsuite, Intercom, ZenDesk and Asana. The company emphasized that Microsoft Teams will share the security and compliance support offered in its other services including EU Model Clauses, ISO 27001, SOC 2 and HIPAA, among others.

Time will tell if it's a Slack-killer but the richly-valued startup certainly showed it was paying attention. In addition to running a full-page ad in The New York Times, Slack noted the arrival of Microsoft Teams prominently on its own site.

"Congratulations on today's announcements," the company said in a company blog post. "We're genuinely excited to have some competition. It's validating to see you've come around to the same way of thinking.  It's not the features that matter. You're not going to create something people really love by making a big list of Slack's features and simply checking those boxes."

"We're here to stay," the company said. "We are where work happens for millions of people around the world. And we are just getting started."

Slack claims it has one million paid subscribers (2.5 million overall) with $100 million in annual recurring revenue and six million apps now installed among its members' teams.

Posted by Jeffrey Schwartz on 11/02/2016 at 9:53 AM0 comments


Windows Chief Slams Google for Premature Vulnerability Alert

Microsoft officials appeared to be fuming this week over Google's disclosure Monday of a 0-day vulnerability just days after alerting the company. The company said yesterday a patch will be available next week and said Google should have waited. Google defended its decision to disclose the vulnerability, saying it's a serious flaw that has been actively exploited.

The search giant acknowledged it was disclosing the vulnerability despite the fact that Microsoft still hasn't issued a fix, urging users to use the auto-updater for Adobe Flash and to apply the patches to Windows when Microsoft releases them.

Myerson made known his displeasure with Google's decision to issue its alert before Microsoft had a patch ready. "We believe responsible technology industry participation puts the customer first, and requires coordinated vulnerability disclosure," Myerson stated. "Google's decision to disclose these vulnerabilities before patches are broadly available and tested is disappointing, and puts customers at increased risk."

Myerson noted it wasn't the first time Google has done so, pointing to another occasion nearly two  years ago and the company's call for better coordinated disclosure to avoid vulnerabilities from being exploited before patches can be readied.

The disclosure fueled continued debate over how disclosure of vulnerabilities should be disclosed in the best interest of users. Udi Yavo, co-founder and CTO of threat detection vendor enSilo, in an e-mail sent to media said that Google was wrong. In addition to advocating for a 90-day window for disclosure, Yavo called for legislation to hold companies legally accountable.

"In the case of Google's disclosure, justification for only allowing a week for Microsoft to develop a patch is because Google researchers were seeing the vulnerability actively exploited in the wild," Yavo noted. "To me, this doesn't ultimately help achieve everyone's goal, which should be keeping consumers and their data safe. By disclosing a vulnerability early, without allowing time for a patch, Google opened-up the small pool of people who found the vulnerability and knew how to exploit it, to all."

Not everyone shares that view. Ilia Kolochenko, CEO of Web security firm, High-Tech Bridge, said in an e-mail that Google did the right thing. "I think it's not a question of days, but rather of efficient cooperation to fix the vulnerability," he said. "Google has great cybersecurity experts and engineers who can definitely help other companies to understand the problem faster and help fixing it. Instead of endless discussions about the ethics of full disclosure, we should rather concentrate on inter-corporate coordination, cooperation and support to make the Internet safer."

What's your take? Should Google have waited or do you think it did the right thing by making the vulnerability known?

Posted by Jeffrey Schwartz on 11/02/2016 at 11:38 AM0 comments


Sizing Up the Newly Combined Dell EMC

Having spent the past year rationalizing its mammoth $67 billion acquisition of EMC, the newly combined company this month hit the ground running. During the Dell EMC World Conference two weeks ago in Austin, Texas, Michael Dell and the senior executive team of the new Dell EMC outlined a laundry list of deliverables that will bring together the two organizations' respective technologies.

From the outset of this month's conference, company officials emphasized their portfolio approach to bringing together EMC and the companies it controlled (including VMware, RSA, Pivotal and Virtustream) with Dell. All now fall under the umbrella of Dell Technologies, with Dell EMC consisting of the server, storage, network and enterprise infrastructure assets of the two companies.

Experts believe Dell EMC is now the dominant provider of datacenter infrastructure. But it has formidable competition from Cisco, Hewlett Packard Enterprise (HPE), Lenovo, IBM and Hitachi Data Systems, and smaller players as well.

At Dell EMC World, company officials showcased the benefits of coming together with this month's launch of a new software-defined version of its PowerEdge servers with EMC's Data Domain backup and recovery software. The company claims a six times increase in scalability and support when added to the new Dell EMC PowerEdge servers.

But Michael Dell showed he clearly wants to provide greater linkage among the assets beyond the core EMC storage business, notably VMware, Pivotal, RSA and Virtustream. Evidence of that was clear with the announcement of plans to deliver an integrated security solution that will bring together the assets of EMC, RSA, VMware AirWatch businesses and Dell SecureWorks. That's just one instance outlined at Dell EMC World.

Other examples include the launches of the Dell EMC's VCE-based hyper-converged appliances built on Cisco's Unified Computing System (or UCS),  as well as its VxRack hyper-converged infrastructure that are both now available with Dell PowerEdge servers and the new Dell EMC Elastic Cloud Storage (ECS) 3.0 modern object-storage platform for cloud-native environments. VxRack is based on the object store that runs the Virtustream public cloud for mission critical applications that EMC acquired last year and the new Dell EMC Analytic Insights Module (AIM) based on the Pivotal Cloud Foundry Platform. In addition to bringing Dell PowerEdge as an option to the Cisco-powered VxRail, Dell EMC is offering an option with VMware's Horizon client virtualization platform in December. While Dell Technologies has a controlling interest in the independently run companies, Michael Dell talked of the benefits of this structure.

 "This unique structure allows us to be nimble and innovative like a startup, but the scale of a global powerhouse," he said in his Dell EMC World keynote address. "For you that means a technology partner that can be number one in everything, all in one place."

Commitment to Cisco and Microsoft
At the same time, Dell made sure to emphasize that the new company will continue to embrace its partners that, in context of some of this next-generation infrastructure, are also competitors, notably Cisco and Microsoft.

Backing away from either company would not only alienate a formidable customer base committed to Cisco's UCS that is the core component of Dell EMC's VxRack line and the Microsoft Windows Server, Hyper-V and Azure platforms, but would undermine a substantial source of revenue that Dell needs to make this merger work, said Gina Longoria, an analyst at Moor Insights and Strategy. Like others, Longoria observed that Dell and company officials talked up the company's commitment to both Cisco and Microsoft. But like others, she agreed with my observation that the scale did appear to tip toward the Dell Technologies portfolio in terms of emphasis.

"I wasn't surprised how much focus they had on VMware but I was disappointed they didn't focus more on their Microsoft capabilities," she said. "I'm sure that's coming. Azure Stack is obviously not coming until next year, but hopefully next year they'll round that out a bit more. It's a little bit from a wait and see but I'd like to see a more balanced message.

Focus on Digital Transformation
David Goulden, who was CEO of EMC before the merger and now is president of the Dell EMC business unit, outlined how digital transformation will fuel growing demand for the hyper-converged products and new object storage platforms that will support containers and the building of next-generation cloud-native applications. While EMC had built its integrated systems with server and compute systems from Quanta before the merger, offering Dell's PowerEdge today has significant implications. For example, until now, with the Quanta server in VxRail, it was a four-node 2U system and had an entry level list price of $60,000. The new VXRail launched this month is now available in a 1U rack unit and at a starting price of $45,000 (or 25 percent lower), said Bob Wambach, vice president of marketing for the Dell EMC VCE business. The company claims the new appliances, which feature the most current Intel Broadwell-based compute platforms, are available in 250 more configurations, offer 40 percent higher CPU performance and are available with double the storage in all-flash nodes. Similarly, the new Dell EMC VxRack System with the PowerEdge servers offer two and a half times more capacity and 40 percent greater performance, the company claims. 

"There's a much wider range from entry level going down in starting configurations to significantly more processing power and performance in the larger boxes," Wambach said. "It represents a very dramatic change in scope of use cases we can address."

Bringing the PowerEdge server to its converged and hyper-converged platforms will play a critical role in Dell EMC's hybrid cloud ambitions, according to Krista Macomber, a senior analyst covering datacenter infrastructure at Technology Business Research. "Dell's legacy manufacturing prowess is another major benefit in positioning the VCE team to more quickly and cost-effectively deliver more customized hyper-converged appliances and ensure spare parts availability," she noted in a research note published last week.

While converged and hyper-converged system growth is outpacing that of traditional servers, it still is a small percentage of the market. At Dell EMC World, Goulden said the company recognizes even as the future points to hyper-converged infrastructure, the vast majority of its customers still prefer, or at least rely upon, the building block approach to engineering. "The data shows that over 80 percent of you today still want the servers and the storage building blocks to build your own," he said in his Dell EMC World keynote. "So don't worry, we are still fully committed to these building blocks."

Posted by Jeffrey Schwartz on 10/31/2016 at 3:13 PM0 comments


Assessing the Damage of Last Week's Powerful DDoS Attack

The damage from last week's distributed denial-of-service attack suggests it was the most powerful to date and it could be a precursor to an even more sustained attack. A bipartisan committee of senators formed over the summer wants answers, but some critics want the government to act more swiftly. The incident also puts a spotlight on the vulnerability of Internet of Things-based components, ranging from sensors on gas meters to IP-connected webcams and thermostats. There are currently 6.4 billion IoT-connected devices in use and that figure is expected to grow to 20 billion by the year 2020, according to Gartner's most recent forecast.

DNS provider Dyn, attacked by the Mirai botnet, was overwhelmed last week by the massive DDoS attack. Dyn is one of a handful of DNS providers attacked last week. The operation brought down or interrupted hundreds of sites last Friday including Amazon, Netflix, Reddit, Twitter and Spotify. It also brought down services that enterprises rely on including Okta's single sign-on cloud service, Box Github and Heroku.

The source is still not known. But according to an analysis by Flashpoint, it didn't have the characteristics of a nation-state attack. The action took advantage of security flaws in IoT-based components provided by China-based XiongMai, which responded this week by recalling millions of its devices. Dyn's EVP of Products Scott Hilton on Thursday described in a company blog post the intensity of the attack, noting that while his team is still analyzing the data, he believes the botnet came from as many as 100,000 endpoints and noted reports that packets were coming at it at a speed up to 1.2Tbps –  though that remains unverified at this time.

"Early observations of the TCP attack volume from a few of our datacenters indicate packet-flow bursts 40 to 50 times higher than normal," he stated. "This magnitude does not take into account a significant portion of traffic that never reached Dyn due to our own mitigation efforts as well as the mitigation of upstream providers."

Hilton described it as a complex and sophisticated attack that used targeted and masked TCP and UDP traffic over port 53. It generated compounding recursive DNS retry traffic, he noted, adding that further intensified its impact. Hilton confirmed that the Mirai botnet was the primary source of the attack and that it is working with authorities conducting criminal investigations.

In addition to law enforcement, Democratic U.S. Sen. Mark R. Warner, a member of the Senate Select Committee on Intelligence, who joined Republican Cory Gardner of Colorado over the summer in forming the bipartisan Senate Cybersecurity Caucus, wants answers and issued a statement calling for better protections. Warner called on three federal agencies -- the FCC, FTC and Department of Homeland Security's National Cybersecurity & Communications Integration Center (NCCIC) -- to provide information on the tools available and needed to prevent attacks from flaws in consumer devices and IoT components including IP-based cameras, connected thermostats and other products that that have connectivity. An FCC spokesman said the agency is still reviewing Warner's letter.

In his letter to FCC Chairman Wheeler, he questioned what can be done about the fact that consumers aren't likely to change passwords in their IoT devices (and if it's even an option). One implication was perhaps mandating improved software that enables automatic firmware updates. Warner also questioned the feasibility of enabling ISPs "to designate insecure network devices as 'insecure' and thereby deny them connections to their networks, including by refraining from assigning devices IP addresses? Would such practices require refactoring of router software, and if so, does this complicate the feasibility of such an approach?"

Morey Haber, VP of Technology at BeyondTrust, in a blog post earlier this week, called on Congress to come up with legislation that would put security requirements on all IoT devices. Haber believes the legislation should put the following requirements and restrictions on all IoT and Internet-connected devices:

  • Internet-connected devices should not ship with common default passwords
  • Default administrative passwords for each device should be randomized and unique per device
  • Changing of the default password is required before the device can be activated
  • The default password can only be restored by physically accessing the device
  • The devices cannot have any administrative backdoors or hidden accounts and passwords
  • The firmware (or the operating system) of the device must allow for updates
  • Critical security vulnerabilities identified on the device for at least three years after last date of manufacturer must be patched within 90 days of public disclosure
  • Devices that represent a security risk that are not mitigated or fail to meet the requirements above can be subject to a recall

Gartner analyst Tim Zimmerman last month called on IT organizations to address these proposed issues throughout all their infrastructure and software. Haber also believes the legislation is critical. "I think last Friday was just a test. I think it was just a huge warning," Haber said. "It was miniscule compared to what could have happened and that could result in huge financial losses and other implications." While praising XiongMai's recall, Haber also warned that "unfortunately, this is just one device of many from door bells to baby monitors that have the same type of problem."

Posted by Jeffrey Schwartz on 10/28/2016 at 11:50 AM0 comments


Microsoft Reveals Windows 10 Creators Update and Surface Studio Desktop PC

Microsoft today gave a new vision for the future of Windows, taking the OS into the realm of content and information creation and sharing.

Coming next spring, Microsoft said it will release the new Windows 10 Creators Update with support for 3D and virtual reality. Microsoft also took the wraps off new hardware designed to bring the best of the new Windows capabilities entering the high-end desktop all-in-one market with its 28-inch Surface Studio and updated Surface Books with twice the graphics processing power of the current top-end system and 30 percent more battery life.

While the forthcoming OS update and new hardware unveiled at an event for media, analysts and Windows Insiders in New York was interesting in its own right, the message here was clearly where Microsoft is taking Windows -- as a platform for not just consuming information and content but creating it as well. Microsoft's top executives emphasized that the new free Windows 10 Creators Update will introduce new ways to create new content and communicate better with people.

The Windows 10 Creators Update aims to bring 3D and "mixed reality" to mainstream use. The updated OS will ease the path for anyone to create holograms, while providing ties to Office including the ability for individuals to bring different communications networks together including e-mail, Skype and SMS.

Microsoft CEO Satya Nadella, during brief remarks at today's event, described these new features as key to bringing Windows into its new realm. The first wave of computing gave users the ability to become more productive, he noted. Over the past 10 years, advances in software, end-user computing devices and cloud services have introduced new ways for people to discover and consume information. Now Microsoft's goal with Windows is to enable new forms of content and information creation.

"I believe the next 10 years will be defined by technology that empowers profound creation," Nadella said. "At Microsoft, our mission is to empower every person and every organization on the planet to achieve more. We are the company that stands for the builders, the makers, the creators. That's who we are. Every choice we make is about finding that balance between consumption and creative expression. I am inspired by what I have seen in the Minecraft generation who see themselves as not players of a game but creators of new worlds they dream up -- the new forms of creativity and the expression we can unleash. This is what motivates us about Windows 10."

Windows 10 Creators Update
The ability to create and transform images and content to 3D will come in a tool most Windows users are quite familiar with: Paint. The new Windows Paint 3D will let users create or capture any image or photo and convert it into 3D and share those 3D images anywhere including on social media. To enhance that capability, Microsoft announced it is teaming with Trimble to 3D modeling app Sketchup, which claims it has millions of creators and content in its 3D Warehouse site.

While 3D will be a core focus in the next Windows release, Microsoft also intends to extend it to Office apps. Megan Sanders, a general manager in Microsoft's emerging technologies incubation group, demonstrated the creation of a 3D image in PowerPoint. Microsoft also said new inking capabilities coming to Windows will extend to Word as described in today's Office Blog

A new MyPeople feature will let users pin the most important people to the taskbar. Saunders also demonstrated the ability to create an e-mail message sent to her husband who could receive it via Skype or SMS (on Android or Windows phones). "Over the next year, you will see us integrate 3D across our most popular Microsoft applications," she said.

Terry Myerson, executive VP of Microsoft's Windows and devices group, said the new Windows 10 Creators Update will bring new capabilities and devices for a wide audience of users ranging from gamers, software developers, artists and engineers as well as for everyday collaboration. "It is our mission to see that everyone can achieve their potential at work and play," he said.

In addition to new Xbox hardware, Microsoft is taking a key step toward bringing its HoloLens technology to the mainstream through its OEM partners. Myerson announced that Acer, Asus, Dell, HP and Lenovo will offer headsets enabled with the Windows 10 Creators Update that can create Holograms at starting prices of $299. Myerson said the headsets from those companies will have sensors that offer "six degrees of freedom" not requiring any setup when integrating between physical and virtual worlds, he said.

The New Surface Studio All-in-One Desktop Canvas
Just as Microsoft pushed into the laptop market last year with the launch of the Surface Book, the company is now entering the all-in-one desktop market. The company took the wraps off the Surface Studio, sleek system with a 28-inch collapsible 4.5k ultra HD screen, claiming it produces 13.5 million pixels (63 percent more pixels than a high-end 4k TV). Microsoft Corporate VP for Devices Panos Panay took the wraps off the new Surface Hub, which the company said transforms the function of a workstation into a "powerful digital canvas."

It produces 102 PPI and offers TrueColor DCO-P3 and initially comes in three configurations. The entry level unit, equipped with an Intel Core i5 processor, 1TB of storage, 8GB of RAM and a 2GB CPU costs $3,000. The mid-range system has an i7 processor, 16 GB of RAM, a 2GB CPU and 1TB of storage and will cost $3,500. And the most powerful system, equipped with an i7, 16 GB of RAM, a 4GB GPU and 2TB of storage, will set you back $4,200.

A New Peripheral for Creators: The Dial
"We want to transform the way you create and think about creating," Panay said. "It's built to pull you in, it is all fundamentally designed to immerse you into the content or the creation you want to work with."

One way Microsoft hopes to do that is with its new Dial, a peripheral shaped somewhat like a hockey puck, which Panay believes will provide a new way to navigate and interact with content. The new Dial is priced at $99 and will be available early next year. It will also work with the Surface Pro 3, 4, Surface Books and the new Surface Studio.

New Surface Books
Microsoft is also rolling out three new Surface Books, also based on the Intel 6th Generation Core CPUs, that Panay said will sport double the graphics processing power and add 30 percent more battery life than the original, bringing the total to 16 hours. The three new systems (specs are here), will be available next month, ranging in cost from $1,899 to $2,799.

Patrick Moorhead, president and principal analyst with Moor Insights and Technology, noted Microsoft decided not to roll out new systems with Intel's new 7th Generation Processor, code-named Kaby Lake, or the USB-C Thunderbolt interfaces.

"They didn't take as may risks as they did last year," Moorhead said. "They were conservative in my opinion." That said, he expects the new hardware will be a hit. "They are going to sell a ton of them."

Posted by Jeffrey Schwartz on 10/26/2016 at 2:12 PM0 comments


Massive DDoS Attack Exploited IoT Vulnerabilities

It was only a matter of time before hackers would find a way to unleash a massive distributed denial-of-service (DDoS) attack by taking advantage of millions of unprotected endpoints on Internet-connected sensors and components on consumer devices such as webcams, according to security experts. Friday's botnet attack on Dyn, a major DNS provider based in Manchester, N.H., was what Chief Strategy Officer Kyle York described as what will likely to be remembered "as an historic attack," which intermittently took down sites such as PayPal, Twitter, Netflix and Amazon. It also impacted business-critical service providers including cloud-based authentication provider Okta and various providers of electronic medical record systems.

The attacker and motive for the attack are not immediately clear. But threat-assessment firm Flashpoint confirmed that the attackers unleashed botnets based on Mirai, malware that were used last month to bring down the popular Krebs on Security site run by cybersecurity expert Brian Krebs and French hosting provider OVH. Flashpoint said it wasn't immediately clear if any of the attacks were linked to each other. The attackers unleashed a 620Gpbs attack on Krebs' site, which he noted is many orders of magnitude the amount of traffic necessary to bring a site offline.

The Mirai malware targets Internet of Things (IoT) devices ranging from routers, digital video records (DVRs) and webcams from security cameras, according to a description of the attack published by Flashpoint, which also noted that a hacker going by the name of "Anna Senpai" released Mirai's source code online. Flashpoint has also confirmed that botnet was specifically compromising flaws in DVRs and webcams manufactured by XiongMai Technologies, based in China. Flashpoint researchers told Krebs all of the electronics boards infected with Mirai share the default "username: root and password xc3511." Most concerning is that "while users could change the default credentials in the devices' Web-based administration panel, the password is hardcoded into the device firmware and the tools needed to disable it aren't present." Krebs noted. XiongMai today said it is recalling millions of its devices.

Security experts have long warned that such devices and other IoT-based sensors and components are vulnerable because they are not protected. Following the attack last month on the Krebs site, security expert Bruce Schneier warned in a blog post that it validated such fears. "What was new about the Krebs attack was both the massive scale and the particular devices the attackers recruited," Schneier wrote two weeks ago. "What this attack demonstrates is that the economics of the IoT mean that it will remain insecure unless government steps in to fix the problem. This is a market failure that can't get fixed on its own." Schneier last month suggested he had strong reason to believe these are nation-state attacks.

Morey Haber, vice president of technology at BeyondTrust, a provider of privileged identity management access software, agrees. In an interview this morning, Haber said the government should require all Internet-connected hardware including IoT sensors to have firmware that will enable passwords.

This attack could be just the tip of the iceberg, considering that only 10 percent of the Mirai nodes were actually involved in these attacks, said Dale Drew, CSO of Internet-backbone provider Level 3 Communications, in a brief video. "But we are seeing other ones involved as well," Drew said.

If that's the case, Haber said it appears someone is trying to send a message. "What would 50 or 90 percent look like if all of the bots were all turned on and used?," Haber asked. "That begs the question, was this a test, or was it a paid for hire? If it really is only 10 percent, as recorded by L3, we could be in store for something a lot larger because we haven't torn down that network yet."

Level 3's Drew advises companies that believe the attacks are impacting their sites to stay in contact with their ISPs and to use multiple DNS providers.

Posted by Jeffrey Schwartz on 10/24/2016 at 1:27 PM0 comments


Steve Ballmer Says Microsoft Once Offered $24 Billion To Acquire Facebook

Microsoft tried to acquire Facebook in its early years for $24 billion, former CEO Steve Ballmer today told CNBC. The fact that he tried to buy Facebook nearly a decade ago is hardly a shocking revelation, given his appetite at the time to make Microsoft more relevant to consumers at the time. Ballmer's comment, though, appears to be the first public acknowledgment of the company's interest in sealing such a deal.

"He said no, and I respect that," Ballmer told the hosts of CNBC's Squawk Box during a guest segment on the early morning program. Microsoft of course did settle for an early $240 million investment in Facebook in early 2007, a fraction of one percent of what Microsoft was willing to pay for the whole thing. It was a noteworthy move back then when Facebook was valued at $15 billion. Certainly Microsoft wasn't the only company circling its wagons around Facebook back then.

Putting aside whether it was a strategic fit for Microsoft, clearly Founder and CEO Mark Zuckerberg and the rest of Facebook made the right bet for themselves as the now publicly held company is valued at $375 billion. One of the CNBC hosts pointed out a bit of irony in the fact that Snapchat rebuffed Zuckerberg when he tried to buy the popular mobile messaging company for $3 billion. "I think trading that for some short-term gain isn't very interesting," Snapchat Founder Evan Spiegel told Forbes back in 2014, months after turning down the offer. In effect, Zuckerberg was denied in the same way he turned down Ballmer.

Asked about reports that he was recently looking into acquiring Twitter after buying a 4% stake in the company last year, Ballmer said "I have never, ever, ever wanted to buy Twitter myself. I got a good life right now. I don't need to do that." Asked which he believed would be the best company to acquire Twitter, Ballmer said Google's search engine would be an ideal platform to surface tweets.

As CEO of Microsoft, Ballmer said he never tried to acquire Salesforce.com. "I believe in revenue and profit," Ballmer said, in apparent dig to his onetime nemesis Marc Benioff. "I could never make the math work on Salesforce." Apparently Ballmer's successor, Satya Nadella, saw things differently last year when he had reportedly tried to acquire Salesforce.com for $60 billion.

Posted by Jeffrey Schwartz on 10/21/2016 at 12:20 PM0 comments


Combined Dell-EMC-VMware To Deliver Integrated Endpoint Security Solutions

Dell will deliver a new platform that combines its existing endpoint security offerings with those from its newly acquired EMC and its RSA and VMware AirWatch business. The new endpoint security and management portfolio is the first major new offering resulting from Dell's $67 billion acquisition of EMC, which closed six weeks ago, representing the largest-ever merger of two IT infrastructure providers.

The new security offering is one of a significant number of announcements announced at Dell EMC World, a two-day gathering in Austin, Texas that kicked off this morning.

Like any large deal, many fail to see its potential benefits, but Michael Dell, the company's founder and chairman, showed no doubt during the opening keynote that the newly combined company will not only succeed but thrive.

"Today Dell is the largest enterprise systems company in the entire world," he said. Among the many new offerings released and in the pipelines revealed today, Dell pointed to the benefits of coming together to address the mounting cyberattacks and threats.

"Every time I sit down with our colleagues with RSA or SecureWorks and they take us through not what happened in the last quarter but just this week, it's very scary," Dell said during a press conference following the keynote session. "The nature of the attacks and the sophistication of the attacks is increasing."

The new suite will come out of Dell's client solutions group, which includes its PC offerings. "You have a package from us that meets the changing needs of the workforce," said Jeff Clarke, vice chairman of operations and president, client solutions group, during the keynote session announcing the new suite. "We now have the ability for endpoint security that's unmatched in the market."

Initially Dell will offer the new portfolio as a bundled suite, Clarke said, and over time the various products will interoperate with each other. Ultimately Dell will offer a single management console for the entire offering, Clarke said.

The company is looking at this platform as three key components: Identity and authentication, data protection and unified endpoint management. The bundle will include:

  • Dell Data Protection-Endpoint Security Suite, which provides authentication, file-based encryption and advanced threat protection
  • MozyEnterprise and MozyPro, the company's cloud-based backup and recovery, file sync and encryption offering.
  • RSA SecurID Access, the multifactor, single sign-on authentication solution.
  • RSA NetWitness Endpoint, a tool designed to utilize behavioral analytics and machine learning to provide more rapid remediation to advanced threats.
  • VMware AirWatch, the company's mobile device management offering. With this release Dell said organizations can use Dell Data Protection with AirWatch to report on activity for compliance.

Dell believes companies are looking to streamline the number of security solutions in their organizations. Technology Business Research's new benchmark survey showed that large companies typically have 50 security products and smaller organizations have 10. TBR's latest research report shows companies would like to get that number down to 45 and 8, respectively. This is the second year in a row, customers are spending more of their new budgets on endpoint security than any other segment, said TBR Analyst Jane Wright, during a panel session at DellEMCWorld yesterday that she moderated.

"Customers are telling us that they're spending more money than ever on endpoint security," Wright said. "And it's not just that they want to protect those nice pretty endpoints, they're looking to protect the data that's on those endpoints or passing through those endpoints at any given time."

The increased spend isn't just on products Wright added. "They are dedicating more people and creating more policies and procedures and testing, than ever before."

Posted by Jeffrey Schwartz on 10/19/2016 at 12:48 PM0 comments


Lenovo Brings Its Unified Workspace to the Cloud

When you think of Lenovo, ThinkPads, Yogas and servers may come to mind -- but not digital workspace technology. The company is hoping to change that view over the next few years as Lenovo aims to extend its software and cloud solutions portfolio.

Lenovo last week announced its new Unified Workspace Cloud, a managed service based on its on-premises Unified Workspace technology. Similar to the on-premises Workspace Cloud, a platform that it has offered since its acquisition of Stoneware four years ago, Unified Workspace Cloud is HTML5 based and uses RDP to access applications in a consistent manor on any PC or mobile device. Sal Patalano, Lenovo Software's chief revenue officer, in an interview said that unlike the digital workspace offerings Citrix and VMware have recently rolled out, that its Unified Workspace offering doesn't require an agent or plugins. "The big thing is being able to do it via browser. We don't deal with any desktop agents," Patalano said, adding it also doesn't require VPN connections. "The ability to negotiate and get into my corporate environment without having to deal with a VPN logon is huge."

The on-premises Unified Workspace front-ends a secure proxy and users can log in to it via their Active Directory credentials to access applications in a datacenter --internally hosted, Web application and SaaS apps. With the current on-premises version, the customer runs two servers -- one to interact with Active Directory and connect into any of those internal private applications they need to access, said Dan Ver Wolf, a Lenovo senior sales engineer. The second server is deployed in the DMZ functioning as a relay. "Users that are remote, using personal devices, whatever it might be, access everything through that external relay, so they get secure access, remain physically separated from the datacenter, but still get access to internal resources," Ver Wolf said.

With the new hosted offering, it uses a similar approach, though it's a managed service hosted via Amazon Web Services and administered by Lenovo's professional services team. "When a customer wants a new application added to the service, they just call to have it deployed," he said.

In addition to no longer requiring the infrastructure associated with the current version, it's half the price. The MSRP for one user access to the on-premises offering is $50 and $100 per user per month for concurrent access, versus $25 and $50 respectively for the new cloud offering. Granted, no one pays MSRP, but pricing will vary based on the number of employees.

Lenovo also announced it has inked a partnership with Nimble Storage, a rapidly growing provider of flash storage systems. The two companies will look to deliver a "self-healing" converged solution with Nimble's InfoSight. Lenovo said the first product based on that solution, the ThinkAgile CX Series, is set for release at the end of the month.

Posted by Jeffrey Schwartz on 10/17/2016 at 11:44 AM0 comments


Windows Server 2016 Arrives Ready to Rev Docker Engine

The release of Windows Server 2016 this week is a major upgrade to Microsoft's venerable server OS thanks to a number of significant new features. But it could be argued that the most distinct new capabilility is its support for containers. That's important because with containers, Windows Server 2016 will be able to run applications and workloads not built to run on Windows, notably Linux, but also those designed to run in cloud environments. 

In addition to supporting Windows and Hyper-V containers -- initially via the runtime environment of the Docker open source container platform -- Windows Server 2016 will include a commercial version of the Docker Engine.

At last month's Ignite conference in Atlanta, Docker and Microsoft said they have extended their partnership, inked more than two years ago, in which a commercially supported version of the Docker engine will be included with Windows Server 2016 at no extra cost.

"This makes it incredibly easy for developers and IT administrators to leverage container-based deployments using Windows Server 2016," Microsoft Executive VP for Cloud and Enterprise Scott Guthrie said in the Ignite keynote.

I had a chance to speak with Docker COO Scott Johnson at Ignite, where he described the next phase of the two companies' relationship, the details of the new arrangement and how the company hopes to widen the reach of Docker containers to the Windows world.

With regard to this new arrangement, does that mean Docker Engine is built into Windows Server 2016?
If you buy Windows Server 2016, you have access to the Docker Engine, which, behind the scenes, will be downloaded from Docker. The user will have the option to just activate Docker and it will appear in front of them.

Are you both providing joint support?
The deal has three legs. First is the commercially supported Docker Engine, the second is commercial support and that's provided by Microsoft, backed by Docker. And the third leg is with our Docker Datacenter product, which helps IT organizations manage these containerized workloads. So that will be jointly promoted by Microsoft and Docker to the Windows Server user base.

Where do you see customers using Docker Datacenter?
What we see is they'll start with the Docker Engine. They will play with a couple of containers, get them fired up. But once IT operations gets a sense that this is a real application architecture, IT operations will says "how do I manage all of these containers? How do I move them from lab to production? How do I move from datacenter to cloud?" Docker Datacenter is the management tooling that helps them do that. So it's the management tools on top of the runtime.

Will it work with Microsoft's System Center?
It pares well with System Center and OMS [Operations Management Suite] in that you can think of them as managing the infrastructure layer. So they're managing the hardware and the hypervisors, and Docker Datacenter is managing the applications in the containers on top of the infrastructure. Microsoft actually produced an OMS monitoring agent for Docker already. So there's already good integration happening already.

How do the Windows Containers fit into Docker containers? Meaning, what is the relationship between them?
The Windows kernel has the container primitives and the Docker Engine takes advantage of those primitives. So when Microsoft says Windows Containers or Hyper-V containers, that's synonymous with Docker Engine containers. The way you take advantage of Windows containers is using the Docker Engine interface. They're part and parcel of the same thing.

Are you anticipating a lot of Windows Server shops will go this route?
What we've seen with the tech previews is that there's actually quite a bit of pickup even in a raw technology preview stage of Windows shops doing a lift-and-shift-type motion with their .NET apps. So they will take an existing app, pick it up off the host, off the VMs, put it over into a Docker container and right away they're able to iterate faster in their CI [continuous integration efforts]. They have a build artifact that they can move from developer to developer. So with Tyco's use case, they're using Docker containers on Windows Server to do a lift and shift and bring a dev ops process to their Windows development, which a couple of years ago, peoples' heads would have exploded. But you're seeing those two worlds come together, largely facilitated by the Docker containers.

What apps lend themselves best for this lift and shift?
Web apps work very well. Mobile apps work very well.

There's this question of whether containers will replace the VM. Does this lend credibility to that argument?
I think that's a long-term discussion. VMs have a hardware isolation level that is built in with 10 years of development. The automation and tooling and security signoffs, have all been built around VMs, the entire army of VMware and Hyper-V admins have built their careers on VMs. So they're not going away anytime soon. And VMs and containers are actually very complimentary because containers are OS virtualization and VMs are hardware virtualization. So they're different layers of the stack. Today they're not one to one, where one replaces the other. In the future is that how it rolls forward? We'll have to wait and see, but that's not how we're positioning the main benefit is today.

But it is an option when you talk about trying to reduce the footprint?
What we see happening is we see them using a single VM with Docker and multiple containers in that VM so they get the isolation benefits and the automation benefits that they've already invested in. They also get the density benefits of multiple containers with multiple apps inside a single VM.  You can get the best of both worlds by doing that and still take your footprint down, but still have the security and automation tooling.

Posted by Jeffrey Schwartz on 10/14/2016 at 11:44 AM0 comments


IT Skills and Compensation Expert To Keynote at Live! 360 Conference

I'm thrilled to share that David Foote, chief analyst with Foote Partners and a prominent expert in IT skills, certification and salary benchmarks, will be the keynote speaker at this year's Live! 360 conference, where he'll reveal some important trends and pay data and discuss the importance of choosing a set of IT skills amid digital transformation efforts that are taking priority in a growing number of organizations.

Live! 360, which is produced by this publication's conference group, is an annual gathering of IT professionals and developers, scheduled to kick off Dec. 5 in Orlando, Fla. It brings together a number of conferences we produce including TechMentor, SharePoint Live!, SQL Live!, Visual Studio Live,! Modern Apps and App Dev Trends.

Foote is a renowned expert on IT skills and compensation and his firm for two decades has produced deep research and analysis on technology, economic and most notably compensation data covering hundreds of technology skills and disciplines. I've known Foote for many years and he plans to discuss how their IT skills are valued, taking into account shifts in technology and business requirements and will present key data and some forecasts.  Amazon, IBM, Google, Oracle Red Hat and Microsoft, among others, often talk up how businesses and the public sector are looking to become more agile. And with the availability of new technology and methodologies ranging from cloud, mobile, DevOps, automation, along with rapid acceleration of real-time analytics, machine learning and Internet of Things, demand for IT professionals and developers with new skills is growing.

Digital transformation is a key theme CEOs of all the major tech providers are talking up these days. According to a survey by Gartner earlier this year, 50 percent of CEOs and senior businesses say digital business transformation is on their agenda. Likewise, organizations are struggling to find people with modern cyber security skills to address current threats and those introduced by these new technologies. Amid these changes, IT professionals will need to carefully consider the approach to maintaining their skills to ensure they can maximize their earning potential, he says.

"If you are looking for job opportunities and greater pay, it's important to maintain cross-skilling," Foote said. "The whole ocean is rising." These digital transformation initiatives are changing and creating demand for in a variety of new areas such as UX designers, digital product designers, digital modelers and digital analytics managers, with expertise in a variety of platforms, form factors, development and infrastructure environments.

During the keynote, Foote will field questions from audience members about how they should consider managing their careers and how organizations are placing premiums on various skills and certifications. It should be an interesting opportunity if you're looking to make sure you're aligning your skills with where business and technology are headed. I hope to see you there.

Posted by Jeffrey Schwartz on 10/11/2016 at 1:49 PM0 comments


AWS and Microsoft Azure Gain Native IPv6 Connectivity

Amazon Web Services and Microsoft have each brought native IPv6 connectivity to their respective cloud services as the need for the new addresses continues to rise amid new software that requires them and a limited pool of those based on those on the original IPv4.

AWS first added IPv6 to its S3 storage offering back in August and last week extended it to other services. Microsoft announced that Azure now supports native IPv6 addresses at last month's Ignite conference in Atlanta, though the move was overshadowed by the news that the company had quietly upgrade all of its network nodes with field-programmable gate arrays (FPGAs), providing 25Gbps throughput, a substantial boost. Internet service and cloud providers have been under pressure for years to upgrade their networks to support IPv6 and now fueling demand are compliance requirements and a proliferation of new devices.

"The demand for IPv6 has never been greater with the explosive growth in mobile devices and billions of Internet of Things (IOT) devices entering the market," wrote Yousef Khalidi Microsoft's corporate VP for Azure networking, in a blog post late last month announcing the IPv6 availability in Azure. Khalidi also discussed other network enhancements including the accelerated support powered by the FPGAs, cross-premises Vnet-to-Vnet connectivity via the new high-availability Azure VPN gateways and the release of Azure DNS, which Khalidi said now lets customers host domains and manage DNS records using the same credentials, APIs, tools, billing and support as other Azure services.

Khalidi noted that Microsoft has used IPv6 for Microsoft services for more than three years. The new native connectivity in Azure is available for connectivity to both Windows and Linux virtual machines. In separate posts, Microsoft has outlined how to connect IPv6 network endpoints to the Azure Load Balancer either by using a template, using PowerShell or using Azure Resource Manager with the Azure CLI.

The IPv6 connectivity is currently available in most regions with the exceptions of parts of Australia, the U.K., Germany, China and the U.S. government cloud. Current availability is posted here.

The AWS support for IPv6 announced in August covered access to content in its S3 storage offering with all features except for Web site hosting and access via BitTorrent.

At the time, it also didn't cover transfer acceleration, though that was added to last week's new set of services now supporting IPv6. The AWS IPv6 support update also adds its CloudFront content delivery network (CDN) at all of its 60-plus edge locations and Web application firewall (WAF) service. In last week's announcement, AWS Cloud Evangelist Jeff Barr explained how to implement the respective APIs.

Posted by Jeffrey Schwartz on 10/10/2016 at 5:12 PM0 comments


Microsoft and Google Are Taking Different Cloud-Based AI Paths

Artificial Intelligence is the hottest term in IT these days as players of all sizes talk up their latest AI projects. Google this week strutted its latest advances in AI by launching Google Home, a device similar to the Amazon Echo,  that taps the Google search engine and has its own personal assistant. The company also jumped back into the smartphone market with its new Pixel. The slick new Android phone is the company's first with the personal assistant built in. Google's latest AI-based wares come on the heels of last week's Ignite Conference in Atlanta where Microsoft CEO Satya Nadella talked up how the company has deployed field-programmable gate arrays in every node of its Azure cloud to enable it to process the tasks of a super computer, accelerating its machine-learning, AI and Intelligent Bots framework.

Just as Microsoft is building out Azure to power the AI-based capabilities it aims to deliver, the Google Cloud Platform will do the same for the search giant. Google has the largest bench of AI scientists, while Microsoft and China's Baidu search engine and cloud are a close second, said Karl Freund, a senior analyst at Moor Insights and Technology. Microsoft  last week said it has formed an AI group staffed with 5,000 engineers.

Freund explained in a blog post published by Forbes that Microsoft's stealth deployment of field programmable gate arrays in Azure over the past few years is a big deal and will likely be an approach that other large cloud providers looking to boost the machine learning capabilities of their platforms consider, if not already under way.

Microsoft Research networking expert Doug Burger, who joined Nadella on stage during the Ignite keynote, revealed the deployment of FPGAs and GPUs in every Azure node, providing what he described as "ExaScale" throughput, meaning it can run one billion operations per second. That means Azure has "10 times the AI capability of the world's largest existing supercomputer" Burger claimed, noting that "a single [FPGA] board turbo charges the server, allowing it to recognize the images significantly faster." From a network throughput standpoint, Azure can now support network speeds of up to 25Gbps, faster than anyone else has claimed to date, he said.

Freund said in an interview that he was aware of Microsoft's intense interest in FPGAs five years ago when Microsoft Research quietly described Project Catapult, outlining a five-year proposal of deploying the accelerators throughout Azure. The company first disclosed its work with FPGAs two years ago when Microsoft Research published a research paper describing its Project Catapult deployment of the fabric on 1,632 servers to accelerate the Bing search engine.

Still, it was a surprise that Microsoft actually moved forward with the deployment, Freund said. Freund also emphasized how Microsoft's choice of deploying FPGAs contrasts with how Google is building AI into its cloud using non-programmable ASICs. Google's fixed function chip is called the TPU, the tensor processing unit, based on the TensorFlow machine learning libraries and graph for processing complex mathematical calculations. Google developed TensorFlow and contributed it to the open source community. Google revealed back in May that it had started running the TPUs in its cloud more than a year ago.

The key difference between Google and Microsoft's approach to powering their respective clouds with AI-based computational and network power is that FPGAs are programmable and Google's TPUs, because they're ASIC-based, are not. "Microsoft will be able to react more readily. They can reprogram their FPGAs once a month because they're field-programmable, meaning they can change the gates without replacing the chip, whereas you can't reprogram a TPU -- you can't change the silicon," he said. Consequently, Google will have to swap out the processors in every node of its cloud, he explained.

The advantage Google has over Microsoft is that its TPUs are substantially faster -- potentially ten times faster -- than today's FPGAs, Freund said. "They're going to get more throughput," he said. "If you're the scale of Google, which is a scale few of us can even comprehend, you need a lot of throughput. You need literally hundreds or thousands or even millions of simultaneous transactions accessing these trained neural networks. So they have a total throughput performance advantage versus anyone using an FPGA. The problem is if a new algorithm comes along that allows them to double the performance, they have to change the silicon and they're going to be, you could argue, late to adopt those advances."

So who has an advantage: Microsoft with its ability to easily reprogram their FPGAs or Google using its faster TPUs? "Time will tell who has the right strategy but my intuition says they are both right and there is going to be plenty of room for both approaches, even within a given datacenter."

And speaking of the datacenter, while Microsoft didn't say outright, officials and partners acknowledged the potential to deploy them in Azure Stack hardware at some point after their release, scheduled for next summer. Indeed, many organizations with large SANs use them to boost connectivity and modern network infrastructure has them as well.

As for Amazon Web Services, the technology it uses for its EC2 cloud is a well-guarded secret. Back in June, AWS launched its Elastic Network Adapter (ENA), boosting its network speed from 10Gpbs to 20Gbps. While Amazon isn't revealing the underlying hardware model behind its ENAs, Freund said it's reasonable to presume the company's 2015 acquisition of Israeli chip maker Annapurna Labs is playing a role in boosting EC2. Annapurna was said to be developing a system-on-a-chip-based programmable ARM network adapters, he said.

Baidu already has taken a huge jump into deploying FPGAs as well, Freund said, and Microsoft's push as well lends credibility that Intel's $16.7 billion acquisition of FGPA leader Altera last year was prescient in boldly predicting that 30 percent of all servers will have FGPAs by 2020. "Intel certainly saw the writing on the wall," he said. "They realized this would be a threat to their business if they didn't have a place in it."

Posted by Jeffrey Schwartz on 10/07/2016 at 1:30 PM0 comments


What's New in Azure Stack Technical Preview 2

Update 10/10: An earlier version of this report stated that a third technical preview will have multi-node support. While there will likely be a TP3 prior to Azure Stack's general release, Microsoft said it won't have multi-node support. Also the statements were inadvertently attributed to Corey Sanders, who we met with at Ignite, but it was Mike Schutz, general manager of Microsoft's Cloud and Enterprise division, who explained the future Azure Stack preview plans.

 

Nearly eight months after issuing the first technical preview of Azure Stack, Microsoft last week released an update to the test version of the software that will ultimately let enterprises and hosting providers replicate the Azure public cloud within their own datacenters -- albeit on a smaller scale.

The second technical preview, commonly referred to by the company and testers alike as TP2, introduces a number of new features and services, covering the entire stack: the Azure Portal, security, compute, network and storage. TP2 also offers added monitoring capabilities within the Azure Portal and, for hosting providers, support for billing and usage monitoring.

The first new feature pointed out by Mike Schutz, general manager of Microsoft's Cloud and Enterprise division, during a meeting at the Ignite conference in Atlanta, was the ability to use Microsoft's Key Vault for providing secure management of keys and passwords. "It helps from a key management perspective and security perspective," Schutz said.

Testers can now also utilize application queuing and federated accesses to the Azure Marketplace with the ability to deploy select solutions, demonstrated in a session by Jason Zander, which is now available for viewing on-demand.

Microsoft outlined a list of the new Azure Stack features in an online document. The noteworthy features and services in TP2, in short, include:

  • Network: Support for iDNS, which enables internal network name registration and Domain Name System (DNS) resolution without additional DNS infrastructure. This is important for the handling of external DNS names, while also letting admins register internal virtual network names. "By doing so, you can resolve VMs on the same virtual network by name rather than IP address, without having to provide custom DNS server entries," wrote Microsoft program manager Scott Napolitan in a blog post. Other new network capabilities in the preview include "User Defined Routes," enabling for the routing of network traffic through firewalls, security, other appliances and other services; and the ability to provision network resources from the Azure Marketplace.
  • Storage: Premium Storage API account support, the ability to created shared access signatures in storage accounts, the ability to create Append Block operations within storage BLOBs and tenant storage service support for common tools and SDKs, including Microsoft's Azure CLI, PowerShell and .NET, as well as Python and the Java SDK.
  • Compute: The ability to de-allocate virtual machines (VMs), resizing of VM disks and the ability for VMs to have multiple network interfaces, while allowing administrators to redeploy VM extensions when configuring or troubleshooting.

Similar to the first preview, TP2 is restricted to running on a single machine and one Active Directory node and is intended for proof-of-concept evaluation, not scale. Schutz said Microsoft will offer a third technical preview at an unspecified time, though that public preview will only be available for single-node configurations. (Note: an earlier version of this report said the third preview will be a multi-node release but the company said that's not the plan.) 

Schutz also defended Microsoft's controversial decision to release Azure Stack next summer initially only on dedicated systems provided by partners Dell, Hewlett Packard Enterprise (HPE) and Lenovo, though the company indicated that other options will become available in the future. The three companies displayed prototypes of 4- and 8-node systems designed for TP2 at Ignite, as outlined in a separate post, "Azure Stack Prototypes Debut at Ignite."

"What we've found is most private cloud deployments fail because of how complex it is to bring together cloud hardware and cloud software," Schutz said. "So we're still very focused on delivering Azure Stack with integrated systems and over time we'll evaluate how broad we can go in terms of other deployment models, but right now we're very focused on the integrated-systems approach."

Posted by Jeffrey Schwartz on 10/03/2016 at 2:08 PM0 comments


Azure Stack Prototypes from Dell, HPE and Lenovo Debut at Ignite

Thousands of IT pros and developers have now had their chance to see what the first Azure Stack systems might look like when the Microsoft public cloud platform is made available for datacenters and hosting facilities next summer.

Dell, Hewlett Packard Enterprise (HPE) and Lenovo each had prototypes spec'd for Azure Stack on display at last week's Microsoft Ignite conference in Atlanta, where the company also announced it has released the second technical preview of the software.

Attendees crowded around the various displays of the prototypes on the Ignite exhibit floor, while technical sessions on the topic were standing-room-only. For its part, HPE disclosed it will offer a 4-node and 8-node Azure Stack system built on its popular DL380 servers and up to 14-cores per node. The company will offer usage-based pricing, said Ken Won, HPE's director of integrated cloud solutions.

Of the three suppliers, only Won would offer a hint to what the systems might cost -- likely $250,000 to $300,000, though he described that as a "guesstimate." Those going with usage-based pricing plans can get one bill from HPE for their datacenter and Microsoft Azure public cloud consumption, Won said.

HPE provides its Operations Bridge multi-cloud and datacenter management suite to manage Azure Stack, and it will work with its own ArcSight SIEM and security analytics platform. Won said ArcSight, though not part of Microsoft's Azure Security Center marketplace, is fully compatible with Azure, which HPE CEO Meg Whitman has described as the company's preferred public cloud.

"We can pull data, log information, out of Azure to the extent that they publish it, and use it as part of the ArcSight solution to look at all of the data you need to look at to identify any security issues," Won said.

At the Lenovo booth, the company was showcasing its Azure Stack prototype, as well. Lenovo previewed its converged 8-node system running 22 cores per Intel Broadwell-based CPU.

The Azure Stack systems were popular among those visiting the booth, said Lenovo advisory engineer Michael Miller, noting a vast majority of those inquiring about the systems were those representing enterprises of all sizes, primarily from Europe, a region with strict data sovereignty regulations.

"We've talked to a lot of developers who use Azure who are interested in doing their work on-premises and then be able to move it into the public cloud," Miller said. "There's been a lot of interest from health care companies [and] actually some interest from government agencies and at least two transportation agencies. A lot of people want to know pricing, and we've told them that hasn't been worked out, that the system is still in development."

Jim Ganthier, vice president and general manager of engineered solutions for the HPC and cloud units at Dell-EMC, said his company will be "on stage when Microsoft releases Azure Stack," but steered the conversation toward the launch of its new SQL Server 2016 and Exchange Server integrated solutions, only saying that Dell doesn't talk about unannounced products -- though it, too, had a prototype on display at Ignite.

Posted by Jeffrey Schwartz on 10/03/2016 at 9:06 AM0 comments


Microsoft Turbocharges Azure Cloud for AI Supercomputing with SDN-Based FPGAs and GPUs

Microsoft this week made the interesting revelation that it has quietly upgraded every node in its Azure public cloud with software-defined network (SDN) infrastructure, developed using field-programmable gate arrays (FPGAs).

Described as a massive global SDN upgrade, it means that the Microsoft Azure public cloud fabric is now built on a 25 gigabit-per-second backbone -- up from 10Gbps -- with a 10x reduction in latency, which Microsoft believes translates to Azure having the highest speed network among cloud services. Combined with new GPU nodes, recently made available in the Azure Portal, Microsoft also clams its cloud can function as the world's fastest supercomputer, capable of running artificial intelligence, cognitive computing and even neuro networking-based applications.

The stealth upgrade started two years ago when Microsoft began installing the FPGAs -- effectively SDN-based processors from Altera, now a part of Intel. At this week's Ignite conference taking place in Atlanta, Microsoft revealed the Azure infrastructure and network upgrade. Microsoft CEO Satya Nadella demonstrated some of the AI supercomputing capabilities the newly bolstered Azure is capable of during his keynote session late Monday.

"We have the ability, through the magic of the fabric that we've built to distribute your machine learning tasks and your deep neural nets to all of the silicon that is available so that you can get performance that scales," Nadella said.

Doug Burger, a networking expert from Microsoft Research, joined Nadella on stage to describe why Microsoft made a significant investment in the FPGAs and SDN architecture. "FPGAs are programmable hardware," Burger explained.  What that means is that you get the efficiency of hardware, but you also get flexibility because you can change their functionality on the fly. And this new architecture that we've built effectively embeds an FPGA-based AI supercomputer into our global hyper-scale cloud.  We get awesome speed, scale and efficiency.  It will change what's possible for AI."

Burger said Microsoft is using a special type of neural network called a 'convolutional neural net', which can recognize the content within a collection of images.  Adding a 30-watt FPGA to a server turbocharges it, allowing the CPU to recognize images significantly faster. "It gives the server a huge boost for AI tasks," he said.

Showing a more complex task, Burger demonstrated how adding 4 FPGA boards to a high-end 24 CPU core configuration can translate the 1,400-page book War and Peace from Russian to English in 2.5 seconds. "Our accelerated cognitive services run blazingly fast," he said. "Even more importantly, we can now do accelerated AI on a global scale, at hyper-scale. "

Applying 50 FPGA boards to 50 nodes, the AI-based cloud supercomputer can translate 5 billion words into another language in less than a tenth of a second, according to Burger, amounting to 100 trillion operations per second. "That crazy speed shows the raw power of what we've deployed in our intelligent cloud," he said.

In an interview, Burger described the deployment of this new network infrastructure in Azure as a major milestone and differentiator for the company's public cloud. "This architecture is disruptive," Burger said, noting it's also deployed in the fabric of the Bing search engine. "So when you do a Bing search, you're actually touching this new fabric."

Posted by Jeffrey Schwartz on 09/28/2016 at 11:46 AM0 comments


Adobe Taps Azure To Run its Entire SaaS Portfolio

Microsoft kicked off its annual Ignite conference for IT pros and developers by announcing that Adobe's software-as-a-service offerings will run primarily in the Azure public cloud. Adobe President and CEO Shantanu Narayen joined Microsoft's CEO Satya Nadella on stage in opening moments of the kickoff keynote session of Ignite, taking place all week in Atlanta.

The two announced the extension of a partnership kicked off two years ago when the two companies worked together to optimize Adobe's apps for Windows 10 and Microsoft's Surface PC tablet. In addition to running the Adobe Creative Cloud, Adobe Document Cloud and Adobe Marketing Cloud on Azure, the companies' new partnership includes optimizing the latter for the new Microsoft Dynamics 365 suite.

"We think there is an opportunity out of the box to provide integration for all of our joint customers to have one integrated sales and marketing service, Narayen said. For Microsoft, Adobe's decision to run its massive SaaS infrastructure on Azure is the latest major endorsement of Redmond's public cloud. Among other key customers using Azure include General Electric, Renault-Nissan, Tyco and Boeing, among others

"This coming together of the intelligent cloud with transformative SaaS applications with creativity and marketing is a massive milestone," Nadella said. This latest pact between the two companies will lead to more integrated offerings tying Office 365 and Dynamics 365 to Adobe's respective offerings, though the two companies haven't elaborated on that point.

Posted by Jeffrey Schwartz on 09/26/2016 at 12:03 PM0 comments


No Fix in Sight for Surface Pro 3 Battery Drain

Surface Pro 3 users may have to wait indefinitely for a fix to a problem with the devices that appear to have even worse battery life than they had before a patch was released nearly a month ago, Microsoft is now indicating.

In a Microsoft forum post yesterday, a company official and moderator going by the name of "Greg" claimed that the Aug. 29 patch is not the cause of the degraded performance in its Surface Pro 3 systems and said the battery drain issue is only affecting "a limited number" of users. Yet that conflicts with the view of some users. "I am not understanding [Microsoft's] stance that the update did not cause the issues we are having," Jeremy Bronson stated in the forum. "Thousands of people did the updates and now have the problem."

The Surface Pro 3 I have can barely run 3 hours and that's in power-save mode with the brightness at only 25 percent. Moreover, the system has become more unstable to the point where I intend to reimage it. In his brief comment, Greg said fixing it is a priority. "Our team is actively looking in to the issue to determine the cause and identify a fix," he said.  "We will post an update as soon as we have more information to share."

That brief statement aside, Microsoft has largely kept Surface Pro 3 users in the dark about this problem for some time and in his latest Redmond magazine Windows Insider column, Ed Bott made his position clear on the company's handling of the situation (see "Shame on Microsoft for Leaving Surface Pro Users in the Dark"). Bott described Microsoft's response, or lack thereof, as the latest example of the company shying away from problems rather than addressing them. Other examples, he noted, include Microsoft's finally aborted "Get Windows 10" campaign and complaints about privacy policies in the new OS. Bott wants to know:

Why the consistently timid response? Part of the reason might be the corporate equivalent of the Miranda rule: As several generations of Microsoft executives will testify, anything you say can and will be held against you -- by antitrust officials, publicity-happy state regulators and a clickbait-driven tech press.

Bott argues it's not that Microsoft is incapable of communicating, the company hides from conflict by choice:

Microsoft certainly knows how to communicate. On the nuts and bolts of product design and implementation, the company is incredibly forthcoming in its communications, with an overwhelming number of blogs and Knowledge Base articles documenting even the tiniest of details.

The company did promise to keep us in the loop, and of course we'll share whatever we learn, hopefully sooner than later.

Posted by Jeffrey Schwartz on 09/23/2016 at 12:27 PM0 comments


The Ford Museum Puts Windows Azure Pack on Display

As Microsoft looks to bring its public cloud into to the datacenters of customers and other hosting providers with the forthcoming release of Azure Stack, its predecessor is still a viable solution for some organizations. The IT decision makers at The Henry Ford, a nonprofit museum complex situated on a 250-acre campus with a broad collection of historical items spanning 300 years, last year decided to use the Windows Azure Pack (WAP) to drive a new private cloud as part of an effort to modernize and digitize a massive collection of objects to make them more accessible to visitors -- both at its Dearborn, Mich. facility in kiosks and online.

Not to be confused with Azure Stack, which will run the identical public cloud platform in a converged bare-metal system when it arrives in the middle of next year, WAP is a hybrid cloud solution introduced four years ago. Unlike Azure Stack, WAP brings the Azure portal interface to System Center 2012/R2 running with Windows Server 2012/R2. Microsoft introduced WAP as an option to Windows Server 2012 with fanfare, billing it as the Cloud OS. Most customers and hosting providers waited a year until the R2 release, which brought a more suitable version of Hyper-V, experts said at the time.

The extent to which WAP is used isn't clear, though more affordable implementations have only recently become available. Dell and Hewlett Packard Enterprise last year rolled out scaled down versions of their Cloud Platform Systems, coengineered with Microsoft. Redmond magazine recently examined HPE's iteration of the WAP-based system.

Organizations that don't want to spend a few hundred thousand dollars on a CPS can also go to a subscription model via third-party cloud and hosting providers. Among those who have launched new WAP-based services last year are Rackspace and Hostway, both of which were among the first to join Microsoft's Cloud OS Network, which now includes more than 100 companies around the world. The Henry Ford considered both providers but ultimately chose Hostway. Matthew Majeski, director of the The Henry Ford,in a recent interview described the effort as a major initiative.

"This was a massive undertaking," Majeski said. For one thing, the museum and its various constituencies were decentralized with 40 Web sites and microsites hosting about 70,000 pages, all with different content management systems, most of which were custom built more than a decade ago. "It was a challenge not from a maintenance standpoint but from the standpoint of driving a consistent experience," Majeski said. "You could go from site to site with different navigation and color patterns fonts. It was difficult for the consumer to navigate."

The IT management issues notwithstanding, the larger priority was the business initiative to provide a better experience for visitors. To achieve that, museum officials decided years ago it wanted to digitize its mammoth collection of objects. To date it has scanned and made available 56 million artifacts consisting of 10TB of data, ranging from cars, parts, signs and other historical vestiges of the century-old automaker. Ford, a longtime Microsoft shop, wanted the new private cloud to be a Microsoft-based solution given its existing systems are primarily Windows Server and SQL Server based.

At the same time, The Henry Ford museum wanted a dedicated privately hosted cloud that was managed rather than running it purely in Microsoft Azure. "It was really all about stabilization," Majeski said. "We needed a complete solution from a hosting perspective, based on the blueprint for where we wanted to go."

The museum also now has Sitefinity 1.1 CMS, which allows it to generate ASP.NET-based pages and an ongoing digitization process of objects and content aimed at building stories around them, powered by a search engine from Elastic Search. Majeski has brought in a team of nine people consisting of designers, curators and people to build metadata around the content.

The project has also centered around taking data out of EMu, a popular provider of museum collections management software, pulling it into the SQL Server databases and using newly built APIs to pull the appropriate data based on the front-end query from a user, Majeski explained. Perficient, the digital agency brought in to push the initiative forward, built the APIs.

Pete Ferraro, the Perficient project manager and architect, said it was the first WAP-based private cloud project he has worked on, though he's worked with the Microsoft Azure public cloud in the past. Leading up to its Feb. 29 launch, Ferraro said there were some performance issues, which he likened to the team coming up to speed on the platform. "A lot of it had to do with some idiosyncrasies connecting the instances themselves, like spinning up multiple instances," Ferraro said.

The Ford project was one of the first large Hostway customers using the WAP solution, which was also a factor. Hostway had a specific team focused on troubleshooting WAP issues, he added. Tony Savoy, senior vice president and general manager for managed hosting and cloud services at Hostway, acknowledged it was one of its earlier projects offered under what it calls its Virtual Private Cloud service.

"The Ford Museum was a fan of Azure but they knew they wanted to take advantage of a private cloud.," Savoy said. "Given we had launched the solution around Windows Azure Pack, it was a perfect fit for what they're looking for. They were looking for performance; they were looking for scalability but they were looking for it in an isolated environment where they didn't have to share infrastructure with other customers."

It's possible some workloads will go to the Microsoft Azure public cloud, which Hostway offers managed services for as well, Savoy noted. Disaster recovery using Azure Site Recovery is one option, he said. Majeski said The Ford Museum isn't ruling out splitting workloads that might not require the isolated dedicated network of WAP onto Azure.

WAP has taken several years to find a niche and even now it's not clear how many customers are using it. Savoy said since Hostway launched its service over a year ago, it has more than 30 customers using it. Some are larger than the Ford Museum, though he says it's the most prominent case study the company has shared to date.

While this case study is an example of a successful virtual private cloud based on WAP, it begs the question, would the Ford Museum be better off with Azure Stack, once it arrives next summer? Perhaps so, though it would have set the project back by 12 to 18 months at least, and most likely longer.

Posted by Jeffrey Schwartz on 09/21/2016 at 9:21 AM0 comments


Larry Ellison Takes on Amazon with Oracle's Latest Cloud Salvo

Oracle isn't the first company that comes to mind when asked who are the largest cloud infrastructure service providers, but apparently Founder and CTO Larry Ellison plans to change that. In his traditional keynote address to open up the annual Oracle OpenWorld conference, Ellison kicked of this year's gathering with a "cloud-first" focus that will apply to everything from its flagship database server, middleware and applications. He also made a brazen claim that Oracle will mount a challenge against Amazon Web Services (AWS) for cloud dominance.

Such bravado is nothing new for Ellison but even for him that could be quite a coup if Oracle were to surpass AWS, which is by far the largest public cloud infrastructure provider, now on at a $10 billion run rate. Only Microsoft, Google and, to a lesser extent, IBM Softlayer have public cloud infrastructures that approach the scale and revenues of AWS. Even so, analysts say AWS has a strong lead over anyone including second-place Microsoft Azure.

Though not impossible, given Oracle's resources, the company has significant challenge before it can threaten AWS, Microsoft and Google, among others -- and that presumes customers embrace Oracle's IaaS. "Oracle has a pretty substantial build-out ahead to compete with Microsoft and Amazon on the global scale in which those companies operate. That does not mean that they cannot compete in some geos effectively in the near term," said IDC Analyst Al Gillen. "The company thinks it can disrupt incumbents on the basis of price. That's great, although basic IaaS without any value add is ultimately a race to $0. I think it is important to be in that market segment to be seen as serious and competitive."

Indeed, while describing cloud compute and storage services commodities, Ellison argued Oracle's cloud will cost less and offer better reliability than AWS. Ellison said Oracle is building out and upgrading its existing global datacenters with a new generation of availability zones each consisting of three nearby facilities connected by fiber-optic rings with their own separate power supplies, allowing triplicate data in each availability zone.

"We have a modern architecture for infrastructure where there's no single point of failure," Ellison said during last night's keynote address. "Faults are isolated therefore faults are tolerated. If we lose a datacenter, you don't even know about it."

Moreover, Ellison claimed Oracle's second-generation datacenters offer twice as many cores and memory as AWS, four times more storage and more than 10 times the I/O capacity at a lower cost.

"Amazon's lead's over," Ellison said. "Amazon is going to have serious competition going forward. We're very proud of our second-generation infrastructure as a service, we're going to be focusing on it and aggressively featuring it."

Oracle is also taking on Microsoft, Dell, Hewlett Packard Enterprise (HPE), Lenovo and VMware in the emerging market for preengineered on premises cloud appliances called the Oracle Exadata Database Machine. Oracle Xabyte Database Machine will allow organizations to run the Oracle cloud platform in the local datacenter, allowing customers to move data back and forth, Ellison said.

 "If you run your Oracle database on premise on your own hardware, you can move that data to our cloud, Ellison said. "The cloud and customer machines are identical. It will be 101 percent compatible."

Ellison also talked up the new Oracle 12c database. Of course the "c" stands for cloud, and Ellison said it's designed with improved multitenancy and support for data sharding, where shards could represent database fragments. Oracle 12c can process and synchronize hundreds or even thousands of shards, Ellison said. "The new database has an in-memory option. We can now keep the column store in Active Data Guard [the fault-tolerance features in the Oracle database] and give huge performance increases. We think more and more people are going to go to in-memory database as memory gets cheaper and cheaper and cheaper."

Posted by Jeffrey Schwartz on 09/19/2016 at 1:40 PM0 comments


Backup Exec and NetBackup Will Finally Connect to Azure Storage

Microsoft Azure will finally become a native storage target for users of Backup Exec and NetBackup, which are among the most widely used data protection and DR software offerings. Veritas, the maker of the two backup and recovery suites, this week announced Azure as a target planned for new releases scheduled for next quarter. Moreover, Veritas said it will offer usage-based pricing in addition to the current subscription and perpetual licenses now offered with both Backup Exec and NetBackup.

The long-awaited native Azure connectivity was just one in a barrage of new offerings as the company looks to extend its core storage software offerings into a broader range of information management and governance tools now that Veritas is no longer a part of security vendor Symantec.

Nearly a year since Symantec sold Veritas off to private equity firm Carlyle Group, the company this week held its first Veritas Vision conference since 2004, when the security giant acquired it for $13.5 billion. At the time, the companies argued that backup and recovery and DR are key components of data security. But cultures clashed followed by many years of turmoil at Symantec, which took a toll on the Veritas business, though it remains a leading supplier of data protection software.

Veritas has remained largely quiet in recent years, though the company kicked off the conference with a new, more aggressive attitude. Putting its stake in the ground, company officials reminded customers that it's a pure software provider that intends to, among other things, reduce customers' dependency on expensive storage hardware.

To make its new attitude clear, the company came out swinging at Dell and its newly acquired EMC and controlling stake in VMware. Veritas ran a full page ad in The Wall Street Journal with the headline: "What's Worse Than a Life of Hardware with EMC? An Eternity with Dell." Company officials talked up the ad, and distributed the newspaper after the keynote. Throughout the event, Veritas took aim at Dell, EMC and VMware, saying the latter was proprietary software designed to lock them into hardware.

Mike Palmer, Veritas' senior VP for data insights and orchestration solutions, sees that as an opportunity for customers and a clear target of its efforts.

"I think we're starting to see the end of VMware run," Palmer said in an interview during this week's conference. "We're seeing customers looking for alternatives to their VMware environments. And that is fueling an adoption of containers as the next solution, if you will, to give them the same sort of hardware value that VMware did, that they are looking for some service augmentation."

With the company's focus on extending its core backup and recovery software and its Enterprise Vault Archive (EVA) platform, used for archiving Exchange and now Office 365 data, Veritas is reading a number of new offerings that'll offer full information discovery and offer more scalability to emerging emerging hybrid cloud architectures. Among some new wares in the works:

  • Information Map: a new visualization tool that will give access to unstructured "dark" data initially tied to NetBackup. Available now, the company is offering a free trial.
  • Veritas Access: New scale-out network-attached storage (NAS) software that can run on commodity x86 hardware, which the company said will let organizations more easily discover and access information and move unstructured data to appropriate disk tiers or public cloud storage based on usage patterns. Veritas Access will support AWS and OpenStack clouds and creates a network-attached storage (NAS) tool that organizations can run on commodity x86 hardware.
  • Hybrid Cloud Orchestration: NetBackup will integrate with the Veritas Resiliency Platform 2.1 to provide automated orchestration to facilitate complex recoveries supporting thousands of virtual machines.  
  • HyperScale: A software-defined storage architecture for deploying tier-one applications in private clouds, initially OpenStack from Red Hat. A version for Docker containers is also under development.

Analysts said the company has created a strong roadmap built off its core strengths. "For them to come out and say that they're going to have an intelligent data management platform and that platform is going to enable both the data management and governance side but do it by actually increasing the availability of the lower tiers, I think is really smart. It's very aspirational," said Enterprise Strategy Group Analyst Jason Buffington.

Nevertheless, Veritas remains most synonymous with its bread and butter data protection software. Indeed, Backup Exec is among the more widely used platforms for backing up data running on Windows and Windows Server, especially Exchange, SharePoint and SQL Server as well as file servers. For larger enterprise environments, NetBackup is designed to protect larger systems that are operating system and virtual machine independent. Among Veritas' key rivals, Acronis, CommVault and Veeam are offering connectivity to Azure as well as Amazon Web Services (AWS) S3 storage. Veritas extended support to AWS and Google Cloud Platform earlier this year. Connectors to Azure storage will come in Backup Exec 16 and NetBackup 8.0 in the fourth quarter. The new releases will also include connectors to private clouds based on Red Hat's OpenStack distribution.

Veritas has built the new cloud connectors based on what it calls its Open Storage Technology (or OST), used across its entire data protection portfolio including its Enterprise Vault Archive (EVA) platform, used for archiving Exchange and now Office 365 data, among other content. "OST is an open standard in terms of how vendors can plug in to us," said Simon Jelley, VP of backup and recovery product management at Veritas, in an interview during the conference this week. "It enables not just our ability to work with any storage vendor to ensure the certification, support, compatibility and guaranteed performance between us but it also ensures that any data that moves to a storage target, be it a cloud, disk or tape target, that we still catalog and track the copies of that data."

Veritas executives expressed confidence in the company's new roadmap. "Your information is your most important asset, but you really can't harness the power of it today," said Veritas CEO Bill Coleman during the conference's opening keynote session this week.  "Data is exploding. It's everywhere, it's redundant, it's hard to access and the costs are going through the roof. And if you don't learn how to access that data and your competitors do, they're going to win. We can help you discover the true value of your data. We want to give information to you that you need, to map out your own digital transformation."

Posted by Jeffrey Schwartz on 09/16/2016 at 2:39 PM0 comments


McAfee Spinoff from Intel Will Face Challenges

The IT security software landscape has changed quite a bit since Intel acquired McAfee Software back in 2011 for $7.7 billion. After rumors that Intel was considering exiting the security software business surfaced earlier this year, the company this week pulled the trigger, agreeing to sell a controlling 51% stake to private equity firm TPG for $3.1 billion in cash while retaining 49% in the new company (which will bring back the original McAfee name).

By spinning out Intel Security, the company claims it values the new business at $4.2 billion or just over half what it paid for McAfee. That valuation stems from a $1.1 billion commitment by TPG to invest in McAfee, which Intel says will create one of the largest "pure-play" cybersecurity providers. Intel insists this is not a retreat from the security business, which does have merit, giving the stake it's retaining.

"Security remains important in everything we do at Intel and going forward we will continue to integrate industry leading security and privacy capabilities in our products from the cloud to billions of smart, connected computing devices," said Brian Krzanich, CEO of Intel, in a prepared statement announcing the spinoff. "Intel will continue our collaboration with McAfee as we offer safe and secure products to our customers."

In this week's announcement, Intel pointed to last fall's Focus conference in Las Vegas where the security group outlined its strategy to extend its portfolio and ecosystem. The company outlined a number of key areas of focus including its new endpoint threat detection and response platform, its new push into continuous monitoring and its new orchestration tools.

The company also discussed its Threat Intelligence Exchange (TIE), based on its McAfee Data Exchange Layer (DXL), which the company describes as its "architecture for adaptive security." Last year's Focus event, which I attended, also was an introduction for a new executive team led by Chris Young, who had headed Cisco's security business and this week was named as the CEO of the new McAfee business after the spinoff is completed in the second quarter of next year.  

As I reported, analysts at the event said Intel Security was making progress, but the company needed to shore up in the area of analytics tools. In a letter to customers and partners, Young said the move will help the company address the entire threat defense lifecycle. "As a pure-play provider, McAfee will accelerate the rate of innovation in delivering an integrated portfolio that is increasingly automated and orchestrated," Young said.

However, McAfee may face a challenge from its founder, John McAfee, who earlier this year became CEO of MGT Capital, which acquired assets of has file suit in the U.S. District Court for the Southern District of New York, asking the court for a judgment declaring the misuse of "their use of or reference to the personal name of John McAfee and/or McAfee in their business," as reported by PCWorld.  

Of course, John McAfee is no stranger to controversy having famously became a fugitive years later after police near his home in Belize wanted him for questioning following a murder at a neighbor's home. McAfee, who fled to Guatemala before returning to the U.S., is no longer wanted for questioning in regard to that case by Belize authorities and he earlier this year gave his side of the story, published by Business Insider. Last year McAfee announced he was running as Libertarian candidate for President of the United States.

Posted by Jeffrey Schwartz on 09/09/2016 at 1:03 PM0 comments


Michael Dell's Defining Moment Arrives with Completion of EMC Acquisition

Nearly a year after announcing the IT industry's largest acquisition to date, Dell Technologies today finalized its $67 billion acquisition of EMC, which includes a controlling stake in VMware and a portfolio of other business units including RSA Security, SecureWorks, Virtustream and Pivotal. The deal officially closed this morning after last week's regulatory approval in China cleared the final hurdle and paving the way for the formation of most expansive supplier of datacenter infrastructure.

Now that the deal is done, the IT industry will be watching closely to see what duplicative product lines the newly merged company will sunset, a key concern where both Dell and EMC had overlapping storage products and security wares, among others. (See our breakdown of the deal several months after it was announced.)

While none of that was addressed today, Chairman and CEO Michael Dell, joined by EMC's David Goulden, who will lead the combined company's data infrastructure group, and Dell CFO Tom Sweet, gave prepared remarks effectively kicking off the new company and taking a few questions from analysts and reporters. Michael Dell said the combined company will draw $74 billion in revenues this year and reiterated his belief that bigger is better.

The new Dell Technologies will have three core businesses: client devices, datacenter infrastructure and the separate business units that include RSA, Virtustream, SecureWorks and Pivotal, which he vowed will continue to have autonomy to work with companies that compete with Dell and EMC's core businesses. "We also strategically aligned our capabilities where it makes sense to deliver integrated solutions in areas like hybrid cloud and security and seamless technology infrastructure from the edge to the core to the cloud," Michael Dell said. "Taken together, we have incredibly powerful set of solutions."

Dell also argued that despite the $47 billion in debt to finance the deal, critics who say the company won't be able to pay down the debt are wrong. "Our cash flow to debt service ratios are prenominal," he said. "In fact our debt payments are much less in just share buybacks and dividends [of large publicly traded companies]. Any FUD out there to the contrary is factually incorrect."

CFO Sweet added that since Dell's original $25 billion leveraged buyout three years ago, it has paid down $5 billion of the debt servicing that deal. In addition to ongoing cost savings initiatives underway at both Dell and EMC, we expect to achieve additional cost synergies associated with increased efficiencies in the combined company's supply chain and in the general administrative areas," Sweet said. "On the revenue side, we will be driving significant cross-sale opportunities in our infrastructure solutions group, and our appliance solutions group and with VMware we expect our annual leveraged flow will be three times our total pro forma cost of service and debt."

As he emphasized since the LBO, Dell pointed out to the flexibility associated with being privately held. "We don't have to cater to short term thinking that exists in the market," he said. "We can think in decades."

Experts will also be watching to see if the new Dell Technologies becomes the successful IT "powerhouse" company officials say they have created, despite numerous mergers that have failed including Symantec's acquisition of Veritas and the Hewlett Packard-Compaq deal. While critics of the deal say Dell will be saddled with debt, will risk losing key talent and may be unable to innovate at the pace of smaller companies, proponents say Michael Dell shouldn't be underestimated.

"Both Dell and EMC have proactively adapted to industry and marketplace changes, and remained solidly profitable while doing so," said Pund-IT Chief Analyst Charles King, in a blog post. "Going private in 2013 allowed Dell to fully focus its assets and acumen on successfully completing its transformation. EMC's innovation and billions in earnings to the mix will substantially speed and enhance that process, not harm it."

Posted by Jeffrey Schwartz on 09/07/2016 at 1:53 PM0 comments


VMware To Add Windows 10 Configuration and Lifecycle Management to AirWatch

VMware believes it can lower the cost of deploying, configuring, managing and securing Windows 10 PCs by up to 30 percent with its new "modern" Unified Endpoint Management (UEM) approach with some key advances outlined at this week's VMworld conference, taking place in Las Vegas.

The new UEM approach outlined by VMware aims to further position the company as the "Switzerland" of endpoint and mobile device lifecycle management, in the worlds of Sanjay Poonen, executive general manager and VM of VMware's End-User Computing business, speaking in the keynote session yesterday, which he elaborated on in a meeting with press and analysts.

Poonen also outlined improved automation capabilities coming to the VMware Identity Manager in its Workspace One digital workspace, which includes improved provisioning of Office 365 accounts, support for Salesforce.com's suite of SaaS offerings, added security with support for two-factor authentication and the introduction of a performance boost coming to its Horizon virtual desktop platform. As part of its emphasis on end-user computing, VMware also announced enhancements to its Horizon virtual desktop platform, which include a claimed boost in performance via improvements to its Blast protocol and support for new hyper-converged systems from EMC, Dell, Hewlett Packard Enterprise, Hitachi Data Systems and QCT.

VMware said it will add the new UEM capabilities to Windows 10 in AirWatch, with details set to be announced in October at the annual AirWatch Connect conference, scheduled to take place in Atlanta. While packaging, branding, licensing and price are still to be determined, the plan is to extend AirWatch, which VMware claims is used by 20,000 organizations, to support not just mobile devices, but to provide what it believes will be a best-of breed alternative to Microsoft's EMS combined with System Center Configuration Manager (SCCM). The new Windows 10 UEM technology will use the AirWatch Web interface in a single pane of glass, explained Sumit Dhawan, senior VP and general manager of VMware's End User Computing group.

In addition to AirWatch MDM for Windows, the UEM technology added to the service will include configuration management, software distribution, OS patch management and client health and security, as described in a Web site Poonen pointed to called WindowsUEM.com.

"Unified endpoint management is absolutely a reality," said Poonen. "The key proposition of what we are trying to get done here is lower total cost of ownership in Windows 10. We want to lower that cost from $7k -- what it is typically today with a lot of servers, lots of other labor -- down to something 15 to 13 percent lower by taking endpoint management, security and lifecycle automation and pulling them together."

Posted by Jeffrey Schwartz on 08/31/2016 at 12:29 PM0 comments


VMware's Cross-Cloud Services Aims To Tie Largest Providers Together

VMware has developed new SaaS-based tools that will enable organizations to bridge applications and workloads across multiple private clouds including Amazon Web Services EC2 and S3, Microsoft Azure, IBM SoftLayer and the Google Cloud Platform. The company's new VMware Cross-Cloud services provide deployment, security and management of applications that can be shared across multiple clouds.

Cross-Cloud services are among a number of new technologies and offerings VMware is introducing this week at its annual VMworld conference in Las Vegas to provide extended hybrid cloud infrastructures and to showcase advances in software-defined networking, storage and end user computing.

Emphasizing hybrid clouds on the first day of the event, the company also introduced its VMware Cloud Foundation, which brings together its vSphere, NSX and Virtual SAN offerings into hyper-converged systems integrated with VMware SDDC Manager. The new VMware Cloud Foundation will integrate with the IBM Cloud.

While the VMware Cloud Foundation extends the company's core virtualization and SDDC technologies to private and hybrid clouds, the new Cross-Cloud services target existing and new customers with a SaaS-based offering that doesn't require existing VMware infrastructure, company officials said. Demonstrated as a technology preview, VMware officials didn't say when it will offer the service.

The preview of VMware Cross-Cloud Services is the company's latest effort to extend its core virtualization expertise into a virtual cloud provider beyond its core vCloud Air offering. The VMware Cross-Cloud services are SaaS components that will tie together cloud usage and costs for IT professionals running multiple applications and processes in hybrid on-premises environments that are busting workloads across multiple public cloud providers, allowing for the use of their native APIs.

"Cross-Cloud services are truly a breakthrough through innovation that only this ecosystem can deliver," said VMware CEO Pat Gellsinger, in the opening keynote. Despite numerous providers of multi-cloud management wares, Gellsinger added at a press briefing that he believes VMware can provide better managed and secure services because of its history with virtualization and its software-defined datacenter expertise. "We think VMware is uniquely positioned in industry to be a neutral provider of those services," he said.

The company didn't say when it will release the new offering but said the first capabilities the tools previewed offer include discovery and analytics to enable onboarding and governance of public cloud applications; compliance and securing using "micro-segmentation" and monitoring for cross-cloud governance and tools to enable deployment and migration for developers to build across public clouds.

Posted by Jeffrey Schwartz on 08/29/2016 at 10:42 AM0 comments


Veeam Gets Physical Targeting Servers, Hybrid Cloud and Office 365 Support

Veeam Software sowed its oats by providing an easy way for VMware shops to back up their virtual machines. But now it appears Microsoft is playing a new role in the company's plan to extend the types of workloads it can protect.

Moving beyond the VM, Veeam is now looking to protect hybrid cloud infrastructure, physical servers and Office 365 Exchange Online. At a launch event in Boston this week and broadcast online, Veeam outlined an extensive pipeline of new wares that will appear in the coming quarters that promise to extend the range of IT assets its offerings can backup and restore.

A superset of the new management and orchestration tools and connectors in the works includes the new Veeam Availability for Hybrid Clouds, which includes the forthcoming Veeam Availability Suite 9.5, along with agents for physical servers and public cloud targets. The launch of Veeam Backup for Microsoft Office 365, which will give customers on-premises copies and e-discovery capability of e-mail using the Veeam Exchange Backup Explorer, also marks Veeam's move from protecting on-premises workloads to software-as-a-service cloud applications.

In adding Office 365 to its portfolio, Veeam is taking on those who specialize in SaaS-based data protection such as EMC's Spanning and Datto's Backupify, which also protect other cloud-based applications from the likes of Salesforce.com and Google. Veeam officials said customers, many of whom used the company's Veeam Backup for Microsoft Exchange to offer the Office 365 as their shops, have transitioned to the online e-mail service.

Enterprise Strategy Group Analyst Jason Buffington said that adding Office 365 support as well as protection of hybrid cloud and physical server resources will give Veeam customers less reason to seek out secondary backup providers. "They didn't build it from scratch -- they extended their core platform to support it," Buffington said.

Indeed, as Veeam officials explained it, the Backup for Office 365 tool brokers Office 365 connections through the native Exchange Web Services API, which lets administrators map a hierarchy that lets them create archive jobs on a mailbox-by-mailbox basis, or set separate schedules.

The archives are stored in the native Exchange Extensible Storage Engine (ESE), once known as the Joint Engine Technology (JET) Blue database format. Veeam claims the Backup for Office 365 tool will offer granular recovery options through Veeam Explorer for Microsoft Exchange, which gives administrators access to archives so they can recover entire mailboxes, as well as individual e-mail items, including such calendar entries and tasks. Administrators can also export the mail items out to a .PST file for offline viewing.

In addition to riding the wave of Office 365's growth, Microsoft is playing a key role by helping provide a major boost in performance by adding Veeam Availability Suite, which the company said will ship in October. Specifically, Veeam engineers became enamored with the upgraded Resilient File System (ReFS) coming to Windows Server 2016 this fall (see our first look at the technical preview by MVP Paul Ferrill, published earlier this year). Recovery of workloads that can take hours or days, have been reduced to 15 minutes, company officials said. Doug Hazelman, Veeam's VP of product strategy said ReFS is enabling that. While it will require organizations to use the new server as a backup repository, the original data can reside on any platform, he said.

"I think what you're going to see when once 9.5 is out the door, and Windows Server 2016 is out the door with ReFS, is the integration we are doing is deeper than any other vendor," Hazelman said. "One of the cool things we are able to deliver is a file system that is far and away above anything we could have done ourselves."

While Veeam is riding on some success coming out of Redmond these days, the company prides itself on a broad number of partnerships that still include strong ties with VMware and include, among others, Hewlett Packard Enterprise (HPE), EMC, Cisco, NetApp and the latest one announced this week, IBM, whose Storwize and SVC storage arrays will be supported in Veeam Availability Suite 10 next year.

While Veeam long eschewed agent-based technology for its VM protection platform, the company has developed agents to target cloud and physical machines, Hazelman said. "We're going with agent technology on a per-VM basis because there are no sockets in the cloud -- it's just workloads," Hazelman said at the event.

The Veeam Availability Platform for Hybrid Cloud will incorporate its namesake availability suite targeted at private clouds and virtual server workload availability with support for public clouds and physical servers via its new Veeam Agent for Linux and Windows, a new disaster recovery orchestration tool and a cloud-based management consoles for large enterprises and service providers. Components will start rolling out next quarter.

Making his first appearance as the company's newly appointed president and COO was Peter McKay, who most recently was general manager of VMware's $3.5 billion Americas business, where he spent three years after the company acquired Desktone, where he was CEO. "I come with experience of scale but also from the speed and agility and entrepreneurship that's required in the smaller companies," McKay said. "I do see we are at an inflexion point. This new [release] is going to be very impactful."

Posted by Jeffrey Schwartz on 08/26/2016 at 12:31 PM0 comments


Microsoft Open Source Czar Takes Spotlight at LinuxCon

Microsoft's new open source czar, Wim Coekaerts, made his public debut at this week's LinuxCon in Toronto, where he gave a keynote address. Coekaerts, who spearheaded Oracle's Linux efforts, joined Microsoft in April as corporate VP for open source.

In his keynote yesterday followed by his first Microsoft blog post, Coekaerts told the Linux faithful why he joined Microsoft, a company whose onetime dominance and heavy handed licensing tactics inspired the Linux movement two decades ago.

Microsoft has added open source notches to its belt on a regular basis over the past two years. Last week, Microsoft announced a long-awaited contribution of PowerShell code to GitHub and the release of its Linux-based PowerShell environment. Still, Coekaerts said he realizes Microsoft has a long way to go to prove its mettle with the open source community. In a sign of how far Microsoft has come, the company was one of the top-level "diamond" sponsors of this year's LinuxCon, with Huawei and IBM being the other two.

"Over the past few months I've been asked more times than I can count, 'Wim, why did you join Microsoft?' As a Linux guy who has watched the company from afar, I am the first to admit that Microsoft hasn't always been the most open company," Coekaerts noted in his debut Microsoft blog post. "After talking to some of the executives at the company, I found that the days of a closed Microsoft are over."

Indeed, Microsoft executive VP Scott Guthrie made a strong push to convince Coekaerts of that, according to a recent Business Insider account. The fact that Microsoft has delivered the .NET Core to Linux and last week's news that it has done so with PowerShell and the plan to offer SQL Server for Linux were among many moves that also played a role in convincing the once-skeptical Coekaerts, he said.

Linux is also playing a key role as Microsoft continues to flesh out its new Operations Management Suite. Coekaerts announced that the preview of the recently announced Docker container monitoring component for OMS is now available. Microsoft Azure CTO Mark Russinovich demonstrated the Docker container management module for OMS at the recent DockerCon gathering in Seattle.

OMS, which provides visibility and control over applications and workloads running in Azure and other public clouds, will also benefit from last week's PowerShell on Linux announcement, according to Technical Fellow Jeffrey Snover. OMS "enables customers to transform their cloud experience when using PowerShell on both Linux and Windows Server," Snover noted in last week's announcement. "OMS Automation elevates PowerShell and Desired State Configuration (DSC) with a highly available and scalable management service from Azure. You can graphically author and manage all PowerShell resources including runbooks, DSC configurations and DSC node configurations from one place."

During yesterday's LinuxCon keynote, Coekaerts revealed that Linux now accounts for 40 percent of the instances in the Microsoft Azure cloud, as reported by Steven J. Vaughan-Nichols, a longtime critic of Microsoft, who outlined the company's turnaround in a cover story for Redmond magazine last year. "We've come a long way, but you'll soon see a lot more work," Coekaerts said. "We're a different company than we used to be and now is the time to prove it."

Posted by Jeffrey Schwartz on 08/24/2016 at 12:30 PM0 comments


Microsoft To Make Office 365 More Conversational with Acquisition of AI Startup Genee

Microsoft is giving Cortana, the company's AI-based personal digital assistant, a boost with the addition of Genee, a startup Microsoft today said it has agreed to acquire for an undisclosed amount. Genee, which launched just two years ago, lets users schedule meetings with individuals or groups by speaking to it.

The addition of Genee builds on Microsoft CEO Satya Nadella's plan to bring "conversations-as-a-service" to the mainstream as the next UI and framework for how applications are  used and connected -- a strategy he outlined in late March at the company's annual Build developers conference. Enabling that is the Microsoft Bot Framework and the Cortana Intelligence Suite. Genee is a meeting and scheduling service that connects with all major e-mail and calendaring apps including Outlook, Gmail, Apple's iCloud and chatbot services offered by Facebook, Twitter and Skype, as well as all SMS-based services.

Cofounders Ben Cheung and Charles Lee, who will join Microsoft, launched Genee to "simplify the time-consuming task of scheduling (and rescheduling) meetings," said Rajesh Jha, Microsoft's corporate VP for Outlook and Office 365, in a blog post announcing the deal. "It's especially useful for large groups and for when you don't have access to someone's calendar."

Jha said Genee is built with optimized decision-making algorithms and natural-language processing that allows people to interact with the virtual assistant as though it were a human. Its artificial intelligence can allow someone to request a meeting with someone by taking one's schedule and preferences and sending an e-mail to the intended recipients. Lee described the vision for Genee in a post announcing Microsoft's intention to acquire it.

"In our drive to deliver large productivity gains through intelligent scheduling coordination and optimization, we often found ourselves on the forefront of technology involving natural language processing, artificial intelligence (AI), and chat bots," he noted. "We consider Microsoft to be the leader in personal and enterprise productivity, AI, and virtual assistant technologies, so we look forward to bringing our passion and expertise to a team that is committed to delivering cutting-edge language and intelligence services."

Lee said in wake of the deal, the Genee service will be shut down on Sept. 1. Jha was vague about what happens next, other than to state that he sees the Genee team helping Microsoft "bring intelligence into every digital experience" as it builds new productivity capabilities into Office 365.

It's also unclear whether Cortana will have a rival in Genee or if Microsoft plans to combine the DNA of the two.

Posted by Jeffrey Schwartz on 08/22/2016 at 12:30 PM0 comments


Microsoft Open Sources PowerShell, Releases Linux and Mac Versions

Microsoft today announced that PowerShell has entered the open source community and versions for both Linux and Mac are now available.

The long-awaited move, which promises to broaden the use of Microsoft's popular command-line shell and scripting language that lets IT pros build task-based scripts to provide more automation to the Windows operating system, is consistent with Microsoft CEO Satya Nadella's embrace of the open source community two years ago when he proclaimed "Microsoft Loves Linux."

Among many initiatives that have followed, Microsoft announced plans to open source its once-proprietary .NET Framework. Given the fact that PowerShell is built on the .NET Framework, it made sense that it was potentially in the open source pipeline as well. Alluding to that possibility back in April was Microsoft Technical Fellow Jeffrey Snover, who is known as the inventor of PowerShell, and has long evangelized its use for its ability to automate tasks with more modern releases supporting Desired State Configuration (DSC is the PowerShell push-pull method for keeping system configurations in check) and more recently the use management APIs to enable cross-platform automation and configuration management using PowerShell.

Snover gave a nod to the possibility at the Build Conference in San Francisco, as reported by Kurt Mackie. During a Build session discussing DevOps and Microsoft's forthcoming Windows Server 2016 Nano Edition, an attendee asked about Microsoft's intentions regarding a port of PowerShell to Linux. In response, Snover acknowledged "significant interest" in PowerShell on Linux. In a nod and wink, he recalled that the onetime key barrier -- that PowerShell is written in .NET code (C#) -- was no longer a barrier, now that the .NET Core was contributed to the open source community.

"Jeffrey Snover predicted internally in 2014 that PowerShell would eventually be open sourced but it was the advent of .NET Core and getting .NET Core working on multiple Linux distros that kick-started the process," said Scott Hanselman, Microsoft's highly visible program manager focused on the bridging of the .NET, ASP.NET and open source communities, in a blog post. "If you were paying attention it's possible you could have predicted this move yourself. Parts of PowerShell have been showing up as open source DSC Resources, Script Analyzer, Editor Services, PowerShell Documentation, and the PowerShell Gallery.

Indeed, it was Snover who delivered today's announcement and the celebration of release was captured on the PowerShell and DSC team's YouTube channel. "PowerShell on Linux is now designed to enable customers to use the same tools, and the same people, to manage everything from anywhere," Snover said. The initial ports are available on Ubuntu, Centos and Red Hat, he noted. "PowerShell also now runs on Red Hat Linux and Mac OS X with additional platforms planned for the future. The "alpha" builds and source code are now on GitHub, he noted.

"Now, users across Windows and Linux, current and new PowerShell users, even application developers can experience a rich interactive scripting language as well as a heterogeneous automation and configuration management that works well with your existing tools," Snover added. "Your PowerShell skills are now even more marketable, and your Windows and Linux teams, who may have had to work separately, can now work together more easily."

The open sourcing of PowerShell promises to give the popular scripting language a broader reach and potentially greater popularity.  "Today's news about open sourcing PowerShell and making it available on Linux stands as a testimony to Microsoft's new era of openness and broad ecosystem extensibility," said Greg Shields, a longtime Microsoft MVP, author-evangelist at Pluralsight and Redmond's former "Windows Insider" columnist (and a cochair of the TechMentor conferences, which, like Redmond ,are produced by 1105 Media). "More culturally, though, I'm just personally excited at how Microsoft has become 'a cool company' to work with.  Today's news reinforces that emotion."

PowerShell has already found a home in the open source community, especially where it applies to DevOps where players such as Chef and Puppet have already begun supporting it. Snover, who had a high-profile presence at last year's ChefCon conference, has emphasized PowerShell's potential to run in containerized environments such as the Chef Server. Don Jones, also a former Redmond columnist, a curriculum director at Pluralsight and TechMentor cochair, said today's move is poised to extend its reach.

"Microsoft has not only made PowerShell open source and available for Linux and Mac OS, but has taken care to integrate with the existing shell experience on those operating systems," said Jones, who also cofounded the PowerShell.org site and has authored numerous technical books about PowerShell. "Running 'ps' won't run some PowerShell command you don't know; it runs 'ps', and lets you integrate that command's output into a pipeline that can also include PowerShell commands. It's a polar opposite of the 2000s-era "embrace and extend" motto at Microsoft.'

Jones said the move also validates that Microsoft's believes the Windows shell has value across operating systems but at the same time is a tacit acknowledgment that it needn't be the only scripting environment. "Along with the upcoming extension of PowerShell Remoting to support SSH as a native transport protocol in addition to WS-MAN, Microsoft is really acknowledging the reality of the modern datacenter," he said. "They're also willing to compete on the value of their products, not a single-stack lock-in -- and that's not the Microsoft we know, but it's one we can come to love, I think."

Snover will be giving a talk tonight about the news (Aug. 18) on PowerShell.org's Scripting Live podcast.

Posted by Jeffrey Schwartz on 08/18/2016 at 2:28 PM0 comments


Rackspace Readies Managed Security Services for Microsoft Azure and Hyper-V

Rackspace last week said its new managed security service for the Microsoft Azure public cloud should be generally available next month. The Rackspace Managed Security service, announced nearly a year ago, is targeted at organizations that can't justify or don't want to operate their own security operations centers and want to turn over the management to another provider.

Rackspace already provides the managed security service for Amazon Web Services, VMware Cloud and its own cloud services. The addition of Microsoft Azure to the mix is part of Rackspace's effort to build out a multi-cloud security service, which is aimed and mid-size and large enterprises that require nonstop monitoring and remediation against cyber attacks, both targeted and APTs.

The company manages Azure environments and hybrid cloud environments based on the Windows Azure Pack (WAP), and is eager to offer its own dedicated versions of Azure once Microsoft releases Azure Stack next year. At the same time, Rackspace also provides management of the public Microsoft Azure cloud, where it will offer the managed security service. Rackspace also will offer managed security services for Hyper-V workloads running in its cloud service portfolio.

Considering it's becoming more common for enterprises to run business-critical workloads spread across multiple public cloud providers, their risk and compliance needs are becoming more complicated, said Jarret Raim, director of strategy and operations for Rackspace managed security business. "Most of our customers are looking for holistic security for all of their environments," Raim said.

With its headquarters in San Antonio, Texas, Raim said Rackspace has access to highly skilled security professionals, helped by the fact that the NSA and a major airport cyber command center can be found locally. The service is largely targeted at the existing customer base that Rackspace already provides managed cloud services, though Raim believes it could attract new ones as well. Among other things, Rackspace takes the evaluation and decision making process of the best tools to address various components of security in public and hybrid cloud environments.

Raim said Rackspace is always evaluating security tools that it considers best suited for various environments and scenarios, though, in most cases, it uses the same providers to monitor and protect the multiple public clouds. On the back end, Rackspace uses Splunk for security analytics and ServiceNow for ticketing and workloads. Alert Logic provides its intrusion detection service and log management, while Rapid7 provides added vulnerability management. Rackspace offers an optional compliance service, which is based on an agent technology provided by CloudPassage. For endpoint protection, Rackspace uses Cloudstrike.

Asked why Rackspace isn't tying into the new Azure Security Center, which went live last month, Raim said his team likely will tap into that over time but it wasn't available as it put its current service together. "We talked to them a few weeks ago about their roadmap and how Rackspace could use that," he said. "We will always default to using platform tools where we can because those fit better for our customers, and then we can bring external tools to supplement that where we feel it's necessary."

Pricing for the service will start at $3,500 per month and $1,000 for those opting for compliance and reporting, though Raim said monthly costs will be in line with usage.

Posted by Jeffrey Schwartz on 08/17/2016 at 10:00 AM0 comments


Microsoft Is Taking No Chances with Azure Stack Rollout

Some early testers of the Azure Stack Technical Preview were angered last month by Microsoft's decision to say the software will only be offered on integrated systems provided by Dell, Hewlett Packard Enterprise and Lenovo. Some say that the move will put the new offering out of their reach. Reading between the lines, it appears quite a few testers of Azure Stack's first technical preview weren't deploying the software properly.

While Microsoft didn't say as much, the company's public explanation last week emphasized how much is at stake. Unlike its existing Windows Azure Pack (WAP), which provides an Azure-like experience with the older Azure portal interface running atop of Microsoft's System Center and Windows Server 2012 R2, Azure Stack is the same software that runs Microsoft's public cloud.

"Azure Stack is Azure in your datacenter," said Vijay Tewari, a principal group program manager, enterprise cloud solutions, in a video posted last week. "What we are really taking is Azure that runs in the massive scale that we have in Microsoft, and we are trying to shrink wrap that and give it to you as customers so you can basically operate Azure Services out of your datacenter on hardware and systems that run in your datacenter."

Nevertheless, numerous customers and partners testing Azure Stack TP1 were vocal in their displeasure with the decision, which they didn't see coming, as noted by dozens of comments on the corporate VP Mike Neil's blog post announcing the rollout plan last month.

"Where is the democratization of Azure Stack by forcing 3 overpriced vendors as turnkey solutions onto your customers? Democratization would be offering the Azure Stack platform as a standalone product offering and allowing your customers to choose their own hardware, support plans, etc.," wrote a poster calling himself Andrew McFalls. "This is absurd, a deal breaker for us and I assume many," added another commenter identifying himself as Kevin Mahoney. "Once again Microsoft refuses to listen to its customers."

Tewari said the move was necessary in order to provide a consistent experience. A single cluster in the public Azure is about 880 servers, and the challenge for Microsoft is to make it so that it can run on a scale that customers can manage, which will start at just four servers. While Microsoft doesn't require WAP on preintegrated systems, it offer as an option via Dell and HPE its Cloud Platform System, which Tewari noted Microsoft updates on a monthly basis. Given the rapid updates of the Azure public cloud, those updates must be deployed properly on Azure Stack in order for customers to get a consistent experience. Likewise, by limiting the initial delivery stack to three partners, Microsoft is ensuring it can maintain control over how the systems are engineered.

"We need a system where we have this continuous validation of all the firmware, all of the software they own, not just initially, but subsequently as the system gets revved up, we provide that in a validated and orchestrated and updated manner to our customers," he explained. "Remember with Azure Stack, it's a full lifecycle product. That means we don't just give you the product and give you the updates and leave it to you as an exercise to apply the updates. We actually do the validation of those updates, and give it you in an orchestrated manner [and] apply those updates, which range all the way from firmware, all the way to all of the Azure services on top, so that they get updated in a manner that your tenant workloads don't go down."

Indeed, many defended the move. Jeff DeVerter, chief technologist for Rackspace's Microsoft practice, admits he was surprised but in retrospect, it will make deploying Azure Stack for its hybrid Azure services.

"It actually helps us in some areas," DeVerter said. "When we roll out a rack solution into Equinix-type hosting facilities, they are completely self-contained from the network to the compute to the storage. This solves the problem of me having to engineer, or retrofit my Azure Stack racks, and just use the hyper-converged system from whoever the vendor of the day is."

Many suggested the decision was apparently made because some early testers weren't deploying it properly. In a recent interview, Mark Jewett, senior director of product marketing in Microsoft's cloud platform division, hinted that was the case.

"One of the important things to focus on is making sure that people that are excited by the hybrid vision of being able to have a consistent platform across different types of environments, hosted and public and private, making sure that it goes beyond vision and gets to success, and I think we can mutually agree that there are some examples out there where that has not resulted in success, and so that's where our focus has really been and it has been the learnings in the early adopter program," Jewett said. "Telemetry told us that it was important to deliver the solution this way to avoid that problem. We actually saw numbers behind it."

Azure Stack could be the most important new datacenter software the company has offered in years and if early adopters pan it, that could be a huge blow to Microsoft's push to advance its hybrid cloud strategy.

"Overcoming a bad rollout is really difficult, so I can see why Microsoft is doing this to take the risk out of bad rollouts," said Terri McClure, senior analyst at Enterprise Strategy Group. "Over time, I would expect they will ease up."

Tewari said that's the goal. "As we get more feedback we will expand the diversity of hardware that we run on and eventually maybe even allow customers to run on existing hardware that they own," he said. "But we have to start with system that are well engineered, that are fully validated, with us and our partners so we can really provide that robust experience so customers can be successful with Azure Stack as we go out the door."

Posted by Jeffrey Schwartz on 08/15/2016 at 5:37 AM0 comments


Microsoft To Shut Down Azure RemoteApp Service

Microsoft today announced it will shut down the Azure RemoteApp service. The service launched two years ago as an online alternative to remote desktop Windows application services and the company's applications virtualization (AppV) technology. Microsoft will shift its efforts on helping Citrix build and roll out its new cloud-based iteration of its XenApp platform.

The two companies announced a broad expansion of their decades-long partnership back in May. Citrix agreed to use the Microsoft Azure cloud as the backplane for its new portfolio of services. At that time, Citrix CEO Kirill Tatarinov, a former Microsoft executive, pledged to move all of the company's core software offerings, including Xen Desktop, XenApp and NetScaler, to the cloud. The two companies said today's move is part of this effort.

"Customers have provided us consistent feedback that they want a comprehensive, end-to-end, cloud-based solution for delivering Windows apps," read a post announcing the plan on the Microsoft's Enterprise Mobility and Security Blog. "The best way for us to deliver this is with Citrix through XenApp 'express', currently under development. This new Citrix service will combine the simplicity of application remoting and the scalability of Azure with the security, management and performance benefits of XenApp to deliver Windows applications to any employee on any device."

Microsoft said it will provide more information on the new offering in the coming months. Bill Burley, corporate VP and general manager of Citrix's Workspace Services Business Unit, in a blog post described the joint effort between Citrix and Microsoft as Azure "RemoteApp 2.0," in which it will provide "simplicity and speed" of Azure RemoteApp with the core functions of Citrix XenApp. Burley signaled this move is part of the company's new Citrix cloud focus centered around simplifying the delivery of virtualized desktop and applications as a service.

"The Citrix Cloud XenApp and XenDesktop service provides our enterprise customers with superior performance and flexibility by moving the backend infrastructure to the cloud," Burley noted. "Now Citrix can combine the power and flexibility of our Citrix Cloud platform including the industry leading HDX user experience and extend the original simple, prescriptive, and easy-to-consume vision of Microsoft Azure RemoteApp by orchestrating the applications in the cloud as well. This offering will radically simplify app delivery without sacrificing management or end user experience."

Burley also said Citrix and Microsoft are developing and delivering the new Citrix Cloud XenApp service in tandem with the planned XenDesktop for Windows on Azure service, which the company announced back at its May Synergy conference.

"By integrating all these efforts across the board, both Citrix and Microsoft will be able to drive down the cost of app and desktop delivery while still providing enhanced security, management and application performance from one of the industry's largest public clouds: Microsoft Azure," Burley noted.

Microsoft said it will continue to operate and support its Azure RemoteApp customers through Aug. 31, 2017 and the company noted it is working with customers on a transition plan. The company will stop offering new AzureRemote App purchases on Oct. 1 of this year. In addition to the new Citrix service, customers could opt for using Remote Desktop Services deployed on Azure IaaS or one of the offerings provided by Microsoft's hosting partners. It was not immediately clear how, or even if, Microsoft and its partners will sell the service, though the company noted it will have more to say at next month's Ignite conference in Atlanta.

Posted by Jeffrey Schwartz on 08/12/2016 at 11:32 AM0 comments


Cloud-Based Secure Data Exchange Detailed by Microsoft Research

Microsoft researchers have discovered a way to let parties share encrypted data using the cloud to transact secure trades of data, while giving users of the information exchanged complete control over specific datasets actually shared.

Using a cloud-based exchange and a concept the researchers call "secure multiparty computation," owners of data can encrypt and store it online with the ability to process and share pieces of information among specific parties earmarked specific to them. This is done without compromising the security and privacy of other information of the larger dataset, according to Ran Gilad-Bachrach, a researcher in Microsoft's Cryptography Research and coauthor of a paper published in June.

Microsoft revealed the researchers' work in a blog post this week and suggested the technology might be available soon. The idea behind "secure multiparty computation" is that it allows an individual holder of data to share it with multiple parties in a group, but ensures no one sees information about other members of that transaction.

For example, if a group of employees wanted to determine how their salaries rank among each other without actually telling everyone how much they earn, they'd have to find one trusted leader to disclose their information to, who would then compute the information and share the results. Using this cloud-based secure multiparty exchange, users could create this data without needing to share their salaries with that trusted colleague, according to another coauthor of the paper, Peter Rindal, a Microsoft intern and PhD candidate at Oregon State University with expertise secure multiparty computation.

An exchange like this could be useful among those who want to share expensive medical research at a lower cost, while ensuring privacy of certain information. Those who own any kind of data could encrypt hundreds or even thousands of components and issue a key specific to a buyer and its data for them to decrypt, according to the report. Because keys are stored in the cloud, the Microsoft researchers noted that security and privacy of data would be compromised if shared directly.

"Instead, we want to use the keys to decrypt the data inside a multiparty computation," Kim Laine, a post-doctoral researcher, coauthor of the paper and also a member of Microsoft's Cryptography Research team, explained in Microsoft's post. Laine is studying how to compute on encrypted data.

Microsoft said that while it's a research project now, "the team aims to publicly release the library, or tools, needed to implement the secure data exchange in the near future."

Posted by Jeffrey Schwartz on 08/10/2016 at 12:39 PM0 comments


Nadella Bullish About LinkedIn and Minecraft Growth

When Microsoft announced that it has agreed to acquire $26.2 billion back in June, it raised a lot of eyebrows. The huge price tag was triple the amount it paid for Skype back in 2011, then Microsoft's largest acquisition. The jury is still out on that one but Microsoft's two other megadeals -- Nokia's handset business for $7.2 billion three years ago and aQuantive for $6.3 billion in 2007 -- were busts, and overall Microsoft's reputation for monetizing large acquisitions is poor.

But none of those deals were pulled off by current CEO Satya Nadella, who wrote off most of Nokia. The first major acquisition by Nadella, game maker Mojang, which Microsoft acquired two years ago for $2.5 billion, appears to be paying off. The company's flagship game Minecraft recently announced it has sold 100 million copies to date, making it the second-best-selling game in history. Nadella pointed to its success and his optimism that the LinkedIn deal will be a catalyst for Microsoft's growth in an interview last week with Bloomberg's Dina Bass.

"When I look at both Minecraft and LinkedIn, they're great businesses that are growing," Nadella told Bass. "And so, in fact, if anything, our core job is to take that franchise and give it more momentum. In the case of Minecraft, it's the biggest PC game, and we are the PC company. Their growth was moving to console. We have a console. Therefore, we were a perfect owner. Same thing with LinkedIn. They're a professional network for the world. We have the professional cloud. Time will tell, but I'm very, very bullish."

Indeed, LinkedIn surprised analysts last week with a much better earnings and revenues than expected. While the company posted a dismal loss of $119 million, non-GAAP earnings of $1.13 per share handily exceeded expectations of $0.78 per share, while revenues of $933 million for the quarter increased 31%. Also worth noting is that membership grew from 433 million to 450 million members per month and the member page views increased 32 percent, yielding 21% year-over-year growth in page views for each separate member accessing the service, the company said.

While indeed $26.2 billion is a huge price tag for any acquisition, Microsoft is effectively paying $58.22 per member, nearly $2 less than the $60.04 at the time the deal was announced, now that the member roster has increased.

There are plenty of skeptics wondering whether the deal will prove to be worth its weight in gold, including Microsoft MVP and Redmond contributor Brien Posey. 

A survey we fielded July of Redmond magazine subscribers found more than one-third of the respondents (37 percent) believe acquiring LinkedIn was a smart move by Microsoft and that it will live up to its promise of helping the company create new ways for people to connect with people through its core offerings, including Office 365 (see this month's Redmond cover story). Only 22 percent believe it was a bad deal, with the remaining 42 percent said it's either too confusing to predict or it has no obvious upside (but poses little risk). A formidable number of respondents, 40 percent, believe Microsoft will make LinkedIn much or somewhat more useful, while 31 percent believe it will become much or somewhat less useful. The remainder don't anticipate significant change, or have no opinion.

Posted by Jeffrey Schwartz on 08/08/2016 at 1:04 PM0 comments


Study Finds SharePoint Employee Usage Remains a Challenge

SharePoint is widely deployed among organizations. But getting employees and specific groups to use it as their enterprise content and document management system remains a challenge, according to AIIM's Impact of SharePoint 2016 report.

AIIM, the large nonprofit and influential trade association covering the enterprise and information technology markets, released the findings of its latest SharePoint report this week. Based on a survey of 274 AIIM members in June, the report found that SharePoint use has increased incrementally, but more than half (58 percent) said SharePoint remains a challenge. The key problem is that 43 percent said employees or groups have their own preferred file-sharing applications for everyday usage, according to the report.

Nearly one quarter (23 percent) said they are planning hybrid environments, with plans to host a majority of their content in Office 365 SharePoint Online and some portion of data deemed sensitive remaining on premises. According to the survey, 17 percent of respondents are planning to move everything into the online version.

Also, despite the widely published release of SharePoint Server 2016 in May, only 23 percent are aware of what it offers, while 29 percent have no idea what it includes. A quarter said they're still on SharePoint Server 2010 and 41 percent on SharePoint Server 2013. Only 17 percent said they'll increase spending to upgrade to SharePoint Server 2016, while 13 percent say they'll purchase the new release for the first time and a mere 2 percent have gone live with it, though it only was in the market for a bit over a month when the survey was conducted.

The survey showed that SharePoint is the main enterprise content management/document management (ECM/DM) system among 28 percent of organizations, 13 percent said it's important to their overall ECM/DM environment and 22 percent report reluctant employees are hampering adoption of SharePoint.

Since its early release, many organizations have struggled to get broad groups of employees to use SharePoint. A substantial percentage (40 percent) said their SharePoint implementations aren't successful, with two-thirds (67 percent) having blamed that on inadequate user training, 66 percent said it's too difficult to use and 64 percent report that lack of support from senior management is the reason that their deployments have failed.

"I've heard personally from folks that have said senior management decided to deploy SharePoint because it's SharePoint but they have no clear understanding of what it's supposed to do," said Bob Larrivee, VP and chief analyst of AIIM Market Intelligence. "Change management is still a big issue when it comes to SharePoint." According to the survey, 58 percent of respondents cited change management for their reluctance to use SharePoint, with 51 percent blaming it on a lack of technical expertise and more than one-third pointing to inconsistent metadata classifications as a key problem.

Barry Jinks, CEO of Colligo, which, along with Gimmal, cosponsored the survey, said executive sponsorship is critical. "If you don't have senior management support and governance for SharePoint, people will say 'maybe I'll use SharePoint, maybe Dropbox, Box or seeming else,'" Jinks said. "So you get this mishmash of document storage solutions throughout a company."

Despite those challenges, there is some good news, with 58 percent saying that revitalizing their SharePoint projects through user training is a priority and 50 percent said they plan to update their SharePoint government policies. More than one-third (35 percent) say they plan to focus on making SharePoint their primary ECM.

Posted by Jeffrey Schwartz on 08/05/2016 at 2:04 PM0 comments


Surveys Reveal Various Windows 10 Migration Priorities

Migrating PCs to Windows 10 is the top end user computing priority of VMware customers, according to the results of a survey published yesterday. Yet another survey by systems management software supplier Adaptiva found more than three quarters see no urgency to upgrade, with 64 percent saying they'll upgrade within the next year.

With any survey, it depends on who you ask. VMware polled organizations of various sizes from small businesses to large enterprises. Adaptiva's customers are typically very large enterprises that use Microsoft's System Center Configuration Manager.

Naturally, both companies timed the release of their findings in concert with Microsoft's release of its Windows 10 "anniversary update," which, for many IT decision makers, is the equivalent of what historically was known as the operating system's first service pack. IT organizations typically wait for that first major update before embarking on major upgrades to the new operating system. Interestingly, only 30 percent of the 300 Adaptiva customers with more than 10,000 seats say the anniversary update is signaling them to upgrade right away.

The VMware survey of 600 VMware customers conducted in June found that 75 percent of organizations will migrate their users within two years. Nearly two-thirds (64 percent) identified Windows 10 migration as their top priority, with maintaining and supporting existing Windows 7/8 PCs ranking second (56 percent) and PC hardware refresh (51 percent). Nearly three-quarters (74 percent) said the ultimate end of life of Windows 7 and 8 are the primary incentives for organizations to migrate to Windows 10. Almost half (49 percent) said that the inclusion of migration plans in their license agreements was also a factor. The survey found 43 percent saying that they wanted Windows 10 because it can take advantage of new hardware capabilities.

Among some other noteworthy data points from VMware's survey:

  • Only 25 percent are migrating to take advantage of better management capabilities, and even less (15 percent) are motivated by the new enterprise mobility management (EMM) capabilities. Only 11 percent said that they are upgrading because of the new applications the OS supports.
  • More than half (58 percent) said they will not change the way they manage Windows when upgrading.
  • A mere 17 percent said they were aware Windows 10 is optimized to implement mobility management functions in general and only 14 percent know it's better suited than previous versions to enable cloud-based management.
  • Small- and mid-size business are the furthest ahead (38 percent) with their Windows 10 migrations, compared with larger commercial organizations (21 percent) and the largest of enterprises (17 percent) customers.
  • Half of the respondents (50 percent) said they'd rather use Windows 10 to work on projects.
  • Roadblocks to migrating include concerns about application compatibility (61 percent), staffing (41 percent) and negative perceptions by some employees (37 percent).

Despite lack of interest or awareness of Windows 10's EMM capabilities by a noteworthy percentage of respondents to the VMware survey, those polled by Adaptiva ranked the OS's improved security as a primary factor for upgrading. Specifically, Windows 10's enterprise data protection, Device Guard, Credential Guard and security auditing are key drivers for the migration, according to the Adaptiva survey. Yet those Adaptiva respondents that have started migrations are pleased with their deployments, with more than half (52 percent) planning to have their systems migrated within the next year. At the same time, 65 percent say their deployments have had challenges. Among some of Adaptiva's findings:

  • System Center Configuration Manager (ConfigMgr) is the top deployment tool of choice, 83 percent overall, and nearly 90 percent among large enterprises.
  • According to the survey, 56 percent are utilizing Current Branch for Business with only 22 percent using Long-Term Servicing Branch.
  • To manage ongoing updates, 42 percent are using SCCM Windows 10 upgrade task sequences and 29 percent are using offline servicing features.

The appeal of Windows 10 is its improved management support capabilities for smaller organizations that don't have System Center or a similar enterprise-grade management platform. In a blog post revealing VMware's survey results, Mark Margevicius, the company's director of end user computing strategy, noted that the enterprise mobility management capability built into Windows 10 is not resonating, including the OS's new automatic enrollment, off-domain joined management and Microsoft's new update release cycle.

"Despite these new capabilities, most customers were not motivated by these features or preferred to stay with traditional Windows management techniques (i.e. imaging, relying on patching/updating or PCLM tools)," Margevicius noted. "We believe that customers who rely on traditional management are ready for a better way. Focusing on EMM can dramatically improve Windows management, while dropping operational costs."

The timing and dynamics of the data from VMware are interesting. Microsoft and VMware both have competing EMM offerings: Microsoft with its Enterprise Mobility + Security (recently renamed from Enterprise Mobility Suite) and VMware offers the popular AirWatch offering. At the same time, Microsoft has partnered with its rival to make sure AirWatch can take advantage of all Windows 10's EMM capabilities -- a surprising pact announced at last year's VMworld conference.

VMware also conveniently announced the release of its new AirWatch Express, which it describes as a cloud-based MDM offering targeted at businesses and organizations that have limited and, in some cases, no IT staff. The offering supports iOS, Mac, Android and Windows 10 devices, with pricing beginning at monthly subscription fee of $2.50 per device. Based on the same AirWatch platform, the company said the Express offering aims to simplify the configuration of applications, e-mail, WiFi and the implementation of security and encryption.

Posted by Jeffrey Schwartz on 08/03/2016 at 12:38 PM0 comments


Windows 10 'Anniversary Update' Arrives Along with SDK

Windows 10 users can now upgrade to Microsoft's widely anticipated release of its "anniversary update". It's the first major upgrade to Windows 10 since its release a year ago and the equivalent of a first service pack, though with a notable number of new features.

In a post announcing the release, corporate VP for the Windows and devices group Michael Fortin noted users don't need to do anything in order to get it as the company is pushing the update out automatically through the Windows Update process. Those that want to get it right away can go into the settings and manually request the update, presuming they have administrative rights to do so.

In short to perform the upgrade, simply go to Settings > Updates & Security > Windows Update and then click "Check for Updates." Make sure you're downloading Version 1607.

MSDN subscribers can access the update as well.

Now that Microsoft has rolled out the anniversary update, it's next priority is ot get more apps on ints

Now that Microsoft has rolled out the anniversary update, its next priority is to get more apps on its Universal Windows Platform, which is critical to the long-term success of the OS. To that end, Microsoft today also released the Windows 10 anniversary update SDK which the company said includes more than 2,700 improvements to its Universal Windows Platform, allowing developers to build capabilities. The SDK aims to enable developers to build functionality into their apps around some of the core improvements in the Windows 10 anniversary update including Ink, Cortana and Windows Hello.

Microsoft said it is also opening Dev Center and the Windows Store, allowing developers to submit apps built for PCs, phones and HoloLens that target the Windows 10 anniversary update SDK. The company will begin the process of accepting apps via its Desktop Bridge, the company's tool known as Project Centennial, designed to let developers convert existing Windows apps to the new Universal Windows Platform.

"While we build the pipeline into the Windows Store to publish these apps, our team will work directly with developers to get their converted apps and games into the Windows Store," said Microsoft's Kevin Gallo, general manager for Microsoft's developer division. Developers can contact Microsoft here to submit an app using the Desktop Bridge to the Windows Store, he said.

Microsoft will also outline how developers submit Xbox apps targeting the Windows 10 Anniversary Update SDK into the Windows Store via the Dev Center, he added. "The Store is open for business and new innovations with Inking, Cortana and Edge will enable new experiences that simply aren't possible on other platforms," Gallo said.

Now that Microsoft has built it, will the developers come?

Posted by Jeffrey Schwartz on 08/02/2016 at 11:31 AM0 comments


Microsoft Says Goodbye to Passport with Updated Windows Hello

Tomorrow's release of the Windows 10 Anniversary Update will give the operating system improved biometric log on capabilities via its Windows Hello feature but it will mean goodbye to Passport, the component of that enabled authentication. The demise of Passport is a change in name only, Microsoft said in late June. It won't be the first time Microsoft has cast aside the Passport brand, which was once the name of the single sign-on service also formerly known as Live ID (and now called the Microsoft Account).

It was surprising that Microsoft resurrected Passport last year when introducing Windows Hello but perhaps the thinking behind the revival was that it would help customers correlate it with the notion of single sign-on. When Microsoft released Windows 10 a year ago, Passport specifically was the component in Windows 10 that let users authenticate to a Microsoft account, an Active Directory within Windows Server, an Azure Active Directory account and any service that supports the Fast ID Online (FIDO) authentication specification. Windows Hello represents the biometric authentication component, which could include a fingerprint scanner available on some systems, facial recognition or a gesture. Windows Hello can also enable authentication via a PIN. Windows 10, when released last year, used Passport to authenticate users.

"Collectively, these represented our FIDO 2.0 aligned end-to-end multi-factor authentication solution," said Nathan Mercer, a Microsoft technical evangelist, who explained the change in a TechNet post in late June. "Moving forward, Windows Hello will represent the brand we will refer to for our FIDO-aligned end-to-end multi-factor authentication solution. Microsoft Passport will be retired as a brand and the credential is now considered part of Windows Hello," Mercer noted. "From a customer's perspective, this is simply a semantics issue and there are no material changes from a configuration or security perspective."

Since last summer's release of Windows 10, the FIDO 2.0 spec, for which is the framework in the Hello technology was originally conceived, was still under development by the FIDO Alliance, a consortium that includes over 250 members including Microsoft, Google, PayPal, Bank of America, Dropbox and GitHub. The alliance submitted the spec to the World Wide Web Consortium (W3C), the Internet standards body responsible for specs and languages including CGI, HTML, SOAP and XML, among others. The FIDO Alliance is working with the W3C to ensure these Web APIs allow FIDO 2.0 authentication to work across platforms and anywhere on the Web. "Once this work is complete, we will synchronize any applicable changes into Windows Hello itself," Mercer said.

Another improvement to Windows Hello is that it will secure credentials from theft and unauthorized changes to a device's hardware-based Trusted Platform Module (TPM). Windows Hello will use software-based encryption on devices that don't have the embedded Intel TPM hardware.

In addition to the Windows Hello improvements, the 350 million Windows 10 users receiving tomorrow's update will see some other noteworthy security improvements in the OS including malware scanning, threat detection, the ability to schedule scans and receive reports when threats are discovered. Enterprise customers will also get the new Windows Defender Advanced Threat Protection feature (WDATP), which "detects, investigates, and responds to advanced malicious attacks on networks by providing a more comprehensive threat intelligence and attack detection," said Yusuf Mehdi, corporate vice president for the Microsoft Windows and Devices Group, in a blog post last month. Several new PCs will support Windows PC including the latest Asus ZenBook Pros and Transformer with fingerprint scanners and a camera that enables facial recognition, respectively and the Dell Inspiron 7000. For systems that don't support Windows Hello, MouseComputer has announced a peripheral camera that users can plug into any USB port. The company, based in Japan, also will offer a peripheral fingerprint scanner.

Posted by Jeffrey Schwartz on 08/01/2016 at 12:10 PM0 comments


Last Chance for the Windows 10 Free Upgrade?

Unless you have spent the last year in a cave, it should come as little surprise that the deadline to receive Microsoft's free Windows 10 upgrade expires this Friday. If you're one of those who files your tax returns on April 15 at the stroke of midnight and have a similar strategy for upgrading your Windows PC, the deadline is similar. You have all day on July 29 in your time zone to upgrade until 11:59. So will there be a last-minute rush by the procrastinators? Perhaps, but I wouldn't bet on it.

While IT organizations often do wait until, and beyond, the last minute to complete certain tasks, if 2014's Windows XP expiration and threats of Armageddon didn't light a fire under many users of that venerable OS, why would the supposed last chance to get Windows 10 at no charge create urgency? Some may believe Microsoft will extend the deadline, though there's strong reason to believe that's unlikely. The company has remained quite firm about this deadline and given the Windows 10 Anniversary Update arrives Tuesday, it would make sense Microsoft wouldn't want have both in play simultaneously. It's plausible at some point that Microsoft may revive the offer. But even then it wouldn't likely be in the near term and it could come with different stipulations.

There's another reason Microsoft will stick to its guns with Friday's deadline. The free offer certainly was no gift to OEMs, as it gave customers a chance to breathe new life to their existing PCs -- especially those Windows 8 models that are already touch enabled. If there wasn't an implicit understanding between Microsoft and the OEMs that the free Windows 10 offer was limited to one year, certainly extending it as the back-to-school season kicks off wouldn't help matters. A number of new devices are set for release.

A great majority of businesses and enterprises tend to upgrade their OSes in concert with the refresh of their hardware. The option to let customers try and keep Windows 10 for a free was a smart move to get consumers and IT organizations to take a risk-free approach to upgrading to the new operating system. Given the skepticism about Windows after most users outright rejected Windows 8 and the fact that many were considering non-Microsoft alternatives, it was a wise thing to do. The shift to Windows as a service moving forward will make this a moot point. But if you have a PC running Windows 7 or Windows 8.1 that's on its last legs, it may be worth finding an hour or two to give it a refresh with Windows 10 and see if it gives it a new lease on life. Of course don't forget to back up your data --  just in case, though in the few upgrades I've done, I never had a problem.

Chances are by now, if you haven't taken advantage of the free upgrade, it was a calculated decision. Perhaps your hardware or software isn't conducive (or at least your concerned about the possibility). Indeed. some people have horror stories to share, including my colleague Kurt Mackie. And while I'm aware of a handful of others who experienced problems, most have gone without a hitch. I even upgraded a 10-year-old desktop that originally had shipped with Windows Vista without any problem, though that system, with just 1GB of RAM, does run quite slow -- but it runs. Even with this strong likelihood that this is your last chance for this upgrade, the penalty for remaining on the sidelines isn't as severe as filing your taxes late.

Posted by Jeffrey Schwartz on 07/27/2016 at 11:13 AM0 comments


A New Approach to DevOps and Hybrid Cloud: HCIaaS

Gridstore, a company that has made a name for itself over the past few years in the rapidly growing market of hyperconverged systems with an emphasis in Microsoft Hyper-V environments, is casting a wider net with the acquisition of DCHQ, a supplier of container-based orchestration tools. The newly combined company will now call itself HyperGrid, offering what it claims is the first hyperconverged computing as a service. The so-called HCIaaS consists of its hardware, public cloud services and container-based development, orchestration, deployment and management environment integrated and delivered with consumption-based pricing.

The new HyperGrid is aiming to make hyperconverged computing, which offers software-defined compute, storage and network control in tightly integrated systems with shared resources and common management for Web-scale applications requiring high availability. Gridstore has offered its appliances designed for Hyper-V environments for several years for operations requiring high availability such as VDI, but the market for hyperconverged gear has become highly competitive. Dell, Hewlett Packard Enterprise, Lenovo and Cisco all have released or are readying new hyperconverged systems. In addition to the major suppliers, Gridstore competes with a number of other startups including Nutanix, which has several OEM deals, a partnership with Microsoft to deliver its Cloud Platform System (CPS) and is seeking to go public.

The market for what Gartner calls hyperconverged integrated systems (HCIS) is expected to grow 79 percent to $2 billion this year, according to the market research firm and predicts the technology will be widely by enterprises and service providers within the next five years. While HCIS dwarfs the traditional hardware and storage markets in terms of size, the new technology is the fastest growing and over time is expected to cannibalize the current systems and storage market. Gartner is forecasting revenues of $5 billion for HCIS in 2019.

DCHQ offers development, orchestration, deployment and management of containerized software that works with emerging tools from the likes of Docker and Mesosphere as well as traditional virtual environments including Hyper-V, VMware vSphere, KVM, bare metal systems and OpenStack. HyperGrid will bundle the software with its hyperconverged systems, which will also include subscriptions to public clouds -- initially AWS with Azure and others to follow.

The software DCHQ provides allows developers to build and manage new cloud-native, SaaS-style apps as well as for the packaging of traditional apps into containers. Integrated with the Gridstore hardware the new HyperGrid believes offering the entire hardware, software and public cloud services bundled with consumption-based pricing will bring hyperconverged computing to organizations that otherwise can't afford it today.

HyperGrid Chairman and CEO Nariman Teymourian believes its business model allows the company to effectively offer usage-based hybrid cloud services with its hardware, software and public cloud service, all supplied and billed by HyperGrid based on an organization's entire consumption over a given month. For customers it promises to lower the cost of acquiring hyperconverged systems and software, Teymourian said. HyperGrid can offer this consumption-based pricing because the company doesn't have to concern itself with cannibalizing an existing commodity hardware business, he said. Teymourian said he would know, having managed the hyperconverged systems business of HPE prior to its divesture from HP. Before that he was an executive at Dell.

"Customers don't want to buy, they just want to rent it," Teymourian said. "They want it as services and to be able to use it when they need it just like they do with AWS."

HyperGrid will offer its services, initially with the ability to scale out to AWS, with other popular cloud services in the pipeline in the coming months including Azure, Digital Ocean and OpenStack, Teymourian said.

"We can provide an AWS-like experience that you can control in datacenter or through a service provider and have it consumed as an AWS-like feature but at a significantly reduced cost."

Enterprise Strategy Group Senior Analyst Terri McClure said the major hardware providers all offer sophisticated financing and consumption-based packages but HyperGrid's approach lends itself to organizations looking for a complete DevOps-based hybrid cloud offering. "This could help to bridge the gap between the development and the operations groups when it comes to DevOps," she said. "HyperGrid is doing this with a bridge to the cloud so they don't have to put as much capacity on-premises."

A Dell official said the company offers cloud offerings supported by consumption-based pricing plus "other creative financing offers from Dell Financial Services. We offer this through a portfolio of Scale Ready Payment Solutions, designed to enable customer adoption of cloud solutions wherever they are on their cloud journey. The programs include Scale On Demand, Pay as you Grow and Provision and Pay, as well as its Cloud Flex Pay, which lets customers evaluate its DCHS gear for several months."

Time will tell whether HCIaaS sticks to the list of "as-a-service" buzzwords.

Posted by Jeffrey Schwartz on 07/25/2016 at 12:32 PM0 comments


Microsoft To Showcase New SharePoint Features at Ignite

Microsoft is planning to put its new SharePoint Framework in the spotlight at the company's Ignite conference, scheduled for the last week of September in Atlanta. In addition to the likely release of at least some of the "Feature Packs" recently announced for the server edition and its revamped JavaScript-based SharePoint UI, Microsoft will hold more than 60 technical sessions centered around new and forthcoming features at Ignite focused on its flagship collaboration platform as well as Office 365, both of which will share this same framework.

Jeff Teper, corporate VP for Microsoft's SharePoint and OneDrive for Business, revealed the Ignite plans yesterday during the 100th episode of Office 365 Developer Podcast (listen here). Microsoft's plan to emphasize the on-premises collaboration platform and Office 365 at Ignite is noteworthy because many in the SharePoint community felt the company downplayed it at last year's inaugural event in Chicago. The lack of emphasis at last year's Ignite led many to question if Microsoft was committed to ongoing future on-premises release, especially since the event was conceived to consolidate the former TechEd and SharePoint conferences hosted by the company in past years.

One year later after the inaugural Ignite conference, Teper outlined new features coming to SharePoint Server based on the online Office 365 version, coming in the form of Feature Packs, developer capabilities via its new PowerApps and Flow tools, the revamped modern UI and new framework, which enables the JavaScript rendering engine along with support for other modern languages.

Ignite is now Microsoft's largest technical conference but it targets a broad cross section of IT executives, administrators, operations managers and developers, drawing more than 20,000 attendees. Teper, known as the "father of SharePoint," talked up the Ignite plans during the 30-minute discussion with co-hosts Richard diZerega and Andrew Coats, both Microsoft technical evangelists. "I want to get a little hat that says 'make SharePoint great again,'" Coats quipped, though Teper quickly responded "let's stay away from either party."

All kidding aside, Teper is largely credited with the early success of SharePoint Server. After a two-year hiatus from the SharePoint team to work on strategy for CEO Satya Nadella, Teper was put back in charge of it and the OneDrive for Business organization. Teper presided over the May Future of SharePoint event.

Since that event, Teper said Microsoft has made significant progress in meeting goals outlined at the time. The company is about to hold its third "dev kitchen," the name of in-person events held by the product teams in Redmond with MVPs. "I think by Ignite, you'll see stuff people can use more broadly and we are really, really excited about it," Teper said in the podcast. "If you look at the performance of some of the new Pages, in the new SharePoint UI, you will see a taste of what's ahead."

The new SharePoint Framework is built with integration hooks with the Microsoft Graph and support for open source tooling that deliver client-side JavaScript rendering. This will give SharePoint a much more modern appearance that is also optimized for mobile devices. By supporting JavaScript and other popular environments such as Go and Swift, Microsoft is looking to enable developers who don't want to use C# or environments like SharePoint InfoPath, to embrace the platform.

When SharePoint was conceived more than a decade ago, the idea was to extend the capabilities of the desktop Office suite at the time to servers to enable information sharing via a document management platform, Teper noted. Today, SharePoint is one of Microsoft's core offerings (1.3 million LinkedIn members say they have SharePoint skills, Teper said). New features now typically appear first in the SharePoint Online offering in higher-end Office 365 subscriptions before coming down to the server product, which is a switch from the old SharePoint Server 2013 days.

In a sense, Teper was brought back to keep SharePoint great, at least from Microsoft's perspective.

"When I came back to the team I had a chance to think about what do we really have to do to make SharePoint better," he said. "It's what made it great was embracing core collaboration, content management made up of building blocks -- Pages, Parts, Lists, Libraries and so forth, but we needed to modernize both applications, make it easier to use out of the box, make it run great on a phone, and modernize the development platform. We also need to leverage what people are doing in JavaScript and give people a simple answer for a RAD tool around PowerApps."

Teper said the changes coming to SharePoint as an application platform is the most significant from an architectural perspective since 2003 when the company embraced ASP.NET page model which controls all of the SharePoint parts to what will be a 100 percent JavaScript-based model that will support other modern programming languages such as Go and Swift. Teper is also hopeful this will draw new developers to the SharePoint ecosystem including those in the open source community and developers building cloud-native, container-based apps. At the same time, he's optimistic that the large SharePoint community that now exists will find this equally appealing.

"I think this [new] model is really good," Teper said. "I think you'll see a simple user experience [and] we will have an extensibility model. The goal is not to have somebody have to read 1,000-page how-to program in a SharePoint book. The goal is for any developer out there to come to SharePoint and feel welcome to the party. The .NET developers -- we're thrilled [to have] but we want to attract more [developers using other programming languages] to grow SharePoint. I think besides having a forward-looking architecture this was a way to grow the community."

Posted by Jeffrey Schwartz on 07/22/2016 at 11:51 AM0 comments


Windows Nano Server Will Require Microsoft Volume License Agreements

The Nano Server edition of Windows Server 2016 will only be available to customers with Microsoft's Software Assurance plan, offered with certain volume license agreements. Organizations will have to opt for the Software Assurance requirement to use Nano Server, which also must follow the Current Branch for Business update cycle.

While Microsoft revealed the Software Assurance requirement last week, it may have slipped under the radar for some because it was the subtext of the company's announcement that Windows Server 2016 and System Center will be available this fall.

Microsoft has heavily promoted Windows Serve 2016's Nano Server, which is notable for being a "headless" or lightweight server option that lacks a GUI, although it has management benefits from having a small footprint. One of its touted benefits is to support "cloud-native applications built with containers and micro-services." The Software Assurance requirement will restrict the use of Nano Server to customers with volume license agreements, though a Microsoft official explained that this server deployment option is for customers building scale-out Windows-based infrastructure, which are typically large organizations with such arrangements. Nevertheless, those without Software Assurance won't be able to deploy the Nano Server option.

"They're setting it up that only an enterprise customer will be able to run Nano Server because not all SMB and midmarket customers have Software Assurance," said Clint Wyckoff, Microsoft Cloud, Hyper-V and Datacenter MVP and a technical evangelist at data protection software vendor Veeam Software.

The Current Branch for Business update model, which will typically come with two to three updates per year, makes sense for cloud-scale environments, according to Wyckoff. "If they integrate and add different modules to Nano Server, for instance a new domain controller, or Microsoft adds another technology to Nano Server, all customers will get the latest features and functionality because they are included in the Software Assurance program," he said. "But there is also a higher price tag associated with it."

Indeed, Microsoft wants customers using Nano Server to be on the latest releases because it's intended for cloud-native applications and operations, said Mark Jewett, senior director of cloud platform marketing. "It really is the f ocus on aligning with the pace of innovation," Jewett said. "In fact we want to keep that thing up and the fact we want to have a servicing model relationship with customers established, has helped us do that."

The Current Branch for Business service update model for Windows Server 2016 and Nano Server follows an approach that's similar to that of Windows 10. However, unlike the client OS version, administrators have more manual control over installing the server updates. They still can't be more than two updates behind, though, since at that point the branch stops getting future updates. Microsoft wants organizations deploying Nano Server to be those "moving at a 'cloud cadence' of rapid development lifecycles and wish to innovate more quickly." Microsoft said Software Assurance is necessary to support Nano Server's Current Branch for Business model.

Posted by Jeffrey Schwartz on 07/20/2016 at 3:20 PM0 comments


Microsoft: 1 Billion Windows 10 Installs by 2018 Is Unlikely

Microsoft will likely fall short of its prediction that Windows 10 will be running on 1 billion devices by mid-2018. Terry Myerson, executive VP of Microsoft's Windows and Clients group, delivered that bold forecast a year ago at the Build 2015 conference, just three months before releasing Windows 10.

However, that prediction presumed that Microsoft's newest Lumia rollout last year would boost, or at least maintain, its fortunes in the mobile phone market. The company's decision to scale back its efforts has changed the equation enough for Microsoft to acknowledge reaching the 1 billion milestone could take longer than originally anticipated.

The conformation came from Microsoft Corporate VP Yusef Mehdi in a statement to author Ed Bott, who offered some perceptive insights on the lessons learned in the year since Windows 10 shipped in his ZDNet blog. According to Mehdi's statement:

Windows 10 is off to the hottest start in history with over 350 million monthly active devices, with record customer satisfaction and engagement. We're pleased with our progress to date, but due to the focusing of our phone hardware business, it will take longer than FY18 for us to reach our goal of 1 billion monthly active devices. In the year ahead, we are excited about usage growth coming from commercial deployments and new devices -- and increasing customer delight with Windows.

Bott pressed the issue after noting that after Myerson set the 1 billion devices goal last year that he "did the math on that claim" and determined it was indeed realistic. "But my numbers relied on Windows Phone continuing to sell at least 50 million handsets per year, in addition to upgrades, for a total of at least 200 million Windows 10 Mobile devices in the market by 2018," he explained in last week's analysis. "That's not going to happen. And, meanwhile, the traditional PC market continues to shrink, slowly."

Despite pulling back its Windows Phone ambitions, Bott noted in his Redmond magazine Windows Insider column that it doesn't necessarily spell the end of the Windows as a platform for mobile devices, including tablets and third-party phones. A noteworthy example includes the forthcoming Windows 10-based HP Elite x3 running on an ARM processor. HP introduced the Elite x3 device in February at the Mobile World Congress in Barcelona. The x3 aims to bridge the functions of a phablet and a PC using Microsoft's Continuum, the technology introduced in Windows 10 and Windows 10 mobile that allows users to connect large displays, keyboards and mice and run both Universal Windows Platform apps from the Microsoft Store and traditional Win32 and .NET applications.

HP today said the device will start shipping later this month starting at $699, with a complete bundled solution including the Desk Dock set for release Aug. 29 starting at $799. In addition to HP, Microsoft has recently identified Acer, Alcatel, Trinity and Vaio as its current roster of phone hardware partners.

Microsoft can only hope that demand for HP's x3 and other forthcoming devices can exceed expectations and prove its scaled back expectations wrong. But for now, no one appears to be betting on that.

Posted by Jeffrey Schwartz on 07/18/2016 at 12:29 PM0 comments


Microsoft Wins Appeal in Dublin Datacenter Warrant Case

A federal appeals court today overturned a lower court ruling ordering Microsoft to turn over the contents of e-mails stored in an offshore datacenter. The decision is a major victory for Microsoft and all major cloud and data services providers seeking to ensure user data is protected under the laws of the countries where information is stored.

The closely watched case centered around Microsoft's refusal to comply with a search warrant by U.S. law enforcement demanding that it turn over the contents of specific e-mails residing in its Dublin, Ireland datacenter in connection with a criminal investigation. While Microsoft did turn over the data in its U.S. datacenters, the company argued that complying with the search warrant's demand to provde the e-mails was beyond the scope of U.S. law. Despite support from Apple, AT&T, Cisco and Verizon, along with the Electronic Frontier Foundation, the U.S. District Court for the Southern District of New York upheld the search warrant issued nearly two years ago. Microsoft subsequently filed the appeal last year arguing privacy laws superseded the court's ruling. "This case is about how we best protect privacy, ensure that governments keep people safe and respect national sovereignty while preserving the global nature of the Internet," Microsoft President and Chief Legal Officer Brad Smith stated at the time.

Smith hailed today's decision as a key precedent for privacy. "It ensures that people's privacy rights are protected by the laws of their own countries; it helps ensure that the legal protections of the physical world apply in the digital domain; and it paves the way for better solutions to address both privacy and law enforcement needs," Smith stated in a blog post.

A panel of three appeals court judges from the U.S. Court of Appeals for the Second Circuit found that last year's ruling ordering Microsoft to comply with a search warrant was invalid, noting that search warrants carried out by U.S. law enforcements are typically limited to locations within this country's borders or locations it has jurisdiction over, according to the ruling. Warrants don't traditionally extend further.

"Because Microsoft has complied with the Warrant's domestic directives and resisted only its extraterritorial aspects, we REVERSE the District Court's denial of Microsoft's motion to quash, VACATE its finding of civil contempt, and REMAND the cause with instructions to the District Court to quash the Warrant insofar as it directs Microsoft to collect, import, and produce to the government customer content stored outside the United States, according to the ruling.

Smith yesterday gave a speech at Microsoft's Worldwide Partner Conference in Toronto where he gave an impassioned emphasis on the company's commitment to fighting for customer privacy and spoke to this closely watched case. "We need an Internet that respects people's rights," he said. "We need an Internet that is governed by law [and] we need an Internet that's governed by good law."

Smith has been a vocal proponent that the Communications Storage Act under the Electronic Privacy Act of 1986 is based on is dated and has urged Congress and the White House to create laws that are consistent with the modern world. Apparently the appeals court agreed as noted in the ruling:  "Three decades ago, international boundaries were not so routinely crossed as they are today, when service providers rely on worldwide networks of hardware to satisfy users' 21st–century demands for access and speed and their related, evolving expectations of privacy."

Posted by Jeffrey Schwartz on 07/14/2016 at 1:50 PM0 comments


Nadella Sees Pokémon Go as a Catalyst for HoloLens

The release of Pokémon Go last week has gone viral at an opportune time for Microsoft. The new mobile game for smartphones, which is suddenly introducing augmented reality to the masses, arrived a few days before Microsoft's annual Worldwide Partner Conference kicked off where the company is showcasing its HoloLens headsets. Microsoft released a developer edition consisting of the headset and software at the Build developer conference in late March and now the company is looking to spread interest in augmented reality based on Windows among its 20,000 partners gathered this week.

Pokemon Go revives the once popular Nintendo characters licensed to the Pokemon Co. and Niantic for Apple's iOS and Android. The free app, which utilizes a smartphone's camera and GPS function, presents the game in an augmented reality type scenario for which HoloLens was devised. As of yesterday, 2 million downloads of the app are generating $1.6 million per day in in-app purchases including an upgrade to a premium version of the app, according to a report by The Wall Street Journal.

"This Pokemon interest hopefully will translate into a lot of interest in HoloLens," Microsoft CEO Satya Nadella told CNBC's Jon Forte moments after the opening keynote session at WPC, taking place this week in Tornonto. "Think about it, the game physics of that app are built for HoloLens. Of course the phone is great, the installed base of the phone is so enormous that it makes it possible. But think about what that game on HoloLens would mean. You're not trying to use a phone when you can actually use your eyes to look through. It's the ultimate computing paradigm and I am happy for Pokemon but I am happy for these industrial applications."

Nadella emphasized HoloLens in the opening keynote session where he was joined by Microsoft Program Director Arantxa Lasa Cid, who demonstrated how Japan Airlines is using it to train engine mechanics how to diagnose and repair its jet engines.

Asked if Microsoft wouldn't be better off targeting its HoloLens technology with a gaming app like Pokemon Go, rather than focusing on industrial use first, Nadella responded that the new game is ideal for helping everyone understand the opportunity of augmented reality. "I think it's fantastic to see augmented reality applications getting built," he said. "The best thing that can happen when you're creating a new category is for applications that are these killer apps, whether it be gaming or in the industrial scenario, to get invested in."

Given the fact that the HoloLens Developer Kits cost $3,000 each, if Pokemon Go has staying power and isn't just a fad, don't expect it to come to HoloLens right away.

Posted by Jeffrey Schwartz on 07/12/2016 at 12:14 PM0 comments


GE Pact Validates Microsoft's Cloud and IoT Chops

Last year, when GE launched its Predix cloud aimed at letting its industrial customers and supply chain partners build, deploy and run industry-specific applications, the company chose Amazon Web Services' compute, storage and other services.

That's not a leap of faith, considering AWS operates the most widely used cloud services. GE has since ported it to other clouds and today announced Predix will also run on Microsoft Azure.

Numerous large enterprises now use Azure in addition or as an alternative to AWS. But had GE decided to snub Azure in the mix of clouds Predix runs, it certainly would have given Microsoft a black eye. GE CEO Jeffrey Immelt joined Microsoft CEO Satya Nadella on stage in the opening keynote session of the annual Microsoft Worldwide Partner Conference (WPC) in Toronto to announce a broad partnership between the two companies, effectively putting any question about that to rest.

In addition to hosting Predix on Azure, GE has said it will integrate the platform with Microsoft's Azure IoT Suite and Cortana Intelligence Suite, and will tie Office 365, the newly announced Dynamics 365, and the Power BI digital dashboard and analytics tool. Immelt indicated later in a CNBC issue that Azure's large geographic footprint and ability to address data sovereignty requirements in certain countries were also key factors.

Adding Predix to the Azure platform will allow GE's customers and partners to connect industrial data with business processes and analytics information. Nadella emphasized the fact that GE, the 140-year-old industrial conglomerate, is no longer a traditional customer that purchased and consumed Microsoft and other companies' IT products. Rather, GE is an example of a traditional company that now is a tech company in its own right. Nadella emphasized that this is a shift more and more organizations, including industrial companies, are making -- and Microsoft is adjusting its relationships accordingly.

"Having this cloud infrastructure, this intelligent cloud makes it possible for us to help build out this world where every company going forward is going to be a digital company. Every company is going to build their own digital technology and these digital feedback loops. And the goal of having a cloud infrastructure is to be able to realize that potential," Nadella said.

Sitting down with Immelt on the WPC stage, Nadella asked about the inflection point for GE's decision to make that transition.

"Increasingly, we saw that the nature of our technology was changing," Immelt responded. "The jet engine was no longer a jet engine. A jet engine had 30 sensors [and] was taking a terabyte of data at each flight. And I would say 2009, 2010, we basically said, 'This isn't a question of whether or not we're a software company or an industrial company.' Industrial companies are going to become software companies, and the nature of our customers were changing and the nature of our products were changing. We said to be a better industrial company, we can't allow our digital future to be created by others. And so we've invested massively to drive this digital transformation."

Predix on Azure is slated for release toward the middle of 2017, with a developer preview scheduled for rollout by the end of this year. The two companies will be discussing Predix for Azure later this month at the Predix Transform event in Las Vegas and at the Minds + Machines conference in November.

Posted by Jeffrey Schwartz on 07/11/2016 at 2:42 PM0 comments


Microsoft Seeks to Shift CRM Market with Dynamics 365

In the latest ambitious move by Microsoft, the company is bringing together its respective Dynamics CRM and ERP suites together into a new software-as-a-service offering called Dynamics 365. The new service, slated to arrive this fall, will offer what Microsoft is describing as "deep integration" with Microsoft's Office 365 suite.

It's perhaps the most aggressive salvo Microsoft has taken yet to try to challenge Salesforce.com in the market for SaaS-based customer relationship, HR and business process automation applications other than reportedly trying to acquire the company last year. Microsoft's move also takes aim at Oracle, SAP and ServiceNow, among others. In a company blog post announcing the new service, Takeshi Numoto, Microsoft's corporate VP for cloud and enterprise, described Dynamics 365 and a suite of tools called AppSource as a move toward offering more modern and intelligent business apps.

Dynamics 365 will morph Microsoft's existing disparate Dynamics CRM and ERP cloud solutions into a single cloud service with specific apps such as Financials, Field Service, Sales, Operations, Marketing, Project Service Automation and Customer Service. While each are part of a common service that can be deployed independently, allowing customers to pay just for apps they require, they work together as customers add on capabilities, Numoto said. Microsoft is also natively embedding its Power BI dashboard tool and Cortana Intelligence, which he said will "help customers achieve their business goals with predictive insights, prescriptive advice and actionable next steps."

In a sales situation, Cortana Intelligence might serve up cross-sales opportunities or it could use IoT data to enable a field service agent to predict a failure before it occurs, he said, adding that the "deep integration" between Dynamics 365 and Office 365 will enable the structured processes and workflow in business applications to understand and interact with Microsoft's productivity tools. For example, it will allow an e-mail to respond to data from finance and sales apps rendering information such as pricing within Outlook.

The launch of Dynamics 365 will also include a new marketplace called AppSource, consisting of SaaS apps from both Microsoft and partners. Microsoft said AppSource will launch with more than 200 SaaS apps. The apps include content packs and vertical industry apps from the likes of AFS, which offers a merchandising field sales, asset management and auditing tools for consumer packaged goods companies; AvePoint Citizen Services, supplier of an automated incident response solution and Veripark's marketing software.

Microsoft has timed the announcement just a week before its annual Worldwide Partner Conference, which will be held in Toronto this year, where it will surely seek to build its ecosystem to take on entrenched rivals.

The move is not trivial considering the Dynamics suite is based on a variety of disparate products Microsoft acquired from Great Plains Software and Navision in 2000 and 2002 respectively, as well as its own internally built components. Providing a common framework was something Microsoft talked up long ago before its apps were offered online.

It also sheds more light on Microsoft CEO Satya Nadella's plan to spend $26.2 billion to acquire LinkedIn, announced last month. First Microsoft is going to provide native integration between Office 365 and Dynamics. The idea is once LinkedIn is part of Microsoft, though the large professional social network will operate as a separate unit, the two companies envision brining their respective intelligent graphs together that would bring Office and Dynamics functionality into LinkedIn and vice versa.

Posted by Jeffrey Schwartz on 07/06/2016 at 2:22 PM0 comments


Windows 10 Anniversary Edition Set for August 2 Release

Ink this on your calendar -- Microsoft will release its Windows 10 Anniversary Edition on Aug. 2, the company announced today. Microsoft is also looking to motivate customers to take advantage of the free upgrade offer, which expires in one month today with several promotional incentives including $300 off for students purchasing a Surface and Xbox One at Microsoft Stores, a free TV when buying Dell PCs priced at more than $699 and $150 off various HP systems and $100 discounts on certain ASUS and Samsung devices.

Now that there's only one month left before the free upgrade expires, it'll be interesting to see if those with Windows 7 and Windows 8.1-based systems who have either put off taking advantage of it or had no plans to do so have last-minute changes of heart, especially those who don't want to have to pay for the upgrade at a later date. While there's always the possibility Microsoft will extend the deadline for taking advantage of the free upgrade offer or make it permanent, the company has remained firm about the offer's expiration date.

Also many users have made clear they don't want to upgrade their older versions, as evidenced by the uproar over Microsoft's heavy-handed push that tricked many users into inadvertently upgrading to Windows 10, a practice Microsoft yesterday indicated it will stop when it modifies its Get Windows 10 (GWX) application this week. That promise came just days following the report of a lawsuit that resulted in a $10,000 judgment in favor of Teri Goldstein, CEO and owner of TG Travel Group LLC, based in Sausalito, Calif., who claimed damage from a Windows 10 upgrade that she allegedly didn't authorize.

The new Windows 10 Anniversary Update will be pushed out to users after August 2 and it will introduce some important new features. Among them are new digital inking capability, improvements to Cortana, extended security including Windows Hello support for apps, refinements to the new Edge browser and the ability to share their Xbox gaming experiences with their PCs, tablets and phones.

Microsoft announced the planned release date today, though the company had indicated Windows 10 Anniversary Edition would arrive around this time when issuing the first Windows Insider technical preview during the Build Conference back in late March (see our first take here).

Yusuf Mehdi, corporate vice president for Microsoft's Windows and Devices Group, underscored six key improvements to Windows 10, which the company said is now used by 350 million users. The first is security, which will now extend Windows Hello support to companion devices to replace passwords with biometric authentication and improvements to its Windows Defender anti-malware service. It will also offer the ability to schedule scans and provide reports notifying users of any threats discovered. Enterprise customers will also get the new Windows Defender Advanced Threat Protection feature (WDATP), which Mehdi said "detects, investigates, and responds to advanced malicious attacks on networks by providing a more comprehensive threat intelligence and attack detection."

The first obvious change Windows users will notice is the prominence of Cortana, the voice-enabled digital assistant that competes with Apple's Siri, Google Now and Amazon Echo. In addition to its place in the search menu, Cortana will show up on the lock screen, allowing users to dictate commands or questions without unlocking the system. Cortana will also let users request personal information such as where you've parked your car or an account number, and it will provide notifications across all devices. Cortana is a key thrust for Microsoft, which recently announced its roadmap to create "intelligent bots" as a service.

Microsoft is also hoping that the new digital ink capabilities in the new release will finally make Windows PCs and tablets an appealing way to draw and take notes – an effort the company has long aspired to bring to the mainstream.

"Now, using a Windows 10 device, graphic designers can be more creative, musicians can write digital music, lawyers can edit documents with the power of the pen, and students can do mathematical equations and learn by writing," Mehdi said in a blog post announcing the release date. The Digital Ink feature will make it easy to take notes or annotate an image, while the new "Smart Sticky Notes" option will let users create simple reminders. A number of apps support the new inking capability including Office and the Edge browser.

Share your views on the new Windows 10 update.

Posted by Jeffrey Schwartz on 06/29/2016 at 12:55 PM0 comments


Microsoft-Red Hat Technical Collaboration Pact Blooms

When Microsoft and Red Hat late last year inked what they described as their most extensive joint development partnership to date, it covered a broad cross section of interoperability initiatives. The two companies this week are demonstrating the fruits of those efforts at the annual Red Hat Summit, taking place in San Francisco.

In concert with Microsoft's release of .NET Core 1.0, Red Hat announced the availability of its flagship Red Hat Enterprise Linux (RHEL) OpenShift cloud platform-as-a-service (PaaS) stack via certified containers and supported hybrid implementations supported by the two companies' partnership.

"This makes Red Hat the only commercial Linux distribution to feature full, enterprise-grade support for .NET, opening up platform choice for enterprises seeking to use .NET on a flexible Linux and container-based environments," according to a Red Hat blog post.

It also allows for micro services where some components are .NET-based and others built in Java. All can work together on RHEL and Red Hat OpenShift, the company noted. It also means .NET Core workloads are portable between Windows Server and RHEL, while bringing .NET to OpenShift cloud environments.

Microsoft also demonstrated SQL Server 2016 running on RHEL, which Red Hat underscored it will support. Revealed in March, SQL Server 2016 for Linux is still in private preview. Red Hat will include a demo of SQL Server 2016 for Linux on Thursday during the general session, which the company will make available through a live Web stream starting at 1:45 PST.

Red Hat's new CloudForms 4.1 cloud management software now supports Azure. The tool provides horizontal scaling of cloud platforms as well as management and reporting. And for its part, Microsoft said its Azure Resource Manager template is now available on GitHub, which the company said will provide a path to deploy RHEL and OpenShift in Azure, which will let developers build, deploy and scale applications in Microsoft's cloud using Red Hat's self-service container-based platform.

Microsoft Corporate VP Joseph Sirosh in a blog post described the releases and demos as a key milestone for the partnership between the two companies and said the Red Hat Summit was an important stage to showcase their progress.

"This is a huge accomplishment for the entire open source ecosystem -- with more than 18,000 developers representing more than 1,300 companies contributing to .NET Core 1.0," Sirosh said. "The new version also includes the first release of the .NET Standard Library, which will enable developers to reuse their code and skills for applications that run on servers, the cloud, desktops and across any device including Windows, iOS and Android."

Posted by Jeffrey Schwartz on 06/29/2016 at 7:57 AM0 comments


Salesforce.com Releases Lightning Connector for Outlook

Salesforce.com today released a connector that will give Office users dynamic and customizable access to their CRM data in their Outlook inbox. The new Lightning Connector is a key deliverable based on a partnership Salesforce.com and Microsoft inked two years ago.

The release comes at a rather awkward time for the two companies. Microsoft two weeks ago agreed to acquire social networking giant LinkedIn for $26.2 billion. The price tag for the deal was reportedly pushed up thanks to Salesforce.com's  own offer made to acquire for LinkedIn, CEO Marc Benioff told Recode, a year after Microsoft tried to acquire Salesforce.com.

Whether or not there are hard feelings, the two companies are partners and the connector will benefit both companies that are under pressure by their shared customer bases to improve the utility of their respective widely used offerings.

Lightning for Outlook brings functionality from the Salesforce.com SaaS suite into users' inboxes. That would let a sales rep update a price quote with Salesforce SteelBrick CPQ directly in Outlook or use a Salesforce.com partner app such as Altify, which provides a sales productivity tool, according to two examples provided in today's announcement.

The connector is based on Salesforce.com's Lightning Components, announced last week, which is a framework of reusable code blocks, many of which are available on the company's AppExchange. The framework comes with Lightning App Builder tool, which lets developers drag and drop those components to build new apps or embed them into third-party software. Users will soon be able to embed those components into Outlook.

The new Lightning for Outlook is redesigned to offer a more intuitive user experience. When users scroll through their inboxes, it has a more modern display and embedded search capability enabling users to find Salesforce data within their inboxes, the company said.

Lightning for Outlook is available in the Office Store as a free add in. Lighting Synch is also free and available now for Salesforce.com customers with Sales Cloud Lightning Enterprise Edition licenses or higher. The Lightning Components Framework for Outlook is set for release this fall. The company said it will disclose the price of the framework upon its release.

Posted by Jeffrey Schwartz on 06/28/2016 at 12:02 PM0 comments


Brexit Could Impact IT Spending and Uncertain EU-U.S. Data Transfer Policy

Fears that the unexpected decision by British voters to withdraw from the European Union is continuing to roil financial markets stoked by fears that the referendum will lead to a global recession. While it throws the value of the pound and Euro into flux, Britain's withdrawal from the EU will have a more far-reaching ripple effect on industries including IT that either reside or do business in there.

Whether or not the current state of panic in the markets is overblown -- and what impact it'll have -- remains to be seen. Some believe it will create market deterioration in the U.K. and Europe. The extent and duration of that is unknown. At the very least, it will be more expensive to do business in Britain and potentially all of Europe.

S&P today reduced the U.K.'s credit rating from AAA to AA, meaning it could trigger changes to trade deals -- and certainly increases the cost of new deals. Anticipating this fallout, Gartner on Friday reduced IT spending forecasts for the remainder of this year. Worldwide IT spending will drop from 1.5% to 1.2%, according to Gartner.

"Businesses will want to reassess where they are physically located, where they need to be in future and where they buy from, to maximize their market potential and minimize cost," Gartner analysts said in a research note. "Many businesses will trigger strategic reviews of how they operate and buy services."

Despite the uncertainty Brexit brings, the analysts advised CIOs, particularly in the EU, not to panic. It recommended the following:

  • Work with the business to create project teams to assess the impact of Brexit and plan for needed changes.
  • Review any European government contracts, which often contain specific Brexit clauses.
  • Reduce uncertainty in your workforce by communicating proactively about the situation and what your business is doing to protect the rights and benefits of staff.

The timeframe for Britain's withdrawal remains to be seen but first the country has to vote in a new government in wake of the resignation of Prime Minister David Cameron, in response to the decision, which he had vocally opposed. Cameron has said he'll stay on for at least three months. The U.K's Information Commissioners' office issued a statement noting that existing privacy laws remain in effect. But according to numerous reports, it could take two years before the withdrawal is complete. As reported by Fortune "that means years of uncertainty, with tech firms and investors unable to know for sure how regulations will evolve (or devolve) in the U.K. and, indeed, the EU."

Brexit also puts into question the data transfers between the U.S. and Europe, which are still trying to come up with data transfer and personal information regulations in wake of the E.U.'s decision to strike down the Safe Harbor Agreement. Shelly Palmer, CEO of Palmer Advanced Media, noted in a blog post that Britain is now a "lame duck" in those discussions.

"As an EU member, the world's 5th largest economy (now 6th because the British pound got hammered that hard by the BREXIT vote) had a loud voice at the table with Germany and France, which want far stricter controls on EU data flow," Palmer wrote. "The U.K.'s voice of reason on the subject of EU data flow is now silent."

Posted by Jeffrey Schwartz on 06/27/2016 at 1:42 PM0 comments


Microsoft Has Significant Plans for Blockchain as a Service

Microsoft has ratcheted up its effort over the past year to build out a framework based on blockchain and to participate in helping build a standards-based ecosystem. Many experts believe blockchain will transform how payments and transactions are made and it's poised to create a trusted, self-sovereign, user-managed identity framework.

Blockchain, the distributed ledger technology enabled by smart contracts that underlies bitcoin cryptocurrency transactions, has existed for a while. But over the past year, it has advanced to the point where every major supplier of IT is building blockchain-based tools and scores of major banks and financial services firms that participate in trading networks are involved in pilots. Microsoft started talking about its potential late last year and has made aggressive moves ever since, which is why it is the cover story for the July issue of Redmond magazine.

It appears blockchain is becoming more strategic to Microsoft. Marley Gray, who headed Microsoft's financial services vertical industry segment as director of technology strategy for Microsoft's financial services vertical, is now the company's blockchain business development and strategy lead.

In its latest bid to extend its blockchain ambitions, Microsoft revealed Project Bletchley, which Gray described last week in a blog post as Microsoft's "vision for an open, modular blockchain fabric powered by Azure and highlights new elements we believe are key in enterprise blockchain architecture." It aims to bring a middleware layer to its Azure Blockchain as a Service (BaaS) that he said "will provide core services functioning in the cloud, like identity and operations management, in addition to data and intelligence services like analytics and machine learning."

The other component to Project Bletchley is cryptlets, which Gray described as a "new building block of blockchain technology [that will] will enable secure interoperation and communication between Microsoft Azure, ecosystem middleware and customer technologies." Gray noted Project Bletchley addresses key requirements early blockchain adopters are demanding:

  • An open interoperable platform.
  • Features including identity, key management, privacy, security, operations management and interoperability that are integrated.
  • Performance, scale, support and stability.
  • Members-only consortium blockchains that are permissioned networks for members to execute contracts.

Microsoft's blockchain capabilities are being delivered via its Azure public cloud via various open protocols including simple, Unspent Transaction Output-based protocols (UTXO) such as Hyperledger, as well as smart contract-based protocols such as Ethereum and others as they surface, Gray noted.

At the recent Build conference, Microsoft with its partner ConsenSys announced support for the Ethereum contract programming language Solidity as an extension to Visual Studio, Redmond's flagship integrated development environment (IDE). Ethereum is an open source project that allows applications to run precisely as programmed with no way for third parties to interfere or create fraudulent transactions, according the Ethereum Foundation.

The applications run on a custom-built blockchain, which in Microsoft's case is the service built on Azure with ConsenSys to offer Ethereum Blockchain as a Service (EBaaS). "What you are seeing is Microsoft tactically reducing the barriers to entry to build these types of applications," said Andrew Keys, director of business development at ConsenSys. Additionally, Project Bletchley will extend Azure BaaS to other platforms, partners and customers, according to Gray.

 

Posted by Jeffrey Schwartz on 06/24/2016 at 1:01 PM0 comments


Feds Certify AWS and Microsoft Azure for the Most Sensitive Workloads

The Federal Risk and Authorization Management Program (FedRAMP) has finally certified Amazon Web Services and Microsoft's respective public cloud services with the highest level of compliance clearance. CSRA, a provider that offers IT services specifically to government agencies, also reached the long-awaited FedRAMP Joint Authorization Board (JAB) Provisional Authority to Operate (P-ATO) clearance.

Though the move was long expected, it paves the way for federal agencies to run the most sensitive, high-impact workloads on the three companies' public clouds, which means including personally identifiable information, financial data and law enforcement information, among other forms of unclassified content. In all it covers 400 different security controls.

AWS and Microsoft Azure were both FedRAMP-compliant for several years but only for low-level or moderate workloads. "Now, Azure Government has controls in place to securely process high-impact level data -- that is, data that, if leaked or improperly protected, could have a severe adverse effect on organizational operations, assets, or individuals," said Susie Adams, Microsoft Federal's Chief Technology Officer, in a blog post announcing its upgraded FedRAMP status.

The Azure Government FedRAMP High accreditation now covers 13 customer-facing services, including Azure Key Vault, Express Route and Web Apps, released last month "representing a significantly more agile pace of accreditation to the benefit of Federal customers," she noted.

"Agencies are already using the service," FedRAMP Director Mark Goodrich told my colleague Mark Rockwell at sister publication Federal Computer Week. The clearances apply to the AWS Government and Azure Government offerings but promise to validate to all cloud users that the companies have implement best practices to ensure compliance.

Teresa Carlson, Amazon's Public Sector VP, claimed in a statement that more than 2,300 government customers worldwide use the AWS cloud. "By demonstrating the security of the AWS Cloud with the FedRAMP High baseline, agencies can confidently use our services for an even broader set of critical mission applications and innovations."

That baseline, the statement noted, "is mapped to National Institute of Standards and Technology (NIST) security controls, which classify data as 'High' if a compromise would severely impact an organization's operations, assets or individuals."

Posted by Jeffrey Schwartz on 06/23/2016 at 12:04 PM0 comments


iOS SharePoint App Aims To Mobilize Intranets and Collaboration

The new SharePoint Mobile app for iOS is now available and it represents an important milestone in Microsoft's effort to bring the corporate intranet and collaboration to users' phones. Given how many people use their phones more than their PCs for routine purposes, bringing the capabilities that have made SharePoint Sites so widely used over the past decade is critical if Microsoft wants SharePoint to remain relevant as people shift the way they search and access information.

That's why Jeff Teper said creating a modern interface was a key priority and emphasis when he gave his vision for the Future of SharePoint last month. At that event in San Francisco, Teper first revealed the new revamped experience for mobile users with a modern, yet intuitive and immersive, interface for mobile devices.

The new SharePoint Mobile app is designed to bring corporate intranets to mobile devices with easy access to team sites, portals and a snapshot of what colleagues in a workgroup or team are working on. It uses the machine learning intelligence of the Office Graph, which renders the most relevant documents used by individuals and groups.

In addition to supporting SharePoint Online in Office 365, the new mobile app supports on-premises and hybrid implementations of SharePoint Server. It works with the two most recent releases, SharePoint Server 2013 and SharePoint Server 2016. Once a user installs the app on their phone, users can log in using their SharePoint credentials, or via Azure Active Directory, NTLM and Forms-Based Authentication (FBA), among others. It's configured to let users switch between Office 365 accounts.

The new mobile client provides a gateway to SharePoint with what Andy Haon, principal group program manager on Microsoft's SharePoint engineering team, called "front door" access to SharePoint. In a video demonstration, Haon explained the "front doors" interfaces to the new SharePoint mobile client.

The Sites tab provides a list of sites users frequently access or follow. It shows recent files, lists, subsites and pages accessed, and allows a user to share a site. When accessing a SharePoint document library that's in a team site, it accesses it in the OneDrive mobile app for iOS, allowing users to view, share, search, follow and manage files stored across Office 365. Microsoft posted documentation on how the two apps work together.

Organizations can provide access to key corporate resources via the Links tab. "Administrators can program links for their employees and these links will show up both in the SharePoint Mobile app and also in the SharePoint Home Web experience," Haon explained. "This enables users to quickly access their Office 365 video portal, company or divisional portals, an important site, cafeteria menus or other organizational resources."

Have you tried the new iOS client yet? Share your views.

Posted by Jeffrey Schwartz on 06/22/2016 at 1:27 PM0 comments


Dropbox Adds Workgroup Collaboration Improvements to File Storage Service

In its bid to appeal to more business users, Dropbox today has added a number of new features to its popular online file storage and sharing service, giving it more functionality for iOS-based devices and offering improved document management and sharing from Windows PCs and Macs.

Dropbox users can now create Office documents from their iPhones and iPads as well as use them to scan documents. All Dropbox users can now insert comments into PDFs, documents and images, a feature that was limited to those with premium accounts and they can now view version histories, among other new features.

The company claims 500,000 users have Dropbox accounts but more than half have free accounts, which provides basic file storage services. In recent years, Dropbox has added new business features to its service. Dropbox is among the most popular of the file storage services, though it competes with many providers including Google Drive, Citrix ShareFile and most notably Box. Dropbox also competes with Microsoft's OneDrive for Business but Dropbox, along with Box and Citrix also have partnerships with Microsoft to provide compatibility with Windows and Office.

One of the appealing features of Dropbox is its rapid file synch capability. Back in March Dropbox said its service is used in 8 million businesses with 150,000 paying business customers and 25,000 new ones coming online monthly.

In a blog post today, the company highlighted the new features, which include the ability to:

  • Scan documents into Dropbox folders from the mobile app, which lets users organize scans from whiteboards, receipts and sketches, with the ability to search within scanned files.
  • Create Microsoft Office documents (Excel, Word and PowerPoint) on mobile devices using a new "plus" button offered in the iOS app.
  • Share files and folders from desktops: Here the company says it is simplifying the saving and sharing of documents. When right-clicking on a folder on a Windows PC or Mac, it'll share right from the desktop removing the process of redirecting to the Web or copying a link to an e-mail.
  • Allow all users to add comments to a file directly from Dropbox, not just premium ones.
  • Preview contents of a file before restoring it to ensure it's the correct version.
  • Let users share a single document with specific people, which is available with a new file-sharing control as well as a read-only option.
  • Search within scanned documents in Dropbox folders including the use of OCR.

The new features are available to those with both free accounts and for those with paid subscriptions with the exception of the latter feature offering the ability to search within scanned files.

The company is also now letting users set the destination folder when uploading photos and videos to Drobox as well as allow them to be grouped. Dropbox Basic users will have the desktop app installed on their computers in order to use the automatic camera uploads.

Posted by Jeffrey Schwartz on 06/22/2016 at 1:28 PM0 comments


Mark Russinovich Demos New Docker Datacenter for Azure

Microsoft Azure CTO Mark Russinovich appeared on stage today at the annual DockerCon conference in Seattle where he demonstrated the new Docker Datacenter (DDC) hybrid cloud development and management platform with nodes connected to the Azure public cloud as well as with the company's forthcoming Azure Stack hybrid cloud environment. Also running on the Azure Stack node was the first-ever public demonstration of SQL Server for Linux.

The appearance and demo by Russinovich, his second at a DockerCon in as many years, showcased progress on a number of fronts by Microsoft to embrace Docker containers, an effort kicked off merely two years ago. While Microsoft's Windows Server 2016 Technical Preview now supports Docker containers and Hyper-V Containers, the focus of today's demonstration was on the new DDC  platform, announced in February and released yesterday for use in the Amazon Web Services and Microsoft Azure clouds.

Available in both companies' respective cloud marketplaces, DDC is designed to allow enterprises and service providers to deploy containers as a service either on-premises or in a virtual private cloud, providing a managed and secure environment that allows developers to build and deploy self-service applications. The DDC includes Docker's Universal Control Plane embedded with the Swarm native clustering tool, Trusted Registry (DTR) 1.4.3 that provides Docker image management, security and an environment for collaboration and the most recent version of the Docker engine runtime.

"Our focus is on getting containers ready for the enterprise and addressing enterprise needs when it comes to running containers in production," Russinovich said as he began his demo, where he clicked on DDC in the Azure Marketplace, which launched an Azure Resource Manager template that initiated a form-filled process. "Within a few minutes, I can get a highly available first class Docker Datacenter cluster up and running up in Azure," he said.

For high availability, he said, at least three controller nodes were needed and he also used eight Docker Swarm cluster nodes. Russinovich also showcased the Docker Trusted Registry running 68 container images for a voting app container where he demonstrated the building of results containers, which also used a worker container to process them. A bug in the code briefly stalled the demonstration.

Russinovich also showcased that DDC is deeply integrated with various services including the Azure Load Balancer, allowing for the voting and results apps to be on the nodes connected to it. He then showed the one different node called "local Linux" that had a different subnet address than the other nodes. That node was the Azure Stack server running the SQL Server for Linux database instance. Running on a Ubuntu Linux server, Russinovich said it was the first public demonstration of SQL Server for Linux, which the company revealed in March and in private previews. Those with the private previews can access SQL Server for Linux on Ubuntu as Docker images.

"This is essentially Azure on your own hardware in your own datacenter," he said. "Azure Stack includes the infrastructure services of public Azure, some of the platform services as well as the portal and the API surface that matches public Azure. So we get this consistent experience going from public Azure to on-premises. What we've got here is a hybrid Docker Datacenter Cluster that spans on-prem to public Azure. The VMs on this server are connected to public Azure using site-to-site virtual private network for secure connection between these nodes that are on the back end as well as the nodes that are on the front end."

Russinovich also demonstrated the ability to monitor the containers using Microsoft's Operations Management Center service, where an administrator could see different views of the 68 containers running across the production servers.

"What you're seeing for the first time here, and this is kind of mind blowing, Microsoft's SQL Server, running on Linux in a Docker container, on Azure Stack in a Swarm cluster that's hybrid and being managed by Docker Data Center, running up in public Azure."

Posted by Jeffrey Schwartz on 06/21/2016 at 3:11 PM0 comments


Dell Sheds Software Assets with Quest Software and SonicWall Deal

Four years ago Dell won a protracted bidding war to acquire Quest Software, which has a wide portfolio of systems management and migration tools, primarily for Microsoft-based environments including Active Directory, SharePoint and Windows. Now Dell is casting aside Quest, along with SonicWall, a supplier of e-mail and content management security appliances.

Private equity firm Francisco Partners and hedge fund operator Elliott Management today said they have agreed to acquire the Dell Software Group's security, systems and information management and data analytics solutions. Terms of the deal were not disclosed but Reuters reports the value to be more than $2 billion. The move comes as Dell is on the cusp of closing its $67 billion deal to acquire storage EMC, the largest merger in the IT industry to date. While Dell isn't saying why it has decided to sell off its Quest tools and SonicWall business, they apparently no longer fit into its strategy to focus on hardware ranging from its PC and devices business and the datacenter with its portfolio of servers, networking and its extended storage and datacenter infrastructure assets that will come from EMC. It also stands to reason Dell could use the cash to finalize the deal. Dell has already shed its consulting business, Perot Systems, and spun off its SecureWorks unit in an initial public offering.

Dell will retain Boomi, the software business that provides integration and connectivity between cloud and on-premises apps along with API and master data management. The Quest software portfolio consists of advanced analytics, database management, data protection, endpoint systems management, identity and access management, Microsoft platform management, network security, and performance monitoring, according to Francisco Partners.

"We see tremendous growth opportunity for these businesses," said Brian Decker, head of security investing at Francisco Partners, in a statement. "Network security and identity and access management are increasingly strategic imperatives for enterprises."

It isn't clear what the operating structure will be for Quest and SonicWall or what the future holds for Dell Software President John Swainson. Dell said further information will come at the closing of the deal, which is subject to usual regulatory approvals.

Posted by Jeffrey Schwartz on 06/20/2016 at 11:32 AM0 comments


VMware Steps Up Endpoint Security and Windows 10 Migration

VMware this week fleshed out its app and device management portfolio with a new management suite that adds endpoint security, threat detection and Windows 10 image and OS migration. The company also revealed new identity management features to its forthcoming Workspace One digital workspace platform including multifactor authentication and Active Directory integration.

The move comes just a few weeks after VMware's key rival Citrix tapped Microsoft to provide tighter integration of its rival offerings with Redmond's Enterprise Mobility Suite. For its part, Microsoft is in the thick of a battle with Citrix and VMware, having made a major and unusual appearance at last summer's VMworld, where the two said they're working together to ensure Windows 10 support in the AirWatch management platform.

In its latest emphasis on endpoint security, VMware inked an OEM agreement with Tanium, which is a key component in a new tool called Trustpoint. VMware's Trustpoint is a management suite that gives administrators views of all endpoints and activities on an enterprise network. It also has a natural language search interface to provide specific information about all of the endpoints and help track use of unmanaged devices or unusual activity. In addition to using Tanium's endpoint protection and threat management tool, it incorporates VMware's "layered OS migration technology."

Why provide a unified endpoint management platform and Windows 10 migration tool in the same product? VMware claims the Image Service technology offered in TrustPoint, which is available for a one-time license fee of $75 per device, is predicated on the notion that upgrading to Windows 10 will provide improved endpoint security. But going through the migration can introduce problems if not performed properly, especially among organizations that do it at scale. VMware claims TrustPoint simplifies the process by automating migration, enabling 100 Windows 10 migrations per day by each technician using the tool's central management console.

"We saw a need for scalability and speed at really high scale," said Blake Brannon, VP of marketing at the AirWatch mobile device management division of VMware. "Because you want that agility to be able to change as the threats change and as your assets change." TrustPoint can detect unmanaged endpoints and block them from the network, ensuring every device connected is in compliance with an organization's security policy, he added.

Identifying unmanaged devices is a critical aspect of ensuring a secure environment, said Curt Aubley, Tanium's VP of global strategic alliances and technology. Many IT managers are often surprised at the scope of devices in use on their networks that aren't managed and therefore could introduce security risks. And that's especially the case for those preparing for an OS migration, according to Aubley. "People don't know what they have," he said, in an interview. "They don't know the hardware or the firmware, or other things are ready to be migrated. That's one of the things we've immediately identified through the security process, and then we can hand it off to VMware so they can do that migration."

Aubley, who recently came over to Tanium from Intel Security, points out the shift to personal cloud services, mobility and the use of employee-owned devices, has led to more dynamic computing environments. "Endpoints (laptops, notebooks, desktops, servers, virtual machines, containers, clouds) are constantly changing by being moved, created, put to sleep or retiered," he noted in a guest blog post on VMware's Web site. "To protect millions of moving targets, you need agile visibility and control at a scale and speed traditional prevention and hierarchical management frameworks are unable to provide. As endpoint security and management converge, IT leaders are taking a more holistic approach to securing and managing their environments."

VMware said it will enable what it calls "Identity-Defined Workspaces" to its Workspace One, the company's new platform revealed in February and due out next quarter, that seeks to combine enterprise system and application mobility management. Workspace One costs $8 per user for a cloud-based subscription or $150 for a perpetual license.

The new authentication capabilities are the result of updates to its enterprise mobility platform, AirWatch 8.4 and VMware Identity Manager, which aim to improve Workspace One's ability to combine the two capabilities and provide extended authentication capabilities to apps and services on any device. The new VMware Verify application offers two-factor authentication built into the platform, using employees' personal phones and tablets as tokens. An employee can authenticate by responding to an SMS notification on the device that says "verify." Workspace One also lets users access a native OS and application experience on their own devices without requiring MDM profiles, which is becoming a requirement in many circles where users don't want their phones or tablets to be registered. Instead, the user can download Workspace One, log in with a corporate e-mail address and the single sign-on capability provides access to a company's intranet and native Windows apps.

The new releases let administrators activate a feature called "Workspace Services," which enables native operating system data protection to ensure sensitive data isn't printed or copied in any other way outside of the Workspace One environment. At the same time, VMware said it protects the privacy of employees' personal data and doesn't allow administrators access to personal apps or enable a devices functions such as GPS, or place other device restrictions outside the confines of Workspace One. However, within the Workspace One environment, VMware said it has added advanced conditional access policies, device auditing, automated remediation and lifecycle management.

VMware's Workspace One unified catalog will also support the Microsoft Store for Business to simplify how IT managers and administrators acquire, distribute and manage Windows 10 apps. VMware claims this promises to reduce application delivery and lifecycle management complexity by tying its application catalog and application delivery capabilities to the Microsoft Windows Store for Business. Among the benefits this will offer is the ability to purchase apps in bulk, cache them for offline distribution, allow administrators to reassign license and the ability to roll out applications built in house.

Posted by Jeffrey Schwartz on 06/17/2016 at 2:09 PM0 comments


Microsoft Partner To Offer Hyperconverged Azure Hybrid Cloud Platform System

Nutanix has become the third hardware company to ink a pact with Microsoft to deliver the hybrid iteration of its Azure Cloud Platform System (CPS). Until now, only Dell and Hewlett Packard Enterprise have offered CPS. Nutanix today announced a partnership with Microsoft to offer the CPS Standard edition. The Nutanix CPS, which will offer Microsoft Azure-consistent environments, will be available next month.

The announcement comes on the eve of Nutanix Next partner and customer conference next week in Las Vegas, where Microsoft Technical Fellow and the Lead Architect for its Enterprise Cloud Group Jeffrey Snover is scheduled to speak.

Similar to the partnerships Microsoft inked last year with Dell and HPE to offer CPS Standard based on the hybrid cloud reference architecture, the Nutanix CPS Standard will be based on a jointly-engineered solution, the company said. Also like the Dell and HPE offerings, the Nutanix system is based on Windows Server 2012 R2, System Center 2012 R2 and the Windows Azure Pack. Nutanix hyper-converged systems are best suited for organizations looking to consolidate their applications, workloads and infrastructure management on a common platform. It comes with Nutanix's Prism management platform and Acropolis VM, which the company describes as a scale-out data fabric for storage, compute and virtualization.

Like its existing hyper-converged systems, the Nutanix CPS Standard is aimed at running business critical workloads such as SQL Server, Oracle and SAP. According to Nutanix, 50 percent of all systems it sells run applications or infrastructure from Microsoft, Oracle and Splunk.

While Nutanix is not as well known as those two players, the company is a rapidly growing provider of hyper-converged systems comprised of integrated storage and compute capabilities for large scale-out workloads. Nutanix, which has OEM agreements with Dell and Lenovo and others, recently registered an initial public offering.

Posted by Jeffrey Schwartz on 06/17/2016 at 12:24 PM0 comments


Symantec Charts New Path with $4.6 Billion Bid for Blue Coat

Looking to revive its struggling IT security business hampered by miscues and a tapped out market for its flagship endpoint security offering, Symantec Monday said it has reached an agreement to acquire Blue Coat, a provider of Web analytics, threat assessment and remediation software and services, for $4.6 billion. Blue Coat CEO Greg Clark will take the helm at Symantec once the deal closes. As part of the deal, Blue Coat majority shareholder Bain Capital will reinvest $750 million in the combined company and Silver Lake Partners is doubling its investment to $1 billion.

The deal, set to close next quarter, would be the largest for Symantec in more than a decade, though it is paying a high premium for Blue Coat, which had $598 million in revenues for its latest fiscal year, though it had just filed for an initial public offering (IPO).

Experts say acquiring Blue Coat, which claims 15,000 customers including 75 percent of the Global Fortune 500, should further propel Symantec into the threat analytics market after it pulled its Web security offering last year in a move to cut costs. Blue Coat had just recently filed an initial public offering (IPO), which it has now pulled. Others wonder if a successful integration of the two companies will be enough to rejuvenate the leadership position Symantec once had as an IT security provider. Bob Phelps, managing director for security services at Accenture, said Symantec is playing catchup with companies such as FireEye and Palo Alto Networks, among others. "Symantec's problem has been their cash cow business is going away, they have not had good management and they have not integrated companies together effectively," he said. Observers also said Symantec has experienced significant drain of executives such as Jeff Scheel, senior vice president of corporate strategy, who  left to join Ionic Security, a startup which earlier this month pulled in $45 million in funding from Amazon and Goldman Sachs. In late April, Symantec CEO Michael Brown unexpectedly stepped down after less than two years on the job after reporting a projected falloff in sales.

Having spun off its Veritas backup and recovery business, Symantec is making a huge bet on Blue Coat, a rapidly growing provider of both on-premises and cloud-based security analytics tools, which the company claims are used by more than 70 percent of the Global Fortune 500.

Blue Coat's Secure Web Gateway, which includes ProxySG, provides user authentication, Web traffic filtering, cloud application monitoring, data loss prevention and threat prediction as well as visibility into encrypted traffic. ProxySG works with Blue Coat's Content Analysis System, which provides static code analysis and sandbox brokering. Blue Coat says the system also offers web and threat intelligence from 15,000 of the largest global enterprises. The company claims more than 70 percent of the Fortune Global 500 use the gateway, which has a broad feature set to authenticate users, filter Web traffic, identify cloud application usage, provide data loss prevention, deliver threat prevention and ensure visibility into encrypted traffic.

Blue Coat's security analytics appliance, the Solera Networks DeepSee recorder for forensics, deep packet inspection and on-demand session replay will bolster Symantec's late entry into the threat detection, remediation and forensics arena, according to an IDC research note. The addition of Blue Coat could pit Symantec against IBM's Q-Radar Security Intelligence Platform and the RSA Security Analytics platform in the threat detection, incident response and forensics investigations, according to IDC, though the firm warned that "these competitors have much greater traction and embedded customers today and will be hard to unseat."

In addition to Blue Coat's SaaS-enabled ProxySG, Blue Coat recently acquired Elastica and Netskope, which IDC believes would bolster Symantec's Web security and cloud security product offerings. "IDC views the Symantec acquisition with cautious optimism and believes this could be an opportunity to modernize the company's portfolio and compete strongly in this crowded market for large enterprise security deals," according to the large research note.

On a conference call with analysts Monday, interim President and COO Ajei Gopal rationalized the deal. Citing a 125 percent increase in the number of zero-day attacks, a 55 percent rise in sphere-phishing attempts and a continued surge in ransomeware campaigns, Gopal claimed the combined company will give the Symantec "best-in-breed" protection against those threat vectors both on-premises and in the cloud.

"With Advanced Threat Protection capabilities from both companies, we will be able to detect and remediate the most sophisticated cyberattacks," Gopal said. "Not only will the combination of our products deliver comprehensive cyber defense, but our best of breed products will also individually get better."

Gopal said that will be achieved by the data Symantec now collects and analyzes via 175 million endpoints via Symantec Endpoint Protection, 2 billion e-mails scanned daily and from millions of physical, virtual and cloud workloads it protects. For its part, Gopal noted Blue Coat also collects and analyzes more than 1.2 billion new Web requests secured per day through Secure Web Gateway and over 12,000 cloud applications monitored and protected through its Cloud Access Security Broker.

"Symantec and Blue Coat collect different and complementary data sets," he added. "Taken together, we will have unmatched visibility and data about the global threat landscape. We believe combining these two massive data sets enables Symantec to deliver unparalleled cyber defense. As more enterprises embrace the cloud, they need to secure their users, data and applications wherever they reside. Previously that would have taken over eight to ten point vendors to provide the security capabilities needed."

Posted by Jeffrey Schwartz on 06/15/2016 at 12:13 PM0 comments


Why Is Microsoft Spending $26.2 Billion for LinkedIn?

Microsoft's announcement today that it is acquiring LinkedIn had many observers scratching their heads wondering how the company can justify shelling out $26.2 billion to bring the second largest social network into the fold. Executives at both companies now have to sell the deal to customers, partners and investors and convince them that the move will bring enough revenue and profit growth to justify the huge outlay in what will be Microsoft's largest acquisition to date.

Initial reaction to the deal was mixed. While some believe it can transform Microsoft in many ways, others pointed that most huge deals rarely live up to their promise. "There's no reason to believe the LinkedIn deal cannot work, I just think it may take a long time and it may not produce the earnings leverage Microsoft shareholders deserve for a $26 billion deal," said Roger McNamee,  a co-founder of Elevation Partners, to CNBC.

On a conference call with investors, Microsoft executives led by CEO Satya Nadella and LinkedIn CEO Jeff Weiner, described a vision that brings the LinkedIn experience into all components of Office 365 and Dynamics, as well as key tools ranging from Cortana, Azure Machine Learning and Active Directory with LinkedIn news feeds, courseware and profiles.

"When we talk about Microsoft's mission, we talk about empowering every person and every organization on the planet to achieve more. There is no better way to realize that mission than to connect the world's professionals to make them more productive and successful. That's really what this acquisition is about," Nadella said.

While LinkedIn will retain its separate brand and organizational structure, the two plan to deeply integrate their respective platforms in ways that aim to extend the usage of both companies' respective products and services. They spoke broadly about plans to tap into each other's respective artificial intelligence (AI) capabilities -- the Microsoft Graph and the LinkedIn Graph -- to create what Weiner described as an "economic graph."

"When you combine the Microsoft Graph with LinkedIn's professional graph, we think we are going to be able to take a very substantial leap forward in terms of the realization of our vision, which is creating economic opportunity for every member of the global work force," Weiner said. It focuses on providing digital representations for profiles, skills, hiring and learning, he said. "The goal is to step back and to allow all forms of capital -- intellectual capital, working capital and human capital to pull it where it can best be levered, and in so doing transform the global economy, he said. "We believe we will be better positioned to make this possible."

The idea is that by knowing what project a worker is engaged in can tap information from users' news feeds and activities of those in their social networks can improve workflows and provide better access to information, Nadella said. A Dynamics CRM user might be able to follow up on a lead or a customer meeting more effectively and user account information and Cortana might describe provide information about people a user is about to meet with, Nadella said.

McNamee stopped short of telling CNBC reporters that the acquisition will fail, but he emphasized his skepticism. "It's really simple, big deals don't work," he said. "These things are sold, not bought. From Microsoft's point of view, they have to pay attention to the fact that LinkedIn has never been a particularly profitable company, and their focus on job boards and hiring and all of that has shifted the value proposition in a way that I do thinks limits their ability to integrate it from Microsoft's enterprise suite. It's not to say they can't do it, it will take real work, and I think quite a lot of time."

Posted by Jeffrey Schwartz on 06/13/2016 at 2:32 PM0 comments


Microsoft Makes Deal To Acquire LinkedIn for $26.2 Billion

In a bold and stunning move, Microsoft today announced it is acquiring the professional social network LinkedIn for $26.2 billion in cash and debt. Presuming the deal goes through, it will be by far the largest acquisition Microsoft has made in its 41-year history, adding a roster of 433 million registered users, of which 105 million unique visitors access their accounts at least once a month.

The deal, already approved by the board of directors of both companies, is expected to close by year's end. Besides the company's failed bid to acquire Yahoo for $44.6 billion back in 2008, it is three times the largest deal Microsoft has made (see Microsoft's largest deals here). By acquiring LinkedIn for a 50 percent premium over its closing price on Friday, Microsoft CEO is making his biggest bet yet in his bid to grow the company's business. Despite making huge strides in reshaping the company in the post-PC era, investors and analysts have shown impatience with the company's pace of growth. Microsoft's decision to acquire LinkedIn also demonstrates that the company is looking to play in a market it has largely avoided. While Microsoft acquired social networking technology with Yammer and invested $240 million in Facebook in 2007, this deal marks the first time Microsoft will try running a huge social network in a market dominated by Facebook, Twitter and Google, among others.

Microsoft said LinkedIn will remain independent, with CEO Jeff Weiner remaining CEO of the widely used professional social networking service. The deal raises questions as to what benefits adding a huge public social network will bring to Microsoft's existing portfolio. In a 90-second video created by Microsoft with Nadella and Weiner, the two gave brief statements on the rationale for the deal. Nadella said he has long contemplated acquiring LinkedIn, believing it fits in with the company's overall productivity and platforms focus.

"For sure I am a deep believer in productivity tools and communication tools because that's what empowers people to be able to be great at their job," Nadella said in the video. "But think about taking that, and connecting it with the professional network and really having the entirety of what is your professional life be enhanced, more empowered, where you're acquiring new skills and being more successful in your current job and finding a greater, bigger next job. That's that vision."

Weiner said during the discussions that led up to the deal both agreed the two companies were aligned in two key areas: purpose and structure. "Satya said time and time again 'You're going to have your independence, we have this shared sense of alignment, so let's dream big, let's think about what's possible.' That's going to be first principle."

The structure of keeping a company it has acquired like LinkedIn independent is not unusual for Microsoft, at least at the outset of those deals. When Microsoft acquired Yammer, Skype and Nokia's handset business, among the largest companies it had acquired previously, similar structures were established initially only for the companies to eventually become more integrated into the Microsoft corporate structure. That has had mixed results. Most of the core Yammer team is now gone and Microsoft has pared back most of the Nokia handset operations. Skype has been more successful so far and is evolving into a key component of Office 365.

Microsoft also appears to be betting that leveraging a large and established community of professional users will enable new opportunities. LinkedIn has had more than 45 billion quarterly member page views, which has grown 34 percent year-over-year. LinkedIn also hosts 7 million job listings, which has grown 101 percent over the past year, while 60 percent of its users access the service from mobile devices.

Initial reaction to the deal has been mostly that of surprise, with many sharing intrigue over the possibilities to link offerings such as Office 365, SharePoint, Dynamics and Azure in some way. One key task for LinkedIn will be to find ways to engage with many users who find the service has become a platform full of clutter and unwanted connection requests. "Satya Nadella makes bold final attempt to stop LinkedIn from e-mailing him," quipped author Matt Gemmell. Two hours later the comment had been retweeted 817 times.

Posted by Jeffrey Schwartz on 06/13/2016 at 9:10 AM0 comments


Microsoft Researchers Correlate Bing Searches to Signal Pancreatic Cancer

Findings from a study conducted by three Microsoft researchers that analyzed the retroactive searches of people diagnosed with pancreatic cancer showed patterns that may help connect certain symptoms not associated with the disease in order to come to a diagnoses sooner.

Using anonymized data from Bing's Web search logs, the three researchers were able to associate those who diagnosed with pancreatic cancer to look for common queries prior to discovering they had the deadly disease. Pancreatic cancer is among the deadliest forms of the disease because it is fast spreading and symptoms rarely appear before very late stages.

The Microsoft researchers included Technical Fellow Eric Horvitz, CTO of Heath Intelligence and Principal Researcher Ryen White and former Microsoft Intern John Paparrizos, who is now a doctoral candidate at Columbia University, where the study was performed. Their findings were published this week in the Journal of Oncology Practice. An abstract of the report explained that online activities could offer clues about emerging health conditions. The researchers said they used statistical analyses of large-scale anonymized search logs examining symptom queries from millions of people.

"We identified searchers in logs of online search activity who issued special queries that are suggestive of a recent diagnosis of pancreatic adenocarcinoma," according to the report's abstract. "We then went back many months before these landmark queries were made, to examine patterns of symptoms, which were expressed as searches about concerning symptoms. We built statistical classifiers that predicted the future appearance of the landmark queries based on patterns of signals seen in search logs."

The results showed that the searches could identify 5 percent to 15 percent of instances where search activities could potentially predict the existence of the disease, according to the report.

Horvitz, an artificial intelligence expert with both a Ph.D. and an MD from Stanford University, said that the queries entered about certain strings of symptoms could offer early warnings of the disease that would ultimately lead to a diagnosis weeks or months than they might otherwise have, in a Microsoft blog post. Given the low survival rate of pancreatic cancer, it could at the very least increase the odds of remission, though warned the findings are just a proof of concept. Microsoft said it doesn't plan to produce any products based on the findings but suggested it sets the stage for future discussion.

Posted by Jeffrey Schwartz on 06/10/2016 at 11:15 AM0 comments


SSO and GPU Boost Coming to Citrix Virtual Desktop and Apps

While the big news for Citrix XenApp and XenDesktop customers is the company's plans to offer Microsoft Azure as a back-end service for its virtual application and VDI offerings, updated releases of both offerings are also coming this month.

The latest versions of Citrix XenApp and Citrix XenDesktop will include single sign-on support though a new federated identity management service, offer extended GPU support for graphics-intensive applications and is now suited for higher performance via integration with key hyper-converged computing platforms. Citrix said its XenApp and XenDesktop 7.9 release will integrate with SAML-based identity providers and Active Directory connected to its Citrix NetScaler load balancing, networking and application distribution controller.

Citrix officials outlined the new features in XenApp and Xen Desktop at last month's Citrix Synergy conference in Las Vegas. Over the past year, Citrix has moved to bring XenApp and XenDesktop to a common code base, and, rather than deliver major new releases once per year, Citrix is now updating both of them more frequently with incremental new features.

The federated identity management service added to the 7.9 release lets each business or group owner administer their own Active Directory domains. Those setting up collaborative workgroups, supply chains or other extended networks can share multiple instances of Active Directory and Azure Active Directory. Citrix bills this offering as an alternative to building two-way or multi-way trusts between parties. "Instead they can just point them to the federated authenticated service and that would do the brokering between them," explained Carisa Stringer, Citrix principal product marketing manager for desktop and apps, in a blog post describing the new release.

On the device side, the new 7.9 release adds improved GPU support including the ability to run Linux clients using NVIDIA's latest processor, which it has made available at significantly reduced cost and software subscription fees for XenApp implementations. "Modern applications are using graphics, and this makes GPUs more widely available," said Calvin Hsu, Citrix VP of marketing for XenApp and Xen Desktop during an interview at Synergy.

Also in the graphics area, Citrix said the release now supports Intel's new Iris Pro and the virtual GPU server, called Graphic Virtualization Technology virtual GPU (GVT-g), aimed at making graphics acceleration more available to virtual desktop instances. Citrix said it has integrated its HDX 3D Pro with Intel Iris Pro Server graphics. "With the next generation of servers, these things will be built into server -- it's part of the chip, it's integrated at the CPU, it's great for Office, for browser applications even Windows 10 itself," Hsu said. "Just the operating system leverages the GPU, it'll render stuff on screen, and so that will really help with the overall graphics capabilities of virtual desktop."

When used with the new XenServer 7, the latest release of Citrix hypervisor platform that's now generally available, Intel Iris Pro makes it less expensive to run graphics-intensive applications because it doesn't require added hardware or software licenses. It's suited to browser-based apps, PowerPoint and computer aided design (CAD) tools. Citrix said XenServer 7 supports up to 128 Nvidia GRID vGPU-enabled VMs, which is a 33 percent increase over earlier versions.

The third key feature in the XenApps and XenDesktop 7.9 update is support for optimized storage I/O and support for hyper-converged infrastructure. Citrix inked a partnership with Nutanix, whose Acropolis hypervisor platform will work with XenServer. Because XenServer now understands all of the Nutanix Acropolis APIs and provisioning steps, it can be set up via the XenDesktop console. Nutanix introduced InstantON VDI. The company claims the latest iteration of its hyper-converged compute and storage platforms takes an average of just four hours to get a VDI implementation up and running. It's priced at $415 per user and aimed at midsize organizations with at least 300 users but can scale to thousands. It comes bundled with Citrix XenDesktop VDI Edition, the Nutanix enterprise cloud platform including the company's AHV hypervisor and the Nutanix connector for Citrix MCS and three years of support. Citrix also said it'll work with hyper-converged equipment from Atlantis Computing and Fujitsu.

Citrix also extended the Lifecycle Management Blueprints with support for any XenApp and XenDesktop deployment. Announced last fall, the Citrix Lifecycle Management Blueprints were designed to simplify and/or enhance new deployments of all of Citrix offerings. The latest Blueprints for XenApp and XenDesktop include Update Service, aimed at ensuring software updates are applied properly and Smart Scale, which tracks usage of cloud services and adapts the use of services to ensure against unneeded use of public clouds.

Posted by Jeffrey Schwartz on 06/06/2016 at 1:49 PM0 comments


Dell Ups Datacenter Infrastructure Ante with New Liquid Cooling Tech

Every so often, someone reveals a major advance in keeping datacenter infrastructure cool -- the key to gaining efficiencies in running some of the largest deployments of server clusters. Dell, with major help from Intel, says it has found a breakthrough approach to using liquid-based cooling. Revealed yesterday, Dell said eBay will be the first to deploy this new hyper-converged system as a proof of concept.

The new technology is the latest iteration by the Dell Extreme Scale Infrastructure Group to push the envelope in applying liquid cooling to the rack in a manner that can push the performance and cost-effectiveness of deploying scale-out infrastructure for sites and compute functions that generate massive numbers of transactions per second. Running water through computing racks is considerably more expensive on a per-gallon basis than using forced-air cooling methods. But according to Dell officials, water is 25 times more efficient at removing heat from the CPUs (critical for ensuring reliability and performance) than air is, watt for watt.

At a briefing in Dell's ESI lab in Austin, Texas this week, Dell officials demonstrated the new system code-named Triton. Dell is not the first company to deliver liquid-cooling technology. IBM actually first started doing so for its mainframes back in the 1970s. Two years ago, Hewlett Packard (now known as Hewlett Packard Enterprise), introduced the Apollo 8000, a supercomputer-class system featuring liquid-based cooling.

Dell and Intel collaborated on a customized 200W 20-core Intel Xeon E5 Broadwell CPU that they claim can provide "double-digit" performance. By removing liquid-to-liquid heat exchangers of water pumping systems typically found in datacenter liquid-cooling racks, Dell is able to achieve higher efficiency and lower water consumption. Dell claims Triton has a power usage effectiveness (PUE) rating as low as 1.02 and requires 97 percent less datacenter cooling power than typical air-cooled systems and 62 percent less power than HPE's Apollo 8000.

The cooling technology enables the processor to stay in turbo mode all the time. The system pushes water at high pressure rates without having to add additional cooling to it. The water is distributed through copper pipes with specially sealed joints that support 350 PSI, based on current tests, and is rated above 5,000 PSI.

"Triton, unlike a lot of other liquid cooling solutions, is unique in that we don't have a distribution unit and we don't have a secondary cooling unit," explained Austin Shelnutt, principal thermal architect for datacenter solutions at Dell, who demonstrated Triton. "We bring that facility water directly into each one of our server nodes. Do not pass go. Collect $200 go straight to the node and strike the heat right from the CPU."

Each sled in the unit has the customized Intel Xeon Broadwell processor, four 2.5-inch hard disk drives and each has its own node power distribution board with leak detection sensors that, in the event of a leak, will kill off water flow into just that sled. The power to the sled is also shut off and an alert is issued via the management interface.

The system still does have fans to cool off the the PCIe devices, memory and hard drives, though the liquid-distribution technology provides 80 percent of the cooling, Shelnutt said. (Dell released a video with Shelnutt demonstrating Triton, which is accessible here.)

Dell and Intel designed this to meet the search performance required by eBay. In a statement, Nick Whyte, eBay's VP and fellow for search technology, said Triton produced a 70 percent increase in throughput in terms of queries per second (QPS). While not everyone has a need for such performance, Triton could find itself in more mainstream datacenter environments over time, said Gina Longoria, a senior analyst at Moor Insights and Technology.

"Triton today is designed for very large scale customers so it will only be relevant to a small set of customers," Longoria said. "Dell is currently looking at a 'closed loop' version of Triton that offers the same core liquid cooling technology and CPU support but removes the need for datacenters to have facility water at the rack. This has the potential to bring liquid cooling to an even broader set of scale-out customers." While liquid cooling will remain a technology used by a limited set of customers in the near term, she said it could be a good fit for customers with CPU-intensive workloads or those that have unique energy efficiency requirements. "It may be used more broadly as the technology evolves to be more easily integrated into existing datacenter," she said.

The fact that Triton can use direct cooling tower water with filtration and sensors to allow it to do so is what gives it an edge today, Longoria added. "Other systems, like [HPE's] Apollo, require a chilled water loop which is an intermediate exchange and use energy to cool the water," she said. "This makes the facilities preparation and operations less expensive with Triton which could result in an overall TCO advantage. Also since Triton requires no intermediate liquid-to-liquid heat exchange, it is more effective in removing heat. This theoretically allows the Triton system to cool components that other liquid-cooled solutions can't."

Posted by Jeffrey Schwartz on 06/03/2016 at 2:01 PM0 comments


Microsoft Taps Partners in Blockchain Community To Advance Global Identity Initiative

While Microsoft was the primary technology sponsor of last month's ID2020 Summit at the United Nations, about 100 other tech providers, ranging from consultants to startups and other large companies, had a presence as well. Less than two weeks following the May 20 event at the U.N., Microsoft indicated it was a fruitful gathering.

Having attended the event, that was apparent to me but Yorke Rhodes III, blockchain business strategist at Microsoft, who also spoke at the ID2020 Summit, announced in a blog post Tuesday that Microsoft is partnering with Blockstack Labs,  ConsenSys and other developers worldwide on a system giving everyone an identity. If there were any lingering doubts before the ID2020 Summit of Microsoft's commitment, the interaction that took place there put them to rest. The U.N. has a goal of ensuring everyone an identity as part of its 17 Sustainable Development Goals announced last year, by the year 2030. In his post, Rhodes described what the ID2020 group hopes to achieve in the coming five years.

The goal is to create an "open source, self-sovereign, blockchain-based identity system that allows people, products, apps and services to interoperate across blockchains, cloud providers and organizations," Rhodes explained. "Our goal in contributing to this initiative is to start a conversation on blockchain-based identity that could improve apps, services, and more importantly, the lives of real people worldwide by enabling self-owned or self-sovereign identity."

The latter part is key. While most of those reading this likely have some form of legal identity, perhaps a driver's license, bank account or passport, not to forget numerous online accounts, there are 1.5 billion people around the world who have no such identity. Rhodes shared some other statistics presented at last month's ID2020 Summit:

  • One in three children under the age of five does not officially exist because their birth has not been recorded.
  • Cumulatively, 230 million children under the age of five have no birth certificate. This number is growing.
  • 50 million children are born without legal identity, the size of the UK, each year.

Rhodes maintains that it's possible to implement a self-sovereign identity management system using components of blockchain-based systems. By partnering with Blockstack Labs and ConsenSys to leverage their Ethereum-based identity solutions and stealth provider uPort, Rhodes said the goal is to collaborate on an open source initiative "to produce a cross-chain identity solution that can be extended to any future blockchains or new kinds of decentralized, distributed systems."

As noted last week, there are many obstacles to overcome. Besides technical interoperability, coming up with geo-political consensus and a governance structure for this model will be a complex effort, to say the least. But the wheels are now in motion. Microsoft, which has identified identity management as a key component of its technology stack, intends to play a major role in the effort to see how blockchain may play a part in the future of all credentials.

Posted by Jeffrey Schwartz on 06/01/2016 at 1:03 PM0 comments


Citrix and Microsoft's Relationship Just Became Cozier

Citrix and Microsoft have worked closely together for 25 years. At one point in the late 1990s, Microsoft even bailed Citrix out with a cash infusion and investment that kept the company going. While Citrix has carved its own niche in desktop virtualization ever since and the two companies have worked closely together, now they have extended their partnership in a new and potentially meaningful way.

The two companies described their latest pairing, announced at this week's annual Citrix Synergy conference in Las Vegas, as their most significant to date. It's the most extensive partnership in terms of the number of offerings involved from both companies and the level of engineering that has, or will take place.

The announcement comes three months after Kirill Tatarinov, a longtime top executive at Microsoft, took the helm as president and CEO of Citrix. Tatarinov arrived amid activist investor Elliot Management pushing for changes -- ostensibly growth for the company, which has invested heavily in engineering but not yielded payback. Tatarinov has remained largely silent since taking over in January. At Synergy, Citrix refreshed its entire product line and Tatarinov said "we're all in the cloud," adding that the company's entire portfolio will be cloud focused over the coming months. Clearly, Microsoft is going to be a key enabler of its accelerated push to the cloud, a statement of direction Citrix has made last year when saying its new Workspace Cloud, which it has since renamed Citrix Cloud, would use Azure as its control plane.

The new pact, revealed by Tatarinov in the opening keynote session at Synergy, identifies Microsoft Azure as its strategic cloud for delivering on Citrix imperative of cloud enabling all of its offerings. The new partnership also covers Office 365, Enterprise Mobility Suite (EMS) and Windows 10 Enterprise migrations and deployments. On Citrix's end, it also covers a broad swath of its portfolio, including the Xen desktop and app offerings, AppDNA, the company's app migration tool; NetScaler, its application-aware load balancing controller and gateway; and ShareFile, Citrix's file storage and synchronization platform.

"We are taking our relationship to the next level," Tatarinov said in his keynote. By identifying Azure as its cloud of choice to deliver its own services, Tatarinov emphasized Citrix will continue to support multiple cloud services. The company has enabled customers, for example, to run their Citrix infrastructure in the Amazon Web Services cloud, though on a self- or partner-managed basis. Citrix offerings are also available as managed services by various hosting partners. "Our job is to give customers choice and enable them to run Citrix from any cloud they choose," Tatarinov said in a press briefing following the keynote. "It's only logical for those who are deploying Office 365, Windows 10 powered from XenApp and Xen Desktop, for those Microsoft customers to expect those capabilities to come from the Azure cloud."

The portion of the partnership that drew the largest applause was that Citrix will offer customers who have licensed Windows 10 Enterprise (Microsoft's Current Branch for Business) the option of managing their Windows 10 images on Azure via XenDesktop without having to pay an additional license fee. It's the first time Microsoft has permitted this capability. Delivered as a service, it'll include the ability to use Citrix AppDNA migration tool and deploy virtual desktops or apps.

"This is an industry first, it's the first time we announced the ability for a Windows client to be hosted in a public shared cloud," said Brad Anderson, Microsoft corporate VP for enterprise and client mobility, who was joined by Bill Burly, VP and general manager of Citrix's Workspace Services Business unit, on stage in Wednesday's keynote session. "It's a big, big deal for the industry. I really think this Windows 10 VDI service on Azure is going to open doors up. People are dying to take advantage of the Azure power to deliver VDI."

Anderson noted that this is not just a licensing agreement -- the two companies have worked on technical integration for nearly a year. "I love the integration that's happened where XenApp apps can now be hosted in Config Manager, which [manages] 70 percent of the world's Windows devices," Anderson said. "It's the tool that everybody is using to upgrade to Windows 10 and now XenApp just fits inside, and you can publish all apps of all sizes in that Config Manager console and in one console you see everything."

On the Office 365 side, Citrix will now enable it to run in XenApp and XenDesktop environments. Xen users will be able to run macros and plug-ins as well as run Skype for Business with what Citrix officials described as "dial-tone" service.

On the enterprise mobility management side, the two companies are competitors and collaborators. Both companies offer their own enterprise mobility and management suite, though Anderson argues Microsoft's EMS, which includes Intune, Azure Active Directory and Rights Management Service. Citrix XenMobile and NetScaler will support Microsoft's mobility management tool.

"Netscaler is going to EMS-enabled," Anderson said. "What that means is as EMS-managed apps devices, come though NetScaler at the perimeter, NetScaler is going to interact with EMS services, and we'll be able to force initial access based upon the policies that are set by EMS. Literally every EMS customer can also be a NetScaler customer."

The two companies are also taking Citrix NetScaler capabilities and integrating that into the MAM SDK of Microsoft's EMS. That will allow any app inline with the SDK will be able to connect with on-premises apps, according to Anderson. And Citrix is going to build its own new enterprise mobility service that'll run on Azure but offered as a Citrix service. The two companies competing mobility offerings will interoperate with each other, Anderson said. "Citrix will bring all of their experience in security compliance, especially these highly regulated businesses, and we'll do all that work in the cloud and apply that," he said.

Time will tell whether these moves merely help Citrix hold on to its existing customers or help grow the business. "All of the technology stuff makes sense. The big question to me is how will they execute together on that," said Enterprise Strategy Group Analyst Mark Bowker. "There's gaps in Citrix's portfolio, such as identity and access management. Microsoft can help clearly help fill in that gap."

Investors and customers will be watching closely whether Microsoft can give Citrix the pull it needs to grow its business again.

Posted by Jeffrey Schwartz on 05/26/2016 at 7:53 AM0 comments


Microsoft Among Those Pitching Blockchain at U.N. Summit To End Identity Crisis

Leading and emerging technology providers, human rights activists and U.N. ambassadors from around the world gathered Friday for the inaugural ID2020 Summit to kick off a major and non-trivial effort of developing a technical, organizational and political framework that would ultimately ensure everyone in the world has a trusted digital identity. It's a significant undertaking considering 1.5 billion people, 20 percent of the world's population, don't have any form of identity today, subjecting them to exploitation, human trafficking and depriving them of any rights because they have no provable or traceable form of identification.

I attended the inaugural "ID2020 Summit -- Harnessing Digital Identity for the Global Community," held at the United Nations headquarters in its Trusteeship Council Chamber with 100-plus tech companies and business leaders appearing. The U.N. is participating in the project because it has determined it fits in with one of its key 17 Sustainable Development Goals (STG) outlined last fall. STG 16, which seeks to create a global knowledge platform, aims to ensure everyone is given a globally accepted digital identity by 2030. Microsoft and PricewaterhouseCoopers (PwC) were the primary sponsors of the ID2020 event, though representatives of standards bodies of all sizes also are contributors including Cisco, the Depository Trust Clearing Corp. (DTCC), the International Telecommunications Union, the Open Identity Exchange (OIX) and OASIS. Joined by U.N. ambassadors, humanitarian leaders and others, the goal was to lay the groundwork for what will be a long process of aiming to solve the global identity crisis. It's a crisis because those with no identity are subject to human trafficking -- particularly women who are captured by sex traffickers. Not having an identity also affects migrant workers and those living in underdeveloped parts of the world who are denied access to health care, education and voting rights, among other things.

Blockchain's Potential
A growing undercurrent of technology providers and businesses are becoming bullish that Blockchain -- the technology that enables anonymous Bitcoin transactions -- could solve many issues in handling transactions and also believe it could be a key ingredient in providing an interoperable identity framework. Leading financial services firms are looking at Blockchain, described simply as a distributed ledger with digitally recorded data in packages called blocks that are unchangeable and verifiable. Many experts are exploring Blockchain's ability to offer a verifiable identity.

There's no lack of identity frameworks in the developed world. On the commercial side, people have Social Security numbers, Facebook accounts, Google IDs and Microsoft Accounts that offer some level of interconnectivity via recognized legal and technical standards. There are various emerging Blockchain frameworks, which simply are distributed repositories that offer electronic stamps that can't be changed. Because they are proprietary and formative, they support various forms of authentication, oftentimes biometrics. In financial services, transactions are timestamped and don't require entities to have a relationship.

John Farmer, director of technology and civic innovation at Microsoft, pointed to three key benefits of Blockchain: It's immutable, meaning data and points of time can't be altered; it has a trustless nature, meaning that parties don't have to know each other yet they can have faith the Blockchain is accurate; and it's a transparent agreed-upon network.

"In terms of Blockchain in particular, we think it is great because of its potential to empower people," Farmer said. "This fits so well with Microsoft's own mission statement."

Early Days
Nevertheless, Blockchain is still an emerging technology, and there's no standard way to interconnect proprietary networks, nor a global political and legal framework surrounding them. "When I see Blockchain, I see something that's in that it's early days," Farmer said. "I look at this akin to where the Web may have been in the early 1990s, and at the same time, a lot of us see the potential of where it can be. It's simply going to take some work to get from here to there."

Scott David, director of policy at the University of Washington Center for Information Assurance and Cybersecurity and a principal consulting analyst at TechVision Research, believes Blockchain has potential to solve the problems the ID2020 forum is hoping to someday solve.

"The way the technology works in there is the Blockchain has the potential as a computational sovereign that can help you address problems that are trans-jurisdictional such as identity to migrating population," David said. "It can also help the weighting of multiple NGOs' [non-governmental organizations] trust frameworks for interoperability."

ID2020 Focus on Universal Identity
Indeed, many leading providers of Blockchain technology at ID2020 talked up Blockchain as an underlying technology to achieve these goals. ID2020 Founder John Edge, also a proponent of Blockchain, said last week's event was not about selling the technology to the U.N.

"ID2020 is a hub for gathering the world's innovative technologies to the United Nations and other organizations who would benefit potentially from discovering this technology," Edge said at a brief press conference during the event. "It definitively is not recommending that technology. This is the start of a 15-year ambition. We have a lot to figure out." Edge said the intent is to spend the next five years determining whether it can move forward with rolling out an identity framework by 2030.

While attendees were all in agreement that the key priority is saving hundreds of millions of people --especially young girls -- from a life of exploitation and in many cases captivity, many attendees were there to discuss, or learn about how Blockchain might play a role in that. The technology is starting to get strong attention in the financial world as well and has been endorsed by the likes of The World Bank and Citigroup, among others.

Microsoft's Blockchain Efforts
For its part, Microsoft has been following the technology for some time, and in November announced its plan to deliver on the technology when it teamed with ConsenSys to offer Ethereum Blockchain as a Service (EBaaS) on Azure, aimed at providing a single-click, cloud-based Blockchain developer environment.

While not organizers of last week's event, Microsoft and PWC were the major sponsors and had a strong presence. Avoiding the appearance of pushing Blockchain as the answer, officials from both companies nevertheless didn't hide that they are strong proponents of it.

"This notion of establishing a secure identity for not just individuals but everything," said Marley Gray, director of business development and cloud enterprise at Microsoft. "Being able to track and be able to transact and secure identity is one of those key enablers to allow you to flow through these different technology innovations. The challenge is not necessarily the technology, or the organization -- it's bringing them all together. And that's only going to be done through these partnerships. At Microsoft we do have a lot of identity solutions in the enterprise, we also have a huge footprint across the consumer. In the enterprise space, we do think Blockchain technology is a part of the solution. Will it be the solution? We don't know. We think there's some good ideas there we can use to evolve this partnership, that's the main reason we are here. It's to try to help raise the tide, lift all boats including those in underdeveloped markets, the unbanked and the underserved."

In his post back in November, Gray indicated why he is a proponent of Blockchain. "The Blockchain is a time-stamped, non-repudiable database that contains the entire logged history of the system," he noted. "Each transaction processor on the system maintains their own local copy of this database and the consensus formation algorithms enable every copy to stay in sync."

At last month's Envision conference, Microsoft announced it is working with R3, a consortium of 40 large financial services companies to test Blockchain technologies.

Is Blockchain the Solution?
Skeptics warn not to look at Blockchain as the entire answer to this identity management dilemma. David Crocker, who was at the forefront of pushing forward early Internet working protocols, was at the ID2020 event. Asked if he believes Blockchain is the future of identity management, Crocker said: "It absolutely is not. Now that I said that, I didn't say it's useless. I said it's not the solution. There is no solution. There are components of technology that are useful, Blockchain is clearly one of them. But you don't build a system out of a component of technology -- you build them out of a bunch of components. And more importantly you build it out of a system design, and that system design has to look at the usage scenarios, and in large-sale system design."

Clearly it will be no easy task for the U.N. and the ID2020 organization pull off its goals. Experts agreed there are many technical, economic and political issues that could pose barriers. But given the societal impact it could have, many believe the stakeholders will find a solution. TechVision's Scott David, who is also a former partner with K&L Gates (formerly Preston, Gates & Ellis), said in an interview that the event helped set the stage to move forward.

"What we had here was this dynamic stable state, you have governments and companies with creative tensions between them, that's where the answers are," David said. "There isn't an answer either/or. There's not going to be some new substitutes for existing institutions, it will be a synthesis of these things. That's why I said it's very satisfactory. This was like a first date in this context of those organization."

ID2020 Moving Forward
David was encouraged by the support from Microsoft, Cisco and others. "As corporations have gotten more powerful, and countries have gotten not diminished power but there are new powers in town, it's heartening to see corporations finding ways to get involved in initiatives like this," he said. "We don't know which one is going to be a winner, but at least there's energy -- that kind of funding and resource and attention that's going to be important."

One thing everyone at ID2020 agreed upon is that there's a common desire to move forward, but no one knows what the political, organizational and technical frameworks will look like. But the wheels are in motion for this to be hashed out for the coming years. If such a framework evolves, it'll have huge benefits to commerce but, more importantly, it will give 1.5 billion people throughout the world rights they never had, and for millions it will save them from inhumane exploitation -- or sadly for millions a life of slavery with no identity to ever save them. If ID2020 can't bring everyone together, it's unlikely anything else will.

Posted by Jeffrey Schwartz on 05/25/2016 at 1:18 PM0 comments


Google and Microsoft Advance Efforts To Deliver Bots

Google last week put a new horse in the race of "bots," the technology that aims to bring virtual assistants to every computer, tablet, phone and embedded device. The search giant demonstrated its new virtual assistant at its annual IO developer conference at its Mountain View, Calif. headquarters.

It's hardly a surprise that Google wants to offer bot technology, given the massive amount of data it gathers from individual searches and the company's well-known machine learning technology. The new Google Assistant is hardware that will interface with Google Home designed to respond to voice commands. Conceptually it's similar to Amazon's Echo device, which is hardware that responds to voice commands. I have an Echo, which does some interesting things such as provide weather forecasts and, of course, lets me order merchandise from the online retailer. But at this point, it doesn't take much to ask it something it can't answer, although presumably that'll change over time.

The same is true for Microsoft's Cortana and Apple Siri, though I haven't put any of these through the paces yet. For its part, Microsoft made clear it will advance Cortana, having released the Cortana Analytics Suite for developers and the Azure Bot Framework, which it announced at the recent Build 2016 conference in San Francisco, Calif. The company last month subsequently released the APIs for the Bot Framework and extended support for its Skype Bot to Macs. Since Build a few months ago, Microsoft has emphasized its plans to invest heavily in bots and now it looks like we'll see them extended to Bing.

A job posting, discovered and reported by Mary Jo Foley on Friday in her All About Microsoft blog,  read that Microsoft was looking for experts to develop its "Bing Concierge" bot. The company, which didn't comment when she inquired, curiously removed the job post after Foley reported on it.  But according to the text of the job posting:

"In Bing Concierge Bot we are building a highly intelligent productivity agent that communicates with the user over a conversation platform, such as Skype, Messenger, SMS, WhatsApp, Telegram, etc. The agent does what a human assistant would do: it runs errands on behalf of the user, by automatically completing tasks for the user. The users talk to the agent in natural language, and the agent responds in natural language to collect all the information; once ready, it automatically performs the task for the user by connecting to service providers. For example, the user might ask 'make me a reservation at an Italian place tonight', and the agent will respond with 'for how many people?'; after several such back-and-forth turns it will confirm and book the restaurant that the user picked."

Foley also noted that Microsoft last week announced that its Bot Framework now connects with Kirk Interactive's bots.

While Microsoft continues its storied push to increase Bing use, and is hoping Cortana -- both in Windows and in other platforms -- will be the catalyst, Google used its Google I/O conference to demonstrate significant progress in its own efforts.

As John K. Waters reported on Redmond sister site Application Development Trends, Google is claiming that 20 percent of all its queries are spoken. Sundar Pichai, presiding over Google I/O for the first time as the company CEO, described his vision to create a more ubiquitous, conversational and "assistive" way to interact with technology "This is a pivotal moment in where we are with the company," Pichai told attendees. "It's not just enough to give people links. We need to help them get things done in the real world."

In the days preceding Google I/O, the company also announced that it is making its developer tool kit for its SyntaxNet artificial intelligence, which Googe calls "the world's most accurate parser," free for developers to modify. It came just days after Amazon open-sourced its Deep Scalalable Sparse Tensor Network, which is available on GitHub.

The battle for creating the front end of AI-based personal assistance is still in its early stages and it's looking like no one provider will have the only bot people use. And as Microsoft, Google, Amazon and Apple advance their efforts, don't forget about Facebook, which has an estimated 1 billion users at the front end of its social network.

What's your favorite bot, or virtual assistant?

Posted by Jeffrey Schwartz on 05/23/2016 at 12:32 PM0 comments


New Tools Extend Migration to SharePoint 2016 and Office 365

OneDrive for Business is evolving as the common denominator for all things SharePoint and Office 365. But as that becomes a common repository for data, migrating files and protecting them can be cumbersome. Companies such as AvePoint, EMC, Harmon.ie and Metalogix, say they can make it easier.

In concert with this month's release of SharePoint Server 2016 and Microsoft's new roadmap for the collaboration platform on-premises version and for Office 365, these companies are among those that are offering new tools to build upon what Microsoft offers.

Metalogix unveiled its new Essentials for Drives: File Shares to OneDrive Freemium. As the name suggests, it's a free edition but one that can accommodate a reasonable amount of users and data before the meter starts running.

Organizations can now migrate up to 500 users or 1TB of data to OneDrive for Business. In addition to files and folders, customers can move versions, metadata and permissions from file systems, file shares and network-attached storage (NAS), the company said. The tool, based on the Essentials software offered by MetaVis, which Metalogix acquired last year, can accurately enhance and assign metadata in OneDrive for Business, flatten the file structure and automatically convert folder names into metadata. It also provides detailed logs to verify migrations including failed item reprocessing. Organizations exceeding the freemium parameters can purchase basic, standard and enterprise subscriptions priced respectively at $9, $12 and $15 per user per month.

Harmon.ie, which offers access to SharePoint content in Outlook, has launched a consolidated version of its tool. The namesake offering lets users upload, classify and share e-mails and documents to SharePoint and Office 365 from any e-mail client including the Windows version of Outlook, Outlook on a Mac, the mobile app iterations of Outlook and from Web browsers. The company also added its new App for Outlook Enterprise Edition that lets users automatically add metadata to the Macintosh, mobile apps and the browser when uploading them to SharePoint. Office 365 Group files can now be accessed from the harmon.ie e-mail sidebar. Users can now include e-mail as records in SharePoint for records management.

While Microsoft offers data protection of Office 365 e-mails, calendars and files, data inadvertently or maliciously deleted does not fall under the company's purview -- and there are other limitations as well. EMC offers data protection and restoration offering that it claims lets users intuitively restore lost data from a variety of SaaS offerings including Google Apps, Salesforce.com and now Microsoft's Office 365. The new offering, which costs $48 per user per year, comes under its Spanning brand. Spanning for Office 365 lets customers automate their backups either daily or on a more granular basis. It's designed to allow the end user to restore deleted data and covers all of a users' devices. 

A new release of DocAve, migration and compliance software from AvePoint, adds support for SharePoint Server 2016. DocAve 6 SP 7 also lets customers move data from SharePoint Server 2016 and earlier versions (as well as other vendors' collaboration platforms) to Office 365's OneDrive for Business. AvePoint claims it can transfer SharePoint, OpenText LiveLink and file shares five times faster than its earlier version.

SharePoint customers who use Nintex Workflow can use the updated DocAve release to migrate, replicate, and transform their on-premises workflows to Office 365. AvePoint claims DocAve can do so keeping all workflow definitions intact. The new release also supports deployment and management of the new SharePoint Server 2016 release on-premises or on third-party cloud-hosted environments. It also provides enforcement of corporate and regulatory records management policies and requirements.

 

Posted by Jeffrey Schwartz on 05/20/2016 at 12:29 PM0 comments


Microsoft and SAP Integration Pact Is Perhaps Their Broadest Yet

Microsoft and SAP have inked a broad partnership that integrates some key offerings including Azure and Office 365 with the SAP HANA data platform. The pact, which includes ties with other products from both companies, was the big headline at the 28th Annual SAP Sapphire NOW user conference, which kicked off Tuesday in Orlando, Fla.

The two companies have aligned in the past but today's announcement promises to benefit many organizations that use both companies' respective offerings, while boosting SAP's efforts to broaden its reach into midsize organizations with the new SAP Business Suite 4 HANA offering. SAP had decided rather than support third-party databases underneath the various components of its suite -- targeted at transaction-oriented, business critical functions -- that it would tie the components only with its own HANA in-memory database.

The two companies have agreed to certify HANA and the suite to run dev, test and production workloads in the Microsoft Azure public cloud.  On the software-as-a-service side of things, the two companies are integrating SAP's Concur, Fieldglass, SuccessFactors and Ariba offerings with Office 365. The Office 365 integration will include document sharing, calendar, communications and other collaboration and data sharing functions offered by Microsoft's service with SAP's SaaS apps.

Microsoft CEO Satya Nadella appeared on stage during the keynote session yesterday at Sapphire NOW, led by SAP CEO Bill McDermott. Addressing McDermott, Nadella said: "This partnership is perhaps one of the broadest things that we've done. We have a long heritage of working together but when I look at the footprint of what you're accomplishing today to benefit customers, it is breathtaking." In addition to the Office 365 integration with SAP's SaaS offerings, Nadella noted Microsoft will offer security and management of the HANA portfolio.

"The applications you build can be managed with EMS and Intune, so you can set security principles for data loss protection," he said. "Now that seamlessly flows into the apps that you build. This combination of integration is really going to accelerate the growth that our customers really seek by bringing together things they're bringing with us."

Pund-IT Principal Analyst Charles King said both companies have thousands of customers in common, which all stand to benefit from the new integration capabilities announced at Sapphire NOW. "Both companies are leading players in their respective spheres -- Microsoft's Azure is one of the industry's largest business clouds while SAP's HANA is the fastest growing in-memory database solution," King said. "Business customers will applaud bringing those platforms closer together but the move should also cheer SAP and Microsoft's technology partners, most of which are delivering or building Azure- and HANA-related service offerings."

While the extended pact with Microsoft was among the most prominent, a number of other key SAP partners announced new capabilities this week. Among them were Amazon Web Services and Dell. AWS launched a new X1 Instance Type designed for workloads generated by in-memory databases such as HANA. The new x1.32xlarge instance size is powered by an Intel Haswell Xeon E7 8880 v3, available with up to 64 cores and 128 virtual CPUs. AWS today posted the specs of its high-end compute node.

Dell announced appliances for its new "Dell Validated Systems for SAP HANA Edge," which will be released next quarter. The new PowerEdge servers running the SAP HANA Edge edition is targeted at midsize organizations that are notably smaller than the traditional SAP customer. "We want to make it not only affordable but enable smaller SAP implementations and democratize it for SMBs," said Jim Ganthier, VP and general manager of Dell Engineered Solutions, HPC and Cloud.

For those looking to add database replication to the mix, Dell said a new version of its SharePlex data replication offering will support SAP HANA, Teradata and EnterpriseDB Postgress. Until now, it only offered Oracle-to-Oracle replication. Dell also launched the Dell Automation and Cloud for SAP Software bundle, which includes support for Cloudera and Dell's new IoT portfolio.

Posted by Jeffrey Schwartz on 05/18/2016 at 1:55 PM0 comments


Latest Windows 10 Insider Preview Build Sends Edge Browser Extensions to the Store

Microsoft has released a number of test builds for the new Windows 10 Anniversary Edition since announcing its pending arrival this summer. The latest Windows 10 Insider Preview Build, released last week, gives testers a look at a number of enhancements and bug fixes. But the primary emphasis is on the management of extensions to the Edge Web browser. In addition to introducing several new extensions, users must now access all of them from the Windows Store rather than finding them in folders.

The Windows Insider Preview Build 14342, pushed out to Fast Ring testers Wednesday, will also remove existing extensions from Edge, and users will have to reinstall and/or add new ones after downloading them from the Windows Store, said Corporate VP Gabe Aul, in a blog post announcing the latest release.

The new Edge extensions Aul highlighted were AdBlock, Adblock Plus, Pin It Button, Mouse Gestures, Reddit Enhancement Suite, Microsoft Translator and OneNote Web Clipper. Aul advised against installing both AdBlock and AdBlock Plus as they cause conflicts with one another. Following the initial release, Aul posted an updated note to the build that suggests uninstalling any extensions you no longer want rather than turning them off.

The new build also provides real-time Web notification with Edge. For instance, if you receive a message via Skype for Web while using an Xbox app, a notification will appear just as it does in an app. Edge users also now have swipe navigation, allowing users to swipe from anywhere on a page to return to a prior page.

Besides improvements to Edge, Microsoft outlined improvements to the OS itself including:

  • Bash on Ubuntu: Symlinks within the Windows Subsystem for Linux now work on the mounted Windows directories.
  • Updated Windows Ink Workspace icon: Microsoft claims the taskbar icon looks better and more consistent with others in the notification area. 
  • Skype UWP Preview Update: Support for multiple Skype accounts.
  • Project Rome: Microsoft announced (and demonstrated) at Build 2016 plans to support redirecting of Web sites to open with an app. A new page at "Settings > System > Apps" is now available, though Aul noted no apps yet support that capability.

If you're on the Fast Ring and have looked at the latest builds, feel free to share your observations.

Posted by Jeffrey Schwartz on 05/16/2016 at 10:42 AM0 comments


RHEL Upgrade Includes Improved Active Directory Integration

It appears the partnership Microsoft and Red Had formed late last year is paying dividends. The release this week of Red Hat Enterprise Linux (6.8), which includes a variety of systems management and security improvements, offers cleaner ties with Active Directory.

Aiming to improve client-side performance and management, the company has added new capabilities to RHEL 6.8's Identity Management client code in System Security Services Daemon (SSSD). The SSSD in RHEL is Red Hat's integration interface between its own Identity Management (IdM) and various identity and authentication providers including Active Directory and LDAP. SSSD also caches user credentials so they're available for authentication if an identity provider goes offline.

A new "cached authentication lookup" capability on the client reduces extraneous communications of credentials with Active Directory servers. Also added was support for the open source adcli, a tool for managing Active Directory domains. Red Hat said SSSD also now supports smart card authentication for login to systems and for functions such as sudo, used for privileged administrative access.

By utilizing the SSSD support, RHEL 6.8 is a viable alternative to LDAP configuration for applications for those looking to utilize enterprise single sign-on using Apache modules because it enables Kerberos-based authentication, allowing users who authenticate via Active Directory not to enter a user ID and password again, said Red Hat engineering director Dmitri Pal, in an April 26 blog post. It also offers improved security, failover, load balancing and awareness of domains and sites versus LDAP, Pal noted.

"Overall, external authentication based on Apache modules adds a lot of flexibility in authentication methods, supports complex domain infrastructures, has better security properties, and raises the resiliency of the application without requiring extra equipment," he stated. However, it does require more complex configuration, he said "but this complexity is well worth the (small amount of) extra effort."

Posted by Jeffrey Schwartz on 05/13/2016 at 9:06 AM0 comments


SharePoint On-Premises Lives On

Long live SharePoint Server.

Given Microsoft's relentless focus on Office 365 and the online version of SharePoint in recent years, those vested in the future of the traditional on-premises offering often wondered if the newest server release was set to be the last major upgrade. Those rumors persisted for years. Microsoft last week put that issue to rest, at least for now that SharePoint Server 2016 won't be the last on-premises version.

At the official launch of SharePoint Server 2016, customers learned it won't be the swan song for the on-premises collaboration offering. Not only that, Microsoft revealed plans to offer new capabilities designed first for Office 365 that will come to the server edition. Even more encouraging is SharePoint Server customers won't have to wait the typical three years for a new version. Feature Packs that bring forth many capabilities built for Office 365 SharePoint Online will come next year.

That promise came from Jeff Teper, regarded as the father of SharePoint, who was brought back to the team last year as corporate VP for SharePoint and OneDrive for Business. Teper presided over last week's "Future of SharePoint" event in San Francisco, where observers were expecting to learn that future was entirely in the cloud, albeit with support for hybrid environments championed as the basis of the new SharePoint Server 2016 release.

Make no mistake -- Office 365 is where all of the new collaboration features will first appear moving forward and Microsoft wants customers to move as much of their collaboration on SharePoint Server to the cloud and Office 365. But Microsoft is also well aware that many organizations can't move all of their data to the cloud and it appears that the strategy it is embarking on with Azure Stack (allowing organizations to build private and hybrid clouds in their datacenters) is following the same path with SharePoint.

Teper relished sharing the news that server edition will live on. "We are committed to on-premises deployments and new on-premises innovation is a big deal," Teper said.

Teper noted that there are some things Microsoft can't bring to SharePoint Server such as some of the machine learning capabilities in the Microsoft Graph -- but there are capabilities such as new, more modern user experience, unified access and intelligent discovery of information in SharePoint and OneDrive, an "intelligent" intranet that spans all device types, app models with support for an improved synch architecture and interfaces and the new SharePoint Framework.

The new SharePoint Framework, which is built with integration hooks with the Microsoft Graph and support for open source tooling that deliver client-side JavaScript rendering, will include the Sites and Files APIs. These features will roll out incrementally over the coming months and quarters. Teper emphasized as a milestone the fact that many of these capabilities will show up in a new on-premises Feature Pack which will build on SharePoint Server 2016 after this month's release. "For the first time in history in SharePoint, we are accelerating the pace of on-premises innovation," he said.

SharePoint MVP Ruven Gotz, director of digital workplace and innovation at Avanade, said that his clients will welcome the news of the planned Feature Packs (others in the SharePoint community welcomed the strategy as well). "Microsoft is making sure their clients know they haven't forgotten about on-premises and that it is also a first-class destination for features and updates," Gotz said. "I think we're going to see hybrid for years.

 

Posted by Jeffrey Schwartz on 05/11/2016 at 11:35 AM0 comments


Microsoft Releases PowerShell Module for Azure Machine Learning

Microsoft last week released the preview of the PowerShell Module for its Azure Machine Learning (ML) service. The Azure ML PowerShell cmdlets library, available on GitHub, lets users interact with Azure ML workspaces, experiments, Web services and endpoints.

The .NET-based PowerShell DLL module will let users fully manage their Azure ML workspaces, according to a blog post by Microsoft Principal Program Manager Hai Ning. The module includes the entire source code, which Ning said has a cleanly separated C# API layer.

"This means you can also reference this DLL from your own .NET project and operate Azure ML through .NET code," Ning stated. "In addition, you can also find the underlying REST APIs the DLL depends on and leverage the REST APIs directly from your favorite client as well."

The module will let IT pros and developers provision new ML workspaces through a management certificate, export and import JSON files that represent an experiment graph and to recreate new endpoints from published Web services, Ning added.

In a separate GitHub post, Ning described how to build a generic machine learning workflow using a global bicycle rental franchise looking to build a regression model to predict rental demand based on historical data. The post documents how to train the algorithm on multiple datasets with the same feature sets but using different feature values to produce models specific to each dataset using the Azure ML retraining API and Azure ML PowerShell automation.

Microsoft intends to extend the Azure ML PowerShell library, according to Ning. The Azure ML workspace and Web service are set to become managed resources with cmdlets pegged to support new ARM-based resources over time.

Posted by Jeffrey Schwartz on 05/09/2016 at 12:51 PM0 comments


Yammer Conspicuously Absent at Future of SharePoint Event

The SharePoint and Office 365 community now have plenty to chew on now that Microsoft has articulated the future of its collaboration software and service. It's no surprise that Microsoft wants as much collaboration as possible to take place in the cloud via Office 365 and SharePoint Online but the company reassured the SharePoint Server community that this wasn't the last on-premises version. Clearly absent from any emphasis was Yammer, the enterprise social networking feature Microsoft acquired four years ago, which remains a silo in the Office 365 service.

Yammer was barely mentioned during the two-hour event, which took place in San Francisco. Watching the event via webcasts at Microsoft's Technology Center in New York during a gathering of more than 200 SharePoint customers, partners and ISVs, it was pretty clear Yammer was left out of the core discussion about the future of SharePoint and Office 365.

"When they bought Yammer they disrupted the enterprise social industry," said Yaacov Cohen, CEO and cofounder of Harmon.ie, a software provider that offers tools for SharePoint and Office 365 integration, which organized the New York conference. "Jeff Teper is driving the strategy and Yammer is a silo service." Teper, known as the father of SharePoint, last year was asked to lead the group again and is now corporate vice president for the OneDrive and SharePoint teams.

Indeed, anyone watching the presentation by Teper and his team for anything new about Yammer had to have felt left out. Yammer was mentioned in passing perhaps two or three times at most. Teper did mention it once when describing how Microsoft is incorporating Groups into all functions of SharePoint and Office 365.

"The core idea of groups is that we have a list of people that are on a team and they can use all the resources in Office 365," Teper said in the early part of the presentation. "They can use Outlook, SharePoint and they can use Yammer, and they can use Skype for Business. A group is simply a list of people that's designed to reduce the friction of collaborations across modality and SharePoint is the place where that group goes to work. We're going to give every group in Office 365 a new, fast and mobile SharePoint site."

A spokeswoman for Microsoft said in an e-mail that Yammer remains on the Office 365 roadmap, and pointed to last week's feature upgrade, which adds support for external groups. "Yammer continues to play a key role in Office 365. Earlier this year, we announced the completion of our foundational work to bring Yammer fully into Office 365. And later this year, we'll complete the integration of Yammer with the Office 365 Groups service."

Nevertheless, observers were surprised there wasn't at least somewhat more emphasis on Yammer. "I heard them mention it once," said Julie Walleshauser, a solutions engineer at Metalogix. "It seems like more social features are being put into SharePoint but more through Office 365 and the Graph API."

Pat Esposito, a consultant with Blue Dog Technology, was also disappointed more wasn't said about how Yammer fits into the future of SharePoint and Office 365. Many of his clients use Yammer, he said, but doesn't believe just because it was left out of this week's discussion that Microsoft won't live up to its plan of integrating it with Office 365. "They still have a lot of work to do but I don't think that means it's going away," Esposito said. "I think they still have two years of work to do before it's fully integrated. If you need a site to go to or a group, where all you want to do is have a threaded conversation, Yammer is the place to start. But I have a feeling we may see the brand diminished. It's hard to tell because they're not being clear about it."

Ruven Gotz, an MVP and director of digital workplace and innovation at Avanade, said his company is a user of Yammer and many of the consultancy's clients use it. Gotz was also hoping to hear more about the plans for Yammer but said not to read too much into it. "Groups is definitely the future," Gotz said. "The question is whether and how Yammer integrates into groups. That's an important aspect."

Questions about the future of Yammer began to surface earlier this year, when Microsoft's Customer Success Managers (CSMs) were let go. Many on the original Yammer team have also exited, though that's not unusual when a startup or small company is acquired. Experts say the fact that Yammer is based on Java and other open source code could be impacting any integration effort.

Esposito noted that the radio silence may not portend the worse for Yammer, or at least the notion of delivering threaded discussions to SharePoint and Office 365. Last year at this time at Microsoft's Ignite conference, many in the SharePoint community were alarmed by the lack of emphasis on SharePoint. Said Esposito: "No one said anything about the future of SharePoint there."  

Posted by Jeffrey Schwartz on 05/06/2016 at 1:11 PM0 comments


Google and Fiat-Chrysler To Build 100 Driverless Minivans

Google and Fiat-Chrysler in the coming months will jointly develop and manufacture 100 driverless cars. The search giant's driverless car technology will be added to 100 Chrysler Pacifica minivans for the 2017 model year. However, if you were looking to buy one, Chrysler parent FCA Group said they're not for sale -- the companies are building the vehicles for test purposes only.

The two companies' engineering teams will work together at a facility in southeastern Michigan. FCA Group said it will initially design and engineer the 100 vehicles to accommodate Google's driverless car technology. Google will integrate the various sensors and CPUs that will let the cars drive autonomously.

While Google has engaged in high profile tests of driverless cars it has built on its own, it's the first time the company collaborated with an outside automaker. "FCA has a nimble and experienced engineering team and the Chrysler Pacifica Hybrid minivan is well-suited for Google's self-driving technology," said John Krafcik, CEO of the Google Self-Driving Car Project, in a statement included in FCA's announcement. "The opportunity to work closely with FCA engineers will accelerate our efforts to develop a fully self-driving car that will make our roads safer and bring everyday destinations within reach for those who cannot drive."

Google isn't the only major IT provider working with automakers to develop driverless cars. Volvo tapped Microsoft for its own driverless car effort back in November, though the companies didn't commit to rolling out vehicles in any specific time frame. The Volvo Concept 26 apparently will tap on Microsoft's machine learning technology. Microsoft is also looking at new ways of applying machine learning with embedded solutions with various IoT initiatives. The two companies are using Microsoft's HoloLens enhanced reality 3D headsets together as well. IBM has also demonstrated driverless car technology, most recently at the Detroit Auto Show. Apple is believed to have its own aspirations to offer a driverless car with analysts consistently suggesting the company buy Tesla, an idea most observers have scoffed at. Right now, it looks like Google is in the fast lane.

Posted by Jeffrey Schwartz on 05/04/2016 at 11:15 AM0 comments


Microsoft Takes Money Off the Table for Republican Convention

Microsoft said it is departing from past practice of contributing money to fund the Republican National Convention, to take place in Cleveland this summer. However, it will offer both parties the use of Azure, Office 365 and Surfaces at their respective nominating events. The company, which said it will also sponsor host committee activities taking place during the Democratic National Convention in Philadelphia, made its decision back in the fall and revealed the plan on Friday.

In a blog post explaining Microsoft's plans for the political conventions, Fred Humphries, Microsoft's corporate VP for U.S. government affairs, noted the company has provided support for both parties' national conventions since 2000 with the goal of helping supply technology that can give "voters' access to information to make informed decisions" and make sure "election results are reported accurately, efficiently and securely."

Four years ago, Microsoft provided the Republicans $1.5 million (according to published reports) when they gathered in Tampa to nominate Mitt Romney. While Microsoft notified the Republican National Committee of its change last fall, the revelation came a day after Google told Politico it would be providing the official livestream of the Republican Convention. Emphasizing that their intent is to be bipartisan, while looking to give voters access to information coming out of both conventions, Google and Microsoft are finding themselves in an unfamiliar position with this year's contentious campaigns. Activist opponents of Republican frontrunner Donald Trump including ColorOfChange, Free Press Action Fund and others orchestrated an elaborate demonstration outside Google's Mountain View, Calif. headquarters last week where they collected 500,000 signatures seeking the company withdraw any support of the convention, according to the Politico report.

Humphries noted that Microsoft has received numerous inquiries about its plans for the convention for some time and said the company started considering how it would support both gatherings a year ago. "Based on our conversations with the Republican National Convention's host committee and committee on arrangements, we decided last fall to provide a variety of Microsoft technology products and services instead of making a cash donation. For the Democratic National Convention, we're providing access to similar Microsoft technology as well as some sponsorship of host committee activities," he said. "We believe that technology from Microsoft and other companies provides an important tool that helps the democratic process work better."

Posted by Jeffrey Schwartz on 05/02/2016 at 12:14 PM0 comments


Apple Endures a Week of Bad News

It was clearly a week Apple CEO Tim Cook, anyone on his team and those with an interest in the company's fortunes would like to forget. Though the first quarter wasn't kind to other key IT players, notably Google, IBM and Microsoft, Apple was beset with fallout from a disappointing quarter, continued grief from the FBI and even an employee suicide on its main campus in Cupertino, Calif.

On Tuesday, Apple reported revenues and profits declined for the first time in 13 years -- while iPhone sales dropped for the first time ever. Making the news worse, Apple failed to meet analysts' consensus expectations of falling margins and a weak outlook. In the most recent quarter ended March 31, Apple reported revenues of $50.6 billion, 13% lower year-over-year, while net income of $10.5 billion was down 23% and earnings per share of $1.90 fell below expectations of $2 per share. Apple said it shipped 4 million Macs, a 12% decline, and iPad shipments of 10.3 million fell 10%.

Among some bright spots were an increase in services revenue ($6 billion) including a rise in subscribers of its music streaming service to 13 million from 11 million in February. Because sales of the Apple Watch after its first year are reported to have totaled 12 million units --  double the number of iPhones that shipped in its first years -- critics have largely dubbed the Watch a flop, though I'd say it's too early to draw any conclusions.

Nevertheless, after its shares started surging in February, those gains are gone and down 10 percent over the past three days. The news and overall concerns about the economy, weakness in the dollar and disappointing sales in China have led analysts to lower their price targets for Apple's shares and a number of prominent growth funds to sell all of their Apple shares. Those sellers include activist investor Carl Ichan, who began investing heavily in Apple three years ago. Yesterday Ichan told CNBC that his funds no longer hold Apple stock. While he said he believes the stock is cheap, he was concerned about growth in China. "China could be a great shadow for it," Icahn said on yesterday's broadcast, noting he called Cook to tell him of the stock dump. "I told him I think it's a great company. Tim did a great job. It's not that a tsunami hit it but I wonder if a tsunami could hit it."

If Apple's bad week on Wall Street wasn't bad enough, the FBI this week confirmed that it will not reveal how it was able to crack the encryption via a security flaw on the iPhone 5C, used by the shooter in the San Bernardino, Calif. terrorist attack.

Apple was also in the spotlight yesterday following the suicide of a 25-year old employee who reportedly shot himself in the head in a conference room in one of its main buildings. It wasn't immediately clear what led the employee to kill himself.

Posted by Jeffrey Schwartz on 04/29/2016 at 11:56 AM0 comments


A Look at Office 365 Native E-Mail Auditing Alternatives

It's no secret the biggest obstacle for some organizations to consider moving their e-mail from Exchange Server on premises to Microsoft's Office 365 is security and the ability for organizations to meet compliance requirements. Microsoft's answer is the auditing features that come with Office 365.

Among the auditing capabilities offered with Exchange Online are the abilities by administrators to enable logging for user mailboxes using remote PowerShell, view records from the Exchange admin audit log and from user mailbox audit logs. IT can also run admin role group audit reports to see which users where added or removed from role groups, according to a Microsoft TechNet post, along with searching the Office 365 Security and Compliance Center.

Many shops may require more than the auditing capabilities included in the Microsoft Office service. At least that's what several suppliers of Exchange auditing tools contend including Cogmotive, Knowledge Vault and Netwrix, among others. The latter supplier last week launched its new Netwrix Auditor for Office 365, which the company said protects Exchange Online from security threats by providing visibility into the entire e-mail environment.

The tool detects and reports changes made to Exchange objects, configurations and permissions including non-owner mailbox access, the company said. Netwrix CEO and Cofounder Michael Fimin said in an e-mailed response that Microsoft's native Office 365 auditing feature has limited capabilities.

"Native tools lack reporting capabilities, search and filtering options," Fimin said. "They also lack the ability to consolidate changes and non-owner access into a single view. Retrieving data using native Office 365 auditing requires tedious work."

Netwrix Auditor predefined audit reports and overview dashboards aggregate data from the entire Exchange Online organization and analyze what's happening in the hosted e-mail server. "The reports and dashboards have flexible filtering, sorting and exporting options, and can be scheduled via report subscription option." Microsoft's native tools don't provide options for archiving auditing data, he added.

For its part, Cogmotive said its Compliance & Audit tool collects data of all employees' Office 365 activities using the Office 365 Management Activity API, including all files modified, login attempts, password changes, mailbox access and other actions.

And KnowledgeVault emphasizes its ability to automate auditing processes across user groups and privileged administrators, while discovering suspicious behavior. The company said its tool also creates a variety of reports that offer details on user logon patterns, non-owner access and activity and analyses of changes in mailbox and user permissions as well as license changes.

Note: An earlier version of this post inadvertently attributed quotes by Netwrix CEO Michael Fimin to the incorrect company official.

Posted by Jeffrey Schwartz on 04/27/2016 at 12:49 PM0 comments


Microsoft Joins Dell's New IoT Partner Effort

Microsoft is among 25 partners to join Dell's effort to create an Internet of Things (IoT) ecosystem, launched last week. Dell, which unveiled its first IoT gateway nearly a year ago, has lined up a variety of players ranging from key IT players such as SAP and Software AG, as well as those that provide industrial automation wares and services including INEX Advisors, Kepware, OSIsoft and PTC.

Dell's IoT push comes as a growing number of enterprise and consumer tech providers see a large opportunity to gather machine data from operational technology (OT) infrastructure including sensors and other machine components for the purpose of automation and data mining. Spending on commercial IoT technologies on infrastructure, software and services, but excluding devices, sensors and automation technology, is expected to be $247 billion, according to Technology Business Research. Growing at a 20.6% CAGR, the researcher is forecasting expenditures will mushroom to $629 billion on 2021.

Attention on IoT has built up in recent weeks as industrial manufacturers gather this week in Germany for Hannover Messe, where Microsoft's CEO gave the opening keynote yesterday. Nadella used the occasion to introduce a number of customers that are using Microsoft's Azure Machine Learning and IoT technologies to automate their manufacturing processes including Rolls Royce, Jabil and Liebherr Group. In the case of Liebherr, the company has linked up with Microsoft to automate their refrigerators and freezers using Windows 10 IoT (formerly Windows Embedded), Azure and Power BI to automate repair orders when a product isn't working properly.

"I would posit that what's new today is that the very thing that you produce, the very thing that you manufacture, for the first time, is connected with all of the web of activity around it," Nadella told the audience. "That is what's really different. These new digital feedback loops that I refer to as 'systems of intelligence' is the new inflection point that we collectively across the software industry or the digital industry and the manufacturing industry are, in fact, bringing forth."

Nadella added that the bridging of IT and operational technology (OT) will continue to enable a growing number of these "digital transformation" efforts. But as Dell and some of its partners noted at a media gathering earlier this month to discuss the computer giant's new initiative, bridging IT and OT is a complex endeavor.

For its part, the initiative Dell launched began several years ago as a business plan and the company has since defined IoT as a key area the company plans to embrace, said Jason Shepherd, director of strategy of partnerships and solutions. In addition to the 25-plus partners announced last week, Shepherd said Dell is gathering others, ranging from ISVs to system integration partners. Unlike other traditional IT areas, building IoT-enabled applications requires a multidisciplinary approach to bring together hardware that operates.

Dell's key entry came a year ago when it announced it had created a new division focused on offering IoT gateways, which are effectively small wall-mountable PCs with wireless receivers and loaded with an embedded operating system -- either Linux or Windows 10 IoT. The edge gateways typically receive signals from sensors or other fixed operational components. It then tags or processes that data so it may pass it along to some kind of database cluster or other processing engine where analytics are performed. Naturally, security and analytics are key components to the IT part of the industrial automation equation and Dell points to its own offerings and that of partners that can perform various forms of predictive analytics.

Dell, with Microsoft and Blue Pillar, are now offering solutions for utilities to help streamline the distribution of electrical power. A partner using the recently launched Dell Edge Gateway 5100 in a variety of scenarios including orchards, vineyards, greenhouses and logistical environments, often in environments subject to extreme temperatures, is INEX Advisors. Christopher Rezendes, INEX Advisors' president, said a good amount of these automation systems are evolving but Dell offers one of the more robust IoT edge gateways available.

"I think we got them early, so there is still a fair amount of feature sets that need to be stabilized or needs to be released, but we think with the approach they are taking, we're going to get a nice collection and selection of tools and utilities that will be pretty easily integratable in their stack," Rezendes said. "So it's not a finished platform so to speak, because not all the services and not all the features have been enabled in the services stack. But it performs brilliantly."

Most of the pilots he's involved in haven't used Microsoft's new Azure IoT offering yet, and Rezendes suspects cost is a factor. "I'm hearing they're very expensive," he said. Given Dell's close ties with Microsoft in other key areas, it makes sense for the two to tie up in advancing IoT, said Ezra Gottheil, principal analyst for IoT, devices and platforms at Technology Business Research.

"The Microsoft alliance is a good one for both Microsoft and Dell; Azure will be the platform of choice for many customers, especially for the 'start small' approach that Dell favors," Gottheil said. "This is not to imply that Azure won't scale up, but Azure is already present in a lot of environments as a result of Azure Active Directory, so some customers will not choose another cloud/IoT platform for small-scale IoT implementations."

Gottheil also noted Dell has a number of important key IoT components including its "well differentiated" software/services like Statistica and Boomi. "Their gateway is in essence a hardened PC, and will fit some solutions, but not all," Gottheil said. "Cisco's gateways are, naturally, more focused on connectivity and HPE's are server based. Many OT vendors also offer gateway solutions. Most of Dell's other IoT offerings are part of their conventional IT portfolio, as is also true of HPE, but Dell has a head start in working with its embedded and OEM partners to incorporate Dell solutions. On the other hand, companies like IBM and Cisco have a head start relative to Dell in bringing in new customers through an interest in IoT."

Posted by Jeffrey Schwartz on 04/25/2016 at 2:33 PM0 comments


Nadella Emphasizes Office 365 as Catalyst for Growth

The embrace and extend philosophy that made Windows the standard for business computing is shifting to Office 365. That doesn't mean that philosophy is moving away from Windows but as developers shift to modern, cloud-scale applications and it's no longer a one-platform world, the jury is still out on where the Universal Windows Platform will fit in the next era of computing, though it appears it will surely have a place.

However this ultimately plays out, CEO Satya Nadella last night attempted to educate Wall Street on what he told developers at last month's Build Conference and at its new Envision event in New Orleans a few weeks ago: That the growth of Azure and Office 365 are attributable to the widespread support by ISVs and infrastructure providers of the Microsoft cloud. While the company missed consensus analyst forecasts in terms of earnings per share for its most recent third FY16 quarter, Nadella emphasized Office 365's growth now and its ongoing future prospects during last night's quarterly conference call.

Growth of Office 365 was quite remarkable, Nadella said, pointing to 22 million consumers and 70 million commercial customers who have active Office 365 subscriptions. In the commercial segment, Nadella said that shows a 57% month-over-month jump and validates Microsoft's focus on productivity and business process is helping the company reach new users, expand into new markets and create a growing platform for developers.

Hailing the latter as a success, Nadella told analysts on the call, that the addition of Office 365 Graph APIs and the opening of Skype for Business to developers is contributing to the success of this shift. The number of applications calling on Microsoft's APIs have grown month over month, he said.  "It can be as simple as Starbucks enabling somebody to e-mail a cup of coffee or it can be as sophisticated as DocuSign streamlining processes for digital signatures based on the understanding of peoples' availability, Nadella said. 

Office 365 and Azure are also helping Microsoft expand into new markets in areas such as security, analytics and voice communications, he said, pointing to the expanded capabilities such as e-discovery and the company's new security focus and the growth of Power BI, which the company recently said has 5 million subscribers. This has helped drive a 35% quarter-over-quarter growth in monthly active users of E5, its new premium edition of Office 365.

Windows 10 Growth
Despite a slowdown in the PC business, which was a given before the release of the latest quarterly results, Microsoft pointed to the growth of Windows 10, which Nadella said at 270 million users is double the number of Windows 7 users at the same point in time after its release. Granted that's skewed by the fact that upgrades of the latter were free to existing users. But Nadella also pointed to its Surface business, which was up 61%, making it the third time -- and second consecutive quarter -- where revenues exceeded $1 billion. It's also the first quarter in which Surface sales hit that threshold when it wasn't during the holiday shopping season.

Windows 10 is also buoying Microsoft's search efforts. The new OS drove 35% of its search revenues. "Developers have already built over 1,000 apps deigned for Cortana," Nadella said. "These new third-party experiences and the 6.3 billion questions people have asked are helping make Cortana smarter and driving search engagement."

Posted on 04/22/2016 at 11:32 AM0 comments


Massive Layoffs at Intel Amid Shift from Core PC Business

Intel yesterday dropped the gauntlet showing it has abandoned hope of a sustained revival of the traditional PC business that put it on the map with its restructuring that will eliminate 11 percent of its workforce, totaling 12,000 employees.

In a letter to employees yesterday, Intel CEO Brian Krzanich said the company's primary growth is coming from providing datacenter, cloud, memory and Internet of Things. Those businesses provided $2.2 billion in revenue growth last year and accounted for 40 percent of the company's revenue. It also was the basis of the bulk of Intel's profit, Krzanich said.

"Our results demonstrate a strategy that's working and a solid foundation for growth," according to his letter. "Our opportunity now is to accelerate our momentum and build on our strengths. But this requires some difficult decisions. It's about driving long-term change to further establish Intel as the leader for the smart, connected world."

Revenues from Intel's client computing group for the quarter ended March 31 of $7.5 billion grew only 2% year-over-year but dropped 14% sequentially, though the end of the year traditionally has the highest volumes, according to the company's first quarter earnings report released yesterday. By comparison, the company's Data Center Group posted $4 billion, up 9% year-over-year. Intel's Internet of Things Group revenue of $651 million showed a 22% year-over-year increase.

While Intel's not abandoning the PC business, the restructuring will let the company focus on more realistic revenue and growth moving forward. In addition to its focus on datacenter and IoT, Intel also sees growth in commercial PCs including hybrid tablet PCs (also described as 2-in-1s), connectivity solutions, gaming and home gateways.

"Intel appears to have decided that it can't wait any longer for a hoped-for revitalization of traditional PC markets," said Pund-IT Principal Analyst Charles King, in a blog post. "That doesn't mean that PCs are dead by any means. There will likely be occasional upward spikes in sales related to new features, vendor innovations and businesses upgrading their office environments, but the salad days of the industry, when people camped out overnight to buy new PCs, are receding into the past."

King also noted that Intel is focusing on what has evolved into its core strengths and where consumer and enterprise IT is headed. "It also works strategically by opening new business opportunities and spurring future growth. IoT could be an especially fruitful area since it plays to Intel's data center leadership position and its strengths in mobility."

Posted by Jeffrey Schwartz on 04/20/2016 at 12:48 PM0 comments


Bill Gates Backs Microsoft's Snooping Suit Against Feds

Microsoft Founder Bill Gates said he's on board with the company's move last week to sue the U.S. Department of Justice seeking to restrict the number of court orders to turn over e-mails and documents residing in its datacenters. At issue is the vast number of those warrants that include gag orders, meaning customers are in the dark when they're under investigation, a violation of First and Fourth Amendment rights, according to Microsoft.

Gates, who of course isn't involved in day-to-day operations at Microsoft, was asked to weigh in on the lawsuit, among other matters such as free trade, the current U.S. presidential campaign and the gender gap in computer science, during an interview by Reuters Editor-in-Chief Stephen Adler (the one-hour discussion was webcast and is available here).

Regarding Microsoft's latest lawsuit, Gates said the government will have to strike the same balance between customers' right to privacy as prior regimes did when federal wire tapping was commonplace. Gates predicted ultimately Congress will have to strike that balance, though he didn't discuss any timeframe, or its willingness to address it in the current political environment.

"This one is about is there a higher threshold that the government has to meet in order to have that disclosure take place without notification to the company involved," Gates said. "Certainly there probably are some cases where they should be able to go in covertly and get information about a company's e-mail, but the position Microsoft is taking in this suit is that should be extraordinary. It shouldn't just be a matter of course there's a gag order automatically put in."

The suit revealed the government has filed 5,624 demands for e-mails or documents residing in a Microsoft cloud datacenter. The court orders included gag orders on nearly half of them, totaling 2,576 of those requests. In 68 percent of those cases, there's no end date for the order, the company claimed.

Adler also asked Gates whom his preference is among the candidates running for president. Given none of the candidates share his views about global free trade, which Gates strongly believes is critical to the U.S. economy, he declined to endorse anyone.

"I'm not going to publicly pick a candidate, our foundation will work with whoever gets elected," Gates said. "There are things like being for free trade, certain types of innovation in the education space, R&D, those are the things. I hope voters weigh those in and we get someone who is willing to invest in the future."

Despite opposition to President Barack Obama's free trade agreement, Gates believes they are critical to the U.S. economy and toward overall expansion. "I think it would be a mistake for the country to not continue to engage in free trade deals," he said. "Some people think we ought to get out of the free trade deals that we're in. And I do know there are large parts of the American economy that benefit from global scale. In fact, we are the economy that by far benefits from global scale."

Regarding the lack of progress in drawing more women into IT engineering, Gates indicated he was troubled by that trend. "We have to look, what is it about the appearance of the people that's pushing them the away," he said. "You see the dropping away happens in high school, it happens in college, even from Microsoft. When we hire women to be a programmer, more of them will switch over into general management and get off of the pure engineering track than men. By the time you get 10 years into a career, you have a gigantic difference between a number of men in the hard core coding jobs versus the women."

Posted by Jeffrey Schwartz on 04/20/2016 at 1:00 PM0 comments


Microsoft's New Lawsuit Against the U.S. Raises Alarming Privacy Concerns

Microsoft's lawsuit yesterday against the U.S. Department of Justice sets the stage for a new showdown that could have major implications on the future of data privacy in the cloud era.

The numbers Microsoft revealed in its suit, filed yesterday in the U.S. District Court for the Western District of Washington, were alarming. Over the past 18 months, the government has filed 5,624 demands for information residing in a Microsoft cloud datacenter and has put gag orders on nearly half, 2,576 of those requests, meaning the company couldn't alert customers that their data was accessed. And in 68 percent of those cases, there's no end date for the order, the company claimed.

"We believe that with rare exceptions consumers and businesses have a right to know when the government accesses their e-mails or records," said Microsoft President and Chief Legal Officer Brad Smith, in a blog post explaining the filing of the lawsuit. "Yet it's becoming routine for the U.S. government to issue orders that require e-mail providers to keep these types of legal demands secret. We believe that this goes too far and we are asking the courts to address the situation."

Smith acknowledged that there are legitimate occasions when such secrecy is necessary, such as if harm to others is at stake or destruction of critical evidence is at risk. "But based on the many secrecy orders we have received we question whether these orders are grounded in specific facts that truly demand secrecy. To the contrary, it appears that the issuance of secrecy orders has become too routine."

The large number of orders demanding secrecy, often indefinitely, violates the Fourth Amendment of the Constitution, Smith argued, on the grounds that individuals and businesses have the right to know when their property is searched. Smith said it also violated the First Amendment, which guarantees its right to speak with customers about such searches.

"The numbers of demands made just on Microsoft alone should alarm everyone, said Rebecca Herold, CEO of the Privacy Professor, which consults with enterprise IT executives. "1,752 of the requests had no end date to their data monitoring? The FBI/DOJ seems to be morphing into a continuous surveillance agency, seemingly without specific investigations involved. Microsoft makes some very strong points supported by sound reasoning backed by technology facts."

The move comes just days after Senate Select Committee on Intelligence Chairman Richard Burr (R-N.C.) and Vice Chairman Dianne Feinstein (D-Calif.) released the draft of its Compliance with Court Orders Act of 2016 legislation for a bill that could put restrictions on the use of strong encryption, a move many believe could have a chilling effect.

"It was largely trying to codify the All Writs Act usage into a modern law which was very disturbing because if you actually followed the letter of the law that was being proposed, fundamentally technology providers would have to write in back doors into their technology," said Aaron Levie, CEO of Box, in an interview (see Q&A with Aaron Levie). "It was a very overarching, incredibly aggressive piece of legislation that comes from a place of trying to modernize the law but in the wrong way."

Herold added the government's persistence on blocking encryption could backfire. "I believe the FBI will soon see their persistence will result in even more use of strong encryption add-ons, and that these encryption solutions will come from other countries not under U.S. jurisdiction," she said. "The FBI could very well be making their goals harder to reach as a result."

Yesterday's suit represents the fourth filed by Microsoft over digital privacy rights. The first lawsuit "resulted in a good and appropriate settlement allowing us to disclose the number of legal requests we receive," Smith noted. The most recent lawsuit, prior to yesterday's, challenging a U.S. search warrant for customer e-mail in a Microsoft datacenter in Dublin belonging to a non-US citizen, is pending in the U.S. Court of Appeals for the Second Circuit.

Box's Levie, who also was among many tech providers supporting Apple in its challenge with the FBI last month, was quick to endorse Microsoft's latest lawsuit, pointing to its implications. "While they obviously deal with this at a much greater scale than we do, we are concerned about the direction that these kind requests are heading in," he said. "And we do think it's important that there are much more modern approaches to government subpoenas and the intersection of digital security and our laws in general and this is a great example of that intersection."

Posted by Jeffrey Schwartz on 04/15/2016 at 1:19 PM0 comments


Badlock Security Vulnerability: How Bad is It?

Experts say administrators should apply an important security patch (MS16-047: Security update for SAM and LSAD remote protocols) released by Microsoft ASAP to repair a vulnerability that could let an attacker intercept network communication between Linux and Unix clients and servers running the open source Samba file and print services connected to Active Directory. Despite the advice, some are scrutinizing the alarm level over the bug, which has the ominous name Badlock.

Critics claim the noise level about Badlock is unusually high, noting there are no known compromises based on the flaw. A report in Wired Tuesday indicated the German security company SerNet, whose engineer Stefan Metzmacher discovered and disclosed the bug later last month, has hyped severity of the bug. Some critics questioned the need for a separate domain and Badlock.org Web site SerNet created using the Heartbleed template and a prominent logo, with information about the bug.

"While we do recommend you roll out the patches as soon as possible -- as we generally do for everything -- we don't think Badlock is the 'Bug to End All Bugs,'" said Tod Beardsley, senior security research manager at security vulnerability and threat analytics provider Rapid7, in a blog post. "In reality, an attacker has to already be in a position to do harm in order to use this, and if they are, there are probably other, worse (or better depending on your point of view) attacks they may leverage."

The Badlock site contains information on the bug. Defending its decision to draw attention to Badlock, an explanation on the site argued: "What branded bugs are able to achieve is best said with one word: awareness. Furthermore, names for bugs can serve as unique identifiers, other than different CVE/MS bug IDs."

Badlock is a protocol flaw in DCE/RPC, the core remote procedure call used for Samba file and print services along with its underlying sub-protocols Local Security Authority (LSA) for domain policies and the Security Account Manager Remote Protocol (SAMR), which are collectively used in the Microsoft Windows Active Directory infrastructure.

The risk of Badlock, if successfully exploited, is a hacker could intercept those over-the-network protocols, enabling distributed denial of service attacks or man-in-the middle (MITM) attacks, allowing an attacker to circumvent Active Directory authentication and change permissions and passwords.

"There are several MITM attacks that can be performed against a variety of protocols used by Samba," according to the Badlock site. "These would permit execution of arbitrary Samba network calls using the context of the intercepted user." The vulnerability applies to all applications implementing this protocol, including Samba - CVE-2016-2118, and Microsoft Windows - CVE-2016-0128."

In a worst case scenario, the site warned an attacker could "view or modify secrets within an AD database, including user password hashes, or shutdown critical services," or on a "standard Samba server, modify user permissions on files or directories." The Samba project issued its own patches.

For Microsoft's part, the most critical of the 13 bulletins released this week was MS16-039, Security Update for Microsoft Graphics Component. That patch remediates vulnerabilities in Windows, the .NET Framework, Office, Skype for Business and Lync. The most significant risk is the potential to launch a remote code execution attack if a user opens certain documents or accesses Web pages with infected fonts, Microsoft said. Chris Goettl, a product manager at security vendor Shavlik, told the popular KrebsOnSecurity site that the Microsoft Graphics Component targets four vulnerabilities, two of which have been detected already in exploits.

Posted by Jeffrey Schwartz on 04/14/2016 at 12:43 PM0 comments


Facebook Releases New API to Build Intelligent Bots

Facebook and Microsoft may exist in opposite ends of the IT industry spectrum but their worlds continue to come closer together. In addition to operating among the largest cloud infrastructures in the world, the two companies have overlapping efforts to change the way people communicate, collaborate and share information. Over a period of two weeks both companies have become the latest to step up their focus on bringing artificial intelligence delivered by bots to the mainstream.

At its annual F8 developer conference in San Francisco yesterday, Facebook CEO Mark Zuckerberg outlined his vision for the world's largest social network. Just as Microsoft CEO Satya Nadella and his top executive team at last month's Build conference introduced a new framework for developers to create and embed intelligent bots into apps that can function as personal digital assistant, Zuckerberg and his team at F8 introduced the company's own effort to deliver bots to Facebook users, among a number of new capabilities outlined at the developer conference.

Coming to the new Facebook Messenger Platform is support for bots, the company announced at F8. Facebook said bots running on Messenger can provide customized messages to users such as automated weather and traffic updates, notifications and specialized content delivery such as receipts. The company's new Messenger Send/Receive API allows not only sending and receiving of basic text messages but images and what Facebook describes as "rich bubbles containing multiple calls-to-action."

The API will be offered to companies that want to deliver these intelligent bots using the Facebook Messenger service. "Developers with the Send/Receive API will be able to build a custom bot in Messenger, providing anything from customer service to rich, interactive experiences such as automated responses or spawning a search," said Deborah Liu, Facebook's VP of product and marketplace in a blog post, which outlines a number of other new tools that have recently been released or are coming soon.

Facebook Product Manager Seth Rosenberg described in a separate blog post how developers can deploy bots to Messenger. Effective immediately businesses and developers can build their apps with the API and then submit them to Facebook for review. "We will gradually accept and approve submissions to ensure the best experiences for everyone on Messenger," Rosenberg stated. "We're putting people first with new guidelines, policies and controls to offer the best interactions we possibly can."

Just as Microsoft's Nadella also announced Redmond's effort to build conversations as a platform, Facebook bots will also support natural language intelligence, Rosenberg noted. Facebook released a new engine "to facilitate more complex conversational experiences, leveraging our learnings," Rosenberg explained. "The wit.ai Bot Engine enables ongoing training of bots using sample conversations. This enables you to create conversational bots that can automatically chat with users. The wit.ai Bot Engine effectively turns natural language into structured data as a simple way to manage context and drive conversations based on your business or app's goals."

Facebook and Microsoft may not be arch rivals but a growing number of their products have somewhat similar use cases. And it wouldn't be surprising if in some way, these two efforts intersect -- or collide.

Posted by Jeffrey Schwartz on 04/14/2016 at 9:14 AM0 comments


Microsoft Pushes to Extend Reach of Power BI

When Microsoft launched Power BI for Office 365 two years ago, the goal was to bring business intelligence to the mainstream. Microsoft last month claimed  it has 5 million Power BI subscribers and now the company is reaching for more.

To extend its reach, Microsoft announced at last week's Build conference in San Francisco Power BI Embedded, a SDK and tool that lets developers build the Power BI service into their applications.  Power BI Embedded is intended for developers, primarily ISVs, and is delivered though the Microsoft Azure Cloud.

"You can now take the Power BI data visualization and reporting functionality and directly integrate it within your own applications," Scott Guthrie, executive VP of Microsoft's cloud and enterprise business, told developers at last Thursday's Build general session. "You can do this without requiring your users to buy or even be aware of what Power BI is. Instead you can basically take advantage of the Power BI Embedded SDK so it just feels naturally like part of your application using the same authentication, login and overall consumer experience that you're already delivering to your end users."

It's priced like any other service offered in Azure, he added. If the SDK is widely adopted by ISVs and SaaS providers, it could help make Power BI more pervasive. Microsoft disclosed a number of ISVs that plan to embed the Power BI SDK in their own offerings: SharePoint workflow vendor Nintex, location-based marketing software supplier Solomo Technology, sales enablement tools provider Highspot and Millman, which offers a financial modeling and actuarial tool popular by those who do risk analysis.

Nick Caldwell, the general manager for Microsoft's Power BI, noted in a blog post that in addition to eliminating the need for ISVs to develop and maintain their own visualization and BI controls, it will offer better compatibility.

"They are guaranteed that their visualizations will work across all devices, and that they can leverage all of the value and innovation that is constantly being added to the Power BI service," Caldwell said. "By removing the complexities of designing and developing a custom BI solution for applications, and making it available with the scale of Azure, we've removed the barriers that previously slowed the development of intelligent enterprise applications."

Posted by Jeffrey Schwartz on 04/08/2016 at 12:52 PM0 comments


Microsoft Outlines Office 365 Deployment and Admin Improvements

Office 365 wasn't the main attraction at last week's annual Build Conference in San Francisco but Microsoft gave its popular productivity platform and cloud service reasonable airtime showcasing new capabilities that IT pros and developers alike should welcome

During Thursday's general session, Qi Lu, executive vice president of Microsoft's Applications and Services Group outlined a number of new capabilities for Office 365, which he said is now used by 60 million commercial subscribers with an installed base of 340 million mobile downloads.

As recapped by senior product manager Jeremy Chapman in Microsoft's Office blog Friday, Lu emphasized three key areas of improvement specifically targeted at IT pros and developers:

  • Added intelligence to the Microsoft Graph, the unified API for Office 365 released back in November that gives developers the ability to incorporate the intelligence of Microsoft services such as Azure Active Directory, OneDrive for Business, Outlook and SharePoint into their own apps.
  • Improved capability for deploying extensible add-ins to Office 365, allowing centralized deployment and programmatic development of specific ribbons and buttons
  • New extensibility for Skype and Office 365 Groups.

The Microsoft Graph's added intelligence aims to let developers build "even smarter apps -- powered by data and insights from Office 365," wrote Microsoft Office 365 Corporate VP Kirk Koenigsbauer, in a separate post. "Developers can now use the Microsoft Graph to access a user's out-of-office status and recent email attachments."

During the Build keynote session, Yina Arenas,  Microsoft's senior program manager lead, demonstrated how that intelligence allows DocuSign, the popular provider of electronic signatures, to use the out-of-office status to automatically pass on a request to an alternate contact who is available and authorized to approve or e-sign a document. "DocuSign is leveraging all of the data and intelligence behind the Microsoft Graph to bring and create more contextual applications for their users," she explained.

The improved capability for deploying extensible add-ins to Office 365 for Windows, Web and iOS for iPad, now includes Office 2016 for Mac to the mix of supported platforms, Lu announced, noting the goal is to ultimately support all key platforms. The support for extensible add-ins allows developers to implement popular Web technologies such as CSS, HTML5, JavaScript and other modern Web frameworks.

 "You can add a simple XML manifest that describes how your applications will integrate into Office and then you can use a collection of JavaScript APIs to interact with the Office apps, for example Word, PowerPoint, Excel and Outlook," Lu explained. 

Lu also announced that developers can now make their own solutions a native part of Office on every platform using Office 365's modern app distribution model. The model lets users install add-ins themselves from the Office App Store. Office 365 administrators can "deploy add-ins to the users or groups within their organizations by a simple button click." The Office 365 Graph now supports Webhooks, Lu announced, which allows applications to respond to changes in data in real time.

The new extensibility for Skype and Office 365 Groups, now lets developers integrate their services into conversations within Office using Outlook and Skype, Lu said. "These two conversation canvases will provide new and powerful ways for your services to engage with your users," he said.

Chapman noted that while these updates apply primarily to developers, the new centralized deployment typically falls into the domain of the Office 365 administrator, who'll likely want to customize the desktop environment with add-ins, personalization and security policies, without expecting users to load the add-ins or configure Office.

"The Office add-in model for COM add-ins has been around for a long time, and developers of those add-ins tend to deliver those as .EXE or .MSI installation files," Chapman noted. "This is often done in order to use software distribution systems -- like Microsoft Intune or System Center Configuration Manager -- to deploy add-ins as part of a customized Office environment."

Consequently, this brings some challenges with the COM add-in deployment model that Chapman noted Microsoft has solved with the new model:

  • Deployment is usually targeted at a device, not a user.
  • Getting add-in experiences to users often requires software distribution systems, where devices need to be enrolled into the management system.
  • Updating and managing add-ins can be challenging when changes are needed.

"The new Office add-in model itself solves the update problem by moving client-side operations either to the cloud or a privately hosted service," Chapman noted. "When updates are needed, the updates can be done centrally against the service that powers the add-in or via the Office 365 admin portal. The new centralized deployment model with Office add-ins also solves for the other two challenges, by enabling admins to target deployment to specific users or groups and eliminating the need for software distribution systems to deploy these new add-ins. You can now upload add-in files and distribute them to your users from the Office 365 admin portal. "

This could also be appealing in organizations where employees use their personal devices who don't want to connect to device management systems, he added. Instead, a small .XML file, which Office apps can easily discover and load when Word, Excel or PowerPoint launches, is installed, he said.

"These updates make it easier for both IT admins and developers to ensure users benefit from Office add-in experiences."

 

Posted by Jeffrey Schwartz on 04/04/2016 at 5:12 PM0 comments


Astronaut Scott Kelly Describes Using HoloLens in Space

Microsoft's HoloLens virtual reality glasses have come down to earth -- literally. Making his first public appearance since spending nearly a year in space, Captain Scott Kelly today described his use of HoloLens with his NASA colleagues, representing one of several select pilot users of the company's Windows 10-based wearable computing device that renders 3D holograms.

The astronaut's endorsement of HoloLens on stage with Microsoft CEO Satya Nadella at the company's inaugural Envision conference in New Orleans came just days after the company made the devices and associated SDKs and APIs available for purchase by qualified software developers, as promised just over a month ago. In addition to announcing the release of the HoloLens Development Edition at last week's Build Conference in San Francisco, Microsoft also said it would be supported in the forthcoming Windows 10 Anniversary Edition, slated for release this summer.

Nadella is an avid proponent of holographic computing, which creates 3D virtual reality type experiences. The company believes HoloLens has the potential to enhance how people work, learn, communicate and play. Nadella emphasized that point to developers at Build last week and reiterated it today in the opening session of Envision, the company's inaugural event to gather business leaders tasked with digital transformation.

"This notion of this new medium of mixed reality that comes to life with HoloLens is reshaping architecture, industrial design and many other fields of operations," Nadella said in his Envision keynote today, following a demonstration of how Ecolab is using it as part of a digital transformation effort to address the shortage of water throughout the world.

Microsoft and NASA last year revealed their plans to test HoloLens for a project called Sidekick, aimed at enabling toe ground crew to provide visual assistance to the astronauts, who had two pairs of the wearable devices in the space station. Using Microsoft's Skype for Business, NASA's ground crew were able to troubleshoot and fix problems in the International Space Station with the benefit of using holographic output.

On stage at Envision, Kelly admitted that when he first saw HoloLens prior to embarking on his historic 340-day space mission he was captivated with the concept but skeptical it would be useful. "Before I flew in space on this mission, I went to Seattle and looked at the HoloLens and was very, very impressed. But I was also a little doubtful we would be able to make this technology work on the space station," Kelly said. "Usually when you're doing anything, there are startup transients and we have a WiFi network up there that isn't always working top notch. But when we turned it on, I was pretty amazed at how seamlessly it worked with their system on board the space station and was very impressed what will in the future be its ability to help us do our work."

At Build last week, Microsoft outlined how a number of early testers have spent the past year running pilot programs that make use of HoloLens ranging from automaker Volvo, Japan Airlines, computer-aided design software supplier Autodesk and a group of researchers from Case Western Reserve University and Cleveland Clinic.

Case Western Reserve University and Cleveland Clinic, which have teamed together to build a new medical school, revealed at last year's Build their intent to test HoloLens. Subsequently at last week's Build event, Case Western Reserve University School of Medicine Dean Pamela Davis said the new medical school plans to use HoloLens as part of the curriculum. In her presentation at Build, Davis and two members of her team described how HoloLens will help students get a better understanding of the digestive and nervous systems and how to diagnose a brain tumor.

"Being untethered and able to walk around 3D holographic content gives our students a real advantage," Davis said. "Students have commented that a 15-minute session with HoloLens could have saved them dozens of hours in the cadaveric lab. When we have only four short years to train them, this is invaluable."

Now that HoloLens is in the hands of a broader developer community, it should provide an even better picture of how commercially viable these devices might become.

Posted by Jeffrey Schwartz on 04/04/2016 at 1:01 PM0 comments


Microsoft To Offer Xamarin Free and Open Sources Core Platform

Microsoft executive VP for cloud and enterprise Scott Guthrie kicked of the second day of the company's Build conference for developers by announcing it will offer the Xamarin tooling for cross-platform free of charge. Xamarin offers popular tooling for developing mobile apps based on the .NET Framework and C# language. Guthrie said Xamarin, which Microsoft last month agreed to acquire, will be available free of charge to all Visual Studio developers including those using the company's free Visual Studio Community edition.

Attendees responded to the news with rousing applause because it will open the path for .NET developers to build C# based applications that can run natively on iOS, Android and Windows without recompiling them for each platform. Microsoft announced its plans to acquire Xamarin last month.

Further looking to extend the reach of Xamarin and .NET Framework, Guthrie also announced that Microsoft will contribute the Xamarin core platform to the open source community via the .NET Foundation that Microsoft created last year. The .NET Foundation is an independent organization to bring collaboration between the open source and C# developer communities.

"This means everything you need to run a Xamarin app on any device is open source," Guthrie said. "We think this makes Xamarin more attractive for mobile development for any device." The .NET Foundation received a major boost from Red Hat, Unity and JetBrains, who are the latest to join, Guthrie announced. "It's a really exciting time to be a .NET developer and we're excited to see .NET apps be built."

Posted by Jeffrey Schwartz on 03/31/2016 at 9:57 AM0 comments


Microsoft Reveals Roadmap for Conversational Computing with Intelligent Bots

Microsoft has already signaled it has ambitions to make Cortana and its machine learning-based engine more useful  and today CEO Satya Nadella articulated a grand vision for the company to make "conversational computing" using artificial intelligence and machine learning pervasive.

In the opening session of Microsoft's Build Conference in San Francisco, Nadella described how Microsoft will enable AI-based conversational computing in everything from Windows, its Edge browser, Outlook, Skype and even the entire ecosystem of software, hardware and cloud-based SaaS offerings. Nadella sees conversational computing using speech as the next UI and framework for how all applications are developed and connected.

Giving emphasis to the name of the conference, Nadella called on developers to "build" the next wave of applications based on tools and services that the company is rolling out including 22 Cognitive Services APIs for its Azure-based Cortana Intelligence Suite and the new Microsoft Bot Framework for developers to use to build intelligent bots that use natural language across platforms and apps. Nadella said the new Bot Framework, which includes an intelligence runtime, will work in any programming language and will allow developers to build interconnected applications that use speech as the UI. "We think this intelligence runtime [in the Bot Framework] and Cortana Intelligence suite is going to be core, just like how the .NET runtimes were, to all of the applications you are going to build," Nadella said. The notion of the Bot Framework runtime, Nadella said, is that developers can build intelligence into their apps and ultimately use voice as the new UI.

"We want to take that power of human language and apply it more pervasively to all of the computing interface and the computing interactions," Nadella said. "To do that, you have to infuse into the computers and computing around us intelligence. It means you have to bring forth these technologies of artificial intelligence and machine learning so that we can teach computers to learn human language, have conversational understanding, teach them about the broad context…so that they can really help you with your everyday tasks both at work and elsewhere. So as we infuse intelligence into everything, I think it's very important to have a principled approach to guide our design, as well as how we build things."

Microsoft is certainly not alone in its push to allow users to ask their computing and mobile devices for information or to engage in tasks by merely speaking. Apple, Google and even Amazon with its Echo device have similar offerings which for the most part are novelties that are limited and with unpredictable performance.

Microsoft's effort to preview the capability of a "chat bot" called Tay that can interact on social media ended up giving the company a black eye last week after it tweeted racist, anti-Semitic and sexist remarks. The company pulled it and apologized. "We quickly learned it was not up to this mark, and so we're back to the drawing board," Nadella said.

Microsoft will roll out the framework over time, though the company is bringing it to Skype with the Skype Bot Platform, which the company said consists of an SDK, API and workflows available in its new Skype Bot Portal.

Posted by Jeffrey Schwartz on 03/30/2016 at 2:54 PM0 comments


The Return of the Microsoft/Yahoo Acquisition Rumor Mill

Rumors that Microsoft has met with private equity investors interested in acquiring Yahoo and is weighing the possibility of offering significant financing to help them succeed in their efforts has many tech watchers feeling a sense of déjà vu. No evidence has surfaced that Microsoft actually wants to take a second shot at acquiring Yahoo itself but the whispers that surfaced late last week, suggest it may be a major player in the uncertain future brings back memories of the two companies' storied past.

If Microsoft plays a key role in an acquisition of Yahoo's core assets, even indirectly, it certainly doesn't mean former CEO Steve Ballmer was right all along back in 2008 when he unsuccessfully tried to wage a $45 billion gamble on the struggling company. At the time, many observers and Microsoft insiders feared the huge financial bet was a disaster in the making. The fact that Yahoo founder Jerry Yang and his board were foolish enough to turn down the premium offer is mind boggling, considering it was evident even back then the company was unlikely to see an offer -- or achieve organic value -- that would have benefited shareholders even close to what Microsoft put on the table back then. The soap opera that has defined Yahoo ever since speaks for itself. Yet Yahoo's pain over that time became Microsoft's gain on many fronts, though to be sure, Redmond had many rough years as well.

After the deal failed, Microsoft ultimately negotiated a 10-year search deal with Yahoo, making Bing its default search engine. That gave Microsoft much of what it wanted at the time with far less of an upfront investment. The pact had incremental benefits for both companies but when Marissa Mayer took over as CEO in 2012, the former Google exec felt differently. Mayer was unhappy with the arrangement and managed to convince Microsoft to scale it back, allowing Yahoo to take another stab at developing its own online search engine.

Back in early February 2008 when Microsoft made the stunning and unexpected hostile takeover bid for Yahoo, we were just wrapping up the March issue of Redmond magazine. The implications were so significant that we ripped up the issue so we could adequately cover it. Looking back at our cover story (Can MicroHoo Take On Google?), which raised serious questions whether it was a wise move or if it would even go through, the deal had promised to help Microsoft take on Google in search and in shifting its then Windows Live ambitions to a Web services portal model that would reach more users. Acquiring Yahoo even offered a hedge against what is now Google Apps for Work, which Microsoft saw as a threat to its Office franchise. It also potentially offered Microsoft a way to transition into the open source world, we reported. Microsoft was yet to announce it was planning to roll out what is now Azure and no one knew to what extent people would embrace the new Apple iPhone -- or any smartphone.

As reported today on CNBC, any moves by Microsoft to have a place in taking a piece of Yahoo are in the early stages. It may gain traction or someone else may make Yahoo an offer it can't refuse. Still, the current auction for Yahoo's assets includes its core businesses outside of its stake in Alibaba. That puts the cost in the range of $10 billion if Yahoo has its way -- though the assets are valued at much less. That's far less risk than the $45 billion of eight years ago. But why would Microsoft want a piece, or influence, over the future of Yahoo?

Mayer's efforts to turn Yahoo around has failed on a number of fronts, including its her decision to pull away from Microsoft, which still has plenty to lose if someone were to pit those assets against Bing. As spelled out in a Seeking Alpha post by Mark Hibben, "Yahoo's recent earnings report for 2015 Q4 and the full year reveals how badly the strategy to circumvent Microsoft backfired." While last year's revenues rose a modest 7.5% to $5 billion, he noted traffic acquisition costs mushroomed from $217.5 million in 2014 to $877.5 million 2015.

"The path to restoring profitability is fairly straightforward: dump Mayer," Hibben added. "Dump the attempts to develop an independent search capability. Return to using Microsoft Bing. Integrate future mobile and contextual search efforts within the larger Microsoft cloud services strategy. Rarely in the acquisition of troubled companies such as Yahoo is an opportunity presented such as what currently faces Microsoft. Microsoft can be the salvation of Yahoo. Indeed, it's the only company that can. Any other acquirer just gets an empty shell when it comes to search. Microsoft would get an empty shell as well, but it's a shell it can profitably fill."

As reported by Re/code's Kara Swisher last Thursday, among other "strategic bidders" besides Microsoft are AT&T, Comcast and Verizon. The latter last year already acquired AOL.

Yahoo is on the auction block thanks to the aggressive efforts of activist investor Starboard Value which issued a proxy battle looking to replace the company's entire board, including Mayer. If the private equity firms looking at Yahoo's assets, which according to Swisher (citing numerous sources) include Advent International, Vista Equity Partners, TPG, KKR and others, get Microsoft to fund the deal, it would apparently be an unconventional way for Microsoft to ensure it kept its interest in making sure Bing remains core to Yahoo's future.

As reported by Mary Jo Foley, while Bing may still only have a small piece of the search pie dominated by Google, it is still an important asset to Microsoft. Its core components power Windows 10's Cortana and it is the basis of a graph "to make more useful the interactions between its own products, like Xbox, Office, Windows and its various cloud services."

Microsoft, along with Amazon, IBM, HPE, Google and many others, see machine learning and artificial intelligence (AI) as a huge opportunity. That could explain why Microsoft is at the very least exploring how it can keep Yahoo within its reach. Even if that happens, and that remains to be seen, it wouldn't be a bet-the-company move this time.

Posted by Jeffrey Schwartz on 03/28/2016 at 1:29 PM0 comments


Docker Client App Beta for Windows 10 and Mac Released

Docker has developed software that will allow Windows 10 and Apple machine users to use its native container platform as a packaged native application. The company yesterday announced the beta of Docker for Mac and Windows. Right now Docker is only offering it as a "limited" beta with GA planned for later in the year.

The new Docker for Mac and Docker for Windows are expected to offer a faster and more reliable native experience without requiring VirtualBox VM extensions. The improvements are the result of work between Docker, Apple and Microsoft engineers to provide the OS integration and allow for host- native hypervisor support, specifically Apple Hypervisor and Microsoft's Hyper-V.

"The Docker engine is running in an Alpine Linux distribution on top of an xhyve Virtual Machine on Mac OS X or on a Hyper-V VM on Windows, and that VM is managed by the Docker application," wrote engineer Patrick Chanezon, who last year left Microsoft to join Docker. "You don't need a Docker machine to run Docker for Mac and Windows." Chanezon said the new releases will also make it easier to run containers on local host networks. "Docker for Mac and Windows include a DNS server for containers, and are integrated with the Mac OS X and Windows networking system," he noted.

The new software is primarily aimed at developers and will let them build, test and deliver programs that can run as native Mac and Windows apps from the system toolbar like packaged applications from app stores.

"Docker for Mac and Windows reflects deep OS system-level work from our Unikernel Systems team and demonstrates how, moving forward, we can leverage native platform capabilities to provide users with the same optimized Docker experience on all platforms," said Solomon Hykes, Docker's founder, CTO and chief product officer, in a statement announcing the release. "These integrated software packages are designed to remove an additional layer of 'dependency hell' for Mac and Windows developers by allowing them to develop directly inside a container."

By removing the need for developers to run OS and language-specific dependencies, they can build their programs with one tool, facilitating testing and overall deployment of containerized apps. That, Docker is hoping, will make it easier for developers to ship "Dockerized" apps directly from their machine registries and allowing them to mount application code and other components into a volume. This is also important because apps automatically refresh when code changes are made.

Based on the system requirements, the new Docker clients appear to be aimed at serious business apps. The Docker for Windows client requires Windows 10 Pro (1511 November update, Build 10586) or later. The Hyper-V and Docker for Mac client must run on a system built in 2010 or later running at least OS X 10.3 Yosemite.

While the Mac and Windows clients share a significant amount of code, the two are at different stages of development, according to Chanezon. "Docker for Windows will initially be rolled out to users at a slower pace but will eventually offer all the same functionality as Docker for Mac," he noted. The company now has a list for those wanting to sign up to test the software.

Posted by Jeffrey Schwartz on 03/25/2016 at 1:00 PM0 comments


Citrix Launches Secure Browser Service for Virtual Web Apps

Citrix Systems today released its new Secure Browser service, a subscription-based offering that lets IT organizations deliver SaaS and custom Web apps to employees without requiring them to make any changes to their browsers such as installing plugins.

In addition to removing compatibility issues among different browser types and versions to ensure apps perform and present properly, the company said Secure Browser lets organizations lock down SaaS offerings such as Office 365 and Google Apps for Work, as well as the suite of business Web apps from Salesforce.com, Workday and Concur, among others.

The new service is an effort by Citrix to deliver browser-based apps to users that don't have, or wish to procure, Citrix's flagship XenDesktop or XenApp infrastructure, though an on-premises version will work with them. It's available now at a starting monthly subscription cost of $20 per user, though the cost could be lower for customers with larger volumes, company officials said. Instead of running Citrix Receiver on a client system, it is deployed in the public cloud and the Web app is optimized and designed for any HTML5-compatible browser, said Brett Waldman, Citrix's senior manager for product marketing.

"What we have done with Secure Browser is we've taken the technology of XenApp and we've streamlined it for a specific use case, in this case delivering a virtual browser," Waldman said. "We take advantage of the HTML5 technology so it just runs inside the browser. There's nothing to install. It's completely untouched by IT or even the end user."

In addition to making it easier to ensure compatibility of different Web apps with any HTML 5 browser, the idea behind Citrix Secure Browser is to also eliminate lines of business merely subscribing to SaaS services without any IT control, also known as "shadow IT." Citrix Secure Browser lets IT organizations publish any SaaS or internally procured Web application making it accessible via URL. Administrators can specify the browser best suited for a specific app as the target browser running on the backend. The service runs on Microsoft Azure but Waldman said Citrix manages and controls the Secure Browser service.

"Secure browsers are essential for companies that have Web apps and need to control access and eliminate potential vulnerabilities," said Mark Bowker, a senior analyst at Enterprise Strategy Group. "Secure browsers simply make Web app delivery more predictable. The benefit of Citrix is that they have years of customer experience that helps them lock in secure browser capabilities with minimal friction for IT and employees."

The company is also offering a version called XenApp Secure Browser as a perpetual license at a starting price of $150 per user or client device. Citrix also is providing the Secure Browser Development Kit for XenApp and XenDesktop customers (though not for those with VDI edition).

Posted by Jeffrey Schwartz on 03/24/2016 at 10:35 AM0 comments


IT Industry Mourns PC Pioneer Andrew Grove

Many IT pros who have benefitted from the rise and growth of the Intel microprocessor that fueled the WinTel-based PC industry owe a portion of their careers to Andy Grove, who died last night at the age of 79. Although the cause is not immediately known, Grove, who survived prostate cancer in the 1990s, had suffered from Parkinson's disease in recent years and had contributed to finding a cure to both diseases.

Grove was a legendary figure in Silicon Valley and played a key role in the formation of the infamous WinTel alliance, where he and Microsoft Founder and then-CEO Bill Gates saw their respective companies mushroom from small startups into the most important technology providers in the world during the late 1980s until the end of the 1990s. At the time Grove became CEO in 1987, Intel revenues were $1.9 billion and on his watch the company ballooned into a $26 billion company when he stepped down in 1998.

"Grove played a critical role in the decision to move Intel's focus from memory chips to microprocessors and led the firm's transformation into a widely recognized consumer brand," the company acknowledged in the announcement of his death. Grove championed the decision to exit the DRAM business and bet the company on the growth of x86-based PC microprocessors amid mounting and unsustainable price pressure from Asian providers of memory.

 "In my mind, that was probably the most fateful decision Intel made," said Crag Barrett, a company veteran who succeeded Grove as CEO, in a Bloomberg interview. "It really was the second birth of Intel."

Technically Grove was not an Intel founder but was brought on board the day the company was incorporated in 1968 by Founders Gordon Moore and Robert Noyce, where all three had worked at Fairchild Semiconductor.

Grove was named president under then-CEO Moore in 1979 and took the reins from Moore in 1987 when networked DOS computers had begun replacing functions once the primary domain of mainframes and host-based minicomputers. But PCs at that point in time were still relatively rare in the home. Grove was known to have better business acumen and operational skills. In the years that followed, notably with the release of Windows 3.0 in 1990, the PC industry began its rapid rise with Grove and Gates at its epicenter, though each played very distinct roles and their dealings were at times combative. The two would also be at the focus of huge, widely publicized antitrust suits.

"Andy Grove foresaw the eventual breakup of the vertically integrated computer industry, and was able to specialize in creating the core component of computing -- the microprocessor," wrote Michael Blanding, a Harvard Business School Working Knowledge staff writer. "And then he championed that silicon part with the famous 'Intel Inside' marketing campaign. Blanding's assessment of Grove came last year in the context of the release of Strategy Rules, a book by Harvard professor David Yoffie and Michael Cusumono, MIT Sloan School of Management a distinguished professor. The book compared and contrasted the management approaches of Grove, Gates and Apple Founder Steve Jobs based on decades of studying the three of them.

"The notion that you could brand a product that no one had ever seen and that no one understood what it did was brilliant," Yoffie told Blanding of Grove. "That changed the entire structure of the semiconductor industry forever." Gates last night tweeted that he was saddened to learn of Grove's death. "I loved working with him. He was one of the great business leaders of the 20th century."

Among others who paid tribute was VMware CEO Pat Gelsinger, who spent decades at Intel as a senior executive before arriving at VMware. "Probably no one person has had a greater influence in shaping Intel, Silicon Valley, and all we think about today in the technology world than Andy Grove," Gelsinger told The Wall Street Journal's Don Clark. "I would not be where I am or who I am if it were not for the enormous influence of Andy as a mentor and friend for 30 years."

Unlike Gates, who came from a wealthy family, Grove was a Jew born in Hungary who escaped persecution from the Nazis and later the Soviets before arriving in New York City, where he earned a degree in chemical engineering from the City College of New York and later a PhD from the University of California at Berkeley. Grove also authored a number of influential books, perhaps most notably Only the Paranoid Survive. Barret noted Grove's firm and demanding style. "Andy really created the corporate culture at Intel, the confrontational nature of the culture, the problem-solving nature of the culture, the 'don't be afraid to take a risk' part of the culture. Those were all built into his DNA. He kept asking 'why?' to get to the root cause, solve the root cause, don't let anyone give you any BS along the way."

The Bloomberg report included an excerpt from Grove that pointed to his childhood and the impact the Nazi occupation had on him. "People who have lived with terror do exactly that," he said. "You get used to horrible things happening and go on with the rest of your life."

Posted by Jeffrey Schwartz on 03/22/2016 at 1:02 PM0 comments


Apple Shrinks the iPad Pro

Apple today rolled out a smaller version of the iPad Pro that, for all intent and purposes, is an upgraded iPad Air with some noteworthy features that the company claims makes it a worthy alternative to Windows PCs. The new iPad Pro was among a number of new offerings Apple launched at an event held in San Francisco including a new upgraded version of its four-inch phone called the iPhone SE that'll have a notably lower price of $399, a new entry level Apple Watch priced at $299 and some new software including a new release of iOS.

The new Apple iPad Pro is the size of its traditional iPad -- 9.7-inches -- packing markedly more punch with an improved display, much better speakers,  support for USB camera connections and SD cards and powered by the company's A9 processor, among other new features.

While the new entry level version of the iPad with Wi-Fi will start with a 32GB drive, it'll also cost $100 more, coming in at $599. A version with 128GB will cost $749 and one with a 256GB SSD is priced at $899. They're due to ship March 31.

Just as the company said last fall when it launched the larger iPad Pro, the company sees its newest tablets as suitable alternatives to PCs.

Phil Schiller, Apple's senior VP of marketing, told attendees that there are 600 million Windows PCs that are at least five years old that should consider the iPad Pro. "This is really sad. These people could really benefit from an iPad Pro," Schiller said. "It's the ultimate PC replacement."

Touting the fact that there are one million iOS apps in the Apple Store for the iPad, Schiller said the new iPad Pro with a 2048x1536 resolution display is 40 percent less reflective than the iPad Air 2, 500 nits of brightness and 25 percent higher color saturation. In addition to supporting Apple's new pen, introduced with the larger version, it also will work with what Shiller called a "smart keyboard."

It also comes with four speakers, supporting twice the volume and improved audio fidelity, he said, along with a new 12MB camera and a 5MP Facetime camera.

Another noteworthy feature is its support for video recording at near 4K video (3840x2160) at 30 fps, 1080p HD at 30 or 60 fps, 720p HD at 30 fps or slo-mo video support for 1080p at 120 fps and 720p at 240 fps. Other specs can be found here.

Whether or not the new smaller iPad Pro appeals to users of older Windows PCs remains to be seen. The bigger question is will it appeal to those with much older iPads?

 

Posted by Jeffrey Schwartz on 03/21/2016 at 3:48 PM0 comments


Google Gains Entry to Cuba to Expand Connectivity

While restoration of ties between the U.S. and Cuba became official last year, President Barack Obama's historic visit this week to the country and with President Raul Castro aims to seal relations and remove a longstanding trade embargo.

The opening of relations promises to vastly modernize information technology in a country where citizens have limited cellular and Internet connectivity. The few who have phones or computers have old technology and only 5 percent are said to have access to wireless data and the Internet. ABC News this morning aired an interview with Obama, who indicated that the Cuban government has reached an agreement with Google to expand broadband and Wi-Fi access in Cuba, as noted by Reuters. "Google has a deal to start setting up more Wi-Fi and broadband access on the island," Obama said.

The president arrived in Cuba yesterday, accompanied by his family, a large delegation of lawmakers from Congress and a group of 11 CEOs from companies which include AirBnB, Marriott, PayPal, Starfish Media and Xerox. Cuba is a country that largely lacks access to technology and companies across the spectrum are likely to look for ways to bring it into the 21st century.

Microsoft has yet to announce its plans to tackle the Cuban market, though it could be fertile ground for the company to push Windows 10, Office 365 and its various datacenter and consumer-oriented offerings. Perhaps Microsoft, Google and Amazon will build their own datacenters to operate their public clouds?

Last May, Microsoft Director of Civic Technology Matt Stempeck visited Cuba and wrote about his findings. The technology he did see there was mostly decades-old systems, often PCs with old CRT displays (remember those?). Despite the limitations of Cuba's tech scene, Stempeck wrote in a blog post that the country is slowly making some strides.

"While Internet service remains rare, especially outside of tourist areas, third-party cellular businesses have sprouted up alongside adjacent farmacias and grocers to serve a far larger percentage of the population than other high tech," Stempeck noted. "Freedom House produces reports about connectivity in Cuba, which is poor but improving."

Naturally concerns about issues such as government surveillance and other restrictions imposed on users will set the stage for how far the IT industry will gain access in Cuba. Over time though, the country's businesses and citizens will gain a new view of the world.

Posted by Jeffrey Schwartz on 03/21/2016 at 12:25 PM0 comments


Microsoft Bets on Xamarin to Boost Universal Windows Platform

When Microsoft pulled the trigger and agreed to acquire Xamarin, it hardly came as a surprise to those familiar with the company. Xamarin's tools let developers build and test their mobile applications to run on multiple platforms without rewriting them from scratch. They offer a framework for building apps in the C# programming language for not only Windows but other platforms such as Android and iOS. The Xamarin tooling has become popular with developers who have preferences for Windows as well as those focused on building applications for other platforms.

Days after the deal was announced in late February, Microsoft said it was cancelling plans to deliver on Project Astoria, the bridging technology aimed at letting Android developers port their apps to run on Windows 10 and the new Universal Windows Platform (UWP). Industry followers for many months predicted the demise of Project Astoria, built around providing an Android Open Source Project (AOSP) subsystem in Windows 10 mobile so that Android apps run with relatively few alterations. Microsoft first revealed Project Astoria at last year's Build Conference along with a similar iOS bridging technology called Project Islandwood, designed to build iOS apps in Apple's Objective C for the Windows Store. Microsoft delivered the iOS bridge in August and said it continues to frequently release updates.

Right after last year's Build, where the bridges were announced, there was skepticism and shortly thereafter questions about financial liability and Java licensing with Oracle, fueling more skepticism for Project Astoria's prospects. Regardless, following many months of rumors, Microsoft has made official it has decided to focus its bridging efforts on iOS.

"The philosophy behind the Bridges has always been to make it as easy as possible for you to bring existing code to Windows, and our investments in the iOS Bridge will make this straightforward," wrote Windows Developer Platform VP Kevin Gallo, in a Feb. 26 blog post. "We received a lot of feedback that having two Bridge technologies to bring code from mobile operating systems to Windows was unnecessary, and the choice between them could be confusing."

Some developers weren't buying Microsoft's explanation. Hector Martinez argued in the comments section of Gallo's post that Android developers use Java, which is very close to C#, while iOS mainly use Objective-C, which he noted is "a much more difficult language to work with. If you think iOS developers are going to make a huge contribution to UWP, then you are thinking wrong." Another commenter more bluntly stated: "You cancelled it because you felt breaking a promise you made to developers was less costly than continuing development."

Microsoft Program Manager Chris Rutkas defended the move. "The bridges had different implementations," Rutkas replied to the post. "The Astoria decision wasn't easy, and we carefully evaluated it for months. We do want to help you bring your existing code to Windows 10, including your Android code, and we think Xamarin will offer interesting possibilities. We'll have more to share at Build on our plans."

Those grumblings notwithstanding, Xamarin has numerous fans, who expressed enthusiasm about Microsoft's decision to pull the trigger and bring Xamarin into the fold. "I'm surprised it took so long," said Eric Shapiro, founder of ArcTouch, a San Francisco shop that designs and builds mobile apps with clients including Audi, CBS, Salesforce.com and Yahoo. Shapiro, who first discovered Xamarin about three years ago, had expected Microsoft to buy Xamarin more than a year ago.

Since starting to use Xamarin for cross-platform development of mobile apps, Shapiro said it now accounts for a third of its work. Its use will continue to grow he said, anticipating it will ultimately be used for two-thirds of all its projects.

"When a client says 'I want iOS, Android and Windows,' then there really is no other option besides doing it in Xamarin," Shapiro said. Though he acknowledged that there are other cross-platform frameworks such as Appcelerator's Titanium and the Javascript/HTML-5-based Sencha Touch, Shapiro believes Xamarin produces higher performing apps that are easier to get approved in the Apple Store.

"We have had a longstanding partnership with Xamarin, and have jointly built Xamarin integration into Visual Studio, Microsoft Azure, Office 365 and the Enterprise Mobility Suite to provide developers with an end-to-end workflow for native, secure apps across platforms," said Microsoft Cloud and Enterprise Group Executive VP Scott Guthrie, who announced the acquisition a few weeks ago.

Shapiro is also hoping that Microsoft's acquisition of Xamarin will bring more visibility to it and that the company will make it more affordable to use. With training and support, each Xamarin developer license costs $3,600. Given Microsoft's motives for acquiring Xamarin weren't likely for potential revenue gains but to advance its ability to support cross-platform mobile apps, Shapiro is hopeful the company will add the tools to MSDN subscriptions. "Microsoft has so many more resources for training that are free," he said. "To me it will be interesting to see how this shifts."

Indeed how this all shifts should become clearer at the forthcoming Build Conference at the end of this month in San Francisco, where Microsoft is expected to outline what's next for UWP.

 

Posted by Jeffrey Schwartz on 03/18/2016 at 11:47 AM0 comments


ManageEngine Adds Single Sign-On to SaaS Apps

IT systems management provider ManageEngine this week released a new tool that provides single sign-on access to software-as-a-service (SaaS) apps including Google Apps and Office 365. The company's new ADSelfService Plus offers password management and allows for the provisioning of single logon access to apps using Active Directory credentials.

The company said ADSelfService Plus lets organizations automate the process of basic password and credential management. It includes a Web-based portal that lets employees automatically reset their passwords and it issues reminders when the passwords expire. The self-service function aims to remove those tasks from IT help desks. The new tool also can synchronize passwords across platforms and provides search capability.

ManageEngine, which offers a broad set of IT help desk, Active Directory and network, server and app management tools, joins a wide field of providers that offer single sign-on capabilities ranging from major players such as Microsoft, Google, Salesforce.com and VMware, to number of players that specialize in identity and access management (IAM) tools including Centrify, OneLogin, Okta and PingIdentity, among others.

The ADSelfService offering provides authentication using Active Directory domain password policies that can be applied to all applications, according to the company. The single sign-on capability also works for applications that don't support SAML and provides password management for iOS and Android-based devices. The company offers a free edition for shops with fewer than 50 users. The professional edition, which can manage up to 500 user accounts, costs $595 per year.

 

Posted by Jeffrey Schwartz on 03/18/2016 at 11:44 AM0 comments


Okta CEO: Microsoft Is Losing in Enterprise Mobility

Microsoft in recent years has tried to play nice with competitors, making it all the more surprising when it took a perceived potshot this week at one rival apparently encroaching too close to its turf. Okta, which offers single sign-on and device management tools that have emerged as a threat to Azure Active Directory, was notified by Microsoft this week that its request to sponsor and exhibit at this fall's Ignite show was denied due to competitive conflict.

Since word of the snub was first reported by Business Insider, Microsoft said it has reconsidered Okta's application and the company is now invited. But it wasn't clear why Microsoft made the unusual move in the first place. "We have a lot of criteria we consider when reviewing sponsorships," a Microsoft spokesman said in an e-mailed statement. "We revisited the historical criteria and determined that Okta qualified for sponsorship opportunities. We've reached out to Okta and look forward to seeing them at Ignite."

Given the fact that Okta has sponsored Ignite (and TechEd) for years, as well as Microsoft's Worldwide Partner Conference and Build, the sudden rejection came out of the blue, said Okta Cofounder and CEO Todd McKinnon. In an interview, McKinnon said he was stunned.

The move was especially galling to McKinnon given Okta's close ties with Microsoft's Office 365 team. The two have worked together to migrate large customers. "We've been really good partners with Microsoft, mostly around Office 365, and we've sponsored their shows and they've been really good shows," he said. "I know they're competitive with us in certain parts of the business, but this is a little bit out of the blue and unexpected."

However, McKinnon said identity management solutions have been a thorn in Microsoft's side. Microsoft has put a large focus on its Enterprise Mobility Suite, which, in addition to Azure Rights Management and Intune (including Azure Active Directory for user management), it aims to offer single sign-on. Microsoft COO Kevin Turner last year identified EMS as a potential $1 billion product for Microsoft. As reported in our October cover story, Okta and a number of third-party providers believe they have reasonable alternatives, or add-ons, to Azure AD, including PingIdentity, OneLogin or Centrify, as well as password-vaulting platforms from the likes of BeyondTrust or CyberArk Software, among others. Some, according to our story at the time, are considering a move away from Active Directory, still the most widely used enterprise directory for authorization, credentials and permissions management. So far, it appears most organizations intend to rely on Active Directory, though to what extent over the long haul remains to be seen.

Given the threat Okta says it poses, McKinnon seems less inclined to believe this was the mistake of a junior-level person. "We see over the last year how they've ramped up their aggressiveness in competing with us," he said. "They see this as really important now. They're losing and they don't like losing. We beat them eight out of 10 times, deal-for-deal, and that's even with them bundling in their products in their Enterprise Agreements for a pretty low price. We still beat them. They realize that if they can control the mobility and identity, then they can control choice, that's in their favor. They don't want to lose that control point."

While Okta is back on the guest list, the company is going to leave Microsoft hanging for now, though it's safe to presume the company will have a presence there. It's not clear if any other vendors were banned from sponsoring or exhibiting at Ignite.

Another key player that is coming after Azure Active Directory and EMS, VMware's AirWatch division, said it wasn't denied sponsorship or the opportunity to exhibit at Ignite. "We have a great relationship with Microsoft and we will be participating in this year's Ignite," an AirWatch spokeswoman said. Had Microsoft rejected VMware's AirWatch, it would have been a more marked reversal coming off its unusual move to have Windows Enterprise Executive Jim Alkove become the first Microsoft executive to appear on stage at VMware's annual VMworld last summer in San Francisco, where the two agreed to work together on ensuring compatibility with AirWatch, which now has its own Azure AD alternative.

Asked about that, Okta's McKinnon argued Microsoft had no choice, given the large base of AirWatch and VMware customers. "They put Office on the iPad because they kind of lost that," McKinnon said. "If they clearly lose they'll be open, but I think they still think they have a chance to win this one," referring to Azure AD and EMS. "They see it as a strategic control point."

 

Posted by Jeffrey Schwartz on 03/16/2016 at 2:04 PM0 comments


LANDesk Jumps into Virtual Desktops with AppSense Acquisition

LANDesk, a software provider long known for its patch management and endpoint security tools, this week said it has agreed to acquire AppSense for an undisclosed amount. AppSense gives LANDesk desktop virtualization management tools, filling an important hole in its portfolio.

AppSense also offers tools that gather all of the telemetry data from the endpoints they are managing and provide graphical presentations for systems managers. Building on its recent emphasis of offering added reporting and analysis capabilities, LANDesk intends to integrate the AppSense telemetry gathering tool with its own management software, according to LANDesk CEO Steve Daly. LANDesk is one of the older systems management tool providers in the industry, which was formed by Intel in the early 1990s (actually its history extends back to 1985 when it was known as LANSystems) and later spun off. The company has since had multiple owners. Recently LANDesk has faced stiff competition from enterprise mobility management and systems management providers and the company has responded by making a number of acquisitions in the past few years.

Among those that have helped LANDesk compete in mobile device management were the acquisitions of LetMobile in 2014 and Wavelink in 2012. Over the past few years LANDesk has also extended its patch management portfolio with the acquisition of Shavik from VMware in 2013, and other companies LANDesk purchased include Naurtech and asset management supplier Managed Planet.

Daly said in an interview that LANDesk has historically focused on providing an operational approach to endpoint protection and acknowledged the company has had success in areas such as patch and device management, which includes controlling encryption. "As we've been going through our strategic planning process, we recognize one of the areas that's becoming operationalized really quickly is the idea of application control, and we're seeing a lot of our customers moving in the direction," Daly said.

Enterprise Strategy Group Analyst Mark Bowker said he likes the acquisition but warned that LANDesk "is entering some bumpy IT territory." Moving forward, Bowker added LANDesk "will need to leverage its market tenure and experience to exude confidence with senior IT and business executives if they want to break further into mobility strategies that include managing applications, data, devices and users.

Bowker also said LANDesk should step up its focus on providing security, not just operational management. "ESG research shows that security of endpoints and threat detection are the top two challenges when it comes to managing applications and end point devices," Bowker noted.

Daly said LANDesk plans to continue to acquire companies that can fill out its emphasis on application and user management.

Posted by Jeffrey Schwartz on 03/16/2016 at 2:01 PM0 comments


Google Extends Single Sign-On to Office 365

Google today said it has configured its identity and access single sign-on tool to work with some unlikely partners' SaaS offerings, notably Facebook and Microsoft. The company's new SAML 2.0-based Google Identity offering released in October can now provide single sign-on access to Office 365 as well as Facebook at Work through the use of Google Apps credentials.

In addition, Google Identity is also now configured to work with a number of other popular SaaS offerings from the likes of Box, Concur, Coupa, New Relic, Panorama9 and Slack, among others.

In all, the company claims Google Identity now works with hundreds apps and services available in the Google Apps Marketplace. Google Identity came about back in October when the company added SAML (Security Assertion Markup Language) 2.0 to its existing OpenID Connect (OIDC) Identity Provider.

Google Identity brought single sign-on capability to its Google Admin Console to SaaS offerings in its marketplace. At the time of its release, Google had indicated that Office 365 was among those in the queue.

Administrators can also combine Google's identity services with Google Apps enterprise mobile management controls, such as setting thresholds for password strength, lock screen requirement and application management, according to today's announcement. Google also published details on how to integrate its identity provider other apps or systems.

The identity services also support Google's Smart Lock multifactor authentication capabilities available with Google Accounts. It also offers identity services for enterprise mobility management offerings.

Posted by Jeffrey Schwartz on 03/14/2016 at 11:56 AM0 comments


AWS Cloud Storage Service Turns 10

Today marks the 10th anniversary of Amazon Web Services' (AWS) Simple Storage Service (S3) and the company is marking the occasion by noting some key milestones. In many ways, the launch of S3 on March 14, 2006 signaled the beginning of the cloud era.

Few IT professionals were aware at the time of the shift in computing that was about to take place. As Amazon saw it, storage and compute should have no limits in terms of capacity and scalability, and it should be accessible at much lower costs. "Amazon S3 is based on the idea that quality Internet-based storage should be taken for granted," Andy Jassy, AWS vice president, said at the time in the press release announcing S3's general availability. In the decade since its launch, Amazon claims S3 hosts trillions of objects and, at peaks, processes millions of requests per second.

The price for S3 storage when Amazon launched S3 was $0.15 per GB stored and $0.20 per GB for data transferred. Now S3 storage at $0.03 per GB per month costs 80% less, while its Glacier archival storage is available for less than a penny per month.

The EC2 compute service offering came five months after the launch of S3. Among some other milestones highlighted by the company was the launch of the AWS Management Console in January 2009, the Relational Database Service (RDS) later that year and the April 2012 introduction of the AWS Marketplace, which allowed for the purchase of preconfigured tools and applications. As of now the AWS Marketplace is divided into 35 categories with more than 2,500 software offerings from upwards of 800 ISPs.

Amazon for the first time broke out the revenues and earnings for AWS a year ago, claiming it was a $5 billion business. Although there are thousands of cloud providers throughout the world offering services of all types and scope, AWS still is the largest and leading cloud provider, with Microsoft Azure at No. 2 and Google in third -- the only other likely player that can rival the infrastructure the other two now offer.

Nevertheless, AWS maintains that edge and S3 is a key part of that. "At this point it's the widest, most used service, even in companies where a lot of people aren't aware they're using it just because it's such a simple obvious place to use the cloud," said Michael Fauscette, chief research officer at G2 Crowd, a market research firm.

A recent survey conducted by G2 found a higher customer satisfaction rating with S3 than Azure, Amazon's closest rival among enterprise customers. Customer satisfaction of S3 was 96 percent and EC2 91 percent, while Azure as a whole scored 83 percent, according to the survey.

Amazon S3 reviewers highlighted the product's price and availability were the key reason for using the service, according to the report. Respondents said their primary use of S3 is for hosting static Web pages, storing backup data, and syncing or sharing data across multiple locations. Among those saying why they like Azure (the entire service, not just storage), compatibility with Microsoft software, scalability, the availability of support, networking options and the ease of configuration and management were primary benefits of the offering.

AWS customers ranked S3 highest for availability, scalability and its overall data storage architecture, while the biggest issues they have with the offering are user management, configuration management and logging, according to the report. For Azure, customers gave it the highest marks for high availability, scalability and integration with Microsoft's Visual Studio development platform, while the biggest complains were with user and configuration management and Docker integration.

Fauscette acknowledged the differences in the way customers procure cloud services from the two companies. "When you're using AWS, you're making a specific decision around a specific offering like storage but with Microsoft you may not be making that explicit decision because you may have decided to buy the entire platform," he said. "It may simply be an awareness issue."

Naturally, it also depends on who is surveyed and by what measure. A technical comparison last year by storage service provider Nasuni of Azure Blog Storage versus S3 gave Microsoft's offering the edge.

Amazon and Microsoft were the only companies found by Gartner last year as leaders in its Magic Quadrant ranking, which said that Amazon's infrastructure services has a multi-year advantage over Azure. It'll be interesting to see how the two, and others, stack up this year.  

Posted by Jeffrey Schwartz on 03/14/2016 at 12:34 PM0 comments


DoJ Accuses Apple and Supporters of Hypocrisy

The U.S. Department of Justice has taken the gloves off, accusing Apple -- and the slew of tech companies including Microsoft that supported Apple -- of hypocrisy for refusing to comply with a court order that it help the FBI unlock the iPhone used by a suspected terrorist in December's San Bernardino, Calif. shootings that killed 14 people.

In a sharply worded filing yesterday, the DoJ, the nation's top prosecutor, lambasted Apple for its response to a California federal district court order. In its objection the DoJ contended that Apple has previously helped U.S. law enforcement, as well as investigators in other countries including China, access information on iPhones.

The filing also accused Apple and those who filed amici briefs of confusing the issue by arguing it would set legal precedent for governments to build backdoors into devices and Internet infrastructure.

"Apple's rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights: the courts, the Fourth Amendment, longstanding precedent and venerable laws, and the democratically elected branches of government," the filing stated. "Apple and its amici try to alarm this Court with issues of network security, encryption, back doors and privacy, invoking larger debates before Congress and in the news media. That is a diversion. Apple desperately wants -- desperately needs -- this case not to be 'about one isolated iPhone.' But there is probable cause to believe there is evidence of a terrorist attack on that phone, and our legal system gives this Court the authority to see that it can be searched pursuant to a lawful warrant. And under the compelling circumstances here, the Court should exercise that authority, even if Apple would rather its products be warrant-proof."

The DoJ also refuted Apple's claim that the All Writs Act is "an obscure law dredged up by the government to achieve unprecedented power. That premise is false."

Apple General Counsel Bruce Sewell told reporters on a conference call that the filing had no merit and called it a cheap shot. "In 30 years of practice, I don't think I have ever seen a legal brief that was more intended to smear the other side with false accusations and innuendo, and less intended to focus on the real merits of the case," he said, calling its response "desperate," containing "rhetoric" that was "unsupported and unsubstantiated." He also added that "the tone of the brief reads like an indictment."

It was clear last week at the RSA Conference in San Francisco that the DoJ disagreed with Apple's position when Attorney General Loretta Lynch discussed her views during a conference session.

 "I respect Tim Cook a great deal and his views will heard in court and they'll be decided in court," Lynch said. "I think we have to decide also as a country, in this conversation, do we let this company, no matter how great the company, no matter how beautiful their devices, do we let one company decide this issue for all of us? Should one company say 'this is how investigations are going to be conducted and no other way?' We don't do that in any other areas."

Apple's Sewell is expected to file a response next week and on March 22 both sides will face off in court.

 

Posted by Jeffrey Schwartz on 03/11/2016 at 1:42 PM0 comments


Most Large Enterprises Moving to Cloud E-Mail Opt for Office 365

The number of large enterprises that have moved to cloud e-mail is still relatively small, though growing, according to a recent Gartner survey, which found that 80 percent who do move are going with Office 365. However, the survey showed that smaller shops are evenly split between Google Apps for Work and Microsoft's Office 365.

Despite its growth, only 13 percent of publicly traded companies have moved to a cloud e-mail service from either Google or Microsoft. From that perspective, 8.5 percent use Office 365, while 4.7 percent use Google Apps for Work. The rest of those surveyed either have on-premises, hybrid hosted or e-mail hosted by smaller providers, according to Gartner.

Those identified as the 80 percent of large companies that have moved to Office 365 have more than $10 billion in revenues, according to Gartner. "Google's popularity is better among smaller companies, approaching a 50 percent share of companies with revenue less than $50 million," said Gartner research VP Jeffrey Mann, in a statement.

It appears both Google and Microsoft have found niches in certain industries. Microsoft's edge is among utilities, energy and aerospace companies, while Google's stronghold is in publishing, retail, advertising, media, education, consumer products and travel, according to the survey. Gartner noted those choosing Google tend to be companies bound by fewer compliance requirements.

Not surprisingly, certain industries have gravitated to the cloud faster than others. Gartner said more than one third of the largest companies in the travel and hospitality, professional services and consumer products sectors, have moved their on-premises e-mail infrastructure to either Google or Microsoft's offerings.

 

Posted by Jeffrey Schwartz on 03/09/2016 at 9:59 AM0 comments


Google Boosts Data Loss Prevention in Gmail

Both Google and Microsoft know data leakage is a big concern among organizations considering the move to cloud-based e-mail and collaboration and both companies are now touting the data loss prevention (DLP) in their respective offerings.

Microsoft stepped up its support for protecting against data leakage last fall when it announced the rollout of DLP for both OneDrive for Business and SharePoint Online. For its part, while Google doesn't support DLP in Google Drive yet, the company said it is planning on offering it later this year. But Google last week said it has added new DLP capabilities for Gmail shops that use Google Apps Unlimited, which the company describes as the premium version of its Google Apps for Work offering.

Google started offering DLP capability for its Google Apps Unlimited customers in December, enabling admins to set policies so that users can't send or forward e-mail with certain types of information, such as social security numbers.

The upgraded DLP features announced last week adds support for scanned documents using optical character recognition (OCR), predefined content detectors and support for new parameters. Specifically, the new OCR ability can extract text from scanned images and use DLP from the administrators console at the OU level to set content compliance and set rules for objectionable information. The new detectors can be set to discover personally identifiable information and data types that would fall under HIPAA rules. Administrators can fine tune parameters such as noting that a single credit card number is OK while sending bulk transmissions would be blocked.

Posted by Jeffrey Schwartz on 03/08/2016 at 4:58 PM0 comments


Cisco Readies SDN-Based Hyper-Converged Infrastructure

Cisco Systems last week jumped into the market for hyper-converged infrastructure and software-defined store, which is emerging as a popular new way to deploy software-defined compute, storage and networking with shared policy management and automation. The company made the move at the annual Cisco Partner Summit, where it also launched new Nexus switches and said it will acquire CliQr, which provides orchestration tools to manage multiple cloud environments.

By launching its own hyper-converged infrastructure, Cisco is taking on a market now dominated by a number of startups such as GridStore, Maxta, Nutanix, Scale Computing and SimpliVity. Until now, Cisco has used its partnership with SimpliVity to offer that capability and reportedly tried to acquire it as well as Nutanix, according to CRN. Cisco's new offering called HyperFlex Systems runs on the Cisco UCS platform. HyperPlex uses SpringPath Inc.'s hyper-convergence software, which virtualizes servers into a single pool of compute and storage resources. The SpringPath Data Platform that Cisco is using in is HyperFlex Systems eliminates the need for network storage, instead automatically tying to existing management software. It uses what SpringPath calls "adaptive scaling capabilities," allowing organizations to scale their compute, caching or storage resources. SpringPath said its monitoring capabilities are designed to ensure uptime.

In addition to integrating with Cisco's UCS servers, the HyperFlex solution integrates with the company's Nexus switches and Application Centric Infrastructure (ACI). The latter is Cisco's fabric for providing monitoring, automation and policy management. The first set of products in the new HyperFlex line are called the HX Series and are available in three models, starting with a 1-rack-per-node unit with two processers per node, a 480GB SSD and six 1.2TB HDDs. At the high end are units configurable from two to six racks with four processors per node, a 1.6TB SSD and 23 1.2 HDDs. HyperFlex is designed for both current applications and those based on microservices architectures and containers, according to Satinder Sethi, Cisco's VP for data center solutions engineering and UCS product management.

 "With Cisco HyperFlex, we're delivering the capabilities customers tell us they've been waiting for in a hyper-converged solution," Sethi said in a blog post. "By extending our strategy of software defined, policy driven infrastructure to hyper-convergence, Cisco will accelerate mainstream adoption of this valuable technology and provide customers a future-ready platform for evolving applications."

Cisco also launched several upgraded versions of its Nexus switches. At the high end for cloud-scale requirements, Cisco claims its new Nexus 9000 offers a 25 percent boost in non-blocking performance, real-time network telemetry at 100Gbps and the ability to scale IP addresses up to 10 times and support for more than 1 million containers per rack.

Also last week Cisco said it has agreed to acquire CliQr Technologies Inc. for $260 million in cash. CliQr offers a cloud modeling, deployment and orchestration tool for building out private, public and hybrid clouds. Cisco noted that CliQr already supports ACI and UCS.

Customers today have to manage a massive number of complex and different applications across many clouds," said Rob Salvagno, Cisco's VP of corporate development in a statement. "With CliQr, Cisco will be able to help our customers realize the promise of the cloud and easily manage the lifecycle of their applications on any hybrid cloud environment."

Posted by Jeffrey Schwartz on 03/07/2016 at 2:19 PM0 comments


IT Industry Mourns Death of E-Mail Inventor Ray Tomlinson

The sender of the first e-mail between two different computers, Raymond Tomlinson, died over the weekend. Tomlinson famously chose the "@" sign followed by a computer hostname appended to a user name as the syntax for how messages should connect with one another.

"It is with great sadness we acknowledge the passing of our colleague and friend, Ray Tomlinson, read a Raytheon statement. "A true technology pioneer, Ray was the man who brought us e-mail in the early days of networked computers. His work changed the way the world communicates and yet, for all his accomplishments, he remained humble, kind and generous with his time and talents. He will be missed by one and all."

Tomlinson was an engineer at Bolt, Bermek and Newman, a government contractor now a part of Raytheon. He was part of a team working on software to create and upgrade the Advanced Research Projects Agency Network (ARPAnet), which was the first to run TCP/IP and became the basis of what is today's Internet.

Until he created the ability to send e-mails from one person to another, e-mails could only be sent to one another on the same computer. Tomlinson was working on improving an inter-user e-mail program called SNDMSG but like others "a mailbox was simply a file with a particular name," according to a Raytheon document Tomlinson authored. In that document he explained the idea of sending a message from one person on one computer to someone else on another by improving SNDMSG and incorporating code from CPYNET, then an experimental file transfer protocol, initially over ARPAnet. Here's how it happened, in his words:

The idea occurred to me that CPYNET could append material to a mailbox file just as readily as SNDMSG could.  SNDMSG could easily incorporate the code from CPYNET and direct messages through a network connection to remote mailboxes in addition to appending messages to local mailbox files. The missing piece was that the experimental CPYNET protocol had no provision for appending to a file; it could just send and receive files. Adding the missing piece was a no-brainer -- just a minor addition to the protocol.  I don't recall the protocol details, but appending to a file was the same as writing to a file except for the mode in which the file was opened. Next, the CPYNET code was incorporated into SNDMSG.  It remained to provide a way to distinguish local mail from network mail.  I chose to append an @ sign and the host name to the user's (login) name.  The @ sign seemed to make sense.  The purpose of the @ sign (in English) was to indicate a unit price (for example, 10 items @ $1.95).  I used the @ sign to indicate that the user was "at" some other host rather than being local.

The two messages first sent in 1971 involved two computers side-by-side connected via ARPAnet, he recalled.

Luminaries throughout the tech industry shared their condolences. "Very sad news," tweeted Internet pioneer Vint Cerf. And Google's Gmail team tweeted:  "Thank you, Ray Tomlinson, for inventing email and putting the @ sign on the map."

Tomlinson was inducted into the Internet Hall of Fame by the Internet Society.  "I'm often asked 'Did I know what I was doing?'" Tomlinson said during his induction. "The answer is: Yeah. I knew exactly what I was doing. I just had no notion whatsoever about what the ultimate impact would be."

Posted by Jeffrey Schwartz on 03/07/2016 at 1:47 PM0 comments


Apple vs. FBI Takes Spotlight at RSA Conference

Apple's refusal to help the FBI unlock an iPhone used by the suspected terrorist involved in December's mass shooting deaths of 14 people in San Bernardino, Calif., predictably took the spotlight at this week's annual RSA Conference in San Francisco.

The RSA Conference is considered the largest gathering of security and encryption experts, drawing 40,000 attendees. Some of the world's leading encryption experts sparred with each other, the nation's top law enforcement official and the head of the U.S. Department of Defense about Apple's refusal to unlock the phone and whether restrictions on encryption are necessary to protect against criminal and terrorist activity -- or to investigate prior crimes.

Microsoft President and chief legal officer Brad Smith used his RSA keynote address to double down on last week's promise to stand behind Apple's refusal to unlock the phone. "Businesses have a right to know so they can defend themselves, and it's why we at Microsoft are joining other companies across our industry to stand up for and stand with Apple in this new, important case," Smith said. "We need to stand up, be thoughtful and also be vocal. Despite the best of intentions, one thing is clear: The path to hell starts at the back door and we need to make sure that encryption technology remains strong."

In addition to Microsoft, industry giants including Amazon.com, Box, Cisco, Dropbox, Evernote, Facebook, Google, Mozilla, Nest, Pinterest Slack, Snapchat, What'sApp and Yahoo signed a 31-page amicus brief filed yesterday in support of Apple.

"The principal argument we make in our joint brief is straightforward," Smith said in a blog post. "The court order in support of the FBI request cites the All Writs Act, which was enacted in 1789, and last significantly amended in 1911. We believe the issues raised by the Apple case are too important to rely on a narrow statute from a different technological era to fill the Government's perceived gap in current law. Instead we should look to Congress to strike the balance needed for 21st century technology."

Associating the All Writs Act with this issue is a clear indication of the need to modernize laws to align the use of digital technology with privacy, Smith added. "What's needed are modern laws passed by our elected representatives in Congress, after a well-informed, transparent and public debate," he noted.

Some of the world's leading encryption experts debated the issue, as well, during the traditional Cryptographers' Panel at RSA. Among those taking issue with the FBI's stance is Martin Hellman, who played a key role in developing public key cryptography and is now a professor emeritus in electrical engineering at Stanford University. Hellman said he was filing an amicus brief in support of Apple asking the court vacate the order. "I think it is a mistake," Hellman said during the panel discussion. "The danger is it will set a precedent."

Adi Shamir, who co-developed the RSA cryptosystem (the "S" in RSA) with Ron Rivest and Leonard Adleman, disputed the notion that the FBI has asked Apple to create a back door. "The FBI is asking Apple to do something very specific," said Shamir, who is now a professor of computer science at the Weizmann Institute in Israel. "The FBI will give Apple a particular phone and ask Apple privately to open up that particular phone. It has nothing to do with placing back doors in millions of telephones around the world. It's not an issue of mass surveillance. I think we are the confusing issue."

U.S. Attorney General Loretta Lynch, who spoke at an RSA session and fielded questions by Bloomberg TV anchor Emily Chang, said she was miffed by Apple's refusal to develop code to unlock the phone in this particular case. "The issue we are facing now is how do we, as an American law enforcement agency, fully investigate the worst terrorist attack on American soil since 9/11. Those issues are very important," Lynch said. "Up until recently, Apple maintained the ability to provide information to the government without any loss of safety or security of the data on their devices. It happens all the time, every day of the week all across America. This is a very different decision by Apple to not participate in that national directive."

Apple acknowledges it has done so in the past, but only for versions of iOS 7 or earlier. However, iPhones running iOS 8 or higher feature passcode-based encryption, designed to provide customers with higher levels of security and privacy. "We are no longer able to use the data extraction process on an iPhone running iOS 8 or later," the company said in a letter to customers last week. Hackers and cybercriminals are always looking for new ways to defeat our security, which is why we keep making it stronger."

Lynch argued the FBI is requesting the software for this one phone, and it's not requesting that Apple turn over that software to law enforcement. "What we are asking them to do is to is to help us with this particular device, not to give the technology to us," she said. "They can keep it, they could destroy it, they could essentially be done with it and it would let us try and get into the phone. We don't even want them to be the ones to get into it."

Apple has argued it's not that simple. Apple General Counsel Bruce Sewell this week testified before a House Judiciary Committee that developing software to unlock this one device is effectively asking the company to create a back door for all phones. "They are asking for a back door into the iPhone -- specifically, to build a software tool that can break the encryption system which protects personal information on every iPhone," Sewell testified in his prepared remarks. "As we have told them -- and as we have told the American public -- building that software tool would not affect just one iPhone. It would weaken the security for all of them."

In a separate session at RSA, U.S. Secretary of Defense Ash Carter fielded questions from Ted Schlein, general partner with the venture capital firm Kleiner Perkins. Carter declined to speak of the Apple case specifically, but said: "I'm not a believer in back doors or a single technical approach to what is a complex and complicated problem. I don't think that's realistic, I don't think that's technically accurate."

Just before speaking at RSA, Carter appointed Eric Schmidt, former Google CEO and now executive chairman of parent company Alphabet, to lead a new Defense Advisory Board to help provide balance between ensuring the privacy and security of the nation's defense infrastructure. "He knows that you can't have freedom, you can't have innovation, you can't take care of your families, you can't have a career if there isn't security," Carter said. "So somebody's got to provide security. It's a serious business. It's not a game."

Posted by Jeffrey Schwartz on 03/04/2016 at 1:00 PM0 comments


Microsoft HoloLens Heads Closer to Reality

Microsoft is now letting developers order its new HoloLens and related software, representing an important step toward bringing its "mixed-reality" glasses to market.

The HoloLens Development Edition, consisting of the glasses and software APIs, will ship March 30. The release of the kits is Microsoft's first and critical step toward letting developers build applications for HoloLens, which the company unveiled over a year ago and has touted ever since, but has also kept at arms' length.

The $3,000 package is initially only available to Windows Insiders in the U.S. and Canada. The number of devices that'll be available appears limited, based on Microsoft's messaging, and the company is looking for those who will create broadly appealing applications that'll extend its Windows 10 operating system.

"Today represents a monumental step forward," said Microsoft Technical Fellow Alex Kipman in a blog post announcing the release. "This is the first step in our journey to consumers. A step focused on our commercial partnerships and on supporting developers, who will help pave the way to consumer availability with amazing and new holographic experiences."

Kipman claims Microsoft has seen significant interest from developers. Among those Microsoft has already announced as partners are Volvo Cars, Autodesk Fusion 360, Case Western Reserve with the Cleveland Clinic, Trimble and NASA. The NASA Mars Online Project, announced early last year, is now "mission operational," Kipman noted in his post.

Microsoft has huge ambitions for HoloLens, which the company describes as a Windows 10 device and APIs that allow developers to create holographic computing experiences. "With Windows, holograms are Universal Windows [Platform] apps and all Universal Windows apps can be made to work on Windows holographic," Kipman said. "Similarly, holographic apps in the Windows Store can be monetized in the same way that all other UWP apps are today."

The forthcoming kit will include development tools that allow programmers to create projects with Microsoft's Visual Studio, an emulator and, over time, the devices themselves. The HoloLens has sensors that are powered by the company's Holographic Processing Unit (HPU) and an Intel 32-bit architecture, he noted. Kipman described the HPU as custom silicon that lets the HoloLens recognize gestures.

HoloLens aims to extend the reach of Universal Windows Apps available in the Windows Store, including OneDrive, Maps, Remote Desktop, People, Movies & TV, Groove Music and Office apps.

"The Microsoft HoloLens Development Edition also gives developers access to a showcase of holographic app experiences to help get them started," he said. "These experiences are designed to demonstrate what the device can do, and how it operates, in order to inspire developers to create incredible things."

From Microsoft's standpoint, there's pent-up demand among developers for HoloLens. Whether or not that translates into customer demand remains to be seem. Now it's in the hands of developers.

Posted by Jeffrey Schwartz on 02/29/2016 at 11:20 AM0 comments


Q&A: Why Intel Authenticate Will Bolster PC Security

Intel is rolling out multifactor authentication (MFA) technology that will work in any new PC equipped with its 6th Generation Core processors, code-named "Skylake." Tom Garrison, VP and general manager of Intel's Business Client Products, recently outlined new MFA technology called Intel Authenticate. Garrison announced Intel Authenticate, which is available in preview now, during a late January press conference. Following the press conference Garrison fielded our questions about the new technology and how it will work with Microsoft later this year to deliver and promote Intel Authenticate and the two companies' respective security capabilities. Also see the March Redmond magazine cover story about Intel Authenticate, which includes analysis and industry reaction.

Q: Can you talk about some of the co-marketing and joint development work you'll be doing with Microsoft?
A: Whenever we have new processors with new capabilities, we've got deep engineering engagement between the two companies. As they have either OS features or us with hardware features, we work collaboratively in that sense. That's the same way we're going with Authenticate. Our engagement right now has been very much focused on making sure that Authenticate [works] seamlessly with Windows 10, as well as Windows 7 and Windows 8.1. Longer term, from an Intel perspective, our goal is that Windows authentication will use the security capabilities that are built into the 6th Generation platforms and beyond. So, for example, if Windows 10 or whatever version of Windows 10 that the user is running on a 6th Generation platform, we would like the operating system to be able to know that it's talking to a machine that has hardened multifactor authentication capability and log in using that. It raises the level of trust of the platform. In the cases where the machine isn't capable of that -- maybe it's an older platform or it's a different architecture -- then it would log in with whatever capability the machine has. And in that case, it would be less secure than what we are providing with our machine, but then that would give choice to the IT decision makers who are buying the platforms. Longer term that's what we're working toward with Microsoft.

From an engineering perspective, we're absolutely engaged between our platform teams and we have security-focused individuals, as does Microsoft. There's no agreement on when we'd be able to do that, or if we will do that, but that is certainly our goal. It's beyond security, even. Our goal is always that we want the operating system to be able to take advantage of the capabilities that are in the hardware, and this is another example of that. So we want to make sure if we have a feature, like Authenticate, that the operating system can take advantage of it, and the user gets the value, in this case the value being a more secure security posture for the endpoint.

Microsoft points to some of the security features it has including Passport and Hello, Azure Active Directory, BitLocker and its enterprise data protection DeviceGuard capabilities. Do any of those conflict, or compete, with what Intel is doing with Authenticate?
Some of them are completely unrelated, but, for example, with Hello versus Authenticate, Microsoft will give you a set of choices of which factors to use. And those factors -- you can use face, a password and [others] -- but their implementation is based on what we call a software implementation, so it's visible at the software level and, therefore, you could be exposed to certain classes of attacks in that case. Even older platforms, username and password, we know there are lots of attacks worldwide. I think the estimate -- if I remember correctly -- is 117,000 attacks every day on corporations. There are lots of attacks and almost all of those are software-based attacks. So when you use something like Hello, it is certainly better than the simple username and password solution, but what [Intel] Authenticate does is it builds even more. And so it's hardened multifactor authentication. So it puts it in hardware. What we're providing is an even better security capability because it's rooted in hardware and, therefore, all the software classes of attack like simple phishing techniques or key-loggers, or screen scrapers, those kind of more traditional attacks will not work with Authenticate, because the credentials themselves are all stored in hardware. There are other classes of attacks where the credentials are actually removed from the PC when they're stored in the software layer. All of those classes of attacks are thwarted with Authenticate, but again, IT can make the choice for large businesses. They can choose which level of security they want to have. Traditional, old username password is probably the least secure. Windows Hello improves that with its facial login and some of the other attributes [being added]. And then Authenticate, we believe, is the level businesses should be looking to deploy to give the best security posture possible for their client endpoints.

At the [January press] briefing, you mentioned that Windows Hello "trains" Authenticate. Can you elaborate on what happens there and whether that was jointly engineered with Microsoft or if you just engineered that capability?
When you train the PC -- say your fingerprint as an example – it's called enrollment. The partnership we have with Microsoft is very broad. We know how the enrollment works and our goal is obviously from a user-experience standpoint, to make sure that the experience is positive for the people that are using Authenticate so they don't have to have multiple enrollments. They can do it once and use it in a Windows 10 context or in an Authenticate context.

Where do you see broadening the integration or these capabilities?
I think the most obvious ones, at least the ones we're talking about right now, we'll continue to add authentication factors. Today you'll see machines with integrated fingerprint solutions, for example from Lenovo -- Lenovo has a hardened fingerprint solution that takes advantage of Intel Authenticate today built-in with a hardened factor. You'll see Authenticate can also be used for what we call soft factors: these are factors that have some element that the factor is visible to the operating system, invisible to software. If, for whatever reason, there is a particular class of a factor that an IT shop really wants, Authenticate is flexible enough that you can use any of those as part of your decision criteria. And obviously from our perspective, the more you can choose hardened factors, each of those individual factors is more robust and less likely to be compromised. But over time, you'll see more factors come in from us. We'll be adding things like hardened facial login. You'll see other biometrics come in from other OEMs, I can't discuss the details, but suffice to say, there will be other biometrics coming in. And then, over time, we will be expanding just beyond identity, which is what we're today focused on with Authenticate. But our strategy is to build upon the security capabilities of the PC, to be able to add capabilities in it with data protection being an example of an area where we can add more capability that significantly improves the protection of data -- in motion or at rest -- on a PC.

You're referring to encryption?
It is encryption, but it's encryption that would allow data to be stored certainly on the PC. For example, if you were going to share information between one PC and another, you can do that in a more secure fashion, and then if your PC was somehow compromised through whatever class of attack, and the information was removed from your PC, that information would be fully encrypted and basically useless to whoever was trying to take advantage of it. Those are capabilities that are coming but we're not talking a lot about it because those are things coming. But It also is about how it interacts with how you would store data either on your PC or off your PC, as well. It's exciting stuff, and my point of raising this is that what we're doing with Authenticate right now is exciting, it hits the major classes of attacks, more than half of attacks are related to stolen or misused credentials. That's why we focused on identity first. And we have a plan to innovate around the broader term of security of the platform beyond identity in subsequent platforms. As we continue to improve identity, we'll add capabilities beyond identity such as data protection, for example.

Trusted Platform Module (TPM), although it has been around for a while, has never gotten broad adoption. Is it possible Authenticate is ending up in the same type of scenario or do you anticipate it will be more broadly adopted?
Our goal is that it's more broadly adopted. I think TPM, in general, is an example of an interesting technology that wasn't, what I would say, broadly adopted. The reason I think [that] is the use cases of TPM were relatively limited. What we are trying to do with Authenticate is focus on a use case and a threat that is a broad exposure. And a use case that's done every single day multiple times a day. We are making the overall experience positive from a user standpoint. TPM is a solution that delivers a higher level of trust, but it's primarily the user who doesn't really care about it. It's the IT organization that would care about it. With Authenticate, you're getting the value of a more trusted machine, which is what the IT shop would care about. And, you're getting a better user experience because to the user it'll look like they don't have any more passwords. They don't need to remember complex passwords that are changing every 30 to 60 days. It's delivering that double value.

From an authentication standpoint, you're a member of the FIDO Alliance. At what point will Authenticate will be FIDO-compliant?
Microsoft is obviously taking a leadership role there. We're working very closely with them so that everything that we're doing with Authenticate in terms of the various factors and so forth that we're enabling are all FIDO-compliant. We are working very closely with Microsoft to make sure that's the case.

I know it's difficult to promise an exact timeline but do you have any sense? Is it going to be a year from now before it's FIDO-compliant?
It's still evolving so it's impossible to forecast.

Once that's achieved, do you think that will be a key impetus in the broad adoption of this technology?
I think it's important, I don't think people are saying, "I'm not going to do anything until I know this is FIDO-compliant." The FIDO element is an important aspect when it comes to the overall industry and how you coordinate and drive the industry forward when it comes to security, so that's important, and that's why we want to make sure that we're compliant in that regard. But I don't perceive any sort of waiting by customers to hear what comes out of FIDO 2.0, before they want to take action. I think as long as they know that companies like Intel and Microsoft are working together and our plan and our strategy is to continue to engage and be part of that FIDO 2.0 consortium, and be compliant with it, that's fine, that's all they need to know.

Can you elaborate on Intel Authenticate's support for System Center and Active Directory?
We want to make sure with Authenticate you don't need to have a new set of tools to enable Authenticate. And so we have provided plug-ins to go into McAfee EPO, or Microsoft System Center Configuration Manager [SCCM] and so with those plug-ins, whatever you're using, whether it's SCCM or Active Directory or EPO, the tools that you're used to using you don't need to learn new tools. Through those plug-ins you now have the ability to do the policy-based management of your fleet. So, for example, if you want to say all 6th Generation platforms, in order to log in, we want you to have a Bluetooth-enabled trusted phone, and also a fingerprint solution, you can do that in a policy-based way with whatever tool you use, whether that's EPO or Active Directory or SCCM.

How do third-party authentication tools from companies like RSA fit into Authenticate. Are you competing with them or do you see integrating with them?
No, we're actually working with them. I can't speak specifically about RSA, but in general we've been working with them. They're very interested in Authenticate because one of the limitations with some of those other solutions, for example hardware tokens, those tokens are a pain-point for their customers, because of lost tokens or just the cost of having to replace them or resynchronize those tokens.

Given the credentials are stored in the hardware, can you describe what happens on the hardware?
Within our chipset, we have a management engine. Within that management engine, it does lots of different functions. For example, that management engine does the capabilities we have around vPro. So you can do out-of-band management when the operating system isn't available, or you want to do some sort of patching or things that are outside the operating system. That's where vPro's capabilities are rooted in this management engine. But also on that same management engine with Authenticate, we have the ability to store credentials. We also have the ability to put the IT policy engine there. So when IT decides, "I want to have a person use a Bluetooth phone and a fingerprint to login," that's a policy decision and they can make those policies, which are all stored in hardware. Also the actual biometric information itself is stored there. Which was super critical to have it trusted and protected to the highest level possible. The combination of those three things, the biometric information, the security credentials and the IT policy engine, are all stored in this management engine and this management engine is what does the work of Authenticate. So based on the information that it has to work with, whether it's the biometric information, the policy engine and so forth, it will decide whether or not to issue the certificate. Those are all decisions that are handled there in the management engine.

It's below the operating system so none of that information is stored in any way that would be available to hackers to somehow compromise, steal and remove from the PC. Because it's in hardware it's below the level of the operating system, it's below the drivers and any other applications software. None of those can see into the hardware space, which gives a much higher level of security and trust.

Posted by Jeffrey Schwartz on 02/26/2016 at 1:16 PM0 comments


Microsoft 'Wholeheartedly' Supports Apple's Stance Against FBI Demands

Microsoft President Brad Smith said his company "wholeheartedly" backs Apple's refusal to cooperate with the FBI's demand that it decrypt the data on an iPhone owned by suspected terrorist Syed Rizwan Farook and his wife Tashfeen Malik, who killed 14 people in December's San Bernardino, Calif. shooting attack.

Smith appeared yesterday before a U.S. House Judiciary Committee hearing investigating the contentious legal battle between the FBI and Apple. The clash has escalated since Apple CEO Tim Cook said Apple will not comply with a California federal district court order issued last week by Magistrate Judge Sheri Pym that required Apple to cooperate with the law enforcement agency's demands. The showdown between Apple and the FBI has pitted civil liberties proponents against those who believe Apple and the IT community have a duty to cooperate with law enforcement in the interest of national security, as reported Monday.

In his testimony, Smith backed Cook's proposal that Congress form a commission to investigate the issue, pointing out that the current laws are antiquated. "We do not believe that courts should seek to resolve issues of 21st-century technology with a law that was written in the era of the adding machine," Smith said in response to a question from Rep. Zoe Lofgren, D-Calif.

"We need 21st-century laws that address 21st-century technology issues," Smith continued. "And we need these laws to be written by Congress. We, therefore, agree wholeheartedly with Apple that the right place to bring this discussion is here, to the House of Representatives and the Senate, so the people who are elected by the people can make these decisions."

A transcript and video of Smith's response to Lofgren's question regarding Apple is available here.

In his prepared remarks, Smith pointed to the overall need to "update outdated privacy laws," including the Electronic Communications Privacy Act (ECPA), a key digital privacy law that was passed three decades ago. To showcase the changes in technology since then, Smith showed an IBM laptop with a monochrome display and a floppy disk, which was considered a modern system at the time, next to a new Surface Pro.

"When the U.S. House of Representatives passed that bill by voice vote on June 23, 1986, Ronald Reagan was president, Tip O'Neill was speaker of the house, and Mark Zuckerberg was 2 years old," he said. "Obviously, technology has come a long way in the last 30 years."

Until yesterday, Microsoft had been silent, at least publicly, on the dispute between Apple and the FBI, even though companies such as Facebook and Google backed Apple early on. It was nice to finally hear where Microsoft stands. Given its own actions when compelled by law enforcement, Microsoft's position isn't surprising.

Posted by Jeffrey Schwartz on 02/26/2016 at 10:37 AM0 comments


HP Launches Windows 10 Phablet at Mobile World Congress

HP is readying a new Windows 10-based combination of a phablet, laptop and desktop PC for business users, planned for release this summer. The HP Envy x3, revealed this week at the annual Mobile World Congress convention in Barcelona on Monday, could provide a promising new use case for Microsoft's Continuum technology and perhaps create a new niche for Windows 10 mobile among business users.

It includes a new six-inch Windows 10 mobile phone with a WQHD (2560 x 1440) AMOLED multi-touch display  that will include built-in native support for Salesforce.com's Salesforce1 CRM offering and will come with HP software that supports virtualized application environments.

The new HP Envy x3 aims to bridge the functions of a phablet and a pc using Continuum, the technology introduced in Windows 10 and Windows 10 mobile that allows users to connect large displays, keyboards and mice and run both Universal Windows Platform apps from the Microsoft Store and traditional Win32 and .NET applications. "HP's new Elite x3 is an innovative new Windows 10 mobile device," wrote Terry Myerson, executive VP of Microsoft's Windows and devices group. "With a unique new docking station, the Elite x3 brings Continuum to life in new ways, seamlessly transitioning between a phone, notebook or desktop.".

The new HP Envy x3 phone, powered with Qualcomm's Snapdragon 820 processor enabled for access to mobile LTE-based networks, also utilizes Microsoft's Windows Hello biometric authentication technology. It's equipped with 4GBs of RAM, a Cat 6 LTE modem and 2×2 AC Wi-Fi. The phone's storage capacity is 64GB eMMC 5.1, with support for up to 2TBs using an optional SD card.

Among other new mobile Windows 10 devices at MWC, Myerson showcased were the new Vaio Phone Biz, the Alcatel OneTouch, the Trinity Nuans NEO and the Huawei Matebook. Microsoft also announced its largest Lumia phone to date, the 5-inch, $200 Lumia 650.

This week's show also demonstrated that HP is developing more than just a phone that can plug into standard desktop peripherals using Continuum alone. The Envy x3 will include HP Workspace, a software update for Windows 10 that will give users access to catalogs of x86-based apps bundled as a virtualized solution.

According to HP, HP Workspace is a client side app that supports a group of back-end services that give IT administrators the ability to manage a catalog of virtualized legacy apps for end users based on their Active Directory profiles. Organizations will be able to use Citrix XenApp or Microsoft Azure RemoteApp to provide virtual app experiences on the Elite x3 clients, the company said. It will also support monitoring and reporting.

The optional Desk Dock will include two USB-A and a USB-C connector ports, Ethernet port and DisplayPort to let users connect external monitors. Another option is the HP Mobile Extender, which HP said gives the experience of a laptop with a near-zero bezel 12.5-inch diagonal HD display, weighing in at about 2.2 lbs. The Mobile Extender won't store any data and users must authenticate from the Elite x3.

HP hasn't disclosed pricing or a release date for the Elite x3, but it's an interesting sign for Windows Phone fans that have seen Microsoft's mobile market share continue to decline. The fact that a player like HP is making a push with a new twist could at least help move the dial in the other direction, even if it only finds niche uses.

Don't expect to find this device at Best Buy or your local AT&T dealer. "This is for large businesses and organizations, not consumers," wrote Patrick Moorhead, president of Moor Insights & Technology, who is over at MWC in Barcelona. "HP, with its Elite x3, is attacking smartphone modularity and extensibility in the enterprise. HP is spot-on that mobile-compute power is skyrocketing, enabling PC-like capabilities. The Qualcomm Snapdragon 820 is an absolute beast of a performer, too. The reason I add the '-like' is that the x3 can't perform all PC use cases well, and if that's your bar, [you're] missing the point. The x3 isn't intended to replace your full PC experience just as the iPad Pro isn't intended to replace the Mac experience."

If you're a fan of Windows Phone, are you encouraged by the launch of this new device, and the hope that others will follow suit?

Posted by Jeffrey Schwartz on 02/24/2016 at 2:00 PM0 comments


Apple Pitches Commission to Balance Civil Liberties with National Security

A court's decision ordering Apple to develop the equivalent of a key that would unlock the encrypted data in a suspected terrorist's iPhone last week has quickly emerged as one of the most divisive issues in both the IT industry, politics and the law enforcement community to date. It has pitted civil liberties proponents against those who believe Apple and the IT community have a duty to cooperate with law enforcement in the interest of national security. Of course, it's not that simple.

Apple has until later this week to provide a response in which it has said it will vigorously defend its decision not to comply with the order given by California Federal District Court  Magistrate Judge Sheri Pym. The order said that Apple must help FBI access the contents of the iPhone of Syed Rizwan Farook, who, along with his wife Tashfeen Malik, was responsible for December's deadly attack on 14 people in San Bernardino, Calif.

FBI Director James Comey posted a statement on the agency's Web site aimed at downplaying concerns raised by privacy advocates. "The relief we seek is limited and its value increasingly obsolete because the technology continues to evolve," Comey stated. "We simply want the chance, with a search warrant, to try to guess the terrorist's passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That's it. We don't want to break anyone's encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn't. But we can't look the survivors in the eye, or ourselves in the mirror, if we don't follow this lead."

Apple CEO Tim Cook has doubled down today on his position, suggesting that the FBI should withdraw its demand. Instead, Cook suggested that the government should form a commission or panel of experts representing all aspects of the debate. It should consist of leading experts in intelligence, technology and civil liberties to weigh the implications of encryption on both national security and privacy, Cook said in a letter to employees this morning (acquired by BuzzFeed) and excerpted in a FAQ published on Apple's Web site.

"Our country has always been strongest when we come together," Cook wrote. "We feel the best way forward would be for the government to withdraw its demands under the All Writs Act and, as some in Congress have proposed, form a commission or other panel of experts on intelligence, technology, and civil liberties to discuss the implications for law enforcement, national security, privacy, and personal freedoms. Apple would gladly participate in such an effort."

In his FAQ this morning, Cook acknowledged that it's technologically possible to develop a way to access the encrypted information. "It is certainly possible to create an entirely new operating system to undermine our security features as the government wants. But it's something we believe is too dangerous to do. The only way to guarantee that such a powerful tool isn't abused and doesn't fall into the wrong hands is to never create it."

Indeed that is a reasonable concern, said Aneesh Chopra, who served as the first U.S. CTO under President Barack Obama from 2009 to 2012. "There is no question in my mind, we have a much bigger concern about the rogue actor threat," Chopra told CNBC. "If you think of the honeypot that would be created if they knew that such a capability was possible, this would be like a race amongst all of evil doers in the cyber security land to find out how Apple did it." Chopra said he wasn't concerned about a rogue government official accessing the capability but rather someone else. "Our law enforcement is world class and we have great confidence that they follow the rules and I believe it's a high capability there," he said. "But for the others in the system, it's just going to create a lot of havoc, and there already is pressure."

Not everyone outside of law enforcement believes Apple is acting altruistically. Among those backing government assertions that that Apple is protecting its business interests was Scott McNealy, the outspoken longtime CEO of Sun Microsystems.

"Apple's got a master stroke here," McNealy said, also speaking on CNBC this morning. "They are looking like they're for the privacy and liberties of folks, but I think they're also quite worried that any business in China is going to go away if the Chinese government think that the FBI gets a backdoor into an Apple phone that is sold in China. There's a lot of business. I know when Huawei was banned from the U.S. that created a negative backlash on the equipment providers here in the U.S. who are trying to sell into the Chinese market. So there's big economic reasons, as well as PR reasons, for what's going on here."

Among IT pros, it appears most are hoping Apple and other companies don't cave. Last week's initial report of the court's order generated numerous responses, most of whom appeared to back that premise.

"Freedom and liberty comes with costs ... and those costs mean that the government cannot step on or endanger our freedoms just so they can open one iPhone," wrote Asif Mirza. "I'm not a big fan of Apple, but I am glad they are standing their ground. I just hope it does not crumble under them."

RocRizzo added: "If Apple opens a backdoor into iOS for law enforcement, every hacker, including those in China, ISIS, will have it. Other hackers will use it to exploit users, just as they have done with the new ransomware that has been going around. It will only be a matter of hours before those who the police claim to protect us from will have the means to break into THEIR encrypted iPhones as well.  If encryption is illegal, only criminals will have encryption."

Regardless of how you feel about Apple, Microsoft and others in the IT community, the outcome of this will have a huge impact on every IT pro that manages any device and data running in the cloud.

Posted by Jeffrey Schwartz on 02/22/2016 at 12:17 PM0 comments


Microsoft Backs New IoT Standards Consortium

Microsoft is among nine leading companies that have formed the new Open Connectivity Foundation (OCF), a consortium launched today that aims to ensure interoperability of Internet of Things devices through standards. Among other initial backers of the new group are Arris, CableLabs, Cisco, Electrolux, GE Digital, Intel, Qualcomm and Samsung.

The OCF said it brings together the work of the former Open Interconnect Consortium and the UPnP Forum and is a nonprofit organization chartered with bringing together key providers of silicon, software platforms and finished products focused on interoperability. The OCF's initial emphasis is on its sponsorship of IoTivity, an open source framework designed to deliver device-to-device connectivity.

A reference implementation of the OCF's IoTivity is available under the Apache 2.0 license. The initial IoTivity implementation includes documentation for Linux, Arduino and Tixen, but the OCF said  that the code is portable with future builds planned for additional operating systems. That will include Windows 10, according to Terry Myerson, executive vice president of Microsoft's Windows and Devices Group. "Windows 10 devices will natively interoperate with the new OCF standard, making it easy for Windows to discover, communicate and orchestrate multiple IoT devices in the home, in business and beyond," Myerson said in a blog post announcing Microsoft's participation in the group's formation. "The OCF standards will also be fully compatible with the 200 million Windows 10 devices."

Microsoft will provide APIs that will let developers integrate their software with OCF-compatible devices, he added. The OCF said that while there are various targeted efforts to address connectivity and interoperability of IoT devices, it doesn't see a single effort focused on addressing all of the requirements.

"The companies involved in OCF believe that secure and reliable device discovery and connectivity is a foundational capability to enable IoT," according to a FAQ on the group's Web site. "The companies also believe that a common, interoperable approach is essential, and that both standard and open source implementation are the best route to enable scale."

Included in the OIC 1.0 spec is a core framework published last month with security and trust protocols, a Smart Home Device reference including the OIC Smart Home Resource Specification, a resource spec that uses RAML (the RESTful API Modeling Language) for exposing the resources' APIs and uses JSON schemas as payload definitions for the resource representations. The latter address device control, notification, environmental control and energy management and saving, according to the spec. Remote access is provided via various standard protocols, including XMPP.

Posted by Jeffrey Schwartz on 02/19/2016 at 12:42 PM0 comments


Apple Sets Stage for Huge Encryption Showdown with Feds

Apple CEO Tim Cook has set the stage for a showdown with law enforcement that will have significant ramifications on the future of encryption used to ensure privacy on individuals' devices. A stunning court order last night ordered Apple to help the FBI decrypt the iPhone of the suspected terrorist who gunned down 14 people in December's infamous attack in San Bernardino, Calif.

The FBI has reportedly tried on its own using brute force password-breaking techniques to decrypt the iPhone of Syed Rizwan Farook, who, along with his wife Tashfeen Malik, was responsible for the attack. When Apple had refused to cooperate, the FBI sued and was handed a victory in a California Federal District Court by Magistrate Judge Sheri Pym to help access the contents of the phone. The FBI is trying to determine if the two attackers were part of a larger terror ring and by accessing the contents of the phone can help determine if additional attacks are looming.

Cook was swift and predictably (based on prior public statements about the issue in general) issued a statement refusing to comply with the order. Noting that Apple has no sympathy for terrorists, Cook stated:

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software -- which does not exist today -- would have the potential to unlock any iPhone in someone's physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The government argues it needs such backdoors to protect people, as Eileen Decker, the U.S. Attorney for Central California, which serves Bernardino, told The New York Times. "We have made a solemn commitment to the victims and their families that we will leave no stone unturned as we gather as much information and evidence as possible. These victims and families deserve nothing less."

Nevertheless, the order, if Apple or anyone else were to comply, could have a chilling effect on the future of privacy. The Electronic Frontier Foundation applauded Apple's defiance. "For the first time, the government is requesting Apple write brand new code that eliminates key features of iPhone security -- security features that protect us all," said Kurt Opsahl, the EFF's deputy director and general counsel, in a statement. "Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we're certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security."

The fact that Apple is refusing to help the FBI is not surprising as it, and other tech giants including Microsoft and Google, has stated its promise to protect user privacy. The most noteworthy example until now was Microsoft's challenge to a court order demanding it provide access to e-mail in a Dublin datacenter. In the case of Apple, the government is asking the company to help it develop a backdoor.

It is already a huge national story that will likely work its way into the most divisive presidential campaign in recent memory. Given this case could reach the Supreme Court, it could also raise the tenor of the debate over filling the seat that's now open following Saturday's death of Justice Antonin Scalia.

So do you think Apple and other companies should create backdoors or do you support its refusal to comply?

Posted by Jeffrey Schwartz on 02/17/2016 at 1:16 PM0 comments


VMware Boosts Employee Privacy and IT Security in AirWatch Update

VMware's acquisition of AirWatch two years ago for $1.54 billion remains the company's largest buy. The company is now giving the popular mobile device management a boost that looks to improve employee privacy and IT security at the same time.

Company officials maintain AirWatch remains as a separate business unit but they also describe it as a core component of the VMware stack. Instead of calling it AirWatch by VMware, the company is calling it VMware AirWatch. Structural and naming issues aside, the company has tied the new VMware AirWatch 8.3 release more closely with its NSX network virtualization platform. This integration will allow administrators to set policies and app-level VPN access to native mobile apps, which allows administrators to restrict what a mobile user and the app can access, said Blake Brannon, VMware AirWatch's VP of product marketing.

Brannon said without this integration, "it puts organizations at risk where you've got a connection coming into the datacenter that sort of has the full flat level network access." AirWatch integrated with NSX provides the ability to "micro-segment" applications within the datacenter or cloud infrastructure. "I can have an administrator roll out a brand new application, spin up a new service on the back end to host that application out to the users and have the software layer open up the applicable connections to those back-end services through the networking software layer without ever involving IT, without ever involving a network admin and it can be solely done via the AirWatch console, so it greatly improves the overall security."

At the same time, Brannon argued that it provides more agility and automation by making it simpler for businesses to roll out new apps internally with the flexibility of moving them or their associated workloads to a cloud-hosted datacenter. Asked why this makes apps more secure, Brannon explained that it ensures a user's device and the app are isolated and can only connect to a specific app server within the network. Hence it restricts access to other components in the network."

The new release also aims to make employees feel more comfortable in letting IT manage user-owned devices. One way it does so is with a new FAQ Web site that shows what administrators can do to your device (i.e. with regard to accessing private data or remotely wiping personal photos or other content).

Also new in AirWatch 8.3 is single sign-on support, which includes the ability to log in without using a password using Microsoft's Passport for Work functionality built into Windows 10 Pro and Enterprise editions. Microsoft describes Passport for Work as "an enhanced version of Microsoft Passport that includes the ability to centrally manage Microsoft Passport settings for PIN strength and biometric use through Group Policy Objects (GPOs)."

AirWatch 8.3 lets administrators configure Passport in a Windows device without having to join it to an Active Directory domain to set up group polices and other configurations, Brannon said. This is especially important for organizations with devices that are in the field. "It's a struggle in general to join them to the domains. In some cases the EMM approach to managing them is just a cleaner, lighter more agile way to manage the device to begin with as opposed to having to deal with joining it to a domain, and having it have a network requirement connectivity challenges for updates to it," he said. "Obviously Passport for Work gives you security benefits with credentials using a different way to sign in to these applications as opposed to using passwords."

It's noteworthy that Microsoft and VMware worked closely to provide this integration, having talked up their pairing back in August when Windows Enterprise Executive Jim Alkove became the first Microsoft executive to appear on stage at VMware's annual confab.

At the same time, Microsoft considers its own Enterprise Mobility Suite (EMS) as one of its fastest growing products. Microsoft Corporate VP Brad Anderson has argued on numerous occasions that organizations don't require a third-party EMM suite when using EMS.

Posted by Jeffrey Schwartz on 02/17/2016 at 1:21 PM0 comments


Microsoft Adds SDK, PaaS and DevOps to Azure Stack Preview

Microsoft this week rolled out additional features to its Azure Stack Technical Preview. The release comes just two weeks after the long-awaited debut of the Azure Stack preview, which promises a means for organizations and service providers to run software in their clouds that's similar to the solutions Microsoft uses for its Azure datacenter public cloud.

The new features, rolled out on Monday, include the Azure SDK that includes Windows PowerShell support and cross-platform CLI support, the Web Apps feature to the Azure Apps Service, SQL and MySQL database resource providers that are designed to support the Web Apps data tier and, for developers, native Visual Studio support.

"We're making additional Azure PaaS services and DevOps tools available for you to deploy and run on top of your Technical Preview deployments," according to a blog post. "This represents the first installment of continuous innovation towards helping you deliver Azure services from your datacenter."

There are now over 70 Azure services. When Microsoft released the Azure Stack Technical Preview in late January, the company signaled it would roll out functionality or services incrementally. The initial release consisted of the core infrastructure-as-a-service stack, said Jeff DeVerter, chief technologist for Rackspace's Microsoft practice. DeVerter shared his observations during an interview yesterday at the Rackspace: Solve conference in New York.

"They initially gave us storage, compute, SDN was there and Resource Manager was there. It was literally a handful of things. And then what they're doing is lighting up additional features as resource packs after the fact, which is what they did this week when they gave database as a service for both SQL and MySQL databases and the Web sites component. The behavior this week suggests they're going to give it in packs. Which gives you some idea as to how Azure is built. It's built as a service framework with capabilities that hang off it."

DeVerter, who said the Azure Stack Technical Preview "is pretty clean," believes the quick release of the second set of features is a positive sign. "It bodes well," DeVerter said. "It also tells you that at some point they will be comfortable with that service framework that is Azure."

The Azure Stack Technical Preview does have one surprising limitation. For the purposes of the technical preview, Microsoft has limited it to running on a single hypervisor. Also Rackspace had to acquire new hardware to deploy it in its test labs -- it picked up quad-core servers with 128GB of RAM. It's too early to say how much hardware Azure Stack will require once it goes GA in the fourth quarter, he said, since the current preview has the one hypervisor limit.

Are you testing Azure Stack? Share your observations.

Posted by Jeffrey Schwartz on 02/12/2016 at 10:44 AM0 comments


Is the IT Bubble Bursting Again?

Some major IT tech companies saw their shares plummet late last week. Many of those whose shares are falling the hardest are the ones that appeared overheated for some time. Despite the unemployment rate dropping to 4. 9 percent, the lowest since 2008, the number of new jobs gained last month fell short of the 185,000 expected, with only 155,000 gained. Meanwhile plunging oil prices, less-stellar fourth quarter earnings and economic jitters over troubles in China and other emerging markets have plagued the markets since the beginning of the year. All of this is triggering concerns that tech spending and IT hiring is going to be hit hard this year, though some believe such fears are overstated.  

Shares of data analytics, security and especially cloud providers took a hard hit Friday and continue to fall today. Tableau, whose value was cut nearly in half Friday after a disappointing quarterly earnings report, was down about 10% today; Splunk, down 20% Friday, traded down 12% today, as was security vendor Palo Alto Networks, down 20% on Friday and fell 9%  today. Major online and cloud provider Amazon was down 13% last week and 3% today; Saleforce.com down 13% Friday and 7.6% today; Workday plunged 16% Friday and fell 10% today; and ServiceNow fell 11% Friday and 10% today. Overall, the tech-heavy Nasdaq average fell 1.8% today and the S&P 500 was down 1.4%

Meanwhile, the economic headwinds have taken its toll on IT hiring. According to the Bureau of Labor Statistics (BLS) only 5,500 IT jobs were added in January, down from 6,100 in December, which was only half the 12,300 added in November. In a breakdown by Foote Research Group, which tracks IT hiring, 93 percent of all IT jobs added over the past year accounted for all new IT jobs in December. Management and Technical Consulting Services added 2,200 jobs, sharply down from the 4,400 monthly average in 2015. Computer Systems Design/Related Services gained 3,400 jobs last month, just slightly less than the 3,800 added in December but sharply below the 6,842 monthly average in this segment for 2015.

Chief Analyst David Foot noted that there were volatile swings in IT job demand, both last year and the beginning of this year, but he sees some favorable indicators for IT employment. "It appears that IT employment is under a certain amount of pressure this year even though economists are suggesting that the American economy is holding up well despite a slowdown in China, growing risks in emerging markets and turmoil in the stock market," Foote said. "The financial markets are leery but the labor market still looks like it's continuing to grow." Foote Partners' compensation benchmark report shows salaries and demand for IT skills covering various areas remain strong, particularly for cloud and mobile platforms, database and data analytics, security, app development and a variety of methodology and process skills.

Adding to the mix is that it's a presidential election year, which always creates uncertainty. Nevertheless, it doesn't appear the volatile markets will have any bearing on shifts in technology either from the end user (client device) and datacenter (infrastructure and cloud) perspectives, nor should it impact this year's priorities.

How is the latest market meltdown affecting your views on IT demand and hiring?

Posted by Jeffrey Schwartz on 02/08/2016 at 1:35 PM0 comments


Microsoft To Address Power BI Web Preview Security Concerns

Microsoft this week released the preview of a new feature for its Power BI data visualization tool that lets users publish dashboards to the Web. However,  its initial lack of IT controls quickly raised security concerns by IT pros, which the company is promising to fix.

The new Power BI "publish to Web" preview lets users publish their data visualizations to Web sites or blogs but it currently lacks the ability for IT administrators to disable it, which could allow users to unwittingly or intentionally publish data not intended to leave their organizations.

A number of critics used the comments section of the announcement by Microsoft group product manager Faisal Mohamood to lament the inability to disable the feature. "Please provide admin security for this Preview immediately," wrote Craig Piley, in the comments section of Mohamood's post -- a request noted by several, who otherwise lauded the capability of the new feature. "I understand the benefit of this ambitious functionality, yet any Power BI user now has the ability to share confidential data outside the organization by simply bypassing a warning message. This presents immense risk for companies, especially those in healthcare."

Responding to the concern yesterday, Mohamood promised the ability to disable the feature will arrive next week. "We released the publish to Web capability in preview, and are working on a variety of additional updates including the ability for admins to enable/disable," he wrote. "The ability to enable/disable publish to Web will arrive next week, worldwide."

Despite that missing feature, the ability to publish data visualizations to the Web from Power BI is likely to be a popular one. "As the amount of data generated around us continues to accelerate at a blistering pace, there is a tremendous desire and need for the ability to present data in visually engaging ways," Mohamood stated. "This has influenced the way we tell stories and share data insights online. The best stories are interactive, include rich data and are visual and interactive. A significant number of bloggers, journalists, newspaper columnists and authors are starting to share data stories with their audience for an immersive experience."

Among some of the benefits, he said, users can:

  • You can connect to hundreds of sources -- files, databases, applications and public data sources are all easy to connect to and use in your data stories.
  • Your reports can auto-refresh and stay up to date.
  • You can visualize the data with utmost flexibility, using custom visuals.
  • You have ownership and manageability of the content you publish; administrators will have the control over this capability as well.

Mohamood pointed to a number of organizations that have already started using the new feature including The Elizabeth Glaser Pediatric AIDS Foundation (EGPAF), the Grameen Foundation,  Water1st International, J.J. Food Service, SOLOMO Technology, Inc., The Civic Innovation Project and Microsoft's Bing Predicts team, which among other things used Power BI to analyze the final debate among Republican presidential candidates in Iowa.

Posted by Jeffrey Schwartz on 02/05/2016 at 5:14 PM0 comments


Google, the World's Most Valuable Company, Seeks Cloud Dominance

Alphabet, the parent company of Google, has unseated Apple as the world's most valuable publicly traded company, and it appears it's growing faster than analysts expected with no slowing down in sight.

It's the first time the company has broken out its businesses since creating the new structure back in August. Alphabet reported revenues of $21.3 billion for the fourth quarter of 2015, up 18% over the same period last year and $500 million higher than analysts had forecast for the quarter. Revenues for last year of $75 billion increased 14% over 2014. Operating income of $23.4 billion was up 23%, with Google's core business showing a 38% operating margin, up from 35% in the prior year.

It was little surprise that the core Google business -- which includes its search engine, advertising, Android, YouTube and Gmail -- accounted for the bulk of revenues. But having insight to the non-core businesses gave investors a sense on how the flagship operation is performing. It also explained to what extent Alphabet's so-called "other bets" are putting a drag on margins. Taking a look at the entire year, Google's revenues were $74.5 billion with operating income of $23 billion, compared with 2014's revenues of $65.7 billion and operating income of $19 billion. Revenue for "other bets," also described as "moonshots," were $488 million, up from $327 million. However, those "other bets," which include businesses such as its driverless cars, Nest home automation, Fiber and Verily healthcare, showed a loss of $3.6 billion compared with 2014's $1.9 billion.

  Twelve Months Prior to December 31, 2014 Twelve Months Prior to December 31, 2015
Google Segment Revenues (in millions)

$65,674

$74,541
Google Operating Income (in millions) $19,011 $23,425
"Other Bets" Revenue (in millions) $327 $448
"Other Bets" Operating Loss (in millions) $1,942 $3,567

While the company talked up a growing Google Apps business, it's not viewed as a huge moneymaker despite the fact that it has one of the world's largest scale infrastructures. Google has already signaled it's going to step up its enterprise efforts with the hiring of VMware founder Diane Greene.

Google CEO Sundar Pichai emphasized the company's goals during Monday's earnings call with analysts. "We are already getting significant traction," Pichai said. "It's a strongly growing business for us. And we plan to invest significantly in 2016. It'll be one of our major investment areas."

Pichai said the Google cloud already hosts 4 million trusted applications and he pointed to the infrastructure that hosts its search business and YouTube as an example of the scale it can bring to enterprise customers. "Public cloud services are a natural place for us," he said. "Our datacenters, infrastructure, machine learning, and premium data services are leaders in the cloud space, as is our price-to-performance ratio," he said. "And we are now able to bring this to bear just as the movement to cloud has reached a tipping point."

Posted by Jeffrey Schwartz on 02/03/2016 at 1:27 PM0 comments


Gridstore Plots Expansion of Hyperconverged Infrastructure

Converged infrastructure has various meanings, depending on the supplier and IT professionals. For Gridstore, the company has decided to focus on offering hyperconverged infrastructure for Hyper-V-only environments. The company, which offers an appliance consisting of Intel-based multicore servers and scalable flash storage, last week said it has raised $19 million in equity finding from Acero Capital, GGV Capital and ONSET Ventures.

Along with the funding, Gridstore said it has named Nariman Teymourian chairman. Teymourian has led a number of startups and most recently was a general manager and SVP at Hewlett Packard Co. The company also has tapped James Thomason, former CTO of Dell's cloud marketplace, as CSO. Thomason arrived at Dell when it acquired Gale Technologies, a startup led by Teymourian. The company has tapped another former Dell executive as its CFO, Kevin Rains, who was director of operations for Dell's software group.

George Symons, CEO, said the company will use the funding to expand its engineering and sales reach. While Cisco, Dell and Hewett Packard Enterprise are among those who offer hyperconverged infrastructure, Symons said in an interview that Gridstore mainly competes with startups Nutanix and SimpliVity, whose gear is targeted at midsized enterprises and departments of larger organizations. Two years ago, Gridstore decided to focus its hyperconverged systems on Hyper-V. Symons said the company's focus on Microsoft's hypervisor platform make its systems best suited for Windows Server-based applications such as databases and others requiring high performance. "We're seeing a lot of SQL Server consolidation as well as people moving from VMware to Hyper-V," Symons said.

The Gridstore appliance is a 2U single-rack system that can scale to 256 nodes, each of which can run up to 96TB of raw flash storage. Customers typically start with 4 to 8 nodes and some have worked their way up to 30 nodes. One prominent law firm has a configuration with more than 100 nodes, according to Symons. The systems have variable configuration options ranging from 2 to 24 cores and up to 1TB of memory in each node.

Asked if Gridstore plans to add networking components to its appliances, Symons said it's possible but not in the short-term plan. "Strategically we will probably do it but not in the next year," Symons said. "We make it easier to integrate with the existing networks. Those seem to be one of the challenges customers have with their environments. "One of the key benefits with hyperconverged infrastructure is ease of deployment [and] ease of management. We do want to have that integration so that it all happens quickly and easily so you're up and running."

In the near term, look for new management capabilities including new reporting analysis, he said. Gridstore is also readying its hyperconverged infrastructure to support both Windows and Hyper-V containers. Symons believes organizations moving to a DevOps model will find containers attractive.

Asked if an Azure appliance is in the works, he hinted that's likely. "It's an area we can play heavily in," he said. "We think we can do a lot of management integration to make it easier in terms of moving workloads, in terms of looking at orchestration. There are some interesting opportunities for us there. We think it's pretty important."

Posted by Jeffrey Schwartz on 02/03/2016 at 1:26 PM0 comments


Microsoft Reveals Prototype Underwater Datacenter

Microsoft today has revealed a research project that shows promise for submerging and operating a cloud-scale datacenter in the ocean.

The experimental datacenter, dubbed Project Natick, was submerged into the Pacific Ocean about a half-mile off the coast from August to November of last year. Enclosed in a special 38,000-pound container that looks like a small submarine, the datacenter had the CPU power of 300 desktop PCs and validated the potential of developing the technology at a much higher scale. The researchers conducting last year's pilot called the container Leona Philpot, the name of the popular Xbox character.

Project Natick received sponsorship as a result of Microsoft's quest to develop datacenters that can be easily provisioned, are low cost and environmentally sustainable, according to the company, which emphasized the fact that 50 percent of the world's population lives near a major body of water. Not only would an underwater datacenter have significantly less power and cooling requirements thanks to the cold temperature and the ocean's potential to generate renewable energy, but it would provide a lower-cost way of delivering datacenter capacity quicker, while providing greater proximity to users.

"Project Natick is a radical approach to how we deploy datacenters," said Ben Cutler, the Microsoft Research project manager in a short video released today showing how it was developed and submerged (it's less than 3 minutes and worth viewing). "Basically what we're doing is we're taking green datacenters [and] deploying them in the ocean [and] off the coast. The overall goal here is to deploy datacenters at scale anywhere in the world -- from decision to power-on within 90 days."

During the video, Eric Peterson, a Microsoft mechanical architect, noted that Project Natick is based on traditional servers "you would find in any datacenter that have been modified for this particular environment since it's in a marine situation." Jeff Kramer, a Microsoft research engineer on the project, added that the servers have the cooling system attached to the outside via special tubing designed to withstand the underwater conditions.

"There's all the control electronics that live on the outside of the rack and then as you move further out you get to the steel shell outside [where] we have all of the heat exchangers that are attached to that shell as well as the outputs for all of the cable and we go back to the surface," Kramer said. "It's kind of like launching a satellite to space: once you've built it, you hand it to the guys with the rocket -- in our case, a crane. You can't do anything about it if it screws up."

According to a FAQ, a Natick datacenter is expected to last for five years, which is consistent with the lifespan of most computers and related gear. Every five years the datacenter would be removed from the ocean, new infrastructure added and then submerged again. Each datacenter would have a target lifespan of at least 20 years, according to Microsoft. Once taken out of commissioned, it would be recycled.

As for when, and in what form, Project Natick will be available, Microsoft emphasized that it is too early to determine if and when that will happen.

Posted by Jeffrey Schwartz on 02/01/2016 at 12:41 PM0 comments


Azure and Kinect Making Inroads Among Retailers

A growing number of retailers are relying on Azure and Kinect to improve in-store and online shopping experiences, according to Microsoft. I spent some time at the Microsoft booth at last week's National Retail Federation show in New York, where company officials showcased advances that are starting to (or poised to) appear in stores.

Microsoft for years has invested heavily in improving retail experiences. The best evidence of that is Microsoft's Retail Experience Center, which is a 20,000-foot facility with individual store experiences meant to allow visitors to see technology under development in a variety of environments -- from coffee shops and electronics stores boutiques that sell clothing, among others. New developments and solutions unveiled at NRF, the retail industry's largest annual gathering of IT professionals, will be brought into the Microsoft Retail Innovation Center, said Martin Ramos, Microsoft's CTO for retail consumer products and services.

I first met Ramos last summer during a two-hour tour of the Redmond facility and it was quickly evident he showcased it as if it were his home. "There is so much cool technology," said Ramos, a 10-year Microsoft veteran who had spent decades at Walmart. One "cool" technology that's becoming popular is Bluetooth beacons, which have presence indicators that are accurate within 10cm. Bluetooth beacons can track the movement and activity of a preferred customer via their phones. Ramos said using "micro-location," a retailer can correlate a customer with a product and receive personalized information in a number of different ways.

"That information may be product information, it could be a video that tells me about that product or it could tell me what my price is," Ramos said. "Because it's a personal device, I can hold my phone in front of a product and it will tell me what my price is. Maybe not what everyone else pays but what my price is."

Internet of Things controllers that let stores automate the lighting and use of video displays near shelves were also on display. Microsoft is also working with Intel on IoT sensors that can determine the movement of products on a shelf for inventory tracking purposes.

So where does Azure fit into this? In a growing number of these scenarios, Azure represents all or part of the infrastructure to process the new information. "You almost have to have a cloud component -- it's just much more efficient," Ramos said. "The struggle at the individual store level is what do I put in the cloud, what do I keep in the store? You always want to be able to scan an item and find the piece."

Ramos said Microsoft is working with a growing number of retailers to implement an edge cloud, essentially a small cloud in the store that looks like Azure. "The nice thing about that is we can push our applications into the edge cloud and if the network to the store goes down, the applications are still there locally," he said. "From an Azure standpoint, we still use the cloud to handle all of the data -- security device management and replication data replication [help in the case if the] device goes down in the store we still have copies of everything in the cloud."

In many cases, the key ingredient is Azure Machine Learning and Azure IoT Services to gather information collected from sensors. Microsoft showcased how retailers such as 7 Eleven are using Azure with a tool from Vmob to gain contextual analytics and customer engagement.

 "We're showing how we are taking the streaming information out of all the devices in the store and then pumping that up into the cloud, using a set of services for streaming data analytics for storage using Data Lake technology and then we're also able to query that and expose the analytics on that using Power BI," said Brendan O'Meara, senior director for Microsoft's retail industry segment.

O'Meara talked up how Kinect with a solution by ISV partner AVA Retail is used in store shelves, making it possible to determine within 4 inches of accuracy where a customer's hand is on a shelf, which instructs a monitor to display information, perhaps even a video, about a product, he explained. By monitoring customer activity, sales employees can become aware of customers that may need assistance.

One customer using a similar take on this technology is Mondelez International, which has begun a road show demonstrating vending machines that are interactive and can measure customer sentiment, according to O'Meara. The new vending machine sports a 3D display that provides information on products and promotions including the ability to give free samples. Its Azure IoT Dashboard tracks inventory and alerts field technicians when a machine isn't working properly.

That should make people think twice about kicking those machines when they don't work.

Posted by Jeffrey Schwartz on 01/29/2016 at 1:24 PM0 comments


Real-Time Office Online Editing Added to Cloud Storage Services

Microsoft today said it is letting Office Online users coauthor files in real time that are stored in third-party cloud storage services including Box, Citrix ShareFile, Dropbox and Egnyte. Until now, in order to coauthor Office Online files in real time, they had to be stored either in SharePoint or Microsoft OneDrive, a capability first added in 2013.

Since launching its Cloud Storage Partner Program (CSPP) last year, Microsoft has signaled support for letting third-party service providers integrate with Office Online in new ways. In addition to adding the real-time coauthoring capability to its CSPPs, Microsoft is also announcing that it's extending its Office for iOS integration to all partners in the CSPP. Microsoft is also adding integration between its Outlook.com email service and Box and Dropbox.

Microsoft had already permitted Dropbox users to coauthor Office files from Android and iOS devices and today the company is incrementally extending that to the rest of the CSPP members, though just for iOS.

"This integration lets users designate these partner cloud services as 'places' in Office, just as they can with Microsoft OneDrive and Dropbox," said Office Corporate VP Kirk Koenigsbauer, in a blog post announcing the incremental new capability. "Users can now browse for PowerPoint, Word and Excel files on their favorite cloud service right from within an Office app. They can open, edit or create in these apps with confidence that their files will be updated right in the cloud. Users can also open Office files from their cloud storage app in Office, then save any changes directly back to the cloud."

Edward Shi, an associate product manager at Box, said in a blog post that users who select its service as the default can access all of their Box files from Office and edited them using Microsoft's native cloud apps. "Further, when starting directly in Box's iOS app, you can open Office to edit your files and those changes automatically saved back to Box," Shi noted. "Or create a fresh Word, PowerPoint or Excel document, assign tasks to specific colleagues and save to Box. The ability to create, open, edit and save Office content in the Box for iOS app enables customers to efficiently create and collaborate on content from anywhere."

Microsoft's Koenigsbauer indicated Android support would follow later in the year. Over the next few weeks, he stated that Microsoft will let Outlook.com users attach files stored in Box and Dropbox to their messages.

Posted by Jeffrey Schwartz on 01/27/2016 at 11:55 AM0 comments


Microsoft's 1TB Surface Book Sells Out

When Microsoft launched its Surface Book back in October, it caught the industry by surprise. Not only did no one see this sleek new design in laptop computing coming from Microsoft, it became a must-have system for numerous Windows enthusiasts for its potential performance. Perhaps the most sought-after configurations of the Surface Book by power users -- the high-end unit with a 6th Generation Intel Core i7 processor, 1TB of storage, 16GB RAM and an NVIDIA GPU – finally became available last week before very quickly selling out.

Initially when I looked up the fully loaded Surface Book on the Microsoft Store Web site yesterday, it wasn't available to order. Today the site is accepting orders with the caveat that Microsoft won't ship it before Feb. 5. Several stores confirmed that they don't yet have any systems in stock and when systems do come in, they're gone within hours.

A Microsoft spokeswoman confirmed the supply shortage. "We've seen strong demand for the 1TB Surface Book and have sold out of current inventory," she said in a statement.

Also now shipping is a fully loaded Surface Pro 4 tablet PC, which also sports the new Intel Core i7 processor, 1TB of storage and 16GB RAM. According to the microsoftstore.com site, that configuration, priced at $2,699 plus $129 for the keyboard, is ready to ship immediately.

Microsoft also added a gold version of its new Surface Pen (added to the choices of black, silver and dark blue). It comes with the Surface Book and Surface Pro 4 but additional pens, which also work with the older Surface Pro 3, cost $59.

Back in November, we took a close look at the new Surface Book entry and the latest Surface Pro 4 (see "Touching the Surface").

Posted by Jeffrey Schwartz on 01/26/2016 at 12:27 PM0 comments


Surface Tablets Didn't Cause Patriots To Lose AFC Championship

Microsoft's Surface tablets took an unwelcome hit during yesterday's AFC Championship game in Denver when the units used by the New England Patriots briefly stopped working. The issue, which Microsoft later said was a networking outage unrelated to the Surface tablets (not a BSOD), was briefly under the spotlight to everyone watching the game.

"Our team on the field has confirmed the issue was not related to the tablets themselves but rather an issue with the network," a Microsoft spokesperson said. "We worked with our partners who manage the network to ensure the issue was resolved quickly."

Update: NFL spokesman Brian McCarthy late Monday issued a statement confirming Microsoft's claim. "Near the end of the 1st quarter, we experienced an infrastructure issue on the Patriots sideline that impacted still photos for the coaching tablets. The issue was identified as a network cable malfunction and was resolved during the 2nd quarter.  The issue was not caused by the tablets or the software that runs on the tablets. We have experienced no issues with the tablets this season. Any issues were network related."

CBS reporter Evan Washburn, who was on the sidelines, pointed out the problem (though not the cause), during the national broadcast: "They're having some trouble with their Microsoft Surface tablets," Washburn reported. "On the last defensive possession the Patriots' coaches did not have access to those tablets to show pictures to their players. NFL officials have been working at it. Some of those tablets are back in use, but not all of them. [There's] A lot of frustration that they didn't have them on that last possession."

Unless you've been hiding under a rock (or totally tune out professional football), the Patriots ultimately lost the game by two points to the Denver Broncos, denying the Patriots a return trip to the Super Bowl on Feb. 7. In a post-game news briefing this morning at Gillette Stadium in Foxborough, Mass., Bill Belichick, the coach of the New England Patriots, said the brief outage didn't affect the impact of the game.

"It is what it is," Belichik said. It's a pretty common problem. We have ways of working through it. There is really nothing you can do. It's not like the headsets where one side is affected. You just deal with it. We've had it at home, on the road. That didn't affect the outcome of the game, no way," he added. "Just part of it. Sometimes it works, sometimes it doesn't."

Microsoft's Surface Pro is the official tablet used during NFL games, thanks to a $400 million deal inked two years ago. Among other things, it's used on the sidelines by coaching staffs to review plays and access statistics.

For a while, broadcasters were referring to them as iPads. The scuttlebutt on social media was that if they were iPads the problem wouldn't have happened. We all know that's not true. But what is true is even though the Seattle Seahawks aren't returning to the Super Bowl, neither are the Patriots. It' s hard to imagine too many folks in the Pacific Northwest are sorry about yesterday's outcome.  

Posted by Jeffrey Schwartz on 01/25/2016 at 3:32 PM0 comments


Docker Looks to Extend Container Reach with Unikernel Systems Acquisition

Looking to forge better integration between container-based software and hardware infrastructure, Docker today said it has acquired Unikernel Systems, a Cambridge, U.K.-based company run by the original developers of the Xen open source hypervisor project. Docker acquired the company because it believes unikernels will be an important technology to enable more intelligent applications capable of automatically tying to the source code within datacenter and cloud infrastructure as well as endpoint devices used in sensors in Internet of Things-type scenarios.

Docker isn't revealing terms of the deal, which closed last month, but said all 13 employees of the company are now part of Docker including Anil Madhavapeddy, CTO of Unikernel Systems and a key architect for the Xen Project.

In an interview Madhavapeddy explained why unikernels are critical to advance the use of Docker containers and so-called micro services. While the Docker container API aims to provide a way for developers to build and ship applications that can run in any operating system, virtual machine or cloud environment, he explained unikernels are microcode within hardware (i.e. routers, IoT sensors and other infrastructure), that can compile the source code into a custom library OS that provides only the functionality required by the application or system login. Initially Docker sees unikernels becoming more widely added at the firmware layer of hardware to enable software-defined intelligence in a small but scalable footprint.

"The other goal is for the platform you are deploying on should never be a lock-in for the developer," he said. "So unikernels provide a lot more flexibility when it comes to building and orchestrating some of these hybrid micro services that are becoming dominant in the ways we're building these applications."

Many hardware manufactures are already using Docker to build more software-defined intelligence into their network equipment, storage gear and IoT devices, said David Messina, Docker's vice president of marketing, though he declined to name any specific suppliers. The firmware within these hardware subsystems can only accommodate microcode, which is what unikernels are, according to Messina. "It would just be additive," he said.

Madhavapeddy said developers who need maximum flexibility and resource multiplexing are starting to use the container model. "The Docker APIs are now also being adapted to support the levels of isolation and specialization that result in unikernels," he added. "So if you're building these hyper-specialized applications that run directly against the hypervisor, it's entirely possible. The hypervisor doesn't have to be Xen it can be Hyper-V for example. We are working on support for this and it fits very well into the Windows model of hybrid Hyper-V or native containers that will come in Windows Server 2016."

Any hypervisor or hardware layer can be supported by adding a library to the unikernel library suite, according to Madhavapeddy. So the idea is that the virtual hardware is just exposed as part of the Unikernel suite and you just link in the right one at build time," he explained. "You just tell it that you want to build a VMware-specialized unikernel or a Xen-specialized one on EC2 or Hyper-V, and the results will be output according to the developer."

Messina said adding the Unikernel Systems team will address a key need to facilitate more applications aware, or software-defined infrastructure. Unikernels today are at the state of maturity containers were a few years ago, he said.

"The Unikernel Systems team just brings a lot of expertise in the low-level plumbing and infrastructure required for Docker to be deployed just universally across the cloud," Messina said, adding it will enable interaction with Internet of Things-based hardware and applications. "The idea is we have about a billion virtual machines on Amazon running Xen and the next trillion will probably be embedded devices running on very, very low power systems all over the physical world. We're incredibly excited that the technology can extend Docker's reach into systems that can not actually run Linux or any general purpose operating system in very tiny device. So this is truly going to become a universal Docker API for developers to base their applications on."

While Unikernel's efforts initially will be infrastructure focused, the goal is to allow the business logic within the application developers are compiling to become accessible, Madhavapeddy said. "So once we cross the chasm and move the infrastructure, just because we're accessible, I see no reason why every developer, coding in F# or C# shouldn't be able to gain the same benefits as well."

While unikernels will give Docker-based containers and micro services intelligent, software-defined infrastructure, it should be at a low level, Madhavapeddy said. "Our goal is that unikernels are invisible -- developers should never know they're using them," he said. "They will just become another option in the build pipeline whenever a system is being constructed. So if they never hear about it we've been successful."

Posted by Jeffrey Schwartz on 01/21/2016 at 2:27 PM0 comments


Microsoft's $1 Billion Cloud Gift

Microsoft CEO Satya Nadella took to the world stage at the annual World Economic Forum in Davos, Switzerland to announce the company will contribute $1 billion in cloud resources over the next three years to nonprofit organizations, researchers and underserved communities. While Microsoft has a storied history of donating software, the latest is a huge contribution.

Nadella in a blog post said the contribution will serve 70,000 nonprofits including researchers and universities that want to "make it easier for governments and NGOs to use the public cloud for public good." The contribution will include Azure, Office 365, Power BI, Enterprise Mobility Suite and Dynamics CRM Online.

Noting the United Nations commitment last fall to tackle such issues as poverty, hunger, health and education by 2030, Nadella said cloud services will help researchers "to reason over quantities of data to produce specific insights and intelligence. It converts guesswork and speculation into predictive and analytical power."

Noting that governments are looking for a framework that would make the best use of cloud resources, Nadella said the four critical components are infrastructure, skills development, trusted computing and leadership. "This framework would encourage more pervasive use of the public cloud for public good," he noted.

For example, he pointed to the L V Prasad Eye Institute in India, which has treated 20 million patients with cataracts. By digitizing medical records and other data "doctors now can pinpoint the procedures needed to prevent and treat visual impairments," he said. Another example he cited was last April's massive earthquake where U.N. relief workers used the cloud to gather and analyze huge amounts of data about schools and hospitals to expedite relief efforts.

Microsoft President and Chief Legal Officer Brad Smith outlined three initiatives that the company hopes to achieve with the contribution.

  1. A new organization formed last month headed by Mary Snapp called Microsoft Philanthropies will aim to make Microsoft's entire cloud portfolio available to specific nonprofit organizations. The organization will roll out a program this spring.
  2. Cloud access will be extended to universities via Microsoft Research and Microsoft Philanthropies. "We will significantly expand our Microsoft Azure for Research program, which grants free Azure storage and computing resources to help faculty accelerate their research," Smith wrote. "To date this program has provided free cloud computing resources for over 600 research projects on six continents. We will build on what works and will expand our donations program by 50 percent, with a focus on reaching important new research initiatives around the world."
  3. Microsoft intends to reach "new communities" where last-mile connectivity and cloud services aren't available. One technology Microsoft has already started helping deliver to address last-mile connectivity is TV White Spaces, which makes use of unlicensed VHF and UHF spectrum.

Naturally it's a good business move to donate products and service. It has served Microsoft, Google and others well over the years. Microsoft has a vested interest in seeing its services used as widely as possible. It's nice if doing so actually helps make the world better.

Posted by Jeffrey Schwartz on 01/20/2016 at 12:00 PM0 comments


Intel Reveals Multifactor Hardware Authentication for PCs

Intel believes it has the broken the password barrier with new technology that will enable hardware-based multifactor authentication. The company today unveiled Intel Authenticate, firmware for Windows PCs running its new 6th Generation Core Processors that'll enable up to four factors of authentication based on policies determined by IT.

The company released a preview of Intel Authenticate for customers to test, though the company hasn't said when it'll be generally available. Intel is working with its key OEM partners and Microsoft to optimize and deliver the new technology. Though ideal for Windows 10, Intel Authenticate will work on Windows 7 and Windows 8.1 but requires the new CPUs, said Tom Garrison, vice president and general manager of the company's business client division.

Garrison privately previewed the technology late last week and formally launched it today at an event in San Francisco, Calif. Intel Authenticate could give IT decision makers the biggest reason to upgrade their PCs by removing the largest enabler of data theft -- compromised user credentials. In a demo I caught in New York last week, Garrison said IT can create policies that enable one form of multifactor authentication -- initially fingerprint scanning, with facial and iris recognition coming later -- or other forms of authentication such as logical location (when using vPro), proximity to a user's smartphone via Bluetooth or PINs generated by the Intel graphics engine entered with a mouse or touchscreen to avoid breaches from key loggers.

"Biometrics is the wave of the future," Garrison said. "We think this will go a long way to making clients more secure."

IT can establish polices that only require one form of authentication if a user is coming in from a known network and require MFA when trying to gain access from a public location, Garrison said. IT can also determine which forms of authentication are required and in what order.

While Microsoft is aiming to make biometric authentication mainstream with its Windows Hello and Passport technologies in Windows 10, Intel Authenticate promises to deliver embedded hardware-based MFA to business computers for the first time, said Patrick Moorhead, president and principal analyst with Moor Insights & Strategy. "You can be more secure by adding single-factor biometrics but you still have a password and it still can be taken from you," Moorhead said. "With this multifactor authentication here, nothing is hacker proof, but it reduces the likelihood that social engineering or compromised credentials will be the cause of a breach."

Garrison played down any notion that Intel Authenticate will compete with Windows Hello, noting both companies support the FIDO alliance which is creating biometric authentication standards. Those standards will be key to ultimately enabling single sign-on using biometrics. Intel Authenticate actually uses Windows Hello to train the hardware to recognize a biometric identity, Garrison said. Furthermore, Intel and Microsoft, along with PC OEMs, are working together and will be jointly supporting these capabilities as the year goes on, Garrison said. The technology won't be available on hardware other than the latest 6th generation Core processors, though vPro is only necessary when using logical location as a form of authentication. Garrison credits this to a significant leap in performance at the CPU level. "This hardware is actually in our chipset, and Intel runs the firmware in the chipset," he said. "It does all the factor-matching, the IT policy enforcement as well as deciding whether or not to grant access."

While that all happens in hardware, he said it's supported in key system and credential platforms including Intel Security's McAfee ePolicy Orchestrator, Microsoft's System Center Configuration Manager and Enterprise Mobility Suite. It can also utilize Active Directory and Group Policy settings.

Posted by Jeffrey Schwartz on 01/19/2016 at 2:17 PM0 comments


Amazon and Microsoft Escalate Cloud Price War

If there was any doubt that Amazon Web Services (AWS) and Microsoft would keep their cloud price war alive in 2016 and beyond, both companies jumped into the new year with their latest round of cost reductions.

AWS made the first move last week reducing pricing of EC2 instances by 5%, with Microsoft following suit yesterday  by slashing the cost of Windows Server and Linux Azure Dv2 VM instances by up to 13% and 17% respectively. The moves come as both companies aggressively court enterprise IT decision makers to move or share datacenter workloads and data to their public clouds.

Microsoft yesterday acknowledged the price cuts were in line with its commitment to keep its costs in check with AWS. Nicole Herskowitz, Microsoft's director of Cloud Platform marketing, argued that Azure customers with Enterprise Agreements will enjoy an even lower price than Amazon. In a blog post, Herskowitz pointed to a number of other advantages.

"Unlike AWS, Azure virtual machine usage is billed on per-minute rate so you only pay for the compute that you use. With AWS you pay for an hour even if you only use a few minutes," she stated. "For developers, Microsoft provides up to $150 free Azure credits per month along with discounts for Dev/Test workloads through the Microsoft Developer Network program. Any developer will be able to get $25 free Azure credits per month for one year with the Dev Essentials program coming soon."

Herskowitz added that Microsoft's Azure prepurchase plan offers a 63% savings when buying VMs for a full year. "If you're moving a significant number of workloads to the cloud and are looking for great pricing with lots of flexibility, check out the Azure Compute option," she noted. "With this program, you can run any compute instance in Azure and realize discounts up to 60% in exchange for add-ons to your Windows Server annuity licenses."

The Dv2 VMs, based on Intel Xeon Haswell processors, are 35 percent faster than the Dv1 processors, Herskowitz emphasized, while also noting that the company's Dv2 instances, unlike AWS EC2 instances offer, load balancing and auto-scaling with no additional charge.

For its part, most of AWS price cuts apply to Linux instances. Specifically, AWS cut the price of its On-Demand, Reserved Instances and Dedicated host prices (for its C4, M4 and R3 instances) in specified regions running Linux by 5%. Less significant reductions apply to Windows, SLES and RHEL instances. AWS said last week's price cuts were its 51st with some retroactive to Jan. 1, with others to be phased on during the coming weeks. AWS said it will update its price list API in the coming weeks.

Posted by Jeffrey Schwartz on 01/15/2016 at 12:11 PM0 comments


Citrix Outlines Retooled Focus for 2016

Looking to move past the turmoil that forced out its longtime CEO Mark Templeton, giving activist investors a hand in sharpening the focus of Citrix, the company this week made some key announcements at its annual partner conference. Citrix announced the acquisition of Comtrade System Software & Tools management packs for Microsoft's System Center Operations Manager (SCOM). The acquisition covers just the technology IP of SCOM management packs for Citrix-specific environments. Comtrade, a provider of various management and monitoring tools, is a longtime Citrix partner, and its SCOM management packs cover XenDesktop, XenApp, XenMobile and NetScaler. (Note: an earlier version of this post said Citrix had acquired the entire company, however the deal covers just the Citrix management pack technology).

The announcements, made at the Citrix Summit in Las Vegas this week, show a company that in many ways is returning to its roots -- managing desktops and apps in virtual environments. Citrix last year was looking at strategic options for its GoTo service and earlier this week said that it is selling off its CloudPlatform and CloudPortal Business Manager product lines to Accelerite.

Comtrade provides a SCOM management pack that lets administrators manage Citrix XenDesktop, XenApp and Workspace Suite environments, explained Calvin Hsu, Citrix's vice president of product marketing for Windows App Delivery. "They provide an end-to-end monitoring solution, specifically with Citrix," Hsu said. "It's already installed and working with hundreds of customers. They are already part of our ecosystem and it integrates with System Center Operations Manager. It provides a more comprehensive end-to-end user experience, monitoring and a number of licensing and provisioning and storefront components."

Hsu also said Citrix announced the release of XenApp and XenDesktop 7.7 (the company quietly made it available last month), which delivers integration with Skype for Business. "We are seeing huge amounts of interest in being able to use Skype for Business from a virtual desktop or virtual hosted application like XenApp because people are creating these mobile workspaces where they want not only their application but also all of their telephony, and videoconferencing and all of that stuff to follow around with them," Hsu said. "It was developed in very close collaboration with Microsoft as they were building out and introducing their new Skype architecture. We were in lockstep with them for that."

The Skype for Business functionality works on Windows, Macintosh and Linux desktops, Hsu said. Citrix also started talking about the next releases of XenApp and XenDesktop. The 7.8 versions, due out later this quarter, will offer AppDisk, app layering tech that lets IT pros package and manage apps separate from the master operating system image, Hsu said. The forthcoming release will also offer management Microsoft AppV packages, improved app publishing and graphics performance.

Also at the Citrix Summit, the company announced a new long-term service branch offering, providing extended support for older products. Hsu said it's for shops that don't want to upgrade but want ongoing support for retired products.

Posted by Jeffrey Schwartz on 01/13/2016 at 12:38 PM0 comments


Samsung Steps Up Windows 10 Support

Microsoft may not have its own keynote slot at the annual Consumer Electronics Show anymore but Windows and Devices Chief Terry Myerson made an appearance during an address at last week's confab by Samsung President Won-Pyo Hong where the two companies announced a new partnership centered around Windows 10.

During his keynote, Hong brought Myerson on stage to launch the new Galaxy Tab Pro S, an ultrathin detachable tablet PC that's 6.3mm thin and weighs just 1.5 lbs., which Myerson noted in a blog post is lighter than the iPad Pro. And while he didn't mention it, it's also lighter than the Surface Pro 4. It supports an optional active pen and claims 10 hours of battery life.

The Galaxy Tab Pro S will come with an Intel 6th Generation Core M processor and is the first Windows device that'll support the new LTE Cat 6, which the new broadband cellular modems designed to offer much faster upload and download speeds than current LTE 4 devices support.

The Galaxy Tab Pro S will sport a 12-inch 2160x1440 Super AMOLED display, will come with 4GB of RAM and a choice of either 128GB or 256GB SSDs. Set for release next month, Microsoft will showcase the new systems in its retail stores.

Also during the keynote, the two discussed the future "Intelligent IoT [Internet of Things]" devices. Samsung, of course, is a key supplier of home appliances and the companies demonstrated the potential of the new technologies. But as reported by Mary Jo Foley in her ZDNet All About Microsoft blog last week, the two companies didn't announce anything specific regarding a partnership on that front.

Myerson did note that Microsoft last year released its Windows 10 IoT Core for suppliers of circuit boards including the Qualcomm DragonBoard 410c, Intel Minnowboard MAX and the Raspberry Pi 2. Myerson pointed to last month's release of Windows 10 IoT Core Pro, targeted at large suppliers looking for more update controls, as one of the company's major Windows IoT Core products.

Most PC suppliers had new systems to introduce including -- but certainly not limited to -- Dell, HP and Lenovo.

Though I didn't make the trip to CES, HP in late November gave me a sneak preview of its new Elite x2 1012 G1 Tablet, another device that borrows from some of characteristics of the Microsoft Surface Pro. As I reported at the time, like the most recent Surface Pro models, the new Elite x2 1012 doesn't have a fan, has a kickstand and is designed to offer 10 hours of battery life. Where it stands out is the fact that it's completely encased in aviation-grade aluminum, has more interfaces, added security support and a wider array of peripherals. 

Dell also unveiled new commercial systems that borrow from its popular XPS line of systems. Its latest Latitude models offer added security and management capabilities, the company said. The new Latitude 3000, 5000 and 7000 Series offer the thin bezels, among other features in the XPS line. And Lenovo also launched a new member to its flagship commercial ThinkPad line with the ThinkPad X1 Yoga, an Ultrabook which the company says offers 11 hours of battery life.

 

Posted by Jeffrey Schwartz on 01/11/2016 at 9:18 AM0 comments


New Windows Insider Columnist Ed Bott Debuts

Ed Bott takes over as Redmond magazine's Windows Insider starting with this month's edition. In his first column, Bott questions the way Microsoft handled its decision to offer unlimited storage for OneDrive only to take it away.

As a longtime expert on Microsoft, Bott is adding the Windows Insider column to his repertoire of activities. Bott is the author of the popular Ed Bott Report, a blog published by ZDNet. In his Windows Insider column, Bott will give opinions and advice for IT managers who administer Windows and the technology surrounding it.

Bott, an award-winning technology journalist, has covered Microsoft for more than 25 years and has written numerous books covering Windows and Office, including his best-selling "Inside Out" series from Microsoft Press. You can follow him on Twitter @EdBott.

Posted by Jeffrey Schwartz on 01/08/2016 at 11:06 AM0 comments


Microsoft To Notify Targets of State-Sponsored Cyberattacks

Microsoft will inform customers of its online and cloud accounts when they are compromised or targeted by representatives of nation states, the company announced late last week. The company revealed its decision to notify Microsoft Account holders of state-sponsored attacks Dec. 30, in wake of several other leading providers doing so including Facebook, Twitter and Yahoo.

The announcement came following a report by Reuters that 1,000 Hotmail accounts were compromised in 2011 by representatives of the Chinese government. The accounts were used by Uighur and Tibetan leaders and diplomats from Japan and Africa, along with human rights lawyers and others, according to two former Microsoft employees who weren't identified by Reuters. Microsoft Spokesman Frank Shaw told the news service it had never confirmed the origin of those attacks. The report revealed that Microsoft hadn't informed the Hotmail users that their messages were collected.  

All Microsoft Accounts including Outlook.com (aka Hotmail) and OneDrive are covered by the new disclosure policy, said Scott Charney, corporate VP of Trustworthy Computing, in a blog post announcing the company's new disclosure policy.

"We're taking this additional step of specifically letting you know if we have evidence that the attacker may be 'state-sponsored' because it is likely that the attack could be more sophisticated or more sustained than attacks from cybercriminals and others," Charney wrote. "These notifications do not mean that Microsoft's own systems have in any way been compromised. If you receive one of these notifications it doesn't necessarily mean that your account has been compromised, but it does mean we have evidence your account has been targeted, and it's very important you take additional measures to keep your account secure."

Charney noted that Microsoft doesn't plan to disclose details about the attackers or their methods given the evidence collected could be sensitive. "But when the evidence reasonably suggests the attacker is 'state sponsored,' we will say so," according to Charney.

Charney advised customers can protect themselves by using two-step verification, watching for suspicious activity, refraining from opening suspicious e-mails or visiting questionable Web sites, using strong passwords and keeping software updated and patched.

Posted by Jeffrey Schwartz on 01/05/2016 at 11:58 AM0 comments


Robust Holidays Bring Windows 10 Tally Up to 200 Million

Windows 10 activations soared during the holiday season as Microsoft kicked off the new year saying a total of 200 million systems are now running the operating system.

That is nearly double the 110 million licenses last reported by Microsoft. More than 40 percent of new Windows 10 systems were activated since Black Friday six weeks ago, Microsoft said today. That represents the fastest growth trajectory of any version of Windows, Microsoft said in a blog post today. It outpaced Windows 7 by approximately 140 percent and Windows 8 by nearly 400 percent, the company said.

"We are also seeing accelerating and unprecedented demand for Windows 10 among enterprise and education customers," wrote Microsoft Senior VP Yusuf Mehdi. "As of today, more than 76 percent of our enterprise customers are in active pilots of Windows 10, and we now have over 22 million devices running Windows 10 across enterprise and education customers."

Indeed, I stopped into the new Microsoft flagship store on 5th Avenue in New York City during the week after Christmas and the store was packed -- and it wasn't just the Xbox department. Customers were looking at a variety of PC types and even the floor dedicated to Dell PCs was crowded.

The growth of Windows 10 doesn't come as a surprise and it's not just consumers who are propelling its growth. According to a Redmond magazine survey conducted in November, one-third of a sample of 759 readers said they have Windows 10-based systems in production with a quarter planning to do so this year.

Are you one of the 200 million users? Share your views on Windows 10 and the device you are running it on.

Posted by Jeffrey Schwartz on 01/04/2016 at 12:01 PM0 comments


Posted on 01/04/2016 at 10:18 AM0 comments


Microsoft Acquires Ray Ozzie's Talko

Microsoft has acquired Ray Ozzie's latest startup Talko, a company that provides conferencing and telephony services, for an undisclosed amount. Talko developed a mobile messaging app that Microsoft believes will improve its Skype service.

While Talko's team is now a part of Microsoft, Ozzie is not coming back for a second tour of duty, as reported by Mary Jo Foley in her ZDnet All about Microsoft blog. Ozzie spent five years at Microsoft ascending to founder and former CEO Bill Gates' role as chief software architect. Microsoft had acquired Ozzie's Groove Networks, which became an integral part of SharePoint Server. Microsoft Office Groove evolved into what became SharePoint Workspace 2010 and now provides some of the core synchronization capabilities in OneDrive for Business.

The Talko app combined messaging, calling and conferencing and was designed to make workers who use mobile devices more productive. Talko said Microsoft has ended the existing service and will integrate it into Skype. Microsoft expects to launch the new features in March. The company was vague about what Talko will bring to Skype.

"As part of the Skype team, we'll leverage Talko's technology and the many things we've learned during its design and development," according to the announcement on Talko's site. "We'll strive to deliver the best of our product's innovations far more broadly than on our current path." For those who used the Talko service, the company said: "Please rest assured that we'll be giving you a way to request an export of all your past Talko conversations - voice, text, and photos - as simple files."

Posted by Jeffrey Schwartz on 12/22/2015 at 1:30 PM0 comments


2015: The Year Microsoft Turned the Ship Around

As 2015 comes to a close, it'll be remembered as a year full of surprises. Who thought back in April that the New York Mets would make the playoffs (along with the Chicago Cubs) and somehow make it to the World Series? While Microsoft was on pace to have a good year when 2014 was coming to a close, it was hard to imagine Redmond would gain such a high level of respect and dialog among so many longstanding critics.

If Goldman Sachs' mea culpa last week wasn't enough, when I picked up this week's issue of Barron's magazine, which is my Saturday morning ritual, I glanced at the cover and saw the headline, "The New Microsoft," with Satya Nadella's photo plastered on it. Usually when a company is featured on the cover of Barron's, it's because it has determined its stock is going to soar -- or crater. In this case it was the former. Noting that since Nadella has taken over, Microsoft's shares are up 48 percent -- 67 percent if you go back to the day his predecessor Steve Ballmer announced he was "retiring," it has emerged again as a growth company whose shares could jump another 30 percent. The article suggests, while Amazon's AWS cloud business is Microsoft's most significant competitor, the growth of Azure will give it a run for its money. At the same time, the author suggests Microsoft is a beneficiary of Amazon's growth. In April, Amazon disclosed AWS's revenues and profits for the first time, and its shares have since grown 70 percent.

Noting that information on Azure's financials are more "opaque," Microsoft disclosed that revenues for that business grew 135 percent in its first fiscal quarter. Consequently, Bernstein Analyst Walter Moerdler has forecast Azure revenues are approximately $1 billion and could rise to $7.3 billion by mid-2018 with profits as good "as Amazon's or better." The article emphasized how a number of Wall Street analysts have regained confidence in Microsoft, having turned straw to gold with its Surface business and making the successful transition to the cloud with other product areas, such as Office -- and proving its operating systems aren't dead with the successful launch of Windows 10 this year. But those are topics we've covered throughout 2015.

Reflecting on the year, it was quite an eventful one and full of surprises. Among them:

  • Office 365 transitioned into a platform and subscription growth suggested the franchise is fast becoming the newest cash cow.
  • The move to support containers provided a key step toward facilitating the move toward DevOps by enabling a new generation of systems automation.
  • The next-generation of Windows Server and System Center (and how it will provide a more holistic hybrid cloud architecture with the forthcoming Azure Stack) came into view.
  • Windows 10 arrived along with all sorts of new devices, including new Surfaces and third-party PCs, and early success has indicated that the death predictions of the famed desktop OS were premature.
  • Biometric authentication took a key step forward this year with the debut of Passport and Windows Hello, but the coming year will determine whether it's poised to replace, or augment the password.
  • Once a fierce opponent of open source software, Microsoft has embraced it and become an accepted participant in the community much to the surprise of Redmond critic Steven J. Vaughan-Nichols, who authored our September cover story.
  • While Microsoft didn't perform any mega deals such as the rumored acquisition of Salesforce.com, it bought 14 closely held companies, the most since 2008. Many of them had an eye toward bolstering security and mobility management. Microsoft also stepped up its partnerships with the likes of Salesforce, Red Hat and even VMware.

Microsoft's success with Windows Phone was less of a talking point. While Microsoft's new Windows 10 for mobile loaded on an improved crop of Lumia Phones brought uniformity to all versions of Windows with the introduction of Windows Continuum, it didn't move the needle in terms of share. Perhaps it's too early but analysts aren't expecting a significant uptick.

Nadella took some heat for putting Project Astoria, its tooling that would enable Android developers to port their apps to Windows, on the backburner. Ironically, Nadella's loudest critic on the reported move was his predecessor Steve Ballmer, who made an issue of it at the recent Microsoft shareholder meeting. That notwithstanding, it's safe to say Nadella has reshaped Microsoft in a big way this year -- and certainly made moves that Ballmer never would have.

Posted by Jeffrey Schwartz on 12/21/2015 at 1:06 PM0 comments


Cybersecurity Information Sharing Act Sets Back Privacy

Passing the Federal budget before the end of the year was a key priority and became a fortuitous opportunity for Congress to slip in the controversial Cybersecurity Information Sharing Act of 2015 into the spending bill, which President Obama on Friday signed into law. IT providers Amazon, Apple, Google, Microsoft and others have opposed measures in CISA, which seeks to thwart crime and terrorism but facilitates mass surveillance via the sharing of information between companies and the government, notably the National Security Agency.

CISA, as described by Wired, creates an "information-sharing channel, ostensibly created for responding quickly to hacks and breaches [with the affect that it] could also provide a loophole in privacy laws that enabled intelligence and law enforcement surveillance without a warrant." The law also allows the president to create portals for law enforcement agencies to enable companies to facilitate the sharing of that information.

Alarming privacy advocates more is that an earlier version of the bill "had only allowed that backchannel use of the data for law enforcement in cases of 'imminent threats,' while the new bill requires just a 'specific threat,' potentially allowing the search of the data for any specific terms regardless of timeliness," read the Wired article.

Privacy advocates such as the Electronic Frontier Foundation have staunchly opposed CISA and waged a strong campaign against it in 2015. In a statement back in March, the EFF argued the risks of CISA: "Under this bill, DHS would no longer be the lead agency making decisions about the cybersecurity information received, retained, or shared to companies or within the government," it said. "Its new role in the bill mandates DHS send information to agencies -- like the NSA -- 'in real-time.' The bill also allows companies to bypass DHS and share the information immediately with other agencies, like the intelligence agencies, which ensures that DHS's current privacy protections won't be applied to the information. The provision is ripe for improper and over-expansive information sharing."

No Tradeoff Between Privacy and Security
CISA aside, the debate over privacy versus how far the government should engage in surveillance to protect the U.S. from crime and terrorist attacks will remain a focal issue in 2016. Apple CEO Tim Cook reiterated his company -- and IT community at large -- won't back down on its position that Americans shouldn't compromise their privacy to give law enforcement access to encrypted data. While saying Apple complies with subpoenas, Cook said on last night's 60 Minutes that people should not have to give up their privacy so that law enforcement can provide security.

"I don't believe the tradeoff here is privacy versus national security," Cook told Charlie Rose. "I think that is an overly simplistic view. We're America -- we should have both." Cook and Microsoft President and Chief Counsel Brad Smith are steadfast proponents of that notion.

When I sat down with Smith and several other IT journalists a few months ago, Smith said "we will protect you from being attacked. Your data is private, it's really your data, it's not our data and it's under your control." 

Microsoft Corporate VP Scott Charney testified back in January before the U.S. Senate Committee on Homeland Security and Governmental Affairs at a hearing entitled: "Protecting America from Cyber Attacks: the Importance of Information Sharing." In a blog post following his testimony, Charney spelled out where Microsoft stands. "Information sharing forums and processes need not follow a single structure or model, and governments should not be the interface for all sharing," he said.

A Key Issue Among Presidential Candidates
While their positions on information probably won't solely determine the outcome of the presidential election in 2016, candidates are making their positions known -- or are dodging the issue. I noted last week where the Republicans candidates stand. During the debate among the top three Democratic presidential candidates Saturday night in New Hampshire, it appeared only Martin O'Malley was the clearest opponent of giving the Feds more access to user data. "I believe that we should never give up our privacy; never should give up our freedoms in exchange for a promise of security," O'Malley said.

Front-runner Hillary Clinton acknowledged she doesn't understand much of the technology saying "maybe a back door is the wrong door" but adding that "we always have to balance liberty and security, privacy and safety, but I know that law enforcement needs the tools to keep us safe."

The debate question by ABC's Martha Raddatz arose following FBI Director's James Comey's testimony Dec. 9 before the U.S. Senate Judiciary Committee in which he argued for voluntary measures between tech providers and law enforcement such as smartphone makers to no longer offer unlocked phones that would enable encrypted communications. "It's actually not a technical issue; it's a business model question," Comey said. "A lot of people have designed systems so that judges' orders can't be complied with ... The question we have to ask is: should they change their business model?"

How hard of a line would you like to see IT and communications providers to take on this issue?

Posted by Jeffrey Schwartz on 12/21/2015 at 10:32 AM0 comments


Goldman Sachs Admits Mistake in Banishing Microsoft

Back in April of 2013, Goldman Sachs downgraded Microsoft to "sell" -- a rating infrequently cast upon large tech companies and a clear signal that the influential investment bank had little faith of a turnaround in Redmond. Goldman Sachs today said despite the uphill battle Microsoft faced and shared by most in the industry that it underestimated its ability to revitalize its business.

Taking a still measured approached given challenges that remain, Goldman updated Microsoft to "neutral." Certainly Goldman wasn't the only skeptic about Microsoft's prospects a few years ago. Microsoft's missteps in the mobile revolution had many critics wondering if Amazon Web Services, Apple, Google and VMware were poised to make Microsoft irrelevant. Since Satya Nadella has taken over in February 2014, followed by this year's release of Windows 10, the success of Office 365, Azure and making key acquisitions, Microsoft is a much different company, a shift Goldman Sachs acknowledged in the release its report, titled "Righting a Wrong."

Not only did it take Microsoft off of the sell list, but Goldman Sachs analysts acknowledged in hindsight that they shouldn't have put the company there in the first place. "We were wrong," read the report. "We failed to appreciate that the stock would disconnect from downward EPS revisions, and the significant upward rerating of the multiple driven by MSFT's transition to the cloud (Office 365 and Azure)."

Since downgrading Microsoft on April 11, 2013, Microsoft's stock has risen 84% compared with a 29% increase of the S&P 500. The firm set a 12-month price target for Microsoft's stock between $45 and $57 per share. At the time Goldman downgraded Microsoft, the analysts did so on the basis that Microsoft was facing declining PC sales and their view that consensus earnings projections were too high.

According to the report: "The company has been successfully transitioning its Office installed base to Office 365, is the number two leader in cloud services behind Amazon Web Services and has shown strong operating expense discipline and capital allocation."

Goldman also says it's optimistic about the Microsoft cloud transition and believes it will be a top driver of growth for the company and that the transition to Office 365 "will lead to an uplift in revenue and gross profit, and new leadership has executed well, reallocating OPEX away from slower growing areas and towards higher growth and more strategic areas of the company such as cloud."

The firm is also increasing EPS estimates in FY17 and beyond as it anticipates Azure and Office 365 offering faster gross margins. "We expect Azure can reach AWS-like margins when they reach similar revenue levels as AWS."

Goldman still believes consensus estimates for Microsoft's FY17, which kicks off July 1, 2016, are too high. Its estimates are $2.86 per share versus $3.12 and warns there's still downside risk in the coming two fiscal years.

Investors appear to have shrugged off the upgrade, with its shares trading down 1.4% in midday trading on a day where the broad market is down sharply. Despite the slight drop, Microsoft's shares are trading at approximately $55 -- which is on the higher end of Goldman's price target.

Posted by Jeffrey Schwartz on 12/18/2015 at 12:16 PM0 comments


Candidates Spar on Encryption During Presidential Debate

The recent terrorist attacks in Paris and in San Bernardino, Calif. have led to renewed calls for technology providers and social network operators to make it harder for them to communicate and easier for law enforcement to track their activities. It was a point of contention in this week's Republican presidential debate in Las Vegas. Democratic front runner Hillary Clinton and President Barack Obama have weighed in over the past few days as well.

Companies including Facebook, Google/YouTube, Snapchat, Twitter and Microsoft, among others, found themselves taking heat for their stance on refusing to create backdoors, put limits on encryption and allowing terrorists to use their infrastructures for recruiting purposes. Senator Rand Paul of Kentucky remains the most vocal candidate in the presidential race opposing more government oversight. "We are not any safer through the bulk collection of all Americans' records. In fact, I think we're less safe," Paul said, during the CNN broadcast. Senator Marco Rubio of Florida argued that limitations placed on the National Security Agency and other intelligence agencies with regard to accessing phone records posed a major setback. "The next time there is attack on this country, the first thing people are going to want to know is, why didn't we know about it and why didn't we stop it?," Rubio said. "And the answer better not be because we didn't have access to records or information that would have allowed us to identify these killers before they attacked."

Candidate Kasich, the governor of Ohio, argued that if the terrorists in San Bernardino couldn't encrypt their communications, the intelligence agencies would have had a better change of tracking their activities.

"The people in San Bernardino were communicating with people who the FBI had been watching but because their phone was encrypted, because the intelligence officials could not see who they were talking to, it was lost," Kasich said. "We have to solve the encryption problem. It is not easy. A president of the United States, again, has to bring people together and have a position. We need to be able to penetrate these people when they are involved in these plots and these plans. And we have to give the local authorities the ability to penetrate to disrupt. That's what we need to do. Encryption is a major problem, and Congress has got to deal with this and so does the president to keep us safe."

Senate Intelligence Chairman Richard Burr (R-NC) also is a proponent of placing restrictions on the use of encryption. "Today encryption is becoming more and more a problem with our ability to see inside of the communications of individuals, both in the United States with each other and with people abroad," Burr said on CBS's Face the Nation on Sunday.

Besides encryption, Kasich also argued that agencies should be permitted to hold onto metadata longer. Candidate Carly Fiorina, a former CEO of Hewlett Packard, argued during the debate that the Feds need to do a better job at tapping the IT industry at interpreting the metadata faster.

"Why did we miss the Tsarnaev brothers, why did we miss the San Bernardino couple? It wasn't because we had stopped collected metadata. It was because, I think, as someone who comes from the technology world, we were using the wrong algorithms," Fiorina said. "This is a place where the private sector could be helpful because the government is woefully behind the technology curve. But secondly, the bureaucratic processes that have been in place since 9/11 are woefully inadequate as well. What do we now know? That DHS vets people by going into databases of known or suspected terrorists."

Republicans aren't alone in their call for tech companies to work more closely with the government.

In his address from the Oval Office Sunday night, President Barack Obama made a passing reference to the issue. "I will urge high-tech and law enforcement leaders to make it harder for terrorists to use technology to escape from justice," he said.

Nearly three years ago, the President urged IT to share information and earlier this year, he issued an executive order that aimed to step up that effort.  

Obama's remarks came hours after Democratic presidential candidate and former Secretary of State Hillary Clinton touched on the issue, suggesting freedom of speech shouldn't preclude providers from making it harder for terrorists to use their networks to recruit and to carry out their goals. In her Sunday speech at the Brookings Institute in Washington, D.C., she took a somewhat harsher tone than in the past against the IT community.

"Resolve means depriving jihadists of virtual territory, just as we work to deprive them of actual territory," she said, as reported by The New York Times. "They are using Web sites, social media, chat rooms and other platforms to celebrate beheadings, recruit future terrorists and call for attacks. We should work with host companies to shut them down."

Clinton reiterated that call Tuesday. "We need stronger relationships between Washington, Silicon Valley and all of our great tech companies and entrepreneurs," she said. "American innovation is a powerful force and we have to put it to work, defeating ISIS. That starts with understanding where and how recruitment happens."

Tech companies including Microsoft have vowed to put customer privacy first, as they should. It doesn't appear the rhetoric of the presidential campaign will die down until after the election next November. Hopefully, a reasoned approach will prevail by whoever becomes the next president.

Posted by Jeffrey Schwartz on 12/17/2015 at 3:15 PM0 comments


Cloud Backup Provider Carbonite Scales Up with Deal To Acquire EVault

Carbonite, one of the popular providers of backup services for consumers and small businesses, today agreed to acquire EVault for $14 million from disk drive vendor Seagate. EVault, which offers higher-end backup services with an on-premises appliance for mid-sized organizations, uses Microsoft's Azure public cloud to store data. The company also claims a one-hour failover.

EVault was founded in 1997 and acquired by Seagate in 2006. When the deal closes, likely next quarter, EVault will give Carbonite the ability to target larger enterprises and provide data protection from VMware-based virtual infrastructures. On an investor call with analysts to announce the deal, Carbonite President and CEO Mohamad Ali said EVault has 5,000 customers and its services are offered by 500 managed service providers. By acquiring EVault, Carbonite will compete with Barracuda Networks, CommVault and Datto, though there are a considerable number of players targeting small enterprises including Acronis, Arcserve, Asigra, Nasuni, Unitrends and Veeam, among others.

"EVault brings probably one of the most sophisticated cloud failover capabilities available in the market," Ali said on the call. "EVault is able to provide one-hour SLA failover. EVault can spin up that server in its cloud, and your business is up and running again off of that server. A lot of companies can only talk about some of the capabilities, [while] EVault probably has the most mature and capable version of it out there. Immediately we're able to expand what we can do, what we can offer [and] what size of customers we can service. I think competitively we're in a really great position."

The EVault Backup Services for Microsoft Azure allows customers to combine their cloud licenses under one enterprise agreement and provide protection for Linux, Unix and Windows-based application servers as well as Hyper-V and VMware ESXi VMs. The EVault service offers front-end data deduplication and uses the company's block-level processing to ensure only new and changed blocks of data are transmitted.

Whether Carbonite will further utilize Azure for its offering, or move EVault away from it, remains to be seen. David Raissipour, Carbonite's SVP of engineering and products said: "Carbonite already has highly efficient and scalable cloud infrastructure so the Azure platform wasn't a key component of the acquisition."

Posted by Jeffrey Schwartz on 12/16/2015 at 1:12 PM0 comments


Dell Weighs Asset Divestitures to Seal EMC Deal

In its first quarterly financial disclosure since becoming a privately held company two years ago, Dell revealed its revenues and what it may do to finance its $67 billion acquisition of EMC and the storage giant's controlling interest in VMware. Dell acknowledged it may need to divest or spin off some of its assets in order to adequately finance the deal and pass investor muster. In addition to divestitures, Dell said it may buy back up to $3 billion of VMware's tracking stock.

Dell issued the financials and other issues related to the deal, the largest ever in the IT industry, in a filing with the Securities & Exchange Commission (SEC) yesterday, through a new privately held entity called Denali Holding Inc. The Denali S-4 filing comes just after the end of a 60-day go-shop period EMC was entitled to, which permitted it to accept a better offer if one emerged. It wasn't considered likely that anyone would outbid Dell's historic bid and the clock expired late last week, making it official. While EMC shares have remained relatively flat since the deal was announced Oct. 12, VMware shares are down sharply.

Among the assets Dell could sell or spin off include its services business, RSA Security, SonicWall and Dell Software Group business from its acquisition of Quest back in 2012. Boomi, which provides connectivity middleware between public cloud services and legacy systems, could also be a possible spin off candidate, the company acknowledged in the filing. The company didn't identify any specific business it will sell or divest, but the wording suggested it will need to liquidate some of its assets.

"Denali expects that it may divest certain businesses lines, assets, equity interests or properties of Denali and EMC to be determined, the proceeds of which may be used to, among other purposes, repay indebtedness incurred in connection with the merger," according to the filing. "Such divestitures may be material to each company's financial condition and results of operations. As of the date of this proxy statement/prospectus, there is no commitment or probable transaction related to these potential divestitures, and the manner in which any potential divestitures might be effected has not been determined."

 One asset that it will leverage for cash is SecureWorks, the business unit that provides threat intelligence, incident response and managed security services. On Friday, Dell announced it plans to issue a stock offering for SecureWorks. In Dell's IPO registration with the SEC, the company declined to say how many shares of Class A stock it will issue, or how much it hopes to raise, only stating that those details will be determined at a future date.

As for how Dell is doing, the company said revenues for the quarter ended July 31 of $14 billion were down 6% year over year, and its $27 billion in revenues for the six month period ended Aug. 1 reflected a 7% decline in revenues. On a GAAP basis, net income of $265 million was down 49% for the quartercompared with last year's quarter, while for the six-month period net income was $769 million, reflecting a 25% drop in profits. Revenue for Dell Software Group for the six months ended July 31 of $680 million declined 5% year-over-year.

Posted by Jeffrey Schwartz on 12/15/2015 at 1:52 PM0 comments


Microsoft Releases Video and Speaker Recognition APIs

Microsoft today said its Speaker Recognition APIs and Video APIs are now available in public preview and said the company is accepting invites to those wanting to test its forthcoming speech-to-text service. The new APIs are the latest milestone by Microsoft Research to make machine learning mainstream under its Project Oxford effort.

The first Project Oxford APIs, designed for facial recognition, debuted last month. And like those, the newest interfaces allow developers to add functionality with just a few lines of code, according to Microsoft. When Microsoft released the facial recognition APIs, the company said it would also follow suit with the speaker recognition, video and text-to-speech APIs by year's end.

Those wanting to test Microsoft's speech to text service, called Custom Recognition Intelligence Service (CRIS), can register at www.ProjectOxford.ai.

As the name implies, the Speaker Recognition APIs can determine who is speaking based on their voice. Microsoft warns that the Speaker Recognition APIs shouldn't be used in place of stronger authentication methods -- they are suited for two-factor authentication used with a password, PIN or a physical device such as a keypad, smartphone or credit card. More practically though, it appears the company is aiming them more toward customer service-oriented functions.

"Our goal with Speaker Recognition is to help developers build intelligent authentication mechanisms capable of balancing between convenience and fraud," said Ryan Galgon, senior program manager at Microsoft Technology and Research, in a blog post announcing the new APIs. "Achieving such balance is no easy feat."

Galgon noted that the speaker recognition APIs learn specific voices during the enrollment. "During enrollment, a speaker's voice is recorded and a number of features are extracted to form a unique voiceprint that uniquely identifies an individual," Galgon noted. "These characteristics are based on the physical configuration of a speaker's mouth and throat, and can be expressed as a mathematical formula. During recognition, the speech sample provided is compared against the previously created voiceprint.

[Click on image for larger view.] Courtesy: Microsoft

Meanwhile, the video APIs use processing algorithms for those editing videos. It will automate functions such as stabilizing videos, detecting faces and use motion detection on a frame-by-frame basis.  Microsoft said the service improves the video editing process by reducing false positives  by ignoring shadows and changes to lighting.

Posted by Jeffrey Schwartz on 12/14/2015 at 12:55 PM0 comments


Synergy 'Composable Infrastructure' Will Be HPE's Big Play for 2016

Look for the lines between server, storage and network gear to blur further in 2016 as a new crop of converged systems arrive that offer what several key suppliers this year have begun describing as "composable infrastructure," which is especially suited for organizations that have moved to a DevOps model of building and managing distributed apps and systems. Hewlett Packard Enterprise last week was the latest to announce next-generation datacenter hardware called Synergy that falls under that description.

HPE executives revealed plans to deliver Synergy in the second quarter of 2016 at its Discover conference, which took place in London, where the company also launched its Azure Cloud Platform System (CPS). Synergy is a converged system that combines compute, storage and networking fabric into common resource pools controlled through a single API that enables management via the HPE OneView systems management platform.

The unified API combined with its software-defined intelligence and fluid resource pools will let Synergy automatically optimize itself based on specific applications designed either for traditional datacenters or cloud infrastructure. Arista, Capgenini, Chef, Docker, Microsoft, NVIDIA and VMware are among those that have said they will support HPE's new Synergy API in their respective wares.

Synergy is the latest form of so-called composable infrastructure -- a term HPE, Cisco and others are likely to tout more in 2016. Cisco claims it's the first to deliver composable infrastructure with its Cisco UCS M-Series Modular Servers and the Cisco UCS C3260 Rack Server.

As HPE describes Synergy (which it also argues is the first composable infrastructure), it composes physical and virtual resources into any configuration for any application.

"What it has is fluid resources that can dynamically flex to the specific needs of an application in any deployment model, whether it be virtual, physical or bare metal or even in containers," said Ric Lewis, SVP and General Manager of HPE Converged Data Center Infrastructure, during the opening keynote session at the Discover conference.

These fluid resource pools are the basis of the unified API, Lewis explained. "It allows ops and developers alike to simply access the resources as code. This infrastructure boots up ready to run a workload. There's not a whole bunch of configuration and things like that you need to do. As you add elements, it auto-identifies those elements and presents them ready to run applications."

The unified API provides a single interface to diagnose, discover, provision, search and update Synergy. Lewis emphasized the compsable API is based on a single line of code designed to provision the resources needed for specific applications, which eliminates the need for scripting.

"I like what I see with Synergy," said Patrick Moorhead, president and principal analyst with Moor Insights and Strategy. "It spans server, storage and networking. It uses fluid resource pools across many different environments. The key is the Synergy API." That said, IT decision makers will need to determine whether a composable infrastructure such as Synergy is suited to their applications and workloads.

It's important for enterprises to have the ability to cost compare to traditional racks and blades, even though comparing cost alone misses the point," he said. "There is a cost to composable's speed and flexibility, and enterprises will want to know what that is."

Posted by Jeffrey Schwartz on 12/11/2015 at 11:03 AM0 comments


IBM Launches Security App Exchange

IBM this week launched a new security marketplace that aims to integrate third-party tools with its QRadar security information and event management (SIEM) platform. The IBM Security App Exchange allows partners and security operation center (SOC) administrators to share security intelligence, workflows, use cases and analytics. It extends upon IBM's X-Force, a 700TB database of security threat intelligence data it gathers from its customers, which the company opened up back in April to anyone who wants to use it. X-Force is now used by 2,000 customers.

The IBM Security App Exchange builds on the X-Force model by letting participants tie into QRadar with custom or commercial tools. IBM kicked of the app exchange with 14 tools, most from IBM as well as from four third-party providers that offer their own specialized monitoring and endpoint protection software.

Among the first partners to offer their tools with the new exchange are Bit9 + Carbon Black, BrightPoint Security, Exabeam and Resilient Systems, whose analytics, endpoint protection and incident response tools are all available on the exchange. The partner tools are integrated with QRadar, which is ranked among the leading SIEM platforms used in security operations centers. IBM touts QRadar's data analytics engine as using threat intelligence to detect and prevent security incidents.  

Partners can use QRadar's APIs to provide compatibility between the platform and their wares.

"This app exchange is a new home for partners and customers to share and download applications, and to enhance and extend their defenses," said Kevin Skapinetz, IBM's director of security strategy. "Think of it as a new way to deploy custom applications leveraging our analytics and our security system as a platform. And equally as important to help with that, we've opened up our analytics platform to support it, so we have new APIs and a new software development kit on top of the QRadar technology."

Brian Hazzard, VP of technical alliance at Bit 9 + Carbon Block, explained that by tying its endpoint detection and response tool with QRadar, SOC administrators don't have to pivot between the two. By exposing the functionality of Carbon Block into QRadar, the administrator or security analyst can view activity in a common interface, Hazzard said.

"That eliminates the swivel chair effect going across products to make the solution unified," Hazzard said. "Ultimately the goal is to make it so that you can do better detection of the threats that are going on and when there is a detection event, the objective is to respond very quickly to ultimately stop the compromise. The key is sharing info and exposing capabilities across the solutions."

IBM's move comes just a week after Microsoft launched the technical preview of its Azure Security Center, with partners that include Barracuda, Checkpoint, Cisco, CloudFlare, F5 Networks, Fortinet, Imperva and Trend Micro. Like IBM's exchange, the new Azure Security Center will aim to share intelligence with Microsoft's new Advanced Threat Analytics, built on the technology of Aorato, a company it acquired a year ago.

"They need to provide Microsoft-branded security offerings without alienating existing and potential security partners," said IDC analyst Robert Westervelt. "[Microsoft CEO Satya] Nadella and his team will have to do a balancing act and Azure Security Center could be the fulcrum  by enabling Microsoft to partner broadly with security vendors that provide specialized functionality and those that Azure customers have already adopted for compliance, controls and visibility. There are definitely similarities to IBM's new Security App Exchange. By offering QRadar, customers have access to custom applications that make the product more powerful and IBM cuts out the cost of acquiring or building out technology to bolt onto the SIEM product."

While software-as-a-service delivery of security is a relatively small market, IDC predicts adoption will grow significantly over the next two to four years. More than half of Web security market revenue will come from the cloud based offerings over traditional on-premises gateways by 2020, according to IDC. Westervelt said Amazon's marketplace introduced an easy way for organizations to buy and implement security offerings and noted Splunk is also successful in adding security functionality from third parties through its application store. "In addition, networking and endpoint security vendors are attempting to capture the opportunity by increasing subscription based security services such as threat intelligence feeds or cloud-based sandboxes to detect advanced threats," he said.

Case in point: Intel Security, which launched its Threat Intelligence Exchange (TIE) based on its McAfee Data Exchange Layer (DXL), counted 16 DXL Alliance partners in October, including Windows privilege management provider Avecto, Brocade, ForeScout, Mobile Iron, Titus and TrapX Security. For its part, IBM's Skapinetz said it has numerous other partners waiting in the wings for its Security App Exchange. "There are others we have in the queue but not ready to talk about," he said. "We'll be adding a lot more in 2016."

Posted by Jeffrey Schwartz on 12/09/2015 at 1:22 PM0 comments


Can EMC and VMware Make the Dell Deal Sweeter?

EMC and VMware are reportedly considering a stock buyback and a restructuring of its Virtustream cloud unit that would make Dell's agreement to acquire the companies for $67 billion more attractive to investors.

VMware shares have declined 18 percent since Dell made its historic and risky deal  on Oct. 12 to acquire EMC and taking a controlling interest in the virtualization company. The drop in VMware's share price has lowered the stock per-share price Dell is ultimately paying EMC from $33 to $30, according to Bloomberg, which first reported on talks to restructure the deal.

Dell has no plans on raising its cash offer for EMC, nor will it shift its stock-to-cash ratio, according to the report, quoting unidentified sources familiar with the plan. Also under consideration is the abandoning of its plan to establish Virtustream as a 50-50 joint venture of EMC and VMware, according to the report. EMC acquired VIrtustream, a rapidly growing cloud provider, for $1.2 billion earlier this year.

Days after Dell announced its plan to acquire EMC, the joint-venture ownership of Virtustream was announced. But VMware investors reportedly fear potential near-term losses from Virtustream despite forecast revenues of hundreds of millions of dollars next year. Having EMC account for its results would make it more palatable to VMware shareholders.

The go-shop period EMC has expires Friday but no one has surfaced to date that will likely top Dell's bid.

Posted by Jeffrey Schwartz on 12/08/2015 at 1:08 PM0 comments


How Apple's Open Sourcing of Swift Could Boost Windows

Apple's announcement last week that it has released its Swift programming language to the open source community could help Microsoft accelerate its effort to help developers port their iOS applications to Windows, some experts believe.

In case you missed it, Apple last week announced it has fulfilled its promise to contribute its rapidly growing Swift programming language under the Apache 2 open source license with a runtime library exception for building apps that can run on iOS, Mac OS X, watchOS, tvOS  and Linux. The Swift open source code, a compiler, library debugger and package manager are now on GitHub. Apple said the open source move, which includes the launch of Swift.org site consisting of various resources, should allow other providers to incorporate Swift with their own software and port it to other platforms, stated Craig Federighi, Apple's senior vice president of software engineering.

It's possible that one motivation for the move from Apple's standpoint is to extend the appeal of iOS and Macs into enterprises. Not surprisingly, IBM, Apple's key partner in building those platforms into enterprises, was the first to release a Swift sandbox for developers.

"The IBM Swift Sandbox is an interactive Web site that lets you write Swift code and execute it in a server environment -- on top of Linux," wrote John Petitto, a Swift developer working in IBM's Innovation Lab in Austin, Texas on the company's DeveloperWorks blog. "Each sandbox runs on IBM Cloud in a Docker container."

Based on recent reports, it appears Microsoft may be looking to further its efforts to help developers port their iOS apps to the new Universal Windows platform with tooling called Project Islandwood, while potentially sidelining a similar plan to provide a bridge for Android apps known as Project Astoria. The future of Project Astoria is now uncertain at best and possibly on ice, which stoked criticisms last week by former CEO Steve Ballmer. Windows Phones must run Android Apps, Ballmer reportedly argued at last week's Microsoft's shareholder meeting. Besides technical problems and developer rejection of Project Astoria, licensing issues with Google parent Alphabet are contributing to Microsoft's waning interest in supporting Android.

Mark Hibben, an independent iOS developer and Seeking Alpha blogger for technology investors posted another interesting analysis:

Windows 10 and Mac OS X and iOS apps are compiled, unlike Android. Compilation takes the programming language code and converts it into the binary instructions that a computer understands. Once the app is compiled, it's relatively straightforward for the operating system to feed those instructions into the processor. The basic similarities between the app architectures of Windows and iOS makes porting iOS apps much easier to do.

Android has a completely different app architecture in which an Android Run Time (ART) takes high level Java and C++ code and converts it into the binary code the processor understands. Each Android app runs in its own instantiation of ART. If it sounds cumbersome, it is.

Consequently, Hibben suggests Microsoft has determined it no longer needs to offer bridges for both platforms, noting just about any app it wants to make available for Windows exists on iOS. Microsoft may have come to the same conclusion. Almost any app that Microsoft would want on its app store is already available for iOS. By open sourcing Swift, Apple could be trying to motivate Microsoft to build a Swift compiler for Windows.

If Microsoft were to do so, such a move could potentially accelerate the delivery of popular apps that are yet to find their way into the new Windows Store.  

 

Posted by Jeffrey Schwartz on 12/07/2015 at 11:56 AM0 comments


AWS Adds Active Directory Services

Amazon Web Services is now offering a set of new options to run Active Directory as a managed service in its EC2 cloud. The company this week said it's offering three options for its new cloud-based Active Directory Service.

The least expensive option is Simple AD, providing only basic Active Directory capabilities. Second is the AWS Directory Service for Microsoft Active Directory (Enterprise Edition) based on the most recent version included in Windows Server 2012 R2 and the third option is the AD Connector, which customers can link with on-premises AD domains.

The company has provided documentation to determine which service is most suitable. For those looking to create or manage user accounts, group memberships, domain-joining Amazon Elastic Compute Cloud (Amazon EC2) instances running Linux and Windows, Kerberos-based single sign-on (SSO) and group policies, Simple AD is the best choice, according to the company. It's the most suitable option for organizations with less than 5,000 user accounts.

Organizations with more than that or those that require trust relationships between the AWS-hosted version of Active Directory and on-premises directories are better off using the new AWS Service for Microsoft AD, Amazon recommends. It's available when an administrator chooses it as a directory type and is provisioned as a pair of domain controllers that run in multiple AWS Availability Zones available in any region connected to a customer's virtual private cloud (VPC), according to the company. AWS said the service offered includes host monitoring, recovery, replication, snapshots and software updates, which is configured and managed by the company.

AWS describes the AD Connector as a proxy service that links on-premises Active Directory with AWS that don't want to host AD Federation Services or other intricate directory synchronization configurations. The company recommends the connector for those with Active Directory on premises that don't require replication to the AWS-hosted directory. Developers can link to Active Directory using the AWS Directory Service API. Separate reference documentation to that API includes descriptions, syntax and examples of various actions and data types within the service.

Posted by Jeffrey Schwartz on 12/04/2015 at 12:01 PM0 comments


Ballmer Criticizes Nadella at Shareholder Meeting

Nearly two years after Microsoft's former CEO Steve Ballmer left the company, he publicly rebuked his successor Satya Nadella for not adequately disclosing cloud revenues. The open critique took place at the company's shareholder meeting Wednesday.

While taking on Nadella might sound like sour grapes, Ballmer has a lot of skin in the game as he's Microsoft's largest individual shareholder. As reported by Dina Bass of Bloomberg, Ballmer takes issue with the fact that Microsoft only reports run rates calling that "bulls---," saying Microsoft should report actual revenues and margins.

"It's sort of a key metric -- if they talk about it as key to the company, they should report it," Ballmer reportedly said at the meeting, held in Bellevue, Wash. "They should report the revenue, not the run rate,"

In discussing the issue with Microsoft, Ballmer said he couldn't even guess what the actual figures are, according to the report. Chris Suh, Microsoft's general manager for investor relations told Bloomberg: "We enjoy a regular dialogue with Steve, and welcome his input and feedback, as we do from our other investors.

Ballmer also interrupted Nadella when he defended Microsoft's focus on allowing developers to write apps to devices of all sizes with the company's new Universal Apps platform, suggesting the company should give priority to enabling Android and iOS apps to run on Windows. The comment implicitly was criticizing reports that Microsoft has put on hold indefinitely Project Astoria, the effort announced at the Build conference back in April to make it easier for Android developers to port their apps to Windows 10 mobile.

If indeed Project Astoria is on hold, it possibly means Nadella threw a Hail Mary bet on the success of Windows 10's "Continuum" technology, which switches the Windows 10 phone  interface to work on PCs. However, Android users will have less of an incentive to use Windows Phones in the first place if Project Astoria is on hold.

As for Microsoft's reporting of cloud revenues, the company will have a hard time convincing skeptics that its cloud transition is meeting expectations unless it provides more than just annual run rates.

 

Posted by Jeffrey Schwartz on 12/03/2015 at 12:22 PM0 comments


Demand for Detachables Rise in Declining Tablet Market

Demand for tablets has shifted to those with detachable keyboards, though global shipments this year are forecast to have declined 8.1 percent, according to IDC's latest report. The growing preference for detachables, or 2-in-1s, has helped Windows gain a bigger, albeit still marginal piece of the tablet pie, according to the IDC Tablet Tracker released Monday.

Shipments of tablets are expected to reach 211 million worldwide, according to the IDC forecast, which has shown three quarters of declines. The research firm expects demand for detachable tablets to continue to rise, which is evident in robust sales of Microsoft Surface Pros, the company's release of the Surface Book and a number of new 2-in-1 units released this quarter by the major OEM providers.

"In the coming year, IDC expects shipments of detachable tablets to increase 75 percent. The proliferation of detachable offerings from hardware vendors continues to help drive this switch," said IDC's Tablet Research Director Jean Philippe Bouchard, in a statement. Jitesh Ubrani, IDC senior research analyst, added in a separate statement that Apple's new iPad Pro is likely to help iOS gain more share among prosumers and enterprise users, despite mixed reviews for the new device. "At the same time we expect Windows-based devices  -- slates and detachables combined --  to more than double its market share by 2019, driven by a combination of traditional PC OEMs as well as more household smartphone vendors," Ubrani said.

Windows-based devices are expected to account for only 8.5 percent of all tablets this year, up from just 5 percent last year, according to IDC. Next year, the share of Windows tablets will grow to 10.5 percent and is projected to reach nearly 18 percent by 2019. Microsoft's gains are expected to come largely from declines in Android share, while iOS is forecast to remain flat.

Another noteworthy trend is waning demand for smaller tablets in favor of medium sized units. Tablets sold ranging from seven to nine inches in size accounted for 57.7 percent this year, down from 61 percent in 2014. The share of tablets greater than nine but smaller than 13 inches is expected to rise this year to 41.9 percent from 35.8 percent in 2014. By 2019, those medium-sized tablets are forecast to account for 55.1 percent, while the smaller ones will be 43 percent.

Posted by Jeffrey Schwartz on 12/02/2015 at 12:46 PM0 comments


HPE Faces Off with Dell with Launch of Azure Box

Hewlett Packard Enterprise CEO Meg Whitman today fulfilled the promise she made on last week's earnings call that she'd outline a new partnership with Microsoft to make Azure a preferred cloud service.  It turned out Whitman needed some help in doing so, thanks to a poorly timed illness that made it difficult for her to speak during the opening keynote of HPE's Discover conference in London.

Although Whitman made some brief remarks, she deferred to her senior executive team to provide the details. As she had indicated on the earnings call, Azure will be a preferred public cloud and in return, Microsoft will use the company's gear for key parts of the service. HPE also introduced its own converged rack-mountable system designed to run Azure within the datacenter, taking on Dell, which last year launched the Azure Cloud Platform System (CPS) followed by the company's announcement in October of a scaled down standard version of Azure CPS.

"We are extending our partnership to the cloud capabilities of the edge with a hyper-converged infrastructure," said Antonio Neri, executive VP and general manager of HPE's enterprise group, who was among the executives standing in for Whitman. Joining Neri via a live video stream, Microsoft CEO Satya Nadella, served up his talking points on Microsoft's hybrid cloud strategy saying it's consistent with HPE's. "We are building out a hyper scale cloud service in Azure, but we think of our servers as the edge of our cloud," Nadella said.

The HPE Hyper Converged 250 for Microsoft CPS Standard offers an Azure-consistent system in a 2U chassis running the latest version of HPE Apollo Gen9 server allowing for either three or four dual server nodes powered by either Intel Xeon E5-2640 v3 or E5-2680 v3 processors, configurable with 128GB to 512GB of RAM and a mixture of SAS and solid state drives. The 3-node unit offers up to 8.5TB of storage and the 4-node system supports 11.5TB of storage.

Microsoft's contribution is a multitenant build of Windows Server 2012 R2, the Windows Azure Pack and HPE OneView for Microsoft System Center, according to the spec sheet. Azure Backup and Azure Site Recovery are available as options.

Dell's new CPS offering, available as a four-node converged system, is also in a 2U box. "They are very similar," said Patrick Moorhead, president and principal analyst at Moor Insights & Strategy of the Dell and HPE Azure CPS offerings. "The main difference is Dell's focus is on service enablement and provisioning and HPE appears more focused on services and integration."

For Microsoft, it should be a welcome sign that both key players are on board with delivering Azure on premises.

Posted by Jeffrey Schwartz on 12/01/2015 at 12:49 PM0 comments


Hewlett Packard Enterprise To Name Microsoft Azure a Preferred Public Cloud

Hewlett Packard Enterprise this week will announce plans to tap Microsoft Azure as a preferred public cloud provider, giving customers who were using its own Helion service a place to go when it shuts down Jan. 31.

CEO Meg Whitman revealed HPE will make Azure a preferred public cloud during the final earnings call for the company that was previously known as Hewlett Packard Co. for 86 years prior to splitting into two separate businesses (HP Inc. and HPE) Nov. 1.

"Microsoft shares our view of a hybrid IT approach for enterprises and we both see opportunity to simplify hybrid infrastructure for our customers," Whitman told analysts during the earnings call. "Going forward, Microsoft Azure will become a preferred public cloud partner. HPE will serve as a preferred provider of Microsoft infrastructure and services for its hybrid cloud offerings. Overall the move to a hybrid cloud environment presents a significant growth opportunity for us and you can expect to hear more about our approach in the coming months."

Notice that Whitman described Azure as a preferred public cloud partner rather than the preferred partner. Whitman could announce the partnership as early as this week during HPE's Discover conference, taking place in London. It's not surprising that HPE isn't forging an exclusive partnership with Microsoft, or anyone else for that matter. After all the company has made a substantial investment in OpenStack and has the assets through its acquisition of Eucalyptus to offer private and hybrid clouds compatible with Amazon Web Services.

It will be interesting to learn the details of this partnership and how much emphasis the new HPE puts into Azure and whether it offers its own iteration of the Azure Cloud Platform System. Currently only Dell offers CPS. Dell last month revealed a more mainstream version of CPS.

HPE's partnership with Microsoft goes beyond Azure. Earlier this month HPE launched new consulting services for customers looking to roll out Windows 10-compatible applications. The offering includes cloud and mobility consulting services as well as vertical industry application transformation.

 

Posted by Jeffrey Schwartz on 11/30/2015 at 11:54 AM0 comments


HP Challenges Surface Pro with Commercial Windows 10 Tablet

HP Inc., the new PC and printer business that spun off of Hewlett Packard Co. earlier this month, is planning to offer a combined Windows 10-based PC-tablet in January that it believes will appeal to businesses. However, like Microsoft's Surface Pro 4, it may have some limitations for their enterprise environments. The company has taken the wraps off the HP Elite x2 1012 G1, which looks like a Surface Pro by sporting a similar 12-inch display but is completely encased in aviation-grade aluminum, has more interfaces, added security support and a wider array of peripherals. Detailed pricing wasn't released other than the entry price will be $899, which will include a standard keyboard and pen.

Like the most recent Surface Pro modes, the new Elite x2 1012 doesn't have a fan, has a kickstand and is designed to offer 10 hours of battery life (actually Microsoft rates the Surface Pro 4 at 9 hours). In a one-hour meeting in New York this week with Keith Hartsfield, VP of mobility and product management with HP's personal systems group, I had the opportunity to look at the system and it definitely had a more commercial feel without taking on any industrial (aka ruggedized) characteristics. It's thin with a much more solid keyboard also made of aluminum. Actually, HP offers two keyboard options: standard and one that's a hair thicker to support smartcards or NFC.

A fingerprint scanner is on the rear of the device itself and the aluminum kickstand lets you hold the device when used as a pure tablet. The device has two USB ports, one full size USB-A and a USB-C supporting Thunderbolt docking. HP offers multiple docking options: the USB-C-based adapter and a traditional Thunderbolt docking station with its high-end HP Wireless Docking Station, which offers based Wigig where "you get zero latency, dual display kind of functionality," Hartsfield explained.

Other aspects of the new device Hartsfield emphasized is the Elite X2 1012's support for Intel's vPro systems management, HP Client Management Solution and the company's own HP Client Security offering. Also included is HP's SureStart bios, which when a system is corrupted by malware, enables a restore from memory.

One capability that may appeal to many IT pros is the standard screws used on the HP device that lets IT pros open the system to replace the battery, drives or upgrade memory themselves if they choose, or even remove the drive before sending it back for servicing, which is a requirement of many government agencies. The kickstand can withstand 10 lbs. of pressure but if it breaks, it too is connected to the device by screws, making it easy to replace, Hartsfield said.

The Bang & Olufsen audio and multiple microphones with HP's noise cancelling software make the device suitable for Skype for Business or other types of conferencing services. The HP Active Pen, which enables the launch of any app with the push of a button, supports 2,048 pressure points compared with the Surface Pen's 1,024.

While it has a number of features the Surface Pro 4 lacks, many do overlap (both offer TPM and lack fans) and there are some areas the Microsoft offering outshines the new HP device. The Surface Pro 4 has a 12.3-inch 2736x1824 display compared with the Elite 1012 G1's 12-inch 1920x1280 display),   and HP's device weighs 2.72 pounds with the travel keyboard (2.89 pounds with the advanced keyboard) compared with 1.73 lbs. for the Surface Pro 4. Also, the Surface Pro 4 comes available with Core M processors as well as Intel's 6th Generation i5 and I 7 processors with SSDs up to 1TB, while the new HP system only comes with a number of Core M processors and maxes out with a 512GB SSD.

Hartsfield said to expect additional configurations to appear but believes the Core M processor is best suited for a fanless system and to ensure optimal battery life. "Core M is a really great choice. This is for the mobile professional and executives that are using PowerPoint and Office or watching movies on a plane," he said. "This is for the person who is always on the move needing ubiquitous connectivity, and the ability to use a real keyboard and a good mouse experience."

Microsoft's release of the Surface Pro, which HP also now resells, has helped shape the market for tablet PCs, Hartsfield said. "I'm glad they created a category that needed to be created," Hartsfield said. "And I'm glad that they spent the better part of $1 billion marketing it to create awareness. We weren't first but I do think we're best. I think one of the things you'll see from HP Inc. as a new company is we've got our innovation mojo back."

Posted by Jeffrey Schwartz on 11/20/2015 at 8:42 AM0 comments


VMware CoFounder Diane Greene To Head New Google Cloud Biz

Google yesterday said it is combining its various cloud groups and has tapped VMware Diane Greene to lead its newly reorganized group. Greene will run a new group that includes Google for Work, the Google Cloud Platform and Google Apps.  Bebop, the startup Greene has been running, has also been acquired by Google.

The move will provide a more coordinated group, said Google CEO Sundar Pichai, in a post on the company's blog. "This new business will bring together product, engineering, marketing and sales and allow us to operate in a much more integrated, coordinated fashion," Pichai said. It also brings one of the IT industry's most influential women back into the fold, more than seven years after she was ousted as VMware's CEO by EMC CEO Joe Tucci, when the two failed to see eye to eye.

Greene has since kept a lower profile though she did join Google's Board of Directors three years ago and will remain there, according to Pichai's post. "Cloud computing is revolutionizing the way people live and work, and there is no better person to lead this important area," he said.

Pichai described the startup that Greene was running and now Google has acquired, Bebop, as a new development platform to build and maintain enterprise apps. "Bebop and its stellar team will help us provide integrated cloud products at every level: end-user platforms like Android and Chromebooks, infrastructure and services in Google Cloud Platform, developer frameworks for mobile and enterprise users, and end-user applications like Gmail and Docs," Pichai said. "Both Diane and the Bebop team will join Google upon close of the acquisition."

Posted by Jeffrey Schwartz on 11/20/2015 at 9:11 AM0 comments


Hyper-V Containers Debut in New Windows Server 2016 Preview

Microsoft today released Windows Server 2016 Technical Preview 4, providing a first look at Hyper-V containers, an additional deployment option for those looking to create multitenant environments. Hyper-V containers offer a higher level of isolation, which offers better security, according to Microsoft. It's available for download now along with the new System Center 2016 Technical Preview 4.

In addition to the debut of Hyper-V containers, Microsoft has issued improvements to Windows Server containers and the Docker Engine for Windows, which made their appearance in the last technical preview, released back in August.

"Hyper-V containers isolate applications with the guarantees associated with traditional virtualization, but with the ease, image format and management model of Windows Server containers, including the support of Docker Engine," according to Microsoft's Server and Cloud blog post.  "You can make the choice at deployment of whether your application needs the isolation provided by Hyper-V containers or not, without having to make any changes to the container image or the container configuration."

Microsoft acknowledges that while there's still room for improvement, application compatibility is a key focus in the new technical preview. Among the applications and application frameworks that now work with Windows Server containers are ASP.NET 3.5 and 4.6. In addition, the new Nano Server deployment option allows for deployment as both a container host and as a container runtime in which the OS runs within the container. Microsoft said this is "a lean, efficient installation of Windows Server ideal for born-in-the-cloud applications." Microsoft also added support for shared folders support, which Docker calls volumes, as well as hostname configuration.

The Nano Server, Microsoft's reduced servicing deployment option, also includes support for Desired State Configuration, which is aimed at helping automate large server deployments, championed by Microsoft Distinguished Engineer Jeffrey Snover and lead architect for the server and cloud group. "These give you the tools for rapid iteration and lighter weight DevOps," Snover said.

Nano Server can now run as a DNS Server or Web Server (IIS). Another key new feature added to Nano Server is the Windows Server Application (WSA) installer based on AppX, which Microsoft said provides a way to install other agents, tools and applications on the server.

Microsoft also announced new software-defined datacenter improvements. Building on the Azure-consistent stack of the last technical preview, Microsoft has added high availability to the network controller, improved load balancing, container network and live migration support. Microsoft has also added Virtual Machine Multi-Queue to enable 10G+ performance.

On the storage front, Microsoft has upgraded its Storage Spaces Direct feature to support all flash configurations with NVMe SSD and SATA SSD devices and Erasure Coding, which Microsoft said offers better storage efficiency. The Storage Health Service has improved health monitoring and operations, with one monitoring point per cluster and Storage QoS now supports adjusting the normalization size of the algorithm from the current default 8 KB settings, Microsoft said.

The new security features include shielded VMs and Just Enough Administration, which restricts administrator rights. The latest preview also supports domain controllers and server maintenance roles.

Posted by Jeffrey Schwartz on 11/19/2015 at 12:39 PM0 comments


Docker Launches Universal Control Plane for App Portability

Docker this week introduced the latest part of its commercial stack that it claims will provide the software to enable the sharing of distributed applications across cloud, operating systems and virtual machine environments without requiring developers to reprogram their code.

The Docker Universal Control Plane, launched at DockerCon Europe this week in Barcelona, runs on-premises on Windows and Linux physical or virtual servers or behind a firewall in a public cloud instance for the purpose of managing "Dockerized" distributed apps that are in production on any type of infrastructure. It's targeted at DevOps organizations looking for an operational system to deploy and manage distributed applications in production while creating limited tasks for developers.

"If you were moving your distributed application into a specific cloud, you had to build an operational skill set and tooling aligned with one specific cloud -- sometimes even changing the code to align with the cloud providers that containers are running in," explained David Messina, Docker's VP of marketing. "The benefit of the Universal Control Plane is that it cuts across all of that, it allows you to pick and choose your cloud."

The other benefit, he noted, is it provides a self-service portal for the development team. With the infrastructure that already has been provisioned by the ops team, they can deploy and manage the distributed applications wherever they want. "The benefit of that is you have control baked in for IT operations," Messina said. "It comes back to the core tenants of Docker, which is that you want to have things run on any infrastructure, you want freedom of choice, and you want the developers to remain agile through and through."

Messina said a number of large Fortune 500 companies have partaken in private beta tests of the Docker Universal Control Plane and now the company is offering to a larger group of testers for those who sign up. Docker hasn't determined general availability or its pricing yet, though Messina said the company is hoping to release it in the first quarter of next year.

The Docker Control Plane is designed to run symbiotically with the Docker Trusted Registry, according to Messina. "The registry is where the content is, the control plane is the thing that manages the infrastructure and helps with the deployment of the content," he said. Asked if the two will be offered as a bundle, Messina said that's under consideration.

Also at the DockerCon conference this week, the company announced new capabilities aimed to make its container platform more secure. Docker announced what it claims is the first hardware signing of container images to its builds on Docker Content Trust, a framework it released in August, addressing a key shortcoming: that there was no way to validate content.

The hardware signing comes via a partnership with YubiKey4, a provider of USB-based hardware encryption devices that don't allow access to machine without its use and an end-user's second form of authentication.

"This provides the ability to do hardware signing for container content," Messina explained. "It validates the publisher and being sure that content comes from that publisher. We've also set up a model where the publisher themselves can't be compromised by any malicious or nefarious attack. Their root key, which is the most important key in the process of signing content, is always protected."

Posted by Jeffrey Schwartz on 11/19/2015 at 10:40 AM0 comments


Microsoft Readies Customer Lockbox for Office 365

Microsoft's Customer Lockbox and Advanced eDiscovery tools, announced earlier this year with the aim of providing higher levels of compliance for Office 365, will be available Dec. 1. The company announced the pending release as part of yesterday's broad initiative outlined by Microsoft CEO Satya Nadella to build out the Operational Security Graph and plans to invest $1 billion per year in security alone for Azure, Office 365 and Windows.

Scott Charney, corporate vice president for Microsoft's Trustworthy Computing, revealed the Customer Lockbox feature for Office 365 in his keynote address at the annual RSA Conference in San Francisco earlier this year. Customer Lockbox ensures that Microsoft employees cannot access customer data or content without their explicit permission. At the time it was announced, Microsoft said the Customer Lockbox feature would be enabled for Exchange Online by year's end and for SharePoint Online in the first quarter of 2016.

If Microsoft needs to access Office 365 content in its datacenters, Customer Lockbox puts in place an approval process, allowing them to approve or reject it. Microsoft engineers will be unable to access the data until the request is approved, according to Microsoft's description of the service. "The lockbox gives you the capability to secure your data, to encrypt your data and give only key access when required," Nadella said in yesterday's speech.

Posted by Jeffrey Schwartz on 11/18/2015 at 10:57 AM0 comments


Most Dell and EMC Customers Surveyed Welcome Merger

A month after Dell announced its definitive agreement to acquire EMC for $67 billion, making it the largest IT merger ever, it appear customers of both companies feel they'll be better off as a result of the deal. At least that's what one survey of 200 C-level IT execs conducted earlier this month by IT advisory firm Enterprise Strategy Group has found.

The results, published today, show that 75 percent of customers believe the merger of the two companies will benefit their organizations, while 17 percent don't expect it to have any impact. Only 3 percent said they are worried it will have a negative impact and the other 4 percent don't know.

Customers who rely on both companies for their datacenter infrastructure were even more bullish -- 84 percent said they expect to benefit. Of those who were primarily only Dell or EMC customers, 68- and 66 percent respectively, believe the deal is good.

ESG pointed out that the survey was not commissioned by any third party, making the data all that more noteworthy. It's also interesting to note that the top benefit that'll come from the merger is "complete and more innovative solutions," according to 65 percent of the respondents. Rival Hewlett Packard Co. completed its planned split, arguing two more-focused firms would be more innovative. The second largest benefit is that customers believe the two companies will offer more stability combined than apart (58 percent), followed by an expectation that Dell-EMC will provide lower cost infrastructure (55 percent). More than half (53 percent) see using the combined company for complete solutions spanning from endpoint devices to datacenter infrastructure while 50 percent welcome having to purchase from fewer vendors.

One area where customers may not put all their IT investments into a combined Dell-EMC basket is in network infrastructure. When asked about those who procure software-defined infrastructure from the VCE alliance, a majority (55 percent) said they'll buy networking components from EMC and Cisco, not Dell (though 26 percent would purchase components coming from both Dell and EMC).

Once combined, 60 percent say they expect to spend more with the new company while only 1 percent said they'll spend less. The obvious but important caveats to this particular data set are a) many of the questions are clearly predicated upon the transaction actually closing and b) this data represents the attitudes of the current Dell-EMC install base.

"While retaining and growing these customers will be critical to the new firm's success, its aspirations will not stop there," according to the ESG report. "As the new Dell-EMC looks to steal market share from incumbents such as HPE and IBM, and fend off encroachment from cloud entrants like AWS (among others), its ability to foster the levels of interest described by respondents in this brief with these non-customers will be essential."

Some of these findings may fly in the face that when industries consolidate that fewer choices lead to higher prices and deteriorated service (anyone who has flown on a major airline lately can attest to that). I suggested as much to ESG Senior Analyst Colm Keagan, but he doesn't believe that comparison holds. "There are still plenty of market pressures coming from legacy vendors [IBM, HP, Microsoft, etc.], cloud vendors [AWS, Microsoft, Google, etc.] and emerging technology companies [Nutanix, SimpliVity, Nimble, Solidfire, etc.]  to keep Dell/EMC honest," Keagan said. "Plus with the huge debt service on the deal [$47 billion], they can ill-afford to start shedding customers by not over servicing them and keeping them happy. "

The bottom line, he added, is there still will be concerns about how the combined company integrates its sales and engineering teams, ensure any staff reductions aren't overly disruptive and rationalize the product portfolios without leaving clients in the lurch. "But the initial client sentiment at this point is positive," Keagan said. "Execution, as always, will be key."

Are you as optimistic about this merger as the ESG respondents?

Posted by Jeffrey Schwartz on 11/18/2015 at 11:49 AM0 comments


Satya Nadella Lays Out Microsoft's Security Roadmap

Microsoft CEO Satya Nadella today gave a holistic view of the company's focus on security and revealed it is building an intelligent operational security graph. Emanating from its Cyber Defense Operations Center in Redmond, this graph is a framework that allows for the sharing of real-time intelligence gathered and shared across every Microsoft offering ranging from sensors, Xbox and Windows devices to the datacenter and public cloud. Nadella said Microsoft is sharing the graph with a new ecosystem of partners and also revealed the company is investing $1 billion in R&D to build security into Windows, Azure and Office 365.

Nadella addressed an audience of federal IT security pros in Washington, DC at the Government Cloud Forum  (a replay is available on demand), where he spelled out the new intelligent security graph. Nadella described it in the context of reporting on the progress Microsoft has made in addressing the threat landscape of 2015 and beyond.

Nadella spoke of the numerous new security features in Windows, Office 365 and its public cloud Azure with the release of the Azure Security Center but also spoke of the huge escalation in breaches this year. "2015 has been a tough year around cyber security," Nadella said. "Just the top eight or so data breaches have led to 160 million data records to being compromised." But Nadella also emphasized the fact that it takes an average of 229 days for an organization to detect an intrusion and, suggesting that the graph will help reduce the time it takes to discover and respond to threats.

Time will tell but today's speech could be remembered as Microsoft's Trustworthy Initiative 2.0. While Nadella didn't describe it as such, he referred to the original Trustworthy Computing Initiative by Bill Gates, the company's founder and first CEO, and suggested Microsoft is now taking that from the focus from software development to operations.

"Fourteen years ago, Bill Gates wrote about Trustworthy Computing as a priority for Microsoft, and we have made tremendous amount of progress on it but with this changing environment, which is no longer just about our code, and the threat modeling and the testing but it is in fact about the operational security posture that we have in this constantly evolving environment, this constantly under attack," Nadella said. "The operational security posture to me is where it all starts."

At the center of that graph that enables Microsoft's new operational security posture is Microsoft's Cyber Defense Operations Center, located on the Redmond campus, which I had the opportunity to recently visit. It is a war room that rivals few others in the world that Microsoft uses to detect threats in advance, and share the information with all of its related product and service groups as well as its partners.

"We don't have silos, we actually have people who are able to in real-time connect the dots between what's happening across all of these services," Nadella said "That operations center, and the output of the operations center, is this intelligence graph that is being used in turn by our products to create security in the products themselves, and we share that intelligence broadly with our customers [and] with our partners."

Among some of the partners that Microsoft has tapped to integrate their security offerings with Microsoft's Azure Security Center are firewall suppliers Barracuda, Trend Micro, Cisco, Fortinet and Checkpoint. Also revealed were partners whose wares will integrate with the Microsoft Enterprise Mobility Suite's Intune service. The integration will let those partners utilize the policy, security settings and data protection capabilities of Intune. Among them are Adobe, Acronis, Box, Citrix, Foxit and SAP.

"Box for EMM with Intune for is the only way that customers can fully manage and secure Office files on mobile devices," explained Chris Yeh, Box senior VP for product and platform, in a blog post. "With the offering, users can determine which applications interact with Box and access corporate content, and implement additional controls and policies on Box and other managed applications."

Nadella also took time to spel out the new levels of security Microsoft has introduced across the board over the past year. It included some demos from Julia White, Microsoft's general manager for Office 365.

In describing the new intelligent security graph and the move to bring it into operation, Nadella has put forward his own Trustworthy Computing Initiative.

Posted by Jeffrey Schwartz on 11/17/2015 at 12:41 PM0 comments


Will Apple's iPad Pro Kill the PC?

Curious to see the new iPad Pro that began shipping last week, I went to Best Buy Saturday to check it out and, as I expected, it's an overgrown version of Apple's standard tablet. Indeed it has many fine refinements including a faster processor, sharper resolution and much better speakers. But it's very expensive for what it is.

Comparisons to Microsoft Surface Pros, other laptops and even MacBook Pros are to be expected and, for some, the iPad Pro might be a suitable replacement. Generally speaking though, they're different beasts. When Apple's original iPad came out five years ago, it was truly a novel device that let us do things we couldn't before. The iPad Pro does offer new capabilities such as an optional pen for drawing, which certainly give it new use cases in certain situations, but it's not anything revolutionary.

Apple CEO Tim Cook stirred up the debate last week when he said that "I think if you're looking at a PC, why would you buy a PC anymore?"  That may be his goal -- and it certainly was the goal of his predecessor, the late Steve Jobs -- but it still seems like a stretch to say no one would have a reason to buy a PC any more.

I have an older iPad and use it quite extensively to consume content. But when it comes to work and productivity, the PC is a far more useful device. At this point, I wouldn't like to replace either but if the Universal Windows Platform lives up to its promise, I can see giving up my iPad. It's hard to see anyone wanting to give up their PC or Mac for an iPad Pro, and certainly not for one that will cost more than $1,200 if you opt for the 128GB version.

Even if I was to replace the iPad I have now, I'd likely swap it out for the more affordable and lightweight iPad Air 2, which is half as expensive and can handle most everything its larger sibling can do.

What's your take on the new iPad Pro? Do you see buying one, or your organization's employees using them? If so, will it be to replace or merely supplement their PCs?

Posted by Jeffrey Schwartz on 11/16/2015 at 11:23 AM0 comments


Microsoft Helps Developers Build Emotion Detection into Apps

In most cases it's relatively easy to get some sense how others are reacting or feeling in a live situation but online or via videoconference, such subtleties are much more difficult to detect. Microsoft this week released the public beta of an API and tool that lets developers program the ability to detect emotions into their apps.

The new emotion API, debuted at Microsoft's Future Decoded conference in London, was developed by the company's Project Oxford team and demonstrated by Chris Bishop, head of Microsoft Research in Cambridge, U.K., during his keynote address. Microsoft revealed Project Oxford at its Build conference back in April.

Microsoft describes Project Oxford as a portfolio of REST-based APIs and SDKs that allow developers to add intelligence into their applications and services using the company's machine learning technology that comes out of Microsoft Research. Among the APIs now in beta are facial detection, speech and computer vision.

This week's new tool, released to beta testers, is designed to let developers build the ability to detect the eight most common states of emotion: anger, contempt, fear, disgust, happiness, neutral, sadness or surprise. The state is detected by reading the kind of facial expressions that typically convey those feelings. In a blog post, Microsoft described some scenarios where those APIs would be useful, such as to develop systems for marketers to gauge reactions to a store display, movie or food or creating apps that render options based on the emotion it recognizes in a photo or video.

Microsoft also showcased a scenario tied to the facial hair fundraising effort Movember, in which the company released MyMoustache, to rate facial hair. Microsoft also released a spell check API beta. It's a context-aware programmatic interface that can detect slang as well as proper word usage (such when "four," "for" or "fore" is correct). It also supports brand names and commonly used terms.

By year's end, Microsoft will be releasing additional tools coming from the Project Oxford team, including a video API based on some of the same technology found in Microsoft Hyperlapse that can automatically clean up video. Also coming by year's end are tools to recognize speakers and custom recognition intelligent services (CRIS), which can detect speech in noisy environments.

Posted by Jeffrey Schwartz on 11/13/2015 at 12:46 PM0 comments


Windows 10 Ready for Enterprise with First Major Update

Microsoft today has begun rolling out the first major update to its Windows 10 operating system and has effectively declared it ready for enterprise deployment.

The new update includes a number of usability improvements to Windows 10 including a boost in performance and added features to components of the OS including Cortana and the Edge browser. It also marks the launch of the Windows Store for Business.

The November Update, as Microsoft describes it, is critical for those considering rolling Windows 10 for businesses, said Terry Myerson, executive VP of Microsoft's Windows and Devices Group, in a blog post announcing the release. Also known as Version 1511, the update is available on MSDN.

"With this update, there are improvements in all aspects of the platform and experience, including thousands of partners updating their device drivers and applications for great Windows 10 compatibility," Myerson said. "Windows 10 also starts rolling out to Xbox One today and select mobile phones soon. But most importantly, with this free update we have reached the point in the platform's maturity where we can confidently recommend Windows 10 deployment to whole organizations."

While Myerson claimed that  the update offers improved performance and boot time, he underscored that with this release Windows 10 is now ready for enterprises thanks to the availability of Windows Store for Business and Windows Update for Business.

"Windows Store for Business provides IT a flexible way to find, acquire, manage and distribute apps to Windows 10 devices -- both Windows Store apps and custom line of business apps," Myerson said. "Organizations can choose their preferred distribution method by directly assigning apps, publishing apps to a private store or connecting with management solutions."

As reported last week, the Windows Store for Business will be critical to enterprises looking to deploy their own custom-developed Universal Windows apps that aren't in the general Windows Store and enable private stores and application license management.

Meanwhile, Microsoft claims that Windows Update for Business will add management controls for IT pros. For instance, they can set up Windows 10 device groups for end users with staged deployments.

The Windows 10 update also includes support for mobile device management and Azure Active Directory Join, allowing users to have one login, while allowing administrators to join any Windows 10 device to an enterprise domain.

The update now lets users turn off telemetry data, though Microsoft recommends against doing so. Not included in this update but "coming soon" according to Myerson, is enterprise data protection, allowing organizations to segregate enterprise data from personal information and content.

The new features added to Cortana include the ability to recognize e-mail addresses and phone numbers, event tracking, the ability to book rides on Uber (which Microsoft has invested in) and support for additional countries besides the U.S. including Japan, Australia, Canada and India. The Microsoft Edge browser also has improved performance, security and preview capabilities.

Posted by Jeffrey Schwartz on 11/12/2015 at 10:06 AM0 comments


Will Potential Tax Liability Derail Dell-EMC Deal?

Could the historic deal reached by Dell to acquire EMC for $67 billion unwind if the IRS decides it wants a piece of it? Dell officials are reportedly concerned that the IRS could rule that the transfer of the tracking stock that comes with EMC's 81 percent stake in VMware could trigger a taxable distribution that would add $9 billion to the bill.

That's a worst-case scenario posed by the tech business site Re/code, citing multiple unidentified Dell insiders saying such a scenario is a concern. According to the report, Section 355, a provision of the U.S. tax law, describes circumstances that could trigger a tax upon the transfer of the VMware shares.

"Simply put, the law is intended to prevent corporate spinoffs or share distributions from helping pay for an acquisition, which appears to be what Dell is attempting to do,"  wrote Re/code author Arik Hesseldahl.  "If the IRS were to rule that the tracking stock qualifies as a taxable distribution of shares as defined in Section 355, it would remove a key plank of Dell's financing for the transaction. At minimum it would require Dell to borrow more money to pay EMC shareholders for the full value of the company. At worst, sources said, the added tax expense could derail the deal entirely."

Dell is aware of the risks, he noted, having made note of it in the Oct. 12 merger agreement on file with the SEC, which states: "Neither the Company nor any of its Subsidiaries has been a 'controlled corporation' or a 'distributing corporation' in any distribution occurring during the two-year period ending on the date hereof that was purported or intended to be governed by Section 355 of the Code."

Rather because EMC hasn't assumed any control of VMware over the past two years, the deal should "qualify as an exchange described in Section 351 of the Code," according to the merger agreement.

"This is a valid worry, but not a deal breaker," FBR Capital Markets analyst Daniel Ives told Reuters. "We see Michael Dell as making sure this deal goes through, even if it takes some deal tweaks along the way."

Posted by Jeffrey Schwartz on 11/10/2015 at 12:39 PM0 comments


Refresh Your Surface Pro 3 with a New Keyboard

One of the many stories of last month's Surface Pro 4 launch was the new keyboard, which is not only easier to use but available with an embedded fingerprint scanner that'll let you login using the new Windows Hello authentication feature in Windows 10.

The new Surface Pro 4 keyboard has thicker keys that are more responsive and more widely spaced, making it much easier to type than the previous version. The wider trackpad is also much easier to use. Having used one in place of the original keyboard offered with the Surface Pro 3, it clearly is much easier to type on.

Just as useful is the built-in fingerprint scanner that lets me log into Windows by touching it rather than inputting my password. Not only does it dispense with the tedious effort of typing in a password to get into Windows but it's more secure. Right now the scanner, which implements Windows Hello, is limited in that you still must use your password for other systems and sites but it was worth the $30 premium for the keyboard.

Perhaps you're wondering why Microsoft didn't put the fingerprint scanner on all the new keyboards for the Surface Pro 4? The reason seems to be support for earlier Surface models. The new Surface Pro 4's camera can enable Windows Hello using facial recognition. Older Surface systems don't have that camera, and those users may still want the fingerprint scanner option.

Even if you purchased a Surface Pro 3 when it was released in June of last year, you'd be hard-pressed to replace it at this point with the new system unless you have a real need for more storage or the new processor. But swapping out the new keyboard can go a long way to make it feel new and improved.  

Posted by Jeffrey Schwartz on 11/09/2015 at 12:49 PM0 comments


Tools Debut This Week To Boost Compliance

While IT pros have historically had to concern themselves with meeting compliance needs, the rise of hybrid cloud, mobility and the ability to more easily move data have put an increased emphasis on it by organizations. While that's especially the case for those who must meet certain governance guidelines, it's becoming common sense for any business or organization with a stake in maintaining better controls of their information putting the burden on them to ensure against data leakage.

Several suppliers this week have added new compliance and auditing capabilities to their wares in recent weeks, including Centrify, Netwrix and Okta.

The latter, one of the leading identity and access management suppliers, this week announced an update to the Okta Mobility Management tool. The update will give the tool new compliance reporting allowing administrators to view configurations within apps and they can compare them with their policy or identity management systems of record the company said. Administrators can adjust access based on group policies, the company said. Okta also added a new rules engine that simplifies the assigning of users to existing groups based on their memberships and also coming are new workflow capabilities, the company said.

Okta, which held its Oktane15 customer conference in Las Vegas this week, also announced the suite will offer consistent device management for both PCs and Macs, allowing common security policies among all devices. The update also includes an iOS Safari extension that will allow users to log in to their accounts from Safari on their iOS devices.

Netwrix , another provider focused on managing permissions, today launched a new version of its auditing tool. The new Netwrix Auditor 7.1, boasts simplified access to historic audit trails, enabling administrators to determine the root cause of breaches when unauthorized access is suspected. Reporting in the new release is improved to provide account permission audits to show who has access to file shares, storage systems and allowing them to ensure privileges are consistent with their business, the company said. The new release also now generates system health reports and adds support for NetApp Ontap-based storage clusters. It already supported Microsoft, VMware and EMC storage.

Centrify, which also targets compliance with its own identity management suite, added support for five key cloud access security brokers (CASB). The integrations with CloudLock, Elastica, Imperva, Netskope and Skyhigh Networks, aim to provide protection with SaaS apps such as Microsoft's Office 365, Box, Dropbox, Google Apps and Salesforce.com against data leakage by providing data loss prevention.

"Our new CASB partnerships mean that end users benefit from password-free access to SaaS apps, while privileged users and IT benefit from maximum visibility and monitoring of app usage and suspicious behavior to ensure security is not compromised, " said Bill Mann, Centrify's chief product officer, in a statement.

Through the integrations, Centrify said it can provide a common interface to configure SAML-based access tokens for enforcement and inspection purposes and allow for monitoring of privileges and access control.

Posted by Jeffrey Schwartz on 11/06/2015 at 11:12 AM0 comments


Intel Steps Up Focus on Security

Intel Security last week made the case to partners and customers that it is now ready to be their primary supplier of enterprise security solutions. This culminates a year of a major revamping for the business.

Following last year's renaming of the business from McAfee to Intel Security, though keeping the latter branding on specific products, the company wants to be seen as one of the big boys, challenging the likes of Cisco, IBM, RSA and Symantec. It will do so by securing everything including sensors, mobile devices, PCs and all points in the enterprise datacenter and public cloud, while providing the needed wares to equip a security operations center (SOC).

While Intel Security wants to provide many of those wares, it will partner with others to ensure it is providing all of the protection enterprises need. At its Focus event last week in Las Vegas, executives said they want to help customers reduce the number of security vendors needed to provide the needed defenses while covering an ever-growing threat landscape. Over the past year Intel Security, with annual revenues of $3 billion last year, has reshaped its corporate strategy that focuses on the expanding attack surfaces from the endpoint to the network and cloud control points.

The company has brought in seasoned security veterans starting with Chris Young, senior vice president and general manager of Intel Security Group, who came over from Cisco where he was senior VP of its security business. In his keynote address to partners at Focus last week Young said Intel Security is shifting away from a strategy that relies on acquisitions to internal development and partnerships.

"McAfee traditionally was all about buying," Young said, pointing to a long history of buying companies and trying to make them fit. "Everything we did, we acquired this company, we acquired that company. I think that's something we're going to have to change. We're putting more emphasis on building a lot of our technologies." Young didn't rule out buying companies where it makes sense.

Young presided over the introduction of two key new products that he said will bring home its message that the company is not the McAfee antivirus company of the past. One is the new McAfee EndPoint Security 10.x, an endpoint threat detection and response platform, which Young said is based on significant architectural changes from the initial version released early last year. Besides improved performance, the company claims it offers visibility to advanced threats and offers high speed detection and remediation.

The other new product, McAfee Active Response, provides continuous monitoring while giving administrators views via the McAfee ePolicy Orchestrator (ePO) management console. "We're getting into the game to help our customers be better at hunting and detecting for the undetectable threats with McAfee Active Response," Young said. "We can offer our customers the ability to go out and become really, really compliant by giving them better detection and correction, by automating the process that the smartest security analysts have to follow if they're going to go out and look for the hardest to find threats in their customers' environments. McAfee Active Response is designed to do just that."

Young also talked up support for its Threat Intelligence Exchange (TIE), based on its McAfee Data Exchange Layer (DXL), which the company describes as its "architecture for adaptive security." Intel announced the exchange last year and described DXL as "a real-time, bidirectional communications fabric allowing security components to operate as one immediately sharing relevant data between endpoint, gateway, and other security products enabling security intelligence and adaptive security."

Intel says it enables product integration via an open API that ties to any layer of the exchange without requiring point-to-point integration.

Intel said 16 vendors are now DXL Alliance partners including Windows privilege management provider Avecto, ForeScout Titus and TrapX Security. The company added two more last week, Brocade and Mobile Iron.

Analysts are watching Intel's new security push closely. At a dinner with press and analysts hosted by Intel last week, I sat next to Frank Dickson, research director for information and network security at Frost and Sullivan, who said Intel Security is in a transition phase that kicked off with the hiring of Young.

"We are just starting to see the fruits of the change," Dickson said. "The key is to do more, faster. Active Response is a good first step, but it is only first step. Security professionals are desperate for tools to simplify the administration of security. Intel Security needs to do more. It seems headed down that path. Additional analytics need to be brought to the problem of security; automated analytics that does not involve a human.  I did not hear much at all on automated analytics."

Posted by Jeffrey Schwartz on 11/05/2015 at 12:11 PM0 comments


Microsoft and Red Hat Announce New Open Source Partnership

In what Microsoft is describing as perhaps its deepest partnership with another major enterprise infrastructure player to date, the company is joining forces with leading Linux and open source provider Red Hat to ensure that their respective operating systems and cloud platforms technologies interoperate.

The partnership, announced today, will involve a Red Hat engineering team moving to Redmond to provide joint technical support for Red Hat Enterprise Linux workloads running in the Microsoft Azure public cloud and on its hybrid cloud offerings. The pact also calls for applications developed in the Microsoft .NET Framework language to run on RHEL, OpenShift and the new Red Hat Atomic Host container platform.

If describing this pact as the deepest partnership ever sounds like hype, neither Microsoft or Red Hat has made such a claim with other jointly announced partnerships.

 "We don't do this depth and level of support with any other partner at this point," said Paul Cormier, Red Hat's president of products and technologies, during a Web conference announcing the pact. "It's a much more comprehensive partnership than we have with any of our other public cloud providers. The colocation of our support teams is really a significant differentiator for our enterprise customers."

The other differentiator is that Windows and RHEL are the most widely deployed enterprise server platforms, he said. Scott Guthrie, Microsoft's EVP of Cloud and Enterprise, who was on the webcast with Cormier, agreed.  "I am not aware of us doing anything like this with any other partner before," Guthrie said.

Making the agreement even more noteworthy is it's coming from companies that once had nothing but disdain for each other. "There wasn't much trust there," Cormier said flatly.

The devil will be in the details and execution of what the two onetime rivals announced today and whether it lives up to its promise. Specifically there are five components to what the two companies hope to deliver:

  • Combined support services for hybrid cloud including Red Hat products and on-premises customer environments running on Microsoft Azure. "What this means is as we bring our solutions together to solve customers' real problems," Cormier said. "We need to do that in such a way that we can really give enterprise class support that our customers want, need and expect." To accomplish this, he said Red Hat will collocate its engineers with Microsoft's so when issues or questions arise regarding integration points, their respective engineers will work together to address them.
  • Red Hat will use the newly open sourced .NET technologies in its platforms including RHEL, its OpenShift cloud PaaS offering and Red Hat Atomic Host, the company's container offering. This integration will let developers build applications that include .NET services, which could be more appealing now that Microsoft has open sourced the framework. "I think this will give us greater interoperability across the technologies and really start to give our customers the heterogeneous world that they really want," Cormier said.
  • Subscriptions to Red Hat products supported on Azure and Windows will be supported by Red Hat the same way that they are supported when running in traditional enterprise environments. Red Hat will also expand cooperation and work on Windows being supported on Red Hat products including RHEL, OpenStack and OpenShift.
  • Red Hat's systems management offering Cloud Forms will include management of workloads on Azure. This should provide a common management pane for physical workloads and private cloud environments including its OpenStack distribution and OpenShift.
  • Microsoft is joining the Red Hat Certified Cloud and Service Provider Program (CCSP) to provide certified support for all of its products in Azure. With Microsoft joining the CCSP, Guthrie said that "Red Hat subscriptions will become portable to Microsoft Azure, with full Red Hat cloud access, enabling a consistent application platform on and off premises."

The two teams will start their collocation in Redmond over the next few weeks, Guthrie said. Over time, they may send teams to other locations as well. In the coming months, Guthrie said the two companies will let customers sign up and pay for licenses of Red Hat products on Azure on a usage basis.

"You'll see the integration of the Red Hat CloudForms offering with both Azure and our System Center VMM product that will enable consistent workload management in a hybrid way," Guthrie said. "We'll also enable Red Hat enterprise Linux Atomic Host on Azure and basically deliver a supported and certified small footprint container host that customers can use for enterprise Linux containers."

The deal is the latest evidence that Microsoft CEO Satya Nadella meant business when he said last year that "Microsoft loves Linux" and gave further evidence in our September cover story that Microsoft's DNA is changing.  Now the onus is on both companies to make their respective wares interoperate as they promised today.

Posted by Jeffrey Schwartz on 11/04/2015 at 1:44 PM0 comments


Office 365 Customers Slam Decision to End Unlimited OneDrive Storage

Microsoft giveth and Microsoft taketh away. In what has quickly resulted in unwelcome backlash, Microsoft last night announced it's withdrawing its unlimited storage option for consumer Office 365 accounts. Why? Apparently some customers felt that entitled them to store their entire digital movie collections in some case exceeding 75TB. The limit scales back to 1TB effective immediately. Customers will have a year to keep what's there but are unable to store more.

Not surprisingly, the move is going over as well as JetBlue's decision to start charging customers to check their bags earlier this year. "Wow, wtf is Microsoft thinking... They essentially just killed OneDrive," tweeted customer Al-Qudsi, a dental student who's likely to remember this when he sets up his own office.

The problem with Microsoft's reasoning  is that it should have realized when it made the offer that some would actually take the company up on it. "I don't understand why companies offer unlimited something and then are surprised when some people try to use it," tweeted tech editor Harry McCracken, now a contributor to Fast Company.  Many customers slammed Microsoft on its own blog post announcing the change.

"I've been using OneDrive since the early days of SkyDrive and I have more than 300GB of storage from Surface and early adopter bonus," in a comment posted by Jim. "I will cancel my Office 365, demand a *full* refund, and move my files back to my NAS. I will also file a complaint to the BBB. The OneDrive team, you should really be ashamed of yourself. MSFT spent the last two, three years to rebuild the trust with its power users, and you managed to destroy all that hard work by a blog post."

Microsoft explained its reasoning in the announcement. "Since we started to roll out unlimited cloud storage to Office 365 consumer subscribers, a small number of users backed up numerous PCs and stored entire movie collections and DVR recordings. In some instances, this exceeded 75TB per user or 14,000 times the average. Instead of focusing on extreme backup scenarios, we want to remain focused on delivering high-value productivity and collaboration experiences that benefit the majority of OneDrive users."

The changes, according to the post, are as follows:

  • Microsoft is no longer providing unlimited storage to Office 365 Home, Personal, or University subscribers. Subscriptions now include 1TB of OneDrive storage.
  • 100GB and 200GB paid plans are no longer an option for new users and will be replaced with a 50GB plan for $1.99 per month in early 2016.
  • Free OneDrive storage will revert to 5GB from 15GB, which includes the camera roll storage bonus at some point in early 2016.

This isn't the first time Microsoft has scaled back on its storage offerings in OneDrive. Back when it was called SkyDrive, Microsoft offered all customers free 25GB storage and slashed it back to 7GB until last year when it announced the unlimited option. Just over a year ago, Microsoft reversed course. "Today, storage limits just became a thing of the past with Office 365," announced Chris Jones, who was corporate VP for the OneDrive and SharePoint team until he was reassigned back in March, as reported by blogger Mary Jo Foley.

However, if you're among those who never received the unlimited option, you're not alone. "During the past few months, I've heard from several dissatisfied Office 365 subscribers who never received their unlimited storage upgrades," wrote Ed Bott, in his ZDNet Ed Bott Report blog.  "Now we know why."

How severe the backlash is remains to be seen but it's likely to raise questions about Microsoft's free Windows 10 upgrade offer. At the same time, there are many who won't be fazed by the latest move. I've come across many (some are even IT pros) who were unaware of the amount of capacity Microsoft offers. For me, 1TB isn't shabby but for those sold by the more generous option, that may offer little solace.

Posted by Jeffrey Schwartz on 11/03/2015 at 11:34 AM0 comments


HP Officially Splits into 2 Companies

Today marks the first day of business for two new companies spawned from the divestiture of one of the most storied and oldest technology companies in Silicon Valley. Hewlett Packard Co. completed its Nov. 1 split to form HP Inc., the former company's PC and printer business, and Hewlett Packard Enterprise (HPE), which provides IT infrastructure and services.

The official split caps the company's shift in strategy announced a year ago. To mark the occasion, HP CEO Meg Whitman, who now takes on that role for HPE while serving as HP Inc.'s chairman, rang the opening bell of the New York Stock Exchange. Combined, the companies recorded more than $110 billion in revenues for its most recent fiscal year. By splitting into two companies, HPE accounted for $53 billion and HP Inc. $57 billion. Wall Street analysts have broached the idea of the original HP splitting into two companies for more than a decade after many questioned the wisdom of its merger with Compaq Computer. When Whitman took over as HP's CEO four years ago, she overturned a hastily announced plan to sell off HP's PC business, arguing that having PCs and enterprise hardware offerings gave the company greater scale than its rivals, notably Dell, IBM and Lenovo.

Last year, Whitman reversed course, saying that two companies would be more competitive than one. Apparently Michael Dell and his investors see things quite differently, with last month's deal by Dell to acquire EMC and its stake in VMware for $67 billion, making it the largest IT industry acquisition.

"They are two diametrically opposed strategies," Whitman told CNBC's David Faber in an interview from the NYSE this morning. "It's quite interesting that we have chosen to deleverage our balance sheet to get smaller [and] to be more nimble and lean into new technology like our 3PAR all flash storage array, like the next generation of networking and servers and our software business, while Dell has chosen to get much bigger, lever way up and really consolidate and make a cost play around older technology. We looked at Hewlett Packard's strengths and we said being smaller and more nimble in this market is a huge advantage."

Some question whether that's the case including the outspoken analyst Toni Sacconaghi, of Sanford C. Bernstein & Co., who told The Wall Street Journal "a big part of the market is moving in a single direction, and [HPE] arguably doesn't have offerings to cater to those [customers]. Traditional vendors have challenges in terms of that migration, but you could argue that others have at least taken more visible, positive steps in that direction."

Those "traditional" vendors he identified were IBM and Oracle, which Sacconaghi argues has a better-articulated hybrid cloud strategy than HPE. Last month, HP said it was shutting down its public cloud, opting to resell cloud services from Amazon Web Services and Microsoft.

"The truth is that Amazon and Azure are way out in front in the public cloud, and our view was is there a way to partner with them while we provide the hybrid environment," Whitman told CNBC's Faber. "It's the classic where to play, and how to win, and we decided we have a better chance of being the leader in private cloud and virtual private cloud, managed private cloud and then of course our Cloud System Automation that helps you orchestrate your cloud in a multi cloud environment."

Now that the two companies have split there are a number of outcomes that need to be considered:

  • Will the new HP Inc. become a more innovative and competitive provider of PCs and printers and will it be able to compete on price as it has in the past?
  • Despite a huge portfolio of storage, server and networking hardware, how will the new HPE stack up against Cisco, Oracle, IBM and of course a combined Dell-EMC?
  • What alliances will HPE make that were never under consideration in year's past?
  • Does HPE become an aggressive acquirer or itself an acquisition target?

"You never know what's going to happen," Whitman said. "This is a very dynamic environment which is one of the reasons we thought it was important to be smaller, so we could be more agile in an environment that requires that."

What's your take on HP's split and the pending combination of Dell and EMC?

Posted by Jeffrey Schwartz on 11/02/2015 at 9:14 AM0 comments


Will Google Merge Android and Chrome OS?

Google has denied a report that it is two years into an engineering effort that would bring both its Android and Chrome OS together as one operating system. The plan, reported by The Wall Street Journal citing unidentified sources, calls for the effort to be rolled out in 2017, though a preview of the integrated OS could appear next year.

However, according to several reports, Google is not killing Chrome OS. Hiroshi Lockheiner, senior VP for Android, Chromecast, Chrome OS tweeted, "There's a ton of momentum for Chromebooks and we are very committed to Chrome OS. I just bought two for my kids for schoolwork!"

The report said Google's engineering team is looking to meld Chrome OS into Android, the most widely deployed operating system on tablets and phones. The notion of combining the two is has often been floated considering the small share of Chromebooks in use, which is believed to be less than 3 percent. Yet they are popular in certain markets, notably for education.

If Google were to follow through with the plan that was reported, it would be a tacit acknowledgment that a browser-modeled operating system that largely eschews the notion of running an application and data locally has limited appeal.

At the same time, combining the two operating systems could position Google to offer Android on PCs. Such a move could potentially mount a challenge to Mac OS and Windows, though it would remain to be seen if the new operating system's functionality would offer a meaningful threat.

Given the source of the report was two unidentified engineers, at the very least it appears there's at least some internal development taking place. But as we know, many technologies developed in companies' research labs never get out.

Update Nov. 2: Lockheiner issued more clarification in an official blog post. "Over the last few days, there's been some confusion about the future of Chrome OS and Chromebooks based on speculation that Chrome OS will be folded into Android," he noted. "While we've been working on ways to bring together the best of both operating systems, there's no plan to phase out Chrome OS."

A number of new Chromebooks are on tap for release next year, he added, also noting the new the new Chromebook for Work. While Chrome OS may not be going away anytime soon, Lockheiner did pont to the new Apps Runtime on Chrome (ARC) that allows Android apps to run on Chromebooks. "We have plans to release even more features for Chrome OS, such as a new media player, a visual refresh based on Material Design, improved performance, and of course, a continued focus on security," he said.

Posted by Jeffrey Schwartz on 10/30/2015 at 2:48 PM0 comments


Veeam Reaches for Larger Enterprises with Availability Wares

Having made a name for itself for its unique approach to backup and recovery by focusing on VMs rather than physical machines for small and mid-size datacenters, Veeam is gunning to move up the food chain with its new enterprise availability suite and cloud connectivity tool.

Veeam now wants to be known as the company that can ensure what is known as provider of "availability for the always-on enterprise." The company will take a key step toward that goal with the Veeam Availability Suite v9, slated for release this quarter. The new release adds an unlimited scale-out data repository and built-in replication in one solution. With its tight integration with Hyper-V and VMware, as well as storage devices of its alliance partners Cisco, EMC, HP and NetApp, the company claims it'll deliver real-time and point recovery objectives within 15 minutes.

In addition, it will offer improved support for cloud services with enhancements to its year-old Cloud Connect tool that'll make it available to clouds of all sizes ranging from small single-site locations to the largest with support coming for Microsoft Azure.

"Veeam has built a great business in the medium-sized business and SMB," CEO Ratmir Timashev said a keynote address at the company's second annual VeeamOn conference in Las Vegas, where I spent several days talking with company executives, industry analysts, customers and partners. "We want to be the de facto standard within the enterprise."

The company indeed has made inroads in taking on larger workloads but some of its claims are ambitious, said Dan Kusnetzky, principal analyst with Kusnetzky Group. "They talk about availability in the modern datacenter but they don't cover all the workloads in the datacenter," Kusnetzky said. "For example, they don't do anything with mainframes and they don't do anything with Unix systems." Nevertheless, Enterprise Strategy Group Analyst Jason Buffington said that "they're still the one to beat in virtualization." Noting the added support for tape, added snapshot integration and last year's support for physical Windows servers, Buffington said that Veeam is making inroads into supporting legacy systems.

Veeam took a further step in legacy system support by announcing Veeam Backup for Linux, a free agent that can run on any physical Linux server. It'll let administrators restore backups from on-premises servers as well as cloud instances and will work with the new v9 suite. A closed beta will kick off in the first half of next year. "They're incrementally getting rid of those last reasons to keep that legacy solution," Buffington said.

Doug Hazelman, Veeam's senior director of product strategy, made no bones that the company is looking to take on some of its larger rivals such as CommVault, IBM and Veritas (the company Symantec is spinning off that offers Backup Exec and NetBackup). "We've been having a lot of success," Hazelman said. "There's been several instances where our new license cost is less than the maintenance renewal for their existing software, and yet we have even greater capabilities."

The company also said it has 1,000 providers offering its new Cloud Connect tool and announced at the event the launch of the new  Veeam Managed Backup Portal, aimed at making it easier for additional providers to roll out cloud-based backup-as-a service offerings. See my report about the Azure-based portal on our sister site Redmond Channel Partner.

Posted by Jeffrey Schwartz on 10/30/2015 at 10:05 AM0 comments


Microsoft Celebrates the Release of Windows 10 Devices at New NYC Flagship Store

More than five years after opening its first retail stores, Microsoft finally has one in New York City, which opens today, which the company said will serve as the flagship of 100-plus locations. The new store, also by far its largest, opens on the same day Microsoft is making available for purchase its new lineup of Windows 10 devices, including the new Surface Book, Surface Pro 4, Lumia phones and Microsoft Band 2.

The new store is in the heart of midtown Manhattan on 5th Ave.,  just a few blocks away from Apple's flagship retail outlet. Although I was unable to attend the grand opening, the new store is vast and has a wide variety of Microsoft-branded products including some not displayed at any of its other locations. Most notable for now is the Surface Hub, Microsoft's giant screen conference room system that allows for video meetings using Skype for Business.

Also on display is HoloLens, Microsoft's futuristic looking glasses that lets users see holograms, though before rushing there to see it, customers can look but not touch. They're in a sealed glass showcase. The location is a premier spot surrounded by some of the world's largest retailers and is a popular tourist spot and home to many large Fortune 500 companies. Microsoft will have specialists who speak a total of 19 languages.

Like its other retail stores, the flagship location will have its Answer desk and will have a floor for conferences and presentations. But it also has a huge two-story video wall showcasing Microsoft's offerings. "The larger footprint means a deeper customer experience of Microsoft's ecosystem in what we consider to be one of the greatest shopping districts in the world," said Soligon, general manager for worldwide marketing, in a statement. "It really is an awesome canvas to be able to highlight products and have them come to life for customers."

Whenever Microsoft opens a new store, it has large festivities and this one will include the launch of the Xbox One game Halo 5: Guardians, letting customers meet developers of the game from 343 Industries. Also sure to garner attention is a free concert tonight by rapper Pitbull over at Rockefeller Center a few blocks away.

Posted by Jeffrey Schwartz on 10/26/2015 at 11:51 AM0 comments


Private Equity Investors To Buy SolarWinds for $4.5 Billion

IT systems management provider SolarWindows this week agreed to be acquired for a healthy $4.5 billion by private equity investors Silver Lake Partners and Thoma Bravo. That's a 44% premium over the company's share price as of Oct. 8, which was right before the company disclosed it was approached by an unsolicited bidder.

At the time, the company acknowledged that it had retained J.P. Morgan as its financial advisor and DLA Piper LLP to provide legal counsel. That's a nice payday for SolarWinds investors. But what does that mean for SolarWinds customers?

"Becoming a private company will provide SolarWinds with optimal operating flexibility to execute on its long-term strategy of providing superior products for IT and Dev Ops Pros all over the world," said SolarWinds President and CEO Kevin Thompson, in a statement. "We are extremely excited about partnering with Silver Lake and Thoma Bravo in the next chapter of the SolarWinds story."

Indeed at this week's DellWorld conference in Austin, Texas, ironically where SolarWinds is based, Dell Founder and CEO Michael Dell talked up the benefits of being a private company. "No 90 day shot clock," Dell said in his keynote address, referring to pressure on public companies to meet quarterly growth expectations. He also added that the company is now able to focus on customer-oriented initiatives and focus R&D accordingly.

SolarWinds is a popular provider of systems management tools for Windows and VMware environments. It has also extended into hybrid cloud management and entered the managed services provider business with the 2013 acquisition of N-able.

The deal, pending shareholder and regulatory approval, is slated to close next quarter.

Posted by Jeffrey Schwartz on 10/23/2015 at 11:08 AM0 comments


HP To Shut Down Its Public Cloud

As it gets ready to split into two companies in a couple of weeks, HP has confirmed its new enterprise business does not plan to continue its public cloud. HP said it will sunset its public cloud Jan. 31, 2016 which, despite huge ambitions to take on Amazon Web Services and others, never gained ground.

Amazon Web Services, Microsoft and Google have established themselves as the largest global cloud infrastructure providers, and while there a number of other major providers including IBM Softlayer, VMware vCloud Air and Rackspace, their footprints don't currently have the scale and customer base as the big three. Still there are thousands of small and midsize hosting and cloud providers.

The move by HP to shut down its public cloud isn't a huge surprise. Bill Hilf, senior vice president and general manager of HP Cloud, made it official Wednesday. "We have made the decision to double-down on our private and managed cloud capabilities," he said in a blog post yesterday announcing the transition.

As it withdraws from the public cloud business HP will emphasize its HP OpenStack-based Helion platform and CloudSystem private cloud offering. The company will also support customers seeking managed virtual private cloud offerings, he said, adding that HP will have some announcements coming within a few weeks.

"Customer tell us that they want the ability to bring together multiple cloud environments under a flexible and enterprise-grade hybrid cloud model," Hilf said. "In order to deliver on this demand with best-of-breed public cloud offerings, we will move to a strategic, multiple partner-based model for public cloud capabilities, as a component of how we deliver these hybrid cloud solutions to enterprise customers."

Hilf added that will include support for Microsoft's Azure public cloud, along with Office 365, and will utilize last year's Eucalyptus acquisition to provide access to Amazon Web Services and will offer PaaS support to Cloud Foundry-based clouds.

Posted by Jeffrey Schwartz on 10/22/2015 at 10:18 AM0 comments


Dell Brings Azure On-Premises Appliance to Mainstream Enterprises

One year after launching the Microsoft Azure Cloud Platform System for service providers and the largest of enterprises, Dell is looking to bring it to mainstream datacenters. Dell Founder and CEO Michael Dell, joined by Microsoft CEO Satya Nadella, introduced the new CPS Standard edition today during the opening keynote of the annual Dell World conference taking place in Austin, Texas

The much larger CPS Premium is out of reach to all but the largest datacenters. The new version will be available in a single rack-based converged system running as few as 100 virtual machines. CPS provides the functionality that's consistent with the Microsoft Azure public cloud but brings it on-premises, allowing for hybrid or private clouds based on the same infrastructure.

"We've had a very successful launch of the premium Cloud Platform System and now we're going to democratize it and make it more accessible to every business, any size, by bringing a standard edition," Nadella said. "This really brings hybrid computing to everyone. The combination of the work that we're doing between Azure and CPS, I think is the way to deliver hybrid computing and its future for our customers."

Dell says the CPS Standard edition can be deployed in as little as three hours and like the larger edition, and the public cloud, updates and patching are handled automatically for customers. This new configuration is intended for workloads consisting of 100 to 400 VMs, is a turnkey system consisting of a compute 2U rack loaded with Dell's PowerEdge C6320 server and has networking and storage designed using Microsoft's Storage Spaces are available with hard disk drives and SSDs. It's run with Dell Cloud Manager.

For now, Dell is the only provider of the Azure Cloud Platform System in a single rack, though companies can build their own version using Windows Server 2012 R2, System Center and the Windows Azure Pack. Doing so is expected to become easier next year when Microsoft delivers Windows Server 2016 and the new Azure Stack, which Microsoft has said will bring more complete Azure functionality using the same portal interface than the current Windows Azure Pack.

Those deploying the new CPS Standard before the release of Azure Stack will be able to add on that new capability when it becomes generally available.

Posted by Jeffrey Schwartz on 10/21/2015 at 12:40 PM0 comments


Docker Extends Distributed App Portability Effort with Tutum Acquisition

Docker has acquired Tutum, a startup that provides a platform for development, deployment and management of cloud-scale apps, for an undisclosed amount.

The two-year-old startup has built its entire platform and cloud service based on the native Docker APIs and has native drivers to Amazon Web Services, Microsoft Azure, IBM Softlayer, DIgital Ocean, among others, to enable the movement of containerized workloads across clouds and private datacenters.

Tutum's offering is still in beta but the company has 24,000 beta users and hundreds of deployments, said Borja Burgos-Galindo, cofounder and CEO of Tutum, in an interview. The company has not committed to a timeframe for general availability, he said.

"What's most important is the notion of letting users bring their own infrastructure," Burgos-Galindo said. "This also goes to Docker's choice to users. What this enables our users to do is bring the infrastructure they have provisioned that Docker may not natively integrate with their on-premises infrastructure."

Today the platform can grab any Linux virtual machine, whether it's on a laptop or on an OpenStack deployment or on VMware and bring it into Tutum, he explained. "Tutum can use that infrastructure as an endpoint on which to deploy the containerized applications, he said.

It will also be able to grab Windows workloads once Windows Server 2016 with support for Docker Hub becomes available. "We're excited to bring this from the Linux world to the Windows world," Burgos-Galindo said.

The ability to build, ship and run apps in this type of infrastructure will lend itself well to those that need to run in distributed environments comprising a private data center and public cloud environments, said David Messina, vice president of enterprise marketing at Docker.

Tutum has 11 employees, all of whom have joined Docker.  

Posted by Jeffrey Schwartz on 10/21/2015 at 2:44 PM0 comments


Microsoft Announces the Surface 4 'Enterprise Bundle'

In preparation for what it hopes is a larger corporate demand for the Surface Pro 4, Microsoft today announced what it's calling the "Enterprise Bundle," which includes the newest Surface device, keyboard and a warranty only available to large enterprises (those purchasing thousands of units). The announcement comes in advance of next week's general release of its new crop of Windows 10 devices, notably the Surface Pro 4, Surface Book laptops and Lumia phones

The warranty program, called Complete for Enterprise, will allow IT to pool devices, so if one exceeds the two-replacement limit, it can still be eligible. Also in situations where an organization has an unbootable system but doesn't want to send it back because it has sensitive data on the drive, they can destroy the device.

Microsoft also disclosed the early corporate adopters for its Surface Pro 4 in an effort to demonstrate current demand by large enterprises for its new devices. "This is the most anticipated device for enterprises in years," said Brian Eskridge, senior director for Surface, in an interview. "The response we've seen for the Surface Pro 4 is by far going to be the fastest Surface adopted by businesses ever."

Granted it's only the sixth launch of a Surface and the first in nearly a year and a half but until last year's release of the much-refined Surface Pro 3, it was questionable whether Microsoft's tablet PCs would ever be widely adopted by enterprises.

A unit of Berkshire Hathaway is among the first of 10 large enterprises that Microsoft is revealing today have ordered the Surface Pro 4, mostly on spec. Berkshire Hathaway Automotive has already started replacing older Windows desktops and laptops with Surface Pro 3 devices and now plans to continue its refresh with the Surface Pro 4.

David Austin, Berkshire Hathaway Automotive's CTO, wasn't sure offhand how many units it will ultimately roll out but he said it would be several hundred per month as part of its refresh of older PCs. "We no longer order PCs for our end users," Austin said in response to an e-mail. "This is our one-size-fits-all device. This provides a more seamless experience for the employees and guests."

Asked why Berkshire chose Windows 10 tablets over iPads or Android-based devices, Austin said familiarity with Window and security, in particular, were key reasons. "We know how to secure, patch, and control access on a Windows device," Austin said. Nevertheless, Berkshire is not rolling out Universal Windows 10 apps.  Instead it will be using standard HTML 5 apps in most cases, according to Austin. In addition to the Surface Pro 4s, he said Berkshire will provide the new Surface Book, Microsoft's first-ever laptop that is also slated for release next week, for power users.

Besides Berkshire Hathaway Automotive, Microsoft today said those ordering Surface Pro 4s include BNY Mellon, Carlyle Group, Clifford Chance, Covana, The Global Fund, Land 'O Lakes, USI Insurance and Wessanen. Also three universities in the U.K, have placed orders: Brighton College, Sheffield Hallam University and the University of Central Lancashire.

For Microsoft, the Surface business has gone from a black eye for the company to one of its fastest growing businesses. Since revamping its devices business and rolling out the Surface Pro 3 last year, it has grown to a $3.6 billion business for the 2015 fiscal year. It appears Microsoft has even higher hopes for the new Surface Pro 4. And if the Surface Book becomes a runaway hit, that'll surely be gravy.

Is your organization considering Microsoft's newest hardware?

Posted by Jeffrey Schwartz on 10/20/2015 at 2:52 PM0 comments


Google Apps Free Promo Targets Enterprises Using Office 365

The rapid growth of Office 365 has become a thorn in the side of Google Apps team and the company is now fighting back. Google today said it will offer its Google Apps service free of charge to midsized enterprises (organizations from 350 to 3,000 employees) for the duration of their contracts with Microsoft, IBM or any other productivity suite provider. The offer applies to any business, but Google is aiming to target mid-market companies, i.e. those with seats usually between 250 and 3,000. So, no company will be excluded just because they don't fall within that range. 

Google has kicked off the offer as it has seen Microsoft steal its lead in the subscription-based personal and group productivity and SaaS-based e-mail and collaboration markets. While Microsoft reports that there are 15.2 million subscribers to the Office 365 Personal and Home Editions, the company hasn't disclosed how many have subscribed to enterprise-oriented SKUs including licenses with Exchange and/or SharePoint Online. However a report published in August by security vendor Bitglass, showed that Office 365 has overtaken Google Apps over the past year.

Usage of Office 365 increased 300 percent over the past year, showing that 25.2 percent of enterprises use the services, compared with last year's 7.7 percent, according to the report. Google grew too but only has 22.8 percent of enterprises from 16.3 percent last year. Microsoft has fared even better with enterprises with more than 500 employees, where 34.3 percent use Office 365 compared with 22.9 percent that use Google Apps. Bitglass, which fielded the report, gathered the number using traffic reports from 120,000 organizations. Microsoft pointed to the report in the Office Blog and disclosed that Bitglass supports both Office 365 and Google Apps though it's listed partners are Microsoft, Dropbox, Box, Salesforce.com, OneLogin, Okta, Deloitte, Forsythe and Sayers (Google was not listed).

"For several years Google was the standard for cloud based productivity suites. However, with the release of Office 365, Microsoft has provided a viable alternative to Google Docs," said Alan Lepofsky, VP and principal analyst at Constellation Research. "The timing for this offer is good, as many organizations are evaluating if they should switch from Microsoft's on-premises offering to the cloud, and this helps place Google Apps for Work into the decision making matrix."

Google is hoping that if organizations take advantage of the ability to try its apps suite for the duration of an existing Office 365 contract that they'll see some of the benefits it offers. Ryan Tabone, director of product management for Google Docs, explained during an interview that the company has added 350 features to Google Apps, including improved fidelity, support for pivot tables in its Sheets spreadsheet, redlining, real-time editing and the ability to work with legacy file formats and workloads.

Tabone demonstrated two new features added last month that take advantage of its cloud-based machine learning capabilities. One was the ability to perform real-time voice-to-text transcription and the other, to render visualization from datasets that can render key business impact results.

"We are trying to push the industry again," Tabone said. "We are not just a good solution in this space -- we are actually a better solution for most enterprises."

To take advantage of the service, customers must do so through one of Google's 13,000 implementation partners. The company is also offering $25 per user to implement and integrate the service.

Posted by Jeffrey Schwartz on 10/19/2015 at 10:51 AM0 comments


New Containers in Azure Aim To Migrate Legacy Apps

Sphere 3D this week said its Glassware 2.0's containerization tools and its hypervisor now work with Microsoft's Azure cloud. The entire Glassware 2.0 offering, which includes its containers designed to migrate Windows and open source apps to the cloud, now runs on Azure. Both companies announced a partnership earlier this year to ensure the compatibility.

Glassware 2.0 consists of a thin hypervisor it calls a "microvisor" and a proprietary containerization technology that Sphere 3D considers simpler to implement than open source Docker containers because it doesn't require a virtualized desktop. It's designed to port legacy applications to modern architectures without having to rewrite or translate code. DevOps teams can port Windows workloads to the cloud with the microvisor into the containers. The company claims microvisors have intelligence to use only those necessary components of the OS to migrate software to cloud-style infrastructure.

Glassware 2.0 technology on Microsoft Azure solves the challenges associated with transitioning applications to the cloud, "be it Web scalability, mobility, infrastructure management or cloud compatibility," said Sphere 3D CEO Eric Kelly. The apps can run natively on Windows 10, as well as iPads, Chromebooks and thin clients. The company says what makes its approach different is that the host operating system isn't installed on a server but rather just what's in the kernel for the applications to run, which should appeal to those with those who have system end of life issues and want to ensure against security vulnerabilities.

The company launched two offerings for Azure. The first is Exosphere, which requires a direct engagement with the company and is intended for applications that must scale to hundreds of thousands of users. It lets organizations build enterprise app catalogs of Windows apps ported to this containerized platform with images that they have verified for rollout. Once deployed, the apps are network aware and can be distributed to the various platforms.

The other, intended for applications that are accessed by hundreds or maybe thousands of users, is the Glassware G-Series. It will be available next month in the Microsoft Azure Marketplace and is a virtual appliance for application containerization.

Posted by Jeffrey Schwartz on 10/16/2015 at 12:39 PM0 comments


Microsoft Gives Skype a Wider Net with Expanded Chat and Video

Microsoft wants to broaden the reach of its Skype service and is doing so by making it possible for Skype users to invite those on other services such as Facebook Messenger, Twitter and WhatsApp to join a conversation. The move comes just a week after Microsoft released new previews of Skype for Business for Office 365 users.

The expanded chat capability, released today, is described by Microsoft as a small update that will change the way people use Skype. The company is hoping that it'll get more people using Skype.

"Anyone can join the chat as a guest from their computer using Skype for Web and enjoy one to one or group instant messaging, voice and video calls," according to the announcement by the Skype team. "No Skype account or app download [is] required. Now you can use Skype to chat with anyone and not just the people in your Skype contact list."

Overall, the increased emphasis on Skype and Skype for Business shows that Microsoft is looking to get more customers using the portfolio of services, though clearly the focus is on the business customers, especially when it comes to video.

In a research note published on Seeking Alpha, Dallas Salazar, chief analyst at CapGainr.com, said he believes Microsoft's latest Skype for Business update could make the company a key player in business videoconferencing, given the forthcoming release's integration with Active Directory, improved user interface, new dashboard and integration with Office apps.

"Basically, this puts Skype for Business on-par with [Google] Hangouts from a tech capacity/use case standpoint. When Microsoft updates Skype for Business to be inclusive of the productivity suite access, this will then become my go-to video app. That's going to be a game changer for me," he wrote. "This also is the next important evolution of video chat to pay very close attention to -- the buildout of cross-sell capacity."

The cross-sell will come with Office 365 and the new Surface Hub. Do you envision using Skype more for personal or business use?

Posted by Jeffrey Schwartz on 10/15/2015 at 1:58 PM0 comments


VMware Is the 'Crown Jewel' in the Historic Dell-EMC Deal

Dell's agreement to acquire EMC for $67 billion, which was announced this week, is by far the largest IT industry deal ever and Dell is poised to have the most comprehensive portfolios of server, storage and IT enterprise management software and services. The deal nevertheless was not welcome by VMware shareholders, who have hammered the stock in the days since the deal was reached.

A day after Dell Founder, Chairman and CEO Michael Dell and EMC Chairman and CEO Joe Tucci talked up the "synergies" and potential cost savings of the deal that will ultimately generate $1 billion in increased growth in its datacenter business, VMware's CEO Pat Gelsinger told CNBC why he feels the market's fears are overblown.

"To us the deal is about growth and having Dell and EMC come together in a larger entity accelerating growth of VMware, we think is a huge opportunity for us in the mid and long term," Gelsinger said. "As you move past this period of volatility, the growth starts to become very apparent from the marketplace. We believe ultimately it will be a very good thing for the market, for VMware for our employees and customers."

While the two companies become a larger provider of IT solutions, Gelsinger said VMware is at the epicenter that will drive that potential growth. "VMware being the crown jewel of the family is about seeing that growth potential," he said.

Given that most large mergers rarely live up to their potential and a good number are outright failures, Dell's largest competitor Hewlett Packard Co. has the most to gain or lose. In a widely reported letter to employees HP CEO Meg Whitman said Dell shareholders "will have to pay roughly $2.5 billion in interest alone." Perhaps that's sour grapes for Whitman, who a year ago was rumored to be courting a deal with EMC. Instead, breaking itself into two companies, which will become official Nov. 1.

"If this goes through HP is totally hosed," wrote independent IT industry analyst Rob Enderle. "Dell would appear as a far more complete vendor than HP is and with HP's crippling layoffs its customers will quickly be looking for enterprise-class alternatives. With EMC Dell could aggressively go after that business hitting HP before the firm can stabilize and protect its client base."

While Enderle believes the Dell-EMC deal can succeed, independent analyst Charles King of Pund-IT, believes the main issue is that the new entity will create a new, non-voting class of VMware shares (worth 0.111 of EMC's 81% stake in the company), which will devalue/destabilize the common shares. Investors are also are concerned that the owners of the new shares (particularly large institutional investors like Elliott Management, which has been agitating for EMC to sell off its VMware stake to create a sizable one time dividend), will sell off the tracking shares as soon as the deal is finalized, he noted.

"There's also some disappointment that VMware wasn't spun off entirely, mixed in with concerns about Dell's ability to effectively manage the company," King said. "Frankly, this scenario was never likely given the value that VMware has provided EMC over the past decade."

Patrick Moorhead, an independent IT industry analyst with Moor Insights & Strategy, who was interviewed separately by CNBC, said not to overlook the combination of two large hardware companies with largely complementary product lines could offer greater scale.

"There's a hedge strategy which says you go with the curve and develop scale and that's what I believe is happening here," Moorhead said. "What this will do is create the largest, by scale, hardware company that's out there. That's the play that Dell is making here."

Asked if he believes Dell will be forced to issue more stock or spin off businesses such as the popular SecureWorks unit, Moorhead pointed to the fact that Dell has benefitted from the historically low interest rates. Since going to the bond markets two years ago to finance its deal to go private, Dell's debt has risen from junk to investor grade, he noted.

"What that says is they're able to pay down that debt," Moorhead said. "I believe that gave the confidence for this to be primarily a debt deal. If they had to spin off VMware or something like SecureWorks to generate that cash at an even greater clip, they do have that option.  But I don't necessarily think that's Plan A  for them. I think that would be more along a Plan B."

Posted by Jeffrey Schwartz on 10/14/2015 at 1:03 PM0 comments


Dell To Become IT 'Powerhouse' with Deal to Acquire EMC

Dell-EMC Deal

After a week of speculation, Dell Inc. has agreed to acquire EMC Corp. for $67 billion in what would be the largest IT industry deal ever. The deal, pending approval of EMC shareholders and government regulators, is slated to close sometime next year, and will clearly reshape the IT competitive landscape, giving Dell the leading enterprise storage business, a controlling interest in virtualization giant VMware Inc., the Big Data and analytics business of Pivotal, and the data security and encryption provider RSA.

The combined company aims to provide turnkey solutions from PCs and thin clients, to servers, storage, cloud, security and integrated software-defined datacenter solutions. "We're putting ourselves in an incredible position in this combined powerhouse enterprise company," said EMC Chairman and CEO Joe Tucci on a conference call this morning with analysts and media, which was led by Dell founder, Chairman and CEO Michael Dell, VMware CEO Pat Gelsinger and Egon Durban, managing partner at Silver Lake Partners, which is providing $3.5 billion in capital to help finance the deal, which will also be funded by the high-yield bond markets.

Both companies are major forces in the IT industry in their own right, though EMC lacked a server and networking business. Its controlling interest in VMware, which will continue as a publicly traded company, will help Dell extend into the growing segments of mobility management via AirWatch, software-defined datacenter, and next-generation virtualization, which includes public and private cloud, micro-services, and containers.

"From earlier reaction I've had from some of our best and longstanding customers," Dell said, "customers don't want to have to integrate things themselves," pointing to the potential to provide new technology based on the companies' complementary technologies. Dell didn't elaborate to what extent the new company would provide such integrated solutions.

With the huge technology and product portfolio of the combined company, the new Dell would become a formidable competitor to IBM Corp., Oracle Corp., Cisco Systems Inc., Amazon Web Services Inc. and even Microsoft. It also creates questions to what extent Dell will be motivated to see its relationship with Microsoft evolve further, given it will have competitive virtualization, cloud, container and related datacenter infrastructure technology.

On the call, Dell said the company would continue its partnerships with companies such as Microsoft, Cisco and Red Hat Inc., while offering competitive solutions, pointing to the industry's long history of "coopetition," the industry catchphrase for partnering with competitors. "We will be thoughtful about how we do that to make sure we do it in the right way," Dell said. "We do have a history of working together in the past and I think that will accelerate our ability to bring solutions to customers in a faster way."

VMware's Gelsinger pointed to his company's longstanding relationships with Dell rivals Hewlett-Packard Co., Lenovo and others. "Dell has demonstrated their commitment to openness over the decades of the company," Gellsinger said. "That's been a hallmark of them and, likewise, the independent ecosystem of VMware."

Michael Dell also noted the company will continue to support its VCE partnership with Cisco and indicated he'd like to extend EMC's federated business model of maintaining separately run businesses such as Pivotal and RSA. "We've certainly studied this capability and have gotten to know the organization and how the federation works [and] we look forward to enhancing that," Dell said. He offered two examples. "Within Dell we have some fantastic new businesses we've been incubating ourselves. SecureWorks would be one [and] we have another one called Boomi, which is a fast-growing cloud data integration company," Dell said. "I think there are some great aspects to this structure and as we come together as a combined company, I think we will be even more powerful."

Industry analyst Dan Kusnetzky of the Kusnetzky Group pointed to the vast number of areas the two companies would become a larger force in a blog post on our sister site VirtualizationReview.com. "Dell clearly wants to be in a position to knock on any enterprise door and be welcomed in as a major enterprise player," Kusnetzky said.

Indeed, this huge deal comes at a time when the competitive landscape of enterprise technology providers is reshaping, in part due to large changes in the way organizations are procuring and deploying IT infrastructure and services. For its part, activist investor Elliott Management was pushing EMC and VMware to find a way to accelerate growth. HP will separate into two companies next year (HP Enterprise and PC and printer division HP Inc.) and even companies such as Citrix and SolarWinds are under pressure by investors to consider their options.

Under the terms of the agreement, EMC has 60 days to consider a better offer, but given the price Dell and SilverLake have offered and the fact that this deal has quietly been in the works for more than a year, it doesn't appear likely that a better offer will emerge.

Posted by Jeffrey Schwartz on 10/12/2015 at 1:11 PM0 comments


Unsolicited Suitor Leads SolarWinds to Consider Options

Popular systems management provider SolarWinds today said it is considering the outright sale of the company or other options, following an unsolicited inquiry by a third party.

After Reuters reported on the inquiry, SolarWinds acknowledged it has retained J.P. Morgan as its financial advisor and DLA Piper LLP to provide legal counsel, providing what the company described as a third party who expressed interest. It wasn't clear whether the third party made an outright offer to acquire SolarWinds or only wanted to discuss possible transaction. SolarWinds emphasized it's too early to determine whether the review will result in any type of transaction.

"Consistent with its duties, our board of directors has determined that it is prudent to undertake a review to see which alternative or alternatives, including our standalone plan, are the best way to maximize shareholder value," said SolarWinds CEO Kevin Thompson in a statement. "As the board conducts its review, we remain focused on strong revenue growth and cash flow and excited about continuing to deliver our world class products to help IT professionals manage all things IT."

A spokeswoman said the company had no further comment. The company's stock on Friday jumped 13.4% on the news and has a market cap of $3.6 billion. SolarWinds has been under pressure of late as it made its transition to cloud subscription models with lower than expected licensing revenues last quarter. The company had lowered its revenue guidance in July.

SolarWinds is a popular provider of infrastructure, network and management software and services as well as security, database support and IT help desk software. It is also the parent company of remote monitoring and management provider N-able.

 

Posted by Jeffrey Schwartz on 10/09/2015 at 1:45 PM0 comments


Microsoft To Release HoloLens SDK and Kicks Off Local Demos

Microsoft is taking its HoloLens virtual reality glasses to the next stage of its evolution from demoware to market development. The company will release an SDK next quarter and is kicking off a multicity road show next week to provide demos of the HoloLens.

At the Windows 10 Devices launch event earlier this week, the company gave airtime to HoloLens and demonstrated a game code-named" Project Xray" that would let players engage in shooting matches with holographic guns. Besides "mixed reality gaming" and entertainment, the company's top brass is bullish on the prospect for HoloLens in a variety of industrial settings.

"Instead of immersing yourself in a world of pixels on a screen, HoloLens brings experiences into our real world, opening a new window into the future of personal computing," said Terry Myerson, executive VP for Microsoft's Windows and devices organization. "Whether it's for productivity, design, healthcare or entertainment, HoloLens creates innovative experiences that are simply not possible on any other device or any other platform."

It appears Microsoft for now wants only those serious about building commercial applications for HoloLens. Myerson said developers must apply for the SDK and fork over $3,000 per device to get their hands on it. "Now is the time we want to unlock the creativity of Windows developers worldwide," Myerson said. Among those already working with HoloLens are NASA, Autodesk and Case Western Reserve University. The road show kicks off in Seattle next week and will hit Chicago, Atlanta, Los Angeles, Minneapolis, Salt Lake City, Toronto, New York, San Francisco, Phoenix and Austin. All of the cities appear to be sold out currently, though you can get onto a waiting list.

Posted by Jeffrey Schwartz on 10/09/2015 at 12:00 PM0 comments


LogMeIn To Acquire Popular Password Manager LastPass

LogMeIn today said it has agreed to acquire LastPass, the popular password management service, for $110 million furthering the company's push into identity and access management (IAM).

By acquiring LastPass, LogMeIn believes it could help its customers address the common problem of employees using the same password for every application they access. LastPass provides single sign-on access with the ability to generate complex and encrypted passwords to each system accessed. LogMein said that 64 percent of Internet users use the same passwords for every application and service they access.

"This transaction instantly gives us a market leading position in password management, while also providing a highly favorable foundation for delivering the next generation of identity and access management solutions to individuals, teams and companies," said Michael Simon, LogMeIn's Chairman and CEO, in a prepared statement.

LogMeIn acquired another IAM provider last year, Meldium, and the company plans to offer both for now, though long-term plans call for developing a common single sign-on service based on LastPass.

Posted by Jeffrey Schwartz on 10/09/2015 at 12:01 PM0 comments


Dell Wants EMC So It Can Gain Control of VMware

Advanced talks between Dell and its backer Silver Lake Partners to acquire storage giant EMC for $50 billion would be the largest in the tech industry and potentially one of the largest leveraged buyouts of a public company. It also appears that Dell's primary interest in EMC is to take control over virtualizations giant VMware, which EMC now holds an 83 percent stake in.

The two companies have reportedly been negotiating a combination for several months and such a deal would involve a huge amount of debt -- $40 billion on top of the $11 billion that Dell now has on the books for its own leveraged buyout in 2013. According to CNBC the two companies are looking to tap the high-yield bond markets to finance the deal. The talks were first reported late yesterday by The Wall Street Journal, which said a deal could come together within a week, though it could also fall apart. The initial report indicated that EMC would want to spin off VMware but CNBC is reporting that Dell wants control over the company. "I know that Dell wants to maintain control over VMware," said CNBC reporter David Faber. "While there might be more sold into the public market, it would not be a wholesale sale of the VMware stake or a spin to shareholders or anything like that. Key to the reason Dell wants to own this is to also control VMware now."

While the reporting is still speculative at this point and neither company has commented, CNBC is reporting that there's a strong possibility that a deal could come together. EMC and VMware have been under pressure for some time by Elliott Management, an activist investor with a track record of forcing change on companies, to find new ways to generate shareholder value. Adding fuel to speculation that EMC will do something is the fact that longtime Chairman and CEO Joe Tucci has announced he will retire next year. EMC, the largest provider of enterprise storage, remains under huge competitive pressure from upstarts with lower cost gear and a growing shift toward cloud storage.

Hewlett Packard earlier this year reportedly was in talks to acquire longtime rival EMC as well. Both companies are under pressure to move beyond their core portfolios of hardware as enterprises move more workloads to cloud providers. HP appears to be a less likely candidate to jump into any bidding given it's in the process of splitting into two companies, which will finalize next month.

Dell and EMC have a storied history. When Dell first started expanding beyond its core PC business, it had a reseller agreement with EMC, which later fell apart when Dell started acquiring storage companies such as EqualLogic and Compellent Technologies, among others.

Naturally Dell taking control of EMC and VMware would have implications on its relationship with Microsoft. While Dell and Microsoft are close partners -- both among their software and hardware businesses -- VMware and Microsoft are key competitors, though they did announce a partnership at the recent VMworld show in San Francisco where the two are working together to support Windows 10 deployments using AirWatch. In addition to having rival virtual machine platforms, VMware is making a big push in mobility management with its AirWatch unit and its vCloud Air public cloud, which competes with Microsoft Azure. At this point the former doesn't appear to be a threat.

Posted by Jeffrey Schwartz on 10/08/2015 at 7:51 AM0 comments


A More Powerful Surface Pro 4 Puts the Pen at Front and Center

Microsoft's surprise entry into the laptop market with the launch of its new high-end Surface Book may have stolen the limelight at yesterday's Windows 10 Devices launch event in New York but the Surface Pro 4 will likely be the system that addresses the needs of most mainstream users.

There's no question that if you look at the Surface Book, which Microsoft calls the ultimate laptop, you'll crave one until you compare its cost with the Surface Pro 4. If you're a power user it may be worth the premium price but if not, you'll quickly determine that the Surface Pro 4 is no slouch either. The comparison is ironically the same as that of the MacBook Air and the MacBook Pro, which Microsoft compares respectively to the Surface Pro and Surface Book.

While it certainly would be wise to wait and see some of the new systems that emerge from various OEMs before deciding on what to put on your wish list, the Surface Pro 3 has emerged as the most popular Windows device, according to our own reader surveys. With the new Surface Pro 4, which will be available Oct. 26, Microsoft appears to have made some impressive refinements. They include improved resolution, lighter weight, a thinner design and an improved pen that attaches magnetically.

By including the pen, Microsoft is making a clear statement it wants users to take advantage of the ability to annotate content and Web pages. A surprising 50 percent already use the pen, according to Microsoft. The new pen is more responsive and supports up to 1,024 pressure points, designed to provide the feel that you're writing with a traditional pen and paper. It's based on the pen technology gained in Microsoft's acquisition of N-trig earlier this year.

Customers will also have somewhat better configuration options. The popular version with an i5 processor is now available with 16GB of RAM for an extra $200, bringing the cost to $1,499 with 256GB of storage. The 8GB version is $1,299. A fully loaded system with an i7 processor and 16GB of RAM costs $2,699.

Surface Pro 4 is also 30 percent faster than its predecessor, according to Microsoft and it's slightly thinner (.33 inches) and lighter, weighing just 1.69 pounds. It has a slightly larger PixelSense 267 DPI display, generating 5 million pixels (12.3 inches versus 12 inches), thanks to a thinner bezel and the physical screen is the same size, allowing support for existing peripherals.

As a result of the added performance boost, thanks to the new Intel Core 6 processors, Microsoft said the Surface Pro 4 is rated at nine hours of battery life, which is no different than its predecessor, though an engineer at the event told me that the way the new CPU manages power, users could see a more efficient system. The new features, along with the improved PixelSense display and a new Microsoft Pen make the Surface Pro 4 a nice upgrade. But most people who bought a Surface Pro 3 last year will have a hard time justifying replacing it just yet.

Customers, especially enterprise buyers, may want to swap out the keyboard, which introduces a fingerprint scanner that'll take advantage of the Passport authentication based on the Windows Hello technology introduced in Windows 10. While the new keyboard's Windows Hello security feature for logons may make its $129 cost justifiable, the fact that the keys are also more responsive and quieter could be added incentives. The new keyboard will work with both Surface Pro models and likewise the older keyboards can connect to the new systems.

Another welcome addition is the new Surface Dock, an improved port replicator that is the size of a small brick. The Surface Dock, which is portable, works with the Surface Book, Surface Book 3 and Surface Book 4 and is configured with four USB 3 ports, two Mini DisplayPorts, 1 Gigabit Ethernet interface and an audio output port.

Considering it was once a drag on the company, the Surface line has become one of Microsoft's hottest offerings and presuming it performs as advertises, it stands a good chance of reaching new heights.

Posted by Jeffrey Schwartz on 10/07/2015 at 12:46 PM0 comments


Windows 10 Vision Unfolds with a Surprise: The Surface Book Laptop

Microsoft's year-long process of laying out its vision for Windows 10 reached a crescendo today.

At an event for media, analysts and Windows Insiders in New York, the company rolled out two new Lumia phones, an upgrade to its Surface Pro line with an improved Windows Hello keyboard and the company's debut into the high-end laptop market with the new Surface Book. The hardware rollouts put some metal around Windows 10, which originally launched back in July.

The biggest surprise was the introduction of the new 13.5-inch Surface Book. It's a convertible laptop and tablet that Microsoft officials claim can run 12 hours on a single charge and is two times more powerful than a MacBook Pro. "It redefines everything you would expect in a laptop," said Panos Panay, Microsoft's Surface VP, describing it as "the ultimate laptop" too many times to count.

The Surface Book starts at $1,499 for an i5-based system using the new Intel Core 6 processor with 8GB of RAM and scales up to an i7-based system with 16GB of RAM and a 1TB drive, both supporting optional Nvidia GPUs. The Surface Book is targeted at artists, engineers and gamers who require significant processing power. When the top part is removed from the keyboard to function as a "clipboard," it weighs 1.6 pounds; with the keyboard, it weighs 3.5 pounds. The device has a 13.5-inch 10 point multitouch PixelSense display that renders up to 6 million pixels at 267DPI, Panay said. "There's nothing close to it," he said. A new Surface Pen supports 1,024 levels of pressure sensitivity. "If you look at a photo on it, it will look real, if you look at a video it'll immerse you," Panay said. The top-end system will cost a hefty $2,699.

The launch of the new laptop shows that Microsoft CEO Satya Nadella remains committed to broadening the Surface line. He's not walking away from it as some critics have called on Microsoft to consider.

Terry Myerson, the executive VP who heads the Windows and devices group, announced 110 million devices now run Windows 10. He added that 1 billion questions have been asked of Cortana. Users have built 650 billion Web pages that render via Microsoft's new Edge browser. "It's by far the fastest ramp-up we've ever had," Nadella said regarding Windows 10, in an appearance to wrap up the event.

While the Surface Book clearly now stands as Microsoft's premium device, the Surface Pro 4 is poised to have mainstream appeal to mobile workers, students and consumers. It's an incremental upgrade to the Surface Pro 3 but has some welcome improvements, including an upgraded pen that magnetically attaches to the device. There's a new top-of the line system configured with Intel's new Core 6 i7 processor, 1 TB SSD and 16 GB of RAM. An entry level system will have 4GB of RAM and a 128-GB drive. The Surface Pro 4 is slightly thinner and lighter than the Surface Pro 3, but the battery life remains the same at 9 hours.

The Surface Pro 4 has the same form factor as its predecessor but it has a slightly larger display at 12.3 inches, made possible by a thinner bezel. Perhaps the most noteworthy addition to the Surface Pro 4 is its new Windows Hello-compatible keyboard, with a fingerprint sensor. The keyboard will work with the new system as well as the prior Surface Pro 3 model, Microsoft said. Likewise, those who want to use their old Surface Pro 3 keyboard with the new system can do so, although an engineer at the event told me that "once people see the new keyboard, they're going to want it." Besides the fact it's thinner and, Microsoft argues, more sturdy, it has improved keys that are quieter and have faster transport. It'll cost the same amount as the old one.

Microsoft also launched a Microsoft Band 2 product today with a number of new features and a perhaps more comfortable fit. Also unveiled were three new Lumia phones, including an entry-level unit priced at $139. Lastly, the company announced an SDK for developers who want to build apps for Microsoft's HoloLens augmented reality device.

Most of the new products launched today will be available on October 26. Customers can preorder them now.

Microsoft described today's product launches as the "beginning of a new era" for Windows. The company also will support upcoming product rollouts by its key OEM partners, including Dell, Hewlett Packard, Acer, Asus, Fujitsu, Panasonic, LG and Toshiba.

Posted by Jeffrey Schwartz on 10/06/2015 at 1:12 PM0 comments


Windows 10 Devices Officially Launch Tomorrow

Microsoft took a decidedly low-key approach to the release of Windows 10 back on July 29, largely because the devices optimized for the new operating system weren't ready. But by making Windows 10 Pro available free of charge for users to upgrade on their existing systems, the company has given them a chance to see and interact with the new operating system.

Now that the hardware is about ready and the key fourth-quarter purchasing season is about to heat up, Microsoft and its partners this week will be unveiling a slew of new PCs, tablets and phones optimized for Windows 10, taking advantage of features such as the Windows Hello biometric authentication capability, Continuum, electronic inking, Cortana and Skype. Perhaps most noteworthy is that many of the high-end new ultrabooks and all-in-one desktops will come with the new Intel Core 6 Skylake processor.

At a launch event tomorrow in New York City, Microsoft is expected to debut two new Lumia phones for Windows 10, an updated Microsoft Band and, perhaps most notably, a new Surface Pro. Some reports have indicated there could be two Surface systems -- a 12-inch model with a similar form factor to the Surface Pro 3 and possibly a larger 14-inch configuration to respond to the challenge of Apple's larger iPad.

Suppliers of Windows devices remain deferential in giving Microsoft the first word. Nick Parker, corporate VP of Microsoft's OEM division, showcased some devices in the pipeline at last month's IFA conference in Berlin. Among the devices Parker showcased was a forthcoming Dell Latitude 11, with an 11-inch display powered with the Core 6 processor and offering a smaller footprint than the popular Dell Latitude 13. Other new business-focused convertible two-in-one systems showcased by Parker were the Hewlett-Packard EliteBook Folio and a new Lenovo Yoga 360.

For the education market, Parker talked up the new Acer Aspire 1 tablet, which is 18mm thin and is equipped with two microphones to make it suited for Cortana and Skype use; and the 14-inch Lenovo IdeaPad, which boasts 12 hours of battery life. Two all-in-one desktops highlighted were the new Dell Inspiron 24 with a Windows Hello camera and an edge-to-edge 24-inch display, and the Asus Zen, also a 24-inch system with a 4k option, Windows Hello camera and support for accelerated graphics cards. Parker also talked up the microphones on the Asus device, an important feature for those using Cortana and Skype.

Another system that stood out was the new Toshiba Radius 12, a 12-inch system equipped with the new Intel processor, a Windows Hello camera and a 360-degree hinge. Speaking of Toshiba, Parker also pointed to the company's Data Logger for industrial Internet of Things operations. The device has 20 sensors, which can track information such as GPS, biometric pressure and temperature, he said.

Come back tomorrow when we report on Microsoft's launch or catch the live stream here.

Posted by Jeffrey Schwartz on 10/05/2015 at 11:40 AM0 comments


Windows Phone's Fans Love It, but Is that Enough?

It's no secret that as far as most people are concerned, there two major mobile phone platforms: Android and iPhone. Windows Phones are out there but they represent such a small minority -- by most accounts, less than 3 percent of the market -- that only fans of the platform seek them out. And it appears those fans are few and far between.

Having attended this week's Visual Studio Live! conference in Brooklyn, N.Y., it was a markedly different world. A disproportionately large number of the 300 attendees were carrying Windows Phones. When I did have my iPhone in view, I was jeered on occasion, leaving me to feel like some sort of traitor. Of course, seeing lots of Windows Phones at a conference for Windows developers is hardly a surprise, even if it isn't in sync with the real world.

In the keynote session at the conference on Wednesday, ZDNet blogger Mary Jo Foley, who also is a Redmond magazine columnist, lamented the problem bedeviling Microsoft when it comes to Windows Phone. "I'm one of the 2 percent," she said. "Windows Phone is the one I am the most leery about Microsoft pulling a turnaround."

Yet those that know Foley are aware she likes Windows Phone. "I am a Windows Phone user, not just because of my job [covering Microsoft] but by choice," she said.

In this month's Redmond column, Foley explained how Microsoft has rebooted its Windows Phone strategy again, with Windows 10 Mobile, which will introduce the new Continuum technology that lets developers build for one Windows platform and have the apps carry over to any form factor, enabling users to switch smoothly between PCs, tablets and phones. Microsoft is hoping this will build enthusiasm for Windows Phone. 

Indeed, it's an interesting concept, but even those developers I spoke with at Visual Studio Live! weren't sold on the appeal. It's one thing to want to switch from a PC to a mid-sized tablet, but why would someone want to do so with a phone, wondered UX expert Billy Hollis of Next Version Systems during a Birds of a Feather lunch discussion.

Meanwhile, Windows Phone enthusiasts are dropping like flies. Our editorial director Scott Bekker for four years was devoted to Windows Phone and remained hopeful that the apps available on Android and iOS would appear. They didn't and still haven't. When his Lumia phone's screen cracked, he threw in the towel without hesitation and decided to get an iPhone. In the end it was the apps. Windows Phone enthusiasts don't care about the apps that are missing on the platform. But the reaction to Bekker's column this week on our sister site Redmond Channel Partner describing his decision to switch was largely supportive.

"I LOVE Windows Phone, but the lack of apps are killing me," wrote Jamison West in the comments section.

The post "expresses exactly how I feel," added Peter Zarras. "I love my [Lumia] 1020 and have no desire to replace it -- but yet, don't yet feel excited or ready to buy a new Windows Phone unless/until certain app limitations are addressed. While I don't miss some apps, I know they're there and some could be helpful to me."

According to unconfirmed but widely reported leaks, Microsoft will debut next week the Lumia 950 and Lumia 950 XL, which are 5.2-inch and 5.7-inch devices, respectively. Both will support Continuum and the Windows Hello biometric authentication feature, according to Foley's ZDnet blog post, and will work with the Microsoft Display Dock, an interface that lets a user connect a phone to a full-size keyboard and monitor, which Microsoft demonstrated earlier this year.

Yet Microsoft will have to defy expectations for these new units to lure customers who don't already have Windows Phones. But for the 2 percenters, the new phones could be a nice holiday gift.

Posted by Jeffrey Schwartz on 10/02/2015 at 1:56 PM0 comments


Microsoft Eases Clustering with Container Service for Azure Resource Manager

Amid the slew of announcements at Tuesday's AzureCon (a series of Channel 9 videos released by Microsoft outlining milestones and the future of the company's enterprise cloud service), Microsoft revealed how its Azure Container Service will simplify the way organizations build, configure and manage clusters.

Specifically, Microsoft announced the Azure Container Service Resource Provider for the Azure Resource Manager (ARM) and released a preview of its new Azure Quickstart Templates, which are available for download on GitHub.

Making the Azure Container Service Resource Provider for ARM is noteworthy because ARM is the new canvas for deploying and managing virtual machines on Azure IaaS. Microsoft recently rolled out ARM as an alternative to and intended long-term replacement for the Azure Service Manager (ASM). Microsoft describes ARM as a much more agile way of deploying and administering virtual machines and containers to Azure than the current ASM.

That's because ARM has a simpler API set than the existing ASM APIs, which power functions such as the Azure portal and are baked into all the Azure tooling, said Michael Collier, a Microsoft cloud solution architect who spoke Tuesday at the Visual Studio Live! developer conference, taking place this week in Brooklyn, N.Y. (Like this site, Visual Studio Live! is produced by 1105 Media.) While ASM isn't going away, Collier indicated Microsoft is emphasizing its development efforts on ARM.

"You will see a lot more focus on Azure Resource Manager going forward," Collier said. "It's where everything is happening and not much innovation is going to be happening with the legacy [Azure Service Manager] API."

The Azure Container Service is the outgrowth of Microsoft's work with Docker and Mesosphere to allow for the deployment and management of scalable clusters of hosts where containerized applications can be deployed, orchestrated and managed, explained Ross Gardler, an Azure program manager at Microsoft, in a blog post. "By leveraging ARM, Azure Container Service will make it easy for you to create and manage clusters of hosts pre-configured with Docker, Apache Mesos, Marathon and Docker Swarm," Gardler wrote. "This work couples Azure's hyper-scale and enterprise-grade cloud with proven open source technologies to deliver the foundation for the container deployment, orchestration, and management service any team building container apps will need."

While he underscored that the initial goal of the Azure Container Service is to ease the building and configuration of clusters using either Docker and Docker Swarm for code compatibility, and orchestration and management tools such as Marathon, Chronos and Apache Mesos to ensure scalability to tens of thousands of containers, Gardler explained that the to-be-released Azure Container Service Resource Provider for ARM will let IT pros define and manage the resulting clusters using the ARM APIs. That will make the configuration much simpler, he said, because of the knowledge it requires.

"Our Resource Provider will abstract away much of this complexity," Gardler said. "Those thousands of lines will be reduced to tens of lines for default configurations. This simplification means fewer configuration errors when deploying and managing these complex clusters. This new Resource Provider will also allow you to utilize Azure features such as integrated tagging of resources, Role-Based Access Control (RBAC) and the Azure management portal. The result is that you can take advantage of the enterprise-grade features of Azure while still maintaining code portability from the orchestration layer up."

In addition to scaling software, Gardler said it's critical to scale hardware. Hence, Microsoft will leverage the Azure Virtual Machine Scale Sets (VMSS), a service that instantiates functions such as create/delete/update on a group of identical VMs through a single API call. "For Azure Container Service those identical VMs are the agents on which containers will be hosted," he said. "Since all VMs in a VMSS have the same configuration, VMSS supports rapid auto scaling of VMs."

Posted by Jeffrey Schwartz on 09/30/2015 at 12:33 PM0 comments


Microsoft Readies Big Data Service That Includes New U-SQL Language

Earlier this year, Microsoft revealed plans to offer a new HDFS-compatible Hadoop File System data store that could run large analytics workloads called Azure Data Lake. So far, the technical preview hasn't appeared but the company today reiterated that the service, which it will actually call Azure Data Lake Store, will be available later this year and also announced some new services planned for its Azure-based Big Data portfolio.

Microsoft describes the Azure Data Lake Store as a single repository that lets users capture data of any size or format without requiring changes to the application as data scales. Data can be securely stored and shared and can be processed and queried from HDFS-based applications and tools, said T. K. "Ranga" Rengarajan, Microsoft's corporate vice president for data platform, in a blog post today outlining the new Azure Data Lake Store.

"Azure Data Lake includes all the capabilities required to make it easy for developers, data scientists, and analysts to store data of any size, shape and speed, and do all types of processing and analytics across platforms and languages," Rengarajan said. "It removes the complexities of ingesting and storing all of your data while making it faster to get up and running with batch, streaming, and interactive analytics.  Azure Data Lake works with existing IT investments for identity, management, and security for simplified data management and governance. It also integrates seamlessly with operational stores and data warehouses so you can extend current data applications."

Complementing Azure Data Lake Store, Microsoft announced its new Azure Data Lake Analytics, an Apache YARN-based service that's designed to dynamically scale to handle large big data workloads. The new Azure Data Analytics service will be based on U-SQL, a language that will "unify the benefits of SQL with the power of expressive code," Rengarajan said. "U-SQL's scalable distributed query capability enables you to efficiently analyze data in the store and across SQL Servers in Azure, Azure SQL Database and Azure SQL Data Warehouse."

In a MSDN blog post today, Michael Rys, a principal program manager for big data at Microsoft, explained why U-SQL is suited for Azure Data Lake Analytics:

Taking the issues of both SQL-based and procedural languages into account, we designed U-SQL from the ground-up as an evolution of the declarative SQL language with native extensibility through user code written in C#. This unifies both paradigms, unifies structured, unstructured, and remote data processing, unifies the declarative and custom imperative coding experience, and unifies the experience around extending your language capabilities.

U-SQL is built on the learnings from Microsoft's internal experience with SCOPE and existing languages such as T-SQL, ANSI SQL, and Hive. For example, we base our SQL and programming language integration and the execution and optimization framework for U-SQL on SCOPE, which currently runs hundred thousands of jobs each day internally. We also align the metadata system (databases, tables, etc.), the SQL syntax, and language semantics with T-SQL and ANSI SQL, the query languages most of our SQL Server customers are familiar with. And we use C# data types and the C# expression language so you can seamlessly write C# predicates and expressions inside SELECT statements and use C# to add your custom logic. Finally, we looked to Hive and other Big Data languages to identify patterns and data processing requirements and integrate them into our framework.

Microsoft also announced the general availability of managed clusters for its Azure HDInsight service on Linux, which the company claims has a 99.9 percent uptime SLA. The company also is offering Azure Data Lake Tools for Visual Studio and said that ISV solutions can be offered in the Azure Marketplace.

Posted by Jeffrey Schwartz on 09/28/2015 at 2:01 PM0 comments


BlackBerry Looks for Revival with Android Phone

Despite its continued woes, BlackBerry CEO John Chen is showing no signs of throwing in the towel as he continues to make acquisitions to shore up its position as a supplier of secure smartphones and device management infrastructure. However Chen apparently is finally backing away from the company's storied BlackBerry operating system, which has powered its flagship phones, and going in a different direction. The company today said it'll offer a new line of phones powered by Android.

BlackBerry announced the new Android-based Priv in its earnings release today. The Priv name underscores the company's heritage in protecting privacy. "Priv combines the best of BlackBerry security and productivity with the expansive mobile application ecosystem available on the Android platform," Chen said in a statement included in the earnings release, which fell short of expectations.

Despite the shift to Android, Chen insisted BlackBerry remains committed to the company's latest operating system, BlackBerry 10. Chen also underscored the company's progress with its popular BlackBerry Messenger (BBR), including plans to offer new services based on the platform like  BBM Protected, BBM Meetings and BBM Money. The company also recently released BBM for Windows Phone.

As BlackBerry revenues continue to plummet ($490 million, down from $816 million during the same quarter a year ago), the company's $66 million loss was greater than expected. Chen emphasized on the earnings call the company is still "known as the leader in secure data with all the encryption technology," pointing to two key acquisitions to advance its position. First is the pending acquisition of Secusmart, a German provider of secure voice and text communication. The other is U.K.-based Morirtu, which provides virtual SIM cards that let users run both personal and business phone numbers on a single device, which can be iOS, Android or BlackBerry OS. The company also has agreed to acquire mobile device management supplier Good Technology for $425 million, as reported earlier this month.

By shifting to Android, BlackBerry could find an audience of customers who would never consider the Google mobile platform because of its known susceptibility to malware. But given the millions of apps that come with it, BlackBerry has a much larger task at hand.

Posted by Jeffrey Schwartz on 09/25/2015 at 12:05 PM0 comments


App Vendors Jump on AirWatch ACE Bandwagon but Microsoft Will Pass

SAP was among 21 application providers that have joined an effort launched earlier this year by VMware's AirWatch business to support device-native, mobile OS standards for enterprise mobility management systems. That brings the total to 44 suppliers that have joined AirWatch's App Configuration for Enterprise group, the company announced today at its AirWatch Connect conference in Atlanta.

Among other members now on board are Box, Boxer, Cisco, Dropbox, Docusign, StarMobile, SkyGiraffe, TeamViewer, VMware, and Xamarin. However don't expect Microsoft to join the project anytime soon, despite its close work with AirWatch on ensuring Windows 10 support.

"Microsoft does not have plans to support ACE," a spokeswoman for Microsoft stated in response to an inquiry. "However, Microsoft already supports app management capabilities available in the latest mobile operating systems. Plus we go beyond that to deliver unique mobile application management capabilities with Office mobile apps for secure mobile productivity."

During a conference call with media, VMware executives said they'd like to see Microsoft join and add support for its various apps, including, of course, the various Office components. "We take advantage of Apple's iOSconfig, take advantage of Google's spec for doing individual application setups, and we also have done that for looking at Windows and Windows 10," said Noah Wasmer, VMware's vice president of product management and CTO for end-user computing. "We would love to see Microsoft come to the table and we welcome them to see the open standards that are available on these different platforms, and we would love to see them come to the table and take advantage of the spec."

AirWatch, which VMware acquired two years ago for $1.6 billion, describes ACE as a community project. The spec is described by ACE as a way for enterprise application developers to interpret app configurations and security policies from EMM (Enterprise Mobility Management) systems, and for EMM systems to configure and secure mobile applications. With ACE, app developers can build a single application that works across all EMM vendors. The app developer does not need to maintain multiple copies of their application, does not need to integrate any proprietary code and no SDK or legacy App Wrapping solutions are required."

According to ACE, functions such as App Tunnel, Kerberos-based single sign-on and various security settings that don't require development efforts as well as those which do, will leverage standard APIs and features built into OSes and offered to EMM vendors to "selectively enable on devices."

Posted by Jeffrey Schwartz on 09/23/2015 at 12:08 PM0 comments


VMware Adds Windows 10 Management and Privacy to AirWatch

At VMworld earlier this month, VMware promised support for Windows 10 and today the company explained how. The company's AirWatch business unit will add Windows 10 configuration and management to its namesake enterprise mobility management suite. The added Windows 10 support is among a number of announcements made at the annual AirWatch Connect conference, which kicked off today in Atlanta.

It's not unusual that VMware would be supporting Windows 10 mobile device configuration and management by updating the AirWatch EMM suite. What did gain notice back at VMworld in San Francisco three weeks ago was how closely the two companies have worked in the background to enable that support. Aside from their longstanding heated rivalry on the systems virtualization front, Microsoft last year decided to jump into the crowded market for enterprise mobility management software by adding its own Enterprise Mobility Suite, which includes Intune, Azure Active Directory and Azure Rights Management. Microsoft now identifies EMS as a $1 billion market opportunity. Microsoft's entry was a direct attack on AirWatch, which VMware snapped up two years ago for $1.6 billion, its largest acquisition to date. VMware has responded in kind by launching its own single sign-on tool, VMware Identity Manager, which in June was added to its higher end AirWatch editions, and was announced at VMworld as a standalone offering, taking on Azure Active Directory.

Despite going at each other -- again not unusual for the two companies -- Microsoft is helping VMware make AirWatch more competitive. "In order to address modern security threats, it's critical [that] hardware and software be designed to work together in tight partnership," said Windows Enterprise Executive Jim Alkove, on stage at VMworld during the keynote session joining Sanjay Poonen, general manager and executive vice president for end-user computing at VMware. "We love Windows 10 because you've opened up Windows 10 for an enterprise mobile management player like AirWatch," Poonen told Alkove.

At today's AirWatch Connect, the company outlined how the EMM suite will support Windows 10. Perhaps most noteworthy, and putting more clarity on why Microsoft is helping AirWatch to the extent it is, is that AirWatch will enable the onboarding of Windows 10 devices thanks to integration with Azure Active Directory. According to AirWatch, using Azure AD join, organizations can allow individual users to enroll enterprise- and employee-owned Windows devices on their own without needing an administrator. The new Windows 10 support in AirWatch includes the ability to bulk-provision a system without having to reimage it. Both Win32 and Windows Universal Apps are supported, while device management policies can be implemented to ensure secure access to enterprise resources. Such policies include, among others, native e-mail configuration, per-app VPN connectivity and security setting enforcement, the company said.

Besides Windows 10 support, also coming to AirWatch are new privacy controls, aimed at providing transparency and consistency for both IT administrators and employees using their own devices to access corporate information systems and services. The company is adding the AirWatch Privacy First Program that provides transparency to individual users so they know what an IT administrator can see, access and change on their device.

At the same time, it ensures if data or an application on an employee-owned device can introduce a problem to the enterprise applications or data, then that the organization's app and information is removed. Likewise, it ensures there's no potential for data leakage from the employee device. "One of the key areas here is to allow us to maintain security on the devices but do so in a way that doesn't infringe on a user's privacy," said Noah Wasmer, VMware's vice president of product management and CTO for end-user computing.

The new AirWatch Privacy First features will be available next quarter. The Windows 10 support is available now.

Posted by Jeffrey Schwartz on 09/22/2015 at 12:08 PM0 comments


Office 2016 for Windows Arrives with Emphasis on Collaboration

Microsoft today released Office 2016, the latest version of its widely used productivity suite. Subscribers to the suite with Office 365 accounts can now upgrade by downloading the new version. With the new Office 365, Microsoft is looking to make the suite's core applications better suited for real-time collaboration on the fly. In addition to built-in real-time coauthoring in Word, IM via Skype for Business and connectivity to Office Groups are focal new features.

Also critical and coming later in the month from Microsoft is improved synchronization with OneDrive for Business, which the company is promising will be more reliable and offer selective sync. The company had announced Sept. 10 that the new suite would be available today. Customers with volume licensing agreements will be able to download the upgrade on Oct 1.  Although Microsoft had released some early limited previews last year, the company offered technical preview in March.

"We designed it to help change the nature of work within organizations of all sizes," said Microsoft CEO Satya Nadella in a blog post announcing the release, underscoring the three scenarios of modern work environments: that users work on multiple devices and locations, they want to collaborate and many use multiple applications. The release also marks Microsoft's new plan for delivering new functions to Office via continuous updates as it is doing with the new Windows 10 operating system, said Kirk Koenigsbauer, corporate vice president for the Office Client Applications and Services team, in the official announcement. "It's a new day for our desktop apps," Koenigsbauer said.

"We set out to make working together easier and more impactful by building a suite of integrated apps and services that removes barriers and empowers teams to do and achieve more," Koenigsbauer noted. That effort includes adding real-time coauthoring to Word initially with plans to add that capability to the rest of the suite over time. The new Office 365 Groups feature is included with Outlook 2016 as well as an Outlook Groups app on iOS, Android and Windows Phone. However, the Groups feature depends on having an Office 365 subscription; it's not available with the standalone Office 2016 product. With Groups anyone can create public or private teams and every group has its own shared inbox, calendar, cloud storage for shared files and a common OneNote notebook to keep the team productive. Skype for Business integration lets users initiate an IM session within a file, or create an audio or video call.

Among some other new features are Tell Me, an improved help feature offered in Office 365 and Smart Lookup, designed to make it easy to perform Web searches while creating a document or other content. Excel 2016 now lets users output their data to Power BI for creating new types of charts (see Redmond's First Look at Power BI 2.0 here).

Some forthcoming features released for technical preview today include the new task manager, called the Office 365 Planner, which lets users create plans that can be viewed in a dashboard with alerts that track the progress of a project and  GigJam, revealed in July at Microsoft's Worldwide Partner Conference. GigJam is designed to let users "quickly retrieve, display and share data across different applications and platforms, with the Cortana digital assistant acting as the facilitator." As we reported back in July, "GigJam displays data from different sources as separate card-sized windows on the desktop "canvas." Users can then choose to share each card, as well as how much data from each card others can see. Users can also ask Cortana to link different datasets for convenient sorting. GigJam could be important to the future of collaboration and workflow management, wrote Office 365 and SharePoint MVP Christian Buckley, who is CMO of Beezy.

For enterprises, Koenigsbauer underscored Office 2016's newly added Data Loss Prevention (DLP) features aimed at reducing leakage of sensitive data and support for multifactor authentication. Later in the year, the new Enterprise Data Protection (EDP) capabilities will enable secure exchange of information.

In case you have missed some of our Office 2016 content, you may find the following Redmond coverage useful in understanding what's new in this release:

If you've started using Office 2016, share what you find most useful as well as what you don't like about it, by commenting below or dropping me a line at [email protected].

Posted by Jeffrey Schwartz on 09/22/2015 at 9:48 AM0 comments


80 Percent of Mobile Malware Now Strikes Windows PCs

Windows PCs are now the source of 80 percent of all mobile malware, according to a report released last week. That may offer little consolation if you're an iPhone or iPad user that has just learned about a malware program called XcodeGhost, a corrupted version of Apple's Xcode language that was embedded in a slew of apps, most notably the popular WeChat, representing the first time an exploit has gotten into the Apple Store.

The rise in spyware and continued attacks on Windows PCs, as well as continued rise in vulnerabilities in Android, are the latest findings from Alcatel-Lucent's Motive Security Labs, the company's malware analysis lab. The Motive Security Labs H1 2015 Malware Report found that after a 0.5 percent decline in infections hitting Android-based devices in the first quarter a surge in attacks led to a 0.75 percent rise in the second quarter, resulting from increased adware infections running on Windows-based PCs connected to mobile networks.

Windows PCs connected on mobile networks, particularly via dongles, mobile Wi-Fi devices or tethered to smartphones, are the most vulnerable. "They are responsible for a large percentage of the malware infections observed," according to the report. "This is because these devices are still the favorite of hardcore professional cybercriminals who have a huge investment in the Windows malware ecosystem. As the mobile network becomes the access network of choice for many Windows/ PCs, the malware moves with them."

Two years ago malware hitting mobile devices was evenly (50-50) split among Windows PCs and Android devices, according to Alcatel-Lucent. The fact that 80 percent now strike Windows machines and only 20 percent on Android devices (the amount on iOS and BlackBerry is negligible) is likely the result of Google's efforts to eliminate malware from Google Play and the company's new Verify Apps feature introduced to Android and available on nearly 80 percent of devices running Android 4.2 (Jelly Bean) or higher. Yet despite accounting for a smaller proportion of devices attacked, the number of Android malware samples doubled in the first half of this year, according to the report.

Despite the release of Verify Apps, most malware distributed to Android devices are delivered as Trojans, by which Android remains the easiest target because it is open, available on third party app stores and Web sites and they're self-signed, meaning it's difficult to trace malware to its developer, the report added. The study also noted that attackers can easily hijack Android apps, inject code and resign them.

As for the proportional shift to Windows, the period covered precedes the release of Windows 10. With this upgrade, Microsoft has made Windows a much more difficult target. These findings could embolden the case for people to upgrade to Windows 10, which adds a number of key new security features including multifactor authentication and biometric identity management.

It'll be interesting to see what the stats look like next year. Who knows where iOS will be in the mix.

Posted by Jeffrey Schwartz on 09/21/2015 at 3:38 PM0 comments


Former Security Strategist Sues Microsoft for Gender Discrimination

A former senior security strategist at Microsoft has filed a class-action lawsuit alleging gender discrimination. The lawsuit, filed this week in a federal court in Seattle, comes nearly a year after Microsoft CEO Satya Nadella’s infamous and poorly received remarks suggesting that "karma" was the best way women should expect to receive salary increases and promotions.

While Nadella swiftly issued an apology, saying Microsoft "wholeheartedly" supports closing the pay gap and responded a week later with a new company diversity initiative, the incident was the latest to put the spotlight on discrimination in the tech industry.

Katherine Moussouris, who filed the complaint, is accusing Microsoft of passing her over for promotions ultimately given to less-qualified men, as reported by Reuters. The report added she was also told by supervisors that they didn’t like her "manner or style."

The complaint also claims Moussouris was given a low bonus after reporting sexual harassment. After seven years with Microsoft, she resigned in 2014 after the company failed to address what she described as "pervasive discrimination."

Additionally, Moussouris complained that other women were also discriminated against and consistently ranked below their male counterparts in routine performance reviews.

"Microsoft systematically undervalues the efforts and achievements of its female technical employees," her attorney Adam Klein of New York-based Outten & Golden told Reuters.  

While at Microsoft, Moussouris "was instrumental in prompting" the company to create its bug bounty program launched in 2013, according to a Wired rreport. While at Microsoft, Moussouris also was a BlueHat content chair, lead subject matter expert in the US National Body for the ISO work item 29147 "Vulnerability Disclosure" published last year, editor of the 2014 International Standard ISO 30111 Vulnerability handling processes and owner of vulnerability disclosure policy for Microsoft in terms of overall strategy, according to her LinkedIn profile.

Moussouris left Microsoft in May of 2014 to become chief policy officer at HackerOne.

Microsoft in a statement issued to Wired disputed the allegations. "We’ve previously reviewed the plaintiff’s allegations about her specific experience and did not find anything to substantiate those claims, and we will carefully review this new complaint."

Posted by Jeffrey Schwartz on 09/17/2015 at 12:16 PM0 comments


Microsoft Looks Beyond the Enterprise with Azure AD Extensions

 Active Directory is used by well over 90 percent of enterprises for authentication to core systems ranging from file servers to various other resources connected to organizations' networks. In a key step toward extending its reach to support partners and customers, Microsoft today is adding two new services to Azure AD that the company says can scale and manage external identities.

The new Azure AD B2C Basic service was designed to enable support for customer facing apps and Azure AD B2B Collaboration will add security for business-to-business partners. Microsoft is releasing technical previews of both new services today. Microsoft indicated an Azure AD B2C Premium edition is also in the works.

 "You just turn on the ability to establish trusted relationships between you and the set of partners who you want to work with," said Alex Simons, Microsoft's senior director for Active Directory, during a conversation prior to the announcement. The B2C (business-to-consumer) service that targets consumers can scale to "hundreds of millions of consumer identities," Simons said in a blog post, noting customers can authenticate now with Facebook and Google credentials and soon to be supported are Microsoft Accounts. The service can support hundreds of millions of consumer identities. The first 50,000 are free. Microsoft posted pricing for those with more than 50,000.

"Along with security and scale, Azure Active Directory B2C also easily integrates with nearly any platform, and it is accessible across devices. This functionality means that your consumers will be able to use their existing social media accounts or create new credentials to single-sign on to your applications through a fully customizable user experience. Optional multifactor authentication will also be available to add additional protection."

The Azure B2B Collaboration component will allow organizations to add contractors and business partners, while ensuring resources are protected. It will support single sign-on to resources based on permissions to such apps and services as Workday, Dropbox and Saleforce.com. Simons noted that Microsoft is demonstrating it in the DevZone section at Salesforce.com's Dreamforce, the software as a service company's annual customer and partner event taking place this week in San Francisco.

Organizations using Azure B2B Collaboration can create advanced trust relationships between Azure AD tenants to share access to business applications and data, according to Simons.

A few early access partners including Real Madrid, lens maker Carl Zeiss and Kodak Alaras acknowledged they're testing the new services via Simons post. In the case of Kodak Alaras, the company set it up to support thousands of partners accessing a new extranet.     

"It's the equivalent of setting up a trust between two tenants in Azure Active Directory, the difference being that it's done at an individual group or user level between the tenants," Simons said. "So you wouldn't just have Microsoft say ‘I trust Intel,' it would be Microsoft saying ‘oh I want these five people or these three groups that Intel has specified to be able to use my applications.'"

Posted by Jeffrey Schwartz on 09/16/2015 at 11:29 AM0 comments


Microsoft President Brad Smith: 'We Want to Sustain People's Trust'

By promoting Brad Smith to president and chief legal officer last week, Microsoft CEO Satya Nadella elevated the company's signal that ensuring trust and privacy is a top and ongoing priority. Equally critical are Microsoft's longstanding desire to ensure digital equality and sustaining the environment – both of which Smith has emphasized.

As Microsoft's general counsel and executive VP for legal affairs, Smith, 56, who joined Microsoft in 1993, has become even more visible lately. This year Smith gave well-received keynotes at the RSA Security Conference in San Francisco back in April and more recently in July at Microsoft's Worldwide Partner Conference in Orlando. Late last month Smith spent nearly an hour meeting with a handful of journalists visiting the Microsoft campus in Redmond, where he emphasized the importance of regaining customer trust more than two years after Edward Snowden's revelation of the NSA's surveillance programs including PRISM.

"I think it's sort of self-evident that trust was shattered two years ago, when the world started to learn what Edward Snowden knew and obviously shared," Smith told us. "Where the cloud is going, where computing is going and just where people are going, trust is an imperative. You don't have to be a lawyer to appreciate what the Supreme Court captured in its decision a few months ago when they said they're putting their lives on the devices.  When you're put your lives on the device, you're putting your lives in the cloud, all of the personal information about you. This is as true for the future of enterprises, as it is for the future of people. So as we often say around here, look people won't use technology they don't trust, and therefore it's just an imperative for us as a company to ensure we put trust in our technology."

That's one reason why Microsoft is challenging a court order insisting Microsoft turn over the emails stored in its Dublin datacenter of a customer suspected in an alleged drug-related matter (oral arguments in that challenge kicked off last week).

"The facts for this are pretty straightforward, the U.S. government served on us a warrant to get the email contents of a customer who is not in the United States, and the email exists and resides in our datacenter in Ireland," Smith said. "Our proposition has been 'hey come on, for 250 years one principal was clear, search warrants reached to the border, they don't reach across the border, police can't take a search warrant unilaterally and search any other country, they have to go through law enforcement in the other country. There there are treaties that exist for this purpose.'"

Regaining customer trust in wake of the Snowden revelations is critical to Microsoft's success as it transitions into a cloud services company focused on mobility, Smith emphasized in our meeting last month.

"If we want to be a successful kind of services company and we want to sustain people's trust, we really need to have a coherent and proactive strategy for how we're going to do this," Smith said. "We want our customers around the world, to be able to make use of the technology we created. We can't do that unless we have an effective regulatory plan, that then manifests itself in engineering features and specifications, and that's what we've sought to do. Ultimately it's forced us to focus very hard on what it is that we've seen them for, what is it that we want customers to know that we stand for."

That has led to the four cloud metrics Microsoft has defined and I'll paraphrase here:

  • Microsoft will ensure data is protected from any attack.
  • Customers own and control their data.
  • With its legal team, Microsoft will make sure your data is managed in accordance with the law.
  • Commitment to transparency. "You will know what we are doing with your data, Smith said.

Microsoft is also committed to supporting advanced encryption, according to Smith. "Encryption plays a vital role," he said. "It is the most important technology for safeguarding people's privacy and protecting their security. So you will see us and others deploying stronger encryption, across services and devices and identifying new ways to do it on an end-to-end basis."

That may mean architecting services that ensure Microsoft and others can't mine data from offerings such as Cortana, he noted.

Smith has a lot of credibility and has generated goodwill in the industry. By making Smith president, it appears Nadella wants to take that even further.

Posted by Jeffrey Schwartz on 09/14/2015 at 10:57 AM0 comments


2015 Reader's Choice Awards Announced

It should come as little surprise that the largest hardware and software providers surrounding the Microsoft ecosystem received the most votes in this year's annual Reader's Choice Awards (PDF). And once again, Dell Inc. won the most awards, this time in 46 categories.

Dell's edge of course comes from the numerous software, security and hardware companies it has acquired in recent years. So who's nipping at Dell's heels? The strongest gainers this year were Hewlett Packard Co. and VMware Inc., while strong showers Cisco Systems Inc. and SolarWinds Inc. also maintained last year's number of awards. Some strong players in more narrow segments are also coming at Dell -- among them Kaseya, ManageEngine, Netwrix Corp., Goverlan, Acronis, Veeam Software, LANDesk, Flexera Software LLC, NetIQ Corp., Barracuda Networks Inc., Kaspersky Lab,  Webtrends,  IBM Co., EMC Corp. (including RSA Security division), NetApp, AppDynamics, Proofpoint and Lenovo Group Ltd.

These awards are decided by readers, not Redmond magazine, and are based on responses to our survey, fielded in late June to early July to our circulation database of 100,000 readers of which 1,056 participated. We include all  known vendors in each category except for Microsoft (with a few exceptions) to get a sense of what "third-party" software, hardware and services were most favored for working with and in support of Microsoft's core platforms. The exceptions were in the browser, public cloud and tablet categories, where we wanted to see how Internet Explorer, Azure and Surface stacked up these days.

The entire list of categories and winners is available for download.

Posted by Jeffrey Schwartz on 09/11/2015 at 12:03 PM0 comments


Is Apple's New iPad Pro a Threat to Microsoft?

Apple's long-rumored iPad Pro surfaced (pardon the pun) this week and with it are comparisons to Microsoft's Surface Pro. Technically speaking, comparing the iPad Pro to a Surface Pro or any Windows Pro-based system is an apples-to-oranges comparison. One is a tablet and the other is a computer.

Yet functionally speaking, for some it may be possible, even desirable to ditch a Windows PC -- at least when out of the office -- in favor of the new iPad Pro. The question is to what extent this will happen or will some even replace their Windows machines in favor of Apple's new ultra-tablet?

The new iPad Pro, launched Wednesday and available in November, comes with a much higher price tag than traditional iPads. It starts at $799 compared to the $499 for the 9.7-inch iPad Air or $399 for an older iPad. But the least expensive iPad Pro only has 32GB of storage. If that's not enough storage for your liking, the next option with 128GB costs $949. If you want a version with built-in cellular, you must, at least for now, go with the $128GB version, which costs $1,079. The optional keyboard is $169 and the new electronic stylus called the Apple Pencil is $100.

For sure, the new iPad Pro appears like an impressive device. It's 6.9 mm thin and weighs 1.57 pound, boasts Apple's high-definition Retina display able to render 4k video. Apple said with its new third-generation AX9 processor, the iPad pro is 1.8 times more powerful than the iPad Air 2. If you're already an iOS and Office 365 user, then carrying around in iPad Pro for basic work and play is an appealing proposition, especially for those who don't need the functions and speed of a PC.

At the same time, the iPad Pros are priced closer to convertible PCs than tablets and those who need or desire the power of a computer might not be able to justify an iPad Pro. That's especially case for those who can get the tablet functionality they want out of the larger iPhones now available.

The good news for Microsoft is even if business users end up flocking to the new iPad Pro, it'll likely drive Office 365 consumption. What's your take? Is the new iPad Pro a threat to Windows?

Posted by Jeffrey Schwartz on 09/11/2015 at 12:04 PM0 comments


BlackBerry Makes Buy In Hopes of Staying Competitive in Crowded MDM Market

Things weren't going so good for mobile device management vendor Good Technology, which late Friday agreed to be acquired by rival BlackBerry for $425 million. While BlackBerry may be best known for its rapidly fading mobile phone business, the company also has a formidable mobile device management platform, deemed by many federal government agencies as the most secure. But both the BlackBerry Enterprise Server and Good's MDM suite continue to face pressure from larger rivals including Citrix, Microsoft, VMware's AirWatch unit and IBM, among others. By taking out Good, MobileIron is the last of the large independent mobile device management suppliers.

It appeared Good Technology was flying high last year. The company had a large customer and developer event (with 5,000 in attendance) in New York in June 2014 when the company had filed for an initial public offering. Good Technology's CEO Christy Wyatt at the time said in an interview that she wasn't concerned about being squeezed out by larger vendors such as IBM, which acquired FiberLink, supplier of the MaaS360 suite, VMware's $1.54 billion acquisition of market leader AirWatch (which was the company's largest deal ever) and Microsoft's move into the market with the Enterprise Mobility Suite.

"We have a very strong relationship with Microsoft, but as with any one of these providers, there's going to be parts of the company that want to do MDM and there will be parts of the company that want to partner with Good," Wyatt said at the time. Analysts last summer had valued both Good Technology and MobileIron at $1 billion each, though at least Good apparently was never able to cash in at that level.

The July 2014 cover story of Redmond magazine raised the question: can Microsoft disrupt the mobile device management business with its new EMS, which it had just announced at the time? With much fanfare, Microsoft Corporate VP Brad Anderson argued (and continues to do so), that organizations won't need third-party MDM suites if they use EMS, which includes Azure Active Directory, Intune and Azure Rights Management. "It all begins with identity," Anderson said at the time of the launch, pointing to the move to bring Active Directory to Azure. It's apparently a message that rival VMware believes as well as it launched a standalone version of its VMware Identity Management at last week's VMworld conference.

Perhaps the big players are stacking the deck against smaller ones such as Good and BlackBerry. Both are hoping by joining forces they will become a more formidable competitor. "By acquiring Good, BlackBerry will better solve one of the biggest struggles for CIOs today, especially those in regulated industries: securely managing devices across any platform," said BlackBerry executive chairman and CEO John Chen. "By providing even stronger cross-platform capabilities our customers will not have to compromise on their choice of operating systems, deployment models or any level of privacy and security. Like BlackBerry, Good has a very strong presence in enterprises and governments around the world and, with this transaction, BlackBerry will enhance its sales and distribution capabilities and further grow its enterprise software revenue stream."

Good has 6,200 customers and claims it has half of the companies in the Fortune 100 including all of the top commercial banks, aerospace and defense firms and key players in other large industries such as healthcare, manufacturing and retail, BlackBerry said in its announcement. Yet Wyatt reportedly told analysts last week that Good was not profitable and it was apparently burning through cash. According to a report in The Wall Street Journal, Good only had seven months of operating cash as of last summer.

Meanwhile look for Citrix, IBM, Microsoft and VMware, among others to continue to put the squeeze on smaller suppliers of mobility management software. Unless they can offer capabilities that are truly differentiated, the big boys are going to soon own the MDM market.

Posted by Jeffrey Schwartz on 09/08/2015 at 10:15 AM0 comments


Jeffrey Snover Promoted to Microsoft Technical Fellow

Microsoft has promoted Windows PowerShell inventor Jeffrey Snover to Technical Fellow.

Snover previously was a Microsoft Distinguished Engineer, another esteemed Microsoft title, although his new role is a step up. Snover joins an exclusive group of the company's top engineers. He acknowledged the news on Twitter:

"Bad news: I'll never get another promotion Good news: I've been promoted to Technical Fellow (there's nothing above that)."

Many tweeted in response that it was a promotion long overdue and noted the irony (as did Snover) that there's only one way to go but down, a direction he actually took years ago, voluntarily when he accepted a demotion so he could further his efforts in advancing Windows PowerShell.

Lately Snover has been promoting Microsoft's DevOps vision, partly enabled by Windows Server 2016, which includes a headless Nano Server. His main role at Microsoft has been to improve Windows Server management with system automation and orchestration across platforms. That management effort has centered on PowerShell and a GUI-less Windows Server 2016, in part. At Microsoft conferences and even at this past spring's ChefConf, Snover has evangelized regarding Desired State Configuration (a PowerShell push-pull method for keeping system configurations in check) and management APIs to enable cross-platform automation and configuration management using PowerShell.

Snover first started talking up the need for automation 13 years ago with his Monad Manifesto, which called on IT pros to create automation scripts using Windows PowerShell. In a recent interview with Redmond, Snover explained why he's making that same case to developers as part of the DevOps vision.

"It's a logical next step," he said. "One of the things I see is that a number of the configuration tasks currently done by operators late in the process are going to move forward and be done by developers as part of the build process."

Posted by Jeffrey Schwartz on 09/02/2015 at 2:18 PM0 comments


VMware Announces Azure AD Identity Manager Alternative

VMware announced a new service that looks to go head-to-head with Azure Active Directory (AD) for enterprise single sign-on at this week's VMworld, currently going on in San Francisco.

The company introduced a new iteration of the VMware Identity Management offering, which was first launched in June, that aims to give it broader reach. Identity Management was previously only available for premium AirWatch customers of either the Yellow or Blue editions.

VMware describes its new VMware Identity Manager Advanced Edition as a standalone Identity Management-as-a-Service (IMaaS) offering that can support major device types including Windows PCs, Chromebooks and Apple Macs.

"What's interesting about this new standalone edition is that we include still elements of the AirWatch console to make good on this concept of adaptive access," said Kevin Strohmeyer, director of product marketing for Workspace Services and End-User Computing, in an interview at VMworld. "What really differentiates our strategy is by having these device-specific adapters, the ability to register a mobile device or even a Windows 10 device that allows us to basically have customized authentication flows that are specific for that operating system."

Strohmeyer said he believes VMware Identity Manager handles federation and management of user identities better than Azure AD and is easier to bridge to legacy and Software-as-a-Service-based applications. In addition to addressing the problem of federated identity management, Strohmeyer said the new VMware offering lets administrators manage security groups.

The challenge for VMware, however, is that the market for federated identity management tools is crowded, and the company is considered new to the arms race. "A lot of customers have integration to Active Directory as the primary source of their identity management," said IDC analyst Al Gillen. "So for VMware to be trying to drive their own directory strategy outside of that seems a little bit like fighting an old battle that's already won. But they seem pretty committed to it."

Indeed in a VMware press conference, CEO Pat Gellsinger claimed that customers have been pushing the company to add IMaaS to its offerings. "We're getting such good response from the industry and from our customers in making that a standard part of our suite," Gelsinger said. "We are very optimistic about the potential for that as yet another element of what we're presenting to our customers."

Posted by Jeffrey Schwartz on 09/02/2015 at 11:56 AM0 comments


VMware Offers SQL Server Online in vCloud Air

VMware this week launched a new database as a service for organizations looking to move their SQL Server applications online without having to modify them. The company announced its new vCloud Air SQL cloud database, announced at this week's VMworld conference in San Francisco. It's available for testing through VMware's early access program and is planned for general availability by year's end.

The service will initially use Microsoft's SQL Server database to support various memory, compute and storage configurations, though VMware said it will offer other relational databases in the future. It'll work in hybrid cloud environments, allowing organizations to use company's public cloud to scale their databases.

"It's the identical SQL Server you might run inside your own datacenter, so that differentiates it from the Azure competitive offering [now called Azure Database], which is not the same," said Matthew Lodge, VMware's VP of cloud services and product marketing, in an interview at VMworld. "For a lot of our customers, that compatibility really matters to them. Database migrations are hard. Companies don't want to change the technology unless it's a port to a new platform, and they are probably doing that for a different reason."

The company sees two use cases for the forthcoming vCloud Air SQL service: "Organizations can accelerate time-to-market using vCloud Air SQL to rapidly provision database instances in the cloud for development and testing and then run applications in production, either on VMware vCloud Air, or back on-premises in a 100 percent compatible environment," said Michael Cincinatus, VMware's senior director of product marketing for cloud services, in a blog post. "Additionally, organizations can extend on-premises applications with next generation mobile or Web-based cloud native applications running in VMware vCloud Air, using VMware vCloud Air SQL."

VMware is offering its early access testers up to $300 in service credits to trial the new database service.

 

Posted by Jeffrey Schwartz on 09/02/2015 at 9:47 AM0 comments


Microsoft Talks Mobility Management on VMworld Keynote Stage

As competition in the IT industry has brought together strange bedfellows lately, it appears Microsoft and VMware are the latest to publicly share their love-hate relationship. Of course it's to mutual benefit. In the keynote session on the second day of VMworld 2015, taking place this week in San Francisco, Windows Enterprise Executive Jim Alkove became the first Microsoft executive to appear on stage at VMware's annual confab.

Sanjay Poonen, general manager and executive vice president for end-user computing at VMware, called Alkove on stage during his keynote presentation where both execs explained how the two companies worked together to ensure tight integration between Windows 10 and VMware's AirWatch enterprise mobility management platform

"We love Windows 10 because you've opened up Windows 10 for an enterprise mobile management player like AirWatch," Poonen said to Alkove. "It's unprecedented."  In response, Alkove said: "In order to address modern security threats, it's critical [that] hardware and software be designed to work together in tight partnership. With Windows 10 we're bringing enterprise mobility management to the entire family of Windows devices and we are simplifying deployment to put an end to the days of wipe and reload."

The unique aspect of this brief public display of mutual admiration comes as Microsoft is fiercely aiming to take on VMware AirWatch, which is one of the leading enterprise mobile management platforms (VMware acquired it in 2014). Microsoft corporate VP Brad Anderson has said on numerous occasions that with Redmond's own new Enterprise Mobility Suite, organizations don't require a third-party EMM suite.

"You would think of Microsoft as being low in the ability to execute," said Ben Goodman, product manager for VMware Identity Manager, in an interview at VMworld. "They are a big company." At the same time Goodman lauded Microsoft for its commitment to ensuring compatibility between Windows 10 and AirWatch. "To Microsoft's credit, they've been great in terms of a development partner," he said. "The question is, do you believe [VMware] AirWatch can manage desktops or do you believe Microsoft can manage mobile? We're both kind of new to both spaces."

Looking to demonstrate it's looking to leapfrog others in mobility management, VMware revealed Project A2, which ties together AirWatch and App Volumes, the tool introduced at last year's VMworld that can deliver hundreds of virtual apps. Project A2, which will be made available for technical preview and released next year, will enable the management of virtual and physical apps on desktops.

Also on the end user computing side, VMware announced Horizon 6.2 and Horizon 6.2 for Linux, which the company said will provide richer user experiences, support for Microsoft's Skype for Business and Nvidia's GRID vGP (virtual graphics processing unit), improved VMware Virtual SAN storage optimizations, support for biometric fingerprint authentication and FIPS 140-2 compliance for those with federal government governance requirements.

Updated Sept. 4: An earlier version of this blog erroneously gave attribution to VMware's Kevin Strohmeyer. The correct executive was Ben Goodman, though both were interviewed in separate meetings at VMworld.

Posted by Jeffrey Schwartz on 09/01/2015 at 1:29 PM0 comments


Microsoft Says 1.5 Million Enterprise Users Have Deployed Windows 10

While Microsoft this week said 75 million users have upgraded to Windows 10, an additional stat IT pros may find noteworthy is that 1.5 million of them were organizations upgrading their Enterprise Edition licenses.

Members of the Windows team meeting with journalists on the Redmond campus today revealed the figure to demonstrate the rapid adoption of Windows 10, which was released just a month ago. While the number pales in comparison to the consumers or users of the Windows 10 Home and Pro edition, the number of enterprises who have deployed Windows 10 in just four weeks after its released is "unprecedented," said Stella Chernyak, a senior director for Windows Commercial at Microsoft.

Chernyak didn't say how many companies the 1.5 million licenses represent but said some of them are among some of the largest global customers. "Some of them are rolling out hundreds of machines in some very large pilots," she said. Like all Windows upgrades over the past few decades, most organizations tend to wait up to a year before performing large rollouts of a new version, a trend that appears to be holding despite the large pilots.

The officials noted that the company anticipates an even larger uptake of Windows 10 once a whole new crop of devices roll out later this year and new features are added through the Windows as a Service updates. Added integration to offerings such as Azure Rights Management is currently in the works. Enterprises are also quite enamored with the new biometric authentication feature called Windows Hello, which is aimed at replacing passwords.

Posted by Jeffrey Schwartz on 08/28/2015 at 6:12 PM0 comments


Machine Learning Is Key To Improving Accuracy of Next Microsoft Band

When Microsoft came out with its Fitbit-like band last year, it introduced some interesting new capabilities to the crowded market for such gadgets. But one of the reasons I returned the first iteration of the Microsoft Band after using it for a month was that it didn't appear to render precise and consistent heart rate data. I have come to learn I wasn't the only one to come to that conclusion and decided to wait and see what the next version offers before spending $200. In all fairness, not all share that view.

It appears the company is readying the next version for the upcoming fourth quarter holiday season, according to published reports. In a briefing at the Microsoft Research Center on the Redmond campus yesterday, Corporate VP Peter Lee, who oversees new experiences at the lab (NExT), indicated that the next Microsoft Band will render more accurate heart rates thanks to improved sensors, though he didn't get into the timing of the release. These improved sensors aren't coming in the form of better hardware but rather major advances in the software developed by Microsoft, he explained.

These same advances apply to other work such as Microsoft's Bing platform, which is all emerging as major Microsoft assets for Microsoft's efforts to advance machine learning. Lee gave the brief discussion about the Microsoft Band as an example of a skunkworks project called "Jewel" that came out of its research labs focused on applying machine learning. While machine learning has long been a key focus at Microsoft Research, the company this year took a step forward with the release of Azure ML. The compute and storage enabled by cloud-based machine learning has helped improve the algorithms used to render data such as blood flow, according to Lee, speaking with journalists Thursday at Microsoft's research center on the company's Redmond campus.

Lee admitted that the hardware BOM (bill of materials) included in the sensors of the Microsoft Band was limited and indicated that won't change in the next version. "We found the signal was woefully inadequate, especially when under physical stress, the key use-case for the Microsoft Band," he said. "While I am making disparaging remarks, it is on par or better than what you would find in the Apple Watch and other fitness bands."

It'll be interesting to see how the improvements in the software algorithm contribute to the next version of the Microsoft Band both in terms of accuracy and other yet undisclosed features of the new device.

Posted by Jeffrey Schwartz on 08/28/2015 at 12:32 PM0 comments


Server Market Continues Modest Growth Trend

PC sales growth may be on the downward spiral but expenditures on servers continue to rise, albeit at a single-digit rate. The latest quarterly reports from Gartner and IDC show revenues for servers increased 7.2 percent and 6.1 percent respectively in the second quarter of 2015.

While Gartner and IDC have somewhat different methodologies, both show continued demand for servers. That may surprise some cloud computing purists who wonder why anyone would buy a server. For sure many of these sales are shifting to cloud providers and MSPs but many build their own systems.

Most of the $13 billion that Gartner and IDC reports was spent on servers in the last three months went to the key players: HP, Dell, IBM, Lenovo and Cisco. It marks the fifth consecutive quarter of year-over-year revenue growth, IDC said.

"The recent growth trend in the server market is confirmation of the larger IT investment taking place, despite dramatic change occurring in system software thanks to open source projects such as Docker and OpenStack," said Al Gillen, IDC's program VP of servers and systems software, in a statement. "While we do anticipate an impact on product mix and potentially on volumes, it is too early in the adoption cycle for these new software products to have a material impact on servers today. In the meantime, the market demonstrated healthy revenue and shipment growth this quarter."

Much of the growth is coming from demand for x86-based hyper-scale systems as well as refreshes of servers among small and medium businesses, likely an outgrowth of Windows Server 2003's end of support. Microsoft issued its last patch for Windows Server 2003 in April. Refreshes of IBM mainframes helped growth on the high end, though mid-range systems declined 5.4 percent, according to IDC.

In terms of shipments, HP remains the leader with 21.7 percent of the market, posting 2.5 percent growth while No. 2 Dell at 18 percent saw a slight decline (0.4 percent), according to Gartner. Though a distant No. 3, Lenovo, which recently acquired IBM's x86 server business, saw volume growth of 185.7 percent year over year. Both researchers said Lenovo saw 500-plus percent revenue growth, although obviously both increases were aided by picking up IBM's commodity server line.

 

Posted by Jeffrey Schwartz on 08/26/2015 at 12:19 PM0 comments


Microsoft's Cloud SQL Database Gets Row Level Security

Microsoft's cloud-based SQL Database now supports row level security (RLS), a feature offered in a number of other databases. RLS lets administrators provide row-level access to data based on a user's identity or role.

The company released the RLS feature in its Azure SQL Database last week. RLS will appeal to organizations looking to restrict access to financial data based on an employee's region and role, ensure specific tenants of a multitenant app can only access their own roles of data and it allows analysts to query various subsets based on their position, according to Tommy Mullaney, Microsoft's program manager for SQL Database.

"RLS enables you to store data for many users in a single database and table, while at the same time restricting row-level access based on a user's identity, role, or execution context," Mullaney said in a blog post. "RLS centralizes access logic within the database itself, which simplifies and reduces the risk of error in your application code."

In his post, Mullaney shared how SharePoint workflow vendor K2 Architect Grant Dickinson was able to ensure it was enforcing security and policies across all database vectors. Before implementing RLS, his team had to use query predicates but that mode of enforcing security was "onerous and prone to bugs," according to Dickinson.

"Furthermore, the data access layer and business logic are able to evolve independently from the RLS policy logic; this separation of concerns improves code quality," he said. "The developers could use a policy language they were familiar with -- T-SQL  -- and as such we were productive on RLS from day one."

Microsoft's Mullaney said it plans to add new RLS capabilities through its iterative development and deployment process.

Posted by Jeffrey Schwartz on 08/24/2015 at 1:26 PM0 comments


Surface Pro 4 with Windows Hello May Appear in October

Rumors that new devices including a Surface Pro 4 and a major launch event by Microsoft planned for October amplified this week following a number of published reports. Though the chatter comes from unnamed sources, all along we've said it makes sense that the next wave of systems would hit at that time presuming Intel's next-generation Core 6 architecture is ready. Even without it, October is the time all the major players roll out their lineups for the critical fourth-quarter holiday buying season.

The buzz about a new Surface Pro 4 picked up earlier in the week when the Chinese site WPDang reported that the new tablet-PC will be joined by two new Lumia phones and a Microsoft Band 2 (as of midday Friday the report was not accessible, suggesting perhaps it was pulled). A subsequent report by The Verge added that Microsoft indeed is planning a launch event. The Surface Pro 4 will be similar to its predecessor, meaning it will support the same peripherals and docking station but it's believed it will have Intel's new RealSense camera and will support the new Windows Hello capability. Windows Hello is the new sensor technology designed to let users replace passwords with facial recognition or fingerprint readers to log into the OS.

It is unclear whether the new device will include the new Intel Core 6 processor, code-named Skylake, but Intel is set to release the chipset in two weeks, according to several reports including this Zacks research note. Skylake, like all new CPUs, is faster and more power-efficient but will "drive multiple 4K displays, feature novel instructions to accelerate security operations, and hardened memory defenses [and] has enhanced Iris Pro integrated graphics which can drive up to three 4K monitors at 60Hz." The site Softpedia published a breakdown of SkyLake this week.

At the Intel Developer Forum (IDF) this week in San Francisco, the company announced a broadening of the RealSense camera sensor interface technology for Windows, Android and MacOS. Intel also released its RealSense SDK for Windows, which includes a tool for developers to access the sensor-based capabilities of its Unity platform.

Posted by Jeffrey Schwartz on 08/21/2015 at 11:53 AM0 comments


Citrix Releases Its Workspace Cloud Platform

In what could be the most new product offering from Citrix in years, the company today said its new cloud-based offering for deploying and managing virtual and mobile devices is now available. The company unveiled the Citrix Workspace Cloud back in May, hailing it as architecture for the modern digital workplace.

With its control-plane architecture, the company designed the Citrix Workspace Cloud to give IT administrators or third party managed service providers the ability to securely deliver virtual desktops or applications to users using any public or hybrid cloud offering. The architecture behind the Citrix Workspace Cloud is the Lifecycle Manager, which was built using the engine from ShareFile, the document sharing platform it acquired back in 2011.

The Lifecycle Manager creates blueprints that ease the migration of earlier versions of XenApp to current releases and provides the ability for IT to deploy them in the new management platform. These blueprints "are effectively groupings of things that you need to do to define whatever workload it is you want to deliver," as Citrix VP and CTO Christian Reilly explained back at Synergy.

Citrix said its Lifecycle Management packages let IT pros design, edit and deploy application and desktop blueprints, and once-distributed administrators can manage and monitor the deployed images or apps. The company is offering an entry-level version free of charge to existing customers with maintenance agreements. Citrix is offering various other Lifecycle Management configurations starting at $2.50 per month per user.

Among the deployments it will manage are the Workspace Cloud Virtual Apps and Desktops, which Citrix said will let IT shops securely deliver Windows and Linux apps, browsers and desktop images to any type of device. Priced at $35 per month, it lets IT pros build, design, edit and roll out app and desktop images. It also includes file, share and synchronization functions. Another option for $40 a month is the Integrated Apps and Data Suite, which adds on top of the Virtual Apps and Desktop Services mobile device and app management, as well as productivity tools.

"With Citrix Workspace Cloud, we are opening up virtualization and VDI to a whole new range of customers, for whom it was too complex or too expensive for in the past," said Jesse Lipson, vice president and general manager, Citrix Workflow and Workspace Cloud, in a statement. The move to expand its customer base comes as the company is under pressure by activist investor Elliott Management, which holds 7.5 percent of Citrix's common stock, to grow the company.

Uptake for the Citrix Workspace Cloud by managed service providers and IT organizations is poised to be a critical measure of whether the company can achieve further growth.

Posted by Jeffrey Schwartz on 08/20/2015 at 1:29 PM0 comments


Windows Containers Debut in New Windows Server 2016 Preview

Microsoft today released the third technical preview of Windows Server 2016 and it will give IT pros and developers the first chance to see the company's new Windows Server Containers, which includes the open source Docker Engine. The latest Windows Server 2016 technical preview also includes improvements to Active Directory (both AD DS and AD FS), Hyper-V, failover clustering, remote desktop services and file and storage services. Microsoft posted an outline of all the new features introduced in this release. Not included in Windows Server 2016 Technical Preview 3 (in Microsoft lingo it's TP3) is the new Azure Stack announced back in May at Ignite that will bring Azure functionality to Windows Server or Hyper-V Containers, though Microsoft has indicated that it will show up in a technical preview later this year.

Windows Server Containers and the yet-to-be released Hyper-V Containers with the Docker Engine will introduce a new way for organizations to build and deploy applications faster and more scalable in on-premises and cloud environments. Through last year's partnership with Docker, the two worked to help Microsoft build Windows Server Containers in the server OS to  insure applications built for them are interoperable via new APIs that are compatible with Linux containers, both from a deployment and orchestration standpoint.

"This is a big step on a journey we started a while ago," said Microsoft Azure CTO Mark Russinovich, in an interview this week. "This TP3 release is the first time we're making publicly available in preview form Windows Server Containers with complete Docker tool chain ported to Windows as well as integration of container deployment and management through Visual Studio."

Docker Senior Engineering Manager Arnaud Porterie noted in a blog post today that "the Docker Engine for Windows Server port is not a fork, nor a different project: it's the same open source code base being built for Linux and Windows." Porterie also emphasized that "the Docker daemon for Windows Server doesn't run Linux images! No virtualization is involved. The Windows Server Containers reuse the host kernel and create a sandboxed environment for the process, exactly like it does on Linux."

From the perspective of development of applications for the Azure public cloud and the on-premises Azure Stack, Windows Server Containers introduce the most important new capabilities in Windows Server 2016 TP3. Yet for IT pros this is just as important as the notion of containerization is to DevOps to allow organizations more business agility, Russinovich explained.

"If you take a look now at any enterprise, they've got to get applications out faster and they've got to iterate on them faster," he explained. "The business motivation for an agile development workflow is what's driving a lot of enterprises [to] the top-level business requirement that you're seeing [from] what started as a grass-roots-driven wave by developers themselves. They're looking for a faster way to iterate as they develop their applications. One of the value propositions Docker likes to tout is a developer can debug and test their container on their own development laptop, and then with high confidence know that that tested application is going to deploy in the same exact way to a production server. And if they're iterating, for example, on their development laptop, they can do that very quickly because the containers deployed so quickly."

Containers in the context of allowing DevOps organizations to iteratively build applications that are scalable for modern cloud environments using micro services is a new concept. But many Fortune 50 companies are either piloting or have small deployments of applications for these new architectures. Because each container is small and isolated, they should help create applications that are less monolithic and more secure. While the technology is still emerging, Russinovich believes the use of micro services will be the next wave of software development, virtualization and IT operations.

"What  you're seeing is the power of this isolation and agility fit nicely with an application level that's decomposed, which also goes back to the business driver of agility," Russinovich said. "If you've got a complex application and many enterprise applications, and even CSV type applications or ISV type applications are complex, [they] consist of many subsystems. If you break down that monolithic application into constituent components, containerize them, and then have the whole thing managed by a microservice application platform you get the benefits of containerization, the agility of deployment, you've got things like rolling update, you get the independence of updates so different teams can work on different parts."

Enterprise Strategy Group Senior Analyst Mark Bowker says he's seeing growing interest among large IT organizations that are enamored by the agility of sites such as Amazon and Facebook and those who offer modern apps, though warns it's still very early days. "I think you're seeing the operating system vendors essentially react and design, and ultimately look at where modern applications are headed, and ultimately making a more efficient operating system to run those types of workloads. It doesn't necessarily have all the full-blown features because a lot of those features are actually written into the application itself," Bowker said.

Many IT pros still don't understand containerization. Russinovich gave a far more extensive explanation of Microsoft's view of what the future of containers holds in a blog post published earlier this week. It's worth reading to understand Microsoft's container vision and what it means for the future of Windows.

UPDATE: An earlier version of this post incorrectly stated that the technical preview included Hyper-V Containers and has been updated to reflect that they'll actually come in a subsequent release.

Posted by Jeffrey Schwartz on 08/19/2015 at 9:50 AM0 comments


Microsoft's Latest OneDrive Update Doesn't Include Popular Placeholders Feature

Looking to make OneDrive the cloud storage service of choice, Microsoft last week added improved synchronization, search and support for the new Apple Watch. The upgrade of OneDrive now includes the ability to synchronize shared folders from a desktop app, visibility to when someone edits a file and Microsoft says it's now easier to search for documents. While customers welcomed the improvements, many lamented the lack of a once popular feature called Placeholders, also known as Smart Files, which Microsoft removed in January.

Microsoft first introduced Placeholders in OneDrive with Windows 8.1 but subsequently removed the feature. Placeholders allowed users to see files that are online in addition to local documents. Users complained about the missing feature in the comments section of the blog post by OneDrive Group Program Manager Jason Moore last week announcing the upgrade, imploring Microsoft to bring Placeholders back.

"I switched from Dropbox to OneDrive because of the placeholders," wrote Mark Newton. "Now, OneDrive is just Microsoft's Dropbox." Tom H. added: "This is great folks but there are thousands clamoring for placeholders, especially those of us who bought your Surface tablets. I would have hoped you would have brought placeholders before this other stuff."

 Many complained that they bought the Surface and Surface Pro specifically because of Placeholders. "The elimination of placeholders has made my 'tablet that can replace my laptop' just a tablet," wrote Jeremy. "OneDrive was the only thing allowing your own devices to do what you advertised. Thanks for selling me a thousand dollar tablet. Downgrading now."

Microsoft's Moore did weigh in last week welcoming the feedback. "Folks -- I definitely recommend checking out our UserVoice and leaving feedback there -- write up the features you want and how you'd like us to go about them! We love seeing the passion," he said. One poster, who didn't leave a name called that lip service. "You already know that placeholders back on Uservoice  [Microsoft's customer feedback platform] have more than 13,000 voices, but it's easier to say, 'give your feedback, but we will do what we like to do.'"

Not everyone commenting on Moore's post want Microsoft to bring Placeholders back. "The placeholders were a nightmare for me. Just keep adding reliable functionality," wrote Rusty Gates. But those commenting who want Microsoft to bring Placeholders back far outnumber those who could live without it.

The obvious compromise would be to offer both options and allow customers to choose either configuration.

In the meantime MVP Kent Chen explains in a blog post on Next of Windows how to map OneDrive files to a network drive or alternatively to consider third-party apps such as Odrive.

Posted by Jeffrey Schwartz on 08/17/2015 at 1:24 PM0 comments


Microsoft Is Working with the Canonical Linux Container Hypervisor Project

Ben Armstrong, Microsoft's Hyper-V and virtualization guru, will outline the company's participation in two open source projects: the Canonical-backed Linux Container LXD hypervisor project and OpenStack. Armstrong tipped off that he'd be speaking at ContainerCon 2015, a Linux Foundation event, and at OpenStack Day. Both will be taking place on Microsoft's home turf of Seattle.

As reported Wednesday by open source expert Steven J. Vaughan-Nichols, Microsoft and Canonical, the parent company that distributes Ubuntu Linux, are working together on LXD, a project to develop Linux container hypervisors. As part of the project, Canonical last year revealed the specs for Linux Containers, or LXC, which is a new stratum on top of LXC that endows the advantages of a traditional hypervisor into the faster, more efficient world of containers," noted Dustin Kirkland, a member of Ubuntu's product and strategy team, in a blog post yesterday.  "Hosts running LXD are handily federated into clusters of container hypervisors, and can work as Nova Compute nodes in OpenStack, for example, delivering Infrastructure-as-a-Service cloud technology at lower costs and greater speeds."

Microsoft of course has openly embraced Linux containers with its collaboration with Docker, where the two are among others working on the Open Container Initiative, so the collaboration on LXC and Canonical appears to be a natural evolution of that work. In a quote shared in Kirkland's post, Microsoft's Armstrong, whose actual title is principal program manager lead at Microsoft on the core virtualization and container technologies, said that "Canonical's LXD project is providing a new way for people to look at and interact with container technologies. Utilizing 'system containers' to bring the advantages of container technology to the core of your cloud infrastructure is a great concept. We are looking forward to seeing the results of our engagement with Canonical in this space."

Meanwhile, Armstrong noted in his own MSDN blog post that he'll be speaking at OpenStack Day Seattle 2015 in Seattle later in the week, which is taking place in conjunction with ContainerCon. While Microsoft has been a silent player in OpenStack, Armstrong pointed out that the company's Nova and Open-vSwitch drivers for Hyper-V connect to Active Directory and Cinder drivers for Windows iSCSI.

 

Posted by Jeffrey Schwartz on 08/13/2015 at 11:16 AM0 comments


Investors To Acquire Symantec's Data Protection Business for $8 Billion

Long wishing to exit the business of backup and recovery and high availability to focus on IT security, Symantec today said it has agreed to sell its Veritas business to private equity firm The Carlyle Group and Singapore sovereign wealth fund GIC for $8 billion in cash. Carlyle has tapped Bill Coleman, BEA Systems founder and CEO as Veritas chief executive, and former 3Com Chairman and CEO Bill Krauss was named chairman.

The moves will end a decade in which Symantec and Veritas never seemed to find the symmetry they were seeking when Symantec acquired the server and storage management and data protection software provider in 2005 for $13.5 billion. Symantec last year said it was creating two separate businesses with the data management business to retake the Veritas name. The transaction is expected to close by the end of the year.

Once the transaction is complete, products such as Backup Exec, NetBackup and Cluster Server will no longer carry the Symantec name. As Veritas reenters the data protection market it will find numerous new competitors who have already spent years going after the business run by Symantec, among them Acronis, ArcServe, Asigra, CommVault, Dell, EMC, IBM, NetApp, Veeam, VMware, Unitrends, Vision Solutions and Zerto.

It also remains to be seen what Veritas' new owners have in store for the business such as en eventual sale or IPO. For its part, Symantec said it plans to use the proceeds to shore up its security business.

Posted by Jeffrey Schwartz on 08/11/2015 at 2:02 PM0 comments


Satya Nadella Congratulates Google's New CEO Sundar Pichai

Google has a new CEO and that's significant news for the company, its competitors, partners and those who use its wide array of offerings -- consumers and businesses alike. So it's hardly surprising that Microsoft CEO Satya Nadella was among those who yesterday reached out to Sundar Pichai, who was unexpectedly named to run Google as part of the largest restructuring in the company's history, via Twitter: "Congrats @sundarpichai well deserved!"

Through that massive company reorg, Google becomes the largest of several subsidiaries that will fall under a new corporate holding unit called Alphabet, which will be led by former Google CEO Larry Page and President Sergey Brin, both of whom are also cofounders. The surprising development appears to be aimed at allowing Page and Brin to focus on new and emerging businesses and creating a new financial reporting structure.

Page described Pichai as a natural choice to lead Google. Pichai, who most recently headed all product development and engineering for the company, is responsible for the development of the Chrome browser and had recently headed up the Android mobile division. "I know Sundar will always be focused on innovation -- continuing to stretch boundaries," Page said in yesterday's announcement. "I know he deeply cares that we can continue to make big strides on our core mission to organize the world's information."

Nadella and Pichai share a common heritage as they both are from India and Pichai's name was among dozens of outsiders rumored for the Microsoft CEO job when the company was searching for a replacement for Steve Ballmer.

When Page named him to replace Andy Rubin to head the Android business two years ago, Page described him in a blog post at the time as someone who has a "talent for creating products that are technically excellent yet easy to use -- and he loves a big bet," as pointed out by The New York Times.

Nadella and Pichai share something else in common: they're both only the third CEOs of their respective companies, though Microsoft is quite older than Google. Pichai's challenge is to lead Google so it can age gracefully -- and with fewer bumps in the road than Microsoft, Apple and others have traveled.

Posted by Jeffrey Schwartz on 08/11/2015 at 11:28 AM0 comments


Google To Restructure as New Company Called Alphabet

Google today said it will create a new publicly traded company called Alphabet, which will serve as the parent for its separate business units including Google itself. Larry Page, Google's cofounder and CEO, will lead Alphabet with Cofounder Sergey Brin as president.

The move is clearly the largest restructuring in the company's history and a major change in organizational makeup for any company its size. The creation of Alphabet aims to separate Google's core search and cloud business from other groups such as the company's investment companies, its Calico life sciences business and Xlab, the incubator for new technologies such as drones, Page said in a blog post announcing the planned move.

"Our company is operating well today, but we think we can make it cleaner and more accountable," Page said. "Alphabet is about businesses prospering through strong leaders and independence. In general, our model is to have a strong CEO who runs each business, with Sergey and me in service to them as needed. We will rigorously handle capital allocation and work to make sure each business is executing well. We'll also make sure we have a great CEO for each business, and we'll determine their compensation. In addition, with this new structure we plan to implement segment reporting for our Q4 results, where Google financials will be provided separately than those for the rest of Alphabet businesses as a whole."

Details of the new organization are still unfolding but it appears Google is looking to provide a new reporting structure to make its business attractive to investors. Page said by leading the parent company, he and Brin can focus more on emerging businesses, while turning Google over to Sundar Pichai, who will take over as CEO. Pichai is currently Page's top lieutenant at Google.

Ruth Porat, who recently took over as Google's CFO, will assume that role at Alphabet. The name of the parent company has raised some eyebrows. "We liked the name Alphabet because it means a collection of letters that represent language, one of humanity's most important innovations, and is the core of how we index with Google search," Page said.

Page emphasized the goal isn't to establish Alphabet as a consumer brand. "The whole point is that Alphabet companies should have independence and develop their own brands."

Posted by Jeffrey Schwartz on 08/10/2015 at 3:59 PM0 comments


IBM To Swap 200,000 Employee PCs with Macs

IBM's partnership with Apple has taken a new twist as Big Blue plans to deploy up to 200,000 Macs. That could equate to more than half of IBM's workforce.

In an internal corporate video published by MacRumors, IBM CIO Jeff Smith revealed the company's plans to roll out up to 50,000 Macbooks to employees to replace their existing Lenovo Thinkpads. Of course it was IBM who developed the Thinkpad before selling its PC business to Lenovo over a decade ago. Nevertheless, Thinkpads remained the client device of choice at IBM. In a separate video clip, Smith recalled a conversation in which IBM Vice President Fletcher Previn told  Apple CEO Tim Cook that one day 50 to 75 percent of IBM employees could have Macs.

Apple and IBM formed a partnership last year in which IBM will develop industry specific mobile apps for iOS and MacOS and offer services to help deploy them. The apparent leak was clearly a precursor to last week's announcement from IBM in which it said it was offering new cloud-based services to help large enterprises deploy and integrate Macs within their IT infrastructures. In the announcement, IBM said Mac deployments in enterprises are on the rise.

The services let IT managers order Macs and have them delivered to employees directly with the system image installed without requiring setup or configuration. IBM partnered with JAMF Software, whose Casper Suite is enabling the ability to create and deploy the system images.

IBM's decision is somewhat ironic considering it delivered the first enterprise PC back in 1981, though the company has no vested interest in the fate of Windows PCs. It remains to be seen how aggressive IBM intends to be with bringing Macs to more businesses as Microsoft looks to convince customers to upgrade to Windows 10. Perhaps IBM is finally getting even with Microsoft for leaving it holding the ball with OS/2. Is a new battle brewing?

Posted by Jeffrey Schwartz on 08/10/2015 at 12:38 PM0 comments


Will Microsoft's Planned Badges Someday Replace MCSE and MCSA Certifications?

The revamp of Microsoft's certification process is under way and the newest offering will come in the form of skills-specific badges for those who don't want, need, or see the value in MCSE or MCSA certifications. For those who do want to pursue MCSE and MCSA certifications, Microsoft is rolling out major improvements to the testing process, to include performance-based exams and the ability to take them anywhere using the company's new online proctoring capability.

Liberty Munson, Microsoft's Principal Psychometrician, outlined the new certification initiatives in a fireside chat earlier this week at the TechMentor conference, taking place on the company's Redmond campus. TechMentor, like Redmond, is produced by 1105 Media. The fireside chat was moderated by Greg Shields, conference cochair,  Redmond columnist and author-evangelist at IT training company Pluralsight.

"We're continuing to evolve the program, and as a result we're going to make some changes where we start badging skills," Munson said. "So we're going to really focus on learning paths, where you pick the skills that you want to go and learn. Some of these courses you will be seeing you can take specific skills and you can get badges and those badges will show up on your transcript and you can show people that you're skilled in certain areas."

The badging training curriculum is likely to roll out in January, according to Munson, and the initial focus will be on Windows Server. Asked if these badges will one day diminish the requirement or desire to seek MCSE or MCSA certifications, Munson told me after the keynote session that it's addressing a generational shift. "The millennials are really about bite-sized chunks, and so we're trying to address that need," she said. "But as a result, certification becomes more difficult of a sell to them because of their learning mind set."

Elaborating on that point, she added: "I think you are going to see fewer people get certifications in the future. If we look a decade from now, two decades from now, certification is going to be something -- I'm going to call it like self-service -- where somebody goes in and they pick the skills that they want to be certified. They design... their own exams and their own certification. I think what we need to do to appeal to that younger generation is give them the flexibility to choose what they want to be measured on so certification takes on a whole different meaning in that potential future. And quite honestly, if certification is going to survive, I think it has to do something like this, it needs to break this mold. Otherwise the millennials are not going to buy into it."

Shields agreed with her, telling me that the notion of badging is a novel approach that many hiring managers and candidates alike may prefer to broader, more extensive certifications that don't necessarily prove specific skills. "As a hiring manager, the ability for someone to look at the badge or to see that someone has certified or passed an assessment in a very particular technology is something that very directly will help him as he goes through finding the right people for the right jobs," Shields said.

At the same time, Microsoft is rolling out a number of significant new capabilities to those who still see and value more traditional MCSE and MCSA certification, Munson emphasized. The new performance-based testing, which is also slated to roll out in January, will represent Microsoft's second attempt to offer exams that go beyond traditional multiple choice questions, enabling candidates to showcase their actual skills.  Microsoft's first stab at performance-based testing years ago was a bust primarily because the underlying infrastructure supporting it wasn't reliable. Given the rollout of Azure, Munson says she's confident that issue is now resolved.

"We think we cracked that nut with Azure in the cloud and really leveraging some of the big things that Microsoft is really focused on right now," she said. "Right now we're in the proof-of-concept phase with our performance-based testing. We probably will start with Windows Server and you'll start seeing elements where performance-based requirements are part of the certification process on the Windows Server exams."

Microsoft is already in the midst of rolling out online proctoring, which eliminates the need for people to have to travel to testing centers. Depending on where a candidate lives,  under the current guidelines, he or she might need to travel hundreds of miles to take an exam, whereas the online proctoring allows people to take them at home or any place they can be monitored to ensure they're not cheating, Munson said. "You have to make sure you're going to be in a location that's going to be uninterrupted," she said. People can take the tests at "home, work, wherever, it doesn't matter.  You can be in your PJs, but you do have to be dressed, because there is a proctor that is going to be watching you."

Online proctoring is already available in 53 countries and Munson hopes to have it rolled out by the end of the calendar year.

Posted by Jeffrey Schwartz on 08/06/2015 at 11:57 AM0 comments


Major Changes Coming from Microsoft Will Impact Your Career

Windows Server, System Center and every other key Microsoft product are now undergoing fundamental architectural and design changes, and if you don't adapt to them and embrace cloud computing, your career in IT likely will be cut short. Regardless of how much experience you have as an MCSE or MCSA, Pluralsight Curriculum Director Don Jones pointed to key changes coming from Microsoft that'll have a major impact on the careers of all IT pros and developers who specialize in all or any component of the Redmond stack.

The changes in Windows Server 2016, the move toward system automation, the shift to applications based on containers,  Microsoft's cloud-first approach and  a move to continuous updates will all require IT pros to gain new skills. Speaking at the TechMentor conference on Microsoft's main campus in Redmond, Wash., which, like Redmond, is produced by 1105 Media, Jones' stern warning to attendees was to keep up on these changes and get educated accordingly "or run calculations on the days you have until retirement."

Jones, a cochair of the TechMentor conference (and a former Redmond columnist), also warned that IT pros who aren't proficient in PowerShell -- or have people on their teams who are -- will face problems. It should be noted that Jones is a longtime proponent of using PowerShell and scripting for automation and is a cofounder and president of PowerShell.org.

"It's PowerShell or bust," Jones said. "That's the future." That's because in order to use the forthcoming Windows Server 2016 Nano Server, administrators will have to rely on PowerShell remoting since the server OS won't support video connections or have a GUI.  "How many of you have administrators [and] server admins who are not comfortable with PowerShell? That's a limiting career decision. You're still going to have GUIs. Even Nano has a GUI. Nano is actually going to offer a Web-based GUI, potentially even in the cloud, that connects to the actual Nano server via PowerShell remoting. They are not going to run on the server."

Jones warned that learning PowerShell is a major undertaking. "The thing is if you're not doing anything with PowerShell already, and I don't want to say this in a bad way, you've kind of missed the boat. PS has gone from a curve to a giant block, and it's a lot to learn, if you can devote enough time you can still learn this tech. But, my god, don't wait any longer."

Perhaps the best route is to learn Desired State Configuration, or DSC, the "forward-evolution" of PowerShell, Jones suggested. "It is literally the most important management service that Microsoft has ever created. And if you're thinking, 'well my company is not sure if we want to use it,' you should consider whether that company deserves your time or not. Or whether you would be safer career wise someplace else. This is a big deal... saying we're not going to use DSC is like saying we're not putting gas in the car but we'd still like to drive it."

Here are some other changes coming from Redmond to some of its core products that IT pros will need to adopt to, according to Jones:

System Center: Microsoft's systems management platform in 10 years will look nothing like it does today, yet will remain critical as Microsoft offers fewer tools in the operating system itself. "If you've never used the System Center products before, you need to start becoming familiar what they can do, particularly System Center Virtual Machine Configuration Manager and things like Operations Manager," he said. "Pieces of it will move to the cloud, pieces of it will live on premises and pieces of it will change completely."

Hybrid Cloud: Jones said those who ignore the rise of cloud computing architectures will be making fatal career mistakes. Take Windows Server 2016 and beyond. The new operating system release with the Azure Stack shows a shift toward Microsoft upgrading its public cloud infrastructure and throwing those pieces into Windows Server releases.  So even your datacenters will evolve into cloud environments, he said. "You ignore this cloud thing at a very high risk to your career," he said. "It is not going to be long before every single corporation of any size has got some workload in the cloud." How should IT pros skill themselves for this shift?  "I think you need to be doing something so that you get familiar with incorporating your on premises services with certain cloud services. I'm not saying you have to migrate all of your stuff to the cloud -- that is not the right answer for most organizations. But you need to start looking at the workloads that are suitable and doing some pilots so you as a person can get familiar with it whether your company appreciates it or not. How many of you can could draw a meaningful picture and walk me though the process of setting up a VPN from on-premises into a private section of Azure? That is a core skill."

Big Data: One of the most ambiguous and disliked term in IT, the ability to process massive amounts of data to facilitate the move toward better systems automation will be important, Jones said. "Imagine scraping every single log and performance monitor counter you have and, constantly being able to predict what that data means based on past patterns. This is adding a lot of operational intelligence and insights to IT operations and is a big part of automation. Big data for us is where it loops it back into automation. We look for trends, patterns and correlations. These are all important things."

Heterogeneity: The dream of working in all Windows environments is long gone, Jones notes. And specializing in just one operating system is not going to be the accepted norm moving forward.  "If you don't know how to do some basic maintenance on a Mac or have ever  built a Linux device, it's a good hobby take up because it's going to be a key part of your career," Jones advised. "This is a good way to hedge your bets."

Exchange Server: It's no secret that high on the list of endangered species are Exchange administrators thanks to the rapid push toward Office 365 and hosted versions of the e-mail platform. "Don't bet the rest of your career on a counter argument to something like Exchange is going to go away," he said. "Have a plan B."

Active Directory:  The ability to configure and manage Azure Active Directory and AD Connect are critical. Those who earn a living by adding users to Active Directory are the equivalent to those who pump gas at full service stations.  "If you have someone in your organization because they add users to Active Directory, that paycheck is in threat," he said.

Posted by Jeffrey Schwartz on 08/05/2015 at 1:31 PM0 comments


Windows 10: Try Before You Buy

Last week's launch of Windows 10 was really about the release of the bits online to those who can get it -- mostly Windows Insiders. PC makers took a backseat because they only recently received the final bits. That's a historic deviation for new releases of Windows, but as the OS moves to a more continuous upgrade cycle, that looks to be a moot point going forward.

Between that and Intel's delay in shipping its Skylake processors, PC makers had little choice but to dial back on the hoopla. Most issued reminders that many of their most recent releases support Windows 10 and many more are coming in August and into the latter part of the year. That's not to say they entirely ignored the launch. There's tons of business to be had by hardware manufactures following the launch of Windows 10 and Dell, HP, Lenovo, Asus, Acer and Toshiba issued reminders of their existing and forthcoming systems.

HP briefed media and analysts last week by outlining which systems are now optimized for Windows 10 and those that are in the pipeline. The unstated message was 'try before you buy.' Take advantage of the free upgrade on your existing system, if eligible, and become familiar with what Windows 10 has to offer.   

"We've been working with Microsoft from the very beginning on Windows 10. As a result we had the opportunity to design our entire 2015 portfolio with Windows 10 in mind," said Mike Nash, VP of product management at Hewlett Packard, on the Tuesday call. "Whether you buy a product that comes from the factory with Windows 10 or you have one of our products that came with Windows 8.1 and you upgrade in the field, it's going to deliver a great Windows 10 experience. We're really confident about that."

According to HP's own research, 22 percent of those it surveyed earlier this year said they'll purchase a new device, while 44 percent will upgrade their current system. Given the falloff in PC sales, that's not a surprising feature and Nash, a former longtime Microsoft exec, is encouraging users to upgrade their existing systems to Windows 10. The implication of course is users will like what it has to offer so much that they'll ultimately want new hardware that takes advantage of its features. And since HP and others are still readying new hardware, the notion of try before you buy will suit them well, according to Nash. "The upgrade becomes a way for you to demo and try out Windows 10," he said.

Indeed when people ask me if they should buy a new system, I say, run Windows 10 on your existing system for now but wait until later this year, as the best is yet to come.

Posted by Jeffrey Schwartz on 08/03/2015 at 11:57 AM0 comments


Windows 10: 14 Million Served on First Day

Some 14 million devices are now running Microsoft's Windows 10 operating system. But many who have reserved their free upgrades will have to wait for days or weeks, the company reiterated last night.

"While we now have more than 14 million devices running Windows 10, we still have many more upgrades to go before we catch up to each of you that reserved your upgrade," wrote Yusuf Mehdi, corporate VP for Microsoft's Windows and Devices Group, in a post last night on the Windows Blog. "Rest assured we are working 24×7 to continue the upgrade process and are prioritizing the quality of your upgrade experience over anything else. We are grateful for your excitement and enthusiasm and we appreciate your patience over the days and weeks ahead as we carefully roll out Windows 10 in phases to all of you that have reserved."

Microsoft's 5 million Windows Insiders who tested the Windows 10 Technical Preview are getting priority, according to the company. While I ran the Technical Preview on one of my machines, I have two others that I don't use often and continued using Windows 8. If you don't want to wait, there are many ways you can fast-track the process, as reported earlier this week. One is to go to a Microsoft Store and have them download it for you (it's free).  Another avenue is to follow Brien Posey's advice to get in front of the line.

Posted by Jeffrey Schwartz on 07/31/2015 at 10:51 AM0 comments


Windows 10 Launch Puts Spotlight on Its Retail Stores

The Windows 10 launch Wednesday put its 110 retail stores worldwide in the spotlight as Microsoft decided to celebrate with local charities it supports such as The Girl Scouts, Habitat for Humanity and the YouthSpark Summer Camp program, among others. While the Microsoft Stores are intended for consumers, partners and IT pros often come in to peruse the store's wares.

An hour before yesterday's opening of the Microsoft Store at Roosevelt Field, in the New York City suburb of Garden City, store manager Scott Goeke spent a few minutes to chat about the preparations for the Windows 10 launch and some of the expectations he has. Until recently Goeke managed the Microsoft Store in Santa Clara but took the opportunity to return east when an opening came up to manage the Garden City, N.Y. location.

What have you been doing to gear up for the Windows 10 launch?
There's been a ton of excitement from our employees and customers. Our main job is to harness that and make sure we really deliver. We've been working on training our employees for months now, making sure that they have the ability to properly recognize and show off what's going on to our customers and show off all the cool features of Windows 10 and really be the experts.

Have customers been asking questions about what to do about Windows 10 -- should they wait -- and those kinds of questions?
It's not so much about should they wait, they've been coming in ever since we announced it earlier this year. Very rarely are we hearing "should I get it?" It's more "when can I get it," which is really cool for us because that's exactly the kind of customer event we want have for Windows 10.

How are you dealing with these upgrades?
We have a well-oiled plan as to how we work with each one of these upgrades. Our employees are ready to roll. If they have Vista and they're looking to upgrade, they have the ability to purchase Windows 10. But from our initial stuff getting checked in -- we've been checking in computers for the last few days and comparing them so people can have them today -- it's been all Windows 7 and Windows 8.  Our goal here is to make the experience for customers as easy as possible. A lot of people don't know but we offer a ton of free services and the upgrade is a key one of those. We're not looking to charge customers for that. That's a last-case scenario. Most of our customers who are going to be walking in are going to have a great experience.

The online upgrade process has been slow in the early hours of the release.
It's a rollout. We haven't been given specifics and times and places where people are going to get it but anybody can come in here right now. We have the bits for it. We can get it on your computer for free. If you have other computers that you're not sure about or are not running well, we can fix those up for free as well. We are trying to expose our customers to all the cool services that Microsoft specifically has to offer. I think it's fair to say that the Microsoft Store is going to provide the absolute best Windows 10 upgrade experience you can find anywhere.

Are you going to initially try to persuade customers to purchase new Windows 10-based computers?
It's going to be a case-by-case basis. It's certainly not our mission to drive devices exclusively. We're going to assess each computer that comes in on a case-by-case basis. There's a system of rating based on what capabilities a computer has and what operating system it has, if it's full of viruses or malware or things like that. We're going to give them the options that they have.

When customers come in for the free upgrade, are you advising them to reimage the machine with a clean install?
No, it's a bootstrap install. It leaves your old Windows image on there and it holds your information on to your new image.

It appears everyone on line is here to see Abby Wambach. No one I spoke with seemed interested in Windows 10.
We have Abby Wambach, which is awesome, but I'm expecting a great kickoff and I think it will escalate into the weekend and beyond. I know we have a significant number of appointments to get their upgrades.

Do customers ask a lot about using Windows 10 for business?
When we talk to business customers, the common thing I hear is "we're still on Windows 7 and it's time to get something new so I'm sure my IT person is going to be doing this." That's the common theme that we're hearing. They're just kind of under the assumption that this is the answer their company has been waiting for.

Many machines are already able to run Windows 10 but, given the quick release of the bits, it's mostly the new machines that will take advantage of new features like Windows Hello. Are you going to let customers, particularly those who are power users that they may want to wait for the newer machines to come out later in August and into October?
Absolutely. If I'm talking about giving the best experience possible, we have to be very realistic with our customers and help them understand that there are certain features of Windows 10 that not all hardware can take advantage of. People need to understand that this is a service that will continue updating, unlike Windows in the past.

When people do buy new systems, what has been the situation regarding the Office 365 upsell. Do most customers do that?
Oh yes. Office 365 has been doing phenomenal for us. When people start to understand really what's in it for them and the OneDrive storage opportunity that comes with Office 365. We have Office 2016 coming out.

Do some balk at the idea of an annual subscription?
Of course that conversation comes up. For us it's helping them understand. And individual licenses are still the right solutions for some people.

What's your favorite new feature in Windows 10?
The Start menu is my thing. I love the fact that it's back, but more importantly that I have my Live Tiles on my Start menu, so I have the best of both worlds.

Do you think those who have avoided Windows 8 because they didn't like the idea of Live Tiles, balking when they see that even though the Start menu is back that the Live Tiles are still there?
You know what? They see it and say "I didn't expect those to be on there." Then they look at it and understand if you show them what they can do with them.  They say it makes sense. So I don't foresee them balking at that. The way it works and how easy it is to move it around and make it personalized, I think people will be perfectly fine with them.

Once you go through that core new interface with them, what's the next feature in Windows 10 you emphasize?
Microsoft Edge. Showing them the new browser and showing them the reading options and the inking options -- they think is pretty cool.

Have you been playing up Cortana?
That's usually number three for me. I see a lot of my employees after they go to the Start menu that's their number two. It's a great tool to make it more personal. That's what we want people to understand, that it's a personalized experience.

Do you talk about Continuum or are customers not interested in that?
It comes up with certain customers. The ones who are power users are the ones having that conversation or the people with the 2-in-1 experience.

What's your level of confidence as to how many Windows Universal apps will be there?
I'm very, very confident about it. Hearing [CEO] Satya [Nadella] at MGX at our Global Exchange [employee conference] a couple of weeks ago gives me that confidence. I came here from Silicon Valley and over the last couple of years, I've had the opportunity to build some pretty good relationships with customers and understand where they're at in their journey and I see the level of confidence they have which gives me a heck of a lot of confidence as well.

And these are developers who were skeptical about Windows in the past?
They're the whole spectrum but there are plenty who are skeptical that weren't developing for Windows that are developing for iOS and Android. When we started making these announcements and they started seeing what the capabilities are, they were excited about it too. And they're some of the most skeptical people in the world. So if they're excited about it, I have no reason not to be.

Even if Windows Phone doesn't gain critical mass, which appears to be the case at this point, are you confident that Windows as a platform has a strong future?
Oh yes. The size and scope of what we're doing here, this is a huge rollout. This is the best Windows we've ever made. It's a free upgrade, there's no reason it's not going to get there.

How many customers come in asking about the Surface Pro 4?
It's a common conversation. My answer is "I wish I had an idea."

Posted by Jeffrey Schwartz on 07/30/2015 at 10:45 AM0 comments


Microsoft Urges Patience to Those Waiting for Windows 10 Download

More Windows 10 Content:

Did you reserve a free upgrade for Windows 10 and still can't get it even though it's supposed to be available today? Apparently you're not alone.

Microsoft's forum page for Windows 10 upgrades currently has a note stating that those who reserved a free upgrade, but don't have it yet, should "watch for your notification in the coming weeks."

Microsoft said Windows Insiders -- those who were testing the Technical Preview -- get first dibs.  A Microsoft spokeswoman sent the following explanation:

"With millions of reservations and Windows Insiders to serve, we want to make sure everyone has a great upgrade experience, so we're rolling-out Windows 10 in phases to help manage the demand. We are rolling out Windows 10 to our Windows Insiders. From there, we began notifying reserved systems in waves, slowly scaling up. If you reserved your upgrade of Windows 10, we will notify you once our compatibility work confirms you will have a great experience, and Windows 10 has been downloaded on your system."

That explains why the system I was runnin g the Technical Preview already has the upgrade and a separate Windows 8.1-based system can't get it. There are ways around that of course. One way is to go to one of Microsoft's 110 retail stores and drop off your machine. Not sure how long that will take and perhaps you don't live near one, so that's not an option.

Another alternative is to take Redmond contributor Brien Posey's advice and run a script that will apparently push though the upgrade. Or you can do what most people will likely do  -- just wait!

Microsoft's forum page for Windows 10 upgrades currently has a note stating that those who reserved a free upgrade, but don't have it yet, should "watch for your notification in the coming weeks."

Posted by Jeffrey Schwartz on 07/29/2015 at 12:54 PM0 comments


Windows 10 Launch: Online but Not On Line

More Windows 10 Content:

What if you had a party and you found out everyone came for the food and not to see you? That was the case for people who showed up for Microsoft's Windows 10 launch party at the Roosevelt Field mall in Garden City, N.Y. -- one of nine Microsoft-owned retail stores picked for celebrity events for the launch of Windows 10. Lower key, in-store festivities were planned for all 110 Microsoft Stores in the United States, Canada and Puerto Rico.

It's a far cry from Microsoft's biggest launch event nearly two decades ago, when computer and electronics stores all over the world opened at midnight for the launch of Windows 95, the desktop OS that ushered in the mainstream PC era. While today's launch of Windows 10 was decidedly more low key and primarily virtual, in the early hours at least it appears to be a non event.

At the Best Buy Store in Westbury, N.Y., other than a few balloons and a couple of signs, it was business as usual. In fact the sign in the front entrance flags the newest Apple MacBook and the Surface Pro 3s still have Windows 8.1 running on them. A few machines do appear to have Windows 10 on them but other than a few people browsing the Best Buy Microsoft department, there was no extra influx of customers. At the Staples store next door, there was no one by their modest PC section and the few systems that were turned on were also running Windows 8.1.

When I asked an employee at Micro Center, also in Westbury, why the store wasn't opening early for the Windows 10 launch, he said Microsoft prohibited the store from doing so. "They still call the shots and they want the focus to be on their stores," he said, though added that the retalier has Windows 10-equipped machines ready to go.

At Roosevelt Field, one of the largest shopping malls in the country, Microsoft is showcasing an appearance by Abby Wambach, who was on the U.S. team that won the women's World Cup Soccer championship this summer.

A line for passes for Wambach's 7 p.m. appearance -- and to take a look at Windows 10, of course  -- formed at 7 a.m. By the time I arrived around 9 a.m. there were about 100 people on that line. I started asking people if they were there to check out Windows 10 and they all universally said they were there to see Abby Wambach. Another who was there asked me "what is Windows 10?" With his two daughters in tow, he continued: "I'm a paper and pencil guy. I still look for payphones." One person waiting on line said bluntly he uses a Mac and was also just there to score some passes to see Wambach.

Well, at least someone was there who has an interest in Windows 10. It turns out Jimmy Solis is a local consultant in the New York City suburbs to small businesses, mostly with 10 to 20 employees. Solis said he's a Windows Insider and has been testing the Windows Technical Preview since it was released in early October, and he's impressed with the final build.

"It's good -- it's going to be like the new Windows 7," Solis said. "It's user friendly, that's for sure. A lot of my clients have complained about Windows 8 but they've fixed all of its bugs." By bugs he was referring to the design of the operating system. Like any IT consultant, Solis said he's going to wait for the first set of patches before recommending any of his clients upgrade to Windows 10.

Scott Goeke, the store manager at the Roosevelt Field location, took about 20 minutes to talk just before the store opened and said he wasn't dismayed when I told him that the crowd was primarily there to see Wambach and not Windows 10. "I'm fully expecting a great kickoff and think it will escalate this weekend," Goeke said. "People have been coming in for months asking about Windows 10."

Many customers had already dropped off their PCs to have the store take care of the free upgrade for them, according to Goeke. The store is offering free installation for those who can't or don't want to go through the process of the download.

While I didn't expect to discover people had camped out for the release, I was surprised to see fewer people waiting there in advance than the crowd that showed up a year ago to celebrate the grand opening of that store. It appears a free Demi Lovato concert on a weekend is a bigger draw than a weekday meet-and greet with Wambach.

Posted by Jeffrey Schwartz on 07/29/2015 at 11:53 AM0 comments


Citrix CEO Mark Templeton To Retire Amid Push for Growth

Citrix today said longtime President and CEO Mark Templeton will retire once the company appoints a successor.  Simultaneously, the company has agreed to give activist investor Elliott Management, which holds 7.5 percent of Citrix's common stock, a seat on its board. The seat will be filled by Jesse Cohen, who will replace Asiff Hirji. Citrix also said that the company's board has formed an operations committee to work closely with the company's management team to find ways to improve margins, profits and its capital structure.

The board has also agreed to Elliott's demands that it consider the sale or spinoff of the Citrix "Go To" business, which includes Go To Meeting, Go to My PC and Go to Webinar, among other related product lines. Citrix said it will conduct a review of strategic alternatives for that business.

Elliott last month sent a letter to Templeton and Citrix chairman Thomas Bogan indicating that it wants to see the company improve its operations and spin off some assets, arguing that Citrix is significantly undervalued and suggesting its stock could be worth up to $100 per share by the end of next year. The stock closed at $61.47 per share on Thursday, though rose on the news and a slightly better than expected second quarter report. The company last quarter collected $797 million in revenue, up 2% year over year, and $103 million in earnings.

In addition to the new board director Cohen, the operations committee will include Citrix director Robert Calderoni, who was also named executive chairman of the board. Bogan was also named to the committee and he'll become lead independent director of the Citrix board.

"We believe the addition of new and fresh perspectives to our board will ensure Citrix continues to lead in application networking and virtualization markets," Bogan said in a statement

Added Elliott's Cohen: "We are confident that the initiatives announced today and the addition of new directors to the company's board will allow Citrix to build upon its position as an innovative industry leader, and to drive significant shareholder value."

Citrix, best known these days for its XenDesktop and XenApp desktop virtualization platform, is also betting big on its new Cloud Workspace platform. The company demonstrated Cloud Workspace for first time at its Synergy conference in Orlando, Fla. back in May. Citrix Workspace Cloud is the company's next-generation digital workspace for Windows-based PCs, Macs, iPads, Android tablets, Chromebooks, new Linux-based systems and even embedded devices that enable Internet of Things-type environments. It's based on a cloud delivery architecture that provides orchestration across servers and nodes.

Posted by Jeffrey Schwartz on 07/28/2015 at 3:17 PM0 comments


What's Next for Microsoft's Surface Tablet PC?

Microsoft may be scaling back its hardware ambitions but the company claims it's still very much committed to its Surface tablet PC business, which several years ago gave the company a black eye following lackluster demand. These days, Microsoft's Surface business is on the rise.

Sales of Surface devices were $900 million in the last quarter and $3.6 billion for the entire fiscal year 2015, which ended June 30. Don't expect a vast line of Surfaces as Microsoft doesn't want to alienate its OEM partners.  But CEO Satya Nadella appears upbeat about the business.

"Surface is clearly a product where we have gotten the formula right, earned fans, and can apply this formula to other parts of the hardware portfolio," Nadella said during Microsoft's earnings call on Tuesday.

So what's next? The Windows 10 release is now just days away and many are wondering whether an upgraded Surface Pro is in the wings. The company isn't saying but DigiTimes last week reported various components suppliers have pegged a new high-end system powered by Intel's Skylake processor, the successor to its Broadwell chips, that arrived early this year but later than planned. Skylake is expected to offer incremental CPU power improvements and improved battery life for devices. According to Digitimes and other reports, the newest Surface Pro devices will maintain the same form factor.

Some news regarding the Surface 3 hit today.  The company announced the general availability of its newest Surface 3 tablet PCs with 4G LTE via AT&T Wireless and T-Mobile. The new units should appeal to those who desire or need a device that has cellular connectivity when WiFi isn't available.

A cellular option, available for iPads, Chromebooks and Android tablets, was absent on the latest crop of Surface devices. The price of the units is effectively $100 extra. Microsoft released the Surface 3, based on an Intel's latest system-on-a-chip, quad-core Intel Atom x7 processor, back in April.  It's available with either 2GB of RAM and a 64GB SSD for $499 or 4GB of RAM and a 128GB SSD for $599. With the 4G LTE they will cost $599 and $699 respectively.

The company isn't saying whether it will add 4G LTE for the Surface Pro 3 but given it's been out for more than a year, it's more likely if 4G LTE is slated for a Pro unit, Microsoft will offer it with the next version.

Posted by Jeffrey Schwartz on 07/24/2015 at 2:23 PM0 comments


Amazon Shocker: Profit Buoyed By AWS' Runaway Growth and Efficiencies

Anticipating another report of an unprofitable quarter by the margin-pressed Amazon.com, analysts were shocked by the e-retailer's $92 million profit for the second quarter on $23.2 billion in revenues, reported Thursday. Albeit the profit was miniscule, Amazon notoriously posts losses and last quarter wasn't projected to be an exception. Remarkably, taking the squeeze off margins was its Amazon Web Services (AWS) public cloud service.

AWS revenues surged 81 percent year-over-year to $1.82 billion, showing markedly accelerated growth over the first quarter's 49 percent jump in sales. Analysts also were impressed by AWS margin of 21 percent, up from 17 percent last quarter.

Investors rewarded Amazon pushing its shares up more than 16% during Friday morning trading, boosting its market cap above $250 billion and giving the company a larger valuation than its brick-and-mortar rival Wal-Mart, making it the largest retailer in the world. In addition to tighter companywide cost controls, analysts were notably impressed by the performance of AWS, some even suggesting Amazon may profit by spinning it off or selling it. Wall Street has overnight fallen back in love with Amazon.

What a difference a year makes. At this time in 2014, Amazon posted disappointing results quarter after quarter, with much that blamed on the tight margins of its retail business, huge investments and a slowing AWS business.  

Despite AWS' heavy price cutting to compete with Microsoft and Google, Amazon Chief Financial Officer and Senior VP Brian Olsavsky credited the subsidiary's team with the rollout of numerous new services and features for enterprise customers as well as improved cost efficiencies. "We are seeing continued increases in usage, both sequentially and year-over-year," Olsavsky said on Thursday night's earnings call. "Innovation is accelerating not decelerating. We had over 350 significant new features and services and we believe that's what resonates with customers. While pricing is certainly a factor we don't believe it's always the primary factor. In fact what we hear from our customers is that the ability to move faster and more agile is what they value."

AWS performance gives it new momentum. Though always the undisputed leader in the growing market for enterprise public cloud services, Microsoft and Google have built out large global infrastructures that rival its footprint, features and pricing. IBM, Hewlett Packard, Oracle and Rackspace, among others, continue to mount challenges as well. While these rivals talk up their hybrid differentiators and continue to expand and gain share, so far it isn't stymying AWS' growth. 

Posted by Jeffrey Schwartz on 07/24/2015 at 10:09 AM0 comments


Cloud Native Computing Foundation Launches To Oversee Container Standards

An industry group formed last month to create standards for containers is on a fast track to get its work done. The Linux Foundation today announced the formation of the Cloud Native Computing Foundation, which will take on the work of the Open Container Initiative (OCI), formed at last month's DockerCon gathering in Santa Clara, Calif.

The group, which initially called itself the Open Computing Project before changing its name to the OCI, describes cloud-native apps and services as those packaged as micro-services-type containers and is aiming to ensure cloud-native apps and services such as automation tools work irrespective of cloud service, operating system and virtual machine. The Linux Foundation used the annual O'Reilly OSCON conference in Portland to launch the new Cloud Native Computing Foundation.

Among the founding members of OCI at last month's DockerCon were Amazon Web Services, Apcera, Cisco, CoreOS, Docker, EMC, Fujitsu Limited, Goldman Sachs, Google, HP, Huawei, IBM, Intel, Joyent, Mesosphere, Microsoft, Pivotal, Rancher Labs, Red Hat and VMware. AT&T, ClusterHQ, Datera, Kismatic, Kyup, Midokura, Nutanix, Oracle, Polyverse, Resin.io, Sysdig, SUSE, Twitter and Verizon have since signed on.

In forming the new organization, Docker has contributed its base container runtime which will be the underlying compute spec.  It will fall under the governance of the OCI, which will also use the Application Container (appc) spec.  The foundation has published its governance charter and the specs are available on GitHub.

Interestingly the technical lead at Docker who is organizing the effort is Patrick Chanezon, who was hired away from Microsoft in April after a two-year stint in Redmond where he worked with Azure GM Mark Russinovich on the Docker container ecosystem. "My main role there was to bring all the Docker ecosystem partners on Azure," he said. "And Microsoft loved the Docker workflow so much that they decided to implement it for Windows. What Mark said a year ago is happening right now."

Docker founder Solomon Hykes recruited Chanezon from Microsoft to help work on the next wave of the Docker platform. The OCP effort kicked off at last month's DockerCon, with Chanezon becoming the company's liaison for the project. Working on standards was nothing new for him, having worked on the JSR 168 Java portlet specification at Sun Microsystems and at Google he worked on the HTLL 5 and Open Social specs.

Chanezon's  formation of the Native Cloud Computing Foundation and agreement on specifications over the past month has happened faster than any other such project he has worked on. "I remember at Sun with JSR 168 there was endless discussion between different vendors," Chanezon recalled. "Here, six week after we announced, we will have the first draft spec on which all participants agree. I've never seen anything get to an agreement so fast. And one of the reasons that's the case is I think container-based computing is being adopted by everyone in the industry. Lots of people want to innovate at the higher level, which is at the orchestration level, and then we can all agree on the standard image format."

From the perspective of advancing interoperability of containers, Chanezon compared OCI to the adoption of the TCP/IP networking standards in the 1990s. "We had lots of protocols, like FTP, Gopher, HTTP and there was lots of competition between all of these protocols, but TCP/IP was the basis on which everyone would agree," he said. "I think with OCI, we're establishing a single basis, and then there will be a lot of competition at the orchestration layer."

The runc spec is now available for comment on and the OCI's goal is to have a first draft available in the next three weeks.

 

Posted by Jeffrey Schwartz on 07/23/2015 at 10:14 AM0 comments


Microsoft's Enterprise Mobility Suite Could Be a $1 Billion Business

One year after releasing its Enterprise Mobility Suite (EMS), Microsoft says it's the "hottest" product the company now offers. Microsoft COO Kevin Turner last week said EMS is on pace to become the company's next $1 billion product (in annual revenue).

While Turner didn't indicate when that might happen, the company yesterday in its earnings release said it has 17,000 Enterprise Mobility customers, up 90 percent for its fourth fiscal quarter ended June 30, year-over-year. The overall installed base has increased 600 percent, the company said, though naturally from a small base. EMS is a cloud-based service consisting of Intune, Azure Active Directory and Azure Rights Management. Subscriptions start as low as $4 per month per user.

"It is the hottest product we have in the company," Turner said in his keynote address at Microsoft's Worldwide Partner Conference in Orlando, Fla. "This product has exploded. It will be a $1 billion product in the future and the market is being made on it now."

At the same time, many competitors with mobile device management suites and various security tools are fighting back. The latest to do so is VMware, which last month enhanced its AirWatch suite by adding its own single sign-on offering that could compete with Azure Active Directory. Adding insult to injury, VMware was named a Leader in Gartner's Magic Quadrant for enterprise mobility management along with Citrix, IBM, MobileIron and Good Technology, while Microsoft was a Visionary for having "a strong vision but falling short of the leaders in terms of execution." Likewise Okta was showcased as the only leader in the Identity Access and Management as a Service category with Azure AD, showing as a Visionary. When I spoke with Microsoft at the time about that, the company spun it as a good thing that Microsoft was recognized as a visionary for a product that wasn't in the market for even a year.

When Microsoft launched EMS at last year's TechEd conference in Houston, Corporate VP Brad Anderson said outright he believes it will obviate the need for traditional MDM products. In a blog post today showcasing EMS, Anderson maintains that prediction.

"The market is definitely still emerging," Anderson said.  "As the value and necessity of EMM grows, we see customers evolving their approach, innovating, and bringing new needs and demands every day.  On a really regular basis I see the traditional point solution MDM vendors, or the identity and access management vendors, struggling to keep up with these demands – customers are seeking more comprehensive and holistic solutions that are architected for (and can scale to) the cloud."

Since its release a year ago, Anderson said Microsoft has extended EMS support for the newest Outlook app, Android (improved), e-discovery, privileged identity management and improved connectivity between Active Directory and Azure Active Directory with the release of AD Connect.

There are big stakes for Microsoft in its move into the MDM market and its success, or lack thereof, could have a major tailwind effect on other offerings, notably Active Directory. Is EMS on your short list?

 

Posted by Jeffrey Schwartz on 07/22/2015 at 2:29 PM0 comments


Microsoft Posts Worst Loss Ever but Looks Ahead to Win in Cloud

When Microsoft acquired Nokia's handset business last year, many feared it was doomed on arrival. The new CEO, Satya Nadella, was stuck with the deal struck by his predecessor Steve Ballmer and stuck with trying to make it work. Clearly that didn't happen as Nadella earlier this month warned employees of plans to write off most of the business, resulting in Microsoft posting the worst quarterly loss in its history. 

Microsoft decided to take its medicine and start the new fiscal year 2016 with a clean slate and focus on Windows 10, which arrives a week from today, and its growing cloud business. In the conference call with investors last evening to discuss the quarterly and year-end results, Nadella emphasized Microsoft's growing cloud business, which includes Office 365, Azure and Dynamics, among other services.

Nadella said its cloud business is on an $8 billion run rate this year on pace to become a $20 billion business in 2018. On the call, Nadella focused on the three key areas of focus for Microsoft moving forward, which he outlined at last week's Worldwide Partner Conference Orlando, Fla. The three areas of focus are:

  • Productivity and Business process including its Office 365, SharePoint and Dynamics offerings
  • Building an "intelligent" cloud platform via Azure, which includes everything from its Enterprise Mobility Suite consisting of InTune, Active Directory Premium and Azure Rights Management;  the company's business intelligence and analytics business and security.
  • Windows: Making computing more personal with Windows across PCs, tablets and phones.  While phones may not play out the way Microsoft had once hoped (in terms of gaining large share), its support of iOS and Android is set up not to diminish that ambition.

For those three to play out, Microsoft is betting tens of billions of dollars on building out its cloud infrastructure and beating out its rivals -- most of whom are also critical partners as well including Amazon, IBM, Google, Hewlett Packard, Oracle and VMware.

"We need to own the cloud," said Kevin Turner, speaking in his WPC keynote address, where he typically gives his annual competitive sizing of the markets Microsoft compete in. "The cloud market is being made right now. And I promise you if we don't own it with the customer, somebody else is going to own it. We have the technology, we have the solutions this is the time to own the cloud."

Turner claims that 85 percent of the Fortune 500 runs at least one Azure service and 60 percent run two or more. While it's too early to say whether Turner's prediction of owning the cloud will play out, the numbers, opaque as they are, look promising at this point. It's 8 billion run rate represents an 88 percent year-over-year increase. If Microsoft can maintain its cloud growth goals, its worst loss ever will be a lot easier to swallow.

Posted by Jeffrey Schwartz on 07/22/2015 at 11:49 AM0 comments


Microsoft Premieres First TV Spot for Windows 10

The ad blitz for Windows 10 has begun. Microsoft last night premiered its first TV spot online for the new operating system, which the company is set to release over the air waves next week. The new Windows 10 spot introduces the personal assistant Cortana and Hello, the feature that aims to replace passwords with biometric authentication. Those two features promise to change the way people interact with Windows and Microsoft used babies and young children to bring this point home, saying that "these kids will grow up with Windows 10."

The 60-second spot is emotional yet gets to the heart of what Windows 10 is all about with the takeaway line: "Windows 10: The more human way to do." Microsoft posted the commercial on YouTube last night.

"The campaign tells the Windows 10 story through the lens of the newest generation, inviting people to join a new era with us," according to a post on Microsoft's Blogging Windows. The commercial shows young children "in their natural settings" in England, the U.S., Iceland Morocco and Thailand. According to the company, " the ads show how technology should be more natural, human and intuitive and adapt to people's needs. The key notion -- Windows 10 delivers a more human way to do."

The commercial kicks off Microsoft's year-long advertising and promotional campaign, which the company is calling "Upgrade Your World."  Microsoft's ability to convince everyday users to upgrade to Windows 10 will be critical to the future of the operating system. When it comes to advertising, Microsoft doesn't have the history of connecting with consumers the way others, notably Apple, has long been able to do. Of course convincing IT decision makers to roll it out on PCs over the next two years is equally crucial. But as more people bring their own devices to work (or use them for work), winning them over has never been more critical.

What do you think of the new commercial?

Posted by Jeffrey Schwartz on 07/20/2015 at 11:36 AM0 comments


Rackspace Will Offer Support for Microsoft Azure Cloud

In what may sound like an unusual arrangement even by today's standards of "coopetition," Rackspace will offer managed support services for Microsoft's Azure cloud. It's a curious arrangement in that Rackspace runs its own public cloud infrastructure as a service that competes with Microsoft Azure. At the same time, the two companies, partners for 13 years, obviously concluded both could benefit from offering the service.

Rackspace is known for its so-called "fanatical support" for the managed hosting and cloud services it offers. While that should like hype, I've talked to many Rackspace customers who have said it's really true. Apparently Rackspace has found there's money to be made in helping customers monitor and manage the rival cloud service. Rackspace recently added Office 365 support as well, has long offered SharePoint hosting and more recently added SQL Server support, along with other Microsoft infrastructure-based services. Last year Rackspace said it would support Microsoft's Cloud OS hybrid cloud software as well as emphasizing new services based on Hyper-V. Meanwhile Microsoft, which offers Azure primarily as a self-service cloud, is relying in partners like Rackspace to offer Azure-based services and support.

Rackspace's own cloud IaaS is based on OpenStack, which the company helped create and championed its move into the open source community five years ago. But the company has struggled over the past few years and recently put itself up for sale only to subsequently decide to remain a publicly traded company.  The company recently named Taylor Rhodes as CEO, who joined Scott Guthrie, executive vice president for Microsoft's cloud and enterprise business, on stage at Microsoft's Worldwide Partner Conference earlier this week in Orlando.

"We have hundreds of Microsoft-certified professionals on our team," Rhodes said at WPC. "And now they can help customers who want to leverage the power of Azure to architect their applications the right way from the start.  To deploy to the platform much faster than they usually can on their own, and importantly, you will keep evolving your product, so we'll keep up with your releases in hopes that they use Azure to its full potential so they get full value. "

Rackspace will offer managed 24x7x365 support via its staff of Microsoft certified engineers, architectural guidance, monitoring and hybrid deployment services. Customers can utilize just support for existing Azure services they already have or purchase Azure infrastructure and have Rackspace support it as well.

Posted by Jeffrey Schwartz on 07/17/2015 at 2:28 PM0 comments


Google Surges, Crashes and Snoozes

Higher advertising spending on YouTube and on mobile platforms, along with growth in programmatic ad buying helped Google post a stronger-than-expected second quarter. Cost cutting and hints by its new CFO Ruth Porat that Google may, for the first time, offer investors a dividend or repurchase shares helped push its stock and market cap today to an all-time high past $400 billion.

Google revenues of $17.7 billion were up 11% for the quarter, blowing past Wall Street expectations of $14.28 billion. That's right -- the company posted $3 billion more in revenues than analysts expected. Investors applauded the strong quarter by pushing its shares up more than 14% today. Porat, who came over to Google last month from Morgan Stanley, noted that viewership on YouTube has increased 60 percent --  the fastest growth in two years.

The surge in its overall business competed for attention with news that Google's self-driving car was involved in another crash yesterday, this one in Mountain View, Calif. where the company is headquartered. Google said the crash wasn't the fault of the driverless car. A Lexus SUV behind it didn't break at all, wrote Chris Umson, who oversees Google's driverless car program. In a blog post, Umson said it was the 14th time that another driver hit one of its cars, 11 of which were rear-end collisions.
"The clear theme is human error and inattention, Unson said. "We'll take all this as a signal that we're starting to compare favorably with human drivers. Our self-driving cars can pay attention to hundreds of objects at once, 360 degrees in all directions, and they never get tired, irritable or distracted."

Meanwhile, if all of this makes you want to snooze, Google has taken care of that today as well. Google has added a snooze button to the interface of its Gmail program. So if you have a message you want to pop up to the top of your inbox at a more appropriate time, the company now supports that capability. Google said users can now snooze such messages as restaurant reservations, hotel confirmations, calendar invites and package tracking updates.

Posted by Jeffrey Schwartz on 07/17/2015 at 12:23 PM0 comments


Microsoft's Top Office Exec Affirms Commitment to SharePoint

Despite revealing that a new version of SharePoint Server is on Microsoft's roadmap for next year, many customers and partners have wondered how committed the company is to the on-premises version of Microsoft's collaboration platform. Those concerns escalated following the company's first Ignite conference back in May where the company emphasized Office 365 and its new tools such as Delve and left SharePoint Server 2016 largely in the background.

Julia White, general manager of Microsoft's Office division, which oversees SharePoint, admitted that she received 423 e-mails following Ignite asking why she barely mentioned SharePoint in her keynote address at the time. In her keynote at Microsoft's Worldwide Partner Conference, taking place in Orlando this week, White acknowledged giving  the short shrift to SharePoint at Ignite and said the company is not deemphasizing SharePoint. "Today, I'm here to say, SharePoint," she said in her keynote Monday.  "SharePoint, SharePoint, SharePoint, SharePoint, SharePoint. We are absolutely committed. We have a fantastic SharePoint Server 2016 coming out. Rock-solid code based on the cloud. For the first time, we're taking the cloud code base and delivering that with our SharePoint Server 2016, which means you get the great reliability, performance [and] scalability that we've learned from the cloud into the Server code base."

SharePoint 2016 is slated for release in the second half of next year, and it will be built with a 'cloud-first' approach. To be sure, despite the lack of love given to SharePoint by White in her Ignite keynote, Microsoft did hold some sessions at the Chicago confab to outline plans for next year's release. A technical preview or beta is likely later this summer. As reported at the time by my colleague Kurt Mackie, Microsoft is continuing to focus on SharePoint's Files, Content Management, Sites and Portals components going forward. It plans to make it easier to use hybrid architectures (SharePoint Server plus Office 365 services) and make it easier for organizations to perform migrations when they are ready.

"We know that SharePoint will be in hybrid for a long time, [as is ] the nature of that workload," White said. "But it doesn't mean our customers don't want to take advantage of cloud capabilities as well. So now you can have all of your custom workflow on-premises and still be able to tap into the Office Graph in a hybrid capability.  And that's what [SharePoint] Server is delivering."

New features will also include built-in data loss prevention (DLP), auditing, reporting and e-discovery, White emphasized.

Microsoft realizes and has acknowledged that many enterprises will want to use SharePoint Server to keep certain data on premises. At the same time, it appears Microsoft is emphasizing the hybrid nature of SharePoint Server 2016, tying the new on-premises server with much of what's available via Office 365 services.

Posted by Jeffrey Schwartz on 07/16/2015 at 11:46 AM0 comments


Windows 10 Will Be the Last Major Microsoft OS Release

Microsoft has alluded to it for some time but when COO Kevin Turner gave his annual pep talk to partners today, he said in no uncertain terms that the launch of Windows 10 will be the last major new release of the operating system.

Speaking at Microsoft's Worldwide Partners Conference, taking place this week in Orlando, Fla., Turner said the move to more continuous upgrades means the company is officially moving away from its model of releasing substantial new versions of Windows every three years. "This will be the last monolithic release we have that was built around the three-year upgrade cycle," Turner said. "We will continually be improving the product."

Does that mean there will be no Windows 11, Windows 12, etc.? While Microsoft hasn't explained how it'll number or name these more frequent upgrades, it would be a safe bet that those with bug fixes and just a handful of new features will be point releases. Upgrades with more significant feature sets could get new version numbers much like Apple does with iOS and MacOS X. Hence, a Windows 10.1 followed by Windows 10.2, etc. appears a likely scenario for point releases. It also wouldn't be surprising to see Microsoft downplay those version numbers over time.

What remains to be seen is the business model for Windows moving forward. Will Windows as a service mean customers over time must pay to receive continuous upgrades? Customers that have become accustomed to free operating systems on other device platforms may be reluctant to pay a subscription fee for Windows, particularly consumers. One possible route Microsoft could go is to have a bare bones starter edition that's free and premium versions that are subscription based. It could also be tied to other subscription-based offerings, notably Office 365.

Microsoft's shift to the new Windows-as-a-service model is a likely reason the company is taking a different low-key approach to this launch. Rather than hosting a major event, Microsoft is having distributed celebrations.

Turner and others have noted that Windows 95 hit RTM 20 years ago today. The Windows 95 launch event, which took place in August 1995 on the Redmond campus outdoors with the rights to the Rolling Stones' "Start Me Up" and Jay Leno serving as master of ceremonies, was a major media spectacle.

As Microsoft sets to kick off delivering Windows as a service moving forward, a new way of introducing it and generating interest makes sense.

 

Posted by Jeffrey Schwartz on 07/15/2015 at 12:09 PM0 comments


Microsoft Reveals Plans for Windows 10 Launch Events

As the July 29 Windows 10 release draws near, Microsoft today revealed its plans to debut the new operating system. The company will kick off a TV ad blitz next week (July 20)  and planned events at its retail and third-party stores.

Microsoft is also planning "special events" at 13 of its stores throughout the world on the launch day including Sydney, Tokyo, Singapore, Beijing, New Delhi, Dubai, Nairobi, Berlin, Johannesburg, Madrid, London, Sao Paolo and New York City. Among the major retail chains participating include Best Buy, Bic Camera, Croma, Currys/PC World, Elkjøp, Jarrir, Incredible Connection, Media Markt, Staples, Wal-Mart, Yamada, Yodobashi and others, Microsoft said.

As part of its promotional campaign called "Upgrade Your World," Microsoft will showcase those who use Windows 10. The company is also contributing $10 million to 10 global and 100 local nonprofit organizations to promote awareness of their causes, Microsoft said. They include CARE, Code.org, Keep a Child Alive, Malala Fund, Pencils of Promise, Save the Children, Special Olympics, The Global Poverty Project and The Nature Conservancy.

The launch event and ad campaign is largely targeted at consumers but the announcement was conveniently timed with Microsoft's annual Worldwide Partner Conference, which CEO Satya Nadella, Windows and Devices Chief Terry Myerson and other key executives kicked off with the opening keynote session. In talking about Windows 10, Nadella emphasized major new security features that enable IT to encrypt and manage access to all business data and separate it from personal information. Nadella took the stage reminding its vast partner base of the principal on which Microsoft was founded: bringing personal computers to mainstream business and home users.

It'll become more personal thanks to Continuum, which will bring a common Windows platform to all types of devices from sensors and phones to tablets, PCs and large conferencing rooms. "We are going to have this one unified platform, and that to me is a key differentiator of what Windows stands for," Nadella said in his WPC keynote.

Also critical extensions to Windows 10 will include the Cortana digital assistant and HoloLens, which will bring 3D user experiences to consumer and business environments.

Posted by Jeffrey Schwartz on 07/13/2015 at 2:37 PM0 comments


Anticipation of Windows 10 Release Stalls PC Sales

It's not unusual for PC sales to fall off in advance of a new operating system release and last quarter was no exception.

PC shipments plummeted 11.8 percent in the three-month period that ended June 30 over the same period last year, according to IDC's quarterly PC Tracker report released Thursday night. The decline was 1 percent more than IDC had earlier projected but was overall in line with the fact that the comparative period last year was buoyed by Windows XP's end of life and the fact that sales channels were reducing inventories to make way for this month's release of Windows 10.

Similar to Gartner, IDC doesn't anticipate an immediate bump after Microsoft's July 29 release of Windows 10. Gartner earlier this week said it's predicting a 5.7 percent decrease in PC spending this year.  IDC points to another noteworthy, but certainly not surprising, point: the free Windows 10 upgrade for those with Windows 7 Home and Pro editions will certainly stall new PC purchases.

Another reason IT professionals will want to wait, at least initially, is for new PCs based on Intel's new processor line, code named Skylake, as well as a new line of Broadwell CPUs. "All of the hardware vendors are readying new designs based on Skylake and to take advantage of the new Windows design with thinner, lighter and better battery life," said Patrick Moorhead, president and principal analyst with Moor Strategy.

Moorhead, who follows the PC industry closely, believes Windows 10 will be a popular operating system. Despite the obvious criticism of its predecessor, Moorhead believes the return of application developers will be key to its success.  "I believe there will be many more apps in this ecosystem, if nothing else because of the ease for which you can get them into Windows 10," Moorhead said.

However, Moorhead believes some predictions that Windows 10 will get a strong lift in in the first year are overstated. That includes our survey, published Wednesday, that found that 55 percent will upgrade in the first year, with 21 percent doing so in the first three months. A more reasonable expectation, Moorhead said, is 20 to 30 percent will roll out Windows 10 within a year. "I don't believe any research out there is worth anything because upgrades will be dependent on the promotions Microsoft does," Moorhead said. "We haven't seen them yet but they're coming."

Posted by Jeffrey Schwartz on 07/10/2015 at 12:05 PM0 comments


Windows 10 Upgrade Outlook Appears Strong Despite Weak PC Demand

With just three weeks until Windows 10 is scheduled to arrive, early indications continue to suggest that there will be significant demand among business PC users, though it's less clear whether the new OS will boost Microsoft's share of the tablet market.

More than half of 675 Redmond magazine readers responding to an online survey conducted last week said that they plan to upgrade their existing PCs to Windows 10 within one year. According to the survey, 55 percent will upgrade in the first 12 months and 21 percent will do so within the first three months of Windows 10's release. Thirty five percent say they intend to upgrade within the first six months.

The question was asked in conjunction with the annual Redmond Third-Party Reader's Choice Awards, which will be published in September. With the release of Windows 10 on July 29, 35 percent of those responding said they plan to refresh their PCs at a faster pace than before. On the other hand, 65 percent have no plans to expedite PC refreshes.

Microsoft's plan to offer consumers Windows 10 as a free upgrade could have an impact on demand for new systems given that the new OS could extend the life of older systems.  However, only those with Windows 7 (or later) Home and Pro editions are eligible for free Windows 10 upgrades. Volume licensees and those with Enterprise editions must have Software Assurance to get the Windows 10 upgrade, which has to be downloaded from the Microsoft Volume Licensing Servicing Center.

Microsoft has indicated that it sees a strong pipeline for new PCs and devices despite its push to get customers to upgrade their existing systems. Yet, a number of indicators don't bode well for new system sales for the rest of this year. Among some recent signs that demand for PCs will remain weak were Gartner's report this week that the $606 billion PC and computing  device market will decline 5.7 percent this year and AMD's warning that its revenues for the quarter that ended June 27 would be down 8%, which is much sharper than the 3% that was earlier forecast.

AMD attributed the decline on weaker PC sales, sending its shares down more than 15% Tuesday. Weak PC demand is also likely to impact Intel, according to a Goldman Sachs research note last month. Micron, the largest provider of RAM for PCs, last month blamed a falloff in PC demand for posting lower revenues for its third quarter earnings report.

Does your organization plan to upgrade to Windows 10? If so, will it be via a new system or an upgrade of your existing device? Either way, what is the impetus for you to upgrade to Windows 10 (or not, if that's the case)? Feel free to comment below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 07/08/2015 at 1:40 PM0 comments


Microsoft Links System Center to New Operations Management Suite

When Microsoft launched its new cloud-based Operations Management Suite (OMS) two months ago, the company emphasized that IT organizations could use it directly or as an extension to their existing System Center management implementations. Microsoft last week added that extension with the release of its System Center add-on for OMS.

The idea behind OMS, released during Microsoft's Ignite conference in early May, is that organizations that don't use System Center can use the cloud-based management platform to administer workloads running on Microsoft Azure, Windows Server, Amazon Web Services, Linux, VMware and OpenStack.

"The OMS add-on provides System Center customers access to the full suite of OMS solutions at one low cost," the company said in a blog post. "For every System Center Standard or Datacenter license you own with Software Assurance, you will be able to purchase a corresponding Microsoft Operations Management Suite add-on for access to allocated solutions that enable you to extend your datacenter, quickly enable hybrid cloud scenarios, and take advantage of cloud bursting, migration and dev/test scenarios."

According to Microsoft, OMS for System Center Standard Edition is priced at $60 per month for a two VM pack, though with an annual commitment the cost is $36 a month for those signing up by year's end. If purchasing OMS separately without System Center, the service costs $83 per month, Microsoft said. The company posted a price list, noting that for each System Center license owned, customers can purchase a corresponding OMS add-on.

Microsoft also said it's now looking to extend its use of Azure for backup and recovery to a broader base of customers including small and medium businesses (SMBs) by offering its Azure Site Recovery service with OMS. The company is extending its Azure Backup offering with a new System Center Data Protection Manager (DPM) agent, available for download. Also new is Azure Backup for IaaS VMs, which Microsoft says provides access to multi-disk storage and PowerShell automation and a new Azure Backup management console.

Posted by Jeffrey Schwartz on 07/07/2015 at 10:34 AM0 comments


Microsoft’s Stature on Wall Street Surges

As Microsoft wraps up another fiscal year, it appears it was a good one with significant changes that'll shape the company for years to come. Despite the tough choices that CEO Satya Nadella last week said that Microsoft must make, the company is also getting respect for the moves he's made in Silicon Valley over the past year as well as another important place: Wall Street. 

Microsoft has risen sharply among the 100 most respected companies by institutional investors, though it still has a long way to go before coming close to Apple, which once again topped the Barron's annual ranking. While Google dropped one notch to No 4 this year, the two were the only tech companies in the top 10. Amazon.com, which still scored better than Microsoft, saw its ranking drop this year to 17 from 7 last year, while Intel at 25 dropped a bit from 21.

Last year, Microsoft was closer to the bottom of the list at 62 but has jumped to 27. Despite that impressive rise in standing among investors, the question is: does Microsoft have much to show for itself? Its stock today hovering just below $45 per share is up 8%, year-over-year. If you use Apple as the metric, Microsoft's growth was impressive but not earth-shattering. Apple shares have jumped 40% year-over-year. On the other hand, while investors may respect Google more than Microsoft, they haven't put their money where their mouths are when it comes to the search giant. Google shares are down 9% over the past 12 months. Some other notables in the tech market: Cisco is up 12%, Oracle is flat, IBM is down 10% and VMware shares are down 12%. The S&P 500 is up just under 6% during the past year. The tech-heavy Nasdaq 100 index has jumped up 14%.

When you're in that plus or minus 10%  zone of course, valuations can swing sharply on major news or even the most speculative of rumors. So is Apple the only one to enjoy a meteoric rise in value? Well, Amazon is up 31% (putting aside it's primarily an online retail business, many believe Amazon Web Services is fueling its growth). Salesforce.com is up about 20%, buoyed by recent speculation that Microsoft made an offer to acquire the huge SaaS provider. According to reports, Salesforce.com has held out for much more and many believe such a combination would wreak havoc on both companies and their customers.

In addition to cultural differences, Barron's cover story this week, examined another well-known trend in Silicon Valley, the Pacific Northwest and other technology providers that has many wondering whether we're in another tech bubble.

Salesforce.com is among a number of companies that use non-GAAP reporting to hide the expenses of huge stock-based compensation that critics say masks the true earnings of a company. Among others who use non-GAAP reporting in a big way to pay large amounts of stock to executives are Amazon, Google, Facebook, Twitter and Qualcomm, Barrons noted. Microsoft, Apple and Intel eschew such compensation. That would certainly complicate Microsoft absorbing Salesforce.

Many IT pros and developers make big investments in these companies, whether or not they hold shares. If Wall Street is taking more notice -- and interest --that may have all types of consequences in Microsoft's next fiscal year and beyond.

 

Posted by Jeffrey Schwartz on 06/29/2015 at 12:19 PM0 comments


What 'Tough Choices' Are in Store for Microsoft?

Microsoft CEO Satya Nadella yesterday warned employees against believing the company's culture can remain "static" and cautioned that some "touch choices" will be made. The letter, obtained and published by Geekwire and confirmed as authentic by Mary Jo Foley's All About Microsoft blog, doesn't specify what those though choices will be but it's a reasonable guess they could have a major impact on Microsoft's smartphone ambitions.

The memo also said that Microsoft will evolve last year's goal to become a productivity and platforms company in a mobile-first, cloud-first world, to "reinvent productivity services for digital work that span all devices."  As Foley pointed out as she read "between the lines" of Nadella's memo, "Tough choices is many times a sign for layoffs and/or product-line phase-outs." That's true and as Nadella put it, "we will need to innovate in new areas, execute against our plans, make some tough choices in areas where things are not working and solve hard problems in ways that drive customer value."

The market share for Windows Phone remains static in the 3 percent range and there's little evidence that share will increase at all. At the very least, it wouldn't be surprising if Microsoft writes off its $7.2 billion acquisition of Nokia's handset business. Despite limited demand for Windows Phone, it would be surprising to see Microsoft give up on it before the release of Windows 10, which potentially could prop up demand for Microsoft's mobile platform. Last week's decision to combine its devices business with operating systems does suggest there will be some job reductions, certainly in areas where there's overlap. The move reaffirms Nadella's commitment and further moves to support the One Microsoft mission initiated two years ago by former CEO Steve Ballmer. Microsoft also appears committed to gaming, which, as Foley pointed out in this month's Redmond magazine column, is important.

Also worth pointing out, Nadella said in his memo that Microsoft will invest "in three interconnected and bold ambitions" which aim to:

  • Reinvent productivity and business processes. "We will reinvent productivity services for digital work that span all devices. We will also extend our experience footprint by building more business process experiences, integrated into content authoring and consumption, communication and collaboration tools. We will drive scale and usage by appealing to 'dual-use' customers, providing productivity services that enable them to accomplish more at work and in the rest of their life activities with other people."
  •  Build the intelligent cloud platform. "All these experiences will be powered by our cloud platform -- a cloud that provides our customers faster time to value, improved agility and cost reduction, and solutions that differentiate their business. We'll further provide a powerful extensibility model that is attractive to third-party developers and enterprises. This in turn enables us to attract applications to our cloud platform and attach our differentiated capabilities such as identity management, rich data management, machine learning and advanced analytics."
  • Create more personal computing. "We will build the best instantiation of this vision through our Windows device platform and our devices, which will serve to delight our customers, increase distribution of our services, drive gross margin, enable fundamentally new product categories, and generate opportunity for the Windows ecosystem more broadly. We will pursue our gaming ambition as part of this broader vision for Windows and increase its appeal to consumers. We will bring together Xbox Live and our first-party gaming efforts across PC, console, mobile and new categories like HoloLens into one integrated play."

As Foley pointed out, Nadella never mentioned Windows Phone, though he used the term mobile. Some have also raised the question of whether Microsoft will continue to put resources into Bing, which Nadella didn't mention.

What tough choices do you think Microsoft should make?

Posted by Jeffrey Schwartz on 06/26/2015 at 10:11 AM0 comments


Startup Snowflake Launches Cloud-Based Data Warehouse Service

A Silicon Valley startup led by former Microsoft Server and Tools President Bob Muglia has launched a cloud-based data warehouse service that it claims will significantly extend the limits of traditional analytics platforms. Snowflake Computing made its Snowflake Elastic Data Warehouse, which Muglia described as a big data platform built from scratch and designed specifically to run in the public cloud, generally available this week. Its release is the latest effort to bring data warehousing to the masses.

Muglia joined Snowflake, founded in 2012, last year. The company, which was founded by the lead architect for the Oracle RAC product along with other database and storage veterans, also said this week it received a C Series investment of $45 million from Altimeter Capital. That brings the total amount invested in the company up to $71 million.

Snowflake says what separates its offering from other high performance data warehouse technology is that it was built from scratch to run in the public cloud. Company officials argue that the cirumstance makes the service much more scalable and less expensive because it uses low-cost cloud storage. Muglia said Snowflake has come across numerous customers capturing machine-generated, semi-structured data using Hadoop-based clusters who have struggled to transform that data into a form that a traditional data warehouse can handle that can enable a business analyst to connect to it with common BI tools such as Excel or Tableau.

"We just eliminate those steps and load the semi-structured data directly into Snowflake and immediately business analysts can run queries against it directly using the tools they understand, and this is a huge savings of complexity and time," Muglia said. With traditional data warehouses, "typically there's a loss that happens from a data perspective as you go through that transformation. Some data is not available because it is not transformed and also the data warehouses tend to only be able to handle a subset of the data. One of our customers was only loading one week's worth of data into their data warehouse because that's all they could support in that, whereas with Snowflake they could keep all the historical data around and when analysts wanted to run a query on data that's say three months old, that was no problem."

Muglia said it's not a problem because of the infinite amount of storage available in the back-end repository of Snowflake -- Amazon Web Services S3 storage. Many of the early customers using Snowflake are using hundreds of terabytes of S3 data. Muglia said the service can easily scale to multiple petabytes of data stored in S3.

"There's no limit to the amount of data you can store," Muglia said. "Obviously if you issued a query that for any reason needed to scan a petabyte of data, that would be very costly and it would be long query to run. Physics still apply but one of the key things is that the way we store that data and the information we gather about the data allows our query processor to do something we call pruning. If you stored five years' worth of data and let's say it was 5 petabytes in size, and you issued a query against one week's worth of that data, regardless of what week it was, we could just select exactly the data we needed. And let's just say we only need half a terabyte of something, we could munch through that relatively quickly and return the results pretty quickly even though you may have 5 petabytes of data."

Asked if in scenarios where a customer has petabytes of data whether Snowflake uses AWS's Glacier archive service, Muglia said that's not necessary given the current economics of storing data in S3. "It's probably the most economical place to store it actually," Muglia said. "Compared to enterprise-class storage it's crazy cheaper, and compared to putting it on nodes, which is what you'd need to do with Hadoop, it's also quite a bit less expensive. Even what people tend to think of as the low-cost alternative they talk about building things like data lakes using Hadoop. In those cases, the data is stored on the active nodes and while that's a whole lot cheaper than EMC storage, it's much more expensive than S3 would be and much more expensive therefore than Snowflake would be."

At today's rate of about $30 per terabyte per month for S3 storage from AWS, Snowflake charges "a slight premium" on top of that for the service. Snowflake is a software-as-a-service (SaaS) offering, so the underlying cloud storage infrastructure is relevant only to the extent customers have other data there. Despite the fact that Muglia left Microsoft four years ago after 23 years, it was ironic to hear him hawking AWS. As Microsoft's Server and Tools president, which was a $17 billion business when he left in 2011, Muglia was one of the first to extoll the virtues of Azure to IT pros prior and right after its launch -- he frequently gave the keynote addresses at the company's annual TechEd conferences. Snowflake hasn't ruled out offering its service on Azure in the future. The company has not conducted a detailed analysis of Microsoft's Azure Blob Storage. "It's feasible to do it in the short run. The question is customer demand," Muglia said. "I think there will probably be a day but it's not tomorrow that we'll do it."

While running Snowflake on S3, Snowflake most pointedly will compete with Amazon's own Redshift data warehousing service and ultimately Microsoft's Azure SQL Data Warehouse, announced in late April at the company's Build conference and set for release this summer. Snowflake Vice President of Product Marketing Jon Bock said for now Amazon's Redshift is a most affordable cloud-based data warehouse service today but argued that its underlying engine is based on code from existing database technology.

"We have the benefit of starting with a completely new architecture and writing the full code base ourselves," Bock said. "One simple example of that is semi-structured data support, that machine data that's increasingly common that people want to analyze. Traditional databases weren't designed for that at all. So what you ended up having to do is put another system in front of that, which is where Hadoop came in. Then you preprocess that data, take the result and load it onto a relational database. We spent time making sure you didn't have to do that with Snowflake. You can take that machine data, load it directly into Snowflake and be able to query it immediately without any of that preprocessing or delay in the pipeline."

If the service lives up to its claims, it should boast a nice valuation, or become an attractive takeover target.

Posted by Jeffrey Schwartz on 06/24/2015 at 11:19 AM0 comments


Microsoft Demos First Multiplatform Container App

A year ago, Docker was just the latest upstart promising to change the way software is developed, deployed and managed. With key industry players supporting its software container platform, Microsoft's partnership with Docker last June was quickly noted for its promise to enable developers to build applications that can work regardless of operating system, virtual machine and cloud. On the heels of joining the Docker-driven Open Container Project this week, Microsoft Azure CTO Mark Russinovich gave a keynote address at the annual DockerCon conference in San Francisco, demonstrating a key milestone toward building apps that can run on multiple server operating systems.

Russinovich demonstrated what Microsoft claims is the first ever multiplatform container application. "Built, shipped and running using Docker, this container application is the first in the industry to work across both Windows Server and Linux," said Corey Sanders, Microsoft's director for Azure product management, in a blog post. "We want to bring you broad choice and flexibility for building your apps, combining Windows Server and Linux containers with Docker Compose and Docker Swarm, to offer a truly cross-platform experience."

Also demonstrated for the first time was how developers and IT pros can use Microsoft's Azure Marketplace to select and deploy single or multicontainer apps that are sourced from a Docker Hub Image using the Docker Compose developer interface, Sanders noted.

Microsoft also showcased its support for the new Docker Trusted Registry VM image, an on-premises authentication repository launched by Docker this week. The Docker Trusted Registry VM image, also added to the Azure Marketplace, runs on premises where customers can store and share Docker container images.

Along with Microsoft, Amazon Web Services and IBM also are offering the Registry, with costs starting at $150 per month. Docker describes it as a highly available registry that offers integration with Active Directory, LDAP directories and other authentication platforms and offers role-based access control and audit logs for organizations looking to manage authorization or with compliance requirements.

Posted by Jeffrey Schwartz on 06/24/2015 at 11:16 AM0 comments


Combining Windows and Devices Engineering Makes Sense

The big headline behind last week's annual Microsoft reorganization was that Stephen Elop and several other longtime senior executives and engineering heads are leaving Redmond. But the bringing together of Windows engineering with devices and Dynamics under cloud and enterprise is a sign that CEO Satya Nadella  is looking to further break down the siloes and, quite frankly fiefdoms, that stood in the way of successful products.

Having the executive who leads the engineering for the software that powers devices such as its Surface tablet PCs, Lumia phones, Xbox gaming consoles and even the new HoloLens, in charge of that hardware as well seems like a no brainer.

Nadella likely realized that, which meant Elop and Terry Myerson couldn't both remain in charge of their respective organizations indefinitely. As The New York Times reported today, Myerson's elevation makes him among the most powerful and visible executives at Microsoft under Nadella. Myerson's 18 year tenure with Microsoft began with an engineering role with the Exchange group, though his work in the operating systems group goes back to when he reported to onetime devices lead Andy Lees where the two reportedly clashed.

Before heading up the Windows group, Myerson was tapped to replace Lees as the head Windows Phone chief. Realizing the Windows Phone software predecessor Windows Mobile had too much legacy baggage to be salvaged a year after modern phone OSes iOS and Android started to take hold, Myerson was a key advocate for starting from scratch and developing Windows Phone, The Times report noted.

Likewise bringing the Dynamics group into the Cloud and Enterprise Group under Scott Guthrie's leadership promises to align the various applications with Azure and the various services and APIs built around it.

While bringing the two groups together looks like a wise move, there's more at stake with the consolidation of operating systems and devices. The question about both is whether they were brought together too late.

Posted by Jeffrey Schwartz on 06/22/2015 at 12:45 PM0 comments


Java License Snafu May Explain Lack of Native Android for Windows 10

Perhaps one of the most interesting announcements Microsoft made at its Build conference in San Francisco six weeks ago was the ability for developers to bridge their Android and iOS applications to the new Universal Windows Platform (UWP). Making it easier for developers to port their existing mobile apps, as well as new ones, to UWP could be critical to making Windows 10 an appealing OS.

However, there's a somewhat different approach developers must take when building their UWP apps for Android versus iOS. Developers looking to port their apps from iOS to the new Windows 10 OS may find it easier than doing so with Android. That's because Microsoft EVP Terry Myerson -- who, in addition to overseeing Windows, is also now responsible for devices -- said in his Build keynote that developers can compile the same Objective-C code used to build iOS apps for iPhones and iPads within Visual Studio on Windows, "enabling you to leverage that code and use capabilities only found on [the] Windows platform." When doing so with Android apps, there isn't a comparable native approach.

That isn't sitting well with at least some Android developers. During the Q&A segment of a one-day Build roadshow stop in New York last month (the show is currently on a worldwide tour), an attendee asked if Microsoft has any plans to do for Android the equivalent of what it has done for iOS and allow native Java development in Visual Studio. Kevin Gallo, who is Microsoft's partner director for the Developer Ecosystem and Platform organization and who gave the day's keynote, responded: "I would love to do that. The challenge we have is there's a company that is currently the steward of Java who doesn't make that easy. For Objective-C it was possible. So we went that route. We see it as a great way of integrating with the application, where you can start with the codebase you've got and just use that language to build."

Implicitly, Gallo was saying Apple didn't stand in Microsoft's way.

The preferred language for native Android is Java. "Java is a very popular language. There are challenges in the Java space," Gallo said. "We're evaluating what we can do there. It's something we definitely are considering." The audience member asked if the issue lies with Oracle. "They're the steward," Gallo said. "They call it a steward for a reason."

The response was followed by laughter, and that was the end of that discussion.

Yet that brief exchange quickly reminded me of the feud Microsoft had in the late 1990s with the original steward of Java, Sun Microsystems, which Oracle acquired in 2010 for $7.4 billion. Sun revoked Microsoft's Java Virtual Machine license in Internet Explorer after accusing Microsoft of violating its terms. The dispute resulted in a lawsuit between the two companies. Concerns that Oracle would be able to engage in anti-competitive behavior also came up among critics of its bid to acquire Sun before the deal was approved by regulators. In more recent years, relations between Microsoft and Oracle have improved and the two announced a broad cloud compatibility pact two years ago.

To what extent Microsoft attempted to gain access to those Java components for UWP isn't exactly clear, but it doesn't look as though a contentious dispute -- if it is even contentious at all -- is rising to the levels of yesteryear.

IDC analyst Al Hilwa said the issue appears to be more of a licensing matter than a technical barrier. "I actually think that it would be a big win for Oracle to be involved more directly with an Android development platform on Windows and potentially elsewhere," Hilwa said. "In theory, vendors like Microsoft can do a parallel implementation of Java, much like Google did, but the legal exposure here is still open because of the Oracle/Google case."

Oracle said it's not commenting. A Microsoft spokesperson pointed to the company's new Project Astoria, the tooling announced at Build, which lets developers bring code from their Android apps to Windows while working within their current IDE. "It uses an Android subsystem to run code from your app in a highly efficient, Windows-optimized way," the spokesperson said.

For iOS, Microsoft's Project Islandwood, also announced at Build, imports iOS apps to the Windows ecosystem. "Via a Visual Studio extension, Islandwood takes a project-based approach -- translating your existing Xcode project into a Visual Studio solution, providing native support for Objective-C and its runtime, and providing a strong subset of the APIs you'll find on iOS, which makes it easy to bring your code across without significant modification," the spokesperson said. "Once it's in Visual Studio, it's a regular Universal Windows app app that happens to be written in Objective-C and uses some APIs from another ecosystem."

IDC's Hilwa said he believes most Java developers should find the bridging approach suitable. "The feedback I hear from developers and app owners is that they are more likely to use the Android 'bridge' approach to bring an app to Windows 10," he said. "It promises to be quicker, though admittedly less strategic in terms of commitment to the platform. But, as long as the services being used are Microsoft's services, then essentially Microsoft gets what it wants out of the exercise."

Forrester analyst John Rymer said that to his knowledge, Microsoft doesn't have a Java license but "it relies on Oracle and Azul for the Java VMs that run on Azure." Azul and Microsoft partnered two years ago on a Windows distribution build of the OpenJDK that will run on Microsoft's Azure cloud platform.

"Underscoring all this, of course, is the assumption that Android and iOS developers will have any interest in Astoria and Islandwood in the first place," wrote Ars Technica's Sean Gallagher last month in a deep look at Microsoft's push to incent Android and iOS developers to port their apps to UWP. "With its broader reach, Islandwood may be the easier sell, but none of this is automatic. There are plenty of developers today using Xamarin to write apps for Android and iOS using C# and .NET. These apps should be easily portable to Windows 8 and Windows Phone -- but often those developers aren't bothering."

Either way, for those wondering why Microsoft is offering native iOS support for UWP but not for Android, that's presumably why. Regarding the question of whether Microsoft will offer native Java support on Project Astoria, the spokesperson said Microsoft has no immediate plans but recommended an alternative in the form of a plug-in available in the Visual Studio Gallery.

Posted by Jeffrey Schwartz on 06/19/2015 at 12:53 PM0 comments


Key Microsoft Execs To Depart Amid Restructuring

Microsoft today said Stephen Elop, once considered a frontrunner to replace Steve Ballmer as CEO in 2013 following the company's $7.2 billion acquisition of Nokia's handset business, is among several senior executives who'll be exiting Redmond.  CEO Satya Nadella said Elop, along with Chief Strategist Mark Penn, Dynamics Chief Kirill Tatarinov and SVP for Technical Strategy Eric Rudder are leaving the company.

The shakeup comes as Microsoft looks to close out its fiscal year on June 30. Microsoft typically has its annual reorg at this time of year. Nadella said in an e-mail to employees that the company is consolidating the company's business units. The Dynamics business long held by Tatarinov will fall under the Cloud and Enterprise business led by Scott Guthrie, while the devices group headed by Elop will be combined with the Operating Systems Group lead Terry Myerson. The combined organization is called the Windows and Devices Group.

"This new team brings together all the engineering capability required to drive breakthrough innovations that will propel the Windows ecosystem forward," Nadella said in his e-mail. "WDG will drive Windows as a service across devices of all types and build all of our Microsoft devices including Surface, HoloLens, Lumia, Surface Hub, Band and Xbox. This enables us to create new categories while generating enthusiasm and demand for Windows broadly."

Nadella noted that combining Dynamics with the Cloud and Enterprise Group "will enable us to accelerate our ERP and CRM work even further and mainstream them as part of our core engineering and innovation efforts. C+E will work closely with ASG [Applications and Services Group] to ensure the end-to-end experience is cohesive across communications, collaboration and business processes."

The only surprise about Elop's departure is that it took so long, considering he was the only senior executive not seen at major events. Penn is sticking around until September, according to Nadella's e-mail, to pursue something else. Could it be that Penn will be jumping back into the political arena?

Bringing these groups and their engineering teams together makes sense as the company looks to build a more cohesive strategy around mobility, productivity, platforms and cloud.

Posted by Jeffrey Schwartz on 06/17/2015 at 12:13 PM0 comments


LastPass Breach Again Underscores Weakness of Passwords

The popular password management service LastPass disclosed yesterday that it discovered "suspicious activity" on its network in which e-mail addresses, password reminders and authentication hashes were breached, though the company said it doesn't believe encrypted user vault data was seized.

LastPass is among numerous cloud-based password management services that allow individuals and enterprise users to store their encrypted passwords in an online vault to provide single sign-on to Web sites and mobile application services. I have used the LastPass service for several years and have found it useful in an age where we have scores of passwords to remember. The inherent risk of using a password vault service such as LastPass is if your master password is compromised, every site you have registered is at risk as well.  The LastPass breach is the latest evidence that passwords are indeed hard to protect, even by experts

Founder and CEO Joe Siegrist said he has confidence in the encryption methods LastPass uses to protect passwords. "LastPass strengthens the authentication hash with a random salt and 100,000 rounds of server-side PBKDF2-SHA256, in addition to the rounds performed client-side," he wrote a blog post announcing the breach yesterday. "This additional strengthening makes it difficult to attack the stolen hashes with any significant speed."

If you're willing to accept that your passwords are still safe, the fact that password reminders were stolen, they could be used in targeted attacks, Columbia University computer science professor Steve Bellovin told Brian Krebs in his KrebsonSecurity news site. The bottom line is that users should change their master passwords.

The breach hasn't made me decide to stop using LastPass but it does make me look forward to a day when biometric or the common use of two-factor authentication replaces the use of passwords, even though that comes with its own baggage.

 

Posted by Jeffrey Schwartz on 06/16/2015 at 11:09 AM0 comments


Microsoft Pulls Modern App To Give Skype the Right Touch

Microsoft's decision to scrap the modern version of Skype for PCs means, effective July 7, you'll need to settle for the desktop version of the popular voice, video and chat application. Many PC users probably won't have an issue with the move, announced last week, but it's yet another black eye for Microsoft's Modern app model, which many still call "Metro."

After July 7, if you click on the Modern app version of Skype, it'll automatically switch you to the desktop version.  For its part, Microsoft points out that PC users have said optimizing Skype for touch isn't a necessary part of their experience when using it.

"With the upcoming release of Windows 10 for PCs, it makes sense to use the Skype application optimized for mouse and keyboards use, capable of doing touch as well rather than two separate applications performing the same function," said Aga Guzik, Skype's head of desktop product marketing, in a blog post announcing the switch. Guzik said the move won't impact Microsoft's plans to offer Skype functionality within a specific app planned for Windows 10. "This way if you want to quickly make a call or send a message you can use task based apps and for those of you power users who like the advantages of the all in one app, you can pick what's right for you."

As Mary Jo Foley pointed out today in her All About Microsoft blog, "Microsoft's decision to kill off the Modern Skype client and move customers to the Desktop version seemed to fly in the face of what company officials have been telling developers and customers for the past several years," referring to its push toward Modern (aka Metro) apps.

Guzik told Foley that Universal phone, video and text apps based on Skype will be built into Windows 10 but they won't come with the initial version of the operating system that ships on July 29. Rather, Microsoft will release these three Skype-based Universal communications apps (chat, video and phone) in Windows 10.

Microsoft's focus around the Universal Windows Platform is built on the notion of a common code base for applications whether they run in desktop or touch modes. As such, perhaps this shift away from the Modern Skype app is indicative of Microsoft's plan to deemphasize the Windows 8 app model. Microsoft has said, though, that the modern Skype app would be available for Windows RT users. Windows RT systems can only run modern apps.

Posted by Jeffrey Schwartz on 06/15/2015 at 3:20 PM0 comments


Citrix Targeted by Activist Investor Elliot Management

Activist investor Elliott Management wants to meet with the top brass at Citrix, one of Microsoft's largest partners, to discuss options that would boost the value of Citrix's shares, the hedge fund manager said this week.

 In a letter to Citrix CEO Mark Templeton, Chairman Thomas Bogan and the company's board of directors Thursday, Elliott indicated it wants to see the company improve its operations and spin off its assets. Given Elliott's 7.1 percent stake in Citrix, and its history of targeting companies it believes are undervalued because of how they're structured, the move pushed Citrix shares up 7 percent yesterday and were up 2.6 percent midday Friday. Given Elliott's influence and holding in the company, Citrix can't ignore the request either.

In the letter, Elliott said it believes Citrix is significantly undervalued, suggesting its stock could be worth up to $100 per share by the end of next year. It was trading at $72 Friday but has jumped since Elliott released its letter Thursday.  Elliott released the letter following a 13D filing which requires disclosure when acquiring more than 5 percent of a publicly traded company's shares.

In its letter, Elliott believes that  the 50 percent jump in value "is achievable because Citrix has leading technology franchises in attractive markets but has struggled operationally for years. As a result, today Citrix's operations and product portfolio represent an opportunity for improvement of uniquely significant magnitude." In a statement, Citrix responded: "We will review Elliott's suggestions and respond as we do with all shareholders who engage with us. The Citrix Board and management team continually evaluate ideas to drive shareholder value and are committed to acting in the best interests of all our shareholders."

Citrix stock has grown only 2.8 percent over the past year before this week's jump in response to Elliott's move. In its April earnings release, Citrix lowered its outlook for the year amid a 48 percent drop in profit, resulting from this year's January restructuring. Citrix is betting big on its new Cloud Workspace platform, which aims to bring together key assets including ShareFile, XenDesktop, XenApp and Netscaler. Cloud Workspace, which Citrix emphasized at its annual Synergy conference last month in Orlando, can function as an organization's cloud control plane, bridging multiple public and private clouds, including core datacenter assets, to various client systems including a wide array of mobile devices.

According to various published reports, Elliott likely wants Citrix to shed or spin off some assets, reduce headcount or buy back shares.

Posted by Jeffrey Schwartz on 06/12/2015 at 12:08 PM0 comments


Many Enterprises Will Wait Before Deploying Windows 10

While Microsoft is expecting 1 billion devices to run Windows 10 within the next three years, the company better not count on large enterprises to hit that ambitious goal early. The results of a survey of enterprise Windows IT pros shows a vast majority will wait at least six months to begin deploying the new operating system and a substantial amount will wait more than a year.

The survey, conducted at last month's Microsoft's Ignite conference in Chicago by  Windows tools vendor Adaptiva, showed that 71 percent will wait at least six months and nearly half (49 percent) will wait more than a year to upgrade. Only 186 people participated in the survey, which would lead some to question its statistical validity, but Adaptiva Founder and CTO Deepak Kumar explained all of the people interviewed are deeply involved in the management of their organizations' client systems.

Those responding to an online survey conducted by Redmond magazine at the beginning of this year found that nearly 41 percent planned to upgrade in the first year.

A vast majority (84 percent) of respondents used Microsoft's System Center Configuration Manager (SCCM) and 40 percent managed more than 10,000 nodes. Eight percent managed 100,000 nodes or more. The largest percentage of those surveyed (35 percent) managed between 1,000 and 10,000 systems; 19 percent had 100 to 1,000 systems and 7 percent were small shops with less than 100 systems. Not surprisingly the larger organizations are most primed to wait.

"With the free upgrade and some of the technology they've put in for ease of upgrade, the expectation in the market is there will be this landslide of Windows 10 adoption," said Kumar. "The surprise for me, is what people are planning to do with Windows 10 is the same as they have done with every other version:  slow adoption."

The fact that organizations may wait is hardly a shocking news flash -- it's consistent with best practices analysts and consultants have given regarding major operating system upgrades since organizations first deployed Windows PCs. "People want to test it and have complete control where it goes out when it goes out," Kumar said.

Microsoft could still hit the 1 billion milestone if it's successful in convincing the vast number of consumers with Windows 7 and Windows 8 on their systems to take advantage of the free upgrade as well as a wide swath of new PCs and tablets expected to appear in the coming months.

Another noteworthy finding from the survey: only 11 percent had at least some machines with Windows XP still in use. This is a marked decline from a year ago at TechEd where 53 percent claimed to still have PCs running the discontinued operating system. Not surprisingly, Windows 7 was most preferred operating system, with 84 percent saying that their organization is running that OS. Windows 8 was in use, at least to some extent, by 57 percent of those organizations.

Posted by Jeffrey Schwartz on 06/11/2015 at 3:14 PM0 comments


Microsoft Acquires BlueStripe To Boost Heterogeneous Monitoring of New OMS

Microsoft today said it has acquired BlueStripe Software, a popular provider of software used to monitor the performance of infrastructure and help systems administrators track down bottlenecks at the transaction layer of applications. The companies didn't disclose terms.

While BlueStripe's performance management platform is popular among enterprises that use System Center Operations Manager, its wares appealed to IT managers for their ability to monitor events throughout the infrastructure and applications stack. The two companies only started working together a few years ago but since BlueStripe talked up the SCOM management packs they've developed, I wondered if Microsoft would swallow the company.

Microsoft said it will stop offering BlueStripe's products on a standalone basis -- the company indicated temporarily -- and will absorb the technology into the new Microsoft Operations Management Suite. OMS, announced at last month's Ignite conference, will be Microsoft's new cloud-based offering that will help manage the components of its new Azure Stack, which the company said will provide Azure-like functionality in the enterprise.

The new OMS, which will work with System Center but not require it, will also provide management of key components of an enterprise infrastructure including (presumably) VMware's virtualization and vCloud Air public cloud, Amazon Web Services and OpenStack-based clouds. BlueStripe and its team could help Microsoft further achieve that goal with OMS. BlueStripe last year developed richer ties to offer a System Center Management Pack for its FactFinder service and late last year integrated it into the Windows Azure Pack.

Despite its ties with Microsoft, BlueStripe also boasted the ability to monitor the impact of transactions in the performance of key applications such as IBM CICS and SAP R3, as well as other transaction-oriented applications running on everything from mainframes to RISC-based Unix systems and Linux-based applications.

"BlueStripe's solution helps map, monitor and troubleshoot distributed applications across heterogeneous operating systems and multiple datacenter and cloud environments. BlueStripe is commonly used today by customers to extend the value of Microsoft System Center by adding application-aware infrastructure performance monitoring," wrote Mike Neil, Microsoft's general manager, enterprise cloud. "Now that we have welcomed BlueStripe employees into the Microsoft family, we will be hard at work making BlueStripe's solution an integral part of Microsoft's management products and services, like System Center and Operations Management Suite (OMS). We will discontinue selling the BlueStripe solutions in the near term while we work on these updates, although we will continue to support existing BlueStripe customers during this time."

Certainly BlueStripe's customers will wonder what this means for the future of its performance management software as a standalone offering. While that remains to be seen, Microsoft said at Ignite that so-called "solution packs" will be a key part of OMS.

Posted by Jeffrey Schwartz on 06/10/2015 at 1:25 PM0 comments


Facebook Unfriends Microsoft

Microsoft apps are no longer linked to Facebook accounts. The news came in the form of an e-mail to millions of Microsoft account holders this morning. The social network severed the link between its Facebook Connect interface and Microsoft apps thanks to a change in the Facebook Graph API, which the company officially announced over a year ago.

The link connected contacts from Facebook accounts into Outlook.com and the Windows 8.1 People app. In addition to synchronizing contact info, Facebook Connect linked to various apps that were common among the two companies such as Microsoft's Photo Gallery, Movie Maker and OneDrive.com, allowing users to share content directly in Facebook.

"Due to these changes, Facebook Connect features will no longer be supported," Microsoft's e-mail stated. "If you've connected Facebook to your Microsoft account, your Facebook contacts will stop syncing to Outlook.com or the People app on your Windows devices, and the sharing options to Facebook will stop working as early as June 9th."

While Facebook didn't address the removal of support for Microsoft apps in last year's announcement, the company did say "are removing several rarely used API endpoints."

Is the lack of this connectivity something you'll miss?

Posted by Jeffrey Schwartz on 06/09/2015 at 10:39 AM0 comments


Edward Snowden: No Regrets Two Years After First NSA Leaks

Two years ago, Edward Snowden took it upon himself to release classified documents that revealed widespread surveillance activities by United States and foreign governments, most notably the National Security Agency. In so doing, Snowden became one of the most wanted fugitives by U.S. law enforcement.  Yet he also became the IT industry's most famous martyr for putting the spotlight on how the government was "monitoring the private activities of ordinary citizens who had done nothing wrong."  

Snowden reflected on the impact of his actions in an op-ed column published last Thursday in The New York Times.  The revelations of programs like PRISM outraged privacy experts and IT decision makers alike, making them wonder if data stored in the cloud and their electronic communications were secure. To its credit, Microsoft Chief Counsel Brad Smith led an effort to fight back, as I noted last week following Congress's long battle to put some limits on The Patriot Act, which officially expired June 1.

"Privately, there were moments when I worried that we might have put our privileged lives at risk for nothing -- that the public would react with indifference, or practiced cynicism, to the revelations," Snowden admitted.  "Never have I been so grateful to have been so wrong." Despite violating key espionage laws, the former NSA contractor Snowden marveled at the rapid impact of his actions. "In a single month, the N.S.A.'s invasive call-tracking program was declared unlawful by the courts and disowned by Congress," he noted of the initial fallout of his revelations. "After a White House-appointed oversight board investigation found that this program had not stopped a single terrorist attack. Even the president who once defended its propriety and criticized its disclosure has now ordered it terminated. This is the power of an informed public."

Two years later, the beat goes on, he added, pointing to the impact of his ongoing leaks. "Ending the mass surveillance of private phone calls under the Patriot Act is a historic victory for the rights of every citizen, but it is only the latest product of a change in global awareness," he wrote. "Since 2013, institutions across Europe have ruled similar laws and operations illegal and imposed new restrictions on future activities. The United Nations declared mass surveillance an unambiguous violation of human rights. In Latin America, the efforts of citizens in Brazil led to the Marco Civil, an Internet Bill of Rights. Recognizing the critical role of informed citizens in correcting the excesses of government, the Council of Europe called for new laws to protect whistle-blowers."

For sure, the recent changes to the Patriot Act with the new Freedom Act are the equivalent of a Band-Aid on a bullet wound. Still, it's a step in the right direction. Snowden also lamented the way the IT security industry has responded. "Technologists have worked tirelessly to reengineer the security of the devices that surround us, along with the language of the Internet itself," he noted. Secret flaws in critical infrastructure that had been exploited by governments to facilitate mass surveillance have been detected and corrected. Basic technical safeguards such as encryption -- once considered esoteric and unnecessary -- are now enabled by default in the products of pioneering companies like Apple, ensuring that even if your phone is stolen, your private life remains private. "

Despite the changes enacted in wake of Snowden's actions, he still used privileged access to classified systems making him a traitor by the letter of the law. The ability for anyone in an organization with unfettered access has put IT decision makers on notice as well that their information systems are at risk. Initial reaction to the revelations two years ago made many organizations wary that data they presumed safe from prying eyes, was subject to backdoor penetration from the Feds.

I recently asked noted security technology expert Bruce Schneier, who I've known for many years, whether he believed Snowden was a traitor or a hero. The silence following my question was deafening and I suddenly felt like the journalist asking a quarterback who just threw an interception at the Super Bowl that cost his team the game how he felt. While Schneier wasn't going to give me a sound bite, he gave me an opaque response. "I don't care about the person. What matters is the documents," he said. "That question will be answered in 20 years by history." Ok, but did Snowden do the right thing by breaking the law and disclosing what was going on? "Yes," Schneier said.

Reading Snowden's op-ed piece, Snowden appears to feel the same way, having marked the June 4 anniversary by revealing that U.S. has expanded its spying initiatives at the border to find hackers.  "The balance of power is beginning to shift," he concluded in his op-ed column. "With each court victory, with every change in the law, we demonstrate facts are more convincing than fear."

Do you see Snowden as a traitor or have his disclosures of wrongdoings made him worthy of indemnification?

 

 

Posted by Jeffrey Schwartz on 06/08/2015 at 10:45 AM0 comments


Microsoft Extends Global Cloud Footprint with New Datacenters

Microsoft capped off the week by touting the latest expansion to its ever-growing global network of cloud datacenters. The company announced plans to add two datacenters in Canada and expansion of facilities in The Netherlands and in India.

In all, Microsoft said it will shortly have 22 cloud regions and datacenters for Azure and nine for Office 365 and Dynamics CRM Online. Microsoft COO Kevin Turner presided over this week's launch of its new Canadian datacenters this week, where the company announced it'll host its cloud services in Quebec City and Toronto sometime next year. "This data residency will provide Canadian businesses with improved latency and geographic redundancy to help them better operate and compete," said Takeshi Numoto, Microsoft's corporate VP for cloud and enterprise marketing, in a blog post today.

Microsoft pointed to an IDC forecast that projects spending on cloud computing services in Canada will grow 45 percent next year to $2.5 billion. Microsoft claims 80,000 businesses are already using the company's cloud services. Among them Microsoft referenced was a Diply.com, a startup based in Ontario that uses Azure to push 850 million page views per day. Among other key Canadian customers are AutoTrader, Genetec and PCL Construction.

The new datacenters in Bangalore, India will be available in preview mode next month, with plans to offer the deployments in the first half of next year, said Microsoft Executive Vice President for Cloud and Enterprise Scott Guthrie in a presentation in Bangalore Wednesday. "Services from local datacenters will open infinite computing capacity for Indian government departments, banks, telecom companies and enterprises of all sizes," Guthrie said. "This will help make Digital India a reality."  

 

Posted by Jeffrey Schwartz on 06/05/2015 at 1:15 PM0 comments


Active Directory Certificate Services Advice Roils PKI Security Experts

The idea came off as a simple one: implement Microsoft's Active Directory Certificate Services (AD CS) and you'll be on your way to a more secure infrastructure. That was the premise of this month's Windows Insider column by Greg Shields, which was quickly criticized by some well-known Microsoft security MVPs and PKI experts.

Among the critics who posted comments were Brian Komar, author of a number of books and guides on Windows PKI security including the Windows Server 2008 PKI and Certificate Security and MVP Paul Adare, a consultant and trainer with a focus on enterprise PKI and Active DIrecotry Rights Management Services (RMS) deployments. Several others weighted in, all lambasting Shields for suggesting that deploying AD CS will jump start a more secure IT infrastructure.

"This is one of the worst security related columns I have ever seen," Adare wrote in the comments section. "Redmond magazine should be ashamed for allowing this to be published at all. As an MVP myself, and one who specializes in AD CS, I'm embarrassed by this." Komar suggested we should remove the article from our site. "It really should be pulled as it would only create disaster in a company as written," Komar  noted. "One of the worst articles I have ever read on PKI. This should be titled ADCS: Worst Practices."

Several commenters criticized Shields for suggesting that you should install an AD CS role onto an existing Active Directory domain controller. It turned out that was a typo which has since been updated, which appeased some but not others, who took issue with the column beyond the typo. Shields acknowledged and regretted the implication of deploying AD CS as a best practice wasn't the best phrasing. In a rebuttal Shields said some of his critics missed his point.

"The comments below have succeeded in manifesting the opposite effect of what I had originally intended. My goal was to merely incent individuals to get started and to highlight a barest minimum of steps that might accomplish that -- even as those steps aren't, as y'all have stated, the very best ones," Shields responded. "There appears an unspoken assertion in these comments that the mere presence of this article presents something like a danger to society. But let's be rational adults here. Would any IT pro, experienced or no, seriously go about creating a PKI solution based solely on a handful of paragraphs in a trade magazine? Likely not."

Shields added that the point of his column was to get people started toward building more secure infrastructure and perhaps on the road to building a more extensive PKI. "And if they turn to Brian's book, Paul's community contributions or the body of PKI knowledge elsewhere, then I've accomplished my goal."

I reached out to David Strom, a longtime colleague and expert on networking, security, PKI and other issues, to get his take on the column and subsequent criticism."​There's an element of truth in both sides," said Strom. "Yes, CAs should be used more, and you should know what you are doing."

Posted by Jeffrey Schwartz on 06/04/2015 at 12:24 PM0 comments


Microsoft Praises Curbs on U.S. Surveillance

It was the latest bruising battle among lawmakers but the U.S. Senate finally agreed on a compromise that will put an end to the Patriot Act, which has allowed government eavesdropping on telephone and electronic communications following the terrorist attacks of Sept. 11, 2001.

The new Freedom Act, signed by President Barak Obama last night hours after its passage by the Senate with a 67-32 vote, takes away the National Security Agency's authority to gather calling data of millions of Americans and instead puts that in the hands of phone companies.

In wake of the disclosures two years ago by Edward Snowden at the scope of the NSA's surveillance efforts, which proponents say is critical in thwarting terrorist attacks, Congress and the president were under intense pressure by privacy advocates to stop the practice. Despite last month's Senate agreement to come up with a compromise that would water down the Patriot Act, which officially expired Monday morning, a strong coalition at the 11th hour moved to preserve it in the days leading up to the deadline.

Opponents of the new Freedom Act argue that it  still doesn't go far enough to protect the privacy of Americans but it is "an important step forward in striking a better balance between public safety and privacy," as noted today in a post by Microsoft Chief Counsel Brad Smith.

"The USA Freedom Act will increase trust in technology by implementing essential reforms to the USA Patriot Act," said Smith in a blog post. "The legislation will ensure that the public is aware of what their government is doing by allowing companies to publish detailed transparency reports. Governments also need to act with proper accountability with proper regard for legal process and people's rights. The reforms of the FISA Court in the bill moves government accountability forward by increasing the transparency of its proceedings and rulings and introducing a process for amicus curiae. And the new law ends the bulk collection of data -- a program that a federal court recently struck down."

For its part, Smith, on behalf of Microsoft, took a leading role in bringing the IT industry together to push for the easing of surveillance. While praising the new Freedom Act, Smith renewed his call on the government to take further steps to ensure data privacy. "There's still more work to do, both here in the United States and internationally," Smith wrote. "High on that list is the creation of new international legal frameworks to tackle other important issues we face in ensuring the free flow of information around the world while respecting national sovereignty."

Passage of the new Freedom Act doesn't put an end to surveillance but keeps it out of the direct purview of the feds. Of course, if you have little trust in the phone companies' ability to better ensure your privacy, the new Freedom Act probably offers little solace.

 

Posted by Jeffrey Schwartz on 06/03/2015 at 1:35 PM0 comments


Windows 10 To Arrive July 29

Microsoft has finally revealed a delivery date for Windows 10. The official release date is slated for July 29. The company used the eve of the annual Computex conference in Taipei to make the announcement. It's not a surprise Microsoft chose Computex to make it official given that the event is the largest gathering of OEM systems (PC and device) manufacturers.

Millions of Windows 7 and Windows 8.1 users will be eligible for a free upgrade on that date, as Microsoft announced in January. Microsoft is letting eligible customers reserve the free upgrades starting today. A Windows 10 icon will appear to the right of the taskbar on eligible devices, allowing users to click on it at any time to initiate their reservation. Microsoft posted instructions on how the process works.

By releasing it in late July, Microsoft is making sure Windows 10 is available for the back-to-school season, which begins in earnest in August. When Microsoft said the OS will ship this summer, many wondered if that meant mid-September to target the fourth quarter holiday buying season. While indeed that's important, making Windows 10 available to students is equally --  if not more -- important.

"With Windows 10, we start delivering on our vision of more personal computing, defined by trust in how we protect and respect your personal information, mobility of the experience across your devices, and natural interactions with your Windows devices, including speech, touch, ink, and holograms," wrote Terry Myerson, Microsoft's executive VP for operating systems, on Microsoft's Windows blog. "We designed Windows 10 to run our broadest device family ever, including Windows PCs, Windows tablets, Windows phones, Windows for the Internet of Things, Microsoft Surface Hub, Xbox One and Microsoft HoloLens—all working together to empower you to do great things."

Following an eight month technical preview, we also now know what features Microsoft will include in Windows 10 and which will get scrapped. Myerson emphasized the following components will in fact come in Windows 10:

  • Cortana: Microsoft's digital assistant will run on all versions of Windows and other platforms, although languages and device types initially will have limitations
  • Microsoft Edge: the company's new modern browser which Myerson said offers "built-in commenting on the Web -- via typing or inking -- sharing comments, and a reading view that makes reading web sites much faster and easier" Myerson  noted.
  • New Apps: Photos, Videos, Music, Maps, People, Mail and Calendar. All are designed to conform to device form factors and enabled to synch with OneDrive.
  • Windows Continuum: The ability for convertible tablet-PCs to easily switch modes from one to the other.
  • Windows Hello: The new logon technology that will enable devices to have FIDO-compatible biometrics as an alternative to password-based authentication.
  • Windows Store: Revamped and unified to support Microsoft's new Universal Windows Platform applications.
  • Windows 10 Home: Updates from Windows Update will automatically be available.
  • Windows 10 Pro and Windows 10 Enterprise: Users will have the ability to defer updates.

Those upgrading from earlier versions of Windows will have to prepare to download apps that were once included in the operating system, according to a Microsoft release note.  Here are some other noteworthy changes:

  • Viewing DVDs will require playback software.
  • Windows 7 desktop gadgets removed.
  • Solitaire, Minesweeper, and Hearts Games preinstalled on Windows 7 removed and replaced by new version of Solitaire and Minesweeper, now called the Microsoft Solitaire Collection and Microsoft Minesweeper.
  • USB-based floppy drives will require download of manufacturer drivers.
  • Windows Live Essentials with OneDrive application removed and replaced with the inbox version of OneDrive.

In the release note, Microsoft also outlined system requirements, which include 1 GHz or faster processor or system-on-chip (SoC), 1 GB of RAM for 32-bit systems and 2 GB for 64 GB systems, 20 GHz storage, DirectX 9 or later with WDDM 1.0 driver and support for at least 1024x600 resolution.

Do you plan to upgrade to Windows 10?

Posted by Jeffrey Schwartz on 06/01/2015 at 10:17 AM0 comments


Satya Nadella Ranked Top Visionary Among High-Tech Leaders

Microsoft CEO Satya Nadella is the most influential leader in the high-tech industry, according to Juniper Research's annual ranking released Tuesday. Nadella bested the likes of Apple's Chief Design Officer Tony Ive, Uber CEO Travis Kalanick, Netfix Cofounder and CEO Reed Hastings, Alibaba Founder and Chairman Jack Ma, Amazon Founder and CEO Jeff Bezos and Tesla Founder and Chairman Elon Musk.

Nadella's ability to surge past this eclectic and prominent group of technology executives is an unusual turn of events given many people have written off Microsoft as a company with vision, which is the basis of Juniper's ranking.

"The rankings, which are based on Juniper's assessment of key criteria including vision, innovation and personal capital, noted that Nadella's implementation of 'Windows-as-a-Service' represented a fundamental change to Microsoft's OS-focused business model, resulting in a very different process of development at Redmond in future," according to Juniper Research's announcement of this year's survey.

Juniper Research's survey of tech visionaries may not represent the most prominent of rankings but it will be interesting to see if Nadella finds himself coming up more often when it comes to looking at who's setting the agenda for the IT industry. And that's not a given, considering Nadella was just named CEO just under 16 months ago.

While Nadella has set the pace for change in a short amount of time and has strived to show the world Microsoft still has some new tricks up its sleeve, his work has just started and success remains to be seen. Nevertheless, this ranking is another sign that critics may not want to write Microsoft off yet.

Posted by Jeffrey Schwartz on 05/27/2015 at 8:39 AM0 comments


VMware Showcases Plans for the Future of Workspaces on Citrix's Home Turf

In a move that appeared timed to rain on Citrix's parade, VMware last week demonstrated technology it plans to unveil which aims to provide the control plane for managing user workspaces. VMware revealed its hybrid cloud architecture aimed at creating "next-generation" workspaces, and did so on its rival's home turf at Citrix's annual Synergy conference last week in Orlando.

VMware took the wraps off Project Enzo, a new platform that it claims will change the way IT organizations deploy and manage virtual desktop environments, just as Citrix demonstrated its new Workspace Cloud architecture to customers at Synergy. Similar to Project Enzo, Citrix Workspace Cloud aims to provide the control plane based on a hybrid cloud architecture for managing user workspaces on a variety of form factors ranging from PCs, phones, tablets and even Raspberry Pi devices.  

"Project Enzo is a new hybrid cloud-scale architecture that is designed to combine the economic benefits of cloud-based VMware virtual desktops and application technology with the simplicity of hyper-converged infrastructure to transform the IT experience," wrote Sumit Dhawan, VMware's senior vice president and general manager for desktop products and end-user computing, in a blog post announcing Project Enzo. "Project Enzo will enable the unified management of on-premises and cloud-based virtual workspace services (desktops and applications) through a single Web-based portal that will be available as a cloud service on VMware vCloud Air."

While there are certainly architectural differences between Citrix Workspace Cloud and VMware's Project Enzo, they appear to have the same goal in mind: being the center of deploying and securely managing user work environments on a variety of device types. The most noteworthy difference is that Project Enzo seems to prefer vCloud Air when it comes to providing the public cloud infrastructure. By comparison, Citrix executives went to great pains last week to emphasize that the Citrix Workspace Cloud can run in any infrastructure as a service, including AWS, Microsoft Azure and IBM Softlayer. Unlike VMware, Citrix doesn't operate a public cloud of its own and when asked last week at Synergy if it planned to do so, executives indicated firmly the company has no interest in doing so due to the massive investment requirement needed. Both companies are relying on cloud service provider partners to deliver these new platforms.

Each company also described their new architectures as deigned to bring together and manage "hyper-converged" software-defined infrastructures. Microsoft has a similar vision with its newly revealed Azure Stack earlier this month at the Ignite conference in Chicago. Azure Stack will come with the next release of Microsoft's server operating system, Windows Server 2016. VMware's Dhawan said the technical preview for Project Enzo is now available.

A key component introduced with Project Enzo technical preview, according to Dhawan, is its VMware Smart Node technology, which integrates hyper-converged solutions. "Smart Node technology enables the intelligent orchestration and automation of common set-up, delivery and management tasks of virtual workspace services across the hybrid cloud," he said.

Apparently VMware's decision to rain on Citrix's parade by announcing Project Enzo was payback, as pointed out by The Register's Simon Starwood, who recalled Citrix announcing a new version of Xen App, Xen Desktop and Receiver products at VMworld last year. 

 

Posted by Jeffrey Schwartz on 05/21/2015 at 3:19 PM0 comments


AWS Outguns Azure in Gartner's IaaS Cloud Magic Quadrant

Rarely a day goes by when I don't receive a news release on behalf of a company announcing it was included in one of Gartner's Magic Quadrant as a leader. If I had a dollar for every one of those announcements I deleted, I could retire now. Today both Amazon Web Services and Microsoft notified me that their respective public cloud services were recognized as leaders in the research firm's infrastructure-as-a-service (IaaS) report.

Gartner pointed to Amazon and Azure as the only "leaders" in its IaaS classification of cloud providers. Close followers, which Gartner describes as "visionaries" include CenturyLink, Google, VMware and IBM/SoftLayer.  Among the others that Gartner described as "niche" providers were Rackspace, Joyent, Virtustream, Interroute, CSC, Dimension Data, Fujitsu, NTT Communications and Verizon.

Despite grouping AWS and Azure as the only leaders, Gartner singled out Amazon as the superior cloud provider. "AWS has a diverse customer base and the broadest range of use cases, including enterprise and mission-critical applications," the report said. "It is the overwhelming market share leader, with over 10 times more cloud IaaS compute capacity in use than the aggregate total of the other 14 providers in this Magic Quadrant."

The report said AWS still maintains a multiyear competitive advantage over both Microsoft and Google. A Microsoft spokeswoman in a statement said Azure still offers more than twice as much cloud IaaS compute capacity in use as the aggregate total of the remaining providers in this Magic Quadrant other than AWS. Microsoft officials also frequently point out that it has more datacenters around the world than AWS and Google combined with 19.

"This speaks strongly to Gartner's belief that the IaaS market is quickly consolidating around a small number of leading vendors," she said. "Microsoft is seeing significant usage and growth for Azure with more than 90,000 new Azure customer subscriptions every month, more than 50 trillion objects now stored in the Azure storage system and 425 million users stored in Azure Active Directory. In addition to strong use of Infrastructure-as-a-Service capabilities, we're also seeing over 60 percent of Azure customers using at least one higher level service."

The report opined that "Amazon has the richest array of IaaS features and PaaS-like capabilities. It continues to rapidly expand its service offerings and offer higher-level solutions." Microsoft argued it's the only vendor identified as a leader in both Gartner's IaaS and PaaS categories. "We also are differentiated in our ability to enable customers to use these capabilities together in a seamless fashion," she said.  "For example, Azure Resource Manager enables a single coherent application model for IaaS and PaaS services and the Azure Preview Portal blends IaaS and PaaS seamlessly so that customers no longer have to work in multiple, disparate environments."

Gartner also pointed to reliability problems that have plagued Azure including numerous outages, though it notes substantial improvements over the past year. "We are committed to applying learnings when incidents occur to prevent recurrences of similar interruptions and improve our communications and support response so that customers feel confident in us and the service," she said pointing to Microsoft's  Root Cause Analysis to see the most recent improvements.

The report also urged those choosing Azure not to jump in too fast. "Customers who intend to adopt Azure strategically and migrate applications over a period of one year or more (finishing in 2016 or later) can begin to deploy some workloads now, but those with a broad range of immediate enterprise needs may encounter challenges," the report advised.

Microsoft said it has aggressive plans to add new features, and said Gartner even acknowledged as much in the report. And, the spokeswoman's statement added:  "Over the past 12 months, we've added more than 500 new features and services to the Azure platform, including robust IaaS and PaaS capabilities as well as offerings that enable consistency across on-prem and the cloud so customers can achieve the hybrid scenarios they demand."

At its Ignite conference last month, Microsoft announced extensive new hybrid cloud computing features coming in the form of the new Azure Stack, which the company believes will give it a further edge over both AWS and Google.

Of course different surveys and customer sets have their own benchmarks and criteria, as I noted last week, when Nasuni's third evaluation of major cloud providers gave preference to Microsoft Azure. Whether or not you give credence to Gartner's Magic Quadrants, it seems to match industry sentiment that AWS remains the dominant public cloud but Azure is a clear No. 2. Both companies would agree this race is far from over.

Posted by Jeffrey Schwartz on 05/20/2015 at 5:21 PM0 comments


Citrix Demos Its Next-Gen Virtual Cloud Workspace Platform

At its annual Synergy conference in sweltering Orlando, Citrix last week staked its future around the company's new Workplace Cloud and a number of new wares that aims to establish it as the purveyor of the modern digital workplace. The major focus of this year's Synergy conference centered around Workspace Cloud, a platform that aims to ease the design, deployment, orchestration and management of secure work environments for mobile workers.

While virtual desktops and apps account for a small percentage of the workplace computing environments deployed today, their usage isn't trivial. Moreover it stands to grow in the coming years in new forms, including desktop as a service, as workers continue to use more device types, rely more on access from various places and organizations want to better secure information accessed by employees, contractors and even customers on these new form factors. The growth of hybrid cloud and the move to bring your own device policies are also enabling these new environments.

Looking to extend its reach from its core strength of offering virtual desktop and application environments, Citrix first started discussing Workplace Cloud a year ago but only demonstrated it publicly at Synergy last week, where customers also began testing the company's latest new platform. The company hopes to make it generally available in the third quarter. Mark Templeton, Citrix CEO, who is revered for his focus on engineering and user experience, showcased Workspace Cloud as the culmination of its effort to bridge public, private and hybrid clouds to the new ways people work with multiple device types. Templeton said the new digital workspace consists of Windows-based PCs, Macs. iPads, Android tablets Chromebooks new Linux-based systems and even embedded devices that enable Internet of Things-type environments.

"We think of Workspace as the core engine of the software-defined workplace," Templeton said in last week's keynote. "So if you don't do a great job with workspaces across all of those kinds of digital tools, then you're not going to have the engine of the software-defined workplace. And we know that everyone's workspace environment is different." The Citrix Workspace Cloud is based on a cloud delivery architecture similar to the company's BlackBeard reference architecture, which provides the service architecture to distribute XenDesktop and XenApp in hybrid cloud environments and RainMaker, which provides the orchestration across servers and nodes.

The control plane that powers Citrix Workspace Cloud is its ShareFile document sharing platform. Citrix, which acquired ShareFile in 2011, is  a smaller competitor to the likes of Box and Dropbox. But Citrix has spent the ensuing years building on the core ShareFile engine to enable it to become the control plane for the new Citrix Workspace Cloud, which the company describes as a management platform for creating mobile workspaces that include desktops, applications and data provisioned in a hybrid cloud environment that could consist of a private datacenter, as well as a public or private cloud.

A key component of Citrix Workspace Cloud is the Lifecycle Manager, which creates blueprints that ease the migration of earlier versions of XenApp to current releases and provides the ability for IT to deploy them in the new management platform. These blueprints "are effectively groupings of things that you need to do to define whatever workload it is you want to deliver," explained Christian Reilly, CTO for the Citrix Workspace.  "And then obviously the management piece comes after that. I'm not talking specifically about just delivering XenApp and XenDesktop because that's a key short term focus. The power of blueprints is if you kind of expand that out to two worlds, one in dealing with blueprints that can group together with different parts of the network topology, different bits of the infrastructure that need to be orchestrated to create an application workload and blueprints that can then provision or talk to Netscaler or other devices to compete the configuration."

In keeping with its history of not running its own public cloud, Citrix is empowering its base of 1,900 cloud service providers to provision Workspace Cloud in any environment, including Amazon Web Services, Azure and IBM's SoftLayer cloud, among others. The control plane itself runs in Microsoft Azure, but Citrix officials insisted that no customer data or apps touch the control plane, or Azure in particular, unless they want it to.

While building the control plane on ShareFile, Workspace Cloud brings together XenDesktop and XenApp platforms as well as networking gear such as Netscaler and CloudBridge. Stitching these together gives Citrix the opportunity to bundle -- and potentially upsell its wares -- though Templeton said the architecture allows organizations to plug in their own components, such as Microsoft and VMware hybrid cloud infrastructure. Workspace Cloud is an ambitious effort by the company to move itself forward with a major new platform designed to create and manage secure user work environments tailored around workers' tendencies to use multiple and often nontraditional devices to access their Windows environments. In addition to launching Workspace Cloud, Citrix previewed several new other key offerings in its pipeline including extensions to its XenMobile enterprise mobility management platform, networking and security upgrades to its Netscaler and CloudBridge tools, data loss prevention and other security improvements to its ShareFile enterprise file sharing offering. It also showed off new automation capabilities to its XenDesktop and XenApp platforms.

Attendees at Citrix Synergy last week seemed impressed with Workspace Cloud, though even its most visible customers said they need to understand how it might fit into their environments. "We will start playing with the beta," said David Enriquez, senior director of information technology for the Miami Marlins. "It looks to me something we could take advantage of such as spring training temporary deployments, if we have to do something at a minor league park or if we have an event at the ballpark that needs infrastructure but we don't want to put it on our existing infrastructure."

Posted by Jeffrey Schwartz on 05/19/2015 at 1:18 PM0 comments


Worldwide Build Tour Lures Developers for Universal Windows and Azure

Now that Microsoft has outlined its Universal Windows Platform and its new model for building and deploying modern applications for its Azure cloud, the company has hit the road to make its case to developers throughout the world.

The Build tour is taking place today in London and New York. These one-day developer confabs follow last month's annual Build conference, which took place in San Francisco. In the coming days and weeks Microsoft is hosting Build events in Atlanta, Berlin, Moscow, Tokyo and Mexico City, among numerous other locations.

I attended the New York event today where a few hundred developers took in a condensed version of the larger Build conference, featuring an overview of Microsoft's UWP and Azure strategy and demos on how to build and port apps to the new platform. Kevin Gallo, Microsoft's partner director for developer ecosystem and platform, gave the opening keynote address in which he called on developers to learn how they can enhance Android and Apple iOS applications for the new UWP by bridging them to Windows. At the same time, Gallo emphasized the ability to port legacy Win32 apps to the new UWP.

Key to the new UWP is the notion that developers can build their apps for Windows regardless of whether they're for a phone, tablet, PC or the new Surface Hub large conferencing system. "All these new APIs are to help create different user experiences on different devices," Gallo said.

Likewise, Microsoft's new love of open source platforms is carrying over to the Build tour, both in terms of the new UWP, which includes the new Edge browser, native support for HTML 5 and XAML, which Microsoft officials emphasized is appealing for developing responsive design to apps. Gallo explained how the new "bridge" capability for developers addresses four key form factors: Web, Win32 apps and integration with applications built in Objective C for iOS and editors for Android developers such as CodeMe. Developers "can use their Android tooling and port to Windows," Gallo said.

Gallo also showcased Microsoft's new Windows for IoT  (Internet of Things). The new Windows IoT Core is available for Raspberry Pi, the popular low-cost small device platform and for MinnowBoard Max, the open source environment for developing embedded apps for Intel Atom processors.

Building on Azure
In the second half of the Build tour keynote session, Neil Hutson, a Microsoft's engineering evangelist, took the stage to talk about extensions to the Azure public cloud. In keeping with Microsoft's emphasis that Azure's not just for Windows and .NET apps, Hutson said that 25 percent of the instances running in the company's cloud are Linux based. "If you want to use a favorite language, platform, framework and operating system, we pretty much have you covered so you can do your best work with us," Hutson said.

While Microsoft has extolled that message for some time now, the next key selling point for Azure rests on its new Azure App Services. The notion behind Azure App Services is that it enables developers to build and deploy modern and mobile apps that can scale globally across the Azure cloud.

Hutson also gave airtime to the array of new data services hosted in Azure ranging from what he described as a high performance SQL DB to support for a variety of alternative SQL and NoSQL database types. Hutson outlined three new features in Microsoft's Azure SQL DB service. First is Elastic Database Pool, which will allow customers to maintain separate isolated databases, the ability to aggregate activity to smooth peaks and the ability define strict performance SLAs. Second is support for full text search and the most warmly received new feature -- based on audience applause -- was support for transparent data encryption (TDE). "That means when data is at rest, it's fully encrypted," Hutson said. Hutson also talked up the new Azure SQL Data Warehouse, which "lets you suck information from SQL DB, and aggregate on premises systems," Hutson said.

Circling back to the Internet of Things, Hutson also talked up how the various data services including Azure Machine Leaning and Event Hub can connect to and process data from millions of devices in near real time.

Stream Analytics consolidates all of that data and allows users to create reports using the new Power BI. They can store infinite amounts of data in the new Azure Data Lake. Hutson also underscored the new Office APIs that will enable interactivity among third-party apps and components of the productivity suite, especially Outlook. With the new Office APIs and Office Graph, developers can build native application experiences into Office 365, Hutson explained. Rather than toggle between Outlook and Salesforce.com, the two can now be integrated, he said.

Microsoft knows developers will be critical to the success of UWP, Azure App Services and the next generation of Office. If the road show comes to your neighborhood, you may want to learn the details and decide for yourself.

 

Posted by Jeffrey Schwartz on 05/18/2015 at 11:20 AM0 comments


Microsoft Azure Blob Storage Slightly Edges Amazon S3 in Latest Shootout

Companies looking for a large global cloud provider to store significant amounts of data will do well choosing either Amazon Web Services or Microsoft Azure, with the latter performing slightly better according to Nasuni's third biennial cloud storage report. Google, the only other cloud provider with a large enough global scale that could be compared with the two by Nasuni's standards, came in a distant third place.

It's important to keep in mind that this is one benchmark by a single provider with its own requirements --  primarily using a large global cloud provider as a NAS target. But Nasuni performs the tests to determine which services it should use to provide as a storage target and claims it's not wedded to any one player, unless a customer specifies one. Nasuni first began sharing its benchmarks in 2012 when AWS had an overwhelming edge, though that was before Microsoft had a mature infrastructure as a service available.

Today, depending on the service, Nasuni primarily distributes its workloads between AWS and Azure and is always willing to add or shift to other suppliers. Nasuni currently prefers Microsoft Azure Blob Storage and Message Queue though it uses AWS's Dynamo database and EC2 compute instances, said John Capello, Nasuni's VP of product strategy. The primary test Nasuni conducted between October and February for the report evaluated a variety of read-write and delete scenarios, according to Capello.

"For our purposes, which is to write files for mid-sized to large enterprises to the cloud, Microsoft Azure Blob storage is a better target for us than Amazon or Google," he said. "Amazon is a very, very close second. Amazon and Microsoft seem to be, as many others have said, the real two competitors in this space in providing cloud services in general, but specifically with storage, they're very, very close in terms of both their speed, availability and their scalability."

According to the report, which is available for download if you're willing to give up your contact information, is that Microsoft outpaced Amazon and Google when it comes to writing data to a target 13 of the 23 scenarios of varying thread counts or file counts. When it came to reading files, Microsoft constantly performed better, though not to the extent it did in the write tests. Microsoft was twice as fast as Amazon when it came to deleting files and five times as fast as Google.

For system availability, Amazon's average response time of 0.1 seconds slightly edged Microsoft's 0.14 seconds, while Google was roughly five times slower. Nasuni also measured scalability and when writing 100 million objects to look at the number of read and write misses, "Microsoft had, by far, the largest write variance, which was more than 130 times larger than Google's, who had the smallest variance." Read and write errors were almost non-existent, according to a summary of the report. "Only Amazon showed any misses at all: five write errors over 100 million objects, which gives an error rate of .00005 percent."

Nasuni omitted several key players from the test, notably IBM's Softlayer, which was undergoing system upgrades and led to frequent periods of planned downtime during the testing period, according to Capello. HP was also initially in the test, though Capello said Nasuni chose to leave the company out this time because of HP's announced plans of changes in cloud strategy. "Before we decided we weren't going to continue testing them, they actually did surprisingly well, in some cases --  better than Amazon and Microsoft in some of the read-write and delete benchmarks," he said. "If we had run the full test, it would be interesting to see where they came out. "

 

Posted by Jeffrey Schwartz on 05/15/2015 at 10:56 AM0 comments


Microsoft Sets the Stage for Patch Tuesday To 'Go Away'

Yesterday was the latest of Patch Tuesday, the ritual which takes place on the second Tuesday of every month. But its days could be numbered. Patch Tuesday won't disappear anytime soon but the release of Windows 10 will set the stage for its ultimate transition to a different release cadence.

Microsoft said at last week's Ignite conference that its procedure for issuing security patches will change with the new OS's release via new "distribution rings" similar to the fast and slow rings offered with the Windows 10 Technical Preview. The company describes it as part of its Windows-as-a-service transition. It will only apply to Windows 10 and future operating systems, given the change in the way Microsoft builds software.

"Our goal is patch Tuesday will go away," said Stella Chernysak, a senior director for Windows Commercial at Microsoft, during an interview last week at Ignite. "Windows Update for Business essentially means the challenges customers have historically had with their patching will be easier to address."

Chernysak said this will let Microsoft issue patches as needed and allow organizations to better automate how they apply those patches, while at the same time allowing for customers to maintain a predictable process.

"This is our new foundation for us to deliver Windows as a service," she explained. Microsoft will start delivering the first set of Windows Update for Services this summer with additional functionality to be added in the fall.

 

Posted by Jeffrey Schwartz on 05/13/2015 at 11:50 AM0 comments


Device Guard Joins Windows Hello and Passport To Lockdown Windows 10

Microsoft's effort to displace passwords with technology with its forthcoming biometrics-based Windows Hello and Passport technology has received a fair amount of attention over the past few weeks. But Microsoft has another new technology slated for Windows 10 called Device Guard, which aims to further protect Windows from malware and known and new advanced persistent threats.

Device Guard, announced at last month's RSA Conference in San Francisco, will be an option for those who want deeper protection against APTs and malware in instances where intruders get in. Device Guard uses hardware-based virtualization to block the ability to execute unsigned code. It does so by creating a virtual machine that's isolated from the rest of the operating system. Device Guard can protect data and applications from attackers and malware that have already managed to gain access, according to Chris Hallum, a senior product manager for commercial Windows client security at Microsoft.

"This gives it a significant advantage over traditional antivirus and app control technologies like AppLocker, Bit9, and others which are subject to tampering by an administrator or malware," Hallum explained in an April 21 blog post. "In practice, Device Guard will frequently be used in combination with traditional AV and app control technologies. Traditional AV solutions and app control technologies will be able to depend on Device Guard to help block executable and script-based malware while AV will continue to cover areas that Device Guard doesn't such as JIT based apps (e.g.: Java) and macros within documents. App control technologies can be used to define which trustworthy apps should be allowed to run on a device. In this case IT uses app control as a means to govern productivity and compliance rather than malware prevention."

Device Guard blocks against malware and zero days targeting Windows 10 by only allowing trusted apps signed by software vendors, the Windows Store and internally developed software, according to Hallum. "You're in control of what sources Device Guard considers trustworthy and it comes with tools that can make it easy to sign Universal or even Win32 apps that may not have been originally signed by the software vendor," he explained.

When I met with Hallum and Dustin Ingalls, group program manager for OS security, at the RSA Conference in San Francisco last month, we primarily discussed Windows Hello and Passport, which Microsoft is hoping will replace passwords by enabling biometric authentication. Device Guard is not quite as sexy since it'll be invisible to individual end users but will allow enterprise IT administrators to make it impossible for attackers to execute code not recognized by Device Guard, Ingalls explained.

The VM created with Device Guard creates what Ingalls called a "tiny OS" where the operating system's decision-making components are isolated. "We take the actual critical integrity components and move those out of the main OS," Ingalls explained. "Now we have operating system that's much more difficult to compromise. "On top of that we make use of a feature called user mode integrity, which they know is vetted in the Windows Store."

Stella Chernysak, a senior director for Windows Commercial at Microsoft, described Device Guard as similar to Bitlocker in concept. "Device Guard will be on business systems, where IT has an opportunity to turn it on," Chernysak explained in an interview last week at Microsoft's Ignite conference in Chicago. "It will be an option for IT to take advantage of that feature or IT may make the decision to ask an OEM or partner to turn it on."

Posted by Jeffrey Schwartz on 05/11/2015 at 12:48 PM0 comments


Will Microsoft Really Bid on Salesforce.com?

As rumors surfaced this week that Salesforce.com is seeking bidders to acquire the company, including possibly Microsoft, they were barely noticed in Chicago's McCormick Place, where the Ignite conference was taking place. The mere mention of it to an attendee garnered an uninterested look, where attendees were focused on taking in the wave of new wares Microsoft rolled out.

Though shares of Salesforce.com's stock were halted briefly on Tuesday when they spiked on the rumors, the reports don't suggest Microsoft has actually even made a bid, only that there are at least two bidders and that Microsoft could be one of them and SAP the other. So the speculation swirling Wall Street is just that, though oftentimes speculation does turn into reality. 

When I first read about the rumours, my reaction was that it does seem rather far-fetched that Microsoft would shell out as much as $60 billion to acquire Salesforce.com, even if it is the leading supplier of cloud-based software-as-a service applications. One has to wonder if the huge investment and cash outlay would be worth it, considering such deals rarely have the upside the architects envision.

Considering Salesforce.com's market cap is about $49 billion, a deal to acquire it with the typical premium could reach $60 billion, give or take a few billion. Salesforce.com's 2014 revenues were just over $4 billion with guidance for this year of $5.3 billion -- or 30 percent. While not shabby, last year's annual revenues increased 35 percent, suggesting growth is slowing.  Another problem: despite its revenue growth, Salesforce.com lost $263 million last year. Also Microsoft has competing, though less successful, product lines with Dynamics and Yammer, to name a few.

On the other hand, acquiring Salesforce.com, which is hugely popular with enterprises, could accelerate Microsoft's shift in transitioning into a SaaS provider and extend its developer footprint into the open source community. It would also give Microsoft an even larger presence in Silicon Valley.

Some of that upside notwithstanding, does Microsoft need to bet the farm on Salesforce.com when it could use that cash hoard in more fruitful ways?

Posted by Jeffrey Schwartz on 05/08/2015 at 12:30 PM0 comments


Microsoft Transitions Cloud OS to New Azure Stack

When Microsoft rolled out Windows Server 2012 and System Center 2012, the company coined it and the then-new Azure Pack as its Cloud OS. While Cloud OS indeed provided the building blocks to build Azure-like clouds in private datacenters and third-party hosting providers, many say it's not seamless. Also Azure itself is a very different cloud than it was in 2012.

Cloud OS is a generic term used by a number of other providers including Cisco and Hewlett Packard. You can expect to see Microsoft phase out its Cloud OS brand that described Microsoft's approach to Windows Server and System Center in favor of the new Azure Stack. Along with the new Operations Management Service, which enables management of multiple servers, clouds and virtual machines, Azure Stack is a product that substantially advances upon the Azure Pack in that it aims to allow enterprises and hosting providers to build and manage cloud infrastructure that truly mirrors the functionality and experience of the Azure public cloud.

While Cloud OS as an amalgamation of Microsoft's datacenter software offerings didn't quite live up to its billing, Microsoft officials were confident at Ignite that the Azure Stack, its new operating system software including the new Nano Server configuration and containers, will enable a common infrastructure for on-premises datacenters and Azure. Time will tell whether Microsoft delivers on that promise but Azure Stack will come next year with Windows Server 2016 and System Center 2016, Microsoft officials explained here in Chicago this week. Corporate VP Brad Anderson introduced Azure Stack Monday in the opening keynote of Ignite.

"This is literally us giving you all of Azure for you to run in your datacenters," Anderson said. "What this brings you is you get that great IaaS and PaaS environment in your datacenters. You have incredible capability like a unified application model that gives you a one-click deployment experience for even the most complex, multitier applications and then you get that cloud-inspired infrastructure. We're giving you the same software controller that we built for our network, the name is the same, network controller.  We're giving you our load balancing. We're giving you all the storage innovation."

Microsoft released the second technical preview of Windows Server 2016 Monday and the Azure Stack is slated for a later test version. Ryan O'Hara, a Microsoft program director, explained in an Ignite briefing Tuesday the Azure Stack will offer more features than the Azure Pack. Among other things, it will offer all of the services of both IaaS and PaaS and all of the Azure management tools. "We think about Azure Stack as the delivery of Azure innovations on premises," O'Hara said.

In Monday's keynote, Jeff Woolsey, a Microsoft senior technical product manager, demonstrated the Azure Stack. "You see the same IaaS virtual machines, the same network interfaces, the same public IP addresses, the same BLOB storage, the same SQL [and] the same role-based access control both in Azure and in Azure Stack," he said. Through the Azure Portal, Woolsey showed how to associate such Azure services as networking, compute and storage, as well as Azure's software-based load balancers, software-defined network controllers and the distributed firewall. Into the Azure Stack. "We've packaged those up and put those in the Azure Stack for you so you're getting those same software-defined networking capabilities," he said.

Azure Stack will be a key component of the next version of Windows Server but it will be a separate offering. As it rolls out, we'll see if this provides the true vision of the hybrid cloud platform formerly known as Cloud OS.

Posted by Jeffrey Schwartz on 05/06/2015 at 12:10 PM0 comments


Microsoft Kicks off Inaugural Ignite Event with Barrage of New Tech for IT Pros

Days after courting developers to build apps for its new Universal Windows Platform at the Build conference in San Francisco, Microsoft deluged more than 23,000 IT pros attending its inaugural Ignite conference in Chicago with a barrage of new offerings to manage and secure the new platform and the entire IT stack.

Ignite kicked off today with a three-hour keynote headlined byd CEO Satya Nadella, who talked up how the company's new wave of software and cloud services will enable IT and business transformation in line with the ways people now work. He also highlighted the need for better automation of systems, processes and management of the vast amount of data originating from new sources such as sensors.

Among the new offerings revealed during the keynote presentation were: Azure Stack, which brings its Azure IaaS and PaaS cloud to on-premises datacenters; Microsoft Operations Management Suite to administer multiple server OSes including Linux, clouds and VMs; and Windows Update Service for Business, while making the case for Windows 10 for enterprise users.

Nadella talked up the company's focus on "productivity and platforms" tied with the shift to cloud and mobility, saying everything Microsoft offers aims to bring all of that together in line with the changes in the way people work and new data types generated from sensors and other Internet-of-Things-type nodes.

"Every layer of the IT stack is going to be profoundly impacted," Nadella said in the keynote session. "This sets up our context. It sets up the tension we have as we set out to manage this IT landscape. "We want to enable to our IT professionals and end users to make their own choices for their own devices, yet we need to ensure the security, the management. We want to enable our business units to choose the SaaS applications of their choice, yet we want to have the compliance and control of efficiency."

Nadella emphasized three themes: making personal computing more personal and secure, bringing together productivity and process and providing more agile back-end infrastructure. Just about everything Microsoft offers will be updated.

SQL Server 2016 will be "the biggest breakthrough in database infrastructure," with a technology called Stretch, allowing a single table to stretch from the datacenter to Azure. Microsoft released the second preview of Windows Server 2016 and is readying System Center 2016 "to make it possible for you to have Azure in your datacenter which is consistent with the public cloud Azure," Nadella said. The new Microsoft Operations Management Suite will provide what Enterprise Mobility Suite provides for client device management to datacenter administration, said Corporate VP Brad Anderson.

The company also gave major airtime to new security wares including the release of the new Advanced Threat Analytics tool, which, among other things, manages activity in Active Directory logs. The company also is moving from its traditional Patch Tuesday delivery of security updates, which take place on the second Tuesday of every month, to "rings" of security releases that will start with the delivery of Windows 10.

For the most part, Microsoft emphasized its new release wave and how it will integrate with key platforms, notably iOS and Android. But in a departure, Windows Chief Terry Myerson couldn't resist talking up Microsoft's added security features on Windows, and the company's new wares to keep Windows even more secure, taking a shot at Google. "Google just ships a big pile of [pause for emphasis] ... code, and leaves you exposed with no commitments to update your device." It was intended to showcase Microsoft's new focus on providing regular security updates for Windows.

Joe Belfiore, corporate VP for Microsoft's operating systems group, showcased the new Windows Hello technology, tied to the company's new Passport authentication service, coming to Windows 10. While Windows Hello will support all forms of biometrics, Belfiore showcased Windows 10 using facial recognition to authenticate into Windows 10.  Belfiore also demonstrated many popular features in Windows 7 that will reemerge into Windows 10 and new features, like Cortana, the new personal assistant that will provide answers to questions. "My mission is to convince you and give you the tools with the belief your end users will love and desire Windows 10," Belfiore said.

In coming posts, we'll drill down into these new offerings, which represent much of Microsoft's product waves expected in the next six-to-12 months.

Posted by Jeffrey Schwartz on 05/04/2015 at 2:08 PM0 comments


Can Android and iOS Bridges Really Save Windows Phone?

Microsoft spent the last two days trying to convince its own and the rest of the software development community that building applications to its new Universal Windows Platform (UWP) will let them create innovative and competitive apps. Indeed providing a common architecture for PCs, tablets and Xbox is a lofty goal and has promising implications. Likewise HoloLens, Microsoft's virtual reality headgear, is a worthy attempt to create new capabilities, though its success remains to be seen. The move to provide interfaces that will let Android and iOS developers extend their apps to Windows -- and vice versa -- raised eyebrows this week. It could be a last-ditch effort to save Windows Phone from fading to obscurity but even if it can't save Microsoft's struggling smartphone, UWP could still be a hit in other ways.

In short, UWP will support everything from legacy Win32 apps in the new Windows Store to Web, Android and iOS apps. Michael Domingo, editor-in-chief- of Redmond magazine sister publication Visual Studio Magazine is at the Build conference this week. Domingo explained the various tools that will create these bridges. Among them Project Astoria is the Android runtime bridge, which can be used from the Android Studio IDE to refactor Android app code for the Windows 10 platform. It will include a Windows emulator, and is supposed to allow for debugging and testing of apps from either the Android IDE or Visual Studio IDE. The new Project Islandwood toolkit is an iOS bridge for developing from Objective C. Myerson demonstrated some of the progress his group has made with the tool, showing the ability to debug and test Xcode from within the Visual Studio IDE. Project Centennial is aimed at Windows developers who want a shortcut for recasting current .NET and Win32 Windows apps for the newer Windows Store.

"Windows 10 is going to enable you to reuse your Web code, your .NET and Win32 code, your Android, Java and C++ code, to build amazing new applications, bringing the code over, extending it, putting it in the Windows Store and reaching 1 billion Windows 10 customers," said Terry Myerson, executive vice president and leader of Microsoft's Windows team, in Wednesday's opening keynote at Build, held in San Francisco. Likewise, "you will be able to compile the same Objective C code that's being used in iOS applications within Visual Studio on Windows, enabling you to leverage that code and extend it with the capabilities only found on the Windows platform."

David Treadwell, a corporate VP for operating systems at Microsoft, yesterday demonstrated how Windows 10 will provide a bridge for the Universal Windows Platform and store.  "Apps written to these classic platform technologies will be able to be packaged and deployed with AppX," Treadwell said. "You'll get the same fast, safe, trusted deployment as apps written to the Universal Windows Platform."

Critics were quick to question how well Android and iOS apps will work on UWP, particularly Windows Phone. "Okay programmers, what do you get when you run something in emulation?" asked blogger Steven J. Vaughan-Nichols , on a ZDNet post. "That's right. You get slow performance."

Vaughan-Nichols, an expert on the open source community and Microsoft critic, had a more fundamental question: "If you're a Windows Phone or RT developer, may I ask why?" pointing to its below 4 percent and falling market share. "Microsoft has handed the keys to the Windows Mobile kingdom to Android and iOS programmers. Whether those developers will bother with it is another question. After the first flush of excitement, they too will face considerable technical and market problems getting their apps profitably on Windows. I think Microsoft is making a desperate play to stay relevant in the mobile space with its own operating system and it's one that's destined to fail."

Key to disproving the Vaughan-Nichols theory will be the ability to bridge these apps with ease, agility, speed and with no degradation in performance. Now that Microsoft has built it, will the developers come?

 

Posted by Jeffrey Schwartz on 05/01/2015 at 12:18 PM0 comments


Microsoft Build: 1 Billion Devices Predicted To Run Windows 10 In Next Few Years

Microsoft believes its new Windows 10 operating system will find its way onto 1 billion PCs, tablets, phones, Xbox gaming consoles and emerging device form factors like its HoloLens by fiscal year 2018, which begins in just over two years. Terry Myerson, executive vice president for Microsoft's Windows group, made the bold prediction in part of the opening keynote presentation at the annual Build conference which kicked off today in San Francisco.

But convincing developers to build applications for the new Universal Windows platform and its application store will be critical if Microsoft can achieve that goal. By providing a common code base for different form factors, Microsoft believes it will have an appealing reason for customers to embrace Windows 10.

In opening remarks, Microsoft CEO Satya Nadella made the case for Windows 10. "Windows 10 represents a new generation of Windows built for an era of more personal computing from Raspberry Pi (the low-cost touch-based device) to the holographic computer," Nadella said.

"Universal Windows apps are going to enable you to do things you never thought were possible," Myerson said. "With Windows 10 we are targeting the largest device span ever. We're talking about one platform -- a single binary that can run across all these devices." While Microsoft has talked up that theme for some time, Myerson announced four key developments that could further embolden Windows to developers and consequently millennials who tend to gravitate to other computing and device platforms.

Perhaps most noteworthy is the ability to port application code for iOS and Android to the new Universal Windows platform. Windows Phones will include an Android subsystem where an app can be written, but the extensions to Windows will enable Android apps to be extended to Windows, Myerson said. Developers will be able to bring the code over, extend it and put it in the Windows Store, "reaching 1 billion Windows 10 customers," he said.

Myerson also announced developers will be able to compile the same Objective C code used to build Apple iOS apps for iPhones and iPads within Visual Studio on Windows, "enabling you to leverage that code and use capabilities only found on Windows platform. "

Addressing the issue of legacy Windows applications, Myerson announced the new Universal Windows apps by letting developers reuse server-hosted code and tools. "Developers will be able to give Web sites live tiles, integrate with Xbox Live and more," Myerson said. Developers can also now enable Cortana notifications, he noted.

Microsoft is also adding support for .NET and Win32 apps into the Windows Store, enabling these apps to take advantage of all of the Universal Windows platform capabilities. It does so using the learnings from Microsoft's App-V technology that lets developers run their applications in virtual environments. Adobe said its Photoshop Elements and Illustrator will be available in this environment.  

The ability to run iOS, Android, legacy Win32 and .NET code could address key barriers to Windows but what will ultimately make Windows 10 fly is the ability to deliver capabilities not currently available. Much of that is now in, or coming into, the hands of developers.

Posted by Jeffrey Schwartz on 04/29/2015 at 2:06 PM0 comments


AWS Is Surprisingly Profitable: Now What?

It's been a long time coming for Amazon.com investors who have grown increasingly impatient with the drag its cloud computing business has imposed on profits but the company last week gave them some good news. Under pressure to give a more detailed breakdown of revenues and profitability of its Amazon Web Services subsidiary, the company caved and promised it would share that information commencing with last week's Q1 2015 earnings report.

In its first disclosure on AWS, Amazon said its cloud computing subsidiary is a $5 billion business that's growing. Specifically, it posted $1.57 billion in revenue for the period, a 49% year-over-year increase. Based on analyst estimates that would put AWS on a $6 billion run rate, according to Redmond's new sister site AWSInsider. The most surprising revelation from Amazon's earnings report was that AWS is profitable. AWS had a margin of 17 percent. Macquarie Analyst Ben Schachter told The Wall Street Journal that AWS "is significantly more profitable than we expected." 

Noting the 15% jump in the company's stock on the news, Finro Equity Analyst Lior Ronen today was among a number of others suggesting that Amazon spin off AWS. In a Seeking Alpha blog post Ronen said based on AWS' $1.57 billion revenue for the quarter AWS segment is on a $6.9 billion annual revenue run rate, based on 11% quarterly growth. Assuming a price-to-sales ratio ranging from 7 to 10, AWS is worth between $48 billion and $69 billion, Ronen predicted.

"By spinning AWS, Amazon will be able to create two tech giants -- one focused on e-commerce and online retail business and the other on cloud computing and IaaS services," he said. "Amazon could leverage the two companies to create a whole that is bigger the sum of its parts: AWS could focus on its niche, develop new revenue streams, and invest further in its technology, while Amazon could do the same on its e-commerce platform. That is the only way Amazon could create a sustainable growth for the long term and employ the advantages it has in both businesses."

However Equity Analyst James Brumely was among those skeptical about AWS' long-term prospects. In a separate Seeking Alpha post, Brumely argued that as cloud services become more commoditized it will put pressure on future margins for AWS. Brumely also said that Google and Microsoft will continue to put pressure on Amazon. "Even as exciting as unexpected operating profits are for the Amazon Web Services (AWS) arm of the e-commerce giant, it doesn't change the fact that the company still lost money last quarter, nor does it change the fact that margins for AWS are more likely to continue to shrink rather than widen as cloud-computing continues to become commoditized," he said.

Furthermore, the 17% margin isn't as impressive as it seems, he argued, pointing to the fact that 291 of the companies in the Fortune 500 have operating profits of 15% or higher. More alarming, he said, is the fact that its 17% margin represents a marked decline from last year's profit of 23% during the same quarter.

"What happened?," he asked. "In simplest terms, Amazon (in a very Amazon-esque manner) has decided to become and remain the low-price leader with the cloud-storage world, and didn't worry about making much -- if any -- profit in the business. As turns out, it still made some operating profit as a cloud-computing provider, but it's making progressively, relatively less as time moves along."

The findings from Amazon's AWS stats may be vague but it's a noteworthy step not just for investors but for buyers of cloud infrastructure services who -- while looking to bet the best deal possible -- surely don't want to see their provider lose money indefinitely. And just as competitors tend to respond to pricing moves of one another, it'll be interesting to see if Microsoft, Google, IBM and others follow suit.

 

Posted by Jeffrey Schwartz on 04/27/2015 at 7:50 AM0 comments


Bugs in New Windows 10 Preview Could Test Your Patience

Testing beta software is always fraught with unexpected challenges but the new Windows 10 Technical Preview Build 10061, released last week, might test your patience. If you've already downloaded it, you know what I mean. If you're not on the "fast ring" release cycle of the Windows Insider program and haven't seen it, prepare to roll up your sleeves.

Microsoft gave a heads up to some of the bugs that are always present when we agree to test beta software. None of the problems seems insurmountable. The most obvious issue is if you use Win32 apps, including Microsoft Office, they won't launch from the Start Menu. Microsoft was aware of this when it released the new preview but wanted to showcase the new features and tweaked look.

There's an easy fix for this problem, as Gabe Aul, chief of the Windows Insider program, explained in last week's blog post announcing the release. "We know this one will be a bit painful but there is a bug with this build in which Win32 (desktop) apps won't launch from the Start Menu," he explained. "The workaround is to use search to find and launch these apps and pin them to your taskbar for quick access." Once you do that, launching applications will be fine.

If you liked using Microsoft's new browser, code-named Project Spartan, it'll appear they pulled it from the Technical Preview along with the beta of the new Windows Store. Both are still there -- you just have to find them and "repin them to your Taskbar from All apps on your Start Menu," Aul said.

Despite some of these issues, Microsoft wanted to showcase some of the newest features coming to Windows 10. Among them, according to Aul, are:

Start Menu allows users to resize

  • Black theme across Start Menu, TaskBar and Action Center
  • Taskbar optimized for tablets. In tablet mode the size of the Start button, Cortana and Task View buttons increases making them optimized for touch
  • Boot-to-tablet mode is the default setting for tablets smaller than 10 inches
  • Virtual desktops: Users can now create as many as they need

Fixes from previous build include:

  • Hyper-V now works
  • Visual Studio won't crash when creating Universal Apps
  • Project Spartan browser bugs repaired

What's your reaction to the latest Windows 10 build?

Posted by Jeffrey Schwartz on 04/27/2015 at 12:55 PM0 comments


Azure, Hyper-V and Spartan Browser Added to Microsoft's Bug Bounties

Microsoft is extending its bug bounty program, which pays up to $100,000, to include Azure, Hyper-V and the new Project Spartan browser that will be included in the new Windows 10 operating system.

Microsoft's bounty program has existed for several years and had already provided awards for detecting flaws in Internet Explorer and Office 365. Microsoft Azure CTO Mark Russinovich announced the addition of its cloud service, Hyper-V and Project Spartan to the bounty program at this week's RSA Conference in San Francisco. Among his three talks at the RSA Conference was an overview of the security of the Azure cloud service where he made the announcement at the end of his presentation.

"We want to make sure we don't have attacks discovered on Hyper-V before we do, so we're asking now for researchers to be there so we can get on top of them before attackers can take advantage of them," Russinovich said. "It's showing that we really want to keep our systems secure and make sure that the good guys aren't encouraged to go take their information and do evil with it but rather help everybody get some incentive like this."

Bounties for fixes that cover known flaws range from $500 to $15,000 and Microsoft will pay up to $100,000 for a mitigation bypass to any of the company's isolation technologies. It also offers $50,000 BlueHat bonuses for discovery and mitigation of zero-day vulnerabilities. Jason Shirk of the Microsoft Security Response Center said in a blog post Wednesday  that the bounty extension will include Azure virtual machines, Azure Cloud Services, Azure Storage and Azure Active Directory, among others. The bounty will also cover Sway.com, the preview of Microsoft's new social network for sharing information. Shirk said Microsoft is only offering the bounty for Project Spartan through June 22.

Stephen Sims, a security researcher at the SANS Institute said Microsoft has paid handsomely for a number of discoveries, such as last year's $100,000 to Yang Yu, who disclosed three exploit mitigation bypass techniques to the company. "My experience with MSRC is they're kind of a pain, they're not very friendly about it but it is good that they have that program setup," Sims said. "But they do pay if you can prove to them without a doubt. If you can find one bug, it's a year's salary, potentially."

Posted by Jeffrey Schwartz on 04/24/2015 at 12:13 PM0 comments


RSA Encryption Inventors Lament Its Use for Ransomware

When the developers of the original RSA encryption algorithms built what has become the mainstream means of encrypting and decrypt data, it wasn't lost on them that some bad guys might also find malicious uses for it as well. Two of its inventors yesterday said they were alarmed at the use of encryption for ransomware, which has become a pervasive way of gaining access to users' PCs and enterprise servers using increasingly more sophisticated social engineering and phishing techniques.

"As a security threat, encrypting ransomware has flown beneath the radar of many IT departments. It emerged as a consumer problem and at smaller companies and agencies," said Paul Kocher, president and chief scientist at Cryptography Research, who once again moderated this year's Cryptography Panel at the RSA Conference in San Francisco. "Many IT admins, unfortunately, write off the potential for ransomware incidents as unavoidable end-user errors that merit a slap on the wrist, but can't be helped. But all evidence suggests the problem isn't going away."

Given two of the panelists invented many of what are now the RSA algorithms used in today's encryption methods -- Adi Shamir, a professor at the Weizmann Institute  in Israel, and MIT Professor Ronald Rivest -- Kocher asked them for their perspective on their use for ransomware.

"As the inventor of one of the algorithms, I sort of feel like the mother whose son has been brainwashed and he's off to become a Jihadist in Syria somewhere," Rivest said. "I think that ransomware is one of those areas where our community failed in a particularly miserable way," Shamir added. "There are good security programs you can use in order to protect yourself from this ransomware."

Shamir said he fears the worst is yet to come as the Internet of Things enables homes and businesses to become more connected. "Think about your TV being ransomware'd stopping to work, with a big display that you have to pay in to get the TV service back," Shamir said. "I think it's a very serious problem. It's going to stay with us and we really have to think about new techniques to stop it."

Shamir also noted that because systems can be infected silently for weeks or months before a user is aware of it, backing up files also won't solve the problem. "Eventually your files on the backup are going to be the encrypted files," he said. "This is a huge issue of the correctness of backed up data, which is a major problem."

This month's Redmond magazine cover story looked at the continued impact of ransomware on consumers and enterprises alike. Panelist Ed Giorgio a cryptographer and security expert said the malicious use of encryption is just part of the problem. "Ransomware is not just about encrypting your data so you don't have access to it, in order to do ransomware you have to first penetrate somebody's computer, then you have some sort of an exploit," Giorgio said. "But as we all know, criminals are very innovative and once they penetrate your file, they will find other things in your computer they can blackmail you for. Even if we do solve the loss of data problem, ransomware will still be around.

 

Posted by Jeffrey Schwartz on 04/22/2015 at 12:16 PM0 comments


RSA's New Leader To Tackle Rapidly Shifting Security Landscape

This is not your father's RSA. That was the message the company's new president, Amit Yoran, effectively gave in the opening keynote on Tuesday at the annual RSA Security Conference in San Francisco attended by more than 30,000 IT security professionals. While it's hosted by RSA, a subsidiary of EMC known for its development of the industry standard RSA public key cryptography algorithm, the conference is an industry event with participation by its partners and competitors alike.

While focusing his keynote on issues that plague security professionals, it set the stage for changes Yoran is planning for the company he took the reins of last year from longtime President Art Coviello, who recently retired. "We're reengineering RSA across the board to enable us to deliver on this vision," Yoran said toward the end of his address. "This time next year, we won't be the same RSA you have known for decades."

Yoran didn't use his keynote to explain how he plans to remake RSA. But at a gathering of press and analysts a day earlier in brief remarks he indicated a move away from RSA's original SecureID strong authentication token platform. Addressing the current risk factors, which extend beyond enterprise perimeters thanks to the growing ubiquity of public and hybrid cloud services, he noted the launch of its new Via identity management product line and extensions to RSA Security Analytics.

RSA described its new Via portfolio as the first smart identity tools that use contextual awareness instead of static rules such as traditional passwords to single sign-on access to systems. The first in the portfolio, RSA Via Access, is a software-as-a-service (SaaS) offering that offers step-up authentication using mobile devices to provide single sign-on access. The portfolio also includes RSA Via Governance, built on its identity management and governance platform acquired by Aveksa, which provides views into access privileges, automates user access and flags orphan user accounts and inappropriate user access, according to the company. Also built on its Aveska acquisition is the new Via Lifecycle, a user provisioning platform.

The other major area of emphasis for the company is the extended capabilities of RSA Analytics. Based on RSA's 2011 acquisition of NetWitness, which Yoran led as CEO at the time, the company is launching a new release of RSA Analytics that will focus on extending visibility from the endpoint to the cloud. And that gave Yoran fodder for much of his talking points in his opening keynote.

Referring to the 2014 Verizon Data Breach Investigations Report that found less than 1 percent of successful advanced threat attacks were spotted by SIEM systems, he argued his call for change. "We're still clinging to our old maps," he said. "It's time to realize that things are different."

Given existing defense mechanisms are not sufficient in and of themselves these days, he believes analytics will be key to proactively identifying attacks. "We must adopt a deep and pervasive level of true visibility everywhere, from the endpoint to the network to the cloud, if we have any hope of being able to see the advanced threats that are increasingly today's norm," he said.

The Stuxnet, Equation Group and Carbanak intrusions are a handful of examples he pointed to. "One of the defining characteristics across all of them is their stealthy nature," he said. "Until written about they were virtually undetectable because they bypassed traditional defenses. Even now many organizations operate completely blind as to whether they are victim to these published techniques. Traditional forms of visibility are one-dimensional, yielding dangerously incomplete snapshots of an incident, let alone any semblance of understanding of an attack campaign. Without the ability to rapidly knit together multiple perspectives on an attack, you'll never fully understand the scope of the overall campaign you're facing."

Arguing he wasn't hawking his products, Yoran said "I'm not just standing up here and saying 'buy RSA gear.' I'm the first to admit that we need to go further than what is available today. We're on a journey to full visibility. Our environments, business practices and adversaries continue to evolve and so must we."

As I said, this is not your father's RSA.

Posted by Jeffrey Schwartz on 04/21/2015 at 2:19 PM0 comments


Microsoft Shuts Down Open Source Subsidiary

Microsoft late last week said it's shutting down the MS Open Tech subsidiary it formed three years ago to invest in open source initiatives and will absorb it into the company. The company announced the formation of Microsoft Open Technologies Inc. in April 2012, staffed with an interoperability strategy team in Redmond that aimed at accelerating its push into the open source community.

In a blog post late Friday, MS Open Tech's president Jean Paoli said the independent organization accomplished what it set out to do and the time is right to bring its people and efforts back into Microsoft. "MS Open Tech has reached its key goals, and open source technologies and engineering practices are rapidly becoming mainstream across Microsoft," Paoli said. "It's now time for MS Open Tech to rejoin Microsoft Corp., and help the company take its next steps in deepening its engagement with open source and open standards."

The move is hardly surprising. In the past year, Microsoft has extended its push into the open source community more than most ever would have expected. Not that Microsoft is positioning itself as an open source company but it in some way supports every major initiative and has made contributions once unthinkable including its .NET Framework. Mark Russinovich, CTO for Azure, earlier this month raised eyebrows when raising the specter of Microsoft open sourcing Windows saying "it's definitely possible."

"Open source has become a key part of Microsoft's culture," Paoli said in his Friday post. "Microsoft's investments in open source ecosystems and non-Microsoft technologies are stronger than ever, and as we build applications, services, and tools for other platforms, our engineers are more involved in open source projects every day. Today, Microsoft engineers participate in nearly 2,000 open source projects on GitHub and CodePlex combined."

Paoli also noted that Microsoft has brought "first-class support" to Linux and Azure, partnered with Docker to integrate its containers to enable support on Azure and Windows, built Azure HDInsight on Apache Hadoop and Linux and created developer support for open platforms and languages including Android, Node.js and Python. In addition to deep support for Docker, Paoli pointed to integration with other key environments, both open and competing proprietary platforms, notably iOS. Among other projects he noted were contributions to Apache Cordova, Cocos2d-x, OpenJDK, and dash.js, support for  Office 365 on the Moodle learning platform and collaboration on key Web standards including HTML5, HTTP/2 and WebRTC/ORTC.

As Microsoft absorbs MS OpenTech, it will create the Microsoft Open Technology Programs Office, according to Paoli. "Team members will play a broader role in the open advocacy mission with teams across the company," he said. "The Programs Office will scale the learnings and practices in working with open source and open standards that have been developed in MS Open Tech across the whole company. Additionally, the Microsoft Open Technology Programs Office will provide tools and services to help Microsoft teams and engineers engage directly with open source communities, create successful Microsoft open source projects, and streamline the process of accepting community contributions into Microsoft open source projects."

 

Posted by Jeffrey Schwartz on 04/20/2015 at 11:17 AM0 comments


Docker Client for Windows Released 

Microsoft's efforts to support containers in Windows took another step forward yesterday with the release of the Docker Client for Windows. The release of Microsoft's Docker command-line interface for Windows comes with Docker's updated container platform, dubbed Docker 1.6.

It comes after an active week for Docker, which on Tuesday received a huge equity investment of $95 million, which the company said it will use in part to further its collaborations with partners including Microsoft, Amazon Web Services and IBM. Microsoft also just announced that Docker containers are coming to Hyper-V and for Windows Server.

"Docker Client for Windows can be used to manage Docker hosts running Linux containers today, and managing Windows Server Containers and Hyper-V Containers will be supported in the future to provide the same standard Docker Client and interface on multiple development environments," wrote Ahmet Alp Balkan, a software engineer for Azure Compute at Microsoft. Microsoft and Docker have collaborated to port the Docker Client to the Windows environment in Docker's open source project, which you can see on GitHub."

Balkan also noted that IT pros will be able to find Windows Server Container Images in the Docker Hub among the 45,000 Docker images for Linux already available, a figure he noted continues to grow. IT pros and developers can download the Docker Client for Windows via the Chocolatey package manager or they can install Boot2Docker, which creates a Docker development environment within a virtual machine, Balkan noted.

The new Docker 1.6 includes a new container and labels that let IT pros and developers attach user-defined metadata to containers and images within various tools, Docker said. Also the new Docker Registry and API, along with the Docker 1.6 Engine boasts improved reliability and performance.

Docker's updated Compose 1.2 tool, designed for running complex applications, reduces repeatable processes.  The release also includes the Swarm 0.2 clustering component, which the company said turns a pool of Docker hosts into one virtual host. The updated release includes a new spread strategy for scheduling containers, more Docker commands supported, the ability to add more clustering drivers and support for more Docker commands such as pulling and inspecting images.

Finally, Docker added Machine 0.2, which the company said has an improved driver interface, more reliable provisioning and the ability to regenerate TLS certificates to ensure better security when a host's IP address changes.

Posted on 04/17/2015 at 1:03 PM0 comments


Microsoft Pushes To Further Distance Itself from AWS

Ask most people what companies are Microsoft's biggest rivals and some will say Apple but most will identify Google. Several published reports even point to powers in Redmond as a key force behind regulators coming down on the search giant this week. IT pros may throw VMware and Red Hat in the mix of major Microsoft competitors but its neighbor Amazon Web Services is right up there having launched its famous cloud infrastructure services years ahead of Microsoft. Even the entry of Azure got off to a slow start, lacking a complete infrastructure service to rival the offerings of AWS.

Microsoft talked up the kink in the armor last year when Amazon shocked investors with heavier-than-expected losses, due primarily to its investments in AWS. Anyone who knows Founder and CEO Jeff Bezos is aware he's not going to throw in the towel on AWS that quickly, though there are some who'd like to see him agree to spin off the cloud business into a separate company. But Bezos' rationale around AWS was always to let the two businesses feed off each other.

Proponents of a divestiture could be buoyed or deflated depending on what Amazon's numbers look like when it reports next week but, to date, history is not on its side. While we've reported on gains by Microsoft, IBM and numerous others at the expense of AWS, by no means is it game over for Amazon, which continues to crank out new offerings on a weekly basis. Consider over the past week when AWS held one of its regional summits, this one in San Francisco. The company pointed to the fact that software partners continue to extend support for AWS' services, simplified its Amazon Machine Learning service, announced its new new Elastic File Storage service and has extended its burgeoning Amazon WorkSpaces offering.

Rarely does a week go by when where isn't something new coming out from what is still the largest provider of infrastructure services, which is why Redmond parent company 1105 Media has launched the new AWSInsider site, which debuted this week. This new sister site is a welcome addition to our portfolio, but in no way will diminish the way Redmond covers AWS for Microsoft-focused IT pros. Rather it only promises to enhance it.

The timing couldn't be better as AWS Furiously Fights off Cloud Competitors. And it's number one antagonist and Pacific Northwest rival Microsoft is about to step up that battle in the coming weeks at its Build conference in two weeks and Ignite in early May. In a preview leading up to Microsoft's big splash,  Jeffrey Snover, lead architect for the Windows Server division, last week talked about six key sessions he'll be participating with the likes of Azure CTO Mark Russinovich, where they'll talk about the company's datacenter vision moving forward, which includes the combination of new versions of Windows Server, System Center, and Azure. A key component will go deep on how this new datacenter vision extends its hybrid cloud platform, aka Cloud OS, with new levels of automation aided by PowerShell's Desired State Configuration and support for containers.

It will be interesting to see how the new offerings coming from not only Microsoft and AWS but all of the major players as well as the lesser known ones, who will also play a key role in how organizations view and procure IT in the future.

Posted by Jeffrey Schwartz on 04/16/2015 at 1:19 PM0 comments


European Union Accuses Google of Violating Antitrust Laws

The European Union has once again thrown the gauntlet down on Google, this time charging the company with violating antitrust laws by using its dominance in search by favoring its own comparison shopping service at the expense of others. The EU is also launching a separate investigation to see if Google has used its clout as the dominant supplier of mobile phone software to hold back providers of competing mobile operating systems, namely Apple and Microsoft. Google denied both allegations.

Regarding the charges that it skews results in its search engine to benefit its own shopping comparison service, the EU charged that "Google gives systematic 'favourable' treatment to its comparison shopping product (currently called 'Google Shopping') in its general search results pages, e.g. by showing Google Shopping more prominently on the screen."

Google diverts traffic from competing comparison shopping services obstructing their ability to compete, said the EU complaint.

"The Commission is concerned that users do not necessarily see the most relevant results in response to queries -- this is to the detriment of consumers, and stifles innovation," it said in a statement. The EU wants Google to operate its own comparison shopping services the same as it treats those of rivals. Google has 10 weeks to respond, at which point the EU will hold a formal hearing.

In response to that allegation, Google said in a blog post it has plenty of competitors and argued its own offerings are often underdogs. "Indeed if you look at shopping -- an area where we have seen a lot of complaints and where the European Commission has focused in its Statement of Objections -- it's clear that (a) there's a ton of competition (including from Amazon and eBay, two of the biggest shopping sites in the world) and (b) Google's shopping results have not harmed the competition," Amit Singhal, senior vice president of Google Search, said in a blog post. "Companies like Facebook, Pinterest and Amazon have been investing in their own search services and search engines like Quixey, DuckDuckGo and Qwant have attracted new funding. We're seeing innovation in voice search and the rise of search assistants -- with even more to come."

As for Android, the EU said it's investigating whether or not Google has violated antitrust regulations by thwarting development of mobile applications to other operating system providers by providing incentives to smartphone and tablet suppliers to install Google's apps and services exclusively. "Distribution agreements are not exclusive, and Android manufacturers install their own apps and apps from other companies as well," said Hiroshi Lockheimer, Google's VP of engineering for Android, in a blog post addressing the investigation. "And in comparison to Apple -- the world's most profitable (mobile) phone company -- there are far fewer Google apps preinstalled on Android phones than Apple apps on iOS devices."

Do you feel the EU has a case or are the latest charges just a witch hunt?

 

Posted by Jeffrey Schwartz on 04/15/2015 at 11:30 AM0 comments


Docker Founder: 'We've been Given a Mandate To Build Something'

Investors are so bullish about how Docker is poised to play a major role in the future of enterprise IT infrastructure and software development that they filled its coffers with $95 million in D Series funding. Docker, regarded as the leading provider of containers for enterprise developers to build service oriented, scalable and portable software, took the huge cash infusion even though the provider of containers hasn't used up last fall's most recent infusion of $40 million.

The company's meteoric rise in just two years has quickly garnered support by enterprise IT heavyweights including Amazon Web Services, Google, Microsoft, VMware and IBM. Not only does Docker aspire to make operating systems and virtual machines much less relevant but it wants to make it possible for developers to build software without regard to the OS or public cloud provider, with the ability to scale with or without a virtual machine. The company claims that the Docker platform shrinks software development times from weeks to minutes and drives 20x improvements in computing resource efficiency.

Docker says it has logged 300 million downloads of instances for its Docker Hub hosted offering and 15 large Fortune 50 companies are now testing its forthcoming Docker Hub Enterprise offering. More than 1,200 open source developers have contributed to the Docker platform, according to the company. David Messina, Docker's VP of marketing, said on a conference call with journalists that the company plans to use the proceeds of the $95 million to expand the orchestration, networking storage and security features of the Docker platform and build on the APIs that enable extensions to platforms from partners like Amazon, Microsoft and IBM.

"I think we've been given a mandate to build something and clearly there's a community of people who are very excited about what we've built so far," said Docker founder and CTO Solomon Hykes, speaking during the conference call. "The expectations are extremely high, almost impossibly high. Our goal is to build a universal tool. Very specifically we're trying to solve fundamental problems that affect all applications, and although at any given time the implementation is limited in its scope. For example most obviously you can only run applications in Docker if they can run in Linux, but over time we're working to expand that scope, and the most dramatic example is our partnership with Microsoft."

Docker's partnership with Microsoft, launched in May and extended in October, is significant. The next version of Windows Server, code-named "v.Next," will ship with native support for containers and with a ported version of Docker that will support Windows container technology, Hykes noted. Microsoft last week said it will release the next technical preview of Windows Server next month.

"As a developer in an enterprise, you'll be able to develop, build and test applications using Docker's standard tooling and interfaces for both Linux and Windows environments," Hykes said. The notion of using Docker containers is that developers can build applications for .NET, Java and modern programming languages that are portable and scan scale without requiring huge investments in virtualization. 

More Lightweight and Faster than a VM

"The first thing that happens when people play with containers in their development projects is it looks like a VM but faster and more lightweight and consuming less memory. And those are all true," Hykes said. "Several years before Docker existed, a common, wisdom among [IT pros and developers] was a container was just that: smaller, more lightweight, a faster VM. The fundamental difference between Docker and other lower-level container tools is simply we disagree. We think containers and VMs are fundamentally different. They operate at different levels of the stack and as a result they are not mutually exclusive. You can use containers with VMs, you can use containers without VMs, directly on bare metal, and we're seeing organizations do both."

Currently most customers aren't using Docker containers to replace virtual machines, Hykes said, emphasizing the notion that containers are designed to ensure existing infrastructure and applications don't require change.

"Typically what we've seen is the way developers reason about containers is not as a possible replacement for VMs, but as an additional layer on top of their infrastructure, which allows them to pick and choose the best infrastructure for each job. [This] means it now becomes easier for an organization to use VMs where VMs make sense, to use bare metal when bare metal makes sense and, of course, to pick and choose between multiple physical machine providers and virtual machine providers and then layer on top of all these different parts of their infrastructure. On top of which they can express their applications. So the bottom line is containers are for applications and VMs are for machines."

To the point regarding the type of applications Docker containers are best suited, Hykes said they can be applied to any type. "Docker can be applied to any sort of application. Over time, I think we're seeing more and more practitioners grow comfortable with the technology, comfortable with the best practices and evolve from the original pilot project, which is typically a non-vital project and gradually trust Docker with projects of larger and larger magnitude."

Messina said the appeal of Docker Server and the forthcoming Docker Hub played a key role in Goldman Sachs and Northern Trust joining the parade of investors funding this new round. The two companies have used Docker for various development efforts and now these organizations are standardizing on Docker in their application lifecycle infrastructure," Messina said.

Insight Venture Partners led the round with new investments from Coatue, Goldman Sachs and Northern Trust. Also participating in the round were previous investors Benchmark, Greylock Partners, Sequoia Capital, Trinity Ventures and Jerry Yang's AME Cloud Ventures.

Posted by Jeffrey Schwartz on 04/14/2015 at 1:38 PM0 comments


Microsoft releases Skype for Business Client for Office 2013

Microsoft today as planned is releasing the Skype for Business client just weeks after introducing the Technical Preview. The company announced the release of the new Skype for Business as part of the Office 2013 April rollout. All Office 365 customers are scheduled to receive the update by the end of May.

Organizations not ready to let their users transition to the new Skype for Business client can allow administrators to switch back to the existing Lync interface, Microsoft said. The company posted instructions for how customers can roll back to the current Lync client, both for shops with Lync Online and those with Lync Server.

A new version of the Lync Server, to be called Skype for Business Server, is scheduled for release next month. Microsoft said in March that the new server edition will support high availability including support for the company's AlwaysOn capability included in SQL Server.

Skype for Business is the new phone and conference interface that replaces Lync and will bring the look and functions of Skype to Office. Microsoft claims that more than 300 million consumers use Skype, which Microsoft acquired in 2011 for $8.5 billion, its largest acquisition to date. Now comes the litmus test on whether Microsoft will get bang for its buck. By integrating the enterprise features of Lync with the interface of Skype, Microsoft is hoping it can raise the profile of its communications technology among business and enterprise users.

Microsoft first indicated plans to integrate Lync with Skype and give it the Skype brand late last year and released the technical preview of the new Skype for Business client at last month's Convergence conference in Atlanta. Skype for Business represents Microsoft's latest effort to give it an even stronger foothold in universal communications, which it has long aspired to do. Microsoft introduced Lync nearly five years ago as a revamped iteration of its Office Communications Server.

The company is hoping that the familiarity and access to all of the 300 million users Microsoft claims Skype has will increase its appeal and usage both within businesses and among consumers. Microsoft says Skype for Business has "enterprise-grade security" and controls for compliance. Just like the existing Skype and Lync clients, the new Skype for Business provides IM, presence, voice and video calls and meetings. With this new release, Skype is integrated directly into Office.

In the new client, users can initiate and control calls from their Office contact lists. It also brings the Skype emoticons to discussions, improved file transfer including the ability to drag and drop, the ability to let recipients see file details including the file's size and name and it lets users take notes from within the clients via OneNote. It also includes the Skype call monitor.

How quickly do you see your organization using Skype for Business?

Posted by Jeffrey Schwartz on 04/14/2015 at 10:26 AM0 comments


Microsoft Challenges Court Order To Turn Over E-Mail in Dublin Datacenter

Microsoft last week filed a legal brief challenging a court order that is forcing the company to turn over a customer's e-mails stored in a foreign datacenter.

The brief, filed April 8 with the United States Court of Appeals for the Second Circuit, seeks to argue last summer's court order that Microsoft must turn over the messages from the customer, who is suspected in an alleged drug-related matter. The identity of the suspect is not known and Microsoft said at the time of the ruling, which was upheld by Judge Loretta Preska, that it would appeal the order.

A number of major technology companies last year had filed briefs in support of Microsoft's appeal including Apple, AT&T, Cisco and Verizon, along with the Electronic Frontier Foundation, noting that the outcome promises to set a precedent for all U.S.-based cloud providers storing data abroad.

"Settled doctrine makes this Court's job simple: Because laws apply only domestically unless Congress clearly provides otherwise, the statute is properly read to apply only to electronic communications stored here, just as other countries' laws regulate electronic communications stored there," according to the brief, which Microsoft published. "Even if the Government could use a subpoena to compel a caretaker to hand over a customer's private, sealed correspondence stored within the United States, however, it cannot do so outside the United States without clear congressional authorization."

Brad Smith, Microsoft's general counsel and executive vice president for legal and corporate affairs, indicated in a blog post that he's confident the company will prevail. "As we stated in our brief, we believe the law is on the side of privacy in this case, he said. "This case is about how we best protect privacy, ensure that governments keep people safe and respect national sovereignty while preserving the global nature of the Internet."

Smith also argued that the feds are long overdue in evaluating electronic privacy laws. "While there are many areas where we disagree with the government, we both agree that outdated electronic privacy laws need to be modernized," he said. "The statute in this case, the Electronics Communications Privacy Act, is almost 30 years old, he noted. "That's an eternity in the era of information technology."

Those differences of course pertain around combatting criminal activities versus protecting privacy. Smith acknowledged that conflict but renewed his plea for the government to find a resolution. "Law enforcement needs to be able to do its job, but it needs to do it in a way that respects fundamental rights, including the personal privacy of people around the world and the sovereignty of other nations," he said. "We hope the U.S. government will work with Congress and with other governments to reform the laws, rather than simply seek to reinterpret them, which risks happening in this case."

 

Posted by Jeffrey Schwartz on 04/13/2015 at 11:11 AM0 comments


PC Sales Decline Not as Bad as Expected

PC shipments have been on the decline. But if you want to look at the glass half-full, those declines aren't as bad as originally forecast.

IDC yesterday reported that 69 million PCs shipped for the first quarter of this year amounted to a 6.7 percent decline over the same period last year. Though that's the lowest number of PCs shipped for the quarter since 2009, this year's decline wasn't as sharp as IDC had originally forecast last fall when the market researcher had predicted volumes to drop by 8.2 percent.

The better-than-expected number -- if you do look at the glass half full -- came from a slower decline in the United States than other parts of the world, according to IDC Senior Research Analyst for PCs Rajani Singh. In the U.S. 14.2 million PCs shipped in the first quarter, a 1 percent decline over the same period last year, according to IDC. The strongest segment of growth was portables, notably in new categories such as new Bing PCs, Chromebooks, convertible PC-tablets and ultra-slim notebooks, according to IDC, which said desktop shipments were sluggish this quarter.

Gartner said desktop declines were in the double digits but figures won't be available for another few weeks, according to analyst Mikako Kitagawa. For its part, Gartner said it had forecast more moderate declines and it says shipments declined 5.2 percent. Gartner is also forecasting moderate PC growth for the years to come.

"The PC industry received a boost in 2014 as many companies replaced their PCs due to the end of Windows XP support. But that replacement cycle faded in the first quarter of 2015," Kitagawa said in a statement. "However, this decline is not necessarily a sign of sluggish overall PC sales long term. Mobile PCs, including notebooks, hybrid and Windows tablets, grew compared with a year ago. The first quarter results support our projection of a moderate decline of PC shipments in 2015, which will lead to a slow, consistent growth stage for the next five years."

The pending arrival should boost PC shipments later this year once Microsoft releases Windows 10, IDC's Singh stated. "Windows 10 should be a net positive as there is pent-up demand for replacements of older PCs," she noted. "Only part of the installed base needs to replace systems to keep the overall growth rate above zero for rest of the year."

Kitagawa in an e-mail said she doesn't anticipate the arrival of Windows 10 playing a significant role in an uptick of PC demand. "We don't expect Windows 10 will stimulate the demand, but will see shipment growth from supply side as manufactures will try to push the volume," she said. "If related marketing activities are visible enough, then it can draw buyers' attention, but it does not mean that it can increase the sales to the end users."

Both research firms also noted that the two largest PC providers, Lenovo and Hewlett Packard respectively, were the only suppliers to grow sales during the quarter. IDC said Lenovo with 19.6 percent share of the market, shipped 13.4 million PCs, an increase of 3.4 percent. HP's sales of just under 13 million systems were up 3.3 percent giving it a 19 percent share of the market. Dell, the No. 3 player, shipped 9.2 million PCs, a 6.3 percent decline giving it a 13.5 percent share.

Smaller PC vendors, defined as "others," accounted for a third of the market and saw their shipments decline 17.6 percent, according to IDC. Naturally that impacted their market share, which dropped from 38.4 percent to 33.9 percent. If that trend continues, expect to see the big get bigger and the rest of the market to be squeezed. One variable is whether HP will be able to maintain its scale after it splits into two companies.

 

 

Posted by Jeffrey Schwartz on 04/10/2015 at 12:40 PM0 comments


Microsoft Adds Hyper-V and Nano Server to List of Planned Containers

In a move Microsoft says will further advance container technology to more deployment scenarios and workloads and allow developers to build more scalable apps, too, Microsoft today said it will offer Hyper-V containers.

The introduction of Hyper-V containers comes just weeks before Microsoft plans to debut the preview of the next version of Windows Server, code-named "v.Next," which the company will demonstrate at its Build conference in San Francisco. As part of today's announcement, Microsoft also revealed plans to offer a container-based scaled down version of Windows Server called the Nano Server, aimed at modern, cloud-native applications. The latest report of the Nano Server under development surfaced last month.

Microsoft Hyper-V containers will offer a deployment option to running applications on Windows Server. The company announced last fall that the next version of Windows Server will support containers, which are lightweight runtime environments with many of the core components of a virtual machine and isolated services of an OS designed to package and execute so-called micro-services. However, while the addition of containers to Windows Server V.Next has been described before, the Hyper-V containers addition is something new.

Microsoft has previously described a partnership with Docker to ensure its containers could run in Windows Server environments. The move followed an earlier announcement in June 2014 to ensure that the Microsoft Azure public cloud could run Docker containers on Linux-based virtual machines. Microsoft has also indicated that Azure will support Docker's open orchestration APIs and Docker Hub images in the Azure Gallery and Portal.

The latest addition today of Hyper-V containers will offer a deployment option that offers extended isolation utilizing the attributes of not just the Windows operating system but Hyper-V virtualization, according to Mike Neil, Microsoft's general manager for Windows Server, in a blog post.

"Virtualization has historically provided a valuable level of isolation that enables these scenarios but there is now opportunity to blend the efficiency and density of the container model with the right level of isolation," Neil wrote. "Microsoft will now offer containers with a new level of isolation previously reserved only for fully dedicated physical or virtual machines, while maintaining an agile and efficient experience with full Docker cross-platform integration. Through this new first-of-its-kind offering, Hyper-V Containers will ensure code running in one container remains isolated and cannot impact the host operating system or other containers running on the same host."

The Hyper-V containers will support the same development and management tools as those designed for Windows Server Containers, Neil noted. Moreover, he said developers don't need to modify applications built for Windows Server Containers in order to run in Hyper-V containers.

And for modern application scenarios where Hyper-V and Windows Server would be overkill, Neil described the new Nano Server as "a minimal footprint installation option of Windows Server that is highly optimized for the cloud, including containers. Nano Server provides just the components you need -- nothing else, meaning smaller server images, which reduces deployment times, decreases network bandwidth consumption, and improves uptime and security. This small footprint makes Nano Server an ideal complement for Windows Server Containers and Hyper-V Containers, as well as other cloud-optimized scenarios."

Posted by Jeffrey Schwartz on 04/08/2015 at 10:34 AM0 comments


Microsoft To Release Updated 'Windows Server 2016' Technical Preview Next Month

Microsoft late Friday issued a short reminder that the Windows Server preview released in October is set to stop working on April 15. A new preview is slated to arrive in May, the company announced.

The company will release a fix so that testers can continue using it between April 15th and the release of the second technical preview, according to the Windows Server blog. "If you would like to continue your evaluation, we will soon deliver a solution until the next preview is released in May," read the blog post. "We will update this blog with more information shortly."

Given it has taken seven months for Microsoft to release the second Windows Server Technical Preview, it'll be interesting to learn what major changes make it into the new build, especially after a panel discussion last week at ChefConf in Santa Clara when Microsoft Azure General Manager Mark Russinovich said it's "it's definitely possible" that Microsoft is considering making Windows an open source platform.

Windows Server 2016, as it is now called, is scheduled for release next year. The platform, including System Center, is undergoing a "deep refactoring," according to Jeffrey Snover, distinguished engineer for the Windows Server Group. As reported last month, Microsoft is aligning the components of each to create a more software-defined, cloud-optimized platform.

It'll also be interesting to see if the reported "Nano Server" edition of Windows Server will appear with the forthcoming technical preview, which will  be a smaller option to use than the Server Core option that currently exists in Microsoft's flagship Windows Server 2012 R2.

Posted by Jeffrey Schwartz on 04/06/2015 at 1:10 PM0 comments


Former Windows Chief Leads Investment Group in Security Startup

Andreesen Horowitz last week invested on an addition $52 million in security startup Tanium, adding on to the $90 million the Silicon Valley venture capital firm last year infused into the endpoint security provider. Steven Sinofsky, the president of Microsoft's Windows group until his unceremonious departure more than two years ago, is now at Andreesen Horowitz and largely leading the firm's investment in Tanium.

Tanium claims its endpoint security platform is designed to provide near real-time visibility to cyber threats against the largest of organizations. The company's vision is to scale without degradation regardless of the size of the organization or number of endpoints. The Tanium platform has two key components: Endpoint Security and Endpoint Management. Endpoint Security provides threat detection, incident response, vulnerability assessment and configuration compliance and Endpoint Management performs patch management, software distribution, asset management and asset utilization reporting.

The company claims with the Endpoint Security component of its platform it can provide 15-second threat detection and remediation, can connect to key external threat intelligence feeds and supports open standards such as OpenIOC (the open framework for sharing threat intelligence contributed by Mandiant), Yara for creating signatures that identify malware families, the Structured Threat Information eXpression (STIX) language for describing threat information and Trusted Automated eXchange of Indicator Information (TAXII). The Endpoint Management component aims to provide accurate assessments of vulnerability by providing accurate visibility of all endpoint assets.

During a CNBC interview this week, Sinofsky, who sits on Tamium's board, described the company's addressable market as up to 1 billion endpoints and argued Tanium's approach to threat detection and systems management is broader than traditional security providers currently offer.

"It's broader than any one security company, or any one in the traditional area we used to call systems management," Sinofsky said. "What's incredible about Tanium is it takes a modern and novel innovative approach to the difference between what used to be the mundane task of inventory of just tracking your PCs with the network and edge detection of companies like Palo Alto Networks and FireEye, and the like land Symantec is kind of a legacy provider of endpoint protection -- the signature files, and malware. Tanium is all about 15-second response across a billion end points in the enterprise world."

The added $52 million, which brings up the total investment up to $142 million, has doubled Tanium's valuation bringing it up to $1.75 billion, according to reports. Sinofsky declined when asked by CNBC to confirm the reported valuations.

 

Posted by Jeffrey Schwartz on 04/03/2015 at 12:42 PM0 comments


Microsoft Turns 40 as Stock Drops to 40

The 40th anniversary of Microsoft's founding is tomorrow, April 4. And in a twist of irony, its stock closed yesterday at just a hair above $40 per share ($40.29 to be precise). While that's still higher than the $35 it was trading at when Satya Nadella succeeded Steve Ballmer, the stock has declined 20 percent since November.

Microsoft's history is among the most interesting growth stories of a company that started with nothing and rose into one of the world's most influential companies. It all started in 1975 when Paul Allen and Bill Gates were able to get its iteration of Basic for the Intel microprocessor-based MITS Altair up and running.

Certainly most Microsoft IT professionals and developers know the rich history behind that but for those who don't (or want to recap those interesting times), you can check out this seven-minute Channel 9 video which recaps some key milestones from 1975. Back then, a gallon of gas was 53 cents and Microsoft's revenues that year were $16,705. Microsoft enjoyed many years as the world's most-valued company until recent years when Apple overtook it -- aided by many mistakes made by Microsoft over the past decade.

Nevertheless since Satya Nadella took over as Microsoft's CEO just over a year ago, the company has remade itself, fully embracing open source and rival proprietary software and services and making mobility and cloud the core of everything it delivers. Nadella defined Microsoft as a "productivity and platforms" company. Investors cheered until January when the company forecasted a weaker outlook for the current quarter, leading to its current stock decline.

Most Wall Street analysts see the current decline as a hold on buying opportunity but there are a handful of skeptics. Among them is Goldman Sachs, which downgraded Microsoft to a Sell on Wednesday. Analyst Heather Bellini in a research note warned that the stock will sit at $38 per share over the next 12 months, below the consensus of $46.97. To be sure, this is a contrarian view but it's worth pointing out the headwinds she sees. Among them:

  • Microsoft was buoyed last year by the end of life of Windows XP and there's no equivalent issue that will force upgrades this year.
  • The free Windows 10 upgrades will impact Windows licensing revenue.
  • There's currently little room for lowering costs .
  • PC sales will remain flat.
  • Cloud licensing products such as Office 365 have much tighter margins than traditional software.

Bullish reports counter that Microsoft's strong commercial software business is gaining share and its cloud business, backed by strong Azure growth, is showing signs that it'll become a key cash cow in the future. Microsoft's tendency to exceed expectations has tempered some concerns over the lower guidance.

Noting both that the Goldman findings and bullish reports, Morningstar yesterday said it's holding its four-star rating of the company. "Microsoft remains a cash flow juggernaut," the report said. "Generating more than $26 million in free cash flow in the past fiscal year and with more than $85 billion in cash on its balance sheet, the technology powerhouse has the financial flexibility and resources to remake itself."

Microsoft's biggest challenge moving forward is to keep Windows successful, while attracting and retaining new talent that will help the company move into the future, as described by The Economist.

Many dread turning 40 while others relish the milestone. Microsoft appears to have moved passed its own mid-life crisis before turning 40, but time will tell.

 

Posted by Jeffrey Schwartz on 04/03/2015 at 9:49 AM0 comments


Microsoft's New Ignite Mega Conference Sold Out with 20,000 Attendees

If you were on the fence about attending next month's brand new Ignite conference, designed to bring together the former Tech-Ed, SharePoint Conference and other events into one mega show, you're too late. Microsoft says Ignite, slated for May 4-7 at the McCormick Place Convention Center in Chicago, is sold out.

According to the Web site Microsoft set up for Ignite, full conference passes are no longer available. But if you're just interested in attending the expo you can still get into that. A spokeswoman for Microsoft said that 20,000 attendees have registered for the inaugural Ignite conference. Ignite is targeted at enterprise IT pros and is expected to be the site where top executives outline the future of key products including Windows Server, System Center, Hyper-V, SharePoint Server and others.

Microsoft CEO Satya Nadella is slated to kick off Ignite Monday May 4 with the opening keynote. Though it's a rebranded and expanded iteration of TechEd, many MVPs have raised concerns over the way Microsoft has organized sessions for the event. Among them is Redmond magazine Windows Insider columnist and Pluralsight author Greg Shield, who made no bones about his take on the event, saying in last month's column that "the company's next big event may be all flash and no substance."

Describing Ignite as the equivalent of a whitewashed corporate white paper, Shields pointed out, with help from his Pluralsight partner Don Jones, that instead of allowing people to propose outline for a session, Microsoft was looking for MVPs to nominate themselves and offer up three broad topics. "I'd been wondering why the newly rebranded Ignite removed the Ed in TechEd," Shields noted in his column. "Now, I think I know why. A tightly controlled and on-message event is a brilliant spectacle, but at the same time disingenuous. Select groups of trusted speakers make for a perfectly executed storyline, but at the cost of introducing new souls and their thoughts into the process."

While longtime TechEd speakers Jones and Shields won't be speaking at Ignite, they're not boycotting the show either. And apparently neither are 20,000 others.

Posted by Jeffrey Schwartz on 04/01/2015 at 11:49 AM0 comments


SharePoint Migration Provider Metalogix Acquires Rival MetaVis

Two leading suppliers of tools that enable migration from SharePoint Server to Office 365, OneDrive for Business and other cloud services are coming together. Metalogix, which also offers Exchange migration tools, today said it has acquired rival MetaVis for an undisclosed sum.

The deal not only eliminates a key rival for Metalogix but it facilitates the company's move to offer a richer set of cloud services tools. Among the MetaVis portfolio are SharePoint Migrator, Office 365 Migration and Management Suite and the Architect Suite, which will extend Metalogix's Content Matrix, Migration Expert, ControlPoint and SharePoint Backup tools.

"What's great is we were in the process of building out a fully integrated cloud solutions platform and MetaVis has exactly that, a very easy-to-install, agentless, simple cloud platform where we're able to combine our efforts," said Metalogix CEO Steven Murphy. "Now we're able to hit the market with a very nice, integrated migration suite which includes ongoing management with a focus on security and compliance administration, which are some really important issues."

With the pending arrival of SharePoint Server 2016, along with several older versions and the growing use of Office 365, SharePoint Online and OneDrive for Business, many organizations risk making key mistakes if they don't plan out their migrations and insure data is moved in a form that makes it usable once it's moved to a new target, especially if it's online. Maggie Swearingen, a SharePoint consultant at Proviti and a Redmond magazine contributor pointed this out in the April issue.

"As organizations consider a myriad of options from Microsoft, it becomes essential to have not only a long-term strategic technology vision -- but also a SharePoint migration and upgrade roadmap that's big on efficiency and low on cost," Swearingen wrote. "The sad reality is that many SharePoint migrations are considered failures by the organization and business even when the content successfully moves from point."

In her report, Swearingen pointed to five of the most popular SharePoint migration providers. In addition to Metalogix and MetaVis, they include AvePoint, Dell Software and Sharegate (of course Microsoft offers its own free tools).

"We think in a nutshell, this [merger of Metalogix and MetaVis] will solidify our stake as a leader in the migration and movement of content to the cloud -- Office 365 and other targets and this will just extend our range around compliance, security and administration," said Murphy.

 

Posted by Jeffrey Schwartz on 04/01/2015 at 11:59 AM0 comments


Microsoft Launches the Lighter, Thinner Surface 3

Microsoft today introduced its thinnest and lightest model to date of its Windows 8.1 tablet. The new Surface 3, which will appear at Microsoft's retail stores tomorrow, will weigh just 1.37 pounds and a paper-thin .34 inches.

The new $499 device will include a one-year Office 365 subscription and, in keeping with its free Windows 10 upgrade offer, it will be eligible for the new operating system once Microsoft releases it this summer. Powered with Intel's latest system-on-a-chip, quad-core Intel Atom x7 processor, Microsoft makes clear this device is not aimed at those engaging in compute-intensive tasks. It's more suited for everyday productivity such as e-mail, Web browsing and other traditional office purposes.

"If you do very demanding work -- things like editing and rendering video or complex 3D modelling -- then the power and performance of a Surface Pro 3 is for you," said Panos Panay, corporate VP for the Surface product line at Microsoft, in a blog post announcing the Surface 3. "If the majority of your work is less intense -- working in Office, writing, using the Internet (using IE, Chrome, or Firefox!), and casual games and entertainment, then you'll find that Surface 3 delivers everything you need."

Microsoft claims that the new Surface 3 will get 10 hours of battery life even when running video and the company also has eliminated its proprietary charger, instead offering support for a Micro USB charger. The Surface 3 comes with a 3.5 megapixel camera in front and its 8 megapixel rear-facing camera comes with a new autofocus feature.

Though running the low-power system-on-a-chip processor, Paney emphasized it can run 64-bit version of Windows 8.1. Pro, making it suitable for business users in addition to students. It will work with the Surface Pen, though that will cost extra, Microsoft said. Paney emphasized the Surface 3's appeal to enterprise users, noting customers including BASF, Prada and the University of Phoenix.

 

Posted by Jeffrey Schwartz on 03/31/2015 at 1:12 PM0 comments


Project Spartan Browser Debuts with New Windows 10 Technical Preview

The first preview of Microsoft next-generation browser, code-named "Project Spartan," made its public debut yesterday. The Project Spartan browser is included with the latest Windows 10 Technical Preview build 10049. I downloaded the new build where Project Spartan made its presence known the first time I booted up Windows 10.

As promised last week, Internet Explorer 11 is included as a separate browser and is unchanged, deviating from an earlier plan to include Project Spartan's new EdgeHTML rendering engine, which the company argues is much faster, more secure and reliable. Microsoft claims that the new browser is better suited for modern apps than Internet Explorer. Project Spartan is said to lack Microsoft's legacy Trident rendering engine, but Microsoft has also suggested that Project Spartan will  have good compatibility with Web apps and intranet sites nonetheless. For organizations not seeing that compatibility, though, IE will still be around.

 "It is fast, compatible and built for the modern Web. Project Spartan is designed to work the way you do, with features enabling you to do cool things like write or type on a Web page," said Joe Belfiore , Microsoft corporate VP for operating systems, in a blog post. "It's a browser that is made for easy sharing, reading, discovery and getting things done online."

Project Spartan aims to deemphasize the fact that you're using a browser, effectively putting the user's focus on the content, Belfiore said. The new browser integrates with Cortana, Microsoft's digital assistant built into Windows Phone 8.1 and introduced into the Windows 10 Technical Preview. When I highlighted text in a page Cortana guessed what I was looking for and rendered it alongside the page I was reading. It uses the Bing search engine to find information and it will be interesting to see if Windows 10 (with the new browser) gives a boost to Microsoft's search share, now dominated by Google.

The new Project Spartan browser also introduces a new feature called "inking," which lets users type or write with an electronic pen directly onto the Web page. You can make comments on a piece of the page and share it as a Web note either as an e-mail or onto social networks. It's similar to marking up a PDF file. Belfiore also pointed out that users can easily compile Web Notes and save them in Microsoft OneNote. In my quick test of that feature, it permitted ne to share the Web Note on Facebook, Twitter, Yammer and several other networks and apps such as Microsoft OneNote, though there was no obvious way to send it as an e-mail.

Also new in the Project Spartan browser are Reading Lists and Reading Views, designed to make it easier to put aside information from Web pages. It does so by letting you save any Web page or PDF into a Reading List for easy access at a later time. I saved a file to a reading list which in a way combines the function of Favorites and Web browsing histories, except you choose what's key to that history and can organize it accordingly.

At first glance, the browser does appear to render pages faster and introduces some useful new features. As for Cortana, that relationship is yet to take off. Every time I have tried to speak with her has resulted in a response in effect "try again later."

Have you downloaded the new build and looked at Project Spartan? Share your observations.

 

Posted by Jeffrey Schwartz on 03/31/2015 at 1:34 PM0 comments


IDC Forecasts Huge Growth for Wearable Devices

Many people remain skeptical that wearable computing and communications devices will grow at the pace of smartphones but growth this year is expected to double, according to a forecast released today by IT market research IDC.

Sales of "wristwear," which will account for 89.2 percent of all wearables, will grow from 17.7 million units in 2014 to 40.7 million this year, IDC is predicting. Wristwear, by IDC's definition, includes bands such as the Microsoft Band, bracelets and watches. Other types include clothing, eyewear, ear pieces and modular devices, accounting for the remaining 10.8 percent.

Wristwear capable of running third-party apps will account for the largest amount of new wearables sold, with 25.7 million predicted to sell this year. That's more than a 500 percent increase from the 4.2 million users bought last year. Next month's release of the Apple Watch will certainly play a big role in that growth. Other popular smartwatches include the Moto 360 and Samsung Gear watches will also contribute. But just as Apple kicked off the personal music player, smartphone and tablet markets, IDC predicts that the new Apple Watch will fuel the market for wrist-worn devices.

"Smart wearables are about to take a major step forward with the launch of the Apple Watch this year," said IDC Research Manager Ramon Llamas, in a statement. "The Apple Watch raises the profile of wearables in general and there are many vendors and devices that are eager to share the spotlight. Basic wearables, meanwhile, will not disappear. In fact, we anticipate continued growth here as many segments of the market seek out simple, single-use wearable devices."

The jury is still out on whether the Apple Watch and other devices like it will be a novelty, or if there is a killer app for these devices other than the convenience of being able to look at your e-mails and texts (and answer your phone). As I noted earlier this month, the Apple Watch doesn't have to be a hit right away but the apps available for it and others like it will have to offer a capability not available with smartphones today.

Do you see a killer app coming or is the current convenience alone enough to drive this new market?

Posted by Jeffrey Schwartz on 03/30/2015 at 12:07 PM0 comments


Microsoft Mandates Key Suppliers To Give Employees Paid Sick and Vacation Time

In a move that could broaden the discussion on income equality beyond gender, race and status, Microsoft will require its suppliers of contract workers to offer them paid vacation and sick time. The new policy, which applies to suppliers with 50 or more employees ranging from engineering and development staff to maintenance and security personnel at its numerous facilities, requires they offer either 10 days of paid vacation and five days of paid sick leave or 15 days of unrestricted paid time off.

Brad Smith, Microsoft's general counsel, said yesterday in a blog post that employees in the U.S. who have worked at least nine months, or 1,500 hours, "who perform substantial work for Microsoft," will be eligible. It's unusual for a large U.S. company to issue what could amount to a costly stipulation for suppliers, but one long sought-after by proponents of fair pay and income equality  The move could lead other companies to enact similar benefits, a report in The New York Times today suggested.

The U.S. doesn't require paid sick leave and 43 million workers aren't offered it, according to the report. Consequently, many people are forced to come to work when they're sick, which often makes others sick and reduces productivity, Smith said in his blog post. Citing a University of Pittsburgh study, Smith said when an employee doesn't come to work when he or she has the flu, it reduced the risk of others catching it from that person by 25 percent and when taking two days off it reduced transmission of the virus by 39 percent.

Another survey, whose source he didn't identify, found only 49 percent of those in the bottom fourth of earners, receive paid time off.  "Lack of paid time off also has a disproportionate impact on minorities at a time when the tech sector needs to do a better job of promoting diversity," Smith noted.  We've long recognized that the health, well-being and diversity of our employees helps Microsoft succeed. Our commitment to them extends beyond the workplace."

While it isn't clear how many employees will benefit from the company's new mandate, Microsoft uses 2,000 outside suppliers who provide contract employees overall, according to The Times report. It surely is likely to raise the ire of those who have fewer employees or individuals who provide contract services to Microsoft. "We recognize that this approach will not reach all employees at all of our suppliers, but it will apply to a great many," Smith said in his blog post. "We've long recognized that the health, well-being and diversity of our employees helps Microsoft succeed. Our commitment to them extends beyond the workplace."

Some suppliers surely won't welcome the move as offering paid time off will be more costly for them. Smith indicated that Microsoft will work with them. "We also want to be sensitive to the needs of small businesses," Smith said. "For these reasons, we are going to launch a broad consultation process with our suppliers so we can solicit feedback and learn from them about the best way to phase in the specific details."

Whether or not Microsoft's move will lead other companies to enact similar policies remains to be seen. Employees working for a smaller company may feel further left out. But if you're a proponent of fairness in pay and compensation, this is a noteworthy step to further that goal.

Posted by Jeffrey Schwartz on 03/27/2015 at 12:46 PM0 comments


Microsoft Simplifies Azure PaaS with Combined App Services

In a move aimed at making it more appealing for developers and business decision makers to use its cloud platform as a service (PaaS), Microsoft is bringing together its separate Azure app services into one complete offering.

Microsoft describes its new Azure App Service, now available, as a fully managed service which provides a simple way for developers to build apps that are customer facing. The new packaging effectively brings together three offerings that, until now, were disparate services -- Web sites, Mobile Services and Biztalk Services -- to easily integrate with SaaS and on-premises systems.

"It brings those together in new unified experiences," said Omar Khan, Microsoft's director of Azure engineering. "Developers are challenged with trying to connect all that data from these different systems into their apps. That's what App Service helps with. It helps developers integrate data from on-premises and from popular cloud services into their Web and mobile apps. And App Service also has new capabilities around allowing businesses to automate their business processes more easily, allowing them to be more agile."

Khan explained how the four offerings are coming together:

  • Web Apps: Online tools and templates that make it easy to build, deploy and scale apps that are customer facing, for employee productivity or partners.
  • Mobile Apps: Services that enable the tailoring of Web and other apps to key mobile platforms, notably iOS, Android and (of course) Windows.
  • BizTalk Apps: Also described as Logic Apps, Microsoft now has 50 connectors to popular SaaS and on-premises apps including Office 365, Microsoft Dynamics Salesforce.com, Oracle, SAP, Facebook, Twitter and others.
  • API Apps: These provide the services to expose APIs with the three App Services so the other three app types -- Mobile Apps, Logic Apps and Web Apps -- can consume those APIs.

"API Apps allow you to take any existing API, whether it's an API in the cloud or an API on-premises, and project that into App Service adding some simple metadata," Khan said. "And in doing so it exposes a slider format, which is a popular format for describing APIs, and thus allowing the other app types to consume those APIs. API Apps also let you then project your own custom APIs into App Service."

The service essentially provides a JSON file to provide your API and you can load that into app service using Microsoft's standard publishing mechanism. "We support Git, so it's basically uploading a JSON file via Git, and then App Service can basically make those APIs available in a reasonable form. And then you can use them within the regular apps within App Service," he added.

Asked how the service connects to on-premises applications and systems, Khan explained that the BizTalk connectors address that.  "We have virtual networking in Azure that allows you to connect on-premises resources to the cloud," he said. "They also support hybrid connections which is a BizTalk capability that allows you to do app-to-app connection across firewalls. So these API Apps and the Oracle connector or the SAP connector, among others, utilize those connectivity options in Azure to connect to the on-premises resources and then there's a connector piece that you can run on premises that connects to that API App."

Microsoft is betting that, by providing this simplified services together, these will bring more applications to the Azure PaaS cloud service. But Microsoft today is also targeting the emerging developers who'll ultimately decide what platforms to build their applications on. Microsoft is now offering Azure for student developers, in which they can get free usage to learn how to build cloud-based mobile and Web apps using services such as the aforementioned Azure Apps and Azure Insights, which "gives students a 360-degree view across availability, performance and usage of ASP.NET services and mobile applications for Windows Phone, iOS and Android," wrote Microsoft's Steve "Gugs" Guggenheimer, in a blog post announcing the offering.

"Student developers are growing up in a world that requires them to leverage cloud services to deliver cool and modern experiences," Guggenheimer noted. "Microsoft Azure is a great fit for students because of its speed and flexibility enabling the creation and development of Web sites and Web apps. This new offer for students, available today in 140 countries, gives young developers access to the latest technology, allowing them to develop in or deploy sites and apps to the cloud, at no cost and with no credit card required."

In addition to Azure Apps and Azure Insights, Guggenheimer noted that the free offering lets students use Microsoft's Visual Studio Online.

 

Posted by Jeffrey Schwartz on 03/24/2015 at 2:46 PM0 comments


11 Device Makers Will Preinstall Office 365 Apps on Android

In its latest show of support for non-Windows hardware, Microsoft on Monday said that 11 device makers will preinstall key apps from its Office Suite onto the vendors' respective tablets and smartphones for consumers and business users. Leading the pack was Samsung, which said it will preinstall Microsoft Word, Excel, PowerPoint, OneNote on its tablets in the second half of this year. Microsoft said it will offer the apps via a new Microsoft Office 365 and Samsung Knox Business Pack.

Microsoft also said that Dell, along with original device manufacturer Pegatron, local device makers TrekStor from Germany, JP Sa Couto of Portugal, Italy's Datamatic, Russia's DEXP, Hipstreet of Canada, QMobile in Pakistan, Tecno from Africa and  Turkey's Casper, will preinstall the Office 365 components.

Samsung's latest move follows this month's news at the Mobile World Congress in Barcelona that the electronics giant will offer OneNote, OneDrive and Skype on the new Galaxy S6 and Galaxy S6 Edge smartphones. At the time, reports surfaced that a deal to offer Office 365 apps for Samsung's portfolio of Android devices with Samsung Knox Workspace security integration was in the works -- a scuttlebutt that came to fruition this week. Adding to its Galaxy phone announcement from earlier in the month, Microsoft said the devices will come with an additional 100GB of Microsoft OneDrive free storage for two years.

Microsoft said businesses and enterprises that buy Samsung devices through its channel partners will have a choice of three Office 365 plans: Business, Business Premium or Enterprise bundled with Samsung's Knox, the company's Android-based security platform. The agreement also covers support and setup services.

Samsung and the other 10 hardware providers will offer the preinstalled Office 365 capability later this year, according to Peggy Johnson, Microsoft's executive vice president for business development, hired by Microsoft CEO Nadella from Qualcomm six months ago. "These deals demonstrate how we are working with hardware partners in new ways to deliver rich experiences through their scale," she said in a blog post. "This is a big step forward for our cross-platform and cross-device services strategy, which will bring an array of Microsoft services to every person on every device."

While this is not a major technical breakthrough but rather a bundling deal, it's likely to attract those who buy Android tablets and smartphones that want to continue using Office to procure new subscriptions or keep users happy with their existing plans.

Posted by Jeffrey Schwartz on 03/24/2015 at 2:41 PM0 comments


The Death of Internet Explorer in Name Only

When Microsoft talked up the company's next-generation browser, code-named "Project Spartan," at this week's Windows Convergence conference in Atlanta, the obituaries came pouring in for Internet Explorer. It flashed on the screen of CNBC, it was on every general news site and was talked about all over social media.

Perhaps those outside of IT hadn't heard about Project Spartan, which Microsoft began talking about in some detail back in January. Microsoft explained at the time that the new browser will contain a new rendering engine, called "EdgeHTML." To create that new rendering engine, Microsoft forked the code in Internet Explorer's Trident engine. Spartan will offer both rendering engines, including the legacy MSHTML engine used for Trident. As reported by my colleague Kurt Mackie:

Organizations will be able to use the Spartan browser even if they have legacy IE support issues to address. When legacy support needs arise, Spartan will be capable of running the old IE Trident engine via Enterprise Mode. Microsoft's Enterprise Mode technology is an IE 11 solution that emulates earlier IE browser technologies all of the way back to IE 5 for compatibility purposes.

Reports this week that Microsoft is killing the Internet Explorer brand do appear to be the long-term plan from a branding perspective. MIT Technology Review Senior Editor Rachel Mertz is among those who believe Microsoft should retire Internet Explorer.

"The changes both to the browser and the branding make a lot of sense," she wrote. "Internet Explorer, first released in the mid-1990s, dominated the browser market at its peak in the early 2000s, but it came to be associated with poor security and compatibility with other browsers and has since languished. Spartan's success is critical if Microsoft is to remain relevant in the Web browser business -- a market in which it used to dominate but now trails Google's Chrome."

At the same time, enterprise users aren't going to want to see Internet Explorer, or at least the rendering capabilities of whatever it calls its next browser, go away. For its part, Microsoft is promising the new browser will offer the same compatibility it has offered in past upgrades."Project Spartan is Microsoft's next generation browser, built just for Windows 10," according to a company statement. "We will continue to make Internet Explorer available with Windows 10 for enterprises and other customers who require legacy browser support."

Now the question is: what will Microsoft call its new browser?

Posted by Jeffrey Schwartz on 03/20/2015 at 1:03 PM0 comments


Free Windows 10 Upgrades for Pirates Still Not Genuine

Microsoft's announcement earlier this week that users of pirated versions of its PC operating system can also take advantage of its free Windows 10 upgrade offer has an important caveat: it's no more official than the bootlegged version.

Terry Myerson, executive vice president of Microsoft's operating systems group, made the head-scratching announcement during Windows Hardware Engineering Conference (WinHEC)  this week in Shenzhen, China. Talking up Microsoft's January announcement that users of Windows 7, 8, 8.1 and Windows Phone could upgrade their systems to the new Windows 10, Myerson told Reuters: "We are upgrading all qualified PCs, genuine and non-genuine, to Windows 10."

Microsoft's goal is to "re-engage" with the hundreds of millions of users of Windows in China, he told the news service, though he declined to elaborate. Given 90 percent of Microsoft software used in China alone is said to be pirated, that's a lot of free software. But that begs the question: why buy the software when you can get a bootlegged version for a fraction of the cost, if not free? Answering that question, Microsoft issued a statement which points out that if you're upgrading your pirated software, you still have an unlicensed version of Windows 10.

"We have always been committed to ensuring that customers have the best Windows experience possible," according to the statement. "With Windows 10, although non-genuine PCs may be able to upgrade to Windows 10, the upgrade will not change the genuine state of the license. Non-genuine Windows is not published by Microsoft. It is not properly licensed, or supported by Microsoft or a trusted partner. If a device was considered non-genuine or mislicensed prior to the upgrade, that device will continue to be considered non-genuine or mislicensed after the upgrade. According to industry experts, use of pirated software, including non-genuine Windows, results in a higher risk of malware, fraud (identity theft, credit card theft, etc.), public exposure of your personal information and a higher risk for poor performance or feature malfunctions."

By tapping China-based PC makers Lenovo, Qihu 360 and Tencent, Microsoft is hoping it'll convince customers to buy legitimate licensed versions of Windows.

 

Posted by Jeffrey Schwartz on 03/20/2015 at 11:38 AM0 comments


Windows 10 Free for Pirated Versions Too

In what may seem like a bizarre move, Microsoft said its free Windows 10 upgrade offer also applies to those with pirated older versions of the operating system.

Terry Myerson, executive vice president of Microsoft's operating systems group, made the startling announcement at the Windows Hardware Engineering Conference (WinHEC), taking place this week in Shenzhen, China.

Microsoft earlier this year said it will offer Windows 10 as a free upgrade to Windows 7 and Windows 8.x users. The offer, which will also include earlier versions of Windows Phone, is good only for one year after the release of the new operating system and doesn't apply to all enterprise users.

Given the vast majority of all PC software in China is said to be pirated, Myerson chose a fitting place to announce the move. "We are upgrading all qualified PCs, genuine and non-genuine, to Windows 10," Myerson told Reuters. Microsoft's goal is to "re-engage" with the hundreds of millions of users of Windows in China, he told the news service, though he declined to elaborate.

It appears Microsoft is hoping to get users of pirated software to use legitimate versions of Windows. Microsoft has tapped China-based PC makers Lenovo, Qihu 360 and Tencent to help in that effort.

 

Posted by Jeffrey Schwartz on 03/18/2015 at 2:13 PM0 comments


Say Hello to Windows 10 This Summer and Goodbye to Passwords

Microsoft has put Windows 10 on the fast track saying in an unexpected announcement the new OS will arrive this summer. That's a surprise escalation in expectations from the fall timeline targeted by Microsoft Chief Operating Officer Kevin Turner back in December.

The company announced the unexpected earlier delivery date at the Windows Hardware Engineering Conference (WinHEC), taking place in Shenzhen, China this week. Also at WinHEC, as reported yesterday, Microsoft revealed that the release of Windows 10 will aim at transitioning users away from passwords to login to their systems and instead will offer Microsoft's new biometric authentication tool called Windows Hello.

While it wasn't initially clear to what extent the Windows Hello technology would be supported in Windows 10, Terry Myerson, executive vice president for the Windows platform group at Microsoft said at WinHEC and in a blog post that all OEMs have agreed to support it.

Windows 10 will be available in 190 countries and 111 languages when it launches, according to Myerson. Obviously that's a wide window given it can arrive anytime between June 21 and Sept. 20. But the expedited release may suggest that Microsoft doesn't want to miss this year's back-to-school season, a time many students buy new systems. If that's the case, it will need to come in June or July, rather than late September.

The big question an earlier-than-expected release raises: is Microsoft looking to rush Windows 10 out the door too soon and will it come out feature-complete?  Meanwhile, there are many new features testers have yet to see, such as the new browser component called Spartan and yesterday's reveal: Windows Hello. Joe Belfiore, corporate vice president for Microsoft's operating systems group unveiled Windows Hello at WinHEC, which he said provides system-level support for biometric authentication, including fingerprint and facial recognition as a replacement for passwords.

Hello isn't the first effort to bring biometrics to Windows PCs. Makers of PCs have offered fingerprint scanners on a small selection of their PCs for years now. But few used them and most devices today have done away with them. This time, it looks like Microsoft is aiming for biometrics that will be pervasive in Windows 10 devices. "We're working closely with our hardware partners to deliver Windows Hello-capable devices that will ship with Windows 10," Myerson said. "We are thrilled that all OEM systems incorporating the Intel RealSense F200 sensor will fully support Windows Hello, including automatic sign-in to Windows."

Myerson said Microsoft is also offering a new version of Windows for smaller Internet of Things devices ranging from ATM machines to medical equipment thanks to partnerships with the Raspberry Pi Foundation, Intel, Qualcomm and others. Microsoft also unveiled Qualcomm's DragonBoard 410C for Windows 10 devices. It includes the first Windows 10 developer board that's integrated with Wi-Fi, Bluetooth and GPS, along with the Qualcomm Snapdragon 410 chipset.

Posted by Jeffrey Schwartz on 03/18/2015 at 12:55 PM0 comments


Office 2016 and Skype for Business Previews Now Available

Microsoft kicked off this week's annual Convergence conference in Atlanta by announcing the preview of Office 2016 for IT Pros and Developers. It was among several releases which also includes a general preview of Skype for Business.

Office 2016 is the first major upgrade of the Office desktop suite since 2013 and follows last month's preview releases of touch-enabled versions of Word, Excel and PowerPoint apps for Windows 10. Today's Office 2016 release for IT pros and developers gives a far broader look at the new suite including some new features such as click-to-run-deployment, extended data loss prevention (DLP) support and the new Outlook 2016 client.

Julia White, general manager of Microsoft's Office division, demonstrated the new Outlook 2016 during the opening Convergence keynote. In the demo, she played up Outlook's ability to handle content linked with OneDrive for Business. When a user goes to attach a file, the most recently accessed documents appear and are added as a link to the sender's OneDrive for Business account.

"When I hit send, it looks like an attachment, it feels like an attachment and when I send it, it actually sends the access to the file," White said. "So I don't have to send a physical attachment and deal with versioning." Users can link up on the same document in the cloud, continued White. She also added that the new Outlook will still let users attach actual files.

The new Outlook also offers significant technical improvements, said Kirk Koenigsbauer, corporate vice president for Microsoft's Office 365 Client Apps and Services team, in a blog post announcing the Office 2016 preview. Among the improvements in Outlook for IT pros he pointed to include:  

  • MAPI-HTTP protocol: RPC-based sync replaced with a new Internet-friendly MAPI-HTTP protocol that supports Exchange/Outlook connectivity
  • Foreground network calls: The use of foreground network calls eliminated to ensure Outlook stays responsive on unreliable networks
  • Multi-factor authentication:  Support multi-factor authentication via integration with the Active Directory Authentication Library (ADAL)
  • E-mail delivery performance: The amount of time it takes to download messages, display message lists and show new email after resuming from hibernation reduced
  • Smaller storage footprint: New settings  let users better manage storage by only retaining 1, 3, 7, 14 or 30 days of mail on the device
  • Search: Improved reliability, performance and usability of Outlook search and FAST-based search engine is integrated into Exchange.

The new Office 2016 preview doesn't include all of the features that Microsoft is planning for the new release, Koenigsbauer noted. The new DLP support builds on what Microsoft now offers with Exchange, Outlook, OneDrive for Business and SharePoint.  "Now we're bringing these same classification and policy features to Word, Excel and PowerPoint," he said. "With these new capabilities, IT admins can centrally create, manage and enforce polices for content authoring and document sharing -- and end users will see policy tips or sharing restrictions when the apps detect a potential policy violation."

Koenigsbauer noted that the new click-to-run deployment feature for Office 365 customers introduces Microsoft's new Background Intelligence Transfer Service (BITS), which Koenigsbauer said aims to prevent network  congestion on the network. "BITS throttles back the use of bandwidth when other critical network traffic is present," he said.

Other deployment improvements showcased in the new Office 2016 preview include tighter integration with Microsoft's System Center Configuration Manager (SCCM), more flexible update management for handling feature updates and bug fixes and improved activation management added to the Office 365 Admin Portal.

White also talked up Skype for Business, which Microsoft said back in November would represent the rebranding of the company's Lync platform with its Skype service. "Now all of the Skype for Business users can connect with Skype from a contacts perspective and communicate [with] them with IM, voice and video," she said. "So imagine a sales person connecting with any customer, a doctor connecting with a patient, and employer interviewing someone via Skype. There's so many possibilities with this new experience."

Posted by Jeffrey Schwartz on 03/16/2015 at 2:13 PM0 comments


Microsoft Taps Forrester Analyst James Staten as Chief Cloud Strategist

Prominent Forrester Analyst James Staten is joining Microsoft today as chief strategist for Microsoft's Cloud and Enterprise division. Word of his move from Forrester to Microsoft spread quickly over the weekend when Staten updated his LinkedIn status.

In a tweet last night, Staten alerted followers: "Just landed in Redmond. Ready to start my new career at #Microsoft." When asked by a Twitter friend if he's relocating to Redmond, Staten replied: "Staying in Silicon Valley. Working w/VCs, startups and local companies r key to my job."

Staten didn't immediately respond to an e-mail from me, though I've known him for many years as someone who had a firm grasp on the competitive strengths and weaknesses of all the public infrastructure-as-a-service (IaaS) cloud players. Furthermore, he had forecast the evolution of IaaS pretty much from the beginning and has consulted with many customers making cloud computing decisions.

It is a safe bet that his primary focus will be to advance the Azure cloud business. Staten's tweets on regarding his move included the Azure hashtag. Given his deep knowledge of the IaaS landscape, Microsoft will have a strong and credible executive on the Azure team, which should also help in the company's effort to further gain inroads in Silicon Valley.

At Forrester, Staten was based in Silicon Valley and, prior to joining the research firm, he worked at Azul Systems and Sun Microsystems software.

In a statement Microsoft said Staten will report to Cloud and Enterprise Executive VP Scott Guthrie, where he'll "work closely with Scott's leadership team on delivering the most complete set of cloud capabilities and services to customers small and large."

 

Posted by Jeffrey Schwartz on 03/16/2015 at 10:18 AM0 comments


Windows 10 Anticipation Puts PC Sales on Skids

Intel's warning that revenues could be off by about $1 billion weighed on its shares Thursday, stoking fears that PC sales may remain weak until Microsoft ships its new Windows 10 operating system.

The chipmaker's revised forecast for the first quarter is revenue of $12.8 billion, give or take $300 million, compared to the prior prediction of $13.7 billion, give or take $500 million. Intel said lower than anticipated demand for business desktop PCs across the supply chain spurred the revised forecast.

Lower than expected Windows XP refreshes have impacted supply for its inventories, Intel said. Businesses and consumers are taking an "if it ain't broke, don't fix it" attitude to their old PCs, Summit Research Analyst Srini Sundararajan told Reuters. But with Microsoft's Windows 10 waiting in the wings, many PC buyers are likely putting off upgrades as well. As research by Redmond magazine and others is showing, many users plan to upgrade to Windows 10 and are awaiting the new hardware in the pipeline that will support it.

Like many other companies, Intel also said currency conditions in Europe will also affect revenues. That has led to an increase in PC prices in Europe, which has impacted demand, according to various reports. BlueFin Research Partners is forecasting about 76 million PCs will ship this quarter, a decline of 8 to 9 percent, Reuters reported.

Intel's datacenter business forecast remains unchanged, the company said.

Posted by Jeffrey Schwartz on 03/12/2015 at 11:42 AM0 comments


Free Scaled-Down App Load Balancer Now Available for DevOps

Kemp Technologies is now offering a free version of its Loadmaster application load balancer, a move the company is hoping developers and DevOps managers will use for distributed workloads that don't require a lot of capacity.

While the move is aimed at seeding its virtual appliance, the company is letting companies use it permanently for production workloads. The catch is that operators will have to reregister it with Kemp's licensing server every 30 days, it will only support 20 Mbps applications and it doesn't support the high availability features of its commercial version, said Kemp Product Manager Maurice McMullin.

Nevertheless, the company believes developers and DevOps managers will use the free virtual load balancing for testing, development and running non-critical applications that don't have large amounts of network traffic. "It's reasonably configured," McMullin said, noting that it includes the intrusion detection system and Web application firewall.

"People could use it in a preproduction environment and a dev-test environment and potentially even some production environments for non-critical low volume workloads," he added. One example might be a time sheet entry system where the application is distributed among locations but isn't used frequently and is not business critical.

Kemp isn't the first to offer a free load balancer. For example, HAProxy offers an open source version of its namesake load balancing software and operates a community site. But McMullin argues the Kemp offering is suited for mainstream VMware and Hyper-V workloads and can run in public clouds including Amazon Web Services, Microsoft Azure and VMware vCloud Air.

The company said its free offering includes complete testing and validation of applications in Kemp's global site load balancing (GSLB) and Edge Security Pack, which offers single sign-on. The free software also supports the Kemp Web Application Firewall Pac as well as its REST API and Windows PowerShell and Java API wrappers.

 

Posted by Jeffrey Schwartz on 03/12/2015 at 3:39 PM0 comments


Hillary Clinton Isn't the Only One To Bypass IT

News that Hillary Clinton operated an e-mail server out of her house in Chappaqua, N.Y. for both personal and official communications while serving as Secretary of State underscores how far people will go for convenience and control -- even if it means bypassing IT to do so.

While maintaining that she didn't break any laws or send any classified messages using her personal e-mail account instead of the official e-mail system the rest of the government uses, Clinton has raised fierce debate over the propriety of her decision to take matters into her own hands. Aside from the legal issues and obvious questions, such as did her use of personal e-mail really go unnoticed for four years and was the system she used as secure as government's network (some argue hers might have been more secure), her actions are far from unique.

In my reporting on this month's Redmond magazine cover story about the forthcoming end of life of Windows Server 2003, IT pros lamented that they discovered many unsanctioned servers in use -- often under employees' desks. For better or worse, many companies have become more tolerant of employees bypassing IT than they were years ago in part due to the bring-your-own-device (BYOD) trend brought on by the advent of smartphones and tablets. I often receive business-related e-mail from high-level people at companies of all sizes from their personal e-mail addresses -- usually a Gmail, Yahoo or Outlook.com address -- and I'm sure you do too.

Many employees in organizations not wanting to wait for IT to provision systems have spun up VMs by setting up an Amazon Web Services or Azure account with a credit card. And it's certainly become common for business users to set up accounts using Salesforce.com and Workday, among other SaaS applications. Many are also setting up Office 365 accounts on their own.  Services such as OneDrive, Google Drive, Dropbox and Box have replaced flash drives for copying and storing files. A survey by data protection vendor Vision Solutions found that 52 percent of organizations don't have processes to manage the use of such services, putting at risk the loss of confidential data.

Apparently the U.S. government -- or at least the State Department, which she headed from 2009 to 2013 -- was one of them. Indeed setting up your own Exchange Server (I'm assuming that's what she used since that's what runs the Clinton Foundation's e-mail) is a more brazen move than most take and one that most are unlikely to do, given the cost.

From news accounts we know she's not the only government official to use personal e-mail for routine business communications, though she's the highest level and, for now, the most infamous one to do so. If we're to take Clinton at her word -- and I realize many don't -- she said yesterday she used her own e-mail for convenience. If that's to be translated that she was trying to balance her work and personal life by using one account, I think we can all agree that was a bad idea -- she said as much, though she could have separated them and still used one device, as many of us do.

Given that she was the nation's top diplomat and a potential presidential candidate, the fallout from this is far from certain as this issue continues to create discourse.

In the end, businesses and government agencies of all sizes have to establish policies that address what employees at all levels can and can't do when it comes to using IT. Absent of any clear rules and enforcement of them, you likely have at least one, if not many, people like Hillary Clinton in your organization.

Posted by Jeffrey Schwartz on 03/11/2015 at 1:19 PM0 comments


The Apple Watch Doesn't Have To Be a Hit Right Away

Many people have asked me if I plan on getting an Apple Watch when it comes out next month. The answer is, not the first version and probably not the second either. I'm not sure if I'll ever buy one but haven't ruled it out in case the price and performance are right.

Apple's launch event yesterday confirmed what we already presumed. The Apple Watch will ship next month (preorders begin April 10 and they'll appear in stores April 24) and the starting price is $350. If you want to spring for one with an 18-karat gold band, that'll cost $10,000 and if you must have the most expensive model -- with sapphire faces -- it'll set you back $17,000. If you collect Rolexes and the like, it'll be the perfect addition to your collection.

Who needs an extension of their iPhone on their wrist? Let's face it, the Apple Watch is yet the latest accessory to the iPhone, which is required for the watch to work. If you want to glance at your messages, view alerts and maybe access some information without removing your iPhone from your pocket, it's certainly could be convenient. The question is how much will most people be willing to pay for that convenience?

Back in 2007 when the first iPhone came out it cost $599 and that was with the carrier subsidy (only AT&T offered them for the first few years). And they were basically just iPods with phones on them with e-mail access and a handful of other apps. When the iPod came out in 2001, Apple certainly wasn't the first to release an MP3 player. In both cases though, the company was the first to legitimize and create a mass market for their offerings in a way others before them were unable to do.

Will Apple be able to catch lightning in a bottle yet again? While that remains to be seen, it's a reasonable bet we'll see substantially less expensive versions of the Apple Watch in the next few years that'll be much more functional than the $17,000 models that are now debuting.

 

Posted by Jeffrey Schwartz on 03/10/2015 at 2:13 PM0 comments


Microsoft and Sphere3D To Develop Windows Containers for Azure

In its push to simplify migration of Windows applications to cloud infrastructures without dependencies on hardware or software platforms, Microsoft has added Sphere 3D as its latest partner to deliver Windows containers. The two companies announced a partnership today to deliver Glassware 2.0 Windows containers for Azure.

Sphere 3D said it's collaborating with Microsoft to develop tools to simplify the migration of Windows-based end user applications to Azure. The two companies are first working to offer Glassware 2.0-based workloads in Azure for schools. Later in the year, Sphere 3D will offer other tools, the company said. Unlike Microsoft's higher-profile container partner Docker, which is open source, Glassware 2.0 is a proprietary platform designed to virtualize applications without requiring a virtualized desktop.

The Glassware 2.0 suite includes a micro hypervisor which the company calls a "Microvisor." Unlike a traditional hypervisor, which requires a guest OS for applications to run, the "Microvisor only pulls in elements of the OS stack needed for the software application to run, and also fills in any gaps that may be present, in particular with applications needing functionality not inherent in whichever OS stack you happen to be using," according to the company's description.

Glassware 2.0 also includes containers, management tools and clustering software. The containers are designed to run multiple instances of the same app in a Glassware 2.0-based server. It provides the ability to share binaries, libraries or the Glassware 2.0 Microvisor, according to the company. This environment provides access only to those components of an operating system an application needs to run. It supports applications running in Windows XP, Windows 7 and Windows 8.x environments.

"When we created Glassware 2.0, we envisioned a time where any application, regardless of its hardware or operating dependencies, could be easily delivered across multiple platforms from the cloud," said Eric Kelly, Sphere 3D's CEO in a statement. "Today, by joining forces with Microsoft, we have taken a substantial step towards realizing that vision."

The company says the Glassware 2.0 Microvisor can virtualize infrastructure components and the application stacks from both Windows and non-Windows-based systems and claims it can "outperform" any existing hypervisor-based infrastructure. Furthermore, the company said it can be used for systems and cloud management, orchestration and clustering. The Glassware Manager runs in Windows Server 2008 and above.

Posted by Jeffrey Schwartz on 03/09/2015 at 1:23 PM0 comments


Apple Is Prepping Larger iPad To Challenge Surface Pro

Forget about next week's anticipated launch of the Apple Watch or the fact that the company will be added to the select 30 companies in the Dow Jones Industrial Average, knocking out AT&T. Apple's latest assault on the enterprise -- a market it has largely eschewed over its 39-year history -- is said to include a new iPad which, in many ways, could appeal to the same potential users of Microsoft's Surface Pro 3.

A larger iPad has been rumored for some time now. The new device is said to include a USB 3.0 port, keyboard and mouse, according to a report in today's Wall Street Journal. At this point, according to the report (which, as usual, Apple had no comment), the company is still just considering the USB port -- a feature the late Steve Jobs was staunchly against. Apple had originally told suppliers it was looking to deliver the larger 12.9-inch iPad this quarter. Now it apparently is in the pipeline for the second half of the year.

At 12.9-inches, the new iPad would actually be slightly larger than the current Microsoft Surface Pro 3, which now sports a 12-inch form factor. It would be considerably larger than the current iPad Air, which is 9.7 inches.

Would a larger iPad with USB support, a keyboard and mouse compete with the Surface Pro 3 and other Windows-based PCs? Before jumping on me for comparing apples to oranges, keep in mind that an iPad that could let people run Office for basic work processes could appeal to numerous individuals, even if the tablets don't have the 8GB of RAM and latest Intel Core processors that the Surface Pros have.

The move makes sense for Apple, which sold a mere 21 million iPads last quarter. While Microsoft hasn't shipped a fraction of that many Surface Pros, sales are also on the rise. However, they still lack the momentum some would like to see. Certainly the new extra-large iPhone 6 Plus is cannibalizing sales of iPads so the natural way to target the tablets is toward enterprise workers. Just as the bolstered iPad will compete with Windows PCs, it also has the potential to cut into at least some sales of low-end Macs.

In the latest boost for Apple's iOS platform, the company's new enterprise partner IBM this week launched additional applications in its MobileFirst for iOS suite at the Mobile World Congress in Barcelona. Among the new apps are Dynamic Buy for retailers, Passenger Care for companies in the travel and transportation industry and Advisor Alert for banks. Big Blue said with the launch of its latest apps, 50 customers are already using them including Air Canada, American Eagle Outfitters, Banorte, Boots UK, Citigroup and Sprint.

The rise of iOS in the enterprise may yet to have peaked. But at the same time, it's not a death knell for Windows. Microsoft's well aware of the new client device landscape and the fact that these new device types aren't going away. From my standpoint, different devices are suited for various purposes and it's nice to have both at my disposal.

Posted by Jeffrey Schwartz on 03/06/2015 at 11:28 AM0 comments


LinkedIn Pares Back Support for Outlook Social Connector

LinkedIn, the popular social network for business users, has informed customers that its social connector will no longer work with older versions of Outlook after next Monday, March 9.

The announcement, sent in an e-mail to customers, stated that the connector will still work with the most recent release, Outlook 2013. Those with Outlook 2003, 2007 and 2010 will no longer be able to view information about their LinkedIn contacts, the company said in the e-mail.

"Our team is working with Microsoft to build even more powerful tools to help you stay connected with your professional world," according to the e-mail that I received this afternoon. "Until then you can get similar capabilities with the 'LinkedIn for Outlook' app for Outlook 2013 from the Office Store."

The move comes just over five years after Microsoft debuted the Outlook Social Connector for LinkedIn. Though I've tested the feature, I don't currently use it.

Do you use the connector actively, or at all? If you have an older version of Outlook is this move likely to convince you to upgrade to Outlook 2013?

 

Posted by Jeffrey Schwartz on 03/02/2015 at 1:08 PM0 comments


Update Windows Server 2003 or Die?

Failure to update your systems and applications running Windows Server 2003 could have deadly consequences. That's the message that Microsoft Distinguished Engineer Jeffrey Snover conveyed over the weekend when he tweeted his warning about what will happen to those who keep Windows Server 2003-based systems running after July 14:

Not updating from WS2003 is like the guy who jumps off a building on the way down says, "so far so good." #ThisIsNotGoingToEndWell

Microsoft has been pretty vocal about the need to update Windows Server 2003. Snover, who is widely regarded by the Microsoft MVP community, is the latest of many in Redmond who are trying to be clear about the risks Microsoft says customers will face if they don't address the situation. In simple terms, there are still millions of Windows Server 2003-based systems in commission. After July 14, Microsoft will no longer issue security patches. That means those servers could become conduits to spread malware or other threats. I go much deeper into that in this month's Redmond magazine cover story.

Microsoft recommends upgrading to Windows Server 2012 R2 or urges users to consider moving the applications impacted by the loss of support to the cloud, if that makes sense. According to a survey conducted last year by application remediation vendor AppZero, more than one third, or 36 percent, said there will be a cloud component to their upgrade process.

Perhaps one of the biggest challenges to upgrading is it requires organizations to decommission the Windows Server 2003 Active Directory domain controllers and migrate the schema to the more current iteration of AD. MVP John O'Neill Sr., who has joined the roster of Redmond magazine contributors, aptly explains how to do so.

Do you have a plan in place? Will you migrate to Windows Server 2012 R2 or are you looking at a pure cloud-based deployment of your applications? Perhaps you're planning a hybrid architecture? Or maybe you simply don't agree with Snover's latest warning of the perils of doing nothing? Share what you're going to do about the pending end of support for Windows Server 2003.

 

Posted by Jeffrey Schwartz on 03/02/2015 at 11:49 AM0 comments


Microsoft Extends Support for New Docker Orchestration Tools

Docker today released several new tools aimed at letting IT pros and developers build and manage distributed applications that are compatible across environments including Amazon Web Services, Google, IBM, Joyent, Mesosphere, Microsoft and VMware.

Over the past year, these players have pledged support for Docker's open source container environment, which has quickly emerged as the next-generation architecture for developing and provisioning distributed apps. Today's beta releases are key deliverables by Docker and its ecosystem partners to advance the building, orchestration and management of the container-based platform.

For its part, Microsoft said it is supporting the newly released betas of Docker Machine (download here), the orchestration tool that gives IT pros and developers the ability to automate the provisioning and management of Docker containers on Linux or Windows Servers; and Docker Swarm, a tool that lets developers select infrastructures for their apps including Azure virtual machines. Microsoft also said it will natively support the new Docker Compose developer tool as a Docker Azure extension.

The new Docker Machine beta allows administrators to select an infrastructure to deploy an application built in the new environment. Microsoft has contributed drivers in Azure that allow for rapid and agile development of Docker hosts on Azure Virtual Machines. "There are several advantages to using Docker Machine, including the ability to automate the creation of your Docker VM hosts on any compatible OS and across many infrastructure options," said Corey Sanders, Microsoft's director of Azure program management, in a post on the Microsoft Azure blog.  

"With today's announcement, you can automate Docker host creation on Azure using the Docker Machine client on Linux or Windows," he added. "Additionally, Docker Machine provides you the ability to manage and configure your hosts from a single remote client. You no longer have to connect to each host separately to perform basic monitoring and management tasks, giving you the flexibility and efficiencies of centralized devops management."

With Docker Swarm, which spins a pool of Docker hosts into one virtual host, IT pros can deploy their container-based apps and workloads using the native Docker clustering and scheduling functions, Sanders added. It also lets customers select cloud infrastructure such as Azure, enabling them to scale as needs necessitate for dev and test. Sanders noted that using the Docker CLI, customers can deploy Swarm to enable scheduling across multiple hosts. The Docker Swarm beta is available for download on Github.

The Docker Compose tool enables and simplifies modeling of multi-container Docker solutions using the declarative YAML file format. "This single file will be able to take a developer-modeled application across any environment and generate a consistent deployment, offering even more agility to applications across infrastructure," Sanders noted. "In Azure, we are working to expand our current Docker extension to support passing of the YAML configuration directly through our REST APIs, CLI or portal. This will make the simple even simpler, so you can just drop your YAML file details into the Azure portal and we take care of the rest."

Microsoft said it will be releasing documentation to build Docker hosts on Azure virtual machines using a Docker Machine.

Posted by Jeffrey Schwartz on 02/26/2015 at 12:53 PM0 comments


How Far Did the NSA Go in Alleged SIM Card Hack?

The National Security Agency (NSA) continues to hold its stance that the only way to thwart terrorist attacks and other crimes is to continue the surveillance programs exposed by Edward Snowden nearly two years ago. The latest report alleges that the NSA, along with the British government counterpart Government Communications Headquarters (GCHQ), has hacked encryption keys from SIM cards on smartphones.

Documents provided by Snowden and reported last week by The Intercept allege that the U.S. and British governments specifically were hacking into SIM cards from Gemalto, the largest provider of SIM cards, used in smartphones to store encrypted identity information. According to the report, the breach was outlined in a secret 2010 GCHQ document.

If indeed the encryption keys were stolen, it gave the agencies the ability to eavesdrop on and wiretap voice and data communications without approval from governments or wireless providers. The bulk key theft also gave the agencies the ability to decrypt communications that they had already intercepted, according to the report. The ability to do so was the result of mining communications of engineers and other Gemalto employees, the report added, noting that the company was "oblivious to the penetration of its systems."

Now Gemalto is shedding doubt on the severity of the breach. The company released a statement which did acknowledge it detected the intrusion that took place in 2010 and 2011. The findings of the investigation "give us reasonable grounds to believe that an operation by NSA and GCHQ probably happened," according to the Gemalto statement. However, in questioning the extent of the breach, the statement said that "the attacks against Gemalto only breached its office networks and could not have resulted in a massive theft of SIM encryption keys."

By 2010, the company said it had already implemented a secure transfer system with its customers and in only some rare instances could theft have occurred. Moreover, in many of the targeted countries at the time, many of the networks only had 2G mobile communications networks, which are inherently insecure. The modern 3G and 4G networks weren't vulnerable to such interceptions, according to the company. Gemalto said none of its other cards were affected by the attack. While the statement also pointed to some inconsistencies in the document that was leaked, including some of the customers it claimed the company worked with, Gemalto said that the SIM cards have customized encryption algorithms for each telecom provider.

For its part, the NSA is making no apologies on its surveillance policies. NSA Director Mike Rogers spoke last week at the New America Foundation's cyber security conference in Washington, D.C., where he said backdoors would not have a negative impact on privacy, weaken encryption or dampen demand for technology from the U.S.

Alex Stamos, Yahoo's chief information security officer, who was in attendance at the conference, took Rogers to task on his contention that the government has backdoors or master keys, according to The Guardian. When Stamos asked Rogers how Yahoo, which has 1.3 billion users throughout the world, could be expected to address requests for backdoors, Rogers reportedly skipped over the foreign requests, describing its overall process as "drilling a hole in a windshield. I think that this is technically feasible. Now it needs to done within a framework."

The problem is, it's unlikely that the feds will come up with a framework that will sit well with many people.

Posted by Jeffrey Schwartz on 02/25/2015 at 10:27 AM0 comments


Lenovo CTO Finally Apologizes for PC Security Fiasco

Lenovo Chief Technology Officer Peter Hortensius yesterday apologized for the SuperfIsh spyware installed on several of its PC models, saying it shouldn't have happened and said the company is putting together a plan to ensure it never happens again.

"All I can say is we made a mistake and we apologize," Hortensius said in an interview with The New York Times. "That's not nearly enough. So our plan is to release, by the end of the week, the beginning of our plan to rebuild that trust. We are not confused as to the depth of that this has caused people not to trust us. We will do our best to make it right. In the process, we will come out stronger. But we have a long way to go to make this right."

Hortensius said so far Lenovo has not seen any evidence that the malicious software that was embedded deep within the company's systems put any customers or their data at risk. "We are not aware of this actually being used in a malevolent way," he told The Times' Nicole Perlroth. Asked if it's possible that Lenovo engineers installed this on any other models than the two already reported (the Yoga 2 models and Edge 15), Hortensius said he didn't believe so but the company is investigating and will have an answer by the end of the week.

Nevertheless, some of his responses were troubling. Why did it take more than a month for Lenovo to get to the bottom of this once it was reported to the company? "At that time, we were responding to this issue from a Web compatibility perspective, not a security perspective," he said. "You can argue whether that was right or wrong, but that's how it was looked at it." Hortensius also wasn't able to answer Perlroth's question regarding how the opt-in processes work.

He was also unable to explain how the company was unaware that Superfish was hijacking the certificates. "We did not do a thorough enough job understanding how Superfish would find and provide their info," he said. "That's on us. That's a mistake that we made."

Indeed mistakes were made. Some might credit him for saying as much and apologizing. But based on the comments from my report on the issue earlier this week, it may be too little, too late.

"I didn't trust Lenovo even before this issue," said one commenter who goes by the name "gisabun." "Expect to see sales drop a bit [even if the corporate sales are generally unaffected]. Microsoft needs to push all OEMs to remove unnecessary software."

"Bruce79" commented: "Inserting a piece of software that opens unsuspecting users up to security attacks? That is a clear betrayal, regardless of price."

Kevin Parks said, "We need a class-action lawsuit to sue them into oblivion. That would tell vendors that we won't accept this kind of behavior."

Another had a less extreme recommendation: "What Lenovo could and should do is simple. Promise to never put third-party software on their machines for [X number] of years. After X number of years, no software will be preloaded; Lenovo will ask if you want the software downloaded and installed."

Was Lenovo CTO's apology a sincere mea culpa or was he just going into damage-control mode? Do you accept his apology?

Posted by Jeffrey Schwartz on 02/25/2015 at 9:36 AM0 comments


Docker Container Management Comes to FactFinder App Monitoring Suite

BlueStripe Software today said its FactFinder monitoring suite now supports distributed applications residing in Docker containers. The company said an updated release of FactFinder will let IT operations administrators monitor and manage application containers deployed in Docker containers.

Granted, the number of full-blown transaction-oriented systems that are developed and deployed in Docker containers today are few and far between. But BlueStripe Director of Marketing Dave Mountain said a growing number of development teams are prototyping new applications that can use Docker containers, which are portable and require much less overhead than traditional virtual machines.

"It's something where there's a lot of activity on the dev site with Docker containers," Mountain said. "Generally when you hear people talking about it, it's very much at the development level of [those] prototyping new ideas for their applications and they're pulling things together to build them quickly. We're not seeing them being deployed out to the production environments, but it's coming. As such, we want to be ready for it."

FactFinder is a tool used to monitor transactions distributed across server, virtualization and cloud platforms and is designed to troubleshoot the root of a transaction that might be failing or just hanging. The company last year added a Microsoft System Center Operations Manager module and the Microsoft Azure Pack for hybrid cloud infrastructures. The BlueStripe "collectors" scan physical and virtual machines to detect processes taking place. With the new release BlueStripe can view, isolate and remediate those processes housed within the container just as if they were in a physical or virtual machine.

Despite the early days for Docker containers, Mountain believes they will indeed become a key tier in distributed datacenter and cloud architectures. "As it continues to go, we expect this to become more mainstream. So this was a move on our part to make sure we're addressing that need," Mountain said. "I think it's real, I don't think this is just hype."

Posted by Jeffrey Schwartz on 02/24/2015 at 10:25 AM0 comments


Lenovo Betrayed Customer Trust by Installing Insecure Adware

Lenovo's decision to install the adware program Superfish on some of its PCs, notably the Yoga 2 models and Edge 15, was the latest inexcusable action by a company that we should be able to trust to provide a secure computing environment. It's hard to understand how Lenovo could let a system that was able to bypass the antimalware software it bundled from McAfee (as well as others) into the market.

While Microsoft swiftly updated its Windows Defender to remove the certificate for Superfish and Lenovo on Friday released its own downloadable removal tools including source code, this wasn't just another typical bug or system flaw.

Unbeknownst to customers, Lenovo apparently installed the Superfish software, designed to track users' online sessions including all SSL traffic, making their systems vulnerable to theft from hackers of passwords and other sensitive information. Adding insult to injury, Lenovo took the rather unscrupulous move of installing it at the BIOS level, making it impervious to antimalware and AV protection software.

Justifying the move, Lenovo said it had knowingly installed the adware under the guise that it would "enhance the shopping experience." The only thing it enhanced was the level of suspicion users have that whoever Lenovo does business with are putting their information at risk to further their own objectives.

Just in the past few weeks, we learned that hackers stole user information from Anthem, the nation's second largest health insurer. Some 80 million customers' private information (myself included) were victims of this attack.  Also last week, the latest leak by Edward Snowden to The Intercept accused the National Security Agency (NSA) and the British government of hacking into SIM cards from Gemalto, a company whose chips are used to store personal information in smartphones such as passports and identity information. And the list goes on.

What's galling about the Lenovo incident is that the company only put a stop to it when Peter Horne, the person who discovered it, raised the issue (the company argued it was due to negative user feedback). Horne, a veteran IT professional in the financial services industry, came across the installation of Superfish in the Lenovo Yoga 2 Notepad he bought. Horne told The New York Times that not only did the bundled McAfee software not discover it but Superfish also got past the Trend Micro AV software he installed. Looking to see how widespread the problem was, he visited Best Buy stores in New York, Boston and retailers in Sydney and Perth and the adware was installed on all the PCs he tested.

Yet upon fessing up, Lenovo argued that it was only installed on consumer systems, not ThinkPad, ThinkCentre, Lenovo Desktop, ThinkStation, ThinkServer and System x servers. Horne had a rather pointed suspicion about Lenovo's decision to install the adware in the first place. "Lenovo is either extraordinarily stupid or covering up," he told The Times. "Either one is an offense to me."

But he noted an even bigger issue. "The problem is," he said, "what can we trust?"

Posted by Jeffrey Schwartz on 02/23/2015 at 2:50 PM0 comments


Microsoft Extends Open Source Support for Big Data Services

Microsoft CEO Satya Nadella "loves" Linux. So it should come as little surprise that Microsoft is planning to support its Azure HDInight big data analytics offering on the open source server platform. The company announced the preview of HD Insight on Linux at the Strata + Hadoop World conference in San Jose, Calif. Microsoft also announced the release of Azure HD Insight running Storm, the popular Apache streaming analytics platform for streaming analytics.

The open source extensions aim to widen Microsoft's footprint in the growing market for big data services, enable users to gather more information that they can parse and analyze to make better decisions and bring big data into mainstream use, as Microsoft has indicated with its development of Cortana, now available on Windows Phone and in beta on Windows 10.

In addition to the public preview of HDInsight on Linux and general availability of Apache Storm for HDInsight, Microsoft announced Hadoop 2.6 support in HDInsight, new virtual machine sizes, the ability to grow or reduce clusters running in HDInsight and a Hadoop connector for DocumentDB.

"This is particularly compelling for people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform because they can use common Linux tools, documentation and templates and extend their deployment to Azure with hybrid cloud connections," said T. K. "Ranga" Rengarajan, corporate vice president for Microsoft's Data Platform and Joseph Sirosh, corporate vice president for Machine Learning, in a blog post.

Support for Storm is also another key advance for Microsoft as it has emerged as a widely adopted open source standard for streaming analytics. "Storm is an open source stream analytics platform that can process millions of data 'events' in real time as they are generated by sensors and devices," according to Ranga. "Using Storm with HDInsight, customers can deploy and manage applications for real-time analytics and Internet-of-Things scenarios in a few minutes with just a few clicks."

Despite its open source push, Microsoft isn't part of the Open Source Platform Alliance that was announced this week to ensure an interoperable Apache Hadoop core.  Among those on board are GE, Hortonworks, IBM, Infosys, Pivotal, SAS, Altiscale, Capgemini, CenturyLink, EMC, Splunk, Verizon Enterprise Solutions, Teradata and VMware.

Asked why, a Microsoft spokeswoman stated, "Microsoft is already partnered with Hortonworks to use HDP which will utilize the Hadoop core from the Open Data Platform Initiative moving forward. We also will continue to contribute to the broader Apache Hadoop ecosystem." The statement also offered support for the project. Microsoft sees the Open Data Platform Initiative as a good step forward to having everyone run on the same Hadoop core including HDFS, YARN and Ambari. "We see standardization in the Hadoop space as a good thing as it reduces fragmentation and makes adoption of the technologies easier."

In addition, Microsoft is focused on contributing Hadoop projects like Hive (Project Stinger, Tez), YARN, REEF and others, as well as partnering with Hortonworks, she said. "We see this Open Data Platform Initiative as complimentary to these efforts and will help the overall Hadoop ecosystem."

Posted by Jeffrey Schwartz on 02/20/2015 at 12:20 PM0 comments


Microsoft Aims To Make Machine Learning Mainstream

Many of us look forward to the day when we can get any information we want and have systems intelligently bring us what we're looking for. In a sense that's what Microsoft's new Azure Machine Learning service aims to do. While IBM is among those who have demonstrated the concept with Watson and is looking to advance the technology as well, Microsoft is looking to bring the service to the masses more easily and affordably.

"Simply put, we want to bring big data to the mainstream," wrote Joseph Sirosh, corporate vice president for Machine Learning, in a blog post announcing the general availability for the service that was announced last summer. Azure Machine Learning is based on templates and usual workflows that support APIs and Web services, enabling developers to tap into the Azure Marketplace to easily pull together components to build applications that incorporate predictive analytics capabilities.

"It is a first-of-its-kind, managed cloud service for advanced analytics that makes it dramatically simpler for businesses to predict future trends with data," Sirosh added. "In mere hours, developers and data scientists can build and deploy apps to improve customer experiences, predict and prevent system failures, enhance operational efficiencies, uncover new technical insights or a universe of other benefits. Such advanced analytics normally take weeks or months and require extensive investment in people, hardware and software to manage big data."

Yet despite the rapid growth and rollout of new Hadoop-based services that are the underpinnings of the most sought out predictive analytics platforms, growth is somewhat stalled, according to a survey conducted during Gartner's latest quarterly Hadoop webinar. The percentage of the 1,200 participants who this month said they have deployed Hadoop-based applications has remained flat since last quarter's survey (only 15 percent said they have actually deployed).

However, when the Gartner survey results are examined based on respondents who said they were in the "knowledge gathering" mode, the percentage of Hadoop deployments was lower than 15 percent. Meanwhile, those who said in the survey that they were developing strategies for Hadoop had rates of deployment that were higher than 15 percent. Gartner Research VP Merv Adrian indicated in a blog post that while it's hard to draw any broad conclusions, it may indicate renewed interest by those who have put their plans on hold. "My personal speculation is that it comes from some who have been evaluating for a while," he said.

And indeed there is plenty to look at. Microsoft has rolled out some noteworthy new offerings and is gaining partner support. That includes the latest entry to the Azure Marketplace, Informatica, which released its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage.

"Users of Azure data services such as Azure HDInsightAzure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data," wrote Informatica's Ronen Schwartz, in a blog post. "The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others -- for advanced analytics."

Do you think machine learning is ready for prime time?

 

Posted by Jeffrey Schwartz on 02/20/2015 at 12:54 PM0 comments


SharePoint MVPs: 'On-Prem is Very Much Alive and Well'

A number of prominent SharePoint MVP experts say they are confident that the on-premises server edition of SharePoint has a long future despite Microsoft's plans to extend the capabilities of its online counterpart -- Office 365 -- as well as options to host it in a public cloud service such as Azure. At the same time, many realize that customers are increasingly moving (or considering doing so) some or all of their deployments to an online alternative, either by hosting it in the cloud or moving to Office 365 and SharePoint Online.  

In one of many Tweetjams -- online discussions via Twitter -- hosted by prominent SharePoint MVP Christian Buckley (@buckleyplanet), the experts weighted in on the forthcoming SharePoint 2016 release, due out later this year, and what it will mean to the future of the premises-based edition.

"On-prem is very much alive and well. Don't think it's going away anytime soon," said MVP Asif Rehmani (@asifrehmani), founder and CEO of Chicago-based VisualSP. "Alive and well. Oh, and heavily customized," added Daniel Glenn (@DanielGlenn), Technical Consultant at InfoWorks and president of the Nashville SharePoint User Group.

Not everyone sees it that way. Some participants say the move toward hybrid deployments is gaining traction and is a sign that SharePoint in the datacenter has peaked. "SharePoint OnPrem is trending down, but still steady and above 70 percent --  there is room to grow still," tweeted Jeff Shuey (@jshuey), chief evangelist at K2, an ISV that provides workflow apps for SharePoint.

Barry Jinks (@bjinks), CEO of collaboration app provider Colligo, argued that the economies of Office 365 are compelling to many customers. "Eventually enterprises will move there," Jinks tweeted. "Just going to take way longer than hoped."

Buckley, the moderator and principal consultant with GTConsult, noted that while Microsoft may want everyone to move to the cloud, enterprises have too much invested in their on-premises SharePoint deployments. "SP on-prem CAN'T be killed by MSFT or anyone, only supplanted as cloud gets ever better," he tweeted. "Our Enterprise customers are looking at Hybrid. Still loving the on-prem #SharePoint as they have hefty investments there," said Gina Montgomery (@GinaMMontgomery), strategic director for Softmart, where she manages its Microsoft practice.

"IT [and collaboration tools] are evolving much faster than a 3 year DVD release cycle," said SharePoint and Office 365 Architect Maarten Visser (@mvisser), managing director of meetroo. " SharePoint OnPrem gets old quickly."

Asked if hybrid SharePoint deployments are becoming the new norm, the experts argued the hype doesn't match what they're seeing from their customers. "I don't think it will be a norm as much as what will be the best fit to meet requirements," said Stacy Deere-Strole (@sldeere), owner of SharePoint consultancy Focal Point Solutions.

"MSFT want it to be [the new norm]," observed SharePoint MVP Jason Himmelstein (@sharepointlhorn), Sr. Tech Director at Atrion. "Like with much of what we see coming out of Redmond it will be a bit before the desire matches the reality."

Yet many acknowledged that many are moving to hybrid deployments, or are in the process of planning to do so. "The story for OnPremises #SharePoint only gets better when you can work seamlessly with the cloud #SPO -- Hybid is a must," said SharePoint MVP Fabian Williams (@FabianWilliams). "Is hybrid the right idea? DAMN RIGHT. Move the right workloads for the right reasons," Himmelstein added.

"Yes, hybrid is becoming the norm for enterprises as well now. It just makes sense," Rehmani added. "Hybrid brings conservative customers the stability they need and allows them to experiment in the cloud," said Visser. "That's why SharePoint 2016 will be all about hybrid to force the transition," said MVP Michael Greth (@mysharepoint), based in Berlin. "Soon -- complex enterprise landscape will require a balance that hybrid can provide," tweeted Michelle Caldwell (@shellecaldwell), a director at integrator Avanade and a founder of the Buckley, Ohio SharePoint User Group. "Many are still planning and dabbling."

Williams added: "Hybrid can be considered normal because you need a 'bridge' to work between SPO & ONPrem since not all features are on both," he tweeted.

Many are also looking forward to hearing about new management features coming in SharePoint 2016. "This will be the super exciting stuff at @MS_Ignite," said Dan Holme, co-founder & CEO of IT Unity, based in Maui. "I believe it will be the differentiator over O365," Glenn said. "But O365 will absorb some (if not all) of it via Azure services over time." Buckley is looking forward to hearing more from Microsoft on this. "There has always been a gap for management across SP farms, much less hybrid," he said. "Will be interesting to see what is coming next."

What is it about SharePoint 2016 you are looking forward to hearing about?

 

Posted by Jeffrey Schwartz on 02/19/2015 at 7:31 AM0 comments


Hardware-Agnostic Approach to Hyper-Converged Infrastructure Debuts

The latest Silicon Valley startup looking to ride the wave of cloud-based software-defined datacenters (SDDCs) and containerization has come out of stealth mode today with key financial backers and founders who engineered the VMware SDDC and the company's widely used file system.

Whether Sunnyvale, Calif.-based Springpath will rise to prominence remains to be seen, but the company's hyper-converged infrastructure aims to displace traditional OSes, virtual machines (VMs) and application programming models. In addition to the VMware veterans, Springpath is debuting with $34 million in backing from key investors with strong track records. Among them is Sequoia Capital's Jim Goetz, whose portfolio has included such names as Palo Alto Networks, Barracuda Networks, Nimble Storage, Jive and WhatsApp. Also contributing to the round are New Enterprise Associates (NEA) and Redpoint Ventures.

Springpath's founders have spent nearly three years building their new platform, which they say will ultimately host, manage and protect multiple VMs, server OSes, apps and application infrastructure including Docker containers. The company's namesake Springpath Data Platform in effect aims to let organizations rapidly provision, host and virtualize multiple VMs (VMware, Hyper-V and KVM), compute and application instances, storage pools and distributed file systems while managing and protecting them running on commodity servers and storage.

Founders Mallik Mahalingam and Krishna Yadappanavar are respectively responsible for the development of VMware VXLAN, the underpinnings of the software-defined network (SDN), and VMFS, the widely deployed file system in VMware environments. The two believe the new distributed subscription-based datacenter platform they're rolling out will reshape the way enterprises develop and host their applications in the future.

Springpath's software runs on any type of commodity servers and storage hardware offered in a Software-as-a-Service (SaaS) subscription model. The hosting and systems management platform costs $4,000 per server per year and ties into existing enterprise management systems. Not surprisingly, it initially supports VMware vCenter, but will also run as a management pack in Microsoft System Center.

The platform is based on what Springpath calls its Hardware Agnostic Log-structured Objects (HALO) architecture, which purports to offer a superior method for provisioning data services, managing storage and offering a high-performance and scalable distributed infrastructure. Rather than requiring customers to buy turnkey appliances, the company is offering just the software. It's currently supported on servers offered by Cisco, Dell, Hewlett-Packard and Supermicro. Customers can deploy the software themselves or have a partner run it on one of the supported systems. Springpath has forged a partnership with mega distributor Tech Data to provide support for the platform.

Ashish Gupta, Springpath's head of marketing, described the HALO architecture in an interview. HALO consists of distribution, caching, persistence optimization and structured object layers, Gupta explained. "Married to all of this are the core data services that all enterprises are going to need from a management perspective like snapshots, clones, compression [and so on]," he said. "The idea here is you can put the Springpath Data Platform on a commodity server and then scale and essentially give core capabilities that can replace your array-based infrastructure. You will no longer need to buy expensive multimillion arrays and isolate yourself in that environment, you are buying commodity servers putting the software on top, and you're going to get all the enterprise capabilities and functionality that you expect."

The hardware-agnostic distribution layer, for example, enables enterprises to take advantage of all the underlying hardware to support an application, he added. "The platform can be running on N number of servers. We can take advantage of all the resources underneath the servers, be it the disk, be it the memory or the flash environment and essentially present that to the applications that are supported by the platform."

In that context the applications can run in a virtualized environment from VMware, Hyper-V or KVM, and can be supported in containerized environments. Gupta noted Springpath is also providing its Platform-as-a-Bare-Metal offering. "So it can look like a traditional storage device, except it can scale seamlessly in a bare metal deployment," he said. Gupta added Springpath has its own file system and supports either block objects or Hadoop plug-in infrastructures. "In essence, we can give you a singular platform for the app your application needs," he said.

While it's not the only hyper-converged platform available, it is the first potentially major one offered not tied to a specific hardware platform or offered as an appliance, said Chuck Bartlett, senior VP for Tech Data's advanced infrastructure solutions division, which is working to line up systems integration partners to offer the platform. "The fact that it is compute-agnostic, meaning the end-user client can use the server platform they have or want to implement, is unique and compelling.

Looking ahead, Springpath's Gupta sees HALO targeting emerging Web-scale applications and distributed programming environments built in programming languages such as Node.js, Scala, Akka or Google Go. The initial release is designed to support VMware environments and OpenStack Horizon-based clouds, though plans call for supporting Microsoft Azure, Hyper-V and System Center this summer.

Posted by Jeffrey Schwartz on 02/18/2015 at 4:02 PM0 comments


Obama's Order Ups the Ante on Cyber Security Information Sharing

President Obama issued an executive order aimed at persuading companies who suffer breaches to share information in an effort to provide more coordinated response to cyberattacks. Though it stops short of mandating that they do so, the president is also introducing legislation that will pave the way for greater information sharing between the private sector and government agencies including the Department of Homeland Security. The legislation also calls for the modernization of law enforcement authorities to fight cybercrime and the creation of a national breach reporting authority.

The order, signed today by the president at the Cybersecurity Summit at Stanford University in Palo Alto, Calif., sets the stage for the latest round of debate on how to protect the nation's infrastructure and consumer information without compromising privacy and civil liberties. Obama's push to promote information sharing, which could help provide better threat intelligence and methods of responding to attacks, nonetheless won't sit well with organizations who loathe to do so for concerns over liability and business impact.

Specifically the president has proposed the formation of information sharing and analysis organizations (ISAOs). These will be private sector groups that would share information and collaborate on issues related to cyber security by creating Information Sharing and Analysis Centers (ISACs). It extends on the information sharing executive order Obama issued two years ago to the day and outlined in this State of the Union Address that led to the release of last year's Cybersecurity Framework.

Since then of course, the numbers of cyber attacks have become more severe with the 2013 Target breach, major attacks last year against Apple, Home Depot, the IRS, Sony and now this year's compromise of customer info at Anthem, the second largest health insurance provider.

Obama also met today with some key industry executives at the Cybersecurity Summit in Palo Alto, including Apple CEO Tim Cook and Intel president Renee James. Besides Cook, top CEOs are conspicuous by their absence including Facebook, Google, IBM, Microsoft and Yahoo. The president signed the executive order at today's summit.

The order also seeks to let law enforcement agencies prosecute those who sell botnets, while making it a crime to sell stolen U.S. financial information such as credit card and account numbers to anyone overseas. It will also give federal law enforcement agencies authority to go after those who sell spyware and give courts the authority to shut down botnets.

Several key IT providers and large companies at risk today attending the summit announced their support for the framework including Intel, Apple, Bank of America, U.S. Bank, Pacific Gas & Electric, AIG, QVC, Walgreens and Kaiser Permanente, according to a fact sheet released by the White House.

While some just announced support for the framework, Intel released a paper outlining its use and stated that it is requiring all of its vendors to use it as well. Apple said it's incorporating it as part of its broader security across its networks. Also requiring its vendors to use the framework are Bank of America, while insurance giant AIG said it is incorporating the NIST framework into how it underwrites cyber insurance for business of all sizes and will use it to help customers identify gaps in their approach to cyber security.

The White House also said several members of the Cyber Threat Alliance, which includes Palo Alto Networks, Symantec, Intel and Fortinet, have formed a cyber threat-sharing partnership that aims to create standards aligned with its information sharing order. Along with that, according to the White House, Box plans to participate in creating standards for ISAOs with plans to use its Box platform to extend collaboration among ISAOs. Further, FireEye is launching an Information Sharing Network, which will let its customers receive threat intelligence in near real time (including anonymized indicators).

Several companies are also announcing efforts to extend multifactor authentication, including Intel, which is releasing new authentication technology that seeks to make biometrics a more viable option to passwords. Credit card providers and banks, including American Express, Master Card and its partner First Tech Credit Card Union, are all advancing efforts to pilot and/or roll out new multifactor authentication methods including biometrics and voice recognition.

Much of the buzz is about the failure of the tech CEOs to attend, but it looks like today's event at Stanford has shown some potentially significant advances by companies and some proposals by the president that will certainly extend the noise level of debate from Silicon Valley to the Beltway.

What's your take on the president's latest executive order?

 

Posted by Jeffrey Schwartz on 02/13/2015 at 12:48 PM0 comments


Microsoft: How To Use Docker Containers in Azure

Microsoft's announcement back in October that it has partnered with Docker to enable Linux containers to run in Windows was an important step forward for enabling what promises to be the next wave in computing beyond virtualization. While things can change on a dime, it looks like Microsoft is going all in by supporting a widely endorsed (including IBM, Google, VMware and others) new computing model based on application portability and a more efficient use of compute, storage and network resource.

It sounds quite grand but so did virtualization -- and the idea of consolidating server resources -- when it hit the scene a decade ago. Of course, the proof will be in the implementation. It's very likely we'll hear about how to enable Linux containers in Windows Server at the upcoming Build and Ignite conferences in late April and early May, as Microsoft Distinguished Engineer Jeffrey Snover hinted last week.

"We're also going to talk about containers, Docker containers for Windows," Snover said. "There will be two flavors of the compute containers. There'll be a compute container focused in on application compatibility, so that will be server running in a containers, and then there will be containers optimized for the cloud. And with those containers you'll have the cloud optimized server."

Those wanting to start running Linux containers in Azure can start now, based on documentation posted by Microsoft yesterday. "Docker is one of the most popular virtualization approaches that uses Linux containers rather than virtual machines as a way of isolating data and computing on shared resources," according to the introduction. "You can use the Docker VM extension to the Azure Linux Agent to create a Docker VM that hosts any number of containers for your applications on Azure."

The documentation explains the following:

It also aims to explain how to:

In its description of Docker containers, it points out they're currently one of the most popular virtualization alternatives to virtual machines in that they isolate data and computing on shared resources, enabling developers to build and deploy apps across Docker resources, which may run in different environments.

As I noted earlier in the week, DH2i is now offering a platform that enables containers that run in Windows Server -- the difference being that they're Windows, not Linux-based, though they purport to work with Docker containers as well.

But if you're looking to start with Docker in Azure, Microsoft is making the push.

Posted by Jeffrey Schwartz on 02/12/2015 at 1:27 PM0 comments


Popular Windows Troubleshooting Site Is Now SSL-Enabled

The popular Sysinternals site acquired by Microsoft nearly two decades ago with troubleshooting utilities, tools and help files is now SSL-enabled. The cocreator steward of the site Mark Russinovich, Microsoft's Azure CTO, tweeted the news earlier in the week.

Microsoft and many others are making the move to use the SSL protocol for Web sites -- the long-established Secure Sockets Layer standard used for encrypted Web sessions. Enabled for decades in sites where financial transactions and other secure communications are necessary, the move to HTTPS sessions from HTTP is rapidly spreading rapidly as the hazards of intercepted communications is on the rise.

If you ask, why would a site that just hosts documentation need an HTTPS connection, consider there are lots of executables there as well, and though all the binaries are signed, using SSL to access the tools via the online share prevents man-in-the-middle tampering in cases where the user doesn't validate the signature before launching the tool.

[Click on image for larger view.] 

 

 

Posted by Jeffrey Schwartz on 02/12/2015 at 1:28 PM0 comments


Azure Customers Can Use ExpressRoute Free Through June

Microsoft's new ExpressRoute service could emerge as a key piece of its hybrid cloud story for customers wary of using the public Internet to link their private datacenters to Azure. ExpressRoute, introduced at last May's TechEd conference in Houston, effectively provides dedicated links that are more reliable, faster and secure. To encourage customers and partners to try it out, Microsoft is offering the service free of charge through the end of June.

The offer covers Microsoft's Express Route 10 Mbps Network Service Provider (NSP) for new and existing customers in all Azure regions where ExpressRoute is currently offered. Several of Microsoft's telecommunications partners that offer ExpressRoute are also joining in the promotion including British Telecom, Colt, and Level 3. Also, AT&T is offering six month trials of its new NetBond service and Verizon is providing six months use of its Secure Cloud Interconnect offering for first-time users of its basic data plan with customers who sign two-year agreements for up to 1TB per month.

"ExpressRoute gives you a fast and reliable connection to Azure, making it suitable for scenarios such as data migration, replication for business continuity, disaster recovery, and other high-availability strategies," wrote Sameer Sankaran, a senior business planner within Microsoft's Azure group. "It also allows you to enable hybrid scenarios by letting you seamlessly extend your existing datacenter to Azure."

The service is especially complementary for services like Azure Site Recovery, which provides disaster recovery services using Azure targets and Hyper-V replication and for applications requiring private or more reliable links than using an Internet connection.

ExpressRoute is designed to connect on-premises resources such as physical and virtual server farms, storage, media services and Web sites, among other services. The service requires you to order circuits via one of the connectivity partners. Customers can choose either a direct layer 3 connection via an exchange provider or a standard layer 3 link from an NSP. Customers can enable one or both types through their Azure subscriptions but must configure both to connect to all supported services.

Posted by Jeffrey Schwartz on 02/11/2015 at 1:28 PM0 comments


Containers Now Available in Windows Server

A little-known startup that offers data protection and SQL Server migration tools today released what it calls the first native container management platform for Windows Server and claims it can move workloads between virtual machines and cloud architectures. DH2i's DX Enterprise encapsulates Windows Server application instances into containers removing the association between the apps, data and the host operating systems connected to physical servers.

The Fort Collins, Colo.-based company's software is a lightweight 8.5 MB server installation that offers a native alternative to that of Docker containers, which are Linux-based, though Microsoft and Docker are working on porting their containers to Windows, as announced last fall. In addition to its relationship with Microsoft, Docker has forged ties with all major infrastructure and cloud providers including Google, VMware and IBM. Docker and Microsoft are jointly developing a container technology that will work on the next version of Windows Server.

In his TechDays briefing last week, Microsoft Distinguished Engineer Jeffrey Snover confirmed that the company will include support for Docker containers in the next Windows Server release, known as Windows vNext.

DH2i president and CEO Don Boxley explained why he believes DX Enterprise is a better alternative to Docker, pointing to that fact that it's purely Windows Server-based.

"When you look at a Docker container and what they're talking about with Windows containerization, those are services that they're looking at then putting some isolation kind of activities in the future," Boxley said. "It's a really important point that Docker's containers are two containerized applications. Yet there are still going to be a huge amount of traditional applications simultaneously. We'll be able to put any of those application containers inside of our virtual host and have stop-start ordering or any coordination that needs to happen between the old type of applications and the new and/or just be able to manage them in the exact same way. It forces them to be highly available and extends now to a containerized application."

The company's containers, called "Vhosts," each have their own logical host name, associated IP addresses and portable native NTFS volumes. The Vhost's metadata assigns container workload management, while directing the managed app to launch and run locally, according to the company. Each Vhost shares one Windows Server operating system instance, which are stacked on either virtual or physical servers. This results in a more consolidated way of managing application workloads and enabling instance portability, Boxley explained.

Unlike Docker there are "no companion virtual machines running Linux, or anything like that at all," Boxer said. "It's just a native Windows application, you load it onto your server and you can start containerizing things right away. And again, because of that universality of our container technology, we don't care whether or not the server is physical, virtual or running in the cloud. As long as it's running Windows Server OS, you're good to go. You can containerize applications in Azure and in Rackspace and Amazon, and if the replication data pipe is right, you can move those workloads around transparently." At the same time, Boxley said it will work with Docker containers in the future.

Boxley said a customer can also transparently move workloads between any virtual machine platform including VMware, Hyper-V and Xen. "It really doesn't matter because we're moving the applications not the machine or the OS," he said. Through its management console, it automates resource issues including contention among containers. The management component also provides alerts and ensures applications are meeting SLAs.

Asked why it chose Windows Server to develop DX Enterprise, Boxley said he believes it will remain the dominant environment for virtual applications. "We don't think -- we know it's going to grow," he said. IDC analyst Al Gillen, said that's partly true, though Linux servers will grow in physical environments. Though he hasn't tested DX Enterprise, Gillen said the demo looked promising. "For customers that have an application that they have to move and they don't have the ability to port it, this is actually a viable solution for them," Gillen said.

The solution is also a viable option for organizations looking to migrate applications from Windows Server 2003, which Microsoft will no longer support as of July 14, to a newer environment, Boxley said. The software is priced at $1,500 per server core (if running on a virtual machine it can be licensed via the underlying core), regardless of the number of CPUs. Support including patches costs $360 per core per year.

Boxley said the company is self-funded and started out as a Microsoft BizSpark partner.

Posted by Jeffrey Schwartz on 02/10/2015 at 12:15 PM0 comments


White House Taps Former Microsoft CIO Tony Scott

The White House late last week said it has named Tony Scott as its CIO. This will only be the third person charged with overseeing the nation's overall IT infrastructure. Scott, who served as Microsoft's CIO, also served that role for Disney and VMware and was CTO at GM.

Scott's official title will be U.S. CIO and Administrator of OMB's Office of Electronic Government and Information Technology, succeeding Steve VanRoekel. The first CIO was Vivek Kundra, who launched the government's Cloud First initiative. The Obama Administration will task Scott with implementing its Smarter IT Delivery agenda outlined in the president's 2016 proposed budget.

I actually first met Scott when he was GM's CTO in the late 1990s when he spoke at a Forrester conference about managing large vendors, which included Microsoft, Sun Microsystems, Oracle and numerous others including  some startups during the dot-com era. I later caught up with him more than a decade later attending Microsoft's Worldwide Partner Conference in 2010 in Washington, D.C.

Among his key initiatives at the time as Microsoft's CIO was enabling the internal use of new technologies  the company had recently brought to market, among them Azure. "I think we've done what Microsoft always has done traditionally, which is we try to dog-food our own stuff and get the bugs out and make sure the functionality is there," he said during an interview at WPC, though he qualified that by adding: "We'll move them or migrate them as the opportunity arises and as the business case makes sense."

Nevertheless he was known as a proponent of cloud-enabling internal applications as quickly as possible. Scott's tenure at Microsoft ran from 2008 until 2013 and he has spent the past two years at VMware.

 

Posted by Jeffrey Schwartz on 02/09/2015 at 12:49 PM0 comments


Microsoft Reveals Its Product Roadmap, Issues Windows Server 2003 Warning

The news that the next version of Windows Server and System Center won't come until next year caught many off guard who were under the impression one would come later this year. Microsoft brought its enterprise product roadmap into fuller view with that announcement and the promise of a new version of SharePoint Server later this year.

This latest information came at a fortuitous time for my colleague Gladys Rama, who was putting the finishing touches on the updated 2015 Microsoft Product Roadmap for sister publication Redmond Channel Partner. Check it out if you want to know planned release dates for anything noteworthy to an IT pro from Microsoft.

As for the delay of Windows Server v.Next, ordinarily it would seem par for the course. But after releasing Windows Server 2012 R2 just a year after Windows Server 2012 and messaging that Microsoft was moving toward a faster release cadence,  it was more surprising. Whether by design or otherwise, the news removes a key decision point for IT pros who were considering waiting for the new Windows Server to come out before migrating their Windows Server 2003-based systems.

As soon as Microsoft got the word out that the new Windows Server is on next year's calendar, it issued another reminder that Windows Server 2003's end of support is less than six months away.  Takeshi Numoto, corporate VP for cloud and enterprise marketing, gave the latest nudge this week in a blog post once again warning of the risks of running the unsupported operating system after the July 14 deadline.

"Windows Server 2003 instances will, of course, continue to run after end of support," he noted. "However, running unsupported software carries significant security risks and may result in costly compliance violations. As you evaluate security risks, keep in mind that even a single unpatched server can be a point of vulnerability for your entire infrastructure."

Microsoft has urged customers to migrate to Windows Server 2012 R2 and, where customers feel it makes sense, consider a cloud service such as Office 365 to replace Exchange Server on-premises as well as Azure or other cloud infrastructure or platform services to run database applications, SharePoint and other applications.

Did the news that Windows Server v.Next have any impact on your Windows Server 2003 migration plans? Or was the prospect of it possibly coming later this year too close for comfort for your planning purposes?

Posted by Jeffrey Schwartz on 02/06/2015 at 4:05 PM0 comments


VMware Releases Long-Awaited vSphere 6 Upgrade

It's been three years since VMware has upgraded its flagship hypervisor platform, but the company yesterday took the wraps off vSphere 6, which the company said offers at least double the performance over its predecessor vSphere 5.5. VMware describes its latest release as the "foundation for the hybrid cloud," thanks to the release of its OpenStack distribution and upgrades to components of the suite that integrate virtualized software-defined storage and networking.

The new wares, set for release by the end of March, will offer a key option for enterprise IT decision makers to consider as they choose their next-generation virtual datacenter and hybrid cloud platforms. With the new wave of releases, VMware is advancing and integrating its new NSX software-defined networking technology. VMware, to some extent, is also challenging the business model of its corporate parent EMC by offering new storage virtualization capabilities with its new Virtual SAN 6 and vSphere Virtual Volumes, which will enable virtualization of third-party storage arrays.

The move comes as VMware seeks to maintain its dominant hold on large datacenter installations looking to move to hybrid and public clouds as giants such as Microsoft, Google, IBM and Amazon Web Services are looking to position their cloud platforms as worthy contenders. In what appeared to be a strategically timed move, Microsoft published its Cloud Platform roadmap, as reported, just hours before the VMware launch event.

With this release, it now remains to be seen whether VMware can successfully leverage the virtualized server stronghold it has with its network and storage virtualization extensions to its public cloud, vCloud Air, as Microsoft tries to lure those customers away with its Cloud OS model consisting of Windows Server, Hyper-V and Microsoft Azure. Despite Microsoft's gains, VMware is still the provider to beat, especially when it comes to large enterprise installations.

"VMware's strength remains their virtualization installed base, and what they're doing through NSX is building that out into cloud environments," said Andrew Smith, an analyst at Technology Business Research. "VMware realizes that they need to gain ground, especially in private cloud deployments, so they're going to use NSX to tie into security along with vSphere, to really try and take the hybrid cloud space by storm. And I think with updates to vSphere and the integration with OpenStack it's all pieces of the puzzle coming together to make that a reality to customers."

OpenStack Cloud Support
The VMware Integrated OpenStack (VIO) distribution, the company's first OpenStack distro, includes an API access that provides access to OpenStack-enabled public and private cloud and VMware vSphere infrastructure. "VIO is free to vSphere Enterprise Plus customers and comes as a single OVA file that can be installed in fewer than 15 minutes from the optional vSphere Web client. VIO support, which includes support for both OpenStack and the underlying VMware infrastructure and is charged on a per-CPU basis,  said Tom Fenton, a senior validation engineer and lab analyst with IT analyst firm Taneja Group, in a commentary published on our sister site Virtualizationreview.com.

"VMware brings a lot to the OpenStack table with VIO," according to Fenton. "Many common OpenStack tasks are automated and can be performed from vCenter. vRealize Operations is able to monitor OpenStack, and LogInsight can parse OpenStack logs to separate the considerable amount of log noise from actionable items." The new NSX software-defined networking infrastructure will enable ties to OpenStack-compatible clouds, as well as VMware's own vCloud Air public cloud.

"The company now has offerings that address all layers of the Kusnetkzy Group virtualization model, including access, application, processing, storage and network virtualization, as well as both security and management for virtualized environments, sometimes called software-defined environments," wrote noted analyst Dan Kusnetsky, in his Dan's Take blog post.

Hypervisor Improvements with vSphere 6
Martin Yip, a VMware senior product manager, said in a company blog post announcing vSphere 6 that it has 650-plus new features, increased scale, performance, availability, storage efficiencies for virtual machines (VMs) and datacenter simplified management. "vSphere 6 is purpose-built for both scale-up and scale-out applications including newer cloud, mobile, social and big data applications," Yip noted.

Compared with the existing vSphere 5.5, vSphere 6 supports 64 hosts per cluster which is double the VMs per cluster, 480 CPUs versus 320, triple the RAM with 12TB per host and quadruple the VMs per host with 2,048. It also supports double the number of virtual CPUs per VM at 128 and quadruple the amount of virtual RAM per VM totaling 4TB, according to Yip.

The starting price for vSphere 6 is $995 per CPU. vSphere with Operations Management 6 starts at $1,745 per CPU and vCloud Suite 6 starts at $4,995 per CPU.

Posted by Jeffrey Schwartz on 02/03/2015 at 12:19 PM0 comments


Goverlan Releases Free WMI Scripting Tool for Remote Windows Management

Goverlan last week said it's giving away its version of its GUI-based Windows Management Instrumentation (WMI) tool for remote desktop management and control. The company's WMI Explorer (WMIX) lets IT pros with limited scripting or programming skills perform agentless remote administrations of Windows-based PCs.

The free tool  -- an alternative to Microsoft's own command line interface called WMIC-- leverages WMI, a stack of Windows driver component interfaces supporting key standards including WBEM and CIM. WMI is built into all versions of Windows, allowing for deployed scripting to manage PCs and some servers remotely.

According to Goverlan, its WMIX tool includes a WMI Query Wizard, which will appeal to administrators with limited scripting or coding skills because it lets them create sophisticated standard WMI Query Language (WQL) queries with a filtering mechanism that generates results matching the needs of the specified remote systems. Goverlan's WMIX GUI lets administrators automatically generate Visual Basic scripts to define parameters to generate a script and report. It can also create WMI-based Group Policy Objects (GPO) and performs agentless system administration.

What's in it for the company to give away this free tool? Ezra Charm, Goverlan's vice president of marketing, noted that the company has never officially launched the tool nor has it significantly promoted it, yet it's popular among those who use it. "We are seeing a ton of interest," Charm said. Though many companies release free tools hoping to upsell customers to premium releases, Charm said the release of WMIX is primarily aimed at overall awareness for those who want to perform advanced WMI-based system administration functions and reports. Nevertheless, the effect would be the same.

WMIX is a component of the company's flagship systems administration tool, Goverlan Remote Administration Suite v8, which, depending on your environment, is an alternative or supplement to Microsoft's System Center Configuration Manager.

"At first, WMIX was implemented as an internal development tool to assist the integration of WMI Technology within the Goverlan Client Management Tool, but we quickly realized that this product would be of great services to all Windows System Administrators out there as it would allow anyone without advanced scripting knowledge to consume this powerful technology," said Goverlan CEO, Pascal Bergeot, in a statement announcing the release of the free tool

Goverlan requires contact information to activate the WMIX download.

Posted by Jeffrey Schwartz on 02/03/2015 at 12:20 PM0 comments


Microsoft Will Deliver On-Premises SharePoint 2016 This Year

Allaying concerns that Microsoft wasn't planning to develop any more on-premises versions of SharePoint, the company today said a new server release is scheduled for the second half of 2015. Microsoft's emphasis on SharePoint Online had many wondering at times whether the company was planning a new server release, although the company had indicated back in March that a new version was coming.

Despite its push to the cloud version, Microsoft has acknowledged and promoted the fact that organizations' best way to transition is via a hybrid architecture providing connectivity between server and Microsoft's SharePoint Online services. However unlike the on-premises version, the Office 365 version of SharePoint Online doesn't support the trusted code and apps developed for SharePoint Server.

"While we've seen growing demand for SharePoint Online, we recognize that our customers have a range of requirements that make maintaining existing SharePoint Server deployments the right decision for some," said Julia White, general manager for Microsoft's Office Products division in a blog post today. "We remain committed to meeting those needs. We're excited about the next on-premises version of SharePoint and we're sure you will be too. It has been designed, developed and tested with the Microsoft Software as a Service (SaaS) strategy at its core, drawing from SharePoint Online. With this, SharePoint Server 2016 will offer customers enhanced, flexible deployment options, improved reliability and new IT agility, enabled for massive scale."

The company didn't offer any details on what its plans are for SharePoint Server 2016 other than to say it will provide more details at its forthcoming Ignite conference in Chicago in the first week of May, although White insinuated that Microsoft would aim to extend the hybrid capabilities that the current on-premises and cloud versions offer.

"A hybrid SharePoint deployment can provide IT agility by creating a consistent platform spanning datacenters and cloud, simplifying IT and delivering apps and data to users on any device, anywhere," she said. "With SharePoint Server 2016, in addition to delivering rich on-premises capabilities, we're focused on robust hybrid enablement in order to bring more of the Office 365 experiences to our on-premises customers."

Microsoft is already jumpstarting that hybrid effort this year with the rollout of APIs and SDKs that aim to bridge the gap between the on-premises and cloud worlds, as noted in last month's Redmond magazine cover story. The topics to be discussed in sessions at Ingite cover the gamut of SharePoint and Office 365 technologies including Delve, OneDrive, Project, Visio and Yammer.

 

Posted by Jeffrey Schwartz on 02/02/2015 at 7:33 AM0 comments


Microsoft: Software Support Provides Best Route to Windows 10 Enterprise Edition

When Microsoft last month announced that it will offer Windows 10 as a free upgrade to Windows 7, Windows 8 and Windows 8.1 users, the company said the deal doesn't apply to enterprise users. The company clarified that point late last week saying that the free upgrade is available to business users who have Windows Pro, but those wanting enterprise management capabilities should stick with or move to Software Assurance.

In a blog post outlining the way Microsoft will release new Windows 10 features and fixes via its new Windows-as-a-service model, the company elaborated on the free offer. Simply put, the free offer is available to anyone with Windows 7 Pro or Windows 8.x Pro. If you have Windows 7 Enterprise or Windows 8.x Enterprise and require the same "enterprise-grade capabilities, Windows Software Assurance (SA) will continue to offer the best and most comprehensive benefits," wrote Jim Alkove, a leader on the Windows Enterprise Program Management.

While Microsoft could change what's offered in its unnamed pro and enterprise versions, the latter edition will offer Direct Access, BranchCache, App Locker, support for VDI and the Windows to Go capability, according to a Microsoft description of the differences between the two SKUs. It's clear that Microsoft wants to get as many users as possible onto Windows 10, which required users to upgrade within a year of its release.

 

 

Posted by Jeffrey Schwartz on 02/02/2015 at 1:51 PM0 comments


Microsoft Releases Outlook for iOS and Android

Microsoft today released versions of its widely used Outlook mail, calendaring and contacts app for users of iPhones, iPads and a preview version Android devices. More than 80 million iPad and iPhone users have downloaded Office, according to Microsoft.

 "We have received tremendous customer request for Outlook across all devices, so we are thrilled to fulfill this for our customers," said Julie White, general manager for the Office product management team. Microsoft was able to develop the new Outlook apps thanks to code developed by Acompli, which Microsoft acquired last month , she noted. 

I didn't see it in the Apple Store earlier today but a link Microsoft provided enabled an easy download of the Outlook app for the iPad. It looks a lot like the traditional Outlook client, though it clearly is stripped down. In the Settings folder (Figure 1) you can choose swipe options, browsers and create a signature for each or all accounts. However, unlike the Windows Outlook client, you can only create generic signatures.

[Click on image for larger view.]  Figure 1. Settings folder on the new Outlook app for the iPad.

The mail client has a search button and allows you to see your folders (Figure 2). It also provides access to OneDrive, Dropbox and Google Drive storage accounts (Figure 3). In the configuration setting, it provides access to your Exchange account, Outlook.com, OneDrive, iCloud, Gmail, Yahoo Mail, Dropbox and Box (Figure 4).

[Click on image for larger view.]  Figure 2. Search feature for the iOS version of Outlook.

 

[Click on image for larger view.]  Figure 3.Available storage accounts.
[Click on image for larger view.]  Figure 4. Available e-mail accounts.

Unfortunately if you connect to any other POP- or IMAP-based service you're out of luck. Microsoft didn't indicate whether that will change, though White noted that "you will see us continue to rapidly update the Outlook app, delivering on the familiar Outlook experience our customers know and love." For my testing purposes, I have a Yahoo mail account that I use for less-critical purposes, which enabled me to test the new Outlook client.

Microsoft said the new Outlook app replaces Outlook Web Access for iOS and Android, though Microsoft said it will keep them around for now because some advanced Exchange and Office 365 features still aren't available with the new Outlook app.

Microsoft also announced that its Office for Android tablets is now generally available in the Google Play store. This replaces the preview versions of Word, Excel and PowerPoint. It joins the already available version of OneNote for Android. The company said it will also support native implementation running on Intel chipsets within the quarter.

 

Posted by Jeffrey Schwartz on 01/29/2015 at 11:26 AM0 comments


Microsoft Upgrades and Discounts Power BI SaaS-Based Analytics Tool

Microsoft is giving its Power BI analytical service an upgrade with added connectivity sources, support for iOS and will be available for $9.99 for a premium edition to be called Power BI Pro. The company will also offer a free version with limited functionality that it will retain the Power BI name.  

Power BI, a cloud-based business analytics service launched a year ago, was aimed at both technical and general business users. The browser-based software-as-a-service (SaaS) tool generates operational dashboards. As noted in our First Look at Power BI last year, this tool adds new functionality to existing Microsoft information management offerings, namely Excel 2013, Office 365 and SharePoint 2013. 

Microsoft currently has three pricing tiers for Power BI, running as high as $52 per user per month included with Office 365 Pro Plus, $40 for a standalone version and $33 when added on to an Office 365 E3/E4 subscription. Starting Feb. 1 Microsoft is offering one paid subscription at the substantially reduced $9.99 price, which is 75 percent less expensive than the highest tier. The company will offer the free version when the upgrade becomes generally available.

The free version is limited to 1GB of data per month per user, whereas the paid subscription will allow up to 10GB according to a comparison of the two options. Users of the paid version will also have access to 1 million rows per hour of streaming data compared to 10,000 rows for the free service. The paid Power BI Pro is required to use such features as access to live data sources, the data management gateway and various collaboration features including the ability to share refreshable team dashboards, create and publish customized content packs, use of Active Directory groups for sharing and managing access control and shared data queries through the data catalog.

With the new preview, users can sign in with any business e-mail address, initially only in the United States. Microsoft said it'll be available for those in other countries in the future. The new data sources supported include GitHub, Marketo, Microsoft Dynamics CRM, Salesforce, SendGrid and Zendesk, noted James Phillips, Microsoft's general manager for data experiences, in a blog post Tuesday.  In the pipeline are Inkling Markets, Intuit, Microsoft Dynamics Marketing, Sage, Sumo Logic, Visual Studio Application Insights and Visual Studio Online, among others. "Power BI is 'hybrid' by design, so customers can leverage their on-premises data investments while getting all the benefits of our cloud-based analytics service," he said.

Microsoft also is offering a new tool called Power BI Designer, designed to let business analysts connect with, model and analyze data, he added, letting them easily publish results to any other Power BI user. The company also released a preview of Power BI for iPad, which can be downloaded from the Apple App Store. Phillips noted versions for iPhones, Android and Windows universal apps will be available later this year.

 

Posted by Jeffrey Schwartz on 01/28/2015 at 3:20 PM0 comments


Apple Results Embarrass Microsoft: Is Nadella's Honeymoon Over?

One day after Microsoft delivered a disappointing quarterly earnings report, Apple Tuesday did the complete opposite by posting its best quarter ever -- far exceeding expectations. In fact Apple is said to have posted the most profitable quarter of any publicly traded company ever, buoyed by the fact that it sold 74.5 million iPhones between Oct. 1 and Dec. 27.

Apple recorded $74.6 billion in revenues with a record $18 billion profit (gross margin of 39.9%). This topped the previous record of ExxonMobil in the second quarter of 2002. The company's shares are soaring a day after Microsoft stock sunk about 9% (making it a key contributor to a down day for the stock market on Monday).

It's not that Microsoft had a horrible quarter -- revenues of $26.5 billion were up 8% year over year -- but profits were down and the growth outlook for next quarter was considerably slower than Wall Street had anticipated. The results have led many to question whether Satya Nadella's honeymoon is over as he approaches his one-year anniversary as CEO. During his first year, Nadella made some major moves to turn Microsoft around, as Mary Jo Foley noted in her monthly Redmond magazine column posted this week.

In addition, Microsoft posted only a 5% increase in commercial licensing and an overall operating income decline of 2%. Impacting earnings were the restructuring of its Nokia business unit and the transition to cloud computing, the company said.

Microsoft also blamed the strong dollar and weakness in China and Japan as well, which could further impact the current quarter should the dollar continue to strengthen, CFO Amy Hood warned on the earnings call. Ironically strong results in China were a key contributor for Apple's growth, while the company indicated only modest impact from the dollar's surge.

Among some other noteworthy improvements that Microsoft reported:

  • 9.2 million Office 365 Home and Personal licenses were sold. This is up 30% since last quarter.
  • $1.1 billion in  Surface sales (24% increase in sales).
  • 10.5 million Lumia smartphones sold, up 30 percent. However, Windows Phone still accounts for just 3 percent of the smartphone market.

But despite those improvements, Apple outshined Microsoft in a number of ways. Apple said it sold its 1 billionth iOS device in November. Average selling prices of iPhones increased $50 as customers opted for more expensive devices equipped with more storage. Not that any major market shifts were expected, but its huge spike in iPhone sales aided by the larger form factor of the latest models continues to leave Windows Phone in the dust.

For Microsoft, while Windows Pro OEM revenue declined 13 percent as did non-Pro revenue, sales of Apple Macintoshes rose 14% to total 5.5 million. Apple App Store revenues also jumped 41 percent. One weak spot for Apple was the decline of iPad sales -- 21.6 million units were sold, which is down from 26 million during the same period last year. The decline in iPads is not surprising given the release of the iPhone 6 Plus, which many might substitute an iPad Mini for due to the fact it's only slightly larger. Also recent upgrades have only had modest new features, giving existing users little incentive to replace the iPads they now have.

Still, iPads are becoming a formidable device in the enterprise and Apple CEO Tim Cook said on Tuesday's call that the company's partnership with IBM has helped boost the use of them in the workplace. "I'm really excited about the apps that are coming out and how fast the partnership is getting up and running," he said.

Many are expecting Apple to increase its enterprise push with the release of a larger iPad. For its part, Microsoft is rumored to have a new Surface device coming later this year, though the company has yet to confirm its plans.

Perhaps Cook is having a better week than Nadella but Microsoft has many other fish to fry where Apple is not in its path. Nadella's turnaround for Microsoft is very much still in progress. If the honeymoon is over, as some pundits suggest, then Nadella's success will ride on his ability to keep the marriage strong.

 

Posted by Jeffrey Schwartz on 01/28/2015 at 12:32 PM0 comments


Hands On: Windows 10's New Preview

When Microsoft released the newest Windows 10 Technical Preview on Friday, testers saw some major new features the company is hoping to bring to its operating system designed to switch seamlessly between PC and tablet modes. Among the key new features are a new Start Menu and Cortana, the digital assistant that, until now, was only available on Windows Phone.

Along with those and other new features, the new Build 9926 takes a key step forward in showing the progress Microsoft is making to remove the split personality that epitomizes Windows 8.x. Microsoft is designing Windows 10 to launch desktop apps from the Windows Store interface and vice versa. Upon downloading the new build you'll want to look at the following:

Start Menu: One of the biggest mistakes Microsoft made when it rolled out Windows 8 was the removal of the popular Start Button. While the company brought some of its capabilities back with Windows 8.1, the new build of the Technical Preview introduces a new Start Menu that Windows 7 users who have avoided Windows 8.x should feel comfortable with. The Start Menu displays the apps you use most on the left side of your screen and lets you customize the rest of the page with tiles that can be sized however the user chooses and grouped based on preferences such as productivity tools and content. It can be viewed in desktop mode (Figure 1) or in the pure tablet interface (Figure 2).

[Click on image for larger view.]  Figure 1. The Start Menu in Windows 10's desktop mode.

 

[Click on image for larger view.]  Figure 2. The Start Menu in Windows 10's tablet mode.

Cortana: The digital voice assistant available to Windows Phone users is now part the Windows PC and tablet environment and it was the first thing I wanted to test upon downloading the new build. It wasn't able to answer many questions, though when asked who's going to win the Super Bowl, Cortana predicted the New England Patriots. We'll see how that plays out. Ask it the weather forecast and she'll give you a brief answer. In other cases when I asked certain questions it would initiate a Bing query and send back the search results in a browser view. Cortana is also designed to search your system, OneDrive and other sources based on your queries. Microsoft has warned Cortana for Windows is still in early development but it could emerge as a useful feature if it's able to work as the company hopes.  Like the Start Menu, Cortana works on the traditional desktop (Figure 3) or in the tablet mode (Figure 4).

[Click on image for larger view.]  Figure 3. Cortana on the desktop.

 

[Click on image for larger view.]  Figure 4. Cortana in the tablet mode.

Continuum: The design goal of Windows 10 is its ability to let users transition between desktop and touch-based tablet modes. In either environment, you should be able to access desktop or Windows Store apps. For example if you have downloaded Google's Chrome browser as a desktop app, when in the tablet mode it will appear as an app in that environment. In either case, you're accessing the same browser, just from a different interface.

Farewell Charms: Microsoft introduced Charms with Windows 8 as a hip new way of configuring machines but many found it cumbersome and confusing. In the new build, Charms are gone, replaced by a new Settings component (Figure 5). As the name implies, Settings offers an easy way to customize the display, connect peripherals and configure networks.

[Click on image for larger view.]  Figure 5. Windows 10's new Settings component.

New Windows Store: Microsoft is preparing a new store that has a common design for PC, tablet and phone users as well as those accessing it via the Web. The new Windows Store beta (Figure 6) appears as a gray icon, though the existing Windows Store is still available in green.

[Click on image for larger view.]  Figure 6. The Windows Store beta.

File Explorer: Many users have complained that the File Explorer in Windows 8.x doesn't allow for a default folder. Now when opening the File Explorer in the new preview, it can be set to open to a default folder (Figure 7).

[Click on image for larger view.]  Figure 7.Windows 10's File Explorer.

Because this is still an early beta you'll find bugs and just because you see features here doesn't mean they'll end up in the shipping version this fall. If you've looked at this build, please share your opinions on the latest Windows 10 Technical Preview.

 

Posted by Jeffrey Schwartz on 01/26/2015 at 3:40 PM0 comments


Box Shares Soar as IPO Shows Investor Appetite for the Cloud

After putting its plans to go public last year on hold, Box's widely anticipated IPO got out of the starting gate today with its shares up as much as 70 percent midday Friday. The company plans to use the estimated $180 million in proceeds to maintain operations and invest in capital infrastructure to grow its enterprise cloud offering.

Founder and CEO Aaron Levie launched the company in 2005 with the aim of offering an alternative to large premises-based enterprise content management systems. Over the years, Levie publicly put a target on the back of SharePoint. Levie's ambitions earlier this decade to establish Box as a SharePoint killer peaked before Office 365 and OneDrive for Business arrived. While Levie still has strong aspirations to become the primary storage and file sharing service for businesses, the market is more challenging now that Office 365 with OneDrive for Business, Google Drive and others are widely entrenched within enterprises.

For its part, Box has always targeted large enterprises, boasting such customers as GE, Toyota, Eli Lilly, Boston Scientific, Procter & Gamble, Chevron, Schneider Electric and Stanford University. Speaking on the New York Stock Exchange trading floor with CNBC, Levie emphasized that the best opportunity for the company lies with targeting large enterprises and mid-size firms with 500 to 1,000 employees.

Amid enthusiasm for Box, there's also plenty of skepticism among analysts. The company incurs large customer acquisition and retention costs, which include "customer success managers" assigned to those with contracts for the life of the relationship, according to the company's S1 filing with the Securities and Exchange Commission (SEC). Moreover, Box is unprofitable with no target date for turning a profit in sight. According to the filing, Box recorded a $169 million loss for the fiscal year ended January 31, 2014 with a deficit of $361 million.

Also in its filing, Box points to competitors including EMC, IBM and Microsoft (Office 365 and OneDrive), Citrix (ShareFile), Dropbox and Google (Google Drive). There are plenty of other rivals both entrenched and new players such as Acronis, Carbonite, Own Cloud and CloudBerry Lab, among numerous others.

Now that Box believes it doesn't have to displace SharePoint and OneDrive for Business in order to succeed, the company last summer forged an agreement to collaborate with Microsoft. The collaboration pact ensured that Office 365 could use Box to store and synchronize files as an alternative to OneDrive, both of which now offer unlimited storage for paying customers. Microsoft and Box rival Dropbox forged a similar arrangement.

Box also offers APIs for developers to build applications using Box as the content layer, which lets users store content from a given application to centrally and securely store it within its service.  Salesforce.com and NetSuite are among those that have used the API to tie their offerings together. In addition, Box last month added a new enterprise mobility management service, Box for Enterprise Mobility Management (EMM), which fits into the company's new Box Trust effort. That initiative consists of a string of partnerships with security vendors and those with data loss protection management tools. Symantec, Splunk, Palo Alto Networks, Sumo Logic and OpenDNS join existing partners Skyhigh Networks, Hewlett Packard, Okta, MobileIron, CipherCloud, Recommind, Ping Identity, Netskope, OneLogin, Guidance Software and Code Green Networks.

It remains to be seen if Box and its chief rival Dropbox can go it alone or if they'll become attractive takeover candidates. Of course that will depend on their ability to grow and ultimately turn a profit. Do you see Box or Dropbox becoming your organization's primary file store and sharing platform?

 

Posted by Jeffrey Schwartz on 01/23/2015 at 12:27 PM0 comments


Windows 10 Free Upgrade Doesn't Apply to Enterprise Editions

It's great that Microsoft will let Windows 7 and Windows 8.x  users upgrade their systems to the new Windows 10 for free when it comes out this fall.  But before you cheer too loud, beware of the fine print: the deal doesn't apply to Windows Enterprise editions.

A Microsoft official earlier in the week told me that the company will have an event in March emphasizing the enterprise features of Windows 10. Hopefully Microsoft will reveal whether it will offer the free upgrade or some other incentive for earlier users to upgrade. In the fine print discovered by my colleague Kurt Mackie, Microsoft noted the exclusions, which also include the small number of Windows RT users.

"It is our intent that most of these devices will qualify, but some hardware/software requirements apply and feature availability may vary by device," according to the explanation. "Devices must be connected to the Internet and have Windows Update enabled. ISP fees may apply. Windows 7 SP1 and Windows 8.1 Update required. Some editions are excluded: Windows 7 Enterprise, Windows 8/8.1 Enterprise, and Windows RT/RT 8.1. Active Software Assurance customers in volume licensing have the benefit to upgrade to Windows 10 Enterprise outside of this offer. We will be sharing more information and additional offer terms in coming months."

In many instances, new system rollouts could negate this issue. Will a free upgrade make or break your organization's decision to move to Windows 10?

 

Posted by Jeffrey Schwartz on 01/22/2015 at 12:22 PM0 comments


Microsoft's HoloLens Doesn't Look Like a Google Glass Wannabe

One of the unexpected surprises at yesterday's Windows 10 prelaunch event and webcast was when Microsoft donned slick looking eyewear designed to bring holography to the mainstream. Whether Google got word of it days earlier when it pulled its own failed Google Glass experiment off the market is unknown. But the irony of the timing notwithstanding, Microsoft's new HoloLens appears to have more potential.

Microsoft Technical Fellow Alex Kipman, who works in Microsoft's operating systems group and is known as the "father of Kinect," made the surprise introduction of HoloLens at the Windows 10 event. While he didn't say when it would come out or how much it will cost, he positioned it as a product that's designed around Windows 10 and the suggestion is we'll see it sometime later this year.

"Holographic computing enabled by Windows 10 is here," said Kipman. "Every Windows 10 device has APIs focused on humans and environment understanding. Holographic APIs are enabled inside every Windows 10 build from the little screens to the big screens to no screens at all." It has a built-in CPU and GPU, but doesn't require external markers, cameras, wires or a computer connection, he added, and it will blend the physical and digital worlds.

When wearing HoloLens, it is designed to combine an existing environment with holograms, giving a 3D-like visual experience. While it surely will enhance Microsoft's gaming and entertainment portfolio including Xbox and Minecraft, the company also underscored practical uses for HoloLens. In a video, the company described how HoloLens can let workers share ideas, collaborate, teach and learn in a more visually immersive way.

Unlike Google Glass, HoloLens appears to have more practical use cases and may actually offer broader appeal. How broad will depend on price and how useful it ultimately is. But Microsoft has the advantage of seeing where Google Glass fell short and potentially has a larger ecosystem behind it. Perhaps it's even the catalyst that can bring developers of modern apps for other platforms into the fold?

Either way, Google hasn't thrown in the towel in this segment and it could prove to be a burgeoning market alongside other wearable gadgets. Kipman said Microsoft has worked on HoloLens in its research labs for many years, suggesting the demo wasn't just vaporware. It's yet another way Microsoft could draw demand for Windows 10 if users find HoloLens appealing. That could be the case if the price is right and it works as advertised.

Posted by Jeffrey Schwartz on 01/22/2015 at 12:28 PM0 comments


Windows 10 Isn't Just a Free Upgrade -- It Talks and Is Smart

Microsoft potentially removed a crucial barrier to the future of its Windows franchise by saying it will offer the next version -- Windows 10 -- as a free upgrade to existing Windows 7 and Windows 8.x  users. The company is also adding some compelling new features that may make the upgrade worth the effort if these new capabilities live up to their promise.

Speaking at the anticipated launch event in Redmond today, Terry Myerson, Microsoft's executive vice president of operating systems, announced the free upgrade. The caveat is users must install Windows 10 within a year of its release, though it remains to be seen whether that deadline will hold. Perhaps for consumers, which today's event was aimed at, that won't be a big deal. But businesses and enterprises do things on their own clocks, based on need and compatibility.

In an earlier post, I thought it would be a wise move to offer the free upgrade, though I had no knowledge Microsoft would ultimately do so. As part of the release, Microsoft is also shifting to what it calls Windows as a service, where it will provide continuous upgrades.

"When it comes to Windows as a service, it's a pretty profound change," Microsoft CEO Nadella said at today's event. "For customers, they're going to get a continuous stream of innovation. Not only a continuous stream of innovation but also the assurance their Windows devices are secure and trusted. For developers, it creates the broadest opportunity to target. For our partners, hardware and silicon partners, they can coincident with our software innovation, drive hardware innovation. We want people to love Windows on a daily basis."

Microsoft gave a number of reasons to "love" Windows besides the free upgrade. The company announced the rumored Spartan Web browser, which has a rendering engine better suited for modern Web applications, Myerson said. Microsoft will also offer Cortana, the digital assistant released for Windows Phone last year, for Windows running on PCs and tablets.

Officials also demonstrated the notion of a set of universal apps such as Word, Excel, PowerPoint, OneNote and Outlook, while optimized for each form factor that are consistent across them and designed to let a user stop working on one device and quickly pick up where he or she left off on another. An Xbox app on Windows that will allow Xbox users to run games on their PC or Windows tablet was also announced.

Microsoft also revealed its vision for augmented reality and took the wraps off HoloLens, which ironically is a Google Glass-looking device that Microsoft said has a built-in CPU and GPU and built on sensors. Microsoft described it as the world's first holographic computer. Its APIs are designed to work with the new Windows 10 environment.

More hardware is in the works from third parties and Microsoft. The event showcased the new Surface Hub, a Windows 10-based 84-inch Ultra HD display with Skype for Business built-in, sensors, cameras and the ability to mark up content with any phone or device. The company will also offer a 55-inch version and indicated other Surface hardware is in the works.

The company will release a new Windows 10 technical preview next week with a Windows 10 build for phones scheduled for release in early February. Many of the new features Microsoft demonstrated today will work their way into builds of the technical preview over the next three to five months, said Joe Belfiore, a vice president in the operating system group. Microsoft also plans to reveal more features for enterprises in March, according to a company official.  The company still plans for the commercial release of Windows 10 in the fall timeframe.

Posted by Jeffrey Schwartz on 01/21/2015 at 2:29 PM0 comments


.NET and AWS Pioneer Mike Culver Passes Away

Mike Culver, who served a number of strategic roles with Amazon Web Services from the inception of the company's launch of its popular public cloud, lost his battle with pancreatic cancer this week. He was 63. Culver, who before joining AWS was also a technical evangelist at Microsoft in the early days of the .NET Framework rollout, was deeply respected in Redmond and throughout the world.

In his roles at AWS, Culver trained insiders at the emerging cloud industry in how to build and deploy apps in EC2 and scaling them on the company's Simple Storage Service (S3). "Mike was well known within the AWS community," wrote AWS evangelist Jeff Barr, who had shared an office with Culver years back. "He joined my team in the spring of 2006 and went to work right away. Using his business training and experience as a starting point, he decided to make sure that his audiences understood that the cloud was as much about business value as it was about mere bits and bytes."

Culver spoke at many AWS and industry events including Visual Studio Live. I met Culver at Visual Studio Live in 2008 where he gave a session on how to scale ASP.NET applications with cloud-based content delivery. At the time Culver was head of developer relations at AWS. Keep in mind, this was before Microsoft officially announced Azure and AWS S3 was brand new. I was quite impressed by his presentation and sat down with him. Though that was the only time we met, we became friends on Facebook and occasionally commented on one another's posts. I'm quite saddened that he lost his battle both for him, his wife, grown children, siblings and many colleagues who clearly had deep admiration and respect for him.

When he was diagnosed with pancreatic cancer in 2013, Culver was quite candid about his treatment but kept an upbeat yet realistic worldview about his battle. Pancreatic cancer is among the deadliest of cancers. I lost my father nearly a decade ago to it. Culver was accepted a few weeks ago to partake in a trial in a new therapy to battle the disease, though in the end, the disease was too far advanced. Culver entered hospice last week. RIP Michael.

Posted by Jeffrey Schwartz on 01/21/2015 at 2:30 PM0 comments


Google Glass Exits Incubation and Enters the Nest

However you feel about the emerging wearables market, many rightfully have found the notion of Google Glass over the top. Given its obvious potential to distract one's attention, it should be illegal to wear them on the streets and certainly when driving.

Google's announcement yesterday that it will end the Google Glass experiment on Jan. 19 was inevitable since all experiments come to an end. On the other hand, Google has a history of labeling new products or services either tests or beta  for an extended amount of time --  remember when Gmail was a beta product for more than five years despite the fact that millions were using it?

Certainly millions weren't using Google Glass and given its $1,500 price tag, it's also not surprising that Jan. 19 is the last day Google will offer it. The company's announcement yesterday that it is moving Google Glass from the Google X research labs headed by Glass chief Ivy Rose into the Nest unit run by Tony Fadell makes sense.

Nest is the company that manufactures and sells network-enabled smart thermostats, which Google acquired last year for $3.2 billion. A few months later Nest also acquired Dropcam for $55 million, the provider of cameras which, like its thermostats, have built-in Wi-Fi connectivity.

Some reports are cheering the demise of Google Glass though the company seems to have future plans for it. Hopefully the Nest division will focus Google Glass on the practical usage: for vertical and specialty functions that can give medical practitioners and all kinds of field workers a tool to do useful things they are now incapable of doing.

Posted by Jeffrey Schwartz on 01/16/2015 at 12:32 PM0 comments


Obama's New Cybercrime Proposal Could Put Onus on IT Orgs

It appears President Obama's forthcoming legislative proposal to crack down on cybercrime could impose additional liabilities on IT pros in that there could be penalties for not putting in place the proper policies, auditing practices and reporting of breaches.

The President this week spoke on his plans to propose the new legislation aimed at stiffening the penalties for all forms of cybercrime that put the nation's critical information infrastructure at risk as well as individual privacy, he said in a speech Tuesday. Obama will emphasize his legislative proposal to Congress in his annual State of the Union address.

"We want to be able to better prosecute those who are involved in cyberattacks, those who are involved in the sale of cyber weapons like botnets and spyware," Obama said in Tuesday's speech. "We want to be sure we can prosecute insiders who steal corporate secrets or individuals' private information. We want to expand the authority of courts to shut down botnets and other malware. The bottom line: we want cyber criminals to feel the full force of American justice because they are doing as much if not more these days as folks who are involved in conventional crime."

The White House also announced it will host a cybersecurity and consumer protection summit at Stanford University on Feb. 13, which will include speeches, panel discussions and a number of topic-specific workshops. Stanford said it is still finalizing details of the summit.

In addition to calling for better information sharing, the legislation will call for compliance with "certain privacy restrictions such as removing unnecessary personal information and taking measures to protect personal information that must be shared in order to quality for liability protection." According to an outline on the White House Web site, the President will also propose giving law enforcement tools they need to "investigate, disrupt and prosecute cybercrime."

The administration has also revised an existing proposal pertaining to security breach reporting "by simplifying and standardizing the existing patchwork of 46 state laws (plus the District of Columbia and several territories) that contain these requirements into one federal statute, and putting in place a single clear and timely notice requirement to ensure that companies notify their employees and customers about security breaches."

Over the next five years, the Department of Energy will also provide $25 million in grants to fund the training of cybersecurity professionals. The move, of course, comes amidst growing concerns about high-profile breaches over the past year including Target, Home Depot and most recently Sony, among others.

Yet the President is sure to face a battle, especially as it relates to information sharing, where the IT industry is fighting to ensure customer privacy and civil rights. For its part, Microsoft has led that fight in its battle to protect data residing on servers in Dublin, despite last year's court order mandating the release of that information. The Electronic Foundation, the non-profit organization focused on protecting civil liberties, swiftly denounced the President's proposal.

"President Obama's cybersecurity legislative proposal recycles old ideas that should remain where they've been since May 2011: on the shelf," according to a statement it released following Obama's proposal. "Introducing information sharing proposals with broad liability protections, increasing penalties under the already draconian Computer Fraud and Abuse Act, and potentially decreasing the protections granted to consumers under state data breach law are both unnecessary and unwelcome."

But the White House isn't alone in its effort to crack down on cybercrime. New York State Attorney General Eric Schneiderman yesterday said he plans to propose legislation that would require companies to inform customers and employees following any type of cyberattack or breach. The legislation would also broaden the scope of data companies would be required to protect, impose tighter technical and physical security protection and offer a safe harbor for organizations meeting certain standards, according to a statement released by the AG's office. "With some of the largest-ever data breaches occurring in just the last year, it's long past time we updated our data security laws and expanded protections for consumers," Schneiderman said.

While it's good that cybercriminals will face harsher penalties for their crimes -- and they should -- it's not likely to thwart those determined to inflict the most harm. Still, no one wants to be the next Target or Sony. As the content of this new legislation is debated, it also puts enterprises on notice that they will need to take measures to protect their critical data -- for their benefit and for everyone else.

 

Posted by Jeffrey Schwartz on 01/15/2015 at 9:48 AM0 comments


Facebook Begins Enterprise Social Network Service Testing

Facebook apparently does intend to enter the enterprise social networking market with its own offering targeted at business users. The new Facebook at Work will let Facebook users establish work accounts that are separate from their personal accounts.

Rumors that Facebook was developing a business network first came to light in November. News that the Facebook at Work pilot would launch today surfaced this morning in a report by Recode. A Facebook spokeswoman confirmed that the company has launched the pilot with some undisclosed participants testing the new service.

"We're not disclosing the handful of companies in the pilot, since it's early and we're still testing," the spokeswoman said in response to an e-mail inquiry. I was able to download the new Facebook at Work app on my iPhone but when searching for it with my iPad  and Windows 8.1 PC, the app didn't appear in their respective app stores (as of Wednesday afternoon). The Facebook at Work FAQ indicated that it's also available in the Google Play app store.

"With a Facebook at Work account, you can use Facebook tools to interact with coworkers," according to the FAQ. "Things you share using your work account will only be visible to other people at your company. To set up an account, your company must be using Facebook at Work." The current app allows users to request more information on how employers can establish accounts.

To what extent a full-blown launch of Facebook at Work might have on incumbent enterprise social network providers such as Microsoft's Yammer, Salesforce.com's Chatter and Jive remains to be seen. But as SharePoint and Yammer expert Christian Buckley of GTConsult said back in November, "they will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."

Do you see your organization using Facebook at Work?

Posted by Jeffrey Schwartz on 01/14/2015 at 12:44 PM0 comments


Microsoft Showcases Retailer Azure Adoption

Microsoft is making a big splash at this year's annual National Retail Federation (NRF) show in New York. The company is showcasing a number of major brand name chains that have kicked off efforts to improve their in-store experiences by using Azure, predictive analytics and new ways of interacting using apps delivered on mobile devices and kiosks.

While Microsoft emphasized that many of its customers were rolling out mobile devices for their employees at last year's NRF show, the types of apps that various retailers and restaurant chains are rolling out this year make use of Microsoft Azure in a big way. A number of big chains including GameStop and McDonalds are making use of applications that make use of Azure Machine Learning, Microsoft's predictive analytics tool rolled out last year.

Usage of Azure by retailers has grown exponentially in the past year, Tracy Issel, general manager of Microsoft's retail sector, said in an interview. "It used to be [that] we talked with people about going to the cloud and the perceived risk and their concern about scalability," Issel said. "I haven't had one of those conversations in a long time. Now it's 'what do I move first and when do I do it?' Many are moving to Office 365 and Azure simultaneously."

In a roundtable discussion today, Issel introduced four customers that are in the midst of major new efforts using Azure and/or Windows 8.1-based tablets and kiosks. Here's a brief synopsis of their efforts:

GameStop: Jeff Donaldson, senior vice president of GameStop Technology Institute, outlined a number of initiatives that aim for customers to use the retailer's mobile app in ways that store employees can engage with them when a customer visits. The app uses a variety of analytics tools including Hewlett Packard Vertica, SAS for statistical analysis and Azure Machine Learning to inform a sales rep when a customer comes into a store as to what interactions have taken place in the past. "When they come into the store, we want to make sure the employees know about the messages we sent to customers so they better understand the intent of the visit," Donaldson says. Another major effort calls for delivering Ultra HD content into each of its 6,400 stores using Azure Media Services.

CKE Restaurant Holdings, aka Carls Jr. and Hardees: The popular fast-food chain has concluded that millennials would much rather interact with a kiosk to order their food than a person, said Thomas Lindblom, senior vice president and chief technology officer. As such, Hardees is rolling out kiosks that allow customers to choose and customize their burgers and it is designed to upsell. Lindblom is using Dell 24-inch off-the shelf touch-based PCs to deliver the highly visual application. Lindblom said CKE is "a significant user of Azure" for a number of functions including storage and disaster recovery. CKE has also rolled out Office 365.

TGI Fridays: Looking to modernize the dining experiences, waiters and waitresses will carry eight-inch off-the shelf Windows tablets with apps developed by Micros (which is now a part of Oracle). The point-of-sale solution is designed to track customer preferences through loyalty cards. "We are cutting training times [and] we are able to deliver this digital touch point in the hands of our servers as they serve their guests," said CIO Tripp Sessions.

McDonalds: Microsoft partner VMob has rolled out an Azure-based application at McDonalds locations in Europe that enables it to track customer preferences using information gathered by his or her purchasing patterns. VMob has also started rolling it out at locations in Japan. VMob founder and CEO Scott Bradley, who was demonstrating the solution in Microsoft’s both, indicated he’s still working on getting the app into United States locations but he implied that may take some time and said it’s not a done deal. Nevertheless, he said he believes McDonalds eventually will roll it out in the U.S.

Posted by Jeffrey Schwartz on 01/13/2015 at 1:48 PM0 comments


Curved Desktop Displays: Ergonomic or Cosmetic?

At last week's Consumer Electronics Show in Las Vegas, there were robots, smartwatches, driverless cars, ultra-high-definition TVs and home automation systems. Even the traditional PC desktop display got a facelift.

Hewlett Packard was among a number of suppliers showcasing new curved desktop displays, designed to provide a more "immersive" experience, as Ann Lai, director of commercial displays at HP, put it in a briefing prior to the show.

"With a curve desktop display, you're really sitting right in front of it, so the curve is wrapping around you and providing you a very immersive experience that also makes it easier to read and more comfortable to use for longer periods of time," Lai said . "As displays have gotten larger, we've noticed the edges can be harder to read, especially when you're sitting close to it. With a curved display, you're going to have much more comfortable peripheral viewing as well as a much more immersive experience as you're using your display."

I'm reserving judgment as I've never tried one, though my first reaction was these would have more cosmetic appeal for an executive's office than helping make workers more productive. HP's new Pavilion 27 CM Elite display and Elite display S273 are both 27-inch curve displays that are priced at $399. The price is a slight premium over displays without a curve.

If you were looking for a new desktop display, would one with a curve be on your checklist?

Posted by Jeffrey Schwartz on 01/12/2015 at 3:32 PM0 comments


Do You Use 'Workplace Join' for BYOD?

When Microsoft released Windows Server 2012 R2 back in the fall of 2013, one of the many features we pointed out at the time was "Workplace Join," which is designed to let organizations  give single sign-on capability to their bring your own device (BYOD) employees -- or for anything not designed to join an Active Directory domain. Simply put, it lets you register a non-domain-based device running Windows 8.1, iOS and Android to Active Directory. 

Microsoft was especially happy to tout Workplace Join when it launched its Windows RT-based Surface 2 back in September 2013. In an interview with Surface Director of Marketing Cyril Belikoff at the time, she talked up the Workplace Join capability with me. "Workplace Joins are the access components of a directory service that allows a user to use their ID and password to access their corporate network documents and shares in a secure way," Belikoff said. "It's not a fully domained device but you get the administration of mobile device management and get the access component."  Last year, Microsoft added support for Windows 7-based systems as well.

Workplace Join does require Active Directory Federation Services, Active Directory on premises and the Device Registration Service, all part of the "Federation Services Role on Windows Server 2012 R2," as described in the TechNet library. 

I've talked to a variety of mobile device management vendors and suppliers of Active Directory management and auditing tools and I've heard various views. Some say customers (or prospective ones) are questioning how these tools will support Workplace Join and others recommend using the device enrollment features in their wares.

Microsoft boosting the functionality of Workplace Join in the forthcoming Windows 10 operating system could be a factor that builds to its popularity.

Please share your experience or views on Workplace Join. Is it suitable for your BYOD authentication requirements? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 01/08/2015 at 2:31 PM0 comments


Survey: Many Enterprises Will Implement Windows 10 Within a Year

IT pros apparently plan to give Windows 10 a warmer welcome than they gave Windows 8 when it arrived, according to an online survey of Redmond magazine readers conducted during December and early this month. A respectable 41 percent said they plan to deploy PCs with Windows 10 within a year after it ships and 30 percent will go with Windows 8.1, the survey shows.

To put that in perspective, nearly two years ago only 18 percent of Redmond magazine readers said they planned to deploy Windows 8 in a survey fielded six months after that operating system shipped. The 41 percent now saying they will deploy Windows 10 was all the more respectable given it ranked second when asked which of Microsoft's "newer" offerings they plan to deploy this year. Topping the list, to no surprise, was Office 365, where nearly 46 percent say they plan to deploy it this year. Many respondents also plan to deploy Lync, Azure IaaS and Azure Active Directory this year.

To be sure, in a different question where Windows 10 was not an option (since it's still in Technical Preview), respondents overwhelmingly see Windows 7 as their client platform of choice to replace aging PCs, though not as many as in previous years, which is to be expected. Here's the breakdown of PC replacement plans:

  • Windows 7: 57%
  • Windows 8.1:  30%
  • Virtual desktop/thin client: 6%
  • Whatever employee requests: 1%
  • Linux: 1%

It stands to reason that those that have started to mix Windows 8.1 devices into their shops will continue doing so. Anecdotally, though, I'm hearing many are awaiting the arrival of Windows 10 before making any firm commitments.

While it's hard to predict how IT decision makers ultimately will choose to replace aging desktops and portable PCs, it appears both Windows 8.1 and Windows 10 for now are the likely favorites among this audience in the coming year.

Drop me a line if you want to share your top IT priorities for 2015. I'm at [email protected].

Posted by Jeffrey Schwartz on 01/07/2015 at 1:02 PM0 comments


Redmond Exec Defends Against Criticism of Microsoft Band

As the annual Consumer Electronics Show kicks off today in Las Vegas, you can expect to hear lots of buzz about driverless cars, home automation systems, the so-called "Internet of Things" and of course wearable computing devices (including smartwatches and fitness bands).

Having spent most of December using the new Microsoft Band, as I reported last month,  it has some nice features but it's still buggy and, in my opinion, not worth the steep $199 price tag. When I returned my Microsoft Band, the clerk asked why. I mentioned the buggy Bluetooth synchronization with iOS, which she admitted is a common problem with the Microsoft Band. It was also a problem CNBC On-Air Editor Jon Fortt emphasized while interviewing Matt Barlow, Microsoft's general manager of new devices.

"I'm sorry to hear about the challenges you're running into, but a bunch of other people using those devices are having a great time being fit with Microsoft Band," Barlow responded. Not letting Barlow off the hook, Fortt told Barlow that Microsoft has already acknowledged the Bluetooth connectivity issues with iOS. "With any types of new product rollout, you're going to have updates that need to occur with software," Barlow responded. "We're updating software and we're updating usability, so I'm definitely convinced that [we're] seeing people using the Microsoft Band with all phone types without any issues moving forward."

Barlow went on to tout the unique 24-hour heart-tracking capability of the Microsoft Band, along with its on-board GPS, guided workouts and e-mail, text and Facebook integration.  "People are really looking for value, and when I think about what we have with the Microsoft Band ... at a $199 price point, [it] is certainly magical," he argued.

Clearly it is the early days for wearable devices and it remains to be seen if they will take off.  For its part, Microsoft has only offered its band through its retail stores, further limiting their presence. One could argue that many are waiting for the Apple Watch, due out this quarter, but at a starting price of $349, it's not likely they'll be flying off the shelves either. Results of a survey by Piper Jaffray confirmed that.

Not that its earlier surveys showed pent-up demand either, but now only 7 percent of 968 iPhone users surveyed said they intend to purchase an Apple Watch, down from 8 percent back in September when it was introduced and 10 percent in September 2013.

"We believe that the muted response to the Watch is due to consumer questions including what is the killer feature of the watch?," wrote Gene Munster,  senior analyst and known Apple bull, in a Dec. 21 research note. People also want to know, "what applications will be available for the watch? We believe that as we get closer to launch [this] year, Apple will answer many of these questions and demand will increase; however, we still believe expectations for the first year of the watch should remain conservative."

Do you use a wearable such as the Apple Watch, the Microsoft Band, or another item that come from a plethora of other players offering similar devices?

 

Posted by Jeffrey Schwartz on 01/05/2015 at 12:25 PM0 comments


Microsoft Pledges Windows 10 Will Be 'Awesome' and 'Rock Solid'

Microsoft has taken a beating by critics over this month's security patch, which initially suggested that Windows 10 Technical Preview testers might need to uninstall Office before coming up with a less invasive workaround. Despite that and numerous other frustrations with the Windows 10 Technical Preview, Microsoft reported 1.5 million testers have their hands on a preview version of Windows 10, and nearly a third of them are "highly active." The company also claims that more people are testing it than any beta release of Windows to date.

Apologizing for not having a major new build this month, Gabriel Aul, a data and fundamentals team lead at Microsoft's Operating Systems Group, promised it would be worth the wait. "We're really focused on making the next build something that we hope you'll think is awesome," Aul wrote in a Windows Insider blog post Wednesday. "In fact, just so that we have a *daily* reminder to ourselves that we want this build to be great, we even named our build branch FBL_AWESOME. Yeah, it's a bit corny, but trust me that every Dev that checks in their code and sees that branch name gets an immediate reminder of our goal."

Microsoft recently said it will reveal what's in store in the next build on January 21 and Aul indicated it would have some substantive new features. Though Microsoft didn't release a major new build in December, Aul pointed out the company has issued numerous bug fixes as a result of feedback from the 450,000 active testers. Given the poor reception for Windows 8.x, the record number of testers of the Windows 10 Technical Preview is an encouraging sign.

While the large participation doesn't guarantee Windows 10 will be a hit, a sparse number of testers would obviously lower the odds of success. "That hardcore usage will help us fix all the rough edges and bugs," Aul said, noting his favorite was a "very rare" instance when the OneDrive icon in File Explorer could be replaced by an Outlook icon. So far, he noted, testers have helped discover 1,300 bugs. Aul said while most are minor bugs, Microsoft will implement UX changes based on the feedback as well. Many will be small changes and others will be major.

What's on your wish list for the next build and where does the final release of Windows 10 fit in your plans for 2015?

Posted by Jeffrey Schwartz on 12/19/2014 at 10:54 AM0 comments


IBM Extends Its Global Cloud Footprint and Customer Roster

It's been a tough few months for IBM. The company has seen its shares tumble in 2014 amid weak earnings. But looking to show it may be down but not out, Big Blue said it has picked up the pace to build out its cloud footprint after getting to a slow start several years ago.

To close the year, the company yesterday said the IBM Cloud now has 12 new datacenters around the world including in Frankfurt, Mexico City, Tokyo and nine other locations through colocation provider Equinix in Australia, France, Japan, Singapore, the Netherlands and the United States.

The IBM Cloud now has datacenters in 48 locations around the world. That's double the number it had last year thanks to its promise to invest $1.2 billion to expand its cloud network, which kicked into high gear with last year's acquisition of SoftLayer. On top of that, IBM invested $1 billion to build its BlueMix platform-as-a-service technology, designed for Web developers to build hybrid cloud applications.

Enabling that expansion, IBM made aggressive moves including its deal with Equinix to connect to its datacenters using the large colocation provider's Cloud Exchange network infrastructure. IBM also inked partnerships with AT&T, SAP and Microsoft. With its recently announced Microsoft partnership, applications designed for Azure will work on the IBM Cloud (and vice versa). As IBM and Microsoft compete with each other, Amazon and numerous other players, the partnership with Microsoft promises to benefit both companies and their customers.

As a result of its aggressive expansion this year, IBM says it, Microsoft and Amazon have the largest enterprise cloud infrastructure and platform offerings. Nevertheless, few would dispute Amazon remains the largest cloud provider.

In its announcement yesterday, IBM also said it recently inked more than $4 billion in long-term enterprise cloud deals with Lufthansa, WPP, Thomson Reuters and ABN Amro, and that its customer base has doubled in the past year to more than 20,000. While many are traditional Big Blue shops, IBM says thousands are new companies, including startups. IBM said its cloud revenues are growing at a 50 percent rate and on pace for $7 billion in 2015.

 

Posted by Jeffrey Schwartz on 12/18/2014 at 10:53 AM0 comments


Salesforce.com Launches File Connector with Links to SharePoint Online and OneDrive

Salesforce.com today launched a connector that aims to bridge its cloud-based CRM portfolio of services with enterprise file repositories. The new Salesforce File Connect will let organizations centralize their customer relationship management content with file stores including SharePoint and OneDrive with a connector to Google Drive coming in a few months.

The release of the connector to SharePoint and OneDrive was promised back in late May when both Salesforce.com and Microsoft announced a partnership to integrate their respective offerings. While the two companies have a longstanding rivalry, they also share significant overlapping customer bases. The companies at the time said they would enable OneDrive for Business and SharePoint Online as integrated storage options for the Salesforce platform.

In today's announcement, Salesforce claims it's the first to create a repository that natively integrates CRM content and files among popular enterprise file stores. Salesforce.com said it provides a simple method of browsing, searching and sharing files located in various repositories.

Salesforce.com described two simple use cases. One would enable a sales rep to attach a presentation on OneDrive for Business to a sales lead in the Salesforce CRM app. The other would allow a service representative to pull an FAQ content form OneDrive for Business running in the Salesforce Service Cloud app.

The connector supports federated search to query repositories simultaneously from any device and lets users attach files to social feeds, groups or records, enabling them to find contextually relevant information in discussions running in Salesforce Chatter. The tool is also designed to enforce existing file permissions.

For customers and third-party software providers wanting to embed file sharing into their applications, Salesforce.com also is offering the Salesforce Files Connect API.

 

Posted by Jeffrey Schwartz on 12/17/2014 at 12:26 PM0 comments


Azure Site Recovery Now Works Without SCVMM

Like many cloud service providers, Microsoft has identified disaster recovery as a key driver for its hybrid infrastructure-as-a-service (IaaS) offering. Microsoft this year delivered a critical component of delivering its disaster recovery as a service (DRaaS) with Azure Site Recovery.

If you saw Brien Posey's First Look at Azure Site Recovery, you may have quickly lost interest if you're not a Microsoft System Center user. That's because Azure Site Recovery required System Center Virtual Machine Manager.  But with last week's Microsoft Azure release upgrade, the company lifted the SCVMM limitation.

The new Azure Site Recovery release allows customers to replicate and recover virtual machines using Microsoft Azure without SCVMM. "If you're protecting fewer VMs or using other management tools, you now have the option of protecting your Hyper-V VMs in Azure without using System Center Virtual Machine Manager," wrote Vibhor Kapoor, director of marketing for Microsoft Azure, in a blog post outlining the company's cloud service upgrades.

By making Azure Site Recovery Manager available without SCVMM, it brings the DRaaS to branch offices and smaller organizations that can't afford Microsoft's systems management platform or simply prefer other tools, explained Scott Guthrie, executive vice president of Microsoft's enterprise and cloud business, in a blog post. "Today's new support enables consistent replication, protection and recovery of Virtual Machines directly in Microsoft Azure. With this new support we have extended the Azure Site Recovery service to become a simple, reliable and cost effective DR Solution for enabling Virtual Machine replication and recovery between Windows Server 2012 R2 and Microsoft Azure without having to deploy a System Center Virtual Machine Manager on your primary site."

Guthrie pointed out that Azure Site Recovery builds upon Microsoft's Hyper-V Replica technology built into Windows Server 2012 R2 and Microsoft Azure "to provide remote health monitoring, no-impact recovery plan testing and single click orchestrated recovery -- all of this backed by an SLA that is enterprise-grade." Since organizations may have different uses for Azure Site Recovery, Guthrie underscored the One-Click Orchestration using Recovery Plans option, which provides various Recovery Time Objectives depending on the use case. For example using Azure Site Recovery for test and/or planned failovers versus unplanned ones typically require different RTOs, as well as for disaster recovery.

In addition to Hyper-V Replica in Windows Server 2012 R2, Azure Site Recovery can use Microsoft's SQL Server AlwaysOn feature. Azure Site Recovery also integrates with SAN replication infrastructure from NetApp, Hewlett Packard and EMC. Also, according to a comment by Microsoft's Roan Daley in our First Look, Azure Site Recovery also protects VMware workloads across VMware host using its new InMage option. Acquired back in July, InMage Scout is an on-premises appliance that offers real-time data capture on a continuous basis, which simultaneously performs local backups or remote replication via a single data stream. Microsoft is licensing Azure Site Recovery with the Scout technology on a per-virtual or per-physical instance basis.

Are you using Microsoft's Azure Site Recovery, planning to do so or are you looking at the various third party alternatives as cloud-based DRaaS becomes a more viable data protection alternative?

 

Posted by Jeffrey Schwartz on 12/17/2014 at 12:08 PM0 comments


The Microsoft Band: Not a Good Fit (Yet)

Microsoft last month entered the wearables market with the Microsoft Band, which, paired with the new Microsoft Health Web site and app for the wrist band, is designed to track your physical activities and bring some productivity features to your wrist.

The Microsoft Band, in my opinion, does a lot of interesting things, though it doesn't really excel at any of them at this point. Among the productivity features included are alerts that let you glance at the first sentence or two of a message, texts, Facebook posts, Facebook Messenger, phone calls, voicemails, schedules, stock prices and an alarm clock that vibrates gradually for deep sleepers who don't like to jump out of bed.

Then there's the health component that has a pedometer to track your steps, a monitor for runners as well as one to track general workouts. It also tracks your sleep including your average heart rate and how often you supposedly woke up. You can synchronize whatever physical activity it monitors with Microsoft Health, an app that runs on any iOS, Android and Windows Phone device. If you use it with Windows Phone, you get the added benefit of using Microsoft's Cortana, the digital assistant that responds to spoken commands. You can also look at reports on the Microsoft Health Web site.

My personal favorite: the Starbucks app, which presents the scan image of your account allowing for the barista to scan it when making a purchase. Most of them seeing it for the first time responded with awe, with one saying that "this is the wave of the future."

Though I've been skeptical  about wearables like this and others like it from Fitbit, Samsung, Garman, Nike, Sony, and dozens of other providers, it's clearly a growing market and it may very well be the wave of the future -- or at least a wave of the future. Market researcher Statistica forecasts that the market for these wearables will be close to $5.2 billion this year, which is more than double over last year. In 2015, sales of these gadgets will hit $7.1 billion and by 2018 it will be $12.6 billion.

Gartner last month reported that while smart wristbands are poised for growth, its latest survey shows at least half are considering smartwatches. It actually sees the smartwatch market growing from 18 million units this year to 21 million in 2015, while purchases of wristbands will drop from 20 billion to 17 billion. Certainly the release of the Apple Watch, despite its hefty starting price of $349, will likely fuel that market, though I already questioned how much demand we'll see for it.

I haven't tested other devices so it's hard to say how the Microsoft Band rates compared to them. But I find the notion of having information on my wrist more compelling than I had thought.  However, performance of my Microsoft Band is flaky. I've encountered synchronization problems that have required me to uninstall and reinstall the Microsoft Health app on my iPhone on a number of occasions. It has presented realistic heart rates when I'm at the gym and suddenly it would give numbers not believable. When I click on the e-mail button it often says I have nothing new and even when I can read them, the messages are cryptic and don't always indicate the sender.

I like that the Microsoft Band does synchronize with some other health apps, such as MyFitnessPal, which I use to track my meals these days. By importing that data, it provides more relevant info that I'd otherwise have to figure out and enter manually. The problem is, I don't believe I could have possibly burned 2,609 calories from a recent visit to the gym, though it would be nice if that was indeed the case.

That's why after spending several weeks with it, I can say I like the concept but it's not worth its $199 price tag unless money is no object to you. While I agree with my colleague Brien Posey that the Microsoft Band has some nice features, I think I'd wait for an improved version of the Microsoft Band and a richer Microsoft Health site before buying one of these (unless they become remarkably less expensive).

That stated, I hope Microsoft continues to enhance the Microsoft Band by adding more capacity and battery life to make it a more usable and comfortable device. If everyone had accurate readings of our physical activities, maybe it would lead to healthier lifestyles.

Posted by Jeffrey Schwartz on 12/15/2014 at 7:28 AM0 comments


Ford Officially Dumps Microsoft for BlackBerry

Once hailed as the future of in-vehicle communications and entertainment, a partnership between Ford and Microsoft has all but unraveled. Ford this week said it's replacing Microsoft Sync with BlackBerry's QNX software.

Ford launched its Sync 3 platform, which ushers in significant new features and will show up in 2016 vehicles sometime next year, the company announced yesterday. Though Ford didn't officially announce it was walking away from Microsoft Sync in favor of BlackBerry QNX, The Seattle Times reported in February that the automaker was on the verge of making the switch. Raj Nair, Ford's CTO of global product development, said in numerous reports yesterday that QNX is now the new platform. Some 7 million Ford vehicles are reportedly equipped with Microsoft Sync but the systems have continuously scored poorly in consumer satisfaction reports due to frequent malfunctions.

Swapping out Microsoft Sync for QNX would also result in cost savings, according to The Seattle Times, noting that it's also used in the in-vehicle navigation systems of Audis and BMWs. Apple and Google also have alliances with various car manufactures. While BlackBerry smartphones may be rapidly disappearing, QNX has gained significant ground in the in-vehicle systems market. While Microsoft Sync, based on Windows Embedded, is said to also run the vehicle entertainment systems of some BMW, Kia, Fiat and Nissan models, Ford and Microsoft announced with great fanfare in 2007 their plans roll out models with the entertainment system as an option.

Microsoft Sync was initially designed to link iPods and Zune music players to entertainment systems, debuting just at the dawn of the smartphone age. At the time, Microsoft Founder Bill Gates saw Microsoft Sync as another element of the company's "Windows Everywhere" effort.  As we all know, much as changed since then.

If Microsoft has new plans for Sync, the next logical time to announce them would be at next month's annual Detroit Auto Show.

Posted by Jeffrey Schwartz on 12/12/2014 at 11:28 AM0 comments


SolarWinds Tackles IP Address Management Conflicts

While IP address conflicts are as old as networks themselves, the growing number of employee-owned devices in the workplace are making them a more frequent problem for system administrators. By nature of the fact that PCs and devices have become transient in terms of the number of networks they may connect to, it's not uncommon for a device to still think it's linked to one network, causing  an IP address conflict when it tries to connect to another network.  

SolarWinds is addressing that with its new IP Control Bundle, which identifies and resolves IP address conflicts. The bundle consists of the new SolarWinds IP Address Manager (IPAM) and the SolarWinds User Device Tracker (UDT). There are two parts to the IP address resolution process.

First, IPAM identifies the IP conflicts by subnet, provides a history of where the user and machine connected to the network, identifies the switch and port on which that system connected and then actually disables that user's connection. Next, UDT uses that information to disable the switch port and assigns a new IP address and updates any DNS entries as necessary for the device to work before reconnecting.

IPAM and UDT are typically installed on a separate server, and when a problem arises an administrator can use the software to scan the network and IP address ranges. It also interrogates routers, switches and other network infrastructure to gather relevant troubleshooting information. Rather than using agents, it relies on standard protocols, notably SNMP.

In addition to troubleshooting and remediating client-based devices, the SolarWinds package can handle IP address conflicts occurring on servers and virtual machines, says Chris LaPoint, vice president of product management at SolarWinds.

"If I'm the owner of that critical app trying to figure out what's going on, I can go to this tool and see that Joe over in another part of the datacenter has spun up a new VM and that's what's creating issues with my application," LaPoint explains. "So now I can probably notify Joe and tell him I'm kicking him off the network because it's actually affecting the availability of a customer-facing application that we need to have running."

Pricing for IPAM starts at $1,995 and UDT begins at $1,795.

Separately, SolarWinds this week said its SolarWinds Web Help Desk now works with DameWare Remote Support. SolarWinds acquired DameWare in 2011 but it operates as a separate business unit. The products are collectively used by 25,000 customers and the combined solution will allow help desk technicians to connect with remote devices or servers, collect support data including chat transcripts and screen shots and generate reports.

The SolarWinds Web Help Desk offering provides automated ticketing, SLA alerts, asset management and reporting while DameWare Remote Support provides remote access to client devices and servers, allowing administrators to take control of those systems and manage multiple Active Directory Domains as well as resetting passwords.

 

Posted by Jeffrey Schwartz on 12/12/2014 at 11:27 AM0 comments


Windows XP Systems Fading but Still On the Prowl

During the Thanksgiving break, I had a number of simultaneous encounters with PCs in public places still sporting the Windows XP logo and it got under my skin. Among them was a computer near the checkout area at Home Depot. And within an hour I spotted another on a counter right next to the teller stations at my local Bank of America branch.

Given that we know Windows XP systems are no longer patched by Microsoft, the sight of them is becoming as uncomfortable as being near someone who has a nasty cold and coughs without covering his or her mouth. Speaking of spreading viruses, I've even been to two different doctors' offices in recent months that were running Windows XP-based PCs -- one of them is used to actually gather patient information and the other to schedule appointments. In both cases, when I asked if they planned to upgrade those systems, I got the equivalent of a blank stare. I don't think they had any idea what I was talking about.

Nevertheless, seeing a Windows XP PC just after I used the self-checkout terminal at Home Depot was especially unsightly given the  retailer's massive breach last month in which e-mail addresses were stolen. Home Depot Spokeswoman Meghan Basinger said: "Thanks for reaching out, but this isn't detail we'd discuss."

Now the Bank of America situation is a bit different. The day after the Thanksgiving holiday weekend, InformationWeek announced their IT chief of the year: Cathy Bessant, head of Bank of America's 100,000-person Global Technology & Operations, who manages an IT organization of 100,000 employees. That's a lot of IT pros and developers.

Bank of America appeared to have a strong IT organization just by the nature of the way the company is often first to market with new e-banking features and mobile apps. The bank's systems tend to be reliable and they haven't had any major breaches that I can recall. Also, having worked in the past for InformationWeek Editor-in-Chief Rob Preston, who interviewed Bessant and reported on the bank's ambitious IT efforts, I have no doubt the choice was a well vetted one.

So when he noted among the bank's many milestones this year that its IT team completed the largest Windows 7 migration to date (300,000 PCs), I felt compelled to check in with Bank of America Spokesman Mark Pipitone. Perhaps after updating so many systems, my inquiry sounded petty, but I was curious as to how they were dealing with these stray Windows XP systems. Were they paying $200 for premium support per system or maybe the PC was just front-ending an embedded system? (Microsoft does still support Windows XP embedded.) As such, I sent a picture of the system to Pipitone.

[Click on image for larger view.]  Figure 1. Bank of America's PC to access the safe deposit boxes.

"Not knowing exactly what device you took a picture of, the best the team can tell is that it's an excepted device (there are some across our footprint), or it's a device that's powered on but not being used on a regular basis," Pipitone responded.

I made a trip to the branch and asked what the XP machine was used for. A rep there told me that it was used for those needing to access their safe deposit boxes. I informed Pipitone of that, though he declined to comment further. Maybe the lone PC I saw isn't connected to the Internet or it is otherwise protected. But the mere public display of Windows XP machines in so many different places for many tech-aware people is still disconcerting.

I laud Bank of America and others who have undertaken the painful move of modernizing their PC environments. At the same time, I look forward to a day when I don't have to see that Windows XP logo when I walk into a place of business, whether it's a doctor's office, a local restaurant or a major retailer or bank. Windows XP was a great operating system when it came out and I know some defenders of the legacy OS will be outraged by my stance -- many of whom are angered by Microsoft's decision to stop supporting it. But Windows XP machines are likely unprotected unless they're not, and never will be, connected to a network.

There is some encouraging news. Waiting in my inbox on December 1 right after the holiday weekend was a press release from StatCounter reporting that there are more Windows 8.1 PCs out there than those with Windows XP. According to the November report, 10.95 percent of systems are running Windows 8.1. Windows XP still accounts for 10.67 percent. This marks the first time that there are more Windows 8.1-based systems than Windows XP PCs, according to its analysis. Back in August, the combination of Windows 8 and Windows 8.1 systems achieved that milestone, so it could be argued the latest report is a minor feat.

Nevertheless, the stragglers will remain for some time, according to Sergio Galindo, general manager of GFI Software, a provider of Web monitoring and patch management software. "I'm aware of several companies that continue running large XP installations -- and even larger IT budgets -- that may have custom XP agreements," Galindo said. "Windows XP will continue to survive as long as it meets people's needs. To keep a network secure, IT admins and computer consultants can 'lock down' the accounts on the XP machines. I strongly advise that machines running XP be allowed only minimal capabilities and have no admin access. I also favor using more secure browsers such as Chrome versus Internet Explorer in these cases. Also, IT admins may want to shut off some of the more common attack vectors such as Adobe Flash. In the case of XP, less (software) is more (secure)."

By the way, just a friendly reminder: there are just over 200 days left before Microsoft will no longer support Windows Server 2003. You'll be hearing a lot about that from us and Redmond magazine's Greg Shields last month primed the pump.

 

Posted by Jeffrey Schwartz on 12/10/2014 at 12:54 PM0 comments


Shareholders Approve Microsoft CEO Satya Nadella's $84 Million Payday

At Microsoft's annual shareholder meeting Wednesday in Bellevue, Wash., CEO Satya Nadella cashed in big. Shareholders approved his proposed $84 million pay package, a reward for a job well done. The pay package, which includes $59.2 million in stock options and a $13.5 million in retention pay, according to Bloomberg, has come under attack as excessive by Institutional Shareholder Services, an investor advisory organization.

Indeed Nadella ranks among the most highly paid CEOs. According to this year's Wall Street Journal/Hay Group CEO Compensation report ranking the 300 largest companies in revenue, the median pay package was $11.4 million, with Oracle CEO Larry Ellison taking the top spot in 2013 earning $76.9 million.

By that measure, Nadella isn't breaking any records. Oracle's share price rose nearly 29 percent, while Microsoft's share price jumped 32 percent since Nadella took over in early February. Nevertheless, investor advocates have scrutinized CEO compensation in wake of the financial crisis.

While Microsoft's prospects look better than they have in a long time, the package for some may look excessive. Others would argue Nadella has plenty of incentive to continue Microsoft's turnaround, which is still in its early stages and certainly not yet a sure thing, given rapid changes in fortune that can take place in the IT industry.

Do you believe Nadella's compensation is excessive or is it fair?

Posted by Jeffrey Schwartz on 12/05/2014 at 12:09 PM0 comments


Cyber Monday Deals Include $99 Windows Tablet PCs and More

It was hard to ignore the hype over the Thanksgiving weekend's traditional Black Friday and Cyber Monday barrage of cut rate deals including this year's decision by quite a few retailers to open their doors earlier than ever. Many, including the Microsoft Store, opened as early as 6 p.m. on Thanksgiving Day, hoping to lure people away from their turkey dinner earlier to get a jump on their holiday shopping.

Content with spending Thanksgiving Day with my family and not a big fan of crowds anyway, I decided to stop by my local Staples at a normal hour on Friday morning. To my surprise, there were just a handful of people in the store. When I asked an employee why the store was so empty on Black Friday, she said the crowds were all there Thanksgiving night.

When I asked her how many people bolted early from their turkey dinners, she said there was a line of about 100 people outside the store prior to its 6 p.m. opening Thursday evening. Apparently a good chunk of them were waiting for the $99 Asus EeeBook X205TA, which normally sells for at least double that price. Truth be told, that's why I popped in, though I had anticipated the allotment would be sold out. I had already looked at the 11.6-inch Windows 8.1 notebook, which can also function as a tablet with its removable keyboard. It's powered with an Intel Atom processor, 2 GB of RAM and a 32GB SSD.

I asked her how many people in line were waiting for that device and she replied that more than half were. While many Windows 8.1 notebooks and tablets were on sale during the holiday rush, the two prominent $99 deals were the aforementioned Asus device and the HP Stream 7. The latter is a 7-inch Windows 8.1 tablet and it comes with a one-year Office 365 Personal subscription good for the tablet and one other PC. The discounted HP Stream 7 is only available today at the Microsoft Store, which is also offering up to $150 off on the most expensive Surface Pros with Intel Core i7 processors.

The HP Stream 7 is also powered by an Intel Atom processor, a 32GB SSD but only has 1GB of RAM. While you shouldn't plan on doing much multitasking with this device, it's certainly a viable option if you want an ultra-portable tablet that can quickly access information and function as an option to a Kindle Fire (the Kindle app is among many apps now available in the Microsoft Store).

Given I already have a Dell Venue 8 Pro with similar specs and 2GB of RAM, the HP Stream 7 was of little interest to me, though it would make a good gift for someone at that price. Back at Staples, I asked the employee if there were any of the Asus convertibles left at the $99 price and to my surprise she said they were all out but I could order one with free delivery from the store's kiosk. It's slated to arrive today. Apparently you can still order one on this Cyber Monday on Staples' Web site (you can probably get a competitor to match the price).

Today the National Retail Federation released a report forecasting that sales over the Thanksgiving weekend overall were down 11 percent and there are a number of theories for why that's the case. The drop in sales does show that all of those retailers who felt compelled to open their doors on Thanksgiving Day may want to rethink that strategy for next year.

 

Posted by Jeffrey Schwartz on 12/01/2014 at 12:54 PM0 comments


Microsoft Delivers a Deep Dive on a Planned Office 365 Revamp

Microsoft this week gave developers and IT pros a deep dive on major new features coming to Office 365, which the company has described as the fastest growing new product in its history. The demos, which include several APIs and SDKs aimed at driving existing SharePoint users to Office 365, gave a close look at how building and administering applications for collaboration is going to change dramatically for IT pros, developers and end users alike.

Because Microsoft has made clear that organizations running applications developed in native code for SharePoint won't be able to migrate them to Office 365, the company is trying to convince customers to plan for the eventual move using the company's new app model. Microsoft is betting by offering compelling new capabilities, which it describes as its "Office Everywhere" effort, that organizations will find making the move worthwhile.

The APIs and new Office 365 features demonstrated include the new My Apps user interface, which the company also calls the App Launcher, due out for preview imminently after what the company described as a brief delay. My Apps gives users a customizable interface to applications they use such Word, Excel, PowerPoint, contacts, mail and files. They can also add other Microsoft services as well as ultimately those of third parties.

Jeremy Thake, a senior Microsoft product manager, demonstrated the new Office 365 platform and underlying API model Thursday at the Live! 360/SharePoint Live! conference in Orlando. Thake said the Microsoft Graph demo was the first given in the United States since the company unveiled it two weeks ago at TechEd Europe, where Microsoft also released the preview of the new Office 365 APIs.

"The Microsoft Graph is essentially allowing me to authenticate once and then go to every single endpoint across Microsoft. And not just Office but Dynamics to Azure and anything I've got running Windows, such as Live, Outook.com and whatnot," Thake said during the demo, noting the plan is to tie it to third-party services that have registered to Microsoft Graph. "It's an access to those things from that one endpoint. This is a really powerful thing that isn't out yet. It's in preview; it will be coming next year."

Consultant Andrew Connell, organizer of the SharePoint Live! track at Live! 360 said the release of the APIs and the Microsoft Graph bode well for the future of Office 365 and SharePoint. "It opens so much more of the company data, not just so much more of our data that we're using in Microsoft services from a uniform endpoint for other companies to interact with and provide additional value on it," he said during the closing conference wrap up panel. "That's going to be tremendous. That [Microsoft Graph] API is being pushed by the 365 group but it's a Microsoft thing -- it touches everything we do."

Also demonstrated for the first time was the new Azure Active Directory for Javascript, announced at TechEd Europe and now in preview. This eliminates the need to call the Office 365 APIs from ASP.NET MVC using managed code, the only supported approach. "What this allows me to do is grab a token, totally client-side, and then run with that request," he said. "It's grabbing the client ID and it's asking for the endpoints, and then all I'm really doing is … the simple HTTP request on the front. I'm posting the token in the header here, which is the token I got back from Azure Active Directory, and I'm just using the URL that I've built to call the API, using that header. And this is totally just using Javascript to the Azure Active Directory library."

Where this is beneficial is if you are already doing angular Javascript development with single-page applications, with no server-side back end, he explained. "Once we get these new AAD libraries for Javascript, I won't need to go to that Web API to get to the Office 365 APIs. I just can call it directly."

Thake demonstrated numerous other APIs including a discovery service and the new Android and iOS SDKs, among other things. There are huge changes coming to Office 365 in 2015 and it will have a huge impact on IT pros and developers who build to and manage it. It was a huge topic at SharePoint Live! and I'll be sharing the implications of what's in the pipeline in the coming weeks.

Posted by Jeffrey Schwartz on 11/21/2014 at 9:02 AM0 comments


Microsoft's Expanding Data Management Portfolio Gives DBAs and Devs Plenty To Digest

`

While Microsoft this year has rolled out extensive additions to its data management portfolio as well as business intelligence and analytics tools, SQL Server is still its core database platform. Nevertheless, Microsoft has unleashed quite a few new offerings that DBAs, developers and IT decision makers need to get their arms around.

 "I think Microsoft needs to have the full stack to compete in the big data world," said Andrew Brust, who is research director at Gigaom Research. Brust Tuesday gave the keynote address at SQL Server Live!, part of the Live! 360 conference taking place in Orlando, Fla., which like Redmond, is produced by 1105 Media. Microsoft CEO Satya Nadella has talked of the data culture that's emerging, as noted in the Redmond magazine October cover story.

Brust pointed out that Microsoft has delivered some significant new tools over the past year including its Azure HDInsight, its Apache Hadoop-based cloud service for processing unstructured and semi-structured Big Data. Microsoft recently marked the one-year anniversary of Azure HDInsight with the preview of a new feature, Azure Machine Learning, which adds predictive analysis to the platform.

"Since the summer, they've added half a dozen new data products, mostly in the cloud but they're significant nonetheless," Brust said in an interview, pointing to the variety of offerings ranging from Stream Analytics, the company's real-time events processing engine  to Azure Data Factory, which lets customers provision, orchestrate and process on-premises data such as SQL Server with cloud sources including Azure SQL database, Blobs and tables. It also offers ETL as a service. Brust also pointed to the new Microsoft DocumemtDB, the company's new NoSQL entry, which natively supports JSON-compatible documents.

Microsoft's release of SQL Server 2014, which adds in-memory processing to its flagship database, aims to take aim at SAP's HANA. "Microsoft is going after it from the point of view you can have in memory and just stay in SQL Server instead of having to having to move to a specialized database," Brust said. "It's a version one, so I don't expect adoption to be huge but it will be better in the next version. They are definitely still working on it. It's not just one-off that they threw out there -- it's very strategic for them."

Posted by Jeffrey Schwartz on 11/19/2014 at 1:38 PM0 comments


Microsoft Releases CLI To Manage Docker Containers from Windows Clients

Microsoft today said it has merged Windows code into Docker, allowing administrators from a Windows client to manage Docker containers running on Linux hosts. It's the latest move by Microsoft to jump on the Docker bandwagon, which began earlier this year with its support for Linux containers in the Azure pubic cloud, and continued with last month's pact by the two companies to develop native Docker clients for Windows Server.

The company published reference documentation, published in the form of a command-line interface (CLI), that illustrates how to compile a Docker container on Windows. "Up 'til today you could only use Linux-based client CLI to manage your Docker container deployments or use boot2docker to set up a virtualized development environment in a Windows client machine," wrote Khalid Mouss, a senior program manager for the Azure runtime, in a blog post.

"Today, with a Windows CLI you can manage your Docker hosts wherever they are directly from your Windows clients," Mouss added. The Docker client is in the official Docker GitHub repository. Those interested can follow its development under Pull Request#9113.

While noteworthy, it bears noting this is not the announcement of a Windows Docker client -- it's just a move to enable management of Linux clients from a Windows client, said Andrew Brust, research director at Gigaom Research, who, when I saw the news on my phone, happened to be sitting next to me at our Live! 360 conference, taking place this week in Orlando, Fla. "This is simply a client that lets you run a client to manage Linux-based Docker containers," Brust said. "It's interesting but it's not a huge deal."

Furthermore, Mouss said on the heels of Microsoft open sourcing .NET Framework last week, the company this week also released a Docker image for ASP.NET on Docker Hub, enabling developers to create ASP.NET-ready containers from the base image. The ASP.NET image is available from Docker Hub.

See this month's Redmond magazine cover story on Microsoft's move toward containers as potentially the next wave on infrastructure and application virtualization.

Posted by Jeffrey Schwartz on 11/18/2014 at 11:52 AM0 comments


Amazon Seeks To Offer SQL Alternative with Aurora Cloud Database

When Amazon Web Services announced Aurora as the latest database offering last week, the company put the IT industry on notice that it once again believes it can disrupt a key component of application infrastructures.

Amazon debuted Aurora at its annual AWS re:Invent customer and partner conference in Las Vegas. Amazon said the traditional SQL database for transaction-oriented applications, built to run on monolithic software and hardware, has reached its outer limits. Amazon Web Services' Andy Jassy said in the opening keynote address that the company has spent several years developing Aurora in secrecy.

Built on the premise that AWS' self-managed flagship services EC2, S3 and its Virtual Private Cloud (VPC) are designed for scale-out, service-oriented and multi-tenant architectures, Aurora removes half of the database out of the application tier, said Anurag Gupta, general manager of Amazon Aurora, during the keynote.

"There's a brand new log structured storage system that's scale out, multi-tenant and optimized for database workloads, and it's integrated with a bunch of AWS services like S3," said Gupta, explaining Aurora is MySQL-compatible. Moreover, he added, those with MySQL-based apps can migrate them to Aurora with just several mouse clicks and ultimately see a fivefold performance gain.

"With Aurora you can run 6 million inserts per minute, or 30 million selects," Gutpa said. "That's a lot faster than stock MySQL running on the largest instances from AWS, whether you're doing network IOs, local IOs or no IOs at all. But Aurora is also super durable. We replicate your data six ways across three availability zones and your data is automatically, incrementally, continuously backed up to S3, which as you know is designed for eleven nines durability."

Clearly Amazon is trying to grab workloads that organizations have built for MySQL but the company is also apparently targeting those that run on other SQL engines that it now hosts via its Relational Database Service (RDS) portfolio including Oracle, MySQL and Microsoft's SQL Server.

Aurora automatically repairs failures in the background recovering from crashes within seconds, Gupta added. It can replicate six copies of data across three Availability Zones and backup data continuously to S3. Customers can scale an Aurora database instance up to 32 virtual CPUs and 244GB of memory. Aurora replicas can span up to three availability zones with storage capacities starting at 10GB and as high as 64TB.

Gupta said the company is looking to price this for wide adoption, with pricing starting at 29 cents for a two-virtual CPU, 15.25-GB instance.

The preview is now available. Do you think Amazon Aurora will offer a viable alternative to SQL databases?

Posted by Jeffrey Schwartz on 11/17/2014 at 12:32 PM0 comments


Facebook Reportedly Developing Enterprise Social Networking Tool

Facebook is secretly developing a social network aimed at enterprise users, according to a report published in today's Financial Times. The report said Facebook at Work could threaten Microsoft's Yammer enterprise social network as well as LinkedIn and Google Drive.

At first glance, it's hard to understand how Facebook at Work would challenge both Yammer and LinkedIn. Though they're both social networks, they are used in different ways. Granted there's some overlap, Yammer is a social network for a closed group of users. Facebook users have apparently used Facebook at Work over the past year for internal communications and the company has let others test it as well.

The report was otherwise quite vague and I wonder if the author even understands the difference between Yammer, LinkedIn and Google Drive. It's not unreasonable to think Facebook would want to offer a business social network similar to Yammer or Salesforce.com's Chatter. But as PC Magazine points out, many businesses might have issues with a service provided by Facebook.

That said, I reached out to SharePoint and Yammer expert Christian Buckley, who recently formed GTConsult, to get his take. Buckley said there's been buzz about Facebook's ambitions for some time but he's skeptical that Facebook could make a serious dent in the enterprise social networking market despite its dominance on the consumer side.

"Honestly I think they're a couple years behind in making any serious move in this space," Buckley said. "They will undoubtedly attract users, and have a number of high-profile deployments, but there is a very real line of demarcation between consumer and business platforms, and I just don't see Facebook as being able to close that gap in any serious way."

Buckley also noted that Google, LinkedIn and Yammer have very different value propositions to enterprises. "Each have their own struggles," Buckley said. "LinkedIn may be displacing Yahoo Groups and other public chat forums, but my understanding is that they are having a difficult time translating that moderate growth into additional revenue beyond job postings. Yammer's difficulties may be a closer comparison and highlight Facebook's uphill battle to win over the enterprise by aligning ad hoc social collaboration capabilities with business processes. Microsoft has Yammer at the core of its inline social strategy, and like SharePoint, the individual Yammer brand will fade (in my view) as the core features are spread across the Office 365 platform. Instead of going to a defined Yammer location, the Yammer-like features will happen in association with your content, your e-mail, your CRM activities, and so forth."

What's your take on this latest rumor?

 

 

Posted by Jeffrey Schwartz on 11/17/2014 at 11:56 AM0 comments


Microsoft's Decision To Open .NET Framework: A Huge Show of Linux Love

When Microsoft CEO Satya Nadella last month said, "Microsoft loves Linux" and pointed to the fact that 20 percent of its Azure cloud is already running the open source popular platform, he apparently was getting ready to put his money where his mouth is.

At its Connect developer conference this week, Microsoft said it will open source its entire .NET Framework core and bring it to both Linux and the Apple Macintosh platform. It is the latest move by Microsoft to open up its proprietary .NET platform. Earlier this year, the company made ASP.NET and the C# compiler open source. This week the company released the .NET Core development stack and in the coming months, Microsoft will make the rest of .NET Core Runtime and .NET Core Framework open source.

Citing more than 1.8 billion .NET installations and over 7 million downloads of Visual Studio 2013 during the past year, Microsoft Developer Division Corporate Vice President S. Somasegar said in a blog post, "we are taking the next big step for the Microsoft developer platform, opening up access to .NET and Visual Studio to an even broader set of developers by beginning the process of open sourcing the full .NET server core stack and introducing a new free and fully-featured edition of Visual Studio." These were all once unthinkable moves.

Just how big a deal is this? Consider the reaction of Linux Foundation Executive Director Jim Zemlin: "These are huge moves for the company," he said in a blog post. "Microsoft is redefining itself in response to a world driven by open source software and collaborative development and is demonstrating its commitment to the developer in a variety of ways that include today's .NET news."

Zemlin lauded a number of Microsoft's open source overtures including its participation in the OpenDaylight SDN project, the AllSeen Alliance Internet of Things initiative and the Core Infrastructure Initiative.

For IT pros, the move is Microsoft's latest affirmation of the company's embrace of open source and Linux in particular. At the same time, while some believe Microsoft is also doing so to deemphasize Windows, the company's plans to provide Docker containers in Windows Server suggests the company has a dual-pronged strategy for datacenter and applications infrastructure: bolster the Windows platform to bring core new capabilities to its collaboration offerings while ensuring it can tie to open source platforms and applications as well.

At the same time, it appears that Microsoft is seeking to ensure that its development environment and ecosystem remains relevant in the age of modern apps. Zemlin believes Microsoft has, in effect, seen the light. "We do not agree with everything Microsoft does and certainly many open source projects compete directly with Microsoft products," he said. "However, the new Microsoft we are seeing today is certainly a different organization when it comes to open source. Microsoft understands that today's computing markets have changed and companies cannot go it alone the way they once did."

Posted by Jeffrey Schwartz on 11/14/2014 at 11:11 AM0 comments


Microsoft Adds Rackspace as New Cloud OS Network Partner

When Microsoft last month announced it has 100-plus partners adopting its burgeoning Cloud OS Network, which aims to provide Azure-compatible third party cloud services, it left out perhaps one of the biggest fishes it has landed: Rackspace.

The two companies are longtime partners, and as I recently reported, Rackspace has extended its Hyper-V-compatible offerings and dedicated Exchange, SharePoint and Lync services. But Rackspace also has a formidable cloud infrastructure as a service that competes with the Azure network. The news that Rackspace now will provide Azure-compatible cloud service, announced on Monday with Rackspace's third-quarter earnings report,  signals a boost for both companies.

For Microsoft it brings one of the world's largest public clouds and dedicated hosting providers into the Azure fold. Even if it's not all in or the core of Rackspace business -- that is still reserved for its own OpenStack-based infrastructure, a healthy VMware offering and the newly launched Google Apps practice -- Rackspace has a lot of Exchange and SharePoint hosting customers who may want to move to an Azure-like model but want to use it with the service level that the San Antonio, Texas-based company emphasizes.

"Those who are down in the managed 'colo' world, they don't want to be managing the infrastructure. They want us to do that," said Jeff DeVerter, general manager of Microsoft's Private Cloud business at Rackspace. "They're happy to let that go and get back into the business of running the applications that run that business."

Customers will be able to provision Azure private cloud instances in the Rackspace cloud and use the Windows Azure Pack to manage and view workloads. This is not a multitenant offering like Azure or similar infrastructure-as-a- service clouds, DeVerter pointed out. "These are truly private clouds from storage to compute to the networking layer and then the private cloud that gets deployed inside of their environment is dedicated to theirs. We deploy a private cloud into all of our datacenters [and] it puts the customers' cloud dual homing some of their management and reporting back to us so that we can manage hundreds and then thousands of our customers' clouds through one management cloud."

Microsoft first launched the Cloud OS Network nearly a year ago with just 25 partners. Now with more than 100, Marco Limena, Microsoft's vice president of Hosting Service Providers, claimed in a blog post late last month that there are in excess of 600 Cloud OS local datacenters in 100 companies serving 3.7 million customers. The company believes this network model will address the barriers among customers who have data sovereignty and other compliance requirements.

Among the members of the Cloud OS Network listed in an online directory are Bell Canada, CapGemini, Datapipe, Dimension Data and SherWeb. "Microsoft works closely with network members to enable best-practice solutions for hybrid cloud deployments including connections to the Microsoft Azure global cloud," Limena said.

Asked if it's in the works for Rackspace to enable Cloud OS private cloud customers to burst workloads to the Microsoft Azure service, DeVerter said: "Those are active conversations today that we're having internally and having with Microsoft. But right now our focus is around making that private cloud run the best it can at Rackspace."

 

Posted by Jeffrey Schwartz on 11/12/2014 at 1:01 PM0 comments


Virtual Images Gather in the New Microsoft Azure Marketplace

If the Microsoft Azure public drive is going to be the centerpiece of its infrastructure offering, the company needs to bring third-party applications and tools along with it. That's where the newly opened Microsoft Azure Marketplace comes in. The company announced the Microsoft Azure Marketplace at a press and analyst briefing in San Francisco late last month led by CEO Satya Nadella and Scott Guthrie, executive VP of cloud and enterprise. As the name implies, it's a central marketplace in which providers can deliver to customers to run their software as virtual images in Azure.

A variety of providers have already ported these virtual images to the marketplace -- some are pure software vendors, while others are providers of vertical industry solutions -- and a number of notable offerings have started appearing. Many providers announced their offerings at last month's TechEd conference in Barcelona.

One that Microsoft gave special attention to at the launch of the Azure Marketplace was Cloudera, the popular supplier of the Apache Hadoop distribution. Cloudera has agreed to port its Cloudera Enterprise distribution, which many Big Data apps are developed on, to Microsoft Azure. That's noteworthy because Microsoft's own Azure HDInsight Hadoop as a Service is based on the Hortonworks Apache Hadoop distribution. While it could cannibalize Azure HDInsight, those already committed to Cloudera are far less likely to come to Azure than if Cloudera is there.

"To date, most of our customers have built large infrastructures on premises to run those systems, but there's increasing interest in public cloud deployment and in hybrid cloud deployment, because infrastructure running in the datacenter needs to connect to infrastructure in the public cloud," said Cloudera Founder and Chief Strategy Officer Mike Olsen, speaking at the Microsoft cloud briefing in San Francisco. "This we believe is, for our customers, a major step forward in making the platform more consumable still."

Also up and running in the Azure Marketplace is Kemp Technologies, a popular provider of Windows Server load balancers and application delivery controllers. The Kemp Virtual LoadMaster for Azure lets customers create a virtual machine (VM) optimized to run natively in the Microsoft cloud, said Maurice McMullin, a Kemp product manager.

"Even though Azure itself does have a load balancer, it's a pretty rudimentary one," McMullin said. "Having the Kemp load balancer in there totally integrated into the Azure environment allows you to script some of those environments and application scenarios. The impact of that is, for an organization that's looking toward the cloud, one of the big challenges is trying to maintain the consistency by having a consistent load balancer from on premises, meaning you get a single management interface and consistent management of apps and policies on premises or in the cloud."

Lieberman Software has made available as a virtual image in the marketplace its Enterprise Random Password Manager (ERPM), which the company said provides enterprise-level access controls over privileged accounts throughout the IT stack, both on premises and now in Azure.

The company says ERPM removes persistent access to sensitive systems by automatically discovering, securing and auditing privileged accounts across all systems and apps within an enterprise. Authorized administrators can delegate to users quick access to specific business applications, as well as corporate social media sites in a secure environment. And those activities are automatically recorded and audited. It also ensures access to such identities is temporary and able to ensure unauthorized or anonymous access to sensitive data is avoided.

Another security tool is available from Waratek Ltd., a supplier of a Java Virtual Machine (JVM) container, which lets enterprises bring their own security to the cloud. Called Runtime Application Self-Protection (RASP), it monitors for key security issues and provides policy enforcement and attack blocking from the JVM.

In the JVM, the company offers a secure container where administrators can remotely control their own security at the application level, said Waratek CEO Brian Maccaba. "This is over and beyond anything the cloud provider can do for you and it's in your control," Maccaba says. "You're not handing it to Microsoft or Amazon -- you're regaining the reins, even though it's on the cloud."

The number of offerings in the Azure Marketplace is still relatively few -- it stands at close to 1,000 based on a search via the portal, though it is growing.

Posted on 11/10/2014 at 12:44 PM0 comments


Is the Free Microsoft Office 365 a Licensing Nightmare?

Microsoft got some positive ink yesterday when it announced that Office 365 users on iPhones and iPads can now edit their documents for free and that the same capability was coming to Android tablets. Indeed it is good news for anyone who uses one or more of those devices (which is almost everyone these days).

But before you get too excited, you should read the fine print. As Directions on Microsoft Analyst Wes Miller noted on his blog, "Office is free for you to use on your smartphone or tablet if, and only if you are not using it for commercial purposes [and] you are not performing advanced editing."

If you do fit into the above-mentioned buckets or you want the unlimited storage and new Dropbox integration, it requires either an Office 365 Personal, Home or a commercial Office 365 subscription that comes with the Office 365 ProPlus desktop suite, Miller noted. As Computerworld's Gregg Keizer put it:  "What Microsoft did Thursday was move the boundary between free and paid, shifting the line."

In Microsoft's blog post announcing the latest free offering, it does subtly note that this offer may not be entirely free. "Starting today, people can create and edit Office content on iPhones, iPads, and soon, Android tablets using Office apps without an Office 365 subscription," wrote Microsoft Corporate VP for Microsoft Office John Case, though that fine print was at the end of his post. "Of course Office 365 subscribers will continue to benefit from the full Office experience across devices with advanced editing and collaboration capabilities, unlimited OneDrive storage, Dropbox integration and a number of other benefits." Microsoft offers similar wording on the bottom of its press release issued yesterday.

Still, while noting this is great news for consumers, it's going to be problematic for IT organizations, Miller warned, especially those that have loose BYOD policies. "For commercial organizations, I'm concerned about how they can prevent this becoming a large license compliance issue when employees bring their own iPads in to work."

Are you concerned about this as well?

Posted by Jeffrey Schwartz on 11/07/2014 at 11:04 AM0 comments


Tech & Tools Watch, November 7: BlueStripe, Netwrix, Riverbed Technology

BlueStripe Embeds App Monitor into System Center, Windows Azure Pack
BlueStripe Software is now offering its Performance Center tool as a management pack for Microsoft System Center 2012 R2 Operations Manager. The company earlier this year released the dashboard component of FactFinder, which monitors distributed applications across numerous modern and legacy platforms.

With the addition of Performance Center, the company has embedded its core FactFinder tool into System Center. FactFinder can monitor everything from mainframe infrastructure including CICS and SAP R3 transactions, along with applications running on Unix, Linux and Windows infrastructures. BlueStripe said it provides visibility and the root causes of performance to application components on physical, virtual and cloud environments. It works with third-party public cloud services, as well.

FactFinder integrates Operation Manager workflows, providing data such as response times, failed connections, application loads and server conditions, the company said. It also maps all business transactions by measuring performance across each hop of a given chain and is designed to drill into the server stack to determine the cause of a slow or failing transaction.

In addition to the new System Center Management Pack, BlueStripe launched Performance Center for the Windows Azure Pack, which is designed to provide administrators common visibility of their Windows Server and Microsoft Azure environments. This lets administrators and application owners monitor the performance via the Windows Azure Pack.

BlueStripe Marketing Manager Dave Mountain attended last week's TechEd Conference in Barcelona and said he was surprised at the amount of uptake for the Windows Azure Pack. "There's a recognition of the need for IT to operate in a hybrid cloud world," Mountain said. "IT's reason for existing is to ensure the delivery of business services. Tools that allow them to focus on app performance will be valuable and that's what we are doing with FactFinder Performance Center for Windows Azure Pack."

Netwrix Tackles Insider Threats with Auditor Upgrade
Netwrix Corp. has upgraded its auditing software to offer improved visibility to insider threats, while warning of data leaks more quickly. The new Netwrix Auditor 6.5 offers deeper monitoring of log files and privileged accounts, which in turn provides improved visibility to changes made across a network, including file servers and file shares.

The new release converts audit logs into more human readable formats, according to the company. It also lets IT managers and systems analysts audit configurations from any point in time, while providing archives of historical data against which to match. Netwrix said this ensures compliance with security policies and thwarting rogue employees from making unauthorized changes.

In all, Netwrix said it has added more than 30 improvements to the new release of Auditor, resulting in higher scalability and performance.

Riverbed Extends Visibility and Control
Riverbed Technology this week launched the latest version of its SteelHead WAN optimization platform, including a new release of its SteelCentral AppResonse management tool to monitor hybrid environments, including Software-as-a-Service (SaaS) apps. 

Core to the new SteelHead 9.0 is its tight integration with SteelCentral AppResponse, which Riverbed said simplifies the ability to troubleshoot applications using the app's analytics engine, making it easier to manage such processes as policy configuration, patch management, reporting and troubleshooting. The SteelCentral dashboard lets administrators track performance of applications, networks, quality of service and reports on how policies are maintained.

SteelCentral AppResponse 9.5 also gives administration metrics on end-user experiences of traditional and SaaS-based apps, even if they're not optimized by the SteelHead WAN platform. Riverbed said providing this information aims to let IT groups respond to business requirements and issues causing degraded performance. The new SteelHead 9.0 also is designed to ensure optimized performance of Office 365 mailboxes.

 

Posted by Jeffrey Schwartz on 11/07/2014 at 10:50 AM0 comments


IBM CISO Study Warns of Uptick in Security Threats

A majority of some of the largest chief information security officers (CISOs) strongly believe that the sophistication of attackers is outstripping their own ability to fend them off and the number of threats has increased markedly. According to IBM's third annual CISO study, 59 percent are concerned about their inability to keep pace with 40 percent and say it's their top security challenge.

Moreover, 83 percent said external threats have increased over the past three years with 42 percent of them saying the increases were dramatic. IBM revealed results of its study at a gathering of CISOs held at its New York offices.

The survey also found CISOs have also found themselves more frequently questioned by the C-suite and corporate boards, while changes to the global regulatory landscape promise to further complicate efforts to step threats, where the vast majority are derived. Kristin Lovejoy, IBM's general manager of security services, said malware creation is a big business in unregulated countries, which are the origin of most attacks.

"Where we say we're worried about external attackers and we're worried about financial crime data theft, there's a correlation between people getting Internet access in unregulated, unlegislated countries where it's an economic means of getting out," Lovejoy said. "When you interview the criminals, they don't even know they're performing a crime -- they're just building code. We have to be careful here, this external attacker thing, it's not going to get any better, it's going to get worse."

Most are able to exploit the naivety of employees, she added, noting 80 to 90 percent of all security incidents were because of human error. "They're getting in because users are pretty dumb," she said. "They click on stuff all the time. It's going to continue." She added organizations that are most secure are those that have good IT hygiene, automation, configuration management, asset management, especially those that implement ITIL practices.

Posted by Jeffrey Schwartz on 11/05/2014 at 1:23 PM0 comments


Survey: Most Small and Medium Businesses Unprepared To Recover from IT Outages

A survey of small and medium enterprises found that only 8 percent are prepared to recover from an unplanned IT outrage, while 23 percent of them report it would take more than a day to resume operations.

Underscoring the risk to companies with fewer than 1,000 employees, a vast majority of the 453 organizations surveyed have experienced a major IT outage in the past two years. Companies with 50 to 250 employees were especially at risk. A reported 83 percent have gone through a major IT failure, while 74 percent of organizations with 250 to 1,000 employees have experienced a significant outage.

One-third are using cloud-based disaster recovery as a service, which has rapidly started to gain momentum this year, according to the survey, conducted by Dimensional Research and sponsored by DRaaS provider Axcient. Daniel Kuperman, director of product marketing at Axcient, said the results confirmed what the company had suspected. "In a lot of cases, companies still don't put emphasis on disaster recovery," he said.

Axcient didn't reveal whose DRaaS offerings the organizations were using, through Kuperman said Dimensional chose its own companies to poll from the researcher's own resources. DRaaS is one of the leading use cases for organizations making their foray into using cloud services.

A survey by cloud infrastructure provider EvolveIP last month found that nearly 50 percent benefitted from a recovery cloud service by avoiding outages from a disaster. Nearly three quarters, or 73 percent, cited the ability to recover from an outrage was the prime benefit of using a cloud service. As a result, 42 percent of those responding to EvolveIP's survey have increased their cloud spending budgets this year, while 54 percent plan to do so in 2015.

Posted by Jeffrey Schwartz on 11/05/2014 at 12:01 PM0 comments


Microsoft Demos Disaster Recovery Feature Slated for Windows Server

In its latest bid to offer better failover and replication in its software and cloud infrastructure, Microsoft demonstrated its new Storage Replica technology at last week's TechEd conference in Barcelona.

Microsoft Principal Program Manager Jeff Woolsey demonstrated Storage Replica during the opening TechEd keynote. Storage Replica, which Microsoft sometimes calls Windows Volume Replication (or WVR) provides block-level, synchronous replication between servers or cluster to provide disaster recovery, according to a Microsoft white paper published last month. The new replication engine is storage-agnostic and Microsoft says it can also stretch a failover cluster for high availability.

Most notable is that Storage Replica provides synchronous replication, which as Microsoft describes it, enables organizations to mirror data within the datacenter with "crash-consistent volumes." The result, says Microsoft, is zero data loss at the file system level. By comparison, asynchronous replication, which Microsoft added to Windows Server 2012 via the Hyper-V Replica and updated in last year's Windows Server 2012 R2 release, allows site extension beyond the limitations of a local metropolitan area. Asynchronous replication, which has a higher possibility for data loss or delay, may not be suited for scenarios where instantaneous real-time availability is a requirement, though for general purposes it's considered adequate.

In the TechEd demo, Woolsey simulated a scenario with four server nodes, two in New York and the other across the river in New Jersey. The goal is to ensure that if users are unable to access data on the two nodes in New York, they automatically and transparently fail over to New Jersey without losing any data, Woolsey explained. It also uses a new feature in the Microsoft Azure service called Cloud Witness.

"To do a stretch cluster you need to have a vote for the cluster quorum," Woolsey explained. "In the past, this meant extra hardware, extra infrastructure, extra cost. Now we're just making this part of Azure as well. So that's an option to take advantage of the Cloud Witness. As you can see, we're baking hybrid capabilities right into Windows Server."

In the demo, Woolsey accessed the file share data to enable replication via the new storage replication wizard. From there he selected the source log disk, then the destination storage volume and log disk. "Literally in just a few clicks, that's it, I've gone ahead and I've set up synchronous replication," he said.

In the recently published white paper, the following features are implemented in the Windows Server Technical Preview:


Feature

Notes

Type

Host-based

Synchronous

Yes

Asynchronous

Yes (server to server only)

Storage hardware agnostic

Yes

Replication unit

Volume (Partition)

Windows Server Stretch Cluster creation

Yes

Write order consistency across volumes

Yes

Transport

SMB3

Network

TCP/IP or RDMA

RDMA

iWARP, InfiniBand*

Replication network port firewall requirements

Single IANA port (TCP 445 or 5445)

Multipath/Multichannel

Yes (SMB3)

Kerberos support

Yes

Over the wire encryption and signing

Yes (SMB3)

Per-volume failovers allowed

Yes

Dedup & BitLocker volume support

Yes

Management UI in-box

Windows PowerShell, Failover Cluster Manager

Source: Microsoft

Microsoft also has emphasized that Storage Replica is not intended for backup and recovery scenarios. And because of the general purpose of the product, the company noted it may not be suited to specific applications behaviors. In addition, Microsoft is warning that with Storage Replica, organizations could see feature gaps in applications and hence they could be better served by those app-specific replication technologies.

What's your take on Microsoft's latest efforts to embed disaster recovery into Windows Server and Azure?

 

Posted by Jeffrey Schwartz on 11/03/2014 at 1:26 PM0 comments


Microsoft Azure Cloud in a Box Comes into View

Microsoft used its TechEd conference in Barcelona this week to give customers a first look at the new Azure cloud in a box. The so-called Cloud Platform System (CPS), announced at an event held last week in San Francisco led by CEO Satya Nadella and Executive VP for Cloud and Enterprise Scott Guthrie, is Microsoft's effort to let customers or hosting providers run their own Azure clouds.

The first CPS is available from Dell, though describing the company as the "first" to provide one implies that other major hardware providers may have plans for their own iterations -- or perhaps it's only at the wishful thinking stage. At any rate, CPS has been a long time coming.

As you may recall, Microsoft first announced plans to release such an offering more than four years ago. At the time, Dell, Hewlett Packard and Fujitsu were planning to offer what was then coined the Windows Azure Platform Appliance, and eBay had planned to run one. Though Microsoft took it on a roadshow that year, it suddenly disappeared.

Now it's back and Corporate VP Jason Zander showcased it in his TechEd Europe opening keynote, inviting attendees to check it out on the show floor. "This is an Azure-consistent cloud in a box," he said. "We think this is going to give you the ability to adopt the cloud with even greater control. You energize it, you hook it up to your network and you're basically good to go."

The CPS appears more modest than the original Windows Azure Platform Appliance in that they are sold as converged rack-based systems and don't come in prefabricated containers with air conditioning and cooling systems. The racks are configured with Dell PowerEdge servers, storage enclosures and network switches. Each rack includes 32 CPU notes and up to 282TB of storage. On the software side customers get Windows Server 2012 R2 with Hyper-V, configured in a virtualized multi-tenant architecture, System Center 2012 R2 and the Windows Azure Pack to provide the Azure-tie functionality within a customer's datacenter.

So far, the first two known customers of the CPS are NTTX, which will use it to provide its own Azure infrastructure as a service in Japan and CapGemini, which will provide its own solutions for customers running in the Azure cloud.  

CapGemini is using it for an offering called SkySight, which will run a variety of applications including SharePoint and Lync as well as a secure policy driven orchestration service based on its own implementation of Azure. "SkySite is a hybrid solution where we will deliver a complete integrated application store and a developer studio all using Microsoft technologies," said CapGemini Corporate VP Peter Croes, in a pre-recorded video presented by Zander during the keynote. "CPS for me is the integrated platform for public and private cloud. Actually it's the ideal platform to deliver the hybrid solution. That is what the customers are looking for."

Microsoft last week tried to differentiate itself from Amazon Web Services and Google in its hybrid approach. CPS could become an important component of Azure's overall success.

 

Posted by Jeffrey Schwartz on 10/31/2014 at 1:00 PM0 comments


Android Founder Andy Rubin Leaves Google

Andy Rubin is leaving Google to join a technology incubator dedicated to startups building hardware, according to published reports. Whether you are an Android fan or not, it's hard not to argue that Google's acquisition of the company Rubin founded was one of the most significant deals made by the search giant.

While Rubin continued to lead the Android team since Google acquired it in 2005, Google reassigned him last year to lead the company's moves into the field of robotics, which included overseeing the acquisition of numerous startups.

Rubin's departure comes a week after Google CEO Larry Page promoted Sundar Pichai to head up all of Google's product lines except for YouTube. Page in a memo to employees published in The Wall Street Journal said he's looking to create a management structure that can make "faster, better decisions."

That effectively put Pichai in charge of the emerging robotics business as well. A spokesman told The New York Times that James Kuffner, who has worked with Google's self-driving cars, will lead the company's robotics efforts.

The move comes ironically on the same day that Google sold off the hardware portion of its Motorola Mobility business to Lenovo. The two companies yesterday closed on the deal, announced in January. Lenovo said it will continue to leverage the Motorola brand.

As for Rubin, now that he's incubating startups, he'll no doubt be on the sell-side of some interesting companies again.

Posted by Jeffrey Schwartz on 10/31/2014 at 12:58 PM0 comments


Microsoft Boosts Management Automation Features of Azure Hybrid Cloud Platform

Microsoft kicked off what looks to be its final TechEd conference with the launch of new services designed to simplify the deployment, security and management of apps running in its cloud infrastructure. In the opening keynote presentation at TechEd, taking place in Barcelona, officials emphasized new capabilities that enable automation and the ability to better monitor the performance of specific nodes.

A new feature called Azure Operational Insights will tie the cloud service and Azure HDInsight with Microsoft's System Center management platform. HDInsight, the Apache Hadoop-based Big Data analytics service, will monitor and analyze machine data from cloud environments to determine where IT pros need to reallocate capacity.

Azure Operational Insights, which will be available in preview mode next month (a limited preview is currently available), initially will address four key functions: log management, change tracking, capacity planning and update assessment. It uses the Microsoft Monitoring Agent, which incorporates an application performance monitor for .NET apps and the IntelliTrace Collector in Microsoft's Visual Studio development tooling, to collect complete application-profiling traces. Microsoft offers the Monitoring Agent as a standalone tool or as a plugin to System Center Operations Manager.

Dave Mountain, vice president of marketing at BlueStripe Software, was impressed with the amount of information it gathers and the way it's presented. "If you look at it, this is a tool for plugging together management data and displaying it clearly," Mountain said. "The interface is very slick, there's a lot of customization and it's tile-based."

On the heels of last week's announcement that it will support the more-robust G-series of virtual machines, which boast up to 32 CPU cores of compute based on Intel's newest Xeon processors, 45GB of RAM and 6.5TB of local SSD storage, Microsoft debuted Azure Batch, which officials say  is designed to let customers use Azure for jobs that require "massive" scale out. The preview is available now.

Azure Batch is based on the job scheduling engine used by Microsoft internally to manage the encoding of Azure Media Services and for testing the Azure infrastructure itself, said Scott Guthrie, Microsoft's executive VP for cloud and enterprise, in a blog post today.

"This new platform service provides 'job scheduling as a service' with auto-scaling of compute resources, making it easy to run large-scale parallel and high performance computing (HPC) work in Azure," Guthrie said. "You submit jobs, we start the VMs, run your tasks, handle any failures, and then shut things down as work completes."

The new Azure Batch SDK is based on the application framework from GreenButton, a New Zealand-based company that Microsoft acquired in May, Guthrie noted. "The Azure Batch SDK makes it easy to cloud-enable parallel, cluster and HPC applications by describing jobs with the required resources, data and one or more compute tasks," he said. "With job scheduling as a service, Azure developers can focus on using batch computing in their applications and delivering services without needing to build and manage a work queue, scaling resources up and down efficiently, dispatching tasks, and handling failures."

Microsoft also said it has made its Azure Automation service generally available. The tool is designed to automate repetitive cloud management tasks that are time consuming and prone to error, the company said.  It's designed to use existing PowerShell workflows or IT pros can deploy their own.

Also now generally available is WebJobs, the component of Microsoft Azure Websites designed to simplify the running of programs, services or background tasks on a Web site, according a blog post by Product Marketing Manager Vibhor Kapoor, in a post today on the Microsoft Azure blog.

"WebJobs inherits all the goodness of Azure Websites -- deployment options, remote debugging capabilities, load balancing and auto-scaling," Kapoor noted. "Jobs can run in one instance, or in all of them. With WebJobs all the building blocks are there to build something amazing or, small background jobs to perform maintenance for a Web site."

 

Posted by Jeffrey Schwartz on 10/28/2014 at 10:59 AM0 comments


What To Expect from Microsoft at TechEd

Will Microsoft's final TechEd conference this week in Barcelona go out with a bang? We'll have a better sense of that over the next two days as the company reveals the next set of deliverables for the datacenter and the cloud. Microsoft has kept a tight lid on what's planned but we should be on the lookout for info pertaining to the next versions of Windows Server, System Center and Hyper-V, along with how Microsoft sees containers helping advance virtualization and cloud interoperability.

In case you missed it, this week's TechEd, the twice-yearly conference Microsoft has held for nearly two decades, will be the last. Instead, Microsoft said earlier this month it will hold a broader conference for IT pros and developers called Ignite to be held in Chicago during the first week of May. Ignite will effectively envelope TechEd, SharePoint and Exchange.

Given the company's statements about faster release cycles, if officials don't reveal what's planned for the next releases of Windows Server, System Center and the so-called "Cloud OS" tools that enable it to provide an Azure-like infrastructure within the datacenter, partner cloud services and its own public cloud, I'd be quite surprised.

If you caught wind of presentations made last week by CEO Satya Nadella and Scott Guthrie, EVP of Microsoft's cloud and enterprise group, it was clear that besides some noteworthy announcements, they were clearly aimed at priming the pump for future announcements. For example Microsoft announced the Azure Marketplace, where ISV partners can develop virtual images designed to accelerate the use of Azure as a platform. Also revealed, the Azure G-series of virtual powered by the latest Intel Xeon processors that Guthrie claimed will be the largest VMs available in the public cloud -- at least for now. Guthrie claimed that the new VMS provide twice the memory of the largest Amazon cloud machine.

As Microsoft steps up its moves into containerization with the recent announcement that it's working with Docker to create Docker containers for Windows Server, it will be interesting to hear how that will play into the next release of the server operating system. It will also be interesting to learn to what extent Microsoft will emphasize capabilities in Windows Server and Azure that offer more automation as the company moves to build on the evolving software-defined datacenter.

The opening keynote is tomorrow, when we'll find out how much Microsoft intends disclose what's next for its core enterprise datacenter and cloud platforms. I'd be surprised and disappointed if it wasn't substantive.

Posted by Jeffrey Schwartz on 10/27/2014 at 3:17 PM0 comments


Pressure Remains on Microsoft Despite Strong Surface Growth

While almost every part of Microsoft's business faces huge pressure from disruptive technology and competitors, the software that put the company on the map -- Windows -- continues to show it's not going to go quietly into the night. Given Microsoft's surprise report that Surface sales have surged and the company promising new capabilities in the forthcoming release of Windows 10, expectations of the operating system's demise are at least premature and potentially postponed indefinitely.

Despite the debacle with its first Surface rollout two years ago, this year's release of the Surface Pro 3 and the resulting impressive performance shows that Windows still has a chance to remain relevant despite the overwhelming popularity of iOS and Android among consumers and enterprises. Granted, we now live in a multiplatform world, which is a good thing that's not going to change. The only question still to play out is where Windows will fit in the coming years and this will be determined by Microsoft making the right moves. Missteps by Apple and Google going forward will play a role as well of course, but the ball is in Microsoft's court to get Windows right.

Amid yesterday's impressive results for the first quarter of Microsoft's 2015 fiscal year were increases along key lines including Office 365, enterprise software and cloud business and the disclosure of $908 million in revenues for its Surface business. That's more than double of what Surface devices received last year. This report includes the first full quarter that the new Surface Pro 3 has been on the market. Presuming there wasn't significant channel stuffing, this is promising news for the future of Windows overall.

Indeed while showing hope, the latest report on Surface sales doesn't mean Windows is out of the woods. Despite the surge in revenues, Microsoft didn't reveal Surface unit sales. And while the company said its Surface business is now showing "positive gross margins" -- a notable milestone given the $900 million charge the company took five quarters ago due to poor device sales -- Microsoft didn't say how profitable they are, said Patrick Moorhead, principal analyst with Moor Insights & Strategy.

Marketing and the cost of implementing Microsoft's much improved global channel and distribution reach neutralized or negated much of the overall negative margin. Moorhead predicted, "I can say with 99 percent confidence they are losing money on Surface still. That may not be bad for two reasons. They need to demonstrate Windows 8 can provide a good experience and second of all it puts additional pressure on traditional OEMs that they need to be doing a better job than what they do."

Also worth noting, the $908 million in Surface revenues were about 17 percent of the $5.3 million Apple took in for iPads during the same period (revenues for Macintoshes, which are in many ways more comparable to the Surface Pro 3, were $6.6 million, Apple said). Apple's iPads, which often displace PCs for many tasks, are also hugely profitable though ironically sales of the tablets have declined for the past three quarters amid the sudden surge in Surface sales. Naturally they also have different capabilities but the point is to underscore the positive signs the growth of Surface portend for the future of Windows.

Morehead said the current quarter and notably holiday sales of all Windows devices, led by an expected onslaught of dirt-cheap Windows tablets (possibly as low as $99) could be an inflexion point, though he warned that Microsoft will need to execute. "If Microsoft continues to operate the way they are operating, they will continue to lose considerable consumer relevance," he said. "If during the holidays, they make news and sell big volumes, I would start to think otherwise."

Key to the quarter's turnaround was the company's expanded global distribution and extended sales of corporations through its channel partners, though that effort is still at a formative stage. Despite his skeptical warning, Moorhead believes Google's failure to displace Windows PCs with large Android devices and Chromebooks gives Microsoft a strong shot at keeping Windows relevant.

"Google had this huge opportunity to bring the pain on Microsoft with larger devices and eat into notebooks," he said. "They never did it. They really blew their opportunity when they had it. While Android may have cleaned up with phones, when you think about it what they did was just blocking Microsoft as opposed to going after Microsoft, which would be in larger form factor devices in the form of Android notebooks and Chromebooks. The apps are designed for 4-inch displays, not a 15-inch display or 17-inch display. And with Chrome, its offline capabilities just came in too slowly and there really aren't a lot of apps. They just added the capability to add real apps."

Meanwhile, Moorhead pointed out that Apple this month has delivered on what Microsoft is aiming to do: provide an experience that lets a user begin a task on say an iPhone and resume that task on an iPad or Mac.

Hence keeping Windows relevant, among other thing, may rest on Microsoft's ability to deliver a Windows 10 that can do that and improve on a general lack of apps on the OS, which in the long run would incent developers to come back. The promising part of that is the renewed focus on the desktop, Moorhead said. "When they converge the bits in a meaningful way, I think they can hit the long tail because of the way they're doing Windows 10 with Windows apps and the ability to leverage those 300 million units to a 7-inch tablet and a 4-inch phone. I think that is an enticing value proposition for developers."

Given the target audience of business users for the Surface Pro 3, it also is a promising signal for the prospects of Windows holding its own in the enterprise. Do you find the new surge in Surface sales coupled with design goal of Windows 10 to be encouraging signs for the future Windows or do you see it more as one last burst of energy?

Posted by Jeffrey Schwartz on 10/24/2014 at 12:48 PM0 comments


IBM and Microsoft Forge Cross-Cloud Partnership

Microsoft may be trying to compete with IBM in the emerging market for machine learning-based intelligence but like all rivals, these two with a storied past together have their share of mutual interests even as they tout competing public enterprise clouds. Hence the two are the latest to forge a cloud compatibility partnership.

The companies said today they are working together to ensure some of their respective database and middleware offerings can run on both the IBM Cloud and Microsoft Azure. Coming to Microsoft Azure is IBM's WebSphere Liberty application server platform, MQ middleware and DB2 database. IBM's Pure Application Service will also run on Microsoft Azure the two companies said.

In exchange, Windows Server and SQL Server will work on the IBM Cloud. Both companies are collaborating to provide Microsoft's .NET runtime for IBM Bluemix, the company's new cloud development platform. While the IBM Cloud already has support for Microsoft's Hyper-V, IBM said it will add expanded support for the virtualization platform that's included in Windows Server. It was not immediately clear how they will improve Hyper-V support on the IBM Cloud.

Andrew Brust, a research director at Gigaom Research, said that the IBM Cloud, which is based on the SoftLayer public cloud IBM acquired last year for $2 billion, runs a significant amount of Hyper-V instances. "They explained to me that they have a 'non-trivial' amount of Windows business and that they support Hyper-V VMs," Brust said.

"With that in mind, the announcement makes sense, especially when you consider [Microsoft CEO] Satya's [Nadella] comment on Monday that Azure will 'compose' with other clouds," Brust added. The comment made by Nadella took place Monday when he was articulating on Microsoft's strategy to build Azure into a "hyperscale" cloud. "We are not building our hyperscale cloud in Azure in isolation," Nadella said.  "We are building it to compose well with other clouds."

Nadella spelled out recent efforts to do that including last week's announcement that Microsoft is working with Docker to develop Docker containers for Windows Server, its support for native Java via its Oracle partnership (which, like IBM, includes its database and middleware offerings) as well as broad support for other languages including PHP, Python and Node.js. "This is just a subset of the open source as well as other middle-tier frameworks and languages that are supported on Azure," Nadella said at the event.

Most analysts agree that Amazon, Microsoft and Google operate the world's largest cloud infrastructures but with SoftLayer, IBM has a formidable public cloud as well. Both IBM and Microsoft are seeing considerable growth with their respective cloud offerings but have reasonably sized holes to fill as well.

Nadella said Monday that Microsoft has a $4.4 billion cloud business -- still a small fraction of its overall revenues but rapidly growing.  For its part, IBM said on its earnings call Monday that its public cloud infrastructure is in a $3.1 billion run rate and its overall cloud business is up 50 percent, though the company's spectacular earnings miss has Wall Street wondering if IBM has failed to move quickly enough. The company's shares have tumbled in recent days and analysts are questioning whether the company needs a reboot similar to the one former CEO Lou Gerstner gave it two decades ago.

"Overall, this looks like a marriage of equals where both stand to gain by working harmoniously together," said PundIT Analyst Charles King. Forrester Research Analyst James Staten agreed. "IBM and Microsoft both need each other in this regard so a nice quid quo pro here," he said.

For Microsoft, adding IBM to the mix is just the latest in a spate of cloud partnerships. In addition to its partnership with Oracle last year, Microsoft recently announced a once-unthinkable cloud partnership with Salesforce.com and just tapped Dell to deliver its latest offering, the new Cloud Platform System, which the company describes as an "Azure-consistent cloud in a box" that it will begin offering to customers next month.

It also appears that IBM and Microsoft held back some of their crown jewels in this partnership. There was no mention of IBM's Watson or Big SQL, which is part of its InfoSphere Platform on Hadoop, based on a Hadoop Distributed File System (HDFS). During a briefing last week at Strata + Hadoop World in New York, IBM VP for Big Data Anjul Bhambhri described the recent third release of Big SQL in use with some big insurance companies. "Some of their queries which they were using on Hive, were taking 45 minutes to run," she said. "In Big SQL those kinds of things are 17 rejoins is now less than 5 minutes."

Likewise, the announcement doesn't seem to cover Microsoft's Azure Machine Learning or AzureHD Insights offerings. I checked with both companies and while both are looking into it, there was no response as of this posting. It also wasn't immediately clear when the offerings announced would be available.

Update: A Microsoft spokeswoman responded to some questions posed on the rollout of the services on both companies' cloud. Regarding the availability of IBM's software on Azure: "In the coming weeks, Microsoft Open Technologies, a wholly owned subsidiary of Microsoft, will publish license-included virtual machine images with key IBM software pre-installed," she stated. "Customers can take advantage of these virtual machines to use the included IBM software in a 'pay-per-use' fashion. Effective immediately, IBM has updated its policies to allow customers to bring their own license to Microsoft Azure by installing supported IBM software on a virtual machine in Azure."

As it pertains to using Microsoft's software in the IBM cloud, she noted: "Windows Server and SQL Server are available for use on IBM Cloud effective immediately. IBM will be offering a limited preview of .NET on IBM Cloud in the near future." And regarding plans to offer improves support for Hyper-V in the IBM Cloud: "Hyper-V is ready to run very well on IBM SoftLayer to provide virtualized infrastructure and apps. IBM is expanding its product support for Hyper-V."

Posted by Jeffrey Schwartz on 10/22/2014 at 2:22 PM0 comments


Will Apple Pay Be the Last Straw for Windows Phone?

The launch today of the new Apple Pay service for users of the newest iPhone and iPad -- and ultimately the Apple Watch -- is a stark reminder that Microsoft has remained largely quiet about its plans to pursue this market when it comes to Windows Phone or through any other channels.

If smartphone-based payments or the ability to pay for goods with other peripherals such as watches does take off in the coming year, it could be the latest reason consumers shun Windows Phone, which despite a growing number of apps, still is way behind the two market leaders.

So if payments become the new killer app for smartphones, is it too late for Microsoft to add it to Windows Phone? The bigger question should be is it too late for Microsoft as a company? Perhaps the simplest way to jump in would be to buy PayPal, the company eBay last month said it will spin off. The problem there is eBay has an estimated market valuation of $65 billion -- too steep even for Microsoft.

If Microsoft still wants to get into e-payment -- which, in addition to boosting Windows Phone, could benefit Microsoft in other ways including its Xbox, Dynamics and Skype businesses, among others -- the company could buy an emerging e-payment company such as Square, which is said to be valued at a still-steep (but more comfortable) $6 billion.

Just as Microsoft's Bill Gates had visions of bringing Windows to smartphones nearly two decades ago, he also foresaw an e-payments market similar to the one now emerging. Gates was reminded of the fact that he described possible e-payment tech in his book, "The Road Ahead," by Bloomberg Television's Erik Schatzker in an interview released Oct. 2.

"Apple Pay is a great example of how a cellphone that identifies its user in a pretty strong way lets you make a transaction that should be very, very inexpensive," Gates said. "The fact that in any application I can buy something, that's fantastic. The fact I don't need a physical card any more -- I just do that transaction and you're going to be quite sure about who it is on the other end -- that is a real contribution. And all the platforms, whether it's Apple's or Google's or Microsoft, you'll see this payment capability get built in. That's built on industry standard protocols, NFC and these companies have all participated in getting those going. Apple will help make sure it gets critical mass for all the devices."

Given his onetime desire to lead Microsoft in offering digital wallet and payment technology, Schatzker asked Gates why Microsoft hasn't entered this market already? "Microsoft has a lot of banks using their technology to do this type of thing," Gates said. "In the mobile devices, the idea that a payment capability and storing the card in a nice secret way, that's going to be there on all the different platforms. Microsoft had a really good vision in this." Gates then subtly reminded Schatzker the point of their interview was to talk about the work of the Bill and Melinda Gates Foundation.

But before shifting back to that topic, Schatzker worked in another couple of questions, notably should Microsoft be a player the way that Apple is looking to become (and Google has) with its digital wallet? "Certainly Microsoft should do as well or better but of all the things that Microsoft needs to do in terms of making people more productive in their work, helping them communicate in new ways, it's a long list of opportunities," he said. "Microsoft has to innovate and taking Office and making it dramatically better would be really high on the list. That's the kind of thing I'm trying to help make sure they move fast on."

For those wishful that Microsoft does have plans in this emerging segment, there's hope. Iain Kennedy last month left Amazon.com where he managed the company's local commerce team to take on the new role of senior director of product management for Microsoft's new commerce platform strategy, according to his LinkedIn profile. Before joining Amazon, Kennedy spent four years at American Express.

Together with Gates' remarks, it's safe to presume that Microsoft isn't ignoring the future of digital payments and e-commerce. One sign is that Microsoft is getting ready to launch a smartwatch within the next few weeks that is focused on fitness. While that doesn't address e-payments, it's certainly a reasonable way to get into the game. According to a Forbes report today, it will be a multiple operating system watch.

It's unclear what role Windows Phone will play in bringing a payments service to market but it's looking less like it will have a starring role. As a "productivity and platforms" company, despite Gates' shifting the conversation to Office, it may not portend that Microsoft has  plans for the e-payments market. If the company moves soon, it may not be too late.

Posted by Jeffrey Schwartz on 10/20/2014 at 1:49 PM0 comments


Microsoft Targets IBM Watson with Azure Machine Learning in Big Data Race

Nearly a year after launching its Hadoop-based Azure HDInsight cloud analytics service, Microsoft believes it's a better and broader solution for real-time analytics and predictive analysis than IBM's widely touted Watson. Big Blue this year has begun commercializing its Watson technology, made famous in 2011 when it came out of the research labs to appear and win on the television game show Jeopardy.

Both companies had a large presence at this year's Strata + Hadoop World Conference in New York, attended by 5,000 Big Data geeks. At the Microsoft booth, Eron Kelly, general manager for SQL Server product marketing, highlighted some key improvements to Microsoft's overall Big Data portfolio since last year's release of Azure HDInsight including SQL Server 2014 with support for in-memory processing, PowerBI and the launch in June of Azure Machine Learning. In addition to bolstering the offering, Microsoft showcased Azure ML's ability to perform real-time predictive analytics for the retail chain Pier One.

"I think it's very similar," in terms of the machine learning capabilities of Watson and Azure ML, Kelly said. "We look at our offering as a self-service on the Web solution where you grab a couple of predictive model clips and you're in production. With Watson, you call in the consultants. It's just a difference fundamentally [that] goes to market versus IBM. I think we have a good advantage of getting scale and broad reach."

Not surprisingly, Anjul Bhambhri, vice president of Big Data for IBM's software group disagreed. "There are certain applications which could be very complicated which require consulting to get it right," she said. "There's also a lot of innovation that IBM has brought to market around exploration, visualization and discovery of Big Data which doesn't require any consulting." In addition to Watson, IBM offers its InfoSphere BigInsights for Hadoop and Big SQL offerings.

As it broadens its approach with a new "data culture," Microsoft has come on strong with Azure ML, noting it shares many of the real-time predictive analytics of the new personal assistant in Windows Phone called Cortana. Now Microsoft is looking to further broaden the reach of Azure ML with the launch of a new app store-type marketplace where Microsoft and its partners will offer APIs consisting of predictive models that can plug into Azure Machine Learning.

Kicking off the new marketplace, Joseph Sirosh, Microsoft's corporate VP for information management and machine learning, gave a talk at the Strata + Hadoop conference this morning. "Now's the time for us to try to build the new data science economy," he said in his presentation. "Let's see how we might be able to build that. What do data science and machine learning people do typically? They build analytical models. But can you buy them?"

Sirosh said with Microsoft's new data section of the Azure Marketplace, marketplace developers and IT pros can search for predictive analytics components. It consists of APIs developed both by Microsoft and partners. Among those APIs from Microsoft are Frequently Bought Together, Anomaly Detection, Cluster Manager and Lexicon Sentiment Analysis. Third parties selling their APIs and models include Datafinder, MapMechanics and Versium Analytics.

Microsoft's goal is to build up the marketplace for these data models. "As more of you data scientists publish APIs into that marketplace, that marketplace will become just like other online app stores -- an enormous of selection of intelligent APIs. And we all know as data scientists that selection is important," Sirosh said. "Imagine a million APIs appearing in a marketplace and a virtual cycle like this that us data scientists can tap into."

Also enabling the real-time predictive analytics support is support for Apache Storm clusters, announced today. Though it's in preview, Kelly said Microsoft is adhering to its SLAs with use of the Apache Storm capability, which enables complex event processing and stream analytics, providing much faster responses to queries.

Microsoft also said it would support the forthcoming Hortonworks Data Platform, which has automatic backup to Azure BLOB storage, Kelly said. "Any Hortonworks customer can back up all their data to an Azure Blob in a real low cost way of storing their data, and similarly once that data is in Azure, it makes it real easy for them to apply some of these machine learning models to it for analysis with Power BI [or other tools]."

Hortonworks is also bringing HDP to Azure Virtual Machines as an Azure certified partner. This will bring Azure HDInsight to customers who want more control over it in an infrastructure-as-a-service model, Kelly said. Azure HDInsight is currently a platform as a service that is managed by Microsoft.

 

Posted by Jeffrey Schwartz on 10/17/2014 at 9:54 AM0 comments


Microsoft Says Women Earn Same as Men, Steps Up Diversity Plan

Apparently stung by his remarks last week that women shouldn't ask for raises but instead look for "karma," Microsoft CEO Satya Nadella said he is putting controls in place that will require all employees to attend diversity training workshops to ensure not just equal pay for women but opportunities for advancement regardless of gender or race.

Nadella had quickly apologized for his remarks at last week's Hopper Celebration of Women in Computing in Phoenix but apparently he's putting his money where his mouth is. Microsoft's HR team reported to Nadella that women in the U.S. last year earned 99.7 percent of what men earned at the same title and rank, according to an e-mail sent to employees Wednesday that was procured by GeekWire.

"In any given year, any particular group may be slightly above or slightly below 100 percent," he said. "But this obscures an important point: We must ensure not only that everyone receives equal pay for equal work, but that they have the opportunity to do equal work."

Given the attention Microsoft's CEO brought to the issue over the past week, it begs the question: do women in your company earn the same amount as men for the same job title and responsibilities and have the same opportunities for advancement or is there a clear bias? IT is an industry dominated by men though educators are trying to convince more women to take up computer science.

Posted by Jeffrey Schwartz on 10/17/2014 at 12:59 PM0 comments


Docker Containers Are Coming to Windows Server

In perhaps its greatest embrace of the open source Linux community to date, Microsoft is teaming up with Docker to develop Docker containers that will run on the next version of Windows Server, the two companies announced today. Currently Docker containers, which are designed to enable application portability using code developed as micro-services, can only run on Linux servers.

While Microsoft has stepped up its efforts to work with Linux and other open source software over the years, this latest surprise move marks a key initiative to help make containers portable among respective Windows Server and Linux server environments and cloud infrastructures. It also underscores a willingness to extend its ties with the open source community as a key contributor to make that happen.

In addition to making applications portable, proponents say containers could someday supersede the traditional virtual machine. Thanks to their lightweight composition, containers can provide the speed and scale needed for next generation applications and infrastructure components. Those next generation applications include those that make use of Big Data and processing complex computations.

Containers have long existed, particularly in the Linux community and by third parties such as Parallels. But they have always had their own implementations. Docker has taken the open source and computing world by storm over the past year since the company, launched less than two years ago, released a standard container that created a de-facto standard for how applications can extend from one platform to another running as micro-services.

Many companies have jumped on the Docker bandwagon in recent months including Amazon, Google, IBM, Red Hat and VMware, among others. Microsoft in May said it would enable Docker containers to run in its Azure infrastructure as a service cloud. The collaboration between Docker and Microsoft was a closely held secret.

Microsoft Azure CTO Mark Russinovich had talked about the company's work with Docker to support its containers in Azure in a panel at the Interop show in New York Sept. 30 and later in an interview. Russinovich alluded to Microsoft's own effort to develop Windows containers, called Drawbridge. Describing it as an internal effort, Russinovich revealed the container technology is in use within the company internally and is now available for customers that run their own machine learning-based code in the Azure service.

"Obviously spinning up a VM for [machine learning] is not acceptable in terms of the experience," Russinovich said during the panel discussion. "We are figuring out how to make that kind of technology available publicly on Windows."

At the time, Russinovich was tight-lipped about Microsoft's work with Docker and the two companies' stealth effort. Russinovich emphasized Microsoft's support for Linux containers on Azure and when pressed about Drawbridge he described it as a more superior container technology, arguing its containers are more secure for deploying micro-services.

As we now know, Microsoft has been working quietly behind the scenes with Docker to enable the Docker Engine, originally architected only to run in a Linux server, to operate with Windows Server as well. The two companies are working together to enable the Docker Engine to work in the next version of Windows Server.

Microsoft is working to enable Docker Engine images for Windows Server that will be available in Docker Hub, an open source repository housing more than 45,000 Docker applications via shared developer communities. As a result, Docker images will be available for both Linux and Windows Server.

Furthermore, the Docker Hub will run in the Microsoft Azure public cloud, accessible via the Azure Management Portal and Azure Gallery and Management Portal. This will allow cloud developers including its ISV partners to access the images. Microsoft also said it will support Docker orchestration APIs, which will let developers and administrators manage applications across both Windows and Linux platforms using common tooling. This will provide portability across different infrastructure, such as on-premises servers to cloud. It bears noting the individual containers remain tied to the operating system they are derived from.

The Docker Engine for Windows Server will be part of the Docker open source project where Microsoft said it intends to be an active participant. The result is that developers will now be able to use preconfigured Docker containers in both Linux and Windows environments.

Microsoft is not saying when it will appear, noting it is in the hands of the open source community, according to Ross Gardler, senior technology evangelist  for Microsoft Open Technologies. To what extent Microsoft will share the underlying Windows code is not clear. Nor would he say to what extent, if any, the work from Docker will appear in this effort other than to say the company has gained deep knowledge from that project.

"This announcement is about a partnership of the bringing of Docker to Windows Server to insure we have interoperability between Docker containers," Gardler said. "The underlying implementation of that is not overly important. What is important is the fact that we'll have compatibility in the APIs between the Docker containers on Linux, and the Docker container on Windows."

David Messina, vice president of marketing at Docker, said the collaboration and integration between the two companies on the Docker Hub and the Azure Gallery, will lead to the merging of the best application content from both communities.

"If I'm a developer and I'm trying to build a differentiated application, what I want to focus on is a core service that's going to be unique to my enterprise or my organization and I want to pull in other content that's already there to be components for the application," Messina said. "So you're going to get faster innovation and the ability to focus on core differentiating capabilities and then leveraging investments from everybody else."

In addition to leading to faster development cycles, it appears containers will place less focus on the operating system over time. "It's less about dependencies on the operating system and more about being able to choose the technologies that are most appropriate and execute those on the platform," Microsoft's Gardler said.

Microosft Azure Corporate VP Jason Zander described the company's reasoning and plan to support Docker in Windows Server and Azure in a blog post.  Zander explained how they will work:

Windows Server containers provide applications an isolated, portable and resource controlled operating environment. This isolation enables containerized applications to run without risk of dependencies and environmental configuration affecting the application. By sharing the same kernel and other key system components, containers exhibit rapid startup times and reduced resource overhead. Rapid startup helps in development and testing scenarios and continuous integration environments, while the reduced resource overhead makes them ideal for service-oriented architectures.

The Windows Server container infrastructure allows for sharing, publishing and shipping of containers to anywhere the next wave of Windows Server is running. With this new technology millions of Windows developers familiar with technologies such as .NET, ASP.NET, PowerShell, and more will be able to leverage container technology. No longer will developers have to choose between the advantages of containers and using Windows Server technologies.

IDC Analyst Al Hilwasaid in an e-mail that Microsoft has taken a significant step toward advancing container technology. "This is a big step for both Microsoft and the Docker technology," he said. "Some of the things I look forward to figuring out is how Docker will perform on Windows and how easy it will be to run or convert Linux Docker apps on Windows."

 

Posted by Jeffrey Schwartz on 10/15/2014 at 2:10 PM0 comments


Nadella May Have Done Women a Favor by Insulting Them

Satya Nadella's comments suggesting that women shouldn't ask for pay raises or promotions have prompted outrage on social media. But to his credit, he swiftly apologized, saying he didn't mean what he said.

To be sure, Nadella's answer to the question of "What is your advice?" to women uncomfortable asking for a raise was, indeed, insulting to women. Nadella said "karma" is the best way women should expect a salary increase or career advancement, a comment the male CEO couldn't have made at a worse place: The Grace Hopper Celebration of Women in Computing in Phoenix, where he was interviewed onstage by Maria Klawe, president of Harvey Mudd College in Claremont, Calif. Even more unfortunate, Klawe is a Microsoft board member, one of the people Nadella reports to.

Here's exactly what Nadella said:

"It's not really just asking for the raise but knowing and having faith that the system will give you the right raises as you go along. And I think it might be one of the additional super powers that, quite frankly, women who don't ask for a raise have. Because that's good karma, it will come back, because somebody's going to know that 'that's the kind of person that I want to trust. That's the kind of person that I want to really give more responsibility to,' and in the long-term efficiency, things catch up."

Accentuating his poor choice of words was Klawe's immediate and firm challenge when she responded, "This is one of the very few things I disagree with you on," which was followed by a rousing applause. But it also gave Klawe an opportunity to tell women how not to make the same mistake she has in the past.

Klawe explained that she was among those who could easily advocate for someone who works for her but not for herself. Klawe related how she got stiffed on getting fair pay when she took a job as dean of Princeton University's engineering school because she didn't advocate for herself. Instead of finding out how much she was worth, when the university asked Klawe how much she wanted to be paid, she told her boss, who was a woman, "Just pay me what you think is right." Princeton paid her $50,000 less than the going scale for that position, Klawe said.

Now she's learned her lesson and offered the following advice: "Do your homework. Make sure you know what a reasonable salary is if you're being offered a job. Do not be as stupid as I was. Second, roleplay. Sit down with somebody you really trust and practice asking for the salary you deserve."

Certainly, the lack of equal pay and advancement for women has been a problem as long as I can remember. On occasion, high-profile lawsuits, often in the financial services industry, will bring it to the forefront and politicians will address it in their campaign speeches. The IT industry, perhaps even more so than the financial services industry, is dominated by men.

Nadella's apology appeared heartfelt. "I answered that question completely wrong," he said in an e-mail to employees almost immediately after making the remarks. "Without a doubt I wholeheartedly support programs at Microsoft and in the industry that bring more women into technology and close the pay gap. I believe men and women should get equal pay for equal work. And when it comes to career advice on getting a raise when you think it's deserved, Maria's advice was the right advice. If you think you deserve a raise, you should just ask. I said I was looking forward to the Grace Hopper Conference to learn, and I certainly learned a valuable lesson. I look forward to speaking with you at our monthly Q&A next week and am happy to answer any question you have."

Critics may believe Nadella's apology was nothing more than damage control. It's indeed the first major gaffe committed by the new CEO, but I'd take him at his word. Nadella, who has two daughters of his own, has encouraged employees to ask if they feel they deserve a raise. If Nadella's ill-chosen comments do nothing else, they'll elevate the discussion within Microsoft and throughout the IT industry and business world at large.

Of course, actions speak louder than words, and that's where the challenge remains.

Posted by Jeffrey Schwartz on 10/10/2014 at 3:00 PM0 comments


New VMware Workstation 11 Supports Windows 10 Technical Preview

In its bid to replace the traditional Windows and client environment with virtual desktops, VMware will release major new upgrades of its VMware Workstation and VMware Player desktop virtualization offerings in December. Both will offer support for the latest software and hardware architectures and cloud services. 

The new VMware Workstation 11, the company's complete virtual desktop offering and the company's flagship product launched 15 years ago, is widely used by IT administrators, developers and QA teams. VMware Workstation 11 will support the new Windows 10  Technical Preview for enterprise and commercial IT testers and developers who want to put Microsoft's latest PC operating system through the paces in a virtual desktop environment.

Built with nested virtualization, VMware Workstation can run other hypervisors inside the VM, including Microsoft's Hyper-V and VMware's own vSphere and ESXi. In addition to running the new Windows 10 Technical Preview, VMware Workstation 11 will add support for other operating systems including Windows 2012 R2 for servers, Ubuntu 14.10, RHEL 7, CentOS 7, Fedora 20, Debian 7.6 and more than 200 others, the company said. 

Also new in VMware Workstation 11 is support for the most current 64-bit x86 processors including Intel's Haswell (released late last year). VMware claims that based on its own testing, using Haswell's new microprocessor architecture with VMware Workstation 11 will offer up to a 45 percent performance improvement for functions such as encryption and multimedia. It will let IT pros and developers build VMs with up to 16 vCPUs, 8TB virtual disks and up to 64GB of memory. It will also connect to vSphere and the vCloud Air public cloud.

For more mainstream users is the new VMware Player 7. Since it's targeted at everyday users rather than just IT pros and administrators, it has fewer of the bells and whistles, but it gains support for the current Windows 8.1 operating system, as well as offering continued support for Windows XP and Windows 7 in desktop virtual environments. "Our goal is to have zero-base support," said William Myrhang, senior product marketing manager at VMware.

VMware Player 7 adds support for the latest crop of PCs and tablets and will be able to run restricted VMs, which, as the name implies, are secure clients that are encrypted, password restricted and can shut off USB access. VMware said the restricted VMs, which can be built with VMware Workstation 11 or VMware Fusion 7 Pro, run in isolation between host and guest operating systems and can have time limits built in.

 

Posted by Jeffrey Schwartz on 10/08/2014 at 2:25 PM0 comments


Veeam Plots Growth with Free Windows Endpoint Backup Software and Cloud Extensions

Veeam today said it will offer a free Windows endpoint backup and recovery client. The move is a departure from its history of providing replication and backup and recovery software for virtual server environments However, company officials said the move is not a departure from focus, which will remain targeted on protection of server virtual machines, but rather a realization that most organizations are not entirely virtual.

The new Veeam Endpoint Backup software will run on Windows 7 and later OSes (though not Windows RT) to an internal or external (such as USB) disk or flash drive and will include a networked attached storage (NAS) share within the Veeam environment. The company will issue a beta in the coming weeks and the product is due to be officially released sometime next year. The surprise announcement came on the closing day of its inaugural customer and partner conference called VeeamON, held in Las Vegas.

Enterprise Strategy Group Analyst Jason Buffington, who follows the data protection market and has conducted research for Veeam, said offering endpoint client software was unexpected. "At first, I was a little surprised because it didn't seem congruent with that VM-centric approach to things," Buffington said. "But that's another great example of them adding a fringe utility. In this first release, while it's an endpoint solution, primarily, there's no reason you technically couldn't run it on a low-end Windows Server. I'm reasonably confident they are not going to go hog wild into the endpoint protection business. This is just their way to kind of test the code, test customers' willingness for it, as a way to vet that physical feature such that they have even a stronger stranglehold on that midsize org that's backing up everything except a few stragglers."

At a media briefing during the VeeamON conference, company officials emphasized that they remain focused on its core business of protecting server VMs as it plots its growth toward supporting various cloud environments as backup and disaster recovery targets. Doug Hazelman, Veeam vice president of product strategy, indicated that it could be used for various physical servers as well. Hazelman said that the company is largely looking to see how customers use the software, which can perform file-level recoveries. Furthermore he noted that the endpoint software doesn't require any of the company's software and vowed it would remain free as a standalone offering.

"We are not targeting this at an enterprise with 50,000 endpoints," Hazelman said. "We want to get it in the hands of the IT pros and typical Veeam customers and see how we can expand this product and see how we can grow it."

Indeed the VeeamON event was largely to launch a major new release of its flagship suite, to be called the Data Availability Suite v8. Many say Veeam is the fastest growing provider in its market since the company's launch in 2006. In his opening keynote address in the partner track, CEO Ratmir Timashev said that Veeam is on pace to post $500 million in booked revenue (non GAAP) and is aiming to double that to $1 billion by 2018.

In an interview following his keynote, Timashev said the company doesn't have near-term plans for an initial public offering (IPO) and insisted the company is not looking to be acquired. "We're not looking to sell the company," he said. "We believe we can grow. We have proven capabilities to find the next hot market and develop a brilliant product. And when you have this capability, you can continue growing, stay profitable and you don't need to sell."

Timashev added that Veeam can reach those fast-growth goals without deviating from its core mission of protecting virtual datacenters. Extending to a new network of cloud providers will be a key enabler, according to Timashev. The new Data Availability Suite v8, set for release next month (he didn't give an exact date),  will incorporate a new interface called Cloud Connect that will let customers choose from a growing network of partners who are building cloud-based and hosted backup and disaster recovery services.

The new v8 suite offers a bevy of other features including what it calls "Explorers" that can now protect Microsoft's Active Directory and SQL Server and provides extended support for Exchange Server and SharePoint. Also added is extended WAN acceleration introduced in the last release to cover replication and a feature called Backup IO, which adds intelligent load balancing.

 

Posted by Jeffrey Schwartz on 10/08/2014 at 11:30 AM0 comments


HP Split Up Could Reshape IT Landscape: Could Microsoft Follow Suit?

Hewlett Packard for decades has resisted calls by Wall Street to divest itself into multiple companies but today it has heeded the call. The company said it would split itself into two separate publicly traded businesses next year. The two companies will leverage their existing storied brand, calling its PC and printing business HP Inc. and the infrastructure and cloud businesses HP Enterprise.

Once the split is complete, Meg Whitman will lead the new HP Enterprise as CEO and serve only as chairman of HP Inc. The move comes less than a week after eBay said it would spin off its PayPal business unit into a separately traded company. Ironically, Whitman, HP's current CEO, was the longtime CEO of eBay during the peak of the dotcom bubble and it too was recently under pressure by activist investors to spin off PayPal, believing both companies would fare better apart.

HP has long resisted calls by Wall Street to split itself into two or more companies. The pressure intensified following the early 2005 departure of CEO Carly Fiorina, whose disputed move to acquire Compaq remained controversial to this day. The company had strongly considered selling off or divesting its PC and printing businesses under previous CEO Leo Apotheker. When he was abruptly dismissed after just 11 months and Whitman took over, she continued the review but ultimately decided a "One HP" would make it a stronger company.

In deciding not to divest back in 2011, Whitman argued remaining together gave it more scale and would put it in a stronger position to compete. For instance, she argued HP was the largest buyer of CPUs, memory, drives and other components.

Now she's arguing that the market has changed profoundly. "We think this is the best alternative," she told CNBC's David Faber in an interview this morning. "The market has changed dramatically in terms of speed. We are in a position to position these two companies for growth"

Even though the question of HP divesting its business has always come up over the years, today's news was unexpected and comes after the company was rumored to be looking at an acquisition of storage giant EMC and earlier cloud giant Rackspace. It's not clear how serious talks, if there were any, were.

The decision to become smaller rather than larger by HP reflects growing pressure by large companies to become more competitive against nimbler competitors. Many of the large IT giants have faced similar pressure. Wall Street has been pushing EMC to split itself from VMware and IBM last week just completed the sale of its industry standard server business to Lenovo. And Dell, led by its founder and CEO Michael Dell, has become a private company.

Microsoft has also faced pressure to split itself up over the years, dating back to the U.S. government's antitrust case. Investors have continued to push Microsoft to consider splitting off or selling its gaming and potentially its devices business off since former CEO Steve Ballmer announced he was stepping down last year. The company's now controversial move to acquire Nokia's handset business for $7.2 billion and the selection of insider Satya Nadella as its new CEO has made that appear less likely. Nadella has said that Microsoft has no plans to divest any of its businesses. But HP's move shows how things can change.

Despite its own "One Microsoft" model, analysts will surely step up the pressure for Microsoft to consider its options. Yet Microsoft may have a better argument that it should keep its businesses intact, with the exception of perhaps Nokia if it becomes a drag on the company.

But as Whitman pointed out, "before a few months ago, we weren't positioned to do this. Now the time is right." And that's why never-say-never is the operative term in the IT industry.

 

 

Posted by Jeffrey Schwartz on 10/06/2014 at 11:42 AM0 comments


Windows 10 Preview: Start Button Is Back, Charms In Limbo

Microsoft may have succeeded in throwing a curve ball at the world by not naming the next version of its operating system Windows 9. But as William Shakespeare famously wrote in Romeo and Juliet, "A rose by any other name would smell as sweet." In other words, if Windows 10 is a stinker, it won't matter what Microsoft calls it.

In his First Look of the preview, Brien Posey wondered if trying to come up to speed with Apple's OS X had anything to do with the choice in names --  a theory that quickly came to mind by many others wondering what Microsoft is up to (the company apparently didn't say why it came up with Windows 10). Perhaps Microsoft's trying to appeal to the many Windows XP loyalists?

The name notwithstanding, I too downloaded the Windows 10 Preview, a process that was relatively simple. Posey's review encapsulates Microsoft's progress in unifying the modern interface with the desktop and gave his thoughts on where Microsoft needs to move forward before bringing Windows 10 to market. One thing he didn't touch upon is the status of the Charms feature. Introduced in Windows 8, Charms were intended to help find shortcuts in managing your device.

In the preview, the Charms are no longer accessible with a mouse, only with touch. If you got used to using the Charms with a mouse, you're going to have to readjust to using the Start Button again. For some, that may require some readjustment, especially if they continue to use the Charms with their fingers. Would you like to see Microsoft make the Charms available when using a mouse? What would be the downside?

Meanwhile, have you downloaded the Windows 10 Preview? Keep in mind, this is just the first preview and Microsoft is looking for user feedback to decide what features makes the final cut as it refines Windows 10. We'd love to hear your thoughts on where you'd like them to refine Windows 10 so that it doesn't become a stinker.

 

 

Posted by Jeffrey Schwartz on 10/03/2014 at 12:35 PM0 comments


Windows 'Drawbridge' Container Tech Sets Stage for Docker Battle

Untitled Document

As the capabilities of virtual machines reach their outer limits in the quest to build cloud-based software-defined datacenters, containers are quickly emerging as their potential successor. Though containers have long existed, notably in Linux, the rise of the Docker open source container has created a standard for building portable applications in the form of micro-services. As they become more mature, containers promise portability, automation, orchestration and scalability of applications across clouds and virtual machines.

Since releasing Docker as an open source container for Linux, just about every company has announced support for it either in their operating systems, virtual machines or cloud platforms including IBM, Google, Red Hat, VMware and even Microsoft, which in May said it would support Linux-based Docker containers in the infrastructure-as-a-service (IaaS) component of its Azure cloud service. Docker is not available in the Microsoft platform as a service (PaaS) because it doesn't yet support Linux, though it appears only a matter of time before that happens.

"We're thinking about it," said Mark Russinovich, who Microsoft last month officially named CTO of its Azure cloud. "We hear customers want Linux on PaaS on Azure."

Russinovich confirmed that Microsoft is looking to commercialize its own container technology, code-named "Drawbridge," a library OS effort kicked off in 2008 by Microsoft Research Partner Manager Galen Hunt, who in 2011 detailed a working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In the desktop prototype, Microsoft said the securely isolated library operating system instances worked via the reuse of networking protocols. In a keynote address at the August TechMentor conference (which, like Redmond magazine, is produced by 1105 Media) on the Microsoft campus, Redmond magazine columnist Don Jones told attendees about the effort and questioned its future.

During a panel discussion at the Interop conference in New York yesterday, Russinovich acknowledged Drawbridge as alive and well. While he couldn't speak for plans on the Windows client he also stopped short of saying Microsoft plans to include it in Windows Server and Hyper-V. But he left little doubt that that's in the pipeline for Windows Server and Azure. Russinovich said Microsoft has already used the Drawbridge container technology in its new Azure-based machine learning technology.

"Obviously spinning up a VM for them is not acceptable in terms of the experience," Russinovich said. "So we built with the help of Microsoft Research our own secure container technology, called Drawbridge. That's what we used internally. We are figuring out how to make that kind of technology available publicly on Windows." Russinovich wouldn't say whether it will be discussed at the TechEd conference in Barcelona later this month.

Sam Ramji, who left his role as leader of Microsoft's emerging open source and Linux strategy five years ago, heard about Drawbridge for the first time in yesterday's session. In an interview he argued that if Windows Server is going to remain competitive with Linux, it needs to have its own containers. "It's a must-have," said Ramji, who is now VP of strategy at Apigee, a provider of cloud-based APIs. "If they don't have a container in the next 12 months, I think they will probably lose market share."

Despite Microsoft's caginess on its commercial plans for Drawbridge and containers, reading between the lines it appears they're a priority for the Azure team. While talking up Microsoft's support for Docker containers for Linux, Russinovich seemed to position Drawbridge as a superior container technology, arguing its containers are more secure for deploying micro-services.

"In a multi-tenant environment you're letting untrusted code from who knows where run on a platform and you need a security boundary around that," Russinovich said. "Most cloud platforms use the virtual machines as a security boundary. With a smaller, letter-grade secure container, we can make the deployment of that much more efficient," Russinovich said. "That's where Drawbridge comes into play. "

Ramji agreed that the ability to provide secure micro-services is a key differentiator between the open source Docker and Drawbridge. "It's going to make bigger promises for security, especially for third-party untrusted code," Ramji said.

Asked if cloud platforms like the open source OpenShift PaaS, led by Red Hat, can make containers more secure, Krishnan Subramanian, argued that's not their role. "They are not there to make containers more secure. Their role is for the orchestration side of things," Subramanian said. "Security comes with the underlying operating system that the container uses. If they're going to use one of those operating systems in the industry that are not enterprise ready, probably they're not secure."

Russinovich said customers do want to see Windows-based containers. Is that the case? How do you see them playing in your infrastructure and how imperative is it that they come sooner than later?

 

Posted by Jeffrey Schwartz on 10/01/2014 at 2:25 PM0 comments


Microsoft Opens Polling Prediction Lab

Microsoft Research today opened its new online Prediction Lab in a move it said aims to reinvent the way polls and surveys are conducted. The new lab, open to anyone in the format of a game, seeks to provide more accurate predictions than current surveys can forecast today.

Led by David Rothschild, an economist at Microsoft Research and also a fellow at the Applied Statistics Center at Columbia University, the Prediction Lab boasts it already has a credible track record prior to its launch. In some examples released today, the lab predicted an 84 percent chance that Scottish voters would reject the election held to decide whether Scotland should secede from the United Kingdom.

The predictions, published in a blog post called A Data-Driven Crystal Ball, also included data of  the winners of all 15 World Cup knockout games this year. And in the 2012 presidential election, the lab got the Obama versus Romney results right in 50 of 51 territories (including Washington, DC). The new interactive platform, released to the public, hopes to gather more data and sentiment of the general population.

"We're building an infrastructure that's incredibly scalable, so we can be answering questions along a massive continuum," Rothschild said in the blog post, where he described the Prediction Lab as "a great laboratory for researchers [and] a very socialized experience" for those who participate.

"By really reinventing survey research, we feel that we can open it up to a whole new realm of questions that, previously, people used to say you can only use a model for," Rothschild added. "From whom you survey to the questions you ask to the aggregation method that you utilize to the incentive structure, we see places to innovate. We're trying to be extremely disruptive."

Rothschild also explained why traditional poling technology is outdated and the need to research new methods like in the Prediction Lab in the era of big data.  "First, I firmly believe the standard polling will reach a point where the response rate and the coverage is so low that something bad will happen. Then, the standard polling technology will be completely destroyed, so it is prudent to invest in alternative methods. Second, even if nothing ever happened to standard polling, nonprobability polling data will unlock market intelligence for us that no standard polling could ever provide. Ultimately, we will be able to gather data so quickly that the idea of a decision-maker waiting a few weeks for a poll will seem crazy."

Microsoft is hoping to keep participants engaged with its game-like polling technique, where participants can win or lose points based on making an accurate prediction (if you're wrong, you lose points). This week's "challenge" looks to predict whether President Obama will name a new attorney general before Oct. 5. The second question asks if the number of U.S. states recognizing gay marriages will change next week and the final poll asks if there will be American active combat soldiers in Syria by Oct. 5.

Whether the Microsoft Prediction Lab will gain the status of more popular surveys such as the Gallup polls remains to be seen. But the work in Microsoft Research shows an interesting use of applied quantitative research. Though Microsoft didn't outline plans to extend the Prediction Lab, perhaps some of its technology will have implication for the company's offerings such as Cortana, Bing and even Delve, the new Office Graph technology formerly code-named "Oslo" for SharePoint and Office 365. Now in preview, it's built on Microsoft's FAST enterprise search technology and is designed to work across Office 365 app silos.

 

Posted by Jeffrey Schwartz on 09/29/2014 at 11:54 AM0 comments


A Reimaged Rackspace To Step Up Hyper-V and VMware Support

Rackspace, which over the past few years tried to transform itself into the leading OpenStack cloud provider, is shifting gears. The large San Antonio-based service provider last week began emphasizing a portfolio of dedicated managed services that let enterprises run their systems and applications on their choice of virtual platforms -- Microsoft's Hyper-V, VMware's ESX or the open source OpenStack platform.

The new Hyper-V-based managed services include for the first time the complete System Center 2012 R2 stack, including Windows Server and Storage Spaces. The orchestrated services are available as dedicated single-tenant managed services for testing and are set for general availability in the United States in November, followed by the United Kingdom and Sydney in the first quarter of next year. Insiders had long pushed the company to offer Hyper-V-based managed services but until recently, it was met with resistance by leadership that wanted to stick with VMware for hosted services.

Months after Rackspace CEO Lanham Napier stepped down in February and the company retained Morgan Stanley to seek a buyer in May, the company last week said it didn't find a buyer and decided to remain independent, naming Rackspace President Taylor Rhodes as the new CEO. A day later, the company said it's no longer just emphasizing OpenStack. Instead, the company is promoting its best-of-breed approach with Hyper-V, VMware and OpenStack.

I caught up with CTO John Engates at a day-long analyst and customer conference in New York, where he explained the shift. "The vast majority of IT is still done today in customer-run datacenters by IT guys," Engates explained. "The small fraction of what's going to the cloud today is early stage applications and they're built by sometimes a sliver of the IT organization that's sort of on the bleeding edge. But there are a lot of applications that still move to the cloud as datacenters get old, as servers go through refresh cycles. But they won't necessarily be able to go to Amazon's flavor of cloud. They will go to VMware cloud, or a Rackspace-hosted VMware cloud, or a Microsoft-based cloud."

In a way, Rackspace is going back to its roots as the company was born as a hosting provider until a few years ago when it decided to base its growth on competing with large cloud providers by building out its entire cloud infrastructure on OpenStack to offer an option to Amazon Web Services cloud offerings, with the added benefit of offering its so-called "fanatical support." Rackspace codeveloped OpenStack with NASA as an open source means of offering portable Amazon-compatible cloud services. While it continued to offer other hosting services including a Microsoft-centric Exchange, Lync and SharePoint managed services offering, it was a relatively small portion of its business, ran only on VMware and remained in the shadow of Rackspace's OpenStack push.

A report released by 451 Research shows OpenStack, though rapidly growing, accounts for $883 million of the $56 billion market for cloud and managed services. "Rackspace for a long time was seen part and parcel of the Amazon world with public cloud and they're clearly repositioning themselves around the fastest growing, most important part of the market, which is managed cloud and private cloud," said 451 Research Analyst and Senior VP Michelle Bailey. "They can do cloud with customer support, which is something you don't typically get with the larger public provides. They have guarantees around availability, and they'll sign a business agreement with customers, which is what you'll see from traditional hosting and service providers."

Like others, Bailey said there's growing demand for managed services based on Hyper-V-based single-tenant servers. "With the Microsoft relationship they're able to provide apps," she said. "So you're able to go up the stack. It's not just the infrastructure piece that you're getting with Microsoft but specifically Exchange, SharePoint and SQL Server. These are some of the most commonly used applications in the market and Microsoft has made it very good for their partners to be able to resell those services now. When you get into an app discussion with a customer, it's a completely different discussion."

Jeff DeVerter, general manager for the Microsoft private cloud practice at Rackspace, acknowledged it was a multi-year effort to get corporate buy-in for offering services on Hyper-V and ultimately the Microsoft Cloud OS stack. "I had to convince the senior leadership team at Rackspace that this was the right thing to do," said DeVerter. "It was easier to do this year than it was in previous years because we were still feeling our way from an OpenStack perspective. If you look at the whole stack that is Microsoft's Cloud OS, it really is a very similar thing to what the whole stack of OpenStack is. Rackspace has realized the world is not built on OpenStack because there really are traditional enterprise applications [Exchange, Lync and SharePoint] that don't fit there. They're not written for the OpenStack world."

DeVerter would know, having come to the company six years ago as a SharePoint architect and helped grow the SharePoint, Exchange and ultimately Lync business to $50 million in revenues. Aiding that growth was the Feb. 2012 acquisition of SharePoint 911, whose principals Shane Young and Todd Klindt and helped make the case for moving those platforms from VMware to Hyper-V.

 

Posted by Jeffrey Schwartz on 09/26/2014 at 7:46 AM0 comments


Larry Ellison Might Have Unfinished Business with Microsoft

Within moments of last week's news that Larry Ellison has stepped down as Oracle's CEO to become CTO, social media lit up. Reaction such as "whoa!" and "wow!" preceded every tweet or Facebook post. In reality, it seemed like a superficial change in titles.

For all intents and purposes, the new CEOs, Mark Hurd and Safra Catz, were already running the company, while Ellison had final say in technical strategy. Hence it's primarily business as usual with some new formalities in place. Could it be a precursor to some bombshell in the coming days and weeks? We'll see but there's nothing obvious to suggest that.

It seems more likely Ellison will fade away from Oracle over time, rather than have a ceremonial departure like Bill Gates did when he left Microsoft in 2008. Don't be surprised if Ellison spends a lot more time on his yacht and on Lanai, the small island he bought in 2012 near Hawaii that he is seeking to make  "the first economically viable, 100 percent green community," as reported by The New  York Times Magazine this week.

For now, Hurd told The Times that "Larry's not going anywhere." In fact Hurd hinted despite incurring his wrath on SAP and Salesforce.com over the past decade, Ellison may revisit his old rivalry with Microsoft, where he and Scott McNealy, onetime CEO of Sun Microsystems (ironically now a part of Oracle), fought hard and tirelessly to end Microsoft's Windows PC dominance. They did so in lots of public speeches, lawsuits and a strong effort to displace traditional PCs with their network computers, which were ahead of their time. Ellison also did his part in helping spawn the Linux server movement by bringing new versions of the Oracle database on the open source platform first, and only much later on Windows Server.

While Oracle has fiercely competed with Microsoft in the database market and the Java versus .NET battle over the past decade, with little left to fight about, Ellison largely focused his ire on IBM, Salesforce and SAP.  Nevertheless Oracle's agreement with Microsoft last year to support native Java and the Oracle database and virtual machines on the Microsoft Azure public cloud was intriguing  as it was such an unlikely move at one time.

Now it appears Ellison, who years ago mocked cloud computing, has Azure envy due to it having one of the more built-out PaaS portfolios. Hurd told The Times Oracle is readying its own platform as a service (PaaS) that will compete with Azure that Ellison will reveal at next week's Oracle OpenWorld conference in San Francisco. Newly promoted CEO Hurd told The Times Ellison will announce a PaaS aimed at competing with the Azure PaaS and SQL Server in a keynote. Suggested (but not stated) was that Ellison will try to pitch it as optimized for Microsoft's .NET language.

Oracle's current pact with Microsoft is primarily focused on Azure's infrastructure-as-a-service (Iaas) offerings, not PaaS. Whether Oracle offers a serious alternative to running its wares on IaaS remains to be seen. If indeed Oracle aims to compete with Microsoft on the PaaS front, it's likely going to be an offering that will give existing Oracle customers an alternative cloud to customers who would never port their database and Java apps to Azure.

However Ellison positions this new offering, unless Oracle has covertly been building a few dozen datacenters globally that match the scale and capacity of Amazon, Azure, Google and Salesforce.com --which would set off a social media firestorm -- it's more likely to look like a better-late-than-never service well suited and awaited by many of Oracle's customers.

Posted by Jeffrey Schwartz on 09/24/2014 at 11:14 AM0 comments


Size Does Matter: Apple iPhone Sales 'Shatter' Expectations

Apple today said it has sold 10 million of its new iPhones over the first three days since they arrived in stores and at customers' doorsteps Friday. This exceeds analysts' and the company's forecasts. In the words of CEO Tim Cook, sales of its new iPhone 6 and iPhone 6 Plus models have led to the "best launch ever, shattering all previous sell-through records by a large margin."

Indeed analysts are noting that the figures are impressive, especially considering they haven't yet shipped in China, where there's large demand for the new iPhones. How and if the initial results will embolden the iPhone and iOS, in a market that has lost market leadership share to Android, remains to be seen. But it appears the company's unwillingness to deliver a larger phone earlier clearly must have cost it market share with those with pent up demand for larger smartphones. Most Android phones and Windows Phones are larger than the previous 4-inch iPhone 5s and the majority of devices are even larger than 4.5 inches these days.

The company didn't break out sales of the bigger iPhone 6 versus the even larger 5.5-inch iPhone 6 Plus, but some believe the latter may have had an edge even if they're more scarce. But if the sales Apple reported are primarily from existing iPhone users, that will only stabilize its existing share, not extend it. However, as the market share for Windows Phone declines, demand for the iPhone will grow on the back of features such as Apple Pay and the forthcoming iWatch that saturate the media. This won't help the market for Microsoft-based phones (and could bring back some Android users to Apple).

It doesn't appear the new Amazon Fire phones, technically Android-based devices, are gaining meaningful share. Meanwhile BlackBerry is readying its first new phone since the release of the BlackBerry 10 last year. Despite miniscule market share, BlackBerry CEO John Chen told The Wall Street Journal that the new 4.5-inch display that will come with its new Passport will help to make it an enterprise-grade device that's targeted at productivity. It will also boast a battery that can power the phone for 36 hours and a large antenna designed to provide better reception. In addition to enterprise users, the phone will be targeted at medical professionals.

With the growing move by some to so-called phablets, which the iPhone 6 Plus arguably is (some might say devices that are over 6 inches better fit that description), these larger devices are also expected to cut into sales of 7-inch tablets. In Apple's case, that includes the iPad mini. But given the iPad Mini's price and the fact that not all models have built-in cellular connectivity, the iPhone 6 Plus could bolster Apple more than hurt it.

Despite Microsoft's efforts to talk up improvements to Windows Phone 8.1 and its emphasis on Cortana, it appears the noise from Apple and the coverage surrounding it is all but drowning it out. As the noise from Apple subsides in the coming weeks, Microsoft will need to step up the volume.

 

Posted by Jeffrey Schwartz on 09/22/2014 at 12:03 PM0 comments


More Microsoft Board Changes: 2 Longtime Directors To Depart

Microsoft earlier this week said two more longtime board members are stepping down and the company has already named their replacements, which will take effect Oct. 1.

Among them are David Marquardt, a venture capitalist who was an early investor in Microsoft, and Dina Dublon, the onetime chief financial officer of J.P. Morgan. Dublon was on Microsoft's audit committee and chaired its compensation, The Wall Street journal noted. The paper also raised an interesting question:  Would losing the two board members now and others over the past two years result in a gap in "institutional knowledge?" Marquardt, with his Silicon Valley ties, played a key role in helping Microsoft "get off the ground and is a direct link to the company's earliest days."

The two new board members are Teri List-Stoll, chief financial officer of Kraft Foods and Visa CEO Charles Scharf, who once ran J.P. Morgan Chase's retail banking and private investment operations. Of course the moves come just weeks after former CEO Steve Ballmer stepped down.

Nadella will be reporting to a board with six new members out of a total of 10 since 2012. Indeed that should please Wall Street investors who were pining for new blood during last year's search for Ballmer's replacement and didn't want to see an insider get the job.

But the real question to be determined is will these new voices teamed with Nadella strike a balance that is needed for Microsoft to thrive in the fiercest competitive market it has faced to date?

 

Posted by Jeffrey Schwartz on 09/18/2014 at 3:32 PM0 comments


Tech & Tools Watch, September 18: IBM, Unitrends, CloudLink

IBM's New M5 Servers Include Editions for Hyper-V and SQL Server
In what could be its last major rollout of new x86 systems if the company's January deal to sell its commodity server business to Lenovo for $2.3 billion goes through, IBM launched the new System x M5 line. The new lineup of servers includes systems designed to operate the latest versions of Microsoft's Hyper-V and SQL Server.

IBM said the new M5 line offers improved performance and security. With a number of models for a variety of solution types, the new M5 includes various tower, rack, blade and integrated systems that target everything from small workloads to infrastructure for private clouds, big data and analytic applications. The two systems targeting Microsoft workloads include the new IBM System x Solution for Microsoft Fast Track DW for SQL Server 2014 and an upgraded IBM Flex System Solution for Microsoft Hyper-V.

The IBM System x Solution for Microsoft SQL Data Warehouse on X6 is designed for data warehouse workloads running the new SQL Server 2014, which shipped in April. IBM said it offers rapid response to data queries and enables scalability as workloads increase. Specifically, the new systems are powered by Intel Xeon E7-4800/88 v2 processors. IBM said the new systems offer 100 percent faster database performance than the prior release, with three times the memory capacity and a third of the latency of PCIe-based flash. The systems can pull up to 12TB of flash memory-channel storage near the processor as well.

As a Microsoft Fast Track Partner, IBM also added its IBM System x3850 X6 Solution for Microsoft Hyper-V. Targeting business-critical systems, the two-node configuration uses Microsoft Failover Clustering, aimed at eliminating any single point of failure, according to IBM's reference architecture. The Hyper-V role installed on each clustered server, which hosts virtual machines.

Unitrends Adds Reporting Tools To Monitor Capacity and Storage Inventory
Backup and recovery appliance supplier Unitrends has added new tools track storage inventory and capacity, designed to help administrators more accurately gauge their system requirements in order to lower costs.

The set of free tools provide views of storage, file capacity and utilization. Unitrends said they let administrators calculate the amount of storage they need to make available for backups and prioritize files that need to be backed up. The tools provide single point-in-time snapshots of storage and files distributed throughout organizations' datacenters to help determine how much capacity they need and prioritize which files get backed up based on how much storage is available.  

There are two separate tools. The Unitrends Backup Capacity Tool provides the snapshot of all files distributed throughout an organization on both storage systems as well as servers, providing file-level views of data to plan backups. It provides reports for planning, while outlining file usage. The other, the Unitrends Storage Inventory Tool, provides a view of an organization's entire storage infrastructure, which the company says offers detailed inventories of storage assets in use and where there are risks of exceeding available capacity.

These new tools follow July's release of BC/DR Link, an online free tool that helps organizations create global disaster recovery plans. It includes 1GB of centralized storage in the Unitrends cloud, where customers can store critical document.

CloudLink Adds Microsoft BitLocker To Secure Workloads Running in Amazon
Security vendor CloudLink's SecureVM, which provides Microsoft BitLocker encryption, has added Amazon Web Services to its list of supported cloud platforms. The tool lets customers implement the native Windows encryption to their virtual machines -- both desktop and servers -- in the Amazon cloud. In addition to Amazon, it works in Microsoft Azure and VMware vCloud Air, among other public clouds.

Because virtual and cloud environments naturally can't utilize the BitLocker encryption keys typically stored on TPM and USB hardware, SecureVM emulates that functionality by providing centralized management of the keys. It also lets customers encrypt their VMs outside of AWS.

Customers can start their VMs only in the intended environment. It's policy based to ensure the VMs only launch when authorized. It also provides an audit trail tracking then VMs are launched, with other information such as IP addresses, hosts and operating systems, among other factors.

Data volumes designated to an instance are encrypted, allowing customers to encrypt other data volumes, according to the company. Enterprises maintain control of encryption key management including the option to store keys in-house.

Posted by Jeffrey Schwartz on 09/18/2014 at 12:09 PM0 comments


Investors Add $40 Million to Docker's Coffers in Push To Advance Containers

As the IT industry looks at the future of the virtual machine, containers have jumped out as the next big thing and every key player with an interest in the future of the datacenter is circling the wagons around Silicon Valley startup Docker. That includes IBM, Google, Red Hat, VMware and even Microsoft. Whether it is cause or effect, big money is following Docker as well.

Today Sequoia Capital has pumped $40 million in Series C funding, bringing its total funding to $66 million and estimated valuation at $400 million. Early investors in Docker include Benchmark, Greylock Partners, Insight Ventures, Trinity Ventures and Yahoo Cofounder Jerry Yang. Docker's containers aim to move beyond the traditional virtual machine with its open source platform for building, shipping and running distributed applications.

As Docker puts it, the limitation of virtual machines is that they include not only the application, but the required binaries, libraries and an entire guest OS, which could weigh tens of gigabytes, compared with just a small number of megabytes for the actual app. By comparison, the Docker Engine container consists of just the application and its dependencies.

"It runs as an isolated process in user space on the host operating system, sharing the kernel with other containers," according to the company's description. "Thus, it enjoys the resource isolation and allocation benefits of VMs but is much more portable and efficient."

With the release of Docker in June, Microsoft announced support for the Linux-based containers by updating the command-line interface in Azure, allowing customers to build and deploy Docker-based containers in Azure. Microsoft also said customers can manage the virtual machines with the Docker client. As reported here by John Waters, Microsoft's Corey Sanders, manager of the Azure compute runtime, demonstrated this capability at DockerCon at the time.

On the Microsoft Open Technologies blog, Evangelist Ross Gardler outlined how to set up and use Docker on the Azure cloud service. According to Gardler, common use cases for Docker include:

  • Automating the packaging and deployment of applications
  • Creation of lightweight, private PaaS environments
  • Automated testing and continuous integration/deployment
  • Deploying and scaling web apps, databases and backend services

At VMworld last month, VMware talked up its support for Docker, saying it has teamed with the company, joined by its sister company Pivotal as well as Google, to enable their collective enterprise customers to run and manage apps in containers in public, private and hybrid cloud scenarios as well as on existing VMware infrastructure.

Containers are a technology to watch, whether you're a developer or IT pro. The entire IT industry has embraced (at least publicly) it as the next generation of virtual infrastructure. And for now, Docker seems to be setting the agenda for this new technology.

 

Posted by Jeffrey Schwartz on 09/16/2014 at 11:57 AM0 comments


Microsoft Looks Past Windows with $2.5 Billion Deal To Acquire Minecraft Developer

Microsoft has pulled the trigger on a $2.5 billion deal to acquire Mojang, the developer of the popular Minecraft game. Rumors that a deal was in the works surfaced last week, though the price tag was initially said to be $2 billion. It looks like the founders of the 5-year-old startup squeezed another half-billion dollars out of Microsoft over the weekend.

Minecraft is the largest selling game on Microsoft's popular Xbox platform. At first glance, this move could be a play to make it exclusive to Microsoft's gaming system to keep it out of the hands of the likes of Sony. It could even signal to boost its declining Windows Phone business or even its Windows PC and tablet software. However if you listen to Xbox Head Phil Spencer and Microsoft's push to support all device platforms, that doesn't appear to be the plan.

"This is a game that has found its audience on touch devices, on phones, on iPads, on console and obviously its true home on PC. Whether you're playing on an Xbox, whether you're playing on a PlayStation, an Android or iOS device, our goal is to continue to evolve with and innovate with Minecraft across all those platforms," Spencer said in a prerecorded announcement on his blog.

If you consider CEO Satya Nadella's proclamation in July that Microsoft is the "productivity and platforms company" and it spent more than double what it cost to acquire enterprise social media company Yammer on Mojang, it may have you wondering how this fits into that focus. In the press release announcing the deal, Nadella stated: "Minecraft is more than a great game franchise -- it is an open world platform, driven by a vibrant community we care deeply about, and rich with new opportunities for that community and for Microsoft."

That could at least hint that the thinking is the platform and community the founders of Mojang created could play a role in UI design that doesn't rely on Windows or even Xbox. Others have speculated that this is a move to make Microsoft's gaming business ripe for a spinoff or sale, something investors want but a move the Nadella has indicated is not looking to make.

"The single biggest digital life category, measured in both time and money spent, in a mobile-first world is gaming," Nadella said in his lengthy July 10 memo, announcing the company's focus moving forward. "We also benefit from many technologies flowing from our gaming efforts into our productivity efforts --core graphics and NUI in Windows, speech recognition in Skype, camera technology in Kinect for Windows, Azure cloud enhancements for GPU simulation and many more. Bottom line, we will continue to innovate and grow our fan base with Xbox while also creating additive business value for Microsoft."

The deal also makes sense from another perspective: Minecraft is hugely popular, especially with younger people -- a demographic that is critical to the success in any productivity tool or platform. Clearly Nadella is telling the market and especially critics that it's not game over for Microsoft.

Posted by Jeffrey Schwartz on 09/15/2014 at 12:24 PM0 comments


Preorders for New iPhone Hit 4 Million in 1 Day

Apple today said preorders of its new iPhone 6 and its larger sibling, the 6 Plus, totaled a record 4 million in the first 24 hours, which doubled the preorders that the iPhone 5 received two years ago. But those numbers may suggest a rosier outlook than they actually portend.

Since Apple didn't release similar figures for last year's release of the iPhone 5s, which was for the most part an incremental upgrade over its then year-old predecessor as well as the lower-end 5c, it suggests customers are sitting on a number of aging iPhones. That includes earlier models which are now reaching the point of sluggishness due to upgrades to iOS running on slower processors and the fact they can only run on 3G networks.

Perhaps boosting demand was the fact that Apple for the first time offered an attractive promotion -- it will offer a $200 trade-in for an older iPhone if it's in working condition. For now, that offer is only good through this Friday but it wouldn't be surprising if Apple extended it or reintroduced it through the holiday season.

Also while I visited a local Verizon owned and operated store on Saturday, I noticed a few customers interested in switching out their larger Android phones for the new 6 Plus. But most of those orders Apple reported must have come from online because the vast majority were there looking at Android-based phones. Those wanting a new iPhone, especially the larger one, will have to wait at least a month or more, although Apple always seems to like to play those supply shortages early on when releasing new devices.

Many customers these days are less loyal to any one phone platform and are willing to switch if one has hardware specs that meet their needs -- perhaps its size, the camera or even the design. For example, I observed one woman who wanted to replace her damaged iPhone with iPhone 6 Plus but when the rep told her she'd have to wait a few weeks, she said she'd just take an Android phone since she needed a new one right away. I saw another man warning his son that if he switched to an Android phone, he'd lose all his iOS apps. The teenager was unphased by that and also bought an Android Phone.

Meanwhile, no one was looking at the Nokia Lumia 928s or Icons and the store employees told me that they sell few Windows Phones, which they estimated accounted for less than 5 percent of phone sales. Perhaps that will change, following Microsoft's deal to acquire Mojang, purveyor of the popular Minecraft game? That's a discussion for my post about today's announcement by Microsoft to acquire Mojang for $2.5 billion.

For those who did purchase a new iPhone, it appears there was more demand for the larger unit with the 5.5-inch display compared with its junior counterpart, which measures 4.7-inches (still larger than the earlier 5/5s models). Apple didn't provide a breakdown. If you're an iPhone user, do you plan to upgrade to a newer one or switch to another platform? What size do you find most appealing?

Posted by Jeffrey Schwartz on 09/15/2014 at 12:08 PM0 comments


Will HP's Acquisition of Eucalyptus Challenge Microsoft's Amazon Cloud Migration Push?

Hewlett Packard's surprising news that it has agreed to acquire Eucalyptus potentially throws a monkey wrench into Microsoft's recently stepped-up push to enable users to migrate workloads from Amazon Web Services to the Microsoft Azure public cloud.

As I noted last week, Microsoft announced its new Migration Accelerator, which migrates workloads running on the Amazon Web Services cloud to Azure. It's the latest in a push to accelerate its public cloud service, which analysts have recently said is gaining ground.

By acquiring Eucalyptus, HP gains a tool sanctioned by Amazon to enable AWS-enabled workloads in its private cloud service. Eucalyptus signed a compatibility pact with Amazon in March 2012 that enables it to use Amazon APIs including Amazon Machine Images (AMIs) for its open source private cloud operating system software.

The deal, announced late yesterday, also means Eucalyptus CEO Marten Mickos will become general manager of HP's Cloud services and will report to HP Chairman and CEO Meg Whitman. Mickos, the onetime CEO of MySQL, has become a respected figure in infrastructure-as-a-service (IaaS) cloud circles. But the move certainly raised eyebrows.

Investor and consultant Ben Kepes in a Forbes blog post questioned whether Eucalyptus ran out of money and was forced into a fire sale or if the acquisition was a desperate move by HP to give a push to its cloud business. HP has had numerous management and strategy shifts with its cloud business.

"One needs only to look at the employee changes in the HP cloud division." Kepes wrote. "Its main executive, Biri Singh, left last year. Martin Fink has been running the business since then and now Mickos will take over -- he'll apparently be reporting directly to CEO Meg Whitman but whether anything can be achieved given Whitman's broad range of issues to focus on is anyone's guess."

Ironically on Redmond magazine's sister site Virtualization Review, well-known infrastructure analyst Dan Kusnetzky had just talked with Bill Hilf, senior vice president, product and service management, HP Cloud. He shared his observations on HP's new Helion cloud offering prior to the Eucalyptus deal announcement. Helicon is built on OpenStack, though HP also has partnerships with Microsoft, VMware and CloudStack.

"The deal represents HP's recognition of the reality that much of what enterprise developers, teams and lines of business do revolves around Amazon Web Services," said Al Sadowski, a research director at 451 Research.

"HP clearly is trying to differentiate itself from Cisco, Dell and IBM by having its own AWS-compatible approach," Kusnetzky added. "I'm wondering what these players are going to do once Eucalyptus is an HP product. Some are likely to steer clients to OpenStack or Azure as a way to reduce HP's influence in their customer bases."

It also raises questions about the future of Helicon, which despite HP's partnerships, emphasizes OpenStack -- something Mickos has been a longtime and vocal supporter of. "We are seeing the OpenStack Project become one of the largest and fastest growing open source projects in the world today," Mickos was quoted as saying on an HP blog post.

Hmm. According to a research report released by 451 Research, OpenStack revenues were $883 million of IaaS revenues, or about 13 percent. They are forecast to double to about $1.7 billion of the $10 billion in 2016, or 17 percent. Not trivial but not a huge chunk of the market either.

HP clearly made this move to counter IBM's apparent headway with its cloud service, even though it appears the three front runners are Amazon, Microsoft and Google. In order to remain a player, HP needs to have compatibility with all three, as well as OpenStack, and acquiring Eucalyptus gives it a boost in offering Amazon compatibility, even if it comes at the expense of its server business, as noted by The New York Times.

In my chats with Mickos over the years, Eucalyptus hadn't ruled out Azure compatibility but admitted it hadn't gone very far last time we spoke over a year ago. Time will tell if this becomes a greater priority for both companies.

Regardless, 451 Research's Sadowski indicated that HP's move likely targets IBM more than Microsoft. "HP is hoping to capture more of the enterprise market as organizations make their way beyond Amazon (and Azure) and build out their private and hybrid cloud deployments," he said. "We would guess that in acquiring Eucalyptus, the company is seeking to replicate the story that IBM has built with its SoftLayer buy and simultaneous OpenStack support."

Posted by Jeffrey Schwartz on 09/12/2014 at 12:51 PM0 comments


Would Dick Tracy Buy the Apple Watch?

Apple's much-anticipated launch event yesterday "sucked the air out of the room" during the opening keynote session at the annual Tableau Customer conference, taking place in Seattle, as my friend Ellis Booker remarked on Facebook yesterday.

Regardless how you view Apple, the launch of its larger iPhone 6 and 6 Plus models, along with the new payment service and smartwatch, were hard to ignore. Positioned as an event on par with the launch of the original Macintosh 30 years ago, the iPod, iPhone and iPad, the new Apple Watch and Apple Pay made for the largest launch event the company has staged in over four years. It was arguably the largest number of new products showcased at a single Apple launch event. Despite all the hype, it remains to be seen if it will be remembered as disruptive as Apple's prior launches. Yet given its history, I wouldn't quickly dismiss the potential in what Apple announced yesterday.

The Apple Watch was the icing on the cake for many fans looking for the company to show it can create new markets. But critics were quickly disappointed when they learned the new Apple Watch would only work if linked it to your new iPhone (or last year's iPhone 5s). Many were surprised to learn that it will be some time before the component circuitry for providing all of the communications functionality of a phone or tablet can be miniaturized for a watch.

This is not dissimilar to other smartwatches on the market. But this technology is not yet the smartwatch Dick Tracy would get excited about. No one knows if the Apple Watch (or any smartwatch) will be the next revolution in mobile computing or if it will be a flop. And even if it is the next big thing, it isn't a given that Apple, or any other player, will own the market.

Yet Apple does deserve some benefit of the doubt. Remember when the first iPod came out and it only worked with a Mac? Once Apple added Windows compatibility, everything changed. Likewise when the first iPhone came out in 2007, it carried a carrier-subsidized price tag of $599. After few sold, Apple quickly reduced them to $399. But it wasn't until later when the company offered $199 iPhones did the smartphone market take off.  It's reasonable to expect there will be affordable smartwatches if it does become a mass market. I'd think the sweet spot will be under $150, but who knows for sure.

While Apple was expected to introduce a watch, the variety of watches it will offer was surprising. Those pining for an Apple Watch will have to wait until early next year but it's hard to see these flying off the shelves in their current iteration. I am less certain Apple will open up its watch to communicate with Android and Windows Phones, though that would open the market for its new devices just as adding Windows support to its iPods did.

Still, it's looking more likely that those with Android and Windows Phones will choose watches running on the same platforms. Indeed there are a number of Android-based alternatives such as the new Android-based Moto 360 for $249 and the Motorola Mobility LG G Watch, available now for $179 in the Google Play store. They too require connectivity to an Android Phone.

For its part, Microsoft is planning its own smartwatch with a thin form factor resembling many of the fitness-oriented watches. It will have 11 sensors and will be cross platform, according to a Tom's Hardware report.

As I noted, the Apple Watch was icing on an event intended to announce the pending ability of two the new iPhones. Arriving as expected, one measures 4.7 inches and the other is a 5.5-inch phablet that is almost the size of an iPad mini. They're poised to appeal to those who want something like Samsung's large Galaxy line of phones but aren't enamored by Android. The new iPhones will also put pressure on Microsoft to promote its large 6-inch Nokia Lumia 1520, which sports a 20 MP camera and a 1920x1080 display. Though Apple says its camera only has 8 megapixels, the company emphasized a new burst mode (60 fps) that can intelligently pull the best photo in a series of images. The new iPhone 6 Plus also has a 1920x1080 display and starts at $299 for a 16GB model (the standard iPhone 6 is still $199).

Besides some other incremental improvements in the new iPhones, perhaps the most notable new capability will be their support for the company's new Apple Pay service. This new capability will allow individuals to make payments at supporting merchants by using the phone's fingerprint recognition interface called Touch ID, which accesses a NFC chip on the phone that stores encrypted credit card and shipping information.

If Apple (and ultimately all smartphone suppliers) can convince customers and merchants that this is a more secure way of handling payments than traditional credit cards, we could see the dawn of a new era in how transactions are made. A number of high-profile breaches including this week's acknowledgment by Home Depot that its payment systems were compromised, could, over the long run, hasten demand for this technology if it's proven to be more secure. Of course, Apple has to convince sceptics that last week's iCloud breach was an isolated incident.

Regardless of your platform preference or if you use best of breed, we now have a better picture of Apple's next generation of wares. We can expect to hear what's in the pipeline for the iPad in the next month or two. Reports suggest a larger one is in the works.

Posted by Jeffrey Schwartz on 09/10/2014 at 12:07 PM0 comments


Microsoft's Revamped MSN Preview Is Worth Checking Out

Back in the 1990s when America Online ruled the day, Microsoft's entry with MSN followed in AOL's footsteps. Microsoft is hoping its latest MSN refresh tailored for the mobile and cloud era will take hold and its new interface is quite compelling.

Launched as a public preview today, the new MSN portal is a gateway to popular apps such as Office 365, Outlook.com, Skype, OneDrive, Xbox Music as well as some outside services, notably Facebook and Twitter. The interface to those services is the Service Stripe. After just spending a short amount of time with it, I'm already considering replacing My Yahoo as my longtime default browser portal home page.

"We have rebuilt MSN from the ground up for a mobile-first, cloud-first world," wrote Brian MacDonald, Microsoft's corporate VP for information and content experiences, in a blog post. "The new MSN brings together the world's best media sources along with data and services to enable users to do more in News, Sports, Money, Travel, Food & Drink, Health & Fitness, and more. It focuses on the primary digital daily habits in people's lives and helps them complete tasks across all of their devices. Information and personalized settings are roamed through the cloud to keep users in the know wherever they are."

If anything, MacDonald downplayed its ability to act as an interface to some of the most widely used services, while providing a rich offering of its own information and access to services utilized for personal use. The preview is available now for Windows and will be coming shortly to iOS and Android. Could the new MSN be the portal that gives Microsoft's Bing search engine the boost it needs and brings more users someday to Cortana?

 

Posted by Jeffrey Schwartz on 09/08/2014 at 2:22 PM0 comments


Windows Phone Users Strike Back: Apps Don't Matter

In critiquing the lack of availability of apps for Windows Phone last week, dozens of readers took issue with my complaints, effectively describing them as trivial and ill-informed. The good news is there are a lot of passionate Windows Phone users out there, but, alas, not enough -- at least for now -- to make it a strong enough No. 3 player against the iPhone and Android-based devices. Though the odds for it becoming a solid three-horse race appear to be fading, I really do hope that changes.

While I noted the increased number of apps for Windows Phone, many readers came at me saying I overstated the fact that many apps are still missing. "I am not sure what all the griping is about," Sam wrote. "I personally think that this whole 'app thing' is a step backward from a decade ago when we were moving toward accessing information using just a browser. Now you buy dumb devices and apps are a must."

Charles Sullivan agreed. "I have a Windows Phone 7, which is now nearly 4 years old and I rarely think about apps," he said. "Apps, for the most part, are for people that cannot spell their own name, which is a lot of people." Regarding the lack of a Starbucks app, Sam added, "Seriously, if you need an app to buy coffee, your choice of a phone is the least of your problems." Several critics said the apps I cited were missing were indeed available. "Yeah, I took two seconds and searched on my Windows Phone and found most of the apps. Seems like zero research went into this," Darthjr said.

Actually, I did search the Windows Store for every app I listed, some on numerous occasions, a few players I even called. If I overlooked some exact apps that are actually there, I apologize. But the truth remains that there's still a vast number of apps available for the iPhone and Android that aren't available for Windows Phone. In many cases they aren't even in the pipeline. Just watch a commercial or look at an ad in print or online and you'll often be directed to download their apps on either iOS or Android.

In some instances, there are similar apps offered by different developers. "Sorry, but your examples are poor and just continue to perpetuate the notion that if you have a Windows Phone, you are helpless," reader The WinPhan lamented. "Everything you listed as your 'daily go-to's' in app usage can be found in the Windows Phone Store, maybe not by same developers/name, but with same functionality and end result, with the exception of your local newspaper and cable. Please do better research."

Acknowledging the lead iOS and Android have, Curtis8 believes the fragmentation of Android will become a bigger issue and Windows Phone will incrementally gain share. "As Windows Phone gets better, with more markets and coverage, devs will start to support it more," Curtis8 noted, adding that iPhone will likely remain a major player for some time. "But I do see Windows Phone gaining traction regardless of the numbers people try to make up. One of our biggest fights is not the consumer, it is the carriers and sales people. Walk into any store and ask about Windows Phone: selection is crap, stock is crap and the sales people will try to convince you on an iPhone or Samsung device. Many people do not feel the missing apps as much if they stay on Windows Phone and find what we have."

That's absolutely true. As a Verizon customer, I've gone into a number of company-owned stores and the story remains the same. You're lucky if you can even find Windows Phones if you're looking for them in the several Verizon Stores I've visited. Forget about it if you're not looking for one. Ask a sales rep about Windows Phone and they'll effectively say that no one's buying them. The selection of Windows Phones on AT&T is better thanks to a partnership with Nokia.

Some respondents argued that the question of apps will become a moot point as Cortana, the voice-activated feature in the latest version of Windows Phone 8.1, catches on, especially if it's also included in the anticipated new Windows 9, code-named "Threshold," as is rumored to be a possibility.

Prabhujeet Singh, who has a Nokia Lumia 1020 indicated he loves the 41 megapixel camera (indeed a key differentiator), "Cortana is amazing. I tell it to set an alarm, set a reminder and when I am driving, I just speak the address. It understands my accent and I am not a native speaker of English. Do I miss apps? Nope. Not at all."

Tomorrow could very well mark an inflection point as Apple launches its next generation of mobile phones -- the larger iPhone 6 models -- and its widely anticipated iWatch. While that's a topic for another day, a report in The Wall Street Journal Saturday said the new Apple iWatch will be enabled by a health app to be integrated in iOS 8 called HealthKit. Several key health care institutions including Sloan Memorial Cancer Center in New York, insurance giant Kaiser Permanente and the Mayo Clinic were on board in some fashion.

If Apple has -- or can gain -- the broad support of the health care industry it had of the music industry when it introduced the iPod 13 years ago despite a crowded market of MP3 music players, it could spawn a new market. To date others including Google with Google Health and Microsoft with its HealthVault community has seen only limited success. On the other hand, some argue that winning in the health care market is a much smaller fish to reel in than capturing the huge music and entertainment market over a decade ago. Nevertheless, Apple's success in the health and fitness market would be a boon to the company at the expense of others.

As long as Microsoft says it's committed to its mobile phone business, it would be foolish to write off the platform. Much of the future of its Nokia Lumia phone business could be tied to Microsoft's commitment to hardware (including its Surface business). As Mary Jo Foley noted in her Redmond magazine column this month: "Like the Lumia phones, the role of the Surface in the new Microsoft will be a supporting, not a starring one. As a result, I could see Microsoft's investment in the Surface -- from both a monetary and staffing perspective -- being downsized, accordingly."

Even if that turns out to be the case, it doesn't portend the end for Windows and Windows Phone, especially if it can get OEMs on board. For now, that's a big if, despite removing fees to OEMs building devices smaller than nine inches. Fortunately for Microsoft, as recently reported, the company's mobility strategy doesn't depend merely on Windows, but on its support for all platforms, as well. The good news is this market is still evolving. For now, though, as long as apps matter, despite some of its unique niceties, Windows Phone faces an uphill battle.

 

Posted by Jeffrey Schwartz on 09/08/2014 at 2:52 PM0 comments


New Azure Tool Migrates Workloads from Amazon, VMware and Private Clouds

Microsoft is readying a tool that it says will "seamlessly" migrate physical and virtual workloads to its Azure public cloud service. A limited preview of the new Migration Accelerator, released yesterday, moves workloads to Microsoft Azure from physical machines, VMs (both VMware and Hyper-V-based) and those running in the Amazon Web Services public cloud.

The launch of the new migration tool comes as Microsoft officials are talking up the growth of its Azure cloud service at the expense of Amazon Web Services. Microsoft Technical Fellow in the Cloud and Enterprise Mark Russinovich emphasized that point in a speech at last month's TechMentor conference, which like Redmond magazine is produced by 1105 Media.

Migration Accelerator "automates all aspects of migration including discovery of source workloads, remote agent installation, network adaptation and endpoint configuration," wrote Srinath Vasireddy, a lead principal program manager for enterprise and cloud at Microsoft, in a post on the Microsoft Azure Blog yesterday . "With MA, you reduce cost and risk of your migration project."

The technology enabling the workload migrations comes from Microsoft's July acquisition of InMage, whose Scout software appliances for Windows and Linux physical and virtual instances captures data on a continuous basis as those changes occur. It then simultaneously performs local backups or remote replication via a single data stream. A week after announcing the acquisition, Microsoft said the InMage Scout software will be included in its Azure Site Recovery subscription licenses.

While the tool looks to give Microsoft a better replication story, it appears Microsoft's Migration Accelerator is pushing customers to use Azure for more than just disaster recovery and business continuity, although that has emerged as a popular application for all public cloud usage.

For example Vasireddy pointed to the Migration Accelerator's capability of migrating multitier production systems that he said offers consistency of applications that are orchestrated across all tiers. "This ensures multitier applications run the same in Azure, as they ran at the source," he said. "Application startup order is even honored, without the need for any manual configuration."

Vasireddy outlined in his blog post how the Migration Accelerator works and its components:

  • Mobility Service: A lightweight (guest based) centrally deployed agent which gets installed on source servers (on-premises physical or virtual) to be migrated to the target virtual machines on Azure.  It is responsible for real time data capture and synchronization of the selected volumes of source servers to target servers.
  • Process Server (PS): A physical or virtual server that is installed on-premises. It facilitates the communication between the Mobility Service and target virtual machines in Azure. It provides caching, queuing, compression, encryption and bandwidth management.
  • Master Target (MT):  A target for replicating disks of on-premises servers. It is installed within a dedicated Azure VM in your Azure subscription.  Disks are attached to the MT to maintain duplicate copies.
  • Configuration Server (CS): Manages the communication between the Master Target and the MA Portal. It is installed on a dedicated Azure VM in your Azure subscription. Regular synchronization occurs between the CS and MA Portal.
  • MA Portal: A multitenant portal to discover, configure protection and migrate your on premise workloads into Azure.

Microsoft's new cloud migration tool also offers automated asset discovery and migration, cutovers to Azure within minutes using in-memory change-tracking, ensuring target VMs are dormant during migrations to lower compute costs and ensure automated provisioning, lightweight agents on targets to enable continuous replication and support for automated network adaptation and endpoint reconfiguration, he said. Those interested in testing the Migration Accelerator must sign up here for the preview.

 

Posted by Jeffrey Schwartz on 09/05/2014 at 12:33 PM0 comments


Equity Firms To Acquire BeyondTrust and Compuware, Concur Reportedly In Play

Two key suppliers of enterprise tools yesterday said they're being acquired and another well-known software-as-a-service provider is reportedly in play. BeyondTrust and Compuware have agreed to be acquired by private equity firms while shares of SaaS provider Concur have traded higher following rumors that several software vendors have approached it, including Microsoft, Oracle and SAP, after hiring an investment bank to gauge interest.

Veritas Capital said it will acquire cybersecurity software vendor BeyondTrust, provider of software used to manage user access and privileges and risk management assessment tools. BeyondTrust's PowerBroker software is especially popular among those looking to manage access rights in Microsoft's Active Directory.

BeyondTrust said it has 4,000 customers in a variety of industry sectors including government, technology, aerospace and defense, media/entertainment, telecommunications, healthcare/pharmaceutical, education and financial services. In addition to managing user access, BeyondTrust offers tools designed to defend against cyber attacks. Terms of the deal weren't disclosed.

Application performance management software supplier Compuware yesterday said it has agreed to be acquired by private equity firm Thoma Bravo for $2.5 billion. The deal puts a 12 percent premium on the company's shares based on its closing price at the end of the trading day Sept. 2 and 17 percent over Friday's close.

Compuware is a leading provider of APM software used to monitor the performance of mobile, enterprise and mobile application infrastructure as well as a line of network testing and management tools. The company also offers a line of mainframe management tools.

Concur, whose popular SaaS software is used to manage business travel and entertainment expenses, had a stock price increase of 6.3 percent yesterday, giving it a market cap of $6.1 billion. According to a Bloomberg report, Oracle has decided to pass on it, while Microsoft and SAP are companies of interest, though none of the companies noted commented on the possible acquisition.

Posted by Jeffrey Schwartz on 09/03/2014 at 12:54 PM0 comments


Windows Phone Lacking Popular Apps

In my monthly Redmond View column I raised the question, "Can Windows Phone Defy Odds and Gain Share?" It's a valid question given the platform's recent falloff in shipments and the lack of widespread enthusiasm for Windows Phone.

Some of you disagree but the numbers speak for themselves. Nevertheless reader John Fitzgerald feels I need to go deeper. "Your article is pretty short on specifics," he wrote. "I have over 100 apps installed, including many useful ones and have no app gap in my life."

Given that was my monthly column for the print magazine, I was constrained by the word count to elaborate. But Fitzgerald raised a valid point:  If all the apps (or reasonable substitutes) you use on your iPhone or Android phone are available for Windows Phone, you're in business. Indeed if Tyler McCabe, the millennial son of SMB Group Analyst Laurie McCabe, objected to the lack of apps in the Lumia 1520 (he evaluated and described on his mother's blog), he didn't mention it. The computer engineering student is accustomed to iPhones and Android devices. 

In today's mobile consumer-driven world, apps do rule, at least for now. As such, looking at my iPhone, there are a number of apps I use daily that are not available on Windows Phone or the functionality is different. Among them are the Starbucks app, which I use to pay for purchases and manage my account, and the MLB app to track baseball scores and news. When I board flights I use my iPhone to access my boarding pass. That's currently not an option with Windows Phone, though Delta and Southwest offer apps for major phones. When I want to check the train schedules, New York's MTA offers an app for iOS and Google Play but not for Windows Phone. The United States Post Office app I use to check rates and track packages is available only for the iPhone, Android and BlackBerry. Google recently made a version of its Chrome Web browser for iOS, while the company last said in March it was investigating bringing it to Windows Phone as well.

An app I use throughout the day to access articles on Newsday, my local newspaper, isn't available for Windows Phone and I was told no plans are under way to develop one. Nor does Newsday's parent company, Cablevision Systems, offer one to provide mobile DVR controls and other capabilities.

The bottom line is if I were to switch to a Windows Phone, I'd be giving up a lot of convenience I've grown accustomed to over the years. If others feel the same way, that could explain why, despite some otherwise compelling features, it could be an uphill battle for the prospects of Windows Phone gaining substantial share. Alas, I believe that ship has sailed.

Posted by Jeffrey Schwartz on 09/03/2014 at 12:55 PM0 comments


Ballmer's Departure: A Win for Microsoft and Clippers

Steve Ballmer's decision to step down from Microsoft's board six months after "retiring" as the company's CEO (announced a year ago tomorrow) has once again put him in the limelight in the IT world, at least for a few days. The spotlight has already pointed to Ballmer in the sports world now that he shelled out an unprecedented $2 billion to purchase the Los Angeles Clippers and gave an infamous rally cry earlier this week on how his team will light the NBA on fire.

More power to him. I hope he brings the Clippers a championship ring -- maybe they become a dynasty. But at the same time, Ballmer's decision to step down from Microsoft's board appears to be a good move, whatever may have actually motivated him. Some may argue it's symbolic. Even though Ballmer is no longer on the board with voting rights, he's still the company's largest shareholder and if he was to align himself with activist investors, he could make his already audible voice heard. As such it would be foolish to write him off.

Following the news of his resignation from the board, I've seen a number of tweets and commentaries saying critics of Ballmer don't appreciate all he has done for Microsoft. Indeed Microsoft performed well during most of his 13-year reign, with consistent year-over-year revenue and profit growth. And products such as SharePoint grew from nothing to multi-billion-dollar businesses. Yet under Ballmer's watch, Microsoft seemed to have lost its way and the company that cut off Netscape's "air supply" lost its caché to companies like Apple and Google.

To those who think Ballmer is unappreciated, did Microsoft perform well because of or in spite of him? Consider this: As manager of the Atlanta Braves, New York Mets and St. Louis Cardinals, Joe Torre's teams performed abysmally. Yet his four World Series rings and year-over year appearances in the playoffs as skipper of the Yankees brought him praise as a great manager and a place in the Hall of Fame. As CEO of Novell, Eric Schmidt was unable to return the one-time networking giant to its former glory. As CEO of Google, he was a rock star. If Novell had hired Ballmer away from Microsoft, could he have saved the company from its continued decline? We'll never know for sure.

In my view, it all started with Ballmer's ill-advised attempt to acquire Yahoo in 2008 for $45 billion. Yahoo CEO Jerry Yang's foolhardy refusal to accept that astronomically generous offer at the time probably saved Microsoft from disaster. In retrospect, perhaps Ballmer was desperate, whether he knew it or not at the time. In 2008, Microsoft was late to the party with a hypervisor, delivering the first iteration of Hyper-V even though VMware had already set the standard for server virtualization.

Also at the time, maybe he realized, consciously or otherwise, that shrugging off the iPhone a year earlier sealed Microsoft's fate in the mobile phone business. Tablets and mobile phones were a market Microsoft could have owned or at least become a major player in had there been more vision. I remember Microsoft talking up technologies like the WinPad in the mid to late '90s and making feeble attempts to offer tablets that were effectively laptops which accepted pen input.

I wasn't planning to weigh in on Ballmer stepping down from the board until I read a review of the latest Windows Phone from HTC by Joanna Stern of The Wall Street Journal. The headline said it all: "HTC One for Windows: Another Great Phone You Probably Won't Buy." While praising the new phone, which actually is based on its popular Android-based device, Stern effectively said while Windows Phones are good, so were Sony Betamaxes (that's my analogy, not hers). Stern noted IDC's latest smartphone market share report, showing it sits at a mere 2.5 percent and not growing. In fact it has declined this quarter.

That got me thinking, once again, that the missteps with Windows Phone and Windows tablets all happened on the watch of Ballmer, who also let Apple and Google dominate the tablet market. To his credit, Ballmer acknowledged earlier this year that missing that market shift was his biggest regret. That still brings little consolation. Now Ballmer's successor, CEO Satya Nadella, is trying to move Microsoft forward with the hand he was dealt. Founder and former CEO Bill Gates sits on the board and is working with Nadella closely on strategic technical matters. Does Nadella also need the person whose mess he's cleaning up (including the fiefdoms that exist in Redmond) looking over his shoulder as a board member?

No one can overlook the passion Ballmer had at Microsoft, but his "developers, developers, developers" rant only went so far. Today most developers would rather program for iOS, Android and other platforms. Hence, it's fortunate that Ballmer has found a new passion. Maybe he can do with the Clippers what he couldn't do for Microsoft.

Posted by Jeffrey Schwartz on 08/22/2014 at 12:51 PM0 comments


Tech & Tools Watch, August 22: NetApp, Netwrix, Advanced Systems Concepts

NetApp Adds Hybrid Cloud Storage for Azure
NetApp Inc. is now offering a private storage solution that supports Microsoft private clouds and the Microsoft Azure cloud service. The new NetApp Private Storage for Microsoft Azure is designed to provide storage for hybrid clouds, letting organizations elastically extend storage capacity from their own datacenters to the public cloud service. The offering utilizes the Microsoft FlexPod private cloud solution consisting of converged storage, compute and network infrastructure from Cisco Systems Inc., NetApp and the Windows Server software stack. Organizations can create internal private clouds with the Microsoft Cloud OS, which includes the combination of Windows Server, System Center and the Windows Azure Pack.  Those which require scale can use the Azure cloud service and Microsoft's  new Azure ExpressRoute offering, which through partners such as AT&T, BT, Equinix, Level 3 Communications and Verizon Communications, provides high-speed, dedicated and secure links rather than relying upon the public Internet. NetApp Private Storage for Microsoft Azure offers single or multiple region disaster recovery and uses the public cloud service only when failover is necessary or for planned scenarios such as testing. When used with System Center and the NetApp PowerShell Toolkit, customers can manage data mobility between the private cloud and on-site storage connected to the Azure cloud.

Netwrix Adds Compliance to Auditor
Organizations faced with meeting regulatory compliance requirements need to ensure their IT audits are in line with various standards. The new Netwrix Auditor for 5 Compliance Standards addresses five key standards: the Payment Card Industry Data Security Specification (PCI/DSS), Health Insurance Portability and Accountability Act (HIPAA), Sarbanes-Oxley Act (SOX), Federal Information Security Management Act (FISMA) and Gramm-Leach-Bliley Act (GLBA). Netwrix Corp. released the new compliance tool last month. It's designed to help ensure IT passes compliance audits by accessing the mandated reports from the Netwrix Auditor AuditArchive, which the company describes as two-tiered storage that can hold data for upward of 10 years. It tracks unauthorized changes to configurations and can issue real-time alerts. It also comes with 200 preconfigured reports. In addition, it monitors user activities and audits access control changes, which helps discover and preempt theft of confidential information. It builds on the built-in auditing capabilities of widely used platforms including Active Directory, Exchange Server, file servers, SharePoint, SQL Server, VMware and Windows Server, though the company said it covers others, as well.

Advanced Systems Concepts Ties to Configuration Manager
Advanced Systems Concepts Inc. is now offering an extension for Microsoft System Center 2012 R2 Configuration Manager. Along with its other System Center extensions, the new release provides common management across on-premises, services provider and Azure environments. By integrating ActiveBatch with Configuration Manager, the company said IT organizations can reduce administration requirements and lower overall cost of PC ownership by automating such functions as OS deployment, patch management, client device management and other systems administration functions. It's designed to eliminate the need for scripting. Among the Configuration Manager functions it offers are production-ready job steps, which include commonly used Configuration Manager objects and functions such as creating job steps to build packages, deployment programs and folders. Among other job steps, it lets administrators modify and delete, as well as assign packages to distribution.  ActiveBatch already supports most other key components of System Center. It already has extensions for System Center Operations Manager, System Center Service Manager, System Center Orchestrator and System Center Virtual Machine Manager, as well as Active Directory, SQL Server, Exchange, Azure and Windows Server with the latest rev of Hyper-V. It also works with other key platforms including Amazon Web Services EC2 and VMware ESX.

 

Posted by Jeffrey Schwartz on 08/22/2014 at 11:43 AM0 comments


Microsoft To Revive Physical-to-Virtual Conversions in Migration Tool

Microsoft will release the third version of its Virtual Machine Converter (MVMC) this fall and the key new feature will be the return of support for physical to virtual conversions. MVMC is Microsoft's free tool for migrating VMware virtual machines to Hyper-V and when it released the 2.0 version earlier this year, the company removed the P2V support and only allowed for V2V conversions.

That "disappointed" some customers, said Matt McSpirit, a Microsoft senior technical product marketing manager on the company's cloud and enterprise marketing group. McSpirit joined me in a panel discussion on choosing a hypervisor at last week's TechMentor conference, held on the Microsoft campus in Redmond. While the panel covered many interesting implications of converting to Hyper-V, which I will break out in future posts, one key area of discussion was converting existing VMware VMs to Hyper-V.

Released in late March, Microsoft Virtual Machine Converter 2.0 upped the ante over the inaugural release by incorporating  support for vCenter and ESX 5.5, VMware virtual hardware version 4  through 10 support and Linux guest OS migration support including CentOS, Debian, Oracle, Red Hat Enterprise, SuSE Enterprise and Ubuntu, as I reported at the time. It also added an on-premises VM to Azure VM conversion tool for migrating VMware VMs directly to Azure.

Another key feature is its native PowerShell interface, enabling customers and partners to script key MVMC 2.0 tasks. These scripts can also be integrated with other automation and workflow tools such as System Center Orchestrator, among others. In addition, Microsoft has released the Migration Automation Toolkit for MVMC 2.0, which is a series of prebuilt PowerShell scripts to drive scaled MVMC 2.0 conversions without any dependency on automation and workflow tools.

The reason for killing the P2V support in that release was apparently to emphasize and encourage that it's a VM to VM conversion tool. During the panel discussion, McSpirit described P2V as less specific to virtual conversion but more focused on physical to virtual.

"A while back, we had within System Center a P2V capability," he explained. "So for customers who were just getting started with virtualization or have some workloads they need to convert from the physical world, we have P2V built into VMM [System Center Virtual Machine Manager]. So as an admin, you've got your Hyper-V host being managed, [you] select a physical server that will actually convert either online or offline, and VMM will handle [the conversion to a virtual machine]  and bring it [into its management tool]. And that functionality was deprecated in 2012 R2 and removed.  And thus, for a period of time, no P2V tool from Microsoft. Yes there's disk to VHD and some other tools but no [fully] supported, production ready tool [from Microsoft, though there are third-party tools]."

In a followup e-mail to clarify that point, McSpirit said that "P2V stands for physical to virtual, thus by definition, it's less focused on virtual conversion and more focused on physical to virtual, but that's not to say it can't be used for converting VMs," he noted. "The P2V wizard in the previous release of System Center Virtual Machine Manager (2012 SP1) could still be pointed at an existing VMware virtual machine, after all, regardless of whether it's converting a physical machine or not, it's the OS, and it's data that's being captured, not the hardware itself. Thus, you could use P2V, to do a V2V."

Microsoft confirmed in April that P2V was planned for the fall release. "With MVMC 3, that comes in the fall P2V is coming back into that, which pleases a lot of customers because it doesn't have that requirement for System Center which for smaller environments is more applicable and it enables you to perform P2Vs with a supported tool," McSpirit said.

Update: Just to clarify, MVMC never had P2V functionality. Rather it was offered in System Center 2012 SP1 Virtual Machine Manager (and earlier). When McSpirit said P2V was deprecated, he was referring to System Center 2012 R2 Virtual Machine Manager, which offered V2V only. With the MVMC 3.0 release this fall, P2V will once again be available.

The session confirmed what many already know: MVMC is a good tool for converting a handful of hypervisors but it still requires manual configuration and  "lacks any sort of bulk conversion mechanism (unless you want to script the conversion process through Windows PowerShell)," wrote Brien Posey in a tutorial outlining how to use MVMC 2.0 earlier this year.

But if you plan to migrate numerous VMware VMs to Hyper-V, you may want to consider third-party tools from the likes of Vision Solutions, NetApp and 5nine Software.

Are you migrating your VMware VMs to Hyper-V? If so, are you using MVMC or one of the third-party tools?

Posted by Jeffrey Schwartz on 08/20/2014 at 9:22 AM0 comments


No Plans for Internet Explorer on iOS and Android

Over the weekend I downloaded the Chrome Web browser on my iPad after it showed up as a suggested download. Somehow I forgot Chrome is now available on iOS and I was thrilled to see I could not only run the browser, but instantly have access to my bookmarks and Web browsing history from other PCs on which I work. Then I wondered if Internet Explorer would ever find its way into the iTunes App Store. After all, Office is now available on iOS, as well as other popular Microsoft offerings. However, it doesn't appear that Internet Explorer on iOS is in the cards.

Apparently, Microsoft put cold water on that idea last week during a Redit Ask Me Anything (AMA) chat. As reported by "All About Microsoft" blogger Mary Jo Foley, Internet Explorer is one of Redmond's offerings that the company won't deliver on iOS or Android. "Right now, we're focused on building a great mobile browser for Windows Phone and have made some great progress lately," said Charles Morris, a member of the Internet Explorer platform team, during the AMA chat. "So, no current plans for Android/iOS."

That's unfortunate, especially given the rate of growth for Windows Phone, which actually dropped last quarter. It appears Microsoft is still fixed on the notion that its Internet Explorer Web browser is inextricably tied to Windows.

Also, as noted by Forbes, the Internet Explorer team acknowledged during the AMA chat that Microsoft has considered renaming the browser, presumably to make it more hip. That's a questionable idea considering Internet Explorer is still the most widely used browser, especially among enterprises.

When asked about the potential for a name change, Jonathan Sampson, also on the Internet Explorer platform team, responded: "It's been suggested internally; I remember a particularly long e-mail thread where numerous people were passionately debating it. Plenty of ideas get kicked around about how we can separate ourselves from negative perceptions that no longer reflect our product today."

Asked why the company didn't just change the name, Sampson responded: "The discussion I recall seeing was a very recent one (just a few weeks ago). Who knows what the future holds." Changing the name, of course, won't do much to help the browser's reputation if it's known for the same problems it has had in the past -- namely security and privacy flaws.

If something has a very bad reputation, sometimes rebranding it is the only course of action. But as the old saying goes, "a rose is still a rose ..." Despite the fact it's no longer the dominant browser, Internet Explorer still has the largest market share, according to Net Applications. Even if its share continues its incremental decline, Internet Explorer is a well-known browser and not merely for its flaws. People who don't use Microsoft products, particularly those who prefer Apple offerings or the LAMP stack, aren't going to be moved by a new name for anything coming out of Redmond. Produce a browser that has fewer flaws and advanced features and it'll be popular no matter what it's called.

Regardless of the name, would making it available on iOS and Android stem any declines on a platform that's still widely used? Or would it facilitate it? Perhaps the company sees Internet Explorer as critical to holding onto the Windows franchise. If that's the case, the company might wait. In the meantime, I look forward to using the Chrome browser on my iPad and will stick to using Internet Explorer only on Windows.

 

Posted by Jeffrey Schwartz on 08/18/2014 at 1:31 PM0 comments


Why IT Pros Should Prepare for Microsoft's Stealth Library OS

Could there be a day when the desktop or mobile operating system you use and developers program to don't matter? The age-old question may not be answered any time soon and don't expect to see any of this in Microsoft's next client OS. But IT pros should prepare themselves for the possibility that Microsoft or other players may someday successfully commercialize a "library operating system" where developers rely primarily on APIs to run their applications, not a client OS or even a virtual machine.

While researchers have bandied about the concept of a library OS for some time, Microsoft Research revealed its own work on one back in 2011 with a project called Drawbridge. Microsoft Research Partner Manager Galen Hunt joined others at the time to outline in detail their working prototype of a Windows 7 library operating system that ran then-current releases of Excel, PowerPoint and Internet Explorer. In this desktop prototype, they said the securely isolated library operating system instances worked via the reuse of networking protocols.

"Each instance has significantly lower overhead than a full VM bundled with an application: a typical application adds just 16MB of working set and 64MB of disk footprint," according to a paper published by the widely regarded Associated for Computer Machinery (ACM) at the time. "We contribute a new ABI [application binary interface] below the library OS that enables application mobility. We also show that our library OS can address many of the current uses of hardware virtual machines at a fraction of the overheads."

Little has been said by Microsoft's library OS efforts since then. Yet three years later, Microsoft's stronghold on the desktop has been diminished by the rapid growth of BYOD. While Microsoft hopes to revive enthusiasm for Windows with a new offering next year, others believe it may be too little, too late. That remains to be seen of course. Nevertheless, Microsoft could have a lot to gain by someday delivering a library OS.

Don Jones, a Microsoft MVP and a Redmond magazine columnist, who recently joined IT training firm Pluralsight, revived the idea of why a library OS could be the right approach for Microsoft to take. And he did so right in Microsoft's house. Speaking in a keynote address at the TechMentor conference this week at the Microsoft Conference Center on the Redmond campus, Jones explained why a library OS could reduce existing operating systems to the firmware layer, putting the emphasis on APIs and application delivery rather than low-level services.

"We tend to say that developers develop for a specific operating system -- iOS, Android, Windows, Macintosh or Linux," Jones told the audience of IT pros. "It's not actually true. They don't. What they are writing for is an operating environment. What they are writing for is a set of APIs."

The APIs which developers code against today are tied to a specific operating system, whether it is Apple's interfaces to iOS or Microsoft's .NET Framework. Efforts to make APIs that work across operating system s has remain ed unsuccessful, Jones noted

"Even when we make efforts to duplicate those APIs for other operating systems with projects like Mono, attempting to duplicate .NET onto the Linux platform, they are rarely successful because the ties between the APIs and the operating system are so integrated," Jones said. "It's really tough to pull them apart and duplicate them elsewhere."

Jones called for a smarter BIOS so when users turn on their computers, it knows how to access the network, write to a display or disk and other basic low-level services, including CPU and memory access and storing user credentials in a secure area.

"This is going to sound crazy but if you take all of Windows and you refactor it into a bunch of APIs that talk to that low-level firmware and if a developer wants to code to those, he or she can," Jones said. "But at the same time, if someone is more comfortable programming against a Linux set of APIs, there's no reason those same APIs can't live on that same machine and run in parallel at the same time because it's not taking over the machine, it's the firmware that runs everything. These are just different ways of getting at that firmware that provides a different comfort level for developers."

In this type of scenario, Jones added: "Windows essentially becomes a set of DLLs, which essentially is not that far from what it is. So we move the operating system to a lower level and everything else is a set of APIs that run on top of that. Meaning you gain the ability to run multiple operating system personalities with APIs side by side because none of them owns the machine."

Can, or will, Microsoft bring this to market? Certainly Hunt believed it's feasible. "Our experience shows that the long-promised benefits of the library OS approach -- better protection of system integrity and rapid system evolution -- are readily obtainable," he wrote at the time.

Jones said not to expect it anytime soon, and warned it's possible it may never see the light of day. But he pointed to some meaningful reasons why Microsoft could be leaning in that direction, or at the very least could benefit by doing so sometime in the future. Perhaps most obvious is the company's key emphasis and investment is on building out the Microsoft Azure cloud and the Windows Server/System Center-based cloud OS that underpins it.

Jones also pointed to Microsoft's new Azure RemoteApp service, which the company announced at TechEd in May and is available in preview. Microsoft Azure RemoteApp, an application delivery service, will let IT deliver Windows applications to almost any device including Windows tablets, Macs, iPads and Android tablets via the cloud. It will work with 1,300 Windows apps, Microsoft said at the time, though the company has yet to announce its plans to offer Azure RemoteApp commercially.

"RemoteApp delivers an app, not a desktop, not an operating system," Jones said. "What the user gets is an app, because the user has been trained that's all they need." Jones also likes the idea of a library OS because it eliminates virtualization, he added. "We don't have to have an operating system and a virtual machine rolling it up. We just have these little API silos," he said. "Suddenly the idea of running applications in the cloud and delivering them becomes a lot clearer and a lot easier and you can get a lot better densities on your hosts."

The role of desktop administrators may not disappear overnight but it would not be a bad idea for them to study and gain expertise on the datacenter side of things if this move comes to play. "Look at what's happening in your environment, and whether you think it's a good idea or not, try to predict what you think Microsoft will or might do and then try to make a couple of side investments to line yourself up so you can ride the bus instead of being hit by it."

Would you like to see Microsoft deliver a library OS? Share your views on the potential benefits or pitfalls of such a computing model.

Posted by Jeffrey Schwartz on 08/15/2014 at 12:01 PM0 comments


Microsoft Cloud Exec Says Azure Is Putting the Squeeze on Amazon

Growing use of the Microsoft Azure cloud service, rapidly expanding features and a price war are putting the squeeze on cloud market leader Amazon Web Services, according to Microsoft Technical Fellow in the Cloud and Enterprise Division Mark Russinovich.

In the opening keynote at the TechMentor conference at the Microsoft Conference Center on the company's Redmond campus, Russinovich showcased the edge Microsoft sees that it has over its key rival.  Russinovich argued that the company's squeeze contributed to Amazon's disappointing quarterly earnings report last month, which rattled investors.  "Amazon Web Services is struggling in the face of pricing pressure by us and Google," Russinovich said. "People are starting to realize we're in it to win."

Indeed, market researchers last month pointed to Microsoft and IBM as those who are gaining the most ground on Amazon in the public cloud. When asked how many TechMentor attendees have logged into the Microsoft Azure Portal, only about one-quarter of the IT pros in the audience said they have. Disclosure: TechMentor, like Redmond magazine, is produced by 1105 Media.

Pointing to the growth of Azure, Russinovich showcased some recently released figures. The service hosts 300,000 active Web sites and 1 million active database services. The Azure Storage service has just surpassed 30 trillion objects with 3 million active requests per second at peak times. Russinovich also said that 57 percent of the Fortune 500 companies are using Azure. And Azure Active Directory, which organizations can federate with their on-premises versions of Active Directory, now has 300 million accounts and 13 billion authentication requests per week.

Russinovich emphasized Microsoft's advantage with Azure Active Directory and the cloud service's emphasis on identity. "Azure Active Directory is the center of our universe," Russinovich said. "When you take a look at all the places where you need identity, whether it's a [third-party] SaaS service or whether Microsoft's own, like Office 365, you look at custom line-of-business apps that enterprises are developing and they want to integrate their own identity services."

Throughout his talk, Russinovich emphasized Microsoft's focus on its hybrid cloud delivery model and the security layers that can extend to the public Azure cloud. To that point, Azure Active Directory's role in supporting hybrid clouds is "it supports ways you can foist identities to the cloud so that they're accessible from all of these targets, line-of-business apps in the cloud and Microsoft's own cloud platforms."

The primary way to achieve that is to federate an Azure Active Directory tenant with an on-premises Active Directory service, he said. "What that means is when someone goes and authenticates, they get pointed at Azure Active Directory in the cloud. Their identities are already there and the authentication flow goes to an Active Directory federated server on-premises that verifies the password and produces the token."  It uses OAuth 2.0 for authentication, he said.

Microsoft plans to integrate OAuth 2.0 into Windows Server and all Azure services, though Russinovich noted it supports other token services such as SAML and other identity services from the likes of Facebook, Google and others.

One area Microsoft needs to deliver on for Azure is role-based access control, which is in preview now, Russinovich said. "This will be fleshed out so we have a consistent authorization story all pointing to Azure Active Directory, which lets you not just connect on-premises to Active Directory through [Active Directory Federation Services], but these third-party consumer services, as well. That's a key point, you want identity to be enabled in the cloud," he said.

Another key area of leadership Microsoft has gained in the cloud is the company's defiance in giving in to law enforcement agency efforts to subpoena user data, alluding to last week's order to turn over data in a Dublin datacenter, which Microsoft said it will appeal. "Brad Smith, our head of legal, is probably now the face of the tech industry going out and fighting the United States government's efforts to get access to data that they probably won't get access to," he said. "We're really trying to make sure we're a great partner for customers that care about data privacy."

When it comes to the public cloud, Russinovich discussed how there's significant demand for cloud services such as storage and disaster recovery. Russinovich demonstrated a few offerings including the new Azure ExpressRoute, which through third-party ISPs and colocation operators, can provide high-speed dedicated private connections between two datacenters using Azure as a bridge or to connect directly to Azure. Other demos included point-to-site VPN and Web apps using Azure.

Russinovich also gave a nod to Azure's broad support for those who want to create Windows PowerShell scripts to add more automation. "Everything in Azure is PowerShell-enabled because Jeffrey Snover [its revered inventor and Microsoft distinguished engineer and lead architect] makes us enable everything from PowerShell," Russinovich quipped.

During the evening reception at TechMentor, I reconnected with some attendees who had told me the previous evening they wanted to learn more about Azure. Some saw opportunities, but others weren't sure how it fits into what they're doing. Nevertheless, most said they were surprised to learn how Azure has advanced. This is exactly what Russinovich aimed to do in his talk -- to make a case that Microsoft has more to offer than Amazon and other cloud providers.

To be sure, Amazon still has a growing cloud business and is seen as the leader for enterprise Infrastructure as a Service. As noted by investment researcher Zacks, Amazon reported  a 90 percent usage growth for its AWS cloud business. While Amazon doesn't break out revenues and profits for AWS, it falls in the category of "Other," now accounting for 6 percent of revenues. Experts believe AWS generated the majority of revenues in that "Other" category.

"AWS continues to launch new services and enhance the security of its services," according to the Zacks report. "Amazon again reduced prices significantly, which was the reason for the revenue decline in the last quarter."

By all indications here at TechMentor and messaging from Microsoft at the recent Worldwide Partner Conference (and TechEd before that), Microsoft is not letting up on its Azure push. While many attendees saw the depth and progress Microsoft has made with Azure for the first time, Rome wasn't built in a day.

 

Posted by Jeffrey Schwartz on 08/13/2014 at 11:34 AM0 comments


Google To Reward Web Sites That Boost Security

Google wants Web sites to become more secure and said Wednesday it will do its part by motivating organizations to build stronger encryption for their sites. The company is giving a pretty significant incentive: it will reward those who do so by ranking them higher than sites lacking the added support to Transport Layer Security, also known as HTTPS encryption. Another way to look at it is Google will punish those who lack the extra encryption.

It's always troubling to hear reports that allege Google is playing with its search algorithm in a way that can unfairly benefit some to the detriment of others. Given its dominance in search, any action, real or perceived, places it under scrutiny and risk of regulators getting on the company's case.

Yet one could argue Google is now putting a stake in the ground in the interest of everyone who uses the Web. By forcing sites to implement stronger encryption by implementing TLS, the company is using its clout to make it a safer place. This could have major consequences to many businesses that live and die by how well they appear in Google search results. That's especially the case for those who expend efforts in search engine optimization, or SEO. But Google is doing so by trying to force those with insecure sites to step to implement TLS. While not a panacea, it's a step up.

Google has talked up "HTTP by Default" for years. It means Search, Gmail and Google Drive automatically direct secure connections to the Google sites. At its recent Google IO developer conference, the company introduced its HTTPS Everywhere push. Webmaster trends analysts Zineb Ait Bahajji and Gary Illyes explained in a post Wednesday how the company plans to rank sites based on their HTTPS/TLS support.

"Over the past few months we've been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms," they wrote. "We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal -- affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content -- while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we'd like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the Web."

In the coming weeks Google said it will publish detailed best practices on how to make it easier to implement TLS at its help center. In the meantime, Google offered the following tips:

  • Decide the kind of certificate you need: single, multi-domain or wildcard certificate.
  • Use 2048-bit key certificates.
  • Use relative URLs for resources that reside on the same secure domain.
  • Use protocol relative URLs for all other domains.
  • Check out our Site move article for more guidelines on how to change your Web site's address.
  • Don't block your HTTPS site from crawling using robots.txt.
  • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

Google is also recommending those with sites already serving HTTPS should test the security levels and configuration using Qualys SSL Server Test tool.

What's your take on Google's effort to force the hand of organizations to make their sites more secure? Is it a heavy handed and unfair move by taking advantage of its search dominance or an altruistic use of its clout that could make the Web safer for everyone?

Posted by Jeffrey Schwartz on 08/08/2014 at 12:34 PM0 comments


Microsoft's NFL Surface Deal: Touchdown or Hail Mary?

Microsoft has extended its relationship with the NFL in a new deal for its Surface Pro 3 tablet-PCs to be used by coaches on the sidelines during games. Viewers of NFL games this season will prominently see team personnel and coaches using Surface Pro 3 devices to review images of plays on screen rather than on printed photos.

The NFL struck a deal last year with Microsoft for the Surface to be the league's official tablet. Now a new $400 million deal calls for Surface Pro 3s to be used on the sidelines during games. The arrangement calls for coaches to use the Surface Pro 3s for five years, according to a Wall Street Journal blog post.  

Microsoft said the Surface devices used by teams are equipped with the Sideline Viewing System (SVS). The app, developed by Microsoft, picks up the same digital feed previously sent to the printers. By replacing the printouts with digital images sent to the tablets, coaches will get them faster (four to five seconds versus 30 seconds) and have the ability to use the high resolution touch display to zoom in on a portion of the image and make annotations, Microsoft said.

While coaches will be able to receive images, the tablets won't have access to the public Internet and team personnel won't be permitted to use the devices to take photos or videos during games. It remains to be seen whether teams in future seasons will use the Surface Pro 3s for making play-by-play decisions.

Perhaps more telling will be whether the promotional value of this deal will boost sales of the Surface line of tablets. Certainly with the vast viewership of NFL games it's a nice boost. But as Computerworld reported Monday, Microsoft's losses on Surface have swelled to $1.7 billion since their 2012 launch, according to a July 22 8-K statement. The publication also calculated that Microsoft saw a loss of $363 million for the last fiscal quarter. That makes it the largest quarterly loss for the tablets since the company began reporting revenues for them. Hence, Microsoft needs to score many more deals like this to keep the Surface in the game.

What's your take on this deal? Did Microsoft score big or was this just a Hail Mary pass?

 

Posted by Jeffrey Schwartz on 08/06/2014 at 11:00 AM0 comments


Court Orders Microsoft To Turn Over E-Mail in Dublin Datacenter

In a setback for U.S. cloud providers looking to ensure privacy of data stored in foreign countries, a search warrant ordering Microsoft to turn over e-mail stored in its Dublin, Ireland datacenter was upheld. Judge Loretta Preska of the U.S. District Court for the Southern District of New York upheld the search warrant by a domestic magistrate judge ruling. The identity and locale of the suspect, which is suspected in an illegal drug-related matter, is not known. Microsoft has said it will appeal last week's ruling.

Several tech companies had filed briefs in support of Microsoft's appeal of the warrant during a two-hour hearing held Tuesday, including Apple, AT&T, Cisco and Verizon, along with the Electronic Frontier Foundation, according to published reports. The outcome of this case can potentially set a precedent for all U.S.-based cloud providers storing data abroad.

"Under the Fourth Amendment of the U.S. Constitution, users have a right to keep their e-mail communications private," wrote Microsoft Chief Counsel Brad Smith in April. "We need our government to uphold Constitutional privacy protections and adhere to the privacy rules established by law. We're convinced that the law and the U.S. Constitution are on our side, and we are committed to pursuing this case as far and as long as needed."

In an OpEd piece published in The Wall Street Journal last week prior to the hearing, Smith furthered that view. "Microsoft believes you own e-mails stored in the cloud, and they have the same privacy protection as paper letters sent by mail," he said. "This means, in our view, that the U.S. government can obtain e-mails only subject to the full legal protections of the Constitution's Fourth Amendment. It means, in this case, that the U.S. Government must have a warrant. But under well-established case law, a search warrant cannot reach beyond U.S. shores."

In upholding the warrant Judge Preska argued that's not the point. "It is a question of control, not a question of the location of that information," Preska said, according to a report byThe Guardian. Hanni Fakhoury, staff attorney for the Electronic Frontier, told Computerworld he wasn't surprised by the ruling, saying it ultimately will be decided by the Second Circuit Court of Appeals. "I hope the Second Circuit looks closely at the magistrate's reasoning and realizes that its decision radically rewrote the Stored Communications Act when it interpreted 'warrant' to not capture all of the limitations inherent in a warrant, including extraterritoriality," he said.

The outcome of this case will have significant ramifications on the privacy of data stored in foreign datacenters. But that's not all Microsoft and its supporters have at stake. Should the warrant ultimately be upheld, The Guardian noted, U.S. companies are concerned they could lose billions of dollars in revenues to foreign competitors not subject to seizure by U.S. law enforcement agencies.

Posted by Jeffrey Schwartz on 08/04/2014 at 12:52 PM0 comments


Entry-Level and Higher-End Surface Pro 3 Models Now Available

If you've been holding off on buying Microsoft's Surface Pro 3, awaiting either a less expensive version or the higher-end model with an Intel Core i7 processor, they're now available. Microsoft announced the release of the new Surface Pro 3 models today.

The first units started shipping in June and featured mid-range models with i5 processors, priced between $999 and $1,299 (not including the $130 removable keyboard). Now the other units are available, as previously indicated. The unit with an i3 processor is priced at $799 for the device, while the i7 models, targeted at those with high-performance requirements such as those working in Adobe Photoshop or computer aided design-type applications, will run you quite a bit more.  A Surface Pro 3 with a 256GB solid-state drive and 8GB of RAM costs $1,559. If you want to go whole-hog, one with a 512GB SSD will set you back $1,949.

Microsoft also said the $200 docking station for the Surface Pro 3 is on pace to ship in two weeks. The dock will have 5 USB ports (three USB 3.0 and two USB 2.0), an Ethernet input and a mini DisplayPort that renders 3840 x 2600 DPI resolution.

If you missed it, here's my first take on the mid-range Surface Pro 3. Have you looked at the Surface Pro 3? What's your take? Is it a device you're considering or are you sticking with a device from one of Microsoft's OEM partners?

 

Posted by Jeffrey Schwartz on 08/01/2014 at 11:39 AM0 comments


Microsoft Improves Office for the iPad

Microsoft has substantial plans for its flagship Office suite, as noted by Mary Jo Foley in her August Redmond magazine column. But for now its version for the iPad is getting all the love. Just four months after the long-awaited release of Office for iPad, Microsoft has upgraded it with some noteworthy new features. The 1.1 versions of Word, Excel and PowerPoint are now available in Apple's iTunes App Store. Microsoft also updated OneNote for the iPad with its version 2.3 release.

While Microsoft added new features specific to the three key components of Office, the most noteworthy addition across the board is the ability to let users save files in the Adobe PDF format (even for those who don't have Office 365 subscriptions). That was one of the three top feature requests, according to a post in Microsoft's Office Blog announcing the upgrade. Second is the ability to edit photos on the iPad, which users can now do in Word. Though limited, the photo editing feature enables cropping and resetting. The third key feature is support for third-party fonts.

In Excel, Microsoft has added a new sorting capability which lets users filter, extend, collapse, display details and refresh PivotTables in an Excel workbook as well as changing the way they're displayed. Another new feature called "flick gesture" aims to simplify the use of workbooks including the selection of data fields with large ranges. "Simply grab the selection handle, flick it in any direction and Excel will automatically select from where you started to the next blank cell," read the blog post, adding if "you're at the top of a column of data and want to select all the way to the bottom, just flick down and the column is selected automatically."

The new PowerPoint app supports the use of multimedia capabilities on the iPad, specifically audio and video embedded into slides. It also lets users insert videos and photos from their iPad camera rolls. When projecting a presentation from the iPad, users can now enable Presenter View to see their own notes on the device.

Microsoft pointed out this is part of its new effort to offer continuous updates to iPad for Office. When you download the update, don't do it when you're in a rush -- give yourself about 15 to 30 minutes, depending on your network. I found Word and Excel in the Update section, though I had to search the store for the PowerPoint version.

Share your thoughts on the new update.

Posted by Jeffrey Schwartz on 08/01/2014 at 11:35 AM0 comments


Microsoft Offices Raided Throughout China

In its latest effort to apparently reign in large U.S. tech companies from expanding their presence in China, government investigators in China this week raided Microsoft offices throughout the country, according to several reports.

The raids by China's State Administration for Industry and Commerce included offices in Beijing, Shanghai, Guangzhou and Chengdu, according to a report Monday in The New York Times, citing a spokeswoman who declined to elaborate due to the sensitivity of the issue. Another Microsoft spokesman told The Wall Street Journal that the company's business practices are designed to comply with Chinese law.

Accusing Microsoft of monopolistic practices and other violations that are less clear, the raids are the latest salvo in tensions between the two countries over the past few years which have been recently escalated by spying, malware attacks and hacking allegations. The move could also be retaliation by the Chinese government following indictments by the U.S. government in May that charged five of China's Army officers with cyber attacks, The New York Times added.

The raids follow a visit to China last week by Qualcomm CEO Steven Mollenkopf, according to the report, which said he held talks with government officials and announced a $150 million "strategic venture fund" to invest in Chinese technology start-up companies, though it's unclear whether the visit sparked the escalation against Microsoft.

Microsoft, which, like many U.S. corporations, has identified China as one of its largest growth markets, is not the only company in the country's crosshairs these days. Government officials have also started to explore the reliance by Chinese banks on IBM mainframes and servers, though Big Blue has agreed to sell its x86 x Series server group to Lenovo for $2.3 billion. Apple, Cisco and Google have also faced heighted scrutiny, according to reports.

Approximately 100 investigators raided Microsoft's offices in China, according to The Wall Street Journal, which reported Tuesday that China's State Administration for Industry and Commerce vaguely accused Microsoft of not disclosing certain security features and how the company integrates its various products. China's government, through state-controlled news outlet CCTV, has raised concerns about the security of Windows 8 and also has deemed Apple's iPhone as a danger to the country's national security.

Disclosures by Edward Snowden of the surveillance efforts have also escalated concerns by China's government, several reports noted. Yet some wonder if the moves are more about protectionism of companies based in China than concerns about security.

 

Posted by Jeffrey Schwartz on 07/30/2014 at 12:37 PM0 comments


Will Activist Investor Push EMC To Spin Off VMware?

Hedge fund holding company Elliot Management has taken a $1 billion stake in storage giant EMC, according to a report in The Wall Street Journal last week. The activist investor with an estimated $25 billion under management is reportedly looking for the company to spin off VMware and Pivotal.

EMC has long opposed spinning off VMware, which is a potential move often floated. It most recently came up in late May when two Wells Fargo analysts advocated the spin off of VMware. Elliot is among those impatient with EMC's stock performance, which has risen only 163% over the past decade, compared with the Dow Jones U.S. Computer Hardware Index's 309% gain.

There's no reason to believe EMC CEO Joe Tucci would now welcome such a move, according to the report, noting his planned retirement is set for next year. Having VMware and Pivotal operate as separate companies, which EMC has a majority stake in, lets it operate under its "federation strategy." This would allow EMC, VMware, Pivitol and RSA  to work together or individually with fewer competitive conflicts. An EMC spokesman told The Journal the company always welcomes discussions with shareholders but had no further comment. Elliot Management did not respond to an inquiry.

Freeing EMC of those companies could make it a potential takeover target for the likes of Cisco, Hewlett Packard or Oracle, though, the report noted, due to EMC's 31 percent decline in market share in the external storage market. EMC's stake in VMware is approximately 80 percent.

Elliot Management has taken many stakes in companies including BMC, Juniper and EMC rival NetApp. The company also made a $2 billion bid to acquire Novell in 2010, which the company rejected. Novell was later acquired by Attachmate.

Posted by Jeffrey Schwartz on 07/28/2014 at 12:29 PM0 comments


Veeam Releases Management Pack for Hyper-V and Microsoft's System Center Operations Manager

Veeam this week has released a new management tool to provide visibility into Hyper-V as well as VMware environments. While previous versions of the Veeam Management Pack only supported VMware, the new v7 release now provides common visibility and management of Hyper-V, Microsoft's hypervisor. Administrators can use the Veeam Management Pack from within System Center Operations Manager.

Veeam had demonstrated a near-ready release at the TechEd conference in Houston back in May, among other System Center Operations Manager management packs. Within System Center Operations Manager, the new Veeam Management Pack version 7 for System Center offers a common dashboard that provides monitoring, capacity planning and reporting for organizations using Veeam Backup & Replication.

With the new management pack, System Center Operations Manager administrators can manage both their vSphere and Hyper-V environments together with complete visibility to physical and virtual components and their dependencies. In addition to offering deeper visibility into both hypervisors within a given infrastructure, the new Veeam Management Pack provides contextual views using color-coded heat maps for viewing various metrics and it provides real-time data feeds.

It also lets administrators manage the Veeam Backup & Replication for Hyper-V platform to determine if and when a host or virtual machine (VM) is at risk of running out of storage capacity, Doug Hazelman, the company's vice president of product strategy, said during a meeting at TechEd. "We provide views on networking, storage, heat maps -- the smart analysis monitors, as we call them," Hazelman said. "This is something you don't see in general in System Center."

If memory pressure is too high on a specific VM, the Veeam Management Pack can analyze the environment such as host metrics, the properties of the VM or whether it's configured with too little memory. Or perhaps the host has exhausted its resources. In that case a dynamic recommendation is provided. While administrators typically default to the Windows Task Manager to determine gauge utilization of CPU, memory and other common resources on a physical server, Hazelman pointed out that the common utility isn't designed to do so for VMs. The Veeam Task Manager addresses that.

The new release will be available with two licensing options. The Enterprise Plus Edition provides complete real-time forecasting, monitoring and management of the entire infrastructure including clusters and the complete virtual environment. It's available as a free update to existing Veeam Management Pack 6.5 customers.

A new lower-end Enterprise Edition is a scaled-down release that  provides management and monitoring but not the full level of reporting of the Enterprise Plus version. The company is offering 100 free cores of the new Enterprise Edition free of charge, including maintenance for one year. This offer is available through the end of this year.

 

Posted by Jeffrey Schwartz on 07/25/2014 at 3:20 PM0 comments


Microsoft Welcomes New Amazon Fire Phone with OneNote, Skype Support

The release of the Amazon Fire Phone this week adds another wildcard to the smartphone race largely dominated by Apple's iPhone and devices based on Google's Android OS. With Windows Phone in a distant but solid third place in market share, it remains to be seen if it's too late for anyone to make a serious dent in the market at this point.

Microsoft must believe the new Amazon Fire Phone has a chance of gaining at least some amount of share and as a result is offering its Skype and OneNote apps in the Amazon App Store for Android right out of the gate. Even if the Windows Phone team doesn't see the Amazon Fire Phone as a threat, Microsoft also seems to realize that the new phone's success or failure won't rest on adding apps such as Skype and OneNote. And if the Amazon Fire Phone should prove to be a wild success, Microsoft, which already has acknowledged that we're no longer in an all-Windows world, likely doesn't want its apps and services to be supplanted by others. This move is not without precedent: Microsoft has offered Amazon Kindle apps in the Windows Store since the launch of Windows 7 back in 2009. Amazon also offers Windows Server and SQL Server instances in its Amazon Web Services public cloud

If any new entrant can gain share in the mobile phone market, it's Amazon. But even when Facebook was reportedly floating the idea of offering its own phone, it appears it realized that an apparent single-purpose device was a risky bet. Amazon has more to gain -- its phones are inherently designed to let people buy goods in its retail store. It even lets consumers capture an image of an item in a store and see if they can get it cheaper, which, they more often than not, can. And customers who have an Amazon Prime account, which all new subscribers get with the phone for a full year, can purchase goods with the phone.

Amazon already has experience in the device market with its Kindle Fire tablets with respectable but not dominant share. And Amazon recently launched its own TV set top box. The Amazon Fire Phone, like the Kindle Fire tablets, runs on a forked version of Android that doesn't support apps from the Google Play store. Amazon's new phone introduces some interesting new features including the ability to flip pages in a calendar and other apps without touching the screen but rather moving your hand. It comes standard with 32GB of storage and has the ability to recognize products.

Critics argue some of these "whiz bang" features are nice and some other appealing features like unlimited photo storage in the cloud are also nice touches. But it also it has a relatively basic design and form factor. While I haven't seen the phone myself, it appears it will appeal to those who use the Amazon Prime service extensively. As an Amazon Prime subscriber too, I'll settle for using whatever apps Amazon offers for iOS or Windows Phone when selecting my next phone. 

Amazon has shown a willingness to break even or even sell products at a loss if it will further its overall business goals. On the other hand, with yesterday's disappointing earnings report where its net loss of $27 per share was substantially higher than the $16 loss expected by analysts, it remains to be seen how far Amazon can sustain losses, especially in new segments. Investor pressure aside, CEO Jeff Bezos shows no sign of backing off on his company's investments in distribution centers for its retail operation, datacenter capacity for the Amazon Web Service public cloud and now its entry into the low-margin phone business.

What's your take on the Amazon Fire Phone? Will it be a hot commodity or will it vindicate Facebook for choosing not to enter the market?

Posted by Jeffrey Schwartz on 07/25/2014 at 10:18 AM0 comments


Nadella Bets Universal Windows Will Advance Cloud and Mobile Share

Despite seeing its profits shrink thanks to its acquisition of Nokia, Microsoft on Tuesday reported a nice uptick in its core business lines -- notably its datacenter offerings -- and strong growth for its cloud services including Office 365 and Azure.

CEO Satya Nadella appeared in his second quarterly call with analysts to discuss Microsoft's fourth quarter earnings for fiscal year 2014. The company exceeded its forecasts for Office 365 subscription growth and saw double-digit gains across its enterprise server lines.

One of the key questions among analysts is what the future holds for Windows and its struggling phone business, now exacerbated. Nadella underscored that bringing a common Windows core across all device types, including phones, tablets, PCs, Xbox and embedded systems, will strengthen Microsoft's push into mobility, as well as the cloud. This is the notion of what Microsoft described to partners last week as the next wave of Windows, which will come in different SKUs but will be built on a common platform -- what Nadella described as "one" Windows that supports "universal" apps.

"The reality is we actually did not have one Windows," Nadella said on Tuesday's call. "We had multiple Windows operating systems inside of Microsoft. We had one for phone, one for tablets and PCs, one for Xbox, one for even embedded. Now we have one team with a layered architecture that enables us to, in fact, for developers, bring [those] collective opportunities with one store, one commerce system, one discoverability mechanism. It also allows us to scale the UI across all screen sizes. It allows us to create this notion of universal Windows apps."

Responding to an analyst question about what it will take to incent developers to build not just for Apple's iOS and Google's Android but also for Windows Phone and Windows-based tablets, Nadella said he believes this concept of "dual use" -- in which people use their devices for work and their personal lives -- will make it attractive for reluctant developers. 

Now that Microsoft has brought all of the disparate Windows engineering teams together into one organization, when the next version of Windows comes out next year, Nadella said it will allow customers to use even their core desktop apps on any device. He's betting that application portability will make it easier and economical for developers to build more apps for Windows.

"The fact that even an app that runs with a mouse and desktop can be in the store and have the same app in a touch-first, in a mobile-first way, gives developers the entire volume of Windows, which you see on a plethora of units as opposed to just our 4 percent share of mobile in the U.S. or 10 percent in some counties," Nadella said. "That is the reason why we are actually making sure that universal Windows apps are available and developers are taking advantage of it. We have great tooling. That's the way we are going to be able to create the broadest opportunity to your very point about developers getting an ROI for building for Windows."

Yet between the lines, the fact that Nadella two weeks ago said Microsoft is a "productivity and platforms" company rather the previous "devices and services" descriptor suggests that the emphasis of Windows is a common platform tied with Office and OneDrive. Microsoft's goal is that this will allow users to do their work more easily, while making it easy for them to use their devices for their personal activities without the two crossing paths. And the most likely way to succeed is to ensure developers who have always built for the Windows platform continue to do so.

Posted by Jeffrey Schwartz on 07/23/2014 at 11:26 AM0 comments


Pluralsight Hires Redmond Columnists Greg Shields and Don Jones

Longtime Redmond magazine columnists Don Jones and Greg Shields have joined online IT training firm Pluralsight, where they will provide courses for IT administrators.

The two will continue to write their respective Decision Maker and Windows Insider columns for Redmond magazine and other content to the publication's Web site. They will also continue to present at the TechMentor and Live! 360 conferences, which, like Redmond magazine, is produced by 1105 Media.

Pluralsight announced the hiring of Jones and Shields on Wednesday. "As thought leaders, Don and Greg are the cream of the crop in their field, bringing the kind of experience and expertise that will add immense richness to Pluralsight's IT offering," said Pluralsight CEO Aaron Skonnard in a statement. Both Jones and Shields are Microsoft MVPs and VMware vExperts, and Shields is also a Citrix Technology Professional (CTP).

The move means they are leaving the boutique consulting firm they created, Concentrated Technology. Jason Helmick will continue to provide PowerShell training for Concentrated. Helmick, Jones and Shields will be presenting at next month's TechMentor conference, to be held on the Microsoft campus in Redmond, Wash.

Jones' and Shields' Redmond columns cover the gamut of issues that relate to Windows IT professionals. Some columns offer hands-on tips, like a recent one on how to troubleshoot with Microsoft's RDS Quality Indicator and another on what's new in Group Policy settings. Other columns have led to heated debates, such as last summer's "14 Reasons to Fire Your IT Staff." Jones' recent Decision Maker columns have explained how organizations can create a culture of security.

Posted by Jeffrey Schwartz on 07/23/2014 at 2:11 PM0 comments


Intel Sees Continued Refresh of Enterprise and SMB PC Hardware

After a protracted decline in PC sales, Intel last week said that enterprises of all sizes are refreshing their portable and desktop computers. In its second quarter earnings report, Intel said PC shipments rose for the third consecutive quarter. While the company acknowledged that the end of life of Windows XP has helped fuel the revival, the company appears optimistic the trend will continue.

Though company officials didn't give specific guidance for future quarters, the company is optimistic that the pending delivery of new systems based on its new 14mm Broadwell processor will propel demand in the following quarters. The new smaller CPU is expected to offer systems that are lighter and offer better battery life.

Intel said its PC group's revenues of $8.7 billion represented a 6 percent increase over the same period last year and a percent jump over the prior quarter. The second quarter Intel reported last week covers the period when Microsoft officially stopped releasing regular patches for its Windows XP operating system.

"The installed base of PCs that are at least four years old is now roughly 600 million units and we are seeing clear signs of a refresh in the enterprise in small and medium businesses," said Intel CEO Brian Krzanich, during the company's earnings call. "While there are some signs of renewed consumer interest and activity, the consumer segment remains challenging, primarily in the emerging markets."

Krzanich was particularly optimistic about the arrival of newest ultramobile systems that will arrive from the 14nm Llama Mountain reference design, which he said will result in fanless, detachable two—in-one systems that are 7.2 mm and weigh 24 ounces. OEMs demonstrated some of these new systems at the recent Computex show in Taipei. Microsoft showcased many new Windows PCs in the pipeline at last week's Worldwide Partner Conference in Washington, D.C.

 

Posted by Jeffrey Schwartz on 07/21/2014 at 1:41 PM0 comments


Microsoft's New InMage Scout Added to Azure Site Recovery Licenses

Microsoft moved quickly after last week's acquisition of InMage Systems to say that the InMage Scout software appliances for Windows and Linux physical and virtual instances will be included in its Azure Site Recovery subscription licenses.

Azure Site Recovery, a service announced at Microsoft's TechEd conference in Houston in May, is the rebranded Hyper-V Recovery Manager.  Azure Site Recovery, unlike its predecessor, allows customers to use the Microsoft Azure public cloud as a backup target rather than requiring a second datacenter. Image Scout is an on-premises appliance which in real time captures data on a continuous basis as those changes occur. It then simultaneously performs local backups or remote replication via a single data stream.

With the addition of InMage Scout subscription licenses to the Azure Site Recovery service, Microsoft said customers will be able to purchase annual protection for instances with the InMage Scout offering. Microsoft is licensing Azure Site Recovery on a per virtual or physical instance basis. The service will be available for customers with Enterprise Agreements on Aug. 1. For now, that's the only way Microsoft is letting customers purchase InMage Scout, though it's not required for Azure Site Recovery. The InMage Scout software is available for use on a trial basis through Aug. 1 via the Azure portal.

The company also quietly removed the InMage-4000 appliance from the portfolio, a converged system with compute storage and network interfaces. The InMage-4000 was available with up to 48 physical CPU cores, 96 threads, 1.1TB of memory and 240TB of raw storage capacity. It supports 10GigE storage networking and built-in GigE Ethernet connectivity. Though it was on the InMage Web site last Friday, by Sunday it was removed.

A Microsoft spokeswoman confirmed the company is no longer offering the turnkey appliance but hasn't ruled out offering a similar type system in the future

Posted by Jeffrey Schwartz on 07/18/2014 at 11:55 AM0 comments


Microsoft To Cut 18,000 Jobs

Microsoft this morning announced what was widely rumored -- it will kick off the largest round of layoffs in the company's history. The company will reduce its workforce by 18,000 employees -- much greater than analysts had anticipated. More than two thirds of them -- 12,500 -- will affect workers in its Nokia factories with the rest impacting other parts of the company. The layoffs are aimed at creating a flatter and more responsive organization.

CEO Satya Nadella announced the job cuts just one week after indicating that major changes were in the works. Just yesterday in his keynote address at Microsoft's Worldwide Partner Conference in Washington, D.C. Nadella reiterated that the company must change its culture and be easier to do business with. Since Microsoft announced its intent to acquire Nokia last year for $7.2 billion dollars, critics were concerned it would be a drag on the company's earnings. Nadella clearly signaled he is moving to minimize its impact.

The larger than expected reductions from Nokia is the result of a plan to integrate the company's operations into Microsoft, Nadella said in an e-mail to employees announcing the job cuts, 13,000 of which will take place over the next six months. "We will realize the synergies to which we committed when we announced the acquisition last September," Nadella said. "The first-party phone portfolio will align to Microsoft's strategic direction. To win in the higher price tiers, we will focus on breakthrough innovation that expresses and enlivens Microsoft's digital work and digital life experiences. In addition, we plan to shift select Nokia X product designs to become Lumia products running Windows. This builds on our success in the affordable smartphone space and aligns with our focus on Windows Universal Apps."

Nadella also said that Microsoft is looking to simplify the way employees work by creating a more agile structure that can move faster than it has and make workers more accountable.  "As part of modernizing our engineering processes the expectations we have from each of our disciplines will change," he said, noting there will be fewer layers of management to accelerate decision making.

"This includes flattening organizations and increasing the span of control of people managers," he added. "In addition, our business processes and support models will be more lean and efficient with greater trust between teams. The overall result of these changes will be more productive, impactful teams across Microsoft. These changes will affect both the Microsoft workforce and our vendor staff. Each organization is starting at different points and moving at different paces."

The layoffs don't mean Microsoft won't continue to hire in areas where the company needs further investment. Nadella said he would share more specific information on the technology investments Microsoft will make during its earnings call scheduled for July 22.

 

Posted by Jeffrey Schwartz on 07/17/2014 at 7:14 AM0 comments


Nadella: Microsoft Must Change Its Culture, Not Its Technology

It's no secret that big changes are coming to Microsoft. CEO Satya Nadella made that clear in his 3,100-word memo to employees late last week. The key takeaways of that message were that Microsoft is now a platforms and productivity company and it intends to become leaner in a way that it can bring products to market faster.

While the new platforms and productivity mantra doesn't mean it's doing away with its old devices and services model, Nadella is trying to shift the focus to what Microsoft is all about and it's sticking with a strong cloud and mobile emphasis.

The latter sounds like Nadella wants to accelerate the Microsoft One strategy introduced last year and, reading between the lines, he wants to break down the silos and fiefdoms. In his keynote address at Microsoft's Worldwide Partner Conference in Washington, D.C. today, Nadella said the company intends to change its culture.

"We change the core of who we are in terms of our organization and how we work and our value to our customers," Nadella said. "That's the hardest part really. The technology stuff is the simpler thing. We all know that but we need to move forward with the boldness that we can change our culture. It's not even this onetime change, it's this process of continuous renewal that [will] succeed with our customers."

Nadella's push comes when there is unease among the Microsoft ranks. Rumors persist that Microsoft is planning some layoffs. The move isn't unexpected, given Microsoft inherited 25,000 new employees from its $7.2 billion acquisition of Nokia. According to a Reuters report, Microsoft is planning on laying off 1,000 of those employees based in Finland. The layoffs are expected to be the most extensive in Microsoft's history, according to a Bloomberg report.

To date Nadella is not indicating any planned cutbacks but at the same time it appears Nadella is well aware that Microsoft needs to rid itself of those who aren't on board with the company's new way of doing business.

Nadella said Microsoft needs to "enable the employees to bring their A game, do their best work, find deeper meaning in what they do. And that's the journey ahead for us. It's a continuous journey and not an episodic journey. The right way to think about it is showing that courage in the face of opportunity."

Posted by Jeffrey Schwartz on 07/16/2014 at 12:40 PM0 comments


What Impact Will the Apple-IBM Mobility Pact Have on Windows?

In what Apple and IBM describe as a "landmark" partnership, the two companies have forged a deal to bring 100 industry specific, enterprise-grade iOS apps and provide cloud services such as security, analytics and mobile integration for iPads and iPhones. The pact also calls for the two companies to offer AppleCare support for enterprises and IBM will offer device activation, supply and management of devices.

This broad partnership is a significant arrangement for both companies in that it will help IBM advance its cloud and mobility management ambitions and Apple will gain its largest foothold to date into the enterprise. It bears noting, Apple rarely forms such partnerships, preferring to go it alone. At the same time, the buzz generated by this partnership, though noteworthy, may be overstating the impact it will have. The harm it will have on Android and Windows also appears marginal.

To date, Apple has benefited by the BYOD movement of the past few years and that's predominantly why so many iPads and Android-based tablets and smartphones are used by employees. While there's no shortage of enterprise mobile device management platforms to administer the proliferation of user-owned devices, Apple is hoping IBM's foothold in the enterprise, its strong bench of developer tools and its growing cloud infrastructure will lead to more native apps and software-as-a-service offerings.

"This alliance with Apple will build on our momentum in bringing these innovations to our clients globally," said Ginni Rometty, IBM chairman, president and CEO, in a statement. "For the first time ever we're putting IBM's renowned big data analytics at iOS users' fingertips, which opens up a large market opportunity for Apple," added Apple CEO Tim Cook. "This is a radical step for enterprise and something that only Apple and IBM can deliver."

While the pact certainly will give IBM more credibility with its customers, its benefit to Apple appears marginal, which is why the company's stock barely budged on the news last night. "We do not expect the partnership to have a measurable impact on the model given that Apple has already achieved 98 percent iOS penetration with Fortune 500 companies and 92 percent penetration with Global 500 companies," said Piper Jaffray Analyst and known Apple bull Gene Munster in a research note. "While we believe that the partnership could strength these existing relationships, we believe continued success with the consumer is the most important factor to Apple's model."

The Apple-IBM partnership certainly won't help Microsoft's efforts to keep its Windows foothold intact, which is already under siege. On the other hand, it's a larger threat to Android than to Windows. The obvious reason is that Android has more to lose with a much larger installed base of user-owned tablets. Even if the number of combined tablets and PCs running Windows drops to 30 percent by 2017, as Forrester Research is forecasting, enterprises still plan to use Windows for business functions because of its ability to join Active Directory domains and its ties to Windows Server, SharePoint, Office and the cloud (including OneDrive and Azure).

"It makes it more challenging for Windows Phone to gain ground in the enterprise, because IBM bolster's Apple's hardware in the enterprise, for both sales/support and enterprise apps," said Forrester analyst Frank Gillett. "And that indirectly makes it harder for Windows PCs to stay strong also, but that's incremental."

Pund-IT Analyst Charles King sees this deal having a more grim effect on Microsoft. "Microsoft is in the most dangerous position since the company is clearly focusing its nascent mobile efforts on the same organizations and users as IBM and Apple," he said in a research note. The partnership was announced at an unfortuitous time for Microsoft -- the company is rallying its partners around Windows, among other things, at its Worldwide Partner Conference in Washington, D.C. where Microsoft has talked up its commitment to advance Windows into a common platform for devices of all sizes, from phones to large-screen TVs. "The goal for us is to have them take our digital work-life experiences and have them shine," Microsoft CEO Satya Nadella said in the keynote address at WPC today.

While Apple and IBM described the partnership as exclusive, terms were not disclosed. Therefore it's not clear what exclusive means. Does that mean Apple can't work with other IT players? Can IBM work with Google and/or Microsoft in a similar way? At its Pulse conference in Las Vegas back in February, IBM said it would offer device management for Windows Phone devices through its recently acquired MaaS360 mobile device management platform.

Also while Apple may have broken new ground with its IBM partnership, Microsoft has made a number of arrangements with providers of enterprise and vertical applications to advance the modern Windows platform. Among them are Citrix, Epic, SAP, Autodesk and Salesforce.com (with Salesforce One being available for Windows apps this fall).

Munster predicted if half the Fortune 500 companies were to buy 2,000 iPhones and 1,000 iPads above what they were planning to purchase from this deal, it would translate to half of one percent of revenue in the 2015 calendar year. In addition, he believes IBM will offer similar solutions for Android. Even if Munster is underestimating the impact this deal will have on Apple, there's little reason to believe this pact will move the needle significantly, if at all, for Windows. The fate of Windows is in Microsoft's hands.

 

Posted by Jeffrey Schwartz on 07/16/2014 at 12:35 PM0 comments


COO Kevin Turner: Microsoft Won't Give Feds Unfettered Access to Data

In his annual address to partners, Microsoft COO Kevin Turner said the company will not provide any government access to customer data. Microsoft will fight any requests by a government to turn over data, Turner told 16,000 attendees at the company's annual Worldwide Partner Conference, which kicked off today in Washington, D.C.

"We will not provide any government with direct unfettered access to customers' data. In fact we will take them to court if necessary," said Turner. "We will not provide any government with encryption keys or assist their efforts to break our encryption. We will not engineer backdoors in the products. We have never provided a business government data in response to a national security order. Never. And we will contest any attempt by the U.S. government or any government to disclose customer content stored exclusively in another place. That's our commitment."

Microsoft will notify business and government customers when it does receive legal orders, Turner added. "Microsoft will provide governments the ability to review our source code, to reassure themselves of its integrity and confirm no backdoors," he said.

The remarks were perhaps the most well received by the audience during his one-hour speech  that also covered the progress Microsoft has made for its customers in numerous areas including the success of Office 365, Azure, virtualization gains over VMware, business intelligence including last year's boost to SQL Server and the release of Power BI, which included its new push into machine learning. While Microsoft Chief Counsel Brad Smith has issued a variety of blog posts providing updates and assurance that it will protect customer data, Turner's public remarks step up the tenor of Microsoft's position on the matter.

While not addressing former NSA contractor Edward Snowden by name, it was a firm and public rebuke to accusations last year that Microsoft provided backdoors to the government. Turner acknowledged that despite its 12-year-old Trustworthy Computing Initiative, its Security Development Lifecycle and a slew of other security efforts, Microsoft needs to (and intends) emphasize security further. "When you think about the cyber security issues, there's never been an issue like this past year," Turner said. "It is a CEO-level decision and issue." 

Turner talked up Microsoft's existing efforts including its ISO standard certifications, operational security assurance Windows Defender, Trusted Platform Module, Bitlocker and various point products. He also played up the company's higher level offerings such as assessments, threat detection response services and its digital crimes unit.

Microsoft has other security offerings and/or efforts in the pipeline, Turner hinted. "We will continue to strengthen the encryption of customer data across our network and services," he said. "We will use world-class cryptography and best-in-class cryptography to do so."

 

 

Posted by Jeffrey Schwartz on 07/14/2014 at 2:47 PM0 comments


Microsoft Digs Deeper into Storage with Acquisition of InMage

If you thought Microsoft was looking to disrupt the storage landscape earlier this week when it launched its Azure StorSimple appliances, the company has just upped the ante. Microsoft is adding to its growing storage portfolio with the acquisition of InMage, a San Jose, Calif.-based provider of converged disaster recovery and business continuity infrastructure that offers continuous data protection (CDP). Terms weren't disclosed.

InMage is known for its high-end Scout line of disaster recovery appliances. The converged systems are available in various configurations with compute storage and network interfaces. Its InMage-4000 is available with up to 48 physical CPU cores, 96 threads, 1.1TB of memory and 240TB of raw storage capacity. It supports 10GigE storage networking and built-in GigE Ethernet connectivity.

Over time InMage will be rolled into the Microsoft Azure Site Recovery service to add scale to the company's newly added disaster recovery and business continuity offering. Microsoft had earlier announced plans to enable data migration to Azure with Scout, InMage's flagship appliance.

"InMage Scout continuously captures data changes in real time as they occur and performs local backup or remote replication simultaneously with a single data stream," a description on the company's Web site explained. "It offers instantaneous and granular recovery of data locally and enables push-button application level failovers to remote sites to meet local backup and/or remote DR requirements, thus going above and beyond the protection offered by conventional replication backup and failover automation products alone."

It also collects data from production servers in real time into memory before they're written to disk and moves the data to the InMage Scout Server. This eliminates any added I/O load from the backup or replication process. It also has built-in encryption, compression and WAN acceleration. It supports backups of Hyper-V, VMware ESX and Xen virtual machines.

The Scout portfolio also protects Linux and various Unix environments, and the company offers specialized appliances for Exchange Server, SAP, Oracle SQL Server, SharePoint, virtualization and data migration.

"Our customers tell us that business continuity -- the ability to backup, replicate and quickly recover data and applications in case of a system failure -- is incredibly important," said Takeshi Numoto, Microsoft's corporate VP for cloud and enterprise marketing, in a blog post announcing the acquisition.  

These products don't overlap. StorSimple offers primary storage with Azure as a tier, while InMage offers disaster recovery using the cloud or a secondary site as a target.

 

Posted by Jeffrey Schwartz on 07/11/2014 at 12:23 PM0 comments


Microsoft Aims To Disrupt Storage Industry with Azure StorSimple Arrays

When it comes to enterprise storage, companies such as EMC, Hewlett Packard, Hitachi, IBM and NetApp may come to mind first. But there are a lot of new players out there taking a piece of the pie. Among them are Fusion-io, GridStore, Nimble Storage, Nutanix, Pure Storage, SolidFire and Violin Memory, just to name a few high fliers. Another less obvious but potentially emerging player is Microsoft, which acquired storage appliance maker StorSimple in 2012.

As I noted a few weeks ago, Microsoft is aiming at commoditizing hardware with software-defined storage. In recent months Microsoft has also indicated it has big plans for making StorSimple a key component of its software defined storage strategy, which of course includes Storage Spaces in Windows Server. Microsoft this week announced it is launching the Azure StorSimple 8000 Series, which consists of two different arrays that offer tighter integration with the Microsoft Azure public cloud.

While Microsoft's StorSimple appliances always offered links to the public cloud, the new Azure StorSimple boxes with disks and flash-based solid-state drives use Azure Storage as an added tier of the storage architecture, enabling administrators to create virtual SANs in the cloud just as they do on premises. Using the cloud architecture, customers can allocate more capacity as needs require.

"The thing that's very unique about Microsoft Azure StorSimple is the integration of cloud services with on-premises storage," said Marc Farley, Microsoft's senior product marketing manager for StorSimple, during a press briefing this week to outline the new offering. "The union of the two delivers a great deal of economic and agility benefits to customers."

Making the new offering unique, Farley explained, is the two new integrated services: the Microsoft Azure StorSimple Manager in the Azure portal and the Azure StoreSimple Virtual Appliance. "It's the implementation of StorSimple technology as a service in the cloud that allows applications in the cloud, to access the data that has been uploaded from the enterprise datacenters by StorSimple arrays," Farley explained.

The StorSimple 8000 Series lets customers run applications in Azure that access snapshot virtual volumes which match the VMs on the arrays on-premises. It supports Windows Server and Hyper-V as well as Linux and VMware-based virtual machines. However unlike earlier StorSimple appliances, the new offering only connects to Microsoft Azure --  not other cloud service providers such as Amazon Web Services. Farley didn't rule out future releases enabling virtual appliances in other clouds.

The aforementioned new StorSimple Manager consolidates the management and views of the entire storage infrastructure consisting of the new arrays and the Azure Virtual Appliances. Administrators can also generate reports from the console's dashboard, letting them reallocate storage infrastructure as conditions require.

Farley emphasized that the new offering is suited for disaster recovery, noting it offers "thin recoveries." Data stored on the arrays in the datacenter can be recovered from copies of the data stored in the Azure Virtual Appliances.

The arrays support iSCSI connectivity as well as 10Gb/s Ethernet and inline deduplication. When using the Virtual Appliance, administrators can see file servers and create a virtual SAN in the Azure cloud. "If you can administer a SAN on-premises, you can administer the virtual SAN in Azure," Farley said.

Microsoft is releasing two new arrays: the StorSimple 8100, which has 15TB to 40TB of capacity (depending on the level of compression and deduplication implemented) and the StorSimple 8600, which ranges from 40TB to 100TB with a total capacity of 500TB when using Azure Virtual Appliances.

The StorSimple appliances are scheduled for release next month. Microsoft has not disclosed pricing but the per GB pricing will be more than the cost of the Microsoft Azure blog storage offering, taking into account bandwidth and transaction costs.

 

Posted by Jeffrey Schwartz on 07/09/2014 at 1:50 PM0 comments


CA To Sell Off Arcserve Data Protection Business

CA Technologies on Monday said it is selling off its Arcserve data protection software business to Marlin Equity Partners, whose holdings include Changepoint, Critical Path, Openwave, Tellabs and VantagePoint.

The new company will take on the Arcserve name. Mike Crest, the current general manager of CA's Arcserve business, will become CEO of the new company. Terms of the deal, expected to close at the end of this calendar quarter, were not disclosed.

Developed more than two decades ago by Cheyenne Software, which CA acquired in 1996, Arcserve has a following of enterprise customers who use it to protect mission-critical systems.

CA just released the latest version of Arcserve Unified Data Protection (UDP), which is available as a single offering for Linux, Unix and Windows systems. It includes extended agentless protection for Hyper-V and VMware virtual environments. However, the backup and recovery market has become competitive, and CA has been divesting a number of businesses since its new CEO, Mike Gregoire, took over last year.

Marlin has $3 billion in capital under management. "We are committed to providing the strategic and operational support necessary to create long-term value for Arcserve and look forward to working closely with CA Technologies through the transition," said Marlin VP Michael Anderson in a statement.

In a letter to partners, Chris Ross, VP for worldwide sales for CA's data-protection business, said the move will benefit Arcserve stakeholders. "Greater focus on company business functions, R&D and support will mean higher levels of service and customer satisfaction," Ross said. "Simply put, the new company will be purpose-built end-to-end for Arcserve's unique target customer segment, partner model and overall strategy."

Posted by Jeffrey Schwartz on 07/07/2014 at 9:32 AM0 comments


VMware Predicts Macs Will Invade the Enterprise

A study commissioned by VMware finds enterprise users "overwhelmingly" prefer Macs over Windows PCs. According to the survey of 376 IT professionals conducted by Dimensional Research, 71 percent of enterprises now support Macs and 66 percent have employees who use them in the workplace.

VMware, which of course has a vested interest in the demise of traditional Windows PCs in the enterprise, didn't ask to what extent Macs are deployed within respondents' organizations. While the share of Macs in use overall has increased over the years, according to IDC, the share of Macs dropped slightly to 10.1 percent last quarter from 11 percent year-over-year. However, that may reflect the overall decline in PC hardware sales over the past year. Nevertheless with more employees using their personal Macs at work and execs often preferring them over PCs, their presence in the workplace continues to rise.

Consequently, VMware is asserting that the findings show the dominance of Windows is coming to an end. "For companies, the choice is very clear -- they need to respond to end-user demand for Macs in the enterprise or they will find it difficult to recruit and retain the best talent on the market," said Erik Frieberg, VMware's VP of marketing for End-User Computing in a blog post last week. "They also need to provide IT administrators the tools to support a heterogeneous desktop environment. Otherwise there will be disruption to the business."

Despite user preference, the VMware study shows that 39 percent of the IT pros surveyed believe Macs are more difficult to support and 75 percent don't believe they are any more secure. "While employees clearly prefer Macs, there are challenges from an IT perspective that Macs must overcome before they can replace Windows PCs in the enterprise," Freiberg noted.

Exacerbating the challenge, 47 percent said only some applications that employees need to do their jobs run on Macs and 17 percent report none of their apps can run on Macs.

That trend is good news for Parallels, whose popular Parallels Desktop for Mac allows Windows to run as a virtual machine on Macs. I happened to catch up with Parallels at TechEd in Houston in May, where the company also announced a management pack for Microsoft's System Center Configuration Manager (SCCM). The new tool gives admins full visibility of Macs on an enterprise network, Parallels claims. In addition to network discovery, it includes a self-service application portal, an OSX configuration profile editor and can enable FireVault 2 encryption. The management pack can also deploy packages and prebuilt OSX images as well as configuration files.

VMware naturally sees these findings as lending credibility to its desktop virtualization push, including its Fusion Professional offering, which lets IT create virtual desktops for PCs and Macs, as well as its Horizon desktop-as-a-service offerings.

The survey also found that less than half  (49 percent) unofficially support user-owned PCs, while 27 percent officially have such policies in place. The remaining 24 percent don't support user-owned PCs.

Are you seeing the rise of Macs in your organization? If so, would you say an invasion of Macs in the enterprise is coming? How are you managing Macs within your shop?

 

Posted by Jeffrey Schwartz on 07/07/2014 at 9:30 AM0 comments


Microsoft Targets Windows 8 Holdouts with 'Threshold'

The preview of the next version of Windows could appear in the next few months and will have improvements for those who primarily use the traditional desktop environment for Win32-based applications, according to the latest rumors reported Monday.

"Threshold," which could be branded as Windows 9 (though that's by no means certain) will target large audience of Windows 7 user who want nothing to do with the Windows 8.x Modern user interface, according to a report by Mary Jo Foley in her All About Microsoft blog. At the same time, Microsoft will continue to enhance the Modern UI for tablet and hybrid laptop-tablet devices.

To accomplish this, the Threshold release will have multiple SKUs. For those who prefer the classic desktop and want to run Win32 apps, one SKU will put that front and center, according to Foley. Hybrid devices will continue to support switching between the Modern UI (also referred to as "Metro") and the more traditional desktop interface. And another SKU, aimed at phones and tablets only, will not have a desktop component, which may prove disappointing to some (myself included) who use tablets such as the Dell Venue 8 Pro. At the same time, it appears that SKU will be used for some Nokia tablets and one might presume a future Surface 3 model (and perhaps for a "mini" form factor).

As previously reported, Threshold will get a new Start menu. Microsoft in April released Windows 8.1 Update, which added a Start screen and various improvements for keyboard and mouse users, but no Start menu. Foley pointed out that the mini Start menu that was demonstrated at Microsoft's Build conference in April is expected to be customizable.

The Threshold release is expected to arrive in the spring of 2015. Meanwhile, Foley also noted a second and final Windows 8.1 Update is expected to arrive next month for Patch Tuesday, though users will have the option of opting out.

Though details of how Microsoft will improve the classic Windows desktop remain to be seen, this should be welcome news to Windows 7 shops (and perhaps some Windows XP holdouts) making long-term migration and system upgrade plans. Our research has suggested all along that shops that plan to pass on Windows 8.x will consider Windows Threshold.

Microsoft said it had no comment on the report.

 

Posted by Jeffrey Schwartz on 07/02/2014 at 8:09 AM0 comments


Why SharePoint Admins Should Check Out Surface Pro 3

Untitled Document

Microsoft's Surface Pro 3 could benefit all types of workers looking for a laptop that they can also use as a tablet. Among them are SharePoint administrators.

As soon as the new Surface Pro 3s went on sale at BestBuy 10 days ago, Tamir Orbach, Metalogix's director of product management for SharePoint migration product, went out and bought one. Having seen my first-look write up last week, he reached out, wanting to share with me his observations on the device in general and why he believes every SharePoint administrator would benefit by having one.

Many of his customers who are SharePoint administrators tend to have a small, low end Windows tablet or iPad and a heavy laptop or desktop on their desks. Orbach believes the Surface Pro 3's high resolution, light weight and the coming availability of a unit with an Intel Core i7 processor and 8GB of RAM will make the device suitable as a SharePoint administrator's only PC and tablet.

"Pretty much all of us professionals want or need both a laptop or desktop and a slate," Orbach said. "It's so light that you can carry it anywhere you want and you would barely even feel it. And the screen is big enough, the resolution is good, the functionality is powerful enough to be used as our day-to-day computer."

We chatted about various aspects of the device:

  • New keyboard: The new keyboard is bigger and we both agreed the fact that it can be locked on an angle is a significant improvement over previous systems (which only could be used in a flat position). Orbach said one downside to that new angle is you can feel the bounce, which is true but it's not that bad in my opinion. "I'd definitely take it over the flat one though," he said.
  • Cost and configuration: Orbach bought the unit configured with a 128GB SSD and 4GB of RAM. That unit cost $999 plus $129 for the keyboard. A SharePoint administrator would be better off with at least the system with a 256GB drive and 8GB of RAM but there's a $300 premium. For one with a i7 processor, you're up to $1,549 without the keyboard.
  • Docking station: If the Surface Pro 3 becomes your only computer it would be worth adding the docking station if you have a primary work area.

If you're a SharePoint administrator or any type of IT pro, do you think the Surface Pro 3 would help you do your job better?

 

 

Posted by Jeffrey Schwartz on 06/30/2014 at 12:28 PM0 comments


OneDrive Still Has Major Drawback Despite New 1TB Capacity

I think it's great that Microsoft is now offering 1TB of capacity in its OneDrive service for Office 365 but that only makes the proverbial haystack even larger due to the lack of a suitable way of finding files when using the new Windows 8.x modern UI.

The major drawback of using OneDrive is that it doesn't sort files in the order they were last modified. I think it's fair to say that's the preferred way for most individuals who want to access files they're working with. Instead, when using the modern UI, it sorts them alphabetically. If you have hundreds or thousands of files in a folder, good luck finding a specific file in the way you're accustomed to.

Sure you can use the search interface or go to the traditional desktop (and that's what I'm forced to do). But if Microsoft wants to get people to use its Windows 8.x UI, wouldn't it make sense to make it easier to use?

Now if you use OneDrive for Business with SharePoint Online, it's my understanding you do have the ability to do custom sorts when using the modern UI. So why not offer the same capabilities to the core OneDrive offering? Surely if Microsoft wants to stave off cloud file service providers such as Dropbox, this would be an easy way to accomplish that.

Do you agree?

Posted by Jeffrey Schwartz on 06/27/2014 at 11:07 AM0 comments


SolarWinds Acquires Web Performance Monitoring Provider Pingdom

Infrastructure systems management provider SolarWinds is extending into the Web performance monitoring market with the acquisition of Pingdom. Terms of the deal, announced last week, weren't disclosed. Pingdom's cloud-based service monitors the performance and uptime of Web servers and sites.

Web performance monitoring is a natural extension of its business and is a key requirement of those managing their infrastructure, said SolarWinds Executive VP Suaad Sait.

"Our product strategy has always been around delivering IT infrastructure and application performance management products to the market," Sait said. "We had a hole in our portfolio where we wanted to extend the capabilities for cloud-based applications and Web sites. We heard this request from our customers as well from the macro market. Instead of building it organically, we looked for a really good partner and that led to us acquiring Pingdom."

Sait said SolarWinds proactively looked for a company to acquire for two reasons. One is that Web sites are becoming a critical component not only as a company's presence but for lead generation. Second, to ensure availability of Web-based applications from remote sites requires they are monitored. "Pingdom rose to the top of the kind of company that fits into the market we serve but also our business model," he said.

Pingdom is a cloud-based offering and the company claims it has 500,000 users. The service monitors Web sites, DNS, e-mail servers and other infrastructure. Setup only requires a URL. The deal has already closed but Sait said the company hasn't determined its integration roadmap for Pingdom.

Posted by Jeffrey Schwartz on 06/27/2014 at 11:11 AM0 comments


Equinix To Support Open Compute Project

Equinix, which operates the largest global colocation of over 100 datacenters, plans to join the OpenCompute Project and implement some of its specs by early next year. Open Compute is a consortium of vendors initiated by Facebook with a large roster of members that include AMD, Dell, Hewlett Packard, Intel, Microsoft, Rackspace and VMware.

Ihab Tarazi, CTO of Equinix, said the company hasn't officially announced its plan to join OpenCompute, but it probably won't come as a major surprise to observers since Facebook is one of its largest customers. Tarazi said the decision to participate goes beyond Facebook. "Our model is to support the needs of our customers," he said. "There's a whole community on OpenCompute we're going to support."

Among them is Microsoft, which also has a major partnership with Equinix, among several interconnection partners it announced at its TechEd conference last month. With Microsoft's new ExpressRoute service, the company will provide dedicated high-speed links to Equinix datacenters. Microsoft joined OpenCompute earlier this year, saying it plans to share some of its Windows Azure datacenter designs as part of that effort.

I sat down with Tarazi, who is in New York for a company investor conference. Despite jumping on the standards bandwagon, Tarazi said he agrees with comments by Microsoft Azure General Manager Steven Martin, who in a speech earlier this month said, "you have to innovate then commoditize and then you standardize."

In his speech Marin added: "When you attempt to standardize first and say 'I want you as vendors, customers and partners, to get together and agree on a single implementation that we're all going to use for years and years and years to come,' the only thing I know for sure is that you're going to stifle anything meaningful being accomplished for years."

Tarazi concurred. "Innovation takes off faster if you are not waiting on a standard, which is what Steve was saying," he said. "As long as you are able to still deliver the service that is the best way to go. You have to sometimes go for a standard where it's impossible to deliver the service without connectivity or standard."

There's good reason for Tarazi to agree. Equinix is stitching together its own Cloud Exchange, announced in late April, with the goal of providing interconnection between multiple cloud providers. In addition to Microsoft, which has started rolling out interconnectivity to some Azure datacenters (with all planned by year's end), Cloud Exchange also connects to Amazon Web Services through its DirectRoute dedicated links.

Others announced include telecommunications provider Orange, Level 3 Communications and TW Telcom (which the latter agreed to acquire last week). Tarazi said the company is in discussion with all of the players that have operations in its datacenters. "We have 970 plus networks inside our datacenters," he said. "All of those connect to Microsoft in one way or another."

Though he agreed with Martin that there's a time for standards, apparently Tarazi believes in addition to OpenCompute, the time has come to support the OpenStack platform. "If you want to move workloads between them, we're going to make that very simple," Tarazi said. "Or if you want to have a single connection and get to all of them, that's really doable as well."

Tarazi said Equinix also plans to support the IETF's Network Configuration (NETCONF) protocol and Yang modelling language to ease device and network configurations.

 

Posted by Jeffrey Schwartz on 06/25/2014 at 3:06 PM0 comments


Microsoft Aims To Commoditize Hardware with Software-Defined Storage

While Microsoft has extended the storage features in Windows Server and its Azure cloud service for years, the company is stepping up its ability to deliver software-defined storage (SDS). Experts and vendors have various opinions on what SDS is, but in effect it pools all hardware and cloud services and automates processes such as tiering, snapshotting and replication.

During a webinar yesterday presented by Gigaom Research, analyst Anil Vasudeva, president and founder of IMEX Research, compared SDS to server virtualization. "Software-defined storage is a hypervisor for storage," Vasudeva said. "What a hypervisor is to virtualization for servers, SDS is going to do it for storage. All the benefits of virtualization, the reason why it took off was basically to create the volume-driven economics of the different parts of storage, servers and networks under the control of the hypervisor."

Prominent storage expert Marc Staimer, president and chief dragon slayer of Dragon Slayer Consulting, disagreed with Vasudeva's assessment. "In general, server virtualization was a way to get higher utilization out of x86 hardware," he said. "The concept of a hypervisor, which originally came about with storage virtualization, didn't take off because what happened with storage virtualization [and] the wonderful storage systems that were being commoditized underneath a storage virtualization layer. What you're seeing today is your commoditizing the hardware with software-defined storage."

Organizations are in the early stages when it comes to SDS. A snap poll during the webinar found that 18 percent have on-premises SDS deployed, while 11 percent have a hybrid cloud/on-premises SDS in place and 32 percent said they are using it indirectly via a cloud provider. GigaOM research director Andrew Brust, who moderated the panel discussion, warned that the numbers are not scientific but participants agreed the findings are not out of line with what they're seeing.

Siddhartha Roy, principal group program manager for Microsoft (which sponsored the webinar), agreed it is the early days for SDS, especially among enterprises. "Enterprises will be a lot more cautious for the right reasons, for geopolitical or compliance reasons. It's a journey," Roy said. "For service providers who are looking at cutting costs, they will be more assertive and aggressive in adopting SDS. You'll see patterns vary in terms of percentages but the rough pattern kind of sticks."

SDS deployments may be in their early stages today but analyst Vasudeva said it's going to define how organizations evolve their storage infrastructure. "Software defined storage is a key turning point," he said. "It may not appear today but it's going to become a very massive change in our IT and datacenters and in embracing the cloud."

Both analysts agree that the earliest adopters of SDS in cloud environments, besides service providers, will be small and midsize businesses. For Microsoft, its Storage Space technology in Windows Server is a core component of its SDS architecture. Storage Space lets administrators virtualize storage by grouping commodity drives into standard Server Message Block 3.0 pools that become virtual disks exposed and remoted to an application cluster.

"That end to end gives you a complete software-defined stack, which really gives you the benefit of a SAN array," Roy said. "We were very intentional about the software-defined storage stack when we started designing this from the ground up."

Meanwhile, as reported last week, Microsoft released Azure Site Recovery preview, which lets organizations use the public cloud as an alternate to a secondary datacenter or hot site and it has introduced Azure Files for testing. Azure Files exposes file shares using SMB 2.1, making it possible for apps running in Azure to more easily share files between virtual machines using standard APIs, such as ReadFile and WriteFile, and can be accessed via the REST interface to enable hybrid implementations.

Is SDS in your organization's future?

 

Posted by Jeffrey Schwartz on 06/25/2014 at 12:49 PM0 comments


Microsoft Targets Apple with Expansion of Retail Stores

Untitled Document

I don't make a habit of attending grand opening ceremonies but when Microsoft opened its second retail store in my backyard Saturday, I decided to accept the company's invitation to check it out. Microsoft opened one of the largest stores to date at the Roosevelt Field mall in Garden City, N.Y. on Long Island (right outside New York City). It's the fifth store in New York and arrives less than two years after opening area locations in Huntington, N.Y. (also on Long Island) and in White Plains, N.Y. in Westchester County. Roosevelt Field is the largest shopping mall in the New York metro area and the ninth largest in the U.S., according to Wikipedia.

The store that opened this weekend is one of the company's largest at 3,775 square feet and 41 employees. It coincidentally opened a day after Friday's Surface Pro 3 launch. "It just worked out that way," said Fazal Din, the store manager, when asked if the opening was timed in coordination with the launch. "But it's a great way to open the store."

While Microsoft's retail stores are primarily intended to draw consumers and are often strategically located near Apple Stores (as this one is), the stores are also targeting business customers, Din said. "We want this store to be the IT department for small businesses," Din said. The store is also reaching out to local partners, he added.

Microsoft corporate VP Panos Panay was on hand for the ribbon cutting ceremony Saturday, where a number of customers later asked him to autograph their new Surface Pro 3s. "This is not only the 98th store, but it's also the 12th store in the tri-state area [New York, New Jersey and Connecticut]. It's kind of a big deal," Panay said. "This is a great area for Microsoft to show its technologies."

Hundreds, if not a few thousand, teenagers camped outside the store in the enclosed mall to score tickets for a free Demi Lovato concert Microsoft arranged in the outside parking lot. The company also gave $1 million in donations to the local divisions of several charities including Autism Speaks, United Way, Variety Child Learning Center and the Girl Scouts of Nassau County.

Nine additional stores are in the pipeline, one of which will open this week in The Woodlands, Texas this Thursday. Most of the stores are in the U.S. with a few in Canada and Puerto Rico. By comparison, Apple, which started opening stores years before Microsoft, has an estimated 424 stores worldwide and 225 in the U.S. With retail sales of over $20 billion for Apple's stores, they represented 12 percent of the company's revenues. Like Apple and Samsung, Microsoft also has its own specialty departments in Best Buy stores.

Though Microsoft is touting the 98th store, by my count only 59 are full retail stores. The rest are smaller specialty stores. It appears Microsoft is largely opening retail stores in the suburbs of large cities rather than in urban locations. For example the only location in New York City is a specialty store in the Time Warner Center in Columbus Circle.

 

Posted by Jeffrey Schwartz on 06/23/2014 at 1:28 PM0 comments


Microsoft Releases Azure Site Recovery Preview

The preview of Microsoft's Azure Site Recovery is now available, the company said on Thursday. Among numerous offerings announced at last month's TechEd conference in Houston, Azure Site Recovery is the company's renamed Hyper-V Recovery Manager for disaster recovery.

But as I reported, Azure Site Recovery is more than just a name change. It represents Microsoft's effort to make Azure a hot site for data recovery. While Hyper-V Recovery Manager, released in January, provides point-to-point replication, Microsoft says Azure Site Recovery aims to eliminate the need to have a secondary datacenter or hot site just for backup and recovery.

"What if you don't have a secondary location?" Matt McSpirit, a Microsoft technical product manager, asked that question during the TechEd opening keynote. "Microsoft Azure Site Recovery, [provides] replication and recovery of your on-premises private clouds into the Microsoft Azure datacenters."

The original Hyper-V Recovery Manager required a secondary datacenter. "When first released, the service provided for replication and orchestrated recovery between two of your sites, or from your site to a supporting hoster's site," the company said in a blog post Thursday. "But now you can avoid the expense and complexity of building and managing your own secondary site for DR. You can replicate running virtual machines to Azure and recover there when needed."

Microsoft says both offer automated protection, continuous health monitoring and orchestrated recovery of applications. It also protects Microsoft's System Center Virtual Machine Manager clouds by setting up automated replication of the VMs, which can be performed based on policies. It integrates with Microsoft's Hyper-V Replica and the new SQL Server AlwaysOn feature.

The service monitors clouds with SCVMM remotely and continuously, according to Microsoft. All links with Azure are encrypted in transit with the option for encryption of replicated data at rest. Also, Microsoft said administrators can recover VMs in an orchestrated manor to enable quick recoveries, even in the case of multi-tier workloads.

Customers can test it in the Azure Preview Portal.

 

Posted by Jeffrey Schwartz on 06/20/2014 at 12:08 PM0 comments


Surface Pro 3 Is Now Available with Day-1 Firmware Patch

A month after introducing the new Surface Pro 3 -- which Microsoft advertises as the tablet designed to replace your laptop -- the device is now available for purchase at select retail locations. But the first batch of units will require a quick firmware update to address an issue where Surface Pro 3 would occasionally fail to boot up even when fully charged.

After spending a month with the Surface Pro 3, I can say the device is a real impressive improvement over the first two versions. It's bigger yet still portable, weighing 1.76 pounds with a much thinner form factor. And it has a much more usable keyboard. See my take, which appears in the forthcoming July issue of Redmond magazine.

I didn't mention the problem booting up because I hadn't experienced it when my review went to press. In recent weeks, I have experience the bug quite regularly. When the problem occurs, typically when Windows goes into sleep mode, I have eventually managed to boot it up, though it has taken anywhere from 15 to 30 minutes to do so.  Microsoft last week shared a tip on how to do it faster. It requires a strange combination of pressing the volume button in the up position and holding the power button for 10 seconds with the power adapter plugged in. I initially thought I had a flawed unit but Microsoft said it was a common problem and the firmware upgrade currently available aims to fix that.

The firmware update fixes the power problem. It's easy enough to install the update. Just go to the Settings Charm, touch or click Update and Recovery and then check for a Windows Update. I attempted to run it last night but I received an error message saying to make sure the system is fully charged and try again. I did so this morning without incident. It's too early to say that the patch worked for me.

Microsoft also issued an update which lets the Surface Pen users double click to capture and save screen grabs, which should be welcome since there's no Print Screen button on the keyboard. This requires the installation of the June 10 Windows and OneNote updates. With the included Surface Pen, users can also use it to boot the machine right into OneNote to start taking notes.

In my evaluation of the test unit, I noted that I experienced occasional problems with the system failing to find a network connection, which I did mention in my first look article. In fact, it would sometimes indicate in the device manager that there is no network adapter. It wasn't clear if this problem was unique to my test unit or a universal problem -- it turns out there are reports of others who have experienced this issue as well, Microsoft confirmed. The way to fix that is to reboot but a spokeswoman for Microsoft said a patch for that problem is forthcoming.

Units with the Intel Core i5 processors are available at Best Buy stores and the Microsoft Store (both in retail locations and online). Versions with i3 and i7 processors will ship in August, with preorders open now. The i7 model is good if you'll be using the Adobe Creative Cloud Suite, part of which the company this week optimized for photographers using the Surface Pro 3. The i5 will appeal to most mainstream workers who don't want or need to use it for any complex photo or video editing or computer aided design (CAD) work.

If you get to a Best Buy or Microsoft Store near you, check it out and share your thoughts.

 

Posted by Jeffrey Schwartz on 06/20/2014 at 10:32 AM0 comments


Adobe Gives Photoshop a Nice Gesture with Windows 8.1 Touch

When Microsoft said it was targeting MacBook users with its new Surface Pro 3 last month, the company demonstrated how much lighter its latest device is by putting the two on a balancing scale. But to really tip the scales for the new tablet PC, Microsoft also talked up its new partnership with Adobe to enhance Photoshop and the rest of the Creative Cloud suite for the new Surface.

Adobe today delivered on that promise with the launch of its new Creative Cloud Photography offering. The company said it will offer the new suite at a starting price of $9.99 per month, which includes what the company calls its new Creative Cloud Photography plan. Part of the new Photoshop CC 2014 release, it is now optimized for Windows 8.1 for those who want to use an electronic stylus or touch.

"We're offering 200 percent DPI support, as well as touch gestures," said Zorana Gee, Adobe's senior product manager for Photoshop during a media briefing. "All of the touch gestures you were able to experience [on traditional PCs and Macs] -- pan, zoom, scale, etc., will now be supported on the new Windows 8 platform."

The optimization for the Surface Pro 3 is available in Photoshop 2014, though the company indicated it was looking to optimize other apps in the suite over time as well.

Adobe last year took the bold step of saying it would only offer its entire Creative Suite of apps, which include Photoshop, Dreamweaver, Illustrator and InDesign, as a cloud-based software as a service. At the Surface launch last month, Adobe was among a number of software vendors including Autodesk and SAP  that said they're optimizing their apps for the touch and gesture features in Windows 8.x.

"It's really, really easy to interact with the screen," said Michael Gough, Adobe's VP of Experience Design, when demonstrating the new Windows 8.1-enabled Photoshop at the Surface Pro 3 launch. "The pen input is natural. The performance is great -- both the performance of the software and the hardware working together."

While Photoshop is bundled with the new plan and is optimized for Surface, the subscription also includes Lightroom and Lightroom mobile, which Adobe has designed for use with Apple's iPhone and iPad.

The new Photoshop release also has a number of other new features including content-aware color adaption improvements, Blur gallery motion effects, Perspective Warp and Focus Mask. The latter automatically selects areas in an image that are in focus. It's suited for headshots and other images with shallow depth of field.

Posted by Jeffrey Schwartz on 06/18/2014 at 8:34 AM0 comments


Microsoft on Cloud Standards: Innovate, Commoditize and then Standardize

If you're wondering where Microsoft stands with cloud standardization efforts such as OpenStack and Cloud Foundry, the general manager for Microsoft Azure gave his take, saying providers should innovate first. In the keynote address at this week's Cloud Computing Expo Conference in New York, Microsoft's Steven Martin questioned providers that have emphasized the development of cloud standards.

"I think we can agree, you have to innovate, then commoditize and then you standardize," Martin said. "When you attempt to standardize first and say 'I want you as vendors, customers and partners, to get together and agree on a single implementation that we're all going to use for years and years and years to come,' the only thing I know for sure is that you're going to stifle anything meaningful being accomplished for years."

The remarks are clearly a veiled reference to major IT providers offering public cloud services such as IBM, Hewlett Packard, and Rackspace, along with VMware, which is pushing its Cloud Foundry effort. Amazon, Microsoft and Google have the largest cloud infrastructures. According to a report in The New York Times Thursday, Amazon and Google both have 10 million servers in their public clouds while Microsoft Azure has 1 million in 12 datacenters today and 16 planned to be operational by year's end. Despite the large footprints none of the big three have pushed running a standard cloud platforms stack.

Martin's statements about standards were rather telling, considering Microsoft has always had little to say publicly about OpenStack and Cloud Foundry. While Microsoft has participated in OpenStack working groups and has made Hyper-V-compatible in OpenStack clouds, the company has never indicated either way whether it sees its Azure cloud ever gaining some OpenStack compatibility, either natively or through some sort of interface.

 "The best thing about cloud technology is in addition to the data, in addition to the access, is the market gets to decide," he said. "The market will pick winners and losers in this space, and we will continue to innovate." Asked after his session if he sees Microsoft ever supporting OpenStack, Martin reiterated that "we'll let the market decide."

Not long ago, one might have construed that as Microsoft talking up its proprietary platforms. However Martin was quick to point out that Microsoft Azure treats Linux like a first-class citizen. "Microsoft will use the technologies that make sense and contribute them back to the public," he said." What will matter is going to be the value that people generate, and how strong and robust the systems are, and the service level agreements you can get."

 

Posted by Jeffrey Schwartz on 06/13/2014 at 11:14 AM0 comments


Microsoft Appeals U.S. Order To Turn Over E-Mail in Foreign Datacenter

As I reported the other day, Microsoft is getting tougher on surveillance reforms. Later that day, Microsoft stepped its battle of overreach up a notch by releasing a court filing seeking to overturn an order to turn over an e-mail stored in its Dublin datacenter. In its appeal released Monday, Microsoft is arguing the search warrant is in violation of international law.

It's believed Microsoft's move is the first time a major company has challenged a domestic search warrant for digital information overseas, The New York Times reported today. Privacy groups and other IT providers are concerned over the outcome of this case, according to the report, noting it has international repercussions. Foreign governments are already concerned their people's data are not adequately protected.

Microsoft filed its objection in the United States Southern New York Court last Friday, saying if the warrant to turn over the e-mail stored abroad is upheld, it "would violate international law and treaties, and reduce the privacy protection of everyone on the planet." The case involves a criminal inquiry, where a federal judge granted a search warrant in New York back in December. The customer's identity and country of origin is not known.

Search warrants seeking data oversees are rare, according to The Times report, but granting one could pave the way for further cases and international conflicts at a time when foreign governments are already unnerved by the surveillance activities by the United States. In its latest filing, Microsoft is seeking a reversal of the warrant. The report said the case could go on for some time with oral arguments scheduled for July 31.

The case could put pressure for revisions to the Electronic Communications Privacy Act of 1986, which was created before international electronic communications over the Internet was common.

Posted by Jeffrey Schwartz on 06/11/2014 at 11:54 AM0 comments


Microsoft's Quest for Surveillance Reforms

In the year since Edward Snowden stunned the world with revelations that the National Security Agency (NSA) had a widespread digital surveillance effort that included the covert  PRISM eavesdropping and data mining program, Microsoft marked the anniversary last week by saying it had unfinished business in the quest for government reforms.

While most cynics presumed intelligence agencies intercepted some communications, Snowden exposed what he and many believe was broad overreach by the government. Many of the revelations that kicked off a year ago last Thursday even put into question the legality of the NSA's activities and the restrictions imposed on the telecommunications and IT industry by the Foreign Intelligence Security Act (FISA).

The leaked documents implicated the leading telcos, along with Microsoft, Google, Facebook, Yahoo and many others, saying they were giving the feds broader access to e-mail and communications of suspected terrorists than previously thought. While the government insisted it was only acting when it believed it had probable cause and on court orders, the NSA's broad activities and the compliance of Microsoft and others put into question about how private our data is when shared over the Internet, even when stored in cloud services.

Whether you think Snowden is a hero for risking his life and liberty for exposing what he believed defied core American freedoms or you feel he committed treason, as Netscape Cofounder and Silicon Valley heavyweight Marc Andreessen believes, the worldview and how individuals treat their data and communications is forever changed.

The revelations were a setback for Microsoft's efforts to move forward its "cloud-first" transformation because the leaked NSA documents found that the company was among those that often had to comply with court orders without the knowledge of those suspected. To his credit, Microsoft General Counsel Brad Smith used the revelations to help put a stop to the objectionable activities.

Both Microsoft and Google last week marked the anniversary by showing the progress both companies have made. Google used the occasion to announce its new end-to-end encryption plugin for the Google Chrome browser and a new section in its Transparency Report that tracks e-mail encryption by service providers. Google announced it is using the Transport Layer Security (TLS) protocol to encrypt e-mail across its Gmail service. Its reason for issuing the Transparency Report was to point out that a chain is only as strong as its weakest link, hoping it would pressure all e-mail providers to follow suit. The report last week showed Hotmail and Outlook.com only implementing TLS 50 percent of the time.

Microsoft has lately emphasized it is stepping up its encryption efforts this year. Encryption for Office 365 is coming, Microsoft said last month. The company will offer 2018-bit Private Forward Secrecy as the default decryption for Office 365, Azure, Outlook.com and OneDrive. Next month Microsoft will also offer new technology for SharePoint Online and OneDrive for Business that will move from a single encryption key per disk to offering a unique encryption key for each file.

Shortly after the Snowden revelations, Microsoft, Google and others filed a lawsuit challenging the Foreign Intelligence Surveillance Act's stipulation that made it illegal for the companies to be more transparent. In exchange for dropping that lawsuit, Microsoft and others were able to make some limited disclosures. But in his blog post last week, Smith said providers should be permitted to provide more details, arguing doing so wouldn't compromise national security.

The unfinished business Smith would like to see resolved includes in summary:

  • Recognize that U.S. search warrants end at U.S. borders: The U.S. government should stop trying to force tech companies to circumvent treaties by turning over data in other countries. Under the Fourth Amendment of the U.S. Constitution, users have a right to keep their e-mail communications private. We need our government to uphold Constitutional privacy protections and adhere to the privacy rules established by law. That's why we recently went to court to challenge a search warrant seeking content held in our data center in Ireland. We're convinced that the law and the U.S. Constitution are on our side, and we are committed to pursuing this case as far and as long as needed.
  • End bulk collection: While Microsoft has never received an order related to bulk collection of Internet data, we believe the USA Freedom Act should be strengthened to prohibit more clearly any such orders in the future.
  • Reform the FISA Court: We need to increase the transparency of the FISA Court's proceedings and rulings, and introduce the adversarial process that is the hallmark of a fair judicial system.
  • Commit not to hack data centers or cables: We believe our efforts to expand encryption across our services make it much harder for any government to successfully hack data in transit or at rest. Yet more than seven months after the Washington Post first reported that the National Security Agency hacked systems outside the U.S. to access data held by Yahoo and Google, the Executive Branch remains silent about its views of this practice.
  • Continue to increase transparency: Earlier this year, we won the right to publish important data on the number of national security related demands that we receive. This helped to provide a broader understanding of the overall volume of government orders. It was a good step, but we believe even more detail can be provided without undermining national security.

President Obama has put forth some recommendations, though some believe they don't go far enough and have yet to affect any major changes. If you saw the interview NBC's Brian Williams conducted with Snowden in Moscow, it's clear, regardless of the legality of the leaks, this debate is far from over. But if you're concerned about your privacy, encrypting your data at rest and in transit is an important step moving forward.

Posted by Jeffrey Schwartz on 06/09/2014 at 1:06 PM0 comments


New Hybrid Windows Devices for Business Users Debut at Computex

Asus, Dell and Hewlett Packard are among the PC suppliers extending the boundaries of Microsoft's new Windows 8.1 operating system with several new enterprise-grade hybrid PC-tablets being revealed at this week's annual Comutex trade show in Taipei.

Some of the devices could even offer an alternative to Microsoft's new Surface Pro 3, a device the company believes is finally suited to combine all the functions of a commercial-grade laptop and a tablet. If the new PC-tablets challenge the Surface Pro 3, that's a good thing for the advancement of Windows for Microsoft. "Surface is a reference design for Microsoft's OEM partners," said David Willis, Gartner's chief of research for mobility and communications, when I caught up with him yesterday at the Good Technology Xchange user conference in New York.

For example, the new HP Pro x2 612, launched today, has a 12.5-inch full high-definition (FHD) display that's just slightly larger than the Surface Pro 3. HP's detachable tablet is available with either an Intel Core i3 or i5 processor with vPro hardware-based security, solid-state drives and two USB 3.0 ports. It is also available with HP's UltraSlim dock. While the Surface Pro 3 is also available with a Core i7 processor, the latter two CPUs should serve the needs of most mainstream business users. And there's nothing to say that HP won't later offer an i7-equipped model down the road.

The HP Pro x2 612 will get 8.25 hours of battery life, though an optional power keyboard extends that to 14 hours, the company said. While the Surface Pro 2 is also available with a power keyboard, Microsoft didn't announce one yet for the new Surface Pro 3. In addition to offering hardware-based security with vPro, HP also added other features to offer improved security for the new device, including HP BIOS, HP Client Security, Smart Card Reader, HP TPM and an optional fingerprint scanner for authentication.

HP also announced a smaller version, the HP Pro x2 410, with an 11.6-inch display and a starting price of $849 for a unit with an i3 processor, 128GB of storage and 4GB of RAM. HP didn't announce pricing for the larger HP Pro x2 612, which ships in September.

Meanwhile, Asus rolled out several new Windows devices including the new Zenbook NX500, available with an i7 quad-core processor. It supports optional NVIDIA GeForce GTX 850M graphics adaptors with 2GB of GDDR5 video memory. The new system also includes a Broadcom 3-stream 802.11ac Wi-Fi, SATA 3 RAID 0 or PCIe x4 SSD storage.

Asus said the new NX500 is the first laptop offered by the company with a 4K/UHD display and VisualMaster technology. Its 15.6-inch device offers 3840x2160 resolution, and an IPS display. The company did not disclose pricing or availability.

And complementing its Venue Pro 8 tablets, Dell also launched several Inspiron models including the 7000 Series 2-in-1. Due to ship in September, it also is powered by Intel's latest Core processors and comes with a 13.3-inch capacitive touchscreen display. A lower-end 11.6-inch model, the 3000 Series, is also available with a starting price of $449.

In all, Microsoft showcased 40 new Windows PCs, tablets and phones at Computex, according to OEM Corporate VP Nick Parker, who gave a keynote address earlier today. "We're delivering on our vision today with rapid delivery of enhancements in Windows, new licensing and programs for an expanded set of partners," Parker said, in a blog post.

Of course, it wasn't all Windows at Computex. Intel said more than a dozen Android and Windows tablets debuted at the conference, with 130 in its radar for this year overall. And Dell revealed it will offer Ubuntu 14.04 LTS version of Linux as an option on its new Inspiron 2-in-1 laptop-tablets.

Posted by Jeffrey Schwartz on 06/04/2014 at 2:33 PM0 comments


Microsoft Azure Slowly Gaining on Amazon Web Services

While Amazon Web Services (AWS) remains by far the most widely used cloud provider by enterprises, it appears Microsoft's Azure cloud service has gained significant ground over the past year since releasing its Infrastructure as a Service (IaaS) offering.

Azure was the No. 2 cloud service behind Amazon last year, according to a Redmond magazine reader survey, and that finding remained consistent this year, as well. But given Redmond readers are predisposed to using Microsoft technology, it has always remained a mystery which cloud provider was the greatest alternative to Amazon in the broader IT universe.

Every major IT vendor -- including Google, IBM, Hewlett-Packard, Oracle and VMware -- and the telecommunication service providers offer enterprise public cloud services and want to expand their footprints. Many of them, notably Rackspace, AT&T, IBM and HP, are betting on OpenStack infrastructures, which, besides Amazon, is the most formidable alternative to Azure.

In the latest sign Azure is gaining ground, Gartner last week released its Magic Quadrant for IaaS providers, where only Amazon and Microsoft made the cut as leaders (a first for Microsoft in that category). Gartner published a measured assessment of Azure IaaS and all of the major cloud service providers.

"Microsoft has a vision of infrastructure and platform services that are not only leading stand-alone offerings, but that also seamlessly extend and interoperate with on-premises Microsoft infrastructure (rooted in Hyper-V, Windows Server, Active Directory and System Center) and applications, as well as Microsoft's SaaS offerings," according to Gartner's report.

"Its vision is global, and it is aggressively expanding into multiple international markets. It is second in terms of cloud IaaS market share -- albeit a distant second -- but far ahead of its smaller competitors. Microsoft has pledged to maintain AWS-comparable pricing for the general public, and Microsoft customers who sign a contract can receive their enterprise discount on the service, making it highly cost-competitive. Microsoft is also extending special pricing to Microsoft Developer Network (MSDN) subscribers."

The fact that Azure has a wide variety of features in its Platform as a Service (PaaS), as well, offers significant complementary offerings. Microsoft also was one of two vendors described as leaders in Gartner's application PaaS (which it calls aPaaS) Magic Quadrant back in January, bested only by Salesforce.com, now a Microsoft partner.

"The IaaS and PaaS components within Microsoft Azure feel and operate like part of a unified whole, and Microsoft is making an effort to integrate them with Visual Studio, Team Foundation Server, Active Directory, System Center and PowerShell. Conversely, Windows Azure Pack offers an Azure-like user experience for on-premises infrastructure," according to Gartner. "Microsoft has built an attractive, modern, easy-to-use UI that will appeal to Windows administrators and developers. The integration with existing Microsoft tools is particularly attractive to customers who want hybrid cloud solutions."

That's a pretty glowing assessment of Azure, but Gartner also issued some warnings to customers considering Microsoft's cloud service. Notably, Gartner cautioned that Microsoft's infrastructure services are still relatively new -- just over a year old -- while Amazon has offered IaaS since 2006.

"Customers who intend to adopt Azure strategically and migrate applications over a period of two years or more (finishing in 2016 or later) can begin to deploy some workloads now, but those with a broad range of immediate enterprise needs are likely to encounter challenges," according to the Gartner report.

Gartner also warned that Microsoft faces the challenge of operating and managing its Azure at cloud scale and enabling enterprises to automate their infrastructures. In addition, Microsoft is still in the early stages of building out its partner ecosystem and doesn't yet offer a software licensing marketplace, it pointed out. Despite offering some Linux services, Gartner believes Azure is still "Microsoft-centric," appealing primarily to .NET developers. That's an image Microsoft has begun working in earnest to shake. For example Microsoft has open-sourced some of its own .NET offerings, while making Java a first-class citizen on Azure.

Microsoft has 12 datacenters worldwide supporting Azure and that number will reach at least 16 by year's end, the company said. Azure is a key component of Microsoft's hybrid cloud strategy, called Cloud OS, which is based on running multitenant instances using Windows Server, System Center, the Azure Pack (for running Azure-like operations in a private datacenter) and the public cloud.

Azure took center stage at last month's TechEd conference in Houston. It was evident in the keynote, but also in talking with folks on the show floor. "I'm seeing more rapid adoption of Azure overall," said Randy DeMeno, CommVault's chief technologist for Windows.

And speaking during a TechEd session, BlueStripe CTO Vic Nyman noted the benefits of using Azure to scale on demand. "Using Azure, and particularly Platform as a Service and Infrastructure as a Service, is a simple, elegant solution whose presentation layers, turning up and down, is an interesting trend we see."

Are you looking at Azure to scale your infrastructure?

Posted by Jeffrey Schwartz on 06/02/2014 at 12:10 PM0 comments


Google Simplifies Exchange Migration and Data Retention

As Google targets everything from serving ads in your thermostat, to making a driverless car, machine learning and now broadband communications with its reportedly planned $1 billion investment in satellite technology, the search giant is also stepping up its less glamorous effort of developing an alternative to everyday enterprise services offered by Microsoft.

Google has won its share of big conversions from Lotus Notes and Microsoft Exchange, but experts say the majority of enterprises moving their messaging and collaboration efforts to the cloud are going with Office 365. Now Google is looking to make the switch easier. Last week, Google said enterprises can migrate from Exchange Server to Google Apps with its cloud-based data migration service directory from the Admin console to the Gmail servers.

The direct migration offering replaces the need for the Google Apps Migration for Microsoft Exchange tool, which customers had to install on their local mail servers. The new migration service also lets administrators monitor the progress of the migration. The new migration service currently only works for e-mail, with calendar migration currently under development. Google is making the new e-mail migration service available on its Gmail servers over the next two weeks.

Google said the migration service currently is suitable for the following:

  • Microsoft Exchange servers that support Exchange Web Services (EWS), specifically Office 365 and Exchange Server 2007 SP1 or higher.
  • IMAP servers, including Gmail, Exchange 2003 or lower, and ISPs like GoDaddy.

Google last month also made it easier to manage retention of mail and documents on Google Apps via its Google Vault service. "The options for setting or modifying a retention period -- the length of time your company's messages are archived in Google Vault -- are now more and we've added safeguards when setting a retention period for a specified number of days," Google said in a blog post last month.

Organizations using Microsoft Outlook with Google Apps can now add, manage and join Hangout video calls by downloading a plug-in to Outlook.

Posted by Jeffrey Schwartz on 06/02/2014 at 8:51 AM0 comments


VMware CEO Rejects Call To Reunify with Parent EMC

Would VMware and its parent EMC be better off as one company? A report last week by two Wells Fargo analysts suggesting the two should combine into one company was rejected by VMware CEO Pat Gelsinger. The analysts suggested the plans to offer federated solutions among the companies EMC controls, which, in addition to VMware, include RSA and the recently spun-out Pivotal, would make more business sense and offer more shareholder value.

At its annual EMC World conference earlier this month, the company launched what it calls EMC II, an effort to federate the four companies to offer software-defined datacenter solutions. Despite this new federated business model, EMC said it remains committed to letting customers choose best-of-breed solutions. Wells Fargo analysts Um and Jason Maynard issued a note suggesting that could be better accomplished by combining EMC and VMware into one company. EMC spun VMware off in 2007.

"What EMC and VMware call federated solutions is, to us, taking the next step in addressing a key trend in the market today of converged solutions," they wrote, as reported by Barron's Tiernan Ray. "Over the past few years, large OEMs such as IBM, HP, Oracle and Dell have built up or acquired a broader capability across the stack and are offering complete converged solutions rather than point products. Cooperation turned into coopetition and will likely become full-on competition -- to us, the friction is fairly evident and we expect this to continue to grow."

Pressed on the matter in an interview on CNBC's Fast Money program Tuesday during the grand opening of VMware's expanded campus in Palo Alto Calif., Gelsinger said there are no plans to combine the two organizations.

"Simple answer, no," Gelsinger said. "It is working so well. We have this federated model where each company has their own strategic role. We're independent, we're loosely coupled and we're executing like crazy. And it's working for shareholders, our ecosystems, our employees on this beautiful campus here. This has worked and we're going to stay on this model because it's been completely successful."

Speaking at the Sanford Bernstein conference yesterday, EMC chairman and CEO Joe Tucci reiterated the strategy. "In each of these companies the missions are aligned," Tucci said, according to the Seeking Alpha transcript of his remarks. "One depends on the other, built on the other. But, again, you can take these and you can use them as a card giving customers choice, which I think is going to help to find a winner in the third platform. We're not forcing you to use our technologies. You can use Pivotal without using VMware. You can use VMware without using EMC, but when they all work together you get a special layer of magic."

Even though they're separately traded companies, EMC holds an 80 percent stake in VMware and has 97 percent control of voting. Longtime storage industry analyst John Webster wrote the companies will have to deliver the so-called "third platform" it evangelizes more affordably for the EMC II federated strategy to be successful. "EMC will have to deliver on all three aspects of its redefined journey -- inclusion, value and affordability -- if its new Federated EMC strategy is to work as promised at EMC World," Webster noted.

For VMware, it's caught in a tough spot. Microsoft's Hyper-V is gaining ground on the dominant VMware virtualization platform and the Microsoft Azure public cloud also appears to have a strong head start over the VMware Hybrid cloud service A recent survey of Redmond magazine readers found that 21 percent who now use VMware as their primary virtualization platform plan to migrate to Hyper-V.

"I wouldn't want to be on EMC's board," one partner of both companies told me during a conversation at this month's Microsoft TechEd conference in Houston. The only way it appears VMware can stem the migrations to Hyper-V is by lowering its cost, experts say. "The problem is it's their 'Office,'" said one Microsoft exec during an informal discussion at TechEd.

It will be interesting to learn how VMware pushed forward in the next few months leading up to its VMworld  2014 conference in late August.

 

Posted by Jeffrey Schwartz on 05/30/2014 at 11:41 AM0 comments


Tool Providers Are Jumping on the Hyper-V Management Bandwagon

One thing that was apparent at this month's TechEd conference in Houston is that apparently everyone is joining the Hyper-V parade. While VMware still offers the dominant virtualization platform, Hyper-V has increasingly gained share in recent years and as a result, quite a few tools have appeared that offer improved support for Microsoft's hypervisor offering.

Among those talking up their extended Hyper-V support at TechEd were Savision, Veeam and Vision Solutions. Last year's release of Windows Server 2012 R2 and the latest release of System Center included major revisions of Hyper-V, which critics said made it suitable for large deployments. Hyper-V is also the underlying virtualization technology in the Microsoft Azure public cloud.

When VMware's ESX hypervisor first emerged, customers were willing to pay the company's licensing fees because of the savings they achieved by eliminating physical servers in favor of virtual environments. But because Hyper-V is free with Windows Server and it's now up to snuff, many IT decision makers are making the switch. Or at the very least, they are adding it to new server deployments.

"I think 2012 R2 release gave Hyper-V the momentum it needed," said Doug Hazelman, vice president of product strategy at Veeam and one of the hypervisor's early supporters. "Hyper-V is the fastest segment of our backup and recovery business." The company has added Hyper-V support in its new Veeam Management Pack  v7 for System Center.

The new management pack, which already supported VMware vSphere, can now run in Microsoft's System Center Operations Manager to provide improved management and monitoring of its Veeam Backup and Replication platform. Hazelman said administrators can use the Veeam Management Pack for organizations' VMware and Hyper-V environments. With the new management pack, administrators can access it right from the SCO console, he said.

For its part, Vision Solutions talked up its partnership with Microsoft (inked last fall) to help organizations migrate from VMware to Hyper-V. "We definitely have seen a pretty significant update with folks getting off of VMware and moving over to Hyper-V for multi-production servers," said Tim Laplante, director of product management at Irvine, Calif.-based Vision Solutions. "That's especially true when their VMware maintenance is coming due."

The company's Double Take Move migration tool got a good share of attention at this year's TechEd, even though Vision Solutions has offered it for a while. It's a viable alternative to Microsoft's own recently upgraded Virtual Machine Migrator.

At TechEd, Savision previewed a new release of its Cloud Reporter, which will generate reports on both Hyper-V and VMware infrastructure, said lead developer Steven Dwyer. "Its capacity planning, virtual machine rightsizing for Hyper V," Dwyer said of the new Cloud Reporter 1.7 release.

Cloud Reporter 1.7 will generate reports that show capacity of both VMware and Hyper-V together, Dwyer explained. In addition it will offer predictive analysis, which administrators can use for planning and budgeting.

 

Posted by Jeffrey Schwartz on 05/30/2014 at 10:38 AM0 comments


Nadella: 'It's Time for Us To Build the Next Big Thing'

While Microsoft CEO Satya Nadella is willing to make acquisitions, he emphasized he's more focused on organic growth than making a big deal.

Taking questions at the Code Conference Tuesday, organized by the operators of the new Re/code site, Nadella was among several CEOs on the roster including Google's Sergey Brin, Intel's Brian Krzanich, Salesforce.com's Marc Benioff and Reed Hastings from Netflix. When asked what companies Nadella would like Microsoft to buy, he didn't tip his hand.

"I think we have to build something big," Nadella said. "If along the way we have to buy things, that's fine. But we have to build something big. We've built three big things, three and half if [we] add Xbox into it. It's time for us to build the next big thing." The focus is on building new platforms and software for productivity, he said.

In a preview of one major effort along those lines, Nadella, joined on stage by Corporate VP Gurdeep Pall, demonstrated the new Skype Translator, which aims to provide real-time language translation. Pall, who leads the Lync and Skype organization, showed how the Skype Translator can enable him to have a conversation with a colleague who only speaks German.

"It's brain-like in the sense of its capabilities," Nadella said. "It's going to make sure you can communicate to anybody without language barriers." In a blog post Tuesday, Pall said Skype Translator is the result of decades of work and joint development by the Skype and Microsoft Translator teams. The demonstration showed near-real-time voice translation from English to German and vice versa. Pall said it combines Skype and instant messaging technology with Microsoft Translator and neural network-based speech recognition.

"We've invested in speech recognition, automatic translation and machine learning technologies for more than a decade, and now they're emerging as important components in this more personal computing era," Pall noted, adding Microsoft will make Skype Translator available as a Windows 8 beta app by year's end. Microsoft also released this post on the research initiative.

Of course while the Skype Translator may represent years of development, Microsoft did acquire Skype for $8.5 billion. Presumably that's what Nadella meant when he said the company may have to buy things along the way. Fortunately, the company has plenty of cash if it needs to fill in where needed.

 

Posted by Jeffrey Schwartz on 05/28/2014 at 2:10 PM0 comments


Like Some MacBook Pros, Microsoft Surface Batteries Are Expensive To Replace

Microsoft set the bar for its new Surface Pro 3 last week when it compared the new device, designed to combine the functions of a tablet and a full-powered computer, to a MacBook Pro. At the launch event in New York last week, Panos Panay, the corporate VP for the Microsoft Surface group, put the two devices on a scale to show how the MacBook Pro weighs more. At the same time, Panay emphasized the optional Intel Core i7 processor with 8GB of RAM in the new Surface Pro 3 makes it powerful enough to run the Adobe Creative Suite, including Photoshop.

But the Surface Pro 3 also shares one of the most undesirable features of the latest crop of high-end MacBook Pros -- a factory sealed battery that isn't user replaceable. To change out the battery once the extended warranties expire, the cost is $200 for both systems. That was one of the top topics during a Reddit IAmA (Ask Me Anything) discussion Tuesday moderated by Panay.

Panay and his team fielded numerous questions about the new Surface Pro 3, a device that breaks quite a bit of ground in the full-featured laptop-tablet field. In addition to the battery-related questions, participants wanted to know when a Surface Mini is coming, how the different processor versions handle power and if Microsoft is walking away from Windows RT.

The battery issue appeared to raise some eyebrows among several participants who were seemingly sold on the device until Reddit member "Caliber" asked why it would cost $450 to replace the battery on existing Surface models. If your system is under warranty, it won't cost anything to have the battery replaced, but Panay (or someone on his team) said replacing the battery on the Surface Pro 3 after the warranty expires will cost $200.

That's still quite expensive -- in fact it's the same amount it costs to replace the battery in the Apple Retina MacBook Pro. Traditional MacBook Airs and Pros have user replaceable batteries, but two years ago with MacBook Retina, Apple upped the cost of the new battery, which requires a technician to upgrade.

So now the Surface and high-end MacBooks share that unpleasant cost, though users hopefully shouldn't need to replace them very often.

Microsoft's $200 price tag to replace the Surface Pro 3 battery should be more palatable than $450 for earlier models. Wondering if Caliber was given bad information from the Microsoft support rep, I called the Microsoft Store myself. While the rep didn't give me a price, she said it would be cheaper to replace the Surface than sending it back for a new battery (likewise with replacing the display if it cracks, she said). Most would agree, $450 for a battery is off the charts and while $200 is pretty high as well, I don't think it's a deal breaker.

Also, it's possible the cost of replacement batteries could come down in the next four-and-a-half years -- the amount of time Microsoft said it would take before anyone should notice deterioration of the battery. If the device is charged five times per day over that period, the Surface Pro 3 should still maintain 80 percent of its capacity by that point. If indeed this turns out to be the case, replacing a battery after about five years isn't too bad. At that point, many may be ready for a new system, anyway.

Fans of the Surface were also wondering when a Surface Mini will arrive, a question shared by many who expected Microsoft to launch one last week. "Please for the love of God give us some more concrete info on Surface Mini," wrote Reddit member "swanlee597." "I was really disappointed in the lack of a reveal. Is it real? Is it coming out this year? Should I just buy an OEM 8-inch Win 8 tablet instead of waiting for Surface Mini?"

As for the future of Windows RT, offered on the Surface, Surface 2 and some Nokia tablets, the Surface team said that "Windows on ARM continues to be an important part of the Windows strategy." Responding to questions regarding the difference in battery life between systems developed with the Intel Core i3, i5 and i7 processors, the company said they will offer the same performance. When it comes to compute performance, however, "the i7 will see benchmark scores approximately 15 to 20 percent better than the [Service Pro 3] i5."

One other question that struck a nerve: Why isn't there an i5-based Surface Pro 3 with a 128GB SSD and 8GB of RAM? Microsoft launched one with a 128GB drive and 4GB of RAM for $999 but the next step up is $300 more for a 256GB unit with 8GB of RAM. That question remained unanswered.

Posted by Jeffrey Schwartz on 05/28/2014 at 12:06 PM0 comments


Azure RemoteApp Aims To Be Remote Desktop Service Alternative

With the slew of announcements at TechEd last week, Microsoft's new RemoteApp was perhaps one of the most noteworthy ones. It certainly is something IT managers looking to offer secure remote applications or remote desktop services should consider.

Microsoft put a decisive stake in the ground with the preview of Azure RemoteApp, which uses the company's huge global cloud service to project data and applications to most major device types and Windows PCs -- but keeping the app and the data in the cloud. Or if you prefer, a hybrid version lets organizations run the apps and data on-premises and use the RemoteApp to distribute them and provide compute services.

I say Microsoft is taking a different approach in that Azure RemoteApp is not a desktop-as-a-service (DaaS) offering similar to Amazon Workspaces or VMware's Horizon. "We definitely see value in providing full desktop but at this point in time we went after the remote application model because that's what a lot of customers said they really wanted once they started working with this," said Klaas Langhout, principle director of program management for Microsoft's remote desktop team. Langhout demonstrated and let a handful of tech journalists test Azure RemoteApp at a workshop in Redmond earlier this month (just days in advance of the TechEd unveiling).

Microsoft released the preview of Azure RemoteApp last week. Organizations are currently permitted to use it with 20 users but it's scalable for much larger implementations, Langhout said. The preview I got to play with had the Microsoft Office apps but Microsoft said the complete service will support any application that can run on Windows Server 2012 R2. Langhout said other versions of Windows Server are under consideration, but the decision to only support the latest version was because "we need to look at this from an application compatibility standpoint," he said.

Azure RemoteApp is intended as an alternative to providing Remote Desktop Services (RDS) in an enterprise datacenter, which required hardware, storage and network infrastructure to quickly get to onboard employees who need a set of applications and access to data. It's also an alternative to Microsoft's App-V and VDI services. Microsoft may incorporate App-V in future offerings or Azure RemoteApp, though that's only under consideration for now.

"We want to provide these applications on any device anywhere while serving it from a multi-datacenter, highly scaled elastic cloud, which allows a very resilient compute fabric to provide these applications no matter where the end user is, and this is extremely fault tolerant," Langhout explained when describing the goal of Azure RemoteApp. "We also want the customer deploying this to be able to set this up without a large capital expense, no purchase order for a lot of servers to be deployed, no setup required for the management side of the infrastructure."

The management burden is removed from the perspective of not having to manage the infrastructure, discrete role services, licenses or RDS. As long as the remote or mobile user has a network connection, the application is projected via RDS to the endpoint device, which can include Windows 7, Windows 8, iOS, Mac OS X and Android. Windows RT support will be added to the preview in a month or so.

RemoteApp will also appear to organizations concerned about protecting data since the applications and data are never persistent on the device. In addition to protecting from data loss from the user perspective, Langhout said it also protects from denial of service and other attacks. It doesn't require any existing infrastructure including Active Directory, though a user needs either a Windows Live ID or an Azure Active Directory account. A user logs into the Azure portal and selects the RemoteApp Service. Then the administrator can select from the gallery image what applications you want to deploy for your users. A hybrid deployment of Azure RemoteApp does require Active Directory on-premises as well as a virtual network.

Microsoft's deliberate move to emphasize a remote application service versus DaaS looks as if the company is not concerned about Amazon WorksSpace or VMware Horizon. That's because Microsoft believes Azure RemoteApp is a better approach to desktop virtualization. "For a lot of scenarios, especially BYOD, they really don't want the Windows shell impeding with the usage of the application," Langhout said. "On iPad, I don't want to go to the Start menu, I just want to get to the application. As long as you can make it seamless to get to the applications, the Windows shell is not as necessary."

The company doesn't have an official delivery date for the service but Langhout indicated his group is shooting for the second half of this year. Microsoft hasn't determined specifics such as the pricing and subscription model.

If you have tested the preview over the past two weeks or you have comments on Azure RemoteApp, feel free to comment below or drop me an email at [email protected]. Is it a better alternative to DaaS?

 

Posted by Jeffrey Schwartz on 05/22/2014 at 1:18 PM0 comments


What Happened to the Surface 'Mini'?

A few weeks ago, Microsoft sent out a press invite for "a small gathering" for news from the Surface team. It wasn't a stretch to presume Microsoft was planning to roll out a Surface "mini" to compete with the slew of 8-inch tablets based on Android, Windows 8.1 and of course the iPad Mini.

The lack of a Surface in that form factor represents a key gap in Microsoft's effort to make Windows a mainstream tablet platform. Analysts say small tablets account for half of all tablets sold. As we now know, there was no Surface "mini." Instead, Microsoft took the wraps off the Surface Pro 3. For IT pros and everyday workers who use both a tablet and PC, Microsoft may have broken new ground with the new Surface Pro because it promises to combine the two, as I told New York Post reporter Kaja Whitehouse yesterday.

Mobile industry analyst Jack Gold said in a research note yesterday that Microsoft made the right move in putting the mini it reportedly had in the queue on hold. "Microsoft finally seems to understand it cannot go head to head with Apple's iPad, and must offer a superior business device leveraging its installed base of infrastructure and applications, in particular the full Office suite," he wrote.

So was Microsoft's invitation a ruse? The prevailing thinking is it wasn't. It appears when the invites went out two weeks ago, Microsoft was planning a mini but Microsoft CEO Satya Nadella decided the device wasn't unique enough from other small tablets, according to a Bloomberg report. Microsoft would not comment, but according to Bloomberg, the company had a mini planned that was powered by an ARM-based Qualcomm processor running the Windows 8 RT office.

Another possible reason is that Microsoft realized that a mini without the Gemini touch-enhanced Office apps in development wouldn't make sense, noted All About Microsoft's Mary Jo Foley. Microsoft doesn't appear to have abandoned a mini and it would be a lost opportunity if they can't get one into the market before the holiday season this year. The only question is will it be Windows RT-based, Windows 8.1-based (like in the Windows Surface Pro) or both?

Posted by Jeffrey Schwartz on 05/21/2014 at 10:54 AM0 comments


Surface Pro 3: The Tablet To Replace Laptops?

If you were hoping that Microsoft was planning on launching the rumored Surface "mini" today, you'll have to wait another day. Instead, the company announced the Surface Pro 3, which appears to address key issues of the previous two versions. Microsoft debuted the Surface Pro 3 today at a press event in New York.

If you buy the notion that Microsoft's third attempt at a new product typically is when it lands a hit, the Surface Pro 3 at first glance should be in the queue to maintain that track record. The Surface Pro 3 is remarkably thinner and lighter than the Surface Pro 2 at just 0.36 inches and weighing 1.76 pounds. While the Surface Pro 2 only comes with an Intel Core i5 processor, the Surface Pro 3 is available with different Intel Core processors: i3, i5 and i7.

The new 12-inch device is slightly bigger than its 10.6-inch predecessor but at the same time has the feel of a full-sized laptop. In fact Panos Panay, Microsoft's corporate VP for Surface, described the Surface Pro 3 as "the tablet that can replace your laptop," during today's introduction.

I'll give you my take after I spend more time with the system but at first glance it feels marginally heavier than the Surface 2 -- which runs Windows RT 8.1 -- but significantly lighter than the Surface Pro 2 and is so much more pleasing on the eyes. At any rate, Microsoft positioned today's launch as a major event.  "Our goal is to create new categories and spark new demand for our ecosystem," said Satya Nadella, who made brief introductory remarks at today's event. "Today is a major milestone in that journey."

The 12-inch ClearType HD display offers a much higher resolution: 2160x1440 and a 3:2 aspect ratio. It supports up to 8GB of RAM and the company said it can get up to nine hours of battery life. While offering a true tablet that could replace a laptop has been Microsoft's goal from the outset for the Surface team, there are a number of reasons that may now be the case, or at least why the company has gotten much closer, with the new model.

The systems are enterprise-ready in that a power user can now get a configuration with a powerful i7 Core processor, up to 8GB of RAM and a 512GB solid state drive. For those with more moderate needs, an Intel Core i3 processor is also available with as little as 64GB of storage. Price will of course vary on configuration but it starts at $799.

Microsoft also addressed a couple of key pet peeves of many Surface users, notably the difficulty in using it on your lap. The new keyboard that can click into the Surface Pro 3 was designed in such a way that it will not wobble in your lap. A new track pad in the keyboard is improved and you can adjust the Surface Pro 3 keyboard in any position.

Like the Surface Pro 2, the new unit is available with an optional docking station and a new aluminum Surface Pro Pen designed specifically for this device. The Surface Pro Pen offers 256 levels of pressure sensitivity.

Emphasizing that the Surface 3 is targeted at commercial use, the company said it is working with some key ISVs including SAP, Dassault Systèmes and Adobe, among others. Panay invited Michael Gough, Adobe's VP of Experience Design, on stage to reveal plans to optimize the Adobe Photoshop CC for touch on the new Surface Pro 3.

"It's a creator's dream come true," Gough said. "It's really, really easy to interact with the screen, the pen  input is natural, the performance is great."

One key thing missing in the accessories lineup was its power keyboard, offered with the Surface Pro 2. Company officials wouldn't say if one comes packed in. Also, while Microsoft has made key strides in getting popular software vendors to offer their wares in the Windows Store, it still lags behind the Apple iTunes App Store and Google Play.

Microsoft will start accepting preorders tomorrow and the devices are slated to ship June 20.

Posted by Jeffrey Schwartz on 05/20/2014 at 1:44 PM0 comments


Rackspace Up for Sale. Will Microsoft Make a Bid?

Rackspace, the largest independent cloud hosting provider, said that multiple bidders have expressed interest in acquiring it. As a result, Rackspace has made it official that it's looking to sell. The company has hired the investment bank Morgan Stanley to evaluate proposals and the company's options.

In a filing late Thursday with the SEC, Rackspace revealed the move, saying it has "been approached by multiple parties who have expressed interest in exploring a strategic relationship with Rackspace, ranging from partnership to acquisition." Rackspace's future has been in question since CEO Lanham Napier stepped down in February. At the time, I wondered if Rackspace would put itself on the market. The board said it is looking for a new CEO but the company has yet to name one.

While Rackspace is profitable, it's being squeezed by larger players such as Amazon Web Services, Microsoft and Google. Rackspace these days is best known for stewarding the OpenStack open source cloud compute storage and networking standards -- a move aimed at providing an interoperable cloud and an alternative to market-leader Amazon.

Many key cloud providers support OpenStack including IBM, Hewlett Packard and AT&T, as well as many smaller providers. OpenStack is also working its way into Linux servers, making it the cloud operating system of choice for those users.

But the three largest cloud providers -- Amazon, Microsoft and Google -- don't support OpenStack, though orchestration tools such as Puppet, Chef and numerous other third-party offerings enable some levels of interoperability.

Of course the Rackspace cloud servers and storage are now OpenStack-based. Rackspace made a big strategic bet when it teamed up with NASA over four years ago to contribute to the OpenStack code it codeveloped with the open source community. Making the transition for Rackspace was a big and costly bet and the company's stock is down 50 percent over the past year -- though shares jumped 20 percent this morning on the news. It's ironic that the company made the filing just as the semi-annual OpenStack Summit in Atlanta took place this week.

Rackspace also has a formidable SharePoint and Exchange hosting service. I can't help but ponder if Microsoft is one of the interested parties, as outlandish as that may sound to some. There's good reason to laugh off Microsoft having any interest in Rackspace. The Microsoft Azure cloud service already has 12 global datacenters online and has four more in the queue for this year. Microsoft Azure is part of the Cloud OS, largely based on Windows Server and Hyper-V.

Rackspace, by comparison, runs an open source infrastructure. And even though it supports Windows Server and Hyper-V, it's a whole different platform. On the other hand, several people in the OpenStack Foundation have lauded Microsoft for making meaningful contributions and participating in activities. But if Microsoft wanted to have a companion network of datacenters based on OpenStack, Rackspace would be an interesting play.

To be sure, this would be a surprising and likely disruptive move. Coming back from TechEd this week, Microsoft has made it clear it's going to put all of its resources into Azure. Unless those inside the company see OpenStack as a viable threat to Azure's future as a dominant enterprise public cloud, buying anything but the company's SharePoint and Exchange hosting service would be a major departure for Microsoft.

There are likely other interested parties. IBM reportedly was once seriously interested in Rackspace before acquiring SoftLayer for $2 billion. Perhaps IBM has renewed its interest in Rackspace, though a counterargument is that Big Blue is emphasizing higher margin services-based offerings. Though I have no insights as to which companies have expressed interest, here are some possibilities, other than IBM:

  • Hewlett Packard: Like Rackspace, HP has made a major commitment to OpenStack for both its public and private cloud offerings. The company certainly has the resources to build out its own global footprint since it's a major provider of server, storage and network gear. In other words, it doesn't need Rackspace for its footprint but rather its brand and customer base.
  • Cisco: Networking giant Cisco, recently announced its $1 billion "Intercloud" effort. It also is a significant contributor in the OpenStack community and perhaps Rackspace could provide the glue for its Intercloud. This doesn't sound like a move CEO John Chambers would make, as he's trying divest groups that aren't core. Given mixed results with WebEx, another service it acquired, picking up Rackspace may not be a natural fit for Cisco.
  • Google: Another unlikely player since it has shown no interest in OpenStack but Google has lots of money to spend and it's made more surprising moves in the past. Also if it had misgivings about passing on OpenStack, this would be an easy way to get on board.
  • AT&T: Perhaps the large telecommunications giant wants to follow in Verizon's footsteps (it bought Terremark a few years ago).
  • Verizon: Even though Verizon has Terremark, like Google, it hasn't jumped on the OpenStack bandwagon.
  • VMware/EMC: VMware has not totally given OpenStack a pass (having bought Nicera), but the VMware Hybrid Cloud service is targeting shops with its private virtualization infrastructure.
  • Red Hat: Could the open source software company, which claims to be the largest OpenStack contributor, decide to become a service provider too?
  • Other possibilities: One can't rule out some other companies with deep pockets (or access to capital) such as SAP, Salesforce.com, Dell and Oracle. But I'd say these are all likely longshots.

To be sure, Rackspace said in its filing that it could also go the partnership route or other alternatives. Rackspace has given no timetable for making any type of move, indicating it was just exploring its options. That said, we all know how these things usually work out.

 

Posted by Jeffrey Schwartz on 05/16/2014 at 11:52 AM0 comments


PowerShell 4.0 and Desired State Configuration Talk Draws Crowd at TechEd

It wasn't the most prominent topic at this week's TechEd conference in Houston, but PowerShell certainly wasn't left in the dust either. I caught up with Don Jones, author of Redmond magazine's Decision Maker column, after his TechEd session: "A Practical Overview of Desired State Configuration."

We met up at The Scripting Guys booth, where Don was signing copies of the book he coauthored with Jeffery Hicks: "Learn Windows PowerShell in a Month of Lunches" (Manning 2012). The book sold out at TechEd and Don was inundated with questions ranging from specific scripting practices to IT management issues.

One such question was about how to deal with individuals who aren't on board with the new techniques PowerShell 4.0 introduced such as Desired State Configuration (DSC). Don explained in one of his Decision Maker columns last year that DSC can be disruptive.

"With DSC, you use PowerShell to write very simple (or complex, if you like) declarative scripts," he explained in the column. "Script isn't really the proper word, as there's no actual programming. You're really just building an exaggerated INI-esque file, where you specify configuration items that must or must not be present on a computer."

If you weren't at TechEd, you can view his session on Microsoft's Channel 9. If you want to spend some quality time with Don on the topic, he'll be giving a crash course on PowerShell at our TechMentor conference up in Redmond on the Microsoft campus. Also at TechMentor, which like Redmond magazine is produced by 1105 Media, Don will give a talk on DSC. The conference takes place Aug. 11-15.

Posted by Jeffrey Schwartz on 05/16/2014 at 12:36 PM0 comments


Hyper-V Recovery Manager Gets Rebranding, Retooling

Not long after Microsoft released Hyper-V Recovery Manager, its tool for disaster recovery, the company is now giving it a new name: Microsoft Azure Site Recovery. But this is much more than a cosmetic change. Microsoft is stepping up its effort to make Azure your hot site for data recovery.

Released just a few months ago, Hyper-V Recovery Manager is designed to protect important workloads and applications by replicating them and making them available for recovery.  The company announced the rebranding Monday at its annual TechEd conference in Houston. Microsoft Azure Site Recovery, available next month, extends the notion of using a secondary data center to replicate your site to using its Azure public cloud service.

"What if you don't have a secondary location?" asked Matt McSpirit, a Microsoft technical product manager, during Monday's opening keynote. "Microsoft Azure Site Recovery, [provides] replication and recovery of your on-premises private clouds into the Microsoft Azure data centers."

As noted Monday, Microsoft also announced plans to release Azure Files, which will let organizations use move their virtual machines to Azure with an SMB storage head as a shard store. Microsoft describes Azure Files as a file sharing as a service offering. It's a platform as a service offering where administrators can configure their apps in Azure and can access shared files without having to be managed explicitly.

 

Posted by Jeffrey Schwartz on 05/15/2014 at 12:51 PM0 comments


Microsoft Injects Azure with Major Network Boosts

If you're attending Microsoft's annual TechEd conference in Houston or watching the keynote and sessions on Channel 9, it's hard to escape hearing some of the many upgrades to the Azure cloud service that were made this week. Microsoft is emphasizing the newly added resiliency of its infrastructure, which now has 12 global datacenters in service with four more scheduled to go online by year's end.

As I noted in passing on Monday, Microsoft announced the general availability of ExpressRoute, which lets organizations connect their datacenters directly to Azure without using a public Internet connection. The service is based on MPLS connectivity from carriers and colocation operators including AT&T, BT, Equinix, Level 3, SingTel, TelecityGroup, Verizon and Zadara Storage.

"You can just add us to your existing MPLS contract or your MPLS WAN and we're also redundant," said Brad Anderson, corporate VP for Microsoft's server and tools business. "Literally, we provision two circuits with every single connection so you have that redundancy for this high connection, dedicated pipe that you have."

Letting organizations use carrier-grade connections via ExpressRoute is likely to make Azure more appealing to enterprise users who don't want to use slower and less reliable Internet connections, experts say. That's especially the case when reliability is critical and for organizations moving large amounts of data that require high bandwidth links.

Microsoft's new Azure Import/Export service, which the company this week announced is generally available, lets organizations move extensive amounts of data in and out of storage blobs. The service can leverage ExpressRoute. Enterprises building hybrid clouds may also require more extensive network connections. Microsoft released several new network features to support those requirements.

Azure Virtual Network now supports multiple site-to-site VPN connections, Microsoft announced. Until now it only allowed for a single connection. This new vnet to vnet option lets organizations connect multiple virtual networks. Microsoft said this is suited for disaster recovery, notably when using SQL Server 2014's new AlwaysOn feature.

Azure users can now reserve public IP addresses using them as virtual IP addresses with Microsoft's new IP Reservation option. Microsoft noted this is important for applications requiring static public Internet IP addresses or for swapping reserved IP addresses to update apps.

Microsoft's new Azure Traffic Manager is also now generally available. It supports both Azure and external endpoints for applications requiring high availability. And Microsoft said that two new compute-intensive virtual machine instances -- A8 and A9 -- are now available to support faster processors and links, more virtual cores and larger memory.

Posted by Jeffrey Schwartz on 05/14/2014 at 11:56 AM0 comments


Microsoft To Extend Security in Azure and Office 365

Microsoft has extended security for its Azure cloud service,  will launch a new antimalware agent and will add encryption for its Office 365 service. The company talked up its enhanced cloud security offerings at this week's TechEd conference in Houston.

The new antimalware agent, released to preview, is available for both Microsoft's cloud services and virtual machines. Microsoft also announced partnerships with Symantec and Trend Micro, whose antimalware offerings will also be available in Azure.

"You can use these antimalware capabilities to protect your VMs as well as protect the Azure applications that you're building," said Brad Anderson, corporate VP for Microsoft's server and tools business, during Monday's keynote address. Anderson also said Microsoft would offer encrypted storage for Office 365, SharePoint Online storage and OneDrive for Business.

"What this provides is the ability to actually have every single file that is stored in OneDrive for business encrypted with its own key," Anderson said.

Trend Micro said its Deep Security and SecureCloud offerings will offer threat and data protection security controls for virtual machines deployed in Microsoft Azure. The controls include antimalware, intrusion detection, threat prevention and encryption. The company said they will also offer centralized, automated policy management.

In addition, Trend Micro said it will offer its PortalProtect data protection solutions for organizations migrating or sharing SharePoint workloads with Azure. And Trend Micro said it will offer Microsoft Agent Extension. Customers can choose Deep Security as a security extension when configuring a VM in Azure. Trend Micro also said customers can also use PowerShell Extensions when implementing Trend Micro Deep Security, SecureCloud and Portal Protect for Azure VMs and SharePoint workloads.

Posted by Jeffrey Schwartz on 05/13/2014 at 10:55 AM0 comments


TechEd: Microsoft Previews Azure RemoteApp and Azure Files, Delivers ExpressRoute

Microsoft kicked off its annual TechEd conference today and  underscored its "cloud first, mobile first" mantra by debuting key new wares aimed at advancing the company's effort to deliver access to data and applications to users anywhere and on any device.

Though Microsoft didn't reveal plans for new releases of Windows Server or System Center, nor was it expected to, the company is using this week's event in Houston to emphasize the role its Azure cloud service and Active Directory can play to deliver secure enterprise infrastructure to all forms of mobile devices. Microsoft officials emphasized that these new tools for IT pros and developers will let organizations house their data on-premises, in the public cloud or in a hybrid scenario that will combine the two.

Among the company's announcements today included Azure RemoteApp, the general availability of ExpressRoute and Azure Files, among a slew of other announcements at TechEd. I'll be drilling into these and many of the new offerings in the coming days and weeks.

Corporate VP Brad Anderson kicked off TechEd with the opening keynote today, saying cloud and mobile go hand-in-hand. "You cannot have a cloud without connected device, Anderson said. "As you think about the connected devices, without that cloud, all you have is potential that goes with it." He added: "The amount of information that will be at our fingertips will be amazing."

Key to putting that information at users' fingertips will be empowering IT to protect enterprise data, while at the same time giving users the access to information from any device. Microsoft's new Azure RemoteApp will let IT deliver Windows applications to almost any device including Windows tablets, Macs, iPads and Android tablets.

The preview of Microsoft Azure RemoteApp is now available to organizations that want to let up to 20 users test the app. Because this is a preview, the company has not determined how it will offer the service from a pricing and subscription model. Azure RemoteApp will deliver applications, initially Office in the preview,  via Microsoft's Remote Data Services. The service will be designed to let organizations keep data centrally located and will support up to 1,300 Windows-based applications.

Microsoft has not committed to when it will release the offering but the company is targeting the end of the year. "Every organization I talk to has a very large inventory of Windows applications they're looking to deliver to mobile devices," Anderson said. "With Azure RemoteApp, users can scale up and down, so their capital expenditures goes down dramatically."

Anderson also made clear that Microsoft intends to be aggressive on the mobile device management front. Microsoft's recently announced Enterprise Mobility Suite will cost $4 per user per month regardless of the number of devices supported, Anderson announced.

The company also announced administrators will be able to manage Office for iPad, iPhone and Android devices. This Windows Intune component of the Enterprise Mobility Suite will let administrators manage a line of business apps running on Android and iOS.

Looking to help IT organizations reduce VM sprawl, Microsoft unveiled Azure Files. Based on the standard SMB 2.1 protocol, Azure Files runs on top of Azure storage will allow for shared readers and writers. It will also work on-premises, allowing users to access their storage accounts without having to spin up a virtual machine and mange an SMB share. In short, Azure Files will let IT organizations create single file shares available from multiple virtual machines.

Microsoft's ExpressRoute, which lets organizations connect their datacenters directly to Azure without using a public Internet connection, is generally available. The service is based on MPLS connectivity from carriers including AT&T, Verizon and Level 3, among others, as well as through colocation provider Equinix. This will appeal to those who want reliable, faster and inherently more secure connectivity, and Microsoft talked up this capability for those who want to use Azure for disaster recovery and business continuity.

Posted by Jeffrey Schwartz on 05/12/2014 at 10:58 AM0 comments


Microsoft Responds to Reader Demands for Automatic Windows Phone Encryption

Now that Nokia's handset business is part of Microsoft, it'll be interesting to see what compelling features come from the new devices and services group besides Cortana, the recently introduced voice-activated personal assistant. One improvement Microsoft might want to put on the fast track is its approach to encryption with Windows Phone.

The suggestion comes from a reader, who responded to my post a few weeks ago about Microsoft's then-pending, and now completed, acquisition of the Nokia handset business.

The reader had recently switched from an iPhone to a Nokia Lumia 521, which he described as a "very capable utility smartphone." However, he quickly discovered that Windows Phone 8.1 BitLocker encryption is not automatically enabled on an unmanaged device when a screen-lock passcode is created, unlike iPhones.

According to a Microsoft Channel 9 video (about 10 minutes in), he discovered Windows Phone 8 devices aren't encrypted at all until activating Exchange ActiveSync (EAS). The reader asked how to activate the built-in BitLocker encryption function from any WP8 handset without having to use EAS or mobile device management (MDM). Also, he wanted to know how to create arbitrary length alphanumeric passcodes from any WP8 handset without having to use EAS or MDM. In short, he can't --  at least not now.

That was something he concluded after seeing the Channel 9 video and reading the Microsoft documentation regarding the BitLocker encryption and how it's built into every Windows Phone. The problem, he argues,  is that Microsoft is avoiding the issue. He pointed out that his iPhone offered "on-the-fly device and file encryption as soon as one creates a screen lock password." This is also confirmed by Apple in its documentation (see pages 8-13).

Wondering if there's perhaps some undocumented workaround or if this will be addressed at a later date, I shared the reader's criticism with Microsoft. A company spokeswoman said the behavior observed by the customer is consistent with the design of Windows Phone 8/8.1. "Device encryption can only be invoked on devices using remotely provisioned management policy (via EAS or a MDM)," a Microsoft spokeswoman confirmed.

To protect personal information on a Windows Phone, Microsoft said users should set up a numeric PIN code. If the phone is lost, stolen or a malicious user attempts to brute force their way into the device, the device will automatically be wiped. To prevent attacks on the Windows Phone storage, Microsoft said it offers a few different solutions. First, when the phone is attached to a PC using USB, access to the data is gated based on successful entry of the user's PIN. Second, Microsoft said an offline attack affecting physical removable storage is addressed by fixing storage media to the device itself. Finally, users can register their Windows Phone devices which will enable them to locate, ring, lock or even erase the device when the phone is lost or stolen, Microsoft said.

Nevertheless, Microsoft is apparently taking this reader's suggestion to heart. "We will consider providing a means to enable device encryption on unmanaged devices for a future release of Windows Phone," the spokeswoman said. "In the meantime there are a series of effective security mechanisms in to protect your data. "

Is this a showstopper for you?

 

Posted by Jeffrey Schwartz on 05/09/2014 at 12:11 PM0 comments


Microsoft Surface Mini Expected This Month

It looks like Microsoft is set to launch its long-anticipated Surface "mini" this month. Microsoft is holding what it described in an invite to media as "a small gathering" on May 20 in New York. While the invitation didn't offer much detail other than the time and place, it indicated it was a private press event regarding the Surface line of hybrid tablet PCs.

Given the subtle hint in the title of its invitation and the fact a small tablet is one of the fastest-growing device types these days, it's a reasonable assumption the company is finally filling this gaping hole in its Surface line. Also suggesting this will be a major launch, Microsoft CEO Satya Nadella will preside over the event, reported Mary Jo Foley in her All About Microsoft blog.

The big question is will Microsoft offer an ARM-based Surface like the Surface 2 that runs Window 8.1 RT or one powered with an Intel processor running Windows 8.1 Pro? Or perhaps we'll see both? The other key question is size. Will it come with a 7- or 8-inch screen? I'd bet on the latter.

If Microsoft wants its Surface line to succeed, it needs a mini in the lineup. Just ask Apple, which reluctantly released its iPad Mini nearly two years ago, not to mention all the Android-based tablets in that form factor.

Price will also be key. With a slew of sub-$300 mini Android and Windows tablets, cost will be critical. Yet I'd be surprised if it's less than the iPad Mini's $329 price tag and if other features are tacked on, it could cost more. It's also a reasonable bet that Microsoft won't undercut its OEM partners.

Posted on 05/07/2014 at 9:57 AM0 comments


Microsoft Didn't Blink when Offering XP Support in IE Patch

Did Microsoft blink? That's the first reaction one might have inferred upon learning of the company's decision to include Windows XP in repairing one of the most prominent zero-day vulnerabilities in Internet Explorer in recent memory.

Microsoft could have stuck to its guns by saying it's no longer patching Windows XP and customers are on their own to either upgrade to a newer operating system or seek costlier assistance. The company had long stated that it would stop issuing patches and updates to Windows XP on April 8 of last month. But the fact that this vulnerability -- revealed earlier this week by security firm FirstEye --- is so significant and that some attackers have already exploited it against companies in the financial services industry necessitated a swift decision by Microsoft.

This vulnerability affected all versions of Internet Explorer running on all releases of Windows including those running on embedded systems, except for users who configured their browsers in protection mode. The flaw enabled attackers to take advantage of a memory corruption vulnerability in the browser. It aimed to deliver a "newer version of the years-old Pirpi RAT to compromised, victim systems by taking control of their browsers, and in turn, their systems and networks," said Kurt Baumgartner, a researcher at Kaspersky Lab, in a blog post.

While Adrienne Hall, general manager of Microsoft's Trustworthy Computing group, said in a blog post that the flaw resulted in a limited number of attacks and fears were overblown, Baumgartner suggested the threat of wider attacks was real. "Once the update and code is analyzed, it can easily be delivered into waiting mass exploitation cybercrime networks," Baumgartner warned. "Run Windows Update if you are using a Windows system, and cheers to Microsoft response for delivering this patch to their massive user base quickly."

Indeed Microsoft acted quicky and decisively but Hall warned Windows XP users shouldn't be lulled into complacency by yesterday's release of a patch for Internet Explorer running on Windows XP. "Just because this update is out now doesn't mean you should stop thinking about getting off Windows XP and moving to a newer version of Windows and the latest version of Internet Explorer," she warned. "Our modern operating systems provide more safety and security than ever before."

 

Posted by Jeffrey Schwartz on 05/02/2014 at 12:17 PM0 comments


Equinix Launches Multi Cloud Exchange

Equinix, one of the largest datacenter colocation and hosting operators, is rolling out an exchange that will link its facilities to multiple cloud service providers.

The new Equinix Cloud Exchange, launched Wednesday, aspires to create a global cloud interconnection network much like Cisco recently announced with its $1 billion Intercloud effort. Just like Cisco, the Equinix Cloud Exchange is initially available in selected areas. The selection Equinix is starting with, nevertheless, is not trivial.

It initially connects to Amazon Web Services and Microsoft Azure in 13 markets worldwide including Silicon Valley, Washington D.C., New York, Toronto, Seattle, Los Angeles, Atlanta, Dallas, Chicago, London, Amsterdam, Frankfurt and Paris. Six additional locations will be added to the network by the end of this year.

Connections to Amazon and Azure are currently limited to Silicon Valley, Washington, D.C. and London with the rest going online later in the year. The company indicated other cloud services will be added in the future. Through a portal and a set of APIs, customers can move workloads among multiple cloud environments. The portal and APIs let administrators allocate, monitor and provision virtual circuits in near real time, Equinix said.

Equinix already has offered Amazon customers connections via its AWS Direct Connect offering and it added Microsoft to the roster last week saying it would make Microsoft Azure ExpressRoute available in 16 markets worldwide.

 

Posted by Jeffrey Schwartz on 05/01/2014 at 10:48 AM0 comments


Microsoft Updates Office iPad App with Top-Requested Feature: Printing

When Microsoft released its Office app for iPad users last month, the company left out one key feature: the ability to print files. The company fixed that yesterday with an updated version of the respective Office apps. But if you have an older printer, you may be out of luck. At the very least you'll have to find a workaround without AirPrint, Apple's universal print driver for iOS.

Microsoft said that adding the ability to print Office files from their iPads was the No. 1 request among the 12 million customers who have downloaded the new app, though many wonder why it was left out in the first place. An update to the app, available in Apple's iTunes App Store, lets Office 365 subscribers with at least the $6.99-per-month Personal Subscription, which recently went live, print their documents. You must update each Office app (Word, Excel and PowerPoint).

The update takes a few minutes if you have a good wireless connection. Once you open a document, spreadsheet or presentation, all you need to do is touch the File icon, Print and Select Printer. Upon doing so, I quickly discovered it couldn't find either of my two printers, which both support Wi-Fi. The error message read "No AirPrint Printers Found," as seen in Figure 1.

[Click on image for larger view.]  Figure 1. My iPad was not able to detect my older Wi-Fi-enabled printers.

Both printers are at least five years old and when I called the manufacturer, Brother, the technician said only newer printers have firmware that support AirPrint. If you have an enterprise printer that supports firmware upgrades you might have more success. If you have a printer that can't support AirPrint firmware upgrades, printing files will be difficult (if not impossible).

Apple created AirPrint as an alternative to requiring printer vendors to develop drivers. According to a Microsoft spokeswoman, AirPrint works "with thousands of printers." To see if your printer supports AirPrint, Apple posted a list and other tips. For its part, Brother offers its own iPad printing tool called iPrint & Scan -- though you can't print Office documents from it directly either. However if you use Brother's iPrint & Scan Web interface (an app also in the Apple App Store) and log into OneDrive, you can open your file and print it. It's not the most elegant approach but it works.

The new Office app for the iPad has some other added features including AutoFit for Excel, which lets users adjust the width of multiple rows or the height of multiple columns simultaneously in a spreadsheet. This feature was designed to let users spruce up the appearance of their spreadsheets and ensure that no content is hidden. For PowerPoint users, Microsoft added SmartGuides, which lets users align pictures, shapes and textboxes when moved around on a slide. The app update also features some bug fixes.

Posted by Jeffrey Schwartz on 04/30/2014 at 1:06 PM0 comments


Nadella Makes a Good Impression on Wall Street

In the first quarterly earnings report since taking over as the third CEO in Microsoft's 39-year history, Satya Nadella appeared on a conference call with Wall Street analysts. While it helped that Microsoft earnings beat estimates, his debut with financial analysts was significant in that he took questions from those who will play a key role in the company's stock valuation.

Microsoft's actual performance under the new CEO's reign of course will ultimately determine how investors value Microsoft. But helping analysts understand Nadella's vision is an important first step. There were no bombshells but Nadella helped guide the analysts on how he intends to monetize the company's "mobile first, cloud first" transition. Nadella's appearance on the call didn't mark a changing of the guard, per se. However, his predecessor Steve Ballmer rarely appeared on quarterly earnings calls.

It's unclear whether Nadella will show up routinely. But it wouldn't be surprising if he becomes a regular on the calls as his ability to help lift Microsoft's stock price --  virtually flat during Ballmer's 13-year reign -- will be a key measure of his success. While Microsoft's performance and growing share in existing and new markets is table stakes, Wall Street never warmed up to Ballmer despite consistent revenue and profit growth. Nadella also needs to convince analysts that he was the right choice even as Wall Street was pushing for a seasoned CEO such as Ford Chief Alan Mulally and Qualcomm COO (now its CEO) Steve Mollenkopf.

Asked if Nadella has any major strategic changes in the works, he said the company will always be in a state of transition and ready to react to rapid shifts in market demand. "One of the things I strongly believe in is you're planning on a continuous basis, you're executing on a continuous basis," he said. "It's not episodic. The only way we're going to succeed here is by having this notion that you're planning all the time and you're also making the changes to your plans based on the changed circumstances. And I think that's the way you run a company like ours in a marketplace as dynamic as ours."

Nadella said Microsoft needs to continually build and buy new capabilities, and expressed confidence the company can do so, starting with the $7.2 billion acquisition of Nokia, which officially closed today.

Among the noteworthy questions he fielded was if and how Microsoft can make the transition from a company which relied on one-time license fees to a subscription model -- a change facing all major software providers. Nadella, with CFO Amy Hood at his side, pointed to the growth of Office 365 as a leading indicator. Microsoft added 1.1 million Office 365 subscribers in the last three months, bringing the total to 4.4 million. The company shared an interesting data point that came up last week during its hybrid cloud webinar saying that 25 percent of enterprises worldwide are using Office 365.

"We are well on our way to making that transition from moving to pure licenses to long-term contracts as well as a subscription model," Nadella said. "This is a gold rush from being able to capitalize on the opportunity. When it comes to that we have some the broadest SaaS solutions and the broadest platform solutions. That combination of those assets doesn't come often."

To that point, he told the Street what it wants to hear: "What matters to me in the long run is the magnitude of the profits we generate, given a lot of categories that are going to be merged as this transition happens. We have to actively participate in it and drive profit growth."

Posted by Jeffrey Schwartz on 04/25/2014 at 10:15 AM0 comments


Tech & Tools Watch, April 25: Metalogix, LogMeIn, LSI 

Now that Microsoft has reassured customers that it will continue to offer new releases of SharePoint for on-premise implementations, deployments of SharePoint 2013 are on the rise. That's the assessment of several experts including Metalogix CEO Steven Murphy. "They did an excellent job of clarifying their position that there are in fact two worlds -- on prem and in the cloud," Murphy said. They will be maintaining both and that's a huge clarification."

The uptick isn't off the charts, he acknowledged, but Metalogix says it's seeing increased demand for its migration and data protection tools for SharePoint. Metalogix this week debuted a new release of its Content Matrix migration tool. Murphy said the new Content Matrix 7 offers substantially improved performance, more fine-grained permissions that lets IT delegate certain functions to trusted business users for reorganization of content, handling bulk metadata and copy and move functions. It also supports the movement of SharePoint legacy code to SharePoint 2013 and provides Office 365 support for hybrid implementations.

LogMeIn Upgrades Join.me for Enterprise IT
The popular Join.me is best known as the free screen-sharing tool that many IT administrators use to troubleshoot remote PC users as well as for conducting online meetings. Now its parent company LogMeIn is offering an enterprise version. LogMeIn said the new enterprise edition will make it easier to manage, customize and deploy. The company claims Join.me is used by tens of millions of users with over 1 million first-time users. The enterprise release will support deployments of more than 25 users, support single sign-on via Active Directory Federation Services and Active Directory sync, will include advanced user policies and permission for both groups and individuals, better user access controls, 100 GB of managed online storage for sharing and recording meetings and Outlook integration.  Join.me enterprise subscriptions will cost $19 per user per month.

Accelerator Cards from LSI Boost Windows Flash Array
As noted in my blog earlier in the week, Windows Storage Server 2012 R2 is optimized to improve performance thanks to its collaboration with Violin to ready the Windows Flash Array. This technology is ideal for improving SQL Server performance, proponents say.

For its part LSI yesterday said its LSI Nytro flash accelerator cards, which list for $3,495, further boost the performance of SQL Server 2014. Like Violin, LSI collaborated with Microsoft to accelerate SQL Server 2014 database transactions by improving I/O performance and thereby minimizing bottlenecks. In addition, the LSI Nytro flash cards reduce latency and boost throughput and reliability by taking advantage of the new SQL Server 2014 Buffer Pool Extension (BPE) capability, the company said.  LSI says BPE functions as a level-two (L2) cache, with the main buffer pool functioning as a level-one (L1) cache. Microsoft in a statement said it tested the cards with SQL Server 2014 and endorsed LSI's claims.

Posted by Jeffrey Schwartz on 04/25/2014 at 1:28 PM0 comments


Microsoft Prepping New Windows Server Flash Array Options

Flash storage is one of the fastest-growing new datacenter technologies these days and while critics warn it can cost a lot, proponents say it can vastly improve performance and reduce operational and capital expenses.

With the release of Windows Server 2012 R2, and more specifically Windows Storage Server 2012 R2, Microsoft is testing the limits of flash storage. Violin, a rapidly growing startup which went public last year, and Microsoft codeveloped the new Windows Flash Array. It's a converged storage-server appliance which has every component of Windows Storage Server 2012 R2 including SMB 3.0 Direct over RDMA built in and powered by dual-Intel Xeon E5-2448L processors.

The two companies spent the past 18 months developing the new 3U dual-cluster arrays that IT can use as networked-attached storage (NAS), according to Eric Herzog, Violin's new CMO and senior VP of business development. Microsoft wrote custom code in Windows Server 2012 R2 and Windows Storage Server 2012 R2 that interfaces with the Violin Windows Flash Array, Herzog explained. The Windows Flash Array comes with an OEM version of Windows Storage Server.

"Customers do not need to buy Windows Storage Server, they do not need to buy blade servers, nor do they need to buy the RDMA 10-gig-embedded NICs. Those all come prepackaged in the array ready to go and we do Level 1 and Level 2 support on Windows Server 2012 R2," Herzog said.

Based on feedback from 12 beta customers (which include Microsoft), the company claims its new array has double the write performance with SQL Server of any other array, with a 54 percent improvement when measuring SQL Server reads, a 41 percent boost with Hyper-V and 30 percent improved application server utilization. It's especially well-suited for any business application using SQL Server and it can extend the performance of Hyper-V and virtual desktop infrastructure implementations. It's designed to ensure latencies of less than 500 microseconds.

Violin is currently offering a 64-terabyte configuration with a list price of $800,000. Systems with less capacity are planned for later in the year. It can scale up to four systems, which is the outer limit of Windows Storage Server 2012 R2 today. As future versions of Windows Storage Server offer higher capacity, the Windows Storage Array will scale accordingly, according to Herzog. Customers do need to use third-party tiering products, he noted.

Herzog said the two companies will be giving talks on the Windows Flash Array at next month's TechEd conference in Houston. "Violin's Windows Flash Array is clearly a game changer for enterprise storage," said Scott Johnson, Microsoft's senior program manager for Windows Storage Server, in a blog post. "Given its incredible performance and other enterprise 'must-haves,' it's clear that the year and a half that Microsoft and Violin spent jointly developing it was well worth the effort."

Indeed interest in enterprise flash certainly hasn't curbed investor enthusiasm. The latest round of high-profile investments goes to flash storage array supplier Pure Storage, which today bagged another round of venture funding.

T. Rowe Price and Tiger Global, along with new investor Wellington Management, added  $275 million in funding to Pure Storage, which the company says gives it a valuation of $3 billion. But as reported in a recent Redmond magazine cover story, Pure Storage is in a crowded market of incumbents, including EMC, IBM and NetApp, that have jumped on the flash bandwagon as well as quite a few newer entrants including Flashsoft, recently acquired by SanDisk, SolidFire and Violin.

Posted by Jeffrey Schwartz on 04/23/2014 at 1:19 PM0 comments


Rumor: Microsoft To Rename Nokia's Device Business 'Microsoft Mobile'

Microsoft's deal to finalize its acquisition of the Nokia Devices and Services business is set for this Friday, April 25, with the Nokia branch rumored to be renamed "Microsoft Mobile."

According to the Web site Ubergizmo, the Nokia handset and services business will remain headquartered in Finland under the new name Microsoft Mobile Oy. As my friend Mike Elgan pointed out, Oy is the equivalent of LLC or Corp. "It's also Yiddish for 'ouch,' but it's likely Microsoft has the Finnish one in mind," Elgan noted. Microsoft isn't commenting on the report. "We have confirmed the acquisition will be completed on April 25," according to a spokeswoman for Microsoft. "At that time we will begin the work of integration."

Also, before the merger becomes official, the terms of the $7.2 billion deal, announced last summer, have been changed.

Though nothing major, 21 Nokia employees in China who were slated to remain with the company will now join Microsoft. Since it was China that  held up the deal last month, perhaps these terms were added to appease all parties? The employees work on mobile phones. Microsoft will also now manage the Nokia.com domain and its social media sites for up to a year and will no longer acquire Nokia's manufacturing facility in Korea.

"The completion of this acquisition follows several months of planning and will mark a key step on the journey towards integration," said Microsoft Chief Counsel Brad Smith in a blog post Monday. "This acquisition will help Microsoft accelerate innovation and market adoption for Windows Phones. In addition, we look forward to introducing the next billion customers to Microsoft services via Nokia mobile phones."

Microsoft has a lot riding on that integration. The deal was long championed by former CEO Steve Ballmer, who recently admitted his biggest regret was missing the mobile wave. The deal involved drawn-out negotiations which originally lacked the support of Founder Bill Gates and current CEO Satya Nadella.

It remains to be seen whether acquiring Nokia's devices and services business turns out to be the savior for Windows Phone and Microsoft's tablet ambitions or what ultimately does it in.

Posted by Jeffrey Schwartz on 04/21/2014 at 3:28 PM0 comments


Don't Look for Azure To Be Built on OpenStack

If you were wondering if Microsoft Azure service would ever become an OpenStack cloud, it looks unlikely anytime soon based on statements by company officials Thursday.

Perhaps you never thought that was in the cards anyway, but given Microsoft's more-welcome approach to open source, I've always wondered what the future held for OpenStack on Azure. I usually get blank stares when I raise the issue.

But Microsoft doesn't believe there are any OpenStack clouds that come near the size and scale of Azure, or the services offered by Amazon or Google, said Corporate VP Brad Anderson, answering a question during a company presented webinar -- the first of its new Hybrid Cloud Series -- held in Redmond (see Kurt Mackie's recap of the presentation here). The hour-long talk is now available on demand here.

"I hear the conversation -- is OpenStack delivering this promise of public, hosted and private and I would argue there's not a global public cloud that's built on OpenStack today," Anderson responded. "If you look at these public cloud organizations -- us, Google and Amazon -- none of us have built on OpenStack. And we're the only one of those three that has this promise and a proven track record of taking everything that we're doing in the public cloud and then delivering it across... a hybrid model."

While Rackspace may beg to differ, IBM and Hewlett Packard are among those that say their OpenStack-based clouds support OpenStack. But both are still a work in progress. At the same time, OpenStack, like Azure, is designed to run Windows Server instances and Hyper-V virtual machines. The promise of OpenStack, however, is that customers can move their workloads to other OpenStack clouds. Microsoft counters that customers can do that with in-house Windows Server private clouds, hosting providers that support Microsoft's cloud OS (few as those may be at this time) and Azure. It's safe to say that the OpenStack community wouldn't see that as a valid comparison.

The question came up just as the OpenStack Foundation this week released its semi-annual distribution called Icehouse which has 350 new features and targets better scalability for enterprise workloads. Members of the OpenStack community from various companies have consistently described Microsoft as an active participant in committees where it comes to ensuring Hyper-V works well in OpenStack clouds.

Despite questioning the reach of OpenStack, Anderson reiterated the company's commitment to integrating with it. "OpenStack is going to be used in a number of different places so we want to also integrate with OpenStack," he said. "If an organization has made a decision that they're going to use OpenStack, it's a lot like Linux. If I go back and look at Linux 10 years ago, we embraced Linux with System Center. We've got an awful lot of Linux. We look at the number of VMs that are running inside of Azure that are Linux-based, and that's a significant number. We'll do the work on OpenStack to make sure Hyper-V in the Microsoft cloud is a first-class citizen. We will continue that work."

While Anderson was playing both sides, in an ironic sort of way, so was Canonical Cloud Product Manager Mark Baker, whom I chatted with earlier in the week about its release of Ubuntu Linux 14.04. Canonical is a major OpenStack participant and Baker claims Ubuntu is a widely used Linux distribution on OpenStack clouds today. At the same time, Baker said besides Amazon, Microsoft Azure is one of the fastest-growing alternatives when it comes to deployments of Ubuntu.

"Even through people may find it surprising, we have a great working relationship with Microsoft and the Azure team," Baker said. "We see that as one of the fastest-growing clouds, and Ubuntu is growing fast on that."

Regardless, a number of major organizations are using OpenStack clouds including Samsung, Netflix, Time Warner, Best Buy and Comcast, according to Baker, acknowledging most are tech-centric enterprises today.

While Anderson didn't actually go into whether or not Azure will support OpenStack, his sizing of it didn't make it sound imminent. Do you agree with his assessment of OpenStack or is he underestimating it?

Posted by Jeffrey Schwartz on 04/18/2014 at 12:33 PM0 comments


Tech & Tools Watch, April 18: Riverbed, BMC, Netwrix

With so many tools released for IT pros every week, many of them often go under the radar. Looking to address that, I thought it would be a good idea to offer regular roundups with the latest bits of product and technology news. Here in the Schwartz Report, we'll call it "Tech and Tools Watch." Without further ado, here's our first installment:

Riverbed Improves Branch Office Converged Infrastructure
Riverbed has turned granite into steel. The company this week relaunched its Granite Solution, a converged infrastructure appliance it introduced two years ago, with the new name SteelFusion. But the change is more than cosmetic. The new SteelFusion 3.0 offers a six-fold improvement in performance and a three-fold improvement in capacity -- up to 100 terabytes, the company said.

Long known for its branch office WAN optimization hardware, Riverbed's SteelFusion brings the same concept in the form of converged appliances for remote locations. The SteelFusion branch office appliance provides converged compute, storage, networking and virtualization features different from typical converged appliances from the likes of Cisco, Dell and HP. The Riverbed offering stores data centrally at the datacenter or headquarters location and streams it to the branch office rather than storing data at each remote location.

"Data belongs in the datacenter, which is why it's called a datacenter," said Riverbed Director of Technology Rob Whitely. "When putting these in the branch, now I can run my services locally at the branch but I don't want data residing at the branch where it's subject to theft, corruption or downtime."

Riverbed said the new release also offers improved integration with EMC and NetApp SANs with support for NetApp cluster mode and EMC VNX2 snapshots. It also has an improved backup and recovery support with a new recovery agent, an enhanced scaled-out architecture and is better suited for VDI and CAD/CAM-type implementations.

BMC Software's CLM Tool Targets Microsoft Azure
BMC Software's Cloud Lifecycle Management (CLM) software now supports migration to Microsoft Azure. The company said its new 4.0 release, due for GA in early June, makes it simple to migrate from VMware-based clouds to the Microsoft Azure infrastructure-as-a-service platform (IaaS). BMC said the new release of its CLM tool will let IT manage services delivery, operations, planning and compliance via different public cloud service providers' infrastructures from a single management platform and interface.

Workloads designed to run in VMware environments can be easily redirected to Microsoft Azure, according to Steven Anderson, BMC's principal product manager. "You can specify the application stacks, the networking necessities, storage necessities and all those various aspects," he said. "Those parts go into the blueprints and can remain essentially the same. So all you have to do is change the blueprint and point to a different OS image, whatever the image ID is for the platform you're interested in and you can deploy the new instances of those workloads on the new platform in very little time at all."

The CLM tool can integrate through Microsoft's System Center Virtual Machine Manager. Anderson said that while Amazon is by far the most widely deployed cloud, the company is seeing increased usage of Microsoft Azure as well.

Netwrix Survey: IT Pros Admit Undocumented Changes
Netwrix this week released the results of a survey it commissioned which found that more than half (57 percent) of all IT pros surveyed admit they have made undocumented changes that no one else is aware of. These changes put organizations at risk for downtime and security breaches, according to Netwrix, which supplies the Netwrix Auditor for tracking and managing changes.

The company surveyed 577 IT pros for its "2014 State of IT Changes" report. The study shows these changes caused services to stop for 65 percent of those surveyed. It also found these undocumented changes led to daily or weekly downtime (52 percent), were the root cause of security breaches (39 percent) and 62 percent of the changes were unauditable. Only 23 percent said they have an auditing or change management solution in place.

That's good news for Netwrix, which last month made its auditing solutions available as specific modules. It now include specific standalone offerings under the Auditor brand for Active Directory, file servers, Exchange, SQL Server, Windows Server  and VMware.

Posted by Jeffrey Schwartz on 04/18/2014 at 1:02 PM0 comments


Are Business Leaders Seizing Your IT Budgets?

It's no secret that the ease of procuring various cloud-computing applications and infrastructure services and the BYOD trend have impacted IT organizations' influence. Now a survey released yesterday suggests business leaders are broadly seizing influence over IT decisions from CIOs and enterprise IT decision makers.

More than one-third of IT decisions are made by business leaders who don't report to the CIO, according to the survey released by Avanade, a joint venture of Microsoft and Accenture focused on the deployment and support of Microsoft technologies. The survey of 1,003 business and IT executives shows that 79 percent believe the business leaders feel they are better equipped to make technology decisions.

This shift means IT organizations are becoming "service brokers," according to Avanade. Under this model, IT organizations consult with the business units to determine their needs and goals. Already 35 percent of IT organizations have transitioned to this service-broker model, according to the survey.

Despite the new shift in control, the survey shows that the vast majority of business leaders (83 percent) still have confidence in IT staff interacting with key stakeholders as consultants and 66 percent plan to expand the role of technologists in becoming business advisors in the coming year. To enable this transition, business leaders are turning to IT organizations to partner with them. The survey found that 44 percent of business leaders are looking to enhance their cloud computing skills and 43 percent are looking to work with IT on systems integration.

This shift has not come without pain. "The tilting balance of control over technology decisions and budget has created a real tension between IT and the business and requires IT to rethink its approach, learn new skills and grow its influence," said Mick Slattery, Avanade executive vice president at Global Service Lines, in a statement. "Forward-looking companies are positioning their IT staff as business advisors and see IT contributing more to accomplishing objectives, and driving positive business results than ever before."

Nevertheless IT organizations are for the most part (71 percent) cooperating with this shift, according to the survey.

Are the lines of business seizing your IT budget? If so how much tension has this created in your organization?

Posted by Jeffrey Schwartz on 04/16/2014 at 11:59 AM0 comments


The Switch Is On: Are You Moving to Hyper-V?

Nearly six months after Microsoft has shipped Windows Server 2012 R2, a growing number of IT pros now believe the Hyper-V hypervisor is ready for prime time. A growing number of third parties and IT pros say it's now practical to use Hyper-V for business-critical workloads.

Before the current release of Hyper-V 3.0, that wasn't the case. While Hyper-V was suitable for various workloads, most enterprises were reluctant to use it for heavy duty virtualization. And even those that were using it certainly weren't displacing existing hypervisors, especially VMware's ESX.

It's not that many shops weren't intrigued by the thought of using Hyper-V, which Microsoft has offered free of charge with Windows Server since 2008. It's just that it lacked the robustness and management capabilities offered by VMware. Many say while VMware still has a technical edge over Hyper-V, the gap has narrowed to the point that it's suitable for a growing number of mainstream use cases. It's even more appealing for those considering Microsoft's hybrid cloud strategy, called Cloud OS, that makes it easier to bridge Windows Server to Azure using Hyper-V.

Microsoft this week moved to make it easier to migrate VMware infrastructure to Hyper-V with the release of Virtual Machine Converter 2.0. The free tool lets IT pros migrate VMware-based virtual machines and virtual disks to Hyper-V-based VMs and virtual hard disks (VHDs).

"Virtual machine migration is an increasing priority for many customers as more and more are exploring and evaluating the Microsoft platform against their existing VMware installed base," the company said in a blog post from its server and cloud team. "Whether it's virtual to virtual (from one hypervisor to another) or physical to virtual, migration provides customers a path for consolidation of workloads and services, and the foundation for cloud."

The new VM Converter 2.0 supports V-Center and ESX 5.5,  VMware virtual hardware version 4  through 10 support and Linux guest OS migration support including CentOS, Debian, Oracle, Red Hat Enterprise, SuSE enterprise and Ubuntu. Microsoft also pointed to two new features. The first is an on-premises VM to Azure VM conversion tool, which lets IT pros migrate their VMware VMs directly to Azure. It also now includes a PowerShell interface for scripting and automation support, letting IT pros automate migration processes with workflow tools including System Center Orchestrator, among others, Microsoft said.

Microsoft also said MVVC 3.0, slated for this fall, will add physical-to-virtual (P2V) machine conversion for supported versions of Windows.

Do you plan to make the switch or are you sticking with VMware (or looking at KVM or other alternative hypervisors)? Share your views in the comment section below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/11/2014 at 11:23 AM0 comments


Windows XP: October 25, 2001-April 8, 2014

Microsoft yesterday issued its final patch for Windows XP and Office 2003. The operating system, arguably the most popular version of Windows ever, is now officially dead (though it's still a long way from the grave). It still lives on millions of PCs and it is well documented that many of them will continue to run the dead OS indefinitely.

Because Microsoft issued the last patch yesterday, nothing bad is likely to happen imminently. It will take many weeks and months before it is clear what vulnerabilities are exploited and how severely it impacts users.

Some expect little of consequence to happen while others say those keeping their Windows XP-based PCs will face major problems. For example, Jason Kennedy, a business product marketing director at Intel, told me this week that he's concerned that many unsuspecting users, especially those with small- and medium-sized business, are awaiting disaster.

"I unfortunately expect many of the bad people who are crafting malware or identity theft opportunities have been lying in wait for some time after April 8," Kennedy said. "I do believe sometime after the deadline those attacks will be unleashed. And people will suffer. I hope it's not severe but I expect there will be problems as a result of not taking the threats serious enough and not taking steps to mitigate."

Given the obvious fact that Intel has a vested interest in users moving off Windows XP since most will have to buy new PCs (with new processors) you may take that with a grain of salt. On the other hand, no one knows what vulnerabilities will surface.

While it remains to be seen if such a dire event happens, those who've decided to stick with Windows XP have made their decisions and are ready to live with the consequences. If you back up your data, chances are the worst that will happen is you'll have to buy a new PC or some other device.

Perhaps you'll give up on Windows altogether? That's what Google, VMware and even Citrix are urging business customers to do. "Many businesses are in a tough spot," Amit Singh, president of Google Enterprise, said in a blog post. "Despite 'significant' security and privacy risks, legacy software or custom-built apps have held businesses back from migrating in time for today's XP support deadline. Companies in this position now find themselves at a timely crossroads. It's time for a real change, rather than more of the same."

Google and VMware teamed up yesterday to announce they will take $200 off Google Chromebooks for Business with VMware Horizon DaaS. The two companies last month announced a pact to bundle the two offerings and this looks to sweeten the deal. Google is offering $100 off Chromebooks for each managed device purchased for a company and Citrix is offering 25 percent off its Citrix XenApp Platinum Edition, which includes Windows XP migration acceleration tool AppDNA.

Windows XP may be dead but as rivals pick at the carcass, it's a long way from being buried.

Posted by Jeffrey Schwartz on 04/09/2014 at 1:24 PM0 comments


Are You Ready for the Last Windows XP Patch?

Tomorrow represents a milestone for many PC and Exchange administrators. It's the long-dreaded day when Microsoft will issue its last patch for Windows XP, Exchange 2003 and Office 2003 (which, of course, includes Outlook).  It's also an important day because Microsoft will also issue the Windows 8.1 Update.

As reported in this month's Redmond magazine cover story, 23 percent of polled readers will keep their Windows XP-based systems running indefinitely. Only 28 percent of you have completed your migrations or have no Windows XP-based machines left. Even though tomorrow is the end for Windows XP, barring any unexpected events, the day will likely come and go without incident -- though you won't be able to avoid hearing about it if you're watching the evening news or listening to the radio.

Nevertheless, Windows XP systems will be around for the foreseeable future as they slowly fade over time. Until then, if you're just using Windows XP for PC apps and are not connecting to the Internet, you shouldn't have any problems. For those still connected, it's advisable to remove the default administrative privileges, enable memory and buffer overflow protection and allow whitelisting for zero-day vulnerability protection, as noted by security software supplier McAfee.

For many organizations, upgrading Windows XP PCs is not a simple task, especially for those with apps that can't run on newer versions of Windows. While there are many remedies -- rebuilding apps, using third-party tools or desktop virtualization/VDI -- all come with a cost and some simply don't see a need to change OSes. Others do but just are going to have to let that deadline pass and either pay extra for support or take other measures -- or perhaps just cross their fingers.

Redmond columnist Greg Shields put it best. In last month's Windows Insider column, he compared replacing Windows XP-based PCs to replacing an aging bridge. "Fixing a bridge or replacing it entirely is an inconvenient activity," he wrote. "Doing so takes time. The process often involves scheduled setbacks, cost overruns and incomprehensible activities that are tough to appreciate when you're idling in construction traffic."

While tomorrow represents the end for Windows XP, Microsoft will issue its Windows 8.1 Update that comes with a more mouse-friendly Start Screen, the ability to pin Windows Store apps to the task bar and APIs that are shared with the forthcoming Windows Phone 8.1.

Right now only a small handful of enterprises are moving to Windows 8.1. But as Microsoft makes more progress in blending the old with the new, perhaps the aversion of moving to the newest version of Windows will subside.

 

Posted by Jeffrey Schwartz on 04/07/2014 at 12:40 PM0 comments


Microsoft Aims Azure at Dev-Ops and Open Source Communities

Typically when I talk to experts about the public cloud, the usual refrain is that there's Amazon Web Services ... and then there's everyone else. When it comes to everyone else, Microsoft Azure is among the leading players with 12 datacenters now in operation around the globe including two launched last week in China. And with 16 additional centers planned by year's end and 300 million customers, the company has strong ambitions for its public cloud service.

At the Build conference in San Francisco this week, Microsoft showed how serious it is about advancing the appeal of Azure. Scott Guthrie, Microsoft's newly promoted executive VP for cloud and enterprise, said Azure is already used by 57 percent of the Fortune 500 companies and has 300 million users (with most of them enterprise users registered with Active Directory). Guthrie also boasted that Azure runs 250,000 public-facing Web sites, hosts 1 million SQL databases with 20 trillion objects now stored in the Azure storage system and it processes 13 billion authentications per week.

Since its launch in November, Guthrie claims that 1 million developers have registered with the Azure-based Visual Studio Online service. This would be great if the vast majority have done more than just register. While Amazon gets to tout its major corporate users, including its showcase Netflix account, Guthrie pointed to the scale of Azure, which hosts the popular Titanfall game that launched last month for the Xbox gaming platform and PCs. Titanfall kicked off with 100,000 virtual machines (VMs) on launch day, he noted.

Guthrie also brought NBC Executive Rick Cordella to talk about the hosting of the Sochi Olympic games in February. More than 100 million people viewed the online service with 2.1 million concurrently watching the men's United States vs. Canada hockey match, which was "a new world record for HD streaming," Guthrie said.

Cordella noted that NBC invested $1 billion in this year's games and said it represented the largest digital event ever. "We need to make sure that content is out there, that it's quality [and] that our advertisers and advertisements are being delivered to it," he told the Build audience. "There really is no going back if something goes wrong," Cordella said.

Now that Azure has achieved scale, Guthrie and his team has been working on rolling out a bevy of significant enhancements aimed at making its service appealing to developers, operations managers and administrators. As IT teams move to a more dev-ops model, Microsoft is taking that into consideration as it builds out the Azure service.

Among the Infrastructure as a Service (IaaS) improvements, Guthrie pointed to the availability of auto-scaling as a service, point-to-site VPN support, dynamic routing, subnet migration, static internal IP addressing and Traffic Manager for Web sites. "We think the combination of [these] really gives you a very flexible environment, a very open environment and lets you run pretty much any Windows or Linux workload in the cloud," Guthrie said.

Azure is a more flexible environment for those overseeing dev-ops thanks to the new support for configuring VM images using the popular Puppet and Chef configuration management and automation tools used on other services such as Amazon and OpenStack. IT can also now use Windows PowerShell and VSD tools.

"These tools enable you to avoid having to create and manage lots of separate VM images," Guthrie said. "Instead, you can define common settings and functionality using modules that can cut across every type of VM you use."

Perhaps the most significant criticism of Azure is that it's still a proprietary platform. In a move to shake that image, Guthrie announced a number of significant open source efforts. Notably, Microsoft made its "Roslyn" compiler and other components of the Microsoft .NET Framework components open source through the aptly titled .NET Foundation.

"It's really going to be the foundation upon which we can actually contribute even more of our projects and code into open source," Guthrie said of the new .NET Foundation. "All of the Microsoft contributions have standard open source licenses, typically Apache 2, and none of them have any platform restrictions, meaning you can actually take these libraries and you can run them on any platform. We still have, obviously, lots of Microsoft engineers working on each of these projects. This now gives us the flexibility where we can actually look at suggestions and submissions from other developers as well and be able to integrate them into the mainline products."

Among some other notable announcements from Guthrie regarding Azure:

  • Revamped Azure Portal: Now available in preview form, the new portal is "designed to radically speed up the software delivery process by putting cross-platform tools, technologies and services from Microsoft and its partners in a single workspace," wrote Azure General Manager Steven Martin in a blog post. "The new portal significantly simplifies resource management so you can create, manage, and analyze your entire application as a single resource group rather than through standalone resources like Azure Web Sites, Visual Studio Projects or databases. With integrated billing, a rich gallery of applications and services and built-in Visual Studio Online you can be more agile while maintaining control of your costs and application performance."
  • Azure Mobile Services: Offline sync is now available. "You can now write your mobile back-end logic using ASP.NET Web API and Visual Studio, taking full advantage of Web API features, third-party Web API frameworks, and local and remote debugging," Martin noted. "With Active Directory Single Sign-on integration (for iOS, Android, Windows or Windows Phone apps) you can maximize the potential of your mobile enterprise applications without compromising on secure access."
  • New Azure SDK: Microsoft released the Azure SDK 2.3, making it easier to deploy VMs and sites.
  • Single Sign-on to Software as a Service (SaaS) apps via Azure Active Directory Premium, now generally available.
  • Azure now includes one IP address-based SSL certificate and five SNI-based SSL certs at no additional cost for each site instance.
  • The Visual Studio Online collaboration as a service is now generally available and free for up to five users in a team.
  • While Azure already supports .NET, Node.Js PHP and Python, it now supports the native Java language thanks to its partnership with Oracle that was announced last year.

My colleague Keith Ward, editor in chief of sister site VisualStudioMagazine.com, has had trouble in the past finding developers who embraced Azure. He now believes that could change. "Driving all this integration innovation is Microsoft Azure; it's what really allows the magic to happen," he said in a blog post today. Furthermore, he tweeted: "At this point, I can't think of a single reason why a VS dev would use Amazon instead of Azure."

Are you finding Azure and the company's cloud OS hybrid platforms more appealing?

Posted by Jeffrey Schwartz on 04/04/2014 at 8:15 AM0 comments


Nadella Strikes Balance Between Universal Windows and a Multiplatform World

Microsoft opened its Build conference for developers with a keynote that focused on the company's attempts at breathing new life to its struggling Windows franchise while simultaneously embracing interoperability with other platforms.

In addition to unveiling its intelligent voice assistant planned for Windows Phone 8.1 and announcing the Windows 8.1 update, Microsoft's top executives talked of progress towards unifying its operating system across PCs, tablets, phones and its Xbox gaming platform. The company has lately described this and efforts to extend to open source and competitive platforms as a "universal Windows."

Underscoring the progress Microsoft has made toward that effort, Microsoft's new CEO Satya Nadella said that 90 percent of its APIs are now common and this should remove some of the barriers to developing for the various system types. "That's fantastic to see," Nadella told the 5,000-plus attendees at the event, held in San Francisco. He said Microsoft will continue to push for a "shared library across a variety of device targets."

Those device targets won't be limited to traditional hardware. Terry Myerson, executive vice president for Microsoft's operating system group, described the company's ambitions for its current-generation operating system, which only accounts for being on a small share of tablets and is still not favored among most PC users.

Those ambitions include not only making Windows tools and frameworks more broadly available but extending them to new types of devices -- including on the so-called "Internet of things," which can range on anything from a piano, as demonstrated, to telemetry components equipped with Intel's x86 system-on-chip called Quark. The component is the size of an eraser, Myerson noted. Such advances will open new opportunities for Windows, he said.

Also in a bid to grow its market share in the low-cost tablet and phone market, Myerson emphasized the company's efforts to expand the presence of Windows by making it free to tablet, PC and phone suppliers offering hardware that's nine inches or less. That promises to take away the key advantage of the Android OS being free.

"We really want to get this platform out there," Myerson said of Windows. "We want to remove all the friction between you and creating these devices."

While Nadella and company are taking steps to expand Windows, they also acknowledged to its core audience that it's not going to be a Windows-everywhere world, as evidenced by the company's long-awaited release of Office for the iPad last week.

One of the biggest barriers to the growth of the modern Windows platform is developers who don't want to be tied to a specific platform. In a move aimed at making their efforts more universal, Corporate Vice President David Treadwell said the company's Windows Library for JavaScript (WinJS) will be cross platform and will be made available to the open source community under the Apache 2 license. It's now available on GitHub.

In a pre-recorded question displayed during the closing of today's presentation, an Android developer asked why he should also develop for Windows. Nadella's answer: "We are the only platform that has APIs with Language bindings across both native, managed and Web. And the fact that that flexibility exists means you can build your core libraries in the language of your choice and those core libraries you can take cross platform. Obviously the Web [is] the one that's easiest to conceptualize and that's what we've done by taking WinJS and putting it into open source and making it a community effort so you can take it cross platform."

Posted by Jeffrey Schwartz on 04/02/2014 at 3:39 PM0 comments


Microsoft Responds to Amazon with Azure Cloud Price Cuts

As predictably as the sun rises, Microsoft yesterday followed Amazon's latest round of price cuts by reducing the rates for its Windows Azure – rather Microsoft Azure – cloud service. (In case you missed it, Microsoft last week shed the Windows name from its cloud service. Hence Windows Azure is now Microsoft Azure.)

Microsoft is cutting the price of its compute services by 35 percent and its storage service 65 percent, the company announced yesterday afternoon. "We recognize that economics are a primary driver for some customers adopting cloud, and stand by our commitment to match prices and be best-in-class on price performance," said Steven Martin, general manager of Microsoft Azure business and operations, in a blog post. The price cuts come on the heels of the company last week expanding Microsoft Azure into China. 

In addition to cutting prices, Microsoft is adding new tiers of service to Azure. On the compute side, a new tier of instances called Basic consist of similar virtual machine configurations as its current Standard tier and won't include load balancing or auto-scaling offered in the Standard package . The existing standard tier will now consist of a range of instances from "extra small" to "extra large." Those instances will cost as much as 27 percent less than their current instances.

Martin noted that some workloads, including single instances and those using their own load balancers, don't require the Azure load-balancer. Also, batch processing, dev and test apps are better suited to the Basic tier, which will be comparable to AWS-equivalent instances, Martin said.  Basic instances will be available this Thursday.

Pricing for its Memory-Intensive Instances will be cut by up to 35 percent for Linux instances and 27 percent for Windows Server instances. Microsoft said it will also offer the Basic tier for Memory-Intensive Instances in the coming months.

On the storage front, Microsoft is cutting the price of its Block Blobs by 65 percent and 44 percent for Geo Redundant Storage (GRS). Microsoft is also adding a new redundancy tier for Block Blob storage called Zone Redundant Storage (ZRS).

With the new ZRS tier, Microsoft will offer redundancy that stores the equivalent of three copies of a customer's data across multiple locations. GRS by comparison will let customers store their data in two regions that are dispersed by hundreds of miles and will store the equivalent of three copies per region. This new middle tier, which will be available in the next few months, costs 37 percent less than GRS.

Though Microsoft has committed to matching price cuts by Amazon, the company faced a two-prong attack last week which included both Amazon and Google not only slashing prices for the first time but by finally offering Windows Server support. While Microsoft has its eyes on Amazon, it needs to look over its shoulder as Google steps up its focus on enterprise cloud computing beyond Google Apps.

One area where both Amazon and Google have a leg up on Microsoft is their respective desktop-as-a-service (DaaS) offerings. As noted last week, Amazon made generally available its WorkSpaces DaaS offering, which it announced back in November at its re:Invent customer and partner conference. And as reported last month, Google and VMware are working together to offer Chromebooks via the new VMware Horizon DaaS service. It remains to be seen how big the market is for DaaS and whether Microsoft's entrée is imminent.

 

 

Posted by Jeffrey Schwartz on 04/01/2014 at 12:54 PM0 comments


Microsoft To Halt Its E-Mail Snooping When Conducting Internal Investigations

Even though Microsoft had strong evidence that a former employee was transmitting via Hotmail stolen code and trade secrets, customers were unnerved when learning the company snooped at the suspect's e-mail account.

As reported two weeks ago, Alex Kibkalo, a former Microsoft architect, was arrested for allegedly stealing trade secrets and leaking Windows 8 code to an unnamed French blogger while working for the company. By delving into his Hotmail account, Microsoft was able to provide evidence to the authorities. If the suspect were smart enough to use any e-mail service not owned by Microsoft, the company would have needed to get a warrant from law enforcement authorities.

But the law doesn't prohibit the owner of a service from snooping so Microsoft was within its legal rights to search the suspect's Hotmail account. Nevertheless, Microsoft was well aware it needed to reassure customers it won't take matters into its own hands so blatantly in the future.  Due to the public backlash that arose shortly after the incident came to light, Microsoft said that it would turn to a former judge to determine if it had probable cause to look into a suspect's account.

Clearly that wasn't cutting it since the judge is still on Microsoft's payroll and Microsoft again said it would be making a change. Microsoft General Counsel Brad Smith on Friday said it would turn all suspected information to the authorities before taking further action. "Effective immediately, if we receive information indicating that someone is using our services to traffic in stolen intellectual or physical property from Microsoft, we will not inspect a customer's private content ourselves," Smith said in a blog post announcing the change. "Instead, we will refer the matter to law enforcement if further action is required."

Smith pointed out while the law and the company's terms of service allowed it to access Kibkalo's account, doing so "raised legitimate questions about the privacy interests of our customers." As a result the company will revise its terms of services and has reached out to The Center for Democracy and Technology (CDT) and the Electronic Frontier Foundation to further discuss the issue of ensuring security without compromising privacy.

It appears this was the right move to take. Does Microsoft's latest move make you feel more comfortable or do you just see it as lip service?

Posted by Jeffrey Schwartz on 04/01/2014 at 12:55 PM0 comments


The Problem with Office for iPad: No Mouse Support

As soon as Microsoft CEO Satya Nadella announced the long-expected release of Office for the iPad last week, I downloaded it on mine. Upon opening Word on the iPad, it displayed all of my documents stored in OneDrive and even sorted them in the order that I last accessed them in. Frankly, Office documents in OneDrive are easier to find and navigate on the iPad than the Microsoft Surface or Dell Venue 8 Pro using the modern Windows 8.1 interface because of how they're organized. Still, because it doesn't have native support for an external mouse, I won't be using Office on the iPad that often.

Even so, if you have an Office 365 subscription and an iPad you should download it and determine how it may best suit your needs. Microsoft optimized Office for the iPad and those that compared it to Apple's own iWork suite believe Microsoft delivered a worthy alternative. But should iPad users who don't have Office 365 subscriptions (the new personal plan costs $69 per year) sign up for it? That depends. Keep in mind if all you want to do is view documents, it's free. Only if you intend to create or edit them do you need an Office 365 subscription. If you don't mind the lack of mouse support, you're set.

But having used a mouse and keyboard with Office for two decades, even though I'm quite open to change and doing things differently, I've become quite accustomed to working with both a keyboard and mouse. Editors and writers tend to frequently cut and paste phrases, sentences and entire paragraphs. Using a mouse and keyboard has become second nature to me. While I'm not averse to typing with an on-screen keyboard, using touch for cut and paste doesn't work for me. But I never bought an iPad with the intent of writing or editing with it. I may pull up a document or PDF to proof, but that's about it.

I bought the iPad to use other apps, notably to read various publications, access e-mail and connect to the Internet when I'm away from my desk. But if I need to take notes, write or edit away from my office, I use a Windows tablet with its type-style keyboard and the track pad and/or an external Bluetooth mouse.

My understanding is while iPads support external Bluetooth keyboards, Apple has made a decision not to support the use of an external mouse on its tablets. Both Android and Windows tablets support them. It's ironic that Apple doesn't do this considering it brought the mouse to life with the Macintosh. Clearly Steve Jobs envisioned weaning users off the mouse to the new touch-based world. It remains to be seen whether CEO Tim Cook has a different view.

I think touch is a great evolution of the user experience but not when it comes to writing and editing anything more than a text, e-mail or social media post. Yes there are ways to jailbreak iOS to enable a mouse to work with the iPad. But until iOS natively gains support for a mouse, I have no intention of using my iPad, which I otherwise enjoy, for content creation. Does this make me stodgy and stuck in my ways or am I in good company?

Posted by Jeffrey Schwartz on 03/31/2014 at 12:26 PM0 comments


Protect Your Data on World Backup Day

Today is World Backup Day, created as an independent effort in 2011 on the eve of April Fools' Day. The goal is to underscore that it takes a fool not to back up their files regularly. Described as an independent effort created under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, it was founded because many people still don't understand the need to back up their data. Hence it's aimed at bringing awareness to the issue. While the intent is well grounded, it doesn't seem to have garnered much attention.

As far as I can tell, even some of its sponsors,  including Backblaze, LaCIE and Western Digital, haven't made much noise about World Backup Day this year. Could it be that most providers of data protection software assume everyone backs up their data? Certainly enterprises do and the April issue of Redmond magazine looks at improvements to Microsoft's new System Center 2012 R2 Data Protection Manager and how third-party providers of backup and recovery software are offering new ways to replicate server data of specific apps and functions such as SharePoint, SQL Server and VMware-based VMs.

But the majority of people are still careless when it comes to protecting their data. According to the World Backup Day Web site, 30 percent of people have never backed up their data. With 113 phones being lost or stolen every minute and 10 percent of every computer infected with viruses each month, those 30 percent need to get on board.

If you're one those who haven't backed up their PCs and devices lately (you know who you are), take the time today to do so. At some point, you'll be glad you did.

Posted by Kurt Mackie on 03/31/2014 at 12:30 PM0 comments


Desktop Virtualization as a Viable XP Migration Option

While virtual desktops represent a small niche of the enterprise client system universe, they're a reasonable option for organizations with PCs still running Microsoft's Windows XP operating system. Unless you've been hiding under a rock, Windows XP will shortly lose official support from Microsoft.

As I reported earlier this month, a survey of Redmond magazine readers found that 23 percent will continue to run their Windows XP-based systems after Microsoft releases the final patch for the OS on April 8. And while the survey also showed an overwhelming 85 percent will deploy Windows 7-based PCs and 35 percent will depoly systems running Windows 8 (multiple responses were permitted), 9 percent said they are looking to virtual desktops. That may include some form of VDI or desktop as a service (DaaS).

Evolve IP, a managed services provider that offers its own hosted DaaS offering based on VMware Horizon View, said its own survey showed that 63 percent will use virtual desktops for at least a portion of their employees. The VDI as a service is hosted in its own cloud where customers can also host their Active Directory instances to manage users. "It's a good mix for the IT department who needs control, but it's also good because it's not an all-in philosophy," said Scott Kinka, Evolve IP's CTO.

There are a number of solutions from the likes of AppSense, Citrix, Dell/Wyse, HP, NComputing and VMware. Of course, Microsoft's own Remote Desktop Services (RDS) and AppV solutions are all viable options as well, either via an MSP or hosted internally. Here's a look at a number of options:

  • AppSense: Using DesktopNow and DataNow, IT can bring together related persona and data to centralize and stream the components to a new desktop. "We don't modify, we just lock down and migrate the settings and other things relative to the application," said Jon Rolls, AppSense vice president of product management."

  • Citrix: With the company's XenDesktop, IT can virtualize Internet Explorer 6 (which can't run on newer operating systems) in a virtual desktop. Likewise, apps that cannot be updated to Windows 7 or Windows 8.1 can run in virtual Windows XP instances.

  • NComputing: The supplier of virtual desktop solutions offers its Desktop and Application Virtualization platform for small- and mid-sized business looking for more of a turnkey type offering. The company plans to further simplify the delivery of virtual solutions with the planned release of its new oneSpace client virtualization platform. The company describes it as a workspace for IT to securely deliver apps and files in BYOD scenarios to any device including iPads and Android-based tablets. "Users are getting full-featured versions of their Windows applications but we've done our own optimization to allow those apps to be mobile- and touch-friendly," said NComputing's senior director of marketing Brian Duckering. "Instead of using the Windows Explorer experience, we integrated it and unified it so it's Dropbox- like." It's due to hit private beta this spring.

  • Microsoft: Just last week Microsoft took a step toward making it easier for IT to deploy VDI scenarios based on its Remote Desktop Services. Microsoft released the preview of its Virtual Desktop Infrastructure Starter Kit 1.0. As Redmond's Kurt Mackie reported, Microsoft is billing it as something that should not be used for production environments. It's just for testing purposes. The kit "complements" the management console and wizards used with the RDS server role of Windows Server 2012 R2. It comes with apps including Calculator and WordPad for testing virtual desktop access scenarios. The finished Starter Kit product is scheduled for release in the second quarter of this year and the preview is available for download now. Organizations can also pair Microsoft's RD Gateway with Windows Server 2012 to deploy VDI, as explained in a recent article.

Desktop as a Service
This week Amazon Web Services released its WorkSpaces DaaS offering. Amazon first disclosed plans to release WorkSpaces at its re:Invent conference  in November at its customer and partner conference in Las Vegas. The service will be available with one or two virtual CPUs with either 3.75 or 7.5 GB of RAM and 50 to100 GB of storage. Per-user pricing ranges from $35 to $75 for each WorkSpace per month. Organizations can integrate the new service with Active Directory.

A wide variety of use cases were tested, from corporate desktops to engineering workstations, said AWS evangelist Jeff Barr in a blog post this week. Barr identified two early testers, Peet's Coffee & Tea and ERP supplier WorkWise. The company also added a new feature called Amazon WorkSpaces Sync. "The Sync client continuously, automatically and securely backs up the documents that you create or edit in a WorkSpace to Amazon S3," Barr said. "You can also install the Sync client on existing client computers (PC or Mac) in order to have access to your data regardless of the environment that you are using."

Google and VMware are also making a big DaaS push. As I reported earlier this month, the two companies teamed up to enable Google Chromebooks to work with VMware's Horizon View offerings.

Things could get really interesting if Microsoft offers its own DaaS service.

Posted by Jeffrey Schwartz on 03/28/2014 at 11:28 AM0 comments


Google Slashes Cloud Pricing, Adds Windows Server Support

Google today gave Microsoft shops a reason to consider its enterprise cloud services by adding Windows Server support and slashing the pricing of its infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) offerings.

At the company's Google Platform Live event in San Francisco, the company also stepped up its effort to extend the appeal of its IaaS and PaaS services to enterprises by introducing a new blend of the two called Managed Virtual Machines, along with an improved big data analytics offering. The company expanded its menu of server operating system instances available with the Google Compute Engine IaaS with the addition of Suse Linux, Red Hat Enterprise Linux and Windows Server.

"If you're an enterprise and you have workloads that depend upon Windows, those are now open for business on the Google Cloud Platform," said Greg DeMichillie, a director of product management at Google. "Our customers tell us they want Windows so of course we are supporting Windows." Back in December when Google's IaaS was first announced, I noted Windows Server wasn't an available option.

Now it is. However, there is a caveat. The company, at least for now, is only offering Windows Server 2008 R2 Datacenter Edition, noting it's still the most widely deployed version of Microsoft's server OS. There was no mention if and when Google will add newer versions of Windows Server. The "limited preview" is available now.

The new Windows Server support was a footnote to an event which emphasized lower and more predictable pricing and a new offering that allows customers to get the best of both the PaaS and IaaS worlds. Given Amazon Web Services (AWS) and Microsoft frequently slash their prices, Google had little choice but to play the same game. That's especially the case given its success so far.

"Google isn't a leader in the cloud platform space today, despite a fairly early move in platform as a service with Google App Engine and a good first effort in Infrastructure as a Service with Google Compute Engine in 2013," wrote Forrester analyst James Staten in a blog post. "But its capabilities are legitimate, if not remarkable."

Urs Hölzle, the senior vice president at Google who is overseeing the company's datacenter and cloud services, said 4.75 million active applications now run on the Google Cloud Platform, while the Google App Engine PaaS sees 28 billion requests per day with the data store processing 6.3 trillion transactions per month.

While Google launched one of the first PaaS offerings in 2008, it was one of the last major providers to add an IaaS and only recently hit the general availability status with the Google Compute Engine back in December. Meanwhile just about every major provider is trying to catch up with AWS, both in terms of services offered and market share.

 "Volume and capacity advantages are weak when competing against the likes of Microsoft, AWS and Salesforce," Staten noted. "So I'm not too excited about the price cuts announced today. But there is real pain around the management of the public cloud bill."  To that point, Google announced Sustained-Use discounts.

Rather than requiring customers to predict future usage when signing on for reserved instances for anywhere from one to three years, discounts of 30 percent of on-demand pricing kick in after usage exceeds 25 percent of a given month, Hölzle said.

"That means if you have a 24x7 workload that you use for an entire month like a database instance, you get a 53 percent discount over today's prices. Even better, this discount applies to all instances of a certain type. So even if you have a virtual machine that you restart frequently, as long as that virtual machine in aggregate is used more than 25 percent of the month, you get that discount."

Overall other pay-as-you-go services price cuts range from 30 to 50 percent dependency. Among the reductions:

  • Compute Engine reduced by 32 percent across all sizes, regions and classes
  • App Engine pricing for instance-hours reduced by 37.5 percent, dedicated memcache by 50 percent and data store writes by 33 percent. Other services including SNI SSL and PageSpeed are now available with all applications at no added cost
  • Cloud Storage is now priced at a consistent 2.6 cents per GB, approximately 68 percent lower.
  • Google BigQuery on-demand prices reduced by 85 percent.

The new Managed Virtual Machines combines the best of IaaS and PaaS, Hölzle said. "They are virtual machines that run on Compute Engine but they are managed on your behalf with all of the goodness of App Engine. This gives you a new way to think about building services," he said. "So you can start with an App Engine application and if you ever hit a point where there's a language you want to use or an open-source package that you want to use that we don't support, with just a few configuration line changes you can take part of that application and replace it with an equivalent virtual machine. Now you have control."

Google also extended the streaming capability of its BigQuery big data service, which initially was able to pull in 1,000 rows per second when launched in December to 100,000 now. "What that means is you can take massive amounts of data that you generate and as fast as you can generate them and send it to BigQuery," Hölzle, said. "You can start analyzing it and drawing business conclusions from it without setting up data warehouses, without building sharding, without doing ETL, without doing copying."

Posted by Jeffrey Schwartz on 03/25/2014 at 1:00 PM0 comments


Cisco To Provide 'World's Largest' Cloud

Cisco today is bringing new meaning to the old saying, "if you can't beat them, join them."

The company today said it will invest $1 billion over the next two years to offer what it argues will be the world's largest cloud. But rather than trying to beat Amazon Web Services, Microsoft, Rackspace, IBM, Hewlett Packard, Salesforce.com, VMware and other major providers that offer public cloud services, Cisco said it will "join" them together, figuratively.

Cisco said it will endeavor to build its so-called "Intercloud" --  or cloud of clouds -- aimed at letting enterprise customers move workloads between private, hybrid and public cloud services. Of course Cisco isn't the only provider with that lofty goal but Fabio Gori, the company's director of cloud marketing, said it's offering standards-based APIs that will help build applications that can move among clouds and virtual machines.

"This is going to be the largest Intercloud in the world," Gori said. Cisco is building out its own datacenters globally but is also tapping partners with cloud infrastructure dedicated to specific counties to support data sovereignty requirements. Gorisaid Cisco will help build out their infrastructures to spec and those providers will be part of the Intercloud.

Gori emphasized Intercloud will be based on OpenStack, the open source cloud infrastructure platform that many cloud providers, including Rackspace, IBM, HP and numerous others, support. But there are key players including Amazon, Microsoft and Google, who don't support it. Gori said Cisco can work around that by using the respective providers' APIs and offering in its own programming interfaces for partners to deliver application-specific offerings.

Core to this is the Intercloud fabric management software, announced in late January at the Cisco Live! conference in Milan, Italy. The Intercloud fabric management software, now in trial and slated for release next quarter, is the latest component of the Cisco One cloud platform that's designed to securely tie together multiple hybrid clouds.

Among the cloud providers now on board are Australian service provider Telstra, Canadian communications provider Allstream, European cloud provider Canopy, cloud services aggregator and distributor Ingram Micro, managed services provider Logicalis Group, BI software vendor  MicroStrategy, Inc., OnX Managed Services, SunGard Availability Services and outsourcing company Wipro.

Gori insists Cisco is lining up many other partners, large and small, from around the world. It remains to be seen if Amazon, Microsoft and Rackspace are in the mix. Asked how Cisco's effort is different from VMware, which is also building a public cloud and enhancing it with local partners, Gori pointed out that its service supports any hypervisor.

Cisco will announce more partners and deliverables at its Cisco Live! conference in San Francisco in May. Whether or not Microsoft is one of those players remains to be seen in the future, he said. "Microsoft is a very big player and is going to be part of this expanded Intercloud," he said. "We are going to do something specific around the portfolio."

Posted by Jeffrey Schwartz on 03/24/2014 at 1:29 PM0 comments


Did Microsoft Cross the Line in Searching Hotmail Account for Employee Theft?

A former Microsoft employee was arrested in Seattle earlier this week after the company searched his Hotmail account and found evidence he was allegedly leaking information and code to a blogger who ended up illegally selling pirated software.

Alex Kibkalo, a former Microsoft architect, is accused of stealing trade secrets and leaking Windows 8 code to an unnamed French blogger while working for Microsoft. Kibkalo, a Russian national who also has worked for Microsoft in Lebanon, also allegedly bragged about breaking into the Redmond campus and stealing the Microsoft Activation Server Software Development Kit, a proprietary solution aimed at preventing unauthorized distribution of the company's software and licenses, SeatlePI reported Thursday.

The move forced Microsoft to admit it had scanned a user account on its Hotmail service to obtain evidence. This comes at a time when many customers lack trust that Microsoft and others are taking enough measures to ensure their privacy of information in the services. Revelations of the National Security Agency (NSA) surveillance efforts by Edward Snowden and accusations that Microsoft and others were cooperating with the NSA has heightened those fears, despite efforts by the players involved to ensure such cooperation is limited to rare instances where there are court orders.

In this case, Kibkalo made it quite easy for Microsoft to discover his alleged acts. One must wonder why he or the blogger would use the company's e-mail service to communicate. Putting that aside, Microsoft accessed the e-mails without a court order because apparently the company legally didn't need a court order to search its own service. But the company did obtain court orders for other aspects of the investigation, said Microsoft Deputy Counsel John Frank, in a blog post published last night.

Frank justified Microsoft's decision to access the e-mails in its Hotmail service and it appears Microsoft didn't violate any laws or its own policies, though some question the wisdom of its actions. "We took extraordinary actions based on the specific circumstances," Frank said. "We received information that indicated an employee was providing stolen intellectual property [IP], including code relating to our activation process, to a third party who, in turn, had a history of trafficking for profit in this type of material. In order to protect our customers and the security and integrity of our products, we conducted an investigation over many months with law enforcement agencies in multiple countries. This included the issuance of a court order for the search of a home relating to evidence of the criminal acts involved. The investigation repeatedly identified clear evidence that the third party involved intended to sell Microsoft IP and had done so in the past."

Likely anticipating customers and privacy advocates might be unnerved by the fact that it dipped into its own servers despite the probable cause of the alleged criminal activity, Frank said Microsoft is stepping up its policies for the way it handles such discovery in the future. "While our actions were within our policies and applicable law in this previous case, we understand the concerns that people have," he said.

Moving forward, he said Microsoft will not search customer e-mail or other services unless there's evidence of a crime that would justify a court order. In addition, Microsoft will turn to a former judge who will now determine if the probable cause would justify a court order and even in those instances, the searches would be limited to searching for the information centered around the suspected activity, not other data, and that it would be supervised by counsel.

To ensure transparency, Microsoft will publish whatever searches it has conducted as part of its biannual transparency reports, he said. "The privacy of our customers is incredibly important to us," he said. "That is why we are building on our current practices and adding to them to further strengthen our processes and increase transparency."

Will appointing a judge to evaluate the merits of the case be enough to settle your concerns that the company won't be looking at your data? Leave your comments below or e-mail me directly

 

Posted by Jeffrey Schwartz on 03/21/2014 at 3:11 PM0 comments


Symantec Shows CEO the Door After Reboot Fails

Just over a year ago, Symantec CEO Steve Bennett announced a plan to turn around the largest provider of security and data protection products. But as rivals continued to gain ground on the company, its board ran out of patience and showed him the door yesterday.

Bennett, a former GE executive and onetime CEO of Intuit, lasted less than two years as Symantec's chief after his predecessor Enrique Salem was also ousted. When Bennett presided over last year's two-hour analyst event dubbed Symantec 4.0, he positioned it as a reboot of the company. The reorganization focused on realigning R&D with its disparate product groups, integrating its technologies, removing the siloes and improving the company's lagging software subscription rates.

During the two-hour Webcast of the event, Bennett and his executive team talked about plans to move into new product areas like network security and putting in place functional technology sharing across its businesses. But according to reports, Bennett's efforts never took hold, though the company said he did help reorganize the company and reduce costs. But that wasn't enough to stem declining revenues, a dearth of new technology innovations and an executive exodus that included the company's CFO and several key business unit heads, according to a report in The New York Times.

Symantec said it has appointed board member Michael Brown as interim president and chief executive officer, effective immediately. Brown joined Symantec's board in 2005 following its $13.5 billion acquisition of Veritas. He had once served as chairman and CEO of Quantum. The company said it has hired an executive search firm to recruit a permanent CEO.

Shares in Symantec were off 12 percent Friday afternoon as investors wonder who will take the company forward and if there will be a Symantec 5.0.

Posted by Jeffrey Schwartz on 03/21/2014 at 12:50 PM0 comments


Asus Halts Android/Windows Tablet PC Production

Asus reportedly has put on hold plans to release a tablet PC that can switch between Android and Windows just two months after introducing the multi-mode device.

The company introduced the Transformer Book Duet TD300 at the Consumer Electronics Show in Las Vegas back in January and said it would ship in the first half of this year. But Digitimes last week reported Asus was shelving the release. It stood out as one of the few unique tablet PCs at CES because users can convert the device from a laptop to a tablet and switch between operating systems.

Intel CEO Brian Krzanich demonstrated it in his CES keynote as the type of new devices the chipmaker sees as boosting demand for its system-on-a-chip processors. Samsung also has announced a similar device, Ativ Q, last summer.  But as noted by BGR, the company never released it and has removed all references to it from its Web site.  

"There are times you want Windows, there are times you want Android," Krzanich said in his CES keynote. Intel's 64-bit SoCs "are the only ones that can offer that capability to seamlessly switch between OSes," he added. "You don't have to make a choice moving forward."

Well for now you do.  But according to The Wall Street Journal, Asus may be facing backlash from both Google and Microsoft, who seemingly would prefer not to see each other's  rival operating systems on the same machine.

While Android is available free and hence Google technically can't stop Asus from releasing the device, the search giant does approve what is sold in its Google Play app store. Analyst Patrick Moorhead told The Journal Google has no incentive to approve dual-OS systems since it would also benefit Microsoft. For its part, Microsoft also has little incentive to give laptop users an additional entrée to the Android marketplace but Digitimes research believed Microsoft had more to gain.

The now-postponed TD300 was appealing because it was a hybrid tablet PC and offered what it called an Instant Switch, allowing users to quickly switch between Android and Windows, rather than rebooting.

If Microsoft can release Office on the iPad, which  Mary Jo Foley reported in her ZDNet All About Microsoft blog will happen next week, surely Google and Microsoft could find a reason to stand out of the way of OEMs releasing a device that lets users switch between two operating systems.

Posted by Jeffrey Schwartz on 03/19/2014 at 11:31 AM0 comments


Windows Azure Can Now Run Oracle Software

When Microsoft announced last summer it had reached an agreement to run the Oracle database, WebLogic middleware and Java on Windows Azure, it seemed as though two worlds were colliding. After all, the two companies have maintained a bitter rivalry over the years, though tensions have eased recently as Oracle CEO Larry Ellison has had bigger fish to fry -- like IBM, SAP and Salesforce.com.

Following nine months of development, Microsoft on Thursday said Windows Server-based virtual machine images of Oracle solutions are now available on Microsoft's cloud offerings. Licenses are included with the VM images and can be accessed in the Windows Azure Management Console. When logging in, administrators can click New, then select Compute, followed by Virtual Machine and then From Gallery, which then lets them chose images. Among those now available are:

Oracle Databases

  • Oracle Database 12c Enterprise Edition on Windows Server 2012
  • Oracle Database 12c Standard Edition on Windows Server 2012
  • Oracle Database 11g R2 Enterprise Edition on Windows Server 2008 R2
  • Oracle Database 11g R2 Standard Edition on Windows Server 2008 R2

Weblogic

  • Oracle WebLogic Server 12c Enterprise Edition on Windows Server 2012
  • Oracle WebLogic Server 12c Standard Edition on Windows Server 2012
  • Oracle WebLogic Server 11g Enterprise Edition on Windows Server 2008 R2
  • Oracle WebLogic Server 11g Standard Edition on Windows Server 2008 R2

Combined Oracle Database/Weblogic VM Images

  • Oracle Database 12c and WebLogic Server 12c Enterprise Edition on Windows Server 2012
  • Oracle Database 12c and WebLogic Server 12c Standard Edition on Windows Server 2012
  • Oracle Database 11g and WebLogic Server 11g Enterprise Edition on Windows Server 2008 R2
  • Oracle Database 11g and WebLogic Server 11g Standard Edition on Windows Server 2008 R2

Java

  • JDK 7 on Windows Server 2012
  • JDK 6 on Windows Server 2012
  • Java Platforms, Standard Edition

Although Windows Azure already supported Java, Microsoft CEO Satya Nadella, who was president of Microsoft's server and tools business last year at the time of the announcement, had pointed out its Java support was based on the OpenJDK. For those who wanted to use Oracle's Java license, the partnership offers a fully licensed and supported Java on Windows Azure. "We think this makes Java much more first class with Oracle support on Windows Azure." Nadella said at the time.

While Microsoft's initial deal had made the Oracle software available from Oracle back in September, Thursday's announcement makes it available on the Windows Server stack as well.

Do you plan to run any Oracle software on Windows Azure?

Posted by Jeffrey Schwartz on 03/14/2014 at 12:19 PM0 comments


Box CEO and Microsoft Critic Warms Up to Redmond

There are dozens of free or low-cost file storage and sharing services but one of the cloud pure plays that has perhaps the most credibility among enterprise IT managers is Box. The service already has large customers such as eBay, Eli Lilly and Proctor & Gamble. Box CEO Aaron Levie has made no secret that he now has his sights on the Microsoft SharePoint market.

Levie stepped up his attack on Redmond back in November telling Forbes that "Microsoft finds itself in this really challenging position where they're being attacked on all dimensions with people whose business models don't rely on the same kind of revenue and the same kind of profits."

Perhaps it was his New Year's resolution to warm up to Microsoft. In January he first praised the company for choosing Satya Nadella as its new CEO in a tweet. Asked about that in a CNBC interview during the annual SXSW Conference taking place in Austin, Texas this week, Levie said he believes Nadella will transform Microsoft into a cloud company and help it become less dependent on its legacy businesses that the Box CEO often criticized.

"Satya is actually really part of the next generation way of how software is going to be developed and how companies are buying technology. And we actually see [that] Microsoft has a major opportunity to become more open with their technologies than ever before, which is a very good thing for us," Levie told CNBC. "It used to be that in the past Microsoft viewed the world through a Redmond lens -- they had to control all the software and all the technology that was deployed in an enterprise. Satya has brought a different level of openness within that company. So we actually think our ability and our change of working with in a complimentary way with Microsoft increasing pretty dramatically under Satya's leadership. At least in that scenario we're viewing it as a very positive thing."

Could that mean some kind of partnership is in the works? Box already has relationships with Google but that wouldn't preclude a deal in this day and age. But while Microsoft has OneDrive and OneDrive for Business, Carson, Nevada-based All Marketing Systems said today it will add Box into OneBigDrive -- a service that consolidates various cloud services including Google Drive and Microsoft OneDrive. By adding Box, AMS is now providing 32 GB of free storage. The company said it will up that in April to 50 GB with a goal of hitting 100 GB of consolidated cloud storage.

Indeed the cloud file sharing and storage market continues to evolve. Two companies this week also launched cloud file sharing and storage services targeted at enterprises. See my separate post.

Posted by Jeffrey Schwartz on 03/12/2014 at 11:10 AM0 comments


Cloud Storage and File Sharing Providers Extend Enterprise Hooks

While almost everyone uses a file sharing and storage service such as Dropbox, Google Drive and Microsoft's SkyDrive, among a slew of other free services, business and IT decision makers want to reign in on the use of those services for business.

One popular alternative is LogMeIn, which today launched Cubby Enterprise. It's described as a business version of its file synchronization and sharing service. The service gives administrators control over data with key security capabilities including the ability to remotely wipe data off devices, set and enforce policies (such as how data are shared) and allows IT to require four-digit PIN codes to access data.

It also supports Active Directory Federation Services integration for single sign-on, domain-based administration to manage user accounts and the ability to remotely deploy on user systems.  Cubby Enterprise also lets IT monitor in real time what data is shared and with whom. Annual subscriptions cost $39.99 per month for five users when prepaid.

Meanwhile, ownCloud launched its ownCloud 6 Enterprise Edition yesterday. The new release gives administrators more control over enterprise data thanks to a rules engine that can provide refined policies for how employees access data. The service is based on the ownCloud Community Edition, an open-source file sync and share project, which the company claims has more than 1.3 million users.

The company indicated back in December it was readying a new commercial edition of its offering, which offers a Dropbox-like experience. However customers who deploy it locally on an Apache Web server or Microsoft's IIS, can integrate it with Active Directory and it has an LDAP wizard for other directories. It uses SAML authentication and the company offers an API to tie it to other applications. It also comes with a plugin for the enterprise social media tool Jive -- a rival to Microsoft's Yammer service. (As an aside, Jive is exploring a potential sale, according to a report by Re/code, which says possible buyers include Oracle, SAP and Workday.)

Organizations with ownCloud 6 Enterprise Edition can store data on their own servers and/or in public clouds such as Amazon Web Services S3 or any OpenStack-SWIFT- based cloud. It doesn't currently support Windows Azure, though CEO Marcus Rex told me back in December he's eyeing that service as well. Customers can create hybrid storage services using local servers for some data and bursting to the public cloud for other content. Administrators can tie ownCloud 6 Enterprise Edition 6 with existing systems management suites and backup and recovery tools. Annual subscriptions start at $9,000 for 50 users.

 

Posted by Jeffrey Schwartz on 03/12/2014 at 12:55 PM0 comments


VMware Targets Windows and Chromebook Users with Desktop as a Service

Looking to replace your traditional Windows desktop infrastructure with virtual desktops but don't want to put in the back-end infrastructure? VMware today is launching a new desktop as a service (DaaS) that will extend its Horizon View VDI to organizations that don't want to install hardware and software to support thin-client implementations.

The new VMware Horizon DaaS is an alternative back-end infrastructure required for VDI with the company's new vCloud Hybrid Service, its public IaaS launched last year. Customers adding the new Horizon DaaS can interconnect it with the on-premise Horizon View infrastructure, though it's not required. VMware gained entry to the DaaS market last year with its acquisition of Desktone.

For those looking to replace Windows XP desktops or looking at deploying a VDI-like solution, VMware sees Horizon View as an affordable alternative. The monthly per-user cost starts at $35 with 30 GB of storage. Enterprises unable or unwilling to invest in the infrastructure and personnel to manage a VDI deployment will find the service appealing, says Dave Grant, senior director of product marketing and product management for DaaS at VMware. "It reduces some of the barriers we've heard around desktop virtualization," Grant said.

The $35 price for Horizon DaaS is the same cost of Amazon Web Services' similar Amazon WorkSpaces offering, which was launched last year at its annual customer and partner conference. Grant acknowledged its new service will compete against Amazon WorkSpaces but argued Horizon DaaS on premise is a broader offering since it will work with organizations that also have VMware View on premise and want to augment it. "Some people might want to burst or want to use it for remote offices," he said.

Administrators can provision the new Horizon DaaS offering with full Windows clients Windows 7 and Windows 8), iPads, Android-based devices and Chromebooks. VMware now has a stake in the success of Chromebooks. The company last month announced a partnership with Google to offer the Horizon View VDI solution on Chromebooks.

Grant told me the company is seeing increased demand for Chromebooks by enterprises. "For our clients that want to adopt the Chromebook,  they still need to leverage Windows applications in their organization and they use Horizon DaaS to pipe in and stream those Windows applications onto the Chromebook," Grant said.

Customer accounts are in a multitenant environment ensuring security, he added. "Every tenant has its own resources for compliance and security," he said. "That's extremely important as you get into the enterprise."

Those using full Windows desktops can migrate their licenses to Horizon DaaS and but don't need to deploy Windows Server.

DaaS proponents argue it is an appealing option for shops migrating from Windows XP, which Microsoft will stop supporting next month. According to a survey of Redmond magazine readers about their Windows XP migration plans, 9 percent are considering some form of VDI. Are you considering DaaS in your organization?

 

Posted by Jeffrey Schwartz on 03/10/2014 at 11:19 AM0 comments


Reader Survey: Many Will Run Windows XP Indefinitely

I was in a hotel lobby last week and saw a kiosk that had obviously suffered a system crash. It wasn't showing the dreaded blue screen of death but it displayed Windows XP powering down. The kiosk apparently froze while trying to reboot. Since it most likely runs Windows XP Embedded, I suspect whoever maintains that hotel's kiosks has no immediate plans to upgrade the operating systems before April 8th -- the last day Microsoft will issue a patch for the aging OS.

Now that Microsoft last month gave Windows XP Embedded a two-plus year reprieve, it will likely live awhile on the numerous kiosks and ATMs running the version of the OS designed for specialized devices. But next month's deadline still holds true for the 30 percent of PCs still running Windows XP. In fact the percentage of systems running Windows XP appears to have inched up a notch.

Indeed many banks, hospitals, schools, government agencies, offices of all sizes and consumers have just one month left before they are running a version of Windows that is no longer supported by Microsoft. It joins the graveyard of its predecessors that include Windows ME, Windows 98, Windows 95, Windows 3.x and others.

Only 28 percent of Redmond magazine readers no longer have any systems running Windows XP. More than 3,000 responded to our online survey, which in itself underscores how many of you have something to say about this. The overwhelming response is triple the amount of readers who weigh in on or most popular surveys. Nearly a quarter of respondents (23 percent) have no plans to retire their Windows XP systems. Only 16 percent were scrambling to migrate while 25 percent planned to do so at some point (but it isn't a major priority) and 8 percent haven't decided what they're going to do.

Why are so many organizations sticking to their guns and planning to run an aging operating system that will put themselves at risk? It has nothing to do with the fact that Microsoft said in January it will continue to offer antimalware signatures for another year. Microsoft's free Security Essentials tool will no longer protect Windows XP systems, though third-party endpoint protection software providers such as Bit9, McAfee and Symantec say they will offer some options (though those vendors do advise upgrading).

Even though 35 percent in the Redmond survey said their Windows XP machines aren't connected to the Internet, 7 percent said that was the justification for sticking with it. The largest portion of respondents, 39 percent, said they have applications that can't run on newer operating systems such as Windows 7 or Windows 8. Here were some reasons respondents gave for planning to keep their Windows XP-based systems running after April 8:

  • XP suits the needs of our applications.
  • Running 16-bit apps and cannot afford to upgrade.
  • XP is the last bearable OS Microsoft has produced.
  • Hardware cannot run new OSes.
  • Management isn't ready to deal with the upgrade hassle yet.
  • They're running apps that can't run on any newer OS AND they work well.
  • The physical hosts are never connected to the Internet, and therefore nether are the guests.
  • There may come a time to move completely off XP, but security is not a factor in the decision.

One university had the most intriguing reason: to teach students what an unprotected system can do. "Indeed, we are keeping some XP (virtual) machines in order to teach cyber security courses."

Windows XP is a victim of its own success. Many are passionate in their position that Windows XP was the best and most-stable operating system Microsoft ever released. I felt that way until Windows 7 came out, which, while far from perfect, was much more stable and reliable than Windows XP or Windows Vista. But I don't have any critical hardware or software that won't run Windows 7, which is the overwhelming destination for those who are migrating.

The majority (85 percent) plan to deploy Windows 7 systems while 36 percent will deploy Windows 8 (multiple answers were permitted on this question). Those that are deploying or supporting Windows 8 seem to be doing so in most cases for the handful of executives and power users preferring the touch-based OS that runs on both PCs and tablets. As I noted last month, while Delta Air Lines is deploying 11,000 Windows 8 RT-based tablet PCs in their aircraft, on the ground it's upgrading office workers and gate agents with Windows 7-based PCs. A few hundred execs are getting Surface Pros running Windows 8 Pro.

Regardless of where you stand with Windows XP, the end is near for its support but it doesn't appear we'll have seen the last of it for many years to come.

Posted by Jeffrey Schwartz on 03/07/2014 at 12:13 PM0 comments


Ballmer's Regret: Missing the Smartphone and Tablet Wave

One month removed from Microsoft after decades with the company that included the last 14 as its second CEO, Steve Ballmer made his first public appearance yesterday in an interview with his longtime friend Peter Tufano, a professor at Oxford University.

In an auditorium full of mostly MBA students, Tufano and attendees asked Ballmer a variety of questions during the one-hour session. The most notable moment of the speech came when the former CEO commented on his biggest mistakes and successes while running Microsoft. Not surprisingly, letting Apple and Google dominate the tablet and smartphone market stung the most, especially considering Bill Gates a decade earlier had strong designs on that market.

"There are some things that didn't go as well as we had intended them to," Ballmer said. "We would have a stronger position in the phone market today if I could redo the last 10 years. Yet one of the things you have to say to yourself is 'do you give up?'" When asked how it felt to have Gates trumpet Microsoft's visions of creating a tablet and mobile device market before turning over the company to him, Ballmer said he regretted not moving faster.

Ballmer gave credit to Apple for marrying the hardware and lamented that Microsoft should have done so earlier as well. Now that Microsoft now offers its Surface devices and is expected to close on its $7.2 billion acquisition of Nokia, which he championed last year, Ballmer said it is important to look ahead and not bemoan mistakes of the past. As it is famously known, Ballmer back in 2007 said the iPhone would flop and was blind sided on how quickly the iPad started cutting into PC usage. Likewise, Google's rapid success with Android appeared unexpected to him.

"There was a little bit of magic too for Android and Samsung coming together," Ballmer said. "But if you really want to bring the vision to market, it is helpful to be able to conceive and deliver the hardware, and our company is in the process of building new muscle, so we're not just thinking of tablets in advance and letting Apple commercialize it."

Ballmer said while most tech companies are "one-trick ponies," including Google and Oracle, he described Apple as one that had two tricks and Microsoft had two and a half. The first was its success with the modern PC with Windows and Office, the second was in the datacenter and the half goes to its rapidly growing Xbox gaming business.

"The fact that we have two and a half, I'm really proud of it and the fact that we've built muscle that lets us do new tricks in the future will distinguish us from all other companies on the planet," he said.

Ballmer, who shared his views on successful leadership, reminded the audience that he has never written a line of production code. "But that didn't let me off the hook for whether we were building the right products with the right quality in the right way," he said. "I won't say things were always perfect, that's not the point. You can't shy away from anything where you alienate people and in some cases if you don't know the details, you learn to ask the right questions. Not everyone does. On the other hand, every company needs to have a measurement system people understand."

That measurement system referred to a company's accountants. "Generally accountants are refs at the gate. They tell you whether the ball went into the goal," he said.

Ballmer had a polarizing tenure at Microsoft and some would question whether he lived up to the leadership principles he spoke of yesterday despite the strong profit growth on his watch. Either way, he reminded the audience he owns a 4-percent stake in Microsoft,  remains on the board and has every interest in seeing Microsoft succeed saying "I'm available to help if the company needs me."

 

 

Posted by Jeffrey Schwartz on 03/05/2014 at 12:50 PM0 comments


Microsoft Leadership Shuffle: Reller and Bates Out, Penn Named Chief Strategy Officer

Microsoft's new CEO Satya Nadella this week will continue to reshape his leadership team with the departure of two senior leaders and the naming of Mark Penn as chief strategy officer. While it remains to be seen how much influence Penn will have, the move potentially give the controversial one-time aide to Bill and Hillary Clinton and current Microsoft marketing executive significant sway over the future direction of the company.

Departing are Marketing Chief Tami Reller and Tony Bates, executive vice president of business development and the onetime CEO of Skype, which Microsoft acquired in 2012. The reshuffling of Nadella's leadership team started last week when Julie Larson Green was named chief experience officer of the "My Life and Work." She will be reporting to Qi Lu, executive vice president of Applications and Services Engineering. This change also makes room for former Nokia CEO Stephen Elop to head the Devices and Studios Group upon the completion of Microsoft's $7.2 billion acquisition of the company's handset unit.

News of the shakeup in Microsoft's executive suite was first reported by Re/code's Kara Swisher, citing unnamed sources close to the company. According to the report, Eric Rudder will temporarily take over Bates' responsibilities and Chris Capossela will take over Reller's role as executive VP for marketing. Reller will apparently stay for a transition period while Bates is leaving immediately.

Nadella reportedly informed insiders of the changes Friday and the company is expected to announce the new executive lineup tomorrow. Swisher speculated the move will give Penn a good look at new product areas and areas where Microsoft can invest in new technologies. However the move takes out of Penn's hands Microsoft's huge advertising budget and shifts it to Capossela.

Penn was widely responsible for Microsoft's overall messaging, which as The New York Times' Nick Wingfield noted was controversial, notably the "Scroogled" campaign which raised questions about Google's approach to privacy. While some believed it was lowbrow advertising, Penn loyalists claimed to have data showing it was effective, Wingfield reported.

As the longtime strategist to the campaigns of both Clintons, Penn is no stranger to controversy. He had to step aside as Hillary Clinton's chief strategist during her 2008 presidential campaign after he lobbied a free-trade pact with Columbia on behalf of Burson-Marsteller, which then-Senator Clinton opposed. Penn was also CEO of the public relations firm at the time. It appears unlikely Clinton will bring him back if she runs for president in 2016,  and would certainly be an issue if Nadella is tapping him for a strategic role.

But how much influence Penn will have is uncertain and Nadella will certainly be adding and subtracting additional people from his inner circle. One looming question is the future of Chief Operating Officer Kevin Turner, who has made no secret of his desire to be a CEO.

Posted by Jeffrey Schwartz on 03/03/2014 at 1:15 PM0 comments


Acronis Creates New Image with Revamp of Data Protection Suite

Backup and recovery software supplier Acronis last week launched what it described as a simplified and more complete suite of data protection software for physical, virtual and cloud environments.

The newly branded AnyData line offers a simplified user interface and boasts a performance boost of 50 percent. It offers both disk, VM, file, single-pass and sector-by-sector backups, full or fast incremental or differential backups and allows for the exclusion of files during backups. On the storage side, it offers a unified backup format, universal restore, deduplication, backup and staging to cloud (as well as tape), encryption, staging to tiered storage and multi-destination staging and retention.

At a press briefing in New York last month, CEO Serguei Beloussov explained the new software was designed to address growing data volumes. Acronis' re-branding and new product suite comes nearly a year after Co-Founder Beloussov returned to Acronis. Beloussov, who is also chairman and onetime CEO of Parallels, took the helm at Acronis following a revolving door of chief executives over the years. The most recent before Beloussov was Alex Pinchev, a former Red Hat president who Acronis tapped in January 2012 and only lasted 14 months.

As part of its new focus, Acronis has four business units: personal, business, mobility and cloud. The personal unit offers backup and storage solutions for individuals, the business group is focused on backup and recovery for small- and medium-sized enterprises, mobility provides secure access, file synchronization and sharing tools, and cloud targets managed service providers, telecommunications carriers and hosters with backup and storage software.

Beloussov said despite the new products and company imaging, Acronis business is strong, saying the last quarter was the best in the company's history with a 50 percent year-over-year increase in large purchases and 70 percent EBIDA growth. While he wouldn't disclose actual revenues, Beloussov indicated the company only had $100 million in revenues a few years ago and now it's up to "several" hundred million.

The suite includes software that protects both data and applications running on clients and servicer in virtual, physical and cloud environments, offering data backup, bare-metal restore capabilities, migration and system environments. It supports Linux, Windows and is compatible with all major file formats including ReFS, FAT16/32, Ext2/3/4, ReiserFS3, XFS, JFS, among others.

AnyData supports all the major virtual platforms including VMware, Hyper-V, Citrix XenServer, Red Hat Enterprise Virtualization and Parallels. It can migrate virtual to virtual, virtual to physical, physical to virtual and physical to physical. It runs agentless in VMware and Hyper-V, supports VMware vCenter integration, simultaneous virtual machine backup, change block tracking, Hyper-V cluster support, any-to-any migration and simultaneous backup in virtual environments.

Acronis is also offering application-specific modules include Exchange, SQL Server, SharePoint and Active Directory.

While Acronis boasts large customers such as Chevron, Ford, Intel, Honeywell, NASA, Samsung and Wells Fargo, the company's primary customers are groups with several hundred employees. Even its large customers tend to be remote groups or units, Beloussov acknowledged.

"They have really renewed their focus on the small business customer and the consumer," said Robert Amatruda, research director of data-protection and recovery at IDC. "I was skeptical but pleasantly surprised at the rapid speed these guys have reworked the company. The way they have rebuilt this product, it is now feature-rich around virtualization, and around migration of data for physical to virtual and virtual to virtual. I think you will see Acronis in environments where you have remote offices and workgroups in organizations that need these features."

 

Posted by Jeffrey Schwartz on 03/03/2014 at 11:46 AM0 comments


Is Microsoft's DPM a Worthy Data Protection Tool for IT?

When Microsoft jumped into the data protection market back in 2005 with the release of its System Center Data Protection Manager, most providers of backup and recovery and replication software shrugged their shoulders. It was easy to point out that it only supported Windows environments and was not very robust.

Now, the latest DPM release can backup and recover Linux virtual machines, enables deployment in virtual environments by configuring storage as VHD pool disks shared in a System Center Virtual Machine Manager (VMM) library and supports SQL Server clusters, as well as provide Windows Azure backup. DPM has come a long way since its release eight-plus years ago. Yet despite the improvements, most third-party suppliers of data protection software don't see it as a competitive threat.

I spoke with quite a few players over the past week and some argue they aren't seeing DPM used at all and others see DPM running alongside their solutions. Take Simpana from CommVault. Randy DeMeno, chief technologist for the company's Microsoft partnership, says some of his customers use DPM and Simpana runs alongside it. "When you get into long-term storage, e-discovery, heterogeneous virtual environments, heavy e-discovery, search, Exchange, SharePoint, [IBM Lotus] Domino [and] various heterogeneous files, that's where CommVault comes into play," DeMeno says.

"We really do high-speed recovery," says Mike Resseler, the Microsoft evangelist at Veeam Software. "We still don't look at DPM as competition but both can work better together. The reason is Veeam Backup and Replication is on the virtualization layer, DPM on Hyper-V. We support VMware as well. We connect the two and can give an effective and cheap solution to do disaster recovery."

Other tools offer better performance with enhanced data deduplication, adds Subo Guha, VP of product management at Unitrends, who acknowledges the improvements to DPM. "It's still kind of weak compared to what we see from a scalability perspective" Guha argues.

Yet most vendors I talked to acknowledged DPM can complement their own solutions. "They've got the backup piece covered and we've got the piece covered where if there's a disaster we can help get the applications up so it can use the data that they have backed up," says Tim Laplante, director of product strategy at Vision Solutions Inc. His company makes Double Take for high-availability environments (with a focus on business continuity and disaster recovery).

Serguei Beloussov, co-founder and CEO of Acronis, offers perhaps the harshest view of DPM arguing that letting Microsoft protect its own environment is the equivalent of letting the fox in the chicken coop. "What most people want to protect themselves against is errors and failures from Microsoft itself," Beloussov says. "Trying to protect yourself against Microsoft with the tool Microsoft supplies doesn’t sound very competitive."

Despite the richer features providers of data protection software offer, does Microsoft's DPM have a place in your shop? Share your views below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 02/28/2014 at 2:34 PM0 comments


IBM Sees Growing Enterprise Demand for Windows Phone

Even though Windows Phone's consumer base is smaller than those for Apple's and Google's mobile platforms, it's projected to be the fastest-growing mobile OS in the coming years. IDC today issued its latest forecast that predicts Microsoft's Windows Phone, which currently holds 3 percent of the market, will account for 3.9 percent of the market by year's end and 7 percent by 2018. Much of that growth could come from enterprise IT decision makers, according to IBM.

An official at Big Blue this week said he sees growing pockets of interest in Microsoft's Windows Phone 8. Most of that interest is coming from IT, not consumers, but it was enough for IBM to give its newly acquired MaaS360 mobile device management platform support for Windows Phone.

IBM added mobile device management to its portfolio of tools for systems administrators two months ago with its acquisition of Philadelphia-based Fiberlink. The company announced the addition at this week's Mobile World Congress in Barcelona and at the IBM Pulse conference in Las Vegas.

The company's MaaS360 Productivity Suite provides secure e-mail, calendaring, contacts and a browser. The tool lets IT separate personal apps and data from enterprise software and information. IT can remotely wipe and manage enterprise apps and data without touching the personal side of the smartphone.

Jim Szafranski, senior VP of customer platform services at IBM's FIberlink unit, said many of its enterprise customers would like to see their employees use Windows Phone for work-related activities because of its tight integration with Microsoft's back-end systems. Consumer demand for Windows Phones of course continues to trail that of iPhones and Android devices, he said.

"Actual end user momentum is trailing business interest," Szafranski said. "IT likes Microsoft and likes Windows. They've made a lot of investment in things like Active Directory and Exchange and as a result they have a lot of interest in seeing Windows Phone used by employees. I don't think anyone is going to be all Windows on mobile, but enterprises do want it and I think they have a strong opportunity when it comes to the enterprise side of purchase decisions."

Despite the incremental but steady growth predicted for Windows Phone, it appears much of it will come at the expense of BlackBerry, though a small amount will come from Android and iOS. IDC's latest forecast predicts Apple's iOS and Android will continue to account collectively for 90 percent of the market over the next four years. This is slightly down from its current share of 94 percent.

 

Posted by Jeffrey Schwartz on 02/26/2014 at 11:55 AM0 comments


Will Slashing Windows Licensing Help Microsoft Tablet Sales?

Looking to boost its fortunes in the low-end tablets market dominated by Apple iPads and devices running Google's Android operating system, Microsoft is reportedly looking at sharply reducing the fees it charges manufactures to license Windows 8.1.

The news, first reported Saturday by Bloomberg, could be announced at this week's annual Mobile World Congress taking place in Barcelona. Citing unidentified sources familiar with the plan, Microsoft will lower the cost manufacturers pay to license Windows 8.1 on tablets by 70 percent from $50 to $15. The new fees would apply to tablets priced below $250 but they could be used on any size or type of device.

While Microsoft didn't comment on the report, it did announce at Mobile World Congress nine new Windows Phone partners. On top of its existing partners Samsung, HTC, Huawei and Nokia, the new influx includes Foxconn, Gionee, JSR, Karbonn, Lava (Xolo), Lenovo, LG Electronics, Longcheer and ZTE.

At Mobile World Congress, Microsoft also announced support for Qualcomm's  Snapdragon 200 and 400 series chipsets and options to support all the major wireless network protocols including LTE (TDD/FDD), HSPA+, EVDO and TD-SCMA (as well as soft keys and dual SIM).

Microsoft's Joe Belifore also confirmed a Windows 8.1 update is coming this spring that promises to appeal more to mouse-and-keyboard users. While not addressing the Windows licensing price cuts, Belifore did say in a blog post  that Microsoft is moving to lower manufactures' licensing costs. "We'll enable our partners to build lower cost hardware for a great Windows experience at highly competitive price points," he said

The company needs to take drastic action if it wants to be a formidable player in the tablet market now dominated by Google and Apple. Google doesn't charge manufacturers any licensing fees for Android or Chrome OS. The added licensing fee of course adds to the price of the device.

For example, the cost of a Dell Venue 8 running Android is currently priced at $229, while a similarly configured Dell Venue 8 Pro with Windows 8.1 now costs $299 (based on prices listed on the company's Web site).  However the Dell Venue Pro also ships with an Office 2013 license, which also adds to the price. But with the current $299 price tag for the Dell Venue Pro 8, it wouldn't be eligible for the reduced Windows 8.1 license fees. Dell would need to lower the price to $250.

Also as the Bloomberg report noted, some of Microsoft's largest suppliers paid Windows licensing fees closer to $30 after marketing funds and other promotional incentives were taken into account.  Meanwhile, Nokia launched its rumored Android phone, the Nokia X at Mobile World Congress. It will be interesting to see whether Microsoft will continue to support that after its $7.2 billion deal to acquire Nokia closes.

What do you think about Microsoft lowering its Windows license fees to go after the low-end tablet market?

Posted by Jeffrey Schwartz on 02/24/2014 at 11:36 AM0 comments


Can Nadella Stand Up to Bill Gates?

Microsoft's new CEO Satya Nadella has acknowledged during the process of selecting who would replace Steve Ballmer that he asked Bill Gates to come back and dedicate time to key initiatives.

Skeptics may argue that of course he's going to say that. In actuality it could have been either Gates or the board who really made the call. But in any case Nadella appears comfortable with the arrangement. In what was billed as the first interview Nadella has given since he was named CEO earlier this month, he told The New York Times he's prepared to push back with Gates if he feels it's warranted. Nadella said he knows how to handle Gates.

"He's actually quite grounded," Nadella told The Times. "You can push back on him. He'll argue with you vigorously for a couple of minutes, and then he'll be the first person to say, 'Oh, you're right.' Both Bill and Steve share this. They pressure-test you. They test your conviction."

Nadella also said that he's aware of the challenges facing Microsoft and acknowledges that the historically mighty powers do fall from grace. "One of the things that I'm fascinated about generally is the rise and fall of everything, from civilizations to families to companies," he said. "We all know the mortality of companies is less than human beings. There are very few examples of even 100-year-old companies. For us to be a 100-year-old company where people find deep meaning at work, that's the quest."

Posted by Jeffrey Schwartz on 02/21/2014 at 10:43 AM0 comments


Metalogix Targets SharePoint Backup and Recovery

Metalogix this week released its first backup and recovery tool designed solely for protecting SharePoint environments. The new Metalogix Backup 4.0 is the fruit of the company's acquisition of Idera's SharePoint tools business last fall and offers improved performance and support for SharePoint 2013.

But it begs the question: Why would an IT organization want a backup and recovery tool that just protects SharePoint but not the rest of the infrastructure? Metalogix officials argue its new tool is better suited to ensuring recovery of SharePoint than traditional data protection suppliers such as Symantec, CommVault, CA, Veeam Acronis and VisionSolutions, among others.

For example, Metalogix Backup now integrates with the company's StoragePoint Remote BLOB Storage (RBS) tool which reduce the size of large SharePoint repositories. The two can back up not only BLOBS that are in SQL Server but BLOBS that have been externalized by RBS offloading, explained Metalogix Chief Marketing Officer Jignesh Shah.

Metalogix Backup 4.0 also features end user self-service recovery built into the SharePoint Ribbon, Shah emphasized. "This is very important," he said. "It's not an administrative tool that you are going to push out to hundreds or thousands of SharePoint users. You can offer them selective recovery directly from the user interface."

Product manager Steven Goldberg explained why the company believes organizations need a backup and recovery tool designed specifically for SharePoint. "The reality is most SharePoint administrators need better control at the SharePoint level," Goldberg said. "The biggest pain point that often comes up is recovery of documents. If someone destroyed, by mistake, a folder…having to recover that is very difficult to do with the horizontal platforms that don't really have deep support for SharePoint backup and recovery."

But even SharePoint administrators and IT decision makers who agree with that notion naturally have to protect data other than SharePoint, I argued. "In that case we see them complimenting the central backup with specialized purpose-built SharePoint backup," Goldberg said.

To make that point, the company is offering the new Metalogix Backup 4.0 for 50 percent of the cost organizations are paying for their existing solutions. Goldberg said Metalogix will offer the same capacity for half their annual maintenance costs, "no matter which one."

Posted by Jeffrey Schwartz on 02/21/2014 at 12:24 PM0 comments


Microsoft Kicks Off SkyDrive Transition with OneDrive Launch

Microsoft today launched its new OneDrive cloud storage service, which replaces SkyDrive. The new name change comes with a few new bells and whistles.  However, OneDrive won't just show up and replace SkyDrive on your system or device -- you must transition to the new service by logging into OneDrive.com.

Other than the new name, OneDrive looks identical to SkyDrive, which Microsoft agreed to rename last summer after a British court ruled it infringed on a trademark held by Sky Broadcasting Group. Microsoft last month revealed OneDrive as the new name for the service following the ruling.

The new OneDrive also includes real-time co-authoring when used with Office 365, allowing for simultaneous edits of Word, Excel or PowerPoint files. An additional new feature includes taking advantage of Android cameras for backup -- a feature already included with iOS and Windows Phone devices.

What Microsoft didn't say is if you use Windows 8, you'll have to wait a few months for a systems update. According to the instructions, to switch from SkyDrive to OneDrive, you must go to the app store of your device platform.

On iOS the transition was pretty seamless. As instructed. I went to the Apple App Store, searched for OneDrive and installed it. Within a minute or so, OneDrive replaced SkyDrive. However when searching for OneDrive in the Windows Store, it was nowhere to be found.

Upon inquiring with Microsoft, a spokeswoman explained why: "Because OneDrive is built into the OS, OneDrive will be updated in Windows in the coming months."  Likewise, if you try to download OneDrive on a Windows 7 PC,  it will re-install SkyDrive. I wouldn't go as far to say Microsoft has left Windows 8 users in the cold since SkyDrive will continue to work but it is rather ironic that Microsoft was able to deliver it for iOS and Android before Windows.

Microsoft still offers 7 GB of storage free of charge and today said it will offer an additional 3 GB to those who use the camera backup feature. For those who purchase additional storage, Microsoft is now offering monthly billing rather than requiring customers to pay a one-time fee.

The SkyDrive Pro service for SharePoint users will be renamed OneDrive for Business. Microsoft said it will disclose plans for that service at next month's SharePoint Conference in Las Vegas.

Posted by Jeffrey Schwartz on 02/19/2014 at 2:20 PM0 comments


Why Delta Decided To Fly High with Windows 8

Last September, Microsoft announced that Delta Airlines would be equpping its cockpits with Surface tablet PCs running Windows RT 8.1 and a Windows Store app called FlightDeck Pro, designed to replace bulky and oft-dated flight manuals and allowing for real-time updates. I met Darrell Haskin, IT director for Delta's crew department at an event in New York a few months ago held by Microsoft. The event showcased a few dozen customers using Windows 8x and Windows Phone devices. Some were giants like Delta, Siemens and Boeing and others using Windows 8x were small- and mid-sized business including a real estate firm and a medical clinic. This month's Redmond magazine cover story highlights six IT pros using Windows 8 devices and OS in their enterprises.

I spent time talking with Haskin and several others on his IT team and I later caught up with him at last month's National Retail Federation convention where he talked up Delta's deployment in a session at Microsoft's booth. It didn't surprise me that Delta was pushing forward with a progressive app like FlightDeck Pro. The airline has a long history of investing in IT. Back in 2001, I met with Delta's IT people at the airline's Atlanta headquarters where they spent an entire day guiding me through their IT operations and talking up their migration from legacy mainframe systems to meet the needs of the new Web-enabled world.

Haskin said he hopes this latest initiative makes customers feel better about flying Delta. Here's an excerpt from our conversation:

What led to your decision to put Surface PCs in the cockpit and give flight attendants Windows Phones?
Haskin: It wasn't that long ago. We started the project about two years ago but we didn't select Windows until a year ago. [Regarding the decision to give flight attendants Windows Phones] we were setting out to find an electronic replacement for our onboard point-of-sale device. That was a proprietary system, an onboard device that just stayed with the airplane. Rather than have 2,000 devices on the aircraft, every flight attendant has their own device they can keep with them. We don't have to keep anything on the airplane any more, we don't have to have people come up and change them because we'd have to keep them charged. We'd have to dock them periodically to download all the information. Now we just do everything wirelessly into the new system.

Before Windows 8, what were you considering?
Haskin: We did an RFP [request for proposal] to seven different vendors. In most cases they were multiple groups of vendors. So when Microsoft actually bid on it, it was actually Microsoft, Avanade and AT&T. We had groups of other vendors that came in with their solutions. We looked at a lot of different options.

Presumably you there was a bid that included iPads and other non-Windows tablets?
Haskin: You bet.

So what led you to device on Surfaces rather than iPads?
Haskin: We really liked the fact that we're a big Windows shop already at Delta Air Lines. We have Active Directory, so we know a flight attendant versus a pilot. We know what type of aircraft he flies, that's all in Active Directory. Plus we use Exchange for e-mail so it just fit into the corporate architecture better for us.

Were you concerned that Windows 8 might not take off, so to speak? Pardon the pun.
Haskin: Yes, of course we had some concern about that but Microsoft is one of our strategic partners. So we sat down with them and conveyed our concerns with them. They were very supportive and said 'we feel this is going to take off'  and now we believe that it will. They were able to allay our concerns about that.

According to Forrester, Windows may only account for 30 percent of all devices by 2017. If that forecast plays out, is that a concern?
Haskin: I think 30 percent is a pretty good number. I think the fact that Delta has chosen to do this in our operation will naturally have some influence on other companies around the globe.

Why did you go with Surface 2s with Windows RT versus Surface Pros running the full version of Windows 8?
Haskin
: Because pilots only use our in-house applications. There really wasn't a need for us to have all of the additional functionality that comes with the Surface Pro for this particular application in the flight deck. And the battery lasts longer, which is very important to the pilots in our flight deck who might be on a 15-hour flight.

Will you be rolling out Windows 8 for the ground force or other parts of Delta?
Haskin
: We're actually still on Windows XP. We're moving to Windows 7.

Why Windows 7 and not Windows 8?
Haskin: We're just not ready to do that yet.

Will some of those go to Windows 8?
Haskin
: Nope, they're all going to Windows 7. That's primarily the back-office desktop users.

What about executives who might want a touch-based interface?
Haskin
: Our executives all have Surface Pros already.

All of them do?
Haskin: Vice presidents and above all have Surface Pros. That's about 200 executives or so. They love the ability to be able to write notes on things and send them out. Our executives are on the move -- I mean, we're an airline. They're flying all over the world meeting with our strategic partners around the world. It's a great school for them. They've given us really positive feedback for Surface Pros.

Posted by Jeffrey Schwartz on 02/14/2014 at 3:58 PM0 comments


Windows Phone Inches Up in Share, BlackBerry Dives and Android Soars

While most people I know have iPhones and some have smartphones based on Android, Google's platform is clearly taking over the global market. Gartner and IDC released their annual smartphone reports and both showed that Android and iOS extended their dominance in the past year.

That may not be a revelation but the numbers show Android has an overwhelming lead over all platforms with 794 million devices shipped. Android phones now account for 79 percent of the market compared with 69 percent last year, according to IDC's report (Gartner's numbers were similar). I find that somewhat alarming given how often I hear Android is inherently more susceptible to malware and security risks than the other platforms.

Runner-up iOS is the only other major player but despite the fact Apple shipped 153 million iPhones, its share has dropped from 19 percent to 15 percent. Between the two, they account for 84 percent of the smartphone market.

While Microsoft continues to struggle to make a dent in the market it has solidified its third-place position with 33 million Windows Phones shipped. That's nearly double the amount last year. Windows Phone now has a 3.3 percent share of the market, up from 2.4 percent last year. Nokia shipped 89 percent of Windows Phones, which was "a testament to its expanding portfolio that addressed entry level all the way up to large-screen smartphones," wrote IDC analyst Ryan Reith. "What remains to be seen in 2014 is how Microsoft's acquisition of Nokia's smart devices will propel volumes higher."

No surprise is that BlackBerry's precipitous fall continued. The smartphone maker shipped 40 percent fewer devices last year (19 million) and its share dropped from 4.5 percent to 1.9 percent.  As Reith noted, it was the only major player to see its share fall. And indeed it fell sharply. "With new leadership, management, and a tighter focus on the enterprise market, BlackBerry may in a better position, but still finds itself having to evangelize the new platform to its user base," Reith noted. I think that was a kind but optimistic assessment for BlackBerry.

The question remains, can Microsoft find a way to make it a real three-horse race with Windows Phone? Given recent reports that even Nokia is considering an Android phone it certainly casts doubt that even Microsoft's new devices unit can take on the Android juggernaut. Or maybe it's just hedging its bets.

Either way, Windows Phone and BlackBerry never come up in conversations I have with anyone developing an app and when I bring them up, the response is typically a deafening silence. I sometimes wonder if some people realize they exist. I still don't think it's game over for Microsoft's smartphones and preferences can change with the wind. But for Windows Phone and BlackBerry, this year and next will be critical if they're going to change this two-horse race.

 

Posted by Jeffrey Schwartz on 02/13/2014 at 12:52 PM0 comments


OpenDaylight Is Key to Microsoft's SDN Strategy

When the Linux Foundation brought together rival networking and infrastructure providers to join its new OpenDaylight Project last April to provide interoperable software-defined networks, it was interesting to see Microsoft sign on as a "platinum" member. Windows Azure, Windows Server 2012 and System Center 2012 already support the Open Network Foundation's OpenFlow standards for SDNs (and were enhanced with last fall's R2 releases). But by joining the OpenDaylight Project, Microsoft was committing to open source implementations of key SDN standards.

Microsoft is an OpenDaylight Project platinum sponsor along with Brocade, Cisco, Citrix, Ericsson, IBM, Juniper Networks and Red Hat. In addition, Microsoft Group Program Manager Rajeev Nagar, who oversees the Windows Datacenter Networking and Platform team, is on the board and technical steering committee of the project.

In an interview Monday, Nagar explained why Microsoft is putting so much effort behind OpenDaylight. "If you want customers to be able to try out and deploy SDN technologies, if you want to drive interoperability through different vendor implementations, we think participation, encouragement and contribution to industry consortia like OpenDaylight is a valuable thing to do," Nagar explained.

It's still the early days for SDNs but as organizations move to these new virtual network infrastructures to automate their datacenters, Microsoft sees System Center, Windows Server and Windows Azure as critical components of these new architectures.

The OpenDaylight Project released its first deliverable last week  called the Hydrogen distribution. "Hydrogen is a very good start to the effort," Nagar said. "It offers a base controller and then it also offers a slew of services in relation to the controller that people can try out. When folks say 'hey, I want to deploy SDN or try it out, want to deploy an overlay network or I want to control or manage my network through a programmatic manor,' those capabilities are enabled through Hydrogen."

At the summit, Microsoft demoed Hydrogen, which is built on Java, on Windows Server and Windows Azure. Nagar wouldn't say if or when it would offer the code with Windows Server, System Center or Windows Azure.

Posted by Jeffrey Schwartz on 02/12/2014 at 12:57 PM0 comments


Web Crawler Helped Snowden Find Classified NSA Files in SharePoint

Though it's well documented that former NSA contract employee Edward Snowden accessed many classified files stored in SharePoint repositories, it was the unfettered permission he had that let him steal and later release those files.

Giving him that access turned out to be an epic failure, whether or not you believe he did the right thing by sharing what he knew with the world. From a security standpoint, this administrator was an insider who was able to steal a lot of classified data that the agency didn't want disclosed.

Now it has come to light how he found these files. A report this weekend in The New York Times says Snowden used a Web crawler to find the 1.7 million files he ultimately took. It doesn't say whether he used the search engine in SharePoint (which effectively is a Web crawler), a freely available one found on the Internet or a custom bot he created.

According to the report, NSA officials only said it functioned like the Googlebot Web crawler, also known by some as a spider. What remains a mystery is why the Web crawler's activities of scanning classified files didn't set off any alarms, especially since the NSA rarely uses Web crawlers, according to the report. Because passwords were inserted, "the Web crawler became especially powerful."

The NSA is still unclear how Snowden came up with the search terms needed to seek out the files he was looking for and the NSA said it doesn't believe the searches were directed by a foreign power. But knowing the search terms apparently has helped the agency determine what information Snowden had, among other things.

As Steve Ragan noted today in his CSO blog, The Times report missed an important point:  "This isn't mastermind-level hacking, it's something any network administrator would know how to do," he noted. "But because Snowden was an insider, not to mention a network administrator with legitimate access to the commands and portals he was mirroring, his explanation for the access and archiving was accepted at face value."

Ragan explained how this can affect you. "At the time the investigators were duped, the NSA had the same problem many organizations have; they were more worried about defending the network from threats that came from the outside, and didn't seriously consider the potential for threats from within."

So if the NSA's SharePoint documents could be found by a Web crawler by your administrator you could be just as susceptible to data loss as the NSA was. Insider threats are nothing new. But what Snowden pulled off last year and the fact that he did so with a Web crawler is a reminder you need to keep an eye on threats from outside and within.

Posted by Jeffrey Schwartz on 02/10/2014 at 2:23 PM0 comments


Nadella: Don't Split Up Microsoft!

Ever since former CEO Steve Ballmer announced his retirement late last summer, some on Wall Street were buzzing that now's the time to sell or spin off certain Microsoft businesses "to provide maximum shareholder value."

Proponents of splitting Microsoft up date back to the Department of Justice antitrust trial in the late 1990s. At the time they spoke of creating "Baby Bills," a term derived from the Baby Bells that were spun off of AT&T back in 1984. Founder Bill Gates was dead against that idea for Microsoft and I'd imagine he still is.

But some still think splitting Microsoft up (or at least spinning off a unit or two) is a good idea. Wall Street was enamored with outside candidates because they had less attachment to the so-called sacred cows. Proponents have often touted Microsoft's market-lagging Bing search engine and Xbox gaming business as good candidates Microsoft should shed.

Just because Microsoft tapped an insider as CEO doesn't mean the idea of spinning off units is DOA. Some would even like to see Microsoft split into multiple companies. Analyst Tim Bajarin of Creative Strategies even suggested in a blog post Tuesday that Microsoft should be split into three companies -- one focused on enterprise IT, the second comprising mobile devices including tablets and the third on its Xbox business.

The decision to shed any major assets will be made by the board, though the CEO presumably will have a strong voice in convincing the board whatever he thinks is best. Based on his comments this week, I don't get the sense Satya Nadella is pining to sell off Bing or Xbox but considering he's only been on the new job for a few days, he has a lot to delve into. Nadella said as much in his interview with corporate VP Susan Hauser on Tuesday.

Hopefully he won't succumb to pressure from Wall Street to spin off those businesses. While the Bing search engine may never surpass Google in usage or mindshare, the technology in Bing is crucial to Windows, Office, SharePoint and many other initiatives, some that may not have even borne fruit. Likewise with Xbox.

Perhaps the biggest takeaway this week in Microsoft's selection of Nadella as CEO is the company's acknowledg ment that it is an enterprise software company first. At the same time, consumers drive enterprises and the lines have blurred between the two.  From that perspective, IT pros should welcome the selection of Nadella as Microsoft's new CEO. "It appears he's prepared to carry out the strategy the company has set," said Directions on Microsoft Analyst Rob Helm. "It's a choice for continuity, for enterprises that's somewhat of a relief." Helm added that the choice of Nadella is a strong sign Microsoft will stay the course.

Forrester Research analyst Ted Schadler agreed. "Talk of bringing in an outsider tends to suggest that the entire strategy of the company and the company needs to be reconfigured," Schadler said. "I didn't believe that was true. I believe the path the company has been on is the right path. They have to move from licensed software to software to software as a service, full stop, period."

 

Posted by Jeffrey Schwartz on 02/07/2014 at 1:55 PM0 comments


Does Nadella Really Want Bill Gates Second-Guessing Him?

When Satya Nadella was in discussions with Microsoft's board to succeed Steve Ballmer as CEO, Nadella reportedly wanted to have Gates close at hand to advise him on technology and product strategy.

Perhaps that's true or maybe that was the message purposely leaked because it was the board and/or Gates who wanted the founder to work closely with the new CEO. Yesterday's news that Microsoft has tapped Nadella to take over as the company's CEO immediately is a huge milestone for the company. Yet the fact that Gates is handing off his seat as chairman to John Thompson to spend a third of his time working with Nadella and the product teams raises a number of questions.

Most notably, does Nadella really want Gates second-guessing him? Nadella is a 22-year veteran and has a strong understanding of Microsoft's legacy, business and technology and has the monumental task of taking the company forward. Whether you like him or not, Gates is a legend for bringing PCs and office productivity to the masses. But as Microsoft charges into the post-PC era, it's a very different world.

It's rather ironic considering Gates always envisioned the post-PC world but even when he was around, he never saw Microsoft making the transition. Most famously as CEO, Gates overlooked the growth of the Internet in the mid-1990s and, of course, had a hand in the Windows Vista fiasco. But most dramatically, despite having projects such as the WinPad and other mobile tablet efforts in the labs, Apple and Google beat Microsoft to the punch. Gates pooh-poohing the iPod and later the iPhone when it was still a Windows Mobile vs. Blackberry world were gaffes Microsoft is still trying to recover from, though Ballmer shares that blame as well.

Maybe it wasn't Nadella looking for the comfort of Gates' presence but a board that, even though it was confident in Nadella's ability to lead Microsoft, was aware that Wall Street wanted star power in the mix. Another possibility is Gates, as founder of the company, wanted to become more involved, as some sources told The Wall Street Journal. The board may have also offered Gates this new role to compensate for giving up his seat as chairman.

Just as Mary Jo Foley has often and once again pointed out, Gates' main focus in this chapter of his life is his focus on philanthropy -- and rightfully so. The notion that Gates will become more involved at this juncture for any significant period of time sounds questionable, as I also noted just a month before Ballmer announced his retirement. It's also debatable the extent Thompson and the board want Gates involved.

Gates is a brilliant person and I'm not suggesting he doesn't have a lot to offer to Nadella and the product chiefs at Microsoft. I just wonder if Gates' presence will be helpful or if it will hold Nadella and the company back. What do you think?

Posted by Jeffrey Schwartz on 02/05/2014 at 1:58 PM0 comments


It's Official: Satya Nadella To Immediately Take the Helm at Microsoft

Microsoft today named Satya Nadella as the third CEO in the company's history and Bill Gates will step aside as chairman but will serve as a technical advisor. Director John Thompson will replace Gates as chairman. Nadella takes over as CEO immediately.

Reports that the search was winding down and that the two would be named to take the company forward at a critical juncture in its history surfaced late last week. It was looking probable in recent days that Microsoft indeed was set to give Nadella the nod. But given the fits and starts of the search over nearly six months, there was reason to wonder if the decision was certain.

Gates' future role with Microsoft was also in doubt, given his full-time commitment to his charitable trust, the Bill and Melinda Gates Foundation. But Gates will step down as chairman and take on the new board position of founder and technology advisor. In that new role the company said he will spend more time with the company working with Nadella to develop technology and product direction. It remains to be seen how much more involved Gates becomes.

Despite calls for outside blood to take over Microsoft, in the end the company found it had the best candidate from within. "During this time of transformation, there is no better person to lead Microsoft than Satya Nadella," Gates said in a statement. "Satya is a proven leader with hard-core engineering skills, business vision and the ability to bring people together. His vision for how technology will be used and experienced around the world is exactly what Microsoft needs as the company enters its next chapter of expanded product innovation and growth."

Since Ballmer announced he's stepping down back in August, Nadella was always considered a candidate, though at first he was considered a long shot. However, over time his prospects seemed to improve thanks to his 22-year tenure with Microsoft working in a wide number of groups including overseeing its Bing search engine, its Office business and most recently overseeing its enterprise tools infrastructure and cloud business. Unlike Ballmer, who was regarded more for his business and marketing acumen, Nadella is an engineer and computer scientist who also has broad awareness of how technology is applied to business and is seen as having a vision for the future of consumer and enterprise IT.

Nadella also is known to spend a lot of time in Silicon Valley, which should help bring Microsoft into the mainstream of the technology market. His ties to the region promises to help recruit key partners and talent to move Microsoft forward.

When Nadella appeared at a Silicon Valley media event in September to talk about Microsoft's cloud strategy, it was clear the company was floating him as a candidate. But as Wall Street was pushing for an outsider to come in, the search focused on Ford CEO Alan Mulally and several others. It was never clear whether Microsoft offered the job to any of those candidates with the leading ones publicly bowing out.

Even though Nadella is widely respected, questions have persisted whether he could run a company with 100,000 employees  (and about 30,000 more due to come on board once Microsoft closes its deal to acquire Nokia). With former Symantec CEO Thompson stepping in as chairman and a strong CFO in Amy Hood, Microsoft believes the two will take some of that pressure off of Nadella.

In an e-mail to employees, Nadella emphasized the company's mobile and cloud-first transition. "While we have seen great success, we are hungry to do more," he noted. "Our industry does not respect tradition -- it only respects innovation. This is a critical time for the industry and for Microsoft. Make no mistake, we are headed for greater places -- as technology evolves and we evolve with and ahead of it. Our job is to ensure that Microsoft thrives in a mobile and cloud-first world."

Born in Hyderabad, India, Nadella has a master's degree in computer science from the University of Wisconsin and another master's degree in business administration from the University of Chicago.

In a video interview released by Microsoft, Nadella shared Gates' view that the company needs to have the goal to make profound changes. "Everything is becoming digital and software-driven," he said. "I think of the opportunities being unbounded and we need to be able to pick the unique contributions that we want to bring. That's where our heritage of having been the productivity company to now being the do-more company where we get every individual and every organization to get more out of every moment of their life is what we want to get focused on."

Nadella's heritage at Microsoft overseeing enterprise infrastructure and productivity is also a sign the company won't lose sight of its bread and butter as it tries to develop new use cases for individuals and enterprises alike.

As an IT pro, how do you feel about Microsoft's choice?

Posted by Jeffrey Schwartz on 02/04/2014 at 9:58 AM0 comments


Microsoft's Ad Scores in Super Bowl XLVIII

No doubt people are feeling euphoric in Redmond and throughout the Pacific Northwest today with their beloved Seattle Seahawks -- owned by Microsoft co-founder Paul Allen -- trouncing the favored Denver Broncos and winning the Super Bowl. Microsoft also scored a victory of sorts last night with its one-minute commercial. This is the first time Microsoft has aired a Super Bowl spot.

Maybe the commercial wasn't the winner (especially if you're a Seinfeld or Steve Colbert fan) but it's about time Microsoft stepped up and strutted its stuff in front of one of the biggest audiences it can capture in one snap, so to speak. It joined companies during and since the dotcom bubble that have strutted their stuff in Super Bowl commercials including Apple, Dell, Google, HP, IBM, Oracle, Salesforce.com and SAP.

Many companies spend tens of millions of dollars to produce and run commercials for the Super Bowl and that can do a number on your ad budget. But Microsoft's advertising budget was estimated at a hefty $1.23 billion in 2012, according to Ad Age, which described last night's spot as a "tear jerker." The commercial got favorable reviews on social media as well, even by some who are not necessarily Microsoft fans. As Microsoft is about to engage in one of the most significant leadership transitions in the company's history and its home team playing for the Lombardi trophy, what better time for it to redeem itself to the world?

The emotional ad showcased former New Orleans Saints legend Steve Gleason, who now suffers from ALS, the debilitating condition known as Lou Gehrig's disease. Gleason narrated "what technology can do," subtly showcasing how he dictated commands while using eye-tracking technology with his Surface Pro and how doctors use Microsoft's Kinect motion sensors in operating rooms.

The spot also showcased "how Skype brings children around the world together to learn how physically challenged people can continue to pursue their passions in life with the help of technology and the particularly moving story of a mother gaining the ability to hear for the first time," said Mark Penn, Microsoft's executive vice president for advertising and strategy in a blog post. "These are real people telling their own stories in their own words and we hope you feel as inspired by them as we do."

A one-minute commercial isn't going to convince critics or those passive about technology that Microsoft is still in the game. But remember if a tree falls in the forest and no one hears it, some may argue it didn't really fall. So if it reminded some that Microsoft still has fuel in its tank, it was probably was worth the $4 million it spent (at least) on the commercial especially as Microsoft is set to announce its new CEO (which could come as early as this week). At the same time, Microsoft needs to continue to get the word out on what it offers and where it's going.

What did you think of last night's commercial?

Posted by Jeffrey Schwartz on 02/03/2014 at 10:56 AM0 comments


Spotlight Intensifies on Nadella as Microsoft CEO Frontrunner

The tea leaves continue to point to Satya Nadella as the leading contender to replace Steve Ballmer as Microsoft's CEO. Since yesterday's report that Nadella is the leading candidate, additional reports have surfaced that Bill Gates may possibly cede the chairman's roll to board director John Thompson.

Gates may step down as chairman but yet have more involvement with the company, Bloomberg reported yesterday. Clearly Nadella wasn't the first choice though some believe outside candidates such as Ford CEO Alan Mulally were placeholders for Nadella to eventually lead Microsoft. Mulally earlier this month said he would not leave Ford this year.

Nadella, a 22-year Microsoft veteran, is viewed as a capable and bright technical visionary but lacking any background in running a large business. Critics have advocated that only an outsider can shake things up in Redmond. Nomura Securities analyst Rick Sherlund believes Nadella is a good choice, especially with CFO Amy Hood at his side.

"He's complimented by a very good CFO and I think Amy Hood will do a lot to manage the cost structure and together with some influence on the board like from ValueAct," Sherlund told CNBC this morning.

The ideal scenario in Sherlund's mind was for Mulally to groom Nadella, who would have shaped the company's technical and product strategy. While Microsoft's stock would have gotten a more immediate bump from bagging Mulally, Sherlund has a price target for Microsoft's stock at $45 per share. It was trading up nearly 2 percent at $37.49 at midday today, making it the second most-active stock on the Nasdaq.

Nadella, who grew up in India, has a reputation as a "relentless questioner," who with substantial technical chops, as Reuters noted in this report today. That's a major departure from Ballmer, better known for his marketing prowess.

Would the combination of Nadella and Hood be enough to keep Wall Street happy? "I think the stock will be in a holding pattern here a little bit," Sherlund said. "Initially it traded up, probably under relief that it wasn't some of the other people that people feared would get that job." While he didn't say so specifically, he didn't deny he was referring to Stephen Elop, a onetime Microsoft executive, who was CEO of Nokia and is slated to return to Microsoft as an executive VP after the Nokia deal is finalized. Elop last year was at one time considered a front runner but quickly dropped from contention, while Nadella was preferred for his deeper engineering background.

Critics of choosing an insider shouldn't presume Nadella wouldn't shake things up at Microsoft and Sherlund said he would need to be willing to cut loose-key executives and bring in new talent. Likewise Sherlund said Microsoft needs to revamp its board. "I think the dynamics on this board are about to change pretty radically," he said. "This board has not done a very good job at looking after shareholder value over the last decade and there's a lot of people don't have a lot of experience in the technology field. This is a tech company, they are going to be run by somebody with a tech back ground, and I think they have to allow him to make the changes that are necessary."

Though Ballmer was recently re-elected to the board, Sherlund said he wouldn't be surprised if he exits. Of course, while a decision remains to be made, fluid reports are pointing to an announcement as early next week. And while it doesn't appear the board has made an offer, the focus on this weekend's Super Bowl is cited as one reason not to announce anything sooner.

In case you haven't heard, the Seattle Seahawks (which, by the way, is co-owned by Microsoft co-founder Paul Allen) are going to the big game. Apparently that's the main thing on the minds of employees in Redmond today.

Posted by Jeffrey Schwartz on 01/31/2014 at 12:10 PM0 comments


Microsoft Takes New Measure To Circumvent NSA Surveillance

Looking to ensure its foreign customers don't back off from using cloud services, Microsoft plans to give them the option of assuring their data is stored on systems outside the U.S. The move aims to ease concerns by customers outside the U.S. that the National Security Agency (NSA) or other U.S. agencies can intercept their encrypted communications.

NSA surveillance activities such as the PRISM  program revealed last year by former contractor Edward Snowden have stoked fear in customers that their data stored on U.S. soil isn't secure. Microsoft's response will address concerns and cover data sovereignty requirements by those in foreign countries, particularly Brazil and throughout the European Union.

Brad Smith, Microsoft's chief counsel, revealed the company's plan to give customers the choice of where their data is stored in an interview last week with the Financial Times. "People should have the ability to know whether their data are being subjected to the laws and access of governments in some other country and should have the ability to make an informed choice of where their data resides," Smith said.

Microsoft confirmed his comments but said it has no immediate plans to elaborate. But Jeffrey Chester, executive director of the Center for Digital Democracy, an advocacy organization for privacy, said Microsoft is the first major U.S.-based digital provider to give customers control over where their data is stored.

"Practically the entire industry is strongly opposed to any EU rule requiring that data on its citizens be stored -- and also regulated -- by either member states or other governmental entities," he said in an e-mail yesterday. "This move should boost the company's prospects attracting EU and other privacy concerned businesses or consumers. It's unlikely, however, that others will follow suit, despite Microsoft breaking ranks with the industry lobby."

Asked why, Chester pointed to a number of groups that oppose forcing providers to offer that choice. Among them are the Internet Association, whose members include Amazon, eBay, Facebook and Yahoo; the Business Coalition for Transatlantic Trace (BCTT) Digital Trade Working Group, which includes companies that perform online international trade including its corporate chairs Amway, Chrysler, Citigroup, Dow Chemical, FedEx, Ford, GE, IBM, Intel, Johnson & Johnson, JP Morgan Chase, Eli Lilly, MetLife and UPS; and the  Information Technology and Innovation Foundation (ITIF), a Washington, DC.-based think tank, that ironically is backed by Microsoft, Chester noted.

Will their positions change? "They want to stuff exporting consumer data to the cloud down the throat of EU consumers.  Perhaps demand will over time change their position, but for now they want no local rules." What are your views on Microsoft's plans to allow foreign customers to store their data offshore?

Posted by Jeffrey Schwartz on 01/30/2014 at 11:24 AM0 comments


Microsoft CEO Search Reportedly Nearing the Finish Line

The long search for Microsoft's next CEO may be coming to an end and it looks like the crown may go to an insider now that Ericsson Chief Hans Vesberg is no longer in the mix. The leading contender is now enterprise and cloud head Satya Nadella, according to a report this morning by Re/code's Kara Swisher.

An announcement could come within a week, several unnamed sources close to Microsoft told Swisher. Yet other sources tell her the search committee still hasn't ruled naming an outsider to the position. Two other insiders are also still contenders, according to the report. Among them are former Nokia CEO and onetime Microsoft President Stephen Elop and Tony Bates, who led Skype and now leads business development and OEM relationships for Microsoft.

CNBC, a Re/Code partner, noted this morning that  two others are still in the mix. The two are Microsoft COO Kevin Turner and Paul Maritz, a onetime insider who later became CEO of VMware and now runs its parent company EMC's Pivitol Software business. Maritz would be a good choice but he has indicated he isn't interested. Turner would be a surprise move and, from what I've heard, not welcome by many employees and partners.

The search is into its sixth month and the long wait to see who will lead Microsoft is taking a toll on morale at the company, according to Swisher. Also with several Microsoft conferences kicking off in a few months, including SharePoint, Build 2014 and TechEd 2014, failure to have named a new CEO could cast a shadow on those key events, which are expected to reveal the direction of the company's key offerings.

Nadella is a strong candidate, as I noted back in October when he talked up Microsoft's cloud and enterprise computing strategy in San Francisco. In addition to working in numerous groups during his two-plus decades at the company, Nadella has run perhaps the most critical and profitable pieces of Microsoft's business.

While I don't often write about rumors, Swisher is on the money more often than not. Either way, hopefully this search will reach the finish line soon.

 

Posted by Jeffrey Schwartz on 01/30/2014 at 12:09 PM0 comments


Cisco Launches Multi-Cloud Infrastructure Software

Cisco yesterday launched a new portfolio of software that it claims will let large enterprises and customers build more reliable, secure cloud-based infrastructure and, perhaps most pointedly, is designed to connect multiple clouds via key partnerships and industry standards. The company introduced its new cloud offering at its Cisco Live! conference taking place this week in Milan, Italy.

A key component of the new offering is Cisco InterCloud, management software. The company said  the software lets enterprises build and securely tie together multiple private clouds while ensuring their resiliency. Cisco claims InterCloud, the latest addition to the Cisco One cloud platform, will also reduce the cost of building public, private and hybrid clouds. The software is in field trial now and the company plans to make it available next quarter.

Microsoft is among several key partners in the Cisco One cloud program. Others include EMC, Citrix, Denali Advanced Integration, NetApp, Rackspace, Red Hat, VCE (the company formed by VMware, Cisco and EMC) and Zerto.

Cisco described InterCloud as a software-based tier that lets enterprises and service providers combine and move workloads (including data and applications) across various public or private clouds. Cisco InterCloud is  also designed to carry over security and network policies. Using InterCloud, Cisco said customers can move workloads between Amazon Web Services, Microsoft's Windows Azure and services from BT, CSC/ServiceMesh, CenturyLink Technology Solutions and Virtustream.

Cisco InterCloud will plug into cloud management offerings including Cisco's own Intelligent Automation for Cloud (IAC), CSC/ServiceMesh's Agility Platform, Red Hat CloudForms, among others. Given integration with Cisco's Unified Computing Systems (UCS), Windows Server and System Center, all indications are that Microsoft will tie its own Cloud OS solution (consisting of System Center and Windows Server) to Cisco InterCloud as well.

Brad Anderson, Microsoft corporate VP for cloud and enterprise, noted in a blog post how InterCloud will let mutual customers extend Redmond's Windows Azure service. "Cisco InterCloud provides the necessary gateway and virtual networking infrastructure to enable customers to seamlessly move and run their applications and workloads on Windows Azure," Anderson said. "

While touching on Cisco's emphasis on interoperability, Anderson didn't mention Cisco IAC 4.0 management software, which Cisco also announced yesterday. IAC is a separate management offering from Cisco that provides orchestration among multiple cloud infrastructures including Amazon's EC2, VMware's Hybrid Cloud service and clouds based on OpenStack.

Since Cisco did say that its new InterCloud would link with IAC, at some point it's also a safe bet that it will tie to Microsoft's System Center, though neither company officially announced such plans.

 

 

Posted by Jeffrey Schwartz on 01/29/2014 at 12:50 PM0 comments


Symantec's NetBackup Upgrade Targets VM Backup and Recovery

Symantec last week rolled out the first upgrade to its NetBackup enterprise backup and recovery service in two years. The company said it gave NetBackup 7.6 a significant performance boost and tuned it up for environments using its replication engine for vSphere.

While Symantec is arguably the leading provider of enterprise backup and recovery software, a slew of challengers are targeting its dominance and have focused on the proliferation of virtual datacenters. Many have argued that NetBackup was not keeping up with this trend.

Though not pointing to any specific problems with NetBackup 7.5, Symantec Senior Product Marketing Manager Glen Simon said there is a companywide emphasis on improving Symantec's software. "Across the board there's an increased emphasis on quality," Simon said. "This release is preparing customers for the next generation of the modern datacenters."

On a high level Symantec said NetBackup 7.6 is designed for organizations that are evolving their infrastructure to software-defined datacenters. The new release is designed to automate large-scale data protection even for those at the cusp of making that transition. According to the company's own research, the amount of data organizations are creating is increasing at up to 70 percent yearly, which the new release is designed to address by providing more automation and faster performance.

Simon emphasized that NetBackup 7.6 also addresses the shift to the growth of virtual machines and targets VMware environments. Specifically it uses NetBackup Replication Director to protect VMware environments, according to Simon. It can also use NetApp snapshots taken from its arrays to protect virtualized environments. The new release can recover VMware vSphere VMs 400 times faster than its predecessor, the company claims.

VMware's dominance notwithstanding, it's not the only hypervisor organizations are using. So what about Microsoft's Hyper-V? "Going forward one of the major focuses on the next release will be Hyper-V," Simon said.

Given the competitive landscape and growth of Hyper-V, the company would be wise not to wait another two years for that upgrade.

Posted by Jeffrey Schwartz on 01/27/2014 at 9:54 AM0 comments


SkyDrive Now Called OneDrive

It took six months but Microsoft has found a new name for its SkyDrive service. The new name, OneDrive will make its debut "soon," the company announced this morning in a blog post. Microsoft didn't say when the new name will appear but urged customers to sign up for alerts.

The new name also applies to its SkyDrive Pro service, which Microsoft will call OneDrive for Business. SkyDrive Pro is the file storage service for Office 365, SharePoint 2013 and SharePoint Online.

"Changing the name of a product as loved as SkyDrive wasn't easy," wrote Ryan Gavin, Microsoft's general manager for consumer apps and services. "We believe the new OneDrive name conveys the value we can deliver for you and best represents our vision for the future."

Microsoft agreed to rename SkyDrive back in July after losing a trademark infringement case with British Sky Broadcasting Group. Though the company said it would do so after "a reasonable amount of time," it appeared Microsoft was having second thoughts about ditching the name as it continued to promote SkyDrive and SkyDrive Pro.

Among the names that some thought were in play to replace SkyDrive were Azure Drive and WinDrive. Perhaps OneDrive wasn't the first choice but it sounds like a suitable name for the service. Do you like it or do you think it was a bad choice?

Posted by Jeffrey Schwartz on 01/27/2014 at 10:23 AM0 comments


Microsoft's Strong Quarterly Growth Shows Early Success of Ballmer's 'One Microsoft'

Microsoft's stellar financial results for the second quarter suggest outgoing CEO Steve Ballmer's devices and services strategy and his "One Microsoft" corporate philosophy is working.

The company stunned Wall Street analysts yesterday with better-than-forecasted results for its second fiscal quarter. The company's $24.5 billion in revenues for the period ended Dec. 31 were $800 million higher than the projected figures. Earnings came in at a whopping 76 cents a share, 8 cents higher than the 68 cents anticipated.

The surprise uptick was the result of an increase in sales of its Surface devices, which last year was a black eye for the company. Also, a large uptake for Office 365 (with 3.5 million subscribers) and high demand for the company's new Xbox One game console helped to spur the growth. Even so, traditional Office revenue declined 24 percent to $244 million, as reported by Redmond's Kurt Mackie.

Microsoft CFO Amy Hood said on last night's earnings call that the quarter's results validate its transition to a devices and services company. Revenues of $11.9 billion were up 13 percent.  In an indication Microsoft may be turning the corner with its Surface devices, the company said revenues doubled last quarter over the previous one to $893 million. The increase is the result of the newly released Surface 2 and Surface Pro 2, which received more favorable reviews and a higher level of consumer interest over the previous generation devices.

"We feel good about the progress we have made over the past couple of quarters, and are enthusiastic about the overall opportunity ahead with Surface," Hood told analysts yesterday on the earnings call. "The Windows ecosystem as a whole is also making important progress."

To that latter point, that progress remains incremental at this point.

With Surface devices ranging in price from $449 to $1,799, depending on model and configuration, Reuters estimated Microsoft sold 2 million Surfaces. As a comparison it is estimated Apple will announce that  20 million iPads sold during the same period.  

Still, Microsoft's advantage over Apple and Google is the fact that despite its consumer push, it's still primarily an enterprise company, as Mary Jo Foley points out in her All About Microsoft blog. Commercial revenues of $12.7 billion were up 10 percent. And as noted by The Wall Street Journal, commercial sales account for two thirds of Microsoft's gross profit.

Hood said commercial bookings grew 12 percent, while license renewal rates were healthy among its key enterprise products, despite a large number of license agreements expiring. Renewals increased 12 percent and contracted (but unearned) revenue came in at a record $23 billion, Hood said. "To me this is important as these long-term commitments demonstrate the confidence customers have in our product roadmap and where we are investing in key areas such as big data, infrastructure management and cloud computing," she told analysts.

Microsoft did not use the earnings release to discuss its search for a new CEO but the latest results suggest whoever gets the nod will have to make a strong case to make any dramatic changes in strategy. That doesn't mean one can't and won't make such a case. But the next CEO could also look to find better ways to execute on the existing plan.

Posted by Jeffrey Schwartz on 01/24/2014 at 12:02 PM0 comments


IBM To Hand Off Its x86 Server Business to Lenovo

IBM's long-expected departure from the x86 server business is official. The company today said Lenovo will acquire the commodity business for $2.3 billion, nine years after the Chinese company bought IBM's then-struggling PC business.

The deal includes all IBM servers designed to run Windows and Linux, though the company is not selling off its high-end server business running its Power processor, which runs both Unix and Linux. By selling off its commodity server business, so goes IBM's last major tie to Microsoft as an official partner. All the same, many of IBM's wares and services sit side-by-side with Microsoft-based systems and Big Blue's consulting and services business supports Redmond's key product lines as well. Just last week, IBM said it will support Microsoft's Dynamics product line.

Lenovo is picking up IBM's BladeCenter and Flex System blade servers and switches, Flex-integrated systems, NeXtScale and iDataPlex servers and related software, the company said. The deal also includes IBM's blade networking and maintenance operations. 

In addition to retaining its Power Systems, IBM said its hardware portfolio will include its System z mainframes, storage, PureApplication and PureData appliances. Talks of a deal started to surface earlier this week after falling apart a year ago. As IBM apparently seemed willing to lower its asking price, according to The New York Times, other players including Dell and Fujitsu expressed interest. Instead, IBM decided to finish its aborted negotiations with Lenovo.

It's hardly surprising that IBM decided to exit the server business, given its legacy of focusing on higher margin hardware, software and services. Just as Lenovo used the acquisition of IBM's PC business to expand its line of desktops, laptops and giving it an entry into the tablet market, it will be interesting to see how aggressively Lenovo moves to undercut Cisco, Dell and Hewlett Packard.

"Competition will remain fierce, with no tendency for oligopolistic behavior among the remaining participants," said Forrester analyst Richard Fichera, in a blog post. "Overall server market volumes will not change as a result of this acquisition."

As part of the deal, Lenovo will also resell IBM's Storwize disk storage systems, tape storage, General Parallel File System software, SmartCloud Entry offering and other components of IBM's system software offerings such as its Systems Director and Platform Computing offerings. 

Posted by Jeffrey Schwartz on 01/23/2014 at 12:38 PM0 comments


Convirture Adds Hyper-V Support to Management Tool

Convirture, which provides a management platform and console for virtual servers and cloud infrastructure, said its solution now supports Microsoft's Hyper-V.

The new release of the company's namesake Convirt Enterprise lets IT pros and administrators provision and manage Hyper-V hosts and virtual machines. Convirt Enterprise had already supported VMware, KVM and Xen hypervisor platforms. It can also manage VMs running in various cloud services, namely Amazon Web Services EC2 as well as those compatible with OpenStack and Eucalyptus Private Clouds.

But before the latest release of Hyper-V (included in Windows Server 2012 R2 last fall, Convirture didn't believe Hyper-V was suited for large-scale virtualization environments. Prior to the latest release of Hyper-V, Convirture also didn't see many IT organizations using it in large-scale datacenter environments, said Arsalan Farooq, CEO of the San Mateo, Calif.-based company.

"We were a little surprised that Hyper-V took this long to start showing up, but there were some legitimate concerns around it and I'm talking about bread and butter hypervisor features," Farooq said. "Since Microsoft shipped Windows Server 2012 R2, Hyper-V has now become real enough that that the cost advantage and the push Microsoft is putting behind it, is starting to gain traction."

Farooq said Convirt Enterprise will appeal to shops running Hyper-V that require more functionality than the Hyper-V Manager Microsoft offers. Yet for those that need to manage multiple hypervisor platforms or require centralized management of multiple VMs and hosts, he said it's a fraction of the cost of Microsoft's System Center Operations Manager.

"Hyper-V Manager is a great tool for small scale management of a few hosts, maybe," he said. "But if you want to go beyond that, if you want to do more capable management, more sophisticated management, you have to go into a full centralized management suite. Now you can have a pretty sizeable Hyper-V environment without having to take on all this conceptual overhead of System Center's networking model and configuration."

Convirt Enterprise provides centralized management in a Web-based console and generates historical metrics, creates VM scheduling and provides self-service provisioning, he added. It costs $449 per single-socket host. Existing customers will receive a free update with the new Hyper-V support. It doesn't support Microsoft's Windows Azure cloud service but he indicated the company has it in its roadmap.

Posted by Jeffrey Schwartz on 01/22/2014 at 10:27 AM0 comments


Should Microsoft Choose John Thompson as CEO?

A rumor surfaced yesterday that Microsoft may be considering John Thompson as its next CEO. Thompson, who after a long career at IBM and at Symantec as the company's CEO, is now on Microsoft's board and heading the search committee for Ballmer's successor.

According to CNET's Charles Cooper, there's buzz in Redmond that naming Thompson as CEO on an interim basis is a plan B that the company is considering, in wake of Ford's Alan Mulally falling off the list. If that were to play out, which I believe is a longshot (though stranger things have happened), that could be to groom an internal candidate such as Satya Nadella or Stephen Elop to take over at a future date.

It wouldn't be the first time a board member stepped in as a CEO. HP's Meg Whitman famously did so over two years ago and who can forget Dick Cheney, who led the vice presidential search team for then-Republican nominee George W. Bush. And we know how that turned out.

Thompson certainly has a strong resume and is well respected. But some question whether his decade as CEO of Symantec was a successful run. While he was there, he led Symantec's famous acquisition of Veritas for $13.5 billion.

While Veritas brought Symantec into the data protection (backup and disaster recovery) market, critics argue the price tag was way too high. In addition to a  major culture clash which I heard from former executives of both companies over the years, rumors have surfaced that Symantec has considered selling or spinning off its data protection business, though nothing has ever come of that.

In addition to his current role on Microsoft's board, Thompson is now CEO of Virtual Instruments, a performance management vendor.

Cooper argues Thompson has credibility with Wall Street,  and that his tenure with Symantec and his reputation in Silicon Valley make him a suitable candidate. Perhaps he'd be fine but I think (and I hope Thompson and the board agree) Microsoft would be better off not naming a CEO who was a plan B.

Posted by Jeffrey Schwartz on 01/22/2014 at 12:36 PM0 comments


VMware Steps Up MDM Battle with $1.54 Billion AirWatch Deal

VMware today said it has agreed to acquire mobile device management vendor AirWatch for $1.54 billion. AirWatch is regarded as a leading supplier of software for securing and managing smartphones, tablets and other systems and personal cloud storage services.

IT managers are increasingly letting employees use their personal devices on their networks. Even on those networks that may not allow them, many people use their own devices anyway and IT managers need to ensure they can control how and where data is accessed and stored. While VMware has been rolling out its own mobile device management tools called Horizon, it appears VMware has now opted to go with the more established AirWatch, a privately held company based in Atlanta, Ga. with 10,000 customers and 1,600 employees.  To what role Horizon will play in VMware's future remains to be seen but based on the company's initial statements, it looks like Horizon could be left out in the cold.

"AirWatch has a leading position in the standalone MDM market, which VMware hopes to leverage to enhance its own mobile ambitions," said analyst Jack Gold  in a statement released via e-mail. "However, it will be a challenge for VMWare to integrate the AirWatch technology with its own, as is the case with any technology acquisition into an existing base. We expect that AirWatch will become the dominant technology base for any future [VMware] Horizon product, and indeed expect that Horizon will ultimately fade away in favor of the AirWatch brand."

Virtually every major IT vendor is now emphasizing mobile device management. Microsoft has made mobile device management a key feature in System Center 2012 R2. By acquiring AirWatch, VMware also joins a number of established IT vendors that have added device management software to their overall systems administration portfolios. Citrix acquired Zenprise just over a year ago, Oracle in November bought Bitzer and just last week IBM said it is acquiring FiberLink. And Sybase's MDM business was a key reason SAP acquired that company nearly four years ago (in addition to its database business).

At the same time, traditional remote monitoring and management suppliers such as Continuum, GFI Software, Kaseya, LabTech, Level Platforms and N-able are extending their mobile device management features. Many of them have historically sold their wares primarily to managed service providers. However, many have recently begun targeting enterprises.

VMware said the AirWatch operation will be the focal point of its mobile systems management operations. AirWatch will become part of VMware's End User Computing Group, headed by Sanjay Poonen, the company's general manager and executive vice president.

AirWatch founder Alan Dabbiere will report directly to VMware CEO Pat Gelsinger. "With this acquisition VMware will add a foundational element to our end-user computing portfolio that will enable our customers to turbo charge their mobile workforce without compromising security." Gelsingter said in a statement.

Under terms of the deal, VMware is funding it with $1.75 billion in cash and $365 million in installment payments, the company said. It's slated to close this quarter.

VMware also issued preliminary financial results for the fourth quarter of 2013, which it will officially release next week. The company said revenues are expected to be $1.48 billion, a 15-percent year-over-year increase. The uptick takes into account divestitures including Pivotal Software. Excluding those divestitures, VMware said revenues increased 20 percent.

Posted by Jeffrey Schwartz on 01/22/2014 at 11:23 AM0 comments


Kinect 2 Coming to Windows This Summer

Microsoft's Kinect camera is best known by those who attach them to their Xbox gaming consoles, although Microsoft also offers a version of the sensor for Windows PCs. While Kinect is a toy for some, for others it's enabling new business opportunities.

Kinect was prominent at last week's National Retail Federation show in New York. During my booth tour, I even had the opportunity to chat briefly with Chris White, the senior program manager for Kinect, who oversees its development and marketing. White confirmed that the eagerly anticipated next iteration of Kinect is on pace to arrive for Windows this summer.

The new Kinect 2 will sport an HD (1080p) swivel camera with 1920 x 1080 resolution, support for 30 frames per second (fps) and a 16:9 aspect ratio. These specs are an improvement over the first-generation Kinect's 640 × 480 (480p) resolution, support for 30 fps and a 4:3 aspect ratio, according to a post by 123Kinect. Many retailers and distributors of apparel and other consumer goods should find that major boost appealing for product development.

Microsoft had a number of partners demonstrating Kinect at its NRF booth last week, which it described in a blog post.

One Microsoft partner bullish about the potential for Kinect 2 is FaceCake, a company that has developed what it calls a virtual dressing room. Using its swivel camera, customers can visualize how apparel will look on them -- whether it's a tie, a blouse or any other garment. With Kinect 2's HD capabilities and other features, the swivel camera will also be able to provide better detail and will be useful for gestures, explained Tom Chamberlin, FaceCake's vice president of business development.

Microsoft has hundreds of developers working with the new Kinect 2 SDK, and the deadline for participating in the preview program is Jan. 31.  As of last week, Microsoft is believed to have filled 300 of 500 of the preview slots.

Among those testing Kinect 2 is NASA's Jet Propulsion Laboratory, which has already used it to control a robotic arm.

While Kinect 2 promises to have a lot of appeal to those building new vertical and industrial applications, I wonder if Kinect 2's improved precision will make it more appealing for mainstream desktop and communication functions, such as video conferencing. If Microsoft's tendency of getting things right the third time holds here, we may have to wait for Kinect 3. But the Kinect 2 looks like it will be a nice improvement over the first version.

Posted by Jeffrey Schwartz on 01/21/2014 at 2:13 PM0 comments


HP Kicks Off Aggressive Incentive for Windows 7 PCs

The good news is the share of PCs and tablets running Windows 8/8.1 is on the rise. But the bad news is that despite now surpassing the 10-percent share mark, according to Net Applications, some PC vendors are reportedly not pleased at the pace of uptake for the vastly altered operating system.

Many enterprises with Software Assurance or other licensing agreements can deploy Windows 7 when they roll out new PCs to users. Most PC vendors offer consumers a limited number of Windows 7-based systems or give the option to have a system shipped with it. But it's not something they have promoted in the past year. That is until now.

HP has kicked off a promotion and is offering $150 off for customers who chose a PC loaded with Windows 7. And the promotion isn't buried on the company's Web site. In fact, it couldn't be more prominent: just go to HP.com and it's in your face.

The timing could be good for those who have to bite the bullet and get rid of their Windows XP systems, which Microsoft will stop supporting April 8. Although Microsoft said last week that it will extend antimalware support for Windows XP until July 2015, the company will stop issuing security and other patches for Windows XP once the deadline hits. Hence experts are warning that users should upgrade their systems in advance.

A spokeswoman for HP told me the Windows 7 promotion is not because of dissatisfaction with Windows 8. "We've been offering Windows 7 since the availability of Windows 8 and we will continue to offer a broad set of choices for our customers," the spokeswoman said." There are promotions on HP Shopping all the time, this is just another promotion."

Moreover, this is a brief campaign, she said. It will end this Saturday.

Posted by Jeffrey Schwartz on 01/21/2014 at 12:06 PM0 comments


Obama Moves To Limit NSA's Surveillance Activity

Looking to strike a balance between maintaining security against major threats and ensuring individual privacy, President Barack Obama today ordered a halt to the current Section 215 bulk metadata program in its current form. The president also recommended a set of reviews and guidelines aimed at putting limits on the National Security Agency's surveillance activities.

The move is the strongest effort yet by the administration to dial back the activities since last year's disclosure by former NSA contractor Edward Snowden of the agency's surveillance programs. However, today's announced changes did not seek to eliminate it. Snowden's revelations last June have shattered the trust of individuals and IT pros alike.

In a speech (transcript) this morning at the Justice Department, the president argued that the review group he commissioned last year did not discover any indication of abuse when mining the metadata found in phone records. Obama emphasized a point he, NSA officials and others have frequently made since Snowden disclosed the various surveillance efforts such as PRISM: The government isn't going through records of domestic citizens but only a consolidated database of records service providers already save for billing and other routine purposes.

Because the review panel concluded it saw no signs of abuse, Obama said, "I believe it is important that the capability that this program is designed to meet is preserved," citing its effectiveness in thwarting attacks since Sept. 11 2001. But he added the program does need added safeguards moving forward to prevent the potential that exists for abuse.

"This type of program could be used to yield more information about our private lives, and open the door to more intrusive bulk collection programs in the future," Obama said. "They're also right to point out that although the telephone bulk collection program was subject to oversight by the Foreign Intelligence Surveillance Court and has been reauthorized repeatedly by Congress, it has never been subject to vigorous public debate."

Obama said the review board's recommendation that replacing its current approach to tracking records with a third party isn't practical, noting it would pose technical questions, new privacy concerns, cost and less accountability. Another option was raised -- maintaining the current capabilities but using a combination of authorities with improved information sharing. "More work needs to be done to determine exactly how this system might work," he said.

As such, Obama ordered that the NSA transition from its current program. The government will only target phone calls that are two steps removed from terrorist organizations rather than the existing three. Obama also ordered Attorney General Eric Holder to collaborate with the Foreign Intelligence Surveillance Court to allow agents to only query the database following a judicial order or if there's an immediate emergency.

During the transition, the president also asked Holder to determine how the metadata can be made accessible through what is called the Section 215 program without holding the data itself. Obama has asked for those alternatives by March 28 and he'll seek permission from Congress to launch any new programs (if required), Obama said.

"I believe we need a new approach," he said. "I am therefore ordering a transition that will end the Section 215 bulk metadata program as it currently exists, and establish a mechanism that preserves the capabilities we need without the government holding this bulk metadata."

Obama announced the government has declassified over 40 opinions and orders of the Foreign Intelligence Surveillance Court (FISC), which reviews what he said is the most sensitive activities including Section 702 program targeting foreign individuals overseas as well as Section 215 telephone metadata program. Furthermore, he asked Holder and Director of National Intelligence James Clapper Jr. to review the privacy implications of surveillance activities and report to the president and Congress yearly. Obama also called on Congress to approve a panel of advocates from outside the government to weigh in on major cases before the FISC.

To address criticism that the government isn't transparent in its programs, Obama directed Holder to lift restrictions on so-called national security letters, which are used to petition data on individuals without their knowledge, to disclose those inquiries after an investigation has gone past the need for security. "We will also enable communications providers to make public more information than ever before about the orders that they have received to provide data to the government," he said.

While critics see these as a step in the right direction, some argue Obama's statement is incremental at best and is not likely to put an end to this debate. "Despite these welcomed reforms, the president's recommendations are still lacking when it comes to striking the appropriate balance between privacy and security," said Internet Infrastructure Coalition (i2Coaliton) chairman Christian Dawson, in a statement issued after Obama's speech. "Without actions that include meaningful reforms to both bulk surveillance, and the indiscriminate use of National Security Letters, all together such a balance is unlikely to be achieved."

Political rants aside, what's your take on the president's move to curtail bulk collections and review metadata analysis?

Posted by Jeffrey Schwartz on 01/17/2014 at 3:44 PM0 comments


Many ATMs Are Still Running Windows XP

It's bad enough that 30 percent of the PCs in the world still run Windows XP and risk running an unprotected OS after April 8. Even more alarming is that many of those machines are ATMs at banks and other locations.

Does that mean ATMs will become vulnerable to malware or data theft once the deadline expires? As Bloomberg BusinessWeek reported yesterday, it depends. Those that run Windows XP Embedded have a little more time as Microsoft will continue to support that version until early 2016. But those with Windows XP will be susceptible to malware and other attacks, said Dean Stewart, an executive at ATM supplier Diebold.

"It's a very real risk," Stewart told the publication. "No ATM operator wants to get his name in the paper."

Only 15 percent of ATMs will be running Windows 7 by the April 8 deadline, added Aravinda Korala, CEO of specialty software supplier KAL.

Those willing to pay up can receive extended support. JP Morgan Chase is doing this for its thousands of ATMs, according to the report. The extended support could jump fivefold in the second year, according to Korala. Depending on the age of the ATM, it could require new hardware components that cost a few hundred dollars or, for older machines, several thousand dollars.

That some of the leading banks' ATMs are still running some form of Windows XP is not a surprise. Everywhere I go I see Windows XP machines: at the gym, my own Bank of America branch and numerous retail establishments. I hear many classrooms and computer labs also still have Windows XP-based PCs.

More concerning is that some will feel emboldened to wait to upgrade thanks to this week's news that Microsoft will extend antimalware support until January 14, 2015. However, Microsoft warned that the extended support doesn't mean that's a good reason to wait longer to upgrade your Windows XP systems, which will remain vulnerable to other attacks after April 8.

If you still have or support Windows XP systems, what are your plans in light of the April 8 deadline? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 01/17/2014 at 1:25 PM0 comments


Commercial Windows 8.1 Tablets Showcased at NRF Show

After laying low at the Consumer Electronics Show last week, Microsoft pulled out all the stops at the National Retail Federation Show in New York, which kicked off Monday. While the NRF show is focused on retail, the large trade show also encompasses the entire goods supply chain.

Attendance at NRF topped 30,000 with over 500 exhibitors taking up the entire Jacob Javits Convention Center. Microsoft showcased how it is looking to transform business across a number of industries with hardened versions of Windows 8.1, big data analytics and its Dynamics platform. The company talked up planned enhancements for Dynamics for Retail including an unlikely boost from IBM, which will partner with Microsoft on large projects as well as with content management system supplier Sitecore.

I spent the good part of Monday at NRF, where Microsoft hosted a number of sessions to outline how various customers are using the rapidly growing number of Windows 8.1 tablets in retail-type of environments. Ashley Furniture, Avis, Delta Airlines and Kohls were there to talk up how they are using Windows 8.1 tablets and handhelds to enable clerks and sales associates to provide more information on the sales floor.

The operative word from Microsoft officials at NRF was "omni-channel," the emphasis on providing improved Windows-based point-of-sale devices, providing customers better information in the store and online and improving operational efficiency through integration and the use of analytics.

Microsoft's booth was so crowded that it reminded me of its CES and Comdex exhibits. In addition to a display of its Surface tablets, the "device bar" featured laptops, tablets and handheld Windows 8.1 devices from Dell, Hewlett Packard, Lenovo, Panasonic and numerous other players.

"Last year at NRF, in my backpack I had the entire portfolio of Intel Windows 8 devices to show off in the booth," said Paul Butcher, a retail strategist at Intel, speaking at a session on the use of tablets in retail. "This year there was absolutely no way I could have done that. I'm talking about the range of devices available to you as business-minded individuals who need to make a decision about what type of device you're going to deploy."

Tracy Issel, general manager of Microsoft's worldwide retail segment, apologized to a standing-room-only crowd of attendees that it took the company this long to get to this point. "Our customers have been very patient with us," she said. "It's taken us a little time to get where we are. We [can now] enjoy a very healthy share of point-of-sale systems and handheld terminals."

At the booth, Microsoft showcased what it called a "connected fitting room" (developed by Accenture and Avanade for  Kohls) that allows customers to select other merchandise from a touch display. It's designed so a sales associate with a Windows tablet is notified of the request to deliver the item to the customer so he or she doesn't have to leave the dressing room.

Also at the booth, partner FaceCake demonstrated an application that lets customers see how different products would look on them using a Kinect camera in a virtual dressing room. And RazorFish also demonstrated a pop-up store that would enable customers to design their own custom surfboards.

In many retail scenarios, customers are using Windows 8.1 Embedded, a "hardened" version of Windows 8.1, said Simon Francis, Microsoft's Windows Embedded U.K. enterprise lead, who walked me through the company's booth. The embedded version of Windows 8.1 has the same capabilities as the commercial implementation, though customers and partners can lock certain features out such as Internet Explorer or access to the Windows Control Panel. "A lot of these devices are going to be in front of the general public," said Francis. "So you need to lock it down."

 

Posted by Jeffrey Schwartz on 01/15/2014 at 8:33 AM0 comments


Will the Latest Employment Reports Impact Your Career Plans?

Among the many priorities IT pros have on their agenda for 2014, one is considering new employment opportunities.

According to a survey of Redmond magazine readers conducted early last month, 26 percent of you will look to change employers this year. That's the same figure of those who responded to last summer's annual salary survey. It's also a two-fold increase over our 2012 salary survey, where only 13 percent said they might look to change employers. An improving economy and a lower unemployment rate were key factors, while many IT pros are unhappy with their current compensation, among other factors.

However, according to a recent report, it may be more challenging finding that new job than you thought. The U.S. Department of Labor's monthly employment report Friday showed growth in jobs slowed sharply to a three-year low. Employers added only 74,000 jobs, falling way short of the  200,000 economists had forecasted.

It remains to be seen whether it was a year-end aberration or a sign that growth in the economy may be decelerating. IT experts have mixed opinions as to what this means for technology jobs. IT job growth is slowing, says Victor Janulaitis, CEO of Janco Associates, which tracks tech employment. Only 3,200 IT jobs were added in December, 15,900 in the past three months and 74,900 in 2013, Janco reported Friday after the Labor Department released its employment report.

"The employment data is not as good as the fall in the national unemployment rate suggests and it seems to be worse for IT pros," Januliaitis said in a statement. "If you factor in the participation rate, the true national unemployment rate would be around 12 percent. That data is causing many companies to consider whether they should expand IT staffs."

However others believe the worst may be over. While IT ranked fifth in job cuts last year, the rate of downsizing decreased 24 percent versus 2012, according to outplacement firm Challenger, Gray and Christmas, which forecasted on Friday that IT would be one of the largest areas of job growth this year. IT ranked third in hiring with 26,000 tech jobs in the pipeline. Another positive indicator: Forrester earlier this month said IT purchases will increase 6.2 percent this year, with the U.S. leading the way. Perhaps skewing last 2013's IT job growth rate numbers were major cutbacks by large employers in the tech industry including Cisco, Hewlett Packard and Intel, according to Dice.

Did last week's report make you feel more optimistic about your career options or do you think it may be more difficult to land a better job than you had anticipated? And for those in IT management who are hiring, are you moving forward with adding new positions this year? If so, where is your highest demand for IT expertise?

Posted by Jeffrey Schwartz on 01/13/2014 at 11:57 AM0 comments


Microsoft's Presence Lives On at CES with Windows 8.1 Devices

After two decades of kicking off the annual Consumer Electronics Show with the opening keynote, Microsoft last year bid the largest tech confab in the U.S. adieu. Even with the first CES without Bill Gates or Steve Ballmer on stage, and no Microsoft booth in the exhibit hall, the company's presence still loomed large in Las Vegas this week.

A number of key Microsoft partner PC makers rolled out a surprisingly plentiful number of new tablets, convertible PCs, laptops and even some desktops running the recently released Windows 8.1 (and plenty of Android-based tablets). All the new Windows devices in the pipeline should substantially bolster Microsoft's client platform.

Asus, Hewlett Packard, Lenovo, Sony, Panasonic and Toshiba are among those that released new Windows 8.1 devices at CES. Dell was relatively quiet, having rolled out an extensive Windows 8.1 lineup in October. However it did introduce a gaming system from its Alienware division.

Though CES emphasizes consumer wares, many of the new Windows 8.1 devices are targeted at business and commercial users, especially now in the BYOD age.

One introduction that raised eyebrows at CES (as reported Wednesday) was the first Intel-based dual-boot convertible laptop/tablet that can run both Android 4.2.2 and Windows 8.1, though not simultaneously, from Asus. Asus' new Transformer Book Duet TD 300 got a substantial amount of buzz and now many are wondering whether other such devices will become a trend.

With so many new Windows 8.1 tablets and PCs at CES, it suggests Microsoft's Surface tablets haven't caused vendors and partners to abandon Microsoft's Windows for Android devices. That could still happen over time. But, so far, it apparently hasn't. Here are some products showcased at CES:

Panasonic Toughpad Is the FZ-M1
Always a favorite of those who require a device that can withstand rugged use is Panasonic's Toughbook line. The new Toughpad FZ-M1 is a 7-inch tablet that Panasonic says is designed to work in any condition.

The 1.2-pound Toughpad FZ-M1 is powered by an Intel Core i5 vPro processor and runs the 64-bit version of Windows 8.1. It comes with 8 GB of RAM, a 128/256 GB solid-state drive and a 1280 x 800 10-point multi-touch display.Panasonic claims a durability rating of MIL-STD-810G at a five-foot drop.

It also supports an optional dedicated GPS, barcode reader, NFC reader, SmartCard reader, 10-foot range RFID transmitters, magnetic stripe reader and multicarrier-embedded 4G LTE wireless connectivity. Due out this spring, the starting price will be $2,099.

Lenovo ThinkPad Tablet Edition
Lenovo continues to modernize the ThinkPad, which is now available in numerous sizes. The new ThinkPad 8 brings the form factor down to an 8.3-inch device with a 1920 x 1200 display. It is made of aluminum and can function in three modes: desktop, tent and tablet mode.

The ThinkPad 8 is powered by an Intel Atom processor and is available with up to 128 GB of storage, a micro USB 3.0 port and LTE connectivity. Pricing will start at $399. It will compete with the Dell Venue Pro 8. Asus also showcased a similar form-factor tablet, the VivoTab Note 8. It will top out at 64 GB of storage. Asus didn't disclose pricing and availability.

HP's Second-Generation Workstation
Hewlett-Packard launched the second version of its 27-inch HP Z1 desktop workstation. Due to ship in late January, the Z1 G2 starts at $2,000 and is targeted at CAD specialists, graphic designers and those who require compute-intensive performance.

The Z1 G2 is available with Intel Core and Xeon processors, ECC memory and RAID storage configurations. For those with more mainstream requirements, HP also launched the HP 205 and ProOne 400, the latter available with either a 19- or 21.5-inch display. The HP 205 comes in somewhat smaller at 18.5 inches and is powered by an AMD E-Series dual-core processor. It will start at a reasonable $400.

Sony Flips Again
Following up its fall launch of Flip PCs, Sony added the VAIO Fit 11A, an 11-inch iteration of its lineup. It is a convertible PC/tablet hybrid and compliments a line of systems in that family that include 13-, 14- and 15-inch systems. The 2.82-pound Vaio Flip PC will start at $799 and is slated to ship in late February.

Toshiba's Satellite and Tecra
If you're looking for a workstation-class Windows 8.1 mobile system, the Tecra W50 is designed for graphics-intensive applications. It's among the first to have a 15.6-inch Ultra-HD display with a 3840 x 2160 resolution. Ultra-HD digital displays have typically only been found in high-end televisions to this point.

The Tecra W50 is powered with an Intel Core i5 fourth-generation processor enhanced with an NVIDIA Quadro K2100M GPU and 2 GB of dedicated memory. One thing it will lack is a touch display. If that's a deal breaker, Toshiba also launched at CES a 4K display in its new Satellite P50t. Both will be available later in the year. Pricing wasn't disclosed.

That's just a handful of Windows 8.1 systems and devices to show up at CES. My colleague Scott Bekker at sister publication Redmond Channel Partner has showcased some others.

Posted by Jeffrey Schwartz on 01/10/2014 at 1:12 PM0 comments


Dual-Boot Android/Windows Tablet Debuts at CES

While a number of partners used this week's Consumer Electronics Show in Las Vegas to launch a new crop of Windows 8 tablets and laptops, those same partners also launched new Android-based devices. But for those who want the best of both worlds, Asus announced at CES the first Intel-based dual-boot convertible laptop-tablet that can run both Android 4.2.2 and Windows 8.1, though not simultaneously.

Asus' new Transformer Book Duet TD 300 lets users switch between Android and Windows within 4 seconds, the company said. It's the first such system based on Intel's new dual-boot system-on-a-chip (SoC) processor. Like many convertible devices, the Transformer Book Duet is a laptop with a detachable display that converts into a tablet.

It remains to be seen whether customers are craving for tablets that can boot both operating systems but Intel's new CEO Brian Krzanich Tuesday evening said they do. "There are times you want Windows, there are times you want Android," Krzanich said in his first CES keynote as CEO. Intel's 64-bit SoCs "are the only ones that can offer that capability to seamlessly switch between Oses," he added. "You don't have to make a choice moving forward."

Asus' new Transformer Book Duet TD 300 is loaded with an Intel Core i7 processor with HD graphics, 4 GB of RAM, a 128 GB SSD and is available with a 1 TB hard drive. The company claims it can run twice as fast as tablets with ARM-based processors. Because it doesn't use OS virtualization, Asus said each operating system utilizes the full capacity of the processor. When running Windows, Asus said it will get 5 hours of battery life and 6 hours when running Android.

With a 13.3-inch 1920 x 1080 Full HD IPS touchscreen, Asus claims it exceeds Microsoft's viewing requirements for Windows 8.1, though it doesn't appear likely Redmond was looking for its partners to bring dual-boot systems to market. But if archrivals Microsoft and Google can each benefit from the emergence of dual-boot systems by taking share from Apple, Microsoft could pick up share it otherwise might not have gained from those who want Android devices. However, Google doesn't appear to need help from Microsoft, given the accelerating growth of Android.

The Transformer Book Duet will reportedly start at $599. Do you think there's a broad market for dual-boot Android-Windows convertibles or will this be a niche product?

 

Posted by Jeffrey Schwartz on 01/08/2014 at 12:37 PM0 comments


Microsoft CEO Candidate Mulally Staying Put at Ford

If there was any lingering doubt that Microsoft would name Alan Mulally to replace Steve Ballmer as CEO, Mulally has put that question to rest.

The Ford CEO intends to stay with the automaker through 2014, he told the Associated Press Tuesday afternoon. While Mulally was a favorite among Wall Street analysts, it became increasingly clear last month that Mulally was not headed to Redmond nor was he certain if he was still the top choice of Microsoft's search committee.

Keeping hope alive among Mulally proponents that he wasn't out of the running, he refused to say outright that he would turn down the job if it was offered to him when asked in December. But as I noted a few weeks ago when Microsoft extended its search, it appeared Mulally was a dark horse. Finally Mulally made that official.

Posted by Jeffrey Schwartz on 01/07/2014 at 3:11 PM0 comments


Are Ballmer and Gates Slowing the Microsoft CEO Search?

One of the biggest stories in Redmond last year was Microsoft CEO Steve Ballmer's unexpected announcement of his upcoming retirement. Whoever succeeds him will undoubtedly define the company's agenda in 2014 and beyond. It remains anyone's guess who Microsoft's board will name as the company's third CEO but it appears the only two to sit at the helm -- Ballmer and Founder Bill Gates -- are the reasons why the search is taking so long.

Gates of course is chairman of the board, while Ballmer is also a director and it remains unclear whether either or both will retain their seats. It's reasonable to presume they don't want to give up their current slots. That may be a key reason why many ideal candidates are uninterested, according to a Wall Street Journal piece over the weekend. The publication noted that no new CEO wants their two predecessors in a position to second guess every move they make.

Citing data from executive compensation firm Equilar, The Journal reported only eight companies in the S&P 500 Index have two former CEOs on their boards. That would certainly explain why a candidate like VMware CEO Pat Gellsinger or his predecessor Paul Maritz, once in the inner circle of Gates and Ballmer, might be unwilling to consider the job. Their presence on the board would certainly make life difficult for others such as Stephen Elop or Satya Nadella.

The counter argument for letting them remain on the board is that both Gates and Ballmer are key parts of Microsoft's DNA. Some stakeholders in Microsoft's future would see that as an asset, while others would consider it a liability.

There are several possible scenarios. They both could retain their seats (with Gates remaining chairman), Gates could step down as chairman but remain a board director or one or both of them could leave the board. Which scenario do you think would give the next Microsoft CEO the best chance of improving the company's fortunes?

Posted by Jeffrey Schwartz on 01/06/2014 at 12:06 PM0 comments


Microsoft Extends CEO Search

While Wall Street seems to want Microsoft to choose Alan Mulally as its next CEO, it's beginning to look more like that ship has sailed. Microsoft Lead Independent Director John W. Thompson, who also heads the search committee to find the next CEO, Tuesday revealed the search will continue into next year.

Technically, the search committee has plenty of time to make a decision. Ballmer gave his 12 months' notice back in August of his plan to retire. But most analysts and watchers have gone under the assumption that the board would announce a new CEO by year's end. Apparently realizing that won't happen, Thompson took the unusual step of saying so in a brief blog post that offered little evidence of where the committee was headed.

Some follow-up reports yesterday suggested the board wants someone from Silicon Valley with a strong background in technology. It's reasonable to assume, as Nomura Securities Analyst Rick Sherlund suggested, that the board hasn't offered Mulally the job -- though it hasn't ruled him out yet. It appears, however, that if it felt Mulally was the best candidate, Microsoft would have picked him by now. The fact that he hasn't committed to staying at Ford, which is irking the automaker's board, suggests he hasn't ruled out a move to Redmond if offered the position.

If by chance Microsoft has offered him the job and he's still dragging his feet, the board should fish or cut bait with him for the sake of both companies. One candidate with both tech and management chops that was under consideration, Qualcomm COO Steve Mollenkopf, fell off the list when the mobile processor vendor named him as its CEO.

Another outside favorite is Paul Maritz, the former CEO of VMware Inc. and now CEO of that company's spinoff, Pivotal. Not only did Maritz boost shareholder value for VMware during his stewardship of the company, he has a vision of technology few leaders have today. Maritz also was a former senior exec who spent many years in the inner circle of founder and Chairman Bill Gates and Ballmer. However, while this friendship may carry positive weight for Maritz, tipping the scales the other way is the fact that many of his talking points as CEO of VMware were to marginalize the value of Windows, both on the client side and in the datacenter.

While surely he could spin his way around that, it's a moot point. Maritz told Sherlund over lunch last week he's not interested. With Maritz not being an option, Sherlund has been talking up his longtime colleague and successor at VMware, Paul Gelsinger, who also has strong management and technology credentials. But Gelsinger may have his sights on the top job at VMware's parent EMC when its current CEO Joe Tucci retires.

Thompson said in his post that Microsoft has honed in on about 20 candidates. Wall Street may want an external heavy hitter but Microsoft may be best served by an insider like Satya Nadella. Just like making sure a new version of software is ready to ship, Microsoft (which we all know has a checkered past in that regard)  is being deliberately careful on who will be only the third CEO in the company's history. On the other hand, uncertainty can't sustain Redmond for too much longer.

Posted by Jeffrey Schwartz on 12/19/2013 at 10:51 AM3 comments


Hands-On: Should Microsoft's Surface 2 Be on Your Wish List?

Business users, field workers, students and anyone else looking for a mobile and functional tablet to run Office, handle e-mail, browse the Web and use various other apps almost anywhere should consider Microsoft's new Surface 2. Whether looking to get some work done on a long flight, at a local Starbucks or on the sofa while watching football, the Surface 2 is the best (relatively) low-cost portable unit for that purpose.

I've spent over a month with the new Surface 2, introduced in September and released in late October, and it's a nice evolutionary improvement over the original Surface RT. To be clear, just like its predecessor, the Surface 2 runs an improved version of the scaled-down version of Windows, now called Windows 8.1 RT. That means that if you are looking for a unit that must run traditional apps designed for Windows 7 or earlier, the Surface 2 is not for you unless you've checked the Windows Store and found an equivalent modern app developed for Windows 8.

Given the state of traditional Windows apps in the new Windows Store, there's a good chance at least some apps you use are not available -- at least for now. In some cases, like in the case of Adobe (noted last week), they may never come. Hopefully that will change and, depending on your app usage, you'll have to decide if that's a deal breaker.  

If running a traditional Windows app on the go is a requirement, the new Surface Pro 2 has a nearly identical form factor to the RT model, though it comes loaded with an Intel processor rather than the ARM-based system-on-a-chip architecture. The Surface Pro 2 is somewhat thicker and heavier than the Surface 2 (2 pounds versus 1.5 pounds), and the Pro offers less (though vastly improved) battery life. And you'll pay at least double for the Pro (see Redmond magazine contributor Derek Schauland's take on the Surface Pro 2 here).

But even if all of your desktop or laptop apps aren't available in the Windows Store, that doesn't mean you should rule out the Surface 2. It simply means you shouldn't consider it as a replacement for your existing Windows PC (though that was never the intent of the Surface 2 or its predecessor). Rather it's a perfect companion device for common tasks like using Office to write, create and edit spreadsheets and work in PowerPoint. For many workers, that should suffice.

The Surface 2 I tested is the entry level unit, priced at $449. I used the Type Cover keyboard (at an additional $130 cost), which is now backlit, though a Touch Cover keyboard is also an option for $10 less. Having seen the Touch Cover, I prefer the Type Cover, which feels more like a real keyboard but that's really a matter of preference. It's foolish to get a tablet without some kind of cover. When you want to use it as a pure tablet, you can easily remove the cover and put it aside. But having some form of keyboard option makes sense, especially if you're using Office.

Powered by an ARM-based NVIDIA Tegra 4 quad-core processor and 2 GB of RAM, the entry level Surface 2 comes with a scant 32 GB solid-state flash drive for storage. But with the installed software, less than half of that capacity will be available for apps and data. Though a USB port will allow for additional storage, if you have your sights on installing lots of apps, you may want to spend an extra $100 for the 64 GB version, which, in reality, should have been Microsoft's entry level model.

Equipped with a perpetual Office 2013 license that includes Outlook, the Surface 2 offers good battery life, a crisp 1920x1080 HD display and performs reasonably well presuming you're not running too many apps at once. The Surface 2 has a very similar form factor to the iPad, though Apple's newest iPad Air is lighter. Even though you can also get an optional keyboard and Office-compatible apps for the iPad (but not Office itself, at least not yet, though many predict that will change in the coming year), the wider display on the Surface 2 seems to make it more suitable for work-related tasks such as working in Office or going through e-mail. The new backlit keyboard is a nice feature but you'll want to make sure to close it when it's not in use to save battery life.

The new dual-position kickstand also is a welcome addition to the new Surface, letting you find a suitable angle for working. I connected my small wireless Logitech mouse to Surface 2's USB port but if you spring for a Bluetooth-compatible mouse, that will free up the lone USB port. Of course, you don't need to use a mouse if you're willing to wean yourself off it. The keyboard does have a track pad or you can use the touch interfaces. I still find the mouse easier for certain tasks, especially when working with an Excel file or copy and pasting in Word, on the Surface 2's 10.6-inch display. However, I'm not sure I'd feel any different with a 20-inch touch display.

Over the past few years, when I go to meetings or attend all-day conferences, I typically take my aging ASUS Eee PC netbook, which runs Windows 7 and can't be upgraded to Windows 8x. The Surface 2 is much easier to carry and pop open on the fly. It is perfect when sitting down at an event or meeting to take notes, keeping an eye on e-mail, browsing the Web and using various social media apps.

While I like the Surface 2, if you already have an iPad, you're not going to want to pass it along to the kids just yet. Chances are you're not going to find all of your favorite iOS apps in the Windows Store. If you do, then you'll find little use for the iPad. Again, in most situations, I doubt that will be the case at this point.

Will that change? As I reported earlier this month, IDC says developer interest in the platform has risen by eight points in the most recent quarter. While that's good news, with 37 percent of developers now saying they want to build Windows apps, that's 35 points below developer interest level in Android and 50 points lower than Apple's iOS. Getting more apps into the new Windows Store ecosystem in the coming year will be critical to the success of the Surface 2 and fellow Windows-based devices that run on low-power system-on-a-chip platforms.

Let's face it though, no one is going to buy a device on the promise that more apps may appear in the future. You're going to consider it based on what it can do for you out of the box. If you're looking to ditch your heavy laptop and all you need is access to files, the ability to work in Office, browse the Web and have it work all day on a single charge (figure eight to nine hours), it'll get you through the day when you're away from the office. Microsoft also has said it will offer a keyboard with a built-in extra battery that will add 50 percent more battery life.

The fact that SkyDrive is integrated nicely into Windows 8.1 makes it easy to synchronize all of the files you use on it with your other systems, including your main computer. Though as I recently noted, it would be nice if SkyDrive had the same ease of use as Dropbox, which is also available as an app in the Windows Store (other popular services such as Box are available).

Some may lament it lacks built-in connectivity to cellular networks, though if you have a phone with a hotspot, it should hold you over when WiFi isn't available. It's likely versions with 4G connectivity will appear or you can also check out the new Nokia Lumia 2520, already equipped for the AT&T network.

There seems to be more demand for the newest crop of Surface systems as it appears they are hard to come by, Network World reported on Monday. Though it could be that's because Microsoft was more conservative in making supply available.

If neither the Surface 2 or Surface Pro 2 suit your needs, there's a slew of alternatives available from third parties such as ASUS, Acer, HP, Dell, Lenovo and Sony. You can bet they will have more consumer-grade and business-class systems in the coming year. Is a Surface 2 on your wish list?

Posted by Jeffrey Schwartz on 12/18/2013 at 10:45 AM0 comments


Microsoft's Kurt DelBene To Oversee HealthCare.gov

Health and Human Services Secretary Kathleen Sebelius today announced that Microsoft Executive Kurt DelBene will be taking over the struggling HealthCare.gov Web site. White House Press Secretary Jay Carney confirmed in a briefing today that DelBene will be a senior advisor reporting directly to Sebelius.

Aimed at providing entry to the insurance marketplace that is perhaps the largest signature effort of the Obama presidency, HeathCare.gov was a debacle from the day of its launch on Oct. 1. The HealthCare.gov site is pivotal to the success of the Affordable Care Act, commonly referred to as Obamacare, and its beleaguered launch is arguably one of the biggest failed IT efforts to date. At the time of its launch, the site repeatedly crashed and few were able to get into the healthcare exchange. Performance has vastly improved since then but the task is far from complete.

DelBene will take over management of HeathCare.gov from Jeffrey Zients, a management expert who was brought in to temporarily oversee the project. Once the transition is complete, Zients will be moving to his new position as director of the National Economic Council in February.

Microsoft said back in July (at the time of its largest reorganization in many years) that DelBene would be retiring from the company and gave no hint at the time at today's announced move.

DelBene is a 21-year Microsoft veteran and most recently served as president of Microsoft's lucrative Office division. During his tenure, he also oversaw development teams and, according to his bio, oversaw the Office engineering organization including Office desktop applications, Office Web applications, SharePoint, Exchange Server, Microsoft Office Communications Server (now Lync) and Office Labs. In addition he managed document and Web-page authoring and collaboration tools for Office.

Carney described DelBene as "uniquely suited" to overseeing HealthCare.gov, given his management roles at Microsoft, his prior stints as a management consultant with McKinsey and an engineer at AT&T Bell Labs.

 

Posted by Jeffrey Schwartz on 12/17/2013 at 10:46 AM0 comments


Unitrends Acquires PHD Virtual To Widen Its Backup and Recovery Footprint

Looking to fill key gaps in both the technology it offers and its addressable market of backup and recovery solutions, Unitrends Monday said it is acquiring PHD Virtual.

Unitrends itself was acquired earlier this year by the private equity firm Insight Venture Partners, which also counts PHD Virtual as one of its holdings. By combining the two companies, Unitrends can target smaller companies and offer solutions for environments that are purely virtualized, a limitation of its appliance-based solutions.

PHD Virtual competes with the likes of Veeam and Acronis and sells to organizations with less than 100 employees with average deal sizes of about $3,000. Unitrends enterprise backup appliances are aimed at enterprises with 100 to 500 employees with deal sizes above $20,000. Unitrends competes against Symantec's Backup Exec, CommVault and Barracuda, said Mike Coney, Unitrends' CEO.

"We saw a real fit with PHD Virtual," Coney said. "Their customers are mostly virtual. Where they lose deals is with requests for physical appliances and where we lose deals is when there's a heavy concentration of virtualized environments." Coney noted that PHD Virtual also recently acquired Reliable DR, which brings the company into the disaster recovery-as-a-service market.

Coney is no stranger to the backup and recovery market. He was on the original BackUp Exec team at Veritas before Symantec acquired the company in 2005. So is an IPO in the works for Unitrends at some point? Coney said that's very likely. "Insight has a track record of bringing their companies public," he said. "You have to think that's part of the exit strategy."

Posted by Jeffrey Schwartz on 12/17/2013 at 3:30 PM2 comments


NSA Downplays Scope of Surveillance

Half a year after former National Security Agency contractor Edward Snowden started to unleash classified documents that revealed surveillance of data provided by telecommunications and key cloud and Internet companies, the NSA's top brass spoke out for the first time. But detractors, some who don't believe the NSA's claims, argue the agency has only inflamed the situation, according to those weighing in on social media, blogs and comments added to various reports.

NSA officials gave their first extensive on-the-record interview with 60 Minutes, broadcast last night (transcript),  in an effort to do damage control and correct what the NSA disputes as misinformation about some of Snowden's revelations, which have resulted in deep mistrust by users and IT pros (many of you included) of the privacy and security of their data. Critics came down on CBS for having correspondent John Miller conduct the interview, who was previously an intelligence official, for throwing softballs and a lack of outside analysis to question some of the NSA's claims.

General Keith Alexander, who leads the NSA and U.S. Cyber Command, joined by other agency officials, admitted to the damage incurred from Snowden's revelations. At the same time Alexander and Rick Ledgett, who is tasked with assessing the damage, spoke out in an effort to discredit Snowden and deny some of the claims he has made

Alexander insisted the NSA isn't reading the contents of e-mail and other online communications, nor is it listening to actual phone conversations. "There's no reason that we would listen to the phone calls of Americans," he said. "There's no intelligence value in that. There's no reason that we'd want to read their e-mail. There is no intelligence value in that."

The only information in the metadata that's analyzed is phone numbers dialed, the parties on the call and the time and day, Alexander said. Only trends that give probable cause are investigated further, he said. "We don't hear the call," he emphasized. "We don't see the names. [We see] the 'to-from number, the duration of the call and the date, time..." He continued by saying the NSA only passes on the specific phone numbers of those communicating with suspicious numbers to the FBI.

But Alexander did acknowledge that the NSA collects the 300,000 phone records of all Americans. Asked why, Alexander explained: "How do you know when the bad guys, who are using the same communications that my daughters use, is in the United States trying to do something bad? The least-intrusive way of doing that is metadata."

Furthermore, Alexander argued that if the NSA had the tools to analyze metadata prior to the September 11, 2001 attacks, it may have found evidence of the planned attacks before they took place. But privacy advocates argue accessing metadata isn't as benign as it sounds and is questionable, if not illegal. Others are concerned for the potential of future abuse.

Alexander also denied that they had direct links to the datacenters of Google and Yahoo, though the question of whether they had access to Microsoft's facilities, disclosed in July, never came up.

Whether you feel CBS let NSA whitewash its surveillance activities, these first remarks by agency officials underscored the damage Snowden caused them. In fact, how much undisclosed information Snowden still has is a mystery. The New York Times reported over the weekend that it is unknown due to the fact that he hacked firewalls, accessed data with other administrators' passwords and used screen scraping tools to gather data. That makes it possible that Snowden still has information that could have devastating consequences, Alexander acknowledged.

Ledgett didn't dispute the possibility that Snowden has 1.7 million documents in hand. If Snowden were to release that information publicly or give it to a foreign government, "it would give them a roadmap of what we know, what we don't know,and give them implicitly a way to protect their information from the U.S. intelligence community's view," Ledgett told Miller.

That notion has led to a debate as to whether the U.S. should give Snowden immunity from prosecution in exchange for returning home to answer questions. Ledgett believes "it's worth having a conversation about," with assurances that all data are secured, while Alexander is against that. "I think people have to be held accountable for their actions because what we don't want is for the next people to do the same thing," Alexander said.

Joined by other NSA officials, the agency also described what it is doing to avoid cyber attacks by foreign nations, which it says could do major damage include bringing down the nation's power grid and financial system. During the broadcast, the NSA revealed it foiled a plot to unleash a virus that would render PCs to a "brick."  The attack, which is said to have emanated from China, would have come in the form of an e-mail notifying users of an important software update, the NSA revealed. Some of its 3,000 cyber analysts tasked with foreseeing such activity caught it before it could do any damage.

It's unfortunate 60 Minutes didn't let critics weigh in in its report but not surprising, given it let Amazon.com CEO Jeff Bezos show, on the eve of "cyber Monday," a video simulation of drones that he said will someday  deliver packages to customers' doorsteps. On the other hand, last night's broadcast did showcase the agency's thousands of highly skilled engineers and offered a glimpse of the NSA, albeit cleansed, while keeping an important discussion in the spotlight.

As we close the books on 2013, Snowden's leaks were one of the top IT stories of the year and epitomized the power of a rogue systems administrator. Whether or not you see Snowden as a hero or a traitor, his revelations have forced IT and business decision makers to rethink how they encrypt their data. That will be a key issue in the coming year.

 

Posted by Jeffrey Schwartz on 12/16/2013 at 1:21 PM0 comments


Adobe Shuns Windows Store for Its Creative Cloud

One of the keys to success for Microsoft's Windows Store effort will be getting major traditional software players to develop new modern apps. If you're expecting to see Adobe's Dreamweaver or Photoshop as an app in the Windows Store, that's not in the cards. And it's not because Adobe is ignoring the shift to mobility.

Adobe's move to go all cloud earlier this year (its software model moved from offering one-time license fees to subscription-based software as a service) gave little room to offer its wares in an app store. Adobe believes its new Creative Cloud is the best path to supporting mobile devices as well as traditional PCs and Macs.

I reached out to Adobe recently to see if it had any plans to offer any of its apps in the Windows Store and a spokesman said no. "The latest versions of Adobe's creative pro offerings are available only through Creative Cloud," he said. "We do not have any current plans to release CC tools outside Creative Cloud."

When Adobe announced it will force users of its design, Web development and marketing tools from perpetual one-time licenses to cloud-based subscriptions back in May, customers were outraged. But the move doesn't seem to have hurt the company, which yesterday reported it has 1.44 million subscriptions. That surpassed expectations of just 1.25 million subscriptions.

The way the company looks at it, it has blown past its forecast, thanks to higher-than-anticipated adoption by enterprise customers, CEO Shantanu Narayen told CNBC this morning following yesterday's fiscal fourth quarter and year-end earnings report. At the same time, revenues for the quarter ($1.04 billion) were down 9.7%. Year-end revenue of $4.1 billion was down 6.8% from $4.4 billion.

Adobe stock was trading 5% higher this morning, apparently on investors' beliefs that the company's Creative Cloud transition is working for Adobe. "Adobe employees have embraced the cloud as a much better canvas in order to do their innovation," Narayen told CNBC. "I think it's not just Adobe, but you'll find every single packaged software company embrace and adopt the cloud."

Looking at the company's results, some analysts now say Adobe is "leading the charge" to the cloud in terms of mainstream ISVs who have made a wholesale shift to the cloud. That said, does the absence of Adobe apps in the Windows Store make Windows tablets less appealing to those who live in the Adobe universe? Or are Windows 8 tablets still suitable for use as with Web-based SaaS solutions from Adobe, Salesforce.com and others, in addition to Windows Store apps?

 

Posted by Jeffrey Schwartz on 12/13/2013 at 2:27 PM0 comments


Microsoft Aims To Extend Cloud Reach with Cloud OS Network

Microsoft today is taking a step forward to advance its Windows Azure infrastructure by launching its new Cloud OS Network. The company now has 25 global partners that will offer cloud services that are effectively compatible with Windows Azure and the latest combination of Windows Server and System Center running in customers' datacenters.

The Cloud OS Network will let organizations create hybrid clouds by extending their Windows Server datacenters to Windows Azure and/or any of Microsoft's Cloud OS Network providers. When Microsoft introduced the Cloud OS term last year upon the release of Windows Server 2012 and System Center 2012, many criticized it as the company jumping into the latest buzzword.

Microsoft indeed was laying the groundwork by positioning Windows Server and System Server as a platform that would let IT managers add capacity to their datacenters by bridging their infrastructures to the public cloud to create a hybrid cloud. But the pieces of Cloud OS weren't there at the time.

This year Microsoft has made some steady progress with the release of Windows Azure Infrastructure Services and the R2 upgrades to Windows Server 2012 and System Center 2012. With a major upgrade to Hyper-V, observers said Microsoft finally had a competitive virtual machine.

Another key component Microsoft added with the October R2 wave was the Windows Azure Pack, a free download that adds Windows Azure functionality to Windows Server by providing a self-service portal for managing instances and various services including virtual machines, Web sites and platform scaling.

By launching the Cloud OS Network, Microsoft is extending the scale and reach of this cloud platform. With these 25 new partners, the Cloud OS Network adds 425 datacenters in 90 markets around the world that will manage over 2.4 million servers and 3 million customers, according to Microsoft.

The new partners include Alog, Aruba S.p.A., Capgemini, Capita IT Services, CGI, CSC, Dimension Data, DorukNet, Fujitsu Finland Oy., Fujitsu Ltd., iWeb, Lenovo, NTTX, Outsourcery, OVH.com, Revera, SingTel, Sogeti, TeleComputing, Tieto, Triple C Cloud Computing, T-Systems, VTC Digilink and Wortmann AG. 

Piers Linney co-CEO of U.K.-based Outsourcery told me that the Windows Azure Pack indeed "provides one pane of glass" between a customer datacenter, Windows Azure and his company's Windows Server-based cloud hosting service. Improvements in network infrastructure will provide smoother migrations among the three (Outsourcery also sells Windows Azure from Microsoft to supplement infrastructure it offers).

Microsoft has already invested substantially in its own Windows Azure infrastructure, and extending the same platform to partners gives the company's hybrid cloud strategy much more substance. For example companies in the U.K. have data sovereignty issues with certain information; hence it has to reside on its soil. Yet in cases where that's not an issue, it can supplement its services with Microsoft's new Windows Azure Backup Services.

Linney said he can also provide turnkey services that Microsoft can't such as Office 365 and Lync while offering a mixture of its own cloud hosting and Windows Azure. "We will increasingly create hybrid solutions for all three [customer datacenters, its service provider cloud and Windows Azure] but the solution includes different elements,"Linney said. "Typically we sell Office 365, our own service provider solution, and we integrate them with Azure. Different solutions require different infrastructures and different designs."

The thinking is there's enough for everyone and it gives customers more options. On the one hand it puts Microsoft in competition with these providers but it's no different than the numerous third-party SharePoint and Exchange providers that compete with Office 365, or for that matter Microsoft offering Surface tablets that compete with its longtime OEM partners.

This concept of offering a global service and augmenting it with hosting partners is hardly unique. It's the same model VMware has employed with its hybrid cloud service bringing partners into the fold. And the OpenStack camp supported by IBM, HP, Rackspace and numerous others has a similar model.

The race by all these camps looking to catch up with Amazon Web Services will be to gain scale and convince IT that they have choice and will not be locked into any one provider. For Microsoft's Cloud OS Network to succeed, it will need many more partners. Microsoft said that's in the works.

 

Posted by Jeffrey Schwartz on 12/12/2013 at 2:15 PM0 comments


Ballmer Reflects on Highs and Lows During Tenure as CEO

As Steve Ballmer prepares to step aside as Microsoft's second CEO after a tumultuous 13-year tenure, his legacy may take years to fully appraise. Since announcing he would be stepping down in late August, some have joyously celebrated a long-awaited change in leadership while others feel, despite some key missteps, that he is worthy of praise for overseeing huge growth during difficult economic times and for making Microsoft a leader in the enterprise.

Ballmer acknowledged his hits and misses with longtime Microsoft watcher and straight-shooter Mary Jo Foley, a Redmond magazine columnist and author of the popular ZDNet All about Microsoft blog. Perhaps no Microsoft outsider knows the innards of Redmond better than Foley and even insiders have learned a thing or two about their company from her over the years. Foley's proficiency for unearthing Microsoft news is well known at all levels of the company. Consequently, Ballmer has avoided sitting down with Foley for two decades -- not for lack of trying on her part -- until last month.

Among the revelations in her interview, which appears today in Fortune magazine, includes the fact that Ballmer had a heavy hand in resolving Microsoft's antitrust litigation in 2000, its push to make Xbox a leading gaming platform and, perhaps most important, his dogged pursuit of the company's enterprise business, which has become a key source of revenue and profit growth.

Regardless how you feel about Ballmer, the company's profit tripled on his watch. At the same time, Ballmer botched Microsoft's Longhorn effort and ultimately the Windows Vista release. Following that debacle, Ballmer failed to accelerate the company's move into the mobile era and now the company is struggling to keep Windows relevant.

A poll of Redmond readers in September after Ballmer's retirement was announced showed 10 percent felt he did an excellent job as CEO, 34 percent said he did a good job, 35 percent believed he was an average CEO and 21 percent gave a poor rating.

Those who commented at the time were mostly critical. "Anyone with a pulse could have ridden that cash cow," said one. "After Gates, Ballmer was a manager, not a leader," argued another. Yet many respondents had more positive assessments. "I think the products introduced, along with the financial performance of the company says it all," a commenter said.

Perhaps the most salient comment I've heard about Ballmer was his inability to make wise management decisions in a company beleaguered with fiefdoms that in many cases rendered innovations to the backburner. As one reader concluded,"...he had command of so much talent and didn't use it wisely."

In her interview, Foley has lots of tidbits worth reading about Ballmer, both in Fortune magazine, and in her All About Microsoft blog, where he recounts and laments the Longhorn debacle. And of course you can find her latest Redmond magazine column, where she looks at the "10 Biggest Surprises of 2013," here.  

 

Posted by Jeffrey Schwartz on 12/11/2013 at 12:54 PM3 comments


Microsoft's Door-Buster Promo Aims To Boost Demand for Windows 8.1

Microsoft will be looking to create some consumer traction for Windows 8.1 devices with a new holiday themed sale. Its promotion, called "12 Days of Deals," kicked off this morning with a Dell Venue 8 Pro. Normally priced at $299, this door buster was listed at $99 for the first 20 customers at each store and online.

At just $99, it's a steal! Microsoft also promised after the first 20 people got their $99 tablets, it would offer more at $199 until they ran out. Even though it was a bait-and-switch, $199's not bad price so I figured I'd check it out. Since the store was scheduled to open at 10:00 a.m., I figured I'd get there at 9:30 guessing I'd face either a really long line or no crowd at all since the deal was not widely advertised.

I tried to grab one on the store's Web site right after midnight but the site was unavailable due to the demand. In person, it turned out the line had about a dozen people on it but apparently the store quietly opened at 8:00 a.m., so I was out of luck for the $99. I learned the line began forming at 3 a.m. this morning but it would take more than a deal like that to get me on any line at that hour.

I decided $199 for a tablet running Windows 8.1 Pro could be compelling if it performs reasonably well. The recently released Dell Venue 8 Pro has an 8-inch IPS HD multi-touch display (1280x800) powered by an Intel Atom processor (the Z3740D with 2 MB cache and up to 1.8 GHz quad core), 2 GB of RAM and 32 GB of storage (with support for 64 GB of additional storage by adding a MicroSD card). It also comes with Office Home and Student 2013.

Since I have 14 days to return it, I'll report whether it's a keeper or if I decide to bring it back!  If you're thinking of trying to snag one for $199 at the online Microsoft Store, forget it.  A message on the site says it's sold out.

If you've tried out the Dell Venue 8 Pro, how do you like it?

Posted by Jeffrey Schwartz on 12/09/2013 at 1:09 PM1 comments


Google's New Cloud Service Doesn't Do Windows

Google this week became the latest major player to launch an infrastructure-as-a-service (IaaS) cloud offering with the general availability of the Google Compute Engine. In so doing, Google is now challenging other major providers of IaaS including Amazon Web Services, Microsoft, Rackspace, IBM, HP, AT&T, Verizon and VMware.

But if you're looking to provision Windows Server in Google's new cloud, you'll have to wait. Right now Google Compute Engine doesn't support Windows Server or VMware instances. During the preview, launched in May, Google Compute Engine only supported Debian and CentOS. Now that it's generally available, Google said customers can deploy any out-of-the-box Linux distribution including Red Hat Enterprise Linux (in limited preview now), SUSE and FreeBSD.

Despite shunning Windows, at least for now, it's ironic to note that one of the leaders of Google Compute Engine also was a key contributor to Microsoft's original .NET development team over a decade ago. Greg DeMichelle, director of Google's public cloud platform, was responsible for the overall design and feature set for Visual C# and C++ and a founding member of the C# language team.

After leaving Microsoft, DeMichelle joined the research firm Directions on Microsoft and also wrote a column for Redmond sister publication Redmond Developer News magazine, where I was executive editor several years ago. (RDN was folded into Visual Studio Magazine in 2009). I reached out to DeMichelle but haven't heard back yet but I do hope to catch up with him and hear more about Google's plans for supporting Windows -- or lack thereof.

Some analysts believe despite Google's late entry, it will be a force to be reckoned with in the IaaS world. In a blog post Monday announcing the launch, Google pointed to several early customers including Snapchat, Cooladata, Mendelics, Evite and Wix. Google reduced the cost of its service by 10 percent and DeMichelle made no secret in an interview with The New York Times that he believes Google is better positioned to take on Amazon Web Services, where he briefly worked prior to joining Google earlier this year.

Like Microsoft, Google entered the enterprise cloud fray years ago by offering a platform as a service (PaaS), known as Google App Engine. Monday's official release of Google Compute Engine means customers can now deploy virtual machines and stand-up servers in its public cloud.

Google is touting the fact that using live migration technology it can perform datacenter maintenance without downtime. "You now get all the benefits of regular updates and proactive maintenance without the downtime and reboots typically required," Google VP Ari Balogh wrote in Monday's blog post. "Furthermore, in the event of a failure, we automatically restart your VMs and get them back online in minutes. We've already rolled out this feature to our U.S. zones, with others to follow in the coming months."

Galogh added Google is seeing demand for large instances to run CPU and memory-intensive applications such as NoSQL databases. Google will offer 16-core instances with up to 104 GB of RAM. The company is now offering those large instance types in limited preview only.

As I noted, Google has lowered the price of its standard instances by 10 percent over the price it offered during the preview period. It also lets customers purchase capacity in increments of 10 minutes, according to its price list. With Google now officially in the game, 2014 promises to be a telling year as to which of the major providers can give Amazon a run for its money. But unless Google introduces Windows Server support, it'll miss out on a key piece of the market.

Posted by Jeffrey Schwartz on 12/06/2013 at 12:06 PM3 comments


IDC's Annual IT Predictions: 'Put Up or Shut Up' in 2014

The mantra for 2014 will be "put up or shut up" when it comes to achieving IT revenue growth and market position in the coming year. That was a key theme outlined by IDC chief analyst Frank Gens during a one-hour webcast yesterday to discuss the influential market researcher's annual worldwide IT forecast.

Spending on IT technology in 2014, excluding telecommunications services, will grow 5.1 percent to $2.1 trillion, which represents a slight uptick over the current year, Gens said. In the coming year, IDC is forecasting a continued move to what it calls the "3rd Platform," centered around mobile devices and the migration to cloud architectures with substantially increased investments in enterprise social networking tools and technology that lets users mine big data.

 "2014 will be all about battles across this platform," Gens said. "The past five years of the third platform build out has been all about laying the infrastructure and developer platform foundations. This next chapter is about fostering an explosion of innovation on that foundation, with hundreds of thousands to millions of new killer apps and solutions."

Recalling Microsoft CEO Steve Ballmer's "developers, developers, developers, developers" rant from years ago, Gens emphasized during the one-hour call that indeed winning over developers will be critical for those who are going to survive in the coming decade. Ironically for Microsoft, winning over developers will be critical if it's going to be a player in the new mobile era.

Microsoft has about a year to win over developers or it will be doomed in the mobile market, Gens said. The good news is developer interest in the new Windows platform has risen by eight points, according to IDC's latest Appcelerator report. However only 37 percent of developers say they are very interested in developing mobile apps for Windows, still 35 points below Android and 50 points below Apple's iOS.  Microsoft honestly needs to double that interest level within the next 12 months, or it could be game over," Gens said.

Increasing sales of tablets will continue to take a bite out of the PC market, which will continue to slide by 6 percent. Tablets will grow by 18 percent and smartphones 12 percent. Mobile devices will outsell PCs by two and a half to one, Gens said.

Cloud spending, which includes service providers, infrastructure and software will grow 25 percent reaching $100 billion. More than a third (35 percent) of that spending will be on cloud service providers and shared hosting facilities. Just like mobile, cloud providers will fight for developers to support their platforms, Gens noted. "Over the next four years, we will see a tenfold increase in the number of apps in the cloud, driven in part by a tripling of the number of developers and contributors to cloud app ecosystems," he said. "Two-thirds of these new apps will have an industry specific or a role specific focus."

With the amount of digital data growing 50 percent in 2014, users will create 6 trillion terabytes, or 6 zetabytes of data. That will fuel 30 percent growth in infrastructure and tools to mine big data, exceeding $14 billion. IDC is also predicting that in the next three years, 80 percent of the most successful apps will leverage large data streams. Demand for big data and analytics skills will outstrip supply, Gens said.

While many enterprises continue to assess whether they'll see any value by investing in social networking, IDC is predicting within the next three years, 80 percent of Fortune 500 will use it as a key foundation for marketing, selling and maintaining community, up from 30 percent today.

Social networking will also increasingly invade product and service development, according to Gens. By 2016, IDC forecasts 60 percent of the Fortune 500 will deploy social-enabled platforms, solutions that gather input from their communities of customers, partners and other components of their supply chains. To enable that, IDC predicts in 2015 the key social platforms will converge and merge with the major cloud platform-as-a-service (PaaS) clouds.

It stands to reason the intense competitive environment in IT will not let up in 2014. How does IDC's forecast for 2014 line up with your predictions?

Posted by Jeffrey Schwartz on 12/04/2013 at 2:04 PM0 comments


Down to 2: Mulally and Nadella Reportedly Top Microsoft CEO Contenders

Microsoft's CEO search committee reportedly is honing in on Alan Mulally and Satya Nadella as the top two candidates to succeed Steve Ballmer.

A report by Bloomberg  on Thursday said the committee is leaning toward Mulally, now CEO of Ford, and Nadella, who oversees Microsoft's enterprise and cloud business and is well respected within the company. Sources told Bloomberg that internal candidate Tony Bates and Nokia former CEO Stephen Elop "remain in the mix," though are less likely.

The thinking of the committee remains "fluid," according to the report, meaning any candidate could still float to the top. The committee has aimed to wind down its search by year's end but the process could go into next year, according to the report.

Based on various reports over the past several months, Mulally appears to remain the favorite because of the way he parachuted into Ford last decade and brought it back from the brink, despite his newcomer status to the auto industry (he was previously CEO of Boeing where he rose through the ranks). As I noted back in September, Mulally has already advised outgoing CEO Steve Ballmer and helped him architect the One Microsoft strategy modeled in part after One Ford.

Critics argue just because he was able to turn Ford around doesn't mean he can do the same for Microsoft. Many take issue with Mulally's age (68) as well. But perhaps the committee is considering a scenario where Microsoft brings Mulally in for a few years, while naming Nadella president and grooming him as heir-apparent?

The other issue is it is not clear if Mulally wants to, or can, leave Ford in the midst of its own transition. Because Mulally hasn't emphatically ruled out heading to Microsoft, he remains in the mix. But if he does shut the door on leaving Ford, then perhaps Nadella will get the nod?

Posted by Jeffrey Schwartz on 12/02/2013 at 2:55 PM0 comments


Slim Pickings for Windows 8/Surface Cyber Monday Deals

On this Cyber Monday, if you were hoping to find a good deal on a Windows 8 PC and/or tablet you'll have to search long and hard. And even those that may appeal to cost-conscious shoppers are already sold out.

In an apparent effort to clear out inventory of its first-generation Surface devices, Microsoft slashed the price of its Surface Pro to $649 for the version equipped with 64 GB of storage -- down from $799. Unfortunately it's sold out. Though if you want the beefier Surface Pro with 128 GB of storage it was still available this morning for $749 (down from $899) at the Microsoft Store. That's $250 less than the Surface Pro 2 and you will sacrifice battery life and performance if you believe then the savings are worthwhile.

If you're looking for a hot deal on the first-generation Surface RT, Best Buy had them listed at an attractive $199. The bad news is they're sold out. Microsoft is offering the Surface RT for $299 (down from $349), though it's a refurbished version. The newer Surface 2s cost $449.

The only PC equipped with Intel's 4th generation (Haswell) processor at the Microsoft Store is HP Pavilion TouchSmart 15-n011nr touchscreen laptop offered as a Cyber Monday deal. It's listed at $499, which Microsoft claims is $200 off the regular price. It has a 15.6-inch touch screen display, an i5 processor, a 500 GB hard drive, 4 GB of RAM and battery life of six hours.

Some other deals tweeted by Microsoft Sales Excellence Program Manager Eric Ligman include the Dell XPS 18 Touchscreen All-in-One for $699 (down from $1,349) and the ASUS VivoTab Smart ME400C-C2-BK touchscreen tablet reduced $100 to $299 -- but alas they too are sold out. Others, touted as deals, are really the same prices they were advertised at in the past such as the New Dell Venue 11 Pro (64 GB). Also beware of low-cost PCs that don't have touch displays. Whether or not you think you'll ever use the touch features, it makes no sense to buy one without a touch interface. At some point you will need it.

If you're looking for a Cyber Monday deal on a Surface 2 or Surface Pro 2, good luck! If you see any deals on Windows 8 PCs and/or tablets, feel free to share them.

Posted by Jeffrey Schwartz on 12/02/2013 at 2:56 PM3 comments


Major Microsoft Cloud Outage Blamed on DNS Failure

Yesterday's latest Windows Azure cloud crash, caused by a DNS failure, overshadowed an upgrade to the service and briefly interrupted Microsoft's much-anticipated Xbox One launch last night.

The malfunction apparently brought down portions of the Xbox Live service on the eve of its midnight consumer unveiling. Microsoft is touting the Xbox One release as its most significant gaming launch to date. Fortunately for Microsoft, the outage's impact in Xbox Live didn't lead to major headlines that could have outshined last night's midnight launch.

As speculated, a DNS failure on the management servers outside of Windows Azure indeed was the cause yesterday's failure, Microsoft corporate VP Scott Guthrie confirmed in a tweet last night. "No -- Azure is not having issues (customer apps continue to run fine). The problem is a DNS name server issue outside of azure [sic]," the tweet read.

In addition to Xbox Live inconveniently going down on the eve of the launch of Microsoft's first new upgrade of its Xbox console, Office 365, Outlook.com and SkyDrive also experienced failures.

Yesterday's failures also overshadowed several key upgrades to Windows Azure that Guthrie announced yesterday on his blog including:

  • BizTalk Services: General Availability Release
  • Traffic Manager: General Availability Release
  • Active Directory: General Availability Release of Application Access Support
  • Mobile Services: Active Directory Support, Xamarin support for iOS and Android with C#, Optimistic concurrency
  • Notification Hubs: Price Reduction + Debug Send Support
  • Web Sites: Diagnostics Support for Automatic Logging to Blob Storage
  • Storage: Support for alerting based on storage metrics
  • Monitoring: Preview release of Windows Azure Monitoring Service Library

Microsoft and its key rivals are all in an aggressive race to gain ground in cloud computing on Amazon Web Services. Of course Amazon has had its own share of embarrassing outages.

Presuming yesterday's outages are remediated, as the Windows Azure Service Dashboard indicated this morning, yesterday's failure will be added to the growing annals of disruptions. But it also adds fuel to the fire for opponents of using Windows Azure or any cloud service.

 

Posted by Jeffrey Schwartz on 11/22/2013 at 10:58 AM0 comments


Salesforce.com Revamps CRM Platform, Signs HP Deal

Salesforce.com announced Salesforce1, its new sales, marketing and service cloud-based platform, at the company's annual Dreamforce conference in San Francisco this week. Salesforce.com operates the largest software as a service (SaaS) cloud platform and this new addition aims to focus on social features, while making its apps and those of its large ecosystem of ISVs suitable for mobile devices.

Salesforce1 is also architected for the notion of cloud-connected devices and introduces a new application infrastructure designed to enable developers to build apps with social interfaces that are designed for mobility. The company said the new platform has 10 times more APIs and services.

The new community oriented Salesforce1 is available to all customers of the Salesforce CRM and Salesforce Platform. The Salesforce Mobile and Salesforce Admin apps are available in the Apple App Store and on Google Play.

"It's the world's first CRM platform for everyone -- for developers, for ISVs, for admins, for end-users and, most of all, for your customers," Benioff said on the company's third quarter earnings call, according to a Seeking Alpha transcript. "So you can go social, mobile, cloud and get connected."

Also at Dreamforce, Salesforce.com and Hewlett Packard inked a deal to let customers build virtual instances of the Salesforce CRM platform. Using HP's "Converged Infrastructure" of servers, storage and network gear, the companies will collectively build the Salesforce "Superpod."

"The Salesforce Superpod will allow individual customers to have a dedicated instance in the Salesforce multi-tenant cloud," Benioff said in a statement announcing the deal.

However Salesforce will host the Superpods in its own datacenters and not HP's. In fact, the Superpods are identical to the existing 15 pods in Salesforce datacenters used to host the company's CRM platform, InformationWeek reported. The key difference is that Salesforce will equip the Superpods with HP infrastructure.

Furthermore Salesforce is only offering the Superpods to the largest of enterprises, the InformationWeek report pointed out, adding that it's intended for those who have governance and security requirements. "For the vast majority of customers, this is not appropriate," he reportedly said. "But there are customers who want to go to another level."

Posted by Jeffrey Schwartz on 11/21/2013 at 8:19 AM3 comments


Why Can't SkyDrive's Apps Work Like Dropbox?

With last month's launch of Windows 8.1, the new Surface Pro 2 and the Windows RT-based Surface 2, Microsoft has made its SkyDrive service a key component of the company's "devices and services" strategy. The concept is great but the execution falls short because SkyDrive isn't as easy to use on any device as Dropbox.

That's a major problem and one I believe Microsoft must fix sooner than later. Indeed time is of the essence. Dropbox is reportedly on the verge of securing a whopping $257 million in financing on top of the $250 million it already has raised, Bloomberg Businessweek reported Monday. Dropbox officials believe the company's market cap is worth $8 billion, according to the report. The company has 200 million users, though a vast majority of them use the free service. Nevertheless Dropbox's revenues have grown from $12 million in 2010 to $116 million last year and it's estimated it'll exceed $200 million, The Wall Street Journal reported this week.

Microsoft has offered its SkyDrive service for many years and those who were fortunate enough to sign up before April of last year secured 25 GB of capacity for free before the company slashed the amount of complimentary capacity down to 7 GB. Even now, the 7 GB limit is more than three times more generous than the 2 GB limit of Dropbox, though you can receive incremental promotional upgrades. I've managed to up my free Dropbox capacity to 5 MB.

All things being equal I'd rather use SkyDrive as my default personal cloud storage provider. Its integration with Windows 8.1, Office 2013 and Office 365 (which Microsoft says is on a 1.5 million user run rate) make it an ideal way of synchronizing documents across multiple devices and PCs. It also makes the latest versions of Windows and Office quite compelling. Enterprise users with SharePoint Online can use the even more manageable SkyDrive Pro, but that's a separate story.

Unfortunately all things aren't equal. Though close, SkyDrive is no Dropbox -- at least not in its current form. Perhaps the biggest downside to SkyDrive is its interface on the current crop of devices and on Windows 8.1. The Dropbox app on Apple's iOS and Windows 8.x is much easier to use as it displays files and folders the way users are accustomed to using the Windows Explorer model. SkyDrive doesn't. It renders files as icons. While you can search for content, good luck trying to sort files. In fact, the best way to organize files is by using the traditional Windows desktop.

Of course, one way around that is to use Windows Explorer in Windows 8.1, which does provide a good view of all your files. But that defeats the purpose of using the modern app and it isn't even an option with non-Windows devices.

At the same time, there are some key benefits to SkyDrive versus Dropbox, as Microsoft points out, including remote access and the ability to edit and add notes. And Microsoft makes it easier to do certain tasks like attaching a file to a message in Mail from SkyDrive, which isn't easily done with Dropbox.

Still, I find it easier to find files in Dropbox than SkyDrive and at the end of the day, that's what matters. Microsoft needs to address this in its apps if it wants to appeal to those happy with Dropbox. Moreover, there's no shortage of alternative personal cloud services from the likes of Apple and Google as well as those already reaching out to enterprises such as Box, which also has raised a boatload of funding and has a sizeable installed base.

Also rest assured Dropbox isn't sitting still. The company last week announced Dropbox for Business, which addresses a key objection to the free service: the lack of IT control and questions about security. Dropbox could be an attractive acquisition target for Microsoft, Google, Apple and even Amazon. The latter could be especially attractive as Dropbox currently hosts its infrastructure on Amazon Web Services Simple Storage Service S3.

But Microsoft doesn't need to shell out the billions it would take to acquire Dropbox.All it needs to do is make SkyDrive's user interface more flexible in its modern apps across all platforms.

What's your preferred personal cloud storage service?

 

Posted by Jeffrey Schwartz on 11/20/2013 at 12:23 PM3 comments


Will Microsoft Become More Competitive By Removing Employee Ranking?

Microsoft's decision to do away with its so-called "rank and yank" method of evaluating the performance of its employees (made famous by GE's former CEO Jack Welch) is Redmond's  latest effort to get them to work more closely toward CEO Steve Ballmer's vision of One Microsoft.

It's no secret that the siloes between divisions in Microsoft have led to bitter disputes over technical and product direction. Critics argue those rivalries and fiefdoms helped pave the way for companies such as Amazon, Apple, Google, Salesforce.com and VMware to lead or eat into markets where Microsoft once had an edge. How the notion of employees being ranked played into those rivalries is hard to say. But removing employee ranking should reduce the Survivor mentality it aims to foster. Experts also argue companies that don't have rigid employee ranking processes are more attractive to talented developers. Removing employee ranking could also help retain valued employees. Of course the end goal is making all Microsoft employees more focused on customer needs.

The One Microsoft reorganization aimed to bring development, sales and marketing with a common goal and it appears this latest move is an outgrowth of this transition. The irony of One Microsoft of course is that it's modeled after One Ford, which the auto-giant succinctly describes as One Team, One Plan, One Goal. As I noted yesterday, Ballmer has called upon Ford CEO Alan Mulally for advice on how to turn Ford around. Of course it's also intriguing given that Ballmer and Chairman and Founder Bill Gates are said to want Mulally to take the reins of Microsoft.

Does removing "rank and yank" further stack the deck for Mulally? That could be one way of looking at it but regardless who becomes Microsoft's next CEO it's in his or her interest to have employees who are on the same page. Only time will tell to what extent this will accomplish that or if a new CEO will have a different philosophy for evaluating employee performance.

Posted by Jeffrey Schwartz on 11/19/2013 at 12:40 PM0 comments


Steve Ballmer's Exit Interview

When Microsoft issued the stunning news that longtime CEO Steve Ballmer would retire, some observers questioned whether he jumped or was pushed. For the first time Ballmer answered the question with graphic detail in an apparent effort to etch his legacy in stone.

In his uncharacteristically self-effacing interview with The Wall Street Journal late last week, Ballmer maintained the decision to leave was his, which he made while in London back in May. Well aware Microsoft needed to change faster or risk becoming marginalized by Apple, Google and others, Ballmer recalled reaching an inflexion point that Microsoft would be able to change faster without him.

"At the end of the day, we need to break a pattern," Ballmer told The Journal. "Face it: I'm a pattern."

Later that month Ballmer set the wheels in motion to inform the board. Also weighing in was lead director John Thompson, the onetime CEO of Symantec, who had made clear to Ballmer in January Microsoft needed to embrace change at a much more accelerated pace.

Ballmer recounted how he met with his close friend Alan Mulally, the CEO of Ford who is a candidate to succeed Ballmer, though the automaker has stated its chief will stay put through 2014. During that Christmas Eve meeting at a Starbucks near Seattle, the two talked for hours about Microsoft's new "devices and services strategy" and how Mulally turned around Ford by implementing a more team-oriented approach. Looking for a way to overhaul Microsoft, Ballmer saw that the  team-oriented model would help to eliminate the legendary siloes that have held the company back. Ballmer then changed his management style to fit with this new idea.

It is well-known Ballmer didn't want to retire until his youngest son graduates in 2017. At the time of his retirement announcement, Ballmer said bringing in a new CEO in the midst of a transition might not be in Microsoft's best interests.

The report is a fascinating story of how Ballmer fell on his sword for the good of the company he loves so much. That may very well be how things played out. On the other hand, a skeptic could argue if the board (with or without support from founder and chairman Bill Gates) did force Ballmer's hand -- wanted to spare one of its largest champions and shareholders humiliation -- it's not beyond the realm of imagination that they agreed to give him this graceful and humble exit.

But regardless how you feel about how Ballmer ran Microsoft over the years, the company has grown in revenue and profits consistently throughout his tenure even if its share price did little to increase. An orderly transition is critical and throwing Ballmer under the bus certainly wouldn't further that cause.

The story Ballmer revealed is quite plausible even if he displayed unusual humility. Unless evidence surfaces to the contrary though, I'll take Ballmer at his word. Do you?

 

Posted by Jeffrey Schwartz on 11/18/2013 at 10:53 AM0 comments


Why Amazon Could See Slow Uptake for Its Cloud VDI Service

Untitled Document

When Amazon announced plans to disrupt the virtual desktop infrastructure (VDI) market Wednesday by launching WorksSpaces at its re:Invent customer and partner conference in Las Vegas, Citrix shares dropped 4.5 percent on the news. Amazon pitched its desktop-as-a-service offering as a more affordable approach to traditional VDI offered by Citrix, VMware and Microsoft. That's because with WorksSpaces, IT can spin up virtual desktops without buying hardware or software just as they can with Amazon's cloud and storage portfolio of services.

Given its track record in upending traditional business models, one doesn't want to ignore Amazon when it offers anything new (remember Borders?). But analysts I spoke with following the announcement noted Amazon is not likely to take the VDI world by storm overnight for a variety of reasons. Maybe that's why Citrix shares are inching back up today?

One noteworthy barrier to adoption of Amazon WorkSpaces is the end user. When Amazon launched EC2 over seven years ago, it gave developers a way to bypass IT to quickly procure infrastructure. End users on the other hand are not clamoring for VDI, said Forrester analyst David Johnson. "There aren't employees inside a company that are going to run out and sign up for Amazon desktops," Johnson said. Desktop as a service will appeal to those who need "pop-up-desktops" for contractors or to quickly get projects started, Johnson said.

A Forrester survey last quarter found that 11 percent of SMBs and enterprises in North America and Europe are including desktop as a service within the next 12 months. This is up from 5 percent during the same time last year. Looking beyond one year, 12 percent said they are planning hosted desktops, up from 7 percent last year.

When it comes to overall plans for VDI, 52 percent said it was a high priority, up from 48 percent last year and 43 percent in 2011, according to Forrester. IDC's current forecast for client virtualization spending overall this year is $175 million. It projects next year it will rise to $311 million and hit $600 million by 2016.

Although VDI deployments that use public cloud infrastructure are part of a small but emerging piece of that market, Microsoft recently made its Remote Desktop Services (RDS) available for Windows Azure. Amazon WorksSpaces gives users their own instance using portions of Windows Server 2008 R2 and renders a user interface that looks like Windows 7. "There are positives and negatives to both approaches but at the end of the day it's similar for the end user," Waldman said.

Meanwhile VMware also has its sights on offering a desktop as a service VDI offering with its recent acquisition of Desktone and Citrix is also developing a similar offering. But Waldman said large enterprises are wary of putting user data in the cloud. "We see enterprises taking a slow cautious approach to cloud hosted virtual desktops. However, for small and mid-sized companies where VDI is too expensive and complex to get up and running, it makes it more accessible to them."

The most likely candidates for Amazon WorkSpaces are those that are already using Amazon's cloud infrastructure services, Waldman noted. But there's a case to be made that many IT pros will consider Microsoft's RDS, because of the application compatibility, Waldman said.

"While 95 percent of apps can work on client or server, many apps were poorly written and literally hard coded to run on a client operating system," he said. "Even though apps written for Windows can run on Windows Server, there are many instances it would not because of that one bad line of code."

While there are solutions to remediate that, such as Citrix's AppDNA, it could be a showstopper for those looking for quick deployments.

Are you considering a desktop-as-a-service VDI deployment? If so, which offering sounds most appealing?

 

Posted by Jeffrey Schwartz on 11/15/2013 at 1:53 PM0 comments


Amazon Aims To Disrupt VDI Market with Cloud-Based Offering

More than seven years after upending how IT consumes compute, storage and application services, Amazon is going up the infrastructure stack  to the desktop. Amazon Web Services today said it's gunning to shake up the struggling VDI market with a cloud-based alternative that requires no hardware, software or datacenter infrastructure.

The company announced its plans to offer Amazon WorkSpaces, which it claims it can offer services at half the cost with better performance than traditional virtual desktop infrastructure platforms today. Amazon Web Services senior VP Andy Jassy revealed the new cloud-based VDI offering in his opening keynote address at the company's second annual re:Invent customer and partner conference taking place in Las Vegas.

Saying VDI hasn't taken off because it's complex to setup and manage, Jassy told the thousands of attendees and online viewers in his keynote that Amazon WorkSpaces promises to reduce those barriers. It will allow organizations to move their desktop licenses to Amazon and provides integration with Active Directory.

"You can access your Amazon WorkSpace from any of your devices whether it's a desktop, laptop or an iOS device," Jassy said.  "And you get persistent sessions, so if you're using a WorkSpace on your laptop, and you switch to your Android [or any other] device, the session picks up just where you left off. What's also nice, because it's a cloud service, all of the data lives in the cloud -- it doesn't live local to those devices, which of course is a concern for an IT administrator."

The company described in a blog post a use case with 1,000 employees that would cost just $43,333 using Amazon WorkSpaces. This would be 59 percent less expensive than an on-premise VDI deployment that would cost $106,356 (which includes datacenter investments).

Amazon will initially offer a Standard service that costs $35 per month for one virtual CPU, 3.75 GB of memory and 50 GBytes of capacity; and a Performance plan that costs $60 for two virtual CPUs, 3.75 GB of memory and 100 GB storage per user. A Performance Plus package will come with 7.5 GB of memory. Customers that don't have licenses to move over can purchase licenses for Microsoft Office and antivirus software firm Trend Micro for $15 per month per user.

Jassy said the company intends to first offer invitation-only trials. He did not disclose general availability. Customers can register for the preview now.

Do you think Amazon can change the economics of VDI and make it more appealing? Given Amazon's track record, I wouldn't bet against the company becoming a player in the VDI market.

Posted by Jeffrey Schwartz on 11/13/2013 at 12:49 PM0 comments


Microsoft Explores Putting Fuel Cells in Server Racks

In its quest to build greener datacenters that are also more efficient and reliable, Microsoft is exploring the use of fuel cells installed in the server racks.

Microsoft announced that it is studying the impact of installing fuel cells directly into the racks as a more efficient means of bringing the power plant into the datacenter than using outside generators. A datacenter powered by fuel cells can reduce operational costs by 20 percent, Microsoft projects, according to a research paper the company published.

The study is the latest evolution of Microsoft's Data Plant project, the company's first zero-carbon datacenter launched last year in Cheyenne, Wyo., where it integrated the infrastructure and its components with a wastewater treatment plant. The study aims to determine if integrating fuel cells can improve service availability, reduce infrastructure costs and meet our commitments to sustainability," Sean James, senior research program manager for Microsoft's Global Foundation Services, explained in a blog post.

This would extend Microsoft's Data Plant concept to determine how to take the entire energy supply chain -- from the power plant to server motherboards -- in a single cabinet, James added. In the paper, the authors illustrate how adding a small generator to the server racks can substantially remove the datacenter's complexity by eliminating the electrical distribution within the grid and datacenter.

By using fuel cells instead of outside power, he notes, they're not restricted by the limits of typical Carnot Cycle Efficiency found in traditional power generators. "By integrating fuel cells with IT hardware, we can cut much of the power electronics out of the conventional fuel cell system," he wrote. "What we are left with is a very simple and low cost datacenter and fuel cell system. As the fuel cell industry becomes more mature, especially small form factor fuel cells for automotive and IT applications, the cost of fuel cells will drop. You may end up with one someday delivering clean electricity and heat to your home."

James pointed out this study is only in the early stages but it's a noteworthy step in the company's effort to bring fuel cells into the server rack.

Posted by Jeffrey Schwartz on 11/13/2013 at 4:02 PM0 comments


IE Chief Dean Hachamovitch Reassigned

Update: Joe Belfiore, Windows Phone corporate VP, will apparently oversee Internet Explorer's user experience and application development, according to a report by The Verge's Todd Warren.

Dean Hachamovitch, the Microsoft corporate vice president who oversaw the development of the company's Internet Explorer browser for nine years, is taking on a new role in the company. In a cryptic and brief blog post, Hachamovitch on Monday announced he will join a new team within Microsoft.

While he didn't say what new group he's joining, Redmond magazine columnist Mary Jo Foley reported in her ZDNet All about Microsoft blog that he's joining a team focused on data sciences. The move is part of new Windows group head Terry Myerson's effort to assemble his own team, Foley noted. She also pointed out that most of the key personnel who reported to former Windows chief Steven Sinofsky have (or are) moving into new roles.

The move comes just one week after Microsoft released Internet Explorer 11 for Windows 7. Since Microsoft released Internet Explorer 7, the company made aggressive moves at improving the browser under Hachamovitch's watch, including notably its support for HTML 5.

When Hachamovitch joined the Internet Explorer team nine years ago, Microsoft's browser was falling out of favor. That's because in wake of the demise of Netscape, which Microsoft neutralized, Redmond had little incentive to improve its browser. Microsoft's complacency eventually caught up with it, as the Mozilla Firefox browser gained share followed by Google's launch of Chrome.

The inflexion point came at the first-ever Mix conference in 2006, when Hachamovitch followed chairman and founder Bill Gates in apologizing for neglecting the bug-ridden Internet Explorer 6, which was full of security holes, as recalled by GeekWire on Monday. "We messed up," he said at the time.

It doesn't appear Microsoft will tap anyone to oversee Internet Explorer, Foley noted. Does that suggest Microsoft is going to let the browser once again fall by the wayside? Hachamovitch in his brief post said he is confident that won't be the case.

"Microsoft will of course continue to invest in the browser, in Web standards, in developer tooling for the Web, in privacy, and in even more areas than before," Hachamovitch noted. "There's a new set of capable leaders who will continue the strong work."

Of course, what would you expect him to say?

What's your take on Hachamovitch's move? Is that an omen that Internet Explorer will be marginalized if it doesn't get a new chief? Does it still matter at this point? Or are you confident Myerson plans to ensure future development of the browser?

Posted by Jeffrey Schwartz on 11/12/2013 at 3:58 PM3 comments


Microsoft's Plans To Toss Its Cookies

Microsoft is joining the chorus of tech companies, notably Google, that plan to do away with cookies, the tracking component used on the Web that's typically exploited by advertisers.

AdAge last month reported that Microsoft is developing new ad-tracking technology that would work across PCs, tablets, smartphones and its Xbox gaming platform. The new ad-tracking component would also be integrated into Internet Explorer and Bing, the report noted.

The move doesn't appear to be intended for your convenience though. It's more about continuing ad tracking across TV and video broadcast networks. Michael Schoen, EVP-programmatic product management at IPG Mediabrands, told AdAge that cookies have become irrelevant for television and Web-based video delivery. "For the past two to three years now, there has been a lot of talk about the impending death of the third-party cookie," he said.

Microsoft is developing a "device-identifier" to replace cookies, AdAge reported, meaning users would give permission to share information via a device's terms of service.

"Microsoft would then become directly responsible for users' data and -- assuming it doesn't share it with third parties -- confine privacy concerns to the Redmond, Wash.-based company rather than countless companies that currently collect data on people's browsing behaviors."

Rather than letting hundreds or thousands of advertisers put cookies in the browser, the "device-identifier" would be the sole component doing the tracking. With Google, Facebook, Amazon, Apple and other large players developing similar technologies, there will ultimately be a smaller pool of those tracking user data.  

Windows 8.1, which shipped last month, includes a new identifier designed to render higher quality and more targeted ads in Windows Store apps, while providing other services, including analytics and app-discovery, said Steve Guggenheimer, Microsoft's corporate VP for Microsoft's development platform, in a blog post.  Users can turn on and off the advertising ID, Guggenheimer noted.

In addition, Windows 8.1 ships with Internet Explorer 11, which comes with a "do-not-track" feature turned on by default. However, Microsoft's do-not-track feature is just a URL string that signals the user's preference to third-party advertisers. It's up to the advertiser to honor the request or not. Microsoft recently admitted that its efforts to standardize do-not-track browser technology at the Worldwide Web Consortium is mired by disagreement among browser makers and other stakeholders.

While not addressing plans to eliminate cookie use with Internet Explorer, Guggenheimer announced the release of SDKs for developers to implement the advertising ID, though he noted more SDKs are in the works.

What remains to be seen, of course, is whether eliminating cookies will improve the performance of Internet Explorer or if the new ad identifier will come with its own baggage.

 

Posted by Jeffrey Schwartz on 11/08/2013 at 3:42 PM0 comments


Microsoft's Rumored CEO Candidate Short List

According to a Reuters report, Microsoft has reportedly narrowed its shortlist to at least three internal candidates and five external candidates.

Among those still in the running to replace Ballmer are Ford CEO Alan Mulally, former Nokia Chief Stephen Elop and two internal candidates -- Satya Nadella, executive VP for the Cloud and Enterprise Group and Tony Bates, the former CEO of Skype who is now executive VP for the Business Development and Evangelism group. The remaining candidates weren't noted, though the report pointed to Computer Sciences Corp. CEO Mike Lawrie as one on Wall Street's desired list.

The search could still take several months, according to the Reuters report, which cited unidentified sources familiar with the matter. Short of luring back Microsoft exec Paul Maritz, who ran VMware before settling into his current role as CEO of spinoff Pivotal, Nomura Securities Analyst Rick Sherlund today said he believes it will be Mulally and thinks he will be named next month. Mulally's name as a preferred candidate surfaced in late September.

Sherlund believes Ballmer will make a quick exit and the company will purchase back his shares, valued at $12 billion. Sherlund said Ballmer will opt to exit the board because he "doesn't want to be second-guessed."

To compensate for Mulally's lack of experience running a tech company, Sherlund believes founder and Chairman Bill Gates will contribute in terms of directing product strategy. "Bill Gates is going to have to roll up his sleeves to compliment Mulally," Sherlund told CNBC's Jim Kramer.

A new CEO would be wise to sell or spin off Microsoft's Bing and Xbox businesses, which are huge drains on profits, Sherlund said. Added Kramer, the new CEO will have to "blow the company up."

I'm still in the camp that Gates isn't going to come back even in the roll Sherlund is predicting. As for speculation that Ballmer may walk away from the board and cash out his shares, that's his choice and I don't see it having a major impact on Microsoft's future direction either way. Despite the latest buzz, there are still many balls in the air. But I do get the sense that the news is going to come sooner than later.

Posted by Jeffrey Schwartz on 11/06/2013 at 2:26 PM3 comments


Should Microsoft Pick at BlackBerry's Carcass?

Once again the soap opera surrounding BlackBerry took an unexpected turn and the struggling former mobile handset leader's already uncertain future became even more questionable. The $4.7 billion buyout by BlackBerry's largest shareholder Fairfax Financial Holdings that was reached in September fell apart yesterday, resulting in CEO Thorsten Heins stepping down in the wake of the news.

As a result of yesterday's drama, which caused the company's stock to plummet 16 percent, Fairfax and undisclosed investors are instead issuing $1 billion in debt securities that can be converted to stock at $10 per share. Meanwhile taking the helm as "acting" CEO is John Chen, the longtime CEO of Sybase. When Chen took over Sybase, it was a onetime large database company that found itself on the brink. Sybase fell on hard times after being squeezed by Oracle, IBM and onetime partner Microsoft, which it ironically helped get into the database market.

While Sybase never was able to regain the share it lost to Oracle, IBM and Microsoft, Chen oversaw the company's expansion into mobile middleware and the development of an in-memory database. Those moves made the company attractive to SAP, which acquired Sybase in 2010 for $5.8 billion, a 40 percent premium to its market cap at the time of the deal.

It remains to be seen whether Chen plans to take "acting" off his title, or merely stand-in for a new CEO.  But if Chen does have long-term plans for BlackBerry that might suggest the company may not be destined to be sold off in pieces as many have predicted will happen.

Indeed Microsoft, along with Amazon, Ericsson and Google were "possible buyers" for BlackBerry, an informed source told The Wall Street Journal, because they were interested in pieces of the company but they all ultimately walked away.

It appears there's little BlackBerry has to offer Microsoft, as reported by technology journalist Mary Branscombe in September. She points out BlackBerry has little to offer Microsoft: the company has no need for its hardware or operating system. Thanks to Skype the once popular BBM (BlackBerry Messenger) would offer little value. Nor would there be value from its enterprise "crown jewel," the BlackBerry Enterprise Server (BES), thanks to mobile device management capabilities in the latest versions of Exchange, System Center and Windows Intune. Given Microsoft's tendency to license patents and not buy them, Branscombe said it's unlikely Microsoft would buy the company for $5 billion just to get them.

I have to agree with her points and even though Microsoft shocked those who never thought the company would acquire Nokia's handset business, it still appears unlikely the folks in Redmond are going to spend heavily for BlackBerry. That doesn't mean there won't be licensing deals or other partnership possibilities. That could be especially possible if Chen decides to focus on expanding BlackBerry's technology assets rather than take the company to the chop shop.

Posted by Jeffrey Schwartz on 11/05/2013 at 1:58 PM0 comments


SharePoint Shops Eying Windows Azure as Office 365 Alternative

Untitled Document

A vast majority of Redmond magazine readers are holding off on moving their SharePoint infrastructures to the cloud and the small portion that are typically opt for Office 365. However a growing number of those planning on running SharePoint in the cloud are looking at Windows Azure at the expense of Office 365.

To be sure, even the majority of those planning to move to SharePoint in the cloud are leaning toward Office 365. But an online survey of nearly 500 readers last week showed a surprising and interesting trend: While 66 percent of current SharePoint online users have Office 365 subscriptions (compared with 14 percent using Windows Azure and 15 percent using other cloud providers), of those planning to run SharePoint in the cloud in the future, 55 percent will opt for Office 365 and nearly 29 percent will use Windows Azure.

That points to a segment of SharePoint shops that are turned off by the lack of code portability from older versions of SharePoint. By standing up their own SharePoint servers in Windows Azure, they get the benefit of running their custom or shrink-wrapped apps in the cloud.

"People don't do customization of SharePoint Online using the old method because the product limits what they can do," explained Forrester analyst John Rymer, who, along with colleague Rob Koplowitz, released a study late last month of their own enterprise customer reluctance to move SharePoint to the cloud. "Integration, for example, is pretty limited, and Microsoft will not accept 'just any random code' and the rules indistinct."

Most SharePoint experts I talk to agree with this but whether or not Office 365 is a real deal-breaker depends on the customer's application and overall requirements. "If you want to use all of the content management capabilities, deeper integration into other line of  business systems, those are the kinds of customers that will continue to run SharePoint either in their own servers or Windows Azure," said Adriaan Van Wyk, CEO of K2, a provider of a SharePoint workflow app for Office 365 that uses Windows Azure.

When I shared the data with Forrester's Koplowitz, he was intrigued by the number of respondents who are looking to run SharePoint in Windows Azure. "That's a real interesting data point," he said.

For now, only 15 percent of respondents to Forrester's August survey said they were using Office 365 SharePoint Online, up just 3 percent over last year, prior to the release of the enhanced service. That's relatively consistent with our survey, which shows just 17 percent of our readers are running SharePoint in any cloud service.

While smaller organizations are the most obvious candidates to go to SharePoint Online Office 365, especially if they don't have a collaboration solution, larger shops have more complicated decisions to make. Whether or not larger shops are using Office 365, Windows Azure or third-party infrastructure-as-a-service (IaaS) or managed services providers (or any combination of those), the largest trend is toward hybrid implementations where they are adding capacity to existing SharePoint infrastructure incrementally.

"We're probably seeing 80 percent of our customers go hybrid cloud in some way, maybe for example moving My Sites and some of their extranets to the cloud, and keeping their line of business integration on premise for now," said Ben Curry, managing partner at Summit 7 Systems. Curry and a number of other SharePoint MVPs will be sharing their views on this in two weeks at the SharePoint Live! conference in Orlando, Fla., which, like Redmond magazine, is produced by 1105 Media's Enterprise Computing Group.

Are you among the growing number of SharePoint shops looking at Windows Azure (or other IaaS providers) to make your foray into the cloud? Or do you find SharePoint Online Office 365 more appealing?

 

Posted by Jeffrey Schwartz on 11/04/2013 at 3:34 PM2 comments


William Lowe, Father of the IBM PC, Dies

William Lowe, who led the engineering team responsible for bringing the first mainstream PC to market in 1981, died last month at age 72 from a heart attack.

News of his death was reported earlier this week by The New York Times. Lowe, a longtime engineer at IBM, proposed bypassing IBM's conventional development model and led a team of 12 engineers that produced the IBM Personal Computer 5150 using off-the-shelf parts and software from third parties. The move led to the creation of IBM's PC within a year. Had IBM opted to build it internally, it would have taken several years.

Lowe's effort also put two companies on the map, Intel and Microsoft. The IBM PC 5150 was powered by Intel's 4.77 MHz 8080 processor based on Microsoft's MS-DOS 1.0 operating system. The team engineered the PC in an IBM lab in Boca Raton, Fla. The secret effort was internally known as Project Chess and the PC's code-name was Acorn. It was available with one or two floppy drives at a price of $1,565 (not including a monitor).

The decision to build the IBM PC on an "open architecture" paved the way for the IBM clone market, ultimately dominated by companies such as Compaq and Dell, as well as dozens of other players at the time. While it gave birth to the PC market and Microsoft, it didn't serve IBM well in the end. While Lowe would become president of IBM's Entry Systems Division and later a corporate vice president, he left IBM in 1988 to join Xerox.

Lowe had no apologies for the decision, as The Times noted. "We are committed to the open architecture concept, and we recognize the importance of an open architecture to our customers," Lowe said of IBM's work with Intel and Microsoft. Some say many top executives never believed the PC would amount to anything major -- a key reason Microsoft was permitted to license MS-DOS to other then unknown suppliers.

While we know how that played out for IBM, it certainly makes one wonder if Microsoft would exist in its current form had IBM not gone down that path. For that matter what might computing and devices look like today?

Posted by Jeffrey Schwartz on 11/01/2013 at 2:23 PM0 comments


Windows Phone Sales Surge in North America

While sales of Nokia's line of Lumia phones have shown steady growth in Europe and Asia over the past two years, they have failed to make strong inroads in North America. In a surprising turn, the company's third-quarter earnings report yesterday revealed a sudden spike in North America and the United States.

Between July and September, Nokia reported it has sold 1.4 million Lumias in North America -- an 180 percent increase over last quarter's 500,000 units and a 367 percent rise year-over-year. Overall, Nokia sold 8.8 million Lumias worldwide last quarter, a 19 percent increase over the prior quarter and 40 percent over the same period a year ago.

Does that mean Windows Phone is catching on? The sudden rise in Lumia sales in North America suggests Windows Phone may be gaining appeal but it will take several more quarters to see if that trend continues.

To be sure, the 8.8 million Lumias sold worldwide pale in comparison to the 33.8 million iPhones Apple sold during the same period (on top of the more than 250 million already in the market). Also Nokia indicated that the most popular phone was the Lumia 520, not the high-end Lumia 1020, released in the beginning of the quarter.

So while Lumia, and by extension Windows Phone, sales are on the rise, it remains in a very distant third place to Android and iOS. Microsoft is betting its acquisition of Nokia's handset business for $7.2 billion, which closes early next year, will give it further leverage in advancing its mobile phone business. But just as the company is marketing its new line of Surface tablets, success will ride on killer apps in the Windows Store.

Posted by Jeffrey Schwartz on 10/30/2013 at 11:27 AM0 comments


Metalogix Acquires Idera's SharePoint Tools Business

Looking to further solidify its tooling offering for SharePoint administrators, Metalogix last week said it has acquired Idera's SharePoint business. The move aims to bolster Metalogix backup and recovery software for SharePoint with Idera's monitoring and diagnostic tools, and comes two months after Metalogix acquired Axceler's SharePoint tools business.

Metalogix CEO Steve Murphy has had his sights on Idera's tools business for two years and told me he convinced them to sell, letting it use the proceeds for Idera's more-lucrative SQL Server tools business. Though terms weren't disclosed, it doesn't seem Metalogix intends to spread its wings into offering SQL Server tools. Rather Murphy's key objective is to challenge AvePoint for leadership in the SharePoint administration market.

"We were looking for a more robust backup play to compete against AvePoint and provide much more robust end-to-end infrastructure management for SharePoint," Murphy said, adding that Idera's diagnostic tools for SharePoint fills a key gap in its offering. "We believe from an infrastructure management perspective, this puts us on par or ahead of our competitors."

For its part, AvePoint doesn't see Metalogix's move as a threat. AvePoint Founder and Co-CEO Tianyi (TJ) Jiang, said in an e-mail what every rival says when a challenger comes its way: that it validates its place in the market.

"It is necessary to provide customers with an integrated, comprehensive enterprise solution that can meet all of their collaboration needs, which AvePoint has done since we first opened our doors in 2001," Jiang said. "While we have seen other vendors try to achieve this by attempting to grow and scale -- be it through acquisition or through internal development -- AvePoint continues to utilize our deep resources, 1,400-plus employees globally, to create the integrated enterprise-grade product experience our customers demand in order to collaborate with confidence with their technology investments."

The acquisition of Idera's SharePoint tools business gives Metalogix a customer base of 13,500, according to Murphy, and a more robust partner network and overall "operational maturity," he argued. "This is a big changing of the guard in the marketplace," he said.

The SharePoint Diagnostic tool Metalogix is acquiring from Idera lets IT monitor the content and server performance of a SharePoint farm, provides custom alerting on pages and controls servers from its own dashboard or the SharePoint user interface, according to a description of the tool. It also offers historical trending and forecasting and the tool does not require IT to deploy agents on server farms.

In addition to SharePoint Diagnostic Manager, Metalogix is acquiring SharePoint Audit Manager, SharePoint Audit, SharePoint Backup, SharePoint Performance Monitor and SharePoint Admin Toolset, according to a FAQ on Idera's Web site. Though these are all now Metalogix products, the transition will take several months and for now Idera is providing technical support.

 

Posted by Jeffrey Schwartz on 10/28/2013 at 1:19 PM3 comments


Enterprise Boost Helps Microsoft Shatter Financial Expectations

Microsoft can talk up "devices and services," the consumerization of IT and BYOD all it wants, but it can thank enterprises for its unexpected surge in revenues and profits.

Overall, the company yesterday reported revenues for the first quarter of its 2014 fiscal year were $18.53 billion, nearly 5 percent higher than the $17.7 billion analysts had expected and up 16 percent year-over-year, while posting earnings of 62 cents per share, compared with consensus estimates of 53 cents per share. Moreover, Microsoft gave a positive outlook for the current quarter which ends Dec. 31. That was a welcome relief to investors after Microsoft reported one of its most disappointing quarters back in July.

The lift came primarily from commercial revenues, which added to $11.2 billion for the quarter, up 10 percent year-over-year. Noteworthy bright spots were sales of SQL Server up 30 percent and sales of its Office products increased 11 percent with Exchange, Lync and SharePoint all growing 30 percent. Server and tools sales increased 12 percent and commercial cloud revenues increased 103 percent.

Improvements in Microsoft's enterprise revenues were especially noteworthy given the fact that IBM and Oracle both fell short during the same period. Microsoft's stock was up 6 percent midday today on its better-than-expected performance along with the positive forecast for the current quarter.

In a category Microsoft calls Commercial Other, which include enterprise cloud revenues from Windows Azure and Office 365, revenues this quarter could reach $1.9 billion on the high end of the forecast, Microsoft said. Revenues in that category this quarter of $1.6 billion were up 28 percent, which the company said reflects increased demand for its cloud services.

Meanwhile, consumer device revenue of $7.46 billion showed modest growth of 4 percent, though device and consumer licensing was down over 7 percent. Microsoft said that was better than it had expected going into the quarter, where it expected a decline in the mid-teens. In Microsoft's struggling Windows business, which continues to be hammered thanks to the growth of tablets, the commercial business held its own.

While non-Pro Windows revenue declined 22 percent, Windows OEM Pro revenue grew 6 percent, Microsoft said. Microsoft revealed Office 365 Home Premium subscriptions have now hit the 2 million mark.

After taking a $900 million charge last quarter on unsold Surface inventory, Microsoft said Surface revenue this quarter doubled to $400 million over the prior period. With this week's release of the new Surface 2 and Surface Pro 2, Microsoft CFO Amy Hood believes this quarter will show further improvements, noting that customers were delaying purchases in anticipation of the new releases. "With Surface, we are making progress with better end market executions," Hood said.

Now that the new line of Surface hardware is shipping along with Windows 8.1, Windows Server 2012 R2, System Center 2012 R2 and upgrades to the Windows Azure portfolio, we'll get a better sense of how enterprises and consumers alike are embracing these new offerings  in three months from now when Microsoft reports its next quarterly results.

Posted by Jeffrey Schwartz on 10/25/2013 at 11:36 AM0 comments


Nokia Tries To Upstage Apple and Microsoft with Launch of Lumia Tablet

Nokia yesterday launched its first tablet, the new 10-inch Lumia 2520, while also adding two new 6-inch smartphones, the Lumia 1520 and Lumia 1320. Rumors of a Windows RT-based tablet from Nokia surfaced over the summer but it fell under the radar when Microsoft announced its plan to acquire the company's mobile handset business for $7.2 billion last month.

But apparently that didn't deter Nokia from launching its new tablet. Of course, the deal hasn't closed and Nokia must run its business as it sees fit until the transaction is complete.  

Nevertheless, Nokia's decision to launch a tablet line today is either a foolish act or a brilliant move. That remains to be seen but it takes a lot of guts for the company to attempt to upstage Apple, which it and everyone else knew was launching new iPads today -- the new iPad Air, a thinner and lighter version of its full-sized tablets and a new iPad Mini with Apple's Retina display. The Nokia launch also comes on the day Microsoft is shipping the new Surface Pro 2 and the Surface 2.

Unlike Microsoft's Surface lineup, the Lumia 2520 will ship this quarter with support for 4G LTE connectivity. It also comes with a 6.7 megapixel camera and Zeiss lenses. Besides the Surface, it's one of the only other tablets bundled with Windows RT 8.1. That will likely result in a poor reception for the new Nokia tablet, said independent analyst Jack Gold in a note today.

"Windows RT is not popular and is not selling well, for good reason," Gold noted. "It is a 'dumbed down' version of Windows which does not run all the apps Windows users expect. Most users have not been thrilled with the user experience. I don't expect Nokia to do well with this product for that very reason."

Gold nonetheless described Nokia's new tablet as sporting an impressive design and will appeal to those who want 4G LTE built in.

The future of this new device will likely serve as a test case for Microsoft and it will invariably be integrated as part of Microsoft's Surface portfolio once the deal closes.

Posted by Jeffrey Schwartz on 10/23/2013 at 1:07 PM0 comments


HeathCare.gov Debacle: Biggest IT Failure Ever?

The botched rollout of the Web site built to let customers enroll in an insurance plan under the controversial Affordability Care Act -- aka Obamacare -- will go down as one of the most high profile IT disasters to date.

That's a high bar if you consider all of the major debacles over the years (think of widely publicized E-Trade, Schwab, eBay and Victoria Secret site meltdowns over a decade ago) to more recent Black Friday retail site failures, and over the past year outages that have knocked off Facebook, Twitter and major advertisers of the Super Bowl. Still fresh on many minds was last year's major meltdowns that shut down major portions Windows Azure, Office 365 and Amazon Web Services, which brought down many key sites with it including Netflix last Christmas Eve.

Yet those seem to pale in comparison to the failed launch of HealthCare.gov, the signature effort of the Obama presidency that has become a lightning rod for opponents that played a key role in this month's shutdown of the federal government. Regardless where you stand on Obamacare, you never want the president telling the world how mad he is about how poorly your IT project was planned and implemented, which is what happened yesterday in the White House Rose Garden.

"Nobody's madder than me about the fact that the website isn't working as well as it should, which means it's going to get fixed," the president said. But that may not be easy. According to a report in The New York Times yesterday, Healhcare.gov is a text-book-case study in how not to manage a critical IT project of this magnitude and importance. The site may need 5 million lines of new code, according to yesterday's report. Overall, it's built with 500 million lines of code --five times the amount needed to run a large bank's computer systems, according to the report.

One of the key factors that may have led to this failure was the Centers for Medicare and Medicaid Services, the government agency overseeing the exchange, taking the unusual step (for a federal agency) of managing the 55 contractors and overseeing the effort to ensure they can properly integrate the apps and ensure the databases work together.

While the site has seen incremental improvements, the extensive code rewrite could take several weeks, at the very least. According to The Wall Street Journal, HealthCare.gov's registration application developed by lead contractor CGI, transfers data gathered from registrants creating accounts and transfers that data to an the Medicare agency's Enterprise Identification Management app developed by Quality Software Services, a subsidiary of United Health Care, which submits data to the credit reporting service Experian to confirm user identities.

Apparently the data hasn't interfaced well with Oracle Identity Manager. Oracle reportedly has sent engineers to help remediate the problem and add capacity but a company spokeswoman told the Journal that OIM is not the root of the problem. "Our software is the identical product deployed in most of the world's most complex systems" according to the spokeswoman's statement.

Better preparation in the form of load testing to simulate the anticipated traffic may have helped avoid much of the problems that surfaced at launch as well, observers note. Some critics are calling for the head of Health and Human Secretary Kathleen Sebelius, who announced today that Jeff Zients, a recent acting director of the Office of Management and Budget to advise on the project, will be adding a "surge" of tech support to remediate the problem.

"These reinforcements include a handful of Presidential Innovation Fellows," Sebelius said in a blog post. "This new infusion of talent will bring a powerful array of subject matter expertise and skills, including extensive experience scaling major IT systems.  This effort is being marshaled as part of a cross-functional team that is working aggressively to diagnose parts of HealthCare.gov that are experiencing problems, learn from successful states, prioritize issues, and fix them."

As the facts of this story unfold, the failed launch of this site that millions awaited for years is a painful reminder of how a poorly planned development and IT deployment effort can doom a key strategic initiative. While such IT failures happen all too frequently, this one could go in the annals of all time flops.

 

Posted by Jeffrey Schwartz on 10/22/2013 at 2:54 PM2 comments


Violin Gives Microsoft's Windows Server 2012 R2 More Flash

One of the many notable improvements IT pros will find in Windows Server 2012 R2 is its improved support for flash-based solid state drives (SSDs). The new server OS, released last week, now offers automated storage tiering, which improves performance when using flash-based SSDs in servers and storage arrays.

Windows Server 2012 R2 analyzes requests for disk IO in real time and allocates the most frequently accessed blocks of data to the significantly faster SSDs, while moving blocks of data not recently accessed to traditional mechanical hard disk drives (HDDs). That approach should appeal to the growing number of shops that have added flash to their servers and storage arrays.

But for those looking to run massive scale-out type datacenters and clouds, Violin Memory last week said it has worked with Microsoft to optimize the performance of its flash arrays running with the new Windows Server 2012 R2 and System Center 2012 R2. Violin Memory said the two companies started working together 18 months ago to develop arrays that can extend the performance and provide lower latency when the new server OS is used with applications, including SQL Server, SharePoint and Exchange, by taking advantage of the improvements to Hyper-V and Server Message Block (SMB) file services in the new Windows Server 2012 R2.

The new arrays, which will ship in January for a yet-to-be-disclosed price, can perform at more than one million I/Os per second (IOPS) and scale from 8 TB to 64 TB of memory, with sub-second failover and I/O latency measured in milliseconds.

Microsoft optimized the kernel of Windows Server 2012 R2 to run at these memory speeds in collaboration with Violin over the last 18 months, said Narayan Venkat, the company's VP of products. "As a result, the solution offers SMB 3.0 direct support enabling a Windows-to-Windows communication environment for management integration with System Center and other management simplification on top of the world class file access performance for both SMB and NFS," Venkat said.

Violin is far from the only company targeting flash-based SSDs for the datacenter. Among them are Fusion-io, Intel (which in June upgraded its enterprise flash SSD offerings), LSI, SanDisk's FlashSoft and Stec (acquired in September by Western Digital) as well as key storage and systems vendors Dell, EMC, Hewlett-Packard, IBM (which earlier this year said it will invest $1 billion investment in enterprise flash storage) and Cisco, which last month said it was acquiring Whiptail for $415 million.

A number of startups are offering enterprise flash storage including Kaminario, Nimbus Data Systems, SolidFire and Pure Storage, which in late August said it raised $150 million, valuing the company at $1 billion and putting it on track for a possible initial public offering. Meanwhile Violin's closely watched IPO a few weeks ago has gotten off to a rough start. The company offered its shares at $9 raising $147 million, though as of Friday closed at $7.25.

Do you see yourself taking advantage of Windows Server 2012 R2's flash storage support?

Posted by Jeffrey Schwartz on 10/21/2013 at 1:02 PM0 comments


Salesforce.com Challenges Active Directory with Single Sign-On Service

Salesforce.com's storied strategy of displacing premise-based apps with Software as a Service (SaaS) went deeper this week with the company's release of Salesforce Identity. The single sign on service aims to displace traditional software like Active Directory as the central repository for user authentication.

The company's new Salesforce Identity service extends beyond traditional enterprise directories like Active Directory by connecting employees, customers and business partners to any application, device or service, said Chuck Mortimore, Salesforce.com's vice president of product management, identity and security. In addition, ISVs and customers can white-label that single sign-on service into their applications.

Salesforce Identity is not the first SaaS single sign-on offering. A number of third parties, including Centrify, Okta, Ping Identity, SailPoint, Symplified and quite a handful of other players, now offer single sign-on services. I noted late last year that identity management as a service would be a key area of expansion in 2013.

Indeed, Microsoft this year brought Active Directory to the cloud with Windows Azure Active Directory, which Microsoft said at June's TechEd has processed 265 billion authentication requests from around the world. In addition, Brad Anderson, corporate VP of the Microsoft Windows Server and System Center group services, said that Windows Azure Active Directory processed more than 1 million authentication requests in a period of two minutes, or 9,000 per second.

Salesforce Identity supports all the key authentication standards, including OAuth, System for Cross-domain Identity Management (SCIM), Security Assertion Markup Language (SAML) and OpenID Connect, which allows IT to synchronize directories ranging from Facebook, PayPal, Amazon and Google to Active Directory.

"You can leverage your existing Active Directory, automatically synchronize users between the two, drive authorization out of existing Active Directory Groups and have those drive profiles and permissions and authorizations to any application brokered to the Salesforce Identity platform," Mortimore explained.

Mortimore avoided saying whether Salesforce is gunning to displace traditional authentication systems such as Active Directory, but reading between the lines, that appears to be the goal.

"The identity marketplace is going through a transition," he said. "The old mechanisms of dealing with identity are not really working for the new use cases in front of customers. You see all sorts of different organizations and identity management organizations trying to deal with this reality. Your LDAP directory isn't necessarily going to be the path forward for all of these applications that are no longer inside your firewall."

In addition to applications, Mortimore explained traditional directories were designed to manage identities of employees, while now IT must address identity management of customers and partners and attributes on devices not owned by the enterprise.

"I see solutions like Salesforce Identity initially interacting with existing on-premises directories, but providing a new cloud-based native identity store option for 'long-tail' external identities such as employees of small partners," said Forrester analyst Eve Maler.

Salesforce's move into the identity management space has been long anticipated. The company announced plans for Salesforce Identity at its annual Dreamforce conference last year. Salesforce has also played a key role in several of the standards committees. Gartner analyst Ian Glazer said IT is demanding identity and access management services interwoven with their SaaS-based applications. Salesforce has added identity management services across its entire platform, including Force.com, he noted.

"This announcement represents a fundamental change in the IAM [identity and access management] market in which non-traditional identity companies such as Salesforce are aggressively entering the market with hopes of major disruption," Glazer said. Yet despite that goal, Glazer doesn't see organizations displacing Active Directory with Salesforce Identity.

"I definitely see organizations using Salesforce Identity (or its competitors), but not with the express goal of replacing AD [Active Directory]," Glazer said. "As enterprise computing moves more toward mobile and cloud computing, the value of AD is diminished. As an enterprise directory, AD will remain [in] usage and meaningful, but the locus of control will shift away from AD and it will likely not be the default source of authentication and authorization services in a post-PC world."

KuppingerCole analyst Mike Small noted in a blog post Tuesday that Salesforce.com has long offered an identity service in its traditional CRM and related offerings. Small pointed out that Salesforce Identity includes an extensible cloud directory and the optional Salesforce Identity Connect module, built on ForgeRock's Open Identity Stack, which bridges between existing on-premise directories and Salesforce Identity. Another appealing capability, Small noted, is that Salesforce Identity's monitoring and reporting capabilities let organizations create user activity and compliance reports.

"Through this platform -- Salesforce.com [is] seeking to change the way in which identities are managed by organizations," Small noted. "To alter the perspective away from one focused on internal IT systems and users to an outward-looking one focused on customers and partners whilst retaining internal control: integrating enterprise identity with CRM."

How will the release of Salesforce Identity change the way you manage access to your applications?

Posted by Jeffrey Schwartz on 10/18/2013 at 1:57 PM0 comments


Upgrading to Windows 8.1 Will Require Some Patience

Microsoft's Windows 8.1 arrived yesterday, and as soon as I got to my desk I embarked on the task of upgrading my PC, which was running the preview release.

The experience was more challenging than I expected. In fact, it killed a good part of my morning and was fraught with frustration.

Though Microsoft said yesterday that you can upgrade by simply going to the Windows Store, when I did so, the Windows 8.1 upgrade was nowhere to be found. The first thing I did was to make sure the Windows 8.1 preview on my system was up-to-date. Indeed, I was due for an update and that process took about an hour.

After extensive searching, I finally found a link to perform the upgrade. However, after launching the upgrade, nothing appeared to happen. It wasn't until about an hour later that my screen indicated that my PC was indeed going through the upgrade process, but I waited an additional hour for it to complete.

Microsoft warned users of the preview that certain apps would be gone, so I prepared for the worst. Indeed, quite a few apps were gone, including -- not surprisingly -- Google Chrome. But also gone was Microsoft Office. Those with Office 2013 will need to go to the Office site and re-install it. Make sure you have your license key. The good news is that all your data should be intact.

My mail settings remained intact -- both my connection to my company's Exchange server and my Yahoo Mail account. My colleague, Redmond Channel Partner Editor in Chief Scott Bekker, had a similar experience with his Exchange connection, though alas, he had to reconfigure his Gmail account.

Upgrades are never fun but this seemed more difficult than it had to be. If you are among those who have yet to perform the Windows 8.1 upgrade, consider yourself warned.

Posted by Jeffrey Schwartz on 10/18/2013 at 12:14 PM0 comments


VMware Extends Hyper-V Support to VM and Host Management

While VMware might want to diminish Microsoft's role in the enterprise, the company understands that if it wants to play an even greater role in managing virtual and cloud infrastructures, it needs to acknowledge Hyper-V as a force to be reckoned with.

In a sign it has embraced that reality, VMware has extended its support for Hyper-V.

At this week's VMworld 2013 Europe conference in Barcelona, VMware announced several new products, but notably VMware vCenter Operations Management Suite 5.8, which provides improved performance monitoring of Hyper-V, as well as SQL Server and Exchange.

The vCenter Operations Management Suite was able to monitor and manage Hyper-V workloads as a guest OS by placing its Hyperic agent in the box. But now, through Microsoft's System Center Operations Manager (SCOM) or VMware's Hyperic management packs for vCenter Operations Manager, it can manage Hyper-V hosts and virtual machines. It also adds support for Amazon Web Services (AWS) EC2 infrastructure as a service (IaaS).

The new version collects data on the CPU, network, memory and other components, and feeds that into its analytics engine to separate normal performance behavior from unhealthy activity and then provides alerts.

Expanding support for Hyper-V is a smart move that will be welcomed by VMware and Microsoft customers alike, says Pund-IT principal analyst Charles King. "By expanding its support for and visibility into Hyper-V and public cloud services like AWS, VMware is highlighting its continuing technical leadership," King said. "Since these new features are also coming with no additional premium, adding them also enhances the value proposition of VMware's solutions and services."

In tandem with the vCenter Operations Management Suite upgrade, VMware also updated its recently launched vCenter Log Insight analytics tool. The new Log Insight 1.5 release provides real-time analytics to provide searchable information and dashboards. Released in June, the Log Insight upgrade supports "content packs" for specific systems such as Exchange Server and SQL Server, as well as products from Hytrust, EMC, NetFlow and VMware's own Nicira software-defined networking platform, said Mark Leake, director of product marketing in VMware's cloud management business unit.

"These new out-of-the-box capabilities enhance the discovery and topology," Leake said. "So you get deeper discovery of app instances and components and you can apply the analytics that we have in vCenter Operations Manager to them."

Posted by Jeffrey Schwartz on 10/16/2013 at 1:13 PM0 comments


Are Dark Clouds Forming Over SharePoint?

Untitled Document

Nearly a year since Microsoft disclosed that SharePoint is a $2 billion business, the collaboration platform has become the latest punching bag in the IT community.

SharePoint took its latest hits yesterday at the annual Gartner Inc. ITxpo in Orlando Florida where research vice president Jeffrey Mann led a session called "Should Microsoft Kill SharePoint," as reported by Kurt Mackie yesterday. The answer is no but Mann pointed out the dissatisfaction with Microsoft's latest release of SharePoint 2013 and the SharePoint Online service offered with Office 365.

"Many organizations using SharePoint cannot go to the cloud because they have regulatory restrictions or complex, customized implementations that prevent adopting SharePoint Online," Mann said in a prepared Q&A. "Some third-party add-ons they depend on are not available for the cloud version. Others do not trust the cloud or see no reason to change, so they won't make the move."

The second big issue, according to Mann, is IT's frustration with SharePoint. "We regularly hear end users and administrators complain about features or user-experience improvements that they would like to see in SharePoint," Mann noted. "Although they want new functionality, they are less keen to have more upgrades, which are seen as expensive, disruptive and time-consuming. It is difficult to see how users can expect to get changes without implementing new versions. If upgrades were easier, they might be less reticent to install new versions. This is a move Microsoft is trying to address with the app model introduced in SharePoint 2013."

Gartner isn't the only one sounding the alarm on SharePoint. A report released last month by the Association for Information and Image Management (AIIM) found that only 6 percent feel they have achieved success with their SharePoint implementations, while 26 percent say they have struggled to meet expectations and 28 percent report that while it's doing the job, progress has stalled to some extent. Seven percent said their SharePoint implementations were not successful. The findings are based on a survey AIIM conducted in July, consisting of a sample of 620 respondents who are among the trade association's 80,000 members.

As Evidence that there's still trepidation about the cloud, the AIIM survey also found that only 9 percent of small organizations plan to move all of their SharePoint content to Office 365, while 2 percent of mid-sized  shops and 3 pecent of large shops plan to do so. Customers, analysts, consultants, integrators and key vendors including Microsoft agree enterprises will widely deploy hybrid cloud models. The AIIM report found that only 20 percent are looking at hybrid clouds, though half will use third-party cloud providers, not Microsoft's Office 365. It bears noting that the AIIM study wasn't aimed exclusively at SharePoint and/or Microsoft shops but at its audience of IT managers and decision makers.

The survey also found that 62 percent are using at least one cloud-based product but only 8 percent said it was SharePoint and 7 percent are using Office 365 as a complete software-as-a-service application. Twenty percent said they're using Exchange Online and 39 percent are using the cloud version of Lync.

When I asked Microsoft for its take on the AIIM study, a spokeswoman noted SharePoint is a leader in Gartner's Magic Quadrant --  two out of three information workers in the enterprise have SharePoint and more than 700,000 developers are building applications for Office 365.  "Microsoft remains very committed to SharePoint for the long term, as the product plays a critical role in the future of collaboration and productivity and is a core pillar of the Office 365 suite," the spokeswoman said in an e-mailed statement. "Office 365 is Microsoft's fastest-growing commercial product ever. In fact, the Office 365 business is now on a $1.5 billion annual revenue run rate."

The statement also defended the security of Microsoft's cloud service. "Office 365 offers the most robust set of certifications and standards options of any major cloud based productivity service. In fact, each year, we undergo third-party audits by internationally recognized auditors to validate that we have independent attestation of compliance with our policies and procedures for security, privacy, continuity and data handling."

Another survey by SharePoint tools provider Metalogix Software Corp. earlier this year showed 55 percent intend to continue running SharePoint entirely in-house and only 10 percent plan to run it purely in the cloud. The remaining 35 percent are planning hybrid SharePoint implementations.

While naturally findings are going to vary depending on the audience that's surveyed, another alarming stat piqued my attention back in August: Overall salaries for SharePoint administrators -- though still six figures -- declined by 7 percent this year, according to Redmond magazine's Annual Salary Survey. That especially raised a red flag because our survey showed overall IT salaries increased 3 percent this year. It's not clear what has led to that decline, whether it's a glut of SharePoint experts or the fact that some SharePoint is now running in the cloud.

Yet many SharePoint consultants are seeing more movement to the cloud than these surveys suggest. "We're probably seeing 80 percent of our customers go towards a hybrid cloud in some way, maybe, for example by moving My Sites and some of their extranets to the cloud, and keeping their line of business applications on-premise for now," said SharePoint MVP Ben Curry, managing partner of Summit 7 Systems, a Huntsville, Ala., consultancy and systems integrator.

Curry knows all the pitfalls of SharePoint 2013 and Office 365 and will be among numerous experts talking about the future of SharePoint design and architecture at the SharePoint Live! track of the Live 360 conference in Orlando next month. Like Redmond magazine, the conference is produced by 1105 Media Inc.

In an interview last week, Curry told me he'll discuss the deployment of business-critical applications using SharePoint and how to maximize the performance of server farms, as hybrid environments emerge. Curry was once a skeptic of using SharePoint in the cloud but he now believes in it. "That's where a lot stuff's going – online. I'd say the rare client is going all SharePoint 2013 on premise. It really is upgrading pieces of it to 2013 on premise and pieces of it to the cloud, which makes performance a little bit more challenging because you have to first decide what workflows are on premise to know what to architect for."

What's your take on the future of SharePoint? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 10/14/2013 at 11:04 AM2 comments


Did Nadella Just Audition for Microsoft's CEO Position?

When Microsoft invited us to Monday's press conference in San Francisco for a rare appearance by the head of the company's newly created cloud and enterprise group, we were informed he'd discuss how Redmond is planning for future growth in that business and how it will differentiate itself from its competitors.

Packed with the typical vagueness of a press invite, we certainly weren't going to pass on the opportunity to hear what Satya Nadella had to say. Would there be a major unexpected announcement of a new product or service to come, perhaps a blockbuster acquisition or would he merely describe what those who follow Microsoft already know?

As it turned out, Nadella effectively did the latter, formalizing the already disclosed release dates of products like Windows Server 2012 R2 and System Center 2012 R2, among other deliverables. But Nadella's media event was also an open Q&A not unlike a presidential press conference, sans the podium and the moderator on stage (and the cocktails that follow).

Though we at Redmond magazine watched the Webcast, our Silicon Valley contributor John K. Waters was on hand to provide color. While an even mix of tech journalists and business media were on hand, it appeared this event were staged for Nadella to make his case to skeptical business journalists who, like us, wonder if Microsoft can remain a force in the post-PC and cloud era.

Given his appearance on CNBC earlier that morning, it also appeared as though this event was staged not only to articulate Microsoft's blueprint for growing its key $20 billion enterprise business but to have him audition as a candidate to succeed Steve Ballmer as CEO. Nadella is one of numerous candidates that pundits have floated as potential successors but typically only in passing. Many of them are those who work or have worked at Microsoft. On Monday, Nadella was hammered with questions about just about everything that may impact Microsoft's future, including how the re-org might be impacted by the next CEO and whether or not he considered himself a candidate for the job -- asked predictably by All Things D's Kara Swisher.

Nadella answered the question just like New Jersey Governor Chris Christie responds when asked if he might run for president in 2016. "Our board's looking for a new CEO and that process is well on our way," Nadella responded. "Steve is very much the CEO today and I'm excited about my job. That's the sum total of what I have to say on that." Do you think he saw that question coming?

Despite the fact that he didn't drop any bombshells, he certainly had answers for a lot of questions which haven't come his way of late. Perhaps the most interesting was a reporter who asked, and I'm paraphrasing here, how Nadella can effectively run Microsoft's cloud and enterprise group when a key component of the company's enterprise portfolio, Windows Server and by default Hyper-V, which powers Windows Azure, is controlled by a different group, the operating systems organization headed by Terry Myerson.

"This 'One Microsoft' reorganization is just fantastic from us not having in fact any of these notions of who controls what," he said. "If you look at what I'm doing here, you're talking about everything that's happening across Office 365, Dynamics, CRM. What's happening inside our server products and Azure has not changed. The servers are very much part of my organization, but that's kind of immaterial, because some of the componentry like the hypervisor itself is in the core OS, but the entire server gets put together. And that I think you can say this 'One Microsoft' thing is more a reflection of how we were already working and more formalizes it. I celebrate this notion of being able to decouple what I think are our engineering efforts and what is our marketing and business model, because I think categories are going to rapidly shift. What is a developer product, what is an IT product, what is an end user product, all have to be rethought, we think about this as one unified engineering effort, and one unified go to market effort. Especially with consumerization, that becomes even more important."

It's interesting to hear that Windows Server is part of his organization, even if it's also part of the operating systems group. The implication is that there's more overlap and shared resources in these organizational lines than Microsoft has articulated.

The next test of his well-roundedness on the Microsoft portfolio came in the form of a question regarding the role of tablets with apps like CRM. Of course Nadella worked in the Dynamics group years ago so it wasn't foreign territory. "We're really thrilled about some of the client work around CRM and [how] some of our ERP applications are taking advantage of the new Windows 8 touch form factors to enable real work," he said.

 Nadella then tied that to his own line of business. "Any time you're innovating in the front end for different mobile form factors, you also want to have the back end as a service," he said. "So Azure Mobile Services, which I think is one of the fastest-growing services out there. When it comes to mobile, we've done a fantastic job of making the business applications or application development for these new touch and mobile form factors, with the back end that much more easier."

Not only did Nadella get a pitch in for the company's software-as-a-service (SaaS) story, he related that to the company's devices and services strategy, tying it to Windows 8 and the company's development environment.

Some interesting figures Nadella shared worth noting:

  • 70 percent of Windows Azure usage is Microsoft's six-month-old infrastructure-as-a-service (IaaS) offering
  • 2.5 million organizational tenants now reside in Windows Azure Active Directory, aided mostly by Office 365. Nadella said that's key for any ISV that can now do single sign-on using WAAD. Note: Despite the figure he gave, a Microsoft spokeswoman sent an updated figure that's actually now 3 million tenants).
  • While Microsoft claims its software runs on 75 percent of Intel-class hardware, Nadella says Microsoft's $20 billion in enterprise revenues makes it a small player in what he sees as a $2.2 trillion market. "What I tell my team is it's not about building software for enterprise IT, but the best way to think about it is to build software that we use in our own stuff [Bing, Office 365, Dynamics Online, Xbox Live, etc.], which is powering our own first-party services and by doing so we're going to meet the future demands of enterprises."

Who knows if Nadella will emerge as a leading candidate to succeed Ballmer. While he might not have the aura of an Eric Schmidt or a Paul Maritz, they didn't either before they were thrust into the limelight. Nevertheless, Nadella could become the same type of executive as those two, whether or not he becomes Microsoft's next CEO. What's your take on Nadella's audition?

Posted by Jeffrey Schwartz on 10/09/2013 at 12:16 PM3 comments


Are You Ready for Smartwatches?

While Dick Tracy's notion of a watch that enables two-way communications dates back to the 1960s, it's no longer science fiction. How quickly watches that extend the smartphone to your wrist catch on is anyone's guess but it seems like we're about to find out.

Samsung is kicking off an ambitious marketing plan for its new Galaxy Gear smartwatch (check out this commercial).  Reviews have largely panned the $299 device for a variety of reasons including the fact it currently has limited functionality. As David Pogue noted in his review in The New York Times last week, the watch will alert you that you have a message, but you have to look at your Galaxy Note smartphone to see that message. It doesn't even work with more widely used Galaxy phones such as the more popular Galaxy S4, though that will apparently be rectified later next month.

The watch does let users receive and make phone calls presuming your phone is nearby (using Bluetooth), controls music playback and take a small number of photos and videos. Oh and it still displays the time. The Galaxy Gear must be charged every night, it's not waterproof and it's big (but that's to be expected).

Samsung is surely not the first to introduce a smartwatch --  Kickstarter-based Pebble Technology, Sony and Motorola now offer smartwatches. But Samsung's huge install base of Galaxy phones and its decision to roll out an aggressive marketing campaign for Galaxy Gear will be a real test of demand for smartwatches. Market researcher Canalys, which forecasts 500,000 smartwatches will ship this year, recently forecast that number will grow nine-fold next year to 5 million.

Besides Samsung, a widely rumored iWatch from Apple next year could be a hit, according to a survey of 799 U.S. consumers by Piper Jaffray. The survey forecasts Apple could sell 5 to 10 million iPhones in the first year they're available. Still, only 12 percent of iPhone users said they'd be interested in purchasing an iWatch, according to the survey. In other words, an overwhelming 88 percent are not interested.

For its part, Microsoft is said to have moved its latest effort to release a smartwatch to the group that designs and markets the company's Surface devices. While The Verge recently reported Microsoft is working on smartwatch prototypes with modified versions of Windows 8, it remains to be seen whether the company's recent move to acquire Nokia's mobile handset business and the company's search for a new CEO will delay Microsoft's entry of a smartwatch.

Many scoff at the idea of having a smartwatch. But those same people never imagined they'd one day have a smartphone too. Are you warming up to the idea of glancing at your wrist to check your e-mail?

Posted by Jeffrey Schwartz on 10/07/2013 at 2:14 PM0 comments


Should Bill Gates Step Down as Microsoft's Chairman?

Three of Microsoft's top-20 investors this week said the unthinkable: The time has come for fo Bill Gates to step down as chairman. I say "unthinkable" tongue-in-cheek as many with various motives have undoubtedly thought Gates should go. But it appears this may be the first time investors made a concerted effort to advance the idea.

The three mysterious investors told Reuters they feel Gates holds too much power considering he continues to sell off shares, holding only 4.5 percent of the company's outstanding shares today -- a figure that will give him no stake by 2018 if he continues to sell them off at the pace he has indicated he will.

The investors, who were not identified, reportedly believe Gates' presence on the board interferes with the creation and adoption of new strategies and would hold back a new CEO's ability to make changes. They see Gates participation on the special committee to find Ballmer's successor as troublesome, according to the report.

Gates' departure as chairman may be inevitable in the coming years, and while anything could happen, it doesn't seem likely the board will strip the founder from having a say in who becomes Microsoft's next CEO. But he won't be able to rubber-stamp anyone if lead independent director John Thompson, sticks to his word to take a broader view.

"I have enormous respect for Bill," Thompson told The Wall Street Journal last week. "But I didn't accept the role on the board or the role as the lead independent director to be Bill's pawn." Thompson, the onetime CEO of Symantec and longtime IBM executive is said to have no problem standing up to Gates.

Do you think Gates should remain Microsoft's chairman and remain on the committee considering Ballmer's successor?

Posted by Jeffrey Schwartz on 10/04/2013 at 1:10 PM0 comments


Dell Refreshes Line of Tablets and PCs

Dell this week refreshed its entire line of tablets and notebook PCs with its new Venue brand of systems, including a wide variety of form factors, price points and sizes ranging from a 7-inch tablet running Google's Android OS to a powerful 15-inch Ultrabook running Microsoft's forthcoming Windows 8.1.

I attended the launch at a hotel in New York and addressed the elephant in the room, asking if there are any Windows RT devices in the pipeline. After taking its existing Windows RT tablet, the XPS 10, off the market last week, I already had an idea how Dell officials would respond. The answer is no.

"We are not planning to refresh our range of RT products, we've had them on the market for a year. Right now we're focused on the full Windows products," said Neil Hand, vice president of Dell's tools and performance PC group.

Later on I asked him what made the group come to that conclusion.

"I think the market data speaks for itself in terms of what the volume has been," Hand said. "We focused very hard on it. We fully supported it with a great product with the XPS 10. It was an award-winning product. Sometimes the best product doesn't always create customer demand. The application base wasn't quite there."

Dell showcased a slew of new tablets, notebooks and hybrids that will start shipping this month and next, coinciding with the launch of Windows 8.1 and the peak holiday season. Among the refreshed systems for Windows 8.1 are:

  • XPS 11: This Ultrabook starts at 2.5 pounds and converts from a tablet to a notebook with a 360-degree rotating hinge, a backlit touch keyboard with a Quad HD (2560 x 1440) display. It starts at $999.99.
  • XPS 13: With a similar form factor to the above-mentioned XPS 11, the XPS 13 is targeted at mobile professionals. This Ultrabook starts at 3 pounds, comes with a more-powerful fourth-generation Intel Core processors, Intel HD 4400 graphics and an improved battery. It has a 1920x1080 HD display and starts at $999.99.
  • XPS 15: Targeted at those who want full multimedia capabilities, this15.6-inch system is available with a Quad HD+ (3200 x 1800) display with Intel's fourth-generation i5 and i7 quad-core processors with optional NVIDIA graphics cards. It's available with 500 GB and 1 TB hard disk drives, a 512 GB solid state drives and Intel's Rapid Start Technology. Pricing starts at $1,499.

New Venue
Dell also unveiled its new Venue line of tablets, which include Windows 8.1-based units available in from factors ranging from 8 to 11 inches and Android devices in the 7 and 8 inch form factors.  The Windows 8.1-based Venue 8 Pro and Venue are powered by Intel's latest Atom quad-core processors, code-named "Bay Trail." It starts at $299. The $499 Venue 11 Pro is based on Intel's fourth-generation Core i3 and i5 processors and comes with the option to use Intel's vPro management software. Both are available with optional keyboards.

The Android-based Venue 7 and Venue 8 tablets are based on Intel's Atom "Clover Trail" Z2760 processors and cost $149.99 and $179.99, respectively.

Posted by Jeffrey Schwartz on 10/04/2013 at 1:07 PM0 comments


Feds Say Windows Azure Meets Key Security Standards

Microsoft's Windows Azure cloud service can now be used for systems that require high levels of security. The U.S. government certified Windows Azure as meeting the FedRAMP Joint Authorization Board (JAB) Provisional Authority to Operate (P-ATO).

That means the Windows Azure meets the security requirements of federal agencies looking to use public infrastructure as a service (IaaS) and platform as a service (PaaS), said Susie Adams, chief technology officer for Microsoft Federal, in a blog post.

While many cloud providers already meet FedRAMP requirements, Microsoft claims Windows Azure is the first to meet the P-ATO requirements from the JAB. The FedRAMP JAB P-ATO includes representatives from the Department of Defense, the Department of Homeland Security and the U.S. General Services Administration (GSA).

 "Securing a P-ATO from the JAB ensures that when government agencies have a need for an Infrastructure as a Service (IaaS) or Platform as a Service (PaaS), they know that Windows Azure has successfully met the necessary security assessments," Adams said in her post. "This not only opens the door for faster cloud adoption, but helps agencies move to the cloud in a more streamlined, cost-effective way. Additionally, since Microsoft datacenters were also evaluated as part of the JAB review process, other Microsoft cloud services are ultimately better aligned to meet these security controls as well."

Certainly customers -- either government agencies or others that require the highest level of security -- will welcome this latest milestone. But it remains to be seen whether Microsoft's latest cloud security milestone will be enough to overcome concerns over the government's surveillance efforts under such programs as PRISM , as noted in our cover story this month. What's your take on Windows Azure achieving FedRAMP compliance?

Posted by Jeffrey Schwartz on 10/02/2013 at 3:52 PM0 comments


Microsoft Readies Office Store for Subscriptions

Microsoft has taken a key step toward letting SharePoint and Office customers purchase apps on a usage-based subscription from the Office Store. The company is letting ISVs submit subscription apps in its Seller Dashboard starting today. By next month, customers will be able to purchase apps on a subscription basis.

Since its launch last year, Office Store customers could only purchase software with traditional perpetual-use licenses. By offering subscription-based apps in the Office Store, Microsoft officials believe it will increase the appeal of its Office 365 service and for using SharePoint in the cloud. Two-thirds of the applications in the Office Store are SharePoint apps, given participation from ISVs and developers, said Dene Cleaver, Microsoft's senior product marketing manager for Office.

"We hope the ISVs and developers can drive innovation and update their apps and I think it allows them to price accordingly, so they can continue to drive value," Cleaver said.

Apps developed for the Office Store can work with Office 365, SharePoint Online and SharePoint 2013. However these apps will not work in earlier versions of SharePoint including SharePoint 2010. "The app model [on-premise] is tied to SharePoint 2013," Cleaver said.

One early supporter of the new model is Adlib, which provides an app called PDF Publisher, software that takes multiple documents from any format and publishes them it into a single PDF document. It can also encapsulate them into an internal report. PDF Publisher began as an application that runs on a premises-based server. Adlib President and CEO Peter Duff said the company's cloud-based version offers many of the key features provided in the server implementation.

"We're based in the Windows Azure cloud, and we can integrate that directly into Office 365 as well as on premise," Duff explained. He added that he believes there's pent-up demand to purchase apps for SharePoint on a subscription basis. "People can utilize our technology on a subscription basis without having to implement significant costs associated with on premise hardware and software and operating systems and things like that," he said.

A rich portfolio of apps in the Office Store will be a critical factor in swaying organizations to move to Office 365 and SharePoint Online, experts say, noting many developers are eager to offer them on a subscription basis. "Today it's just pockets of legacy apps that have moved to the app store," said Ben Curry, managing partner and a principal architect at Summit 7 Systems, a Huntsville, Ala. Microsoft partner, consultancy and solutions provider.

Many of his clients have turned to Nintex, a SharePoint workflow provider that is frequently mentioned as one of the first key ISVs to embrace the Office Store with the trial version of its Nintex Workflow Online app. "Clients who love Nintex on premise are now loading the app version. It's an easier model to keep updated."

While Microsoft says there have been 1 million downloads from the store, few are using it in a big way. "I think that will change," Curry said. "As the app store grows, demand will grow."

 

 

Posted by Jeffrey Schwartz on 10/01/2013 at 12:03 PM0 comments


Top Brands: Microsoft Stays at 5, Apple Ousts Coke for Top Spot

Despite turmoil in Redmond this year, Microsoft held on as the fifth most popular brand in Interbrand's Best Global Brands, released today. Microsoft's key rivals Apple and Google jumped in rank taking the number 1 and 2 spots respectively.

By becoming the leading brand, Apple ousted Coca Cola, which had topped the charts every year since Interbrand launched the survey 14 years ago. While the ascent of both Apple and Google is hardly surprising, Microsoft's ability to hold on to its spot as number 5 is a feat, considering a weak reception for Windows 8 and its Surface tablets, both released last year.

But Nokia, whose key assets Microsoft is acquiring for $7.2 billion, didn't fare as well. Nokia dropped from 17 last year to 57 -- an overall decline in brand value of 65 percent. After the deal closes of course, Nokia will continue to exist as a company that will retain patents and its telecommunications infrastructure business.

The good news for Microsoft is that its brand equity remains strong in spite of many who question the company's relevance in the post-PC era. IBM's slipped a point, while Apple and Google hold the top brands. Also on the rise, and a considerable threat to Microsoft is Amazon.com, which has a thriving cloud business and remains an aggressive contender in the tablet business.

In case you're wondering, Ford's brand equity increased 15 percent, moving from 45 to 42 percent. That's relevant only if Microsoft ultimately selects Ford CEO Alan Mulally to replace Steve Ballmer. As reported last week, Mulally is said to be a top contender for the job.

Brands are a critical factor to any company's success. Redmond magazine's reader survey this summer showed the jury is still out on Microsoft, though over 90 percent said they plan to spend the same or more with Microsoft in the coming year.

Microsoft will need all the brand equity it can get as it tries to convince customers to give its next generation of Windows software and devices, as well as cloud services, a chance.

Posted by Jeffrey Schwartz on 09/30/2013 at 3:10 PM0 comments


Enterprise Social Network Firm Axceler Is Now ViewDo Labs

In wake of Metalogix acquisition of Axceler's SharePoint business, the remainder of the company is charting a new path and identity. The company, now called ViewDo Labs, will emphasize enterprise social networking, not just for SharePoint but for all collaborative environments.

Part of the deal when Metalogix acquired Axceler's SharePoint business last month was that Metalogix would retain the Axceler brand, hence the new name and focus. ViewDo Labs describes itself as a provider of enterprise social network (ESN) analytics and governance tools. Michael Alden, Axceler's former president and CEO, will hold the same title with ViewDo.

"Our team's goal is to help organizations increase adoption of ESNs by offering visibility into what employees are collaborating on and what platforms are most successful," Alden said in a statement.

ViewDo will release a cross-platform enterprise social network analytics service called ViewPoint Enterprise, which offers a common view via a dashboard of activity in a variety of enterprise social network environments including SharePoint, Yammer and Salesforce.com's Chatter. The company plans to support other enterprise social networking platforms in the future.

ViewPoint is designed to help increase usage of these networks, while reducing overall risk, such as confidential data getting into the wrong hands. The company says VIewPoint will track who is collaborating, what's driving adoption and help determine business metrics from that collaboration.

According to a survey fielded by ViewDo, 46 percent of enterprises don't measure overall productivity and large enterprises are having the most difficult time getting users to adopt social collaboration tools in the workplace.

A trial version of the software is available for download now and is slated for general availability next month.

Posted by Jeffrey Schwartz on 09/29/2013 at 4:32 PM0 comments


Surface 2's 'Workplace Join' Adds MDM and Security Features

Microsoft's first stab at tablets with the Surface RT and Surface Pro have made little inroads in enterprises, which is hardly a surprise given this summer's $900 million write-down of the devices. The company is hoping it can do better with the Surface 2 and Surface Pro 2, launched Monday and due to ship next month.

One of the biggest non-starters for IT managers with the Surface RT and the Windows RT operating system that runs the tablets is IT can't join them to Active Directory domains. While the forthcoming Surface 2s don't do so either, Windows RT 8.1's new "workplace join" feature could make the new tablets more palatable to some IT pros.

As reported in June by Kurt Mackie, the use of a "workplace join" feature adds a security safeguard that ensures that only registered devices can connect to a company's data. Microsoft didn't mention that during Monday's launch event in New York, though I sat down with Surface director Cyril Belikoff right after the formal presentation, who emphasized the workplace join.

"Workplace joins are the access components of a directory service that allows a user to use their ID and password to access their corporate network documents and shares in a secure way," Belikoff said. "It's not a fully domained device but you get the administration of mobile device management and get the access component."

Within the server, IT can assign access rights to document shares and files on the network and can be managed with MDM tools from MobileIron, AirWatch and Microsoft's own Windows Intune device management service.

But he acknowledged that might not suit those organizations that mandate all devices must be jointed to an Active Directory domain. "Surface pro has all the security and management functions those IT organizations need," he said.

Is the Surface 2 and Windows 8.1 RT more appealing to you with the workplace join?

Posted by Jeffrey Schwartz on 09/27/2013 at 12:50 PM2 comments


Is Ford CEO Alan Mulally the Prime Candidate To Replace Ballmer?

Ford CEO Alan Mulally is reportedly Microsoft's lead candidate to replace Steve Ballmer, who is set to retire within the next year.

Once viewed as a longshot, All Things D's Kara Swisher reported yesterday that Mulally has risen as the top contender to replace Ballmer, though it's not clear if he would take the job if offered. Mulally's work turning around Ford is not done and he promised the automaker he would stay on board through the end of 2014.

Swisher, who noted that Mulally is one of the few CEOs who has candidly responded to her e-mails -- and on-the-record -- said he has gone dark after sources told her he's a front-runner. Bloomberg TV suggested maybe the leak was aimed at putting pressure on Mulally to consider breaking his promise to Ford.

It's not surprising that Gates and Ballmer might want to see Mulally take the job. Ballmer himself has deep ties to Ford. His father Frederic was a manager there and Mulally reportedly advised Ballmer in advance of the company's latest restructuring. The new One Microsoft even shares a similar name to One Ford, the automaker's own mission statement to reinvent itself. Indeed Mulally won huge plaudits for saving the company without accepting a government bailout, as General Motors and Chrysler did back in 2009.

Until yesterday's report, many pundits were betting former Nokia CEO and former Microsoft president Stephen Elop was among the top contenders to take over the top spot at Microsoft. Swisher's report said he remains a top candidate, along with Tony Bates, the former CEO of Skype, which Microsoft acquired in 2011. Microsoft also reportedly has contacted another former key Microsoft exec Paul Maritz, the recent CEO of VMware, who is now running that company's spinoff Pivotal.

Ford COO Mark Fields today told reporters at a fundraising event in Detroit, that Mulally remains "focused" on the automaker's long-term turnaround.

Would you like to see Mulally take the reins of Microsoft or do you prefer the Board choose an insider or one who once held a leadership position to take the top slot?

 

Posted by Jeffrey Schwartz on 09/27/2013 at 12:49 PM3 comments


SharePoint Experts Say Cloud Collaboration Is Inevitable

While the recent NSA leaks have caused IT decision makers and individuals alike to reconsider their use of cloud services to store and selectively share their files, several SharePoint experts today said they believe online collaboration is inevitable. Nevertheless most mid- and large-size organizations that are using cloud-based implementations of SharePoint -- or considering doing so -- will continue to employ a hybrid strategy.

That was the consensus of a number of participants, including myself, in a Tweetjam organized by Metalogix SharePoint evangelist Christian Buckley this morning. "Companies will use more and more hybrid scenarios to find their way into cloud offerings," said Michael Greth, a SharePoint MVP, specialist and community leader.

Antonio Maio, a product manager at TITUS and SharePoint MVP agreed. "Hybrid will be around much longer than previously anticipated," he said. "Reality of difficult change."

That change has become more difficult following this summer's NSA leaks. For example, 46.3 percent of 300 Redmond magazine readers responding to an online survey said they were concerned about potential government surveillance and 13 percent said they're pulling back on such initiatives. But the economics of cloud computing are going to ultimately dictate more usage not less over time, many predict.

"You will be left behind if you aren't in the cloud and left to watch the people use the nice new features," tweeted Ben Henderson, client services manager at Colligo. Nevertheless  it will remain a hybrid world for the foreseeable future, Buckley predicted. "Data storage/ownership issues will get resolved," he said.

So what do organizations need to do to prepare to move to the cloud? Buckley tweeted, "1) priorities of each workload, 2) governance strategy between cloud and on prem systems, 3) THEN focus on new functionality."

You can view the entire discussion and feel free to weigh in here.  

 

Posted by Jeffrey Schwartz on 09/25/2013 at 12:40 PM2 comments


Dell Apparently Last To Abandon Windows RT

It looks like Microsoft is going it alone with tablets running its Windows RT operating system -- the version of Windows 8 that runs only applications that support its Modern UI accessed from the Windows Store.

As reported by The Verge this morning, Dell's XPS 10 tablet, which sported an optional keyboard, is  listed as unavailable in Dell's online store.  As an alternative, Dell recommends the $499 Latitude 10, powered by an Intel Atom processor with the full version of Windows 8. As noted last month, the few PC suppliers offering Windows RT tablets were cutting bait including Acer, Asus, Lenovo and Samsung. I noted at the time that Dell appeared to be the last one standing with the XPS 10, though it could only be found online.

Though Dell is scheduled to announce new tablets Oct. 2, analysts told PCWorld that one running Windows RT 8.1 appears unlikely, leaving Microsoft as the last man standing. That's my bet as well, though it remains to be seen whether OEMs will give Windows RT another chance.

For now, the future of Windows RT is now in Microsoft's hands when it releases its second-generation Surface 2, announced on Monday and scheduled for release next month. The question is will business users give this revamped model a shot or is Microsoft's future in tablets for now aligned with the full-blown Windows 8.1? What's your take? Comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/25/2013 at 12:41 PM0 comments


Will NSA Surveillance and Security Threats Inhibit Your Cloud Collaboration Plans?

When Edwin Snowden revealed the National Security Agency's covert surveillance of communications with the cooperation of the largest service providers, it validated what many cynics and security experts presumed as fact. But the leaks outlining blatant surveillance by companies including Verizon, Google, Facebook and Microsoft, among others caught the large universe of IT decision makers off guard.

As my colleague Chris Paoli and I have noted since the June leaks, many organizations were alarmed to read reports that the NSA could access data stored in the public cloud even though the U.S. government has insisted it was only doing so to track suspected terrorist activities overseas. The revelations also came as a growing number of organizations were starting to use Microsoft's Office 365 and SharePoint Online.

In wake of the reports, many organizations have pulled their cloud deployments back in house. The Information Technology and Innovation Foundation, a Washington, D.C. think tank, said cloud providers in North America alone stand to lose between $21.5 billion to $35 billion in revenues by 2016 after it came out the NSA was invoking the Foreign Surveillance Intelligence Act (FISA) and the Patriot Act for programs such as PRISM to obtain and mine data to investigate suspected threats.

I don't believe the damage will be that severe, though. A survey of Redmond magazine's readership last month revealed many shops are cloud migration plans on hold, while many are retreating from those under way.  According to our online survey of 300 readers, 35 percent are putting planned projects on hold in wake of the NSA leaks while 13 percent brought cloud projects back in house.

That and other findings from the survey are the basis of our October cover story, which we'll be publishing next week.

But given the large universe of SharePoint administrators, developers and users who make up the readership of Redmond magazine, many decision makers are having new reservations over whether to run SharePoint and other collaboration and enterprise social media tools in the cloud. So when Christian Buckley, the SharePoint MVP evangelist at Metalogix, invited me to participate in a Tweetjam on Security and Cloud Collaboration, I was happy to participate. It takes place Wednesday, Sept. 25 at 11 a.m. ET/8 a.m. PT.  If you can't make it, you can review the comments at your leisure.

Participating on the Tweetjam are a number of SharePoint MVPs including Antonio Maio, (@antonionmaio2), product manager at TITUS and SharePoint MVP, Eric Riz, (@rizinsights), executive vice president at Concatenate and SharePoint MVP, Tom Resing, (@resing), systems engineer at Jive and SharePoint MCM and MVP, Michael Greth (@mysharepoint), SharePoint specialist and community leader, SharePoint MVP. Buckley, who has organized the Tweetjam, outlined in a blog post the questions it will cover:

  • Are most organization ready to move to the cloud?
  • Did the NSA data breach in the US by Snowden affect your near or long-term plans for the cloud?
  • What are the key risks with moving to the cloud, and what can companies do to mitigate them?
  • Which workloads are most effective in the cloud?
  • How can Microsoft make their cloud options more viable?
  • What 3 things should companies prepare for in their move toward the cloud?
  • What are your cloud predictions for the next 2 to 3 years? 

If these are questions on your mind and you're free, join in the discussion and share your views. Just click here to participate. You can follow me on Twitter @JeffreySchwartz.

Posted by Jeffrey Schwartz on 09/24/2013 at 3:42 PM0 comments


Will Displaced Nirvanix Customers Seek Refuge with Microsoft?

When Nirvanix last week abruptly informed customers they have weeks to find a new cloud storage provider because it was shutting down its operations, it left more than 1,000 enterprises scrambling to save their data.  Many are likely to turn to Amazon Web Services, which has the most mature and advanced cloud infrastructure, but Microsoft will also likely become a beneficiary of Nirvanix's demise.

Initially Nirvanix told customers last Monday they had two weeks to find a new home for their data but the company later in the week extended the deadline to Oct. 15. Still for those with terabytes or even petabytes of data stored in Nirvanix datacenters, moving all of that data in less than a month is a tall order. Further adding to the difficulty is the fact that Nirvanix doesn't have the most robust network infrastructure, which is being heavily taxed with the fact that all of its customers are trying to pull all of that data at once, explained Andres Rodriguez, founder and CEO of Nasuni, which provides its own storage service that once used Nirvanix as its back-end target.

Rodriguez last week told me he saw this coming long ago and tried warning some of his customers whom he knew were using Nirvanix for some of their cloud storage that he believed Nirvanix was at risk of going out of business.  Now every NIrvanix customer is trying to get their data out. "What's happening now with Nirvanix is the equivalent of bank rush," Rodriguez said. "Everyone is trying to get their data out in a hurry and you know what that does to a network, and it's going to be very hard to get their data out."

When Nasuni used Nirvanix as its cloud storage provider two years ago, Rodriguez became increasingly concerned that it couldn't scale. Nasuni now uses Amazon Web Services Simple Storage Service (S3) for primary storage. Nasuni runs annual tests against what Rodriguez believes are the largest cloud providers. The most recent test results released earlier this year concluded that Amazon S3 and Windows Azure were the only two viable enterprise-grade storage services.

Nasuni just added a mirroring option that lets customers replicate their data stored in Amazon S3 to Windows Azure for added contingency. While Rodriguez believes Amazon S3 and Windows Azure are the most scalable and resilient, he warns it could be years before the majority of customers feel comfortable using the latter as their primary target.

The Nirvanix demise validates warnings that it's easier to upload to the cloud than to recover large quantities of data and the need to have contingency and migration plans in place, said Forrester Research analyst Henry Baltazar.  "The recent example with Nirvanix highlights why customers should also consider exit and migration strategies as they formulate their cloud storage deployments," Baltazar said in a blog post last week. Now they have a difficult task in gathering their data, he noted.

"One of the most significant challenges in cloud storage is related to how difficult it is to move large amounts of data from a cloud. While bandwidth has increased significantly over the years, even over large network links it could take days or even weeks to retrieve terabytes or petabytes of data from a cloud. For example, on a 1 Gbps link, it would take close to 13 days to retrieve 150 TB of data from a cloud storage service over a WAN link."

Gartner analyst Kyle Hilgendorf also emphasized in a blog post that failure to have an exit strategy when using a cloud service, especially for storage, can be a recipe for disaster. As for this week's Nirvanix news, Hilgendorf said: "What are clients do to?  For most -- react...and react in panic. You may be facing the worst company fear -- losing actual data."

Posted by Jeffrey Schwartz on 09/23/2013 at 12:52 PM0 comments


Apple's iOS 7 Includes Improved Enterprise Management Capabilities

Untitled Document

In addition to the 9 million who stood in line to get their new iPhones over the weekend, millions of existing iPhone and iPad users were able to download the new iOS 7, released late last week. For those who are holding off, enterprise IT managers have good reason to encourage (perhaps even insist) users to upgrade their iOS devices to the latest operating systems.

Here's a list of some of the new capabilities in iOS 7 that Apple highlighted, which promise to improve security and management of corporate data accessed on user-owned devices:

Protect Corporate Data
IT can now manage which applications and accounts are used to open document and attachments. IT can prevent users from opening personal documents from managed apps, while allowing administrators to configure a list of apps available in the sharing panel.

Per App VPN
Administrators can determine which apps can connect to the VPN, ensuring data transmitted by managed applications only goes through the VPN, while ensuring personal activities do not go travel through it.

App Store License Management
Business can now purchase apps on behalf of users while maintaining ownership of the apps and retaining control of the licenses. Enterprises can now purchase licenses though the Volume Purchase Program (VPP) site and use their mobile device management (MDM) platform to assign apps to users. It lets employees enroll with their personal Apple IDs without providing it to the enterprise. IT can also revoke apps and reassign their use to others. The VPP also now supports purchase of Mac apps.

MDM Enhancements
IT can now set up managed apps wirelessly, install custom fonts and configure accessibility options. IT can configure company owned devices in line with corporate settings and policies. It also supports highly managed deployments.

Enterprise Single Sign-On        
Now users can sign on and authenticate across apps including those from the App Store. All apps configured with SSO verify user permissions to access enterprise resources, logging users in without having to re-enter passwords.

Improved Exchange Integration
Exchange 2010 users can synchronize their notes with Outlook, while they can now view PDF annotations. Search is also improved.

Have you tested any of the new enterprise management features added to IOS7? Share your observations below or drop me a line at jschwartz@11105 media.com.

Posted by Jeffrey Schwartz on 09/23/2013 at 2:43 PM0 comments


4 Ways Microsoft Plans To Grow

Microsoft's departing CEO acknowledged his biggest regret was failing to lead Microsoft into the smartphone and tablet worlds before Apple and Google, which he blamed on Microsoft's botched development of Windows Vista a decade ago.

"I regret that there was a period in the early 2000s when we were so focused on what we had to do around Windows, that we weren't able to redeploy talent to the new device form factor called the phone," Ballmer said at Microsoft's annual financial analyst meeting yesterday. "The time we missed was about the time we were working away on what became Vista, and I wish we'd probably had our resources slightly differently deployed, let me say, during the early 2000s.  It would have been better for Windows and probably better for our success in other form factors."

It will be up to Microsoft's next CEO to take the company forward but Ballmer made clear it will be along the strategy he rolled out in July.

"My greatest desire will be to see the company be so much more successful four or five years from now than it is today," Ballmer said. "That will be grounded in three things:  Number one, we've picked a great new CEO.  Number two, we made fundamental bets before that new CEO came out.  The strategies that we put in place, the approach that I've put in place that the board has endorsed -- those things are important.  The new CEO is important.  And the leadership team and the talent pool that is here is very important."

Ballmer also emphasized Microsoft's financial strength. "The ability to invest, to get into new areas, to think about the next big thing, none of you should take that casually as an investor, "he said. Much of what Microsoft outlined at the financial analysts meeting we've heard before but here are four points Ballmer and the executive team emphasized at the meeting:

One Microsoft Will Empower Devices and Services
Microsoft believes the employee is critical to creating demand for its enterprise IT infrastructure and cloud services. "We do believe that consumers drive a lot of what's coming into the enterprise," Ballmer said. "When I was in Office, Excel was brought into the enterprise because consumers liked Excel and brought it in over Lotus [1-2-3].  We've seen the same thing with the iPad.  Consumers have the iPad and bring it into the business. So we think these end points of these desirable, high-quality devices, combined with the power of our high-value scenarios and services is what's going to propel us forward into reaching customers the way we'd like to."

By bringing together the operating systems groups two months ago, the company is moving forward with tying them together. "We really should have one silicon interface for all of our devices," said Terry Myerson, executive VP for Microsoft's recently formed operating systems group, tasked with unifying Windows.  "We should have one set of developer APIs on all of our devices.  And all of the apps we bring to end users should be available on all of our devices." Myerson said the focus on devices will be to ensure they are even more integrated with cloud services. Whether we're branding them Windows or Xbox, we really need one core service which is enabling all of our devices," Myerson said. "We have a very clear vision of what we want to get done, and we're moving very fast."

More Apps Coming
Despite the fact that the Microsoft Store lags that of the Apple iTunes App Store and Google Play in terms of the number of apps available, Myerson believes the company will catch up. "We really are working our angles on this locally, globally," he said. "We're looking at all of it.  We're looking at domains for ISVs.  We're looking at consumer apps.  We're looking at enterprise apps.  We're looking at how the virtuous cycle works on other platforms, how it should work on our platform.  And it really is a top priority for me and my team, and we're working it."

Satya Nadella, executive VP for Microsoft's cloud and enterprise group added: "We're really building out our tooling across all of our assets and enabling these developers to exploit our broadest platform, and I think that's another source of innovation around our platforms that I think will translate into sort of unique app experiences for our platforms."

Stop Google
Microsoft is not throwing in the towel on search and it will relentlessly pursue taking on Google from all angles including seeking relief from the antitrust authorities that were once the bane of its own existence. "I do believe that Google's practices are worthy of discussion with competition authority, and we have certainly discussed them with competition authorities," Ballmer said.  "I don't think their practices are getting less meritorious of discussion.  We've highlighted some of their bad practices in our advertising, in our discussions with regulators, the bundling that they're doing with YouTube and Google Maps and some other things.  Anyway, suffice it to say that I think they need pressure from competition authority.  I think they need pressure in the marketplace."

That pressure isn't just going to rely on the feds bringing Google down but further investments in Bing, Ballmer added. For instance he cited working with Apple to integrate Bing search into its Siri-like voice assistant.

Eat its Own
Cannibalization is very much in play, COO Kevin Turner said. "As we move people from on-premise to cloud, yes, we're willing to cannibalize ourselves to do it," Turner said. "And we've embraced that.  And when I said that transition is underway, it's underway." Saying Office 365 is already a case in point, "I think the cannibalization stuff is imminent."

 

Posted by Jeffrey Schwartz on 09/20/2013 at 2:24 PM0 comments


Are SharePoint Experts Earning Less This Year?

While our annual Redmond Magazine Salary Survey showed average wages increasing 3 percent, we discovered a surprising anomaly in this year's report: SharePoint experts are making less this year.

Granted the average SharePoint administrator or developer still earns a six-figure salary, our survey found a decline of 7 percent. Last year the average SharePoint Salary was $107,063, dropping to $100,817 this year.

Given the growth of SharePoint and the release late last year of SharePoint 2013 as well as a substantial upgrade to the SharePoint Online feature of Office 365, I found that surprising.

"I've noticed that trend also and I don't understand that either because it seems everybody wants to use it more," said survey respondent Chris, who told me he was equally surprised when I ran the number by him. "I see SharePoint expertise required in job descriptions all the time."

Could it be that there are more SharePoint experts out there, thereby giving employers a larger pool of candidates to select from? Or is it perhaps an aberration in this year's survey?

If you're a SharePoint expert, or one who hires them, please share your observations. You can comment below or send me a note to [email protected].

Posted by Jeffrey Schwartz on 09/18/2013 at 2:20 PM0 comments


Is Microsoft Finally Planning a Siri-Like Feature for Windows Phone?

It appears Microsoft is planning a voice assistant for its Windows Phone platform after all. As Mary Jo Foley noted in her Redmond magazine column this month, Microsoft has spent the past decade working on natural language platform, though it has stepped up its effort over the past two years.

Since that column went to press, it appears she, along with Tom Warren of The Verge, unearthed information that points to a Siri-like voice assistant coming to Windows Phone early next year. In a post on her All About Microsoft blog last week, she said technology Microsoft has under development that goes by the code name "Cortana" aims to compete with Apple's Siri and Google's Now. 

While Windows Phone already understands basic voice commands, according to her sources Cortana, named for an artificially intelligent character from Microsoft's Halo game series, "will be able to learn and adapt, relying on machine-learning technology and the 'Satori' knowledge repository powering Bing."

In addition to Windows Phone, Foley learned Cortana could work itself into the core Windows and Xbox operating systems because Microsoft is adding it to its entire services-enabled "shell," -- the services Ballmer described in his July reorg letter.

Following her report, Warren learned Microsoft is testing the Cortana UI's ability to gather notifications, weather reports and calendar information. It also uses location information and has access to Bluetooth controls.

Whether you're a fan of voice assistants, or at least the concept of them, it appears all device makers will be focusing on advancing the technology sooner rather than later. If they get it right, as Google appears to have with its new Moto X phone, that can be a good thing, especially for those who can't seem to keep their hands off their devices.

Posted by Jeffrey Schwartz on 09/16/2013 at 2:59 PM0 comments


Microsoft's Paying for Used iPads

Microsoft's latest attempt to garner interest in its Surface hybrid tablet-PCs is a trade-in program the company launched that will let customers swap out their iPads for a $200 gift card applicable toward a Surface.

Actually the $200 gift card can be used toward anything offered in the Microsoft Store but Microsoft is obviously trying to get people to trade in their iPads for a Surface RT or Surface Pro.

The company quietly kicked off the deal last week but word of it started to spread in recent days. The offer runs through Oct. 27, though I'd bet it will remain permanent – unless it turns out that's the date the new Surfaces are available.

Nevertheless, I'm curious how many pe ople will actually take Microsoft up on this trade-in offer. Most people I know love their iPads and have no desire to get rid of them. Also the oldest unit Microsoft will take is the iPad 2. In most cases you can get more for it selling it on eBay or elsewhere than trading it into the Microsoft Store.

Perhaps if your device is broken it's a good deal. While the fine print didn't say the device must work, it did say it's up to the store manager's discretion.  

But the move underscores the increasingly aggressive posture Microsoft is taking toward Apple -- following an ad campaign showing Siri explaining everything an iPad can't do that users can perform on a Surface.

Still, I'll be surprised if many people take Microsoft up on its latest offer. Will you?

Posted by Jeffrey Schwartz on 09/13/2013 at 1:37 PM0 comments


New iPhone 5S Could Raise the Bar for Smartphones

By now you've probably heard Apple launched two new iPhones yesterday, the 5S and the 5C. The long-rumored devices were equipped as expected but analysts gave the lower-end 5C a thumbs-down due to its higher-than-expected price tag and lack of broad appeal to emerging markets.

Many reports have described the new higher-end iPhone 5S as an incremental upgrade but I think it could prove to be a key indicator whether the smartphone market will embrace fingerprint-based authentication. It uses technology Apple gained when it acquired AuthenTec for $356 million last year, which lets users unlock their phones and make purchases on the iTunes App Store, instead of using passwords.

The phone's new Touch ID apparently doesn't let users authenticate to other sites but I would imagine the company is waiting to see how users accept this new feature and iron out any kinks before broadening its use. Should users widely embrace this form of authentication on smartphones, I would expect to see all of the other players rush to add similar features. It would naturally spread to tablets as well.

Apple's new iPhone 5S also sports a vastly improved camera, which will put it in competition with camera offered on Nokia's new Lumia 1020. The sensor on the iPhone 5S is improved and 15 percent larger, the phone supports more pixels and sports a new ƒ/2.2 aperture. The camera also has an improved LED-based flash. The new ARM-based A7 CPU doubles the speed of the iPhone and it's the first based on a 64-bit processor.

As I noted, the 5C is intended as a lower-cost unit and while it will be available in the U.S., analysts are banking on Apple breaking into emerging markets with it. Unlike previous units, the 5C is made of plastic rather than aluminum -- and lacks some of the newer features in the 5S such as the improved camera and Touch ID.

The $99 price tag (with a two-year contract), though half the cost was higher than most analysts expected and not scaled down to the extent anticipated and doesn't appear targeted at emerging markets, according to analysts, who noted it will compete with the newly priced iPhone 5. "The market placement was different than our prior expectations," wrote Piper Jaffrey analyst Gene Munster in a research note. "Previously we thought the 5C would be a lower end phone to address emerging markets like China and India."

Apple also didn't announce an anticipated partnership with China Mobile to offer a lower-end iPhone. "We believe that based on the pricing of the 5C ($549 unlocked), the phone is only a replacement for the 5 instead of downgrading the old device, and still plays in the higher end of the market," Munster wrote.

As for the 5S, Munster, a noted Apple bull, also believes Touch ID could pave the way for a broader payments offering. "We view security as the biggest hurdle in offering a successful payments platform and the biometric feature may be able to address that in iOS 8," he noted.

Do you see fingerprint-based authentication becoming the mainstream on all smartphones? Would you like to see it replace passwords? Feel free to comment below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 09/11/2013 at 9:09 AM0 comments


Microsoft Reportedly Readying Next Generation of Surface Tablets

Microsoft is hoping those in the market for a hybrid tablet PC will find the second wave of Surface devices more appealing than the first. We'll get a better sense of that later this month when Microsoft likely unveils new Surface RT and Surface Pro devices.

The company is expected to disclose more details at an event in New York. An invitation to the event only indicated Microsoft will discuss "Surface growth and expansion, specifically pertaining to the commercial and business enterprise sector" and "Microsoft's ongoing commitment to Surface and its business customers."

Microsoft declined to elaborate but several press leaks about both devices appeared this week suggesting a new Surface Pro equipped with Intel's long-awaited Haswell processor is in the works (Intel said yesterday those processors are coming shortly, though it didn't specify when).

A Surface Pro with a Haswell chip, which I've long presumed would be in version 2 of the higher-end unit, will address a key objection to the first Surface Pro with regard to power use. An upgraded Surface Pro unit would support 8 hours of battery life, noted Paul Thurrott at Supersite for Windows, rather than the meager four hours that its predecessor offered.

Also reportedly in development is a new Surface RT 2, according to Neowin.net. The upgrade will have the more-powerful Tegra 4 processor, a 1080p display, a two-step kickstand and potentially support up to 8 GB of RAM (the current device maxes out at 4 GB). The site earlier this year reported that a Surface Mini is also in the works.

It will run on a "variant" of the Windows 8.1 RT operating system, sources tell All About Microsoft's Mary Jo Foley.

It's unclear when the new devices will actually ship but presumably the company will want to make them available in the critical fourth quarter holiday shopping season. Do the upgrades sound more appealing to you than the first versions?

Posted by Jeffrey Schwartz on 09/11/2013 at 1:01 PM0 comments


Split Up Microsoft? Most IT Pros Say No

With Microsoft now in search of a new CEO, the noise level suggesting Microsoft should split itself up into multiple companies is predictably up. Yet the steps that "retiring" CEO Steve Ballmer and Microsoft's board have taken make that seem unlikely in the short term, though of course in business anything can happen.

But with last week's deal to acquire Nokia for $7.2 billion, short of it unexpectedly imploding before the deal closes early next year, Microsoft is on a path to add parts, not remove them. Even so, the latest buzz theorizing a Microsoft divestiture comes from Nick Wingfield, the Seattle-based reporter with The New York Times who today stitched together a scenario that has Microsoft being split into four businesses based on interviews with longtime Microsoft watcher Rich Sherlund of Nomura Securities and other experts.

Wall Street always becomes infatuated with the notion of breaking up companies investors feel they can chop up to make a quick buck. Sometimes divestitures happen for good reason and they're successful.  While most IT pros might not care if Microsoft were to spin off its Xbox gaming business or Bing search engine, most have a stake in the ties between the client, server and the cloud. Hence few IT pros have interest in Microsoft breaking itself into multiple businesses, especially if those ties are broken.

An online survey of more than 1,100 Redmond magazine readers fielded immediately after Microsoft announced that Ballmer will retire as CEO in the next year shows only 12 percent feel Microsoft's next CEO should split Microsoft into multiple companies. So what do they think the next chief executive should do?

Nearly half, or 49.4 percent, believe the next CEO needs to focus on breaking down the silos and fiefdoms that many believe has stifled Microsoft's ability to provide common services across different product lines. Ballmer's "One Microsoft" strategy, which coincidentally is the basis of his July corporate re-org, aims to break down those silos. Just as Ballmer solidified his "devices and services" strategy with the Nokia deal, the new CEO will inherit One Microsoft.

Nearly a quarter of respondents (24.7 percent) believe the new CEO should make major acquisitions that would embolden the company's "devices and services" strategy (ironically we fielded the survey just days before Microsoft announced it plans to acquire Nokia's handset business). Only 13.9 percent believe Microsoft should make moves into new markets.

Just as many investors and others on Wall Street had long wanted Ballmer to go, more than half (58.2 percent) of Redmond readers also welcome the change in leadership. Only 10.6 percent were disappointed in the news, believing only Ballmer or Gates could run Microsoft, while 31.2 percent don't believe Ballmer's departure changes anything.

"There's a lot of finger pointing and blame to go around," said Directions on Microsoft analyst Rob Sanfilippo. "Whether Ballmer has caused some of the issues at Microsoft over the last decade, I think [it] is a lot more complicated than could be blamed on one person."

Some survey respondents agree, saying Ballmer deserves more credit than he received. "Microsoft's rise in the enterprise over Ballmer's tenure is explosive, and he's on the right road to fix the consumer missed opportunities," one respondent said. "Mr. Ballmer is a one-of-a-kind individual, in every positive sense of the word," adds another. "He's a human dynamo -- one of the few people in the world besides Bill Gates who could keep a juggernaut like Microsoft on the track."

One reader sees Ballmer's departure as an opportunity for Microsoft to reevaluate everything. "Ballmer's retirement is a chance for Microsoft to reinvigorate itself," the reader said, adding his successor needs to reevaluate Microsoft's Windows strategy, restore the traditional (Windows 7) interface, work closely with application vendors in regard to mobile versus desktop use and make Windows 8 really effective for tablets and phones. The new CEO also must work to make cloud-based applications "the most secure [and] effective available and sell subscriptions at a very reasonable price-point."

While many are relieved Ballmer is leaving, they're also anxious about the uncertainty new leadership might bring. "His successor needs to first define why Microsoft is in business and not just follow Ballmer's plan of [becoming a] services and devices company."

Many survey respondents believe Microsoft's next CEO must accelerate Microsoft's break with past protective practices, such as offering Office on Apple's iOS platform, while breaking down the political divisions across business units that have stifled innovation.

"Whoever succeeds Ballmer must find a way to continue to break down silos and depoliticize if that's even possible for Microsoft, and accelerate innovation," one concluded. "Microsoft may survive forever as an 'also [ran]' company but their time in the limelight (publicly and to investors) will never return if they don't break out of their current ways including kingdom building and other such silliness. I am not convinced that hardware can save Microsoft... too little too late as there are far too many players that are better at it, and faster."

Respondents were divided on what type of leader Microsoft should seek with 41.4 percent favoring an outside tech visionary in the mold of Steve Jobs and 38.8 percent preferring a turnaround expert such as Lou Gerstner, credited with bringing IBM back from the brink in the 1990s, despite coming from outside the technology industry.

Microsoft's next CEO has one thing going for him or her: A vast majority (69.2 percent), say the company will remain a key provider of enterprise software and services. But a healthy one in four (26.2 percent) say Microsoft will become a less influential player over time, while 4.2 percent believe the company will become marginal. And while a return by Gates is extremely unlikely, 9.2 percent believe only he can save Microsoft. Absent of that happening, one reader said, "They need to find someone who is a technology first person like Gates was, not a marketing-first person as Ballmer is."

Time will tell how history treats Ballmer's tenure but for now, many respondents believe Microsoft needs a leader with more vision and less bluster. "Under [Ballmer's] leadership Microsoft has deeply hurt [its] reputation with IT professionals like [me]," opined one critic. "They need someone with enterprise IT vision who understands that driving wedges with IT professionals gives us the freedom to think past our fears of moving our shops to other products."

Posted by Jeffrey Schwartz on 09/09/2013 at 2:19 PM0 comments


Icahn Walks Away from Battle for Dell

After a long and protracted battle, activist shareholder Carl Icahn has walked away from his effort to acquire a controlling interest in Dell, though he hasn't backed away from his insistence that shareholders are accepting an inferior deal.

The tide was clearly against Icahn's Southeastern Asset Management with founder and CEO Michael Dell's pending shareholder vote set for Thursday and all indications that its offer, backed by Silver Lake Partners with a lift from Microsoft, had enough support to win. But Icahn maintains the $13.75-per-share deal valued at $25.5 billion undervalues Dell even though they nominally sweetened the deal in July. After a Delaware court last month ruled in favor of the Dell team's contention that needed fewer votes in favor of the deal than Icahn's group claimed, momentum was in the favor of Dell.

"Dell is paying a price approximately 70 percent below its ten-year high of $42.38," Icahn wrote in a letter to shareholders filed with the SEC today. "The bid freezes stockholders out of any possibility of realizing Dell's great potential."

While expressing regret that he was not successful in his bid, Icahn took solace in the fact that he forced Dell to revise the terms of its bid somewhat. "It certainly makes the loss a lot more tolerable in that as a result of our involvement, Michael Dell/Silver Lake increased what they said was their 'best and final offer.'"  

Posted by Jeffrey Schwartz on 09/09/2013 at 12:00 PM0 comments


Will Acquiring Nokia Help Microsoft Reinvent Itself?

Microsoft and Nokia certainly caught the tech world off guard earlier this week when Redmond said it was acquiring the core business of the struggling phone maker for $7.2 billion. And if you don't think this changes everything for better or worse, think again.

The notion of Microsoft buying any major hardware company, much less a phone maker, was once unthinkable by founder Bill Gates and lame duck CEO Steve Ballmer. Now Ballmer has effectively described this deal as the missing link to the "devices and services" company he wants Microsoft to be. The first thing to ask: Is that the path the board and the next CEO will see for Microsoft as well? Since they signed off on this deal, that appears to be the marching orders for now -- though we all know how things can change.

Apparently Microsoft investors don't agree Nokia is the missing link. The company's stock has been down since the deal was announced, basically erasing the surge in market cap Microsoft gained when announcing Ballmer's retirement.

Although the deal appeared to be dead after talks fell apart in June, Ballmer and his team have been negotiating all summer, according to a report in The New York Times on how the deal went down. So this is Ballmer's deal but he'll be long gone before he can accept credit or blame.

Critics of the deal argue it pairs two companies that are both afterthoughts in the mobile phone and tablet markets today. Two weak players don't necessarily add up to a strong one. Is this move indeed the missing link that can put Windows Phone and tablets running Windows 8 or Windows RT on the map or is it the ultimate act of desperation?

"I don't think Ballmer's vision matches up with reality," said independent industry analyst Jack Gold, of IT market researcher J. Gold Associates. "I just don't think Microsoft can pull this off effectively. I see it as a knee-jerk reaction to Apple and Google, rather than a real strategy to become a leader in the market."

Yet on a conference call Tuesday, Ballmer said he believes this deal will boost Microsoft's market share in the mobile phone market from 3 percent to 15 percent. An aggressive target, indeed, but depending on how Apple's iOS and the Google Android ecosystem play out, the deal would still render Windows Phone a much smaller player in the market. IDC Wednesday predicted Windows Phone's share will double by 2017 and will cover 10.2 percent of the smartphone market. Google's Android will be the dominant player with 68.3 percent and iOS will be in the middle with 17.9 percent.

Ballmer emphasized on Tuesday's call that acquiring Nokia's handset business will ensure Microsoft bolsters its share in the market -- which he deems critical. "We want to strengthen the overall opportunity for Microsoft from a devices and services perspective and for our partners as well," Ballmer said on the call. "We need to be a company that provides a family of devices ‑‑ in some cases we'll build the devices, in many cases third parties, our OEMs, can build the devices â€‘‑ but a family of devices with integrated services that best empower people and businesses for the activities that they value the most."

The message is that not only does Microsoft gain the ability to take charge on how Windows Phones are designed, delivered and marketed, it gives the same capability to refine its Surface tablets and other hardware it decides to deliver.

As part of the deal Ballmer said Microsoft is buying the assigned rights in Nokia's IP license with Qualcomm and other key IP licenses. The company is also licensing, though not buying patents that can work with Windows Phone and other Microsoft products. Microsoft is also licensing rights to use Nokia's HERE mapping geospatial location technology, which it wants to use broadly in Microsoft products. According to various reports, Microsoft wanted to buy the patents and HERE technology outright but Nokia didn't want to part with it. Microsoft sees HERE as critical to breaking Google's hold on mapping.

Whether or not this deal makes you want to run out and buy a Nokia Lumina or have visions of using Bing more often once HERE is integrated, it changes everything about how Microsoft will develop Windows and deliver on its "devices and services" mission.

Do you like this deal or are you concerned about the direction this takes Microsoft? Share your thoughts below or feel free to drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/06/2013 at 2:33 PM0 comments


Are You Looking for a New Job? You’re Not Alone.

Not happy with the bonus or raise you received this year? Perhaps you're long overdue for an increase, want more recognition or you have a boss who knows less about IT than you do.  If so, you're not alone.

The number of Redmond magazine readers that say they may change jobs has doubled since last year, according to the 18th Annual Redmond magazine Salary Survey, published Tuesday. Salaries overall are up 3 percent, slightly lower than the 3.25 percent average salary in last year's survey. But here's the most interesting finding  in this year's report: More than a quarter of the 1,018 qualified respondents (26.8 percent) say they are weighing the possibility of changing employers up from 13.1 percent a year ago.

The surge in those who plan to put out feelers suggests IT organizations are poised to see marked turnover, which is hardly surprising, given the fact that unemployment, though still uncomfortably high continues to come down. The national unemployment rate early last month when we analyzed the survey was at 7.4 percent but IT unemployment was only 3.6 percent according to the U.S. Department of Labor.

When I shared the findings of our survey with Mike Durney, CEO designate of Dice Holdings, which operates the popular job recruitment service for IT pros and developers, he wasn't at all surprised at the trend. Many IT pros and managers have been stuck in jobs where they're being asked to do more, while salaries have not increased at the pace they did before the financial crisis of 2008.

It all comes down to simple supply and demand, Durney and others I spoke with noted. "There are lots of reasons people are afraid to make the leap, but it looks like this year, they're finally starting to make that leap."

Are you making the leap? Feel free to comment or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/04/2013 at 2:06 PM0 comments


Microsoft Buying Nokia's Devices and Services Business for $7.18 Billion

In an unexpected reversal, Microsoft has agreed to acquire Nokia's devices and services business for $7.18 billion (EUR 5.44 Billion), in cash, both companies announced early Tuesday morning in Espoo, Finland, where Nokia is based.

The deal is Microsoft's second largest, rivaled only by the 2011 acquisition of Skype for $8.5 billion and puts an even larger bet on its expansion into hardware. The company's third largest acquisition was aQuantive for $6 billion, which Microsoft wrote off last year.

From a scale perspective, the deal is huge. When the deal closes in the first quarter of 2014, an estimated 32,000 Nokia employees will transfer to Microsoft, including 4,700 in Finland and 18,300 involved in manufacturing, assembly and packaging of products worldwide, the companies announced.

Terms of today's agreement, approved by both companies' boards, covers the acquisition of all Nokia's devices and services business, the licensing of Nokia's patents and of its mapping services.

Such a deal seemed all but dead back in June when the two companies were reportedly in advanced discussions before talks broke down and it appeared unlikely the companies would renew negotiations.

That the two companies had consummated a deal is surprising considering there were no reports they had resumed negotiations. The timing is even more unexpected considering less than two weeks ago Microsoft announced CEO Steve Ballmer will retire within the next 12 months.

Ironically one of numerous candidates to succeed Ballmer is Stephen Elop, Nokia's CEO, who under the terms of today's agreement will return to Microsoft as an executive vice president.

"Building on our successful partnership, we can now bring together the best of Microsoft's software engineering with the best of Nokia's product engineering, award-winning design, and global sales, marketing and manufacturing," Elop said in a statement announcing the deal. "With this combination of talented people, we have the opportunity to accelerate the current momentum and cutting-edge innovation of both our smart devices and mobile phone products."

Speculation that Microsoft might one day acquire all or part of Nokia had surfaced back in 2011 when the handset maker chose Microsoft's Windows Phone as its smartphone operating system of choice.  But such a deal had remained remote even though both companies have struggled to gain share over the much more dominant phone and tablet platforms iOS from Apple and Google's Android.

Microsoft is presumably hoping that adding Nokia's handset business and related software to its arsenal will give it the scale to expand the Windows Phone platform. Given reports that Nokia is also developing a Windows RT-based tablet, Microsoft may also be betting that the Nokia deal will help bolster its fortunes with its own struggling efforts to  gain share with its Surface tablet line. Microsoft's first crop of Surface devices have been a disappointment – the company in July took a $900 million charge on unsold inventory.

Even though the Nokia deal and Microsoft's bigger push into hardware may put the company at further odds with its PC and phone partners, Microsoft may be betting that just as Google has leveraged its acquisition of Motorola, Microsoft can do the same with Nokia using it to bolster its manufacturing capability and leverage Nokia's relationships with wireless carriers.

"It's a bold step into the future – a win-win for employees, shareholders and consumers of both companies," Ballmer said in a statement. "Bringing these great teams together will accelerate Microsoft's share and profits in phones, and strengthen the overall opportunities for both Microsoft and our partners across our entire family of devices and services. Nokia brings proven capability and talent in critical areas such as hardware design and engineering, supply chain and manufacturing management, and hardware sales, marketing and distribution." 

Posted by Jeffrey Schwartz on 09/02/2013 at 2:23 PM0 comments


Metalogix Acquires Axceler's SharePoint Business

Looking to round out its SharePoint migration and management platform, Metalogix Software Corp. today said it has acquired Axceler's SharePoint software business. The deal makes Metalogix one of the largest independent SharePoint ISVs. Terms of the deal were not disclosed but Metalogix is assuming ownership of the entire Axceler brand and SharePoint software portfolio. Neither company will have a stake in the other.

What remains of Axceler is its Lotus Notes business and new suite of enterprise social media management tools called ViewPoint, launched earlier this year for Yammer and Salesforce.com Chatter environments, among others. Axeler CEO Michael Alden said on a conference call this morning that the company will unveil a new name next week.

It appears Metalogix CEO Steven Murphy wanted to keep his company focused purely on SharePoint migration and administration. By acquiring Axceler's SharePoint governance and permissions tools, it gives Metalogix a much broader offering, said Forrester Research analyst Alan Weintraub.

"The portfolio is larger which gives Metalogix a broader opportunity in the market," Weintraub said. "Metalogix growth was restricted by the limited amounts of capabilities they were delivering and their acquisition has allowed them to grow without completely overlapping."

While Axceler brings 3,000 new customers and 70 employees to Metalogix, Weintraub said this deal wasn't about bringing in new customers. Rather it was to allow Metalogix to expand into the important area of governance and permissions management, which complements its existing archiving and capacity management suite.

One area both companies overlap is SharePoint migration. Metalogix offers its Metalogix Migration Manager included in its Content Matrix suite while Axceler offers ControPoint. In an interview Murphy said Metalogix will wean new customers toward Metalogix Migration Manager product but will continue to support the Axceler ControlPoint tool and may issue some updates.

"There's not going to be a strategic roadmap on the ControlPoint Migrator but we'll continue to issue enhancements and support as necessary as that installed base is migrated to the Content Matrix platform," Murphy said. "We don't sunset products in this market, we provide enough functionality in the next releases that people move forward to the new platforms. Microsoft helps with that with their migrations [new releases]. They kind of force everyone to move forward."

Murphy said the two companies already have many common customers and Metalogix was constantly hearing that governance is a key concern among SharePoint customers. "It's the SharePoint governance and administration features that overwhelmingly our customers are looking for, that it made sense to link up with our migration, upgrade and ongoing management capabilities of Content Matrix," Murphy said.

The combined company will have over 13,000 customers and 250 employees. Murphy said Axceler customers will immediately have access to its 24x7 live telephone support service.

Among the 70 employees moving from Axceler to Metalogix is Christian Buckley, a SharePoint MVP and evangelist who is widely regarded in the SharePoint community. "There are a lot of interesting integration stories and products in the pipeline," he said.

Posted by Jeffrey Schwartz on 08/28/2013 at 12:44 PM0 comments


How Will the Next CEO Save Microsoft?

Untitled Document

While the guessing game of who will replace Steve Ballmer as Microsoft's CEO is on, the bigger question is can any executive fix the troubles in Redmond?

Wall Street tech analyst Rick Sherlund this morning told CNBC perhaps no one can help Microsoft make up for the ground it has lost in the tablet and smartphone race. But he said that may not be the criteria the search committee of Microsoft's board, which includes chairman and founder Bill Gates, will be looking for. 

"You can certainly continue to try but I think this is not about fixing the company in that regard," Sherlund said, adding the bigger priority will be creating shareholder value. Does that mean breaking up the company or using Microsoft's mountain of cash to buy back shares or pay hefty dividends? Or does creating value mean making some key acquisitions that would help increase Microsoft's share value, which has held relatively steady over the past decade?

Stakeholders including enterprise IT decision makers and those who manage their infrastructures with Microsoft products -- as well as those who use them for content creation and management --  might have different views on creating value and consequently how Microsoft should evolve. And that will also be critical to creating shareholder value.

The first thing the new CEO will need to consider is whether the company can deliver on the new "One Microsoft mantra," which really is just a marketing slogan for a concept Microsoft has long aspired. Remember Ray Ozzie's "three screens and a cloud" message, which referred to Microsoft's goal to tie devices, phones and TVs together with the cloud. And over a decade ago when Microsoft first announced the .NET Framework, its goal was to create intelligent devices and services, known as Project HailStorm.

"One Microsoft" looks to break down the organizational siloes -- and in many cases fiefdoms -- with a management structure that the company hopes can better achieve that model. Make no mistake, Apple and Google have similar goals and though unstated Amazon.com has shown signs that it also aspires to a similar goal. Companies like Citrix and VMware have similar worldviews.

It seems unlikely Gates would sign off on a CEO who would want to dismember that anytime soon. That would be a last resort. Despite Microsoft's troubles, they're not on par with the problems that plagued Apple when it was on the brink of collapse before Steve Jobs returned or IBM in the early 1990s when it appeared Big Blue was toast (and its CEO at the time, John Akers, had the wheels in motion to break up the company). Akers was replaced by Lou Gerstner, who was CEO of Nabisco at the time and had no background running a technology company.

Gerstner revived IBM in one of the most unlikely and remarkable corporate turnarounds ever. But I can't imagine that type of executive running Microsoft. And despite Microsoft's troubles, which aren't trivial, I don't see it in the dire straits IBM was in two decades ago. Ironically Microsoft was founded as a company looking to disrupt IBM's business model and prided itself on the fact that it didn't have the corporate makeup and legacy issues that faced Big Blue. In many regards, Microsoft has become what it once rallied against.

Along came Google, VMware and a re-christened Apple, who are now trying to do to Microsoft what it did to IBM. But if companies like Apple and IBM can return from the brink, it's certainly reasonable that the right leader can revive Microsoft's tenuous, though less severe issues and put it on a path to future growth.

In the meantime no one knows how long it will take to name a new CEO or how this will play out. The search committee has a big task on its hands and whoever it chooses will have a major impact on Microsoft's future.

Have any thoughts on who you'd like to see as Microsoft's next CEO and how he or she should take the company forward? Feel free to comment or drop me a line at [email protected].

 

 

Posted by Jeffrey Schwartz on 08/23/2013 at 11:05 AM0 comments


Is There Hope for Windows RT?

While Microsoft continues to promote Windows RT, the version of its client OS designed to work only with software offered via its Windows Store interface, third-party support is fading fast.  

Other than Microsoft's Surface RT, try finding anyone else who offers a tablet with Windows RT. I swung by my nearby Microsoft Store, Best Buy and Staples, and the only Windows RT device I could find was the Surface RT.

In the latest sign that everyone appears to be cutting bait on Windows RT, ASUS last week said it's dropping its Windows RT-based systems, while Acer said it was scaling back. Last month, Lenovo said it would no longer offer the Windows RT-based version of its IdeaPad Yoga. And HP long ago tossed its plans to offer a Windows RT tablet.

Both Acer and ASUS also started selling smaller form-factor Windows 8 tablets, now priced below $400. The ASUS VivoTab Smart ME400, priced at $399, has a 10.1-inch display, weighs 1.3 pounds and has a claimed battery life of 9.5 hours.

The 8.1-inch Acer Iconia W3, which costs $299, is in the same size range as the iPad Mini, most Android tablets and the larger Kindle Fire. While the Acer Iconia W3 hasn't received rave reviews, with full Windows 8 hybrid tablets-PCs hitting the sub-$500 price point, it's no wonder these companies are putting their Windows RT counterparts on ice.

However, not everyone is abandoning ship. Dell still offers the XPS 10, which it introduced last October with the launch of the operating system. Even with that Windows RT tablet, no one would accuse Dell of flaunting it. Has anyone spotted one of these?

If anyone can save Windows RT other than Microsoft itself, it's Nokia. The mobile phone company that has pinned its survival in the smartphone market on Windows Phone is said to have a Windows RT-based tablet in the works. According to Microsoft-News.com, Nokia will announce its Windows RT device at an event in New York. According to the report, the 10.1-inch tablet is based on Qualcomm's quad-core Snapdragon 800 processor and equipped with LTE support.

It will have to be pretty inexpensive and offer long battery life to gain any footing. The keyboard will reportedly have an added battery for extra power. As far as pricing is concerned, consider the new Lenovo ThinkPad 2 LTE tablet: That 10.1-inch device with an Intel Atom processor, 2 GB of Ram and a 64 GB solid-state drive costs $699 for just the base model and $119.99 for the optional keyboard. It gets 10 hours of battery life, according to Lenovo, and comes with Windows 8.

Unless the next iteration of Windows RT devices can outperform their full Windows 8 counterparts -- presumably from the next crop of Surface RTs and possibly from Nokia's offerings -- it could be an uphill battle. That is, of course, unless Microsoft can further lower the economic bar and show that developers are actually on board. It wouldn't hurt to see some more apps surface, pardon the pun.

Posted by Jeffrey Schwartz on 08/21/2013 at 11:32 AM0 comments


Are VMware Veterans Jumping on the Hyper-V Bandwagon?

Untitled Document

A venture startup financially backed and run by a deep bench of VMware talent is hoping to re-invent the way IT pros manage their virtual infrastructures using a new cloud-based big data analytics service.

CloudPhysics last week went live with its namesake service aimed at simplifying the administration of virtual machines by using a vast real-time analytics engine that aggregates and analyzes billions of data points. Administrators will be able to use the results of these analytic queries to ease the burden of solving the multitude of complex operational issues that come up, according to the company.

The Mountain View, Calif.-based company also said it has raised $10 million in a second round of venture capital financing from Kleiner Perkins. The company's first round came from Mayfield Fund.

CloudPhysics operates a cloud-based software as a service (SaaS) consisting of what it described as a sophisticated real-time data analytics engine. This knowledgebase, which constantly takes in new data feeds, diagnoses and troubleshoots thousands of issues that might affect the function of a VMware ESX virtual server cluster environment such as incorrectly configured scripts, network configuration errors, and memory and IO utilization issues.

"The administrator has multiple questions, literally thousands of questions that are very well-defined explorations or responses to very well-defined problems," explained Founder and CEO John Blumenthal, who is among the VMware veterans who helped launch CloudPhysics in 2011.

Naturally I asked if this is a VMware-only play or if the company will support the growing presence of virtual machines powered by Microsoft's Hyper-V, as well as Xen and KVM hypervisors. Blumenthal said that is indeed the plan and by the end of the year it will support one of the above-mentioned hypervisors. While there's a good chance it will be Hyper-V, he said it's not a certainty. The company is still weighing whether it should consider KVM before Hyper-V.

"The commercial midmarket user who is our targeted customer as we go to market is looking more curiously at Hyper-V," Blumenthal said. "And as you move up in the size of organizations, we are encountering an increased presence and interest in KVM and OpenStack."

Blumenthal described the service as a big data repository that collects more than 80 billion pieces of data each day from a variety of sources, ranging from technical blogs to configuration data from customers and other sources. The data is all "anonymized" and used to create patterns that are subsequently analyzed.

Data fed from customer datacenters and other sources are kept anonymous by using sophisticated cryptography to debunk concerns about the privacy and security of data, Blumenthal said. While I didn't dispute the wisdom of those measures, especially with heightened concerns about surveillance, I asked Blumenthal why an organization would be worried about their memory utilization getting into the wrong hands.

"It's more of a policy issue than anything else," Blumenthal said. "When you talk to users, they make extensive uses of SaaS services, including Salesforce.com, where actually the most sensitive data in a corporation is now off-prem in the form of the customer contact list. Usually, in most of our discussions with our users who raise these concerns, they back down from it very quickly when they stop and think it through."

More than 500 enterprises globally tested the service, which is hosted on the Amazon Web Services EC2 service, though Blumenthal said it can easily be moved to another infrastructure as a service (IaaS).

"It's not tied to Amazon in any way," Blumenthal said. "Amazon's back-end provides the running infrastructure for compliance and security."

Customers install a virtual appliance on their VMware ESX clusters, which function as an agent. Administrators can discover and troubleshoot hundreds of operational problems using specific analytic components that CloudPhysics calls Cards, available from an app store-type environment also launched this week. In addition to accessing cards that offer pre-configured reports, a customer can create their own with a tool called Card Builder.

The analytics engine is designed to help administrators optimize storage, compute, network and other components using various modeling methods that can address performance and cost benchmarks. A planning component lets administrators simulate the effects of adding new hardware, software and other components.

CloudPhysics offers a free community edition. For a standard edition with more features and e-mail support, pricing starts at $49 for customers signing a one-year contract or $89 for those who opt to go month by month. An enterprise edition is available for $149/$189 per month and offers telephone support and the full menu of features.

Posted by Jeffrey Schwartz on 08/19/2013 at 3:32 PM0 comments


Would Transparency by Feds Ease Fears Over Cloud Surveillance?

When President Obama last week called for the government to be more transparent about its data surveillance activities, critics saw it as a step in the right direction, though it's unclear how, when or if that will happen. As I noted at the beginning of the week, claims by Edward Snowden that Microsoft may be feeding the National Security Agency customer data -- which Microsoft insists is not true -- is having a chilling effect on customer confidence that data is safe in the cloud.

Yet well before Snowden disclosed surveillance activity such as PRISM, the Cloud Security Alliance (CSA)  had established mechanisms for service providers to disclose their data-protection  practices. A key initiative was the Security, Trust & Assurance  Registry (STAR), launched by the CSA two years ago, which is where cloud providers like Amazon and Microsoft have provided audited security controls.

Now that Snowden has unleashed a flood of classified information that points to PRISM and the NSA's widespread use of surveillance to thwart terrorism, the CSA has sprung into action, calling attention to its efforts and leading the discussion on the effect of surveillance on cloud security.

The Snowden leaks come just as IT organizations have started to become more comfortable with the notion that data can be securely stored in the public cloud. As I pointed out Monday, less than a third of those surveyed by the CSA in wake of the Snowden leaks believe there is adequate transparency on how often the government accesses their information. That lack of transparency was a recurring topic in the CSA's first-ever town hall panel held Monday.

"Today, there's no mechanism in place for cloud customers, any user organizations that rely on these cloud providers, to know when their data was exposed," said moderator Elad Yoran, VP of finance with the New York City chapter of the CSA and the CEO of Vaultive, an up-and-coming provider of a cloud encryption service. This is an issue Yoran has studied quite intensely for obvious reasons.

Not only is there a lack of transparency by the NSA and other U.S. law enforcement agencies, but many key cloud providers have complained that their hands are tied in that they're restricted in what they're permitted to disclose.

"This is definitely a hot topic for me," said panelist Peter McGoff, general counsel of Box, the popular cloud storage provider. "One thing we look at as a cloud provider, and what we're asking for, is more transparency in the process. We want to be able to communicate to customers at a minimum the numbers of such requests that we get in and what our process is. Right now, it's not quite super clear that we have that flexibility."

McGoff did offer that Box hasn't received an overwhelming number of warrants for enterprise data.

Until last week, the Obama administration has resisted supporting changes in the disclosure policies, but the president is now proposing that the government step up its efforts to be transparent. The proposal was vague and opposition from both parties indicated nothing will change in the near term. However, panelists during the hour-long CSA town hall webcast said Obama's proposal was a positive move.

"It's a good first step," Box's McGoff said. "I felt much better with president Obama coming out and putting a bright light on this."

Robert Brammer, a senior advisor to the Internet2 Consortium and CEO of Brammer Technology, agreed. "The review the president has talked about with the intelligence process with one of the objectives to create more transparency in the process will improve the level of dialogue on this subject," he said.

While calling for more transparency, Brammer argued there's a lot of misinformation, if not hysteria, about government surveillance activities. "Some of the emotional and superficial and narrowly based commentary that's come out in the media -- either in the newspapers or Sunday morning talk shows -- frankly makes this problem worse," he said. "We need a substantive dialogue on the issues and not a bunch of emotional sound bites."

One substantive point, Brammer noted, was a whitepaper (PDF) released last week by the Obama administration that lays out how telecommunications providers access and analyze metadata gathered from calling information.

"This information is limited to telephony metadata, which includes information about what telephone numbers were used to make and receive the calls, when the calls took place, and how long the calls lasted," according to the whitepaper's executive summary. "Importantly, this information does not include any information about the content of those calls -- the government cannot, through this program, listen to or record any telephone conversations."

While Snowden revealed surveillance efforts that were previously not public, much of the concern that has surfaced is old news, added Francoise Gilbert, founder and managing director of IT Law Group, a law firm focused on domestic and international information privacy and security. The U.S. government has had surveillance initiatives in place dating back to the late 1960s, and the Foreign Intelligence Surveillance Act (FISA) was initiated in 1978, Gilbert pointed out during the CSA panel discussion.

"The topic of government access to data is not something new," she said. "There have been many iterations and many amendments to these laws to keep up with technology, technology progress, and there has been a movement for the past two years to amend one of these laws -- the Electronic Communications Privacy Act -- to also bring it to the 21st century."

Gilbert also pointed to due-process requirements such as the Wiretap Act. While critics of the Foreign Intelligence Surveillance Court (FISC), created under FISA, believe the judges rubber-stamp most law enforcement warrants, Gilbert argued U.S. citizens have more protections than those in many foreign countries such as the United Kingdom.

"There is no FISA court -- they just come in and have access to your information," she said of many foreign counties. "In general, the laws I would say are definitely more favorable to the governments in foreign countries, especially in the U.K.," than in the United States.

Perhaps, but there's a growing chorus of critics in the United States who don't view the current laws along with the Patriot Act as very favorable to their privacy. While the government argues its surveillance efforts have thwarted potentially deadly attacks, even the panelists on this week's CSA webcast concurred that the feds are going to have to look at becoming more transparent.

I'd say that's especially true in wake of the latest leaks by Snowden, reported yesterday by The Washington Post. The report reveals an audit last year that found that the NSA overstepped its legal authority by erroneously tapping both foreign and American targets here in the U.S., typically the result of typographical, operational or computer errors. The audit cited 2,776 such errors, Snowden told the Post. According to the report, Snowden shared the documents from the audit with the newspaper. An anonymous NSA source sanctioned by the White House told the Post "We're a human-run agency operating in a complex environment with a number of different regulatory regimes, so at times we find ourselves on the wrong side of the line." 

What effect have the disclosures of programs like PRISM had on your plans to use public cloud services? If you haven't already, please take a few minutes to participate in our brief survey, which can be accessed here.

Posted by Jeffrey Schwartz on 08/16/2013 at 4:25 PM0 comments


Windows 8.1, Windows Server 2012 and System Center 2012 R2 Wave Release Set for Oct. 18

While most watchers presumed Microsoft would deliver its next wave of Windows client and data center products sometime this fall, the company today has made it official: all will be released Oct. 18.

Microsoft didn't say if it was planning a major live launch event but the company will make all of the recently announced new software available on that date. PC makers will launch new PCs and tablets with Microsoft's new Windows 8.1 client, Microsoft senior marketing communications manager Brandon LeBlanc said in a blog post this morning.

Though there was no mention of a new Surface device on that date, commenters on the blog were already speculating on one. It's possible Microsoft will hold off on that to not upstage its already aggravated OEM partners. "We haven't announced RTM today," LeBlanc noted. "This announcement is just for general availability. We also haven't made any new announcements for TechNet subscribers."

At launch, Windows 8 customers can upgrade to the improved Windows 8.1 version via the Windows Store (see MSDN Magazine Editor-in-Chief Mike Desmond's take on some of the key new features in Windows 8.1 here).

In the case of Windows Server 2012 R2 and System Center 2012 R2, existing customers can download it that day, while new customers can buy the new release on Nov. 1. The fact that Microsoft is launching its client and server upgrades underscores the so-called seamlessness the company has emphasized since the June TechEd conferences in New Orleans and Madrid.

Brad Anderson, the Microsoft corporate VP who outlined the new features in Windows Server 2012 R2 and System Center 2012 R2 in the opening keynotes at TechEd in June announced the release date for the new server operating system in a blog post today. The two, along with an upgraded Windows Intune also slated for release that day, are part of Microsoft's release strategy.

Apparently Anderson sees Windows Server 2012 R2 as a work of art, having noted the release date coincides with the 501st anniversary of Michelangelo exhibiting the ceiling of the Sistine Chapel for the first time. "If you love great works of art, then it's up to you to decide," he said.

Indeed, some IT pros might agree that building a hybrid cloud datacenter architecture takes both artistic and technical skills. Also it's not surprising that Microsoft is looking to tie the launches of the client and server OSes together, especially given Microsoft's emphasis on how IT can provide more seamless management of user-owned Windows PCs and tablets.

Perhaps most noteworthy in the Windows Server 2012 R2 upgrade is improvements to Hyper-V and easier integration with public cloud services including Windows Azure. To enhance Windows Server, Microsoft is launching the Windows Azure Pack, though it wasn't immediately clear if that will be included in the Oct. 18 launch.

If you've looked at any of the upgraded wares, feel free to comment below or pass along your thoughts to [email protected].

Posted by Jeffrey Schwartz on 08/14/2013 at 3:35 PM0 comments


IT's Dilemma After NSA Leaks

The leaks by Edward Snowden revealing the U.S. National Security Agency's (NSA's) classified but wide-ranging PRISM data-gathering effort -- which is aimed at intercepting and thwarting terrorist threats -- have had a chilling effect on customer confidence that data is safe in the cloud.

For better or worse, these revelations have also caused consumers and enterprise customers to cast a more skeptical eye on Microsoft and other key tech stalwarts including Google, Apple, Facebook, Amazon and Yahoo. I say "more skeptical" because there was no shortage of cynicism about what role these providers were already playing in sharing their data.

Snowden cast Microsoft as a key villain when one of his leaks charged Redmond was cooperating with the NSA in letting it tap into e-mails from Outlook.com (formerly Hotmail), data stored in SkyDrive, and Skype chat sessions and phone conversations.

Microsoft General Counsel Brad Smith swiftly denied allegations that the U.S. government had back-door access to data and encryption keys. "Microsoft does not provide any government with direct and unfettered access to our customer's data," Smith stated. "Microsoft only pulls and then provides the specific data mandated by the relevant legal demand." The company only responds to requests for specific accounts and identities, and governments must serve court orders or subpoenas for account information, he added.

The problem is the U.S. government has tied the hands of providers as to how much they can reveal about their level of cooperation. Smith argued Microsoft wants to disclose how it handles national security requests for customer information, but as of mid-August, the U.S. Attorney General has denied the company's request to allow it to be more transparent. "We hope the Attorney General can step in to change this situation," Smith said

Meanwhile, customers and enterprises are rethinking how they use the cloud for their data. The Information Technology & Innovation Foundation (ITIF) in early August released a report predicting that absent of the U.S. government taking action, recent security concerns could cost the cloud computing industry anywhere from $22 billion to $35 billion.

A survey by the Cloud Security Alliance (CSA) found 56 percent of respondents outside the United States are less likely to use a domestic cloud provider, while 10 percent have actually cancelled a cloud deployment here. Less than one-third of all participants -- including those domestically -- believe there's adequate transparency on how often the government accesses their information.

It's possible the U.S. government will never let Microsoft and other cloud providers fully disclose what covert activities go on in the name of national security. That's a consideration that has to play into every enterprise IT decision maker's choice to use any cloud service, whether it be Amazon Web Services, Windows Azure, Office 365 or even letting employees use consumer services such as Box, Dropbox and SkyDrive.

President Obama's proposal Friday to improve transparency was a step forward -- but it will certainly face political obstacles.

We want to know how you're addressing these issues. We've fielded a survey to get your views and we'll be reporting on what you can do to protect your organization's data in the cloud. You can also e-mail me your thoughts at [email protected].

Posted by Jeffrey Schwartz on 08/12/2013 at 1:15 PM0 comments


Auditing Vendor Sees More Demand for Google Apps than Office 365

Hardly a week goes by when Google or Microsoft doesn't announce a key win for their respective cloud-based productivity services. While analysts say it's premature to declare a winner, Google Apps continues to gain ground on Microsoft's Office 365.

While none of the major IT researchers have published data, Gartner back in April indicated Google Apps is gaining ground on Office 365. Under what it described as a narrowed analytical framework, Gartner analysts on a webcast suggested Google Apps had anywhere from a 33 to 50 percent share of the cloud productivity app market.

Gartner hasn't published any formal research nor has Forrester Research. "The number of customer acquisition announcements both vendors make and the inquiries that we do on this topic don't really give a clear picture as to which is up, either," analyst T.J. Keitt said in an e-mail. Pointing to a video Forrester posted earlier this year on how to choose between the two, Keitt said "We're too early to declare a winner and loser in this competition."

In a nod to Google, auditing software provider BeyondTrust this week unveiled a version of its PowerBroker Auditor for Google Apps. The company said the free tool available for download is designed to simplify the manual task of making configuration and administrative changes for security and compliance reporting.

When I asked if there's a version for Microsoft's Office 365, the company said that will follow in the future. "Customer feedback indicated that Google Apps was being widely deployed, more so at this point than other online collaboration solutions, so we went with that feedback and released support for Google Apps first," said Brad Hibbert, BeyondTrust's vice president for product strategy and operation, in an e-mail. The company says it has 5,000 customers.

Making this more noteworthy is the fact that BeyondTrust lists itself as a Microsoft Gold Partner. BeyondTrust doesn't have a partnership with Google, according to the list of partners on its Web site. However that's in the works, according to Hibbert. "We've begun the partnering process with Google as a result of this product development," he noted.

Is your organization forsaking Office or Office 365 in favor of Google Apps? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 08/09/2013 at 3:06 PM0 comments


How Will Microsoft Rename SkyDrive?

Microsoft's announcement last week that it must find a new name for its SkyDrive and SkyDrive Pro cloud storage services is the latest blow to Microsoft's efforts to maintain consistent branding for its flagship products.

The company must stop using the SkyDrive brand after coming up short in a legal dispute with British Sky Broadcasting Group, which claimed it had dibs on the name. It's a setback for Microsoft, which has heavily marketed SkyDrive over the past several years and has described it as the key cloud storage component for Windows 8x and Office 365. Likewise Microsoft last year attached the SkyDrive Pro brand to the cloud-based online storage component for SharePoint 2013 and the SharePoint Online component of Office 365.

So what should Microsoft call it? Should the company emphasize the Azure brand and call it Azure Drive? That's a bit of a tongue-twister. What about WinDrive? That's presuming Microsoft could work out something with Tormax, which has a glass sliding door called Win Drive.  Share your suggestions below.

Speaking of naming changes, one that Microsoft quietly changed this week was Power Pivot, which originally was one word. The company acknowledged the change explaining it was intended to bring consistency to the other BI offerings. Microsoft outlined plans this week to release a preview of its new Power BI tool for Office 365.

Posted by Jeffrey Schwartz on 08/09/2013 at 2:15 PM0 comments


Life Insurer Deploys 30,000 Windows 8 Tablets

Untitled Document

Meiji Yasuda Life Insurance Company in Japan is rolling out one of the largest known deployments of Windows 8-based tablets. The company is giving Fujitsu tablets to 30,000 of its sales reps, both Fujitsu and Microsoft announced today.

The companies say it's the largest rollout of Windows 8 tablets in Japan and it probably ranks among the largest known deployments anywhere. Meiji Yasuda Life will start using the tablets next month at 1,200 locations, according to Fujitsu. Both Fujitsu and Microsoft worked together to develop what they describe as a sales terminal.

Equipped with a 12.1-inch display and just 15mm (.59-inch thick), it weighs approximately 880 grams (1.15 pounds) and is powered to last an entire day. In addition to sporting the typical Windows 8 touch interface, it's designed to allow for handwritten input, allowing customers to sign documents.

The tablet has a built-in mobile WAN module to work on the wireless NTT DOCOMO LTE-based network. The systems are also designed to encrypt data on the tablets' solid state drives.

Erwin Visser, Microsoft's general manager, Windows Commercial said in a blog post today that the tablets are replacing Windows XP-based PCs, where sales reps put together proposals and then printed them out.  "Using the Windows 8 tablets, their sales efforts will be more efficient and the customer experience will be greatly improved," Visser said. "The company also expects to process contracts more quickly, while ensuring customer security is protected and eliminating the need for printed documentation altogether."

This is a big win for Windows 8 and Microsoft and probably welcome news after the release of last quarter's tablet market share report from IDC released Wednesday. Now Microsoft needs many more big deployments to get enterprises interested in Windows 8.

Many IT pros tell me they plan to pass on major Windows 8 rollouts for the mere reason that they typically skip releases after performing major upgrades. Most organizations have upgraded their Windows XP (and earlier) PCs with Windows 7 and don't see a need to transition again at this time. But there are still a huge number of Windows XP-based systems that will no longer be supported with security patches after April 8 of next year and Microsoft is encouraging organizations to consider Windows 8 rather than Windows 7.

As more organizations find use for tablets, perhaps the resistance to Windows 8 could subside. It remains to be seen how quickly enterprises will take a keener interest in tablets and even when they do, whether they'll choose those based on Windows 8, iPads or devices loaded with Google's Android or Chrome OS.

Posted by Jeffrey Schwartz on 08/07/2013 at 12:29 PM0 comments


Windows Tablet Market Share Grows Despite Weak Windows RT Sales

Windows is now installed on 4 percent of all tablets (1.8 million licenses sold) in the second quarter. However, Windows RT only shipped on 200,000 systems, mostly Microsoft's Surface RT, IDC reported today.

The findings in IDC's second-quarter Tablet Tracker report presented the latest stinging data point that systems with Windows RT are not catching on with consumers, business users or IT pros despite rapid growth during the same period for iPads and Android-based tablets. Because the second quarter ended June 30, the numbers don't take into account the fact that Microsoft last month slashed the price of its Surface RT devices and took a $900 million charge on the extra inventory.

But it doesn't look likely it will have a dramatic effect on the next quarter. "We don't see [Windows RT] making traction at all," said IDC program manager Ryan Reith, in an e-mail. "The bigger problem is hardware partners are beginning to shy away from the platform as they don't see consumer demand or its fit in the industry."

As for tablets running Windows 8, Reith is more optimistic. "As we have said all along, uptake for Windows 8 will be slow but eventually it will stick," he said, acknowledging the latest quarterly report is unlikely to silence critics. "Windows 8 is slowly making progress but it's a huge focus point for the industry and media with a very large target on its back, so I'm not quite sure it has come even close to meetings critics' needs."

Indeed the 1.8 million tablets running Windows 8 pale in comparison to the 14.6 million iPads sold and the 28.2 tablets loaded with Android, which respectively account for 32.5 and 62.6 percent market share.

Meanwhile Microsoft over the weekend quietly cut the price of its Surface Pro tablets by $100 bringing it down to $799 for a system configured with 64 GB of storage and $899 for a 128 GB version. Keep in mind that doesn't include the price of keyboards, which can add $129 (for the Type version).

While the latest price cut on the Surface Pros may help move the needle a tad, I'm still betting a forthcoming version with Intel's Haswell processors will offer more appeal to users, presuming they offer the all-day battery life that CPU promises. Current Surface Pros only run about 4-5 hours, limiting their appeal.

Top Tablet Operating Systems, Shipments, and Market Share, Second Quarter 2013 (Shipments in Millions)

Vendor 2Q13 Unit Shipments 2Q13 Market Share 2Q12 Unit Shipments 2Q12 Market Share Year-over-Year Growth
1. Android 28.2 62.6% 10.7 38.0% 162.9%
2. iOS 14.6 32.5% 17.0 60.3% -14.1%
3. Windows 1.8 4.0% 0.3 1.0% 527.0%
4. Windows RT 0.2 0.5% N/A N/A N/A
5. BlackBerry OS 0.1 0.3% 0.2 0.7% -32.8%
Others 0.1 0.2% N/A N/A N/A
Total 45.1 100.0% 28.3 100.0% 59.6%

Source: IDC

Posted by Jeffrey Schwartz on 08/05/2013 at 1:21 PM0 comments


Is Office 365's Growth Lowering Salaries of SharePoint and Exchange Experts?

Salaries for SharePoint developers and administrators have dropped 7 percent this year -- but they're still drawing six figures in pay.

As we pull together the forthcoming 2013 Redmond Salary Survey, which we'll publish in the coming weeks, the median compensation for SharePoint developers and administrators was $100,817, compared with $107,063 last year.

Likewise, the average salary of Exchange administrators was slightly less -- $87,569 versus $88,889 last year, a 1.5 percent decline. It begs the question, is the growth of Office 365 cutting into the earnings of those with SharePoint and Exchange expertise? To be sure, those who said they have Office 365 expertise also draw six-figure salaries.

The majority responding to this year's survey say the cloud is having either no impact or a positive effect on their salaries and careers. At the same time, as noted last week, a growing number of respondents are looking for new jobs.

Are you one of them? Drop me a line at [email protected] and let me know how your career as a Windows IT pro is going.

 

Posted by Jeffrey Schwartz on 08/05/2013 at 1:26 PM0 comments


Dell Committee Accepts Revised Bid as Icahn Sues To Block Changed Vote Date

Ahead of Friday's scheduled vote  -- or should I say re-scheduled scheduled vote -- Dell's Special Committee accepted last week's revised bid from founder Michael Dell and his investors led by Silver Lake Partners.

Consequently the vote was once again rescheduled until Sept. 12. Michael Dell and Silver Lake agreed in return to not require that shares not cast count as "no" votes, a sticking point over last week's revised bid. The committee also agreed to reduce the breakup fee to $180 million from the original $450 million.

"The Committee is pleased to have negotiated this transaction, which provides as much as $470 million of increased value," said Alex Mandl, the committee's chairman, in a statement posted this morning. "We believe modifying the voting standard is in the best interests of Dell shareholders, both because it has enabled us to secure substantial additional value and because it provides a level playing field for the decision facing shareholders. The original voting standard was set at a time when the decision before the shareholders was between a going-private transaction and a continuation of the status quo. Since then, the nature of the choice facing shareholders has changed because of the emergence of an alternative proposal by certain stockholders."

That alternate proposal was the $12-a-share bid led by investor team Carl Icahn and Southeastern Asset Management (which actually values at $15 to $18 a share because shareholders would retain a portion of their shares after receiving a payout). Icahn and his team fear Dell's offer undervalues the company. Icahn last week filed a lawsuit on Delaware Chancery Court in Wilmington, Del. to block the vote change. Icahn and his investors are aiming to replace the entire Dell board and Michael Dell as CEO.

It appears (so far) that the revised offer from Dell-Silver Lake has a better chance of succeeding than it did before they upped their offer. But anything can happen over the next month and Icahn isn't throwing in the towel just yet.

By going private, Dell and Silver Lake argue they can be more competitive by not having to disclose information, while allowing the company to make long-term bets that may not pass muster with the scrutiny of public shareholders.

So the battle continues.

Posted by Jeffrey Schwartz on 08/05/2013 at 1:28 PM0 comments


Windows XP Migration: The Next Y2K Crisis for IT Pros?

As the clock continues to tick for Microsoft's Windows XP, Microsoft and others are doing everything they can to motivate reluctant users to migrate off the aging operating system. Suffice to say, it remains an arduous process. In a token effort to remind people of the OS's limited life, Microsoft last week said it's holding a virtual retirement party for Windows XP, which it will officially stop supporting April 8 of next year.

Could that day be the closest thing we've had to the Year 2000 (Y2K) crisis scare -- when any computer not properly updated would be rendered inoperable? And with some of the legacy systems out there that led everyone to wonder if there would be dial tone or running water. Certainly those at Microsoft looking to get people off Windows XP might want people to give it that same sense of urgency.

For those who haven't been paying attention, Microsoft will no longer issue system or security patches for Windows XP after that date. That means continuing to run systems with Windows XP can make them a sponge for malware, viruses and other problems.  The so-called retirement party comes in the form of an information graphic Microsoft published with snippets outlining why users should get off Windows XP. But it was really just Microsoft's latest gimmick to draw attention to the issue.

Nevertheless it's not a trivial problem. Larger organizations know what they have to do and if they don't want to do it themselves they'll hire outside partners to help with migration of apps and configuration of the new systems.

Smaller businesses and branch offices of larger enterprises often are the guiltiest of those still saddled with PCs still running Windows XP. Some may beg to differ with the term "guiltiest." I have heard from quite a few stalwarts who will go to the grave with their Windows XP systems and are quite angry at Microsoft for pulling the plug on it. But from everything I'm hearing April 8 is the real deal. No one expects Microsoft to give Windows XP another stay of execution.

There are many who appreciate that sense of urgency and intend to upgrade as the date comes closer, though it's expected many upgrades will happen after April 8 as well. Many shops will ultimately break down and make the move. Those who choose to do it themselves can use some of the tips adeptly outlined by Redmond magazine's online news editor, Kurt Mackie, back in April. But many liken the task to mowing their own lawn or painting their house. Sure they could do it but they'd rather pay someone else a few bucks and not have that burden to deal with it.

For decision makers who feel that way about moving their PCs to a new operating system, there are plenty of third-party options, though some might be costly. In a move to offer an inexpensive approach for smaller organizations looking to make the upgrade, Harry Brelsford, founder of SMB Nation on Thursday August 1 is launching a new service called XPmigrations.com.

Brelsford has run SMB Nation for over a decade in Bainbridge Island, Wash. and is a fixture in Redmond. While Microsoft won't be promoting XPmigrations.com, officials there are well-aware of the effort, Harry told me. And XPmigrations.com will use Microsoft Community Connections to help introduce consultants to appropriate business and civic groups.

Here's how XPmigrations.com works: Any qualified Windows consultant can register at the XPmigrations.com site to apply to become a migration expert. SMB Nation itself has a network of 40,000 SMB IT Pros and Harry explained to me that the goal is to have migration experts available nationwide. The experts will help customers choose how to handle a migration including choosing the right PC for their needs and then moving data over and getting each desktop or mobile system configured and connected to the network. The cost to upgrade each PC is $200.

Breldsford calls it a co-op and XPmigrations.com will operate like a temporary employment agency, which he ran in another lifetime. In effect, XPmigrations.com will sort of function like an online labor pool or marketplace of Windows XP migration experts, he explained. XPmigratiions.com performs background checks on prospective consultants, equips them with a migration assessment tool kit and trains them to become a Certified Migration Expert (CME). "We liken the need to migrate off Windows XP as the equivalent of a Year 2000 issue," Brelsford told me.

While the number and scale of systems that may be affected if organizations didn't update the proper systems may not measure up to a Y2K crisis, the number of systems running Windows XP is significant. An astounding 41 percent of Redmond magazine readers still have Windows XP-based systems within their organizations, according to the Redmond 2013 Readership Survey. Another interesting figure shows only 18 percent have absolutely no Windows XP-based systems (meaning 82 percent have such PCs) compared with 3 percent who say they don't have any Windows 7-based systems.

On the other hand, only 23 percent report that more than half of their systems are Windows XP-based. Yet less than half -- 45 percent -- say they plan to migrate off of those systems by the end of the year, with another 24 percent planning to do so but haven't established a timeframe.

But we all know what's going to happen. Most are going to wait until the last minute and beyond. And then there are the diehards.

Posted by Jeffrey Schwartz on 08/01/2013 at 1:15 PM0 comments


Reader Feedback Wanted: Looking for a Higher-Salaried Job?

As I scour the data from this year's 2013 Redmond Salary Survey, it appears salaries on average are on the rise. The average increase in compensation is almost as high as last year's annual salary survey but a notable difference is a growing number of you are looking for a new job.

While you'll have to wait a month for the final results, I'm looking for some of you to give some color to align with some of the trends I'm spotting. If the only thing holding you back is you don't want to tip off your employer or colleagues that you're not happy with your salary or job (or maybe you don't want to admit you're doing better than most), no worries, I won't identify you.

But if you want to share your insights on what's driving compensation for Windows IT pros and how they're navigating their careers these days, your feedback would be welcome. One caveat: I do need to know who you are to ensure I am sharing accurate information but again, I will not identify you unless you want me to. That said, please answer the following questions:

  • Is your experience with any Microsoft-based technology valued by your current employer?
  • How has the raise and bonus you received this year compare with what you've received in previous years?
  • What's your view of why your employer is either being more generous or less with salaries this year?
  • Do you feel the only way you can boost your salary is to move on to a new job?
  • If salary isn't the reason you're looking for a new job, what is?
  • How is the cloud impacting your career?

Other relevant thoughts on the state of compensation and the job market for Windows IT pros are also welcome. You can reach me at [email protected].

 

Posted by Jeffrey Schwartz on 07/31/2013 at 1:15 PM0 comments


Salesforce.com CEO Marc Benioff Mocks Microsoft's Surface

Never one to miss an opportunity to taunt his rivals, Salesforce.com CEO Marc Benioff gave his unbridled assessment of Microsoft's struggling Surface: The alternatives are superior.

"The reason why they're not accelerating growth is for one simple reason," Benioff told The New York Times, in an article over the weekend breaking down Microsoft's quest to keep Windows relevant in the tablet era. "There's a better technology."

I shrugged it off as typical Benioff bravado as he often takes swipes at Microsoft, which besides Oracle and SAP is perhaps Salesforce.com's largest competitor. Not only does Microsoft pose a formidable threat to Salesforce.com in the market Benioff's company dominates -- CRM -- but Microsoft and Salesforce are looking to draw customers and developers to their competing cloud platforms -- Force.com and Windows Azure respectively.

Salesforce.com officials coincidently put substance behind Benioff's remark to The Times today with the release of its latest tooling for developers. An upgrade to its Salesforce Platform Mobile Services, launched back in April, provides extended support for those building native iOS and Android apps. The new release even targets Microsoft C# and .NET developers with the included Xamarin Mobile Pack, which includes the Mono framework, as one of the new packs added to Salesforce Mobile Services.

As Adam Seligman, Salesforce.com's vice president of developer relations outlined the new additions to its new mobile offering and specifically explained the addition of the Xamarin pack, I asked if he sees adding native support for the new Windows runtime and Microsoft's tile-based app model in Windows 8, Windows RT and Windows Phone. "We have seen no demand for Windows RT," Seligman explained. "What we are doing for these cross platform devices is supporting regular native Web with HTML 5."

While he wouldn't say whether he shares Benioff's opinion that the Surface is inferior, I asked Seligman if ultimately supporting the new Windows runtime is on Salesforce.com's agenda. "Our mobile SDK today supports Android and iOS and if other platforms get some share we'll definitely look at supporting them too."

Many organizations use Salesforce.com's CRM and other SaaS offerings. If using Salesforce.com on Windows 8/RT and Windows Phone is confined to HTML 5 and JavaScript, will that limit the appeal of Surfaces and other Windows-based mobile devices? Share your opinion or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 07/30/2013 at 1:15 PM0 comments


The Demise of Microsoft's Surface Is Greatly Exaggerated

It really gets under my skin when Microsoft's Surface devices are described as an abject failure. Indeed it's fair to say that Microsoft has failed to make a meaningful dent in the tablet market with the Surface including distribution blunders and poor marketing that led to Microsoft's $900 million write-down. But to suggest they're DOA is premature.

The Surface could wind up sharing the same fate as the Zune (it's already outlived the ill-conceived Kin) but the game is far from over despite Microsoft's gaffes with its tablet entries. Forget the failure of the Zune and remember other technology entries where Microsoft was late to the party such as Windows NT, Active Directory, Hyper-V and even Internet Explorer.

Though the clock is ticking, there's still time for Microsoft to generate demand for the Surface and Windows 8/RT. In Nick Bilton's Bits blog yesterday in The New York Times he refers to the Surface as a failure, though he stopped short of comparing them to the Zune.  

"The Surface failed because Microsoft confused consumers who didn't want to think about RT or Pro or what version of Windows their new device would run," he wrote.

Bilton is right in that Microsoft needed to offer customers better clarity about the differences between Surface RT and Surface Pro. A better name for RT (maybe Express or Lite) to better clarify the differences between the two would have helped avoid that confusion.

Microsoft's long-term vision for Windows is a day when people have all but forgotten about the traditional desktop interface that now powers over a billion PCs and have moved on to its new modern tile interface. But presuming Microsoft can convince its base of developers and customers to make that transition, it will be many years before that happens.

In the meantime, Microsoft needs to play to those who only want to shell out a few hundred dollars for a device and those who want a full-function system that can appeal to workers who need more than a tablet intended for information consumption.

Until the economics make it possible to offer Windows RT machines -- including Surface RTs -- for less than $300, they will have a hard time moving the needle. It's getting closer with the latest price drop. Having a low-cost alternative to the iPad and devices based on Google's Android (as well as its Chromebooks) is a must if Microsoft wants to be in the game.

At the same time Microsoft has to figure out how to walk the fine line for those who want the desktop. It must make that experience appealing while tempting (but not forcing) them to dabble into the new tile interface. And of course getting key apps in the Windows Store is going to be essential -- emphasis on key apps. Microsoft CEO Steve Ballmer last week told employees in a town hall meeting that, "We built a few more devices than we could sell," as reported by The Verge. A "few" is a "halfhearted" overstatement, opined Computerworld's Preston Galla who sniped: "To me, a six million tablet oversupply is quite a bit more than 'a few,'" he wrote.

A report by Neowin.net noted Ballmer also said new Surface devices will offer "typical" improvements, though The Verge last month reported that the new tablets will come with Qualcomm Snapdragon processor (the current version has an NVIDIA Tegra 3 chipeset) that will include an option for LTE broadband connectivity. Ballmer is also keenly aware that getting quality apps in the Windows Store is critical, according to the Neowin report.

I for one am looking forward to seeing what the next crop of Surface devices offer (hopefully a mini is in the works too) as well as the anticipated onslaught of new devices from Microsoft's partners in the coming months -- especially the new Ultrabooks based on Intel's new Haswell processor -- which I hope Microsoft offers in the next Surface Pro.

With what's in the pipeline, it's too early to bury the Surface. At the same time, it would be premature to do a victory lap.

 

 

Posted by Jeffrey Schwartz on 07/29/2013 at 1:15 PM0 comments


7 Reasons Bill Gates Shouldn't and Won't Return to Microsoft

At a party I attended last weekend, I met someone who provides outsourced support for Windows-based systems and clients. Naturally we got into a conversation about Microsoft and its challenges in the post-PC world.

This gathering was on Saturday, the day after investors bid down Microsoft shares 11.4 percent following the company's dismal fiscal fourth quarter earnings report that was lower than analysts expected. As we were talking, he predicted with a sense of inevitability that Bill Gates will return to Microsoft. I responded that's never going to happen and we had a friendly but spirited debate on the matter.

Gates has made very clear on numerous occasions he's not returning to Microsoft -- his commitment is toward tackling poverty, disease and providing education to those in the poorest parts of the world. This is indeed a much more important endeavor. As we debated the matter at Saturday's party, this person said to me "if it means saving the company he founded, he'll want to come back," arguing it would give him the chance to do for Microsoft what Jobs did for Apple.

I probably would have shrugged off the conversation had an article not appeared in the Australian International Business Times Monday saying there's speculation that Gates may return to Microsoft. The article was devoid of any other substance and didn't even point to unnamed sources.

Again, I was prepared to shrug that off, yet a number of reporters pointed to it including Mary Jo Foley in her ZDNet All About Microsoft blog, where she stuck to her guns in pointing out there's no way Gates is coming back (she has done so in the past in her Redmond magazine column as well).

I'm in full agreement. Nor do I think he should and here are seven reasons why:

  1. Just because Steve Jobs brought Apple back from the brink and led it to become the world's largest company (and Howard Schultz was able to return to Starbucks and revive its fortunes) doesn't mean Gates is in a position to do the same for Microsoft -- nor are Microsoft's troubles as dire as Apple's were back in the late 1990s. Consider Founder Michael Dell has yet to completely turn around company that bears his name which he came back to fix (and it remains to be seen if he ever will) and Jerry Yang couldn't do so for Yahoo.

  2. The origins of Microsoft's problems pre-date Gates departure. While he wasn't CEO when Microsoft released Windows Vista, he was an active chairman and chief software architect. Gates arguably could have made decisions back then that might have put Microsoft in a better competitive position today -- particularly in the tablet and smartphone game. Let's not forget Gates was still at Microsoft in a full-time capacity when the company made its ill-advised $44 billion hostile takeover attempt of Yahoo, a deal which could have proven devastating to Microsoft, yet fortunately the aforementioned Yang was foolish enough to fight off.

  3. As Microsoft's largest shareholder, Gates can steer the company in any direction he wants without running it from day to day. To what extent he's doing so with CEO Steve Ballmer as a conduit is anyone's guess.

  4. Gates founded Microsoft with the vision of putting a computer on everyone's desk and home. To a large extent he's seen his vision through, though he was never able to make good on his early promise of making tablets ubiquitous.

  5. When Jobs came back to Apple, he had a vision for the next wave of computing, communications and entertainment and had unfinished business. I've heard little from Gates these days to suggest he shares those same attributes.

  6. Despite all of the challenges facing Microsoft, it's hardly about to fall off a cliff. The company has over $70 billion in cash and many of its businesses are still thriving, while others are in the midst of multiyear transitions.

  7. While there are still many developing countries and among the poorest in our society who still can't afford computers or smartphones, Gates best chance for changing that is through the Bill and Melinda Gates Foundation. There's no reason he should, or will, give up his work with the foundation.

Do you believe Gates should put poverty aside and return to Microsoft? Feel free to comment or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 07/26/2013 at 1:15 PM0 comments


Can a Solid Windows 8 'Mini' Shake Up the Tablet Market?

Microsoft has hinted for some time that it expects demand for Windows 8 and Windows RT to accelerate with the release of smaller tablets aimed at competing with the iPad Mini, Samsung Galaxy Tab, Amazon Kindle Fire and others.

The topic came up on Microsoft's earnings call last week, and CFO Amy Hood gave a plug for the new smaller Windows tablets in the pipeline that has driven robust sales of competing tablets. I for one am curious if that means Microsoft is readying a Surface device that will take on the smaller tablets and, if so, whether it will include just Windows RT or a full-blown version of Windows 8. A competitively priced "Surface Mini" could be pretty remarkable if it has the right feature set.

So far the most prominent example of what a miniature Windows tablet might look like is the recently released Acer Iconia W3, an 8.1-inch tablet that runs the full-blown Windows 8. It's bundled with a free copy of Office Home and Student 2013 (sans the Outlook client), weighs just 1.1 pounds and is slightly less than a half-inch thin. Battery life is eight hours and an optional keyboard is available. It can fit nicely into a purse or jacket pocket.

I took a brief look at the Iconia W3 over the weekend at the Microsoft Store where sales reps were plugging the device. While I spent some time using it in the store, more extensive reviews suggest while it's a good concept, it misses the mark. A Wall Street Journal review published today by columnist Walt Mossberg captured what I've observed.

Although Mossberg pointed to some advantages the Iconia W3 offers over the iPad Mini, notably Office, an SD card slot, USB and HDMI ports and better resolution (1280 x 800 versus 1024 x 768), he concluded those collective extras aren't enough to make Acer's device a superior alternative.

"Overall, I found it to be no match for the iPad Mini," he said. "Compared with the smallest iPad, the Acer features cheaper, bulkier construction; a worse-looking, slower-responding screen; significantly less battery life; and drastically worse cameras. And it's Wi-Fi only, with no cellular data option."

If you're among those who feel Mossberg's reviews are too consumer focused, keep in mind if this form factor catches on, your organization will ultimately be supporting them in this new era of bring your own device (BYOD). The Iconia W3 also failed to impress PC Magazine reviewer Brian Westover either, who found it performed poorly.

Nor did the Iconia excite someone I know who spent considerable time with it and found it frustrating, especially the display, which he described as "pretty much unusable" in that reading Web pages was "difficult to the point of being painful... I thought mine was defective, only to learn that the W3 was designed to look this way."

It wasn't all bad, he said, noting it is easy to handle with one hand, offers a responsive touch-UI and he likes the extra IO. But all of that is negated by the poor display, he concluded.

The fact that Acer has already slashed the price from $380 to $300 after less than two months on the market may suggest it has received a less-than-stellar reception, Mossberg suggested.

If Acer and other OEMs -- and Microsoft itself -- are able to release $300 tablets that fix these shortcomings, would you pick one over an iPad Mini, an Android tablet or a Kindle Fire? Share your thoughts below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 07/24/2013 at 1:15 PM0 comments


Dell Again Postpones Vote with a Cheap 10-Cent Shot

In the latest bid to salvage his offer to take the company he founded private, Michael Dell and his team of equity investors led by Silver Lake Partners sweetened their $24.4 billion offer by 10 cents a share, amounting to about $150 million. But tucked in with the drop-in-the-bucket increase to $13.75 per share is a rule change in the vote which this time wouldn't count votes that aren't tendered by proxy or in person.

As a result of the new offer and voting rule change, Dell postponed the shareholder vote for the second time -- the first adjournment occurred last Friday. Now the vote is scheduled for the morning of Aug. 2. Dell postponed last week's vote after it appeared evident he didn't have enough major shareholders on board to approve the deal.

Throughout the near six months since he first made the offer to take Dell private, Michael Dell's team has adamantly held firm with its price saying it would not budge despite a counteroffer from an investor team led by activist investor Carl Icahn, who vigorously opposes the Dell offer and has successfully rallied other shareholders to vote against it. The paltry 10-cent-per-share increase is the group's "best and final proposal," as asserted in a statement this morning. Clearly the sweetened bid looks like a cheap (relatively speaking) way to change the rules in their favor.

However CNBC is reporting the Dell special committee evaluating the offer has indicated it will not change the voting rules for a mere dime. If you've been waiting at the edge of your seat to see how this will play out, it seems you're going to have to wait a bit longer.

Posted by Jeffrey Schwartz on 07/24/2013 at 1:15 PM0 comments


Did Microsoft Throw Surface RT Under the Bus?

When Microsoft reported its worst quarter in recent memory last week, the company took its medicine after investors drove down its share price 11.4 percent Friday. On the surface (no pun intended), the culprit was Microsoft's unexpected $900 million charge related to the price reduction of its Surface RT devices.

Indeed the write-down gave analysts and pundits the opportunity to point to the deep-rooted problems of the Surface RT, including weak retail distribution (which it has since broadened), uninspiring marketing and Microsoft's decision to price the devices on par with the iPad rather than lower than its competition.

But there's another reason Microsoft took its medicine last week -- the company missed expectations across all its business units, including Server and Tools and the Microsoft Business Division (aka Office). "It was discouraging to read down the table and see that every division was below expectations," longtime Microsoft watcher Rick Sherlund, Nomura Securities analyst wrote in a research note.

On the other hand, many viewed the Microsoft Surface RT write-down as a tacit admission that it didn't have an iPad killer on its hands. Of course that reality set in long ago. Does that mean Surface RT and its sibling the Surface Pro are dismal failures? I believe it's way too early to write it off, so to speak. Microsoft appears to have cut its losses on its earlier mistakes, while clearing the decks for the next crop of Surface devices.

Presuming the next Surface RT and Surface Pros tablets are vast improvements, and the number of mainstream apps steadily increases, Microsoft is still in the game. It's not clear whether a Surface "Mini' is in the works but I think many would welcome such a device. Indeed some of the new devices offered by OEM partners give a sense of what's in the pipeline.

I stopped by the Microsoft Store over the weekend and spent some time with the Acer Iconia, an 8-inch Windows 8-based tablet powered with an Intel Atom processor as well as another new arrival, the 10.1-inch ASUS VivoTab Smart Tablet, priced at $399. Sporting a 64-GB SSD, 2 GB of RAM and a copy of Microsoft Office 2013 Home and Student, the ASUS device weighs 1.28 pounds and has run time of 9.5 hours on a charge. It's certainly more affordable than the Surface Pro and appears to have a better battery life.

Just as Microsoft is working to release faster updates to Windows, the company and its partners need to do the same with its devices. The sooner the better.

 

 

Posted by Jeffrey Schwartz on 07/22/2013 at 1:15 PM3 comments


Negative Investor Reaction Brings Halt to Dell Buyout Vote

As a chorus of major investors including Blackrock and mutual fund giant Vanguard said they were going to vote against Dell's bid to buy out its shares to take the company private, at the 11th hour the vote, scheduled for yesterday morning, was postponed until next Wednesday.

The move buys company officials a few extra days to reverse the growing tide against the deal but it appears as if Dell is only delaying the inevitable.

Of course anything can happen and the easiest way to save the deal would be for Dell and his investor team, led by Silver Lake Partners, to sweeten the bid. They have steadfastly said they won't raise their $13.65 per share bid valued at $24.4 billion. I suspect if it becomes clear the only way he can get enough votes to salvage it, Dell and his team will raise their bid. But they won't do so until the bewitching hour.

Nevertheless activist investor Carl Icahn is pushing relentlessly for Dell to do so and Icahn has put his money where his mouth is with his counteroffer to acquire Dell's shares for $14 plus warrants that would give investors a continued stake in the company. Icahn continues to maintain his bid is superior and the Dell offer undervalues the company. Dell has argued the offer by Icahn and his team led by Southeastern Asset Management is risky.

In a statement yesterday, Icahn said he wasn't surprised by Dell's 11th-hour delay adding that it validates his contention that investors are underwhelmed by the offer. "We believe that this delay reflects the unhappiness of Dell stockholders with the Michael Dell/Silver Lake offer, which we believe substantially undervalues the company," he said. "This is not the time for delay but the time to move Dell forward."

Though it's far from certain that Icahn's would prevail if Dell's is voted down, Icahn has indicated he would replace the entire board and would bring in a new CEO to replace Michael Dell. But for now the focus by both sides is to either seal this deal or vote it down.

Dell will spend the next several days trying to twist quite a few arms and change the minds of those who have indicated they'll vote against the deal. As Yogi Berra famously said: "It ain't over till it's over." Of course yay or nay, this saga is far from over.

Posted by Jeffrey Schwartz on 07/19/2013 at 1:15 PM0 comments


Shrinking Windows Business and Soft Surface RT Sales Hurt Microsoft

The meltdown in the PC business has finally caught up with Microsoft. While the company has missed expectations four times out of the last five quarters, analysts and investors gave Microsoft a pass. In fact the company's stock has appreciated nicely in recent months.

But yesterday's worse-than-expected fourth fiscal 2013 quartery report showed Microsoft can no longer hide from the shrinking PC business, with Windows revenues declining 6 percent for the quarter and 1 percent for the entire year (take into account an upgrade offer and revenue deferral affected the numbers). Putting that aside, Microsoft said Windows Division revenues increased 6 percent for the quarter and 5 percent for the year.

While it's certainly no surprise that Microsoft's Surface devices, especially those based on Windows RT, have failed to make a dent in the rapidly growing tablet market, Microsoft's $900 million charge hit the company where it hurts and some analysts on last night's earnings call sounded put off by the surprise. Microsoft said it took the charge after slashing the price of the devices by as much as 33 percent.

The revenue and earnings miss are hitting Microsoft shares hard -- near midday they were down over 10 percent. Presiding over her first earnings call as Microsoft's new CFO, Amy Hood said Microsoft believes the price cuts for Surface RT "will accelerate Surface RT adoption and position us better for long-term success."

Certainly cutting the price could help but what will more likely accelerate Surface RT and Surface Pro for long-term success will be the next crop of devices based on improved processors and storage, longer battery life and more popular apps in the Windows Store. Reading between the lines, it appears new Surface devices are coming. Microsoft is also counting on a fresh crop of systems from OEMs based on Intel's long-awaited Haswell processor that will address the shortcomings in the higher-end devices.

Hood pointed to last week's major reorganization of the company -- dubbed One Microsoft -- the planned release of Windows 8.1 and a pipeline of new devices and services will lead to improved results. The company will hold an analyst meeting in September to spell out how Microsoft will grow its business and provide further details on the reorganization, Hood said.

Yesterday's news wasn't all bad. The Server and Tools business posted $5.5 billion in revenues, up 9 percent, driven by System Center (up 14 percent), SQL Server (up 16 percent) and a 25 percent increase in Windows Azure customers. But that was obscured by the overall bad news, which resulted in quarterly revenues of $19.2 billion, up 3 percent year-over year but falling short of the estimated $20.73 billion. Earnings per share were only $0.59 cents, falling way short of the $0.75 analysts had forecast.

Hood acknowledged that "we have to do better" to gain share in the tablet and phone market. "With over 1.5 billion Windows users around the world, a transition of this magnitude takes time," she said. "We're confident we're moving in the right direction."

Are you as confident as Microsoft is? Comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 07/19/2013 at 1:15 PM1 comments


Dell-Icahn Showdown: Who Will Blink?

 

Shareholders will finally vote on the proposed (and hotly contested) deal to take Dell private tomorrow. The outcome will either put Dell's fate to rest or at least set the next stage of the battle.

How tomorrow's vote will play out is too close to call but there's a good chance the team lead by founder and CEO Michael Dell and Silver Lake Partners may not have enough votes to complete the $13.65 per-share deal. This deal is valued at $24 billion, thanks to corporate raider Carl Icahn's counter-offer, which he sweetened Friday.

If Icahn, backed by Southeastern Asset Management, can convince shareholders their offer is superior, and they ultimately accept it, that could have huge ramifications for the future of the storied company. Icahn has made no secret he intends to replace the entire board of the company including Michael Dell.

In a letter to shareholders last night, the company board's special committee, which has recommended the Dell-Silver Lake bid, said despite the appearance that Icahn is offering a higher per-share offer plus provisions to keep part of the company public, Icahn's bid would be riskier. That's because Icahn would presumably have to borrow against Dell's balance sheet, the company has long argued. The Dell letter's key argument:

We consider it unwise to layer substantial financial risk on a company already facing significant challenges from competition and from the rapid pace of technological change. It is, we believe, not an accident that no large publicly traded technology company carries high levels of debt. And while we recognize that, as a private company controlled by Mr. Dell and Silver Lake, the Company will have a significant debt burden, the risks of that capital structure will be borne entirely by the buyers and not by the public stockholders. Moreover, the buyers have the financial resources to invest additional funds if that proves necessary.

Conversely, a leveraged recapitalization of the sort advocated by Mr. Icahn would force Dell stockholders to maintain meaningful equity exposure to a non-investment grade, publicly traded company that we believe would likely be ill-prepared to weather further downturns in the PC business and could be hamstrung in its ability to make the additional investments needed to complete its transformational plan. We believe such a company would face instability that would undermine customer confidence and make it harder to attract and retain the best employees.

For its part, Southeastern Asset Management has argued shareholders are missing out on a superior opportunity for capital appreciation, pointing to the doubling of HP shares since November in a chart it posted showing analyst forecasts for the company at the time, as noted by Barron's Tiernan Ray. Icahn has also accused Dell of using scare tactics.

"They tell us about the profitable PC market drastically declining and point to their quarterly numbers," Icahn said in a letter posted Monday. "But they neglect to point out the reduction in margin is of their own doing because they have, of their own volition, lowered prices which obviously have drastically reduced margins.  But even Dell's own management believes this is temporary."

Meanwhile Forbes reported today the current offer by the Dell-led team is facing an uphill battle, with Blackrock Group the latest influential shareholder backing off from supporting the deal. Dell has held steadfast in saying it will not sweeten its offer. Whether or not you're a major Dell customer, the outcome could have ramifications on the entire industry.

Will Michael Dell and his team blink or with it remain firm with its current offer, which Icahn and Southeastern say undervalues the company? And how would you like to see this play out? Comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 07/17/2013 at 1:15 PM0 comments


Amid Declining PC Sales, Lenovo Ousts HP as Leading PC Supplier

While PC shipments continued to fall off a cliff, declining about 11 percent for the second quarter, there was a changing of the guard as well. Lenovo has ousted HP as the leading global PC supplier in terms of units shipped, according to both Gartner and IDC. However in the U.S., Lenovo has a long way to go before catching up with HP and Dell, even though it made significant strides at all of its rivals' expense.

In terms of worldwide shipments, Lenovo hit the top spot with a 16.7 percent share of the market, according to both market researchers. HP had a 16.3 percent share according to Gartner, while IDC said it had a 16.4 percent share. Dell's share was 11.8 percent and 12.2 percent respectively, the firms said. Asus came in a distant fifth place falling 21.1 percent, according to IDC and 20.5 percent, according to Gartner. Falling the hardest was Acer with a 32.6 decline, IDC reported.

Preliminary Worldwide PC Vendor Unit Shipment Estimates for 2Q13 (Units)

Company 2Q13 Shipments 2Q13 Market Share (%) 2Q12 Shipments 2Q12 Market Share (%) 2Q12-2Q13 Growth (%)
Lenovo 12,677,265 16.7 12,755,068 14.9 -0.6
HP 12,402,887 16.3 13,028,822 15.3 -4.8
Dell 8,984,634 11.8 9,349,171 11.0 -3.9
Acer Group 6,305,000 8.3 9,743,663 11.4 -35.3
ASUS 4,590,071 6.0 5,772,043 6.8 -20.5
Others 31,041,130 40.8 34,675,824 40.6 -10.5
Total 76,000,986 100.0 85,324,591 100.0 -10.9

Source: Gartner

Although Lenovo is a distant fourth in the U.S. (behind HP, Dell and Apple), it showed dramatic growth while everyone but Dell saw their shipments decline in the country. IDC said shipments of Lenovo PCs in the U.S. increased 19.6 percent during the quarter, compared to the same period last year. Dell rose 5.8 percent, while HP's declined 4.1 percent.

Among Redmond magazine readers, Dell PCs are the most widely procured systems, followed by HP and Lenovo. According to 1,157 respondents to the Redmond Magazine Readership Study, 45 percent said Dell is their preferred PC provider, with 29 percent choosing HP and 10 percent selecting Lenovo. One percent chose both Acer and Asus, while 13 percent said they have no preference.

When it comes to PCs, it's hard to find discernible differences in pricing these days thanks to commoditization. Do you see any measurable differences among vendors in terms of design, reliability, support or other factors that may push one supplier over the top?

 

Posted by Jeffrey Schwartz on 07/15/2013 at 1:15 PM0 comments


Can Terry Myerson Save Windows?

Microsoft CEO Steve Ballmer's realignment of the company's organizational structure is a major bet that removing the silos that existed in product groups will help fulfill his mandate of transforming Microsoft from a traditional software supplier to a devices and services company. While there are many nuances of the new "One Microsoft" organization Ballmer revealed yesterday, one of the biggest bets Ballmer is making is on Terry Myerson.

Ballmer has tapped Myerson to lead Microsoft's new Operating Systems group. In other words, the future of Windows is on Myerson's shoulders. Just to be clear, that's all of Windows. The new Operating Systems organization is responsible for Windows delivered to Xbox, Windows Phone, PCs, tablets, Windows Server and Windows Azure. Until now, these versions of Windows were spread across three autonomous organizations, which had no stake in working with other groups.

I've read some debate as to whether Myerson should be tasked with ensuring the uniformity of Windows, given the lackluster results of Windows Phone, which he previously oversaw. But that may be an unfair criticism, since by the time Myerson arrived in the Windows Phone group its problems were already solidified. The problems with Windows Phone aren't issues with the quality of the platform -- in fact its likeness to Windows 8 makes the tandem an attractive option if the company can find a way to address the marketing obstacles that seem to be stunting the growth of Windows 8.

Myerson joined Microsoft in 1997, when the company acquired Intersé Corp., a Web analytics company he founded at the age of 21, according to his bio. Before leading the Windows Phone group, Myerson led Microsoft's Exchange Server team.

Just as Myerson will be responsible for delivering a unified Windows, the centralization of Microsoft's marketing under Tami Reller aims to provide a common message about the "One Microsoft" as it pertains to Windows. Equally important will be Myerson's ability to collaborate with Microsoft's OEM partners and Julie Larson-Green, whom Ballmer tapped to oversee the development and distribution of Microsoft's own hardware and online services.

"We've got innovative ideas coming from our OEM partners and Julie's team has some very innovative ideas," Myerson said yesterday during a conference call held for media and analysts to discuss the reorg. "Terry and I have worked together for a long time," Larson-Green added. "We both have worked on the operating system side. I've worked on the hardware side and it's a good blending of our skills and our teams to deliver things together. So the structure that we're putting in place for the whole company is about working across the different disciplines and having product champions. So Terry and I will be working to lead delivery to market of our first-party and third-party devices."

It could take years before it's clear whether Ballmer's sweeping reorg works. The success of Windows 8 will be among the most closely watched barometers of that success. And critics and Wall Street won't be waiting for years to weigh in.

What's your take on Ballmer's choice of Myerson to shepherd the unification of Windows -- and the overall decision to break down the organization silos at Microsoft? Feel free to comment below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 07/12/2013 at 1:15 PM0 comments


Microsoft Pushes the Envelope with Hyper-V Live Migration

When the third iteration of Microsoft's Hyper-V arrived with last year's release of Windows Server 2012, many regarded its virtual machine live migration capability as one of the hypervisor's key improvements. Hyper-V 3.0 offers faster migrations at speeds of up to 10 Gigabits per second, while allowing IT pros to conduct simultaneous live migrations. IT pros can also now perform live migrations outside a clustered environment.

As Microsoft explained last year, "You can configure a virtual machine so that it is stored on an SMB [Server Message Block] file share. You can then perform a live migration on this running virtual machine between non-clustered servers running Hyper-V, while the virtual machine's storage remains on the central SMB share. This allows users to gain the benefits of virtual machine mobility without having to invest in the clustering infrastructure if they do not need guarantees of availability in their environment. (Hyper-V with SMB storage can also be configured with Failover Clustering if you do require high availability."

So how is Microsoft upping the ante on live migration in Windows Server 2012 R2? Following up on a demo at TechEd last month, Microsoft Principal Program Manager Jeff Woolsey showed attendees at the company's Worldwide Partner Conference in Houston Monday just how much faster IT pros can perform live migrations with the new release. In the demo, Woolsey showed an 8 GB virtual machine running SQL Server, which he described as a worst-case scenario for live migration.

In the demo scenario, migrating Windows Server 2012 to a like system takes just under 1 minute 26 seconds, while the Windows Server 2012R2 Preview performed the same migration in just over 32 seconds. Then using remote direct memory access (RDMA) during the live migration process combined with SMB Direct, it took just under 11 seconds, without utilizing added CPU resources.

"With compression we're taking advantage of the fact that we know the servers ship with an abundance of compute resources, and we're taking advantage of the fact that we know that most Hyper-V servers are never compute bound," Woolsey said during the WPC demo. "So we're using a little bit of that compute resource to actually compress the virtual machine inline during the live migration. This allows us to compress it and it's actually done a lot faster and much more efficiently. All of this is built into Windows Server 2012 R2."

Of course this was a demo and mileage will vary. For those testing Windows Server 2012 R2, are you impressed with the improvements to Live Migration in Hyper-V as well as other new capabilities Microsoft is bringing to its hypervisor? Feel free to comment below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 07/10/2013 at 4:59 PM0 comments


Ballmer Touts Microsoft's 'Better Together' Theme at WPC

While it's hardly common to find someone with a Windows 8 PC and/or tablet, a Windows Phone and using Microsoft's portfolio of cloud services all at once, CEO Steve Ballmer today talked up his longstanding mantra that using the company's homogeneous collection of offerings will make individuals and IT pros more productive.

Speaking in the opening keynote address at Microsoft's annual Worldwide Partners Conference (WPC), taking place this week in Houston (keep up-to-date with the WPC news on our sister site, RCPmag.com), Ballmer spoke of that symmetry both from the standpoint of running Windows 8 on multiple devices but also extended that message to public, private and hybrid clouds, making the claim that customers want one provider for all of those topologies.

Arguing the latter point, Ballmer cited an IDC survey commissioned by Microsoft that found 63 percent of its customers prefer to have a single cloud provider and 67 percent plan to purchase a variety of different cloud offerings from a single operator. The survey also found that 74 percent want the option to take a cloud offering back on premise. "We think we are the only solution and certainly the best solution for customers who want that," Ballmer said.

Also despite miniscule market share for Windows Phone, Ballmer promoted it as a key pillar of extending Windows 8 from the PC and tablet. "It's a little-known secret how unbelievably amazing those phones are," Ballmer acknowledged, as he talked up their tight integration with Office. "The ability to really get work done on a Windows Phone is nothing short of amazing," he said.

Ballmer asserted Microsoft is not backing down on its quest to convince IT that Windows Phone devices are a better alternative to iPhones and Android-based phones, which account for the vast majority of smartphones in use today. "We're going to continue to push hard with the consumer but we think we have a very compelling proposition for the enterprise," he said.

To make his case, Jenson Harris, Microsoft's director of program management for Windows User Experience, demonstrated a new capability introduced in the recently released Windows 8.1 Preview called Miracast that lets a user broadcast the image of their Windows Phone to a PC (or vice versa) using a whiteboard app.

While all of this may sound nice, the reality is Microsoft is living in a heterogeneous world and Ballmer was wise to note the fact that the company is adding support for other platforms, particularly from a systems management perspective.

Indeed Microsoft isn't the only company making its products work better together. Apple, Oracle and VMware, among others have similar philosophies. The question is, do you? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 07/08/2013 at 1:15 PM0 comments


Windows New Start Button Is Not Like the Old Start Button

The Start button is back in Windows 8.1 but depending on how you used it in the past, you may be extremely disappointed. That's especially true if you used the Start button to launch your favorite apps.

The revived Start button in Windows 8.1, released last week at the Build conference in San Francisco, doesn't let you do that. "Microsoft's Start button 'fix' is worse than doing nothing," wrote Earl Fargis, in response to my post Wednesday on the new Windows 8.1 Start button.

So what should you do? You can stick with the third-party suppliers of Start buttons evaluated by David Pogue back in April. Pogue pointed to seven options, some of which are free, others that may set you back no more than $5. I used the 30-day trial of Start8 and it worked just like the classic Start menu. But when the trial period ended, I didn't bother to plunk down $4.99 to keep it.

Instead, I created Desktop icons for the apps I use most often and that really does the trick. Just like Windows 7, all you have to do is to right-click on the Desktop, click New, then Shortcut and browse your system to find the proper shortcuts.

Though it doesn't let end users launch their programs, the new Start button will be a welcome addition to IT pros and administrators. By right-clicking on the Start Menu, either in the Desktop or Modern interface, what pops up is the option to administer the desktop. Among the features available are power settings, System, Device Manager, Network Connections, Disk Management, Computer Management, Windows PowerShell, Task Manager, access to the Control Panel, File Explorer, Search, Run and the ability to shut down the system.

In the end, the Start button is an improvement for IT pros and for users who configure their PCs. But the days of using it to launch apps appear to be over, unless Microsoft once again succumbs to pressure to "refine the blend" further. Either way, you're not out in the cold. Just set your apps up on the Desktop or use one of the third-party tools and you're in business.

Posted by Jeffrey Schwartz on 07/01/2013 at 1:15 PM0 comments


Build 2013: Skype Moving to Windows Azure

Microsoft is moving Skype to its Windows Azure cloud service, thanks to its new auto scaling and a slew of other features demonstrated yesterday at the Build conference in San Francisco.

Corporate VP for Windows Azure Scott Guthrie disclosed the plan to move Skype off of its own servers onto Windows Azure to showcase the auto-scaling feature Microsoft is planning for the service. The idea is that Skype's need for capacity ebbs and flows and the ability to get dynamic compute and storage will serve Skype well, Guthrie said.

While Windows Azure has always enabled customers to scale up and scale down their apps, it required them to write custom scripts in order to enable that capability, Guthrie said. Moving the Skype service to Windows Azure is a good test case for the auto scaling capability, according to Guthrie. By moving to Windows Azure, Skype can scale to the capacity it requires as fluctuations in usage change, he noted.

"We're going to make this a lot easier by baking in auto-scale capability directly into Windows Azure," Guthrie said during the keynote presentation. "This is going to make it trivially easy for anyone to start taking advantage of this kind of dynamic scale environment and yield the same cost basis."

The auto-scaling feature is now available in preview for those using Windows Azure Web Sites, Cloud Services and Virtual Machines. A menu of other services, including availability, monitoring and alerting, are also available. Only alerts and monitoring are in preview for Windows Azure Mobile Services.

Among other enhancements Microsoft has planned for Windows Azure include support for availability, monitoring and alerting. The improved Azure features are available for preview now. The company has not set, or disclosed a planned release date for general availability.

Guthrie also demonstrated how Microsoft will at some point let SaaS providers and ISVs authenticate to their applications via Windows Azure Active Directory. In a demonstration he showed how they can integrate existing enterprise security credentials, having single sign-on within the application. "This makes it really, really easy for you now to build your own custom applications, host them in the cloud and enable enterprise security throughout," he said.

In addition, Guthrie previewed how Windows Azure Active Directory will also make it easier for enterprises to integrate existing SaaS-based apps and have the same type of single sign-on support with Active Directory.

The single sign-on preview was just a demo. Microsoft didn't release a beta or preview for the Windows Azure Active Directory improvements. The company will disclose more details in the coming weeks, according to a company spokeswoman.

Microsoft also announced the release of several Windows Azure features that were in preview, including Windows Azure Mobile Services and Windows Azure Web Sites. Windows Azure Web Sites, which as the name implies is aimed at letting developers build and host Web sites. Windows Azure Mobile Services are designed for developers who want to build apps for iOS, Android and Windows Phone that are cloud enabled.

Posted by Jeffrey Schwartz on 06/28/2013 at 1:15 PM0 comments


Who Will Fill Cisco's Shoes in the Server Load Balancing Market? 

When Cisco said it was exiting server load balancing business last fall, a slew of vendors lined up to fill the void. Among them were Barracuda, Citrix, F5 Networks, Kemp Technologies and Riverbed Technology, looking to position their server load balancers -- also known as application delivery controllers (ADCs), as alternatives.

Cisco apparently decided to pull the plug on ACE, a plug-in load balancer the company offered for its Catalyst switches. Cisco first entered the ADC market back in 2000 when it bought a company called ArrowPoint for $5.7 billion, which is the third-largest acquisition it has made.

Now that Cisco has cut its losses by discontinuing sales of ACE, Kemp is looking to show that it has an edge on its rivals. At the Cisco Live conference this week in Orlando, Kemp said its LoadMaster Operating System (LMOS) for Cisco's Unified Computing System (UCS) is now available. Kemp said it's LMOS for UCS is certified by Cisco via its Interoperability Validation Testing (IVT) program.

Kemp officials say it's offering the only load balancer operating system to run natively within the Cisco CUS fabric without requiring a server virtualization hypervisor. "The Cisco customer base is crucial to us not just because they've vacated but because the hosting market has been an important element in the way we deliver our technology and the hosters are overwhelmingly going toward Cisco blade infrastructure," said Kemp's co-founder and chief scientist Jonathan Braunhut last week.

While Cisco has also recommended Citrix Netscaler to ACE customers, Kemp approached Cisco with the alternative of offering its ADC within UCS, explained Kemp CEO Ray Downes in an earlier discussion. "With the recent discontinuation of their ACE product, they have a handshake agreement in Citrix Netscaler, however you have a lot of Cisco customers who are looking for a migration path and they would like to continue to work with Cisco," he said.

Because LMOS will run within UCS fabric, customers will be able to gain more efficiency with the blades, said Iain Kenney, Kem's director of product management. "Obviously with our Loadmaster for Cisco UCS running inside that fabric means we can load balance your middle-tier workloads, kind of natively within the fabric."

Posted by Jeffrey Schwartz on 06/27/2013 at 4:59 PM0 comments


Build 2013: Ballmer Announces Windows 8.1 Preview Download with Start Button

The biggest objection to Windows 8 since its release last fall is its failure to appeal to longtime users of the operating system's traditional desktop while drawing them to the new modern interface.

Microsoft CEO Steve Ballmer tacitly acknowledged that fact this morning as he gave the opening keynote of the Build conference in San Francisco, clearly the event of the year for Microsoft. Minutes into his keynote, Ballmer officially announced the release of the Windows 8.1 Preview.

Since the release of Windows 8 back in October, Ballmer said about 100,000 applications have appeared in the Windows Store that run on the new modern interface and application model.

But Ballmer also said there are 2 to 3 million applications designed to run on the traditional Windows desktop environment. With Windows 8.1, Microsoft is aiming to make it easier for users to access the environment and applications they prefer, while making clear that over time it hopes to see most key applications move to the new model. In other words, he's putting the training wheels back on.

"What we will show you today is a refined blend of our desktop experience and our modern user interface and application experience," Ballmer said. That wasn't quite a mea culpa but he positioned the move as a re-tweaking that Microsoft hopes will address the key objections existing and potential users have had to Windows 8.

In the worst-kept secret in the world of Windows, Microsoft has indeed brought the Start button back to Windows, along with a bootable desktop.

"You will see that we [brought] back the start button to the desktop," Ballmer said. "If you want to boot to the desktop, you can boot to the desktop." The response was a rousing applause, which isn't surprising given the reaction we've heard from you over the past several months.

However in a surprising move, Ballmer also announced that Microsoft is embedding its Bing search engine properties across its line of products. It will include adding its Bing search engine in Windows 8.1 as well as for Windows Phone, Office and Skype. Microsoft calls the project Bing as a Platform (see Microsoft's Search blog).

"With Windows 8.1, I would say Bing is inside," Ballmer said. "Our shell experience is powered by Bing, You'll see we are opening up Bing as an application development platform, so you can use all this investment we have put into crawling the Web and understanding entities. You can use that, see that and build that richness into your applications running on top of Windows."

In addition to bringing Bing to Windows, Microsoft is offering Bing Developer Services, which will let those building applications to embed search capability in them.

Indeed, if you're tired of reading about the updates coming to Windows 8.1, now you can see it for yourself. Existing Windows 8 users can access the preview in the Windows Store. Microsoft also said it's available for download here.

My colleagues Michael Desmond and John K. Waters are live on the scene at Build and you'll see plenty of coverage from them, myself and my colleagues at Redmond Channel Partner and Visual Studio Magazine in the coming days and weeks. In the meantime, check out the preview and let me know what you think by commenting below or dropping me a line at [email protected].

Posted by Jeffrey Schwartz on 06/26/2013 at 1:15 PM0 comments


XenServer Goes Open Source: Will It Hurt Hyper-V?

Citrix this week has taken the latest step in its open source onslaught by throwing its XenServer hypervisor platform into the mix. The move offers an affordable alternative to virtualization suites from Microsoft's Hyper-V, which is starting to gain appeal with the latest version released last fall, as well as from VMware's hypervisor and cloud management platforms.

Although parts of Citrix XenServer were already open source, the company yesterday released the entire product to the community via xenserver.org. The new XenServer 6.2 release, which boasts substantial performance and scalability improvements, is the same software as the open source Xen, which Citrix released to the Linux Foundation back in April. The move is also consistent with the Citrix push to open source much of its IP including last year's release of its CloudStack cloud compute platform to the Apache Foundation.

"We are making the Citrix XenServer the commercially supported version of the open source [Xen] product," explained Krishna Subramanian, vice president of product marketing for Citrix Cloud Platforms group. "There's nothing to change when you go from the open source aversion to the Citrix version. It's just a license key change."

Looking to broaden appeal for Citrix XenServer, the company is slashing the cost of subscribing to the pay version, which gives administrators support from Citrix. The entry fees are now $500 per socket, compared with an earlier initial cost of $2,500. Citrix is also now offering perpetual licenses starting at $1,250 per server. That provides all the features in its Platinum version, its highest level of support. The company is removing the SKUs providing all the features offered in Platinum, Subramanian said.

I asked Subramanian what impact this might have on Citrix's revenue and her response was by lowering the cost of implementing the high-end version of its hypervisor platform, more shops will find it a viable option. Hence, Citrix believes it will make up the loss by increased overall volume.

"All the features you can get in the highest end version in the past are all available in the open source Xen server, are now in Citrix XenServer," she said. "It is one SKU and is the commercially supported version of the free Xen server. That makes the value proposition of [Citrix] XenServer very, very strong, because now at a fraction of the cost of the alternative, less than half, even a third of the cost of alternatives, you'll have an enterprise grade virtualization solution."

Scott Lindars, Citrix senior product marketing manager, noted with over 1 million downloads of the free Xen open source hypervisor platform he believes many of those are prospects to now license Citrix XenServer, where in the past it may have been cost-prohibitive.

"Now we have that huge base that's already using free and will continue, and there's a compelling reason for them to think about moving to the paid version," Lindars said. "We see this as a big upsell opportunity and expect a groundswell of users going from free to paid."

Does Citrix believe this give shops eyeing Microsoft's Hyper-V and Cloud OS platform tied to System Center 2012 and Windows Server 2012 a reason to reconsider? Subramanian noted Citrix considers Microsoft a key partner and the company supports Hyper-V, adding that 90 percent of common workloads are Windows based. Those who want to use the complete Microsoft stack will go with Hyper-V, she said, while others who don't want to be tied to System Center will consider Citrix XenServer.

"This is not something we're positioning against Hyper-V," she said. "Most customers that use Hyper-V go with System Center and buy the full Microsoft management stack. Customers that don't want to do that and want to have an open orchestration and management platform that runs on a hypervisor and they don't want to commit to a Microsoft stack but still see value in XenServer."

As I noted last week, 61 percent of 1,153 respondents to Redmond magazine's 2013 Reader Survey said they use VMware for virtualization, while 44 percent said they use Microsoft. Citrix also came in strong with 29 percent, while 12 percent chose Oracle and 8 percent Red Hat's KVM. Subramanian believes this move makes XenServer immediately more attractive to shops running Windows Server since KVM requires Linux.

IDC analyst Al Gillen told me last week Hyper-V's that share has increased to a 28.4 percent share of the hypervisor market, up from 24.6 percent a year earlier but noted very few organizations are ripping out their existing hypervisor platforms.

"With Windows Server 2012, there aren't many customers who would look at it and say it's not ready for prime time," Gillen said. "A lot of the share gain we're seeing is [for new installations] rather than a replacement effect."

While most experts predict Hyper-V will gain share in the hypervisor market, I'm wondering what impact the Citrix move will have on potential siphoning from growth of Hyper-V. Does the Citrix move change your thinking? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 06/25/2013 at 1:15 PM0 comments


Microsoft Tightens Security for Virtual Machines in Windows Azure

Microsoft has added a new security option for those using its Windows Azure cloud service. Administrators can block unauthorized users from accessing virtual machines, Microsoft quietly announced at its TechEd conference in New Orleans earlier this month.

The new option lets administrators put Access Control Lists (ACLs) on individual endpoints. By putting the ACLs on endpoints or subnets, administrators can control unauthorized access to virtual machines that are protected behind a firewall but are accessible in the public cloud.

"We are adding an additional security option so that administrators can control inbound traffic to Virtual Machine," said Microsoft cloud strategy advisor Louis Panzano, from the company's office in Spain in a blog post. "You simply define how traffic from outside of your corporate firewall communicates with your virtual machine public endpoints through PowerShell and soon it will be available in the management portal."

During a session at Friday's MongoDB Days conference in New York (see this blog post), Microsoft cloud evangelist and architect David Makogon noted the announcement of the new security option, saying it offers an important way to control access to an exposed IP port. As Panzano noted in his blog post, Makogon pointed out the option for now is not available in the Windows Azure management portal (meaning it required the creation of PowerShell scripts).

Magogon said a good resource for creating that script is available via a blog post by Michael Washam, who until a few weeks ago was a senior program manager at Microsoft responsible for the Windows Azure PowerShell cmdlets for compute (IaaS, PaaS, and VNET), Windows Azure .NET SDK and areas of the Service Management API (RDFE).

"A significant improvement in the security of virtual machines is the ability to lock down an endpoint so that only a specified set of IP addresses can access it," wrote Washam, now a principal cloud architect at integrator Aditi Technologies. In his blog post, Washam explained how to specify ACLs during or after a deployment using PowerShell. "You create a new ACL configuration object using New-AzureAclConfig and then modify it with Set-AzureAclConfig," he noted. "The created ACL object is then specified to the *-AzureEndpoint cmdlet in the -ACL parameter." He shared an example script in his post.

This is an important new option, Magogon emphasized, advising attendees of his presentation it will keep unauthorized users out of their systems running in Windows Azure. "You probably don't want to have that port hanging out to the public," he said, noting by implementing the script you "can set Azure ACL configuration and create a rule [to] permit or block a particular subnet."

 

Posted by Jeffrey Schwartz on 06/24/2013 at 1:15 PM0 comments


MongoDB Emerging as a Key NoSQL Database

Chances are if you're considering a database to host unstructured data, MongoDB is high on your list. Experts say it's the leading so-called NoSQL database and the key commercial sponsor of the open-source repository. 10gen is clearly on a roll.

The company points to big commercial adopters including Cisco, Craigslist, Disney, eBay, Forbes, Foursquare, Goldman Sachs, Intuit, LexisNexis, Met Life, MTV, Salesforce.com, Shutterfly and Telefonica.

Likewise, 10gen has lined up partnerships with key IT vendors and cloud providers. In addition to Microsoft, other partners include Amazon Web Services, Red Hat and SoftLayer, the company IBM earlier this month agreed to acquire for $2 billion. IBM itself announced a partnership with 10gen earlier this month to collaborate on building mobile enterprise apps. ObjectRocket, a database as a service provider recently acquired by Rackspace, announced a partnership with 10gen on Friday.

In a sign of the times, 10gen recently settled in its new headquarters in the old New York Times building, which is also occupied by Yahoo. I spent a few hours Friday at 10gen's road show, which it calls MongoDB Days in New York. The event drew more than 1,000 attendees that took part in a wide variety of business-oriented and technical tracks.

Naturally, while attending the crowded gathering, I gravitated to the one session on how to build MongoDB apps running on Microsoft's Windows Azure. Leading the presentation was David Makogon, a cloud evangelist and architect at Microsoft, who works primarily with ISVs.

While only about two dozen people attended the Windows Azure session, Makogon said it's a popular platform among Windows Azure users and ISVs. Makogon said he gets more inquiries from partners and his support team about MongoDB than anything else.

Now that Windows Azure Infrastructure Services is available, MongoDB is easier to deploy, Makogon said. With the PaaS version of Windows Azure, IT had fewer controls, which while suitable for those who don't want to have to address lower-level infrastructure, it's limiting for others do prefer or need that control.

PaaS has worker roles, which is basically Windows Server without IIS running. "It runs fantastic, however, the problem I have with it is I'm imitating a replica set," he explained. "With Worker Role instances, you can't choose which one to shut down. For instance if you want to specifically shut down the primary node, you can take it offline but you can't exactly choose that one to scale. Let's say you scaled to five instances and want to scale back to three, you can't specify which instances you want to let go. So when it comes to low-level management of your replica set, that's where the worker role tends to fall short, just because you can't map it to the exact server."

With Windows Azure Infrastructure Services, IT has full control, he added. "Say I want this virtual machine to be shut down," he said. "My general recommendation is if you're getting into replica sets or shards, especially because there's no scaffolding available today, virtual machines [Windows Azure Infrastructure Services] are the way to go."

Indeed while MongoDB is popular, as I reported from the Visual Studio Live! conference in May in Chicago, there's no shortage of NoSQL databases, depending on your requirements. One popular alternative is Cassandra, as are CouchDB, Neo4j, RavenDB and even table storage offered in Windows Azure.

In a brief interview following his keynote address, 10gen CEO Max Schireson told me the company is seeing numerous MongoDB deployments on Windows Azure but noted despite the addition of IaaS, the cloud service still needs to evolve.

"We're seeing a lot on Windows and some of it moving to Azure," he said. "I think it will take a few more Azure iterations for more of the workloads to move there but if there's one thing people know is Microsoft will keep working on it and as it grows and matures people will move more of their workloads there."

Asked how Windows Azure needs to mature, Shireson said, "I think some of it will be newer hardware and when solid state drive  technology shows up in Azure, it will boost it. And as Azure and Windows Server continue to come together it will be a big boost."

Are you running MongoDB in Windows Azure? If so, please share your experience by dropping me a line at [email protected].

Posted by Jeffrey Schwartz on 06/24/2013 at 1:15 PM0 comments


Strange Bedfellows: Microsoft and Oracle Eye Cloud Pact

Microsoft CEO Steve Ballmer, Server and Tools President Satya Nadella and Oracle President Mark Hurd will gather for a press teleconference Monday to announce a cloud partnership.

During last night's quarterly earnings call with analysts, Oracle CEO Larry Ellison strategically alluded to the forthcoming partnership with Microsoft to support its next-generation database, 12c. The Microsoft deal is one of several partnerships Oracle is inking in the coming days. Others include with the likes of Salesforce.com and NetSuite, Ellison said.

Ellison indicated the companies will support 12c -- its new cloud-based, multi-tenant database. The "c" stands for cloud, Ellison said. It's reminiscent of the company's use of the letter "i" when it introduced its Internet database Oracle 8i in 1999. It's unclear whether it will follow the company's historical nomenclature, which would be Oracle 12c. But given its MySQL assets and the move toward NoSQL repositories, anything is possible. In his remarks on the earnings call, Ellison described 12c as "the most important technology we've ever developed for this new generation of cloud security."

Given Nadella's planned presence on the call, it's not a stretch to presume the pact will involve supporting Windows Azure and/or Microsoft's so-called Cloud OS stack, which includes the combination of Windows Server and System Center. While there's no love-lost between Microsoft and Oracle, they've largely stayed out of each other's hair these days.

Such an arrangement is certainly not without precedent. Even when their rivalry was more pronounced, Oracle made its flagship products available for Windows Server, even though the company tends to push Linux -- and in fact typically the first releases of new products these days always come out on Linux (followed by Windows at some later point).

Despite a chilly relationship of convenience between Microsoft and Oracle, Ellison has saved his ire for IBM, Salesforce.com and, of course, Hurd's former employer HP. The latter company and Oracle became acrimonious following the hiring of Hurd at Oracle and the company's push into the server market with the acquisition of Sun Microsystems. And of course let's not forget Oracle's decision to stop supporting HP's Itanium-based servers. Also, I have heard little to suggest the longstanding animosity between Oracle and SAP has subsided.

Even though Oracle and Microsoft compete in the database market as well as in on-premise and software-as-a-service (SaaS) applications, they've both carved out their own niches -- and neither is hurting, though they do face increased competition from startups and a slew of alternative big data and NoSQL technologies. And the Java versus .NET rivalry has become more moot these days. Nevertheless, Oracle and Microsoft aren't companies you expect to see on a stage together -- or on a conference call.

It's worth noting that Ellison himself isn't scheduled to be on the Microsoft-Oracle call, though he could decide to parachute in. But he used last night's earnings call with analysts to provide a glimpse of what's coming.

 "Next week we will be announcing technology partnerships with the most important, the largest and most important SaaS companies and infrastructure companies in the cloud," Ellison said. "They will be using our technology -- committing to our technology for years to come. These partnerships I think will reshape the cloud and reshape the perception of Oracle technology in the cloud."

While some reports have suggested Ellison let next week's announcement slip, clearly he planned to articulate that Oracle will be launching its new cloud database and partnerships to support it next week. Invites to the conference call arrived from Microsoft less than two hours after the call. It appears Microsoft didn't expect anyone from Oracle to tip the company's hand last night.

However meaningful the partnerships Oracle has planned for next week, Ellison accomplished one thing -- providing a distraction from the fact that Oracle missed analyst estimates again for its most recent quarter by $250 million. Revenues of $10.95 billion for the quarter were flat. Will this new partnership with Microsoft be the same? We'll have a better idea next week.

 

 

Posted by Jeffrey Schwartz on 06/21/2013 at 1:15 PM0 comments


Time To Jump on the Hyper-V Bandwagon?

It seems Hyper-V is getting more respect every day. Five years after Microsoft entered the hypervisor market to challenge the standard bearer VMware, a growing number of holdouts are now finding Hyper-V has become a viable alternative.

My friend and longtime colleague David Strom last year made the case that Hyper-V has come of age, arguing more and more organizations are using it. He said Hyper-V has become more attractive to IT pros and partners alike, thanks to better management (System Center 2012) and development tools and integration with Windows 8. That argument came last August, a month before Microsoft started shipping Windows Server 2012, which of course included the third iteration of the hypervisor, known as Hyper-V 3.0.

While many argued Hyper-V wasn't a viable alternative to VMware's ESX, the 3.0 release of Hyper-V upped the ante for Microsoft's hypervisor with features such as concurrent live migration, dynamic memory, the Hyper-V Replica, major improvements to storage, continuous availability with clusters that can support 64 nodes and 4,000 virtual machines up from just 16 nodes. Hyper-V also offers major network virtualization improvements.

And Hyper-V is set to get better with this year's release of Windows Server 2012, which as Redmond magazine online news editor Kurt Mackie pointed out last week will gain UEFI security support, SCSI Boot, automatic activation of VMs, VM copy and paste, Live Migration improvements, Shared VHDX, Linux guest support and Hyper-V Replica improvements. In a separate report, he outlined how the improved Windows Server and Hyper-V releases will let IT pros more easily manage their Linux and Unix environments.

In recent months, it seems enthusiasm for Hyper-V has increased. I was at a conference last week when Bruce Otte, director of IBM's SmartCloud platform and workload, pointed to the growth of Hyper-V. Indeed even The Wall Street Journal recently pointed to the growth of Hyper-V, where it noted that according to IDC, the share of servers running Hyper-V has jumped to 27 percent from 20 percent in 2008, while VMware has dropped to 56.8 percent from 65.4 percent in 2008. Citrix and Red Hat have smaller shares with their respective hypervisors.

According to Redmond magazine's 2013 Readership Survey, 61 percent of 1,153 respondents said they use VMware for virtualization, while 44 percent said they use Microsoft (naturally multiple answers were permitted here). Citrix also came in strong with 29 percent, while 12 percent chose Oracle and 8 percent Red Hat.

In a brief conversation with IBM's Otte after his presentation last week, he said since the release of Hyper-V 3.0, a growing number of customers are choosing it for new infrastructure. "Hyper-V has a greater amount of flexibility and is a little more stable," Otte said. "The other thing I attribute it to, is that Microsoft provided a virtualized license participation program that allows you to lower your license fees if you use their Hyper-V and monitoring and management stack. It's almost free."

As a result of the long-awaited improvements to Hyper-V, the partner ecosystem is expanding. A growing number of infrastructure vendors that have waited on the sidelines are now finding Hyper-V is ready for prime time. At Microsoft's TechEd conference two weeks ago in New Orleans, a number of suppliers of infrastructure software and hardware announced support for Hyper-V. Among them were A10 Networks, ExtraHop, F5 and Riverbed.

Riverbed launched its new Stingray Traffic Manager 9.2, a load balancer that can run as a software virtual appliance for the latest Hyper-V release. Venugopal Pai, Riverbed's VP of global alliances hinted a full-fledged appliance would follow. Riverbed also launched a version of its Steelhead WAN optimization appliances for Hyper-V. Asked why it's taken Riverbed five years to support Hyper-V, Pai said that until the release of 3.0, the company didn't see it widely deployed for functions such as network connectivity and disaster recovery.

"With the adoption of Windows Server 2012 becoming stronger, Hyper-V becomes a real choice," Pai said. "We're seeing more and more of our enterprise customers wanting to get that choice from us as well."

F5 also made a big splash at TechEd about its support for Hyper-V, claiming it will be the first vendor to add support for environments with Network Virtualization using Generic Routing Encapsulation (NVGRE). Its BigIP gateways will support NVGRE early next year, making it possible to extend Windows Server networks to Windows Azure and other environments using network virtualization and software defined networking, said Jeff Bellamy, F5's global director of business development focused on the company's partnership with Microsoft.

A10 Networks, which launched its 64-bit AX application delivery controller, added Hyper-V support for the first time. Until now, A10's load balancers were designed to work only with VMware hypervisors in its load balancers. And ExtraHop launched an application performance management offering that runs as a Hyper-V virtual appliance, utilizing the new Hyper-V Virtual Switch capability.

ExtraHop Director of Marketing Eric Giesa said that ExtraHop's solution will let customers achieve cross-tier operational visibility of a Hyper-V implementation, allowing IT pros to correlate metrics across the application, database, network and storage tiers. It will also let administrators monitor private cloud and Windows Azure-based public cloud environments.

With the latest and forthcoming improvements to Microsoft's Hyper-V and now that its gaining support from third-party infrastructure providers, do you see it taking on a larger piece of the virtualization pie in your organization? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 06/19/2013 at 1:15 PM0 comments


HP Ships Personal Systems Chief Todd Bradley to China

Todd Bradley, the longtime HP executive vice president who was in charge of the personal systems group (which includes the world's largest PC business along with tablets and printers), has a one-way ticket to China. HP today said Bradley is stepping down to help develop business operations over there. Bradley's official new title is executive vice president, Strategic Growth Initiatives.

Could this be a graceful exit for the exec who once was CEO of Palm and was twice passed over for the top job at HP? Or is Bradley truly looking to help the company enter into new markets? HP CEO Meg Whitman in a statement described Bradley's new role as critical -- and he will work directly with her to expand its business in China. One of his key priorities will be to develop a channel business in China, according to Whitman.

"There's nothing more important to HP than our channel partners and the future of our business in China," said Whitman. "I've asked Todd to use his expertise to focus on these areas. I've also asked him to study the landscape of small companies and startups that could partner with HP to spur growth."

Bradley's replacement is Dion Weisler, currently senior vice president for the group in Asia Pacific and Japan. Interestingly Weisler joined HP in January 2012 from Lenovo, the company that's gaining the most ground on HP. Weisler served as COO of Lenovo's product and mobile Internet and also ran the company's businesses in Korea, Southeast Asia, Australia and New Zealand. Weisler also spent 11 years at Acer as managing director for its U.K. business.

Clearly building a presence in emerging markets is an important assignment for any executive and Bradley certainly is a good choice. But Bradley has also made no secret he wants to be a CEO again someday and stepping aside from running a $42 billion business to creating new operations could be an interesting assignment. But I'd still expect to see Bradley on the list for CEO assignments as the openings arise in the coming years.

If Carl Icahn has his way and his team with Southeastern Asset Management are able to pull off an upset and trump the offer by Michael Dell to take over Dell, Bradley would be on the shortlist of candidates offered the CEO job there, Reuters reported earlier this month.

While HP has the leading PC market share, it's the number two choice of Redmond magazine readers, according to our 2013 Readership Survey. According to 1,157 respondents, 29 percent identified HP as their preferred PC vendors, while 45 percent chose Dell. Lenovo came in third with 10 percent, while other players were in the low single digits. Thirteen percent had no preference.

Clearly the stakes are high in this move as HP looks to win over more IT decision makers and consumers alike in a market that is going through its largest transformation ever.

Bradley was well regarded and many credit him with growing HP's personal systems business. Do you think Bradley was pushed or did he jump to this new opportunity?

 

Posted by Jeffrey Schwartz on 06/18/2013 at 1:15 PM0 comments


When Microsoft Almost Split...

Thomas Penfield Jackson, the outspoken federal judge who oversaw Microsoft's antitrust trial in 1998 and ordered the company to divest its Windows business, died Saturday of cancer at the age of 76.

After finding Microsoft guilty of breaching U.S. antitrust laws and violating the 1994 consent decree in which it had agreed not to tie the sale of products to the sale of Windows, Jackson ordered that Microsoft split itself into two companies -- one focused on Windows and the other on the remaining software.

As we all know, that never came to pass. Microsoft is still one very large company -- in fact, larger. But that may be thanks to unusual behavior by the judge. During the dramatic trial, Jackson, a technology neophyte himself was able to disprove testimony of key Microsoft execs including founder, chairman and then-CEO Bill Gates that Internet Explorer was inextricably tied to Windows by decoupling the browser himself.

Jackson's split-up order was overturned on appeal only because he had discussed his thinking during the trial with a few select journalists under the condition that they not publish his views until after he rendered the verdict. While the appeals court found Jackson was biased, it didn't overturn the verdict, just the divestiture order.

Jackson famously compared Gates to Napoleon and the company's executive team to "drug traffickers." Gates had "a Napoleonic concept of himself and his company, an arrogance that derives from power and unalloyed success, with no leavening hard experiences, no reverses," Jackson said in 2001. Jackson also described Gates' testimony as "inherently without credibility."     

The fact that Jackson unabashedly shared his opinions gave the appeals court latitude to overturn his ruling that Microsoft split itself into two companies. But for several years, Microsoft faced a significant risk that it might have to split into at least two companies, and perhaps even more. Many, myself included, believed that wouldn't happen. But the possibility weighted heavily on Microsoft.

While customers and investors watched closely, along with lawsuits by 20 states attorney generals, there was little evidence that it affected major decisions by IT pros and investors alike to support their investments in Microsoft. Of course we always hypothesized what a divested Microsoft would look like. After all, many of us lived through the divestiture of AT&T in 1984, a move which reshaped the telecommunications industry.

Microsoft detractors argued vigorously that a Windows company and one that produces Office, for example, would foster more competition because they would have more freedom to align with parties that the combined company couldn't-- thereby offering customers more choice. While others feared a divested Microsoft would result in fragmentation and interoperability tissues, the counterargument was that market forces would require players to make things work together.

As it turned out, market forces did diminish Microsoft's dominance. These days, it would be hard to call Microsoft a monopoly despite its strong presence on PCs today. While there are still plenty of critics who may beg to differ, Forrester Research is predicting only 30 percent of client devices will run Windows in 2017 thanks to the proliferation of iOS- and Android-based smartphones and tablets. Nevertheless Forrester does believe Microsoft's new Windows 8 will start to take hold in 2013.

Despite the diminished influence of PCs, 69 percent of Redmond magazine readers say they plan to upgrade their Windows PCs to Windows 7-based systems and 18 percent to Windows 8, according to our 2013 Reader Survey.

But if Jackson was more discrete and Microsoft had exhausted all appeals (likely to the Supreme Court) there may have been a very different outcome. What would the IT picture look like today?

Posted by Jeffrey Schwartz on 06/17/2013 at 1:15 PM0 comments


Office 365 App Now Available for iOS

Microsoft today released a version of its Office suite for the iPhone, the first time the company offered the combination of Word, Excel and PowerPoint on an iOS device. However, the app is only currently available for Office 365 subscribers.

While Microsoft has offered OneNote and Lync clients for iOS, customers have long pushed Microsoft to offer the full Office bundle for the iPhone and iPad. The new Office Mobile for Office 365 is intended for the iPhone, though Microsoft said it's possible to download the new app on the iPad as well. However Microsoft is advising those with iPads to use Office Web Apps with the tablet.

"Like all iPhone apps, Office Mobile can work on iPad, either small or "2X" scaled up, but you'll have a more satisfying experience using Office Web Apps," the company said in a blog post.

I downloaded the new Office bundle on to m y iPad to see if it was suitable. While I only spent a few minutes using Word, it works (but is not ideal). The resolution is mediocre and it offers only a limited view because the on-screen keyboard takes up half the display. Though I didn't test it with an external keyboard, I'd think that scenario might offer more on-screen viewing space.

Nevertheless Office Mobile for Office 365 should be a welcome addition to iPhone users – many of whom are Windows IT pros. According to the Redmond magazine 2013 Readership Survey published last week, 33 percent of 1,163 respondents say they own an iPhone. More importantly, 69 percent say they support iPhones for use by employees within their organizations.

Microsoft no doubt is hoping this will accelerate the momentum of Office 365. According to the Redmond magazine survey, 12 percent of 1,022 respondents said their organizations will deploy Office 365 within the next 12 months, while 49 percent will upgrade to Office 2013. Depending on the license, many of those organizations using Office 2013 will have Office 365 subscriptions as well.

Of course, the big question now is will (or when will) Microsoft develop a version of Office specifically designed for the iPad? Will the company add support for Office 365 features, such as SharePoint Online? Our research shows that out of 1,159 respondents, 51 percent personally own an iPad and 61 percent say their organizations support iPads. Still, those who don't use or plan to sign on to the subscription-based service will be disappointed Microsoft didn't release a standalone app.

What's your take on this move? If you test the app on either your iPhone or iPad, please give us your thoughts below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 06/14/2013 at 1:15 PM0 comments


Intel Targets New SSDs for Mainstream Datacenter Storage

Intel believes its latest solid-state drives are ready for conventional use in datacenters. The company yesterday took the wraps off its S3500 Series SSDs, which it says can be used for apps that require high-read performance.

"Our third generation of products we believe has the right features, the right cost points, the right capability to unleash SSDs into the mainstream of the datacenter," said Rob Crooke, corporate vice president and general manager of Intel's Non-Volatile Memory Solutions Group, speaking at a press conference in New York. "We're getting increasingly broad capability in our solid-state drive product family both in the breadth of the products as well as in features within those products."

Intel's new S3500 Series drives can run in servers and storage racks, and are available in 1.8-inch and 2.5-inch configurations, ranging in capacity from 80 GB to 800 GB. Dell, Hewlett-Packard and IBM have said they will offer the new drives as options for their server lines, Crooke said, adding that Intel worked closely with HP on the design of the new SSDs. Intel's suggested pricing is $115 for a 1.8-inch 80 GB drive and $979 for a 2.5-inch 800 GB SSD.

This third generation of SSDs from Intel is the first based on its 20-nanometer NAND technology and boasts 75k read, 11.5k write IOPS. Crooke said the drives have a maximum read latency of 500 microseconds 99.9 percent of the time, are rated at 2 million hours mean time between failure and include built-in 256-bit encryption.

While Crooke said the SSDs lend themselves well to systems running big data applications as well as for cloud operations, they're also well-suited to everyday systems where IO performance is critical. Crooke said by replacing the direct attached storage in a server running Microsoft's Exchange with traditional SAS (Serial Attached SCSI) and SATA hard drives with SSDs, the mail server can go down in size from 6U to 2U with 6x improved responsiveness, while reducing the amount of datacenter space needed by 60 percent and requiring 80 percent less power.

"We actually improve the performance and lower the overall TCO of something as simple as the mail server," Crooke said.

So Intel says SSDs have arrived for the datacenter. Are you ready to opt for SSDs over hard disk drives for your everyday Windows Server-based systems?

Posted by Jeffrey Schwartz on 06/12/2013 at 1:15 PM0 comments


Google Chromebooks Find Niche: Education

Google hasn't lit the world on fire with its Chromebooks -- client devices based on the company's Chrome OS. But some proponents of Chromebooks believe they are a viable alternative to Macs, PCs and tablets.

Indeed only 1 percent of those responding to Redmond magazine's 2013 Readership Survey say they envision replacing their Windows 7 PCs with Chromebooks. Nevertheless that's the same percentage who say they'll replace them with Macs. The only thing that fared worse were PCs running Windows RT, the stripped-down version of Windows 8 that only features the "modern" UI.

Those who have shown enthusiasm for Chromebooks are IT decision makers at schools, and many are already piloting and deploying them. Lisa DeLapo, director of technology at St. Joseph School, part of the Roman Catholic Diocese in Oakland, Calif., last week explained why she and her team chose Chromebooks.

"Whether they're telling stories of famous heroes using Google Sites, making group study guides with Google Forms, or listening to voice comments on their science fair projects in Google Docs, our students learn more from creating than they ever could from only consuming information," DeLapo explained in a post on the Google Enterprise Blog. "We have found Chromebooks to be the perfect tools -- they're portable and easy to use, have a keyboard and a large screen and are secure."

In addition to using Google Apps with the Chromebooks, the school uses Pearson PowerSchool, an app for tracking grades, test scores and attendance as well as collaborating with parents. Because the app is Java-based, which is not supported in Chrome, she needed to find a way to access it securely from the Chromebook.

DeLapo came across the Chrome RDP app by Fusion Labs, which provides access to any Windows desktop or server directly from the Chrome browser. "Since it uses Microsoft's native Remote Desktop Protocol, no additional configuration or setup is needed after you install the app," she noted. "It gives us secure access to PowerSchool and other legacy applications, and it's straightforward for teachers to use. They download the Chrome RDP app from the Chrome Web Store, open up the app, and enter their login information for secure access to PowerSchool through the school's firewall."

Back in February, when HP became the latest PC vendor to plunge into the Google Chromebook market, I raised the following question: Are Chromebooks a wild card that could make a dent in the Windows PC market moving forward? I subsequently received an e-mail from Paul Jones, who is involved in national education issues. He said that Chromebooks may gradually have more impact than critics and naysayers think. Jones, who permitted me to share his views, said the greatest percentage of Chromebooks are going into school districts. And he believes this is a "huge" percentage.

"What Google is doing is creating the 'next' generation of device users -- kids like my daughters will grow up being educated with Google and Chrome," Jones said. "My girls already have more expertise in Chrome than most technology centric critics.  When they reach adulthood, Google will have captured them as die-hard Google fans; kids who have grown up with Google, and trust Google with their technological needs." Thanks Paul for reaching out.

There is certainly a lot to be said for the impact the youngest of users will have on the future of computing devices that will populate the desktops and knapsacks of the next generation of enterprise workers. Still, I expect a hard-fought battle by Apple and Microsoft for that audience as well.

Posted by Jeffrey Schwartz on 06/12/2013 at 1:15 PM0 comments


IT Not Supporting Windows 8 Tablets (Yet)

Perhaps it will come as little surprise but IT pros say their organizations readily support iPads much more than tablets running Windows 8. The numbers are 61 percent and 15 percent respectively. This is according to 1,178 Redmond magazine readers who responded to our just-published 2013 Readership Survey. Only 9 percent support Windows RT -- a version designed for ARM-based tablets that only supports Microsoft's modern apps and can't join Active Directory domains.

Now keep in mind, this wasn't an either-or question. iPads started shipping more than three years ago and are hugely popular, while Windows 8/RT tablets only became available in late October and haven't enjoyed the same success so far. The number of Windows IT pros who expect to support Windows 8 tablets in a year from now will rise to 39 percent and to 18 percent for supporting Windows RT, those same respondents said, while 63 percent expects to manage iPads. .

The survey showed that 32 percent currently supported Android-based tablets while that number will rise to 41 percent in a year. Nearly a third (30 percent) doesn't support any tablets -- a figure that's expected to decline to 19 percent.

Market trends predict tablets outpacing PCs in terms of demand. IDC's latest forecast shows tablets will out ship PCs within two years. The Redmond magazine readership doesn't reflect that view but this is hardly surprising. You're IT pros managing the infrastructure of business or public sector employees, who are still more likely to do their work on some form of desktop or notebook device. Increasingly the trend will shift to hybrid devices that function as both. But most don't see a pure tablet replacing a fully functional computing device. I share that view.

But those same employees will increasingly use their tablets and smartphones more when they don't have access to their PCs. And it will be in the interest of IT organizations to support these devices. Microsoft sees that trend, which is why some of the next versions of Windows Server, System Center and Windows Intune, announced at last week's TechEd conference in New Orleans, will offer more secure mobile device management and emphasize what Microsoft is now calling "people-centric IT."

Today the outlook for Windows 8 and its successor release is uncertain at best. But neither was Windows 3.0 when Microsoft introduced it over two decades ago. The challenge for this next generation of Windows is much higher with much different market dynamics. Nevertheless it's too early to dismiss Microsoft's staying power. And there's another key variable. One thing that hasn't changed is IT pros -- like all people -- don't embrace change overnight.

 

Posted by Jeffrey Schwartz on 06/10/2013 at 1:15 PM0 comments


Microsoft Denies Participation in Government Surveillance

Revelations yesterday that telecommunications carriers and key technology providers including Apple, Facebook, Google and Microsoft share information with the intelligence community has put the Obama Administration and Congress on the defensive. But Microsoft yesterday sought to assure critics its scope is limited to data subpoenaed.

"We provide customer data only when we receive a legally binding order or subpoena to do so, and never on a voluntary basis," the company said in a statement. "In addition we only ever comply with orders for requests about specific accounts or identifiers. If the government has a broader voluntary national security program to gather customer data we don't participate in it."

At the same time, leaked PowerPoint slides suggest Microsoft may have been the first IT player to participate in the program, called PRISM, as early as 2007.

In a press conference this morning, the president defended the program, noting it was authorized by Congress. "What the intelligence community is doing is looking at phone numbers and durations of calls, they're not looking at people's names, and they're not looking at content," Obama said at a press conference this morning from San Jose, Calif. "But by sifting through this so-called metadata, they may identify potential leads with respect to folks who might engage in terrorism."

Obama went on to say if the intelligence community actually wants to listen to a phone call, they must get the approval of a federal judge in a criminal investigation. In addition to Congress, he pointed out the FISA court is overseeing this effort. The FISA court evaluates classified programs to make sure that the government is acting in step with the law and Constitution, he noted.

"With respect to the Internet and e-mails, this does not apply to U.S. citizens and people living in the United States," he added. "In this instance, not only is Congress fully apprised of it, the FISA court has to authorize it."

Privacy experts are outraged feeling the government is overreaching and has the potential to abuse people's privacy. Others argue that the only way to combat terrorism is to take these measures.

Regardless of where you stand on the issue, that's a discussion to be held elsewhere. The interesting component from an IT perspective is how the intelligence community is using metadata. With a whole new class of technology aimed at analyzing big data, it further highlights how software companies of all sizes are bringing to market new tools to gain intelligence from huge amounts of data, as I recently reported.

From their perspective, it's unfortunate for Obama and Congress that this came to light at all, but the timing of the revelation perhaps couldn't have been worse. The president has a long-planned meeting with China president Xi Jinping in California this afternoon to discuss a variety of trade issues.

Clearly on the list of topics discussed during that meeting will be China's brazen efforts to hack into the networks and systems of government agency systems including those with classified information, as well as recently reported attacks on commercial systems. Among those hit include some of the largest U.S. banks, utilities and media companies.

Certainly with today's disclosures, that will make that a... well let's just say a more awkward conversation. Hopefully the president will find a way to convince Jinping that he won't tolerate the persistent attacks and cyber-spying, which have wide-ranging consequences on all of our systems.

Posted by Jeffrey Schwartz on 06/07/2013 at 1:15 PM0 comments


TechEd 2013: Windows Azure AD Serves Up Big Numbers

 

More on this topic:

 

In the first two months since Microsoft released Windows Azure Active Directory, it has processed 265 billion authentication requests from around the world -- or 9,000 requests per second -- while customers have created 420,000 unique domains.

Brad Anderson, Microsoft's corporate vice president for Windows Server and Systems Center, revealed those stats in his keynote address at TechEd 2013 in New Orleans, which kicked off Monday and runs through tomorrow.

"Everything starts with the identity of that user inside of Active Directory," Anderson told TechEd attendees. "We've now cloud optimized Active Directory with Windows Azure Active Directory, so now we can extend your capabilities of Active Directory to the cloud with you in complete control about what you want to have appear inside that Azure Active Directory."

Microsoft released Windows Azure Active Directory in early April following a nine month preview and is offering it free of charge. It's the same directory users authenticate with to access Office 365, Windows Intune and now Windows Azure. Prior to the release of Windows Azure Active Directory, Windows Azure users had to authenticate with their Live IDs, which Microsoft is now phasing out in favor of what it generically calls the Microsoft account.

While administrators in organizations of all sizes can now synchronize identities in Windows Server Active Directory with Windows Azure Active Directory using Microsoft's DirSync, there are limitations. At the recent Visual Studio Live! conference in Chicago, Windows Azure MVP Michael Collier, who is a cloud architect at Aditi Technologies, warned developers that Windows Azure Active Directory doesn't support the management of devices, printers or Group Policy. "It's more targeted around users, authentication and properties for those users," Collier said during a talk on Windows Azure Active Directory.

"You're not going to enforce Group Policy today with Windows Azure Active Directory, added Eric Boyd, also a Windows Azure MVP and CEO of Chicago-based responsiveX. "You don't join your machines in your domain to a Windows Azure Active Directory like you do an Active Directory on premise," Boyd explained.

While customers have indicated they'd like to see Group Policy in Windows Azure Active Directory, Boyd is urging them not to expect it anytime soon. "There are certainly challenges with doing that, if that's the only source of authentication for your company," he said.

In an interview with Microsoft's Anderson, I asked what the future holds for Group Policy in Windows Azure Active Directory, since it was a topic that has come up in frequent interviews. "With that cloud-optimized mobile device management solution you get Group Policy-like capabilities like setting your network and your wireless settings and setting a power-on password encryption," Anderson said. "Think about Azure Active Directory, Windows Intune, as well as Office 365, really driving the move toward these software-as-a-service [aspects] delivered from Azure with capabilities like lightweight policy management coming with Windows Intune."

Lightweight policy management in Windows Intune is one thing I pressed him on -- whether full Group Policy available on premise would come to Windows Azure Active Directory. His response: "I see doing a much more light version of Group Policy but right now we're delivering that through Windows Intune," he emphasized. "So think about these things as all inter-related and things we are building on together. So as we think about Azure Active Directory and Intune, we're doing common planning and engineering milestones across those two things."

I'll take that as a maybe. How does Windows Azure Active Directory fit into your enterprise identity management? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 06/05/2013 at 1:15 PM0 comments


TechEd 2013: SQL Server 2014's Big Data and Memory Focus

More on this topic:

 

Microsoft announced a major new version of its flagship database, SQL Server, that aims to extend support for big data and the in-memory processing capability added to the current version. It was one of a slew of new wares Microsoft launched at its annual TechEd conference, which kicked off this morning in New Orleans.

SQL Server 2014 is the official name for Microsoft's relational database management system, which will take advantage of Windows Azure with hybrid cloud scenarios. It contains built-in Hekaton technology that converts tables so they run in memory. Other improvements in SQL Server 2014 include improved backup and availability, according to Quentin Clark, a Microsoft VP, on stage during the keynote presentation.

Improved in-memory processing capabilities will support online transactions in near real-time, Clark said. An early tester of the forthcoming SQL Server 2014 is Edgenet, which is a software-as-a service (SaaS) provider of inventory management for retailers, Clark said. Retailers often update their inventory management systems using batch processes, meaning if a customer wants to know if an item is in stock, it may or may not be up to date.

Using the in-memory capabilities to process transactions, they can refresh that data in near real-time, Clark explained. "We're doing it because it's a way to achieve unprecedented latency and scale, low-latency and high scale and throughput for transactional data," he said.

SQL Server 2014 preview is slated for release later this month, according to Microsoft's SQL Server Blog. Microsoft indicated it will ship shortly after the newly launched Windows Server 2012 R2 and System Center 2012 R2, both due out by year's end. That would imply SQL Server 2014 will likely arrive early next year.

 

Posted by Jeffrey Schwartz on 06/03/2013 at 1:15 PM0 comments


TechEd 2013: Windows Server 2012 R2 and System Center 2012 R2 Unveiled

More on this topic:

 

Microsoft outlined plans for Windows Server 2012 R2 and System Center 2012 R2 saying they provide parity with its datacenter infrastructure software and in its public cloud portfolio. The upgraded server OS and system management platform is also designed to help IT better-manage user-provided devices, providing extended mobile device management capabilities.

The company, which announced the upgrades at its annual TechEd conference in New Orleans this morning, said it will release previews of the R2 versions of Windows Server 2012 and System Center 2012, which are slated for general availability by year-end. Also planned is an upgrade to its cloud-based Windows Intune systems management service. Microsoft plans to upgrade its entire portfolio of datacenter software and services, as reported by my colleague Kurt Mackie.

Talking up the so-called bring your own device (BYOD) trend, Microsoft corporate VP Brad Anderson, joined on the TechEd 2013 keynote stage by principal program lead Molly Brown, said Windows Server 2012 R2 will let administrators register all user devices by creating a "workplace join" to Active Directory.

In a demo by Brown, an administrator can join a Windows 8.1 device to a domain. IT can use a new "Web application proxy" to publish corporate resources for access by end users, which is tied to Active Directory. Admins can also register a device with ADFS. Windows Server 2012 R2, combined with System Center 2012 R2 Configuration Manager, supports enabling a two-factor authentication security feature called "Windows Active Directory Authentication," which is based on the technology that Microsoft acquired from Phone Factor. The new R2 releases let administrators verify the identity of a user by initiating a phone call to a mobile device as a secondary security precaution.

Windows Server 2012 R2 with the new System Center 2012 upgrade, will also include a feature called selective wipe, which will let administrators remotely remove enterprise applications and data off of a user-owned device without deleting any data or apps owned by the user.

Another new feature coming to Windows Server 2012 R2 is Work Folders, which lets users store data on a device, replicate it to the file server and push out to other registered PCs and devices. With this feature, data is encrypted both on the devices and on the server, Anderson said. "That is truly enabling what we call 'people centric IT' and enables you make your users productive on all their devices."

Posted by Jeffrey Schwartz on 06/03/2013 at 4:59 PM0 comments


TechEd 2013: New Azure Services Coming to Windows Server 2012

More on this topic:

 

Looking to flesh out its so-called Cloud OS vision, Microsoft revealed plans to offer the new Windows Azure Pack, which brings the portal interface of Microsoft's cloud service to Windows Server and System Center.

In a keynote address this morning that kicked off the annual TechEd 2013 conference in New Orleans, corporate VP Brad Anderson emphasized the latest version of Hyper-V release introduced in Windows Server 2012 last year is the same hypervisor offered in its Windows Azure cloud service.

"Consistency across clouds is one of the things you should absolutely add to the top of your list as you're looking at your cloud decisions," Anderson said. "If things are consistent, if you have the same virtualization, same management, the same developer interfaces, you have the same identity and you have consistency across data, what that allows you to do is you can just move VMs and applications across clouds, no converging, no migration, no friction."

With the Windows Azure Pack, administrators can let lines of business configure and consume capacity, Anderson explained. It also enables high-density Web hosting, allowing organizations to deploy 5,000 Web servers on a single Windows Server instance just as they do in Windows Azure, allowing it to run in the datacenter. Features such as the Windows Azure Service Bus will also run on this new pack. And Microsoft will offer a set of APIs that let developers build apps and deploy them on premise or in the cloud.

"We're hardening [these] in Azure, and then through the Windows Azure Pack we deliver that and it just drops right on top of Windows Server and on System Center," Anderson said.

Posted by Jeffrey Schwartz on 06/03/2013 at 1:15 PM0 comments


Microsoft Says Start Button Will Return (with a Catch)

Microsoft yesterday made official one of the worst-kept secrets in the world of Windows these days -- that it's bringing the Start button back when it releases Windows 8.1 later this year. Moreover, users will have the option of booting up to the traditional, or classic, Windows desktop rather than the tile-based modern user interface that defines Windows 8.

But these two changes come with a bit of a caveat. First, the Windows 8.1 Start button will be there but it will use the Windows flag, rather than the old round button used in earlier versions of Windows. And by default, clicking on it will take you to the tile-based interface. In order to have it launch the Windows menu a one-time configuration is required, The New York Times Nick Wingfield reports.

Likewise, Windows 8.1 by default will still boot to the tile-based interface and you'll need to configure it to boot to the traditional desktop. Not a big deal but I anticipate grumbling by some already.

The original decision to remove the Start button when releasing Windows 8 angered more than three quarters of Redmond magazine readers, generating more response than any other issue in recent memory. And that response was emotionally charged on both sides of the argument.

Those angered by the move felt Microsoft was making it harder to use Windows in the traditional desktop mode for no beneficial reason. But I heard from plenty of IT pros who felt Microsoft's decision to remove the Start button was necessary to wean users away from it and toward the tile-based touch metaphor. Change happens and we need to adapt to it, the argument went.

But Microsoft failed to realize many business users who have no current need or interest in the tile interface -- and more importantly use machines that are not touch-capable -- resented the removal of the Start button. I happen to like the tile interface. But for applications that run only in the traditional desktop mode, I didn't see the benefit of learning new habits. Nor do I see why leaving it in would discourage me from using the tile interface, which seemed to be Microsoft's concern. Microsoft apparently realized that it was premature to remove the training wheels. I think it was a sensible concession with no downside. Do you?

Posted by Jeffrey Schwartz on 05/31/2013 at 1:15 PM0 comments


Is IT a Barrier to Enterprise Social Networking?

A growing number of employees want to use social networking to improve collaboration, productivity and knowledge sharing. But within some organizations IT is standing in their way. That's the conclusion of a study released by Microsoft Tuesday, which found IT blocking the adoption of social network in many workplaces.

Thirty percent of those surveyed said IT organizations are putting restrictions on the use of social networking for business use, while 77 percent said they want better collaboration tools and 31 percent are willing to spend their own money to get the tools they want. The survey, commissioned by Microsoft and conducted by Ipsos, is based on a sample of nearly 10,000 individuals in organizations with 100 or more employees in 32 countries.

Microsoft of course has taken a keen interest in enterprise social networking following last year's $1.2 billion acquisition of Yammer and the addition of social networking features to its SharePoint collaboration platform. Microsoft this summer will begin to integrate the features of Yammer into SharePoint, Office 365 and ultimately across its various product lines. Naturally Microsoft would like to see more universal buy-in from IT.

"The tension and rift we're seeing is fascinating," said Brian Murray, director of enterprise strategy at Yammer, now part of Microsoft's Office group. "Many end users don't believe their employers recognize the value of social tools. We are urging those responsible for provisioning technology to look at the merits, listen to employees and look into provisioning these technologies on a wider scale."

Security concerns were by far the leading reason why survey participants believed IT was reluctant to support enterprise social networking, according to 68 percent of those responding. The belief that social networking would hinder productivity was the number two reason, 58 percent said. Those were the two key fears, with a much smaller percentage naming such issues as HR concerns, risk of harming corporate image and data loss.

Responses varied significantly based on countries. It turns out those in China -- an overwhelming 84 percent -- believe social networking greatly or somewhat improves productivity, while at the other extreme only 24 percent of respondents in the Netherlands believed it provided such gains. In the U.S. only 34 percent are bullish on its benefits, while 46 percent of the worldwide sample believe it so.

Does this portend challenges ahead for Microsoft and others such as Salesforce.com, which offers the competing enterprise social networking tool Chatter? For its part, Microsoft said it added 312 new Yammer customers last quarter with a 259 percent increase in revenues.

But Microsoft believes social networking needs to be used more strategically, a conviction shared by others with skin in the enterprise social networking game. "Just putting Yammer into your environment doesn't make it successful," said Gail Shlansky, director of product manager at Axceler, a provider of SharePoint management tools, which is adding Yammer governance to its ViewPoint management tool. "It takes advocates, it takes telling the resources in the enterprise who are going to use it into being part of that team."

Are you concerned about the use of enterprise social networking tools in your organization? Has your shop given it the green light? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 05/29/2013 at 1:15 PM0 comments


HP CEO Whitman Says PC Demand Will Bounce Back

When Hewlett Packard CEO Meg Whitman was asked by CNBC Thursday where she saw the bottom when it comes to declining PC sales, she didn't quite say. But Whitman argued there are 140 million desktops and notebooks that are more than four years old. While acknowledging tablets might be growing faster, she said users will ultimately need full-fledged systems as well.

"People are still doing real work on their laptops, and tablets are great. But if you're doing Office or Excel or real creation, it's really hard to do that on your tablet," Whitman said the morning after reporting earnings for its second fiscal quarter.

"We're investing in hybrids where you have a laptop plus a tablet, where the screen comes off, so it's all in one. I think that's the direction the industry will go," Whitman said. "My view is people will want to create, they want to consume, they want to share, and I think that form factor will have a future." But she acknowledged: "The growth is tablets and smart phones, we're in to a whole new era in personal computing, there's no question about it."

Whitman said Windows-based PCs are no longer the only game in town. But nonetheless, she re-affirmed HP's commitment to the WinTel platform. The company yesterday also unleashed a new line of touch-based Windows 8 PCs including what it describes as mobile all-in-one desktop. Weighing in at 12-pounds, you probably wouldn't take this on the road but users can move the new ENVY Rove 20 from room to room.

Overall Whitman was in a better mood than usual yesterday. While HP's year-over-year revenues plummeted last quarter, the company stunned Wall Street with much better than expected earnings, leading to a 17 percent rise in its share price Thursday. HP became an instant darling after reporting a surprise $3.6 billion in cash flow from operations, up 44 percent.

Despite the jubilation, things still aren't exactly hunky-dory at HP's Palo Alto, Calif. headquarters, where personal systems revenues declined 20 percent, enterprise systems were down 10 percent, the services business dropped 8 percent and software was 3 percent lower.

As Whitman re-iterated, HP is 18 months into a five-year turnaround. At this point, things are slightly ahead of schedule but she's not celebrating yet. "I feel good about the growth prospects for 2014."

Posted by Jeffrey Schwartz on 05/24/2013 at 1:15 PM0 comments


Windows Server 2012 Waiting in the Wings

More than eight months after Microsoft released Windows Server 2012, and despite an 11 percent uptick in the company's Server and Tools business last quarter, it appears most enterprises, mostly with fewer than 1,000 employees, have yet to upgrade or deploy the new operating system.

A report published yesterday by Symantec reveals that while 56 percent of respondents plan to deploy Windows Server 2012, 93 percent of them have yet to do so. The survey consists of 530 organizations worldwide. The notion that organizations appear to be moving slowly to Windows Server 2012 is not surprising. Typically IT decision makers are conservative about introducing new platforms into their datacenters and there's no reason Windows Server 2012 should diverge from that practice. Nevertheless IDC in February reported that Windows Server hardware's overall growth increased 3.2 percent in the fourth calendar quarter of last year, though it's unclear how much of those sales are systems with Windows Server 2012. I have a query into Microsoft and will update this blog if I receive an answer.

Symantec's survey found of those planning to move to Windows Server 2012, 13 percent are waiting for the first service pack, 15 percent will do it within six months, 17 percent within a year and 11 percent plan to wait longer.

"It's going to be a wait and see," said Susie Spencer, a senior product marketing manager at Symantec. "Those that are more risk takers will jump on board pretty soon but those wanting to make sure everything is tested and tried before they implement it in their organization are going to wait before they move to Windows Server 2012."

For those that are planning to move to Windows Server 2012, the key reason for doing so is to run SQL Server (67 percent), followed by Active Directory Exchange Controllers (61 percent), Exchange (58 percent), file and print servers (54 percent) and finally SharePoint (52 percent).

Key new capabilities in Windows Server 2012 respondents are seeking are improved VDI, Hyper-V virtualization and the new Resilient File System (ReFS), according to the survey. PowerShell improvements and refined disk de-duplication were close followers, Spencer said.

The survey also found that only 18 percent have more than three-quarters of their IT environments virtualized, while 52 percent plan to be completely virtualized within two years. A survey of Redmond magazine readers found only 12 percent have upgraded to System Center 2012, which combined with Windows Server 2012 is the underpinnings of its so-called Cloud OS strategy. The good news I'm hearing from various third parties is improvements to Hyper-V in Windows Server 2012 have for the first time made the hypervisor a viable alternative to the VMware stack.

What are your plans in upgrading to Windows Server 2012 and what do you find most compelling about the new OS and the Cloud OS story? Are you waiting or hoping Microsoft will have more to say about its roadmap at next month's TechEd 2013 conference in New Orleans? Feel free to comment or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 05/22/2013 at 4:59 PM0 comments


Microsoft Ad Finally Describes What Surface Does

Microsoft's Windows 8/RT-based Surface devices are pretty slick and attractive to those who want both a tablet and a PC in one lightweight, ultra-modern unit. Yet most people are unaware of the benefits of the device despite a six-plus month ad blitz that's hard to miss. The commercials, slick in their own right, don't explain what a potential customer can do with a Surface.

I know many people who don't follow technology closely but nevertheless use it. Quite a few have no clue what the Surface does and Microsoft's commercials have done little to advance its cause. Microsoft today is finally acknowledging this.

"As much as we love the ads and others have, they've also been a bit polarizing," Michael Hall, general manager for Surface at Microsoft admitted in a blog post. "Some of our biggest Surface fans have been downright angry (ANGRY!) that we aren't talking about more of what it actually does!"

It appears Microsoft has finally gotten the message that if it wants to convince customers to look at the tablet hybrid, perhaps it should tell them why. The company's new ads do just that. The spots implicitly point out what you can't do with an iPad by pointing to the ability to run Office, a keyboard and support for peripherals via SD card and USB ports.

Perhaps these new commercials will help move the needle a bit on Surface sales. But I suspect until the next generation of devices hit the market, presumably with Intel's next-generation Haswell processors, it may remain a tough sell. But it's harder to convince people to buy something if they don't know what it is. Hence the new ads are a step in the right direction.

Posted by Jeffrey Schwartz on 05/20/2013 at 1:15 PM0 comments


Yahoo Beats Out Microsoft To Nab Tumblr

In a defining moment for Yahoo, the company today said it has agreed to acquire the popular micro-blogging site Tumblr for $1.1 billion in cash. Rumors had swirled for several days that Yahoo was the leading contender to acquire Tumblr, though Facebook and Microsoft were also reported to have engaged in serious talks to acquire the company.

It's hard not to see just a bit of irony in the fact that Yahoo bested Microsoft for Tumblr, given Microsoft's failed bid to acquire Yahoo in 2008 and the current search partnership between the two companies. Of course the latter is less unusual given that partners almost always compete at the same time.

It's not clear how badly Microsoft wanted Tumblr, a company that boasts 300 million unique visitors monthly, 120,000 new accounts each day and 900 posts per second. Half of those posts are from mobile devices that conduct an average of seven sessions per day, and Tumblr says its collective audience spends 24 billion minutes on its site monthly. It seems Microsoft is itching to do some kind of deal, coming off reports last week that it's also weighing buying Barnes & Noble's Nook business.

In rather unique verbiage for an official announcement on an investor relations site, Yahoo said the two companies agreed "not to screw it up" by deciding to keep Tumbr as a subsidiary that will continue to be led by its founder and CEO David Karp. For Yahoo the deal is a big bet. While Tumblr has a huge audience its revenues are not so enormous -- a paltry $13 million. That's because it has eschewed advertising. Something Yahoo will undoubtedly change.

Microsoft has a vested interest in Yahoo's overall well-being, at least for now. A report in The Wall Street Journal two weeks ago said Yahoo may want out of its deal with Microsoft to use Bing as its search engine. According to that report, Yahoo CEO Marissa Meyer is unhappy with the fact that its revenue per search with Bing is lower than it was when it ran its own engine.

Meyer's former employer, Google, is waiting in the wings, according to the report, though Microsoft is said to have no interest in letting Yahoo out of the deal before it expires in 2020. One source close to the contract said the earliest Microsoft might let Yahoo walk is 2015.

As for Tumblr, what's your take on Microsoft's alleged interest in acquiring a blogging site? Should Microsoft go down that path, either via acquisition or organic development or should it leave that to the likes of Facebook, Twitter and WordPress?

Posted by Jeffrey Schwartz on 05/19/2013 at 1:15 PM0 comments


Google CEO Larry Page Lashes Out at Microsoft

The rivalry between Google and Microsoft took on a new twist this week as the two companies accused each other of providing selective interoperability. While that's nothing new, the Google CEO Wednesday kicked off a tirade of barbs at a surprise appearance during the company's annual developer conference where he lashed out at Microsoft and accused the company of impeding interoperability.

During his unscheduled presentation at Google I/O, Page criticized Microsoft for integrating the Gmail chat/IM service into Outlook.com but not letting Google do the same with its service.

"The Web is not advancing as fast as it should be," Page said. "We struggle with companies like Microsoft. We would like to see more open standards and more people involved in those ecosystems."  A day later, Google slapped Microsoft with a demand that it remove its Windows Phone 8 YouTube app from the Windows Store for violations of the company's terms of service.

Is there a double standard here or does Google have a legitimate beef that Microsoft is tinkering with the app by blocking ad content, while not returning in kind interoperability with its IM service?  The reality is, when it comes to interoperability, it's not hard to find both companies (actually most IT players) guilty of hypocrisy.

Nevertheless, Microsoft continues to make strides in its support for platforms not born in Redmond, most recently by extending support for Android and Apple's iOS. While that hasn't led to a version of Office for those mobile platforms as many would love to see, I saw first-hand at this week's Visual Studio Live! conference in Chicago, produced by Redmond magazine parent company 1105 Media, that Microsoft has stepped up its tooling for those mobile environments, including its Windows Azure Mobile Services offering.

Suffice to say, this isn't altruism here on Microsoft's part. Android and iOS are by far the most widely used platforms for phones and tablets these days. Google revealed 900 million devices worldwide now run Android, up from 400 million last year at this time and 100 million two years ago, said Sundar Pichai, Google's senior vice present for the Android, Chrome and Apps teams, in his keynote address at the developer conference.

So it's in Microsoft's interest to reach those users lest it provoke further erosion of the Windows and Office platforms. At the same time, Page shouldn't talk out of both sides of his mouth. Instead of effectively saying "can't we all love one another," if he wants interoperability, he might want to refine his approach. It all boils down to selective interoperability and this week's crossfire is only the latest chapter.

Posted by Jeffrey Schwartz on 05/17/2013 at 1:15 PM0 comments


Windows 8.1: What's in a Name?

Yesterday Microsoft released some of the first details of its Windows 8 upgrade, code-named Windows "Blue."

Here's now what we know: The update now has the official name of Windows 8.1. And we also know that Windows 8 and Windows RT users won't need to shell out any more money for the update, which may or may not bring back the start button.

We also have this vague description from Microsoft's Tami Reller on what the offering will be: "Windows 8.1 will advance the bold vision that we set forward with Windows 8 to deliver great PCs and tablets with an experience that does allow you to simply do more," she said during a presentation yesterday.

Even with Reller's presentation, it's still unclear what Windows 8.1 is. In the past, when Microsoft rolled up updates and new features for its OS, these used to be called service packs. But seeing that this will be called Windows 8.1 and not Windows 8 SP1, one might be inclined to believe that this is not the same thing -- that it's something different than a service pack.

However, this is Microsoft -- the company that avoids product name consistency like the plague.

And the few glimmering details we do have (either by rumor or directly from Microsoft) does make it sound like Windows 8.1 will just consist of rolled up updates and new features. So I'm leaning more towards this being more Microsoft naming shenanigans and an attempt by the company to jazz up what is essentially a service pack.

This is a rare occurrence where the naming switch does make sense for the sole purpose of at least getting the public to take another look at Windows 8.

Let's face it -- Microsoft is desperate to move some Windows 8 units. It clearly sees that releasing something called Windows 8 Service Pack 1 won't light a fire under an apathetic consumer base. And calling it Windows 8 "do-over" doesn't really work as well. So what can Microsoft name this to convey both to the public that it's a new and improved Windows 8?

Luckily, Apple has already laid the groundwork. For more than 10 years now, Apple has been rolling out new updates to its OS X operating systems with a yearly refresh that, in some cases, has been drastically different than the previous year. But instead of calling it a new OS, they stick a new version number on it (along with throwing a large cat code name on it).

We've been conditioned to understand that Apple's latest version upgrade, OS X v10.8, is almost a completely different product than its previous OS X v10.7 because of a change in one decimal number.

I could see that Microsoft would want to take a page of the market branding that Apple has already established, especially since Redmond desperately needs some help in this department. And it does want to convince consumers that Windows 8.1 is worth giving a shot, even among those who gave Windows 8 vanilla a shot and have already wrote it off.

However, a name that consumers can get behind can only carry you so far, and a product rarely lives or dies by its brand name (Ok, the Microsoft Kin may be the exception to that).

What we really need to see are the nitty-gritty details of how Microsoft has addressed the legitimate concerns users have with its latest OS. Thankfully, we won't have to wait too long for the missing pieces to drop in during next month's Build conference. And for those looking for a way in the door, Microsoft released some more tickets this morning.

What do you think, is Windows 8.1 Microsoft's attempt to spruce up the service pack name? Let me know in the comments below.

Posted by Chris Paoli on 05/15/2013 at 1:15 PM0 comments


Rackspace Wants the Windows Cloud Market

While Rackspace typically emphasizes that its infrastructure as a service is now based on the open-source OpenStack cloud platform, the hosting provider still has a vested and significant interest in Windows. In addition to offering SharePoint hosting and development services, many of its customers use the Rackspace cloud to run Windows-based apps. The company also points to the improved Hyper-V support in Grizzly, the latest update to the OpenStack cloud OS released last month.

Now Rackspace is trying to make it more attractive to have Windows apps run in its OpenStack environment. The company earlier this month added new tools to help IT pros and developers alike build and manage .NET applications. Among them are Rackspace's PowerClient, a Powershell-based API client for the company's public cloud and the Rackspace Cloud SDK for Microsoft.NET.

"Now you can start seamlessly and easily integrating your applications into your Windows cloud environments," said Cole Humphreys, a Rackspace senior product marketing manager in a blog post. PowerClient is intended for those who want to build custom Microsoft Powershell-based cmdlets.

While Rackspace and OpenStack have a CLI known as NovaClient (for Nova compute), it's Python-based and designed to run natively in most Linux environments. NovaClient can work with Windows as well but it's not optimized for that OS, wrote Rackspace sales engineer Mitch Robins in a description of PowerClient posted to Github.

For systems administrators and IT pros responsible for ensuring uptime, implementing tools designed to run natively in Linux can be a difficult process in Windows, he noted. "You want to be able to use natively supported and functional tools, without the potential headache of having something fail because it wasn't meant for use within the Windows eco-system."

Designed to run on Rackspace OpenStack cloud servers today, Robins signaled that the roadmap calls for it working in any OpenStack environment in the future, however the OpenStack Foundation hasn't established a timeframe for that.

Meanwhile Rackspace is looking for .NET developers to build modern apps designed for mobile environments where the company sees much of the demand for its cloud services. A Rackspace survey found that 82 percent of IT decision makers believe mobile apps will become a standard method of accessing enterprise data and 28 percent say they will build their own enterprise app stores.

"The trend is toward more and more mobile apps and we are in the early stages," said Rackspace CTO John Engates, who also noted that 84 percent access to information at any time from any place while 55 percent said the cloud and mobile devices are enabling that.

To bring .NET developers into the fold, the Rackspace Cloud SDK for Microsoft.NET library runs from within Microsoft's free Visual Web Developer Express 2012 tool using its integrated NuGet extension manager.

Posted by Jeffrey Schwartz on 05/14/2013 at 4:59 PM0 comments


Could Big Data Have Prevented the Boston Marathon Attack?

The terrorist attacks of September 11, 2001 widely exposed the fact that if the CIA and FBI had data sharing capabilities, law enforcement could have thwarted the worst attack in U.S. history. Ironically it came to light just a week after Robert Mueller took over as head of the FBI.

As word quickly got out of the CIA's suspicions about the terrorists who carried out the attacks on the World Trade Center and the Pentagon, Mueller's key task as the then new FBI chief was to rectify that problem. Nearly a dozen years later, Mueller is set to retire this summer and he's confronted with the latest revelation that the suspected bombers in last month's deadly Boston Marathon attack had ties to Russian terrorists.

The suspect, Tamerlan Tsarnaev, who reportedly had extreme views, was interviewed in 2011 by the FBI, though it later dropped the case. Now The New York Times points out that as Mueller's career comes to an end, while the two agencies may have improved the way data is shared, it still has a long way to go before it can predict with better precision a threat that's about to initiate such unthinkable acts.

Perhaps it's unfair to say Mueller didn't get the job done during his tenure -- law enforcement at many levels have thwarted dozens if not hundreds of attacks in the 12 years since 2001. It may be impossible to prevent someone determined enough to initiate an attack but in the age of IBM's Watson, predictive analytics is already making us safer.

There are numerous companies and approaches aimed at using big data to provide better intelligence. But we're not there yet. President Obama needs to find a successor who is savvy about the role big data can in predicting the risks without overstepping and encroaching on the privacy and civil liberties of innocent citizens.

Posted by Jeffrey Schwartz on 05/10/2013 at 1:15 PM0 comments


Windows 8 Isn't a New Coke Moment

 

A number of media sites over the past day or so have compared Microsoft's failure to wow the market with its Windows 8 operating system as analogous to New Coke, the ill-fated attempt in 1985 to change the flavor of the century-old popular soft drink.

The change was so widely rejected that three months later Coca Cola Corp. reversed course and brought back the original Coke. First it branded the revived version as Classic Coke while keeping New Coke on the shelves as Coke II, which was ultimately phased out.

Whether or not you're old enough to remember the New Coke debacle, it certainly remains a case study in how not to destroy a brand. Search for "New Coke" today on Google or Bing and you'll see numerous articles and blog entries comparing Windows 8 to New Coke. However, Windows 8 is not a "New Coke moment" for Microsoft and I'll explain why.

But first, here's the latest news on Windows 8. Tami Reller, Microsoft's chief marketing officer and CFO of Windows yesterday revealed on the Microsoft's official Windows Blog that a public preview of Windows "Blue" will debut at next month's BUILD conference in San Francisco and with general releases slated by year's end.

Reller tacitly acknowledged Windows 8 hasn't met customer expectations. She also reported that slightly six months after its release, 100 million Windows 8 licenses have shipped, which includes license upgrades as well as its Surface own devices, PCs and tablets offered by third-party partners such as Acer, Asus, Dell, HP, Lenovo, Samsung, Sony and Toshiba.

In enterprises however, IDC analyst Al Gillen, told The New York Times, that 40 percent of those customers have license agreements that allow them to downgrade to Windows 7. I sent Gillen a note asking how many of those customers exercise those rights. While he didn't have hard data, he said "my sense is that it is pretty high, probably similar to or higher than Windows Vista (meaning majority of those with downgrade rights are probably exercising them). It is a classic avoidance scenario having more to do with existing momentum of Windows 7 than anything else."

A survey of more than 1,100 Redmond magazine readers planning to replace their PCs, 69 percent said they will install Windows 7, while 18 percent will use Windows 8.

That Microsoft sold 100 million Windows 8 licenses was also the range most analysts had assumed (or at least in that ballpark), though Microsoft raised eyebrows when it didn't provide an update when it reported its fiscal year third quarter earnings last month. The last update Microsoft offered was in January when it said it sold 60 million Windows 8 licenses.

Now most observers believe Windows Blue will let users configure their PCs to boot up to the classic Windows 7-like desktop rather than to the current tile-based Windows Store interface, which requires them to touch or click on the Desktop tile. Everyone is also expecting Microsoft will bring the Start menu back to the classic desktop. None of this is certain, nor did Reller tip her hand.

Microsoft will face a loud uproar if doesn't include these changes and Reller all but indicated that's not what Microsoft wants. "The Windows Blue update is also an opportunity for us to respond to the customer feedback that we've been closely listening to since the launch of Windows 8 and Windows RT," Reller said. "From a company-wide perspective, Windows Blue is part of a broader effort to advance our devices and services for Microsoft."

Also part of that effort is to have smaller devices that can answer Apple's iPad Mini, Amazon's Kindle Fire and the Google Nexus 7. The Wall Street Journal last month reported Microsoft is planning a 7-inch device, based on interviews with Asian component manufactures.

The BBC yesterday interviewed me for its daily World Business Report for a piece it did trying to compare Microsoft's moves as a "New Coke moment." When asked if I saw similarities I said no because Microsoft's not doing an about face here. Unless planets collide, Microsoft's not scrapping its tile interface. Microsoft's likely plan is to make Windows 8 more appealing to those who want the traditional interface while allowing them to use the tile interface at their leisure or as apps they need make sense.

As far as I can tell, the first to make the New Coke analogy was Steven J. Vaughan-Nichols, who last week maintained Microsoft should respond by "dumping the Metro interface."

I wouldn't bet on that happening, nor should it.

Posted by Jeffrey Schwartz on 05/08/2013 at 1:15 PM0 comments


U.S. Accuses China of Cyber Warfare

 

The Pentagon this week accused the Chinese military of what was long reported: that it was backing cyber-espionage by hackers attacking government and commercial computer systems.

It's the first time the U.S. government unambiguously accused China of backing and engaging in protracted cyber-warfare, potentially putting a rift in relations between the two countries.

"In 2012, numerous computer systems around the world, including those owned by the U.S. government, continued to be targeted for intrusions, some of which appear to be attributable directly to the Chinese government and military," the Pentagon said in its annual report to Congress.

While the China Ministry of Foreign Affairs denied the accusations, an editorial in The New York Times said "there seems little doubt that China's computer hackers are engaged in an aggressive and increasingly threatening campaign of cyber-espionage directed at a range of government and private systems in the United States, including the power grid and telecommunications networks."

To be sure, the U.S. government has had to answer charges that it and Israel were behind the Stuxnet attacks against Iran, in an effort to subjugate its alleged nuclear warfare ambitions.

But a number of major banks and government agencies have sustained crippling attacks, as this month's Redmond magazine cover story points out. President Obama back in his February State of the Union Address pointed to cyber threat as a major risk to the nation's security. The warning came with an executive order mandating government agencies share information about cyber threats between state and local governments, and the private sector.

It looks like the Pentagon has upped the ante.

 

Posted by Jeffrey Schwartz on 05/08/2013 at 1:15 PM0 comments


More Women in Redmond's Exec Suite

The U.S Senate has a pretty big problem on its hands.  I'm not referring to its inability to arrive at consensus on most issues these days The latest setback is much more fundamental: the scarcity of ladies' rooms in the Senate chamber.

Until recently that wasn't as big an issue, but with last year's elections, there's now an all-time high of 20 female U.S. senators. That's causing long lines because there are only two stalls for women near the Senate floor. I suspect it's not so bad at Microsoft's headquarters in Redmond but with Amy Hood taking over as the company's chief financial officer last week, the company now has its first woman overseeing its books. 

Like many large companies, Microsoft touts its diversity, and the company has taken a step forward with Hood's promotion.

These days, four women make up Microsoft's senior leadership team of 16, meaning they account for 25 percent. Two of them, Tami Reller and Julie Larson-Green have become the face of Windows in recent months.

Posted by Jeffrey Schwartz on 05/08/2013 at 1:15 PM0 comments


Bill Gates: Users 'Frustrated' with iPads

Apple's iPad may be a market-leading tablet but Microsoft founder and chairman Bill Gates this morning dismissed its usefulness saying they lack the breadth and capabilities of new devices based on Windows 8 (including the company's own Surface).

Acknowledging the iPad's dominance Gates told CNBC the following: "A lot of those users are frustrated. They can't type, they can't create documents, they don't have Office there. We're providing them something with the benefits they've seen that have made that a big category, but without giving up what they expect in a PC."

Gates was sitting next to his close longtime friend, the billionaire investor Warren Buffet, in Omaha, which just wrapped up the Berkshire Hathaway annual board meeting (Gates is on the company's board). If you want to listen to his comments but don't want to hear what he has to say about interest and currency rates, investor confidence in wake of the three-year anniversary of the Flash Crash (has it been that long?) and his view on board governance, you can scroll to 7:22 when Gates is asked about the plummeting PC business.

In the interview, Gates also scoffed at the notion the PC's best days are over. "PCs are a big market," he said. "It's going to be harder to distinguish products, whether they're tablets or PCs. With Windows 8 Microsoft is trying to gain share in what has been dominated by the iPad-type device."

Regarding Microsoft's Surface hybrid PC-tablets Gates said: "You've got that portability of the tablet but the richness in terms of the tablet but the richness in terms of the keyboard and Microsoft Office of the PC."

Perhaps Gates implicit prediction that Windows will have a successful second life is right -- he's defied naysayers before -- but Windows 8 and its successors based on the new Windows Store or "Metro" style model will take time to gain acceptance, according to industry analysts and according to a survey of Redmond magazine's readership.

More than half, or 54 percent, said they don't have a single Windows 8 PC in their shop, according to our survey in which over 1,000 registered and qualified readers of Redmond magazine responded. And 37 percent say Windows 8 accounts for less than 10 percent of PCs, presumably a handful in most cases. Over the next year, 39 percent say they won't have any Windows 8 PCs, while in two years, more than a quarter, 26 percent, still won't have one. Only 8 percent say all of their PCs will be Windows 8 based in two years.

At the moment, the iPad and Android-based devices are the devices of choice. Apple delivered 19.5 million iPads, while Microsoft shipped a mere 900,000 Surface Pros and Surface RTs, market researcher IDC reported last month. Meanwhile 27 million devices based on Google's Linux-based Android operating system have shipped from a number of vendors. Windows 8 pales in comparison accounting for 1.6 million units and Windows RT just 200,000 units.

Even though Windows 8 has gotten off to a slow start, I believe it's too early to write it off. Surveys show what people are contemplating based on what they know now. When Microsoft reveals the future of Windows later this month, the company will need to convince users it has a compelling alternative. If it does, people's thinking can change in a very short amount of time. Either way, the future of Windows will take years to play out.

In addition to his enthusiasm for Windows 8, Gates gave a plug for Microsoft's emphasis on cloud computing. "The cloud is a gigantic opportunity because now there are so many things you can do in computing that just wouldn't have been possible before," he said. "You've got a lot of the top companies going to seize that opportunity."

Posted by Jeffrey Schwartz on 05/06/2013 at 1:15 PM0 comments


Can Intel's New Chief Revive the Sagging PC Business?

Intel's decision to name Brian Krzanich as its sixth CEO comes as the CPU giant is looking to breathe new life to its x86 PC architecture, among many other challenges facing the world's largest manufacturer of processors. Krzanich was an odds-on favorite to succeed Paul Otellini, who in November caught Intel's board off guard when he said he would step down three years earlier than expected.

Otellini's premature retirement left Intel's board scrambling to name a successor with a search that included both outside and internal candidates. The company yesterday stuck with tradition in naming a longtime insider. Krzanich, who joined the company over three decades ago at the age of 22, is now the company's chief operating officer.

While many analysts predicted Krzanich would get the top job, the company also promoted long-shot candidate Renée James, Intel's software head, as its new president. Krzanich is known for his manufacturing chops while James is credited with advancing Intel into next-generation devices, though that work is far from complete.

Both understand that while the WinTel dynasty that once fueled huge revenue and profit growth no longer embodies everything Intel and Microsoft do, it's still a huge business for both companies. Nevertheless Intel will never rely solely on Microsoft to boot up its processors, while Redmond also has turned to many of Intel's rivals such as ARM licensees including NVIDIA, Qualcomm and Texas Instruments.

With questions about who will fill Otellini's shoes out of the way, both companies have a lot at stake with next month's anticipated announcements of new PCs based on Intel's new Haswell low-power CPUs. If they push the envelope as expected -- offering 2x to 4x improvements in battery life while providing added graphics capabilities, new Windows 8 machines could become more desirable, especially if the forthcoming upgrade enamors potential customers.

Consumers and enterprises alike are buying new PCs -- they're just not purchasing them in droves the way they once did, as noted in yesterday's report that Windows 8 now accounted for 3.8 percent of all PCs in use as of April. This is up from 3.2 percent in March. They're also stretching the lives of their old machines longer than in the past because they're not the only device they rely on. And, for many, there's no compelling reason yet to upgrade, which undoubtedly contributed to last quarter's worse-than-expected decline in PC shipments, reported last month.

With Haswell and Intel's hybrid laptop design called North Cape, it will be up to the company and Microsoft with Blue, aka Windows 8.1, to change that along with the WinTel ecosystem. Are you optimistic that will happen? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 05/03/2013 at 1:15 PM0 comments


Would You Wear a Windows Watch?

Many people I know have stopped wearing watches, opting to check their smartphones for the time. I’m among those who feel naked without my Seiko (having experienced that a few weeks ago when the battery suddenly died). I couldn’t change that battery fast enough. At the same time, I found a report in The Wall Street Journal that said Microsoft may be developing a Windows-based watch. Intriguing.

While Microsoft hasn’t commented on the report, The Journal said executives with an Asian suppliers said the company has requested components for a potential watch-type computing device. Microsoft, however, is only in the consideration stage, according to the report.

Until now, I’ve rolled my eyes at the spate of reports that Apple and Google have plans to offer wearable devices.  Take Google Glass, the glasses that let people take pictures using voice commands or certain physical gestures, as well as request directions and request other information from their Android phones. Mashable’s Lance Ulanoff gave a demo yesterday on NBC’s The Today Show.

The thought of replacing my glasses with Google's eyewear has as much appeal to me as having a chip implanted in my brain. I just had an eye exam last week, and as it turns out, my prescriptions have changed both for reading and distance. The doctor suggested I consider progressive lenses rather than replacing lenses on two sets of frames. But he warned me it takes some getting used to. Now imagine having progressive lenses that also let you search for information or take pictures? And if you were concerned about privacy before... well that’s a whole separate discussion.

The prospective Microsoft watch would have a 1.5-inch display for the prototype. I’m not expecting to see a Microsoft watch anytime soon but I must admit, I could see a day when I replace my Seiko with a timepiece that could give me a quick glance of my schedule, messages and other quick tidbits of information including of course, the time.

A watch-based interface would be much less invasive, districting and dangerous than eyewear. True the temptation to glance at my watch would increase but it would mean less pulling out of the phone to quickly access information. But I suppose it would suddenly become illegal to look at your watch while driving (but that’s a small price to pay). Perhaps the biggest fear is we’d see pedestrians walking the streets glued to their watches. However, those same pedestrians are already glued to their phones.

So I’m open to a watch that could let me quickly check my messages and perhaps look up information as long as it doesn’t look too odd and is as comfortable as my current timepiece. Nevertheless, I expect I’ll be changing the battery on my Seiko at least a few more times.

Posted by Jeffrey Schwartz on 05/01/2013 at 1:15 PM0 comments


A New Look for Redmondmag.com

If you had to double-check to make sure you didn't go to the wrong Web site, it's understandable. But you're at the correct place. Welcome to the complete makeover of Redmondmag.com! 

As you can see, this isn't just a minor design tweak -- this is an entirely new look and feel, aimed to make it easier for you to find your favorite articles based on topics based on your areas of expertise and interest. While this new modern look may take some getting used to, I have no doubt you will discover more of the rich information available on the site.

Our design team, under the direction of Scott Shultz, worked tirelessly to give the site a much more visually-appealing, welcoming and less cluttered look, while front-end developer Rodrigo Munoz and site administrator Shane Lee executed the new look flawlessly.

We broke down the six most popular subjects: Windows/Windows Server, Active Directory, SharePoint, Exchange, Cloud/Virtualization and Security. At the same time, the site has menus that make it easy to find anything else of interest.

Perhaps the most welcome change is the fact that you'll find Redmondmag.com much easier to read on a mobile phone or tablet. No longer will you have to resize pages in order to read stories on your devices.

While the look has changed, the mission to bring IT pros the most useful information about the rapidly evolving computing environments in enterprises and to live up to our tagline as the leading "Independent Voice of the Microsoft IT community" stays the same. As always, I appreciate your feedback and hope you'll drop me a line to let me know what you think of our new look.

 

Posted by Jeffrey Schwartz on 04/30/2013 at 4:59 PM0 comments


Windows 8: Get 'Started' Now!

He's baaaaackkkkkk!!! As I'm sure you recall, The New York Times columnist David Pogue made a guest appearance in the pages of Redmond magazine last month to offer his Top 20 Windows 8 Tips. His first tip -- that those who bemoan the loss of the iconic Start button could use a third-party alternative to bring it back -- caused an uproar.

Scores of respondents flamed us saying Microsoft should bring back the Start button. Some argued failure to do so would be the downfall of Windows 8. While it's unlikely Windows 8 will live or die by whether or not Microsoft brings it back, which, according to a number of reports, is suddenly looking possible -- at least in some form -- I asked Pogue if he would give us his take on the third-party options that are out there. You can find his update here.

There are seven third-party Start buttons for Windows 8 that caught Pogue's eye. Most of them are free, though some cost $5 or less. When he sent me the piece as I requested the other day, I faced a bit of a quandary. At the time I asked him to run down these third-party options a few weeks back, there was no sign whatsoever that Microsoft was even considering bringing the Start button back. The first evidence suggesting that has changed came last week when ZDNet blogger and Redmond magazine columnist Mary Jo Foley reported that it looks likely that Microsoft is bringing back the Start button as an option in Windows Blue, aka Windows 8.1.

If that's the case -- and we all know Mary Jo has good sources -- it made me wonder: should anyone bother with a third-party option when it's possible Microsoft may bring back the Start button? Would it be a disservice to post Pogue's piece with this possibility looming? I thought long and hard about it and came to the following conclusion: Right now there is no Start button and there are at least 60 million Windows 8 machines out there at last count -- and that was in January. Most people I hear from are quite frustrated with the missing Start button, readers, friends and family included. If there's a free or inexpensive tool that could service some of those people today, then what's the downside to getting "started" (pun unintended) right away?

Also, though it's looking likely Microsoft will bring the Start button back, it's still not a certainty. Microsoft could still backtrack since it hasn't made a public decision. And even if it does bring back the Start button, it will be at least several months before Windows 8.1 is available. Perhaps you'll even prefer the third-party Start buttons over Microsoft's once it ships. With that in mind, I didn't feel it could do any harm to post this piece. In fact I believe it will help many of you get "started" right away.

Posted by Jeffrey Schwartz on 04/27/2013 at 1:15 PM0 comments


GFI Adds Cloud-Based Patch Management

In a move that should broaden the appeal of its cloud-based antivirus and monitoring service for PCs and servers, GFI Software has added patch management. The company launched GFI Cloud last year and adding patch management addresses a critical pain point for IT pros looking to ensure systems are secure.

With its Web-based interface, GFI Cloud is targeted at mid-sized companies generally with fewer than 1,000 employees. However GFI's CEO Walter Scott told me it is used by some customers with many thousands of employees. As I reported last month, GFI spun off its security business into a new company called ThreatTrack, which will target large enterprises.

Scott and ThreatTrack's CEO Julian Waits are longtime close friends and while the two companies have the same shareholders, they're separate business. ThreatTrack will continue to license the Vipre technology it inherited to GFI. But our discussion yesterday was about the addition of patch management to GFI Cloud.

Adding patch management was a natural way for GFI to expand GFI Cloud since the company already has patch management technology via its LanGuard software. "We're seeing the trends that more and more applications are trying to do automatic patching -- we've seen Microsoft have two big failed patches this year," Scott said. "If that's not a reason for automatic update, turning those types of features off, I think automatic patching is putting people at risk."

In addition, Scott warned that due to the size of some patches -- which sometimes are as large as hundreds of megabytes -- distributing them across 1,000 machines can create network performance issues. In some cases, that could be trivial and brief but in rural areas that don't have access to high speed networks, it could wreak havoc.

Scott said a growing number of IT decision makers are becoming more comfortable with the notion of using a cloud provider for patch management. "We're seeing that more and more business are giving up the control of trying to control the end point," he said.

GFI Cloud installs the company's TeamViewer, a Web-based interface that lets IT pros remotely access a system with an issue, enabling them to repair it. The new patching option provides updates to Windows, Exchange, Office, browsers, Adobe and Java, among other widely used software.

The cost is $12 per system, per service, per year, with a minimum of 10 computers.

 

Posted by Jeffrey Schwartz on 04/26/2013 at 4:59 PM0 comments


Windows XP for Life

With the clock ticking on Microsoft's plans to pull the plug on Windows XP, analysts are warning the millions of enterprises users still running the 12-year-old operating system that time's running out. A careful migration can take many months and up to a year (depending on the shop) and waiting until the last minute or past the deadline means there will be no more security patches after April 8, 2014.

Microsoft also confirmed it won't support Windows XP Mode in Windows 7, for those who are concerned about that, though if they are running Windows 7 as their host OS, their systems should be protected. There are many options for migrating to Windows 7 and/or Windows 8, and my colleague Kurt Mackie wrote an exhaustive and thoroughly researched analysis on many of those options, which you can find here.

Yet a reader quickly commented: "We have no plans to move off XP after support ends. Our specific compatibility needs are met only by Windows XP and unless Microsoft reverses course and puts back Windows XP features in modern Windows, it is XP that we will continue to use. In fact we find newer Windows quite a downer to productivity and usability."

I have received a few e-mails saying they'll use Windows XP until they die. One Redmond reader, Mike, responded a few weeks ago with the following: "XP will continue to run millions of users' applications for decades. Just the Microsoft support options are going away, not the OS. No reason to worry."

Well there is a reason to worry, especially if your machine is connected to the Internet, which is pretty much a given these days. Your systems won't only be more vulnerable to attack, but they could very well become purveyors of malware. So if you do plan to stick with Windows XP, you'll need to find some way to protect it -- both for your benefit and others.

As for Windows XP Mode in Windows 7, one reader e-mailed me the following suggestion:

"I understand Microsoft's position regarding XP Mode. After all, users have had plenty of warning and, after all, XP Mode is Windows XP SP3. Microsoft cannot be expected to continue to issue patches for XP Mode and not XP.  That said, I would hope that Microsoft will not remove XP Mode from Windows 7 via a service pack come next April.  Even if they do, it would still be better for Windows XP users to install Windows XP SP3 under some other VM tool (such as VMplayer) running under Windows 7 than to continue to run Windows XP on bare metal."

While it's true there are some legacy programs that run on Windows XP or below, and some users still have some old MS-DOS-based systems chugging along, most enterprises will be best served by migrating to a newer operating system. The process may be painful, but most should find the benefits worth the effort.

Posted by Jeffrey Schwartz on 04/24/2013 at 1:15 PM0 comments


Microsoft's Commitment to Green Datacenters

Many enterprises have stepped up their efforts to reduce the carbon footprints of their facilities to cut the costs of their operations. It makes good business sense but just as important, reducing emissions is a responsibility every organization should endeavor to provide a cleaner environment.

On this Earth Day many companies are using the occasion to share their contributions toward reducing the amount of power their facilities are consuming. For its part, Microsoft's multi-year cloud computing transformation has afforded the company to push the envelope in reducing the amount of power required to run its growing global datacenter footprint.

Microsoft says it is one of the first major organizations to impose internal fees on carbon emissions, which has given those who manage different businesses and operations incentives to conserve energy by using alterative renewable power sources at 100 worldwide datacenters. Robert Bernard, Microsoft's chief environmental strategist said in a blog post today the company has so far used $4 million from those fees to invest renewable energy projects around the world.

"While we still have progress to make in reducing our environmental footprint and realizing the potential of technology to address environmental challenges, I'm pleased to say that we are well on our way to making environmental sustainability a core value at Microsoft," Bernard noted. "We're more confident than ever about the role of IT to address climate change and other important environmental challenges."

As a result of its efforts to run greener datacenters, Microsoft this year ranked No. 2 behind Intel on the EPA's Green Power Partnership list this year. Microsoft's annual green power usage was about 1.9 billion kilowatts per hour, representing 80 percent of its energy usage,  thanks to its implementation of Sterling Planet's bioenergy products and services as well as Microsoft's investments in hydro and biomass renewable energy.

In its new datacenters, Microsoft's Power Use Effectiveness, or PUE averages at 1.125, compared to an industry average of 1.8. Among some other facts noted by Microsoft, its new Quincy, Wash. datacenter uses primarily hydro power generated by the local Columbia River and at its Dublin, Ireland datacenter uses 1 percent of the amount of water typically utilized in traditional datacenter. Bernard also last year said for every unit of energy not coming from a renewable energy source such as hydro or wind, the company purchases renewable energy to offset that.

Not every organization has the resources and commitment to invest in green initiatives to that extent but everyone can do their part. En Pointe Technologies, a systems integrator based in Gardena, Calif. last week gave some tips that we all can consider:

  • Conserve power: Using the power management options in Windows and Active Directory, IT can ensure systems are in sleep mode during non-business hours
  • Power down completely: Shutting down monitors, PCs and printers after business hours can reduce 66 percent of power usage. For example large enterprise printer in sleep mode still consumes 26 watts of power but only .5 watts when turned off
  • Reduce paper usage by printing less. According to the EPA, the average user prints 10,000 pages per year. Reducing that could add 3 percent to a company's bottom line.

Earth Day is a good day to renew our commitment to making more efficient use of energy not just to save money but to provide a cleaner environment.

Posted by Jeffrey Schwartz on 04/22/2013 at 1:15 PM0 comments


Microsoft and Intel Disavow Fears of Doom

Over the past several weeks, it appeared the sky was falling for Windows and its longtime partner Intel. With PC sales falling more precipitously last quarter than analysts had originally forecast and IDC blaming it on disappointing uptake of Windows 8, we all were anxiously awaiting this week's earnings reports to hear from the horses' mouths themselves as to actually how bad things are now.

The good news is Intel and Microsoft didn't deliver even more dire reports or forecasts as feared. As it turned out they fared better than IBM, which sharply missed expectations with a quarter-over-quarter revenue decline of 5 percent, following last month's shortfall reported by Oracle and fears about Apple. IBM's miss was the first in eight years and doesn't bode well for other major IT players.

The bad news is Microsoft's decision not to provide an update on Windows 8 sales did little to refute IDC's Windows 8 claim, made last week. Noting that Microsoft sold 100 million Windows 7 licenses six months after it was released, the company's silence certainly raises questions, inferred All About Microsoft blogger and Redmond columnist Mary Jo Foley. It's no secret Microsoft is competing with itself with the popularity of Windows 7. Listening to Microsoft's earnings call last night, CFO Peter Klein, who is stepping down at the end of June, stated the obvious about Windows 8: The best is yet to come.

Now that Windows 8 has been available for six months, anyone not urgently needing a PC surely doesn't want to buy something that they know will be obsolete in a few months (I sure don't). Hence anyone who desires a "new" Windows 8 device (tablet, PC or hybrid) is going to wait for the Windows "Blue" to land on the high end systems based on Intel's next generation Haswell processor and on the low-end Bay Trail Atom processor.

Indeed the hardware out there from OEMs and Microsoft's own Surface RT and Surface Pro will clearly appear underwhelming if you stitch together the improvements the new devices will sport once they start to hit the market later this year. Klein suggested as much on yesterday's earnings call.

"We've always felt that with Windows 8 it was a process of the ecosystem really innovating across the board and really starting to see that on the chips," Klein said on the earnings call. "We're very encouraged by both Haswell and some of the Atom processors to really improve the overall user experience that Windows 8 delivers. And over the coming selling season, I think that's very encouraging and we're optimistic about that."

So here's my takeaway from Microsoft's third-quarter earnings report and investor call yesterday:

  • Windows 8 systems are shipping at a weaker pace than it would like to admit but the company is bullish that a new crop of PCs, tablets and hybrids later this year will bolster the OS. Our own data shows Redmond magazine readers are deploying Windows 8 incrementally and will continue to go at that pace for many years.
  • Microsoft's transition to the cloud is going well, specifically with Office 365. Office 365 saw an increase of five times this quarter over the same period last year and one in four enterprise customers are using it. Klein said the business is on a $1 billion run rate. While it may be cannibalizing traditional Office licensing sales, this is business Microsoft is not losing to Google, which nevertheless continues to gain momentum with Google Apps. "We expect our transactional customers to increasingly transition to the cloud with Office 365," he said.
  • With an overall 11 percent year-over-year improvement in its flagship enterprise business -- Server and Tools -- multiyear licensing increased 20 percent, with Systems Center revenue up 22 percent and HyperV gaining 4 points in the market, Klein said.
  • This week's launch of Windows Azure Infrastructure Services and Windows Azure Active Directory positions Microsoft to provide a solid public cloud offering and is poised to capture a major piece of the public cloud business. At the same time, Amazon Web Services will remain the market leader for many years to come, and Microsoft will also face a broad array of competitors in the public cloud from AT&T, Google, Rackspace, HP, IBM, Salesforce.com and those in the RedHat and VMware ecosystems.

As Microsoft looks to close out its 2013 fiscal year, it will no doubt be a challenging sprint to the finish line as the company continues to transition to new products and services. While Microsoft didn't knock it out of the park yesterday, no one thought it would. By virtue of the fact that Microsoft performed within expectations (depending on which consensus groups you look at it either slightly missed or slightly beat forecasts) and didn't drop any bombshells, the company can keep its head down for now.

Posted by Jeffrey Schwartz on 04/19/2013 at 1:15 PM0 comments


IBM Bets Big on Enterprise SSD Flash Storage

IBM may be looking to get out of the commodity x86 server business but it's making a major bet on extending the use of solid state disk (SSD)-based flash storage technology in the datacenter, which holds the promise of enabling faster transactional and analytical applications. At the same time Big Blue believes it can be used for mainstream workloads such as boosting performance and removing latency of key operational systems such as Exchange Server, SharePoint and SQL Server.

I attended an analyst and press briefing last week at IBM's New York office, where the company trotted out its top technology execs to kick off a major corporate initiative to advance SSD-based flash. In addition to investing $1 billion over the next three years to extend its flash technology and to integrate it into its server, storage and middleware portfolio, IBM is opening 12 centers of competencies around the world for customers and partners to test and conduct proofs of concept using its flash-based arrays.

Though IBM and its rivals have offered SSDs in their storage systems over the past several years, Big Blue believes the economics of flash storage make it increasingly more viable for enterprise and cloud-based systems, which led to its last year's acquisition of industry leader Texas Memory Systems.

Steve Mills, IBM's senior vice president and group executive for software and systems, pegged the price of low-cost disk drives at $2 per gigabyte and high-performance disks costing $6 per gigabyte. While SSD-based flash costs about $10 per gigabyte, Mills argued that because only a portion of spinning disk can actually be used in high-performance systems, the actual cost is also around $10 per gigabyte.

"This is such a profound tipping point," Mills said at last week's event. "There's no question that flash is the least expensive, most economical and highest-performance solution." Over the past decade, processor, memory, network, and bus speed and performance has increased tenfold while the speed of mechanical hard disk drives [HDDs] remains the same, according to Mills. "It has lagged," he said.

"We clearly think that the time [for flash] has come," he added. "This idea of using semiconductor technology to store information on a durable basis is what flash is all about."

Specifically, flash can also offer substantially faster transaction speeds -- on average just 200 microseconds compared with 3 milliseconds, Mills noted. "By reducing the amount of time, the IO wait that the database in the system is experiencing, you're able to accomplish more," he said.

Several customers were on hand to validate Mills' argument, including Karim Abdullah, director of IT operations at Sprint, which tied IBM's FlashSystem to an IBM SAN Volume Controller (SVC) to improve access to the wireless provider's 121 distributed call centers worldwide. The volume of calls to Sprint's call center increased dramatically two years ago when the company offered its unlimited data plan, leading to much higher volumes of database queries. "It provided a 45-fold boost in access to that piece of data," Abdullah said of the flash systems.

Al Candela, head of technical services at Thomson Reuters, implemented the flash arrays to build a trading system that could offer much lower latency than the existing architecture with HDDs allowed. "I saw benefits of a 10x improvement in throughput and a similar achievement in latency," Candela said.

While some of these early implementations are aimed at very large scale, high performance computing systems, it also could be used to boost the performance of commodity servers and Windows Server-based systems, said Ed Walsh, VP of marketing and strategy for IBM's system storage business unit. "We see it being used uniformly across Microsoft platforms, Linux platforms, Power platforms and to be honest mainframes," he told me

Mark Peters, a senior analyst at Enterprise Strategy Group said flash from IBM and others is rapidly moving from niche/vertical/specialist workloads, such as HPC, to more general workloads including Exchange Server SharePoint. "People running Microsoft based systems could absolutely find it appealing," Peters said.

Texas Memory Systems traditionally targeted high-end HPC workloads. "IBM's ownership and focus will swing it to the commercial side way more and faster," Peters said.

In addition allowing systems like Exchange to run faster, Mills said the ability to read and write from flash storage means applications will require fewer server cores, meaning licensing fees for database, operating system and virtualization software, as well as other line-of-business apps, will be much lower.

While that may be true, that doesn't mean some software companies won't try to compensate by raising their licensing fees, warned PundIT analyst Charles King. "Oracle, as an exemplar, a company that hasn't been shy about adjusting its pricing schema to ensure its profits in the face of emerging technologies," King said. "However, that could also work in IBM's favor. If the company keeps the licensing cost of DB2 steady and Oracle attempts to rejigger its own pricing, the result could make IBM's new FlashSystem solutions look even more compelling."

Because of the much smaller footprint -- Mills described a two-foot rack of flash systems capable of storing a petabyte of data -- datacenter operators can lower their costs by 80 percent, including the power and cooling expenses.

As noted, IBM is not the only company touting SSDs. A growing number of companies such as SolidFire and STORServer are targeting flash storage to enterprises and cloud providers. Incumbent storage system provides like EMC, Hewlett-Packard and NetApp also offer flash technology. Likewise, key public Infrastructure as a Service cloud providers including Amazon Web Services, Rackspace and others offer SSD-based storage.

"IBM claims its hardware-based approach offers better performance than what it called 'flash coupled' software-centric solutions from major competitors like EMC and HP, and it didn't really address smaller and emerging players," King said. "Overall, it's going to take some time to sort out who's faster/fastest and what that means to end users, but IBM's argument for the value of flash was broader and sounder than most pitches I've heard."

Scott Lowe, whose Random Access column debuted on RedmondMag.com last week, warned:  "Solid state disks still remain orders of magnitude more expensive than hard disks. Many array solutions include powerful deduplication and other data reduction features that can help to address the capacity question while still providing incredible performance for workloads that need it, such as VDI and big data. Here, the cost per IOPS is low, but the cost per GB is high." Lowe cited SSD flash as one of five options in a modern storage topology.

Do you see using SSD flash in your storage infrastructure? Drop me a line at [email protected].

 

 

Posted by Jeffrey Schwartz on 04/19/2013 at 1:15 PM0 comments


Can Microsoft Unseat Amazon in the Cloud?

Game on. Microsoft's launch yesterday of Windows Azure Infrastructure Services, armed with another round of price cuts, puts the two cloud giants in the Pacific Northwest on more equal footing. Three years after launching Windows Azure as a complete platform as a service, or PaaS, Microsoft finally is able to let customers stand up their own virtual machines in the company's cloud. That means for the first time Windows Azure's infrastructure as a service (IaaS) is now a viable alternative to Amazon's widely used EC2.

Thanks to cut-rate pricing and a widely distributed global cloud service, some of the largest cloud implementations run on Amazon's EC2 including Netflix, Nasdaq, Dropbox, and even Salesforce.com's Heroku service. Almost everyone offering various IT services I speak with say they use Amazon as their provider. Yet surprisingly, Windows Azure may be more widely used than it appears.

A survey of Redmond magazine readers, who are obviously skewed toward Microsoft offerings, shows 26 percent of 979 respondents use Windows Azure, while an equal percentage use "other" services not mentioned in our survey (Rackspace, AT&T, Verizon, Google, HP, Dell, IBM and Oracle). That suggests many are using regional hosting providers or nothing at all. As for Amazon, 13 percent indicated they're using EC2, However the number that plan to use Amazon over the next 12 months jumps to 20 percent, while Windows Azure increases to 28 percent. Those using other services will drop to 18 percent.

It's not just Redmond magazine readers who say Windows Azure is the most widely used cloud provider. Morgan Stanley's April CIO survey found 20 percent use Windows Azure for PaaS functionality compared with 12 percent who use Amazon, though it should be noted that Amazon is primarily an IaaS provider. The Morgan Stanley CIO survey showed 13 percent are using Windows Azure IaaS, presumably the preview release, since it was just made broadly available yesterday. As a result, Morgan Stanley is bullish on Microsoft's cloud prospects and it was a key reason the investment bank upgraded the company's stock.

"Our research suggests Microsoft's large technology base and captive audience of over 5 million .NET developers has propelled the company into an early leadership position in Cloud Platforms," wrote Keith Weiss and Melissa Gorham in yesterday's research note. "Additionally, the cloud-based Office 365 platform continues to see good adoption in customers of all sizes. The contribution from these cloud initiatives remains relatively small today and the shift to subscription licensing may actually create slight near-term headwinds. However, the potential positive sentiment impact of Microsoft being seen as a 'winner' in the cloud as these offerings gain scale and exposure could act as a positive catalyst for MSFT shares, in our view."

Still IaaS is where the action is and Amazon appears to be the company to beat. In addition to Amazon a large swath of IaaS providers including Rackspace, IBM, HP, AT&T and Piston Cloud, among a slew of others are building out a formidable rival with OpenStack. Infrastructure providers such as Cisco, Intel and Red Hat, among over 100 others also have big bets on OpenStack as well. Built on the open source cloud compute, storage and networking platform, a lot of players are investing heavily in OpenStack. The OpenStack Foundation, which is having its biannual summit this week in Portland, Ore., just released its twice yearly platform update, code-named "Grisly." Keep an eye on my Cloud Report blog for updates on OpenStack. One thing I should note is that Grisly includes improved Hyper-V support, something that was notably absent in earlier releases, as I noted last month.

Turning back to Windows Azure, Forrester analyst James Staten said in a blog post yesterday that after yawning about the long anticipated Microsoft and OpenStack releases, that Redmond stands to gain further credibility as a cloud infrastructure provider with its IaaS release.

"Microsoft has done a good job of balancing the sense of comfort and familiarity with this release between those who have experience with other IaaS platforms and those who want the familiarity of their existing Microsoft corporate environments," Staten wrote. "Microsoft Windows Azure IaaS will feel familiar to developers who have experience with Amazon Web Services' Elastic Compute cloud."

Staten added that System Center admins will find the service familiar and very much like a Hyper-V compute pool. "This is probably the best validation of the hybrid cloud model from a major tech supplier so far this year," he said. "Microsoft is steadily delivering on its promise to make the extended, hybrid cloud the way forward for enterprise customers; especially systems admins, who want the option of extending datacenter workloads to the cloud without giving up the tools and IT processes they have spent years refining."

It appears Windows Azure will appeal to large audience of enterprise customers but it also looks like many business units will use different cloud service providers depending on the nature of the application, the development platform and the service levels required. Likewise there are hundreds of smaller providers out there offering services.

Who is your cloud service provider of choice now and looking ahead? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/17/2013 at 1:15 PM0 comments


Foxit Targets Adobe with Free PDF Creator

There are plenty of ways to create PDFs without spending a fortune on Adobe's Acrobat Standard software. Of course you can save Microsoft Office files as PDFs. But if you want to generate PDFs that you can annotate and enable extensive workflow capabilities -- including the ability to electronically sign and stamp them -- Foxit tomorrow will unveil a new free reader that the company said rivals Adobe's Acrobat Standard.

While less expensive alternatives to Adobe Standard offer many of these features from Avanquest, Nitro, Nuance and Wondershare (among others), Foxit is upping the ante with its new Foxit Reader 6.0. The Freemont, Calif.-based company says 150 million have already downloaded a Foxit Reader, which it believes makes it the second most used PDF reader behind Adobe's offering. With tomorrow's release, Frank Kettenstock, Foxit's VP of marketing told me: "We're expecting that to go up significantly."

The new Foxit Reader 6.0 will include a Microsoft Office ribbon interface that lets users create PDFs from Word, Excel and PowerPoint, as well as and hundreds of file types via drag-and-drop. It'll also support Microsoft's Active Directory Rights Management Services.

Users will also be able to create PDFs by right clicking on files, scan documents, use a PDF print driver, and cut and paste from the Windows clipboard. Users can download the Foxit Reader here.

For those requiring more extensive editing and form creation capabilities, Foxit tomorrow will also unveil PhantomPDF 6.0, which allows users to reformat texts and manipulate objects within a PDF. It also offers extended search capabilities via the company's PDF IFilter, which lets the Windows indexing and other search technologies to index PDFs. Support for Evernote lets users attach PDFs to Evernote notes. Users can also edit scanned documents and implement optical character recognition.

Foxit will offer two versions of the PhantomPDF 6.0. The standard edition is priced at $89 and business edition, priced at $129, offers extended compression features, editing and security for enterprise users.

 

Posted by Jeffrey Schwartz on 04/15/2013 at 1:15 PM0 comments


Intel's Uncertain Future

Microsoft took a lot of heat last week when IDC reported a 13.9 percent decline in PC shipments for the first quarter of this year. While it wasn't a shocker that PC sales were declining, the shortfall far exceeded the 7.7 percent shortfall IDC had originally forecast.

After IDC blamed disappointing sales of Windows 8, I begged the question if Microsoft can turn it around with the next wave of Windows wares code-named "Blue." We'll get more insights on that going into June. However, I'm still of the belief that a better crop of processors from Intel will make Windows 8 more attractive.

Haswell, which promises to offer improved battery life and better graphics rendering, will be key to the delivery of new high-end systems, notably Ultrabooks. Just as important, the redesigned 22 nanometer Atom processor, code-named "Bay Trail," could work its way into new tablets and smartphones. But it faces stiff competition from a number of other players including AMD, Nvidia, Qualcomm and others supporting the low-power processor design of ARM.

As Intel looks to spread its wings, there are a lot of questions about the company's future, including who will replace CEO Paul Otellini, who unexpectedly said he will retire next month -- three years short of the company's mandatory retirement age of 65. Intel apparently did not have a successor lined up, though it appears the company is likely to tap an insider, The New York Times reported today.

Though expectations are that Intel will report revenues and earnings tomorrow on the low end of analyst expectations, the report and any future outlook or guidance company officials give could either amplify or mute IDC's findings.

Regardless, the deliverables from Intel and its partners, along with the new wave of Windows software from Microsoft, should result in some improved new device choices later this year.

Microsoft and Intel may no longer be exclusively tied at the hip but the fact that Wintel has evolved is not a bad thing. What type of Windows 8 device would win you over? Feel free to comment or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/15/2013 at 1:15 PM0 comments


Can Microsoft Fix Windows 8?

Microsoft over the past week has taken its lumps -- first with Gartner reporting a 7.6 decline in worldwide PC sales last quarter, followed by rival researcher IDC's more dire analysis saying shipments dropped a record 13.9 percent. And if reporting the worst decline in PC sales since IDC first started tracking shipments two decades ago wasn't bad enough, the analysts issued an uncharacteristically direct blow, blaming the deterioration on "weak reception" for Windows 8.

It doesn't matter whose numbers you believe -- every market researcher has its own methodologies. We have long understood demand for PCs have been on a downward spiral, thanks to the fact that tablets and smartphones are rapidly becoming the preferred devices for accessing the Internet. The sucker punch came when IDC said Windows 8 has accelerated the PC decline of PC sales.

"It seems clear that the Windows 8 launch not only failed to provide a positive boost to the PC market, but appears to have slowed the market," said IDC analyst Bob O'Donnell, in a statement. "While some consumers appreciate the new form factors and touch capabilities of Windows 8, the radical changes to the UI, removal of the familiar Start button, and the costs associated with touch have made PCs a less attractive alternative to dedicated tablets and other competitive devices. Microsoft will have to make some very tough decisions moving forward if it wants to help reinvigorate the PC market."

It appears Microsoft has made some of those decisions, which the company will start to reveal at TechEd North America in early June and the entire plan unfolding at the Build conference in late June. One encouraging sign is that it appears the so-called Windows Blue wave, will facilitate new form factors. As reported by The Wall Street Journal Thursday, Microsoft is developing a new line of its Surface, and the lineup of its hybrid PC-tablets will include a 7-inch device that will compete with the Google Nexus, Amazon's line of Kindle Fires and Apple iPad mini, among others in that size range.

While Microsoft didn't comment on the Journal's report, the company has not discussed plans to address that form factor, as noted by ZDnet blogger and Redmond columnist Mary Jo Foley.

Presumably if a smaller Surface is in the works, Microsoft will also let OEM partners build and market smaller Windows 8 tablets as well. But Microsoft needs to offer licensing terms that can enable OEMs to offer devices that are competitive with Android-based tablets. As we all know, OEMs don't pay licensing fees for Android.

Indeed Gartner sees devices based on Google's Android outpacing anything else, running on nearly 1.5 billion units by 2017. Coming in second is Windows at 571 million and those based on Apple's MacOS and iOS will come in third at 504 million. Those forecasts combine PCs, tablets and phones, further skewing in favor of Android, given its dominance in the smartphone market.

If Microsoft was initially reluctant to offer smaller Windows 8 tablets, there's growing hope that mindset has changed. As Microsoft cuts Windows 8 to size, so is Wall Street. What will make analysts, who have downgraded the company in recent days, more upbeat? In a research note issued by Merrill Lynch today, which downgraded Microsoft from a buy to neutral, PCs will continue to decline by 20 to 25 percent before picking up steam again. "PC unit growth could improve after reaching that trough level, as a function of enterprise and high-end unit demand," the research note said. "However, enterprise upgrade cycles could get elongated from trends like BYOD [bring your own device] and desktop virtualization."

Tablets have indeed made it easier for consumers to put off upgrading PCs. Enterprises, coming off or in the midst of Windows 7 upgrades, are replacing systems as needed. I'm personally waiting to see what the next crop of devices look like once PCs are available based on Intel's new Haswell processor, which will deliver better display and offer much longer battery life.

The next shoe to drop will be next Thursday evening, when Microsoft is scheduled to report earnings for the quarter. Of course, a lot can happen before that.

Posted by Jeffrey Schwartz on 04/12/2013 at 1:15 PM0 comments


Windows 7 XP Mode Is History Too

When Microsoft stops supporting Windows XP next year, hopefully using Windows XP Mode in Windows 7 isn't your fallback plan. I received an inquiry from a reader asking if Microsoft will continue supporting the Virtual XP within Windows 7. The answer is no.

"Windows XP Mode in Windows 7 aligns to the same lifecycle as Windows XP," a Microsoft spokeswoman confirmed. Some may see this as a double whammy since that pretty much puts the kibosh on running those legacy Windows XP-based apps that won't work on Windows 7 or above.

"Microsoft made a big deal of having the Virtual XP machine within Windows 7 as a smooth way to transition to the modern world," the reader making the inquiry said, adding users should be able to run Windows 7 as the host operating system and Windows XP Service Pack 3 (SP3) Professional as the guest operating system.

When Microsoft issued its gentle reminder on Monday that Windows XP users have one year to migrate, was Windows XP Mode your fallback plan or did you presume that was on the chopping block as well? Or perhaps  will you will take comfort in using Windows 7 security and patches to continue running those apps in Windows XP Mode? Feel free to comment or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/12/2013 at 1:15 PM0 comments


HP Aims To Reshape Datacenters with Moonshot Servers

For once there's noise coming out of Hewlett-Packard this week that doesn't center on the dysfunction that has surrounded the company for the past two-plus years. As expected, the company this week started shipping its next-generation server architecture known as Project Moonshot.

This is not just about the latest update to a server line. Project Moonshot represents an entirely different system architecture. It also introduces the first major shift in server form-factors since HP's release of blade servers a decade ago. In these racks are cartridges that are half the size of a typical laptop PC. The company describes Moonshot-based systems as software defined servers that have networking and storage interfaces built into the racks as well.

But what's really unique  is these new systems take up 80 percent less space, use 89 percent less energy and cost 77 percent less, the company says. "HP is taking head on the challenges of space, energy, cost and complexity of the datacenters for today and tomorrow," said CEO Meg Whitman, in a Webcast kicking off the launch Monday.

"This is not an incremental change, it's the launch of a new class of servers," added Dave Donatelli, executive VP of HP's enterprise group.  "With current server technology, the economics behind social, mobile, cloud and big data will begin to deteriorate rapidly as requirements for servers continue to escalate,"

The first deliverable is the HP Moonshot 1500, based on a 4.3U chassis that supports up to 45 hot-pluggable server cartridges and shares storage  -- both traditional hard disk drives and flash-based solid state drives -- and network interfaces. And the initial crop is powered by Intel Atom S1260 processors. Look for future systems to support ARM-based processors, those used to power tablets and smartphones, this year as well.

HP is initially only offering support for Linux (Red Hat, SUSE and Ubuntu) and KVM virtual machines but company officials said customers will be able to specify Windows and VMware-based virtualization later this year.

"Windows server support will roll out later in the year and also on additional targeted server cartridges when they're available," explained John Gromala, HP's director of product marketing for industry standard servers and software, in an interview. "It will be specifically targeted towards workloads and applications that are actually taking advantage of those specific types of offerings."

Asked if any customers are testing Moonshot systems bundled with Windows Server, Gromala confirmed they are but he declined to elaborate. It appears HP will only be offering Windows Server 2012.

By virtue of these substantially smaller sized systems, coupled with the equally dramatically lower power consumption requirements (the equivalent of six 60-watt light bulbs), the release of HP's first Moosnshot systems is a key milestone for a company that has struggled to find its way of late. Not that HP pulled Project Moonshot out of a hat. Project Moonshot was a key initiative that HP Labs has had under development for many years.

Presuming these systems perform as advertised, they promise to change the footprint of many datacenters over time. Just like blade servers didn't initially take off (from any vendors), the economics ultimately made them appealing. We'll see if Project Moonshot has the same impact.

 

Posted by Jeffrey Schwartz on 04/10/2013 at 1:15 PM0 comments


Windows XP Farewell Tour Begins

Presuming running a version of Windows supported by Microsoft is a requirement in your shop, you have one year left to rid yourself of Windows XP. On April 8, 2014, Microsoft will no longer support Windows XP. To commemorate -- or considering the wide proliferation of Windows XP, to give customers a kick in the teeth -- Microsoft has launched its one-year countdown.

The pending end-of-life support for Windows XP is a big issue for a lot of businesses. Anecdotally, I can't tell you how many businesses that still have Windows XP running rampant. Every time I'm at the gym, the doctor, at many restaurants and stores, and even my local Bank of America branch (which is a relatively new one), I see Windows XP on their desktops.  

A year from now, those with Windows XP, as well as Office 2003, will no longer receive security updates or tech support (at least from Microsoft). Microsoft is kicking off this countdown with a promotion that gives those who upgrade from Windows XP to Windows 8 and Office 2003 to Office 2013 a 15 percent discount though June 30. The offer, dubbed Get2Modern,  is available to customers who upgrade via Microsoft's partners for migration assistance.

But regardless of small incentives, many organizations may not feel ready for Windows 8, which Microsoft acknowledges. "For some, moving their full company to Windows 8 will be the best choice, and for others it may be migrating first to Windows 7," Erwin Visser, a senior director in the Windows division, noted in a post on the Windows for Your Business blog today. "Still, for many, it will be deploying Windows 8 side-by-side with Windows 7 for key scenarios, such as Windows 8 tablets for mobile users."

According to the latest market share reports published by Net Market Share, Windows 7 is the most deployed PC operating system, accounting for nearly 48 percent with Windows XP filling in nearly 39 percent. Windows 8 now accounts for a mere 3.17 percent, though up from 2.67 percent a month earlier. Clearly Windows 8 will grow much more incrementally than previous versions, which, given the scope of change it introduces, isn't surprising.

Research of the Redmond magazine readership indicates within the next year, 61 percent will have deployed Windows 8, though, in most cases, only to a small percentage of employees. In two years, 74 percent will have deployed Windows 8 on a much broader basis.

Currently 95 percent of Redmond magazine readers have Windows 7 deployed, while 76 percent still have Windows XP, 41 percent have some Windows 8 and 20 percent have Windows Vista. Since multiple responses were permitted, the data doesn't say to what extent these various versions are deployed. But given the survey was based on 1,178 respondents it should give a valid measure and point to the strong proliferation of the various versions of Windows.

For those of you that still have Windows XP what are your plans? Are you going to let it go down to the wire (or perhaps beyond) or are you planning to act sooner than later? Will you go right to Windows 8 or are you going with Windows 7?

Please share your Windows XP migration plans by dropping me a line at [email protected].

Posted by Jeffrey Schwartz on 04/08/2013 at 1:15 PM0 comments


Will IT Nix Facebook Phones?

Miffed that there still isn't a Windows 8/RT app for Facebook nearly six months after the launch of the new operating system, I tried in vain to ignore yesterday's launch of the new so-called Facebook phone. The new smartphone software, dubbed Facebook Home, and the HTC First phone coming next week, takes a forked version of Google's Android and puts the social network front and center.

If Facebook Home gains critical mass, we may become a society of zombies -- though some may argue that's already occurred. Either way, it promises to only make things worse. It remains to be seen whether the software -- downloadable April 12 from the Google Play app store for select Android phones from HTC and Samsung -- will gain critical mass.

"The Android launcher approach allows it to target a huge installed base of hundreds of millions of Android users, which will be a large chunk of Facebook's total user base of more than a billion people," wrote Ovum chief telecom analyst Jan Dawson, in prepared commentary.

Much of the reaction I've seen has given it a thumbs-down. Analyst Jack Gold said in an e-mail to me he predicts only the Facebook diehards will embrace it. "I think this appeals to the 10-20 percent of true junkies," he said. "Others will find it very intrusive."

I agree, though it's the teens and twenty-somethings that will seal its fate either way.

I'd like to think most people don't want Facebook to take over their phones and will be content using the app (that is unless you're a Windows Phone user in which case like those with Windows 8/RT PCs and tablets, you're stuck accessing it via the Web browser or through third-party apps). But if Facebook does what some fear -- offer free phones in exchange for their privacy -- there are suckers who will go for it without batting an eye.

As IT pros, will the potential proliferation of Facebook Home lead to the demise of BYOD? Probably not but it might require a re-thinking of what IT needs to do to recapture employees' attention while on the job. Does this mean invoking policies that ban the use of this software on BYOD-supported devices? Will IT need to put new rules in policy management systems and security software to put the kibosh on Facebook Home?

"I think this will just exacerbate an already big problem in many companies -- mobile anarchy," Gold said. "Companies have to establish some level of governance on devices (many have already). Some user devices just are not acceptable in the corporate environment. If you buy one, don't expect it to get connected to the corporate network or use the apps."

All of these are issues IT may confront if Facebook Home is even a moderate hit. Meanwhile, now that Facebook has finished developing its phone software, how about giving Windows 8 and Windows Phone users an app?

Posted by Jeffrey Schwartz on 04/05/2013 at 1:15 PM0 comments


Ray Lane Finally Steps Down As HP Chairman

It was long overdue but Hewlett-Packard chairman Ray Lane yesterday said he was stepping down, but will remain on the company's board of directors. Two other board members, John Hammergren and G. Kennedy Thompson said they will exit the board when their terms expire next month.

The three were up for re-election this week and got by with a slim majority but the vote made clear investors were not happy with the three. Many fault Lane for the way he guided former CEO Leo Apotheker and the chaos that erupted under Lane's reign as chairman.

The dysfunction came to a head in August of 2011 when Apotheker simultaneously announced the company was evaluating the divestiture of its PC group, the demise of its TouchPad tablets less than two months after their release and the $10.3 billion acquisition of Autonomy -- a deal that turned out to be worth a small fraction of what HP paid. The company wrote off most of that deal. That led to Lane's ultimate ouster a month later when Meg Whitman was named CEO.

In a statement, Lane tacitly acknowledged the level of dissatisfaction with him. "After reflecting on the stockholder vote last month," Lane said. I've decided to step down as executive chairman to reduce any distraction from HP's ongoing turnaround."

Indeed Lane and his board colleagues did the right thing. By most accounts, HP is making progress, albeit incremental, under Whitman. Employee morale is improving and the company's stock is up. But as Whitman warned, it will take years to right HP's ship. Investor confidence is a step in the right direction.

Posted by Jeffrey Schwartz on 04/05/2013 at 1:15 PM0 comments


VDI Demand Remains Tepid

A Forrester report released this week found that only 8 percent of IT shops in North America and Europe plan to deploy VDI and/or hosted Windows desktops. This is consistent with a recently fielded study by Redmond magazine.

Though we asked the question in broader terms than Forrester, about 1,130 responding to our online survey indicated that 63 percent will have a hosted desktop or VDI implementation -- though mostly for a small percentage of employees -- over the next 12 months, compared with 54 percent last year.

That would suggest 9 percent of shops will implement VDI this year. The increase, though moderate, suggests slightly more growth than the Forrester sample revealed, which found the same percentage had VDI/desktop hosting plans last year.

But as Redmond's Kurt Mackie reports: about a one-fourth (24 percent) of respondents were interested in VDI, yet had no plans to deploy it, according to the Q4 2012 study. That figure represents a decline from the 33 percent showing interest in VDI in Forrester's Q4 2011 study. It should be noted that Forrester is comparing surveys fielded last year and the previous year -- our conclusions come from a single survey fielded in February.

The Redmond magazine survey, which was fielded to those who subscribe to the magazine, showed 37 percent will have no VDI/hosted desktops this year, compared with 46 percent who say they currently have no such deployments.

Of those who do have VDI/hosted desktops, the largest number of respondents --30 percent -- said they only have rolled them out to 10 percent or less of their employees. That suggests it's being used for specialized applications such as for help desks, employees who may not need to use a PC all day and perhaps contractors. That's not a surprising finding. Interestingly those planning rollouts to 10 percent or less of their employees will decline this year to 28 percent.

Interestingly, a growing number of shops planning rollouts in the next 12 months plan to do so to a larger percentage of their workforces, according to our survey. Whereas only 12 percent planned rollouts for up to 25 percent of their workforces, that number will increase to 17 percent this year. And those who plan even larger deployments of up to 50 percent of their employees will increase from 6 percent in 2012 to 11 percent this year. Very few plan rollouts to larger portions of their users and only 1 percent said they plan to do so to their entire workforce.

Are you planning a VDI/ hosted desktop implementation -- or expanding/upgrading an existing one? If so, what's your infrastructure of choice? If not, why is it unappealing? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 04/03/2013 at 1:15 PM0 comments


Victims of Cyber Attacks Lawyer Up

It's no secret to anyone in IT that the number of reported cyber attacks is on the rise. And while victims have historically avoided at all costs disclosing the fact their systems were penetrated, some now have to do so.

The result, The Wall Street Journal reports today, is that many victims are hiring law firms or seeking legal counsel so they can invoke attorney-client privilege. I'm not a lawyer or an expert in compliance but my first thought was: "really?" In one of several such examples cited by the Journal, Nationwide Insurance disclosed a breach in which customer records were accessed. Nationwide reported the breach in compliance with new state laws and under strong urging by the Securities and Exchange Commission that they do so.

The FBI investigated further, even while class action suits were filed on behalf of customers saying that Nationwide failed to protect their information. In response, the insurance giant hired the law firm Ropes & Gray and then declined to comment further, citing the litigation.

While companies need to act in the best interest of their shareholders and customers, clamming up is a dual-edged sword. Certainly disclosing more information has its own risks but hopefully companies like Nationwide are quietly sharing information with the proper authorities that can help better protect themselves and others, as President Obama ordered back in February in his State of the Union Address.

The number of breaches disclosed over the past two years has increased 40 percent, according to accounting firm KPMG, the Journal noted, adding that hackers have penetrated 681 million records between 2008 and 2012. I obtained a copy of the KPMG report, which also noted that 60 percent of all incidents reported were the result of hacking.

The report does show an encouraging development: the healthcare sector, which just a few years ago accounted for the highest percentage of data loss incidents (25 percent) saw that drop to just 8 percent last year. It looks like health care providers are doing something right.

With the recent spate of attacks, such as last week's Spamhaus distributed denial of service (DDoS) attack reported Friday or the recent and quite significant strikes I noted back in March:

We're under siege purportedly by the Chinese, Iranian and Russian governments. Organizations including the Federal Reserve Bank, The New York Times, NBC News, Apple, Facebook, Twitter, heck even Microsoft itself, have all recently sustained cyber-attacks.

As the President noted in his State of the Union Address, "We cannot look back years from now and wonder why we did nothing in the face of real threats to our security and our economy."

As long as we continue to fight back, IT needs to contend with the fact that hackers and cyber-terrorists will only get smarter and find new ways to attack our systems. So what are you doing or what do you feel needs to be done? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/01/2013 at 1:15 PM0 comments


Microsoft Taps Silicon Valley for Build

Microsoft will court Silicon Valley developers when it holds its Build developer conference June 26-28 in San Francisco, where the company is expected to preview the future of the Windows platform and the apps that will run on it.

Steve Guggenheimer, Microsoft's corporate VP and chief evangelist for Microsoft's developer and platform group made the announcement at the Visual Studio Live! Conference in Las Vegas last week. (Disclosure: Like Redmond magazine, Visual Studio Live! is produced by parent company 1105 Media). Guggenheimer subsequently posted details on the planned conference in a blog post. Registration opens tomorrow.

Former Redmond magazine editor Keith Ward, who is now editor in chief of Visual Studio Magazine, caught up with Guggenheimer at Visual Studio Live! after the announcement and tried to get some details about what Microsoft will reveal at the upcoming conference.

In the interview, posted on the Visual Studio Magazine Web site, Guggenheim (not surprisingly) kept details close to the vest. "It's a little early to get into the details," Guggenheim told Ward. "The key is we'll provide updates across the range of our platforms, and depending on where we are with different pieces we'll give updates. It will be the Windows family of products, Visual Studio, many different things."

As All about Microsoft and Redmond columnist Mary Jo Foley noted in her post, the Build conference will overlap with TechEd Europe, due to take place in Madrid. The implications of that remain to be seen.

Certainly watchers are anticipating Microsoft will have more to say about the next wave of Windows updates, code-named Blue, which have indicated support for new form factors, among other things. As reported by Foley and Redmond's Kurt Mackie, Microsoft's chief spokesman Frank X. Shaw confirmed Blue, albeit with few details.

Not only is it good to hear that Build is in the not-to-distant future but the fact that Microsoft chose San Francisco as the venue for the conference was a wise move -- the company needs to have Silicon Valley in the fold.

Posted by Jeffrey Schwartz on 04/01/2013 at 1:15 PM0 comments


Spamhaus Suffers Biggest DDoS Attack of All Time

This week's attack on the Spamhaus Project was the worst known distributed denial of service (DDoS) attack raising the bar on the brute force weapons at the disposal of cyber assailants.

Spamhaus is often attacked by those who take issue with the fact that it blacklists spammers. But this week's DDoS attack started at 10 gigabits per second and peaked at an unprecedented 300 Gbps, The New York Times reported. "It is the largest DDoS ever witnessed," said with Dan Holden, director of Arbor Network's Security Engineering and Response Team, noting that the unknown attackers were well aware that Spamhaus already had sophisticated cyber defenses.

"It's unique because of the amount of power they've been able to harness," added Dean Darwin, senior vice president for security at F5 Networks. Despite the new level of magnitude, Darwin warned it may just be the tip of the iceberg. "It's the kind of attack we're going to see a lot more of," said Darwin, saying the Spamhaus attack is the latest data point showing the need for CIOs and CSOs to step up their game by providing application-level security to their systems.

"The attacks we've seen in the past are very network centric," he said. "Now we're seeing the sophistication of the attack profiles as being very application centric." In effect, Darwin said unless firewalls, intrusion prevention systems, threat management gateways and malware remediation programs, among other tools, can work intelligently together, victims of DDoS and other attacks will remain vulnerable.

Despite its magnitude, Spamhaus is just the latest of an onslaught of cyber-attacks that have gripped companies of all sizes in recent months, especially some of the nation's largest banks. While DDoS attacks are nothing new, most have lasted a few days or at most a week, Holden said. "To go for months is unprecedented."

Experts also pointed out this week that the largest telecommunications and Internet service providers (ISPs) need to make their networks more intelligent so as to know that a flood of millions of packets targeted at DNS at once from organizations is the result of botnets rather than legitimate traffic.

There is a best practice recommended by the Internet Engineering Task Force published in 2000 called BCP38. David Gibson, vice president of strategy at Varonis, a provider of data governance solutions, pointed out in a blog post that most providers have implemented these best practices except for 20 percent. The problem is 80 percent isn't good enough, he noted.

"Just like on the road, where a few (or many) distracted or careless drivers can cause harm to countless others, a group of sloppily configured routers can allow attackers to disrupt critical infrastructure that we've come to depend on," Gibson noted. "We can't turn off DNS. Though it's theoretically possible to make everyone use TCP instead of UDP for DNS queries (which would make these queries much more difficult to spoof), so many people would be adversely affected during the transition that this might make things worse than just living with the DDoS attacks."

Has the onslaught of attacks caused you to change how you defend your company's systems? Drop me a line at [email protected].

 

 

Posted by Jeffrey Schwartz on 03/29/2013 at 1:15 PM0 comments


ThreatTrack CEO Explains Why GFI Spun Off Security Business

  • Read Jeff's in-depth interview with ThreatTrack CEO Julian Waits here.

GFI Software earlier this month spun off its security business into a new company called ThreatTrack, whose core assets came from the 2010 acquisition of Sunbelt Software. The move is aimed at creating a separate business targeted at large government agencies and enterprises such as big-box retailers and financial services firms. GFI will continue focusing on its heritage customer base consisting of small and mid-sized companies with fewer than 1,000 users.

Earlier this week, I spent a half-hour chatting with ThreatTrack's CEO Julian Waits, who has spent much of his career in the IT security industry, most recently as general manager of GFI's security business. Waits is a longtime friend of Walter Scott, CEO of GFI, and despite the plethora of companies that offer IT security software and services, the two believe they have a unique set of offerings such as ThreatAnalyzer, based on the company's SandBox technology.

In the interview, Waits emphasized his belief that despite the increased onslaught of attacks from China, Russia, Iran and North Korea, enterprises need to take a more proactive stance in trying to predict where the next threat will originate. "I still believe 80 percent of security problems are risk management problems," Waits said. The company is using Cloudera's Hadoop distribution technology to use big data to more proactively predict future threats, Waits explained.

An encouraging development was President Obama's executive order announced last month for the government and private sector to share more information to protect the nation's critical infrastructure. The executive order promises to boost the amount of data that can be collected and analyzed, Waits said.

"The more of that data we can collect, the more fancy analytics stuff we can use to do our job a lot faster," he said. "There's no way we'll be able to anticipate an APT before it's created. God knows they're becoming more and more sophisticated. All we can do is become more sophisticated about how rapidly we can respond to it, and it's going to take a community to do that."

Waits said ThreatTrack has about 300 employees at its Clearwater, Fla. headquarters and is looking for a site in Washington, D.C. where it will staff 50 additional people. The company is also considering a presence in the San Francisco Bay area, Waits said. In addition, Waits indicated he'll be shopping for companies to acquire, especially those that can help ThreatTrack apply predictive analytics to the big data it collects.

See Also:

Posted by Jeffrey Schwartz on 03/27/2013 at 1:15 PM0 comments


Bidding War Leaves Dell's Future in Limbo

Dell's bid to go private is very much up in the air, and with it is the future of the company and who will run it. When founder and CEO Michael Dell lined up Silver Lake Partners and an ensemble of other investors, including Microsoft, early last month, it appeared reasonably certain the $24.3 billion deal would sail through.

However a week after Michael Dell and Silver Lake made the offer, some key shareholders felt it was a lowball bid and indicated they'd vote against it, hoping for a better offer. As the 45-day "go-shop" period to consider superior offers arrived Friday, the company received two bids.

One was from the private equity fund manager Blackstone Group and the other was from activist investor Carl Icahn, who said he has amassed a 4.6 percent stake in Dell. Under both bids, a portion of the company would remain public.

Icahn wants to buy 58 percent of the company at $15 per share (compared to the existing $13.65 per share offer). Under that scenario, investors could only sell part of their holdings, according to Icahn's proposal, published by Dell this morning. Subsequently, Icahn would hold a 24.1 percent share, Southeastern Asset Management would wind up with a 16.6 percent share, and T. Rowe Price would receive a 9.3 percent share.

Blackstone's offer of $14.25 per share would give shareholders the option of receiving cash or stock, according to the company's proposal. Alex Mandl, chairman of the Special Committee of Dell's board evaluating the bids, issued a statement saying the board will review both bids. "There can be no assurance that either proposal will ultimately lead to a superior proposal," Mandl warned. "While negotiations continue, the Special Committee has not changed its recommendation with respect to, and continues to support, the company's pending sale to entities controlled by Michael Dell and Silver Lake Partners."

If either rival bid is successful, it remains to be seen whether Michael Dell would continue to run the company he built 29 years ago, The Wall Street Journal reported today (subscription required). There were rumblings last week that Blackstone approached former Hewlett-Packard CEO and current Oracle President Mark Hurd about taking the helm of Dell, should Michael Dell not stay on.

Blackstone has more than a passing interest in this deal. The company hired Dell exec David Johnson, a key player in the computer giant's acquisition efforts and now "actively involved" in Blackstone's attempt to acquire Dell, sources told WSJ.

While Blackstone has retained Morgan Stanley to help it secure financing of the deal, several reports say GE Capital is a leading contender to acquire Dell's financing arm for $5 billion.

Silver Lake and Dell have insisted they believe their offer fairly values the company and hence don't plan to counter with a higher bid. An increased offer would add to the already high level of debt the investors would be taking on to go private, thereby extending the risk they're taking on.

The bigger problem for Dell, however, is customer confidence, warned analyst Roger Kay in a Forbes blog post published today, noting with irony that HP by comparison is suddenly looking more stable than Dell. "During HP's rocky time of management changes and policy reversals, Dell and IBM made as much hay as they could," he said. "Now, HP is looking stable-ish and its customers have begun to calm down."

It's too early to tell whether the committee will accept either offer and, if they do, whether the Dell-Silver Lake team will up the ante. Or if the committee rejects both offers favoring the original one, investors could still nix the deal if they can gather enough votes -- an unusually distinct possibility.

And even though Barron's (subscription required) noted today that HP's situation is much improved and its shares are up 62 percent from its 10-year low in November, the report noted HP's revenues are still declining.

As Dell customers, how would you like to see this play out? And is the current uncertainty leading you to shop elsewhere? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/25/2013 at 4:59 PM2 comments


Will Microsoft Stand Out In the Big Data Fray?

Microsoft took a key step forward in its quest to bring "big data" to the cloud this week when it released the public preview of its Windows Azure HDInsight offering. The cloud-based service, first made available on a limited basis last fall, aims to let enterprise customers process huge volumes of structured and unstructured data using Microsoft's SQL Server and the Hortonworks distribution of the Hadoop file store.

A growing number of organizations have started using file management systems based on the Apache open source Hadoop Distributed File System (HDFS). The Java-based file repository lets users store huge volumes of unstructured information in massively distributed clusters based on commodity servers. Using query tools, users can rapidly find and access that content.

Organizations of all types -- law enforcement agencies, retailers, financial services firms and health care providers, among others -- are rapidly gravitating to Apache Hadoop to store information gathered from sources such as social media, news feeds and user-generated content in order to determine trends to deliver insights and intelligence in near real time.

With scores of startups and established players all jumping on the Hadoop bandwagon, Microsoft has hitched its wagon with the Hortonworks distribution, which the company emphasizes is 100 percent Apache-compatible. The HDInsight service in Windows Azure lets organizations spin up Hadoop clusters in Windows Azure in a matter of minutes, noted Eron Kelly, general manager for Microsoft's SQL Server group, in a blog post this week.

"These clusters are scaled to fit specific demands and integrate with simple Web-based tools and APIs to ensure customers can easily deploy, monitor and shut down their cloud-based cluster," Kelly noted. "In addition, [the] Windows Azure HDInsight Service integrates with our business intelligence tools including Excel, PowerPivot and Power View, allowing customers to easily analyze and interpret their data to garner valuable insights for their organization."

Among the first to test HDInsight is Ascribe, a U.K.-based Microsoft partner that provides healthcare management systems for hospitals and large medical practices. Its solution handles the lifecycle of patient care using key new components of Microsoft's portfolio including Windows 8-based tablets, SQL Server 2012 and HDInsight to perform trending analysis using anonymous patient data.

Paul Henderson, Ascribe's head of business intelligence, demonstrated the application at the GigaOM Structure Data conference in New York this week. "Rather than building our own server farm or incurring huge capital costs, HDInsight provides us with the ability to process that volume of stuff at scale and that is a huge benefit," Henderson told me after the demonstration.

But at the Structure Data conference, there were scores of other players talking up new ways of capturing, analyzing and processing huge amounts of data. While Microsoft once only had to worry about players like Oracle, IBM and Teradata, now there are a vast number of players looking to offer modern alternatives to traditional SQL database stores.

For example, a growing number of customers are using NoSQL databases such as those based on MongoDB (the leading player here is 10gen) to store data in the cloud, as well as a number of other approaches I'll touch upon in future posts. "The majors, as we may call them, Amazon, Google and Microsoft all have multiple plays going on in the cloud database world," noted Blue Badge Insights analyst Andrew Brust, who was on a cloud database panel at Structure Data.

Despite the growing number of players and approaches, Brust believes many customers will look for the mainstream providers to embrace them. "We're seeing specialized products from specialized companies doing things that the major databases have glossed over," Brust said. "That's great, but when it's going to really become actionable for companies is when the mega-vendors either implement this stuff themselves or do some acquisitions and bring these capabilities into their mainstream databases that have the huge installed bases, then it becomes a lot more approachable to enterprise companies."

Noted cloud analyst David Linthicum, also on the panel was more skeptical. "It pushes them to be more innovative but I haven't seen much innovativeness come out of these larger database organizations in the last couple of years," Linthicum said.

As for Microsoft, the company is addressing growing demand for in-memory databases, brought to the mainstream last year by SAP with HANA. In-memory databases can perform queries much faster than those written to disk. Microsoft revealed its plans to add in-memory capabilities to the next release of SQL Server, code-named Hekaton, at the SQL Pass Summit back in November.

"This is a separate engine that's in the same product in a single database and will have tables optimized for either the conventional engine or the in memory engine," Brust said. "You can join between them so you are going more towards an abstraction."

But with a growing number of startups looking to re-invent the data repository, such as NuoDB, Hadapt and the new Pivotal initiative from EMC, Microsoft is now in a more crowded field. While Microsoft has broadened its data management portfolio with SQL Azure and now HDInsights, the requirement to find, process and analyze new types of information is greater than ever. All eyes will be on Hekaton and Microsoft's ability to deliver new levels of performance to SQL Server.

 

Posted by Jeffrey Schwartz on 03/22/2013 at 1:15 PM0 comments


Should Microsoft Step Up OpenStack Support?

While software and service providers continue to support OpenStack, the open source cloud infrastructure initiative, there are still a healthy number of key players who haven't committed to it (or have done so in a limited way).

The behemoths that come to mind are Amazon Web Services, Google, Microsoft, Oracle and Verizon (including its Terremark business unit). That's quite a few heavy hitters missing, even though some 200 players have joined the OpenStack Foundation including AT&T, Cisco, Citrix, Dell, EMC, HP, IBM, Intel, NetApp, Rackspace, Red Hat, Suse, Ubuntu and VMware.

Yet some of those, notably Citrix and VMware, have offered half-hearted support for OpenStack. Citrix was an original sponsor of the project until last year when it launched its own CloudStack open source project, offering up its cloud management platform from its 2011 cloud.com acquisition, to the Apache Foundation, much to the consternation of Rackspace. Oracle hasn't endorsed OpenStack either, though last week's acquisition of Nimbula, at least gives it a foot in the door, but its intentions remain to be seen.

As for VMware, it became an overnight supporter following last summer's acquisition of software-defined networking (SDN) provider Nicira, a major OpenStack contributor. Still Rackspace execs say VMware needs to step up and make its ESX hypervisor fully OpenStack compatible.

The same holds true for Microsoft. Though not a member of the OpenStack Foundation, Microsoft has participated in various OpenStack meetings and has provided some support for Hyper-V (outlined in this OpenStack wiki). A Microsoft spokeswoman also pointed out that Microsoft is enabling System Center 2012 to monitor, orchestrate and backup open source cloud environments.

The only hypervisor that can run as a host in an OpenStack cloud is the open source KVM, said Jim Curry, a Rackspace senior VP. "There's no reason there shouldn't be five or 10 well-tested and implemented hypervisor choices for customers that's the promise of OpenStack," Curry said.

Only 25 percent of Hyper-V's features work in an OpenStack cloud, Curry added. "We can run Windows as a guest, but if customers require it on the host, it would require Microsoft to step up the Hyper-V compatibility," he said.

When it comes to Microsoft's own cloud efforts such as Windows Azure, the company likes to talk up its support for Java, PHP and Python, though it's still geared toward .NET applications. Microsoft is also readying infrastructure as a service capabilities for Windows Azure via support for virtual machines including Linux instances.

Given the limited Hyper-V support for OpenStack, it's probably a long way off, if ever, that Windows Azure would gain OpenStack compatibility -- at least from Microsoft unless there's some stealth effort in Redmond.

Would you like to see Microsoft step up its OpenStack support or is that best left to others? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 03/20/2013 at 1:15 PM1 comments


Readers Chime In: Bring Back the Start Button for Windows 8!

Last week's post questioning whether Microsoft should have kept the traditional Start button in Windows 8 really hit a nerve. I can't recall a topic of late that has generated such a flood of comments and e-mails.

An overwhelming majority of respondents said Microsoft should bring it back -- some went as far to say the company never should have ditched it in the first place. The pushback comes as PC sales are on pace to decline for the second year in a row and analysts are predicting Windows 8 tablets will only have a single-digit share in 2017. That's not the only bad news. As I noted Friday, now it appears Microsoft has only sold 1.5 million of its Surface devices, according to a report by Bloomberg.

While it's a stretch to blame all that on the absence of the Start button, the visceral reaction is a clear example of how resistant people are to change. "Microsoft should bring back the Start button, I think that it would definitely aid the transition of all Windows users into Windows 8, increasing sales as the result," wrote Kirk Lewis, a tech support expert in Northridge, Calif. "The start button has been the intuitive glue that all desktop versions of Windows have shared since Windows 95."

An IT pro in Montana who supports PCs running everything from Windows 98 to Windows 8 described the removal of the Start button as "probably the worst thing Microsoft could have done."

One consultant said: "This is honestly the first version of Windows I can't recommend to my business customers," while an IT pro expressed his annoyance with Microsoft. "The fact that Microsoft has once again decided that they are smarter than the thousands of administrators who use it every day once again has turned me off," he noted. "Why replace it for the simply sake of replacing it?! Bad call. This looks to me to be the next ME or Vista."

For business and enterprise environments, many IT pros are concerned users will struggle to find and easily launch their programs when running Windows 8 in the classic Desktop mode.

"I have used Win8 enough to understand Microsoft's thinking, but I believe they made a mistake by not including a fully functional Win7 style desktop so people could ease into it," argued another IT pro. "Use of Windows 8 on non-touch screen devices is awkward at best, and without the complete Win7 style interface, it requires a pretty steep initial learning curve. This is because you can't just click past the tile start screen and go to work on the desktop. You have to figure out how to use it, and how to mimic gestures with a mouse, to get anything done. I dread trying to support users through that transition. And this is from someone who has supported users thru the green screen to GUI transition, the Win 3.x to Win 95 transition, Win95 to XP, etc."

Also, third-party Start programs may not be an option for many shops, he added. "Many corporate environments will not allow it," he noted.

Despite all the complaints, a vocal minority believes all this fuss is about nothing and people need the need to change their habits. "It has been since 1995 since Microsoft made a real change in then desktop GUI," says Allen McEuin, MCSE from Louisville, Ky. "Leave it be and get used to it. It ain't 1995 anymore." And said another from Brazil: "Bringing back Start menu would be a major step backwards to this technology."

Then there are some who are actually glad to see it go. "I do not like the Start button and happy that they removed it. It is cleaner and easier to use the 'search' function or the Start screen. I have Win8 on all my computers and it really is not different than Windows 7 except [minus the] Start menu. I understand the annoyance it causes to learn new ways to do it but it is also more efficient and cleaner this way."

Noel from Kentucky agreed: "It took very little time to get to know the interface. The desktop is a click away. You can have all the apps listed with a click or have shortcuts galore. You can still have multiple Windows open. I think some of these respondents may be pushing against Windows 8 for other reasons. Bottom line is that it is a very fast and secure OS."

Microsoft had no comment on the reaction to last week's post on the topic. Though the company has surprised me plenty of times before, I wouldn't bet on Redmond bringing the Start button back.

So how am I dealing with this? I've created shortcuts on the desktop to applications I use most frequently. Using Search is another easy alternative. There are some keyboard shortcuts as well. But it appears many are less forgiving of Microsoft on this one.

 

Posted by Jeffrey Schwartz on 03/18/2013 at 1:15 PM32 comments


Will Google's Android-Chrome Consolidation Threaten Windows?

Google's decision to pull the man known as the father of Android from that group is a huge gamble by CEO Larry Page and one that poses a new threat to Microsoft and Apple.

The unexpected move to bring Android under the auspices of Sundar Pichai, who already oversees Chrome and Apps, suggests Page remains determined to upend the PC business just as his company has done with tablets and smartphones.

While Page didn't say that outright, read between the lines: "Today we're living in a new computing environment," he said in a blog post announcing the move. "While Andy's a really hard act to follow, I know Sundar will do a tremendous job doubling down on Android as we work to push the ecosystem forward."

Since their debut in 2011, Chromebooks have not moved the needle much. Page's move this week looks like he has a master plan in hopes of giving Chromebooks a major lift. Windows fans and Microsoft shouldn't shrug this off, at least not yet, lest we forget how many did the same when Google first announced the Android phone OS. By bringing Android, Chrome and ChromeOS together, Google will be able to draw from the ecosystem that to make Chromebooks a more compelling alternative to Windows PCs (and Macs for that matter).

In addition to piggybacking on the Android ecosystem for Chrome OS, leveraging technology from Android could have all kinds of ramifications including the ability to provide application portability and an improved touch interface, among other things. While the initial crop of Chromebooks boast a low price tag of less that $500 (some half that amount), they have had limited capabilities -- effectively only allowing you to use the Chrome browser to run Google Apps in the cloud.

Things are changing and now the latest entry, the Google Chromebook Pixel, starts at a hefty $1,299, which PCWorld labels "an expensive curiosity." One of the key objections to Chromebooks -- that you have to store your Google Apps in the cloud -- is going to go away.  As Derik VanVleet, a senior solutions engineer on the sales team of Atlanta-based Cloud Sherpas, Google's largest partner, explains, Google will offer a native client for Chrome that will be natively built into Google Apps thanks to the company's acquisition of Quickoffice last year.

"This whole conversation of documents fidelity and document conversion goes away," VanVleet told me. "Quickoffice allows me to natively edit Microsoft documents in their native format with virtually 100 percent fidelity, better than what Microsoft is able to offer in Office 365 Web Apps. Once Quickoffice is fully integrated to Google Apps whole conversation is going to go away and Microsoft is going to have a real problem."

All About Microsoft blogger and Redmond magazine columnist Mary Jo Foley has often pointed to the possibility of Microsoft bringing together its Windows and Windows Phone groups. Weighing in on Google's OS consolidation move this week, she again raised the specter of Microsoft finally merging the two groups and platforms.

There's another factor to consider. PC makers are fuming over Microsoft's covert development and ultimate release of its Surface tablet. That has led quite a few, the latest being HP, to jump on the Chromebook bandwagon. But just as PC makers need to keep Microsoft in check, Google is offering its own branded device and with its Motorola Mobility unit, has the resources to continue that push.

Google's realignment also comes as Bloomberg today reports that sales of Microsoft's Surface devices are more dismal than originally projected. Microsoft has sold only 400,000 Surface Pros and 1 million Surface RTs -- half of what it projected, sources told Bloomberg. Microsoft did not comment on the report.

Chromebooks haven't made a dent either, so right now Google and Microsoft are in the same boat. While I asked last month if HP and others jumping on the Chromebook bandwagon might bolster ChromeOS's prospects, the response was that Chromebooks will have their place but so will Windows 8 and Windows RT.

Do you think Google's move Wednesday will up the ante in making Chromebooks more appealing to consumers and business users? And how do you think Microsoft should respond? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 03/15/2013 at 1:15 PM4 comments


Google Moving In On Redmond, Literally

Hardly a week goes by when Google doesn't announce an enterprise signing on to its cloud services, notably Google Apps and to a lesser extent its Google App Engine platform as a service (PaaS), usually at the expense of Microsoft. But now Google is stepping up its assault on Redmond in more ways than one -- the company's moving in.

Google is doubling its capacity in nearby Seattle, The New York Times reported last night. In addition to embarking on an aggressive campaign to recruit engineers and other cloud experts in the area, the company is adding 180,000 square feet of datacenter capacity there. The new office in Kirkland, Wash., is about the size of a Walmart Supercenter, making the location its third largest -- outpaced only by its headquarters in Silicon Valley and its facilities in New York.

One of Google's latest recruits from Microsoft is 10-year veteran Brian Goldfarb, who now heads up cloud platform marketing at the search engine giant. He acknowledged to The Times that his new employer is coming from behind. But he insisted Google will be a force to be reckoned with. "We have the best data centers on the planet. You can't really give engineers a bigger, badder thing to work on," he said.

The latest endorsement of Google's cloud offering came yesterday from Swiss-based Holcim, which is said to be one of the world's leading suppliers of cement and related supplies. The company is rolling Google Apps out worldwide to its 40,000 users.

"We chose Google Apps because it will help us concentrate on our core businesses, and bring our employees, customers and partners across the globe closer together," wrote Holcim CIO Khushnud Irani, in a blog post Tuesday. On top of the Google Apps productivity and collaboration suite, Holcim plans to use the Google Search Appliance, Google App Engine and Google Apps Vault, the latter to archive and manage e-mail content.

"With the introduction of Google Enterprise products, we are embracing modern cloud-based delivery models that ease the technical complexities of internal operations," Irani said. "Such a model takes away the technical complexities of internal operations and instead allows IT personnel to focus on closer involvement with their business counterparts in creating deeper business value."

Among other big wins for Google are Japan's All Nippon Airways, California's City of Monterey and The Chicago Public Schools -- the latter deploying 40,000 seats in 681 locations.

Microsoft has announced its fair share of big Office 365 wins as well, such as the retail giant JC Penney, the Department of Veterans Affairs and the State of Texas, which is deploying 100,000 seats.

So who's winning this battle? "I would say we see them neck and neck at this point," said Sara Radicati, president and CEO of The Radicati Group, a market researcher firm that tracks enterprise use of e-mail, collaboration social networking and security software and services. "Google Apps tends to appeal more/gain more traction with SMBs and non-US customers, Microsoft seems to be doing better in small- to medium-size market in the US, so if you look at it on a worldwide basis I would say they are about even right now, whereas in the U.S. I would put Microsoft slightly ahead at this time."

Meanwhile, some of the key arguments Microsoft is making to warn customers of the risks of moving to Google Apps are becoming moot. Perhaps the biggest one, that you can't store documents locally, is about to go away, said Derik VanVleet, a senior solutions engineer on the sales team of Atlanta-based Cloud Sherpas, Google's largest partner.

Google is going to make Quickoffice, the company it acquired last year, a native client with its Chromebooks, effectively letting users store documents on their systems, rather than requiring them to only save them in the cloud, VanVleet pointed out.

"The capability from our understanding is very close," he told me. "What that allows me to do is use the Google Apps platform and this whole conversation of documents fidelity and document conversion goes away. QuickOffice allows me to natively edit Microsoft documents in their native format with virtually 100 percent fidelity, better than what Microsoft is able to offer in Office 365 Web Apps."

The rivalry between Microsoft and Google in the cloud is nothing new. In fact I've followed this for many years. Now with the revamp of Microsoft's Office 365 released two weeks ago, and Google's moves toward reducing the shortcomings of its offering, the battle is entering a new phase.

Who do you think has the edge? Drop me a line at [email protected].

 

 

Posted by Jeffrey Schwartz on 03/13/2013 at 1:15 PM0 comments


Should Microsoft Bring Back the Start Button?

Perhaps the most controversial of David Pogue's Windows 8 tips was the first one suggesting third-party tools that let users effectively re-instate the Start menu that made its debut with Windows 95 but was removed from the new OS.

Pogue said those who bemoan the absence of the Start menu can get it back via third-party apps such as Start­IsBack, Classic Shell, Start8, Power8, Pokki and StartW8. Admittedly, I tried Start8 and found it to be a nice crutch. It made me wonder, why doesn't Microsoft just bring it back in the next service pack?

I asked a spokeswoman at Microsoft if there are any plans to bring back the Start menu and she said the company had no comment. One critic slammed Pogue for even suggesting users bring back the Start menu, arguing such a move would be a step back. "I look forward to an article on Windows 8 by a real enthusiast, not someone who explains how to undo the interface," wrote Mark Justice Hinton.

While Microsoft removed the Start menu to wean people off its traditional way of using Windows to the new app model in Windows 8 via the Windows Store, some habits die hard. Some agreed, asking why Microsoft didn't leave Start menu intact?

"The start menu would have minimized retraining, provided continuity and allowed both the tablet and desktop communities to be well served," a commenter known as D Weeberly wrote. "A simple option could have been provided to disable it for those with an opposite preference."

For some, the removal of the Start menu has turned them off to Windows 8 altogether, especially those who had to get new PCs and were content with the way earlier versions of Windows worked. "I have several friends and acquaintance that have needed new computers recently and HATE W8," one commenter said. "I have now found StartW8 and 8GadgetPack so I can now at least offer them a working solution."

Keep in mind, if Microsoft wanted to play hardball, it could have architected Windows 8 so it wouldn't allow third-party Start menu add-ons or permit them in the Windows Store. Do you think if Microsoft brought back the Start menu it would be a setback for its effort to move people to its new modern Windows Store model? Or would it actually help ease the transition from Windows XP or Windows 7 to the new OS?

Let me know if you think Microsoft should bring back the Start menu by dropping me a line at [email protected].

 

Posted by Jeffrey Schwartz on 03/11/2013 at 1:15 PM91 comments


Why Is Microsoft's Cyber Security Leader So Optimistic?

When Scott Charney spoke at last week's annual RSA Conference in San Francisco, he expressed optimism about the future of IT security despite a new onslaught of attacks and threats facing consumers, businesses and the nation's critical infrastructure.

I initially wondered why Charney, the Microsoft corporate VP for Trustworthy Computing was so upbeat? After all, we're under siege purportedly by the Chinese, Iranian and Russian governments. Organizations including the Federal Reserve Bank, The New York Times, NBC News, Apple, Facebook, Twitter, heck even Microsoft itself, have all recently sustained cyber-attacks.

When President Obama last month issued his Executive Order directing immediate information sharing between the Federal government and the private sector, notably operators of critical infrastructure, he painted a bleak picture of the looming threats in his State of the Union Address. Many in IT were already aware of the problems the President raised but nevertheless he amplified the issue.

On deeper reflection however it bears noting that the President appointed Charney to the President's National Security Telecommunications Committee (NSTAC) in 2011 and presumably he has Obama's ear. Charney described in a blog post the President's order, called Presidential Policy Directive 21, as a key step forward:

When reviewing the key definitions, approaches and activities outlined in the Executive Order, it is fairly well aligned with a set of global principles essential for enhancing cyber security. More specifically, it recognizes the principles of active collaboration and coordination with infrastructure owners and operators, outlines a risk-based approach for enhancing cyber security, and focuses on enabling the sharing of timely and actionable information to support risk management efforts.

It is important to see these principles reflected in the Executive Order for three reasons. First, it is the private sector that designs, deploys and maintains most critical infrastructure; therefore, industry must be part of any meaningful attempt to secure it. Second, both information sharing and the implementation of sound risk management principles is the only way to manage complex risks. Finally, while critical infrastructure protection is important, it cannot be the only objective of governmental policy; privacy and continued innovation are also critical concerns.

Microsoft itself has come a long way in improving the security of all its products, as noted in last year's Redmond magazine cover story "10 Years of Trustworthy Computing." Charney used last week's RSA keynote to re-iterate the security improvements introduced in Windows 8, such as the Unified Extensible Firmware Interface (UEFI) specification which enables new users to securely boot their systems.

Charney also has played a key role in fostering cloud security. Jim Reavis, executive director of the Cloud Security Alliance last week told me why the group gave Charney its industry leadership award this year.

"Microsoft was the first major provider to support our STAR registry by being very transparent," Reavis said. "They've also created vendor neutral tools to help assess their own readiness based on the cloud industry's best practices. He's that rare individual that not only does the right thing with their company but with his competitors and the industry at large. Scott is the epitome taking on that shared responsibility."

The latest cyber-attacks are also putting the spotlight on public cloud computing. Already security remains the biggest inhibitor to cloud computing. I was at IBM's Pulse Conference in Las Vegas earlier this week and the security fears surrounding cloud were not lost on Big Blue as it talked up its cloud portfolio and launched its open standards-based initiative for cloud computing.

Kris Lovejoy, IBM's general manager for security services and the company's former chief security officer, believes the move to cloud computing will actually provide enterprises with more secure IT than their existing infrastructures.

"Cloud is fundamentally more secure or inherently more securable than traditional infrastructure environments because of the way it's designed and because of the way you can replicate security controls on top of cloud environments," she said during a panel discussion for media and analysts Tuesday at Pulse. "One of the biggest challenges we have in traditional infrastructure is complexity -- too many pieces of technology. In a cloud environment everything can be standardized so we can be secure."

Do you feel your systems are on the road to becoming more secure despite a new crop of sophisticated and persistent threats including the risk of cyber terrorism? Or are you of the mind that for every step forward IT security takes, these new risks are taking us a step or two back?  Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 03/06/2013 at 1:15 PM0 comments


Breakdown in Procedures Caused Azure Outage

When Microsoft's Windows Azure cloud storage service went down worldwide late last month, the company confirmed within a few hours the cause of the massive meltdown.  An expired SSL certificate crippled the service late Friday Feb. 22 into the next day.

Furious customers wanted to know how something as simple as renewing a SSL cert could fall through the cracks. Even worse, how could that become a single point of failure capable of bringing down the entire service throughout the world?  It turns out the cause was a "breakdown in procedures," according to Mike Neil, general manager for Microsoft's Windows Azure service, in a contrite post-mortem posted Friday detailing the cause of the outage and plans to ensure the error isn't repeated.

 "A breakdown in our procedures for maintaining and monitoring these certificates was the root cause," Neil noted. "Additionally, since the certificates were the same across regions and were temporally close to each other, they were a single point of failure for the storage system."

Neil explained that Windows Azure has an internal service called the Secret Store that manages hundreds of certificates used to securely run the cloud service. The Secret Store alerted the team on Jan. 7 that the blob, queue and table certificates would expire on Feb. 22. It turns out the storage team did update the certificates but failed to flag a storage service release as one that included the updated certs.

"Subsequently, the release of the storage service containing the time critical certificate updates was delayed behind updates flagged as higher priority, and was not deployed in time to meet the certificate expiration deadline," Neil explained. "Additionally, because the certificate had already been updated in the Secret Store, no additional alerts were presented to the team, which was a gap in our alerting system." Hence, that's how it fell through the cracks.

So what's Microsoft doing to ensure this doesn't happen again? Neil said the Windows Azure team will improve the process for detecting certificates that need to be renewed. Production certs due to expire in less than three months will generate and operational incident and will be tracked as what he termed a Service Impacting Event.

"We will also automate any associated manual processes so that builds of services that contain certificate updates are tracked and prioritized correctly," he noted. "In the interim, all manual processes involving certificates have been reviewed with the teams. We will examine our certificates and look for opportunities to partition the certificates across a service, across regions and across time so an uncaught expiration does not create a widespread, simultaneous event. And, we will continue to review the system and address any single points of failure."

Posted by Jeffrey Schwartz on 03/04/2013 at 1:15 PM5 comments


Do SharePoint Users Clamor for Yammer?

The release of SharePoint 2013 and this week's launch of an upgraded Office 365 for small- and mid-sized customers, offers enterprise a mind-numbing amount of new ways workers can find and share information. With My Sites and SharePoint Communities, the new collaboration platform offers social networking features such as activity streams aimed at fostering greater interactivity among enterprise users.

But for a richer social networking experience, Microsoft last year spent $1.2 billion to acquire Yammer, one of the leading enterprise social networks. Among other things, Yammer's APIs connect to other enterprise business systems such as SAP. Microsoft has been relatively quiet about its plans for Yammer, though at the SharePoint conference in November Microsoft said Yammer will use SkyDrive Pro to store data, the same cloud-based service it's using for SharePoint 2013 and Office 365.

Also in the pipeline this summer, users will be able to preview and edit files from their Yammer feeds via Office Web Apps. Microsoft this week also announced it will implement technology in Yammer that will translate languages, for those in multi-national and multi-lingual environments.

The expectation is that Microsoft will next year offer a version of Yammer that will tie to SharePoint. But it remains to be seen how many organizations want to add that extra component. "There's not much demand for it to be honest,"  said Errin O'Connor, founder and chief SharePoint architect at EPC Group, which consults with organizations that typically have more than 5,000 users and is already engaged with 25 SharePoint 2013 implementations. "People want to get their feet wet first with My Sites in SharePoint 2013 before they sign on another big ticket item."

Shane Young, founder of SharePoint911, which Rackspace acquired last year, said a lot of customers are trying to figure out their enterprise social networking strategies, but social is not a key reason organizations are looking to upgrade to SharePoint 2013. "From the people I've talked to at this point, a lot of them have the understanding they need to do more with social and SharePoint offers it, so it has their interest but I'd be lying if I told you a lot of these companies had a solid social strategy in place," said Young, a SharePoint MVP.

Yet others I've talked with say many organizations are clamoring to add social networking to SharePoint. A survey of SharePoint customers conducted back in August by Forrester Research found 38 percent plan to implement SharePoint's social networking features, while 25 percent have no plans. Do you have a social networking strategy? If so, where does SharePoint and Yammer fit in that strategy?  Share your thoughts or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/01/2013 at 1:15 PM0 comments


Getting to Know Windows 8 and You

A few months ago, David Pogue, the personal technology columnist at The New York Times, penned his weekly column on how to choose a tablet.

The holiday gift-giving guide recommended the usual suspects: the various iterations of the Apple iPad, including its new sibling the iPad mini; a variety of Android-based tablets; and the latest Kindle Fire upgrade. But conspicuously absent was any mention of the recently released Surface RT from Microsoft and devices from other OEMs running Windows 8 and its Windows Store (aka Metro)-only Windows RT.

I was miffed by the omission because I knew Pogue was quite familiar with the most significant revamp of Windows ever. As author of a number of the O'Reilly Media guides on how to use Windows 7, (as well as Vista and XP), Mac OS X Lion and iPhone, Pogue's latest book is "Windows 8: The Missing Manual" (also from O'Reilly Media), officially released today.

Wondering what led to the Windows 8 exclusion, I asked Pogue if it ended up on the cutting room floor, or if perhaps he simply didn't consider the Surface RT -- with the new Windows 8 and Windows RT touch-enabled interface -- a legitimate tablet. I made clear my interest was in understanding his point of view on the matter, not to critique his column.

In his response, Pogue reiterated he'd spent a lot of time with Windows 8 and had already covered it weeks earlier -- ­and would have more to say in the future (and he did). I invited him to share his favorite Windows 8 tips with Redmond readers and his result is this month's cover story, "20 Windows 8 Tips."  While editing it, I tested his tips, and hope they're as useful to you as I found them. Be sure to share your own Windows 8 advice at Redmondmag.com or send your tips directly to me at [email protected].

Quite a few Redmond readers have taken a wait-and-see stance regarding Windows 8. Many of you panned the preview version before its Oct. 26, 2012, release, while others have warmed up to it. Personally, the more time I've spent with it, the more I've come to like it. But in order for Windows 8 to succeed, the catalog in the Windows Store is going to need many more apps.

Just over four months after the OS release, Microsoft recently disclosed it has sold 60 million Windows 8 licenses. But PC shipments have declined. Microsoft is betting it can make up for that in sales of Windows tablets.

I encourage you to get to know Windows 8. And I'd like to get to know you better. This is my first Redmond View column as Redmond magazine's new editor (available in March's Redmond magazine), though I'm no stranger here, having covered IT and Microsoft since the early 1990s -- and spent the last six years with this publication and its sister titles.

But enough about me. A number of exciting things are in the pipeline for Redmond, including a refresh of Redmondmag.com and an extensive readership survey that will help us understand where you are and where you're headed. We look forward to publishing the results.

Technology has shifted rapidly over the past few decades, and now it's changing faster than ever. In the meantime, I want to hear from you. Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 02/27/2013 at 1:15 PM5 comments


Microsoft Is Latest Cyber Attack Victim

Microsoft joined a large parade of organizations to announce they are victims of hackers who've infiltrated and infected their systems with malware and/or stole data.

In recent weeks, The New York Times, Wall Street Journal, NBC News, Apple, Facebook and Twitter are just a handful to come out and say they've been hit. In a blog post late Friday, Matt Thomlinson, Microsoft's general manager for trustworthy computing security, revealed the attack it sustained was similar to those that hit Apple and Facebook. He said there was no evidence that customer data was stolen. Here's what he said:

"As reported by Facebook and Apple, Microsoft can confirm that we also recently experienced a similar security intrusion.

Consistent with our security response practices, we chose not to make a statement during the initial information gathering process. During our investigation, we found a small number of computers, including some in our Mac business unit, that were infected by malicious software using techniques similar to those documented by other organizations. We have no evidence of customer data being affected and our investigation is ongoing.

This type of cyberattack is no surprise to Microsoft and other companies that must grapple with determined and persistent adversaries (see our prior analysis of emerging threat trends). We continually re-evaluate our security posture and deploy additional people, processes, and technologies as necessary to help prevent future unauthorized access to our networks."

Indeed the growing admissions by customers beg the question: Are we under siege more than we have been in the past? Or are companies putting aside their concern that such admissions are embarrassing and risk other liabilities, in order to ensure they are compliant with regulations that govern them? It's no doubt a combination of both.

When President Obama announced his cyber security directive earlier this month in his State of the Union Address, many IT security experts may have rolled their eyes, but it nevertheless appears to have raised the profile of the growing cyber threats and the urgency for organizations to work with the government without compromising customer privacy. It will be interesting to hear what Microsoft's corporate VP for trustworthy computing Scott Charney has to say in his RSA keynote address tomorrow.

Regarding Obama's cyber security directive, Charney echoed concerns that there needs to be a balance between cooperating and maintaining flexibility. In a Feb. 14 blog post two days after Obama's directive, here's what Charney had to say:

"It will remain important that government and industry work together to manage carefully the most significant risks to our most critical infrastructures. To that end, we must remain focused on the desired security outcomes and recognize that owners and operators of critical infrastructures must retain the flexibility to manage risks with agility, implementing practices and controls that are both practical and effective. Continued collaboration between the government and the private sector will be essential in ensuring the success of this Executive Order"

It's clear that the sophistication and determination of cyber attackers continues to rise dramatically. A months-long investigation by The Times last week alleges the origin of a spate of attacks coming from the Chinese military, a charge its government vehemently denies despite a deep trove of evidence pointing its way including a 76-page report from the cyber security consultancy Mandiant, based on extensive research.

Today The Times reported that in wake of President Obama's directive and the latest allegations, the administration is treading carefully not to call anyone out noting the sensitivities of challenging China's new president Xi Jinping. Equally sensitive are other purported purveyors of such attacks, such as those from Iran and Russia.

Nevertheless, the latest report "...illustrates how different the worsening cyber-cold war between the world's two largest economies is from the more familiar superpower conflicts of past decades -- in some ways less dangerous, in others more complex and pernicious."

No doubt this will take center stage at this week's annual RSA Conference 2013 and we'll be keeping you abreast on what you can do to protect yourself from the growing threats.

Posted by Jeffrey Schwartz on 02/25/2013 at 4:59 PM0 comments


The Irony Behind the Windows Azure Meltdown

Just as I was getting ready to call it a week late Friday afternoon, Microsoft's Windows Azure cloud storage service went down worldwide. As I reported, Windows Azure storage was unavailable because of an expired SSL certificate.

The global outage of Windows Azure late Friday into Saturday is ironic, considering the release of last week's study that Windows Azure storage offered the fastest response times of five large cloud networks -- namely those operated by Amazon Web Services, Google, HP and Rackspace. Good thing for Microsoft that Nasuni, the vendor that ran the shootout, wasn't testing Windows Azure at that time.

Once the service was back up Saturday, I posted an update noting that Microsoft had fixed the problem and users could once again access their data. The company said it was 99 percent available early Saturday and completely restored by 8 p.m. PST. But the damage was already done and many customers and partners were furious.

In comments posted on a Windows Azure forum, Sepia Labs' Brian Reischl, who first pointed to the SSL certificate as the likely culprit, seemed to feel users should cut Microsoft some slack. Reischl said letting an SSL certificate fall through the cracks is a mistake anyone could make. "I know I have. It's easy to forget, right?," he posted. "It's an amateur mistake, but it happens. You end up with some egg on your face, add a calendar reminder for next year, and move on."

But one has to wonder how Microsoft, which has staked its future on the cloud and has spent billions to build Windows Azure into one of the largest global cloud services, could not have put in safeguards to prevent the domino effect that occurred when that cert expired, much less having a mechanism in place to know when all certificates are about to expire. Putting it in admins Outlook calendars would be a good start.

Of course there are more sophisticated tools to make sure SSL certificates don't expire. Among them are Solar Winds' certificate monitoring and expiration management component of its Server & Application Monitor, a Redmond reader favorite. Another option not so coincidently hit my inbox this morning. Matt Watson founder of Stackify, spent a few hours over the weekend developing a free tool called CertAlert.me, which allows a site owner to scan the Web sites its owns and track SSL and domain name expirations.

"It happens a lot," Watson told me in a brief telephone conversation regarding outages such as the one that struck Friday, which affected Stackify.  "All you can do is sit on your hands and pray," he said, adding years ago he had to deal with an expired SSL certificate. "You buy them and you forget about them and the next thing you know your site's gone. It's one of those things that get overlooked."

Asked what's the business opportunity for offering this free service, Watson said he saw it as an opportunity to bring exposure to the startup's namesake offering, a Windows Azure-based server monitoring platform targeted at easing access for developers while ensuring they don't have access to production systems.

Indeed you can bet Microsoft is going to ensure it doesn't happen. "Our teams are also working hard on a full root cause analysis (RCA), including steps to help prevent any future reoccurrence," said Steven Martin, Microsoft's general manager of Windows Azure business and operations, in a blog post apologizing for the disruption. Given the scope of the outage, Microsoft will offer credits in conformance with its SLAs, Martin said.

This is not the first outage Microsoft has had to explain and probably won't be the last. And we all know the number of well-publicized outages Amazon Web Services has encountered in recent years.

If you're a Windows Azure customer, did last week's slipup erode your confidence in storing your data in Microsoft's cloud? Drop me a line at [email protected].

Note: This post was updated to clarify hat the Windows Azure outage affected Stackify. 

Posted by Jeffrey Schwartz on 02/25/2013 at 1:15 PM3 comments


HP Shoots for the Moon with Low-Power Servers

Hewlett-Packard shares shot up 13 percent this afternoon after reporting better than expected -- but still poor – earnings yesterday evening. Investors seemed to ignore the company's declining PC business perhaps paying attention to the company's plans to deliver its next-generation datacenter offering called Project Moonshot next quarter.

Project Moonshot is based on a revamped server architecture announced last year that replaces traditional Intel Xeon and AMD Opteron processors with low-power chips from ARM and Intel, the latter using Atom CPUs, both of which are found in tablets and smartphones. These servers are targeted at enterprises running large datacenters, cloud service providers and others running large Web sites.

CEO Meg Whitman said while servers based on Project Moonshot will start to ship next quarter, HP won't be in full production until 2014. Whitman believes Project Moonshot will have a major impact on its enterprise and datacenter business.

"We expect this to truly revolutionize the economics of the datacenter with an entirely new category of server that consumes up to 89 percent less energy, 94 percent less space and 63 percent less costs [than] traditional x86 server environments," she said on yesterday's call. "This is exactly the technological inflection that can fuel the exponential growth of hyperscale computing."

Whitman said HP has received its first order from a company operating in Japan, which is still reeling from the earthquake and resulting Tsunami two years ago that knocked out much if its power grid. As a result, space, energy and cost of compute are at a premium she indicated.

In a CNBC interview this morning, Whitman talked up HP's enterprise business noting it makes up 43 percent of the company's profits. "I don't think many people understand that," she said.

While she may see the enterprise as HP's salvation, HP's deteriorating PC sales -- down 8 percent with margins of just 2.7 percent -- continue to plague the company. Consumer sales declined 13 percent, commercial business was down 4 percent while desktops and notebooks dropped 10 and 14 percent respectively, HP reported.

Though all PC vendors are in similar boats, HP's share of "smart connected devices," which consist of PCs, tablets and smartphones, dropped 8.5 percent in 2012 year-over-year, while Samsung grew 119 percent, Apple 44.3 percent and Lenovo 61.4 percent, IT market researcher IDC reported yesterday. Only Dell saw a greater decline of 12.9 percent. For the most recent quarter, HP was the only of the big-five to see its share decline.

Whitman said HP will continue to emphasize "personal systems," which include PCs, tablets and yet-unannounced smartphones. She also underscored HP's plan to spread its personal systems portfolio beyond Windows, pointing to this month's launch of Chromebooks, based on Google's Chrome OS.

"We like the overall market but there is a transition from more traditional form factors to new form factors," Whitman said. "There's also a transition from one operating system to multi operating systems and we are pursuing a multi operating system strategy. At HP, we've got to reallocate resources from our PC business to our mobility business, from one operating system to multi operating systems, and we have to allocate resources to services, because profit pools are shifting here."

Responding to a question about longstanding rumors and continued calls to breakup HP into separate company, Whitman didn't entirely shut the door on such a move but indicated that's not what she favors. "No one knows more about computing than HP," she said. "It's a core part of the DNA of this company. We like being able to go from the device to the datacenter but it's a tough transition to manage."

Continuing her refrain that rebuilding HP is a multiyear strategy, Whitman believes the company is on the right track. Do you? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/22/2013 at 1:15 PM1 comments


SharePoint in the Cloud Is 'Up in the Air'

Earlier this month, I pointed to a survey by Metalogix that found 55 percent of enterprises will run SharePoint on premise only, while 45 percent will deploy SharePoint in hybrid clouds that combine on-prem installations with Office 365 or third-party cloud providers. Only 10 percent said they are planning to run SharePoint purely in the cloud.

While that's not far off from the prevailing wisdom, it's important to keep in mind that it's hard to draw firm conclusions on a sample of 100 people, especially a crowd attending Microsoft's SharePoint Conference. Their mere presence suggests they may be ahead of the curve compared with typical SharePoint users.

One of Metalogix rivals, AvePoint, today announced the release of DocAve 6 Service Pack 2, the latest iteration of its SharePoint management and deployment platform. The new release boasts new governance features that will help organizations determine what data can safely be moved to the cloud versus information that must stay in house, depending on predefined compliance requirements. The new release also is designed to help customers manage SharePoint 2013 on prem and in the cloud along with Office 365.

I asked Shyam Oza, AvePoint's senior product manager for administration, migration and cloud strategy, if he agreed with Metalogix core findings. "I really do think it's up in the air," Oza said. "While that number might be accurate or snapshot of right now [actually in late November], it's a number that's shifting very quickly. We've had phone calls with customers who in the middle of last year said they would go to the cloud and are now saying it's too sensitive."

Do you plan to move any of your SharePoint farms to the cloud or deploy new ones via Office 365, Windows Azure or other third-party cloud providers? I'd love to speak with you to hear how you're making the move -- or why you're not. Please share your plans, issues and concerns with me at [email protected].

 

 

Posted by Jeffrey Schwartz on 02/20/2013 at 1:15 PM0 comments


NewsGator Looks Beyond SharePoint Amid Yammer Tie

Microsoft today said its $1.2 billion acquisition of Yammer has resulted in record growth for the cloud-based provider of enterprise social networking. At the same time, Yammer's key rival NewsGator plans to extend the reach of its enterprise social networking platform beyond SharePoint.

In a vague news release announcing the growth of Yammer, Microsoft disclosed that sales quadrupled last quarter and the total user base grew to seven million. During the quarter, 290 new customers added Yammer including GlaxoSmithKline, SABMiller and TGI Fridays, among others.

Since the deal closed last year and Yammer was folded into the Office Division, Microsoft has said little about how it will integrate Yammer into SharePoint. What Microsoft has revealed is that it will integrate Yammer with SkyDrive Pro, the cloud-based storage Microsoft is offering with SharePoint 2013 and Office 365. Yammer will also give SharePoint users the ability to preview and edit SharePoint files from their Yammer news feeds via Office Web Apps.

These two features, slated for release this summer, will make it easier for users to find and share information in the Yammer interface. While Microsoft hasn't said how and when it will more tightly integrate Yammer into SharePoint, NewsGator does not see it as a threat. NewsGator has a large installed base of large enterprise customers (many with tens of thousands of employees) and the company will start to become less focused on SharePoint as its primary platform, as reported by Computerworld last week.

Brian Kellner, NewsGator's VP of products, told me that while the report was accurate, he went a little deeper explaining that doesn't mean its social networking tools won't continue to support SharePoint. "We're not ending our SharePoint integration," Kellner said. "We have a great business on SharePoint but we have other things we can do to extend that value."

NewsGator is adding connections to Salesforce.com's Chatter, the company's software-as-a-service-based enterprise social networking offering and with SAP applications. Kellner said NewsGator is also focusing on helping bridge between SharePoint running on premise and Office 365, as well as securing mobile access running in a Microsoft Windows Azure-based service rather than requiring VPN access.

"These are all investments that are stretching out away from SharePoint but still it's running on top of SharePoint," Kellner said. Looking ahead, NewsGator is looking to offer the core logic of its Social Sites without requiring SharePoint. "Today Social Sites must have SharePoint but at some point we'll have something that doesn't have to have SharePoint." When? Perhaps later this year he said.

But he emphasized that doesn't mean the company is walking away from SharePoint. "It's not an either-or proposition," he said, noting last week the company issued a new release that will let organizations use Social Sites on both SharePoint 2010 and 2013 using the same code base. This is important, he noted, because organizations are likely to run SharePoint farms with both versions for many years to come.

If adding social networking to SharePoint is on your roadmap, are you going to rely on what Microsoft brings forward with Yammer, or are you going to use third parties like NewsGator to extend your social networking footprint? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/20/2013 at 1:15 PM0 comments


Microsoft Still Ranked Most 'Indispensable' to CIOs

While Microsoft may have dropped a few notches on the annual Harris Interactive Reputation Quotient this week, dropping from 2 to 15, Redmond does top a more important status list: it's the most indispensable vendor to big-spending CIOs.

At least that remains the finding in Piper Jaffray's quarterly CIO Survey released to its clients this week, which found Microsoft was by far their most critical "mega-vendor" for the future. In its survey of 135 CIOs, 61 (or 45 percent) picked Microsoft, which had more than double the mentions of the No. 2 vendor, Oracle. The ranking, in order, below those two include: SAP, Cisco, IBM, EMC, Hewlett-Packard and Apple. Interestingly only 10 percent or less picked the latter four, with only 4 percent choosing Apple.

While that's not the first time Microsoft has topped that list, the company widened its lead this year with 45 percent giving Redmond the thumbs-up, compared with 33 percent in 2012.

"CIOs state that 'there are really no alternatives to Microsoft,'" said the report. "[Others said] 'MS services are getting better and will allow us to move more to the cloud,' and 'we are highly invested in their technologies and dependent on them extending their platforms.'"

Piper Jaffray senior research analyst Mark Murphy goes on to note: "We believe Microsoft's dominance in the enterprise is underappreciated, and some of the threats against Microsoft, such as alternatives to the Windows desktop OS in the enterprise or productivity software, may be over-hyped in the near term. That said, keep in mind that our CIO survey does not address the large consumer business for Microsoft, which faces much more intense competitive pressures than its enterprise business."

Looking at what that means in terms of actual spending, Microsoft is still on top in terms of market penetration with a 98.5 percent (all but two of 135 respondents) reporting they run the company's software. Coming in second was VMware with 91.1 percent followed by Symantec (78.5 percent), Oracle (76.3 percent), Citrix (71.9 percent) and IBM Software (64.4 percent).

For Microsoft, 56 percent of the survey group plans to maintain consistent spending with Microsoft, while 29 percent will increase the amount sent to Redmond by up to 10 percent. A smaller sample (7 percent) will increase spending with Microsoft by more than 10 percent while only 5 percent said their Microsoft spending will decline by up to 10 percent.

Overall that translates to spending growth for Microsoft of 2.1 percent, bested only by fast-growing cloud-based human capital management provider Workday (4.7 percent), VMware (3.7 percent) and ServiceNow, a cloud-based provider of IT management tools (2.9 percent).

While both Linux and Windows Server will show increased growth, Linux is on a higher trajectory, according to the survey. Last year 36.7 percent expected Linux to grow in their organization while this year 33.3 said it will grow. But while 34.9 percent expected Windows Server to grow, only 14.8 percent said it will this year. Not surprisingly, Unix and mainframe environments will decline more but also a substantial footprint should remain unchanged. When I talk to those who prefer Linux, they make it very clear that Windows is not an option in their eyes.

Microsoft's more obvious threat is on the desktop, including tablets and phones, where investors continue to raise questions on the company's moves. Its calculated risk to offer its own Surface-branded hybrid PC-tablet has had the consequence of motivating OEMs including Samsung, Acer and Hewlett-Packard to offer Chromebooks -- PC-like devices running Google's Chrome OS.

Another shoe dropped this week when a report surfaced that HP is developing Android-based tablets and phones, not a huge surprise especially in wake of Microsoft's growing ties to Dell. Maybe these OEMs would have done this regardless of Microsoft's Surface play but nonetheless Windows 8 -- good as it is with strong ties to Windows Server -- is going to have to earn its way into consumer hands and enterprises alike.

Perhaps no one put the changes that have evolved out there more poignantly than Tod Nielsen, a Microsoft executive that I recall meeting with on a number of occasions in the 1990s. Nielsen, who famously testified in 1998 with then-Microsoft executive Paul Maritz as defendants in Microsoft's famous anti-trust trial, recalled those days in a speech last week at the Parallels Summit in Las Vegas.

"When I was at Microsoft, and I left in 2000, if you told me when I would be in business meetings where everybody would have a computer around the room and the majority of the computing devices would be non-Windows PCs, Macs or new types of tablets, I would have thought you were on drugs. There was no way it's even possible," Nielsen said.

Nielsen, who went on to head BEA Systems, spent time as COO at VMware and is now an exec with its newly launched Pivotal offshoot, added:  "Today in our industry, it's very common to see PCs just being part of the fabric but they're not the dominant part of the fabric that they once were. If you think how many native Win32 applications you use in your daily life, the only one I use is PowerPoint. Everything else is a Web app and is not native to Windows. It's interesting how things have changed."

While Nielsen's loyalties have invariably changed over the years, Microsoft still remains the most relevant company by two-thirds of software developers surveyed by Evans Data, according to a study released this week. But No. 2 Google was seen as likely to dominate in three years, especially with younger developers under the age of 25.

"The developer landscape is shifting as developer demographics change," said Evans Data CEO Janel Garvin, in a statement. "The age of software developers in North America has been trending younger since 2009, and as a new generation of developers comes on stage they bring new perceptions of the industry and its leaders."

Depending on your point of view, the CIOs surveyed by Piper Jaffray are out of touch, a skewed sample or maybe Microsoft's threats are indeed over-stated. But few would argue Microsoft needs to get younger developers and users in its camp. What's your take? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/15/2013 at 1:15 PM4 comments


Obama Issues Executive Order To Tackle Cyber Threat

President Barack Obama yesterday issued an executive order mandating government agencies share information about cyber threats between state and local governments, and the private sector.

It's the latest effort by the President, who revealed the order in his State of the Union address, to combat the growing number of attacks that have hit the federal government, businesses and operators of critical infrastructure.

Just last week, the Federal Reserve was the victim of an "Anonymous" hack in which user data from the Fed's Emergency Communications System was breached, though reportedly no data was compromised. Reports of major cyber-attacks across the public and private sectors have become routine and the President made no bones that risks of cyber terrorism loom large.

 "America must also face the rapidly growing threat from cyber-attacks," Obama said in last night's address. "Now, we know hackers steal people's identities and infiltrate private e-mails. We know foreign countries and companies swipe our corporate secrets. Now our enemies are also seeking the ability to sabotage our power grid, our financial institutions, our air traffic control systems. We cannot look back years from now and wonder why we did nothing in the face of real threats to our security and our economy."

The Presidential Policy Directive (PPD) on Critical Infrastructure Security and Resilience mandates the heads of all federal agencies and departments to identify and remediate all threats to critical infrastructure and ensuring a policy for continuity. The directive emphasizes information sharing, without disrupting existing privacy policies: 

Greater information sharing within the government and with the private sector can and must be done while respecting privacy and civil liberties. Federal departments and agencies shall ensure that all existing privacy principles, policies, and procedures are implemented consistent with applicable law and policy and shall include senior agency officials for privacy in their efforts to govern and oversee information sharing properly.

The directive may get a warmer reception from privacy proponents because, while it orders the government to inform with the private sector and operators of Internet infrastructure, it only goes one way. The private sector and operators are not required to share information with the government, Forbes points out.

Nevertheless, the i2Coalition, a lobby group consisting of hosting providers including Rackspace, Softlayer and Hedgehog Hosting, called on the White House to support the controversial Cyber Intelligence Sharing and Protection Act (CISPA) information-sharing bill, which two members of the House of Representatives are set to re-introduce today. Originally introduced last year, it died in the Senate and the White House hasn't supported it to date.

"Companies like those that make up the i2Coalition -- the providers of the nuts and bolts of the Internet -- must have a seat at the table in any discussion about the future of cybersecurity," noted Christian Dawson, the i2Coalition's co-founder and board chair, in a blog post, which included an online petition for Homeland Security Secretary Janet Napolitano. "We must work to achieve voluntary best practices that promote the growth of an open Internet. To be successful, the efforts must be truly voluntary and not a result of heavy-handed 'incentives' that effectively compel compliance."

CISPA is more controversial than the President's order because the former would let companies such as Facebook or Google share information regarding cyber attacks with the Feds, notes PCMag, while the President's order only requires the government to share information with the private sector.

However the battle with CISPA plays out, it looks like the administration has taken a step forward in stepping up defenses against cyber threats.

Posted by Jeffrey Schwartz on 02/13/2013 at 1:15 PM1 comments


Influential Shareholders Oppose Dell Buyout

It's looking more like the leveraged buyout of Dell for $24.4 billion is not a fait accompli -- not that it was ever a sure thing.

When announcing last week's unprecedented agreement for Michael Dell to buy out the company he founded with an investor group led by Silver Lake Partners, key financial institutions and a $2 billion loan from Microsoft, they left the window open for 45 days to consider any better offers if they were to come along. The prevailing belief last week suggested that wasn't likely.

That may still be the case but with key investors opposing the deal, they may have to sweeten their offer or the LBO could be on ice. Dell's largest outside shareholder Southwest Asset Management, which owns nearly 8.5 percent of the computer giant, last Friday kicked off the opposition to the deal, saying it "grossly undervalues the company."

In a letter to Dell's Board also filed with the Security and Exchange Commission in an SC 13D filing, Southwest Asset Management not only stated it would vote against the deal but said it wouldn't rule out a proxy fight or litigation. The chorus has grown louder this week with Yackrman Asset Management, Pzena Asset Management and Harris Associates speaking out against the deal.

The latest shoe dropped yesterday when mutual fund giant T. Rowe Price, which holds 4.4 percent of Dell's shares, said it opposed the deal. "We believe the proposed buyout does not reflect the value of Dell and we do not intend to support the offer as put forward," T. Rowe Price chairman and chief investment officer Brian Rogers said in an e-mailed statement.

The current opposition amounts to 15 percent of all shares, though in order to nix the deal, 42 percent would need to vote against it, The Wall Street Journal reported.

Dell has predictably kept mum on the matter, though reports say Silver Lake and Michael Dell have no intentions to raise their offer. We'll see if more shareholders step up.

Posted by Jeffrey Schwartz on 02/13/2013 at 1:15 PM2 comments


Surface Pro Sellout!

I took my lumps from some of you over the weekend in response to Friday's prediction that the first edition of the Surface Pro, released over the weekend, would be a bust.

Based on reviews I read from respected critics such as David Pogue, the personal technology columnist for The New York Times, and All About Microsoft blogger and Redmond columnist Mary Jo Foley (granted she's not a technical review per se), among others, the consensus was it was a nice machine that lacked one critical component: the ability to run all day. Pogue only got three and a half hours out of his review unit. Microsoft rates the device as having at least 5 hours of battery life..

As far as I was concerned, that was a deal breaker. If I'm going to shell out more than a thousand dollars for a portable machine, I want it to work all day. In response, Dan in Iowa said,"I believe what you're saying is you've never owned a laptop, and you've never seen the Surface Pro. Hmmm... I guess I'll trust your judgement [sic] then?"

Actually I have owned an Asus netbook that has averaged 14 hours a day (and it still does run all day) for three years. I chose it over the iPad, which had just debuted at the time, to replace a Windows XP-based Fujitsu Lifebook. The Lifebook had served me well since 2004, providing 8 hours of battery life. Unfortunately it wasn't upgradeable to Windows 7. The three-pound netbook cost less than an iPad and was sufficient for work in the field. Six years earlier, I had invested nearly $2,000 for the three-pound Fujitsu Lifebook for its long battery life.

So when it comes to power, I'm biased. Maybe spoiled but that's what I need if I'm using it out of the office all day. That said, I'm not a "Microsoft hater," as two respondents concluded. It is my responsibility to advocate for IT pros to ensure they're getting the most from Microsoft and its third-party ecosystem and believe people should use technology that best suits their needs at a price within their budgets.

As promised, I went to a nearby Microsoft retail store yesterday to see the Surface Pro for myself (I would have gone Saturday but a blizzard pre-empted my plans). I spent an hour with it and liked what I saw. It was notably thicker than the Surface RT. That's because it has an Intel i5 processor and needs a fan to keep it cool. That also explains the limitation in battery life.

When I predicted the inaugural Surface Pro would be a flop, it was from the point of view that most business professionals will wait for the next version, which presumably will come out later this year with Intel's next generation Haswell processor. Based on some reports, that suggests the next Surface Pro will get at least double the battery life. Knowing that, why wouldn't one wait? I will note that if Haswell doesn't deliver, as some reports are now suggesting, that would be a bad news.

There are rumors that Microsoft may be planning to offer a docking station or extra battery to address the power issue. That would be a good thing but at what price?

Personally, I'm going to wait for the next release of the Surface Pro and compare it with what comes out from Microsoft's OEM partners. Or perhaps I'll sacrifice power and go with an Intel Atom-based device that runs Windows 8 Pro all day long.

Time will tell if I overstated my case that the first edition of the Surface Pro will be a flop. Reports over the weekend that the 128 GB units were sold out on the Microsoft Web site and in stores suggest otherwise. No one seems to want the 64 GB version since it's only $100 cheaper and its capacity is paltry.

At the Microsoft retail store, one sales rep told me that they're flying off the shelves. "My friend works at the Best Buy in Commack [NY] and there were lines outside the store and they sold out," the rep told me. I later went to a Best Buy next door in Huntington, NY and found a display unit past the more prominently showcased iPads. It was one with a 64 GB drive. No one was looking at them but the Best Buy sales rep said his store only received three units, two of which had 128 GB drives and sold out right away.

Back at the Microsoft retail store, I asked a customer there with his young daughter looking at the Surface Pro what he thought of it. "Looks pretty good," he said. "You can do a lot more with it than the iPad, which doesn't have Word or Excel." While that latter point may be a temporary argument (that remains to be seen), I still agree.

Meanwhile, Microsoft must now content with some angry customers on its hands. According to comments underneath Microsoft Corporate VP and Surface Pro team engineering lead Panos Panay's blog post Friday announcing its release, quite a few customers are livid about the unavailability of 128 GB units after the company already missed its January target for releasing the devices.

It's not clear whether Microsoft purposely limited supply in order to create demand or if the company simply missed its delivery targets. Either way, the Surface Pro got more buzz than I had expected.

Posted by Jeffrey Schwartz on 02/11/2013 at 1:15 PM13 comments


Surface Pro: Ironically Powerful While Lacking Power

Microsoft's Surface Pro hybrid tablet-PC goes on sale tomorrow, and based on the early reviews, the debut model is impressive yet shaping up to be a flop. It appears Microsoft is strategically rushing this device out to showcase how powerful a tablet can be, even if it lacks enough battery power to get you through half a business day.

In other words, don't expect the Surface Pro to share the fate of the Kin, Microsoft's short-lived consumer shartphone that the company pulled from the market in 2010 less than two months after releasing it. Rather, consider the Surface Pro a prototype of what's to come later this year.

I haven't seen the new Surface Pro but I plan to check it out this weekend. I'm not sure I'd personally plunk down more than a grand for any PC but if I did, I'd expect in the range of 10 hours battery life as a non-starter. Sight unseen, I wouldn't even consider any portable machine for as little as $500 that couldn't get me through the day -- after all, what's the point?

In his review yesterday, New York Times columnist David Pogue noted the test unit Microsoft sent him only ran 3.5 hours, others have said it gets a paltry four-plus hours. Surely Microsoft can't believe these systems are going to fly off the shelf at the price of $1,129 (for a 128GB unit with a keyboard) with that kind of battery life. At least Microsoft better hope they don't sell like crazy because if they do, I predict they're going to have a lot of disappointed customers.

More likely Microsoft is releasing this lacking-in-power Surface Pro to show IT pros how ironically powerful a tablet PC running Windows 8 can be compared with competing devices which now rule the roost, namely iPads, Kindle Fires and Android-based tablets.

With the Surface Pro's Intel Core i5 Ivy Bridge processors, Microsoft can now stem the bleeding and showcase to CIOs how a Windows 8 tablet can do things that a low-power ARM-based devices can't do, like running Photoshop and running other processor-intensive apps.

In a Reddit AMA (ask me anything) chat Wednesday, Panos Panay, the Microsoft corporate VP leading the Surface Pro engineering team explained why the company decided to sacrifice power for more power, so to speak.

The product was designed to take full advantage of Windows 8 coupled with the Ivy Bridge core processor from Intel. We created a product that did not compromise speed, performance in any way. With that, we wanted to be the best notebook/laptop product in its class, but still deliver you the tablet form factor. This product is optimized in every way to take advantage of the full third generation core i5 it runs, yet give the best battery life. If you compare it to say a MacBook Air, you will quickly see that pound for pound in battery size vs battery life, you will find optimizations that puts Surface best in its class. That said we picked a smaller battery to be sure we were able to give you the same performance and to keep it thin. This kept the weight under 2lbs, and still kept it thin enough to take advantage of our great Windows work for inking and give you a great inking experience (like pressure sensitive inking, ability to do kanji, great sketching). While these tradeoffs are challenges as much as they are opportunities, we think given the performance and experience you will be getting, it is an exciting product.

I'm still not buying it and others, like All About Microsoft blogger and Redmond magazine columnist Mary Jo Foley, aren't either. Come June when Intel is expected to release its 7-watt Haswell processor, touted to offer the most significant boost in battery life yet, that could be a game changer. If Haswell lets Ultrabook manufactures, including Microsoft, to deliver these new truly high-power systems -- that could raise the stakes for the Surface Pro and Windows 8 Pro.

That's why the initial Surface Pro may be a dud, but it's certainly not headed down the path of the Kin.

 

Posted by Jeffrey Schwartz on 02/08/2013 at 1:15 PM13 comments


Microsoft's $2 Billion Dell Loan Could Yield Dividends (but No Guarantees)

If shareholders approve Silver Lake's $24.4 billion leveraged buyout bid for Dell, announced yesterday, Microsoft could reap a variety of dividends for its good will in the form of a $2 billion loan. But as far as I can tell, it's a no-strings-attached investment to offer support for a strategic partner with no assurances that Microsoft will get the return it may desire.

In a brief statement acknowledging the loan, Microsoft cryptically said: "We're in an industry that is constantly evolving. As always, we will continue to look for opportunities to support partners who are committed to innovating and driving business for their devices and services built on the Microsoft platform."

Microsoft's loan is effectively high-yield debt, according to The Wall Street Journal, which reported last night that it doesn't give Redmond any "direct role in the management or oversight of Dell." Nor does it give Microsoft "a board seat, equity ownership or formal strategic involvement." 

Interestingly, the Journal also noted neither Silver Lake, nor Dell, needed Microsoft's $2 billion loan to pull off the LBO but nevertheless accepted it to ensure good will between the two companies in wake of Microsoft's release of its Surface hybrid tablet PCs.  Back in October on the eve of the Windows 8 launch, Michael Dell and Microsoft CEO Steve Ballmer sat down with The New York Times to talk about Dell's comfort with the Surface strategy. It was a noteworthy show of faith when other OEMs reportedly were not happy with Microsoft's foray into the hardware business.

As a Dell creditor, will Microsoft's move further unnerve its other OEM partners including Hewlett-Packard, Lenovo, Acer, Asus, Samsung, Toshiba, Sony and others even more? A number of those players already are offering Android tablets, and a growing number of them are rolling out Chromebooks including HP as the latest to release one, as I noted Monday.

Having propped up underdogs including Barnes & Nobel, Yahoo and, of course, Nokia, it seems there's nothing to preclude Microsoft from aiding any of its other major PC suppliers if they were in need. The only player I can see falling into that bucket at the moment might be HP if it were to once again revisit spinning off its PC business or breaking itself up entirely. Despite a brief rumor yesterday suggesting HP was considering breaking itself up (the rumor that keeps resurfacing), there's also evidence HP also continues to see merits in keeping itself intact.

The lack of shackles notwithstanding, Dell does appear committed to Windows PCs and Windows Server-based datacenter and cloud offerings but that didn't appear to be in question even if Microsoft hadn't ponied up the $2 billion.

While the jury is still out on its Barnes & Noble and Nokia investments, Microsoft has benefitted from pumping money into Apple in the late 1990s when it bought $150 million in Apple stock to prop up the then-struggling company. Of course that wasn't altruistic, given legal disputes between the two, a looming antitrust suit by the U.S. government and another avenue to sell Office. When Microsoft later sold its Apple stock, it profited handsomely.

If indeed Microsoft's loan is the equivalent of high-yield debt, Redmond could see a nice return. While this move is not without risk, let's not forget, $2 billion is not a huge chunk of change for Microsoft. Do you think Microsoft's $2 billion loan to Dell was a good move for both players? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 02/06/2013 at 1:15 PM1 comments


Is Google's Chromebook a Windows 8 Killer?

While Hewlett Packard CEO Meg Whitman said she is in support of Windows 8, she never promised her company's exclusive support for the Windows OS. HP is apparently hedging its bets with today's launch of its first Chromebook laptop.

The HP Pavilion 14-c010us Chromebook has a 14-inch display, is equipped with 2 GB of RAM, 16 GB SSD and is powered by an Intel Celeron 847 (1.1 GHz) processor. It weighs 4.25 pounds, and HP claims battery life of 4 hours and 15 minutes. It's priced at $329.99.

HP joins rivals Acer and Samsung in joining the Chromebook party. To date, Chromebooks have not lit the world on fire. Unlike Windows PCs and Macs, Chromebooks are bundled with Google's suite of productivity tools, and the computers presume you're always connected using its cloud infrastructure as its platform.

It's a different approach and, in many ways, mimics the network computers IBM, Oracle and Sun tried pushing in the late 1990s with little uptake. Until recently, I didn't know anyone who owned a Chromebook. That changed a few weeks ago when Andrew Brust, CEO of Blue Badge Insights, tweeted he just bought a Samsung Chromebook.

Upon learning HP jumped into the Chromebook pool this morning, I checked in with Brust to see how he likes his Chromebook (I had made a mental note to do so anyway). He pointed out he needs more hands-on time with his Chromebook to fairly compare it to Windows 8, which he uses all day. Brust also has an iPad, Kindle Fire, Nexus 7 and MacBook Air.

"People I respect have been saying the second gen Chromebooks were surprisingly good, so I decided to buy one, especially given the low price of $249," Brust noted. "The thing is surprisingly useable.  I still prefer to use Windows, or even MacOS, with a full version of Office. But the fact remains that the presence of a touch pad and keyboard makes the Chromebook a true content creation machine and at a price point that achieves parity with the cheapest of content consumption-oriented tablets like the Kindle Fire and Nexus 7.  And the availability of Chrome Remote desktop also makes Chromebooks useable as thin clients that connect back to beefier Windows machines."

He underscored the Achilles heel of the Chromebook is its requirement of a constant Internet connection. "But with the addition of the next generation of Chrome packaged apps, which will work offline by default, and run not only on Chrome OS, but also Windows, MacOS and Linux, Google really has something here."

As Microsoft looks to gain momentum for Windows 8, its primary target is offering a superior alternative to competing tablets such as iPads and Google Android-based devices as well as ever-so-slick MacBooks. Should Microsoft also be worried about the rise of Chromebooks?

"For Microsoft, this may just be a thorn in the side, but it's one of many," Brust said. "And with now four important Windows OEMs hopping on the Chrome OS bandwagon, it's got to be impossible for Redmond to ignore.  Meanwhile, I question how much revenue the OEMs can get on such inexpensive devices."

How much OEMs will emphasize Chromebooks remains to be seem but one can't blame them for hedging their bets after Microsoft launched the Surface PC/tablet thereby reneging on its 30-plus year legacy of not competing with them. Some, including Acer CEO JT Wang, have made their displeasure known, while HP is showing it by throwing its new Chromebook in the mix.

Have you used a Chromebook or are you considering one? How would you compare it to various versions of Windows and other computing devices you have used? Will Chromebooks emerge as a true player or will they just appeal to a limited niche of users, the fate I have predicted since their launch. Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/04/2013 at 1:14 PM8 comments


Oracle To Acquire Microsoft Lync Connectivity Tool Company

It appears Oracle wants to add networking to its broadening portfolio of datacenter and applications offerings. Oracle today said it will acquire Acme Packet for $2.1 billion, the company's largest acquisition since it bought Sun Microsystems in 2010. It's also noteworthy because the move puts Oracle in direct competition with Cisco, who was rumored for some time to have had its sights on Acme. Another interesting twist: Acme Packet counts Microsoft as one of its key ecosystem partners -- its Session Border Controllers (SBCs) are used to enhance connectivity of Redmond's Lync Sever unified communications (UC) platform.

Enterprises and service providers alike use Acme's appliances to boost the reliability, interoperability and security of IP communications links. Because IP  is inherently not reliable or secure, Acme targets session delivery networks to enhance session-based voice, video, data and UC. Acme's session delivery networks offer session boarder control and management to ensure prioritized, secure and trusted delivery of such services and apps.

The company provides enterprise SBCs, which Microsoft recommends to boost the reliability and interoperability of Lync when connected to telecom providers SIP trunks. According to an Acme description of its Lync integration support:

"The session management function routes sessions between your Lync and legacy IP telephony environments, centralizes dial plan management for the entire infrastructure and provides interoperability with non-Lync communications systems. The Net-Net ESD is fully qualified by Microsoft under its Unified Communications Open Interoperability Program and offered in software and appliance configurations that provide efficient, highly scalable solutions for integrating Lync into your network."

A February 2011 TechNet article describes the various network topologies it recommends.

Acme is seen as the leader in providing SBCs, though its revenues have declined amid growing competition. It claims over 1,850 customers in 109 counties have deployed 20,000 systems. The company counts 89 of the top service providers and 48 of the Fortune 100 as customers.

It will be interesting to see whether Oracle will continue and advance support for Lync, or whether Microsoft will turn its sites to Alcatel -Lucent, Juniper Networks or Sonus. Do you use Acme to provide SBC services to your Lync deployment? If so, how do you feel about Oracle acquiring the company? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/04/2013 at 1:15 PM0 comments


Reckoning Time for Office and SharePoint

This week's launch of Office 2013 has put the spotlight on Microsoft's calculated move to condition individuals to pay a yearly fee to use the suite in tandem with managing your e-mail and using SkyDrive to store content. The notion of installing Office on up to five devices (including tablets) is compelling, as contributing editor Brien Posey pointed out this week. Microsoft is betting the farm individuals and enterprises alike will pay an annual subscription like they do if they want to use all the features in an antivirus software.

Perhaps that would be a slam dunk for customers if Office was available for the iPad. But it appears Microsoft doesn't want to help boost the growth of iPads at the risk of cannibalizing Windows 8 and its Surface devices, which so far don't seem to have made a dent in the tablet market. I believe Microsoft will ultimately offer Office for the iPad --  the question is when?

The other shoe is expected to drop this month or next, when Microsoft releases the enterprise versions of Office along with SharePoint 2013. As reported in detail back in December, SharePoint 2013 will offer extensive new enterprise social networking capabilities, search, business intelligence and support for cloud deployments. It will also offer parity with the SharePoint Online edition in Office 365.

Some argue these features will compel a larger-than-normal percentage of shops to upgrade than those that typically migrate to newly released versions of SharePoint. Microsoft has released new versions of SharePoint every two to three years over the past decade. SharePoint migration partner Metalogix this week did its own survey of SharePoint customers, which found 64 percent plan to upgrade to the 2013 release.

How accurate this forecast is remains to be seen. Jignesh Shah. Metalogix chief strategy and marketing officer told me his team sat down and conducted 20-minute interviews with 100 IT decision makers attending last November's Microsoft SharePoint conference in Las Vegas.

The fact that those customers were at a SharePoint event may very well have skewed the results, but Shah countered this is a higher percentage than he has noticed in the past under similar environments. "In 2010 it was less than 50 percent [that migrated to the new SharePoint version] over a period of two years. More than 60 percent plan to upgrade [ to SharePoint 2013] in the first year," Shah said, though the earlier data point was not based on a formal survey like it conducted this past November.

What's the reason for the uptick this time around? Companies want to take advantage of the social features and support for mobile devices, according to the survey, which is consistent with what I've heard for some time, regardless of whether Metalogix findings reflect the sentiments of the overall SharePoint community.

SharePoint shops also have major content management headaches. Three years ago, Shah said its average customer had between 50 to 100 GB of data in their SharePoint farms. Now 50 percent have more than 1 TB of data in their SharePoint stores and 15 percent have 10TB, the survey found. Furthermore, the average shop has seen their SharePoint content grow 75 percent over the past year.

The findings also show 55 percent will keep SharePoint on premise, while 10 percent will go "all-in" to the cloud. The remaining 35 percent will deploy a hybrid approach running in house and using cloud services to augment their SharePoint infrastructures.

If you're running older versions of SharePoint, notably the 2003 and 2007 releases (according to Metalogix, 37 percent still have those in their SharePoint farms), you can't migrate to SharePoint 2013 directly -- you must first deploy SharePoint 2010 and then upgrade that to SharePoint 2013. If as many shops plan to upgrade to SharePoint 2013 as quickly as the survey results suggest, that would be good news for Metalogix and others that offer SharePoint migration tools including AvePoint, Idera, Quest (now part of Dell's software group), among others.

For Metalogix, it sees this as an opportunity for it and its integration partners to let organization skip the two-hop upgrade step, while ensuring organizations can undertake a migration without losing use of their existing SharePoint systems during the transition. Metalogix also promises its tools will let organizations migrate content with fidelity of existing metadata, permissions and formatting time stamps and revisions.

With the pending release of SharePoint 2013, do you plan to migrate to SharePoint 2013 in the near term and where does the cloud fit into those plans? Likewise with Office 2013 -- do you see moving to the subscription model of Microsoft's productivity suite? Feel free to comment below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 02/01/2013 at 1:15 PM2 comments


Can BlackBerry Make a Comeback at Microsoft's Expense?

Everything is on the line today for Research in Motion as the company launched its long-awaited BlackBerry 10 at a high profile launch event in New York. The company, which also renamed itself BlackBerry today, launched its first two next-generation smartphones -- the Q10 with its signature physical keyboard and the touch-only Z10.

"It's a new day in the history of BlackBerry," said the company's CEO Thorsten Heins, speaking at this morning's launch. Despite a much-improved app-centric model that looks to build on BlackBerry's reputation for offering real-time and secure communications for enterprises, no one expects the company to lead the smartphone market it pioneered. The best it can hope for is to duke it out with Microsoft for third place.

That's ironic because five years ago -- just months after first iPhone shipped, Microsoft and RIM were jockeying for market leadership -- at least from an enterprise standpoint. Prospects for the iPhone were uncertain at that point and Google's Android was still gestating under the radar. "BlackBerry is the device to beat," the industry analyst Rob Enderle told me at the time.

Fast forward five years and USAToday asked if today's BlackBerry 10 launch is the company's Kodak moment, citing market research forecasts from Gartner that BlackBerry sales will decline 36 percent by 2016 giving it just 1.1 percent share of the market. By comparison, Gartner expects Android to have a commanding 55 percent share of the market, iOS 11 percent and Windows Phone 9.7, according to the report.

Ovum analyst Jan Douglas is also skeptical about BlackBerry's prospects. "At its peak, RIM shipped between 12 and 15 million devices per quarter, but there is no way it can hit this number on a sustainable basis once the BB10 launch filters through," Douglas said in a blog post. "Though the new platform should have significant appeal to existing users, we don't expect it to win significant numbers of converts from other platforms. There is little in the new platform that suggests it will have the compelling apps, content stores, or the broader ecosystem that consumers have come to expect in a competitive smartphone platform."

Not everyone believes it's game over for BlackBerry. RBC Markets technology analyst Mark Sue told CNBC its forecasting 500,000 BlackBerry 10s will ship in the first month, and the company could sell 10 million units in the first year.

"All their competitors need to be paranoid because if you look at a lot of devices, some of them are looking pretty old," Sue said. "A lot of the designs haven't changed over the last four or five years. Where we see a lot of growth is in the other category this year and I think RIM night have a small chance of opportunity if they execute."

Tech columnist David Pogue of The New York Times was skeptical of BlackBerry's prospects as well but wrote today BlackBerry delivered a new platform and product which didn't have any gaping holes. Because BlackBerry has a new real-time secure operating system, the BlackBerry 10 won't support old apps but the company developed software that can support 70,000 existing Android apps.

The BlackBerry 10 is also designed to appeal to enterprises by giving users one view of their business and personal data, thanks to a feature called BlackBerry Hub. The question raised by Times reporter Ian Austin is whether enterprises will want to upgrade their BlackBerry Enterprise Servers with this new software, which also supports other mobile phone platforms.

The trouble for BlackBerry is, so do many other mobile device management (MDM) offerings on the market. At the same time, BlackBerry 10 allows IT to better secure access to enterprise data from the phone and removing all data if access is revoked, should a device be lost or if the user were to leave the company.

Despite a much improved offering, like others Pogue was reluctant to say BlackBerry will see its fortunes revived. But he did argue the company may not be on the brink. "These days, excellence in a smartphone isn't enough. Microsoft's phone is terrific, too, and hardly anyone will touch it," Pogue noted.

But don't expect Microsoft to let BlackBerry get in its way. It has the cash and marketing might to continue its long battle with the BlackBerry.

With the launch of BlackBerry 10, is it on your short list or are you unmoved by the new offering? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 01/30/2013 at 1:14 PM2 comments


Symantec CEO Reveals Long-Awaited Turnaround Plan

Six months after ousting Enrique Salem as CEO of Symantec and replacing him with Chairman Steve Bennett, the company last week revealed plans for a reboot, which includes plans to consolidate its management, reduce overhead and re-organize into 10 business areas. Bennett, the former CEO of software maker Intuit and onetime GE executive, outlined how he plans to remake the company with a plan called Symantec 4.0.

The biggest news around Symantec 4.0 is the company doesn't plan to shed any key assets as many had thought he might do. When Bennett first stepped into the CEO seat, analysts were wondering whether it would spin off Veritas, which Symantec acquired in 2004 for $13.5 billion and was viewed as the root of the company's problems. While it became clear he wasn't going to shed the company's data protection business, there were rumors on the street Bennett might spin off or sell Altiris, which turned out to be untrue.

During a two-hour live analyst event that was Webcast, Bennett said Symantec 4.0 centers around adding more value to all of its product lines and re-aligning R&D to allow technologies and intellectual property developed for one group or product to be shared and utilized cross products when it makes sense. The company has historically developed technology in siloes.

Francis de Souza, Symantec's group president for enterprise products and Services shared an example of how Symantec is putting this new cross functional technology sharing into practice. The company has added File system technology developed by Symantec's Storage and Availability Management (SAMG) to its Data Loss Prevention and security offerings.

"What that technology allows us to do is to actually understand who's been accessing important files in your environment, again in SAMG technology in a security context," de Souza explained, adding Symantec is also looking to deploy it into its integrated backup and recovery software.

"We believe that allows us to set a new bar in the scalability of our integrated backup offering." he added. "The idea though is that instead of doing this as one-off integration between products, we'll focus on creating centers of excellence and use those technologies across our portfolio in a lot of cases to solve new unmet customer needs. And this isn't just an internal concept. We're going to take these centers of technology and look to leverage them through partners."

To that last point, he emphasized that Symantec has no plans to get into the network security business. "We have this tremendous Data Loss Prevention [DLP] technology that's actually content-aware," he said. "So more than knowing what application you're using, we can tell you what the content is. It may make sense to leverage that technology through a partner ecosystem to go to the next next-generation of network security."

Symantec 4.0 will also involve the centralizing of various functions. Symantec has made numerous acquisitions over the years but the companies were never properly integrated. Noting Symantec has copious point solutions, he believes the company has never integrated them across product lines and groups to add value. The company is also looking to improve what it admitted was poor rate of product renewals. To address that, Symantec is creating a team to address that company-wide.

While Symantec has struggled to successfully integrate Veritas, suggestions the company should divest it never made sense to me. Data protection and security should work hand-in-hand. Symantec is a popular company among Redmond magazine readers. In the publication's Third Party Readers Choice Awards, you'll see 26 products were voted came in first, second or third place, among various categories including security, IT asset management and backup and recovery. Seven products came in first place.

That to me says Symantec is one of your key software providers. How do you feel about Bennett's plan to make it a better company to do business with? Feel free to comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 01/28/2013 at 1:14 PM1 comments


Microsoft Financial Report: Is the Worst Over for Redmond?

When Microsoft reported earnings that slightly missed revenue forecasts but beat profit projections yesterday, investors reacted by trading down the company's stock 2 percent in after-hours trading. However, after sleeping on it, Wall Street seems to be seeing the glass half-full.

Microsoft stock was trading up about 1 percent midday today, following several analyst upgrades. While the jury on Windows 8 is still out and Microsoft certainly didn't hit it out of the park, the 60 million licenses sold -- albeit much from deferred revenues -- was enough to convince analysts that it's not a dud.

"We remain positive on the Windows 8 strategy around client, mobile, server, and cloud," Raymond James analyst Michael Turits noted, according to Barrons. "While W8 has not deflected the downward course of PC shipments thus far we expect the acid test to come in C2H13 with improved OEM hardware and distribution, and increasing value of both the enterprise and home ecosystem."

Citigroup analyst Walter Pritchard also was upbeat and raised his estimates for Microsoft's moribund stock to $35, which was trading at just under $28 midday Friday, Barron's Tiernan Ray reported. "We'd note that while PC market declines in CYQ4 continued, there are signs that the worst may be behind the company," according to Pritchard. "Looking forward, with 10" iPad sales plateauing, touch coming to more PCs, price points coming down, a lower-power Intel SOC architecture [Haswell] launching CYQ2/Q3 and a likely update to Win8 in the fall, Q4 may be the bottom for MSFT's consumer PC business. We look to see further confirmation of this trend."

Indeed Microsoft CFO Peter Klein indicated on yesterday's earnings call that he believes lower cost systems are in the OEM pipeline. "We are working very closely with our chip partners as well as the OEMs to bring the right mix of devices which means the right set of touch devices at the right price points depending in the unique needs of the individual," Klein said on the call. "I think we've learned a lot from that and one of the things you'll see is a greater variety of devices at a bigger variety of price points that meet the differentiated needs of our consumers."

The performance of Windows overshadowed the best performing part of Microsoft's business: Server and Tools. Revenues of $5.19 billion were up 9 percent, suggesting a healthy lift for Windows Server, SQL Server and System Center.

It remains to be seen whether indeed the worst is over for Microsoft but yesterday's report could have been a lot more alarming.

Posted by Jeffrey Schwartz on 01/25/2013 at 1:14 PM0 comments


Last Chance To Get Windows 8 Pro on the Cheap

At the risk of sounding like a car salesman, time is running out. Next Thursday is the last day to take advantage of Microsoft's $39.99 upgrade to Windows 8 Pro -- $14.99 if you purchased a Windows 7 machine after June 2 of last year. I'm not a pitchman for Microsoft, and at Redmond magazine we clearly understand many enterprises coming off Windows 7 upgrades are in no rush to move to Windows 8. Nevertheless, there are many good reasons why IT pros should use Windows 8.  

Anyone who has followed the pricing history of Windows can attest the company has never (at least in recent memory) offered its flagship PC operating system at such a cut rate price. Who knows if Microsoft will offer Windows at that price again? It's possible they will, especially if Windows 8 sales don't meet Microsoft's and Wall Street's hopes. But it could also prove to be your last chance to get Windows 8 Pro so cheap (upgrade licenses jump to $119 on Feb. 1).

When Microsoft released Windows 7, the company initially offered a package to consumers that permitted upgrades to 3 PCs for $149.99, an offer it brought back briefly in the early days but never did so after that point. I took advantage of the offer even though I didn't actually upgrade all of my family PCs right away. But it turned out to be worthwhile a couple of years later.

Even if you don't have touch-enabled PCs -- and most people probably don't -- you should become familiar with Windows 8 and its new Windows Store (aka Metro) interface. It works fine with a mouse and keyboard. And while the store lacks the number of apps available in the iTunes App Store or Google Play, the numbers are increasing. Using apps on a PC is a compelling experience and portends how people will ultimately use their PCs with or without touch.

Perhaps you're worried your existing apps won't work if you install Windows 8. Certainly make sure you run the Windows 8 Migration Assistant and make sure your hardware and software is compatible. Presuming your system passes muster, I can say running existing apps through classic Windows 8 has been a charm -- pun intended. So if you want to take advantage of some of the features Windows 8 offers but don't feel like shelling out big bucks for a new touch-enabled machine, Microsoft's soon-to-expire offer is worth considering -- even if you think you might want to perform the upgrade later.

 

 

Posted by Jeffrey Schwartz on 01/25/2013 at 1:14 PM10 comments


Lotus Notes Is Still Thriving -- or Is It?

I recall covering the launch of Lotus Notes two decades ago, when many companies were using it to improve their productivity. IBM's $3.5 billion investment in Lotus Development Corp. has served it well, even as it is now seen by many as a legacy platform.

Nevertheless I was surprised to read a report in The Wall Street Journal Monday that it was a $1.2 billion business, according to IDC, as of 2011. IBM, which yesterday exceeded analyst forecasts with revenues of $29.3 billion for the quarter ended Dec. 31 and earnings of $6.1 billion, said its Lotus business grew 9 percent for the period.

However, company officials didn't refer to Notes in its discussion with analysts yesterday, according to a transcript of its earnings call. Rather it referred obliquely to its social business offerings such as Connections as well as Kenexa, a cloud-based talent management and recruiting platform the company acquired last month. IBM is placing a big bet on social communications, perhaps hoping it will marginalize e-mail.

That leads me to wonder, how many enterprises are still using the Lotus Domino and Notes portfolio as their core messaging platform? Moreover do you intend to stick with it or are you looking to move to a hosted offering? If so which one? Or perhaps you're considering a move to the forthcoming Exchange Server 2013 and SharePoint 2013 tandem? Feel free to comment or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 01/23/2013 at 1:14 PM5 comments


Does a Microsoft Investment in Dell Make Sense?

Reports surfaced yesterday that Microsoft might pump up to $3 billion in Dell, a notion that until recently was unthinkable if not laughable. But times have changed.

As reported last week, Dell is purportedly lining up investors to take the company private led by private equity firms TPG Capital and Silver Lake Partners. The latter firm, the influential broker which brought the likes of Skype and Yahoo to Microsoft, seems to currently have the upper hand in leading a deal, which would include Microsoft, several sources close to negotiations told CNBC reporter Dave Farber yesterday.

Talking about the negotiations this morning, Farber said a deal could come together as early as this week but he noted it could also unfold. Given the total leveraged buyout of Dell would exceed $20 billion, according to numerous reports (though unconfirmed by either company), a $3 billion investment in Dell wouldn't give Microsoft a controlling interest in the company.

Nevertheless, the reports have indicated a Microsoft agreement to invest in Dell would come with the assumption the computer giant would make a commitment, perhaps formidable, to the Windows platform and even strengthen the distribution of Microsoft's new Surface, the company's hybrid tablet PC.

Under those circumstances, what's the downside for Microsoft, which has $66 billion in reserves, has already thrown similar amounts of cash to help Nokia market its Windows Phones and has invested $600 million in Barnes and Noble to prop its Nook tablet business? Microsoft even spent $8.5 billion on Skype, so what's a few extra billion to ensure the prosperity of perhaps the most strategic partner in the Windows ecosystem?

The downside risk for either company isn't trivial. First off, would the rumored investment itself be enough to achieve the outcome of solidifying Windows? If Microsoft's investment in Dell was to achieve the promise of giving it a competitive advantage, via early access to development or preferred licensing, it would drive a wedge between Redmond and its other strategic partners, notably Hewlett Packard and Lenovo, which are key providers of Windows-based PCs. One executive told The Wall Street Journal that a Microsoft investment in Dell indeed would embolden rivals to advance their support for Android. It could even annoy IBM. Though Big Blue no longer offers PCs, IBM does have a substantial Windows Server business and a Microsoft stake in Dell could drive IBM to further its bet on Linux, which is already significant.

But by deciding to make its own PCs last year with the release of the Surface RT and yesterday's announcement that it will release the Windows 8-based Surface Pro on Feb. 9, Microsoft has already signaled to OEMs that it'll do whatever it has to do to ensure the best prospects for its newly revamped desktop OS.  

So if Microsoft hitches its wagon closer with Dell will HP, Lenovo and others go deeper with Android? My guess is (and I don't have any inside knowledge on this), those companies have already made those decisions regardless of what Microsoft does. At the same time, if Microsoft can ensure strong demand for Windows, it should keep those partners from drastically reducing their commitment or walking away from it.

But even if all its partners were to further empower Google, Microsoft may be betting that an investment in Dell could give it what it needs to ensure it's a dominant supplier of Windows-based PCs, tablets and potentially phones, giving it the development and distribution might that has served Apple well.

Of course this influence would go beyond the device level, which isn't what's going to help Dell proper in the long run, as the company and others determined long ago. As Dell continues its push into the datacenter and cloud, Microsoft's influence could have equally untold implications. For example, Microsoft could get affordable access to technology to continue its build-out of Windows Azure, while helping it stave off, at least to some extent, the threat of alternative cloud platforms such as Amazon's EC2, OpenStack and VMware's vCloud, among others.

Some might argue Microsoft should acquire Dell outright and it could do so for much less than the $44.6 billion it almost spent to acquire Yahoo five years ago. But as its ultimate search deal with Yahoo years later showed, Microsoft can get what it needs by taking far less risk and nothing would preclude a further investment or complete buyout down the road, if it came to pass that made sense.

If anything, Dell taking on too much equity and influence from Microsoft could jeopardize the computer giant's own long-term well-being, should it be pulled away to any extent from other platforms including many popular open source initiatives. It remains to be seen whether Dell would diminish its support for other platforms but I don't think such a shift is likely -- it would cost Microsoft more than $3 billion to make that happen.

Do you think a Microsoft investment in Dell would be good for the future of the Windows platform or would the collateral damage be too significant for your comfort level? Please share your comments below or drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 01/23/2013 at 1:14 PM2 comments


Does HP Want Autonomy?

Hewlett Packard this week denied reports that it's looking to sell its troubled Autonomy unit and the EDS services business it acquired nearly five years ago for $14 billion.

It's the latest scuttlebutt surrounding a company that continues to appear lost and can't seem to escape bad news such as yesterday's confirmed report that the architect of its public cloud effort Zorawar "Biri" Singh, senior vice president and general manager, has left the company. Singh's departure was first reported by All Things D. See more on that in my Cloud Report blog.

While the EDS business may have issues at the moment and HP last year took an $8.9 billion write down for that deal, services are a key driver of business for rival IBM and it's hard to envision HP selling it off at this time.

As for Autonomy, which HP famously acquired for the astronomical -- actually, it's fair to say "ludicrous" -- price of $10.3 billion. I was among the skeptics at the time. Its approximate $1 billion in revenues never justified the valuation, a conclusion Oracle reportedly made many months before HP took the bait.

CEO Meg Whitman ultimately came around to agree that HP not only overpaid for the company but said it was duped by Autonomy's founders accusing them of "serious accounting improprieties" and "a willful effort by Autonomy to mislead shareholders," thereby taking a massive $8.8 billion write off. Founder Mike Lynch, who HP fired last May, adamantly denied the accusations, set up a blog to defend his position and accused HP of backtracking.

Former CEO Leo Apotheker pushed for the deal but he was unceremoniously dismissed before it closed. Whitman decided to go ahead with the purchase. After a whistleblower apparently came forward, HP had investigated the matter and Whitman went on CNBC making no bones about the fact the company felt it was the victim of massive fraud. The company has referred the matter to the SEC and the U.K Fraud Office (Autonomy was a British company).

An interesting Reuters report said HP's Apotheker and then chief strategy officer Shane Robison, who Whitman later dismissed, blindly went into the Autonomy deal out of desperation. Apotheker defended the deal in a statement to Bloomberg.

While all of this went down in in the closing weeks of 2012, it resurfaced this week when reports came out that several Silicon Valley companies were interested in acquiring EDS and Autonomy. Whitman's denial certainly could be posturing for a better deal but many believe Autonomy is of value to HP despite the incredulous price it paid for it.

One such believer is IDC chief research officer Crawford Del Prete. "When you talk to customers, what you find is that if you have an unstructured data problem, if you have a problem around syntax, if you have a problem around search, Autonomy has got some really useful technology," he told CNBC. "Autonomy can really help in a world where you have a mix of structured and unstructured data."

HP also has Vertica, a company it acquired in 2011 to help build data warehouses to discover structured data. The move to manage Big Data is a hot agenda item and an area not lost to other key IT players including IBM, Oracle, EMC, Google and Microsoft.

Would you like to see HP keep and emphasize Autonomy or do you envision building your data management and e-discovery efforts around other platforms such as the SharePoint-SQL Server tandem (along with Exchange and Windows Server of course) . See what SharePoint 20013 will offer in that regard. 

What's your take? Drop me a line at [email protected].

 

Posted by Jeffrey Schwartz on 01/18/2013 at 1:14 PM0 comments


New Dynamic Access Control Tips Posted

The cover story of this month's Redmond magazine looks at a key feature in Windows Server 2012 called Dynamic Access Control, designed to improve file server authorization and authentication by reducing Active Directory groups.

Microsoft has described DAC as one of the most important new features in Windows Server 2012. Of course there are many other key new capabilities in the new server OS, as Redmond contributing editor Brien Posey reviewed last summer.

Today MSDN kicks off the first of a four-part series on DAC. Authored by the U.K. Solutions Development Team of Microsoft Consulting Services, the blog post explains how to get started with images that show how to set up and manage permissions.

DAC is important because it allows IT to secure files, folders and other resources without having to manage groups. Just as the Redmond cover story pointed out, the MSDN post explains: "The main idea here is that a user's access rules are based upon Claims from their Active Directory properties. This makes it much easier to manage which users can and which users cannot access a specific resource."

If you want to learn more about DAC, expert Mark Minasi, will be giving a discussion on it the TechMentor conference, produced by Redmond magazine publisher 1105 Media, in Orlando in March.

Are you using, or planning in implementing, DAC in your organization? Please share your stories or concerns, by dropping me a line at [email protected].

 

Posted by Jeffrey Schwartz on 01/18/2013 at 1:14 PM0 comments


Is Dell on the Market?

While Dell hasn't publicly confirmed that it's considering leveraged buyout proposals from private equity firms, Wall Street continues to buzz at the prospect that indeed the computer giant is on the market and a deal could surface in the not-to-distant future.

Two bidders apparently in the mix include TPG Capital and Silver Lake, with Bank of America Merrill Lynch, Barclays, Credit Suisse and the Royal Bank of Canada in the pipeline to provide financing. First reported by Bloomberg on Monday, CEO Michael Dell is not surprisingly a key player in any such deal -- he holds an estimated 16 percent stake valued at about $3.6 billion in the company he founded in his dorm room back in 1984.

The company's stock rose 21 percent as of the close of business yesterday before tapering off more than 4 percent this afternoon. Once valued at more than $100 billion during the dotcom boom, a deal today would be valued at approximately $20 billion, ironically in the ballpark of the amount Hewlett Packard paid for Compaq a decade ago.

Both Dell and HP continue to struggle in the PC market as Asian rivals gain momentum -- most notably Lenovo, which gained prominence after acquiring IBM's PC business in 2004, but also by Acer and Asus.

Like HP, Dell is looking to further extend its footprint into the enterprise, as evidenced by numerous acquisitions in recent years such as SonicWall, EqualLogic, Compellent, Perot Systems, KACE Networks, Ocarina Networks, Force 10, Boomi, Wyse, AppAssure and most recently its push into enterprise software capped by the recent $2.4 billion acquisition of Quest Software.

The question is, would a privately held Dell have the resources to make the acquisitions the company needs in order to continue its push to build upon its enterprise hardware, software and services portfolio? Is a spinoff or outright sale of its PC business in the cards?

What's your view on the potential impact of Dell going private? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 01/16/2013 at 1:14 PM5 comments


Microsoft's Role in Facebook's New Search Graph

In his opening monologue last night, Jimmy Kimmel took note of Facebook's announcement yesterday that it intends to use the mounds of data it has collected from its 1 billion users to enable a new form of search.

The new Search Graph, Kimmel pointed out, provides customized search results by incorporating data from your network of Facebook friends. "So you can ask questions like 'who are my friends that live in San Francisco?' Which by the way if you have to ask that, you don't have any friends in San Francisco," Kimmel quipped. "It's an interesting new feature. Soon you'll be able to find anything you want on Facebook, except for the thousands of hours of your life you lost going on Facebook."

All kidding aside, Facebook is hoping that Search Graph will steer users away from the widely used Google search engine. And apparently acknowledging there's life outside of Facebook, it will continue to use Microsoft's Bing to find information on the Web, which it has done since 2010, as noted by Redmond columnist and All About Microsoft blogger Mary Jo Foley.

Could this Facebook integration also bolster her theory about why Microsoft is in it for the long haul with Bing? Apparently Microsoft, which was an early investor in Facebook, worked closely with the company to ensure Bing's role in extending Search Graph. In a blog post yesterday, Derrick Connell, Microsoft's corporate vice president of Bing, explained how the two work together.

"Now when you do a Web search on Facebook, the new search results page features a two-column layout with Bing-powered Web results appearing on the left-hand side overlaid with social information from Facebook including how many people like a given result," Connell explained. "On the right hand side, you will see content from Facebook Pages and apps that are related to your search."

Naturally both companies have a common goal in taking away search share from Google.

Could adding more Facebook features to Bing be in the works? Connell said, stay tuned. "Over the next several weeks our two teams will continue to experiment and innovate towards our shared vision of giving people access to the wisdom of their friends combined with the information available on the Web."

.

Posted by Jeffrey Schwartz on 01/16/2013 at 1:14 PM0 comments


Cisco Releases New UC Tool

Cisco Systems this week took the wraps off a unified communications platform that combines chat, presence and voice messaging onto PCs, tablets and smartphones.

Cisco Jabber is the culmination of the company's 2008 acquisition of Jabber Inc. Cisco said Jabber works with or is set to support Windows, iPhone, iPad, Nokia, Android and the BlackBerry platforms. Support for the Mac is slated for this summer.

It also works with Cisco's own video endpoints, including Cisco Unified IP Phones, Cisco WebEx MeetingCenter and Cisco TelePresence.

The company launched the new product at the Enterprise Connect trade show, taking place this week in Orlando, FL. The release of Cisco Jabber sets the stage for a battle with Microsoft, which late last year released Lync Communications Server. Cisco, like Microsoft, is looking to convince customers to move away from traditional PBXes.

While Cisco last week stepped away from a different battle with Microsoft, deciding not to go to market with its Cisco Mail offering, the battle in the UC market between the two companies is alive and well. Don't expect either company to back away.

Based on the standard Extensible Messaging and Presence Protocol (XMPP), Cisco Jabber is interoperable with a variety of presence and instant messaging platforms, according to Cisco. That allows for chat among individuals using other IM platforms from Google, IBM, Microsoft and AOL.

Cisco also said Jabber can integrate with Microsoft Office, enabling individuals to set up calls or chats within Office.

On the video side, Cisco Jabber supports the H.264 standard and provides high-definition resolution. It allows for multiple parties to participate in a videoconference. However, the latter capabilities aren't set to be available until the second half of the year.

Posted by Jeffrey Schwartz on 03/02/2011 at 1:14 PM0 comments


Can Windows Phone 7 Succeed?

After attending Microsoft's launch of Windows Phone 7 Monday in New York, I walked away feeling that Microsoft has put its best foot forward in attempting to regain share in the hypercompetitive mobile phone market (see Microsoft Launches Windows Phone 7). The defining question: will Windows Phone 7, despite its positive attributes, get lost in the crowd that is clearly dominated these days by Google's Android, Apple's iPhone and Research in Motion's BlackBerry?

"I've never seen anything like it," said Forrester Research analyst Jeffrey Hammond, who was at Monday's launch event. We were talking about the rapid ascent of the Android mobile platform, which had virtually no share a year ago, and now has emerged as the fastest selling smartphone OS, according to data released by Nielsen last week.

With Verizon Wireless reportedly set to start selling the iPhone early next year and the BlackBerry platform holding its own, where does that leave even a respectable Windows Phone 7? Hammond pointed out it is not game-over for Microsoft.

"Right now we see 23 to 25 percent of phones out there in the U.S. that are smart phones, so there's still another 75 percent of the market to convert over," Hammond said. "If they aggressively price Windows phones so they are logical replacements for quick messaging devices and they offer lower cost data plans, they can grow in the market without having to take Android devices out of users' hands."

Plus every two years, a good number of users swap out their phones, suggesting the long term outlook for all platforms could shift. Key to whether or not Windows Phone 7 will be a viable platform moving forward is whether the .NET developer community mobilizes, so to speak.

IDC analyst Al Hilwa believes they will. "They have a ready base of developers that haven't been very much engaged so far with either Android or Apple that will bring that whole base of developers on board," Hilwa said.

Hammond agrees: "I think there is general interest from the core Microsoft developer network out there and there's a lot of them," he said. There are two things that have to happen, he added. "They've got to get units in market and they've got to make sure they make the on-boarding process as easy and as inexpensive as possible."

Despite a slick batch of devices that will come at launch, this is very much a consumer play. Microsoft still has to evolve Windows Phone 7 into an enterprise-grade platform, adding the ability to remotely manage the devices and embed improved security. Still this is a market where consumers decide first and if a platform succeeds, enterprises will then decide whether or not to support it. So from that perspective, Microsoft's strategy makes sense.

At the same time though, Microsoft will have to win over consumers that want to use their devices to help them in their jobs. From that standpoint, the SharePoint support and E-mail integration are good first starts.

Microsoft CEO Steve Ballmer summed up Windows Phone 7 as "a different kind of phone." It is from the perspective that it is focused more on what the user wants to do with their phones, Hammond pointed out. "They are focusing on what it allows you to do, as opposed to what it does," he said.

If you’re a Microsoft partner or enterprise developer, I'd like to hear your take on whether Windows Phone 7 still has an opportunity over the long haul. Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 10/13/2010 at 1:14 PM4 comments


Microsoft Extends HPC's Limits

Microsoft is looking to up the ante with its Windows Server HPC platform. The company released its third iteration yesterday and signaled it would like to see broader use of its high-performance computing platform.

"Think of this as one of the key shifts in our fleet for what we look at as this future of technical computing," said Bill Hilf, Microsoft's general manager for technical computing. Hilf made his remarks in his keynote address at the High Performance Computing Financial Markets conference in New York.

"It's a pretty wide initiative where we want to take the technical computing technologies and really make those available and accessible to a broader number of IT end users and developers," explained Bill Hamilton, Microsoft's director of technical computing.

In an interview right after his keynote address, Hilf said he believes HPC applications will drive usage of Windows Azure, thanks to Windows HPC Server's newfound support for Microsoft's cloud service.

"I believe the technical computing workload will be the killer Azure app because the nature of these workloads consume a ton of computer," he said. "We believe having an infrastructure with hundreds of thousands of servers is going to be very compelling."

Some notable facts about Windows HPC Server 2008 R2:

  • It will support 1,000 nodes, up from 256.
  • Excel users will be able to perform computations much faster.
  • Organizations can tap idle Windows 7 desktops in their clusters.

If you're among those that are testing it out, or intend to do so, share your thoughts with me at [email protected].

 

Posted by Jeffrey Schwartz on 09/21/2010 at 1:14 PM0 comments


Microsoft Skipping Verizon for Round 1 of Windows Phone 7

Reports that Windows Phone 7 initially won't be available at launch on the Verizon Wireless network are hardly a surprise, given that all test units were assigned to AT&T. But now comes word that it might be awhile before Verizon Wireless users will be able to get their hands on Windows 7 Phones.

That's because, according to News.com's Ina Fried, Microsoft will need to create an upgrade to Windows Phone 7 to support CDMA networks. Both Verizon Wireless and Sprint's networks are CDMA-based while AT&T's and T-Mobile's networks are GSM-based.

How extensive that upgrade will be and when it will arrive is unclear, reports All About Microsoft's Mary Jo Foley but it appears Microsoft is targeting the first half of 2011.

If anything, Microsoft should have aimed to release WP7 on CDMA networks first, and then worry about GSM support.  By not offering support for CDMA from the outset, it only gives Google's Android further opportunity to become entrenched on Verizon's network.

That might not be so terrible if Verizon didn't have the largest percentage of subscribers in the U.S. with a share of over 31 percent, according to comScore. Perhaps Verizon wanted no part of Microsoft after the Kin debacle? Or maybe Microsoft feels it's getting even with Verizon for heavily pushing Android phones.

Whatever the case, Microsoft should have endeavored with more priority to deliver Windows Phone 7 on Verizon's network at launch. If Microsoft wants to have any chance of slowing the momentum of Android, its phones need to be on Verizon's network and prominent among the carrier's dealers.  It wouldn't hurt to have them available with Sprint as well.

What's your take? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/17/2010 at 1:14 PM1 comments


Microsoft Downplays Windows 1.0

Windows 1.0 got off to its auspicious start on Thursday Nov. 10, 1983, at the Plaza Hotel in New York City. Invitations to the launch were sent to the press in a box with a squeegee. The header read: "For a clear view of what's new in microcomputer software please join Microsoft and 18 microcomputer manufactures for a press conference…"

But, like many versions of Windows that would follow it, the first release didn't ship until two years after that fateful press conference, leading many to refer to it as "vapor-ware." Finally, Microsoft released Windows 1.0 in November of 1985 at Comdex.

When it comes to Windows 1.0, Microsoft prefers not to look that far back and has no apparent plans to celebrate its pending 25th anniversary. When we reached out to Microsoft to talk about its first rendition of Windows, the company declined to make anyone who was there at the time available. "We are focusing our anniversary efforts on the Windows 7 first birthday so unfortunately we won't be able to provide a briefing from someone [from] the Windows 1.0 days," a spokeswoman for Microsoft said in declining our request.

Of course, that anniversary is coming next month on Oct. 22. But we want to hear your recollections of Windows 1.0. Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/10/2010 at 1:14 PM3 comments


Apple iPads and iPhones to Get Flash?

Apple's decision to loosen the guidelines of its App Store means you should could soon be seeing Flash-based content on iPhones and iPads.

But that remains to be seen. While the company's move will allow developers to use third party tools including those used to create Adobe Flash code, that doesn't mean iPads, iPhones and iPod Touch devices will be able to run Web-based Flash content. Still, it suggests that may be in the cards at some point.

What happened to Apple CEO Steve Jobs' insistence that Adobe Flash was too buggy and would be a drag on power on the devices? Will he stick to that stance?

"We have listened to our developers and taken much of their feedback to heart," Apple said in a statement released yesterday. The news drove Adobe's shares up 12 percent yesterday. The unusual about-face by Apple was likely the result of the Federal Trade Commission's June inquiry and success of Google's Android platform, according to The Wall Street Journal. IDC this week said Google's Android and Microsoft's Windows Phone 7 platform will steal share from Apple in the coming quarters.

It remains to be seen whether Flash on iPhones and iPads Apple's relaxed guidelines for developers if iPhones, iPads and iPods will increase their appeal and put a wrench into challenges by Google, Hewlett-Packard, Microsoft, Research In Motion and others. 

What's your take on Apple's unexpected move? Drop me a line at [email protected].

Note: Updated September 13th. 

Posted by Jeffrey Schwartz on 09/10/2010 at 1:14 PM3 comments


Who Will Replace Elop?

The news that Stephen Elop is leaving Microsoft is hardly a surprise -- Elop was believed to have been coveting a CEO job for a long time and now he has one.

Elop will take the reins of Nokia Sept. 21, leaving yet another void in the executive ranks at Microsoft. In addition to looking to fill the hole left by the departure of Robbie Bach, who headed Redmond's Entertainment and Devices business, now CEO Steve Ballmer will oversee the Microsoft Business Division until he names a successor.

In an e-mail to employees announcing Elop's departure, Ballmer pointed to MBD's bench, including Chris Capossela, Kurt DelBene, Amy Hood and Kirill Tatarinov, who will report to him until a successor is named.

It's been a few months though since Bach was ousted. Will Ballmer take as long to replace Elop? Or might he use the change to re-organize the different business groups at Microsoft?

What's your prediction? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/10/2010 at 1:14 PM0 comments


Will MPN Drive Consolidation?

It's hard to believe it's already September. While that means back to school for many, it also means there's less than one month until the official launch of the long-awaited new Microsoft Partner Network.

Some may dispute whether it's long-awaited, I realize. After all, for some smaller partners, the new certification requirements could mean their once-coveted Gold Certified status will be no longer attainable.

But others are eagerly anticipating the change. One such firm is Tallan, a Microsoft National Systems Integrator (NSI) based in Hartford, Conn. As I reported today, Tallan has acquired twentysix New York, a move it said positions it for the pending changes.

"It wasn't really a concern and not a factor but now that they've announced it and gotten the rules out, it makes me feel easier we will be able to satisfy that we will be able to stay in the top tier of the program," says Craig Branning, Tallan's CEO. In fact, he welcomes the fact that it will be harder to achieve Gold status.

"I like it. It gives a company like ours some advantages because we can set ourselves apart," he says. "It was a little bit too easy to qualify for gold status for some of the smaller players."

How will MPN shape your business? Is it giving you the urge to merge? Are you looking at other partnering opportunities? Please share with us the good, the bad and the ugly as some did at the recent WPC 10 conference in Washington, D.C. You can drop me a line at [email protected].

Posted by Jeffrey Schwartz on 09/02/2010 at 1:14 PM0 comments


Should HP Be Split Up?

There's no shortage of opinions out there as to what should happen now that Hewlett-Packard has ousted CEO Mark Hurd.

Oracle CEO Larry Ellison blasted HP's board for taking what he felt was a minor infraction and cutting him loose as a result. "The HP board just made the worst personnel decision since the idiots on the Apple board fired Steve Jobs many years ago," Ellison wrote to The New York Times. "That decision nearly destroyed Apple and would have if Steve hadn't come back and saved them."

Meanwhile, Kevin Kelleher revives an idea that was frequently broached at the end of the Carly Fiorina era: break up the company.  Kelleher made his proposal in DailyFinance:

"If the board were really honest with itself about the best strategy, it would use Hurd's departure as an occasion to break the company up into three smaller pieces -- a computer hardware maker, an IT services firm and a printer company. Find a solid CEO for each, and let them focus on what they do best."

While that may be extreme, staying the course could be a tough act to follow. True, while Hurd grew revenues from $87.6 billion in 2005 to $125 billion projected for this year, that was done primarily through mega acquisitions.

The cost of doing so has meant the company has taken on huge debt. Daily Finance's Peter Cohan points out HP's debt has mushroomed from $3.4 billion to $14 billion on Hurd's watch, while its cash position has slipped from $13.9 billion to $13.3 billion.

Hurd's cost cutting, while improving operations has had a price. According to yesterday's Wall Street Journal:

"Caris & Co. analyst Robert Cihra estimates that HP's PC business has effectively driven all revenue growth since Mr. Hurd took over. But operating margins in that business are just 5 percent. And in the last two quarters, HP has been losing share to Asian rivals Lenovo Group, Acer, Toshiba and Asustek Computer at a faster clip.

HP's market-leading printing business generates solid 17 percent margins, but growth is a sluggish 3 percent, estimates Mr. Cihra. The same is true of its services business, which became a big part of HP's business with the acquisition of Electronic Data Systems: 16 percent margins but just 2 percent growth."

Meanwhile, Reuters points out that Hurd reduced research and development spending by 20 percent, noting HP spent only 2.5 percent of sales on R&D. Reuters notes that rivals Apple, Cisco, Dell  and IBM averaged more than 6 percent, meaning HP will need to increase its R&D if it wants to keep pace.

What's next for HP is anyone's guess but the status quo likely won't do. Does the board need to do something as drastic as splitting up the company? What's your recommendation? Feel free to comment below or drop me a line at [email protected].

Posted by Jeffrey Schwartz on 08/11/2010 at 1:14 PM3 comments


Microsoft Targets Mission-Critical Systems

Microsoft yesterday launched what it calls Premier Mission Critical Support Service, which, as the name implies, is intended to help users architect and maintain apps and systems that require constant availability.

These long-term services are for those who want to invest hundreds of thousands of dollars to perform architectural reviews, implementation and monitoring services thereafter, as reported by my colleague Kurt Mackie.

I spoke with Norm Judah, CTO of Microsoft Consulting Services to try to get a better understanding of where partners would fit into this service. When asked if they would be certified to potentially deliver these architectural reviews, he sounded doubtful -- and certainly not in the short term.

"Probably not, at least not initially, because the solutions engineer, the guy we are going to keep on site is generally deeply involved in doing that work to have that context," Judah explained. "There's an opportunity for us, when we have much more knowledge about the system to really investigate what a sell-through would look like, but I don't think we are ready for that. In many cases, the customers are asking for us to do this because they want Microsoft's skin in the game."

That said, Judah does see partners delivering some of the remediation services. "Remediation might be something simple such as tune a database but it could be something very complex, like you need to re-architect the middle tier of your commerce application. Those are all partner opportunities, driven by the customer. That is really the customer's choice who they want to do that with. If they have an existing partner that they work with, maybe it's the guy who wrote the application, no problem in doing that and we would work very closely with them taking the output of their remediation."

What's your take on this new service? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 08/11/2010 at 1:14 PM0 comments


HP's Hurd Had To Go

A top executive charged with sexual harassment or violating company policies is hardly a rarity in today's business world. Yet, the news Friday that the laser-focused Hewlett-Packard CEO Mark Hurd was effectively dismissed for such allegations has rocked Silicon Valley.

As the story played out over the weekend, and no doubt will continue to do in the coming days and weeks, some argue HP's board should have looked the other way, given the fact that the company's market cap has doubled and the company has substantially and consistently grown revenues and profits since Hurd took the reigns five years ago.

It's bizarre that the woman at the center of the scandal, marketing consultant and "actress" Jodie Fisher, charged Hurd with sexual harassment and hired the celebrity attorney Gloria Allred to represent her only to release a statement expressing regret.

"Mark and I never had an affair or intimate sexual relationship," she said in a statement released Sunday. Fisher went on to say that she has reached a private settlement with Hurd, who also denied any intimate relationship. "I was surprised and saddened that Mark Hurd lost his job over this," Fisher said. "That was never my intention."

Are you buying this from either party? Even if all that is true would HP have let go a CEO who turned the company around merely for fudging expense reports or having a few ill-advised dinners? Perhaps it would have given the company's strong focus on requiring employees to uphold ethical standards. HP reportedly has let go numerous employees who had violated HP's rules of conduct and it would have been hypocritical to look the other way when Hurd did so. In the end, Hurd no longer had the confidence of HP's board. While Wall Street loved him, employees and many in Silicon Valley detested him, according to analysts.

Suffice to say, it appears there's much more to this story. But the bigger question is what's next for HP? Even those who argue that Hurd was to HP what Steve Jobs is to Apple, let's not forget that Apple did just fine during his six-month leave of absence last year. Hurd's departure is an opportunity for the company to bring innovation forward and strengthen its partnerships.

Internal candidates to replace Hurd include Todd Bradley, who turned around the company's once-struggling PC business. Ann Livermore, who runs HP's huge services business is another oft-mentioned possibility. Outside candidates include two Softies: Microsoft COO Kevin Turner and Stephen Elop, president of the company's Business Division.

Others include EMC CEO Joe Tucci, IBM executive Michael Daniels, and former Compaq CEO Michael Capellas, The Wall Street Journal reports. Netscape founder and Silicon Valley VC Marc Andreessen is leading the search committee. (Could he be a candidate too?)

HP's board should be looking for a CEO who can execute as well financially as Hurd did -- no easy task -- but also someone who will bring technical and product leadership that companies like Apple, Google and (yes, even) Microsoft are demonstrating these days.

If not, those companies, along with the likes of Acer, Cisco, EMC and IBM, just to name a few, will further erode HP's effort to lead in markets that range from tablets to PCs to communications devices to enterprise infrastructure.

The new CEO will also have to look at its partnerships, including the one Hurd extended with Microsoft back in January to develop and bring to market advanced data center technology. While there has been some progress, that partnership looks a little cold these days.

For example, HP phoned it in when it came to endorsing Microsoft's Windows Azure appliance. At Microsoft's Worldwide Partner Conference last month, Dell and Fujitsu had executives front and center to talk up their plans to deliver an Azure appliance. At WPC, HP was nowhere to be found, though Scott Farrand, vice president for the company's Industry Standard Servers and Software business told me not to read anything into that.

"It was a simple matter of logistics for trade shows and availability of key executives for that kind of time line," said Farrand, who also gave telephone briefings at WPC. "There was no specific message in there that's of significance relative to HP's attitude here. We've had a longstanding partnership with Microsoft."

Also, while HP is now planning to deliver a Windows-based tablet PC, that was not a sure thing following its April announcement that it was acquiring Palm for $1.2 billion. In the months that followed, HP went dark regarding the future of the Windows-based Slate PC that Microsoft CEO Steve Ballmer highlighted earlier this year. It appeared DOA only for the company to recently disclose it will deliver the Windows-based Slate targeted at high-end enterprise users.

HP needs a leader who will not just give lip service to advancing its wares and its partnerships. It's not only important to the future of HP but to the ecosystem that surrounds the company.

What's your take? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 08/09/2010 at 1:14 PM1 comments


IBM vs. Microsoft: The Big Iron Battle

It appears either Microsoft has mainframe-envy or IBM is not too happy about Microsoft's data center ambitions of late. Most likely, it's a combination of both.

Consider the following:

  • Microsoft recently said it is going to offer portable data centers based on its cloud-based Windows Azure platform.
  • IBM just launched a new mainframe that is the first to support virtualization of x86-based blades with Linux, not Windows, as the preferred platform. Big Blue appears cold to the idea of the blade extensions supporting Windows.
  • The European Union last week launched an investigation of IBM's mainframe business. IBM's response: Microsoft and its minions are behind the investigation.

So it was a hot July for two companies who really have bigger fish to fry than each other. It is clear IBM is not happy with Microsoft these days, based on Big Blue's response to the EU investigation:

The accusations made against IBM by TurboHercules and T3 are being driven by some of IBM's largest competitors -- led by Microsoft -- who want to further cement the dominance of Wintel servers by attempting to mimic aspects of IBM mainframes without making the substantial investments IBM has made and continues to make. In doing so, they are violating IBM's intellectual property rights.

T3 and TurboHercules offer mainframe software that competes with IBM's own offerings.

As for the newly launched mainframe, the zEnterprise is the most important new piece of big iron launched by IBM in more than a decade because it is the first to provide integration with its Power7 blade infrastructure and x86-based blade racks running Linux.

Where it breaks new ground is its common virtualized platform capable of running 100,000 VMs simultaneously, while providing a turnkey data center that shares network, storage and power components.

Providing the capability for the mainframe to assign workloads to x86-based blades is the system's Universal Resource Manager, made up of software and embedded hardware, said IBM Distinguished Engineer Donna Dillenberger, in an interview at the launch event.

Her colleague, David Gelardi, IBM's vice president of sales, support and education, told me one could opt to run Windows workloads rather than Linux on the x86 BladeCenter Extensions, dubbed zBX.

"There's no reason you can't use it to run Windows, because Tivoli's provisioning capabilities is operating systems agnostic," he said. "Windows would run on an outboard blade and ultimately would run on an xBlade inside zBX."  

But at the same time, Steve Mills, senior vice president of IBM's software and hardware businesses, was in another room with analysts playing down that notion. When asked if the blades would run Windows, Mills reportedly said because they are x86-based, Windows could run on it "but the problem was essentially, to IBM, Windows was too much of a black box to be able to do what they wanted to do with it," recalled RedMonk analyst Michael Cote, in an interview.

"I don't think IBM is especially interested in managing Windows on the zEnterprise," Cote said. "Technologically it wouldn't work out, and they probably are unwilling to do whatever it would take to make Microsoft help them out with it. But I think in the wider context of things, IBM's not really out to help Microsoft out really."

Analyst Joe Clabby of Clabby Analytics, in an interview, explained it would require Microsoft to support IBM's virtualization technology and make tweaks to its own Hyper-V. "Hyper-V is nowhere near IBM from a virtualization and provisioning perspective," Clabby said. "If I were IBM I'd say get that stuff out of the way, use this approach and then you can integrate with our mainframes better, but I don't think that will be received well by Microsoft."

So both Cote and Clabby are in agreement that we shouldn't anticipate the new zEnterprise running Windows workloads -- at least with the help of Microsoft and IBM -- any time soon.

As for the EU investigation: "I don't know if IBM's allegations that Microsoft is behind it are true, but in this day and age, part of the way you compete is to try to help government agencies do antitrust stuff," Cote said. "Whether it's a good way of competing, it seems like one front in a war of competing."

Is Microsoft really a threat to IBM these days? Cote says certainly more so than it was in the past. "Microsoft wants to expand into the enterprise area," he said. "If you look at the numbers on Windows server usage, it's everywhere. That's a chunk of revenue that IBM is missing out on."

It's not just the data center where IBM is taking on Microsoft. IBM is also taking its best shot at breaking into the desktop with its new CloudBurst offerings and its free Microsoft Office alternative, Lotus Symphony.

While the two companies both compete and partner in many areas, it does appear that the rivalry between them is picking up. "I don't think IBM as a culture has ever forgiven itself for creating Microsoft with DOS licensing and everything, but I see a bit more viciousness when it comes to IBMers talking about Microsoft people these days," Cote said.

All that said, he points out that IBM's true nemesis is Oracle. And Microsoft has made it clear that its two biggest enemies are Apple and Google, with VMware and Oracle clearly in its path, as well. The tensions between IBM and Microsoft "are definitely more active but they're not at each other's throats," Cote said.

Clearly it will be interesting to see the two go head to head in the market for so called private clouds "in a box." What's your take? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 08/03/2010 at 1:14 PM0 comments


MPN: Going for the Gold

Anyone who was hoping that Microsoft's partner organization would put the breaks on its plans to require unique certifications was disappointed last week.

"It's full speed ahead," said Julie Bennani, general manager for partner programs at Microsoft, in an interview during last week's Worldwide Partner Conference in Washington, D.C. "We are still going on with those requirements and landing those in October."

While she and Microsoft's new partner chief, Jonathan Roskill, signaled they were willing to consider alternatives at a later date, as reported, the plan looks baked to move forward with the Oct. 1 date for transitioning to the new Microsoft Partner Network (MPN). "If we said, 'If you could get Gold in three of the five in Core IO, we could give you a Core IO competency.' That's one thing that's interesting to think about," Roskill said.

Microsoft last week did say it's renaming the competency and advanced competency designations Gold and Silver, but many appeared to welcome a Silver designation like receiving a booby prize.

At WPC last week, I sat in on a session called MPN: The Good, The Bad, and the Ugly. It was moderated by Mo Edjlali, a management consultant, who, until recently, served five years as the president of the International Association of Microsoft Channel Partners Washington D.C. chapter.

Now a management consultant, Edjlali looked at the situation from both points of view. "I think Microsoft wants people to have focus, that makes sense," Edjlali said. Indeed there are many partners who are Gold who most would agree don't deserve that designation today.

"But I think the trouble is some products are so closely related that you can't say your good in BI and not in SQL Server, it confuses customers when they might feel that expertise isn't there when it's always been there," he added. "People say 'my staff hasn't changed but now we are going to come across like we're not as sharp as we used to and these big companies that have the manpower will have the credentials but not have the skills or actually put the billable people on project with those skills.'"

That is the center of the fear. The large integrators can afford to have Gold certified engineers across the board, but the small- and mid-sized firms can't. So they will have to decide which disciplines they want to be Gold certified in and accept Silver for the rest.

Perhaps a better idea would have been to introduce a Platinum tier, Edjlali said. "It's going to be difficult for people to go from Gold to Silver," he told me in an interview after the session.

Janice Crosswell of Microsoft Canada's Corporate Assurance Group, who was sitting in on the session tried to spin the situation. "Silver is better than [the current] Gold," Crosswell said to the group. "When you're talking about some of the math, and I am saying 'I am just Silver,' you are actually rank higher than the [current] gold. There are some more requirements."

That didn't go over too well. "Customers are never going to know that Silver is now better than Gold used to be," a partner in the session replied. "They see Gold and that's what they see."

Posted by Jeffrey Schwartz on 07/12/2010 at 1:14 PM0 comments


Compromise in the Works on MPN Competencies?

There is growing buzz that Microsoft will come up with some compromise over the certification requirements that some partners fear will put them out of business. But it is not clear to what extent.

Details of any changes to the new Microsoft Partner Network (MPN) are expected to be made public next week at the company's annual Worldwide Partner Conference, set to be held in Washington, DC.

"The word on the street is some changes have been made and will probably be announced at WPC," says Howard Cohen, northeast regional chairman of the International Association of Microsoft Channel Partners (IAMCP).

Cohen says it remains unclear what those changes will be but he was optimistic. "The people at Microsoft who were responsible for MPN have told us clearly that they are open to dialog and they appreciate the role IAMCP plays as the voice of their partner channel." A Microsoft spokeswoman said the company does not comment on rumors but said "we will be talking a lot about MPN next week at WPC."

As reported, MPN is set to place new certification requirements that will force many partners to hire additional engineers in order to maintain multiple advanced certification levels. As the Gold Certification Partner designation is set to fade away, MPN will give way to specialty-specific designations called Advanced Competencies.

The requirement that bars double-dipping by engineers could impact those organizations with multiple Competencies now because they will have to add engineers to cover multiple disciplines at the Advanced level in the MPN.

Many Microsoft partners, larger ones in particular, say that's a good thing. "We don’t want to compete with someone who paid $1,500 and passed one test and became a virtualization partner," says Thad Morrow, director of sales at Concord, Calif.-based Entisys Solutions.

"Personally I think they are going to have make revisions because it will put too many partners out of business to be quite honest," argues Jeff Goldstein, president of New York-based Queue Associates, winner of Microsoft's CRM Dynamics SL Partner of the Year Award. "I understand what Microsoft is trying to do. Too many people are Gold Certified Partners."

Cohen doesn't dispute the notion that Microsoft has too many partners out there who have abused the Gold designation over the years. By his estimate, of the 12,000 partners in the New York City metropolitan area, 7,000 don't even have registered domains. Of those that do, perhaps only 3,000 to 4,000 are active Microsoft partners looking to grow their businesses, he reasons.

"Nobody who's making investments in building a good practice likes to see anybody level the playing field," Cohen says. "I think MPN is meant to wipe away all of the things that leveled the playing field. This is a good thing. As long as you don’t hurt the people who are playing by the rules, those are the ones we have to make sure we are taking good care of."

Pruning Microsoft's partner base in a way that doesn't take meat off the bone could be a critical challenge for the company's new channel chief Jonathan Roskill, who took over last week after swapping jobs with Allison Watson, who held the job for over seven years.

Another challenge for Roskill will be to get partners comfortable with Microsoft's "we're-all-in" cloud strategy.  What other challenges does Roskill have? If you'd like to make your opinions known, please take a minute to participate in a brief poll. As always, your responses will be kept anonymous unless you invite us to follow up with you about your answers. Click here to take the survey.

Posted by Jeffrey Schwartz on 07/08/2010 at 1:14 PM1 comments


Fireworks in Redmond

Report suggests Microsoft's woes stem from lack of young developers and customers.

It was, no doubt, a rather unsettling beginning of Microsoft's new fiscal year.

First Microsoft kills the Kin, making it arguably its biggest flop since Bob. Then a report by Microsoft Kitchen's Stephen Chapman detailed some specifics about plans for Windows 8 including a Windows Store, faster boot-up, support for slate-type devices and facial recognition. Not a welcome move as Microsoft is looking to keep the focus on Windows 7.

Then on July 4, The New York Times published a scathing piece that questioned Microsoft's ability to appeal to young consumers and developers alike. Using the Kim demise as the backdrop, the Times piece questioned Microsoft's ability to appeal to the youth crowd.

"We did not get access to kids as they were going through college," Bob Muglia, president of Microsoft’s business software group, told the Times last year. "And then, when people, particularly younger people, wanted to build a start-up, and they were generally under-capitalized, the idea of buying Microsoft software was a really problematic idea for them."

Tim O'Reilly, the influential book publisher and conference organizer, lent credibility to the Times' Ashlee Vance assertion:

"Microsoft is totally off the radar of the cool, hip, cutting-edge software developers. And they are largely out of the consciousness of your average developer," O'Reilly was quoted as saying.

O'Reilly in a blog posting said he doesn't recall saying that. "My memory is that Ashlee opened our conversation with that assertion, which I countered by saying that Microsoft still has big, active developer communities, and that you shouldn't assume that just because you can't see them in San Francisco, that they are dead," O'Reilly writes.

"I feel more than a little misrepresented," O'Reilly concludes. "It's sad when the NYT uses "flamebait" techniques in its stories. Rather than real journalism, this felt like a reporter trying to create controversy rather than report news."

Frank Shaw, Microsoft's corporate VP of communications told Seattle's public radio station KUOW that Microsoft has two strong efforts to recruit young developers -- Dreamspark and its Imagine Cup project taking place now in Poland. "We've got programs designed at young developers," Shaw said. "I look at both those things and think maybe our definition of cool and hip is different."

He points to his recent blog posting where he highlights Microsoft's net income for its 2009 fiscal year of $14.5 billion, compared to Google's $6.5 billion and Apple's $8.2 billion. "We've grown revenue and profits in a phenomenal way," Shaw said. Of course a clearer indication will come later this month when Microsoft reports its fiscal year 2010 earnings for the period ended on June 30.

Do you think Microsoft can pull out of its current morass and appeal to young developers and consumers? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 07/07/2010 at 1:14 PM2 comments


Kin May Outshine Bob as Microsoft's Biggest Dud

Microsoft cut its losses quickly with the Kin, even though it never should have been hatched in the first place. Not that it's a big shock that Kin failed or Microsoft had much choice in the decision to kill it. It just seemed that after the lack of logic in rolling it out in the first place, that Microsoft might be reticent to concede defeat so rapidly.

Kin, known internally at Microsoft as Pink, of course was borne of Microsoft's acquisition of Danger, maker of the Sidekick, which was marketed by T-Mobile until it yesterday pulled the plug on it as well, according to CNET's Ina Fried. Even though Kin was destined to be a dud, Andrew Brust blogged some interesting theories on why Microsoft had to launch it and let it fail. Check out his top 10 reasons why Microsoft may have launched Kin anyway. Brust notes:

Kin had to be brought to market, and had to fail, in order to topple [corporate VP Roz] Ho from her position and consolidate power in the WP7 [Windows Phone 7] team.

When I saw the Webcast of the Kin's launch in April, I couldn't help but cringe. It came after Microsoft finally had a credible story with Windows Phone 7, even if it is still has an uphill battle to cut into the comfortable lead maintained by Apple with the iPhone and Google with its Android platform.

But it would be foolish to rule Microsoft out of the mobile enterprise race, yet. But when Microsoft launched Kin, it was evident it had no legs, and moreover it was a clear distraction from its Windows Phone 7 story.

The Kin, if it is remembered at all, may challenge Microsoft's Bob as the company's biggest dud ever. For those who remember Bob, launched 15 years ago to make Windows 95 easier to use, it recently made Time magazine's list of 50 worst inventions ever.

Now that's a tough act to follow.

Posted by Jeffrey Schwartz on 07/02/2010 at 1:14 PM0 comments


Quest Reaches Out To MSPs

Quest Software wants providers of managed services to use its broad portfolio of systems management, migration and connectivity software. The problem is managed services providers, or MSPs, don't want to pay Quest's traditional lump-sum licensing fees. In order to make its software palatable to managed services providers, Quest has introduced a new consumption-based licensing model that it hopes will broaden its market.

The company earlier this month launched its new Service Provider Program, designed to let MSPs license its software the way service providers are accustomed to doing business -- by the month or quarterly and as applicable on a per user or account basis.

"We are traditionally targeted at the on-premise customer to give them all the capabilities they need to do things themselves," Darren Swan, Quest's manager of development told me. "We have worked with service providers to help them do it with the customers, but we haven't targeted specifically the service providers to enable them to move on-premise to host and manage."

Quest had to make a fundamental change in its business model to reach this new customer set. That's because the business model for MSPs doesn't encourage them to incur up-front software license fees.

After seeing its revenues decline for the first time in a decade last year, Quest is trying to extend its reach to MSPs by offering its software via a new licensing and model that is more conducive to them.

The subscription model is appealing for two reasons: the ongoing revenues and convenience for the end customer, explains Scott Gode, VP of marketing at Azaleos, a managed services provider and Quest partner.

"The unique thing with Quest, particularly around their migration tools, is that it's not at all a niche because migrations are huge and will continue to be," Gode says. "But it's not your typical annuity model because a migration is more often or not a one-time event verses consuming cloud storage space, which is an ongoing annuity model."

But Quest also has monitoring and other tools that are used on an ongoing basis. Yet MSPs are only willing to pay for those tools as they are consumed. One MSP that found Quest's array of tools appealing is DirectPoint Inc., based in Lindon, Utah.

Dan Atkinson, DirectPoint's VP of alliances, hooked up with Quest about a year ago, and found many of its monitoring and migration tools suitable for its needs. But Atkinson said DirectPoint couldn't introduce new capital expenditures to pay for the software when the company was accustomed to weighing its costs towards operational expenditures. Atkinson said Quest's move from perpetual license fees to quarterly per-user pricing sealed the deal.

"Paying all of those upfront license fees and maintenance fees does not work well for somebody like ourselves that is a pure-play MSP," Atkinson explains. Software vendors who want to do business with MSPs and cloud providers appear to be grudgingly if not gradually moving in this direction, according to Atkinson.

Case in point is CA Technologies, which recently acquired monitoring vendor Nimsoft and cloud virtualization platform vendor 3Tera. "We're beginning to see more and more of it," Atkinson says.

If you're an MSP, are you finding software vendors showing more willingness to work with you on licensing? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 06/29/2010 at 1:14 PM0 comments


Big Changes Afoot for Microsoft Partners

Time will tell whether the reshuffling of responsibilities between soon-to-be former Channel-chief Allison Watson and her successor, Jon Roskill was a chance to allow both execs to broaden their careers and put new blood into their respective organizations or whether there are more pronounced changes in the works to the way Microsoft goes to market with its partners.

It's hard to be shocked that after nearly seven years Watson is moving to a new role. That's the Microsoft way and anyone who follows the company knows it frequently shuffles execs. By many accounts, Watson was overdue for such a change.

The move caught many partners and even those inside Microsoft off guard --just a few weeks before Microsoft's Worldwide Partner Conference.

"The new fiscal year is a week away, and that means kicking off and executing the 2011 marketing campaigns and Allison's new role will play an important role there," Directions on analyst Paul DeGroot suspects. "Microsoft's global sales meeting usually happens a week or two after the partner conference, so she'll be getting worldwide visibility right away."

Watson's move as corporate VP for Microsoft's Business and Marketing Organization (BMO) should not be viewed as a kick upstairs, DeGroot added. "BMO is a pretty critical part of Microsoft's sales organization, so leading the U.S. BMO is by no means a trivial role or a sideways move," DeGroot said. "Maybe they are grooming her for future roles in the corporate mainstream."

By most accounts, Watson was well regarded in the partner community. "Allison truly has supported our group and we were very happy to have the opportunity to work with her," said Kerry Gerontianos, president and CEO of  Incremax Technologies Corp. and the IAMCP's national president. "The most rational explanation for the change is she's been there awhile."

Now it is unclear whether she will share the keynote stage with her successor or will cede it to him. The message of course is that the two will work together. Watson said as much in a blog posting yesterday.

"As I shift into my new role, one thing that will not waiver is my passionate dedication to partners," Watson noted. "My commitment to strengthening and evolving our engagement, and the collective learning you've imparted will be a toolset I will continue to employ. Jon and I are eager to hear from you about how we can best serve Microsoft's partners and customers."

Yet if she knew this was coming as recently as two weeks ago, she held it close to the vest. In fact she spent nearly an hour just two weeks ago talking to myself and Redmond Channel Partner editor-in-chief Scott Bekker, outlining the importance that partners follow Microsoft to the cloud, which was to be the basis of her keynote at WPC.

"Microsoft is 'all in' but we haven't really been telling everyone what 'all in' is yet," she told us. "In a comprehensive way, that's obviously a major goal for WPC. And the roadmap is becoming very fleshed out during the course of the next 12 months."

How partners should follow Microsoft to the cloud will surely remain the theme of WPC (stay tuned for the July RCP cover story), still lots of questions loom. Perhaps the most significant is what this will mean for Julie Bennani, general manager of Microsoft's Worldwide Partner Group. Bennani is architect of the new Microsoft Partner Network.

Many channel partners are concerned about what the new certification rules will mean to them, particularly smaller ones. Under the new rules, as I reported, there will be no double dipping, but partners are hoping Microsoft will make exceptions.

While it appears the new rules are pegged at those that latch onto a certification without committing enough to those product lines, others argue the collateral damage can hurt those partners who can't afford to hire additional engineers, yet those they have are skilled in multiple disciplines.

"If they want to make bigger changes to MPN, they could use this re-organization as an excuse for a delay," suggests Howard Cohen, regional chairman of the International Association of Microsoft Channel Partners (IAMCP).

Also it remains to be seen what this will mean to Pam Salzer, Microsoft's Senior Director of Worldwide Partner Marketing. Many are bracing for additional personnel moves at some level in the coming days and weeks.

What's your take on all these changes? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 06/25/2010 at 1:14 PM0 comments


New Desktop Virtualization Licensing Looms

Microsoft is talking up application virtualization and there are some new technology, partnering and licensing considerations on the horizon. Also in its battle against VMware, look for Citrix and Microsoft to act as true partners-in-crime to take on the cause for desktop application virtualization.

First, in case you haven't heard, effective July 1, those customers who don't qualify for Windows Client Software Assurance will require a new license called Windows Virtual Desktop Access, or Windows VDA. According to a posting on Microsoft's site, the company came up with Windows VDA to allow organizations to license virtual copies of Windows in virtual environments for devices that don't qualify for Windows client SA such as third-party contractor PCs and thin clients, among others.

Windows VDA is a device-based subscription license that will cost $100 per device per year. "It will allow organizations to create multiple desktops dynamically, enable user access to multiple virtual machines (VMs) simultaneously and move desktop VMs across multiple platforms, especially in load-balancing and disaster recovery situations," Microsoft says.

What's the benefit of that added cost? Among other things, Microsoft says it will allow users to run Windows in a data center including Enterprise editions, rights for a primary user to access corporate VDI desktops from non-corporate PCs including home systems and kiosks and access rights for up to four VMs, concurrently. It will also support unlimited mobility of VMs between servers and storage and unlimited backups of VMs, according to Microsoft.

At a time when Windows 7 sales appear to be going through the roof, Microsoft appears to be making a strong push toward desktop virtualization even as it may cannibalize future Windows licensing.

To strengthen its portfolio, Microsoft is working closely with longtime partner Citrix to take on the larger behemoth -- VMware. Brad Anderson, who oversees Microsoft's virtualization efforts, gave a keynote address at Citrix's annual Synergy conference earlier this month following Citrix launch of the first bare-metal client hypervisor..

Following Anderson's keynote, he and Citrix CTO Simon Crosby posted a recorded video conversation between the two where he talked up what Microsoft has in store on the App-V front.

"With application virtualization, what we're doing is taking all the assets, all the experience from the desktop, applying it to the server, and this will be released in conjunction with the next version of System Center in 2011," Anderson said. "Think about this as being embedded into virtual machine manager that gives you the ability to separate out your existing applications, so that you can actually have that separation of the app and the OS and dramatically reduce your number of operating system images."

One of the things Anderson demonstrated in his keynote was the next version of System Center Virtual Machine Manager making Xen Server a first class citizen. "The integration is definitely there," Anderson said.

Added Crosby: "So you can drag and drop a multi tier app from Xen App and Xen desktop, which has tons of components, and just magically shows up. One cool use case: once you've done all this virtualization, it's a good way to deploy things into the cloud, so I can then take an app that I've virtualized and pop it up into Azure."

What's your take on the new VDA licensing? If you're a Microsoft partner are you looking more closely at Citrix's XenSource platform and the new bare-metal hyper-visor the company announced? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 05/28/2010 at 1:14 PM1 comments


Some Partners Unsure About Microsoft Partner Network Changes

Not feeling all that prepared for the forthcoming changes in the Microsoft Partner Network? You're not alone.

According to a poll of a small but very influential group of partners, only 29 percent said they feel very aware of the changes and requirements in the Microsoft Partner Network, while 58 percent are somewhat aware and 12 percent are either unsure or not aware. Meanwhile, 38 percent say they are not familiar what they have to do to prepare for the changes, while 46 percent said they were somewhat prepared. Only 15 percent feel very prepared.

These stats and others were gathered at last week's first-ever national meeting of the U.S. chapter of the International  Association of Microsoft Certified Partners (IACMP), where I reported Microsoft's effort to extol the virtues of its move to the cloud and the uneasiness that partners are experiencing.

During the event, which included live meetings around the country and attendees who logged into a Webcast, Microsoft polled the audience to get their feelings on MPN.

While only 26 attendees weighed in, only members of IAMCP were invited and these are influential Microsoft partners. So you may choose to take these numbers with a grain of salt but they are at least an indicator of how some key partners feel.

We’d like to hear from more of you. Please let us know what level of awareness and readiness you have for the forthcoming changes in MPN. Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 05/26/2010 at 1:14 PM0 comments


Are You Ready for MPN?

The International Association of Microsoft Certified Partners held its first-ever national meeting yesterday where the forthcoming new Microsoft Partner Network (MPN), the Worldwide Partner Conference (WPC) and Microsoft's emphasis on the shift to cloud computing were front-and center.

Nearly 1,000 partner firms are members of IAMCP. The goal of yesterday's held event was to reinforce the IAMCP's mandate that partners should network with one another and create relationships by which they go to market together in areas where their skills are complimentary.

"We see an opportunity to really help you gain more information and connections to grow your business," said Cindy Bates, VP of Microsoft's U.S. Partner Strategy, who was the keynote speaker. Bates used her pulpit to talk up MPN -- launched at last year's WPC but set for some key changes to be rolled out this year. For a deep dive on the MPN see Scott Bekker's full report.

As far as many partners are concerned, MPN spells uncertainty. Approximately 60 to 70 percent of the IAMCP are now Gold Certified Partners, said Kerry Gerontianos, president and CEO of New York-based Incremax Technologies and president of the U.S. IAMCP, in an interview following the presentation. That's because it is unclear how those partners will rank under the new structure, which looks to create an advanced certification for some of Microsoft's largest partners.

Gerontianos, who hosted the event at Microsoft's New York City office, talked to me about satisfaction with MPN so far by IAMCP members. "I would say it's mixed," he said. "I think there is a lot of concern about MPN." While Bates did little to address how partners may be affected down the road, she did tell the several hundred in attendance that she considers IAMCP an important constituency.

"IAMCP and its very impressive partner community is closely and strategically aligned with Microsoft and, in our view, is one of the most [important] communities in the industry, fostering and facilitating partner-to-partner connections," Bates said.  "I have seen first-hand how partners have increased revenue as a result of p-to-p networking, identifying new customer opportunities through point solution delivery, expanding geographical reach or increasing capabilities through partnerships."

Of course those are mere platitudes to those partners that may find themselves having to invest in further certifications or losing their existing status. To its credit, the IAMCP is pushing hard for Microsoft to take the needs of smaller but high-revenue producing partners into consideration when rolling out MPN. How that will play out remains to be seen.

What's your take on MPN? How do you see it affecting you? And how does Microsoft's "we're-all-in" the cloud emphasis potentially impact your business moving forward? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 05/20/2010 at 1:14 PM2 comments


CardSpace 2.0 Headed for the Heap?

Active Directory Federation Services 2.0 is now shipping but Microsoft postponed the release of CardSpace 2.0, putting its future in doubt.

The current CardSpace is built into Windows 7 and Vista but it doesn't appear that it is widely used. Perhaps that's why Microsoft quietly announced that it was putting on hold the next version, CardSpace 2.0, which was to provide a common user-interface for managing multiple logins.

CardSpace 2.0, which had been in beta since last year, supports ADFS 2.0 and includes support for the Windows Identity Foundation. To address the lack of an updated Information Card, in the new ADFS 2.0, Microsoft next month is expected to release a Community Technology Preview of an add-on to ADFS 2.0 that will enable Windows Server to issue InfoCards.

It appears that Microsoft shifted gears in March with the release of its U-Prove information identifier at the RSA Conference when Scott Charney, Microsoft's corporate vice president of Trustworthy Computing, launched the CTP of the company's U-Prove technology.

U-Prove centers on the issuance of digital tokens that allow users to control how much information is shared with the recipient of the token. Used against ADFS 2.0, U-Prove lets users federate identities to across trusted domains. Microsoft released U-Prove under its Open Specification Promise and also donated two reference toolkits for implementing the algorithms under the Free BSD License.

Moreover, Microsoft released a second specification under its OSP for integrating U-Prove into open-source identity selectors. How that will play out, in terms of whether the .NET and open-source communities embrace U-Prove, remains to be seen.

But that has many people wondering if there's any future for CardSpace 2.0 and if U-Prove will prove, pardon the pun, to be a viable replacement. "There's certainly support for information cards; our involvement in information cards is alive and well," said  Joel Sider, a senior product manager in Microsoft's Forefront security group, in an interview yesterday. Microsoft is not saying when it will update its CardsSpace 2.0 plans, but some are wondering whether the technology has a future.

CardSpace 2.0’s uncertain fate is "no surprise given its limited adoption," said Patrick Harding, CTO of Denver-based Ping Identity, a Microsoft partner and competitor. "Unfortunately, it has also really upset all of those people and companies that have bought into the InfoCard model at Microsoft's urging."

What's your take on the CardSpace 2.0 situation? Have you looked at U-Prove? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 05/06/2010 at 1:14 PM0 comments


HP with Palm Reshapes Mobile Landscape

About 10 days ago, I broke down and bought a Palm Pre Plus for a mere $49. The few people I've told have given me blank stares. I love the gesture-based interface and the way the device works. An added bonus was the free mobile hotspot built into the device that I can use to connect my netbook while on the road. I've been reluctant to reveal my purchase because I have 20 days left to return it to Verizon should I conclude it's a dud.

While I have been on the fence, HP announced yesterday that it has agreed to acquire Palm Inc. for $1.2 billion. This leaves me thinking Palm's webOS has a much brighter future. Here's why: The biggest knock against webOS and the Pre is its lack of apps. That has been the source of my dilemma -- not the device itself.

Only 2,000 are in Palm's App Catalog compared to 200,000 or more in Apple's iTunes App Store. Making matters worse, sales of Palm-based devices are nimble, in part because developers have taken a pass on webOS in favor of Apple's iPhone and Google's Android platform.

IDC said in a research note yesterday that it believes Google's Android platform will be number two behind Nokia's Symbian platform. As everyone knows, Apple iPhones and iPads are selling like hotcakes. Then there's the BlackBerry brand and devices based on Microsoft's Windows Mobile coming out this fall.

So where does that leave Palm and webOS? Until HP came to the rescue, the future was looking bleak, despite what I think is a superior platform and user interface to Android and BlackBerry. Windows Phone 7 is a dark horse with a lot of potential, but the jury is still out. If I end up disappointed with my Palm Pre, I wouldn't rule out settling with a Windows Phone 7 or an iPhone, as an alternative. But I have no intention of leaving Verizon and neither are an alternative at this time.

The good news for webOS is that HP plans to invest significantly in both sales and marketing as well as in its developer eco system, said Tom Bradley, executive vice president of HP's personal systems group, speaking on a call to investors that was webcast. Bradley also sees extending webOS to other form factors and using its vast channel and retail presence to reach customers -- both enterprise and consumers.

"Our breadth of products between smart phones, slate and potentially netbooks represents an enormous opportunity for our customers," he said. While he declined to elaborate, the Web site CrunchGear posted five devices it envisions HP developing with webOS.

Bradley also is well aware he needs to get developers as excited about webOS as they are about Android and the iPhone, and indicated he's up for the challenge. "We believe this is a very, very early stage market. I think the developer community will very aggressively, as we invest and provide support, begin to develop that suite of applications for webOS that will make it even more compelling than it is today," Bradley said.

Indeed, HP has the means to quickly give a boost to the webOS developer eco system, said Jeffrey McManus, CEO of Platform Associates, in a brief e-mail exchange. McManus, who has given talks in the past on how to develop apps for the iPhone, was among the first to purchase the Pre when it came out last year.

"I'm confident that HP will be a great steward of Palm's webOS platform, which is still the most compelling mobile platform in existence today, particularly from a developer perspective," McManus said.  "It gives developers the ability to build native mobile applications using the same tools and technologies they already use to build ordinary Web sites today (HTML, CSS and Javascript). And WebOS will become even more compelling as HP brings it to more devices, including the Slate (which I'd expect them to do fairly soon)."

Still, IDC and others say the move could strain HP's relationship with Microsoft, whose CEO Steve Ballmer showcased HP's forthcoming Windows 7-based Slate tablet device at the Consumer Electronics Show in January. It also suggests HP's move into the smart phone market will come at the expense of Windows Phone 7, though Bradley was coy as to whether it will have a multi-platform smart phone and tablet strategy -- or emphasis.

Let's not forget that Bradley's successor at Palm, Ed Colligan, licensed Windows Mobile for the Palm Treo, in a widely publicized event with Microsoft chairman Bill Gates. It represented the first non-PalmOS-based platform the company added and, at the time, a key vehicle for Windows Mobile (in 2005 the Treo was the only major smart phone on the market besides the BlackBerry). That had mixed results for both companies, and I guess we'll find out how Bradley might handle Windows Phone 7.

Bradley insisted HP will continue to work closely with Microsoft. "We clearly believe in choice," he said. "We intend to continue to be a strategic partner for Microsoft, they are a huge piece of our business today and will continue to be so."

Still, there are reasons to be skeptical. As analyst Rob Enderle pointed out in a blog posting, HP has a history of missteps in the mobile market. "Palm and HP have both made runs at matching Apple in the past, and fallen flat on their faces," Enderle noted. "But the two companies’ combined resources might be just the secret sauce needed to stand tall beside Cupertino’s Goliath."

Michael Gartenberg, an analyst at Altimeter Group, agreed. "HP is now a force to be reckoned with in the mobile space," Gartenberg said in a blog post. "The combination of Palm technology and brand combined with HP resources and channel partners will be a strong combination for HP to drive their mobile efforts forward."

Having bought my first 3Com PalmPilot 3x in 1999 and a longtime user of the Treo, it bares noting that what's left of Palm is the company's name and heritage. The Palm Pre and webOS are very different platforms, but in my opinion, it is the only platform that currently rivals the iPhone and Droid.

A year ago I asked: Will Palm Get its 'MoJo' Back with webOS and Pre? Things certainly went down a different path than the company and many of its investors and supporters had envisioned and hoped.

With HP agreeing to acquire Palm (and let's not forget the deal could fall through, or another suitor could come along with a better bid), I agree with McManus. The future of webOS is looking brighter. But the question remains: will HP get its mojo back in the mobile market? It's too early to say but as I pointed out yesterday, it is quite ironic that Palm will once again be reunited with its former CEO Bradley, with 3Com and will be under the same roof as the once iconic iPaq. Presuming Jon Rubenstein, the key developer of the Apple iPod, and his team stick around, some interesting things can happen.

I still haven't decided whether I will keep my Pre but I am more inclined to hold onto it than I was yesterday at this time. What's your take on HP's move? If you're a developer, are you more inclined to look at webOS? Should I keep or return my Palm Pre? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/29/2010 at 1:14 PM2 comments


Will ADFS 2.0 Boost Cloud Security?

The pending release of Microsoft's Active Directory Federation Services (ADFS) 2.0 is expected to play a key role in simplifying how organizations provide access control to systems and applications, including those running in the cloud.

Microsoft is expected to release ADFS 2.0, the free Windows 2008 Server add-in to Active Directory, this week, as reported. ADFS 2.0 provides claims-based authentication to applications developed with Microsoft's recently released Windows Identity Foundation (WIF).

While ADFS 2.0 give single sign-on to .NET applications built-in WIF and systems running Windows 2008 Server instances, it also extends that authentication to Microsoft's Windows Azure cloud service. But just as important, it provides single sign-on to Windows applications running on other cloud-based services, said Jackson Shaw, Quest Software's senior director of product management.

"ADFS 2.0 is really going to shed the spotlight on federation and cloud services and that's something the industry can use," Shaw said, in a telephone interview from the company's TEC 2010 conference in Los Angeles. "You can put an ADFS 2.0 instance up and use it to connect directly to Google or Salesforce.com. It's fairly straightforward."

Key to ADFS 2.0 is its support for the Security Assertion Markup Language 2.0 (SAML) standard, which is widely supported by cloud providers and ISVs. By allowing Windows and .NET apps to make and exchange SAML-based authentication claims, that removes a key barrier.

While Shaw sees ADFS 2.0 as a key step forward toward improving cloud security, he cautioned it's not a panacea. "Not every single piece of information about what someone can or can't do is stored in Active Directory," Shaw said. "There may be something about my spending authority in the SAP system, for example. What that means is it forces a customer to synchronize more info into Active Directory."

The problem, he explained, is customers may not want to always do that."That's part of the evolution of cloud services we have to go through, and that's why I am excited about ADFS 2.0, because as more and more customers start to use this, these types of difficulties are going to be surfaced," Shaw said.

Not lost on him of course, is the opportunity that presents for third parties like Quest, Ping Identity, Symplify, CA, Novell and others to offer tools to remediate some of these issues.

Keynoting at this year's TEC 2010 was Conrad Bayer, Microsoft's general manger for Identity and Access solutions. Shaw, who attended the keynote, shared a few observations:

  • Directory technologies have all been brought together into one group at Microsoft, which Bayer will oversee. That includes ADFS, Forefront Identity Manager and Rights Management Server. "This is definitely a step in the right direction from the perspective of actual integration across the product line and hopefully some proper integration with Active Directory," Shaw said in a blog posting released just after we spoke.
  • When Bayer polled the audience to see how many were using AFDS, very few raised their hands. "I believe this will change once ADFS v2.0 releases later this year - since ADFS is basically free," Shaw noted.
  • Cardspace 2.0 is not ready, Bayer confirmed. "It doesn't go away but it isn't imminent to be released either," noted Shaw. "They want to add OpenID support and they are working on that along with incorporating it into Internet Explorer."

Are you looking to use ADFS 2.0 in your organization or for your clients?  Drop me a line at jschwartz@1105 media.com.

Posted by Jeffrey Schwartz on 04/26/2010 at 1:14 PM5 comments


Inside Microsoft's Private Cloud

I had the opportunity this week to see Microsoft's portable data centers, which the company showcased here in New York.

In honor of Earth Day, I thought it would be fitting to describe what Microsoft is showcasing because it does portend its vision for the next generation data centers that have self-cooling systems and servers that don't require fans.

Microsoft first demonstrated the portable data centers at its Professional Developers Conference back in November in concert with the launch of Windows Azure. It gained further prominence last month when Microsoft CEO Steve Ballmer made his "we're all in the cloud" proclamation at the University of Washington with these huge units in tow. The one I saw in New York was 20-feet long by seven feet wide but Microsoft also has one that stretches 40 feet.

These portable data centers, which are designed to be housed outdoors, are packed with loads of racks, blade servers, load balancers, controllers, switches and storage all riding on top of Windows Azure and Microsoft's latest systems management and virtualization technology.

But they also have self-cooling systems that suck the hot air out of the servers and use that to generate heat when needed in other parts of the data center. In places where the climate is cold, it brings that cool air into the data center. Otherwise it takes the outside air and runs it through what are known as adiabatic coolers. These custom-configured containers have sensors that automatically adapt to outside temperatures that fall below 50 degrees or above 95 degrees, as well as humidity levels lower than 20 percent or higher than 80 percent.

While these portable data centers represent the latest proof-of-concept for where Microsoft sees organizations building on-premises private clouds, they are used to power Microsoft's own Azure-based data centers. "These are actual units that run our data centers today," said Bryan Kelly, a service architect for research and engineering in Microsoft's Global Foundation Services business unit, who demonstrated the portable data center for me.

It is also a reasonable bet that while they are not on Microsoft's official product roadmap, customers will ultimately be able to buy their own Azure powered containers that, in some way, emulate this model, most likely from large systems vendors such as Cisco, Dell, EMC Hewlett-Packard, IBM and custom system builders.

Speaking at the Microsoft Management Summit in Las Vegas Tuesday, Bob Muglia, president of the company's systems and tools business was the latest to suggest as much.  "The work that we're doing to build our massive-scale datacenters we'll apply to what you're going to be running in your datacenter in the future because Microsoft and the industry will deliver that together," Muglia said, according to a transcript of his speech.

Microsoft will buy 100,000 computers this year for its own data centers, Muglia said. They will be housed in these containers weighing roughly 60,000 pounds, equipped on average with 2,000 servers and up to a petabyte of data.

The units I saw showed no brands, so I have no idea, whether they were bundled with Cisco, Dell, HP or IBM components, just to name a few. But it doesn't matter, according to Kelly. "This is all commodity hardware," he said.

That may be so but the configuration of these data centers is anything but commodity. To give it justice, check out this Channel 9 video taken by Microsoft's Scott Hanselman at PDC where cloud architect Patrick Yantz provided a 16-minute walkthrough of the units.

Do you see these data centers in your future? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 04/22/2010 at 1:14 PM0 comments


Infosys Bags $100 Million from Microsoft

After declining to disclose the value of its outsourcing deal with Microsoft, Infosys CFO today reportedly revealed that it's worth a whopping $100 million.

Bangalore-based Infosys announced Tuesday that Microsoft is outsourcing its IT help desk, PC, infrastructure and application support to them in a three-year deal that involves 450 Microsoft locations in 104 countries.

CFO V. Balakrishnan revealed its windfall to Dow Jones today.  When I spoke with Nataraj, Infosys VP and unit head of infrastructure management Tuesday, he emphasized that the deal will not lead to any job displacement.

Nataraj also said that the work it has picked up was support already farmed out to a multitude of partners in the past -- that this was mainly the consolidation of that work to one partner. "Microsoft has outsourced parts of its internal IT before," notes Directions on Microsoft analyst Paul DeGroot.

Our original news report generated some less than enthusiastic comments. "C'mon Microsoft -- use US-based workers," wrote Bob. "No wonder no one wants to go into the tech field. Even Microsoft drinks the outsourcing Kool-Aid."

Added John: "What bugs me more than anything is that, like anyone who offshores to China or India, the cost of Microsoft products will not be cheaper. On the one hand we're too expensive to employ, on the other hand we live in the U.S. and are paying a premium for goods and services. Well, we may be seeing the last of the great experiment we call the U.S.A."

DeGroot points out that outsourcing of IT support could have other ramifications for Microsoft. For example, with fewer internal operations being performed by Microsoft employees, it could mean that there are no longer as many people internally with easy access to product groups and to highly detailed operational data.

It’s also potentially providing less access to "the world's best Exchange/Windows/SharePoint/SQL Server engineers who they might be able to call on to solve a problem that a lot of customers are having," he said.  

"Microsoft also uses its internal systems to 'dogfood' new products," he added. "For example, new products are often put into full production internally while they are still in external betas. Microsoft users put up with the problems they might encounter because they understand that their experience will help the company make a better product."

Posted by Jeffrey Schwartz on 04/15/2010 at 1:14 PM6 comments


VCs Set Expectations

If you're looking for VC money, don't presume no one else has considered your unique idea, be prepared to show you have a solid customer roster and don't expect to find the easy money of yesteryear.

Those were among the takeaways of a panel presentation I attended earlier this month with five VCs and a company that received venture funding. It was moderated by Bloomberg TV's Taking Stock anchor Pimm Fox. 

As a prelude to the panel, PricewaterhouseCoopers' partner David Silverman revealed PwC's annual MoneyTree report, which gave the lowdown on last year's dismal year for venture funding that was tighter than ever. Investors only pumped $17.7 billion into companies, down 37 percent over 2008's $28 billion.

There are indications that it is starting to bounce back incrementally this year, with signs trending toward $20 billion, Silverman said. As reported, funding remains tight -- but VCs still see opportunities in areas such as cloud computing, smart phones and green technology. VCs are looking at smaller deals and companies that have proven and viable customer bases.

The takeaway: Those seeking big payouts of yesteryear should reset their expectations, the VC's warned. "I think the classic VC model is broken," said First Round Capital founder and partner Howard Morgan

"The mathematics are simple; if you raised a billion dollar fund and you wanted it to return 20 percent, you needed to return $3 billion. If you own 20 percent of your companies that exit, you needed to create $15 billion worth of market cap. If you were in YouTube and Skype and MySpace and a few others, you may be halfway there. That part is broken. What's not broken is that companies need capital."

Vytas Kislieulius, CEO of Collections Marketing Center, a software-as-a-service startup that runs a collections exchange and describes himself as a "serial entrepreneur," said it is important to be realistic when making your case to potential investors these days.

"I have never been one to believe that the investors are here for my benefit but for our benefit. If I make it good for them they'll make it good for me," Kislieulius said. “If I can't make it good enough for both of us, I know who’s going to win. It's not me. That's the way the deal works."

That also means startups should go to investors with a solid business case. "There's less willingness to let it ride now than there used to be," Kislieulius said. "It takes so much more proof that there's a real market and that there's real customers."

Finally, he warns, those seeking funds should scope out potential investors carefully. "You have to choose them as carefully as they choose you because if you just take the money and it's a mismatch, it's brutal, I can promise you that."

Have you reached out to the venture community for funding? Or are you seeking alternative forms of funding your business? Drop me a line at [email protected] and follow me on Twitter @JeffreySchwartz.

Posted by Jeffrey Schwartz on 03/31/2010 at 1:14 PM0 comments


Should Microsoft Fear Google Apps?

Can Microsoft convince customers to upgrade to the full version of its forthcoming Office 2010, due for release May 12?

While that question has been looming large for awhile, today The Wall Street Journal's Nick Wingfield once again raises that specter focusing on the formidable challenge from Google. Wingfield said "Microsoft seems to be staring down the Google threat," pointing to wins by General Motors and Starbucks.

Google has 25 million Google Apps customers, though Gartner says only 1 million are paying customers. The report says there are 40 million paying Microsoft Office online customers, a small but noteworthy fraction of the hundreds of millions of Office users.

With Office 2010, Microsoft is adding its own Web-based client that will extend the use of apps such as Word and Excel to the browser. Moreover, Office 2010's ability to link to the forthcoming SharePoint Server 2010 will make the two a unique pair of products that will enable new levels of collaboration within enterprises and among extended work groups (for a deep dive on Office 2010 see the cover story in the current issue of Redmond magazine).

Microsoft appears to be shrugging off Google Apps as a threat to its Office franchise. But Silicon Alley Insider editor Henry Blodget begs to differ, writing "Microsoft Should Be in Major Panic Mode." Why? Blodget argues that Google has improved the capability of its offering and that many Office users will migrate. Microsoft can add more features, he argues, but those will appeal to a small subset of overall users.

"So don't take the puny size of Google's App business and the fact that big companies aren't seriously considering Apps as an alternative as a sign that Microsoft is safe," he writes.  "Microsoft isn't safe.  Microsoft is very exposed."

Since launching its partner program last year, Google recently reported that it has signed on nearly 1,000 solution providers. One of them is Tony Safoian, president and CEO of SADA Systems, who is both a Microsoft Gold Certified Partner and a member of Google's program. Safoian appeared last week on a Redmond Channel Partner Webcast hosted by editor-in-chief Scott Bekker.

"We feel like there are customers that are a great fit for Google and culturally they may be a little different," Safoian said. "And there's customers who will never get away from a desktop oriented experience or they just love the Outlook interface and they've invested a lot in that technology. We are just being honest and faithful to the market in being able to speak intelligently about both solutions and being able to offer whichever one makes the most sense."

Are you looking to upgrade to Office 2010? If so, why? Or should Microsoft be in panic mode? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/29/2010 at 1:14 PM4 comments


Is Windows Phone 7 Following in Palm's Footsteps?

While many loathe writing off Palm, the company responsible for creating the first generation of PDAs, the prognosis isn't looking too good. At the moment, those predicting Palm's demise seem to be heavily outweighing those who believe the company is going to regain its former glory.

Palm's sustainability came into deeper scrutiny late last week when the company said it shipped 960,000 units and only sold 408,000 of them. That suggests the company has stuffed its channel with a ton of unsold inventory. The news caused its shares to drop 30 percent Friday. Though the shares rallied early on the news that AT&T would start selling its devices, the company's shares closed down half a point.

Critics point to a number of missteps by Palm, from choosing Sprint as its exclusive launch partner (it added Verizon in January)  to releasing the new device last summer -- days before Apple started shipping its next generation iPhone. But the biggest mistake has been how the company treats its partners and developers.

Despite the promise of delivering a software development kit that would be friendly to any JavaScript or HTML developer, the tooling failed to arrive in advance of the device. This dampened the prospect of it building a rich partner ecosystem. And now there are only 2,000 apps that support Palm's WebOS, compared with 170,000 for the iPhone, 30,000 for Google's Android and 5,000 for Research in Motion's BackBerry, according to data compiled by Silicon Alley Insider.

The 1990s saw Palm building a vast developer ecosystem with its PalmOS. "Back in the early days, Palm could do no wrong and was an unstoppable force, it dominated the PDA space…" recalls longtime Palm devotee Mark Nielsen in a blog posting last month.

But a key problem this time around was that Palm lost the developer ecosystem long before WebOS, Nielsen continued. Under that backdrop, Nielsen begs the question: is Microsoft following in Palm's footsteps with its Windows Phone 7 Series strategy, which effectively scraps the old Windows Mobile 6.x code in favor of the Silverlight RIA-based architecture and Zune interface? In other words, just as PalmOS apps were useless to WebOS, will the same come true for .NET Windows Mobile developers?

"I have nothing against the Zune. I own one but it's not Windows Mobile and its navigation UI is not very flexible. On top of that, they choose to not support past apps, which once again I believe was a huge mistake for Palm," Nielsen notes.

"So like Palm, they have chosen to start over and play catch-up on third-party apps when they didn't really have to," said Nielsen. "You've alienated your past developers while hurting their customer-base which is your customer-base. In the meantime, you've positioned your new OS to 'wow' the home consumer and downplay your enterprise strengths. Microsoft, it's not too late to correct some of your decisions. Just look at Palm and see how it has worked for them."

Giovanni Gallucci, organizer of last year's Windows Mobile Developer, says that's not a fair comparison. "The PalmOS ecosystem was dying or dead," he said in an interview. "The Windows Mobile team learned by watching Palm and realized they have to get their code out there fast, early and into everybody's hands. Clearly Palm's approach that no one would get the SDK until after the device shipped was a strategy that failed."

Gallucci dug himself into a hole in early 2009 when he and others launched the Palm PreDevCamp effort. He shortly walked away from it last year after its apparent demise. It should also be noted that Apple wasn't quick to make its SDK available before the iPhone came out. However, this didn't hurt them as it did Palm.

"Microsoft is going to the opposite extreme saying 'we're going to give you the SDK well before we even call this an alpha,'" he said. "They are taking a risk, much more than any other company does in giving developers access to their code, long before it's fully baked. But it's a risk that's paid off for them in the last three decades."

What's your take? To learn more about Microsoft's new mobile strategy, see: Top 7 Windows Phone 7 Highlights from MIX10. Share your thoughts by droping me a line at [email protected].

Posted by Jeffrey Schwartz on 03/23/2010 at 1:14 PM4 comments


Remembering Jerome York

Jerome York, best known for his association with the billionaire and activist investor Kirk Kerkorian, passed away late last week just days after suffering a severe brain aneurysm.

York, 71, was best known for playing a key role in helping save Chrysler and later IBM. He was brought in to both companies as CFO when their survival was very much in question. York instituted major cost-cutting initiatives and is credited with contributing to their respective turnarounds.

At the time of his death, York was still sitting on Apple's board, where he was a director since 1997. York joined Apple's board just prior to the return of CEO Steve Jobs. "He has been a pillar of financial and business expertise and insight on our board for over a dozen years," Jobs said in a statement. "I will miss him a lot."

More recently, York was in the spotlight for his efforts to lead Kerkorian's initiatives to salvage General Motors before its meltdown last year that resulted in its filing for bankruptcy. As his obituary in The New York Times noted, he foresaw much of GM's problems years before they played out, though his warnings were largely ignored.

His obituary pointed to "one rare miss" when he and some investors bought direct systems marketer Micro Warehouse for $275 million, looking to capitalize on the boom for selling IT goods online. I recall sitting down with York at Micro Warehouse's Norwalk, Conn. headquarters. At the time, York was still treading water, trying to transform the company from an inbound seller to an outbound marketer of systems. But Micro Warehouse ultimately filed for bankruptcy and was snapped up by rival CDW.

Posted by Jeffrey Schwartz on 03/22/2010 at 1:14 PM1 comments


Microsoft Plunges Into VDI Pool

While client and desktop virtualization was always something Microsoft knew it couldn't ignore, it has always loomed large as a threat to Redmond's Windows franchise. But a group of coordinated announcements today suggests Microsoft is going to put more emphasis on both application virtualization and virtual desktop infrastructure (VDI) technology.

Microsoft has taken several key steps to make its Application Virtualization (App-V) and VDI stack both more appealing from a licensing perspective, as well as from an implementation standpoint.

"It's a coming out party," IDC analyst Al Gillen said in a telephone interview. "Microsoft had been very disinterested in client virtualization, or at least in promoting client virtualization. This represents a fundamental shift of strategy for them. They really have not endorsed client virtualization anywhere near the level of sincerity that they needed to. It's ground-breaking from my point of view for Microsoft to do this."  By not putting emphasis on VDI, Microsoft risked seeing VMware and Citrix continue to expand its presence, Gillen points out.

Microsoft kicked off its announcement with a Webcast talking up its added focus on VDI with a panel of customers, along with Gartner analyst Mark Margevicius, to extol VDI in general and Microsoft's place in the equation.

The popular travel site Expedia Inc., for example, is well into the rollout of a catalog of 600 applications to 7,000 distributed desktop users as part of a migration from Windows XP to Windows 7, and is now doing a proof-of-concept on VDI, said Chaz Spahn, a senior systems engineer at Expedia in a telephone interview.

"We looked at SCCM [Microsoft's System Center Configuration Manager] or application virtualization technology and saw it gave us faster time to delivery," Spahn said of the App-V decision, noting it is appealing for use with call center agents and remote developers, who are typically contract workers distributed worldwide.

"We find at Gartner that the level of interest in desktop virtualization without exception is very high right now," Margevicius said on today's Webcast. Customers across sectors ranging from health care, government, finance and manufacturing are all interested in it due to its potential to ease administration and deployment as well as address concerns about compliance and security. "The distributed nature of PCs is very much at risk in terms of data being compromised," he said.

What is so noteworthy about today's announcement?  Microsoft said customers no longer have to purchase separate licenses to access Windows in VDI environments. For non-Software Assurance customers, Microsoft has added a "Windows Virtual Desktop Access subscription" priced at $100 per year per PC or thin client device.

Meanwhile, Microsoft is upgrading its VDI stack, adding support for Remote FX graphics acceleration platform into Windows Server 2008R2, support for dynamic memory enabling memory on VMs to be changed on-demand and the elimination of the need for hardware-based virtualization. Also today, Microsoft added to its longstanding partnership with Citrix Systems, where it will extend Citrix HDX technology in XenDektop to RemoteFX.

"While some if it isn’t quite ready yet, from a competitive perspective they want the market to know what's coming and get the market excited about their portfolio," said Jeff Groudan, director for thin client computing at Hewlett-Packard, in a telephone interview. HP used Microsoft's launch to announce its Remote Desktop Client (RDC) add-on for its portfolio of Windows Embedded Standard (WES)-based thin clients.

Today's announcements also follow Microsoft's recent release of App-V 4.6, an add-on to the Microsoft Desktop Optimization Pack and Microsoft Application Virtualization for Terminal Services. Key to that upgrade is that it allows organizations to deploy applications in a single storage area network (SAN), rather than require them to be spread out across VMs.

Microsoft had no choice but to take the plunge into the client virtualization pool, observers say. Among other reasons, it's critical to Microsoft's effort to become a player in the overall mobile computing space, Gillen says. "Mobile devices are becoming important and it’s a space that Microsoft doesn't own," Gillen says. "Client virtualization is one of the things that really marries together traditional client computing together with mobile computing and Microsoft was going to be a non-player if they didn’t get in their and compete."

If you're a customer, are you looking at VDI and application virtualization for your organization? And for partners, do you see a rich opportunity for services dollars here? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/18/2010 at 1:14 PM0 comments


Are Microsoft Partners 'All-In' the Cloud?

A week after trying to sell customers on its "we're all in" campaign to the cloud, Microsoft is now trying to bring its vast network of partners onboard.

Allison Watson, the corporate vice president of Microsoft's Worldwide Partner Group, made her pitch Wednesday in a prepared and edited video presented via a 10-minute webcast.

"The cloud is here, the cloud is now, and it is important that each of you embrace understanding what it is," Watson said, after reiterating CEO Steve Ballmer's five "dimensions" about how the cloud will embody all of Microsoft's computing efforts.

But if the number of views tallied on the video is any gauge (less than 100 nearly 24 hours after the webcast), it leads me to wonder whether partners are feeling the buzz about Microsoft's cloud campaign. As I was watching the video, available on-demand, I was wondering: where's the beef?

And without further adieu, Watson explained how 1.5 million McDonald's employees at 31,000 stores are using Microsoft's Business Productivity Online Suite (BPOS). "They needed a cloud e-mail solution and Microsoft online services became their choice." (Yes, I know that "where's the beef" was a campaign by McDonald's rival Wendy's, but you get my point).

Watson used the McDonald's example to explain how BPOS can be integrated with customers' internal systems and partners' own offerings. "I would highly encourage you to actively integrate these offerings within your own larger stack today so you don't miss out on this cloud opportunity now," Watson said.

Microsoft has 7,000 partners offering BPOS with 20,000 active trials under way, she said. And since its launch last month, 200 customers per day are signing on to use Windows Azure, she added. "In many ways, it's still a green field with an upside in trillions of dollars," she said.

Indeed, according to our own survey of 500 Microsoft partners, 18 percent believe cloud computing will have an impact on their business this year. Twenty-six percent believe the impact will come next year, and 16 percent say it will arrive in 2012. Another 10 percent predict it will come after 2013, while 8 percent say it has already arrived.

But in response to a blog post by Watson following the video that effectively reiterated Microsoft's five principals, one partner asked, "Where can I get information on partner opportunities now?" Watson replied that more information will come at the Worldwide Partner Conference (WPC) in July.  

There are some actions partners can do in the meantime. She suggested working with the Bing APIs because search will embody the need for partners to help customers find and aggregate data in new ways moving forward. "We're developing search technologies that integrate information seamlessly from the cloud from users, from developers, and we are bringing all of those things together in an integrated way," she said.

Another key area where partners will be able to add value is helping customers address security and privacy, she noted.

OK, so Watson has primed the pump. But many partners are still wondering how this will change their business. What's your take on Microsoft's "we're all in" cloud campaign? Are you "in" or are you still wondering, "Where's the beef?" Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/11/2010 at 1:14 PM0 comments


Will Novell Finally Be Acquired?

In more than two decades of following Novell, I've had many conversations with experts about who might someday acquire the company. In my mind, it was never a question of "if" but "when" Novell would be snapped up. But the company just chugged along.

Could that acquisition finally be arriving?

New York-based hedge fund Elliott Associates LP on Tuesday made a bid for Novell for $2 billion -- a 49 percent premium over Novell's share price Tuesday night before it catapulted yesterday by 28 percent. Elliott already holds an 8.5 percent stake in the common stock of Novell. The hedge fund was vague about its intentions with Novell but believes the company is underperforming.

Indeed, Novell has underperformed compared to key rivals Red Hat, Microsoft, Citrix Systems and IBM, wrote Anders Bylund, an analyst and contributor to The Motley Fool. But what will a hedge fund do to turn the company around? Potentially chop it up and sell off the pieces? Might another player -- such as one of its rivals -- be able to add value to its offerings?

"Over the past several years, Novell has attempted to diversify away from its legacy division with a series of acquisitions and changes in strategic focus that have largely been unsuccessful," wrote Elliott portfolio manager Jesse Cohn in a letter to Novell shareholders. "With over 33 years of experience in investing in public and private companies and an extensive track record of successfully structuring and executing acquisitions in the technology space, we believe that Elliott is uniquely situated to deliver maximum value to the company's stockholders on an expedited basis."

Elliott declined to elaborate further and it remains to be seen if a bidding war emerges.

Novell was once a kingpin in the software industry. Its founding CEO, the late Ray Noorda, was a legend in the 1980s and early 1990s, and was perhaps best known for coining the term "coopetition."

Once Microsoft's nemesis, Novell was the first major player to provide the technology for enterprises to interconnect their PCs. These days, though, you'd be hard pressed to find an enterprise of any size still relying on Novell's NetWare.

After a failed bid to acquire Lotus in 1990, Novell later acquired WordPerfect, ultimately selling most of those assets to Corel. The one vestige of WordPerfect still owned by Novell is the technology that is now the basis of GroupWise, also a minor player in messaging compared to Microsoft Exchange and Lotus Notes.

These days, of course, Novell is best known as the No. 2 Linux distributor. But it also has virtualization, systems management, identity management and services offerings.

And ironically, Novell today is a Microsoft partner as Noorda's philosophy of coopetition has come full circle -- much to the consternation of many in the open source community.

How important is Novell's fate to your business, and what are the implications of where the company ends up? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/04/2010 at 1:14 PM0 comments


Will XP Users Upgrade Existing PCs to Win 7?

With today's deadline to sign off of the Windows 7 RC, many users have to decide whether to go back to Windows XP or Vista, or whether to pony up and upgrade to Windows 7.

Providers of PC migration software like Laplink and Detto Technologies can capitalize on that decision either way. In my news story, I described how I used Laplink's PCmover to upgrade to Windows 7 from the release candidate, but the software is really intended for those with XP or even older versions of Windows looking to a) migrate those systems to brand-new ones, or b) do in-place upgrades of existing PCs from older versions of Windows to Windows 7.

Systems with Vista don't require a clean install when upgrading to Windows 7, though in many instances it might not be a bad idea. But those with XP have no choice other than to perform a clean install. And that's where Laplink has its sights. While PCmover is a retail product, Laplink is also is trying to extend its reach to the enterprise. Laplink has OEM arrangements with Dell, Hewlett-Packard and Lenovo, as well as 1,000 channel partners.

Why would a channel partner want to bother with a low-cost tool like PCmover? A $500 starter kit for 25 licenses is a good way to offer small businesses PC migration services, said Mark Chestnut, Laplink's senior VP of business development.

"For someone who is in the business of delivering PC migration as a service, we lower their cost of delivering that service and allow them to make better margins," Chestnut said.

Many small businesses may not have the patience or the resources to re-image their Windows 7 systems, Chestnut said. That offers a services revenue opportunity for solution providers, he added. Microsoft insiders tell him there are still 20 million machines running XP that are eligible to be upgraded to Windows 7.

"The current economic environment being what it is, companies are really obviously clamping down on IT spending, yet Windows 7 has some huge advantages," Chestnut said. "I think they will take a closer look at keeping as many of the old PCs and preserving their previous investments longer, than in the past."

Are you considering upgrading your older hardware to Windows 7? Or, if you're a solution provider, do you see an opportunity? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 03/01/2010 at 1:14 PM1 comments


Look Who's Tweeting: Microsoft's Channel Chief

Microsoft channel chief Allison Watson last week joined the Twitterati and has launched a new blog called Redmond View.

Watson, corporate vice president of Microsoft's Worldwide Partner Group, has invited partners to follow her on Twiiter @Allison_Watson or on Facebook "so I can get your feedback and chat with you about what's going on in the marketplace and in your business," she wrote in her inaugural blog post.

Getting right down to business, Watson focuses on a subject near and dear to partners and Microsoft: the Business Productivity Online Suite (BPOS). Pointing to over 1 million BPOS seats, Watson calls on partners to go deeper.

"It's important that you internalize our offerings in your unique business requirements, and then give us feedback about what you need to capture the opportunity," she said. "A lot of partners are asking me, 'How do I make money in this deal?' It depends on whether you're a reseller partner, an ISV partner, or an integrated partner. Based on all the deals we've done to date, we're hearing that the average partner opportunity is about $167 a seat. That includes partner referral fees, the initial setup and migration fees, as well as factoring in some of your managed service fees. That's a pretty big opportunity."

Watson points to four tools Microsoft is offering: the profitability modeling tool, a partner link tool that lets partners embed direct quoting, a tool that allows co-branded billing and a commerce dashboard to help understand the success of sales trials.

Watson's call to action comes as some Microsoft partners are voicing frustration over its pricing moves, and as Google appears to be gaining momentum in the enterprise with its own Google Apps offering (Google this week said it has nearly 1,000 partners in its Google Apps Authorized Reseller program).

What's your take on Microsoft's tools and Google's momentum?. Are you looking at Google Apps as an alternative to BPOS, or perhaps an adjunct? Drop me a line at [email protected]. And you can follow me @JeffreySchwartz on Twitter, as well, for other short updates.

Posted by Jeffrey Schwartz on 02/24/2010 at 1:14 PM1 comments


Cisco Declares War on HP

Cisco's decision to pull the plug on its partnership with HP was a major salvo in tensions that have been brewing between the two companies over the past year. Cisco last week said that it's cutting HP off as a Certified Channel and Global Service Alliance partner, a move that could force the companies' respective partners to make some tough choices.

"There may be a push by one or both companies to push channel partners to an either/or situation," said Mark Amtower, a marketing consultant with expertise on selling IT to the federal government, in an e-mail. "Many companies carry both as partners right now -- I don't think that will continue. If you push HP, marketing support from Cisco will disappear and vice versa."

The two companies have been encroaching on each other's turf for some time, with Cisco last year saying it would offer its own blade servers and HP becoming more entrenched in networking by bolstering its ProCurve line and agreeing to acquire 3Com Corp.

With the partnership set to expire April 30, Cisco took the unusual move of publically announcing it was cutting HP off. HP quickly shot back, accusing Cisco of not working to "best serve clients' needs."

Does this move signal an end to co-opetition? It raises the question of whether we will we see more partnerships unravel or, at the very least, become more diminished as companies look to become single-source providers.

Or maybe, as Directions on Microsoft analyst Paul DeGroot suggested, the current partnership has become "too all-or-nothing." Perhaps they needed "a more nuanced approach to ensure that joint customers get the support they require, while the other partner doesn't get privileges that it doesn't need for mere interoperability purposes," he said.

Gartner analyst Tiffani Bova agreed. "I wouldn't be surprised if a new arrangement doesn't follow closely behind where they meet each other half way in order to continue to service their joint customers and partners," Bova said.

Indeed, that may happen. On the other hand, what if Cisco means business and wants nothing to do with HP? If indeed these two companies go their own way, we could see Cisco getting closer with IBM and perhaps Oracle/Sun while HP could forge closer ties with the likes of Brocade and Juniper Networks.

Certainly, for Microsoft partners, this also raises some questions since most also carry gear from Cisco, HP or both. What's your take on the implications of Cisco and HP going separate ways? Will we indeed see others follow suit? Among other things, could this lead Microsoft to rethink its strategy of working closer with the likes of Novell, Red Hat and Zend? Could co-opetition as we know it be on the line here, or is this just a case of Cisco playing hardball?

Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/22/2010 at 1:14 PM0 comments


Seeking Funds for SMBs

Small and medium-size businesses have long been the salvation of IT recoveries, but this time that conventional wisdom may be falling flat.

The good news, as I reported earlier this month, is the economy surged last quarter by 5.7 percent, the largest such expansion in six years. Adding to that optimism, the Federal Reserve yesterday said business equipment output was up 0.9 percent in January, slightly higher that December's 0.7 percent.

IT output jumped 1.7 percent, marking the third consecutive monthly gain of more than 1 percent for IT gear. That has reflected in strong earnings reports from Cisco, Intel, Microsoft and, yesterday, HP, which posted an 8 percent increase in revenues and boosted its outlook for the year.

That should bode well for SMBs, which are typically the first to lead recoveries from recessions. But a troubling report in BusinessWeek underscores the fact that SMBs this time aren't leading that recovery. Instead, SMBs are continuing to let go of employees and reduce capital spending.

Only 20 percent of those surveyed by the Federation of Independent Business plan to make capital outlays. Even more concerning, 3 percent see sales increasing, -1 percent say they plan to hire more employees, 1 percent expect the economy to improve and 5 percent believe it's a good plan to expand, according to the FIB survey (PDF). And -13 percent expect credit lines to open up.

Small businesses continue to hurt, that same BusinessWeek piece said, noting a Feb. 1 report by the Federal Reserve saying that banks continue to hold back on offering credit to them.

Probably none of this is surprising, but it is rather sobering. How is this affecting your ability to sell solutions to prospects? Have you found avenues of financing for your own business or that of your clients? Perhaps you've turned to leasing, private equity or even the venture capital community? Please share them with us. Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/18/2010 at 1:14 PM5 comments


Will Partners Embrace New Win Phone 7?

Now that Microsoft has revealed its mobile ambitions, partners must wait to see what's underneath the covers.

Microsoft began its orchestrated rollout of the new Windows Phone 7 Series this week at the Mobile World Congress in Barcelona. The new platform replaces Windows Mobile 6.x with a completely revamped user interface that incorporates Microsoft's Metro, the basis of Zune and Windows Media Center.

Windows Phone 7 Series licensees must adhere to specific integration requirements such as defined screen sizes, support for touch and GPS, among other things. The goal is for Windows Phones that come out later this year to be more architecturally consistent like the BlackBerry and iPhone, while offering a broader ecosystem of devices and form factors.

If you have a vested interest in the current Windows Mobile, you should take a look at the changes that lie ahead. They're not trivial. This 20-minute Channel 9 video provides a good overview of what Windows Phone 7 Series will look like.

But Microsoft is tight-lipped about the underpinnings of its new platform. While company officials say that's by design – to keep focus on the new UI -- it has some wondering whether that portends portability issues.

"I think probably what's going on is it’s a complete break with Windows Mobile 6.5," says Directions on Microsoft analyst Matt Rosoff. "They know that news might not be received well by application developers so they are trying to figure out what the portability story will be."

If Microsoft is headed in a different direction architecturally, it's going to have to shim the old apps to get them to run, says IDC analyst Al Hilwa. "We're talking about various subsets of .NET underneath so it's not that difficult, but the question is whether they have the time to do that," Hilwa says, referring to the planned holiday season release. Partners will get a better picture of what development challenges they face when Microsoft releases the Windows Phone 7 tooling and bits at next month's MIX 10 conference.

"Windows Mobile has a portfolio of business app extensions and, given the new interface, those folks may very well have to re-architect their apps," Hilwa says. "I think they will be more than willing to do that, that’s my sense. They are already partnered and invested in Microsoft technologies. I think they will make that judgment and take the time to refurbish their apps. But as usual with application vendors, not everyone always will, there will be those that can't invest much but I think that's a minority."

More curious: can Microsoft attract those partners who have passed on Windows Mobile but have already built apps for the Apple iPhone, Research in Motion BlackBerry and devices based on Google's Android platform?

For now, Microsoft is emphasizing the consumer aspects of Windows Phone 7 – the Zune interface, the ability to aggregate social networks, photos, games via Xbox Live, and media into a common user interface. Though Microsoft hasn't played up the business capabilities, officials say it will support OneNote, Exchange, Word, Excel and access to Sharepoint. But at this week's debut, Microsoft gave mere lip service to those features. "The amount of time devoted during the presentation to "Productivity" was disappointing," writes Philippe Winthrop, an analyst at Strategy Analytics, in a blog posting.

Enterprises for the most part don't develop mobile apps internally, they rely on the partner community, Hilwa says. The question is will Windows Phone 7 Series win over the partner community? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/16/2010 at 1:14 PM1 comments


SAP Seeks Happiness

It's been a dramatic week for SAP, whose software runs the operational underpinnings of some of the largest enterprises. The company shook up its executive suite, replacing CEO Leo Apotheker with co-CEOs Bill McDermott and Jim Hagemann Snabe. SAP today also disclosed the departure of former SAP CEO John Schwarz.

Listening to founder and chairman Hasso Plattner speak on Monday during a press conference that was webcast, it was a day of reckoning for the company to acknowledge its missteps and apologize to its customers for gouging them.

Those weren't his exact words but he tacitly acknowledged SAP has to find a new engine of growth besides imposing heavy maintenance and licensing fees. "We are a public company, and profit is everything," Plattner said. "But in order to be profitable, it needs to be a happy company. I will do everything possible to make SAP a happy company again. And in order to be profitable and please the shareholders, we have to focus on our customers, and we have to make the customers and their employees happy, as well."

During the Q&A portion of the call, a reporter asked if Plattner was acknowledging that SAP was an unhappy company. Clearly resenting the question, Plattner responded, "Please don't turn it around that we are unhappy. Take it that we have to be happier. Happy companies are companies who enjoy their success, their strategy, and are marching forward at the highest possible speed without complaining. SAP has the capacity, has the strategy, has the development on its way, and it takes unfortunately some time with our huge customer base."

Forrester Research analyst Paul Hamerman said in a blog post that Plattner said the right things. "He's got this right: taking care of your customers makes your company successful. Forcing profitability via price increases and sales tactics is not a sustainable recipe for success," he wrote.

True happiness for SAP, of course, will come when it can -- among other things -- address its stalled cloud strategy. The company launched its Business ByDesign, a SaaS-based application suite, in 2008 but angered larger enterprise customers by saying it was targeted at organizations with 100 to 500 employees, according to a research alert released by Saugatuck Technology today.

"SAP's strong prevailing culture and its need to protect its R/3 cash flows fundamentally forbade the company from pursuing offerings that could replace it," the report said.

I spoke with one of the report's authors, Saugatuck founder and CEO Bill McNee, who described four challenges facing SAP.

The biggest changes SAP must face are cultural. "They have a very significant cultural transition where they have focused historically on the large enterprise customer almost to a fault and a legacy around the big deal, to a technology-not-invented-here syndrome," McNee said.

Second, the company needs to accelerate cloud strategy. "They need to better articulate their cloud vision," he said.

Third, the company needs to figure out how to bring forth the right technology and monetize it.

And finally, if SAP really wants to succeed in targeting the small and medium business market, it needs to come up with an accelerated go-to-market strategy. That also means shedding its legacy of primarily selling direct to customers. "SAP has less experience building partner networks that will enable them to succeed in the small to medium market," McNee said.

If SAP is successful with its Business ByDesign offering and building up a partner eco system, it is likely to butt heads with Microsoft's Dynamics business, McNee said. "Microsoft's channel should stay alert to changing customer requirements, and evolving offerings from Microsoft, going forward."

What will it take to bring happiness to those buying and selling ERP, CRM and other business solutions? Share your thoughts by droping me a line at [email protected].

Posted by Jeffrey Schwartz on 02/11/2010 at 1:14 PM0 comments


Feeling the Google Buzz?

Google's latest stab at social networking is creating a lot of "buzz," but it remains to be seen whether it will become as dominant as Facebook or Twitter. Based on initial reactions, it doesn't appear to be a threat. The real question, though, is whether it will make Google Apps Premier Edition (GAPE) a stronger contender in the enterprise.

Make no mistake: That's one of the company's goals with Google Buzz, which uses the inbox as a way of bringing together all of one's social networking activities.

"The inbox is the center of attention for many people's online communication, but the way today's social tools interact with e-mail is pretty limited," said Todd Jackson, the product manager for Buzz, speaking at Google's headquarters at an event that was webcast. "With Buzz, we wanted to change that and bring social updates to your inbox in a way that goes beyond normal e-mail."

Buzz got off to a curious start, rattling some of its Gmail users. "OK, Google Buzz, you've made your point. Now how do I shut [you] off?" tweeted Jeffrey McManus, CEO of Platform associates, the developer of Aprover.com and a Gmail user.

I asked McManus, a longtime user of social networks and well-known in the .NET development community, for his thoughts on Google's long-term prospects in the enterprise.

"Buzz brings very little that's new to the table," he responded.

Burton Group analyst Guy Creese agreed in a blog post this morning. Creese and others believe that Microsoft already has a superior answer to Buzz in its Outlook Social Connector, which will appear in Office 2010, due for release this spring. While it remains to be seen how well Outlook Social Connector will be received, Creese believes it's a good start and will appeal to those comfortable with Outlook.

Not surprisingly, Microsoft seems to feel the same way. ZDnet blogger and Redmond columnist Mary Jo Foley said in a blog post yesterday that Microsoft doesn't appear to be concerned. "Are the Softies quaking in their boots? Not exactly," she wrote.

However, some analysts suggest that while Buzz may not displace Facebook and Twitter, it could gain traction. "Despite mediocre past attempts at social networking products such as Orkut or Dodgeball," wrote Interpret analyst Michael Gartenberg in a blog post, "Buzz is likely to attract a strong following by virtue of its tight integration into Gmail and the ability for Google to expose the service to advanced as well as novice users immediately."

I'd like to hear from those who've used Outlook Social Connector and Google Buzz and get your thoughts. And for those in the channel selling Office and SharePoint, what's your take? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 02/10/2010 at 1:14 PM0 comments


Microsoft Clams Up on BPOS-Lite Rumor

Microsoft has removed a job posting seeking a manager for a new hosted offering intended to bring e-mail and collaboration services to SMBs.

The service is code-named "BPOS-Lite," according to text of the posting, which was revealed Monday by ZDNet.com and Redmond columnist Mary Jo Foley. "BPOS 'Lite'...is part of the 'next wave' of services targeting professional individuals and smaller organizations, offering Microsoft's best collaboration, communications and productivity services," the now-removed posting said.

The manager hired for the position will be charged with developing business strategy, including creating a go-to-market model, launching services and developing service enhancements, according to the posting. The manager will "act as strong advocate for BPOS-Lite with corporate, field and partner teams; with analysts; and at industry and customer events," according to the post.

Microsoft isn't commenting, though its partner group has tweeted Foley's post. "We are always working on the next wave of Microsoft Online Services, offering Microsoft's best collaboration, communications and productivity services to businesses of all sizes," said a prepared statement e-mailed by a company spokesperson. "Although we do not have details available to share today, we look forward to sharing more at a later date."

Perhaps that later date will land during Microsoft's Worldwide Partner Conference in Washington, D.C. that's slated for July, speculated The VAR Guy.

One person who's heard rumblings about BPOS-Lite is Bob Leibholz, vice president of business development at New York-based Intermedia, a Microsoft Gold Certified partner and one of the largest BPOS hosting providers with over 250,000 Microsoft Exchange seats. Leibholz said Microsoft hasn't given him any information about the service and he's wondering if it may put a tighter squeeze on him and his partners.

Leibholz made his displeasure known last fall when Microsoft cut the pricing of BPOS from $15 a month per subscriber to $10.

"From my perspective, they devalued BPOS last year when they decreased the price, and a concept of BPOS-Lite, which is basically another price concession, fundamentally continues to miss the understanding of value and rather compete purely on price," Leibholz said in an interview today.

Meanwhile, Microsoft has experienced scattered outages with its BPOS service over the past week, most recently yesterday. According to a letter to customers last week from Microsoft's Online Services team, the root cause of the outages were issues with networking. "We hold ourselves to the very highest standard," the letter said. "And yesterday, we didn't meet it."

Posted by Jeffrey Schwartz on 02/02/2010 at 1:14 PM0 comments


Will Apple's iPad Define Slate Computing?

While there's no shortage of opinions as to whether Apple will catch lightning in a bottle for a third time with its new iPad, there's a good case to be made that the initial entry could be a boon to those developing PC-based slates.

As media critic David Carr reports today in The New York Times, the iPad "is a device for consuming media, not creating it." That's not to suggest that future releases won't raise the bar, but as many observers suggest, Apple also has to make sure not to offer too much and risk cannibalizing its MacBook product line.

Ironically, this is the same issue Microsoft faced in its initial hesitation to embrace netbooks. But the real potential of the iPad and similar Windows 7-based devices, such as one anticipated from Hewlett-Packard, is for them to let individuals consume content as a companion to one's computing experience, not a replacement. That's where the concept of the iPad and Windows 7-based slates could shine.

Among the biggest criticisms of the iPad is that it can't multitask and won't support Adobe's Flash (nor are there known plans for it to support Microsoft's Silverlight). In a Wired magazine report, Apple CEO Steve Jobs was reported to have told employees in a profanity-laden rant that Flash is too buggy and that Adobe is lazy. "No one will be using Flash," Jobs reportedly said. "The world is moving to HTML 5."

Adrian Ludwig, general manager for Adobe's Flash platform product organization, suggests in a blog post that he believes Apple's real motive is control over content. "It looks like Apple is continuing to impose restrictions on their devices that limit both content publishers and consumers," Ludwig wrote. "Without Flash support, iPad users will not be able to access the full range of Web content, including over 70 percent of games and 75 percent of video on the Web."

Several content producers tell The Times that the stalemate could indeed hasten acceptance of HTML 5. John Gruber, author of the popular Mac blog Daring Fireball, wrote that it "used to be you could argue that Flash, whatever its merits, delivered content to the entire audience you cared about. That's no longer true, and Adobe's Flash penetration is shrinking with each iPhone OS device Apple sells."

Meanwhile, as Windows 7-based slates come out this year, it is possible that OEMs will play both sides of the coin. Those that support Windows 7 already effectively support Flash, Silverlight and other runtime environments, presuming they don't strip those capabilities out. Because of the broader ecosystem of devices, some will purely access content, while others will both create and view it.

But regardless of how you view the iPad or slate computing in general, Apple has put a stake in the ground for a class of devices that potentially can redefine how we consume content, advancing on what Amazon has done with the Kindle.

Let's see what HP and the rest of its Wintel brethren bring out.

Posted by Jeffrey Schwartz on 02/01/2010 at 1:14 PM0 comments


Undoing a 'Disaster,' Oracle Plans To Take Sun Sales Direct

Today could be a big day for those who implement data center technology, databases, applications and software based on Java.

As reported, Oracle today will outline its plans for integrating Sun Microsystems. Part of that plan includes hiring 2,000 engineers and sales people to sell integrated appliances that include provide integrated databases, app software, servers, storage and network gear, according to published reports. The integrated appliance model could be a multi-billion dollar business, Oracle CEO Larry Ellison tells The Wall Street Journal.

However Oracle will sell its products direct to Sun's top 4,000 customers, Ellison tells The New York Times. Those 4,000 customers account for 70 percent of Oracle's revenues. Ellison indicated Oracle will move away from relying on Sun's partners to serve those customers.

"The partner model was disastrous, and we are immediately changing that," Ellison tells the Times. Such a move could leave a lot of partners out in the cold. Will those displaced partners move to pushing gear from Hewlett Packard, which earlier this month inked a $250 million agreement with Microsoft to jointly develop their own next-generation data center technology?

Or will Dell, which is looking to build its own partner ecosystem, become an attractive haven? How will Oracle's move affect the way all those players treat partners in the future?

Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 01/27/2010 at 1:14 PM0 comments


Accenture's Mistake with Tiger Woods Transcends Fiasco

When Accenture last week ditched Tiger Woods as its sole pitchman, it served as a key reminder of what happens when you put all your eggs in one basket.

Accenture is one of the largest independent providers of IT consulting, integration and outsourcing services with annual revenues of $21.58 in fiscal year 2009. The company, which had blanketed Woods across all media in its "We Know What it Takes to be a Tiger" campaign last week scrubbed all vestiges of Woods from its Web site and removed all posters and other collateral from its offices, according to a front page story in The New York Times.

Until last month, the golf champion had an unblemished image. It all came apart with daily allegations of indiscretions and infidelities that have since dominated the news. Accenture last week issued a statement saying "the company has determined that he is no longer the right representative," and that it will roll out a new campaign in 2010.

The new campaign will continue to carry its High Performance Delivered” message, Accenture said. While Accenture and its ad agencies are undoubtedly scrambling to come up with a new strategy, it might be advisable not to have that message riding on one point of failure, especially considering the fact that enterprise customers expect their services providers to avoid that very thing from happening in their IT environments.

According to the Times report, Woods appeared in 83 percent of Accenture's ads. Besides having so much riding on Woods, columnist Frank Rich yesterday pointed to a conversation he had last week with New York Daily News sports columnist Mike Lupica. "If Tiger Woods was so important to Accenture, how come I didn’t know what Accenture did when they fired him," Lupica asked Rich, in his weekly column.

Granted most buyers of IT consulting and integration services are familiar with Accenture, its revenues and profits have declined over the past year. So maybe it was time for the company to reshape how it delivered its value proposition. Even if Tiger Woods fiasco hadn't unfolded, perhaps he wasn't the best representative for a company providing IT services after all, notes Directions on Microsoft analyst Paul DeGroot, during an e-mail exchange we had last week.

"If you want to come across as hip, fast, physically gifted, by all means hire Tiger Woods," DeGroot said. "The lesson is, align your [message] with your company image."

 

Posted by Jeffrey Schwartz on 12/21/2009 at 1:14 PM0 comments


Subscribe on YouTube